

For as long as computers have existed, physicists have used them as tools to understand, predict and model the natural world. Computing experts, for their part, have used advances in physics to develop machines that are faster, smarter and more ubiquitous than ever. This collection celebrates the latest phase in this symbiotic relationship, as the rise of artificial intelligence and quantum computing opens up new possibilities in basic and applied research
As quantum computing matures, will decades of engineering give silicon qubits an edge? Fernando Gonzalez-Zalba, Tsung-Yeh Yang and Alessandro Rossi think so
Physicist and Raspberry Pi inventor Eben Upton explains how simple computers are becoming integral to the Internet of Things
Physics World journalists discuss the week’s highlights
James McKenzie explains how Tim Berners-Lee's invention of the World Wide Web at CERN has revolutionized how we trade.
Tim Berners-Lee predicts the future of online publishing in an article he wrote for Physics World in 1992
Jess Wade illustrates the history of the World Wide Web, from the technology that enabled it to the staple it is today
Emerging technologies shaping our connected world
Fifth episode in mini-series revisits the birth of the Web and the challenges it now faces
Computing is transforming scientific research, but are researchers and software code adapting at the same rate? Benjamin Skuse finds out
There’s a scientific reason why Twisters is set in the US Great Plains rather than Argentina, and it has to do with the Gulf of Mexico
New gravitational field model quantifies the "divergence problem" identified in 2022
Relationship between mass, wing area and wingbeat frequency holds true for insects, bats, birds, whales and even a flapping robot
New "wavefunction matching" method correctly predicts nuclear radii of elements with atomic numbers from 2 to 58
New work sheds light on vortex flows involved in mixing and transporting ooplasmic components that cells need to develop
Simulations should be designed to minimize energy consumption, say physicists
Introducing artificial intelligence into the clinical workflow helps radiologists detect lung cancer lesions on chest X-rays and dismiss false-positives
Algorithms help materials scientists recognize patterns in structure-function relationships
A deep learning algorithm detects brain haemorrhages on head CT scans with comparable performance to highly trained radiologists
An artificial intelligence model can identify patients with intermittent atrial fibrillation from scans performed during normal heart rhythm
Proof-of-concept demonstration done using two superconducting qubits
An image-based artificial intelligence framework predicts a personalized radiation dose that minimizes the risk of treatment failure
A machine learning algorithm can read electroencephalograms as well as clinicians
Hopping-based logic achieved at high fidelity
A new approach could put Majorana particles on track to become a novel qubit platform, but some scientists doubt the results’ validity
Proof-of-concept demonstration yields encoded magic states that are robust against any single-qubit error
Researchers discover that operating close to a phase transition produces optimal error suppression in so-called cat qubits
Confining ions with static magnetic and electric fields instead of an oscillating radiofrequency field reduces heating and gives better position control
Proposed device might aid the development of quantum computers
Featuring world-leading journals, news and books, dedicated to supporting and improving research across the field, from fundamental science through to novel applications and facilities.