Tristan GreeneEditor, Neural by TNW
Tristan is a futurist covering human-centric artificial intelligence advances, quantum computing, STEM, physics, and space stuff. Pronouns: Tristan is a futurist covering human-centric artificial intelligence advances, quantum computing, STEM, physics, and space stuff. Pronouns: He/him
A team of researchers from Harvard recently published early work indicating a complex theory of quantum physics could be exploited to create huge, high resolution telescopes. Astronomers of the future will see the distant reaches of our universe through the magic of teleportation.
Technically, it’s called “entanglement,” but it works pretty much like teleportation. Basically a couple of quantum particles become “entangled” with one another in such a way that anything that happens to one particle happens to the other, even if they’re separated by physical distance.
We wrote a whole thing about it you can read here.
The big idea here is that with the assistance of quantum technology, we can make really, really big telescopes. And that’s a pretty big deal. The biggest telescope we can make right now comes in the form of the Extremely Large Telescope (seriously, that’s its name). Its mirror is a mere 40 meters across and it cost a cool billion dollars.
Luckily, engineers have come up with a solution to deal with the ridiculous expenses of building giant telescope mirrors: they put smaller mirrors in groups called arrays. Unfortunately, these arrays can only get so big before there’s too much data loss – called noise – for scientists to be able to get useful images from them. That limit is somewhere around the equivalent of a 330 meter mirror, if you judge by the largest array made so far.
These telescope systems represent what many experts consider to be the peak of what we’re capable of doing under today’s industrial limits. Sure, with a little more time and billions of dollars more we might reach some modest size increases, but we’re probably hovering around the edge.
Quantum entanglement could change all of that, but unfortunately the way it would (theoretically) work involves shooting a constant stream of entangled photons off into space. Despite recent breakthroughs, our grasp of quantum computing, as a species, isn’t strong enough to perform that kind of Herculean feat of quantum engineering. The sheer number of photons involved reaches into, well, astronomical proportions.
But what if there was a way to cut the number of entangled photons needed down to a more practical number?
That’s exactly what the Harvard team did.
Basically their work indicates that by exploiting a phenomena called “quantum memory” the number of entangled photons needed for the telescope of the future to function is much lower.
According to the researchers:
The necessary rate of entanglement distribution is reduced by several orders of magnitude, which opens up realistic prospects for employing near-term quantum networks for high-resolution imaging.
What does this mean? For starters, the team suggests a telescope array could be created with a size-equivalent of 30 km – 100 times larger than the biggest by today’s standards.
As the Harvard team’s work informs further development – and quantum computing hardware advances continue – it’s now almost certain that telescope optics will become larger and higher resolution with less expensive materials needed.
Eventually this means costs should come down, which will go a long way towards ushering in the quantum era of astronomy.
It’s entirely feasible that we’ll one day have planet-sized telescopes moving throughout our galaxy, capturing images of planets being born, stars dying, and intergalactic pizza delivery drones bringing Earth’s greatest export to the hungry masses.
One day we’ll reveal the darkest corners of our universe. And we’ll teleport pictures of what dares dwell in them across it’s expanse, thanks to the wacky nature of quantum physics.
Get the TNW newsletter
Get the most important tech news in your inbox each week.