Napier Lopez is a writer based in New York City. He's interested in all things tech, science, and photography related, and likes to yo-yo in Napier Lopez is a writer based in New York City. He's interested in all things tech, science, and photography related, and likes to yo-yo in his free time. Follow him on Twitter.
Our phones get better at capturing images in low light every year, but as far as I know, no phone has been good enough to capture stars in the night sky. If a new leak is to be believed, Google might be the one to change that
Spanish tech site Pro Android posted a what appears to be an upcoming marketing video for the Pixel 4. The video shows off some things we already know of including air gestures using Project Soli, tight Google Assistant commands, and the Pixel’s famous Night Sight.
But then the video goes a step further, claiming “you even get the stars,” showing off what appears to be a high-quality photo of the night-time sky, enough to see the milky way take shape. I say “appears” because the video is potato quality. And here I thought we’d gotten over Mr. Blurrycam in 2019.
The original video has since been removed, but 9to5Google has re-uploaded it here:
It would be quite the effort for someone to put together a video this professional – not to mention one that introduces such a plausible new feature – as a hoax. Given the effort Google has put into making Night Sight the de facto standard in low light mobile photography, it’s not surprising to see the company take things a step further.
If real, I’d be curious to see if and how Google is able to overcome typical problems with astrophotography. Does Google make any attempt to combat light pollution, for instance? I’d be impressed if it let me capture more than a handful of stars in NYC’s night sky. And if you want to have any foreground visible objects in your shot, you’ll also have to need to balance exposures for the sky and ground differently.
Ther are even cameras specially made or modified for astrophotography, such as the Nikon D810A, which come with optimized sensor filters allowing them to better capture some of the wavelengths typical in celestial objects. It’s possible Google might try to replicate those sensor filters by enhancing certain colors via AI, but that’s just my own guess.
Google typically announces its Pixel phones in early October, so it shouldn’t be much longer until we find out.
Get the TNW newsletter
Get the most important tech news in your inbox each week.