Through a combination of hardware, software and artificial intelligence technology, Google expects significant camera improvements to itssmartphone. And CNET has the exclusive deep dive into exactly what the company has done.
Because better photography is a key reason to upgrade a smartphone, Google’s camera tech positions its $899 flagship Android phone to compete directly against Apple’s $1,099 iPhone 14 Pro Max.
Google’s Pixel phones haven’t sold well compared with models from Samsung and Apple. But they’ve earned high marks for photography year after year. And if anything is going to win customers over, it’ll be camera technology.
Last year’s Pixel 6 Pro sported a “camera bar” housing three rear cameras — its 50-megapixel main camera, a 0.7x ultrawide angle and a 4x telephoto. The Pixel 7 models’ main camera gets a new 50-megapixel sensor. On the Pixel 7 Pro, the ultrawide has the same sensor as last year but gets a macro mode, autofocus and a wider 0.5x zoom. The 7 Pro’s telephoto’s zoom magnification extends to 5x, and a new 48-megapixel telephoto sensor enables a 10x zoom mode without using any digital magnification tricks.
But the Pixel 7 Pro’s improved hardware foundation is only part of the story. New computational photography technology enabled by new AI algorithms andspeeds up Night Sight, unblurs faces, stabilizes video better and merges data from multiple cameras to improve image quality for intermediate zoom levels like 3x or 7x. That’s more like how a traditional camera behaves.
“What we really tried to do is give you a 12mm-240mm camera,” said Pixel camera hardware leader Alexander Schiffhauer, translating the Pixel 7 Pro’s 0.5x to 10x magnification range into traditional 35mm camera terms. “That’s like the Holy Grail travel lens,” an all-purpose setup that photo enthusiasts have long enjoyed for portability and flexibility.
Here’s a deeper look into what Google is up to.
Virtual 2x and 10x cameras on the Pixel 7 Pro
High-end smartphones now come with at least three rear-facing cameras so you can capture a wider range of shots. Ultrawide angle cameras are good for photographing people crammed into a room and interesting buildings. Telephoto cameras are better for portraits and more distant subjects.
But having a big gap between zoom levels can be a problem. The Pixel 7 Pro’s 50-megapixel main camera and 48-megapixel telephoto cameras offer something of a fix.
Ordinarily, those cameras use a technique called pixel binning to turn a 2×2 grid of pixels into a single larger pixel. That makes 12-megapixel shots with better color and a wider dynamic range of dark and light tones.
But by skipping pixel binning and using only the central 12 megapixels, the 1x main camera can take a 2x shot, and the 5x telephoto can take a 10x shot. The smaller pixels mean image quality isn’t as high, but it’s still useful, and Google applies its Super Res Zoom technology to improve color and sharpness, too. (The 2x mode works on Google’s Pixel 7, too.)
Apple took the exact same approach with its iPhone 14 Pro phones, but only with the main 1x camera.
Where Apple went farther is letting you shoot full 48-megapixel photos with the iPhone 14 Pro’s 1x camera. At 1x, Google always uses pixel binning and thus captures only 12 megapixels.
‘Zoom fusion’ for more photo flexibility
When shooting between 2.5x and 5x zoom, the Pixel 7 Pro blends image data from the wide and telephoto cameras to improve image quality. That improves photos compared with just digitally upscaling a photo from the main camera, Schiffhauer said.
But it’s difficult. The phone has to reconcile the two cameras’ perspectives, which means foreground objects block ones in the background differently. You can see this for yourself by covering first one eye and then the other to see how a scene changes. The two cameras also focus differently because of their different focal lengths.
To avoid discontinuities, Google uses artificial intelligence, also called machine learning, and other processing techniques to figure out which portions of each image to include or reject.
Zoom fusion takes place after other processing methods. Those include HDR+, which merges several frames into one image for better dynamic range, and an AI algorithm that monitors hand shake to take photos when the camera is most stable.
Unfortunately for those who like to shoot raw photos, an image format that offers higher quality and more editing flexibility, zoom fusion isn’t an option there. You’ll get full 12-megapixel raw images only at the Pixel 7 Pro’s fixed zoom levels of 0.5x, 1x, 2x, 5x and 10x.
The Pixel 7 Pro’s new unblurring technology
Google introduced technology in 2021 to marry data from the main and ultrawide cameras to cope with motion blur in faces that can spoil photos. This face unblur technology now kicks in three times more often, said Pixel camera software leader Isaac Reynolds.
Specifically, it’ll work more often in ordinary light, activate more when it’s dim, and function even on faces that aren’t in focus.
For processing photos after the fact, the Google Photos app gets a new tool to unblur shots. It works even with digitized film photos from the bygone age of analog photography.
Eye detection and other autofocus upgrades
The Pixel 7 Pro has several autofocus improvements, starting with the addition of autofocus hardware on the ultrawide camera. For all the cameras, though, Google now uses an AI algorithm to process focusing data from the image sensor.
The camera also will be able to spot eyes, not just faces, for autofocus that works more like that found in high-end cameras from companies like Sony, Nikon and Canon.
New AI technology also can focus better as people move in a scene. “If someone turns their head away from the camera or walks away, we can maintain focus on their head,” Reynolds said.
The Pixel 7 phones also do better when faces are hard to recognize, like with big hats or really large dark sunglasses.
And new AI-based autofocus technology makes it much faster to switch to telephoto shooting. The Pixel 6 Pro often pauses when activating its telephoto camera.
Better photo upscaling
From 5x to 10x zoom, the Pixel 7 Pro uses the central pixels of the 5x camera to take a 12-megapixel photo.
With digital magnification methods, though, the camera can reach up to 30x zoom, up from 20x in the Pixel 6 Pro. Google developed a method called Super Res Zoom that takes advantage of hand shake to gather more detailed data about the subject and zoom better.
Google’s digital zoom also can use AI techniques to magnify images. This year, Google trained its AI to predict new pixels better. The phone also calculates a scene attribute called an anisotropic kernel to better predict the subtle changes from one pixel to neighbors to better fill in new data while magnifying.
“Obviously, the quality of 30x isn’t going to be quite what it is at 10x,” Schiffhauer said. “But you still get these really beautiful photos that you can share.”
Faster Night Sight photos
Night Sight, the pioneering and now widely copied technology to take better shots when it’s dim or dark, is now twice as fast. That’s because Google uses image frames it collects from before you tap the shutter release button. (That’s possible because the camera continuously collects imagery, stashing it in memory but only keeping it if you actually take a photo.)
“Users are waiting half as much time to capture a Night Sight photo,” Reynolds said. “They’re getting sharper results, and they’re paying no penalty on noise.”
For the Pixel line’s astrophotography mode — the extreme version of Night Sight — new AI technology for removing noise speckles now preserves stars better.
Better stabilization and other video improvements
Google also overhauled the Pixel 7 cameras’ video, a weak point compared with iPhones in many reviewers’ eyes. For starters, all Pixel 7 Pro cameras shoot up to 60 frames per second at 4K resolution now, but there’s a lot more:
- The cameras can shoot in a 10-bit HDR (high dynamic range) mode for better high contrast scenes, like those with bright skies or dark shadows.
- Google is now on its eighth generation of video stabilization technology. It helps particularly when tracking moving subjects with heavy zoom magnification.
- Speech enhancement technology better captures subjects’ voices when you’re shooting with the rear-facing cameras.
- More serious videographers can lock white balance, exposure and focus — something possible before only with photos.
- Cinematic blur lets you artificially blur backgrounds in video, a feature previously available only for photos.
- Copying the iPhone’s approach, time lapse videos are now always 15 to 30 seconds long. Previously, you’d have to calculate the best settings yourself.
- And with a technology called blur injection, the Pixel 7 Pro can ease the sometimes jittery video that you get when it’s bright out and shutter speeds are extremely fast.
Together, the improvements show Google is fighting hard to maintain its leading smartphone camera technology. “We’re pushing hardware, software and machine learning as far as you can go,” Schiffhauer said.