Google Pixel 4 could include new feature that dramatically improves display - PhoneArena

Google Pixel 4 could include new feature that dramatically improves display

Google Pixel 4 could include new feature that dramatically improves display
When it comes to flagship smartphones, displays are often regarded as one of the most important features. Nowadays, most manufacturers opt for OLED panels that can display vibrant colors and deep blacks. But from the look of things, Google wants to take things a step further this year with a new feature that’ll automatically adjust the white balance based on ambient lighting conditions. 

A rival to Apple’s True Tone and Samsung’s Adaptive Display

The basis of Google’s new feature appears to come from the “concept of chromatic adaptation.” Due to the way the human visual system works, an object can appear to be the same color in every situation, regardless of the lighting conditions that it’s being viewed in. Displays, however, are a totally different story and often appear too blue under certain lighting, which is where the new feature comes in.

Like Apple’s True Tone, Samsung’s Adaptive Display, and select other implementations from the likes of LG, Google’s upcoming feature should automatically adjust the white balance of displays to reflect the ambient lighting conditions. That way, the color’s shown on the panel will appear consistent in all scenarios and look more natural. Another benefit should come in the form of reduced eye strain due to the lower amount of blue light.

It should be noted that there’s no guarantee this feature will make it into the Pixel 4 and Pixel 4 XL later this year, but it’s certainly likely. After all, in order to work it requires an entirely new sensor that isn’t found in any existing Google smartphones, meaning that it could be one of the five visual sensors reportedly planned for the front of the next-gen flagships. 

An all-new radar chip could make the cut too

Aside from the feature mentioned above, Google is rumored to be planning an all-new radar chip codenamed Project Soli that can track "micro motions.” It’s unclear at the moment what features the implementation will allow, but previous demos of the technology have pointed towards support for a variety of gesture-based controls.

Back in 2016, the Silicon Valley-based company showed off the technology working on both a smartwatch and a smart speaker. On the former, the Project Soli chip was used to control the interface, while on the latter it could be used to adjust the volume and play/pause music. Another idea Google showed off was a tablet user selecting a specific color in a drawing app.

A square camera on the back and a "forehead" up front

Regarding the looks of the next-gen Pixel flagships, recent leaks have pointed towards the presence of a large square-shaped camera module on the rear that’ll house two cameras and an added ‘spectral sensor’ that has never been used on a smartphone before. The latter should be able to track x-rays, ultraviolet, infrared, and other information that can’t be captured by standard cameras. Other features may include the ability to better identify materials and measure the distance of objects, thus producing potentially better portrait and low-light images.

In terms of the smartphone's front, rumor has it Google has chosen a design that includes minimal size bezels, an extremely thin chin, and a “forehead” above that’ll house the typical selfie cameras in addition to a variety of sensors that could potentially create a rival to Apple’s Face ID. This makes sense because the fingerprint scanner has seemingly been removed from the setup. Completing the flagship should be Qualcomm’s Snapdragon 855 paired with a minimum of 6GB of RAM and 128GB of storage, although other configurations are possible.

Loading Comments...
FCC OKs Cingular\'s purchase of AT&T Wireless