How essential phone 2 could pull off putting a camera behind the display

After rumors of essential canceling plans for the essential phone 2, the company has recently filed a patent to the WIPO (World Intellectual Property Office) where it describes a phone with a camera integrated in the screen itself. We do not mean a camera hole, like the Samsung Galaxy S10, but a camera located under the screen.

The result is a smartphone with a screen that goes from side to side without the need for sliding mechanisms.

Embedding a camera behind a display is a challenging feat to pull off seeing as a traditional display is opaque ie light cant pass through. Cameras capture images by registering the light that comes through into it so how does essential plan to work a camera sitting behind an opaque screen. here is how we speculate essential could pull that off.

Seeing the standards that mobile cameras have set with phones like the pixel 3, the Galaxy S10, making a selfie camera that will be able to take pictures of that standard is, first of all, a difficult thing to do much less talking of putting it under the display.

OnePlus on-screen fingerprint sensor in the OnePlus 6T is an optical fingerprint reader which means it is able to capture light that is reflected from the fingerprint to get a detailed enough photo to make out the fingerprint pattern and authenticate a login.

The essential phone 2 could build on that technology to capture images. This could be achieved through hardware and software. With hardware, we know that its just part of the screen that lets light to pass to the sensor which means it would be cheaper and more feasible to further refine the technology that makes it somewhat transparent for taking pictures.


Optical fingerprint scanners use a charge couple device, or CCD, similar to those found in a digital camera, to generate an image of your fingerprint. Light illuminates your finger, and light-sensitive diodes on the surface of the CCD measure the intensity of the reflected light at many points. The result is an image with light and dark patterns, basically a black and white photo.

A camera for selfies would have to have full colors and the display itself will distort some of those colors if not all except the display is completely transparent. That is where software would help. Machine learning would be able to remove those discrepancies.

A system could be developed where images taken directly with a camera and those taken with the under-screen camera could be compared to calibrate the under-screen camera.

Another option would be to use a transparent display

ALSO READ: The real reason Apple is making a bigger 16 inch MacBook Pro

LG, Panasonic, and even Samsung have made transparent displays for TVs. When they are turned off they look completely transparent like regular glass which will allow cameras to take pictures through it quite comfortably

OLED screens have two layers of glass on both sides of the OLED, which consist of an emissive and conductive layer. Electrical impulses travel through the conductive layer and produce light at the emissive layer. The OLEDs produce their own light, which allows the screens to be much thinner and they don’t require backlighting. The narrow gap between the pixels of the screen as well as the clear cathodes within allows the screens to be transparent.

LCD for transparent displays uses natural lighting like the sun instead of electrical backlighting. The lack of a backlight allows the screens to be much thinner as well as see-through. See-through LCD screens are a cheaper alternative to OLED’s, however, their use is limited because of the natural light restriction.


If Essential wants to go with any of these technologies, I think it would be a long time before we can see the phone maybe a year or two

what do you think. have another idea? leave a comment below