Proximity sensors have been around on our phones for quite a while and apart from detecting the presence of a human ear, it also locks the screen when the phone’s in the pocket and enables gesture controls. As we might have noticed, there are times when the proximity sensor is not that accurate and is lagging in response.


Elliptic Labs had made its presence felt at the CES few years ago and it had demonstrated 3D gestural interactions which was based on the ultrasonic wave technology. So what it did is to allow the users navigate through apps, control music playback, and answer calls by just using the gestures. This is something you will closely relate to if you have watched the first season of telly flick “Black Mirror”

As of now, the company has decided to use the ultrasonic technology for proximity sensing and take on the infrared-based proximity sensors. The phone’s earpiece would act as a transmitter and send out the small waves of sound from the earpiece and measures the intensity of the reflected back waves through the microphones thus allowing the Elliptic to dim the display accordingly.

The proximity sensing is one of the bare basic feature of any smartphone, but the IR sensor module leaves a ugly dark spot on the front of the phone – something that the ultrasonic sensor won’t do. So the Ultrasonic sensor will be more accurate, less ugly and yes it will take the gesture control to the next level. Additionally, it also reduces the power consumption and the cost of manufacturing. This will give Elliptic Labs enough ammunition to attract the hardware partners and thus incorporate their technology. The company expects Ultrasound proximity sensors to make it to production devices by the end of this year.

Was this article helpful?