Elliptic Labs Investor Relations Material
Latest events
Q3 2024
Latest reports from Elliptic Laboratories
Norwegian Ultrasound Sensors
Elliptic Labs is a company that specializes in providing virtual smart sensors for a range of electronic devices. Their technology is widely utilized in smartphones, laptops, IoT devices, automotive applications, smart speakers, and televisions. The company uses AI, ultrasound, and sensor fusion to deliver solutions such as 3D gesture recognition, proximity sensing, presence detection, and monitoring of breathing and heartbeats. The company is headquartered in Oslo, Norway, and its shares are listed on the Oslo Stock Exchange.
A Research Spin-off
Elliptic Labs was founded in 2006 by Laila Danielsen, a Norwegian entrepreneur who still acts as the company's CEO. The company emerged from the University of Oslo's research initiatives, focusing on utilizing ultrasound technology for gesture recognition and touchless interaction. The company's early research and development efforts were directed toward integrating ultrasound technology into mobile devices, enabling users to control their gadgets through hand gestures without physical contact.
AI Virtual Smart Sensor Platform
Elliptic Labs' breakthrough came with its AI Virtual Smart Sensor Platform, which has been adopted by major smartphone manufacturers. By eliminating the need for physical sensors, this technology enhances device functionality while reducing hardware costs. Over the years, Elliptic Labs has expanded its applications to include smart home devices, laptops, and automotive systems. The company's technology is in use in over 500 million devices worldwide. Some of Elliptic Labs' publicly traded peers include Synaptics Incorporated, Cirrus Logic, and Infineon Technologies AG.
How it Works and Application Areas
Elliptic Labs' ultrasound-based approach offers several advantages over traditional sensors, including greater flexibility, reduced hardware costs, and the ability to create more intuitive and hygienic user interfaces. The company's technology uses ultrasound to enable touchless interaction with electronic devices. It begins with emitting ultrasonic waves from tiny transducers embedded in the device. When these waves encounter an object, like a user's hand, they reflect back to the device, where transducers capture the returning signals.
These signals undergo advanced processing to analyze the time delay and changes in frequency and amplitude, creating a 3D map of the object's movement. This data is then interpreted by an AI engine trained to recognize specific gestures and movements, such as swipes, waves, or hovers.The AI interprets these gestures and translates them into corresponding commands, allowing users to control their devices without physical contact. For example, users can turn on a screen, adjust the volume, or navigate menus with simple hand movements.