Gesture recognition technology shrinks to micro size
The use of augmented reality (AR) applications and wearable electronics is constantly increasing in the industry. For instance, smart glasses can show an employee real-time instructions on how to assemble a device or help find parts that need service. Smart textiles based on sensor technology, such as smart gloves, can convert physical movements into virtual equivalents and quite literally guide the employee by the hand.
The smart glove must be able to accurately detect hand and finger movements and grip force. This is often done using deep neural networks, machine learning methods that mimic the function of the human brain and traditionally require a lot of computational power. Researchers at Aalto University have collaborated with , a company that specialises in intelligent sensor technology, to develop gesture recognition that can be used on even fingertip-sized microcontrollers.
‘Usually, sensor data collected by gloves needs to be sent over a network to a computer that processes it and sends the information back. The deep learning-based gesture recognition algorithms we have developed are so lightweight that they can do the same locally in an embedded system like smart gloves,’ says Yu Xiao, a researcher at Aalto University who is the leader of a research group that specialises in wearable systems development.
This means that the devices can be used anywhere, without the need for internet connection or an external computer. The information can be transferred between the smart gloves and AR glasses using the Bluetooth Low Energy network.
The technology could be used in a variety of embedded systems for sensor data in the future.
’We can apply the developed technology for several measurement types like keeping separate counts for multiple gestures, for measuring motion improvements in physiotherapy or for detecting the state of multiple machines running based on a vibration or sound spectrum,’ says HitSeed CTO Pertti Kasanen.
‘Smart sensors and augmented reality and virtual reality applications have endless opportunities in industry, healthcare and education,’ Xiao says.
The researchers used HitSeed’s fingertip-sized Sensor Computer, which supports Google’s Tensor Flow Lite software library, to run convolutional neural networks (CNNs) on smart gloves. CNN is a specialised type of neural network which is often used for image classification. A CNN is made up of neurons that have learnable weights and biases. As a next step, the system will be extended to support local execution of long short-term memory (LSTM), which is commonly used for processing entire sequences of data such as speech and video.
The research project received €100,000 seed funding from the European Union's Horizon 2020 ATTRACT project, which supports collaboration between research institutes and companies to develop technologies that change society. Next, researchers will seek partners for research aimed at commercialising the technology.
(pdf).
Contact
Professor Yu Xiao
Department of Communications and Networking
yu.xiao@aalto.fi
Pertti Kasanen
Partner, CTO/HitSeed
pertti.kasanen@hitseed.com
Read more news
Unite! Seed Fund 2026: Open for applications
The 2026 Unite! Seed Fund call is officially open, offering funding across three strategic lines: Student Activities, Teaching and Learning, and Research and PhD. Deadline for applications is 20 March 2026.
Apply now: Unite! Seed Fund 2026 - Student Call
The Unite! Seed Fund call for 2026 is now open for students. Apply now for up to €20,000 per project, involving at least two Unite! Universities. Deadline for applications is 20 March 2026.