apple patient
  • Home
  • News
  • Rumors
  • Tips & Tricks
  • Tests & Experience Reports
  • Generally
No Result
View All Result
  • Home
  • News
  • Rumors
  • Tips & Tricks
  • Tests & Experience Reports
  • Generally
No Result
View All Result
apple patient
No Result
View All Result

Apple is relying on EMG and AI for new gesture control

by Milan
March 11, 2026
in News
Apple AI

Image: Shutterstock / NicoElNino

Apple is continuously working on new technologies in the field of artificial intelligence and human-computer interaction. In a recent research project, the company demonstrates how an AI model can recognize hand gestures that were not even included in the original training dataset. The goal of this research is to enable more precise and flexible control of wearable devices via muscle movements in the future.

The study, titled "EMBridge: Enhancing Gesture Generalization from EMG Signals through Cross-Modal Representation Learning," was published on Apple's Machine Learning Research blog. It will be presented at the ICLR conference in April 2026. The study focuses on a new framework called EMBridge, which combines EMG muscle signals with hand position data to recognize gestures more reliably.

The results show that Apple has developed a system that enables so-called zero-shot gesture recognition. This allows the AI to identify hand movements it has never seen before.

An important component of the study is the EMG (electromyography) technology. EMG measures the electrical activity that muscles generate during contraction. These electrical signals occur with every muscle movement, such as bending the fingers or making a fist.

EMG has been used in various fields for years. These include, among others:

  • medical diagnostics
  • Physiotherapy
  • Prosthetic control
  • Research on wearables and AR/VR systems

EMG is becoming increasingly important, especially in the field of wearable devices. One example is the Ray-Ban Display glasses from Meta, which use a so-called Neural Band. This device is worn on the wrist and interprets muscle signals to control the glasses' functions.

Apple's study also explores how such muscle signals can be used to control digital systems. The focus is particularly on improving the ability of AI models to recognize new gestures.

EMBridge: Apple's framework for gesture recognition

The central element of the study is EMBridge, a framework for cross-modal representational learning. The researchers developed this system to bridge the so-called modality gap between EMG signals and hand poses.

While EMG signals contain information about muscle activity, they are difficult to directly correlate with specific hand movements. Hand position data, on the other hand, describes the precise position of the fingers and joints. EMBridge combines both sources of information.

The model is trained in several steps. First, a separate pre-training course is performed with:

  • EMG signals
  • structured hand position data

The two displays are then compared. In this process, the EMG encoder learns from the pose encoder, allowing for better interpretation of muscle signals.

In this way, the system can recognize patterns that arise from muscle signals when certain hand movements are performed.

The data sets for training

For the development and evaluation of EMBridge, the researchers used two large datasets: emg2pose and several datasets from the NinaPro project.

emg2pose

The emg2pose dataset is a comprehensive open-source dataset containing approximately 370 hours of sEMG signals and synchronized hand position data. It includes:

  • Data from 193 voluntary users
  • 29 different behavioral groups
  • a large number of discrete and continuous hand movements

Examples of the included gestures are:

  • clenching a fist
  • counting to five with one's fingers

The hand position labels were generated using a high-resolution motion capture system. The dataset contains over 80 million pose labels, making it comparable in size to the largest datasets in the field of computer vision.

Each participant conducted four recording sessions per gesture category. In each session, the EMG tape was attached in a different position.

The sessions lasted between 45 and 120 seconds. During this time, users repeatedly performed a mixture of three to five similar gestures or free hand movements.

For training, the researchers used non-overlapping two-second windows as input sequences. Additionally, the EMG signals were processed.

  • instance normalized
  • bandpass filtered in the range 2–250 Hz
  • cleaned with a 60 Hz notch filter
NinaPro datasets

In addition to emg2pose, two datasets from the NinaPro project were used. The NinaPro DB2 dataset served as pre-training for the system. It contains paired EMG and pose data from 40 subjects. The dataset includes 49 different hand gestures, among them:

  • basic finger flexions
  • functional grasping movements
  • combined movements

EMG signals were recorded using 12 electrodes on the forearm. The sampling rate was 2 kHz. Simultaneously, a data glove recorded hand kinematics.

For the subsequent gesture classification, the researchers used NinaPro DB7. This dataset contains data from 20 non-amputee subjects who were recorded with the same EMG device and the same set of gestures as in the DB2 dataset.

Training methods of the model

After pre-training, the two data representations were compared. This allowed the EMG encoder to learn which hand poses correspond to specific muscle signals.

Another training step was masked pose reconstruction. In this step, the researchers masked parts of the hand position data. The model then had to reconstruct this missing information solely based on the EMG signals.

This method forces the system to learn deeper connections between muscle activity and hand movements.

Additionally, a problem that frequently occurs in training processes was taken into account: similar gestures are often mistakenly treated as completely different examples.

To mitigate this problem, the model learned to recognize when hand poses represent similar configurations. In such cases, the system generated soft target values instead of strictly separating the gestures.

This resulted in a better structured representation space of the model and improved the ability to generalize to previously unknown gestures.

Study results

The researchers evaluated EMBridge using the emg2pose and NinaPro datasets. The results show that the system consistently outperforms existing methods, especially in zero-shot gesture recognition. This means that the model can identify gestures that were never seen during training.

It is also noteworthy that these results were achieved with only 40 percent of the training data.

Possible applications for Apple devices

Although the study does not mention any specific products, potential applications can be deduced relatively easily. One conceivable practical application is wearable human-computer interaction. In scenarios such as VR and AR systems, or in the control of prostheses, a device worn on the wrist must continuously derive hand gestures from EMG signals.

Such technology could be used to control devices such as:

  • smartwatches
  • AR or VR headsets
  • smartphones
  • computer
  • other wearables
  • possible future smart glasses

Controlling a virtual avatar or a robotic hand could also be done via muscle signals. Furthermore, the technology could create new possibilities for accessibility and alternative input methods.

Limitations of research

The study also mentions an important limitation. The model is based on datasets that contain both EMG signals and synchronized hand position data. Such datasets are comparatively difficult to collect and require specialized measurement systems.

Therefore, training such a model still depends on high-quality and elaborately generated datasets.

Apple is further advancing gesture control with EMG and AI

The new study shows how Apple combines AI technologies with EMG sensors to improve gesture recognition on wearable devices. Using the EMBridge framework, researchers succeeded in developing a system that recognizes hand gestures even when they were not included in the original training dataset.

The ability to recognize gestures with zero shots represents a significant advancement. It could enable more flexible and natural interaction with digital devices in the future.

Although the study doesn't mention specific products, the research suggests that Apple is working intensively on new forms of human-computer interaction. EMG-based control could play a central role in wearables, AR and VR systems, and other digital devices in the future. (Image: Shutterstock / NicoElNino)

  • MacBook Neo loses to iPhone 17e in benchmark test
  • MacBook Neo: How many charging cycles can the battery handle?
  • Apple is working on an Apple Pencil with realistic haptics
  • OpenAI brings multi-account support to ChatGPT Atlas
  • MacBook Neo: Why the SSD is significantly slower
  • Silo Season 3 is scheduled to premiere on Apple TV this summer
  • Studio Display XDR: Apple plans update for full calibration
  • M5 MacBook Air Reviews: More performance than expected
  • visionOS 26.4: How to get X-Plane 12 on Apple Vision Pro
  • MacBook Neo Reviews: High praise despite clear limitations
  • Apple has the advantage: Laptop prices could rise sharply
  • Apple now produces one in four iPhones in India
  • Apple & AT&T: German publishers demand punishment
  • iOS 26.4 Beta 4: An overview of all visible changes
  • MacBook Neo: Apple makes repairs significantly cheaper
  • ChatGPT uses Shazam and instantly recognizes songs in the chat
  • Cosmic Orange Trend: Competitors copy Apple's iPhone
  • Apple TV: New star for "The Morning Show" season 5
  • MacBook Pro with M5 Pro and Max tested: Incredibly fast
  • Apple Studio Display XDR: First reviews praise upgrade
  • macOS 26.4 brings MacBook Neo wallpapers to all Macs
Have you already visited our Amazon Storefront? There you'll find a hand-picked selection of various products for your iPhone and other devices – enjoy browsing !
This post contains affiliate links.
Add Apfelpatient to your Google News Feed. 
Was this article helpful?
YesNo
Tags: TechPatient
Previous Post

MacBook Neo loses to iPhone 17e in benchmark test

Apple is relying on EMG and AI for new gesture control">
Apple AI

Apple is relying on EMG and AI for new gesture control

March 11, 2026
MacBook Neo Apple

MacBook Neo loses to iPhone 17e in benchmark test

March 11, 2026
MacBook Neo Apple

MacBook Neo: How many charging cycles can the battery handle?

March 11, 2026

About APFELPATIENT

Welcome to your ultimate source for everything Apple - from the latest hardware like iPhone, iPad, Apple Watch, Mac, AirTags, HomePods, AirPods to the groundbreaking Apple Vision Pro and high-quality accessories. Dive deep into the world of Apple software with the latest updates and features for iOS, iPadOS, tvOS, watchOS, macOS and visionOS. In addition to comprehensive tips and tricks, we offer you the hottest rumors, the latest news and much more to keep you up to date. Selected gaming topics also find their place with us, always with a focus on how they enrich the Apple experience. Your interest in Apple and related technology is served here with plenty of expert knowledge and passion.

Legal

  • Imprint – About APFELPATIENT
  • Cookie Settings
  • Privacy Policy
  • Terms of Use

Service

  • Partner Program
  • Netiquette – About APFELPATIENT

RSS Feed

Follow Apfelpatient:
Facebook Instagram YouTube threads threads
Apfelpatient Logo

© 2026 Apfelpatient. All rights reserved. | Sitemap

No Result
View All Result
  • Home
  • News
  • Rumors
  • Tips & Tricks
  • Tests & Experience Reports
  • Generally

© 2026 Apfelpatient. All rights reserved. Page Directory

Change language to Deutsch