Loading Events

This event has passed.

Webinar: Exploring EMG-Driven Engineering Solutions

July 16 @ 10:00 am - 12:00 pm EDT

Exploring EMG-Driven Engineering Solutions with the Delsys API

Engineering research through the lens of electromyography (EMG) has significantly advanced the robustness and reliability of systems that interface with the human body. In fields like human-machine interaction, rehabilitation robotics, and extended reality (XR), high-quality EMG signals have proven to be an established parameter for gesture recognition, precise control, posture analysis, and muscle fatigue assessment.

The Delsys API plays a critical role in this space by enabling seamless EMG streaming into Python, C#, and Unity environments. This functionality empowers researchers and developers to build real-time, low-latency, and customizable interfaces for a wide range of engineering applications—spanning academia, OEM development, and industry innovation.

This webinar will feature use cases of the Delsys API in virtual reality, rehabilitation robotics, musculoskeletal modelling, and myoelectric control, to demonstrate how EMG-driven projects and open-source toolkits are empowering unique integration strategies and real-world implementations. These shared resources aim to accelerate future research by removing common developmental barriers and fostering a more collaborative scientific community.

Program

Joanna Koshy, Application Engineer, Delsys

This webinar focuses on making EMG more accessible by lowering the technical barriers that often limit its use. Aimed at researchers in both industry and academia, it highlights practical tools and approaches to advance movement-based engineering and research applications.

Dr. Erik Scheme, Evan Campbell, & Ethan Eddy, Institute of Biomedical Engineering, University of New Brunswick, Canada 

This talk will provide an overview of myoelectric control and how it is evolving in recent years.  Drawing on its historical roots in prosthetics, we will describe recent work on improving the robustness of continuous myoelectric control and its generalizability to unseen environments. We will then explore its potential as an input modality for human-computer interaction and the design implications when transitioning to emerging consumer applications.  Finally, we will discuss the need for broader collaboration, transparency, and reproducibility in myoelectric control research and introduce LibEMG – an open-source framework for accelerating the development and translation of myoelectric control.

Dr. Sean Banerjee, Dr. Natasha Banerjee, & Dr. Ashutosh Shivakumar, Terascale All-Sensing Research Studio (TARS), Wright State University, USA 

Virtual Reality (VR) is a catalytic force, propelling the development and deployment of immersive, scalable and customizable virtual simulations to understand and assess team performance and behavior in human-human and human-machine teaming scenarios. By incorporating relevant sensors in VR based simulations, precise, user-generated multimodal data can be extracted, processed and represented producing insightful performance metrics that quantify and enhance collaborative performance.

This talk showcases Terascale All-sensing Research Studio (TARS)’s current research in “VR Augmented Human Performance Enhancement” at Wright State University, Dayton, Ohio. It discusses TARS’ projects where the state-of-the-art in virtual reality is utilized to inform and enhance collaborative performance in human-human and human-machine teaming scenarios.

Mouhamed Zorkot, Rehabilitation and Assistive Robotics Group, EPFL, Switzerland 

In this talk, Mouhamed will present neuromodulation strategies for lower limb rehabilitation using EMG-driven approaches. The session will highlight how muscle activity can be accessed and processed to develop control strategies to enhance transcutaneous spinal cord stimulation and functional electrical stimulation for gait rehabilitation in individuals with neurological disorders. He will also present the technical implementation and discuss the potential of this approach to support personalized therapies in both clinical and research settings.

Dr. Vittorio Caggiano, MyoLab.AI, USA 

Understanding and replicating human motor control is a grand challenge spanning neuroscience, robotics, and artificial intelligence. Human movement emerges from the complex interplay of muscles, tendons, neural signals, and dynamic interactions with the environment. While experimentally collecting the full biological and physical complexity of this system is nearly impossible, computational modelling offers a scalable path for a holistic understanding. Yet, most existing simulation tools are tailored to specific domains and not scalable, hence limiting their ability to capture the whole complexity of the human-environment behavioral interactions.  

To bridge this gap, we developed MyoSuite, an open-source multidisciplinary ecosystem for high-performance, physiologically realistic simulation of human musculoskeletal and sensory systems, enabling agents to interact with complex, physically grounded environments. MyoSuite combines biomechanical fidelity with the scalability and speed demanded by modern AI research, making it a powerful platform for advancing embodied intelligence. Originally released with a set of physiologically plausible upper-limb and lower limb models and control tasks for manipulation and locomotion, MyoSuite has since evolved into more complex full body control and tasks. Furthermore, it now includes a broader ecosystem encompassing multiple initiatives. MyoAssist, which explores human-AI co-adaptation and the design of exoskeletal support systems. MyoSuite-MJX, a hardware-accelerated backend built on MuJoCo MJX to improve scalability and enable faster training. MyoUser, aimed at studying ergonomics, fatigue, and human-centered design in simulated work and everyday environments. 

In this talk, I will also present the 4th edition of the NeurIPS MyoChallenge: an annual community-driven initiative designed to benchmark progress in motor control and reinforcement learning through standardized, reproducible tasks. These challenges promote collaboration across disciplines and provide a shared testbed for evaluating control, adaptation, and learning in physiologically realistic human embodiments. 

Through the open release of the platform, benchmarks, and tools, MyoSuite aims at democratizing access to high-fidelity embodied simulations and accelerating discoveries in motor learning, ergonomics, assistive robotics, and digital physiological twins. Together, these efforts set the stage for a new generation of AI systems that move, adapt, and collaborate with the dexterity and nuance of the human body. 

The webinar will conclude with a live Q&A panel featuring all presenters, offering attendees the opportunity to ask questions and dive deeper into EMG tools, methods, and applications.

Speakers

Categories: