Animae Ex Machina

About Us

We are a research group with the University of Washington, headed up by Eric Rombokas, collaborating with the AMP lab and the Depts. of Mechanical and Electrical and Computer Engineering.

What we do

We develop hardware, collect data, and use machine learning to improve human-computer interaction. We measure how people move their bodies and their hands, find regularities that render that data easier to work with, and create bidirectional interfaces between the human and the machine.

We study how humans and other animals sense their bodies and the environment to control movement. We use this understanding to improve mobility for people with lower limb amputation – walking, negotiating stairs, running, jumping. We are building smarter controllers and sensory feedback for prosthetic limbs. We are performing basic research in neural control of movement, how the brain creates a sense of ownership of the body, and more!

We study soft robots and wearables - how to build them, make them move, sense the state of their bodies and the things they interact with. We focus on data-driven sensing, combined sensor-actuators, and exploiting multimodal phenomena for state estimation.

You can find published papers on Eric's google scholar page, or browse research on this site for a more anarchic view into what we're currently exploring. We also occasionally make videos about projects and put them on Eric's twitter

University of Washington AMPlify lab

The AMP Lab is a collaboration at the University of Washington between the College of Engineering and Rehabilitation Medicine that seeks to amplify human and robotic movement and performance. The AMP Lab seeks to advance our understanding of the dynamics and control of movement to design treatment strategies and assistive technologies that improve function, performance, and quality of life for people in health and disease.



Limb Simulator

The Limb Simulator aims to reduce phantom limb pain for those who have lost a limb. Most phantom limb pain treatments are palliative in nature – they target individual factors of pain without addressing the underlying cause. By simulating the experience of owning and using an intact limb, Limb Simulator hopes to engage and remodel areas of the brain associated with the lost limb to impact both short and long term outcomes of phantom pain.

Bishop's Hand: Virtual Hand Embodiment

Why does your hand feel like a part of your body, but your screwdriver does not? We are investigating the mechanisms used by the brain to create a sense of ownership over the body. One of the ways that researchers study this is to create the illusion of body ownership over "false" limbs, like fake hands or prosthetic limbs. There are only a few ways we can measure whether a person is feeling this illusion, such as questionnaire, or by measuring reactions to threats to the limb. We are working toward a quantitative understanding of how this "rubber hand illusion" is affected by sensory cues, movement, and expectation. In the laboratory, we can control the visual, tactile, and movement cues that people experience, determine how they contribute to feelings of body ownership.

Coordinated Movement Prosthesis Control

Could a prosthesis learn from examples to adapt to any real-world scenario, much like humans do? Such a self-driving prosthesis would be a radical shift from the current state of affairs where assistive devices operate under strict “modes” of operation with terrain specific movement profiles.


GazeToGrasp is a project that strives to make grasping objects more accessible to upper-limb prosthesis users. Many prosthesis users use compensatory strategies to make up for the range of motion deficiency in a wrist-locked prosthetic limb and experience additional cognitive load when using wrist-locked upper limb prostheses. GazeToGrasp uses deep learning to create a predictive control strategy for a virtual upper-limb prosthesis with the goal of assisting prosthesis users in performing grasping tasks.

Attaching wearable devices to the body

Take a moment to be mindful of all the objects attached to your body - your clothing, your shoes, your watch and possibly a pair of glasses resting on your nose. While these passive wearable objects are optimally designed to camouflage interaction forces between the human body and the device, the same degree of comfort and function is not available for active wearable devices, such as powered prostheses, exoskeletons, and haptic devices which apply external forces and moments to the human body. This project seeks to develop the fundamental scientific principles behind the design of engineered physical interfaces between humans and machines.


Haplets are wireless, finger-worn LRAs. They were created to answer the question: “What is the minimum viable haptics that we can add to the fingers to augment hand tracking in AR?”. Our paper covers the hardware design, validation and a user study with Haplets used in conjunction with a pen. Our results suggest that users are more accurate when drawing with Haplets.

Machine Learning for Control

Almost every application we work on uses machine learning techniques. We focus on data-driven discovery of representations that help us understand sensor data and control complex devices.

Muscle Activation, EMG, and Gestures

New gesture-sensing systems use sophisticated cameras and data processing to achieve in-air interaction with computer systems and interaction with virtual and augmented reality. A key feature of human movement, however, is invisible to these cameras- the activation of the muscles. Electromyography (EMG) can sense the activation of muscles, but it is difficult to infer pose and movement from EMG alone. These two complementary technologies can be combined to improve human-machine interaction.

Negshell Casting

Negshell casting is a fabrication method for making lightweight and strong soft robots. It eliminates the steps for intermediate core removal or multi-step casting for silicone or urethane based soft robots, soft haptic actuators or soft medical devices.

Sensory Feetback

With the loss of a limb comes not only a loss of function: the rich sensory experience we use to navigate the world, now lost, has no artificial equivalent. Without a way to feel out our surroundings, everyday tasks such as navigating the stairs become challenging and attention-consuming. Our goal is to create a prosthetic modification for patients who have undergone a cutting-edge surgical intervention called Targeted Reinnervation, enabling them to feel genuine sensation once more.

VisuoTactile Sensory Conflict

How we as humans make sense of our world? This is a complex question which we are trying to address by means of sensory conflict in the realm of vision and taction.


We utilize the actuating fluid itself as a sensing medium to achieve high-fidelity proprioception in a soft actuator. As our sensors are somewhat unstructured, their readings are difficult to interpret using linear models. We therefore present a proof of concept of a method for deriving the pose of the soft actuator using recurrent neural networks towards the ultimate goal of closed-loop control over a highly complex soft robotic system.

Smart Step

Imagine yourself wearing a pair of ski boots. Imagine yourself having to wear a medical boot because you tripped and sprained your ankle. Imagine yourself wearing a prosthetic leg having to walk down stairs. Smart Step is a smart wearable device that helps you descend the stairs easily and intuitively.

Targeted Muscle Reinnervation Mapping

Targeted Reinnervation (TR) surgery is a groundbreaking intervention for amputees, enabling them to feel authentic sensations on their missing limb. Despite incredible innovation in creating prosthetic systems that take advantage of TR's sensory phenomena, there has been limited study in characterizing the nature of the sensations, and how they develop over time. We aim to explicitly outline the exact qualities of the stimulus at the surgery site that elicits particular sensations in the phantom limb, like pressure, itching, or scratching. This characterization will inform the design of future prostheses to more richly simulate the myriad feelings of the world with which we interact.


For the most up-to-date listing, please see Google Scholar.

Improving IMU-based prediction of lower limb kinematics in natural environments using egocentric optical flow, Abhishek Sharma, Eric Rombokas (2022)

Improving automatic control of upper-limb prosthesis wrists using gaze-centered eye tracking and deep learning, Maxim Karrenbach, David Boe, Astrini Sie, Rob Bennett, Eric Rombokas (2022)

Dimensionality Reduction of Human Gait for Prosthetic Control, David Boe, Alexandra A Portnova-Fahreeva, Abhishek Sharma, Vijeth Rai, Astrini Sie, Pornthep Preechayasomboon, Eric Rombokas (2021)

Descending 13 Real World Steps: A Dataset and Analysis of Stair Descent, Astrini Sie, Maxim Karrenbach, Charlie Fisher, Shawn Fisher, Nathaniel Wieck, Callysta Caraballo, Elisabeth Case, David Boe, Brittney Muir, Eric Rombokas (2021)

Haplets: Finger-Worn Wireless and Low-Encumbrance Vibrotactile Haptic Feedback for Virtual and Augmented Reality, Pornthep Preechayasomboon, Eric Rombokas (2021)

Comparison of texture-based classification and deep learning for plantar soft tissue histology segmentation, Lynda Brady, Yak-Nam Wang, Eric Rombokas, William R Ledoux (2021)

Sensuator: A Hybrid Sensor–Actuator Approach to Soft Robotic Proprioception Using Recurrent Neural Networks, Pornthep Preechayasomboon, Eric Rombokas (2021)

Virtual reality hand therapy: A new tool for nonopioid analgesia for acute procedural pain, hand rehabilitation, and VR embodiment therapy for phantom limb pain, Hunter G Hoffman, David A Boe, Eric Rombokas, Christelle Khadra, Sylvie LeMay, Walter J Meyer, Sam Patterson, Ann Ballesteros, Stephen W Pitt (2020)

Coordinated Movement for Prosthesis Reference Trajectory Generation: Temporal Factors and Attention, Vijeth Rai, Abhishek Sharma, Pornthep Preechayasomboon, Eric Rombokas (2020)

Negshell casting: 3D-printed structured and sacrificial cores for soft robot fabrication, Pornthep Preechayasomboon, Eric Rombokas (2020)

Linear and Non-linear Dimensionality-Reduction Techniques on Full Hand Kinematics, Alexandra A. Portnova-Fahreeva, Fabio Rizzoglio, Ilana Nisky, Maura Casadio, Ferdinando A. Mussa-Ivaldi, Eric Rombokas (2020)

Vibrotactile Feedback Improves Foot Placement Perception on Stairs for Lower-Limb Prosthesis Users, Nataliya Rokhmanova, Eric Rombokas (2019)

A Framework for Mode-Free Prosthetic Control for Unstructured Terrains, Vijeth Rai, Abhishek Sharma, Eric Rombokas (2019)

ConTact Sensors: A Tactile Sensor Readily Integrable into Soft Robots, Pornthep Preechayasomboon, Eric Rombokas (2019)

Mode-free Control of Prosthetic Lower Limbs, Vijeth Rai, Abhishek Sharma, Eric Rombokas (2018)

3D Printed lattice microstructures to mimic soft biological materials, L Johnson, C Richburg, M Lew, W Ledoux, P Aubin, E Rombokas (2018)

Sensitivity to Conflict Between Visual Touch and Tactile Touch, D Caballero, E Rombokas (2018)

A Lower Limb Prosthesis Haptic Feedback System for Stair Descent, A Sie, J Realmuto, E Rombokas (2017)

Sensory Feedback for Lower Extremity Prostheses Incorporating Targeted Muscle Reinnervation (TMR), E Rombokas (2016)

Gpu based path integral control with learned dynamics, G Williams, E Rombokas, T Daniel (2015)

A robotic model of inertial flight maneuvering in the hawkmoth, E Rombokas, L Scheuer, JP Dyhr, TL Daniel (2014)

Sensing from control: airframe deformation for simultaneous actuation and state estimation, BT Hinson, E Rombokas, JP Dyhr, TL Daniel, KA Morgansen (2013)

Vibrotactile sensory substitution for electromyographic control of object manipulation, E Rombokas, CE Stepp, C Chang, M Malhotra, Y Matsuoka (2013)

Reinforcement Learning and Synergistic Control of the ACT Hand, E Rombokas, M Malhotra, E Theodorou, E Todorov, Y Matsuoka (2012)

Comparison of remote pressure and vibrotactile feedback for prosthetic hand control, C Tejeiro, CE Stepp, M Malhotra, E Rombokas, Y Matsuoka (2012)

Tendon-driven control of biomechanical and robotic systems: A path integral reinforcement learning approach, E Rombokas, E Theodorou, M Malhotra, E Todorov, Y Matsuoka (2012)

Reduced dimensionality control for the ACT hand, M Malhotra, E Rombokas, E Theodorou, E Todorov, Y Matsuoka (2012)

Biologically inspired grasp planning using only orthogonal approach angles, E Rombokas, P Brook, JR Smith, Y Matsuoka (2012)

Tendon-Driven Variable Impedance Control Using Reinforcement Learning, M Malhotra, E Rombokas, E Theodorou, E Todorov, Y Matsuoka (2012)

Continuous vocalization control of a full-scale assistive robot, M Chung, E Rombokas, Q An, Y Matsuoka, J Bilmes (2012)

Task-specific dynamics for robotic hand control, E Rombokas, M Malhotra, Y Matsuoka (2012)

Task-specific demonstration and practiced synergies for writing with the ACT hand, Eric Rombokas, Mark Malhotra, Yoky Matsuoka (2011)


Eric Rombokas

Investigator Supreme

Alexandra (Sasha) Portnova

Posdoctoral Fellow

Abhishek Sharma

PhD Student
Mechanical Engineering

Maxim Karrenbach

Electrical Engineering


████ █████


Huiwen Guo

Mechanical Engineering

Luke Johnson

Mechanical Engineering

Lalit Palve

Mechanical Engineering

Nataliya Rokhmanova

Mechanical Engineering

Gaurav Mukherjee

Mechanical Engineering

Vijeth Rai

Electrical Engineering

Astrini Sie

Electrical Engineering

David Caballero

Electrical Engineering

David Boe

Prosthetologist & Neurobiologist

Jom Preechayasomboon

Mechanical Engineering


News section deprecated

The news section of this website has not turned out to be a useful platform for us, but I'm leaving it here as an archive. If you're interested in seeing updates and news, consider twitter @erombokas @prnthp @gmukherg

All reality is virtual: how VR can transform the way we think about our bodies.

We presented virtually to an audience in AltspaceVR, hosted by Rotary International. Speaking in VR, to an audience using VR equipment, provides a unique opportunity to showcase our research toward improving physical, body-oriented virtual experiences.

Limb Simulator demo at AOPA National Assembly 2019

We will be presenting live demos at the American Orthotic and Prosthetic Association national assembly september 25-28.

This is part of the VA Office of Research and Development Rehabilitation Roadshow. RR&D will be at BOOTH 1519 in the Exhibition Hall to present and demonstrate prosthetics-related devices developed by VA investigators. The objective is to increase ORD visibility, potential licensing opportunities, solicit feedback that could help improve the devices, or create new ones. Featured Device Development: Louis Stokes VA, Cleveland: restoration of touch to upper and lower limb prosthetics. Upper-limb user will demonstrate the system Minneapolis VA: adaptable ankle, rehab ankle, 3D printed variable heel height foot/ankle, socket fit sensor, Prosthetic Sock Management Tool Seattle VA: pivot flex foot, Virtual Reality phantom pain treatment Cleveland researchers will be presenting a symposium, “Prostheses that Feel: Clinical and Technical Considerations for Restoring Sensation to Upper and Lower Limb Amputees” on Sept, 28th from 1-5PM. Contact me if you're interested in free admission to the expo hall on Saturday morning.

Limb Simulator in the news

Some recent media interest in our Limb Simulator for treating phantom limb pain: Seattle Kiro News story here and Geekwire story here.

3d Printed Microstructure Lattices in the news

Our recent work in using lattices to mimic biomechanical tissue properties is in the news. Physics World Article

Hamburger Helper Hackaton - Hungry Hamburger Helpers Wanted!

A group of Occupational Therapy fellows have been designing an assistive device for eating hamburgers. The Hamburger Helper is a device for keeping a hamburger together during eating. The initial idea and prototype was created by a bilateral upper limb amputee who was tired of hamburgers falling apart when he held them in his hook prostheses. Over the winter quarter, he, the OT fellows, and the experts at VA have developed some concepts and design specifications for advancing the idea.

If you have technical skills, come partner with some potential users of the device, and the clinicians, to realize and iterate on their designs using 3D printing and other fab techniques.

Thursday, March 29th, 2018, at 9:00am, meet at the AMP lab. The teams will work through the day, culminating with with Hamburger Happy Hour at 4:00. Work with our faculty and professional engineers, learn about assistive devices, and get to use some fancy 3D printers!

RSVP to by Weds. March 28th, 11:59pm


There are more topics and projects than we have the capacity to explore. We are always seeking skilled, self-motivated, and indomitable people from a wide variety of disciplines. Good candidates for joining the lab are more than enthusiastic; they have a concrete set of skills to bring to bear on a particular problem or topic.

University of Washington

Seattle, WA 98195