MorphMind: Smart MorpheesPlug toolkit



about.
Shape-Changing Interfaces (SCIs) are redefining how we interact with technology, adding a physical, tactile layer to digital interfaces. But designing and testing these interfaces is still messy full of manual tweaks, unreliable tracking, and slow iteration cycles. For my MSc dissertation, I set out to upgrade the existing MorpheesPlug toolkit into something smarter and more scalable.
I integrated Motion Capture (MoCap) and Machine Learning (ML) into the system to create a data-driven prototyping environment. My goal: eliminate guesswork, predict deformation in real time, and push SCI research toward real-world application. The project involved redesigning soft widgets, refining fabrication methods, building a custom MoCap setup, and training ML models that could learn how soft robotic forms behave before they even move.
Shape-Changing Interfaces (SCIs) are redefining how we interact with technology, adding a physical, tactile layer to digital interfaces. But designing and testing these interfaces is still messy full of manual tweaks, unreliable tracking, and slow iteration cycles. For my MSc dissertation, I set out to upgrade the existing MorpheesPlug toolkit into something smarter and more scalable.
I integrated Motion Capture (MoCap) and Machine Learning (ML) into the system to create a data-driven prototyping environment. My goal: eliminate guesswork, predict deformation in real time, and push SCI research toward real-world application. The project involved redesigning soft widgets, refining fabrication methods, building a custom MoCap setup, and training ML models that could learn how soft robotic forms behave before they even move.



problem.
Working with SCIs is unpredictable. The materials morph in nonlinear, often inconsistent ways, and there’s no easy way to track or predict those changes. Three core issues stood in the way:
Tracking Deformations
Traditional MoCap systems can’t reliably track soft, flexible materials. Markers fall off or get occluded, and embedded sensors restrict movement. Getting clean, accurate data from deformable TPU shapes was a major challenge.Fabrication Constraints
Soft widgets made from flexible TPU would often leak or collapse under pressure. Thin prints lacked structural integrity, and the elasticity made marker placement a nightmare. Without airtight, stable structures, actuation couldn’t be tested reliably.No Predictive Feedback
Without predictive models, every shape-change cycle required manual adjustment. There was no way to know in advance how much air pressure was needed or what the outcome would be. This made iteration slow and inconsistent.
Working with SCIs is unpredictable. The materials morph in nonlinear, often inconsistent ways, and there’s no easy way to track or predict those changes. Three core issues stood in the way:
Tracking Deformations
Traditional MoCap systems can’t reliably track soft, flexible materials. Markers fall off or get occluded, and embedded sensors restrict movement. Getting clean, accurate data from deformable TPU shapes was a major challenge.Fabrication Constraints
Soft widgets made from flexible TPU would often leak or collapse under pressure. Thin prints lacked structural integrity, and the elasticity made marker placement a nightmare. Without airtight, stable structures, actuation couldn’t be tested reliably.No Predictive Feedback
Without predictive models, every shape-change cycle required manual adjustment. There was no way to know in advance how much air pressure was needed or what the outcome would be. This made iteration slow and inconsistent.



solution.
To address these pain points, I approached the redesign in three parts:
Material & Widget Design
I modeled new versions of the Fold, Spiral, and Teeth widgets in Fusion 360, optimizing them for TPU printing. I then applied UV resin curing to seal the prints while maintaining flexibility, solving the leakage and durability issues.Motion Capture Integration
I built a custom multi-camera MoCap setup using Qualisys Track Manager to capture high-resolution shape changes in real time. Through careful calibration and marker positioning, I was able to track even complex deformations across multiple angles.Machine Learning Modelling
I trained ML models on the captured MoCap data to predict how each widget would deform based on input parameters like inflation time and pressure. Simpler forms like the spiral and fold showed high accuracy, while more complex widgets like the teeth revealed interesting nonlinear behaviors.
Together, these components formed an intelligent, feedback-driven toolkit for faster SCI prototyping.
To address these pain points, I approached the redesign in three parts:
Material & Widget Design
I modeled new versions of the Fold, Spiral, and Teeth widgets in Fusion 360, optimizing them for TPU printing. I then applied UV resin curing to seal the prints while maintaining flexibility, solving the leakage and durability issues.Motion Capture Integration
I built a custom multi-camera MoCap setup using Qualisys Track Manager to capture high-resolution shape changes in real time. Through careful calibration and marker positioning, I was able to track even complex deformations across multiple angles.Machine Learning Modelling
I trained ML models on the captured MoCap data to predict how each widget would deform based on input parameters like inflation time and pressure. Simpler forms like the spiral and fold showed high accuracy, while more complex widgets like the teeth revealed interesting nonlinear behaviors.
Together, these components formed an intelligent, feedback-driven toolkit for faster SCI prototyping.


impact.
This project turned a hands-on, manual workflow into a smarter, data-rich process. I was able to:
Predict shape transformations before they happen.
Track deformation in soft materials with high precision
Cut down time spent on trial-and-error testing
Make fabrication more consistent and repeatable
The upgraded toolkit enables designers and researchers to prototype with confidence. Instead of hoping a widget behaves as expected, they can now simulate outcomes, adjust with precision, and focus on innovation, not troubleshooting.
This project turned a hands-on, manual workflow into a smarter, data-rich process. I was able to:
Predict shape transformations before they happen.
Track deformation in soft materials with high precision
Cut down time spent on trial-and-error testing
Make fabrication more consistent and repeatable
The upgraded toolkit enables designers and researchers to prototype with confidence. Instead of hoping a widget behaves as expected, they can now simulate outcomes, adjust with precision, and focus on innovation, not troubleshooting.
conclusion.
Shape-changing interfaces aren’t just futuristic concepts; they’re possible today, with the right tools. This project showed me how far design can go when paired with computation. By combining real-time sensing and predictive intelligence, I created a toolkit that brings control, clarity, and speed to a space that was once unpredictable.
What started as a toolkit upgrade became a shift in how we can think about designing with movement, material, and machine learning together.
Shape-changing interfaces aren’t just futuristic concepts; they’re possible today, with the right tools. This project showed me how far design can go when paired with computation. By combining real-time sensing and predictive intelligence, I created a toolkit that brings control, clarity, and speed to a space that was once unpredictable.
What started as a toolkit upgrade became a shift in how we can think about designing with movement, material, and machine learning together.