March 19th, 2026
5 min read
A ballerina with amyotrophic lateral sclerosis (ALS) gets to experience the joy of dance again. A DJ living with ALS can continue to create music and perform for an audience. People with physical limitations can compete, socialize and connect through e-sports.
These are just a few examples arising out of Project Humanity, an initiative aimed at creating a society where everyone can achieve well-being. It is a human-centered program that uses technology to enable people with physical limitations to connect to the world – and in the future it will be powered by the Innovative Optical and Wireless Network (IOWN), making those connections seamless, real-time and global.
What is Project Humanity?
Project Humanity is an initiative to support social inclusion and well-being for people with disabilities, dementia, autism spectrum disorder (ASD) and other conditions, as well as their families, friends and caregivers. We do this in ways that minimize burden and respect each person’s dignity. This includes developing and implementing technologies such as avatar control using bio-signals, including electromyography (EMG) and electroencephalography (EEG) signals.
In the case of EMG signals, the team working on Project Humanity have developed EMG-based control interface technology that uses the signals to sense even the most subtle muscle movements and convert them into digital character actions. Using this technology, people experiencing muscle atrophy and limited mobility can intuitively control digital avatars or remote robots with slight muscle movements, giving them the ability to communicate and even perform on-stage in digital environments. An EGG control interface, meanwhile, uses sensors to read brain waves and converts that into device operation information.
In addition, Project Humanity has developed voice synthesis technology, which reproduces a person’s voice from recorded speech and translates it into multiple languages while preserving the speaker’s unique vocal characteristics.
Project Humanity in action: DJ MASA’s story
In 2014, at age 27, DJ MASA, or Masatane Muto, was diagnosed with ALS, an incurable disease that progressively damages motor neurons responsible for voluntary movement. The diagnosis did not dim his passion for music.
He performs a variety of music live using eye-gaze technology, but he was missing the interactive component in his shows – the back-and-forth with the crowd. “I want to connect with the audience,” DJ MASA told us when Project Humanity was exploring the possibility of a collaboration with him.
To bring DJ MASA’s hope to life, we blended virtual and physical spaces so he could interact with the crowd through a digital avatar. We used our cross-lingual text-to-speech voice synthesis technology to enable him to address the audience in both Japanese and English using his own voice. Our EMG-based avatar control interface leverages residual muscle activity to control an avatar, enabling him to perform DJ call-and-response gestures such as raising his hands, clapping, pointing at the audience, jumping, and swaying his body.

As a result, he was able to “connect with the audience,” as he wished. DJ MASA communicated with the crowd in Japanese and English using his own reproduced voice and performed live through his avatar at self-organized live music festival and a variety of international exhibitions such as Ars Electronica Festival, Big Bang Ball and South by Southwest. When he lifted the avatar’s hands, the audience raised theirs in response. “I could feel I was connecting with the audience,” he said afterward.
Not only did we present a DJ performance, but we also showcased an avatar dance performance synchronized to the music that DJ MASA controlled with his gaze, using NTT’s technology called Style-specified Dance Choreography Generation from Music (SDCGM). “I continue to challenge the impossible,” DJ MASA said.
Waves of Will: Breanna Olson’s story
Breanna Olson, a former ballerina who now lives with ALS, provides another example of how this technology can bring joy and keep dreams alive.
Breanna is an American dancer, and advocate who is widely known for her indomitable spirit and courage in the face of a life-changing diagnosis. Breanna’s dance journey began at a young age, with training in various dance genres including ballet, contemporary and jazz. Throughout her career, she has performed on stage and worked with several dance companies, earning high praise for her ability to blend emotion and technique.
After she was diagnosed with ALS in 2023, she wanted to continue to express herself through dance. In 2025, in collaboration with NTT and global creative research and development network Dentsu Lab, she performed Waves of Will, a world-first live performance that uses EEG-based signals to trigger and modulate predefined dance expressions. The system captured her brainwaves and analyzed their patterns to detect a specific, learned EEG signal that allowed her to select predefined movement options at the moments she chose, and those movements were executed by a digital virtual dancer. Rather than responding to passive stimuli, the interface enabled Breanna to actively communicate her intention. She used this system in pre-production to help shape the artistic direction of the piece, and during the performance itself to select the movements that the virtual dancer would execute.
It was the world’s first live performance that combines brainwave technology, data engineering and performance design.
“Even when my body stopped moving, my imagination never did,” said Breanna. “Working with NTT and Dentsu Lab, I am able to dance again. I have found a new language for creativity, one where my thoughts move, my ideas take shape, where I can keep expressing who I am.”

How does IOWN support Project Humanity?
Project Humanity is changing lives. IOWN can take this work even further. With its ultra high speed and low latency, IOWN has the potential to make the technology behind Project Humanity run more smoothly and quickly.
Just imagine what this can mean for people with physical limitations. For instance, everyone communicating online in a group – no matter where they are – will be able to see and react to things at almost the same time. By adjusting the system so that everyone stays in sync with the slowest connection in the group, the experience will be fair and coordinated for all involved.
Digital twin technology – one of the pillars of the IOWN concept – is also key to creating a future where there are no barriers to participating in social activities for those with physical limitations. Digital twin technology allows for the creation of a virtual copy in real time of anything in the real world, including people. When coupled with EMG-based control interface technology, this can enable true, real-time communication using digital avatars.
Looking ahead to the future of Project Humanity
NTT has big plans for Project Humanity. We are working to develop communication assist technologies that further support the social participation of people with dementia, and technologies that help people change the way they think and feel so they can better accept and cope with dementia in a positive, healthy way. And we are working to support the movement of people with spinal cord injuries.
Our initiatives also include a focus on neurodiversity, recognizing differences in the brain, nerves, and various individual characteristics that result from them – and building mutual respect, supporting ways to utilize these differences in society. The principles behind Project Humanity are people-centered. NTT respects diversity, solves problems in ways that do not burden those involved, and aims to create a world where everyone can achieve well-being. It is our goal to create an inclusive society without division, where no one is left behind.
With Project Humanity and IOWN, NTT is creating a future where everyone can continue to participate in society and chase their dreams.