This is how you make a digital human
Soul Machines’ technology is changing how consumers interact with companies.
Full Transparency
Our editorial transparency tool uses blockchain technology to permanently log all changes made to official releases after publication. However, this post is not an official release and therefore not tracked. Visit our learn more for more information.
Imagine going online and having a personal chat with Olympic gold medal winner Michael Phelps. Phelps’ digital “clone” calls you by name, wants to know how you’re doing, and offers feedback on athletic training.
This hypothetical conversation isn’t as far-fetched as it may seem. For the better part of the last decade, the New Zealand company Soul Machines has developed digital humans who can engage with customers, answer questions and even read emotional cues given by the user. Many of these “people” are used by companies like Royal Bank of Scotland to help with customer service, to imbue human warmth into what can often be a cold experience. The company’s new project, currently under wraps, is building a celebrity’s digital twin (and no, it isn’t Michael Phelps).
How does a team of programmers and designers create a digital human? The process requires a lot more than just good code and graphics. Team members from the New Zealand company study and adapt human physiology and brain processing for their digital inceptions. Or, as the company’s material says, they use “neural networks to combine biologically inspired models of the human brain and key sensory networks.”
So far, Soul Machines’ digital twins are based on 20 real-life people of diverse demographic backgrounds. Their personalities and characteristics can be spliced together to create synthetic “people.”
While still an early adopter market, digital humans promise to revolutionize how companies interact with their customers.
“Digital humans are offering innovative ways for companies to humanize their brand and personalize customer service in a way that has not been possible before,” says Greg Cross, Soul Machines’ co-founder and Chief Business Officer.
The seed for Soul Machines was planted in 2012. Co-founder and animator Mark Sagar, who won Academy Awards for his work on “King Kong” and “Avatar,” wondered if it was possible to build autonomous animation, creating independently functioning characters. Sagar and Cross built a team of researchers — neuroscientists, developmental psychologists, and child psychologists — to see if they could build a digital model of a human brain.
The operating systems in Soul Machines’ creations are based on the structure of human minds: The brain stem or reptilian part regulates basic life functions like breathing and heart rate; the limbic system controls emotions such as facial expressions; and the cortex oversees rational thinking.
Human engagement depends on these three systems, explains Cross. For example, if one person smiles at another, the limbic and reptilian systems usually prompt a smile in return. If the person receiving the smile decides not to smile, their rational brain has sent an override to the emotional brain.
Digital humans, based on Soul Machines’ model of the brain, are able to “see” and “hear” using their perception systems with input from the video camera and microphone. Part of this perception includes emotional engagement, so digital humans are able to interact in emotionally appropriate ways. Soul Machines has built out physical bodies, too, using scans of real-life people that enable heightened interaction by pointing and touching the screen. For example, if a bank customer is happy or a hotel guest disgruntled, the digital human has the capacity to respond accordingly. Digital humans also have the ability to learn and grow through their interactions, making them more human-like.
Soul Machines’ technology increases personalization in a professional world that has become more digital in recent years. The technology can be applied in any context in which there is a desire to expand a personal reach, like celebrity engagement. Companies (or individuals) design their digital humans and program them with scripts based on relevant content.
Reflecting on a future in which we’ll all interact with a lot more machines, Cross says “our philosophy is that we will find those machines more helpful, relatable, and trustable if they’re more like humans.”
Currently, the company is collaborating with a number of corporations to create customized digital humans. Along with the Japanese skincare brand SK-II, Soul Machines created Yumi, an AI-powered digital influencer. Yumi personably answers any questions consumers have about SK-II products or particular skincare needs.
Soul Machines recently launched a Digital DNA Studio, which taps into the company’s “DNA database,” built from the 3-D images of models they’ve captured to make realistic humans. The “virtual gene pool” contains information on skin color and texture, as well as bone and muscle structures. The result is a second type of digital creation, a “synthetic digital human,” which can be developed from the database in just a few days.
Over the next few years, Soul Machines plans work with companies to create thousands of digital workers, including assistants and companions, which will open up a myriad of possibilities. While there is concern about computer automation becoming ubiquitous (some reports predict 30 million workers will lose their jobs because of AI), Cross says digital humans can fill critical gaps where human beings are already opting out of jobs.
“Basic services like healthcare and education present enormous opportunities for us to find better ways of delivering highly personalized services. These are some of the exciting things we imagine in the world,” notes Cross. “A lot of the future is based on how well we collaborate and cooperate with machines.”
For more information, see:
New Zealand’s Soul Machines puts a human face on AI
For related media inquiries, please contact story.inquiry@one.verizon.com