Startups

Are lifelike digital humans the future of customer experience?

Comment

woman sits at computer and interacts with autonomously animated digital person created by Soul Machines
Image Credits: Soul Machines

Soul Machines, a New Zealand-based company that uses CGI, AI and natural language processing to create lifelike digital people who can interact with humans in real time, has raised $70 million in a Series B1 round, bringing its total funding to $135 million. The startup will put the funds toward enhancing its Digital Brain technology, which uses a technique called “cognitive modeling” to recreate things like the human brain’s emotional response system in order to construct autonomous animated characters.

The funding was led by new investor SoftBank Vision Fund 2, with additional participation from Cleveland Avenue, Liberty City Ventures and Solasta Ventures. Existing investors Temasek, Salesforce Ventures and Horizons Ventures also participated in the round.

While Soul Machines does envision its tech will be used for entertainment purposes, it’s mainly pursuing a B2B play that creates emotionally engaging brand and customer experiences. The basic problem the startup is trying to solve is how to create personal brand experiences in an increasingly digital world, especially when the main interaction most companies have with their customers is via apps and websites.

The answer to that, Soul Machines thinks, is a digital workforce, one that is available at any time of the day, in any language, and mimics the human experience so well that humans have an emotional reaction, which ultimately leads to brand loyalty.

“It’s like talking to a digital salesperson,” Greg Cross, co-founder and chief business officer, told TechCrunch. “So you can be in an e-commerce store buying skincare products, for example, and have the opportunity to talk to a digital skincare consultant as part of the experience. One of the key things we’ve discovered, particularly during the COVID era, is more of our shopping and the way we experience brands is done in a digital world. Traditionally, a digital world is very transactional. Even chatbots are quite transactional – you type in a question, you get a response. What drives us as a company is to think about how do we imagine that human interaction with all of the digital worlds of the future?”

It’s worth noting that Soul Machines’ other co-founder, Mark Sagar, has won Academy Awards for his AI engineering efforts creating the characters in the films “Avatar” and “King Kong.” Perhaps the skill behind producing such realistic digital humans is why Soul Machines reported a 4.6x increase in conversion rate, a 2.3% increase in customer satisfaction and that customers are two times more likely to buy after interacting with one of the company’s products, Yumi, a digital skincare specialist for SK II, a P&G brand.

Ruth, digital person created by Soul Machines, for Nestle Tollhouse
Ruth, the digital cookie coach created by Soul Machines for Nestle Tollhouse.

The startup has worked with brands such as Nestle Tollhouse to create Ruth, an AI-powered cookie coach that can answer basic questions about baking cookies and help customers find recipes based on what they have in their kitchen. Soul Machines also teamed up with the World Health Organization to create Florence, a virtual health worker who is available 24/7 to provide digital counseling services to those trying to quit tobacco or learn more about COVID-19. There’s also Viola, who lives on the company’s website as an example of a digital assistant who can answer questions and interact with content, like YouTube videos or maps, that she pulls up.

“Soul Machines’s Digital People solution has been particularly well-received in the service industries where corporates are looking to enhance their online customer service experience beyond a text-based chat or audio-only call that typically have long wait times to talk to a live person,” Anna Lo, investment director at SoftBank Investment Advisers, told TechCrunch. “With autonomous animation, the customized persona which Soul Machines produces is also a useful customer acquisition tool for new product and service queries.”

Lo also pointed topotential applications in the telehealth sector where patients would prefer a live video experience. Digital people could help provide a level of privacy and comfort for patients to ask sensitive questions in a way that frees up doctors to handle more hands-on medical situations, said Lo.

For consumers, many digital assistants can feel more like a gimmick than a useful tool. But these assistants allow companies to collect first-party data on their customers, which can be used to acquire and retain customers and add more value, rather than having to spend huge sums of money to buy that data from social media platforms or Google AdWords, Cross said.

While Soul Machines has a clear outline of ways it can enhance the future of customer experience, it still has a ways to go to get the tech where it needs to be. The digital people (or shall we say digital women, because Soul Machines’s clients clearly subscribe to the woman-as-a-servant philosophy) that are in the market now feel like visual chatbots. They seem to be able to only answer scripted questions or questions phrased in a specific way, and they have a few responses that they recycle through.

Viola, digital person created by Soul Machines
Interacting with Viola, the digital person created by Soul Machines.

For example, Viola introduces herself to the user by saying: “I’m here to help you explore the world. Ask me a Who, What, Where or Why question, and let’s see where we go.”

You can ask Viola what she is, what Soul Machines is, and some other random questions that can be pulled from online encyclopedias like, “Where is New Zealand?” or “What is the Big Bang?” I asked her what cognitive modeling and deep learning were, and she said, “Sorry, I don’t know what that is.”

If the user asks a question that’s not easy to answer, Viola often provides a standard deflection like any basic chatbot. Or, she’ll respond in ways that are surprising, if certainly not intended. For example, I asked Viola: “Why do you look sad?” She responded by pulling up a YouTube video of “I’ll Stand by You” by The Pretenders. Not exactly the answer that I was looking for, but Viola does appear, at least, to interact with the content she brings up by looking at and gesturing toward it. This suggests that Viola is aware of the content in her digital world, Cross said.

Florence, digital person created by Soul Machines, for the World Health Organization
Florence, the digital person created by Soul Machines for the World Health Organization, reacts to a human’s smile with a smile of her own.

Florence and Ruth were similarly restricted to phrasing questions in a way they were trained to understand and react to, and answering questions that are within the limits of their operational design domains. For her part, Florence had a decent facial mimicking feature. When I smiled at her, she smiled back, and it was a lovely, genuine-looking smile that actually endeared me to her.

As customers interact with any of Soul Machine’s digital people, information about their facial expressions and the way they react emotionally is collected, anonymized, and used to train the Digital Brain so that it can interpret those responses and provide an appropriate answer.

To be able to measure progress in autonomous animation, Soul Machines has written a whitepaper proposing a framework consisting of five-ish levels – there is a Level 0, which is “No Autonomy” — just a recorded animation, like a cartoon.

Five Levels of Autonomous Animation: Updated Framework to Improve Human-Machine Collaboration
Image Credits: Soul Machines

Levels 1 and 2 involve pre-recorded and human-authored animation (think how animated characters mimicked the movements of real actors in movies like “Avatar” or “The Lord of the Rings”). Levels 3 through 5 involve real-time, dynamically generated, content-aware animation. Soul Machines puts itself presently at Level 3, or “Guided Animation,” which it defines as a “Cognitively Trained Animation (CTA) system [that] uses algorithms to generate a set of animations without the need for explicit authoring. Authors evolve into trainers solely focused on defining the scope of content and role. The system informs the trainers on areas of improvement.”

Soul Machines is working toward Level 4 autonomy, or “Goals-Based Animation,” which would involve the CTA system generating new animations dynamically to help it reach goals set by the teacher, Cross said. The system tries new interactions and learns from each one under the guidance of a trainer. An example of this could be a virtual assistant providing customers with advice on complex financial situations and creating new behaviors on the fly, but those behaviors would be all in line with branding and marketing goals that are provided by the company.

Or, it could be a company using a digital version of a celebrity brand ambassador to answer questions about its products in a digital showroom. Soul Machines recently announced an intention to build a roster of digital twins of celebrities. Last year, the company started working with basketball player Carmelo Anthony, of the Los Angeles Lakers, to create a digital likeness of him, something it has previously done with rapper Will.I.am, which was featured on a 2019 episode of “The Age of A.I.,” a YouTube Originals series hosted by actor Robert Downey Jr.

Anthony is already a Nike ambassador, so in theory, Soul Machines could use his likeness to create an experience that’s perhaps only available to VIP customers who own a set of NFTs to unlock that experience, Cross said. That digital Anthony might also be able to speak Mandarin or any other language in his own voice, which would open up brands to new audiences.

“We’re really preparing for this next big step from the 2D internet world of today, which I believe will still very much be our on ramp, to the metaverse for the 3D world where digital people will need to be fully animated,” Cross said.

Soul Machines currently has prototypes of digital people that can interact with one another by responding emotionally and answering questions that they’ve asked each other, according to Cross. The co-founder thinks this will be first applied to metaverse spaces like a digital fashion store inhabited by multiple people, some of whom are digital people and some of whom are avatars that are controlled by humans.

In the future, Soul Machines envisions a world where people can create digital replicas of themselves.

“We’re very much on the path to creating at some point in the future these hyper-realistic digital twins of ourselves and being able to train them just by interacting with them online,” Cross said. “And then being able to send them off into the metaverse to work for us while we play golf or lie on the beach. That’s a version of the world of the future.”

This article has been updated with a quote from Softbank. 

Sectors where New Zealand startups are poised to win

More TechCrunch

Cognigy is helping create AI that can handle the highly repetitive, rote processes center workers face daily.

Cognigy lands cash to grow its contact center automation business

Featured Article

Raspberry Pi is now a public company

Raspberry Pi priced its IPO on the London Stock Exchange on Tuesday morning at £2.80 per share, valuing it at £542 million, or $690 million at today’s exchange rate.

1 hour ago
Raspberry Pi is now a public company

The TechCrunch team runs down all of the biggest news from the Apple WWDC 2024 keynote in an easy-to-skim digest.

Here’s everything Apple announced at the WWDC 2024 keynote, including Apple Intelligence, Siri makeover

Hello and welcome back to TechCrunch Space. What a week! In the same seven-day period, we watched Boeing’s Starliner launch astronauts to space for the first time, and then we…

TechCrunch Space: A week that will go down in history

Elon Musk’s posts seem to misunderstand the relationship Apple announced with OpenAI at WWDC 2024.

Elon Musk threatens to ban Apple devices from his companies over Apple’s ChatGPT integrations

“We’re looking forward to doing integrations with other models, including Google Gemini, for instance, in the future,” Federighi said during WWDC 2024.

Apple confirms plans to work with Google’s Gemini ‘in the future’

When Urvashi Barooah applied to MBA programs in 2015, she focused her applications around her dream of becoming a venture capitalist. She got rejected from every school, and was told…

How Urvashi Barooah broke into venture after everyone told her she couldn’t

Slack CEO Denise Dresser is speaking at TechCrunch Disrupt 2024.

Slack CEO Denise Dresser is coming to TechCrunch Disrupt this October

Apple kicked off its weeklong Worldwide Developers Conference (WWDC 2024) event today with the customary keynote at 1 p.m. ET/10 a.m. PT. The presentation focused on the company’s software offerings…

Watch the Apple Intelligence reveal, and the rest of WWDC 2024 right here

Apple’s SDKs (software development kits) have been updated with a variety of new APIs and frameworks.

Apple brings its GenAI ‘Apple Intelligence’ to developers, will let Siri control apps

Older iPhones or iPhone 15 users won’t be able to use these features.

Apple Intelligence features will be available on iPhone 15 Pro and devices with M1 or newer chips

Soon, Siri will be able to tap ChatGPT for “expertise” where it might be helpful, Apple says.

Apple brings ChatGPT to its apps, including Siri

Apple Intelligence will have an understanding of who you’re talking with in a messaging conversation.

Apple debuts AI-generated … Bitmoji

To use InSight, Apple TV+ subscribers can swipe down on their remote to bring up a display with actor names and character information in real time.

Apple TV+ introduces InSight, a new feature similar to Amazon’s X-Ray, at WWDC 2024

Siri is now more natural, more relevant and more personal — and it has new look.

Apple gives Siri an AI makeover

The company has been pushing the feature as integral to all of its various operating system offerings, including iOS, macOS and the latest, VisionOS.

Apple Intelligence is the company’s new generative AI offering

In addition to all the features you can find in the Passwords menu today, there’s a new column on the left that lets you more easily navigate your password collection.

Apple is launching its own password manager app

With Smart Script, Apple says it’s making handwriting your notes even smoother and straighter.

Smart Script in iPadOS 18 will clean up your handwriting when using an Apple Pencil

iOS’ perennial tips calculating app is finally coming to the larger screen.

Calculator for iPad does the math for you

The new OS, announced at WWDC 2024, will allow users to mirror their iPhone screen directly on their Mac and even control it.

With macOS Sequoia, you can mirror your iPhone on your Mac

At Apple’s WWDC 2024, the company announced MacOS Sequoia.

Apple unveils macOS Sequoia

“Messages via Satellite,” announced at Apple’s WWDC 2024 keynote, works much like the SOS feature does.

iPhones will soon text via satellite

Apple says the new design will lead to less time searching for photos.

Apple revamps its Photos app for iOS 18

Users will be able to lock an app when they hand over their phone.

iOS 18 will let you hide and lock apps

Apple’s WWDC 2024 keynote was packed, including a number of key new updates for iOS 18. One of the more interesting additions is Tap to Cash, which is more or…

Tap to Cash lets you pay by touching iPhones

In iOS 18, Apple will now support long-requested functionality, like the ability to set app icons and widgets wherever you want.

iOS 18 will finally let you customize your icons and unlock them from the grid

As expected, this is a pivotal moment for the mobile platform as iOS 18 is going to focus on artificial intelligence.

Apple unveils iOS 18 with tons of AI-powered features

Apple today kicked off what it promised would be a packed WWDC 2024 with a handful of visionOS announcements. At the top of the list is the ability to turn…

visionOS can now make spatial photos out of 3D images

The Apple Vision Pro is now available in eight new countries.

Apple to release Vision Pro in international markets

VisionOS 2 will come to Vision Pro as a free update later this year.

Apple debuts visionOS 2 at WWDC 2024