As the interaction between humans and machines continues to advance, lives can be improved – for some, immeasurably. EG talks to two New Zealand startups using creative engineering solutions such as thought-controlled technology and digital humans aimed at making life easier and better for all.

Sarvnaz Taherian knows all about the power of the mind. Wearing a brain-sensing headband with a single sensor on her forehead, the co-founder of Auckland-based hi-tech startup Thought-Wired is able to tweet, update her Facebook status or browse the Internet all without raising a finger. “Basically anything I want to do with a keyboard and a mouse can be enabled by this technology,” she says. 

Welcome to the world of BCI, or Brain Computer Interface technology, which promises to endow us with the ability to control our immediate worlds with nothing but our thoughts. It’s a scenario that has been dreamed of since 1924, when German psychiatrist Hans Berger first recorded the electrical activity of a human brain via electrodes placed on the scalp. The dream got a boost in the 1960s, thanks to neurophysiologist William Grey Walter’s work identifying brain activity associated with particular physical movements. Then this century, the first brain-sensing headbands appeared, capable of registering signals and sending them to a computer. 

For diehard gamers, the potential uses are obvious. The founders of Thought-Wired, however, are driven by a loftier purpose: they want to use BCI technology to help disabled people who can’t move or talk to communicate using their minds. “Making their lives better is our main motivation”, says Sarvnaz. It’s a theme that recurs among many of New Zealand’s most cutting-edge new technology ventures, this goal of improving people’s lives through innovative engineering solutions.

Take the Auckland-based Artificial Intelligence (AI) startup Soul Machines, for example. The company’s first virtual assistant, an avatar called “Nadia”, was developed to help disabled people navigate Australia’s National Disability Insurance Scheme (NDIS) website (more on Soul Machines later). In the case of Thought-Wired, the venture was born when co-founder Dmitry Selitskiy saw a demonstration of brain-sensing hardware and recognised its potential to help a young cousin with severe cerebral palsy. The family had tried just about every type of assistive technology to help the child communicate, but with no luck – the body wouldn’t cooperate. Could harnessing the mind be the answer? Several years and many iterations later, the company is on the verge of commercial release. 

At the heart of Thought-Wired’s innovation is software called nousTM, which teaches the user to control their thought patterns and select “yes” or “no” options on a computer or device screen. What will it bring for its users? Ultimately, it is hoped, all manner of things, from home automation to the use of motorised wheelchairs. “But our first use case is to enable communication,” says Sarvnaz. Taking brain-sensing technology out of the lab and into the lives of disabled people raised significant hurdles. 


thought wired

Part of the Thought Wired team

As James Pau, the third of Thought-Wired’s three co-founders, puts it: “BCI is a technology that has been around more than 50 years, and there’s a reason why it hasn’t made it into everyday mainstream use.” James, a former Mechanical Engineering Research Fellow at the University of Auckland, says the innovations behind nousTM aren’t that technical. A lot of the challenges we had to overcome to get to this point were more about usability.” The key was to simplify the user interface. And the breakthrough to achieving that came when they realised they needed to change tack. “Initially we started with a machine-learning approach, but we flipped that on its head, so it now relies on training the person using nousTM in how to produce the brainwaves required for our system,” says James. The whole development approach has been web-driven. James describes the required state as being almost meditative, involving complete focus. “It’s different for each person, but after enough practice it becomes almost second nature – like muscle memory.”

BCI is a technology that has been around more than 50 years, and there’s a reason why it hasn’t made it into everyday mainstream use.

It’s early days, and there is much more improvement required to make the hardware more sensitive and precise, and the software more responsive and personalised. But for the six million people globally who can’t move or speak because of severe disabilities, but have healthy minds, the prospect of being able to communicate is growing more real by the day. 

Developing digital humans Like Thought-Wired, Soul Machines is intent on making lives easier by improving the human-computer interface, albeit with a very different twist. A 2016 spin-out from the University of Auckland’s Bioengineering Institute, it’s a company of biomedical engineers, artificial intelligence researchers, neuroscientists, artists and others, focused on creating lifelike “digital humans” to humanise the interface between people and machines. As well as “Nadia”, developed for the Australian NDIS, the rollout so far includes an interactive infant prototype known as “BabyX”, a virtual customer service agent called “AVA” designed for US software outfit Autodesk, and “Sophie”, created in partnership with Air New Zealand as an experiment to explore how AI might be used to help travellers. 

soul machines

Soul Machines's "Sophie", a digital human created in partnership with Air New Zealand, is modelled on Rachel Love.

The common theme to all of these creations is that they are, to a greater or lesser degree, emotionally intelligent, imbued with the ability to “read” emotions by analysing an individual’s facial and vocal cues, and to respond in an appropriate, engaging way. Rachel Love has been involved in the project since the second year of her biomedical engineering studies at Auckland, when she picked up summer holiday work at the Laboratory for Animate Technologies under its then-director Dr Mark Sagar, now CEO of Soul Machines. “I always wanted to do something that involved the brain,” says Rachel, whose first love is biology. “I didn’t want to do engineering until I learned that biomedical engineering existed. The Lab was trying to understand how the human brain works, and the fact they were combining neuroscience and engineering, and in a way nobody had before, I thought was the coolest thing in the world.” 

When Soul Machines launched she joined full time, initially as an Avatar Engineer (the company now prefers the term “digital human” to avatar). The job was an eclectic one, involving her at times in animation, voice technology and Emotional Markup Language. More recently, she’s evolved into a Conversation Engineer, a more specialised role that involves developing the content a digital human can talk about in a given context. “We’ll ask, ‘What are the logical things they need to provide answers to? Does the answer need to be broken down into nuggets?’ It deals with content, and with all the potential ways that a conversation can go wrong and how you can get it back on track.” It’s not a role she envisaged when she began studying engineering – indeed, the job didn’t exist at the time. “But I’ve found engineering is a really good background, because so much of what we do here is about problem solving, finding solutions to unique problems. “For engineers coming through uni at the moment, it is a really exciting space to be involved in,” adds Rachel, who stresses Soul Machines is constantly recruiting. “It’s fun, it’s new and it is dealing with problems we haven’t had to think of before.” It’s certainly a fast-developing industry, AI, with plenty of deep-pocketed global competition. 

Soul Machines’ edge rests in the naturalistic appearance of its digital humans, the sense you get of a fully-formed persona, as well as the effectiveness with which they learn from every new interaction. The company’s big innovation lies in the way it marries cutting-edge technology with deep research into the workings of the human brain. As the website puts it, “We use Neural Networks that combine biologically-inspired models of the human brain and key sensory networks to create a virtual central nervous system we call our Human Computing Engine”. The result is a digital human able to respond dynamically and naturally to its human interlocuter, while delivering the appropriate information. 

“The idea with all our projects is to make people’s interactions with computers better and easier, enabling a natural, face-to-face, more human-like experience,” says Rachel, who cites the NDIS example. “For those disabled users, it was a massive advantage to have a face on a screen that they could interact with, rather than getting frustrated and feeling isolated, while struggling with the keyboard and the bureaucratic process.” What’s next? While the company has made a lot of headway, there’s still plenty of development to come, from the way these digital humans look, to the way they talk, interact and engage, to the way they deliver new services and experiences. “There’s always ways to improve performance so that they truly do engage and respond like a digital representation of a human, while also being upfront about the fact that they are not human.”