Huawei is working on giving its virtual assistant the ability to read a user's emotions
Back in the beginning of the year, research firm Gartner said that emotion AI will be able to use analysis to detect people's moods and respond with more personalized answers. According to Gartner's research vice president Annette Zimmerman, "By 2022, your personal device will know more about your emotional state than your own family." With the ability to read facial expressions, analyze the user's voice and behavior, virtual assistants will be able to understand the context of the commands they receive, and respond with an answer that is is more in line with the emotions of the user.
According to Gartner, virtual assistants with the ability to read emotions would come in handy in automobiles and health-care. Cars outfitted with such a digital helper would be able to decide whether a driver is tired, stressed, angry or frustrated, and control the car in a manner that would make the ride safer. In health-care, a virtual assistant with emotion AI could monitor a person's mental health 24/7 and alert caregivers if a problem is detected.
James Lu, director of AI product management at Huawei's consumer business group, says that what his company has in mind is a virtual assistant who will stretch out conversations as long as possible so that the user does not feel alone. This requires giving an assistant a high IQ and high EQ (emotional quotient).
Huawei's latest top-of-the-line chipset, the Kirin 970, is equipped with a dedicated Neural Processing Unit (NPU) that drives AI capabilities on handsets powered by the chip. Currently, the SoC powers the Huawei P20 Pro and the Huawei P20.