We’ve discussed how the term artificial intelligence (AI) covers a wide array of applications; just like many of these functionalities, affective computing is beginning to see some growth in the market. Spanning across computer science, behavioral psychology, and cognitive science, affective computing uses hardware and software to identify human feelings, behaviors, and cognitive states through the detection and analysis of facial, body language, biometric, verbal and/or vocal signals.
While affective computing requires hardware such as a camera, a touch device, or a microphone to capture the signals described above, the bulk of AC technology uses software technologies to detect and analyze an individual’s current mood or cognitive state. Synthesis and analysis of the captured data heavily relies on AI and machine learning algorithms and modeling, as well as aspects of computer vision (CV), natural language processing (NLP) and natural language understanding (NLU).
Affective computing offers a great opportunity for organizations to augment human capabilities and build trust between humans and the machines they interact with. Businesses are developing more and more use cases that incorporate affective computing, and AI vendors are developing innovative capabilities to help organizations capitalize on this opportunity. Here are just a few of the vendors driving the affective computing market forward with these innovations:
Using Affective Computing to Make Driving Safer and More Enjoyable
MIT Media Lab spin-off Affectiva uses in-cabin cameras and microphones in its Automotive AI product to capture facial and vocal signals, enabling its AI to analyze and recognize the cognitive and emotional states of automobile drivers and passengers, including drowsiness or distraction. Affectiva’s key differentiator is the technology used in Automotive AI’s multimodal emotion recognition tool, which uses deep neural networks, trained on an ever-growing emotional data repository of 8+ million faces, to analyze both facial and vocal signals.
Aside from making consumer driving safer, affective computing solutions like Affectiva could make ridesharing, taxis, and fleet management safer by monitoring and alerting drivers to distracted or unsafe driving behaviors and by providing enhanced in-cabin experiences.
Building Trust in Financial Services with Emotional Analysis
Swiss company NVISO has created Insights ADVISE, which uses facial analysis to recognize emotions and build accurate financial behavioral profiles for financial services clients. The app uses biometric facial recognition technology to track 68 facial points and head pose in financial clients as they answer questions. NVISO’s deep learning network takes this information and analyzes the clients’ emotions in real-time to build a corresponding financial behavioral profile.
NVISO helps financial advisors to better understand and serve their clients through a more accurate financial behavioral profile. The Insights ADVISE app empowers financial advisors to better understand their clients’ needs, risk appetite, and preferences and serve them more effectively, building mutual trust and ensuring realistic goals and expectations.
Using Behavioral Insights to Train Customer-Facing Employees
Cogito, an MIT Human Dynamics lab spin-off, creates products that analyze behavioral signals within voice conversations to provide real-time guidance to customer-facing agents in service, sales, and care roles. Cogito’s products analyze over 200 behavioral signals such as pitch, tone, pace, turn-taking, vocal effort, and vocal tension within voice conversations, and deliver in-call guidance that is simple for the agent to interpret. Cogito also records and bundles insights from customer interactions and delivers them to supervisors to identify training opportunities and share best practices.
The level of guidance helps employees in these roles deliver more satisfactory care to individual clients, but it also empowers agents to augment their own emotional intelligence and improve their overall performance through goal-oriented insights.
Improving the Retail Experience for Customers and Businesses with Facial Recognition
Sightcorp uses computer vision and machine learning/deep learning to analyze and recognize faces in a retail environment, generating in-store analytics that measure customer satisfaction and provide real-time customer insights. Sightcorp also offers multiple privacy settings in its products by allowing businesses to activate automatic face blurring and by refusing to store images. By processing on the edge and following these rules, retailers can ensure that sensitive data is not transferred or is vulnerable outside of the device running the facial recognition and analysis.
Affective computing in retail delivers insights businesses need to improve and optimize customer experiences while allowing human retail workers to focus on physical customer interactions, increasing both the immediate customers’ experiences and long-term customer satisfaction.
Affective computing, like other AI applications, empowers organizations with the insights they need into their businesses and consumers to succeed. This technology requires a comprehensive AI strategy to deliver business value while navigating ethics and privacy considerations.
Learn more about incorporating affective computing in your organization’s AI strategy; watch IDC’s on-demand webcast “Affective Computing: Augmenting Human Emotional Intelligence with AI”.