Affective Computing in User Experience Design

An office desk is filled with UX design paperwork and laptop

As digital interfaces become more ubiquitous in the 21st century, the need for high-quality User Experience Design (UXD/UX) has taken an increasingly important role for acquiring and keeping users. Specifically, UX refers to how users interact with products and services in engaging and meaningful ways through branding, page layout, and more. In many cases, the quality of a product’s UX can have a direct correlation to its success, either encouraging users to stay or causing them to leave if the UX is poorly designed. 

Enter Affective Computing (AC) which uses Artificial Intelligence (AI) to create responses that are in tune with a user’s emotional state. In the context of UXD, this can lead to significant breakthroughs in product design, allowing engineers to develop products that can adjust to how a user is feeling, whether it be frustration, confusion, or happiness, and provide them a more intuitive experience.                                 

Evolution of User Experience Design

Despite its strong association with digital products, UX design has a significant history of application in products that were developed well before the digital age. These early products, like telephones and construction tools, were designed ergonomically to fit the human body and reduce operational challenges. 

Some products, like the typewriter took on even more nuance with their UX as the original designers of the typewriter had to arrange keyboards in a way that prevented letter hammers from getting jammed. To solve this, UX designers created the QWERTY keyword that is commonly found on nearly all latin script keyboards. 

By the turn of the 21st century, digital technology had become a prominent fixture as computers and the internet spurred new demand from individuals and companies alike. New devices like smartphones caused a UX revolution as smaller, touchable screens became the norm. Nearly every digital application suddenly required new designs to accommodate touchscreen functionality and led to more advanced UX features like voice-commands and hand gestures. 

What is Affective Computing?

Affective computing, or Emotional AI, is a study of artificial intelligence that aims to recognize, interpret, and simulate human emotions to create stronger engagement from users. Coupled with natural language processing, this niche aspect of artificial intelligence is establishing more efficient chatbots and apps that can interact more naturally with humans. Examples include:

  • Voice-Assistants: Apps like Siri and Alexa are designed to identify a human’s inflection tones and other aspects of their speaking to identify how they are feeling using emotional AI, creating more nuanced responses to user input. 

  • Emotion-Sensing Wearables: New wearable products like smartwatches can sense a user’s heart rate, temperature, and other vital indicators to determine their stress levels. 

  • Mental Health Apps: Certain applications can help identify a user’s mental state and well-being by analyzing their inputs and comparing them to training data to determine coping strategies.

  • Adaptive Learning Systems: Affective computing has a strong place in education, recognizing a student’s frustrations or confusions and providing them with personalized solutions to their problems. 

  • Automotive Safety: Vehicles fitted with AI can identify if a driver is becoming drowsy or reckless, helping them to maintain composure by leading them to the nearest pitstop. 

The Direct Impact of Affective Computing on UXD

Affective computing and UX design both serve a common purpose - to adjust to the human form and condition. As AI becomes more robust with innovation in Deep Learning, a direct intersection is being constructed that not only fits human ergonomics, but their mental and emotional states as well that traditional computer programs are incapable of achieving. 

  • Personalization: Content recommendation feeds can take a step further, such as music streaming, but not just recommending music from similar artists, but identifying a trend in a user’s current listenings and suggesting music that suits a similar emotional tone. 

  • Enhanced Feedback: Affective computing in gaming, a popular platform for new AI systems, can identify whether a player is struggling or not, providing intricate tips based on their actions. 

  • Predictive Analysis: Smart home devices like lighting and door cameras can detect an occupants mood by analyzing their posture with computer vision and adjust home settings to match their mood. 

  • Improved accessibility: A wealth of potential exists for the handicapped and disabled like smart wheelchairs that can detect operational commands based on their user’s emotional state through facial recognition and more. 

UXD Using Affective Computing

Due to Affective Computing’s powerful ability to enhance UXD, companies are already implementing strategies into their products to deliver more responsive applications that cater to human emotion. 

  • DuoLingo: The popular language-learning app uses Emotional AI to analyze real-time data from users, adapting their lessons and fine-tuning them to user ability. This provides a sense of being more “in-tune” for users who need intricate help with learning complicated language materials. 

  • Affectiva: This tech company uses Affective Computing and Deep Learning to gauge audience reactions that can help companies fine-tune their marketing campaigns. This can lead to higher conversion rates using more relatable content. 

  • Emotion API: Released by Microsoft, this API gives game developers more emotion-sensing capabilities that can be integrated into their game to create unique challenges for players or assist them in understanding the game’s mechanics. 

  • Embrace2: Empatica’s Embrace2 is a line of wearable healthcare devices that uses Affective Computing to detect epileptic seizures, giving hospital staff more efficient response times that can save lives. 

The Future of Affective Computing in UXD

With more advanced AI models being created from innovations in Machine Learning, the potential for Affective Computing and User Design couldn’t be brighter. New product features like virtual and augmented reality are already creating exciting new applications for humans to enjoy and provide a powerful new platform for the next stage of UXD. 

With the ability to step into the digital world, humans can soon enjoy a digital environment that fully adheres to their emotional state, providing new methods of stress relief and entertainment that was once considered only science-fiction.

Keegan King

Keegan is an avid user and advocate for blockchain technology and its implementation in everyday life. He writes a variety of content related to cryptocurrencies while also creating marketing materials for law firms in the greater Los Angeles area. He was a part of the curriculum writing team for the bitcoin coursework at Emile Learning. Before being a writer, Keegan King was a business English Teacher in Busan, South Korea. His students included local businessmen, engineers, and doctors who all enjoyed discussions about bitcoin and blockchains. Keegan King’s favorite altcoin is Polygon.

https://www.linkedin.com/in/keeganking/
Previous
Previous

Use of AI for Personalized Content in Social Media

Next
Next

AI in Social Media Analytics and Insights