A Value Framework for Technology Potentials: Business Adoption of Emotion Detection Capabilities

A Value Framework for Technology Potentials: Business Adoption of Emotion Detection Capabilities

Stefan Koch, Kemal Altinkemer
DOI: 10.4018/IJDSGBT.302636
Article PDF Download
Open access articles are freely available for download

Abstract

This paper proposes a modified approach for analyzing and presenting the value generation, capture, and distribution from organizational adoption of new technologies. The focal technology used is emotion detection, and technical approaches and possibilities, as well as some of the inherent challenges and limitations are briefly described. For this technology potential, a novel framework for value generation, capture, and distribution is developed using a design science approach. The framework allows organizations to select appropriate adoption areas and derive methods for risk and problem mitigation. This framework is then demonstrated within the selected setting of emotion detection capabilities.
Article Preview
Top

Introduction

Creating business value from information technology (IT) has been a topic discussed for decades (McAfee & Brynjolfsson, 2008). Kohli and Grover (2008) provided a good summary and additionally an outlook for this discussion, and also Melville et al. (2004) reviewed the extant literature and proposed an integrated model for IT business value. The challenge of analyzing and selecting investment areas based on business value is extremely relevant, today, when the focus of many organizations is on digital technologies and on how to use them for transformation (Fitzgerald et al., 2014) and to find and exploit respective opportunities (Muehlburger et al., 2020). This paper aims at developing a framework supporting this strategic task.

One example for a digital technology that can offer several different application opportunities for organizations is emotion detection. Providing valuable decision-making data, current technology is starting to facilitate the detection and classification of human emotions, allowing to accurately measure levels of happiness, sadness, and other emotional states through video, photo, voice, gesture, text, any other sensor data or a combination thereof. This has led to rising interest in emotion detection in popular culture and media. However, while emotion detection could be used in pricing, advertising, service management, and many other areas, adoption by business organizations has been comparatively lacking, thus far. This means that, in an age of digital transformation, organizations face the challenge to decide whether to adopt and how and which particular areas or application contexts to adopt such a new technology; as a result, organizations need support for such tasks.

Current uses can be broadly divided into two categories: 1) Real-time applications that do not store emotions and are mostly related to human-computer interaction; 2) applications that record emotions for later analysis. Within this second group, further classification can distinguish between anonymous data collection (e.g., for user interface design improvement) and individually attributable data collection. Importantly, the last category enables one-to-one marketing based on emotions. Companies have already implemented, or are close to implementing, programs that capture and leverage emotion data, including: Attention monitoring for ad effectiveness; TV ratings via facial micro-expression analysis and eye tracking; call routing in call centers based on customer anger via voice pitch analysis; recommendation services, especially for entertainment; increasing security at malls, airports, sports arenas, and other public venues by detecting malicious intent; creating enhanced virtual reality experiences; accommodating different learning styles in educational environments; drowsiness detection in transportation; monitoring emotions and communication based on skin conductivity via sensors for autism patients or pain management for children patients via facial analytics in medical settings. Chen, Chang et al. (2017) provided an overview of advantages for the context of emotion classification of YouTube videos, listing search results, recommendations, advertisement accuracy, policy adjustment, and Web intelligence. Lee et al. (2012) used the valence and arousal of Facebook users to explain event intentions. Mukhopadhyay et al. (2020) in the context of online learning.

Complete Article List

Search this Journal:
Reset
Volume 13: 1 Issue (2024)
Volume 12: 1 Issue (2022)
Volume 11: 2 Issues (2021): 1 Released, 1 Forthcoming
View Complete Journal Contents Listing