Creepy Technologies and the Privacy Issues of Invasive Technologies

Creepy Technologies and the Privacy Issues of Invasive Technologies

Rochell R. McWhorter, Elisabeth E. Bennett
Copyright: © 2021 |Pages: 20
DOI: 10.4018/978-1-7998-8954-0.ch083
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Technology has become increasingly invasive and corporate networks are expanding into public and private spaces to collect unprecedented data and provide new services such as artificial intelligence and through unsettling human-like personas. The term “creepy technology” is appearing in the literature along with concerns for privacy, ethical boundaries, cybersecurity, and mistaken identity but is also in news articles to inform the public about technology advances that affect consumer privacy. Invasive technology provides the impetus for external adaptation for many organizations and current trends require rapid adaption to potential threats to security. Also, usability addresses how users respond and adapt to new technology. This chapter includes the presentation of an exploratory study of how the public responded to various technology announcements (N=689 responses) and results indicated a significant response to invasive technologies and some sense of freedom to opine. This chapter also provides discussion of interventions that are critical to both public and private sectors.
Chapter Preview
Top

Background

Examples of creepy technology found in the literature are numerous (Martin, 2019; Purshouse, & Campbell, 2019; Vladimir, 2018; Wilkins, 2018). For instance, research about creepy technology includes studies at institutions of higher education that use facial recognition technology (FRT) to monitor students (see Cole, 2019; Cuador, 2017; Lieberman, 2018; Reidenberg, 2014) as well as examining trends into invasive technology (Aratani, 2019; Brown, 2019; Symanovich, 2018). Also, Wang and Kosinski’s (2018) controversial research attempted to predict sexual orientation by analyzing digital pictures and the researchers remarked that “given that companies and governments are increasingly using computer vision algorithms to detect people’s intimate traits, our findings expose a threat to the privacy and safety of gay men and women” (p. 246). Thus, such predictions could be harmful to people if they are identified or misidentified and subjected to discrimination, as well as discomfort about something so personal.

Complete Chapter List

Search this Book:
Reset