Event Title

Artificial Intelligence (AI) Assistant Helpfulness

Presenter Information

Katelyn Garza
Katrina Henley
Cameron Long

Loading...

Media is loading
 

Faculty Mentor

Dr. Amy Hayes

Document Type

Poster Presentation

Date of Publication

2021

Abstract

Since Apple first introduced Siri in 2011, artificial intelligence (AI) powered voice assistants (VA's) have become well-established features of mobile devices (Guzman, 2019). Following Siri, additional prominently used voice assistants include Amazon's Alexa, Google's Google Assistant, and Microsoft's Cortana. Recently, there has been a growth in voice-based technology, and many people are now communicating with voice assistants daily in the same way they would with other humans (Sundar et al., 2017). Additionally, though consumer research has shown that people generally prefer female voices over male ones (Griggs, 2011), the context in which the users experience these voices matters. For instance, female voiced computers created to perform a dominant role, such as giving commands or rating performance, were evaluated more negatively by users than male-voiced computers performing the same role (Nass et al., 2006). Thus, we aimed to investigate two relevant hypotheses tested in the present study: First, we hypothesized that listening to a female AI voice assistant would increase sexism ratings, specifically benevolent sexism, compared to listening to a male AI voice assistant. And secondly, we hypothesized that listening to a female AI voice assistant would increase traditional attitudes towards women compared to listening to a male AI voice assistant. We created an online instrument that allows participants to interact with two versions of a digital AI helper (one male, one female) while completing a quiz, and then rate how helpful and accurate the AI helper was. Data collection in the project is ongoing.

Keywords

Sexism, AI, Satisfaction

Persistent Identifier

http://hdl.handle.net/10950/3085

Comments

Chandler Sutton commented "It’s interesting to think about how having a female voice used for AI assistants could possibly enforce stereotypical attitudes/sexism towards women. I hadn’t thought about how a simple technological assistant could have an influence on how we interact with each other with lesser or greater amounts of sexism. It was a great presentation, although it would be interesting to see what the results would be with a bigger and more varied participant sample. Good job!"

Garza_Lyceum_Poster.pdf (403 kB)
Garza_Poster

This document is currently not available here.

Share

COinS
 

Artificial Intelligence (AI) Assistant Helpfulness

Since Apple first introduced Siri in 2011, artificial intelligence (AI) powered voice assistants (VA's) have become well-established features of mobile devices (Guzman, 2019). Following Siri, additional prominently used voice assistants include Amazon's Alexa, Google's Google Assistant, and Microsoft's Cortana. Recently, there has been a growth in voice-based technology, and many people are now communicating with voice assistants daily in the same way they would with other humans (Sundar et al., 2017). Additionally, though consumer research has shown that people generally prefer female voices over male ones (Griggs, 2011), the context in which the users experience these voices matters. For instance, female voiced computers created to perform a dominant role, such as giving commands or rating performance, were evaluated more negatively by users than male-voiced computers performing the same role (Nass et al., 2006). Thus, we aimed to investigate two relevant hypotheses tested in the present study: First, we hypothesized that listening to a female AI voice assistant would increase sexism ratings, specifically benevolent sexism, compared to listening to a male AI voice assistant. And secondly, we hypothesized that listening to a female AI voice assistant would increase traditional attitudes towards women compared to listening to a male AI voice assistant. We created an online instrument that allows participants to interact with two versions of a digital AI helper (one male, one female) while completing a quiz, and then rate how helpful and accurate the AI helper was. Data collection in the project is ongoing.