People with autism turn to ChatGPT for advice on workplace issues

People with autism turn to ChatGPT for advice on workplace issues



Editors’ notes

This article has been reviewed according to Science X’s
editorial process
and policies.
Editors have highlighted
the following attributes while ensuring the content’s credibility:

fact-checked

trusted source

proofread

by Byron Spice, Carnegie Mellon University

A new study shows that many people with autism embrace ChatGPT and similar AI tools for help and advice as they confront problems in their workplaces. But does that use of AI make sense? Credit: JiWoong Jang and Sanika Moharana, both Ph.D. students in the Human-Computer Interaction Institute.

A new Carnegie Mellon University study shows that many people with autism embrace ChatGPT and similar artificial intelligence tools for help and advice as they confront problems in their workplaces.

But the research team, led by the School of Computer Science’s Andrew Begel, also found that such systems sometimes dispense questionable advice. And controversy remains within the autism community as to whether this use of chatbots is even a good idea.

“What we found is there are people with autism who are already using ChatGPT to ask questions that we think ChatGPT is partly well-suited and partly poorly suited for,” said Begel, an associate professor in the Software and Societal Systems Department and the Human-Computer Interaction Institute. “For instance, they might ask: ‘How do I make friends at work?'”

Begel heads the VariAbility Lab, which seeks to develop workplaces where all people, including those with disabilities and who are neurodivergent, can successfully work together. Unemployment and underemployment are problems for as many as nine out of 10 adults with autism, and many workplaces don’t have the resources to help employees with autism and their coworkers overcome social or as they arise.

To better understand how (LLMs) could be used to address this shortcoming, Begel and his team recruited 11 people with autism to test online advice from two sources—a based on OpenAI’s GPT-4 and what looked to the participants like a second chatbot but was really a human career counselor.

Somewhat surprisingly, the users overwhelmingly preferred the real chatbot to the disguised counselor. It’s not that the chatbot gave better advice, Begel said, but rather the way it dispensed that advice.

“The participants prioritized getting quick and easy-to-digest answers,” Begel said.

The chatbot provided answers that were black and white, without a lot of subtlety and usually in the form of bullets. The counselor, by contrast, often asked questions about what the user wanted to do or why they wanted to do it. Most users preferred not to engage in such back-and-forth, Begel said.

Participants liked the concept of a chatbot. One explained, “I think, honestly, with my workplace … it’s the only thing I trust because not every company or business is inclusive.”

But when a professional who specializes in supporting job seekers with autism evaluated the answers, she found that some of the LLM’s answers weren’t helpful. For instance, when one user asked for advice on making friends, the chatbot suggested the user just walk up to people and start talking with them. The problem, of course, is that a person with autism usually doesn’t feel comfortable doing that, Begel said.

Results from the experiment were presented by first author and HCII Ph.D. student JiWoong (Joon) Jang at the Association for Computing Machinery’s Conference on Human Factors in Computing Systems (CHI 2024) in Honolulu. In addition to Begel and Jang, co-authors include HCII Ph.D. student Sanika Moharana and Patrick Carrington, an assistant professor in the HCII.

It’s possible that a chatbot trained specifically to address the problems of people with autism might be able to avoid dispensing bad advice, but not everyone in the autism community is likely to embrace it, Begel said. While some might see it as a practical tool for supporting autistic workers, others see it as yet another instance of expecting people whose brains work a bit differently than most people to accommodate everyone else.

“There’s this huge debate over whose perspectives we privilege when we build technology without talking to people. Is this privileging the neurotypical perspective of ‘This is how I want people with autism to behave in front of me?’ Or is it privileging the person with autism’s wishes that ‘I want to behave the way I am,’ or ‘I want to get along and make sure others like me and don’t hate me?'”

At heart, it’s a question of whether people with autism are given a say in research that is intended to help them. It’s also an issue explored in another CHI paper on which Begel is a co-author with Naba Rizvi and other researchers at the University of California, San Diego.

In that study, researchers analyzed 142 papers published between 2016 and 2020 on developing robots to help people with autism. They found that 90% of this human-robot interaction research did not include the perspectives of people with autism. One result, Begel said, was the development of a lot of assistive technology that people with autism didn’t necessarily want, while some of their needs went unaddressed.

“We noticed, for instance, that most of the interactive robots designed for people with autism were nonhuman, such as dinosaurs or dogs,” Begel said. “Are people with autism so deficient in their own humanity that they don’t deserve humanoid robots?”

Technology can certainly contribute to a better understanding of how people with and without autism interact. For instance, Begel is collaborating with colleagues at the University of Maryland on a project using AI to analyze conversations between these two groups.

The AI can help identify gaps in understanding by either or both of the speakers that could result in jokes falling flat or creating the perception that someone is being dishonest. Technology could also help speakers prevent or repair these conversational problems, Begel said, and the researchers are seeking input from a large group of people with autism to get their opinion on the kind of help they would like to see.

“We’ve built a video calling tool to which we’ve attached this AI,” said Begel, who has also developed an Autism Advisory Board to ensure that people with have a say in which projects his lab should pursue.

“One possible intervention might be a button on this tool that says ‘Sorry, I didn’t hear you. Can you please repeat your question?’ when I don’t feel like saying that out loud. Or maybe there’s a button that says, ‘I don’t understand.’ Or even a tool that could summarize the meeting agenda so you can help orient your teammates when you say, ‘I’d like to go back to the first topic we spoke about.'”

More information:
JiWoong Jang et al, “It’s the only thing I can trust”: Envisioning Large Language Model Use by Autistic Workers for Communication Assistance, Proceedings of the CHI Conference on Human Factors in Computing Systems (2024). DOI: 10.1145/3613904.3642894

Citation:
People with autism turn to ChatGPT for advice on workplace issues (2024, June 9)
retrieved 9 June 2024
from https://medicalxpress.com/news/2024-06-people-autism-chatgpt-advice-workplace.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.




Research identifies issues with booking new appointments at clinics for cancer treatment

Jun 8, 2024

What makes roads safer? New study uses AI to find out

Jun 8, 2024

Socially connected older adults hit harder by pandemic than isolated peers

Jun 8, 2024

Girl in Australia, 2, struck with H5N1 bird flu: WHO

Jun 8, 2024

Clinical trial shows 15-day Paxlovid regimen safe but adds no clear benefit for long COVID

Jun 7, 2024

Unlocking another piece of the Parkinson’s puzzle—scientists reveal workings of vital molecular switch

Jun 7, 2024

Antioxidant gel preserves islet function after pancreas removal: New approach could reduce diabetes complications

Jun 7, 2024

More evidence suggests regular consumption of melatonin can reduce chances of age-related macular degeneration

Jun 7, 2024

Neuroscientists map brain pathways for learning from negative feedback

Jun 7, 2024

New therapeutic targets to fight type 2 diabetes

Jun 7, 2024

Read More

Zaļā Josta - Reklāma