By Alina Tugend, The New York Times Company
Imagine: robots that help teach social skills to children with autism. Translation software that provides deaf students with a smoother and more interactive experience. Data analysis to determine effective methods for identifying people with dyslexia.
These tools, which all incorporate artificial intelligence, aim to find better ways to detect, teach and help people with learning disabilities. Some are already in classrooms; others are still in the research phase.
Social robots, which are designed to interact with humans, can help teach social and educational skills to students of all abilities, including those with attention deficit hyperactivity disorder, hearing impairments, of Down and autism.
Meeting the needs of children with autism is particularly urgent because of their numbers; 1 in 54 children are diagnosed with autism, according to the Centers for Disease Control and Prevention.
And those students tend to react to robots “in a way that they don’t react to puppets or pet therapies or many other kinds of things that we’ve tried,” said Brian Scassellati, professor of science. computing, cognitive science. and mechanical engineering at Yale University.
It may be because robots look like humans but are non-judgmental, he said. The robots come in a variety of designs, including a little boy, a classic sci-fi machine, and a furry snowman, and they go by energetic names like Kaspar, Nao, and Zeno.
In a recent study by Scassellati and colleagues, an early prototype of a robot named Jibo – which looks like a small table lamp with a round head that swivels in all directions and a glowing white circle on a touch screen as its face – worked every day for 30 days with 12 children and their caregivers. Jibo modeled social gazing behavior, such as making eye contact and sharing attention, and provided feedback and guidance during six interactive games played on screens.
“The robot’s job was to adjust the difficulty of the game based on the child’s performance,” Scassellati said. But the idea is not for the robot to replace a teacher or caregiver. “We never want to encourage kids to just react to technology; it does them no good,” he said. “We want to allow them to interact with people in a more substantial way.”
Research has shown that robots help improve educational and social skills, but much more study is needed to find out how to make these changes stick and translate to the real world.
How does the AI play into this? Technology has advanced, as has research into the formation of perceptions, how people can infer the feelings and thoughts of others, and what constitutes emotional intelligence. This information can be translated into algorithms that allow robots to interpret speech, gestures, and complex verbal and non-verbal signals, as well as learn from feedback.
Danielle Kovach, who teaches third-grade special education in Hopatcong, New Jersey, said she would be curious to see what further research shows. “A big part of teaching social skills to students with autism is reading facial expressions, reading body languages, and picking up social cues from others. Is a robot able to mimic what we learn from humans? Kovach is also president of the Council for Exceptional Children, an organization of special education professionals.
While social bots are primarily used in research studies, there is a nascent market for classrooms and individuals. For example, LuxAI, a company based in Luxembourg, has been selling the friendly QTRobot, designed for autistic children, to parents since the beginning of 2021; for now, it only works in English and French.
Children with autism interact with the robot daily for 10 minutes to an hour, depending on their age and the level of support needed, said Aida Nazari, co-founder of LuxAI. The company has sold a few hundred QTRobots, mostly to families in the United States, she added. But many families may find a social robot to be way too expensive at this point. QTRobot costs $2,000 plus a monthly software subscription of $129, which includes support services.
Rachel Ricci was the first person in Canada to order a QTRobot, which she received in February 2021. Her son, Caden, 10, was diagnosed with autism when he was 3 years old. Caden and his parents or therapist use tablets to play games aimed at improving his educational social skills, such as recognizing and naming emotions. QTRobot serves as a third encouraging friend and teacher.
He uses it for 30 minutes five days a week, and “QT helps him build his confidence,” Ricci said. Getting the robot during the pandemic was a lifesaver, she added. While most of his classmates at a Montreal school for people with autism regressed when the school closed and therapists weren’t available, Caden stayed on track. Ricci credits this to QTRobot.
But others said there was a big difference between individual use of a robot in a private home and use in a school setting.
While such technology can be “appealing,” it is “rarely used completely as intended in the classroom,” said Jordan Adcock, who teaches fifth grade in Forest Grove, Oregon, and has an autistic son. “What we really need is more teachers, assistants and a high quality curriculum.”
AI is also being used in a simpler way to help people with autism: through play. Maithilee Kunda, an assistant professor of computer science at Vanderbilt University, and her colleagues have created a video game called “Film Detective,” which will be tested this spring.
The Concept: The player wakes up in the future — the year 3021 — and must help a scientist and her robot sidekick catch a bad guy who is stealing items from the Museum of Human History. Their detective work involves using a series of film clips to decode people’s behavior in today’s world.
“A lot of people with autism have superior visual thinking but have great difficulty with social action,” Kunda said. “So we thought, what if we could give them visual ways to imagine theory of mind?” Theory of mind is the ability to imagine what other people are thinking or feeling – something people with autism can find particularly difficult, which can make social interactions difficult.
The game taps into theory of mind using film clips, asking players to interpret why characters acted the way they did and what they might have thought.
Without AI, “it would have been possible to make the game and watch the movies together, but the unique thing we’re offering is a very detailed, explicit theory of how social reasoning works that can be simulated. We can use that as scaffolding to help teach children,” said Kunda, who also directs Vanderbilt’s Artificial Intelligence and Visual Analog Systems Lab and is a research fellow at Vanderbilt’s Frist Center for Autism and Innovation.
The use of AI to improve visual and auditory accessibility is also rapidly evolving.
For example, the National Technical Institute for the Deaf, one of nine colleges at the Rochester Institute of Technology, worked with Microsoft to customize technology and platforms that already existed to caption courses for deaf students. and hard of hearing. Classes have sign language translators and stenographers, but extra help was needed.
For the institute’s needs, Microsoft Translator has been “taught” the specialized terminology used in courses as well as university-specific vocabulary, such as the names of certain buildings and people, said faculty member Wendy Dannels. research that is deaf.
With AI, the translation from speech to written word is much smoother than automatic speech recognition was before, she said. And spurred by the pandemic, during which face coverings have made communication particularly difficult for many deaf and hard of hearing people, the institute has developed an app called TigerChat. The app turns speech into text messages, making it easy to chat with friends.
Additionally, faculty members at the institute work with Vuzix, a company that has developed eyewear capable of displaying text directly on the lenses. Roshan Mathew, a graduate student in Computer-Human Interaction at Rochester Institute of Technology, has tried Vuzix glasses and loves them. “When I have to use a smartphone or laptop to talk to someone, I can’t maintain face-to-face contact,” said Mathew, who is deaf. “Communications is not just what we say, but what we see.”
A key use of AI in special education is its ability to detect patterns in large amounts of data to better identify and define certain disabilities.
Take dyslexia, for example. People with the condition typically have reading difficulties because they have difficulty connecting the letters and words on the page to the corresponding sounds they represent. As of 2020, 47 states required students to be screened for dyslexia at the start of elementary school. Yet there are no tools designed specifically for this, and dyslexia is often misdiagnosed – or missed entirely.
The most widely used assessment for dyslexia is a test called DIBELS (Dynamic Indicators of Basic Early Literacy Skills), usually given to all students in kindergarten through third grade to assess their overall level of reading and literacy, said Patrick Kennedy, senior research associate at the University of Oregon Center for Teaching and Learning. The test was not designed to detect dyslexia but is used “in the absence of other tools”, Kennedy said.
Kennedy and his colleagues plan to enroll 48 elementary schools in the United States and expand the DIBELS assessment to 4,800 students in kindergarten through third grade.
Over the next three years, they will examine the results – using machine learning – to determine patterns of reading and spelling development over time. Ultimately, researchers hope to assess whether DIBELS successfully identifies dyslexia and how it can be used most effectively.
“The goal of this project is to provide schools with better information to enable them to make better decisions,” Kennedy said.
This article originally appeared in The New York Times.
denverpost