With Amazon’s acquisition of iRobot, a large number of households in the world will have either an Alexa or a Roomba device, or both. But what do the children in these households think of these devices? That is exactly what developmental psychologists at Duke University set out to answer with a new study that is published in APA’s peer-reviewed journal Developmental Psychology on Monday. For the study, the researchers recruited 127 children between the ages of four to 11 from the Northeastern United States. The participating kids watched two 20-second clips about Alexa and Roomba before they were asked a few questions about each device. Alexa more human-like According to the university, the children judged Alexa to have more human-like thoughts and emotions than Roomba. But despite this, they felt that neither the Roomba nor Alexa deserved to be yelled at or harmed. This feeling diminishes as the children get older. Also, the kids believed that both the robots were probably not ticklish and wouldn’t feel any pain if they got pinched, understanding that they can’t feel physical sensations like people do. Interestingly, they gave high marks to Alexa for mental and emotional capabilities, like being able to think or get upset when someone is mean to it. Roomba did not fare as well. “Even without a body, young children think that Alexa has emotions and a mind. And it’s not that they think every technology has emotions and minds - they don’t think the Roomba does - so it’s something special about Alexa's ability to communicate verbally,” said lead author Teresa Flanagan in a press statement. More insights into children and AI bonding Studies like this could offer interesting insights into the relationship that children have with increasingly “intelligent” devices. It also raises questions about how we treat machines and technologies. These could be especially important for developmental psychologists and parents. For example, should adults set a good example for kids by thanking Alexa, or Siri, or even ChatGPT for completing tasks? But right now, the researchers aren’t yet completely sure about why children think the technologies should not be mistreated. “It’s interesting with these technologies because there's another aspect: it’s a piece of property,” Flanagan said. “Do kids think you shouldn't hit these things because it's morally wrong, or because it's somebody's property and it might break?” explained Flanagan. For example, one ten-year-old who participated in the study said it was not okay to yell at either because “the microphone sensors might break if you yell too loudly,” while another thought it was because “the robot will actually feel sad.” All said and done, further research will be required to understand the motivations behind why children think such technologies should be treated the way they should be.