A psychiatrist, writing for Fox News, says that Siri, the iPhone’s virtual assistant, could be as psychologically bad for you as violent video games, or even some street drugs.
Dr. Keith Ablow writes:
From my perspective as a psychiatrist, Siri, the iPhone’s virtual assistant, could prove more toxic psychologically than violent video games or some street drugs.
…
Just tell her, “Siri, I want pizza,” and Siri says, in a female voice, “I’m checking your current location . . . I found 13 pizza restaurants. Eight of them are fairly close to you.” She then lists the restaurants on the iPhone screen so you can choose one. With a polite tone, she will apologize if she doesn’t understand your voice, “OK, I give up . . . could you try it again?”
He goes on to list some of Siri’s more humorous replies to statements or questions made by users. Then he starts getting a little batty… He states that by being “coaxed” to interact with a “virtual entity” like Siri, people are dumbing down their interpersonal skills and are being encouraged to treat other people like machines.
He continues: “To the extent that people become “attached” to Siri and “rely” on Siri and think Siri is “funny,” they are just a tiny, tiny bit less likely to value a friend’s responsiveness, or a colleague’s help or even to appreciate the nuances in tone of voice that real humans use to convey emotion and communicate with one another.”
By interacting with Siri, Ablow says we begin to “Siri-ize” other humans, downsizing them in their humanity.
He then relates a conversation he had with Siri: “That’s why I told Siri just now, “Siri, I hate you.” She seemed irritated. She said, “Noted.” But she’ll still give me directions and send my emails. So, it doesn’t really matter that I told her that. That’s the point. Scream into the void enough, and your words and emotions will eventually be no better than a machine’s.”
You don’t have to ask Siri why Dr. Ablow wrote his column for Fox. Obviously, by attempting to demonize a currently popular product, and attaching manufactured problems and issues to it, he gets his name in the news. Then, hopefully he’ll get the gig the next time a trial lawyer needs a head doctor for a murder trial. “My client is obviously insane, what do you think drove him to commit this crime, Doctor?” “It’s obvious counselor, Siri made him do it!”