Siri, the iPhone 4S assistant, claims to do three things really well:
Siri understands what you say, knows what you mean, and even talks back.
Here’s what that means:
- Siri can parse the syntax of natural language and extract key phrases and derive semantic meaning to them.
- Using contextual information like location awareness, time sensitivity, and access to your contacts, Siri has the ability to simulate conversations as if it were a person. There are some cool examples of the software using context information to create alarms, etc.
- Siri has a large corpus of set phrases and words that it has sounds attached to it, so it can simulate a human reply as well.
Here’s what it doesn’t mean:
- Siri has no information or ability to catch the inflections coupled with an emotional response. If you are distressed and needing help, Siri doesn’t act any faster than when you are happy and jovial.
- Siri doesn’t learn your preferences like a friend or human assistant would, so it won’t suggest anything. In other words, a search query still needs to be formulated before Siri can help.
- In the same vein, Siri doesn’t spontaneously create discussion around the things you do on an iPhone, like finding a contact or playing a game.
- Siri is terrible with alternative meanings, metaphors and innuendos. A pragmatic example is illustrated in a recent episode of “Hypercritical”, where Siracusa notes the consistent failings of creating a calendar reminder. A more general implication is the dictation element lacks deeper semantic context derived from metaphor: Siri couldn’t finish the sentence, “Time flows like a ____” and could only make best guesses. For humans, the natural response is river or stream*.
- You can’t talk to Siri about anything. There is a specific set of commands that Siri handles, and there is a specific set of rules that you need to comply to.
The Personal Assistant
A personal assistant knows a lot about you. In the early days, personal assistants were a lifeline, displaying small forms of heroism for their bosses — bringing that gift for the daughter, providing reminders for appointments, having a pen and paper ready for dictation, making that favorite cup of coffee, and more. They see your failings and attempt to reverse them in order to make you prepared for the day.
Personal assistants also have motivation to be great — it’s their job and they get paid to do it. The implications are tremendous — PA’s go out of their way to provide for you because they have to, not because they are a feature. A PA also may want a promotion for professional development. Perhaps a PA has an emotional attachment to the boss. Not the attachment that resulted in being passed around the office, but one that links a mentor with a mentee. The PA learns from the mentor for his or her benefit.
Personal assistants are human beings because the above qualities come natural to them. We are creatures that want to help and provide for others. Humans can be altruistic, or they can require tangible rewards. In either case, the end product is the same: people get results fast and with meaning.
This will never happen:
Hey Siri — I’m not doing so well.
Why not, John?
My wife and I had a divorce, I can’t find a babysitter for the kids, and I am in big debt.
Sorry to hear that, John. How are the kids handling it?
They are OK, but they know something is wrong. I am being too nice to them, which is making me have money issues.
Yeah, it is tough to find a balance. How about considering a finance app to balance your budget. Might I recommend QuickBooks?
That’s a good idea, Siri, but it still makes me feel a bit bummed.
I’ve always thought exercise was great. Why not try hitting the gym, or going for a run?
Maybe a walk would clear my head…
I’ll make a playlist for you to keep you motivated, and we can set goals for your performance.
But, we could get close…
There are several elements in the above dialogue that need to be considered in simulating a personal assistant:
- Conversation before suggestion: In order to gather real context, a personal assistant should take time to gather data before providing a suggestion on what a person should do with their phone. Why not ask further questions to get the person talking, first, before providing a targeted answer?
- Meaningful recommendations: A personal assistant should be helpful in more ways than just being a note taker or reminder engine. There is a great opportunity space for an assistant to introduce a world of applications or tools that can be used by people, even though they didn’t know they wanted it.
- Enhanced memory of dialogue: Siri already does a good job keeping a conversation going, but it looks for key words to relate to previous conversations. A better conversation engine would mine every part of a person’s discussion, constantly bringing it to the forefront when necessary, not just in a sequential manner.
Siracusa mentioned the issue of belaboring a task because of the need to correct Siri repeatedly or the desire to have more manual control. Siri could leverage well-established learning algorithms to speed up a conversation. It could also store conversations in the cloud to remember past discussions in making suggestions for later. The reason Siri takes time (other than server issues) is because it doesn’t actually understand meaning, it just simulates it.
We have yet to crack what it means to actually understand**. While we know that context is a necessary element, other human concepts are also important. Emotion, feelings, dialogue, and connections are intertwined with our daily experiences. None of these exist in Siri today, but the software is on the right path. Siri has access your phone’s experiences, which are directly correlated to your own — it just doesn’t leverage all of them. To me, this is the part of the personal assistant that will come alive in the next few years with further iterations of the Siri software.
Siri is an amazing feature that is groundbreaking for in the field of AI, but it serves as a reminder for how far we still have to go.
* – Perhaps Siri could be equipped to understand relational elements among elements in a sentence, using large analyses of corpses of words and the statistical likelihood of their appearance with others (e.g. flow appears often with stream). Still, the actual meaning of the metaphor is lost in translation — Imagine if Siri could wonder: How can time flow if it is not actually made of water?
** – See the Chinese Room thought experiement.