Oh Siri, Part 907

John Gruber posted about a tweet from Nilay Patel, editor-in-chief of The Verge, in which his Apple Watch responds to the question, “What time is it in London?” by showing the time in…London, Ontario.

Gruber later found that his Watch and HomePod both gave different answers, which only underscores how fractured and broken the Siri experience is (if you didn’t click the first link, the HomePod gave the time for London, England.

While this is not the biggest error (or technically an error at all), it demonstrates how sort-of dumb Siri is. When people ask what time it is in London, they are almost certainly asking about London, England. People understand this because London, England is one of the most famous cities in the world (sorry, London, Ontario). But Siri seems to (sometimes, sometimes not) go by proximity and misses the obvious answer.

And is often slow in doing so.

And will sometimes report no connection when there is, in fact, a connection (the servers at Apple apparently lose connection from time to time and Siri will not answer even the most basic questions when it is down).

As a side note, I asked Siri on my watch what time it was in London and it gave the time for London, Ontario. But worse, it didn’t even list the province. It just said “London” because I guess I’m in Canada and should automatically know which London it’s referring to? Even though if I did, I probably wouldn’t be asking what the time was in the first place.

Oh Siri.

Oh Siri, Part 87

Adding containers to a shopping list.

Attempt #1: Kool Aid
Attempt #2: Cooler
Attempt #3: Containers. Hooray.

I pronounced the word “container” the same way, with the same inflection each time. This is why the reports that say Siri is better than Alexa ring false to me (or they are testing something else, like depth of trivia knowledge). When Alexa fails, it’s usually because it can’t process the command, either because I’m asking something impossible, or just phrasing it in a way that it’s not been programmed to recognize. It could be as simple as omitting a key word.

Siri is different. Siri will sometimes just fail completely, offering up a baffling “no internet connection” error when the internet is right there, or asking me to try again later because maybe someone at Apple has tripped over the server’s power cord again or worse, insisting that I have no such list to add an item to, after which I will ask Siri to show me that list and it does–then still refuses to let me add items to the list because it still doesn’t exist. But more often than these, Siri will misinterpret what I am saying, giving me Kool Aid instead of containers.

It does this often enough that it doesn’t surprise me. It doesn’t even bother me, really, I just accept that it’s part of the whole Siri experience. But Siri has been around since the iPhone 4S (2011)–it really should be a whole lot better than it is. Bad Apple.

Oh Siri, Volume 2

Me, dictating into the Message app:

Finished run, heading home.

Siri, interpreting:

Finished her own, heading home.

Somehow “run” becomes “her own.” Note the number of syllables isn’t even the same. Note that I’ve dictated this phrase before. Is it possible for AI to get dumber? I’m beginning to think so.

Oh Siri.

Oh Siri, Volume 1 or Hell jars, howard yoyo 2017

Mocking Apple technology making mincemeat of spoken–or written–phrases is a tradition going back almost 25 years. This is from August 1993:

See this and other Newton strips on the official Doonesbury site

Today there are entire sites dedicated to how iMessage mangles text through auto-correct. Sure, some of the examples are probably manipulated for maximum comic effect (though it’s really not necessary, as the worst of autocorrect hardly needs a helping hand to look bad), but the fact that there are entire sections of the internet devoted to this stuff speaks to how ubiquitous it is. (Also the best examples are the ones where people keep futilely typing the same autocorrected word over and over. You can almost feel the despair coming though their attempted messages.)

And then there’s Siri. Siri is great when it works properly, which for me is most of the time. But when Siri decides not to work, it gets really stubborn in insisting that you are speaking different words.

Here are two to start, the first I’ve mentioned before.

Pyramid: I try to tell Siri to play the album Pyramid. It tries to play the imaginary album Pure Mind. I was never able to get Siri to play Pyramid. I had to physically interact with my phone to listen to it. How 2007.

Winner: Siri

Pasta: I try to send the message “The pasta will be ready in two minutes.” Siri says, “The pastor will be ready in two minutes.” I keep trying different pronunciations/inflections/accents for “pasta” and get these results:

pasta = pastor
pasta = pastor
pasta = pastor
pasta = pastor
pasta = pasta

I don’t know what finally made it work and I have no confidence it will ever work again. I’m just glad I wasn’t sending the message to a pastor.

Winner: Me

I don’t have a sassy wrap-up for this (it’s my first entry, cut me some slack) but I will note that I just spent half an hour at that stupid autocorrect site, laughing more than I’d like to admit.