Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
AI Apple

Siri, What Time Is It in London? (daringfireball.net) 181

John Gruber, writing at Daring Fireball: Nilay Patel [Editor-in-Chief of news website The Verge] asked this of Siri on his Apple Watch. After too long of a wait, he got the correct answer -- for London Canada. I tried on my iPhone and got the same result. Stupid and slow is heck of a combination. You can argue that giving the time in London Ontario isn't wrong per se, but that's nonsense. If you had a human assistant and asked them "What's the time in London?" and they honestly thought the best way to answer that question was to give you the time for the nearest London, which happened to be in Ontario or Kentucky, you'd fire that assistant.

You wouldn't fire them for getting that one answer wrong, you'd fire them because that one wrong answer is emblematic of a serious cognitive deficiency that permeates everything they try to do. You'd never have hired them in the first place, really, because there's no way a person this stupid would get through a job interview. You don't have to be particularly smart or knowledgeable to assume that "London" means "London England", you just have to not be stupid. Worse, I tried on my HomePod and Siri gave me the correct answer: the time in London England. I say this is worse because it exemplifies how inconsistent Siri is. Why in the world would you get a completely different answer to a very simple question based solely on which device answers your question? At least when most computer systems are wrong they're consistently wrong.

This discussion has been archived. No new comments can be posted.

Siri, What Time Is It in London?

Comments Filter:
  • OK voomer (Score:5, Insightful)

    by phantomfive ( 622387 ) on Friday May 22, 2020 @11:34AM (#60090860) Journal
    I don't fault people for expecting for having high expectations of the AI in Siri, especially with an entire movie about a guy falling in love with Siri. Still......

    If he's been using Siri for almost a decade now, and only now realized that Siri gives inaccurate/inconsistent answers sometimes, then something is wrong with his head.
    • by AmiMoJo ( 196126 )

      It's because Google Assistant is so much better. I don't know how Alexa compares but Google Assistant is much more consistent and more likely to be right, so naturally every other assistant gets compared to it.

      Don't feel too bad though, Samsung's Bixby is even dumber that Siri.

      • by lgw ( 121541 )

        Don't feel too bad though, Samsung's Bixby is even dumber that Siri.

        I had a Samsung "Smart" TV with voice control once. It was hilariously bad. I could never get it to consistently respond to my voice, but it would constantly respond to voices coming from the TV. Really Samsung engineers? Really? You didn't filter for that? FFS Samsung.

        Well, all Samsung products I've ever had have turned out to be crap, regardless of price point, so in hindsight it's expected. Man, Samsung went through the "Sony arc" fast.

        • My experience with Samsung was almost always they had all the features and no thought to UX.

          I started a avoiding them a while ago though, I assume their phones are quite nice, but I'm just basing that on popularity.
      • by Kaenneth ( 82978 )

        Alexa used to be able to answer "What time is it at the Eiffel Tower" correctly, meaning it could link the landmark to the location, to the time zone, to the current time.

        But I just asked it now, and it starts talking about when the tower was built. Oh well.

    • "If he's been using Siri for almost a decade now, and only now realized that Siri gives inaccurate/inconsistent answers sometimes, then something is wrong with his head."

      No, it just shows that Siri is actually like a real woman, with maybe more accurate and consistent answers.

    • Re:OK voomer (Score:5, Informative)

      by _xeno_ ( 155264 ) on Friday May 22, 2020 @12:23PM (#60091122) Homepage Journal

      When Apple introduced Siri, they sold it as understanding context. One of the example dialogs in the keynote that introduced Siri was asking Siri what the weather was, then following up with "how about in Chicago?" and having her understand that meant "what's the weather in Chicago?" In fact, that was one of the major selling points of the original Siri: it understands context.

      Except it doesn't. Or, in this case, it's trying too hard to understand context - it's using the user's current location to figure out how to decode "London." (This is a common problem in Apple Maps, too, incidentally: their geocoder is terrible and frequently confuses addresses. There are times when you can literally copy an address out of an Apple Maps point of interest, paste it into the search field, and find that Apple Maps all of a sudden locates the same street address in a completely different town. Never mind that you searched for a complete address, with state, town, and zip code given. This causes problems when you attempt to navigate to an address given in a meeting invite: there's no guarantee Apple Maps won't randomly change the town on you. Hope you checked the destination it picked carefully!)

      Siri was sold as being something you could "hold a conversation with" except that - well, you can't. Siri is, like pretty much every actual "VUI" (yeah, that's a real industry term), based entirely on keywords. Stray too far from a pre-written phrase, and it just doesn't work.

      It also helps that Siri is now an overloaded term. "Siri suggestions" are supposed to use on-device machine learning to learn things you frequently do and make suggestions based on that. So it would make sense that Siri as voice assistant would use that information. Except it can't, because that's all kept on-device (assuming you trust Apple is telling the truth). Instead, voice assistant Siri has its own database of "what you likely mean" that's kept on Apple's servers, because voice assistant Siri is all implemented server-side. (Your phone sends an audio recording, and all the speech-to-text and processing is done server-side.)

      So, in theory, if you asked voice assistant Siri about London, England enough, voice assistant Siri would learn that's what you meant when you say London. But Siri suggestion Siri wouldn't. Instead Siri suggestion Siri learns what you likely mean based on things you do on your device itself. Likewise, if Siri suggestion Siri sees you're looking at London, England in Apple Maps, voice assistant Siri won't pick up on that context.

      Incidentally, when I tried asking my iPhone what time it is in London, it just gave me the local time, completely ignoring the "in London" part.

    • Not only that, but it can be difficult to accurately develop something that just works with context to everywhere.

      For example, where I live, if somebody said "Do you want to do something *this weekend*?", and it was mid week, or it happened to be Sunday, they mean the upcoming weekend. But if they said that on Friday or Saturday, they mean the current weekend.

      I know this is different in different areas, because I once had to work on a bug for software that was running in various places throughout North Ame

    • If he's been using Siri for almost a decade now, and only now realized that Siri gives inaccurate/inconsistent answers sometimes, then something is wrong with his head.

      I don't think it's so much that, as much as it is his attempt to call attention to the fact that it's been a decade and we're still dealing with these issues.

      • Oh, did you really think the state of AI has changed that much? Despite some flashy stories, it hasn't changed much (especially NLP, which had its flash with Watson).
  • by johnw ( 3725 ) on Friday May 22, 2020 @11:35AM (#60090864)

    Bill Bryson tells of an incident in one of his books where he was living in England and phoned his local travel agent to book a ticket to Brussels. The agent phoned him back a few minutes later to ask, "Would that be the Brussels in Belgium, Mr Bryson?".

    • by PPH ( 736903 )

      Wisconsin?

    • After thinking it over for a long, long time, Siri would have assumed he meant Brussels, Illinois [wikipedia.org] rather than Brussels, Wisconsin [wikipedia.org].
    • Bill Bryson tells of an incident in one of his books where he was living in England and phoned his local travel agent to book a ticket to Brussels. The agent phoned him back a few minutes later to ask, "Would that be the Brussels in Belgium, Mr Bryson?".

      Better than the risk of buying an expensive ticket to the wrong place (in case he misheard the town name).

      • Er, how would he buy "a ticket to Brussels" if it was Brussels, Wisconsin? It's a tiny town of less than 1200; no one would be flying there, and a train or bus ticket from England would of course be even more impossible.

    • by tragedy ( 27079 )

      There's a number of anecdotes out there about people arriving in Auckland, New Zealand furious about the fact that they've just flown 17 hours to get to New Zealand when they thought they were buying a ticket to Oakland, California. So you have to watch out for homonyms and near-homonyms as well. Basically, verifying a location name or address completely is always a good idea. Just like with street names in any major metropolitan area. Many of them in the US and elsewhere are actually composed of several sm

      • by johnw ( 3725 )

        Several responses to my posting seem to have missed the point. The travel agent didn't say, "Sorry, did you say Brussels or Bristles or Bristol?" which would fit with the hypothesis that she just wanted to check that she'd heard correctly and wasn't booking tickets to a similar sounding place. The words used make it clear that, despite working in a travel agency, she wasn't familiar with Brussels as a destination, had done a bit of research, and wanted to make sure she'd found the right place. You have t

  • It depends on context. Yes Siri's answer is bad. But it could be good. I have friends in London, Ontario. If Siri was smart it could reasonably understand that I have been to Ontario, but never the UK. For me, that would be a right answer. But yeah, if its just always answering Ontario, thats dumb. Smart assistants should always make the right decision based on Context. Ignoring it in this case, and just saying you'd fire someone for giving you Ontario time is just a dumb as Siri's response. Things should

  • by Anonymous Coward on Friday May 22, 2020 @11:40AM (#60090882)

    if you're going to overreact like that, the assistant is better off. If anything, the assistant probably did it on purpose because you're an asshole who deserves it. I hope that important meeting you missed by five fucking hours results in severe reputational damage to you. Fuck you.

  • Siri Is Not a Person. Neither is Alexa. Neither is Google's Assistant.

    I don't think Siri is good at all - but demonstrating the shortcomings of any of them by comparing its response to what a human would do under the same circumstances seems pointless. Guess what, John? They don't actually think at all!

    And, right off, I probably would have said "London England" when I asked, just because I know what I expect (and don't expect) from these things.

    • The oddity is that if you did a generic web search, google, bing, or Bob's Discount Web Crawler, you'd get the time for London England in half a second. Why is a "smart" watch doing this differently? Maybe it's trying too hard for a personalized approach and primitive algorithm spends most of it's time to exactly identify the location being referred to rather than use the common answer. I know my iphone starts whining like a puppy when I refuse to allow it to use my locaiton.

    • Siri Is Not a Person. Neither is Alexa. Neither is Google's Assistant.

      I wish the voice was the same as the voice in the game 'Portal'. Seems a little more proper.

  • Given how utterly oblivious most Americans are to anything outside the borders of the USA, I'm kind of shocked Siri didn't give him the current time in London, Ohio.

    • I visited London, California a few times. I remember that as a kid I was vaguely disappointed.

      Singing: "Do you know the way to San Jose, I mean the one that's north of LA, not the one that's west of San Juan."

  • London ON is PROBABLY the closest London to them (assuming the giant crybaby is American).

    Essentially, we have immature people throwing a tantrum because the decision trees of language processing didn't adequately understand what they in their imprecise question REALLY WANTED?

    1) remember, it's NOT actually AI
    2) it's still a correct answer, and given a limited set of criteria, decently correct because your dumb ass was vague

    Not everyone in the world who says London, means London UK.

  • by JBMcB ( 73720 )

    Siri and Google Assistant aren't AI. The only AI thing they even remotely use is the actual voice recognition. The responses are all based on pre-programmed decision trees.

    So, I live about an hour and a half away from London, ON. If I ask Siri for travel directions to London, am I talking about London, ON or London, England? If I ask for the weather, which am I talking about? How would it know what I am looking for? I guess you could argue that, if we are in the same time zone and I ask for the time, I'm pr

  • People in the UK wouldn't ask "what's the time in London", they would ask "what's the time". Because they know its the same everywhere in the country.

    London is the tenth largest city in Canada (learned that in a quiz show), so people in Canada might actually mean London, Ontario. People in Detroit as well. To them it's nearby but in another country so it might be in a different time zone.

    It was not what the guy expected, but the error is very noticeable, so it won't go undetected, and it's not that un
  • I find this is reminiscent of US practice. I have heard people from there repeatedly saying things like "Paris, France" or "London, England".

    WTF do you think it is? If you don't specify that you mean a copy somewhere else, you are talking about the original. If I say "Edinburgh", I am not referring to somewhere in Indiana, or any of the 21 places in the USA that go by the name Glasgow or even the one that got swallowed up by Sydney in Australia. If you mean the original, you just say the name.

    • If you don't specify that you mean a copy somewhere else, you are talking about the original.

      There is Birmingham. Two major cities. The council of Birmingham, UK, famously created a brochure about their lovely town and had to destroy about 600,000 copies because they used stock images for Birmingham, USA.

    • It's because a lot of states in the US have towns or small cities named for the larger, more well-known European cities. If I say "Paris", do I mean the large city in France that's a few thousand miles away or the small town that's only 50 miles away? Usually you can tell from context, but not always.
  • "THERE IS AS YET INSUFFICIENT DATA FOR A MEANINGFUL ANSWER."
    also, the quote above should be in all caps, this is only here because the slashdot filter is wrong.
  • People use the term AI like it's a real thing, but it's not.
    At least, it's not what they think it is.

    I was watching a Jeopardy where humans were playing against Watson.
    Every question, the machine was there with the right answer while the humans were still trying to grok the question.
    Until...there was a question that asked not for a fact, but for a definition.
    Then the humans had reasonable answers, while the machine was stalled and fumbling.

    That's when you realize that there is no I in AI.
    It's just a search

    • by lgw ( 121541 )

      People use the term AI like it's a real thing, but it's not.
      At least, it's not what they think it is.

      If everyone uses a word a certain way, then that's what that word means. Sorry, hackers. "AI" means what we have now. In 20 yeas when "AI" is a lot better, "AI" will mean what we have then.

    • I was with you right up until you said full self-driving cars won't happen. That's because I don't believe self-driving technology actually requires the sort of "smart" AI you're thinking of.

    • by ceoyoyo ( 59147 )

      At least, it's not what they think it is.

      That one. If you're using the technical term properly, AI is pretty much anything a person can do. The term arguably applies to a calculator doing basic math.

      If you're using the colloquial term it means anything a computer *can't* do.

  • The assumption Siri is making here (that the London desired is in Ontario) is perfectly reasonable. The rant the article goes on about this being the "wrong" London and how a personal assistant would be fired for doing this just shows how completely clueless the author is. It's perfectly reasonable for Siri to pick the closest London, or the London in the same country, or the London the person goes to/near the most. Assuming London always means England is utterly silly. As a reverse example of this, I once
    • I would be curious to see the algorithm for determining the appropriate default of each location which shares a name with another location.

  • Why buy apple in 2020 ?
    They only put on the market hardware broken by design.

  • by Your.Master ( 1088569 ) on Friday May 22, 2020 @11:59AM (#60090992)

    I think this exposes a difference in how we use language actually. If I ask what time it is in London I'm almost certainly talking about London, Ontario. If I wanted to know London, England, I'd probably ask the time in the UK. Because the entire UK is in one timezone but Canada spans 6, as does the US, with 4 of them overlapping and Canada going further East, US further west, and the US having an exclave in Hawaii).

    I use the largest geographic entity that is unambiguous. Even Ontario isn't unambiguous, it spans multiple timezones. But just a couple sentences ago I said Hawaii and not Honolulu - I'd pick states and provinces where I'm pretty sure they are basically all one timezone, or at least the parts relevant to me are one timezone. Maybe it's because I grew up in a rural area and don't closely identify physical locations with major cities? Unclear.

    I don't know how common that is compared to people who think of everything in a city-first manner, but for people like me, London, Ontario is a better answer, while making it very unambiguously clear which London you are referring to in your answer, just in case. At least while I'm in North America. Maybe if I was in Paris or something you might infer that I'm checking whether my flight to London crosses a timezone.

    (I think it's highly likely I have asked a digital assistant about the time in London, but to be fair, I might have self-consciously added "London, Ontario" so the digital assistant wouldn't get confused, because I'm super explicit to those things).

  • by pz ( 113803 ) on Friday May 22, 2020 @12:02PM (#60091006) Journal

    The issue is that there are multiple places with the same name. There's more than one London, more than one Paris, more than one Berlin. For these more-than-one places, there's usually (usually) a really famous one, and one or more very much less famous ones. Sometimes the famous one is not the original (like Boston, Massachusetts named after Boston, England).

    And then, this gets complicated by a lot of other factors. First, where is the person who is referring to an ambiguous location relative to each of the places with the same name? As they get farther and farther away from the less famous one, chances are they want to know about the more famous one. But there's also something specific about how the place is referenced: if the requestor is asking about the time in "London" and they are located in London, Ontario, chances are they mean the one in England. But if they are in London, England, under what circumstances do they mean London, Ontario? Perhaps if you know that they have personal or business ties to Canada. Or are planning a trip there.

    At what radius away from London, Ontario do general references to "London" mean the one in England? That is a hard problem. And the best answer is to say, "which one do you mean?"

    I live in a town that shares a name in two other states in the US (and a place in England). When I search for public school schedules without specifying the state, it should be obvious based on my location which one I mean, but I nearly always get one of the other ones. That's an easy case.

    But say I'm in Amarillo, Texas and ask, "how far is it from Amarillo to Paris?" which Paris do I mean? Say I'm in Dallas, now which do I mean? Say I'm somewhere in California, now which one? Say I'm in Nice, now which one? It's a hard problem. And, again, the best answer is for the service (here, Siri) to respond, "which Paris do you mean, the one in Texas, or the one in France?". Learning from the selection then becomes easier because you have context.

  • LOL, working for this guy must be a nightmare. Calling people stupid, cognitive deficient and firing them after the first mistake. If the time in London is so critical to you just use your watch or remember the time difference! Humor aside, I wonder why other platforms / services got it right and why Siri didn't. I mean, come on, they have the data. I wonder if privacy and other data influence this too or is simply a coding issue. I wonder.
  • Here is the evidence:

    https://www.youtube.com/watch?... [youtube.com]

  • Artificial...Idiot?

    The AI term is oversold. All there is a glorified statistical correlator supplemented with some simplistic rigid decision making algorithms (which is the case here likely). On the other hand, perhaps at high enough level of complexity these system will be comparable to our "wet-were" intelligence?

  • Apple Watch, iPhone, and Mac all gave me the time for London, England.
  • This is a hard problem. Apart from using recent context to see if there was a reference in the last five minutes to a location particularly close to any London, then what you're talking about does depend on where you are. If you're close to a non-UK London, then that's likely it, but if you're far from them all, then it's London, England.

    Of course, it should probably be learning. If you immediately respond asking for a different London, then it should learn to default to that one for you, and it should a

  • by Anne Thwacks ( 531696 ) on Friday May 22, 2020 @12:16PM (#60091080)
    If you had a human assistant and asked them "What's the time in London?" and they honestly thought the best way to answer that question was to give you the time for the nearest London, which happened to be in Ontario or Kentucky, you'd fire that assistant.

    If I could fire Google, I would.

    Searched on Google Maps for Edmonton - a place in London, England about 6 miles from where I live, I get Edmonton in Canada or Australia. This is a persistent menace.

    Answers on the other side of the world replacing much nearer ones, are a daily issue - including Google maps sending people to other countries.

    If it is more than an hour's drive away, they at the very least it needs "are you sure?" even if there is no obvious alternative - it could be a typo.

    And yes, I did mean "Las Vegas" the bar two miles away, and not the place with a similar sounding name 5,000 miles away.

    Artificial intentigence? No it is

    Actual Idiocy

    • Searched on Google Maps for Edmonton - a place in London, England about 6 miles from where I live, I get Edmonton in Canada or Australia. This is a persistent menace.

      I can just see you taking a taxi to Edmonton, the driver goes 200 miles to the coast, and tells you to swim the rest.

    • I don't know why people in the UK think that it's a good idea to name their towns after cities in Canada or Australia. You're creating your own problems.

  • by Trailer Trash ( 60756 ) on Friday May 22, 2020 @12:33PM (#60091156) Homepage

    Seriously. This is the kind of stuff that we've known forever makes complete sense to humans and no sense at all to computers and AI. Worse yet, if I lived across the time zone line from London, KY, I might actually be asking that if I ask someone "What time is it in London?" Human interaction is really, really complex.

  • by BAReFO0t ( 6240524 ) on Friday May 22, 2020 @12:38PM (#60091184)

    This is exactly what I always said: To get this kind of questions right, you have to have grown up in the same culture as the asker!
    Nothing can substitute that. Definitely nothing that assumes absolute truths. Let alone a shitty matrix of weights that falsely gets called "AI"!

    And no, you would not necessarely fire an assistant like that. Because, yes, people from far away from your culture might give such an answer too! Hell, *I* might assume you meant London, Ontario, if that place is close enough to where we are! Why would you ask for the weather halfway around the planet?
    At best, I would ask "Which London?", if in doubt.

    But yes, this is exactly and precisely why what is currently called "AI" will always be a trainwreck.
    Maybe you should have hired a psychologist or sociologist or neurologist or really any scientist with a knack fornthe philosophical underpinnings and with an actual clue about humans and real brains? Instead of mathematicians and programmers who also only know human behavior and scientific philosophy from hearsay. ;)

  • There was an "All in the Family" episode where Archie Bunker didn't get his Christmas Bonus because he sent a package that was supposed to go to London, Ontario to London UK.

  • Context is hard. Follow-up clarifying questions are even harder.

    Tell Alexa: "What's the pink elephant eating in the weather?" and she will give you the local weather.

    The current effort is, sadly, all about creating skills that provide useful features or that can be used to advertise paid-for services rather than linguistic capability. They aren't assistants. They're just fast keyboards in specific cases.

  • has reached a new low.

    SlashDot is following the National Enquirer down the media toilet.

    I used to come here for news that has been filtered and above the "lowest common denominator" level....

    Should change the name to FaceDot!

  • I mean, how is Siri suppose to know that you mean London sorted by common sense rather then by any other category, such as distance. The vast majority of people take this for granted but for AI, and some humans even, context in spoken language is not an easy problem to solve. Especially if it's not something that the developer is already proficient with in their regular lives. Think about that for a little bit.

    A friend of mine gets completely stumped by a simple "Good morning!" because "is it really a 'good

  • Because that is all these things can do. Pattern matching, statistics and some hard-coded rules. That does not create intelligence.

    Also, don't expect this to get better in the next few decades and maybe not ever. This specific case may not get a hard-coded rule though.

  • A system that is actually intelligent upon realizing there are multiple correct answers should clarify which result you are asking about. If you lived near London, Ontario however were in another time zone then the right thing to do would be to ask London England or Ontario? If you were in London Ontario, England would be the assumption because who asks for the time where you are by specifying the city you would just ask what time is it.
  • They really needed someone like Jobs to scream at them about polish.
  • With the way that time-zones work in Kentucky, it might be a legitimate question to ask, "What time is it in London," and really mean London, Kentucky.

  • Let's face it: these personal assistants remain pretty much what they have been since their inception - good for grins and giggles and party games, and very little else. As soon as there is even a minimum of ambiguity in your query (as in the example in the article) they spin their wheels badly. As of today, they are stuck in a capabilities set whereby they are good at chores that, for the most part, you can do yourself faster and more efficiently, and hopeless at the kind of things you would really like a
  • by MountainLogic ( 92466 ) on Friday May 22, 2020 @05:04PM (#60092406) Homepage
    About a decade ago I was working for a global tech company in Redmond, Washington, USA (no not Microsoft). I hosted a meeting where an exec from another global company living in the UK was invited. Late the night before the meeting I got an email that said he would not be there until 10:30 AM. Turns out he told his company's travel dept that he had a meeting in Redmond. When he got off the plane at 10 PM the night before the meeting he left the small airport fond the lone taxi and gave the cabbie the address to the hotel. Without hesitating, the Cabbie told him that the ride would be over $1k and take 8 hours if he wanted to get to Microsoft first thing in the morning. (he arrived on the last flight into Redmond) Turns out cab rides from Redmond, Oregon, USA to Redmond Washington USA happens more often than you might think. Redmond Oregon is the small airport that serves the tourist headed to Bend in central Oregon and Seattle-Tacoma International Airport (SeaTac) serves Redmond, Oregon. So type in Redmond USA as your destination and you might get a surprise.
  • Location: London, England. A traveler hails a taxi cab (the real thing, not a Uber car).
    [cabbie] Where to, sir?
    [traveler] Waterloo.
    [cabbie] You mean the station?
    [traveler] I am sure we are late for the battle.

    (disclaimer: over the time I met with different takes on the same tale, some even claimed to be anecdotal. This is just my retelling)

Never test for an error condition you don't know how to handle. -- Steinbach

Working...