Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AI Google IBM Microsoft Apple

Tech CEOs Declare This the Era of Artificial Intelligence (fortune.com) 178

You will be hearing a lot about AI and machine learning in the coming years. At Recode's iconic conference this week, a number of top executives revealed -- and reiterated -- their increasingly growing efforts to capture the nascent technology category. From a Reuters report (condensed): Sundar Pichai, chief executive of Alphabet's Google, said he sees a "huge opportunity" in AI. Google first started applying the technology through "deep neural networks" to voice recognition software about three to four years ago and is ahead of rivals such as Amazon.com, Apple, and Microsoft in machine learning, Pichai said.
Amazon CEO Jeff Bezos predicted a profound impact on society over the next 20 years. "It's really early but I think we're on the edge of a golden era. It's going to be so exciting to see what happens," he said.
IBM CEO Ginni Rometty said the company has been working on artificial technology, which she calls a cognitive system, since 2005 when it started developing its Watson supercomputer.
Artificial intelligence and machine learning will create computers so sophisticated and godlike that humans will need to implant "neural laces" in their brains to keep up, Tesla Motors and SpaceX CEO Elon Musk told a crowd of tech leaders this week.
Microsoft, which was absent from the event, is also working on bots and AI technologies. One company that is seemingly off the picture is Apple.
This discussion has been archived. No new comments can be posted.

Tech CEOs Declare This the Era of Artificial Intelligence

Comments Filter:
  • by SuperKendall ( 25149 ) on Friday June 03, 2016 @02:28PM (#52244701)

    Lots of people have long had the dream of putting together a chatbot that would represent them in online forums...

    Well I'm going the opposite route. I'm attaching a chatbot to my source code editor for work, leaving me free all day to do nothing but post in online forums!

    As for the work quality, I wouldn't worry about that - one of the neural inputs is StackOverflow recent answers.

    • Lots of people have long had the dream of putting together a chatbot that would represent them in online forums...

      Well I'm going the opposite route. I'm attaching a chatbot to my source code editor for work, leaving me free all day to do nothing but post in online forums!

      As for the work quality, I wouldn't worry about that - one of the neural inputs is StackOverflow recent answers.

      I love it!

      The MadLibs approach to coding!

    • I'm attaching a chatbot to my source code editor for work, leaving me free all day to do nothing but post in online forums!

      How are you going to afford access when you're unemployed?

  • by jeffb (2.718) ( 1189693 ) on Friday June 03, 2016 @02:28PM (#52244705)

    So, "accelerating"?

    Tech companies spend more resources on trendy topic because tech companies spending more on a topic makes it trendy. Film at 11.

    • by AK Marc ( 707885 ) on Friday June 03, 2016 @02:53PM (#52244893)
      We are no closer to AI now than we were 70 years ago. All we have now is better dB lookups. *yawn*. Call me when someone creates an approach that has a possibility of creating AI.
      • Is there an artificial AI person-in-software yet? No. However, there's no denying advances such as Watson, self-driving cars, and Siri and the like. All part of a general trend of increasing information density in human artifacts. And increasingly becoming potent enough to wipe out the livelihoods of many people in a single stroke.
        • by AK Marc ( 707885 )
          Google is a better AI than Siri. Siri just gets mention because it's voice, which is unrelated to AI.
          • Not related? Without voice/facial recognition the AI is deaf and blind.
            • by AK Marc ( 707885 )
              AI doesn't need to hear or see. At least until the Turing test is only given orally, or via ASL.
              • Ok. I can't really see Skynet sending Helen Keller Terminators back in time to save Sarah Connor, but who knows.
                • hmm, kill her, rather.
                • Skynet could build androids that could pass as human but couldn't get rid of the Austrian accent. No matter how complex the software there are always bugs.

                • by AK Marc ( 707885 )
                  Your stupid argument is that a car isn't a car without doors. They are irrelevant to the nature of it, but almost all have it. voice and video recognition is solved anyway. Kinect does it as well as any AI needs. Plugging a Kinect into an AI is no more trouble than plugging in a keyboard. So I don't see what voice and video recognition has to do with AI. They are orthoginal, even if you'll find most AIs will have it.
      • Re: (Score:2, Informative)

        by Anonymous Coward

        Deep neural nets are not dB lookups. They mimic the way the brain stores and recalls patterns and responses to patterns. Specifically they (both the deep neural nets and the brain) store the patterns and responses to patterns in the form of synaptic weights of multiple layers of neurons. If that is what you want to call dB lookup, then well the brain just works through dB lookups too.

        • by AK Marc ( 707885 )

          Deep neural nets are not dB lookups. They mimic the way the brain stores and recalls patterns and responses to patterns.

          It's not a dB lookup, it's a dB lookup that's stored like a brain. Still sounds like a dB lookup to me. The way they are used is to lookup things. That the lookup table isn't defined is the only "smart" thing about them. What's the number of people that will buy bottled water tomorrow in Florida? That's a neural net (dB) lookup. That you don't fully define that it's based on weather patterns (water purchases increase under threat of hurricane), or day of week, or month of year, or nearby holidays, or

  • It's great people are getting excited about AI. I'm looking forward to reading about it every fucking day, just like I did about voice recognition, how apps would change my life etc. At the very least, I hope it means it will become slightly easier to say things like "set an alarm at 2.30" and not end up with a calender entry which reads "self harming - tooth hurty" or whatever, but can we sort of pre-empt the whole thing and start thinking about what comes after AI so those of use who find it a little dul

    • by mlts ( 1038732 )

      AI seems to be one of those things that is always waiting in the wings, right next to the holographic storage drive, useful VR, 3D TV, memristors, flying cars, and the magic pill that you take that does the job as 12 hours of sleep.

      In reality, the tech companies have not done much in the past 5-10 years. We have more cat picture sites, coupled with more intrusive ads, and consoles that can play the latest regurgitation of Call of Duty, but compared to the 1990s or 2000s where people started using computers

      • Those are "cute" things for AI to do. I'd rather see the following:

        Elimination of business cycles, instead ensuring monotonically increasing standard of living for all individuals (instead of a cyclically increasing average, even though some individuals never experience an increase).

        Figure out better education programs to help eliminate violent prejudices from society.

        Figure out how to placate various despots, etc. so that we can start pulling people out of oppression.

        Some of the medical things. Faster, mor

  • by Anonymous Coward

    Apple already provides Artificial Importance.

  • by Anonymous Coward

    One of the first jobs we're going to 'automate' with AI will be the CEO position.

    Biggest return most savings.

    And an ai ceo won't go on tv and say stupid shit that tanks their stock.

  • .... before we can say that this is the era of that thing?

    What passes as AI so far is still just all smoke and mirrors.

    • What really matters isn't how real it is; but how profitable it is. So far, the best possible exit for an AI startup is to get bought by Google or HP, then sometimes flash a display of brilliance, or sometimes disappear, never to be heard of again.

      When we see unicorn AIs, then we'll have something.
    • .... before we can say that this is the era of that thing?

      What passes as AI so far is still just all smoke and mirrors.

      Yeah, but the shareholders don't want the ceo to declare this the era of smoke and mirrors.

    • Seriously... chat bots are supposed to be the new big thing since wearables didn't really take off. It's all about tech and tech journalism needing something to hype.
    • Sometimes I think that what passes for HUMAN intelligence is just all smoke and mirrors.

  • Great (Score:5, Insightful)

    by decipher_saint ( 72686 ) on Friday June 03, 2016 @02:42PM (#52244821)

    Let's replace CEOs and stupid tech blogs with AI and put them on their own internet

  • Honestly, AI will be part of the future and the transition will not be pretty on a human level. Jobs will be lost, processes will change and the world will adapt. Rushing this by tossing out meaningless declarations by CEO's will not change. There is not timetable for the AI revolution. Let's adapt when ready.
    • The main cost of any business is labor. Expert systems (which is what they're really talking about when they say AI) will save billions, maybe trillions of dollars. The human cost is irrelevant z just like it was for the first 70 or so years of the Industrial Revolution.
  • We aren't even close to what I would call an AI era. We need about a 100 billion (thats billion with a B) times more advanced AI than what we have today for anything even remotely approaching the technology needed for us to be in an AI era.

    • We need about a 100 billion (thats billion with a B) times more advanced AI than what we have today

      This is almost certainly not true. The human brain has 100 trillion connections. Some artificial neural nets (ANNs) have over a million. So the brain has 100 million (with an M, not billion with a B) fewer. But the synapses in the brain fire 100 times per second, while ANNs can clock a million times faster. So now we are within a factor of 100 ... but that is not all. As far as we know, a brain stores ALL information in synapses. So you are using synapses to remember what your third grade teacher loo

      • by starless ( 60879 )

        As far as we know, a brain stores ALL information in synapses. So you are using synapses to remember what your third grade teacher looked like, your mother's voice, and what freshly baked cookies smell like. None of that is useful when you are, say, trying to ride a bicycle, and none of those other synapses are being used. But a computer only needs to load the synaptic data needed for a particular task, and leave the rest on a HDD.

        Actually, I tend to think that the availability of all the other "irrelevant" information is needed to allow a system/someone to make truly intelligent decisions in a flexible way.

      • You don't need AI to ride a bicycle.

        Fifth order linear control [youtube.com] (differential equations).

      • Here is a hint: computer "neural networks" are nothing like how brain neurons work. The fact that people bleat on about "neural networks" just shows that they don't know how AI works. NN are a dead end.
        • Here is a hint: computer "neural networks" are nothing like how brain neurons work.

          Although there are differences, ANNs are analogous to how a biological brain works. ANNs usually use a sigmoid or rectified linear activation function, while biological neurons use a step function (it is either activated or it isn't). ANNs are often fully connected, while BNNs are not, but since the weights can go to zero, that is not a big difference. ANNs usually have distinct layers, while BNNs are more random, but many ANNs are recurrent and have feedback from lower layers back to the top. Otherwise

        • by Etcetera ( 14711 )

          Here is a hint: computer "neural networks" are nothing like how brain neurons work. The fact that people bleat on about "neural networks" just shows that they don't know how AI works. NN are a dead end.

          They don't need to be like how brains work at the symbol organizational level, at least not any more. We can get there by just throwing interconnected data at it now. Sure, it would be more efficient to continue building that, but we're at the point where we're just a few Moore's Law updates away from being able to represent it at the synapse level without understanding the higher processing at all.

          In some ways, that's even scarier. We'll have AI and not understand how it works any more than how we understa

      • by garote ( 682822 )

        You expect to throw computing cycles and capacity at this thing you call a neural network and grow real intelligence from it? That's about the same as clearing a strip of jungle, building a tower out of palm trees and vines, and expecting planes to start landing.

        The human brain is only very loosely and very very partially described with this dime-store textbook concept called a neural network (which is not an "algorithm" by the way). There's shit going on in brains at the quantum level that biologists and

  • We have been creating Intelligences running on organic processors for all of human history. The two I helped to create have some bugs, but I blame the team programming effort with the wife. (we still argue about who introduced which bugs, and if a patch would ever be effective).

    A newborn is simply a set of default starter programs that interact with an increasing number of inputs over time.

    Partly cloudy and warm by the Beach

    • I don't think it means what they seem to think it means
    • "We have been creating Intelligences running on organic processors for all of human history. The two I helped to create have some bugs, but I blame the team programming effort with the wife. (we still argue about who introduced which bugs, and if a patch would ever be effective)."

      But because the manufacturing process involves harassment of women, the CEO's we're talking about will never get it past their HR departments.

  • AI has made steady progress over the last twenty years. Nothing has happened that puts it over the threshold of a revolution. New applications will be found, and new software would be developed, just like algorithms and information retrieval were key to Search Engines and Google Maps, but this didn't mean an era of algorithms and IR descended upon us.

    The more AI buys into the hype the stronger the backblow will be when it fails to deliver. Read up about the AI winter which happened in exactly the same way i

    • backblow will be when it fails to deliver. Read up about the AI winter which happened in exactly the same way in the 1980s.

      I vividly remember this and attended colloquia with Dr. Hecht-Nielsen [wikipedia.org].

      Concerning backblow, I used to like to temper people's hype with reality, but found it more entertaining to add to the hype and watch the downfall eating popcorn. Evil, I know, but I found schadenfreude much less stressful than living with a cassandra complex.

  • If they say so it must be true.

  • by Okian Warrior ( 537106 ) on Friday June 03, 2016 @02:54PM (#52244911) Homepage Journal

    Relax everyone, we're nowhere close to having, what is commonly perceived to be, intelligent programs.

    What we have, and what we have finely honed, are clockworks: algorithms that perform a single specific task.

    Granted, a lot of what humans do can be replaced by a sufficiently well-designed clockwork. Lots of human tasks are repetitive, boring, and uncreative. Driving, for example, is repetitive, boring, and uncreative, and appears to be well suited to a clockwork.

    And this will bring about massive changes in how we view human activity. We will eventually have to change our notions of entitlement and human worth, and found a new sect of economic theory.

    But each of these is only a clockwork, suited to only a single task. Humans, the only example of intelligence we have, can learn to do any of these tasks, and as far as we can tell there is no wiring in the human brain specific to any of them. Humans can learn to play chess, checkers, poker, or any of a hundred other games, but so far as anyone can tell there's no wiring in the brain specific to chess.

    A chess program can't learn to play checkers, but the human algorithm is universal.

    We're starting to automate our world, that's all.

    • Driving is repetitive, boring, and uncreative? You should show up in some of the autonomous-vehicle threads and use that statement to confront the "machines will never be able to share the road with humans" crowd.

      I'm pretty sure that human brains are no less "clockwork" than any of the things you mention -- just with more complex works, that are perhaps less reliable/predictable due to their implementation.

      As far as the "universality" of the "human algorithm", well, greater human minds than mine have found

  • by Archfeld ( 6757 ) <treboreel@live.com> on Friday June 03, 2016 @02:59PM (#52244941) Journal

    Tech CEO's famous for spouting techno-babble, raising and losing enormous amounts of venture capital, and utilizing golden parachutes declare something incredible is about to happen, just invest some money with us.

  • This is going to change things the way "The Year of the MOOC" [nytimes.com] changed everything!
  • All this push is mainly to monetize your every thought.

    Currently it's to monetize your every voice command (a superset)

    And before that it was to monetize your every question (a super-superset)

    And before that was to monetize your every transaction (a super-super-superset)

    And before that was to monetize your every 'access' (a super-super-super-superset)

    You get the point.

    I recall back in the 90's IBM attempt to develop tech to charge a penny for every byte that when through a router... charge by byte vs a subs

  • What does the deep learning Era Naming AI think the current era should be called?

  • Everyone assumes artificial intelligence means a human-line consciousness of above-average intelligence.

    I think it more likely that artificial intelligence would start with the intelligence of a worm or a mouse, and then work its way up from there.

    Now, these humbler creatures *do* have intelligence and an ability to learn to *some* degree, and except for the very simplest of cases, we don't understand what intelligence even *is* in these situations, much less being ready to duplicate it in software.

    There ar

  • Since it won't be used to help humanity, but to remove work faster than it is replaced, kill it.

  • Right?

    What we have now is various AI networks/algorithms/etc which cannot reason, cannot really use memory (in a sense how human beings do that) and which are less "intelligent" than earthworms with three hundred neurons.

    Which automatically begs a question: if a creature with 300 neurons is more intelligent than our intelligent algorithms then maybe we still light years away from implementing proper AI, aka general AI.

    For some reasons media has conflated AI to general AI, but these two things are a hundred

  • I keep coming back to natural language compression prizes. The best hope we have of ameliorating human stupidity and ignorance is computer based education starting with a _neutral_ electronic genius with astronomical verbal intelligence. Verbal intelligence entails the ability to assess the verbal and cognitive character of your audience and modify your speech acts accordingly. The cost of electricity -- about 10 cents per kilowatt hour -- would be vastly lower than the cost of transferring benevolent _na

    • by garote ( 682822 )

      There are a whole lot of problems to be solved before that future comes into view, some of them problems with human nature. Young people don't wanna waste time chatting with Mr. Roboto The Professor Of Science. They want to eat, run around, and have sex. You want to install higher values in them, that's gonna take parents. I assume the next natural step is to clamor for a Mr. Parento, The Electronic Father Figure, and do away with all this boring parenting crap too.

      Then in another 200 years we can have

  • See also the Gartner Hype Cycle -- "the peak of inflated expectations", right before "the trough of disillusionment".

"Pull the trigger and you're garbage." -- Lady Blue

Working...