Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
AI Iphone Software Apple Hardware Technology

Apple Is Working On a Dedicated Chip To Power AI On Devices (bloomberg.com) 49

According to Bloomberg, Apple is working on a processor devoted specifically to AI-related tasks. "The chip, known internally as the Apple Neural Engine, would improve the way the company's devices handle tasks that would otherwise require human intelligence -- such as facial recognition and speech recognition," reports Bloomberg, citing a person familiar with the matter. From the report: Engineers at Apple are racing to catch their peers at Amazon.com Inc. and Alphabet Inc. in the booming field of artificial intelligence. While Siri gave Apple an early advantage in voice-recognition, competitors have since been more aggressive in deploying AI across their product lines, including Amazon's Echo and Google's Home digital assistants. An AI-enabled processor would help Cupertino, California-based Apple integrate more advanced capabilities into devices, particularly cars that drive themselves and gadgets that run augmented reality, the technology that superimposes graphics and other information onto a person's view of the world. Apple devices currently handle complex artificial intelligence processes with two different chips: the main processor and the graphics chip. The new chip would let Apple offload those tasks onto a dedicated module designed specifically for demanding artificial intelligence processing, allowing Apple to improve battery performance.
This discussion has been archived. No new comments can be posted.

Apple Is Working On a Dedicated Chip To Power AI On Devices

Comments Filter:
  • Of course. (Score:3, Funny)

    by Fire_Wraith ( 1460385 ) on Friday May 26, 2017 @06:46PM (#54495137)
    What could possibly go wro-

    KILL ALL HUMANS
  • by Anonymous Coward on Friday May 26, 2017 @06:49PM (#54495159)

    Wouldn't it be easier for Apple to use its massive cash hoard and acquire Cyberdyne?

  • ASIC [wikipedia.org]s have always had their use (literally) but seem to have exploded into mainstream with Bitcoin. Now it seems everyone is working on their own "AI" chip, which is fancy wording for "We put these most commonly used functions in silicon". Intel is now putting FPGAs into their Xeon chips so that customers can start speeding up their workflows.

    We've kind of tapped out x86 performance lately. My 6 year old laptop is still fairly competitive. I have phone 5 generations old and it's "good enough". Are companies

    • Are companies going to now turn to ASICs to get the competitive edge?

      You do not design the hardware only to then find a suitable application. If a company wants a competitive edge it must first figure out what it wants to do. Then it finds the most efficient way to do it. This could involve an ASIC - but this is not required.

      Overall, I do not see a trend towards custom silicon. A limited market always existed and it continues to exist. If anything, the reduced cost of general purpose devices (CPU,FPGA) make custom silicon far less attractive then before.

      • it must first figure out what it wants to do.

        I think they have. They want to look for a certain word on very low power. How much more battery can Google save by putting 'Ok Google' into silicon?

        On the "AI" part of things how much has TensorFlow changed recently? GPUs were a good stepping stone (like they were for BitCoin) but the next step in speeding up some of the basic functions is to move from GPUs to something less general.

        What is the BitC^H^H^H^H AI performance difference between a 40U rack of CPUs, 40U rack of GPUs and a 40U rack of ASICs?

      • Not just that, the deal w/ custom silicon is that it has a specific i.e. limited use, and thereby a limited market. At best, one could put it on an FPGA and run w/ it. The time one goes from an FPGA to an ASIC is when one ramps up the volume to the point that a cost reduction is desperately needed. Otherwise, one has to run a minimum number of wafers on a fab to remain cost optimized. Not possible if one is running a product w/ such a limited scope & market
        • by tlhIngan ( 30335 )

          Not just that, the deal w/ custom silicon is that it has a specific i.e. limited use, and thereby a limited market. At best, one could put it on an FPGA and run w/ it. The time one goes from an FPGA to an ASIC is when one ramps up the volume to the point that a cost reduction is desperately needed. Otherwise, one has to run a minimum number of wafers on a fab to remain cost optimized. Not possible if one is running a product w/ such a limited scope & market

          You keep forgetting Apple is fabless semiconduc

    • ASIC [wikipedia.org]s have always had their use (literally) but seem to have exploded into mainstream with Bitcoin. Now it seems everyone is working on their own "AI" chip, which is fancy wording for "We put these most commonly used functions in silicon". Intel is now putting FPGAs into their Xeon chips so that customers can start speeding up their workflows.

      We've kind of tapped out x86 performance lately. My 6 year old laptop is still fairly competitive. I have phone 5 generations old and it's "good enough". Are companies going to now turn to ASICs to get the competitive edge?

      Is the IP for making ASICs either cheap, or as readily available as Linux or BSD source code? The way you describe it, it sounds like the average Billy Joe Blow would walk into a Microcenter, pick up an ASIC just as easily as he picks up a graphics card, plugs it into his computer, and is off to the Bitcoin mining races.

      • , pick up an ASIC just as easily as he picks up a graphics card, plugs it into his computer, and is off to the Bitcoin mining races.

        https://www.element14.com/comm... for use on a $20 dev board.

        Once you prove out your designs there I don't know what it would cost to get it manufactured. But it's definitely within the budget of Apple and Google to have it.

  • Did I miss something? Hype, yes. But boom? I was always under the impression that a boom first and foremost requires some kind of product that you could sell.

    • Yes, you missed something. AI (or more accurately, deep learning) is everywhere now. Talk to your phone and it recognizes what you say? That's done with a neural network. Go to YouTube and it recommends some videos you might want to watch? That's another neural network (actually two of them). Google Translate? All done with deep learning now. Upload a photo to Facebook and it tags the people in it? More neural networks. The field is booming, and lots of companies are designing special hardware to

  • by Anonymous Coward

    we we've tested the hell out of Siri. The only command we've found that works more than 25% of the time is the pattern "set timer for N minutes." Nothing else works well.

    • by Anonymous Coward

      This. I haven't found any other commands that work.

      • by Anonymous Coward

        try 'google '

    • Siri is certainly better than it was a few years ago but it still does brain dead things like trying to route you to another continent when you ask "How do I drive to ..." where "..." is usually just a suburb or three away.

      The amount of background noise seems to be the determining factor: if I don't turn off the radio, wind the windows up and speak with about 1/2 second gaps between my words then I'll have to dictate messages to it three or four times before it gets them correct enough to actually send them

    • by Anonymous Coward

      You're talking to it in the wrong way

  • by MouseR ( 3264 ) on Friday May 26, 2017 @08:27PM (#54495577) Homepage

    Apple Neural Engine + Boston Dynamics' Atlas + Fleshlight

    and I'm a buyer.

  • by Anonymous Coward

    is apple itself.

  • Du. Du hast. Du hast mich. And with AI, you can take that literally.
  • by Anonymous Coward on Saturday May 27, 2017 @01:51AM (#54496347)

    Not a single time. The company I work for makes voice-controls for machinery to be used by the disabled, so we were very curious about Siri. Even with high quality microphones, training, and simple one word commands, our stuff still isn't 100% reliable even after over twenty-five years since we delivered our first voice-controller sewing machine to Goodwill. Voice control, especially speaker-independent, isn't anywhere nearly ready for consumers.

  • The computer which controlled the machines, Skynet, sent two Terminators back through time.
  • We'll find out at WWDC if this thing is actually completed. If it is, there's a chance the 2017 iPhone models may incorporate this chip specifically to speed up Siri.

  • Apple Is Working On a Dedicated Chip To Power AI On Devices

    Dared to lowercase "a", but not "On", "To", or "On". I guess that makes this news More Impressive.

    So that gets us to:

    Apple Is Working on a Dedicated Chip to Power AI on Devices

    Oh oh, lameness filter activated:
    * nice main verb: "is working on"
    * nice pablum phrase: "dedicated chip"
    * nice cliche: "to power"
    * nice hipster slang: "devices"

    Oh, a sleeping drunkard
    Up in Central Park,
    And a lion-hunter
    In the jungle dark,
    And a Chinese dentist,
    And a British qu

  • You could call it the Apple Neural United Server

"Inquiry is fatal to certainty." -- Will Durant

Working...