Apple Is Working On a Dedicated Chip To Power AI On Devices (bloomberg.com) 49
According to Bloomberg, Apple is working on a processor devoted specifically to AI-related tasks. "The chip, known internally as the Apple Neural Engine, would improve the way the company's devices handle tasks that would otherwise require human intelligence -- such as facial recognition and speech recognition," reports Bloomberg, citing a person familiar with the matter. From the report: Engineers at Apple are racing to catch their peers at Amazon.com Inc. and Alphabet Inc. in the booming field of artificial intelligence. While Siri gave Apple an early advantage in voice-recognition, competitors have since been more aggressive in deploying AI across their product lines, including Amazon's Echo and Google's Home digital assistants. An AI-enabled processor would help Cupertino, California-based Apple integrate more advanced capabilities into devices, particularly cars that drive themselves and gadgets that run augmented reality, the technology that superimposes graphics and other information onto a person's view of the world. Apple devices currently handle complex artificial intelligence processes with two different chips: the main processor and the graphics chip. The new chip would let Apple offload those tasks onto a dedicated module designed specifically for demanding artificial intelligence processing, allowing Apple to improve battery performance.
Of course. (Score:3, Funny)
KILL ALL HUMANS
Why design it from scratch? (Score:3, Funny)
Wouldn't it be easier for Apple to use its massive cash hoard and acquire Cyberdyne?
Re: (Score:2)
I thought the Terminator had a 6502 compatible processor [pagetable.com].
Rise of ASICs? (Score:2)
ASIC [wikipedia.org]s have always had their use (literally) but seem to have exploded into mainstream with Bitcoin. Now it seems everyone is working on their own "AI" chip, which is fancy wording for "We put these most commonly used functions in silicon". Intel is now putting FPGAs into their Xeon chips so that customers can start speeding up their workflows.
We've kind of tapped out x86 performance lately. My 6 year old laptop is still fairly competitive. I have phone 5 generations old and it's "good enough". Are companies
Re: (Score:3)
We've kind of tapped out x86 performance lately. My 6 year old laptop is still fairly competitive.
Your six year old laptop is about to become an ancient slow piece of crap.
It takes more and more processing power just to run the OS because they keep bloating up to the specs of the latest processors. This is true for Windows and the mainstream Linux WM's, although not quite so bad on the Linux side of things.
PCs more long lasting. (Score:2)
They're not becoming slow. Unlike in the 90s, when kicking up the MHz did result in a corresponding performance boost, the same does not happen when one increases the number of cores & threads. Software has to be written & fine-tuned for these greater CPUs. Otherwise, all they are good at doing is running more processes, like for instance, this Firefox session w/ 16 tabs.
I'm right now typing this on a Dell Inspiron 17 w/ a Core i7 and 8GB of RAM. Runs just fine. My other laptop is a Pentiu
Re: (Score:2)
Are companies going to now turn to ASICs to get the competitive edge?
You do not design the hardware only to then find a suitable application. If a company wants a competitive edge it must first figure out what it wants to do. Then it finds the most efficient way to do it. This could involve an ASIC - but this is not required.
Overall, I do not see a trend towards custom silicon. A limited market always existed and it continues to exist. If anything, the reduced cost of general purpose devices (CPU,FPGA) make custom silicon far less attractive then before.
Re: (Score:2)
it must first figure out what it wants to do.
I think they have. They want to look for a certain word on very low power. How much more battery can Google save by putting 'Ok Google' into silicon?
On the "AI" part of things how much has TensorFlow changed recently? GPUs were a good stepping stone (like they were for BitCoin) but the next step in speeding up some of the basic functions is to move from GPUs to something less general.
What is the BitC^H^H^H^H AI performance difference between a 40U rack of CPUs, 40U rack of GPUs and a 40U rack of ASICs?
Re: (Score:2)
Re: (Score:2)
You keep forgetting Apple is fabless semiconduc
Re: (Score:2)
ASIC [wikipedia.org]s have always had their use (literally) but seem to have exploded into mainstream with Bitcoin. Now it seems everyone is working on their own "AI" chip, which is fancy wording for "We put these most commonly used functions in silicon". Intel is now putting FPGAs into their Xeon chips so that customers can start speeding up their workflows.
We've kind of tapped out x86 performance lately. My 6 year old laptop is still fairly competitive. I have phone 5 generations old and it's "good enough". Are companies going to now turn to ASICs to get the competitive edge?
Is the IP for making ASICs either cheap, or as readily available as Linux or BSD source code? The way you describe it, it sounds like the average Billy Joe Blow would walk into a Microcenter, pick up an ASIC just as easily as he picks up a graphics card, plugs it into his computer, and is off to the Bitcoin mining races.
Re: (Score:2)
, pick up an ASIC just as easily as he picks up a graphics card, plugs it into his computer, and is off to the Bitcoin mining races.
https://www.element14.com/comm... for use on a $20 dev board.
Once you prove out your designs there I don't know what it would cost to get it manufactured. But it's definitely within the budget of Apple and Google to have it.
"The booming field of artificial intelligence"? (Score:2)
Did I miss something? Hype, yes. But boom? I was always under the impression that a boom first and foremost requires some kind of product that you could sell.
Re: (Score:2)
Yes, you missed something. AI (or more accurately, deep learning) is everywhere now. Talk to your phone and it recognizes what you say? That's done with a neural network. Go to YouTube and it recommends some videos you might want to watch? That's another neural network (actually two of them). Google Translate? All done with deep learning now. Upload a photo to Facebook and it tags the people in it? More neural networks. The field is booming, and lots of companies are designing special hardware to
Re: (Score:2)
Ok, so I didn't miss much, or at least anything important.
We make voice control systems... (Score:1)
we we've tested the hell out of Siri. The only command we've found that works more than 25% of the time is the pattern "set timer for N minutes." Nothing else works well.
Re: We make voice control systems... (Score:1)
This. I haven't found any other commands that work.
Re: (Score:1)
try 'google '
Re: (Score:3)
Siri is certainly better than it was a few years ago but it still does brain dead things like trying to route you to another continent when you ask "How do I drive to ..." where "..." is usually just a suburb or three away.
The amount of background noise seems to be the determining factor: if I don't turn off the radio, wind the windows up and speak with about 1/2 second gaps between my words then I'll have to dictate messages to it three or four times before it gets them correct enough to actually send them
Re: (Score:1)
You're talking to it in the wrong way
Proposed synergy... (Score:4, Funny)
Apple Neural Engine + Boston Dynamics' Atlas + Fleshlight
and I'm a buyer.
The only bigger joke than Siri (Score:1)
is apple itself.
Let the ANE-L probing begin! (Score:2)
I have never seen Siri work (Score:4, Interesting)
Not a single time. The company I work for makes voice-controls for machinery to be used by the disabled, so we were very curious about Siri. Even with high quality microphones, training, and simple one word commands, our stuff still isn't 100% reliable even after over twenty-five years since we delivered our first voice-controller sewing machine to Goodwill. Voice control, especially speaker-independent, isn't anywhere nearly ready for consumers.
Judgment Day (Score:1)
Which means the chip is almost ready? (Score:2)
We'll find out at WWDC if this thing is actually completed. If it is, there's a chance the 2017 iPhone models may incorporate this chip specifically to speed up Siri.
Aye Yai (Score:2)
Dared to lowercase "a", but not "On", "To", or "On". I guess that makes this news More Impressive.
So that gets us to:
Oh oh, lameness filter activated:
* nice main verb: "is working on"
* nice pablum phrase: "dedicated chip"
* nice cliche: "to power"
* nice hipster slang: "devices"
Imagine a beowulf cluster of these! (Score:1)