Apple's A6 Details and Timeline Emerge 123
MojoKid writes "For a CPU that hasn't seen the light of day, there's a great deal of debate surrounding Apple's A6 and the suggestion that it may not appear until later in 2012. The A6 is a complex bit of hardware. Rumors indicate that the chip is a quad-core Cortex-A9 CPU built on 28nm at TSMC and utilizing 3D fabrication technology. While the Cortex-A9 is a proven design, Apple's A6 will be one of the first 28nm chips on the market. The chip will serve as a test case for TSMC's introduction of both 28nm gate-last technology and 3D chip stacking. This is actually TSMC's first effort with an Apple device. The A4 and A5 have both historically been manufactured by Samsung."
Unbelievable (Score:5, Funny)
Steve's not dead two weeks and already Apple fumbles the ball. STACKED chips? How is the next iPad going to be as thin as it can possibly be when they start stacking chips?
Re: (Score:2, Informative)
Not sure if this post was intended to be serious, but we're not talking about stacking them to a particularly large height. A single wafer is far thinner than any practical phone thickness, and a few of them stacked is still super-thin.
Re: (Score:2)
Re: (Score:2)
Indubitably.
Re: (Score:2)
Re: (Score:2)
A single wafer is far thinner than any practical phone thickness, and a few of them stacked is still super-thin.
Just try telling that to Mr. Creosote.
Re: (Score:2)
We have to go. Um... my wife is having her period.
Stacked Chips (Score:5, Funny)
I bet they'll try to patent this "innovation" -- even though they clearly stole the idea.
For goodness sake, Pringles has been stacking chips since the 1960's.
Re: (Score:3)
However there is more prior art - casinos have been stacking chips for many decades...
Re: (Score:2)
You're thinking of stacking the deck, Casinos have been stacking the deck for many decades.
Re:Stacked Chips (Score:5, Interesting)
Re:Stacked Chips (Score:5, Informative)
They lost a court case in the UK over this a couple of years back - for strange historical reasons, you pay VAT on crisps, but not on cakes.
The strange historical reasons being that some bright spark thought they could be really clever by only charging VAT on "non-essential" items, thus creating endless work for lawyers and committees arguing over what was "essential".
...and as anybody who watches QI knows, the official definition is that "cakes" go hard when they are stale [wikipedia.org], whereas biscuits* go soft.
* That's biscuits as in British English, i.e. cookies or crackers - not scones (which I guess are cakes).
Re: (Score:1)
The idea of not charging VAT on essentials is a good one in that it reduces the tax burden of the poor. But why exactly are cakes essential and biscuits aren't?
cat/dog food biscuits are taxed (Score:1)
im sure more dogs/cats eat biscuits than humans.
Only humans eat cakes, and even then not more than 1-2 times a month. Its hardly a daily purchase, unless you consider bread to be a different cake, which in theory it is, a bland plain cake.
Why dont we tax drugs instead, more $ in it.
Re: (Score:2)
Some bright spark also thought it would be clever to charge VAT on hot meals and not cold ones.
Re: (Score:2)
Sounds a lot like the U.S. If I get my Subway toasted, it gets taxed. Untoasted, it's not taxed. Wacky.
Re: (Score:2)
Re: (Score:2)
Maybe they'll sue Qualcomm and TI for violating their processors look and feel^W^W^W, sorry, trade dress.
Quad Core In a Tablet/Phone? (Score:5, Interesting)
I love my quad core desktop processor, but I find myself scratching my head at the idea of quad core CPU in a tablet. Even with iOS 5's enhancements there's no true multitasking in it or any other tablet/phone OS - every application is interacted with in a full-screen monolithic manner.
Dual core CPUs allow the OS to do one thing in the background and not bog down the device for the running application, but what on earth are you going to do with 4 CPUs when you can only interact with 1 program at a time? This seems like it would only be of benefit to games and a couple other niche uses, otherwise a processor with fewer cores and higher per-core performance like the A15 mentioned in the article would be far more beneficial.
Re:Quad Core In a Tablet/Phone? (Score:4, Interesting)
Re:Quad Core In a Tablet/Phone? (Score:4, Informative)
That is assuming that Apple actually put multithreading into their iphone SDK.
Of course there's threading [apple.com] in iOS. There are examples [xprogress.com] to be found if you google for them.
Re: (Score:2)
Re: (Score:1)
Classical threading is just one side of the story.
Internally, many frameworks are multithreaded, mostly the ones that deal with audio, video and image manipulations. And with blocks executing on user-created queues, the improvement can easily be felt.
Think applying real-time effects to a 1080p video stream (with a preview) and compressing it to H.264 on the fly. On your phone.
Re: (Score:2)
Think applying real-time effects to a 1080p video stream (with a preview) and compressing it to H.264 on the fly. On your phone.
A lot of that sort of stuff is also hardware-accelerated where you hand off a stream to the appropriate API and the device will encode/decode using hardware features while using very little CPU.
hardware acceleration (Score:1)
Re:Quad Core In a Tablet/Phone? (Score:5, Interesting)
Dual core makes sense because of power saving issues, you can have one low clocked core which is enough for basic phone functionality which is turned off when you started using the phone. In this sense I could even understand a triple core chip, you would have one low power core for when the phones not being used, then when it is you can move OS/Background processes to one core and have a 3rd core for running the main process.
Surely a purpose built GPU would give far better gaming improvements than an additional A9 core.
Re: (Score:2)
If TSMC are struggling with their 40nm process, what makes anyone think that they'll do better with 28nm? The very idea of lithographic shrinks is that once you have a stable process, you then shrink it in order to get more die per wafer, and hence, a theoretical cost down. Theoretical because in practise, a wafer on a finer lithography is going to be more expensive than a previous generation, particularly if new equipment, yield hits and other parameters are factored in. So initially, the new die would
Re:Quad Core In a Tablet/Phone? (Score:4, Informative)
The article is talking about things long in the past, I have a HD5850 in my machine that's almost two years old and built on 40 nm process from TMSC. That process has been fairly stable for a long time now even though it was a bit delayed and early yields weren't as good as hoped. Where they have really struggled is with their 32-34 nm - I don't remember exactly - process that should have gone into the last generation of chips. In short, they ended up simply skipping it since they were due to deliver 28 nm by the time it would be ready. And there's actually three 28 nm processes, LP, HPL and HP which you can call low, mid and high-power. LP is really just for support chips, but it's rumored that HPL will be used for the next generation Cortex and AMDs Southern Islands, while nVidia is waiting on the HP process for their next generation. For the GPU business it just means progress is slower - both AMD and nVidia are stuck waiting for TMSC. For CPUs on the other hand Intel and GlobalFoundries are heavy competitors - GF to take over the business while Intel only produce for themselves - but being a process step behind is like fighting with one hand tied behind your back.
Re: (Score:3)
Going to a second supplier makes sense for most companies including Apple. Reliance on one supplier for a critical part makes some companies nervous. As for Samsung jumping ahead TSMC, it's unlikely to happen soon as Samsung has only started making products on their 30 nm lines within this year. Going to the next step (22 nm) will take a few years for them.
I don't know about using separate cores clocked differently. That seems it would cause more problems than solving the power consumption problem espec
Economics 101 (Score:2)
Going to a second supplier makes sense for most companies including Apple.
"Most"? Actual it does NOT make sense most of the time for simple reasons of economics. Virtually all manufacturing has large fixed costs (tooling, engineering, setup, salaries, etc) which have to be recouped somehow. If you produce a small number of units, your per-unit cost climbs steeply. This is 100% of the reason for volume discounts.
The problem with using a second supplier is that you are replicating all of these fixed costs but you can only amortize them over half the number of units. Worse, bot
Re: (Score:1)
If you have high volumes of a certain product, you absolutely want second, or third fabs for those. After all, all these foundries - TSMC, Samsung, UMC, Hynix, Vanguard, Nanya, et al - have multiple customers, all with supply agreements, and once Apple hits their max, a foundry would go into allocation if they tried allocating Apple more lots. That's why for high volume, second suppliers would be involved, and both suppliers capacity would be above optimal. End result being that for the 2nd fab, once App
Re: (Score:2)
Fixed costs and outsourcing (Score:2)
(1) Apple isn't getting two substantially different parts from two suppliers; they are getting two identical parts from two different suppliers as Apple designed the chip themselves.
That has no bearing on the economics of the situation. Both suppliers still have the same fixed costs to amortize. The fact that the product is identical is irrelevant. They both have to buy equipment, hire staff, engineer the build processes, etc. These are fixed costs that have to be paid even if they never actually produce a single unit. With a second supplier, Apple pays many of these costs twice but neither supplier can amortize them over as many units. This drives the price up. It HAS to cost m
Re: (Score:2)
Re:Quad Core In a Tablet/Phone? (Score:5, Informative)
Dual core CPUs allow the OS to do one thing in the background and not bog down the device for the running application, but what on earth are you going to do with 4 CPUs when you can only interact with 1 program at a time?
You do know that iPhone apps can do quite a lot in the background, even if only one app can have focus at one time, right? Right now apps are deliberately curtailed to only certain background activities because of the limitations of the amount of cores, adding in more cores and more powerful cores will allow apps to do more in the background.
The limitation of being able to interact with one app at a time is due to UI constraints. Even on a regular computer there isn't much case for multiple programs being visible to the user at one time. For the most part a user isn't able to fully interact with multiple programs at a time, the usual case is to view a document in one app while doing work in another. A better solution to this is to allow programs to share their display engines so that a single program can run and display documents from other programs while only having one program running at a time.
The model of one application running with a few lighter weight processes doing background work makes sense for devices with tight resources and that's the model that iOS is attempting to follow.
Re: (Score:1)
Right now apps are deliberately curtailed to only certain background activities because of the limitations of the amount of cores, adding in more cores and more powerful cores will allow apps to do more in the background.
I think Apple has been very upfront about the fact that limiting available background activities is primarily about power management and battery. Nothing about 2 cores prevents you from maxing them out, it's just that most of the apps that do such a thing do so because they are poorly and lazily coded. Apple's restrictions have always been about forcing developers to make apps that run in a way that will not kill the user's battery and several of the Android developers have made comments about wishing they
Re: (Score:2)
So your theory is that we need 4 cores to run many lightweight apps at the same time. That doesn't make much sense.
My "theory" is that there are a lot of apps that can benefit from having additional cores to run threads on. It doesn't matter if it is the front app doing parallel processing or "background" apps that have registered tasks to be run. Additional cores will get used on iOS devices and they provide additional flexibility to the software.
Re: (Score:2)
Do you have any example of a useful cpu-heavy "background app"? Sorry but alarm clocks and reminder apps only need 0.001 core. Timesharing is fine for them. Even mp3 decoding requires less than 0.1 core nowadays.
Tom tom or navigon?
Re: (Score:2)
Do you have any example of a useful cpu-heavy "background app"? Sorry but alarm clocks and reminder apps only need 0.001 core. Timesharing is fine for them. Even mp3 decoding requires less than 0.1 core nowadays.
I can think of a few foreground apps that would benefit from as many cores as you care to throw at them:
Multitrack recording apps, like GarageBand.
Video CODEC intensive apps, like iMovie.
Re: (Score:3)
You're making a few wrong assumptions. Multiple cores doesn't mean all cores are powered on at the same time. It also ignores advances in battery tech and power management (something Apple pays particular attention to), as well as miniaturization allowing larger batteries due to smaller components. We've already seen this in later generations of iDevices.
This will be a boon to game makers to allow more complex AI as well as short term CPU boosts for processes that need it. It also ignores the innate possibi
Re: (Score:3)
Yeah, like sending all your data to Apple without your consent, for example.
Right, because that *totally* happened. It wasn't a file that was just sitting on the phone accumulating more and more data as everyone else has reported*. You have the truth because you have an axe to grind.
* Yes, that's bad enough, but let's not just make shit up, m'kay?
Re: (Score:3)
You do know that iPhone apps can do quite a lot in the background...[]... right?
Yeah, like sending all your data to Apple without your consent, for example.
You're confusing Apple with Android. Only with Android, all your data gets sent to some entity you have no identity for.
Re:Quad Core In a Tablet/Phone? (Score:4, Funny)
Re: (Score:3)
Maybe Apple has finally decided to support Flash?
Yeah, but with only 4 cores Flash will still drop frames.
Re: (Score:2)
Maybe Apple has finally decided to support Flash?
What is Flash? You mean a flash drive? Get the Camera Connection Kit [apple.com] and that will give you a USB port, and it apparently supports flash drives.
Re:Quad Core In a Tablet/Phone? (Score:4, Interesting)
This assumes that iOS will only ever allow you to interact with one program at a time. This also assumes that iOS doesn't do so already—ever play music while working with another app? It's a question of controls, and finding ways to work with multiple programs that works for the users.
If I were doing it, I'd consider a "half-screen" mode where you can have two apps open, one on each side of the screen. But that's worse than Apple-armchairing, that's UX-armchairing. *shudder*
Re:Quad Core In a Tablet/Phone? (Score:4, Funny)
One core for the OS, one for the apps, one for the antivirus and one for the rootkit.
Re:Quad Core In a Tablet/Phone? (Score:4, Funny)
And one core to rule them all .......
no, wait, wrong story ........
Re: (Score:2, Funny)
Re: (Score:1)
Re: (Score:3)
It's also about threading. But even then, while developers don't have access to APIs that spawn processes, the OS _does_ multitasking.
Also, it's not only a matter of performance, but it's also a matter of power. A quad core processor allows the thing to scale in an energy-proportional manner. Only need a single core? Appropriate performance and every other core will remain powered down - consuming a lot less power. And for mobile, battery life is King.
Need a lot more power? (games, for example) Yup, its the
Re: (Score:1)
Remember : " ...640 kB should be enough ..."
There plenty of new applications which could make good use of multicore CPUs : ...
AI : facial recognition
automatic voice translation
reading on lips
Desktop use:
replace th
Re: (Score:2)
A multi-threaded app would process data more quickly. It's a way of getting more processor power out without raising the clock speed any more.
Re: (Score:3)
I can think of dozens of things that they are dying to use that power for: Pumping 4x the pixels for a high resolution display, doing processing related to speech recognition (even if the matching is done server side), running spotlight indexing on local content as you download it... (e.g. your email and docs from the cloud), playing HD video while doing all of the above, supporting a "mission control" style app switcher with live previews and spaces style switching, supporting airplay in the background w
Re: (Score:3)
Re: (Score:3)
Go use OmniGraffle on iPad. You'll want the 4 cores (easily threadable tasks, not enough cores).
Re: (Score:3)
Even with iOS 5's enhancements there's no true multitasking in it or any other tablet/phone OS
Technically incorrect. Both iOS and Android are TRUE multitasking operating systems, which iOS inherits from BSD, and Android inherits from Linux. So perhaps you only work with one app at a time, but there is far more going on than you realize... all those processes running on your phone in the background? Those are tasks. Even when you're not using it, it is probably multitasking away and you didn't even realize!
Re: (Score:3)
Music plays while Safari has the screen and is browsing websites.
Not to mention the file system and underlying OS operations, notification services, location services, and so on. There's a lot of things that run in the background under iOS and more cores is just going to help them run more smoothly.
Re: (Score:1)
Re: (Score:1)
Doesn't sound true (Score:3, Interesting)
Apple has already had problems in the past with low-stock at launch. Why would they risk having even worse problems using unproven tech at a fab they haven't used before? There's always problems with supply when dealing with smaller fab tech, which will probably be worse with 3D being thrown in.
Re: (Score:3)
Re: (Score:3)
I dunno. If there's one thing the last five years have shown, Apple are quite prepared to take calculated risks. Moving to x86 architecture, the iPhone and the iPad were all calculated risks which could easily have gone horribly wrong.
Re: (Score:2)
More than a few months... different manufacturer, but Intel has been shipping 28nm Arrandale processors for over a year now. :) I have one in the laptop sitting in front of me as I type this.
Re: (Score:2)
I sit corrected... 32nm for the arrandale. it's the next generation that's 28nm.
Re: (Score:2)
I'm thinking it's more along the lines of "There is more demand when we can produce less so lets start at a higher starting price point."
Later, when the pace of production can meet demand, they can just let the same price ride until competition shows up. Then, they can reap the benefits of an extra 6-9 months of higher prices, and then drop them when needed with no overhead.
Not sure, I'm not an apple consumer, but has the price of an apple product ever dropped until the next iProduct came out?
Re: (Score:2)
I'm thinking it's more along the lines of "There is more demand when we can produce less so lets start at a higher starting price point." Later, when the pace of production can meet demand, they can just let the same price ride until competition shows up. Then, they can reap the benefits of an extra 6-9 months of higher prices, and then drop them when needed with no overhead.
Not sure, I'm not an apple consumer, but has the price of an apple product ever dropped until the next iProduct came out?
That pretty much never happens. Historically, Apple has a price point and it stays there across multiple hardware refreshes. This is true for the mobile devices as well as computers and laptops. If a price drops, it's typically when a new hardware version is released (like the shift down across the iMac and MacBook Pro models) and the drop is permanent. The only time I remember it happening during a product's life cycle was for the 1st gen iPhone. I seem to recall the subsidizedprice dropping $100 or so a c
I would guess it would depend on when (Score:3)
So TSMC's 28nm is going to be what is behind AMD and nVidia's next gen GPUs, despite their poor handling of 40nm for both companies. Those guys (nVidia in particular) also have a large first dibs on the production.
So if they are planning on the A6 from there later in 2012, well I could see it. Both nVidia and AMD want to launch new GPUs soon. I'm sure they want a Christmas launch though realistically it'll probably be early next year. Ok well they do those, tons o' chips are made with the 28nm process, the
Re: (Score:2)
This certainly changes the the equation when deciding which client should have priority.
Re: (Score:2)
Would be far too late for that. If you want 28nm stuff the fab not only needs to be built now, it needs to be full of equipment and staff, and be producing test runs. Building a fab takes a -long- time. How long? Well to give you an idea Intel is already building Fab 42 in Chandler for 14nm processors. Please remember they don't even make 22nm processors yet, however they are already in construction of the fab after it.
Also in terms of altering contracts for this generation, it is too late. nVidia and AMD a
Re: (Score:1)
Re: (Score:1)
this is not really uncommon in the history of apple
Re: (Score:2)
Because they have to. Their competitors are using the 28nm tech also. If it works (which is likely, since you can already get chips produced with this tech from a couple companies) then Apple needs to be in on it rather than stuck with an older, slower, hotter, more power-hungry chip. If it fails, then it fails for everyone and Apple is no worse off than their competitors.
Historically? (Score:3)
The A4 and A5 are not even that old.
What is a 3D stacked chip for a fab? (Score:1)
Is TSMC now into doing assembly, in addition to wafers? Since when did it get into the packaging business? I thought that their business model was to ship their wafers to the assembly houses approved of by their customers, in this case, Apple, and that the assembly houses involved would do the packaging for them. From 3D stacked chip, I'm assuming that they'll be stacking multiple die on each other, like in an MCP. What's it in case of an A6 - 4 basic CPU's just stacked one over the other? Some of the
Re: (Score:1)
they are not taking wafers and stacking them up like a club sandwich, its all on a wafer with multiple planes
Re: (Score:1)
Re:What is a 3D stacked chip for a fab? (Score:4, Informative)
From 3D stacked chip, I'm assuming that they'll be stacking multiple die on each other, like in an MCP.
Stacked chips having been happening a long time. The A4 and A5 are stacked with the CPU and the memory on top of each other. Technically there is no reason why they can't stacked CPUs on top of each other. Practically, I suspect heat is a problem.
The other part of the question - iOS - is it something that's as SMP enabled as OS-X is? From what I've seen of i-PADs, they are not multi-tasking OS's at all - all they do is save the state of an app once you exit it, and resume from that point if you return. If that's the case, how does multiple cores help for this case?
iOS is based on OS X which is based on BSD so yes SMP is there. Your knowledge about iPads is very out of date. The hardware itself is capable of multitasking as you play music while surfing web. The APIs that Apple exposes limits how applications access the multitasking. Fast-switching is the most common used version because most applications don't really need to keep running while not being used. However Apple provides seven different multitasking models [wikipedia.org] in iOS 4 released more than a year ago.
Finally, Apple can make this chip even better for themselves by moving their macs and airbooks to this processor, so that they have just one CPU platform of their own, making it easier to have a common code base for their apps, like Safari, Mail, et al.
Except that ARM and x86 instruction sets are not compatible. You can emulate x86 in an ARM environment but it will be painfully slow. Emulating ARM in an x86 environment will work but there's no real point other than coding and debugging for something like iOS.
Re: (Score:1)
From what I've seen of i-PADs, they are not multi-tasking OS's at all
You must be a Windows user. Windows users eternally confuse operating systems with interfaces. iOS and Android are true multi-tasking operating systems. The interface currently restricts focus to one app at a time, but backgrounding apps, as well as being based on BSD an Linux respectively, means that iOS and Android both are true multitasking operating systems.
HAHA! (Score:3)
" Given the iPhad's dominant market position, "
I wonder who slipped that in there?
"Stacking" (Score:2)
Assume by "stacking" they are referring to (and the article alluded to) something similar to Intel's Tri-Gate transistors?
http://hothardware.com/News/Intel-Announces-New-22nm-3D-Trigate-Transistors/ [hothardware.com]
And not simply stacking and interconnecting like this?
http://www.tomshardware.com/news/rochester-3d-processor,6369.html [tomshardware.com]
It is the GPU that wil matter! (Score:4, Interesting)
Apple can afford to bring out iPad3 with a CPU that is not much faster than the current one.
What they can not afford, is stalling GPU performance.
If rumours are correct, and iPad3 will have a retina display, it will need a lot more shader performance to fill that screen with 3 million pixels. As it is now, it is hard enough to get 60fps on non retina displays with moderately complex OpenGL ES2 shaders.
Re: (Score:1)
If rumours are correct, and iPad3 will have a retina display
That's what the rumors said about the iPad2 ...
If you can trust apple to do anything, it's to provide no more than minor upgrades to their products, as we've seen with every iPhone and iPad so far.
It doesn't really matter what they release, millions will still buy it.
Re: (Score:1)
Do you figure if you just make shit up, it somehow gives your point credence?
Re: (Score:2)
Sorry, what part of my post do you think I "made up"?
The iPad2 was rumored to have a retinal display, and every new generation of iPhone and iPad has been a fairly minor upgrade in terms of specifications and features.
Do you think that millions won't buy the next incremental upgrade?
Re: (Score:2)
Re: (Score:2)
I assume you think that adding a newer display somehow made it no longer a mundane update? I would expect a newer version to have a newer display!
I know that Steve told you it was "revolutionary", but it really was just another minor update. Take a look at the specs. Somewhat less awesome now, eh?
Re: (Score:2)
Re: (Score:2)
You're still on this? Okay, let's add some perspective.
Before the iPhone 4, Apple had one of the lowest pixel densities on the market. It's not like they were best-in-class here.
The LG Arena way back in early 2009, had a 311 ppi display.
Even earlier, the Xperia X1 from late 2008 had a 311 ppi display
Apple iPhone 4, in the middle of 2010 had a 326 ppi display.
Sure, it was the highest on the market at the time, but only slightly higher than phones that came out more than a year before.
So, yeah, it was just
Re: (Score:2)
Re: (Score:2)
Sample quantities? For multiple, popular, retail products?
It seems that you're delusional. Not that it matters to me. Enjoy believing that the next mundane refresh of the iPhone will "change everything again".
Perhaps this time they'll have a notification system that works half as well as a 5 year old blackberry. It'll be revolutionary!
Quad Core is not just for handhelds (Score:4, Insightful)
Funny how it all comes full circle. Apple suffered from having its unique RISC architecture for many years. Then Apple conformed to X86 for just a few years and leveraged that to get enough marketshare that they can move back to an independent architecture again.
Seth
Re: (Score:3)
The move from PPC to Intel was more about logistics than performance. Apple might have been Motorola's and IBM's most high profile customer but they would really be a small customer in terms of volume. Due to the nature of Apple's consumer business, their chips would have to be heavily customized requiring more R&D and cheaper by unit as they were intended for consumers. IBM's internal server/workstation division would pay more for PPC chips as they were intended for higher-end computing. Apple wou
Re: (Score:1)
Was Apple buying their PPCs from just IBM, or Mot/Freescale as well? I thought the reason Apple dropped PPC is that these 2 didn't have a long term roadmap on performance upgrades, which is what they were looking for. And since that time, IBM has been improving the power management of the Power considerably, so that today, despite being tops in performance and used for things like SAP, it consumes remarkably low power. Apple might want to consider re-instating servers w/ the Power7.
Also, Power is now a
Re: (Score:2)
Was Apple buying their PPCs from just IBM, or Mot/Freescale as well? I thought the reason Apple dropped PPC is that these 2 didn't have a long term roadmap on performance upgrades, which is what they were looking for. And since that time, IBM has been improving the power management of the Power considerably, so that today, despite being tops in performance and used for things like SAP, it consumes remarkably low power. Apple might want to consider re-instating servers w/ the Power7.
Apple first bought all CPUs from Motorola then switched to using desktops CPUs from IBM. They still relied on Motorola for mobile G4 chips and IBM never released a mobile G5. I suspect heat and power consumption were not good enough for laptops. Even if IBM has been improving on power management, Apple is not likely to have their servers on one platform and the rest of their line on another.
Also, Power is now an open specification, so it's no longer restricted to just IBM and Motorola. Apple could take it to any fab, like TSMC, and have them make what they need.
Yes but Apple will have to do all the R&D themselves. They design their own A4 and A5 chips however significan
Re: (Score:1)
Also, Power is now an open specification, so it's no longer restricted to just IBM and Motorola. Apple could take it to any fab, like TSMC, and have them make what they need.
Yes but Apple will have to do all the R&D themselves. They design their own A4 and A5 chips however significant portions of those chips like the ARM and graphic cores are licensed by Apple but not designed by them. It's not that Apple couldn't do so but it is a much larger undertaking than designing their iDevice CPUs.
Yeah, but Apple bought a company called PA Semiconductors in 2008, who were doing not an ARM, but a PowerPC based CPU. That company was not making any ARM processors until Apple bought them. So Apple could have taken their PWRficient designs, and designed their iPads, iPods and even Airbooks around it. As it is, OS-X already exists for the PPC, so it would have been a question of updating their recent versions so that it was supported.
Also, Apple was a far bigger consumer of PPCs than IBM itself - the number of Macs they sold, while low, easily dwarfed the number of IBM Power7 systems sold. Incidentally, did IBM ever migrate their other legacy products to Power7, or do they still make upgrades to those legacy platforms as well?
At their peak, Apple might have purchased maybe 2-3 million PowerPC CPUs a year from IBM with 2 million from Motorola. But remember these chips were cheaper than workstation/server grade POWER IBM would buy internally. While it is not clear how many units IBM sold to itself, the systems group sold about $24B worth of hardware and services. IBM also most likely sold more than just CPUs to themselves. In the ASIC business IBM has sold more processors to their other customer MS for Xbox than Apple and that processor hasn't changed in years. From the viewpoint of IBM which customer would you prioritize: Apple who needs new upgrades every year for cheaper processors and a lot of R&D, IBM who is internal but willing to pay a lot more per chip, or MS who needs the same processor year after year?
X-box would definitely be a bigger customer than Apple: I was thinki
Re: (Score:2)
Yeah, but Apple bought a company called PA Semiconductors in 2008, who were doing not an ARM, but a PowerPC based CPU. That company was not making any ARM processors until Apple bought them. So Apple could have taken their PWRficient designs, and designed their iPads, iPods and even Airbooks around it. As it is, OS-X already exists for the PPC, so it would have been a question of updating their recent versions so that it was supported.
I suspect Apple bought out PASemi more for their expertise and patents than their designs specifically for mobile devices not laptops. Again, it's not that Apple can't do it with enough resources or money but that the effort would be quite large. PASemi had maybe 150 engineers; I think Intel employs thousands for chip design. Also bear in mind what Apple wants from PowerPC was laptop and desktop CPUs for consumers. Even IBM with all their resources and chip expertise was never able to release laptop CPU
Re: (Score:3)