Apple A8X IPad Air 2 Processor Packs Triple-Core CPU, Hefty Graphics Punch 130
MojoKid writes When Apple debuted its A8 SoC, it proved to be a modest tweak of the original A7. Despite packing double the transistors and an improved GPU, the heart of the A8 SoC is the same dual-core Apple "Cyclone" processor tweaked to run at higher clock speeds and with stronger total GPU performance. Given this, many expected that the Apple A8X would be cut from similar cloth — a higher clock speed, perhaps, and a larger GPU, but not much more than that. It appears those projections were wrong. The Apple A8X chip is a triple-core variant of the A8, with a higher clock speed (1.5GHz vs. 1.4GHz), a larger L2 cache (2MB, up from 1MB) and 2GB of external DDR3. It also uses an internal metal heatspreader, which the Apple A8 eschews. All of this points to slightly higher power consumption for the core, but also to dramatically increased performance. The new A8X is a significant power house in multiple types of workloads; in fact, its the top-performing mobile device on Geekbench by a wide margin. Gaming benchmarks are equally impressive. The iPad Air 2 nudges out Nvidia's Shield in GFXBench's Manhattan offscreen test, at 32.4fps to 31 fps. Onscreen favors the NV solution thanks to its lower-resolution screen, but the Nvidia device does take 3DMark Ice Storm Unlimited by a wide margin, clocking in at 30,970 compared to 21,659.
Re: (Score:2)
I don't really see the point. (Score:3)
The iPad isn't used for number-crunching. It's not a high-end gaming platform - it certainly has a lot of games, but few of them are graphically demanding. It's used for a little light content creation, but nothing more than timeline video editing - not real time effects composition or rendering. Mostly it's used for document viewing and web browsing. So long as it's got enough power to comfortably decode video (And it has hardware h264 acceleration anyway), why would you need to worry about just how much?
Re: I don't really see the point. (Score:4, Insightful)
Re: (Score:2)
We do seem to have reached the point where computers no longer become obsolete. I remember back at the turn of the millenium - you'd buy the best computer around, and barely had time to get it out the box before a new one came along with a faster processor and twice the memory. These days most people can quite happily get along with an eight-year-old C2D, and the leading cause of replacement is hardware failure. Even on laptops - the keyboards start to fall apart before the processor speed becomes an issue.
Re: (Score:2)
Obligatory clip from The Onion movie [vimeo.com].
Re: (Score:2)
Remember people saying the same thing when I got my original iPad in late 2010. Four years later and it can't even run the latest iOS, let alone recent versions of most apps. (Even the apps that are compatible with it's iOS version number tend to run slowly or crash frequently.)
Re: (Score:2)
The original iPad is actually now really out of date, but then it was underpowered to start with. :)
It has a single core, 256MB of RAM, and frankly it is so far beyond even an iPad 4, much less an Air or Air 2, that it really is not useful for much beyond e-mail and very light web browsing.
Re: (Score:2)
But, Mr. Monster does have a point. It's not just processor speed or graphics capability anymore. Apple, especially, loves changing core technologies and then leans on developers to upgrade their apps to take advantage of them. And of course, most developers are going to add functionality to the current version and not back port them. So, not only are you stuck with an old OS, but you're stuck with old apps.
At least Apple is smart enough to leave the old apps in place, so it's not like you're left with
Re: (Score:2)
Re: (Score:2)
I put a solid state hard drive in an old C2D equipped Dell D630 for a friend of mine and upgraded the ram to 4GB and installed win 7 pro in place of Vista for him and he just flipped out over how much better it runs. I was pretty surprised as well, it seems faster than some of the shitty consumer level new stuff at best buy. Not bad for an antique.
Re: (Score:2)
Re: (Score:2)
It gets warm but not unbearably. I've worked on jet fighters for over 30 years now so I can't really say about the noise.
Re: (Score:2)
Celerons still take thirty seconds to process a mouseclick though. That hasn''t changed.
Windows user? I've got openSUSE running on my old laptop from 2004--an Acer with a Celeron--and it still handles full-screen video (and mouseclicks) very nicely, thank you.
Re: (Score:1)
Re: (Score:2)
That sa
Re: (Score:2)
Extrapolate and it's pretty obvious the PC you use for everyday computing tasks is going to become small enough to fit in
Re: (Score:3)
Apple seem to be pushing their mobile CPUs forward quite fast - they're also way ahead of the curve in adopting 64-bit ARM. I wonder if there's a longer term strategy to start migrating devices like the MacBook Air over to their A-series CPUs, instead of Intel. That could tie things together quite nicely for them.
Re: (Score:2)
ARM processors are not really comparable with x86. They compare well for certain synthetic benchmarks and for the mobile tasks they are optimized for, but they also do far worse in many general computing tasks too. Similarly, mobile GPUs can push a lot of pixels but are not really comparable to desktop GPUs because they don't have nearly as much power available for advanced pixel shaders, tessellation, physics processing and the like. In other words, I don't think you are likely to see a MacBook with ARM pr
Re: (Score:2)
People buying MacBook Airs aren't buying them for heavy crunching, but for portability and battery life, where ARM makes sense. I could see OS X ported to ARM (if they don't already have a port) and software being released as fat/universal binaries, kinda like 68k/ppc under Classic or ppc/x86 under OS X.
Re: (Score:2)
I wonder how a quad core A8X compares to one of the x86 GPUs. And if not the A8X, then the A9X. One of these days, it will catch up, and since it's Apple's own design, it'll be cheap to drop one in as a GPU. Think about it, you now have an ARM cores in your MacBook Pro, in addition to the x86 cpu, being extremely energy efficient. With Grand Central Dispatch, it might even be possible to push some instructions over (or perhaps specially written code).
Re: (Score:2)
They have undoubtedly got internal prototypes of a MacBook Air running OS X on their own processors. And their development toolchain and libraries are merging iOS and OS X more and more every year. This year, there were a couple of WWDC talks specifically about sharing code between the two platforms.
I think it's fairly obvious that the technology stack is ready bot
Re: (Score:2)
Unlikely.
The whole reason for 64 bit ARMs was because on ARMv8, AArch64 code runs significantly faster than AArch32 code. So if you want speed, you have to move everything to AArch64 - ARMv8 is onl
Re: (Score:3)
The iPad isn't used for number-crunching.
Have you ever tried running photoshop on a portable device? They have portable editions of that now, with filters. The more power the device has, the more complex filters can reasonably be run on the device. That's just one easy example of an app which can benefit from a lot of CPU.
I however would imagine that the GPU is a good place to put a lot of the additional power today, because the screen resolutions continue to increase. Intensive applications will just have to make use of it.
Re:I don't really see the point. (Score:4, Insightful)
I guess you haven't seen Pixelmator run on the iPad Air 2.
Re: (Score:2)
There are plenty of games (mostly racing games) that I believe push the iPad Air GPU close to its limits.
Another thing to keep in mind is that Apple will want to keep the door open for future products that would require top notch graphics capabilities on iOS. It would be a strategic flaw to lag behind Android devices in graphics power.
Re: (Score:2)
Re: (Score:3)
Re: (Score:2)
Ha ha ha, good one. Less than 1 million sold this last round, it must have been a deafening clamor.
Re: (Score:1)
Well, no one is going to write graphically demanding games until the hardware capable of running them is available.
Re: (Score:2)
It's not a high-end gaming platform - it certainly has a lot of games, but few of them are graphically demanding.
It is a chicken and egg problem...
If better CPU and GPU never arrive, then better games and applications won't either...
This will prompt software devs to move along in power and advancement, and make everything else out run smoother, while allowing more stuff to run in the background...
It won't happen tomorrow, and frankly we may need 3 more releases for this new power level to become "standard", but it has to start somewhere.
After all, the original iPad had, what... 256MB of RAM? The iPad 2 had 512MB, it
Re: (Score:2)
The iPad Air 2 has 2GB of RAM. Those three years just vanished overnight.
Re: (Score:2)
The iPad Air 2 has 2GB of RAM. Those three years just vanished overnight.
Read what he said, you even quoted it. He didn't say it will happen in 3 years, he said the move to 2GB (which has already happened) will matter in about 3 years. Which is to say when it has finally become mainstream and so developers target it widely, currently while you can target 2GB systems they have comparatively very limited penetration.
Re: (Score:2)
^ this, thank you... you typed out what... frankly, I'm shocked has to be typed out on a web site with the tag "news for nerds" :)
This isn't even new, it has been going on in the general purpose computer world for a very long time. :)
Re: (Score:2)
Sigh... The point went right over your head...
The move to 2GB will actually be "required", and thus matter, in 3 years.
Just like a lot of apps no longer run on the iPad 1 or 2 due to the lower amount of RAM, app devs aren't going to make apps require 2GB of RAM for awhile, but once enough iPads have it, then it becomes normal.
Do I REALLY have to spell that out on a tech site?
Re: (Score:2)
The iPad isn't used for number-crunching. It's not a high-end gaming platform
It's not used for these tasks because it doesn't have the capability, not that the use cases don't exist.
If it performed better then it could be "used for number-crunching" and could be a "high-end gaming platform"
I'm not saying it should be more capable, I just don't get what you're trying to say.
Re: (Score:2)
Cubasis with a number of software synthesizers and some filters on recorded tracks is heavy number crunching, and something the tablet format is excellent for. More computing power means more filters and synthesizers, meaning less limitations on what can be done right away.
Many synthesizers and trackers can be used for realtime performances, and there the limiting factor is raw CPU power. This move is excellent news for those using iPads to perform.
Re: (Score:2)
I don't know, but pretty much everywhere else on the internet everyone is screaming "haha only 3 cores? Android has had octo core processors since 1973" - I suppose this is one small step towards shutting them up (though Apple haters will usually find something else 'superior' to be smug about)
'Mobile' no more. (Score:5, Informative)
Per the Geekbench 3 CPU benchmark suite, the A8X scores ~4500.
The Surface Pro hybrid laptop's i3 scores 4750.
Apple's base model MacBook Air's i7 scores 5300.
(and for reference, the old Core 2 Quad Q6600 scores 4250.)
Meanwhile, the Intel chips in the Surface Pro and MacBook Air have a 15W TDP, while the A8X should be well south of 5W. Granted, a lot of that goes to the integrated GPUs, but the A8X is no slouch in graphics either. The iPad runs at a higher resolution than 90-plus-percent of PCs today and runs plenty of good-looking 3D games. It's good enough for consumer use, definitely.
Finally, Intel's 'recommended customer price' for their ULV chips is ~$300. Major purchasers like Apple and Microsoft no doubt negotiate a substantial discount, but I doubt it comes close to the ~$20 (plus in-house design costs) Apple pays for the A-series chips.
This may sound like an Apple fanboy post, but it isn't. It's a 'Intel needs to get their shit together' post. A decade ago Intel lost their way with the Pentium 4 and AMD took the lead for a few years. In the end that gave us the vastly improved Core architecture. If Broadwell and Skylake don't put Intel out ahead of ARM designs in a hurry, the next few years could be very interesting.
Re:'Mobile' no more. (Score:5, Informative)
Correction - The Surface Pro's i3 scores 3250 [primatelabs.com]. Sorry.
Re: (Score:3)
Gah, editing again. The Surface Pro's i3 has a 11.5W TDP. The available i5 and i7 have 15W TDPs and performance close to the MacBook Air's i7
Changed chips, forgot to switch numbers :-/
Re: (Score:1)
Except that the "base model" Macbook Air doesn't have an i7. It has an i5.
Re: (Score:2)
...and you're right. I need to stop doing three things at once. For some reason I thought the 4260 was an i7 part (though in the ULV chips, the only difference is a bit more cache).
Re: (Score:3)
Geekbench isn't a proper benchmark so one shouldn't draw too many conclusions from it. Let's see something like SPEC or a subset of it instead.
Also: TDP is just that, a design power. Modern chips can exceed it for a thermally insignificant period but in most cases power draw isn't near it. Think of it as a worst case power for sustained workloads.
Re: (Score:1)
Unfortunately, there is no SPEC benchmark available for iOS.
Re: (Score:2)
He didn't overlook them at all. He acknowledged "plus in-house design costs" IMMEDIATELY after saying $20 for the price of the chip.
Re:'Mobile' no more. (Score:4, Interesting)
It is amortized over the large number of iPhones and iPads sold. Lets say they have 1,000 people dedicated solely to the ARM cpu/gpu. At $200,000 per person per year. They sell 150 million iPhones + iPads a year (more, but I rounded over for easier calculation).
1,000 * 200,000 / 150,000,000 = $1.33
So, $21.33.
Re: (Score:2)
What about GPU scores though? What'll be interesting is to see what a quad core A8X would look like, and how it compares to intel's HD 5000, AMD and NVIDIA's discrete GPUs. Imagine dropping an A8X or an A9X in your new MacBook Pro as the GPU. All of a sudden, you have both x86 and ARM in one box. With Grand Central Dispatch, and some special code, you can even offload CPU intensive tasks to the A*X. Metal API is pretty impressive to a non-coder like me, making it available on the MacBooks would be amaz
Re: (Score:1)
thin is in.
Re:Let's shit all over the customers (Score:5, Informative)
The new Mac Mini is twice as slow as the late 2012 model. So fuck you Apple.
The benchmarks [cpubenchmark.net] say that the CPU of the entry-level late 2014 Mac Mini is only 3.8% slower than the entry-level late 2012 Mac Mini. However, the TDP is also 57.1% lower (from 35W to 15W).
Re: (Score:1, Informative)
Nope Apple apologizer. The new Mac Minis are all dual cores compared to the late 2012 for which some were quad core. So basically Apple cut the multi core performance to half. Nice job, Apple.
http://www.primatelabs.com/blog/2014/10/estimating-mac-mini-performance/
http://ark.intel.com/products/83506/Intel-Core-i7-4578U-Processor-4M-Cache-up-to-3_50-GHz
Re: (Score:2)
Re: (Score:3)
They crippled it when they removed the optical drive IMO. It made for a really nice and discreet media center. No, adding an external drive won't do it as it kinda defeats the media center idea...
Re: Let's shit all over the customers (Score:1)
Have you ever heard of a NAS?
Considering top media center apps like Plex have broken or no support for DVD/BR playback, I don't understand your logic.
Re: (Score:3)
Yes, in fact I do have a file server here, but an optical drive is useful when you want to watch a DVD without ripping it first. Since a Mac mini is x86, it can run Windows Media Center, MythTV, XBMC, or any other software.
Re: (Score:2)
Oh well.......Apple is becoming less and less relevant anyway in the computer world.
What leads you to say that?
Re: (Score:2)
Because they don't really innovate anymore and most of what they do is a regression. The Mac Mini is a perfect example. It really not only failed to advance but in some ways went backwards. The newer OS upgrades are more about selling you some crap you don't want or need than increasing productivity. Mountain Lion was the last OS that actually seemed like an improvement. My computer ran better with that installation but Mavricks really seems sluggish, so much so I wiped the drive and went back to ML.
Re: (Score:2)
Because they don't really innovate anymore and most of what they do is a regression. The Mac Mini is a perfect example. It really not only failed to advance but in some ways went backwards.
Ya, definitely disappointing there. But on the other hand, doesn't it have a higher CPU-power-per-watt rating? I imagine that this matters more to some people than pure CPU horsepower.
The newer OS upgrades are more about selling you some crap you don't want or need than increasing productivity. Mountain Lion was the last OS that actually seemed like an improvement. My computer ran better with that installation but Mavricks really seems sluggish, so much so I wiped the drive and went back to ML.
Huh, I've noticed the opposite with Mavericks. I only recently (a month ago) upgraded, but I have noticed significantly better battery life with it — especially with Safari not chewing up as many idle CPU cycles.
I hate that upgrades are tied to the Apple Store now. Why???
Ya, that drives me nuts too. However, I think it should still be possible to extract a .dmg installation image
Re:Let's shit all over the customers (Score:5, Informative)
An informed individual posted an explanation for this. Apparently, the new Intel chips have different pinout requirements between the dual core and quad core variants - this is assuming you are soldering the CPU directly to the motherboard. Because of this difference Apple can not sell a quad core CPU without designing a new motherboard. So they sell it with the fastest CPUs that operate within the given power constraints and supports the required physical pinout.
In all likelihood, Apple will release a quad core update sooner rather then later. Holding off for 6 months gives them plenty of time to design the new hardware while also giving them the opportunity to make headlines once again in 6 months time.
The Mac Mini is a great little design. If one is in the market and wants to get the fastest one possible, it is probably best to either wait or purchase a quad of the previous model.
Re: (Score:1)
Re: (Score:2)
In all likelihood, Apple will release a quad core update sooner rather then later. Holding off for 6 months gives them plenty of time to design the new hardware while also giving them the opportunity to make headlines once again in 6 months time.
And that's what it's about isn't it, there's no reason the richest tech company in the world couldn't have produced a quad core variant in that time but now they already have the next mac mini in reserve ready to go.
Re: (Score:2)
Apple prides itself on producing fewer parts and models. They avoid multiple variations of anything.
So even fewer than the iterations before it? What's next? No choice in processor at all?
This supply chain philosophy goes all the way back to their founding and Steve Jobs, and is partly why they ARE successful.
Except that Mac Mini previously did have quad core options.
The fewer "options" you offer to the customer, the easier it is for them to make a purchase decision to buy. Adding more options just gives a customer more reason to delay their purchase decision.
So they eliminated the quad core option because up until now the Mac Mini line was too confusing?
Re: (Score:2)
Re: (Score:2)
The benchmarks [cpubenchmark.net] say that the CPU of the entry-level late 2014 Mac Mini is only 3.8% slower than the entry-level late 2012 Mac Mini. However, the TDP is also 57.1% lower (from 35W to 15W).
But it's a desktop PC, sacrificing a bit of power for a lower TDP is great in a laptop but to go 2 years on a desktop and release a system with a decrease in performance with only a lower TDP to show for it is pretty lame.
Re: (Score:3)
I waited to upgrade my iPhone 4S until the recent release of IOS8.1. It seems to work pretty well. Indeed, there is a bit of slowdown but it is quite acceptable. For a 3 year old device this is quite good support, I'd say.
I have not yet tried 8.1 on my iPad 2. Does anyone have experience with it?
Re: (Score:2)
For a 3 year old device this is quite good support, I'd say.
Would you say that about any other consumer device? Heck, the *warranty* on a new car is usually longer than that; and US laws mandate that the manufacturer support the car with parts and service for at least 10 years. If a major flaw is found, the company has to issue a recall and fix it for free. Your phone/taplet/computer? After a year you're on your own, and any updates - even for massive security flaws - is totally at the whim of the manufacturer. And this is considered good?
Re: (Score:2)
The iPhone 3GS was release in June 2009 and got a security update 2/2014.
Re:Let's shit all over the customers (Score:4, Informative)
Please do not try to inject facts into a haterz rant. It's not as if Apple is better than Android in providing timely updates to all the devices they support. Or force you to wait for your manufacturer to provide the update. Or allow your carriers to screw you over by withholding updates. Oh wait...
Re: (Score:2)
A car costs tens to hundreds of thousands of dollars when new. The iPhone - not so much. An automobile has the real possibility of killing you or dozens of people around you - again, the iPhone not so much*.
They are not even remotely analogous.
* and for all of you serious aspergers cases out there, don't go trying to make up some bizarre scenario where an iPhone connected to a cheap charger burns down a elementary school, OK?
Re: (Score:3)
"A car costs tens to hundreds of thousands of dollars when new. The iPhone - not so much"
With Canadian carrier prices, it's about the same after 2 years...
Re: (Score:2)
Re: (Score:2)
Works find on the original mini pad, so I assume the iPad 2 will be fine as well. Haven't upgraded my 4S since 7 is slower than 6 - enough to bug me - and I don't want to be bugged by my phone. But it's an early 4S and physically at death's door so I'll probably get a new version.
PLEASE APPLE. It won't hurt you much. Keep the 4S size. And while you're at it, bring back the 17 inch MBP and the cheese grater. Pretty please.
(Stomps off to the basement to sniffle.)
Re: (Score:2)
I have upgraded my 4S to 8.1 and really, while it is a little slower here and there, it's perfectly useable. Why do you say it's at its death door? Mine runs just fine and I plan to use it for another year at least.
Re: (Score:3)
I upgraded my iPad 2 to iOS 8.1. There are occasional glitches and some things seem to take slightly longer to respond, but all in all it's not too bad.
Safari, unfortunately, is next to useless if you're using more than a single tab. Crashes don't seem quite as frequent as they were in iOS 7, but switch back to the other tab and the page reloads, every single time. I've switched to Chrome because this was getting too frustrating.
One interesting thing is how there is some framerate drop in transitions and ov
Re: (Score:1)
For what it's worth, my 5s is noticeably snappier with iOS 8 than it was with 7. So at least iOS 8 doesn't make that slightly-older phone slow and dysfunctional. I can't speak for other models.
Re: (Score:2)
I've installed 8.x.x on my iPad 2 and it works okay except some apps (mostly apps my 3 year old uses) now crash often (were working just fine on 7.x.x).
I've also found that some apps have some UI issues with 8.x (I've been led to believe that Apple have changed some semantics of how the touch/scroll events behave on buttons in lists) but what ever the actual reason, this has broken a few things here and there.
Re: (Score:1)
Shut up and keep sucking bitch.
Re: (Score:2)
I dunno, I picked up some second hand (2009) 27" iMacs for about what I could get the same quality IPS screens for. These run Linux and Windows just fine, nice stylish all-in-one PC for peanuts. Sure, they're no powerhouse, but for non resource intensive tasks they work great.
There are certainly cases where running windows/linux on mac hardware makes perfect sense.
Re:Let's shit all over the customers (Score:5, Funny)
Re:Let's shit all over the customers (Score:5, Insightful)
Re: (Score:1)
Re: (Score:2)
Don't blame me, I voted for Kodos!
Re: (Score:2)
No, because they were better at being evil. Much better.
Re: Let's shit all over the customers (Score:3, Insightful)