Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Apple Hardware

Apple A8X IPad Air 2 Processor Packs Triple-Core CPU, Hefty Graphics Punch 130

MojoKid writes When Apple debuted its A8 SoC, it proved to be a modest tweak of the original A7. Despite packing double the transistors and an improved GPU, the heart of the A8 SoC is the same dual-core Apple "Cyclone" processor tweaked to run at higher clock speeds and with stronger total GPU performance. Given this, many expected that the Apple A8X would be cut from similar cloth — a higher clock speed, perhaps, and a larger GPU, but not much more than that. It appears those projections were wrong. The Apple A8X chip is a triple-core variant of the A8, with a higher clock speed (1.5GHz vs. 1.4GHz), a larger L2 cache (2MB, up from 1MB) and 2GB of external DDR3. It also uses an internal metal heatspreader, which the Apple A8 eschews. All of this points to slightly higher power consumption for the core, but also to dramatically increased performance. The new A8X is a significant power house in multiple types of workloads; in fact, its the top-performing mobile device on Geekbench by a wide margin. Gaming benchmarks are equally impressive. The iPad Air 2 nudges out Nvidia's Shield in GFXBench's Manhattan offscreen test, at 32.4fps to 31 fps. Onscreen favors the NV solution thanks to its lower-resolution screen, but the Nvidia device does take 3DMark Ice Storm Unlimited by a wide margin, clocking in at 30,970 compared to 21,659.
This discussion has been archived. No new comments can be posted.

Apple A8X IPad Air 2 Processor Packs Triple-Core CPU, Hefty Graphics Punch

Comments Filter:
  • by SuricouRaven ( 1897204 ) on Sunday October 26, 2014 @10:28AM (#48234075)

    The iPad isn't used for number-crunching. It's not a high-end gaming platform - it certainly has a lot of games, but few of them are graphically demanding. It's used for a little light content creation, but nothing more than timeline video editing - not real time effects composition or rendering. Mostly it's used for document viewing and web browsing. So long as it's got enough power to comfortably decode video (And it has hardware h264 acceleration anyway), why would you need to worry about just how much?

    • by thetoadwarrior ( 1268702 ) on Sunday October 26, 2014 @10:34AM (#48234109) Homepage
      For most people it is just a toy but that doesn't mean those who do more should go without. Everything you said applies to the vast majority of desktops too. You don't need to upgrade. My iPad air didn't magically turn to shit. It'll be a viable device for years to come.
      • We do seem to have reached the point where computers no longer become obsolete. I remember back at the turn of the millenium - you'd buy the best computer around, and barely had time to get it out the box before a new one came along with a faster processor and twice the memory. These days most people can quite happily get along with an eight-year-old C2D, and the leading cause of replacement is hardware failure. Even on laptops - the keyboards start to fall apart before the processor speed becomes an issue.

        • I remember back at the turn of the millenium - you'd buy the best computer around, and barely had time to get it out the box before a new one came along with a faster processor and twice the memory.

          Obligatory clip from The Onion movie [vimeo.com].

        • Remember people saying the same thing when I got my original iPad in late 2010. Four years later and it can't even run the latest iOS, let alone recent versions of most apps. (Even the apps that are compatible with it's iOS version number tend to run slowly or crash frequently.)

          • The original iPad is actually now really out of date, but then it was underpowered to start with. :)

            It has a single core, 256MB of RAM, and frankly it is so far beyond even an iPad 4, much less an Air or Air 2, that it really is not useful for much beyond e-mail and very light web browsing.

            • But, Mr. Monster does have a point. It's not just processor speed or graphics capability anymore. Apple, especially, loves changing core technologies and then leans on developers to upgrade their apps to take advantage of them. And of course, most developers are going to add functionality to the current version and not back port them. So, not only are you stuck with an old OS, but you're stuck with old apps.

              At least Apple is smart enough to leave the old apps in place, so it's not like you're left with

          • The hardware the original ipad had in it would be more of a problem than the software but unless you're obsessive about apps it still works as a device for reading books, surfing the net, watching videos, etc. I suspect the main reason the 1st iPad has a shorter support lifespan was because the hardware was a bit poor but even with that in mind and even if you are totally into apps the idea of having the buy every version of the iPad is entirely unnecessary. It received OS updates for just over 2 years and
        • by amiga3D ( 567632 )

          I put a solid state hard drive in an old C2D equipped Dell D630 for a friend of mine and upgraded the ram to 4GB and installed win 7 pro in place of Vista for him and he just flipped out over how much better it runs. I was pretty surprised as well, it seems faster than some of the shitty consumer level new stuff at best buy. Not bad for an antique.

          • But the heat, my god the heat and noise (fan) from those D630s are awful. I have one i keep around for drive cloning, but other than that it gets shelved like a book.
            • by amiga3D ( 567632 )

              It gets warm but not unbearably. I've worked on jet fighters for over 30 years now so I can't really say about the noise.

        • Celerons still take thirty seconds to process a mouseclick though. That hasn''t changed.

          Windows user? I've got openSUSE running on my old laptop from 2004--an Acer with a Celeron--and it still handles full-screen video (and mouseclicks) very nicely, thank you.

      • The difference being that I have a vast trove of useful software to wring every last drop of performance out of my desktop hardware. What can i do to top out an ipad? The software stack is mostly useless for power and will remain that way for a very long time.
        • Most people don't need to do that and unless you're doing serious 3d modelling GIS data processing / interpretation or gaming most desktop apps do not need a core i7 and 16gb of memory. I have an old thinkpad (one of the last IBM labelled ones) and with an SSD and ubuntu is does everything I want except gaming (though it will play TF2 but gets hot at fuck) and it doesn't need all the memory or CPU to do the job. So while that's good that you can do that but for like 99% of people it's unnecessary.

          That sa
      • I've been an advocate for tablets and PDAs long before the iPad or iPhone came out. After using computers for 3 decades, it's pretty obvious what's happening. First the computer you used was the size of a desk. Then the size of a suitcase, Then the size of what we call a desktop computer. Then the size of a 2" laptop. Then the size of a 1" notebook. Now they're a 0.5" ultrabook.

        Extrapolate and it's pretty obvious the PC you use for everyday computing tasks is going to become small enough to fit in
    • Apple seem to be pushing their mobile CPUs forward quite fast - they're also way ahead of the curve in adopting 64-bit ARM. I wonder if there's a longer term strategy to start migrating devices like the MacBook Air over to their A-series CPUs, instead of Intel. That could tie things together quite nicely for them.

      • by AmiMoJo ( 196126 ) *

        ARM processors are not really comparable with x86. They compare well for certain synthetic benchmarks and for the mobile tasks they are optimized for, but they also do far worse in many general computing tasks too. Similarly, mobile GPUs can push a lot of pixels but are not really comparable to desktop GPUs because they don't have nearly as much power available for advanced pixel shaders, tessellation, physics processing and the like. In other words, I don't think you are likely to see a MacBook with ARM pr

        • People buying MacBook Airs aren't buying them for heavy crunching, but for portability and battery life, where ARM makes sense. I could see OS X ported to ARM (if they don't already have a port) and software being released as fat/universal binaries, kinda like 68k/ppc under Classic or ppc/x86 under OS X.

        • I wonder how a quad core A8X compares to one of the x86 GPUs. And if not the A8X, then the A9X. One of these days, it will catch up, and since it's Apple's own design, it'll be cheap to drop one in as a GPU. Think about it, you now have an ARM cores in your MacBook Pro, in addition to the x86 cpu, being extremely energy efficient. With Grand Central Dispatch, it might even be possible to push some instructions over (or perhaps specially written code).

      • by Bogtha ( 906264 )

        I wonder if there's a longer term strategy to start migrating devices like the MacBook Air over to their A-series CPUs, instead of Intel.

        They have undoubtedly got internal prototypes of a MacBook Air running OS X on their own processors. And their development toolchain and libraries are merging iOS and OS X more and more every year. This year, there were a couple of WWDC talks specifically about sharing code between the two platforms.

        I think it's fairly obvious that the technology stack is ready bot

      • by tlhIngan ( 30335 )

        Apple seem to be pushing their mobile CPUs forward quite fast - they're also way ahead of the curve in adopting 64-bit ARM. I wonder if there's a longer term strategy to start migrating devices like the MacBook Air over to their A-series CPUs, instead of Intel. That could tie things together quite nicely for them.

        Unlikely.

        The whole reason for 64 bit ARMs was because on ARMv8, AArch64 code runs significantly faster than AArch32 code. So if you want speed, you have to move everything to AArch64 - ARMv8 is onl

    • The iPad isn't used for number-crunching.

      Have you ever tried running photoshop on a portable device? They have portable editions of that now, with filters. The more power the device has, the more complex filters can reasonably be run on the device. That's just one easy example of an app which can benefit from a lot of CPU.

      I however would imagine that the GPU is a good place to put a lot of the additional power today, because the screen resolutions continue to increase. Intensive applications will just have to make use of it.

    • by ArcadeMan ( 2766669 ) on Sunday October 26, 2014 @11:15AM (#48234281)

      not real time effects composition or rendering

      I guess you haven't seen Pixelmator run on the iPad Air 2.

    • There are plenty of games (mostly racing games) that I believe push the iPad Air GPU close to its limits.

      Another thing to keep in mind is that Apple will want to keep the door open for future products that would require top notch graphics capabilities on iOS. It would be a strategic flaw to lag behind Android devices in graphics power.

    • by fermion ( 181285 )
      The layout of webpages is more complex. Unlike the Mac, there is no easy way to block animated content. My old iPad hangs on many web pages. For this reason alone, we need a fast processor. Apple is also trying to put the iPad into the laptop space. It is the affordable Apple device, at $1000 fully loaded, often cheaper than the MS Surface.
      • Ipad and Surface Pro are not direct competitors. Ironically, its the artists and designers clamoring for Surface Pro and not Apple gear this time around.
        • Ha ha ha, good one. Less than 1 million sold this last round, it must have been a deafening clamor.

    • Well, no one is going to write graphically demanding games until the hardware capable of running them is available.

    • It's not a high-end gaming platform - it certainly has a lot of games, but few of them are graphically demanding.

      It is a chicken and egg problem...

      If better CPU and GPU never arrive, then better games and applications won't either...

      This will prompt software devs to move along in power and advancement, and make everything else out run smoother, while allowing more stuff to run in the background...

      It won't happen tomorrow, and frankly we may need 3 more releases for this new power level to become "standard", but it has to start somewhere.

      After all, the original iPad had, what... 256MB of RAM? The iPad 2 had 512MB, it

      • The move to 2GB will matter... in about 3 years.

        The iPad Air 2 has 2GB of RAM. Those three years just vanished overnight.

        • The move to 2GB will matter... in about 3 years.

          The iPad Air 2 has 2GB of RAM. Those three years just vanished overnight.

          Read what he said, you even quoted it. He didn't say it will happen in 3 years, he said the move to 2GB (which has already happened) will matter in about 3 years. Which is to say when it has finally become mainstream and so developers target it widely, currently while you can target 2GB systems they have comparatively very limited penetration.

          • ^ this, thank you... you typed out what... frankly, I'm shocked has to be typed out on a web site with the tag "news for nerds" :)

            This isn't even new, it has been going on in the general purpose computer world for a very long time. :)

        • Sigh... The point went right over your head...

          The move to 2GB will actually be "required", and thus matter, in 3 years.

          Just like a lot of apps no longer run on the iPad 1 or 2 due to the lower amount of RAM, app devs aren't going to make apps require 2GB of RAM for awhile, but once enough iPads have it, then it becomes normal.

          Do I REALLY have to spell that out on a tech site?

    • The iPad isn't used for number-crunching. It's not a high-end gaming platform

      It's not used for these tasks because it doesn't have the capability, not that the use cases don't exist.

      If it performed better then it could be "used for number-crunching" and could be a "high-end gaming platform"

      I'm not saying it should be more capable, I just don't get what you're trying to say.

    • Cubasis with a number of software synthesizers and some filters on recorded tracks is heavy number crunching, and something the tablet format is excellent for. More computing power means more filters and synthesizers, meaning less limitations on what can be done right away.

      Many synthesizers and trackers can be used for realtime performances, and there the limiting factor is raw CPU power. This move is excellent news for those using iPads to perform.

    • by Cloud K ( 125581 )

      I don't know, but pretty much everywhere else on the internet everyone is screaming "haha only 3 cores? Android has had octo core processors since 1973" - I suppose this is one small step towards shutting them up (though Apple haters will usually find something else 'superior' to be smug about)

  • 'Mobile' no more. (Score:5, Informative)

    by pushing-robot ( 1037830 ) on Sunday October 26, 2014 @10:46AM (#48234165)

    Per the Geekbench 3 CPU benchmark suite, the A8X scores ~4500.
    The Surface Pro hybrid laptop's i3 scores 4750.
    Apple's base model MacBook Air's i7 scores 5300.
    (and for reference, the old Core 2 Quad Q6600 scores 4250.)

    Meanwhile, the Intel chips in the Surface Pro and MacBook Air have a 15W TDP, while the A8X should be well south of 5W. Granted, a lot of that goes to the integrated GPUs, but the A8X is no slouch in graphics either. The iPad runs at a higher resolution than 90-plus-percent of PCs today and runs plenty of good-looking 3D games. It's good enough for consumer use, definitely.

    Finally, Intel's 'recommended customer price' for their ULV chips is ~$300. Major purchasers like Apple and Microsoft no doubt negotiate a substantial discount, but I doubt it comes close to the ~$20 (plus in-house design costs) Apple pays for the A-series chips.

    This may sound like an Apple fanboy post, but it isn't. It's a 'Intel needs to get their shit together' post. A decade ago Intel lost their way with the Pentium 4 and AMD took the lead for a few years. In the end that gave us the vastly improved Core architecture. If Broadwell and Skylake don't put Intel out ahead of ARM designs in a hurry, the next few years could be very interesting.

    • Re:'Mobile' no more. (Score:5, Informative)

      by pushing-robot ( 1037830 ) on Sunday October 26, 2014 @10:52AM (#48234191)

      Correction - The Surface Pro's i3 scores 3250 [primatelabs.com]. Sorry.

    • Gah, editing again. The Surface Pro's i3 has a 11.5W TDP. The available i5 and i7 have 15W TDPs and performance close to the MacBook Air's i7

      Changed chips, forgot to switch numbers :-/

      • by Anonymous Coward

        Except that the "base model" Macbook Air doesn't have an i7. It has an i5.

        • ...and you're right. I need to stop doing three things at once. For some reason I thought the 4260 was an i7 part (though in the ULV chips, the only difference is a bit more cache).

    • by Megol ( 3135005 )

      Geekbench isn't a proper benchmark so one shouldn't draw too many conclusions from it. Let's see something like SPEC or a subset of it instead.

      Also: TDP is just that, a design power. Modern chips can exceed it for a thermally insignificant period but in most cases power draw isn't near it. Think of it as a worst case power for sustained workloads.

    • What about GPU scores though? What'll be interesting is to see what a quad core A8X would look like, and how it compares to intel's HD 5000, AMD and NVIDIA's discrete GPUs. Imagine dropping an A8X or an A9X in your new MacBook Pro as the GPU. All of a sudden, you have both x86 and ARM in one box. With Grand Central Dispatch, and some special code, you can even offload CPU intensive tasks to the A*X. Metal API is pretty impressive to a non-coder like me, making it available on the MacBooks would be amaz

Understanding is always the understanding of a smaller problem in relation to a bigger problem. -- P.D. Ouspensky

Working...