Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Portables Upgrades Apple

Apple's A6 Details and Timeline Emerge 123

MojoKid writes "For a CPU that hasn't seen the light of day, there's a great deal of debate surrounding Apple's A6 and the suggestion that it may not appear until later in 2012. The A6 is a complex bit of hardware. Rumors indicate that the chip is a quad-core Cortex-A9 CPU built on 28nm at TSMC and utilizing 3D fabrication technology. While the Cortex-A9 is a proven design, Apple's A6 will be one of the first 28nm chips on the market. The chip will serve as a test case for TSMC's introduction of both 28nm gate-last technology and 3D chip stacking. This is actually TSMC's first effort with an Apple device. The A4 and A5 have both historically been manufactured by Samsung."
This discussion has been archived. No new comments can be posted.

Apple's A6 Details and Timeline Emerge

Comments Filter:
  • by Anonymous Coward on Sunday August 28, 2011 @05:40AM (#37232728)

    Steve's not dead two weeks and already Apple fumbles the ball. STACKED chips? How is the next iPad going to be as thin as it can possibly be when they start stacking chips?

    • Re: (Score:2, Informative)

      by Trepidity ( 597 )

      Not sure if this post was intended to be serious, but we're not talking about stacking them to a particularly large height. A single wafer is far thinner than any practical phone thickness, and a few of them stacked is still super-thin.

      • by Dunbal ( 464142 ) *
        Of course I have no idea about what we're talking about, neither do you. But hypothetically adding extra layers and making a chip thicker automatically creates heat issues, since that extra layer or two must act as a thermal insulator, trapping heat in the middle.
        • by repetty ( 260322 )

          Indubitably.

        • Well both you and the poster are correct. The current A4 [ifixit.com] and A5 [informationweek.com] are stacked as other many other package on package [wikipedia.org] chips. However, normally the memory is stacked on the CPU. In the case of the A4 and A5 the L2 cache is stacked on the ARM cores. The CPUs are not stacked probably for the heat problems that you mention.
      • by gmhowell ( 26755 )

        A single wafer is far thinner than any practical phone thickness, and a few of them stacked is still super-thin.

        Just try telling that to Mr. Creosote.

    • by narcc ( 412956 ) on Sunday August 28, 2011 @05:50AM (#37232762) Journal

      I bet they'll try to patent this "innovation" -- even though they clearly stole the idea.

      For goodness sake, Pringles has been stacking chips since the 1960's.

      • Re:Stacked Chips (Score:5, Interesting)

        by TheRaven64 ( 641858 ) on Sunday August 28, 2011 @06:53AM (#37232934) Journal
        Pringles claimed that they were stacking cakes. They lost a court case in the UK over this a couple of years back - for strange historical reasons, you pay VAT on crisps, but not on cakes. Pringles had been avoiding paying VAT by claiming that, because they were made from baked dough, they were cakes and not crisps.
        • Re:Stacked Chips (Score:5, Informative)

          by itsdapead ( 734413 ) on Sunday August 28, 2011 @07:12AM (#37232996)

          They lost a court case in the UK over this a couple of years back - for strange historical reasons, you pay VAT on crisps, but not on cakes.

          The strange historical reasons being that some bright spark thought they could be really clever by only charging VAT on "non-essential" items, thus creating endless work for lawyers and committees arguing over what was "essential".

          ...and as anybody who watches QI knows, the official definition is that "cakes" go hard when they are stale [wikipedia.org], whereas biscuits* go soft.

          * That's biscuits as in British English, i.e. cookies or crackers - not scones (which I guess are cakes).

          • by Anonymous Coward

            The idea of not charging VAT on essentials is a good one in that it reduces the tax burden of the poor. But why exactly are cakes essential and biscuits aren't?

            • im sure more dogs/cats eat biscuits than humans.

              Only humans eat cakes, and even then not more than 1-2 times a month. Its hardly a daily purchase, unless you consider bread to be a different cake, which in theory it is, a bland plain cake.

              Why dont we tax drugs instead, more $ in it.

          • by mikael ( 484 )

            Some bright spark also thought it would be clever to charge VAT on hot meals and not cold ones.

            • Sounds a lot like the U.S. If I get my Subway toasted, it gets taxed. Untoasted, it's not taxed. Wacky.

          • by welcher ( 850511 )
            Surely a scone is a bread, albeit leavened with soda rather than a yeast...
    • by mjwx ( 966435 )
      Worse yet, Apple is looking at releasing a "new" ARM A9 processor when TI and Qualcomm are looking at releasing a 28nm A10 processor. Given Apple's history they'll expect this to stay "current" for at least 18 months.

      Maybe they'll sue Qualcomm and TI for violating their processors look and feel^W^W^W, sorry, trade dress.
  • by rsmith-mac ( 639075 ) on Sunday August 28, 2011 @05:48AM (#37232754)

    I love my quad core desktop processor, but I find myself scratching my head at the idea of quad core CPU in a tablet. Even with iOS 5's enhancements there's no true multitasking in it or any other tablet/phone OS - every application is interacted with in a full-screen monolithic manner.

    Dual core CPUs allow the OS to do one thing in the background and not bog down the device for the running application, but what on earth are you going to do with 4 CPUs when you can only interact with 1 program at a time? This seems like it would only be of benefit to games and a couple other niche uses, otherwise a processor with fewer cores and higher per-core performance like the A15 mentioned in the article would be far more beneficial.

    • by jpapon ( 1877296 ) on Sunday August 28, 2011 @06:05AM (#37232792) Journal
      Just because there is only one app running doesn't mean it is running in a single thread. While most apps might not take advantage of multithreading at the moment, if quad core processors become the norm I'm sure you'll see them starting to use it. That is assuming that Apple actually put multithreading into their iphone SDK.
      • by Graff ( 532189 ) on Sunday August 28, 2011 @06:18AM (#37232836)

        That is assuming that Apple actually put multithreading into their iphone SDK.

        Of course there's threading [apple.com] in iOS. There are examples [xprogress.com] to be found if you google for them.

        • by jpapon ( 1877296 )
          Thanks. I wasn't saying that it doesn't exist, I just don't do any iphone programming, so I didn't know.
        • by Anonymous Coward

          Classical threading is just one side of the story.

          Internally, many frameworks are multithreaded, mostly the ones that deal with audio, video and image manipulations. And with blocks executing on user-created queues, the improvement can easily be felt.

          Think applying real-time effects to a 1080p video stream (with a preview) and compressing it to H.264 on the fly. On your phone.

          • by Graff ( 532189 )

            Think applying real-time effects to a 1080p video stream (with a preview) and compressing it to H.264 on the fly. On your phone.

            A lot of that sort of stuff is also hardware-accelerated where you hand off a stream to the appropriate API and the device will encode/decode using hardware features while using very little CPU.

            • The problem with hardware acceleration chips is that they are typically single purpose. Maybe as an iPhone owner who only downloads iTunes videos using a single codec, that's great but once you step outside the box and want to view a video in one of the other dozen common formats then 4 cores become much more important for decoding on the fly.
    • by Stevecrox ( 962208 ) on Sunday August 28, 2011 @06:07AM (#37232798) Journal
      For me it's more about the manufacturing yields, the article mentions TMSC are struggling with their 40nm production process and this thing is 28nm being released next year. From what I understand TMSC is being used to remove Apples reliance on Samsung, I wouldn't be surprised if this allows Samsung, etc.. to jump ahead as TMSC don't sound ready to mass produce the chip.

      Dual core makes sense because of power saving issues, you can have one low clocked core which is enough for basic phone functionality which is turned off when you started using the phone. In this sense I could even understand a triple core chip, you would have one low power core for when the phones not being used, then when it is you can move OS/Background processes to one core and have a 3rd core for running the main process.

      Surely a purpose built GPU would give far better gaming improvements than an additional A9 core.
      • If TSMC are struggling with their 40nm process, what makes anyone think that they'll do better with 28nm? The very idea of lithographic shrinks is that once you have a stable process, you then shrink it in order to get more die per wafer, and hence, a theoretical cost down. Theoretical because in practise, a wafer on a finer lithography is going to be more expensive than a previous generation, particularly if new equipment, yield hits and other parameters are factored in. So initially, the new die would

      • by Kjella ( 173770 ) on Sunday August 28, 2011 @09:22AM (#37233542) Homepage

        The article is talking about things long in the past, I have a HD5850 in my machine that's almost two years old and built on 40 nm process from TMSC. That process has been fairly stable for a long time now even though it was a bit delayed and early yields weren't as good as hoped. Where they have really struggled is with their 32-34 nm - I don't remember exactly - process that should have gone into the last generation of chips. In short, they ended up simply skipping it since they were due to deliver 28 nm by the time it would be ready. And there's actually three 28 nm processes, LP, HPL and HP which you can call low, mid and high-power. LP is really just for support chips, but it's rumored that HPL will be used for the next generation Cortex and AMDs Southern Islands, while nVidia is waiting on the HP process for their next generation. For the GPU business it just means progress is slower - both AMD and nVidia are stuck waiting for TMSC. For CPUs on the other hand Intel and GlobalFoundries are heavy competitors - GF to take over the business while Intel only produce for themselves - but being a process step behind is like fighting with one hand tied behind your back.

      • Going to a second supplier makes sense for most companies including Apple. Reliance on one supplier for a critical part makes some companies nervous. As for Samsung jumping ahead TSMC, it's unlikely to happen soon as Samsung has only started making products on their 30 nm lines within this year. Going to the next step (22 nm) will take a few years for them.

        I don't know about using separate cores clocked differently. That seems it would cause more problems than solving the power consumption problem espec

        • Going to a second supplier makes sense for most companies including Apple.

          "Most"? Actual it does NOT make sense most of the time for simple reasons of economics. Virtually all manufacturing has large fixed costs (tooling, engineering, setup, salaries, etc) which have to be recouped somehow. If you produce a small number of units, your per-unit cost climbs steeply. This is 100% of the reason for volume discounts.

          The problem with using a second supplier is that you are replicating all of these fixed costs but you can only amortize them over half the number of units. Worse, bot

          • If you have high volumes of a certain product, you absolutely want second, or third fabs for those. After all, all these foundries - TSMC, Samsung, UMC, Hynix, Vanguard, Nanya, et al - have multiple customers, all with supply agreements, and once Apple hits their max, a foundry would go into allocation if they tried allocating Apple more lots. That's why for high volume, second suppliers would be involved, and both suppliers capacity would be above optimal. End result being that for the 2nd fab, once App

          • There are two factors that you are neglecting: (1) Apple isn't getting two substantially different parts from two suppliers; they are getting two identical parts from two different suppliers as Apple designed the chip themselves. (2) Apple doesn't manufacture anymore; that has been outsourced to Foxconn with engineering and design remaining with Apple so their costs of using two suppliers is substantially diminished. Really this is no different than Apple getting flash memory from two different suppliers.
            • (1) Apple isn't getting two substantially different parts from two suppliers; they are getting two identical parts from two different suppliers as Apple designed the chip themselves.

              That has no bearing on the economics of the situation. Both suppliers still have the same fixed costs to amortize. The fact that the product is identical is irrelevant. They both have to buy equipment, hire staff, engineer the build processes, etc. These are fixed costs that have to be paid even if they never actually produce a single unit. With a second supplier, Apple pays many of these costs twice but neither supplier can amortize them over as many units. This drives the price up. It HAS to cost m

              • You are looking from the viewpoint of the manufacturer, TSMC and Samsung and not from Apple's viewpoint which is the point of this entire thread. From the viewpoint of Apple, they want more than one supplier for critical components especially in this case where they are contracting out two different companies to manufacture something of their design. While all your points are true for the manufacturer, they are somewhat irrelevant to Apple. TSMC and Samsung have to figure out exactly how to account for t
    • by Graff ( 532189 ) on Sunday August 28, 2011 @06:13AM (#37232816)

      Dual core CPUs allow the OS to do one thing in the background and not bog down the device for the running application, but what on earth are you going to do with 4 CPUs when you can only interact with 1 program at a time?

      You do know that iPhone apps can do quite a lot in the background, even if only one app can have focus at one time, right? Right now apps are deliberately curtailed to only certain background activities because of the limitations of the amount of cores, adding in more cores and more powerful cores will allow apps to do more in the background.

      The limitation of being able to interact with one app at a time is due to UI constraints. Even on a regular computer there isn't much case for multiple programs being visible to the user at one time. For the most part a user isn't able to fully interact with multiple programs at a time, the usual case is to view a document in one app while doing work in another. A better solution to this is to allow programs to share their display engines so that a single program can run and display documents from other programs while only having one program running at a time.

      The model of one application running with a few lighter weight processes doing background work makes sense for devices with tight resources and that's the model that iOS is attempting to follow.

      • Right now apps are deliberately curtailed to only certain background activities because of the limitations of the amount of cores, adding in more cores and more powerful cores will allow apps to do more in the background.

        I think Apple has been very upfront about the fact that limiting available background activities is primarily about power management and battery. Nothing about 2 cores prevents you from maxing them out, it's just that most of the apps that do such a thing do so because they are poorly and lazily coded. Apple's restrictions have always been about forcing developers to make apps that run in a way that will not kill the user's battery and several of the Android developers have made comments about wishing they

    • by perlith ( 1133671 ) on Sunday August 28, 2011 @06:13AM (#37232820)
      Maybe Apple has finally decided to support Flash?
      • by flosofl ( 626809 )

        Maybe Apple has finally decided to support Flash?

        Yeah, but with only 4 cores Flash will still drop frames.

      • Maybe Apple has finally decided to support Flash?

        What is Flash? You mean a flash drive? Get the Camera Connection Kit [apple.com] and that will give you a USB port, and it apparently supports flash drives.

    • by Crash Culligan ( 227354 ) on Sunday August 28, 2011 @06:17AM (#37232832) Journal

      rsmith-mac: what on earth are you going to do with 4 CPUs when you can only interact with 1 program at a time?

      This assumes that iOS will only ever allow you to interact with one program at a time. This also assumes that iOS doesn't do so already—ever play music while working with another app? It's a question of controls, and finding ways to work with multiple programs that works for the users.

      If I were doing it, I'd consider a "half-screen" mode where you can have two apps open, one on each side of the screen. But that's worse than Apple-armchairing, that's UX-armchairing. *shudder*

    • by Anonymous Coward on Sunday August 28, 2011 @06:17AM (#37232834)

      One core for the OS, one for the apps, one for the antivirus and one for the rootkit.

    • It's also about threading. But even then, while developers don't have access to APIs that spawn processes, the OS _does_ multitasking.

      Also, it's not only a matter of performance, but it's also a matter of power. A quad core processor allows the thing to scale in an energy-proportional manner. Only need a single core? Appropriate performance and every other core will remain powered down - consuming a lot less power. And for mobile, battery life is King.
      Need a lot more power? (games, for example) Yup, its the

    • by Anonymous Coward

      Remember : " ...640 kB should be enough ..."

      There plenty of new applications which could make good use of multicore CPUs :
      AI : facial recognition
      automatic voice translation
      reading on lips ...
      Desktop use:
      replace th

    • ...but what on earth are you going to do with 4 CPUs when you can only interact with 1 program at a time?

      A multi-threaded app would process data more quickly. It's a way of getting more processor power out without raising the clock speed any more.

    • I can think of dozens of things that they are dying to use that power for: Pumping 4x the pixels for a high resolution display, doing processing related to speech recognition (even if the matching is done server side), running spotlight indexing on local content as you download it... (e.g. your email and docs from the cloud), playing HD video while doing all of the above, supporting a "mission control" style app switcher with live previews and spaces style switching, supporting airplay in the background w

    • OS X has core load sharing built into the OS. Even though you are only doing one task, it can split it up over multiple cores. iOS does do multitasking, but you are correct it is very limited, and almost nothing is exposed to 3rd party apps.
    • Go use OmniGraffle on iPad. You'll want the 4 cores (easily threadable tasks, not enough cores).

    • Even with iOS 5's enhancements there's no true multitasking in it or any other tablet/phone OS

      Technically incorrect. Both iOS and Android are TRUE multitasking operating systems, which iOS inherits from BSD, and Android inherits from Linux. So perhaps you only work with one app at a time, but there is far more going on than you realize... all those processes running on your phone in the background? Those are tasks. Even when you're not using it, it is probably multitasking away and you didn't even realize!

  • Doesn't sound true (Score:3, Interesting)

    by KClaisse ( 1038258 ) on Sunday August 28, 2011 @06:14AM (#37232826)

    Apple has already had problems in the past with low-stock at launch. Why would they risk having even worse problems using unproven tech at a fab they haven't used before? There's always problems with supply when dealing with smaller fab tech, which will probably be worse with 3D being thrown in.

    • by jpapon ( 1877296 )
      I agree. There's no way that Apple is trusting the manufacture of what could be tens of millions of chips to an unproven technology. Even if (and that's a big if) TMSC could manage to get chips delivered on schedule, there's no telling what sort of reliability issues you'd be seeing 6 months down the road... especially with something like "3D" chips. I really don't think that Apple's business execs are crazy enough to take a risk like that.
      • by jimicus ( 737525 )

        I dunno. If there's one thing the last five years have shown, Apple are quite prepared to take calculated risks. Moving to x86 architecture, the iPhone and the iPad were all calculated risks which could easily have gone horribly wrong.

    • by Pikoro ( 844299 )

      I'm thinking it's more along the lines of "There is more demand when we can produce less so lets start at a higher starting price point."
      Later, when the pace of production can meet demand, they can just let the same price ride until competition shows up. Then, they can reap the benefits of an extra 6-9 months of higher prices, and then drop them when needed with no overhead.

      Not sure, I'm not an apple consumer, but has the price of an apple product ever dropped until the next iProduct came out?

      • by flosofl ( 626809 )

        I'm thinking it's more along the lines of "There is more demand when we can produce less so lets start at a higher starting price point." Later, when the pace of production can meet demand, they can just let the same price ride until competition shows up. Then, they can reap the benefits of an extra 6-9 months of higher prices, and then drop them when needed with no overhead.

        Not sure, I'm not an apple consumer, but has the price of an apple product ever dropped until the next iProduct came out?

        That pretty much never happens. Historically, Apple has a price point and it stays there across multiple hardware refreshes. This is true for the mobile devices as well as computers and laptops. If a price drops, it's typically when a new hardware version is released (like the shift down across the iMac and MacBook Pro models) and the drop is permanent. The only time I remember it happening during a product's life cycle was for the 1st gen iPhone. I seem to recall the subsidizedprice dropping $100 or so a c

    • So TSMC's 28nm is going to be what is behind AMD and nVidia's next gen GPUs, despite their poor handling of 40nm for both companies. Those guys (nVidia in particular) also have a large first dibs on the production.

      So if they are planning on the A6 from there later in 2012, well I could see it. Both nVidia and AMD want to launch new GPUs soon. I'm sure they want a Christmas launch though realistically it'll probably be early next year. Ok well they do those, tons o' chips are made with the 28nm process, the

      • Are nVidia and AMD willing to finance a Fab for TSMC? Apple has a history of paying manufacturers enough in advance for their product to finance the plant and equipment needed to build that product.

        This certainly changes the the equation when deciding which client should have priority.
        • Would be far too late for that. If you want 28nm stuff the fab not only needs to be built now, it needs to be full of equipment and staff, and be producing test runs. Building a fab takes a -long- time. How long? Well to give you an idea Intel is already building Fab 42 in Chandler for 14nm processors. Please remember they don't even make 22nm processors yet, however they are already in construction of the fab after it.

          Also in terms of altering contracts for this generation, it is too late. nVidia and AMD a

    • Low stock is not a problem when you have products as hot as apple's. If there were viable competitors you might have a point
    • by Osgeld ( 1900440 )

      this is not really uncommon in the history of apple

    • by Kohath ( 38547 )

      Because they have to. Their competitors are using the 28nm tech also. If it works (which is likely, since you can already get chips produced with this tech from a couple companies) then Apple needs to be in on it rather than stuck with an older, slower, hotter, more power-hungry chip. If it fails, then it fails for everyone and Apple is no worse off than their competitors.

  • by A12m0v ( 1315511 ) on Sunday August 28, 2011 @06:27AM (#37232858) Journal

    The A4 and A5 are not even that old.

  • Is TSMC now into doing assembly, in addition to wafers? Since when did it get into the packaging business? I thought that their business model was to ship their wafers to the assembly houses approved of by their customers, in this case, Apple, and that the assembly houses involved would do the packaging for them. From 3D stacked chip, I'm assuming that they'll be stacking multiple die on each other, like in an MCP. What's it in case of an A6 - 4 basic CPU's just stacked one over the other? Some of the

    • by Osgeld ( 1900440 )

      they are not taking wafers and stacking them up like a club sandwich, its all on a wafer with multiple planes

      • Stacking of multiple die is always done at die level. Depending on the aspect ratios, it may or may not require spacer chips, which are dummy die between 2 die to enable wire-bonding between the 2. You never stack wafers
    • by UnknowingFool ( 672806 ) on Sunday August 28, 2011 @12:15PM (#37234694)

      From 3D stacked chip, I'm assuming that they'll be stacking multiple die on each other, like in an MCP.

      Stacked chips having been happening a long time. The A4 and A5 are stacked with the CPU and the memory on top of each other. Technically there is no reason why they can't stacked CPUs on top of each other. Practically, I suspect heat is a problem.

      The other part of the question - iOS - is it something that's as SMP enabled as OS-X is? From what I've seen of i-PADs, they are not multi-tasking OS's at all - all they do is save the state of an app once you exit it, and resume from that point if you return. If that's the case, how does multiple cores help for this case?

      iOS is based on OS X which is based on BSD so yes SMP is there. Your knowledge about iPads is very out of date. The hardware itself is capable of multitasking as you play music while surfing web. The APIs that Apple exposes limits how applications access the multitasking. Fast-switching is the most common used version because most applications don't really need to keep running while not being used. However Apple provides seven different multitasking models [wikipedia.org] in iOS 4 released more than a year ago.

      Finally, Apple can make this chip even better for themselves by moving their macs and airbooks to this processor, so that they have just one CPU platform of their own, making it easier to have a common code base for their apps, like Safari, Mail, et al.

      Except that ARM and x86 instruction sets are not compatible. You can emulate x86 in an ARM environment but it will be painfully slow. Emulating ARM in an x86 environment will work but there's no real point other than coding and debugging for something like iOS.

      • From what I've seen of i-PADs, they are not multi-tasking OS's at all

        You must be a Windows user. Windows users eternally confuse operating systems with interfaces. iOS and Android are true multi-tasking operating systems. The interface currently restricts focus to one app at a time, but backgrounding apps, as well as being based on BSD an Linux respectively, means that iOS and Android both are true multitasking operating systems.

  • by msauve ( 701917 ) on Sunday August 28, 2011 @07:06AM (#37232976)
    FTA:
    " Given the iPhad's dominant market position, "

    I wonder who slipped that in there?
  • Assume by "stacking" they are referring to (and the article alluded to) something similar to Intel's Tri-Gate transistors?

    http://hothardware.com/News/Intel-Announces-New-22nm-3D-Trigate-Transistors/ [hothardware.com]

    And not simply stacking and interconnecting like this?

    http://www.tomshardware.com/news/rochester-3d-processor,6369.html [tomshardware.com]

  • by Bram Stolk ( 24781 ) on Sunday August 28, 2011 @01:05PM (#37235056) Homepage

    Apple can afford to bring out iPad3 with a CPU that is not much faster than the current one.
    What they can not afford, is stalling GPU performance.

    If rumours are correct, and iPad3 will have a retina display, it will need a lot more shader performance to fill that screen with 3 million pixels. As it is now, it is hard enough to get 60fps on non retina displays with moderately complex OpenGL ES2 shaders.

    • by narcc ( 412956 )

      If rumours are correct, and iPad3 will have a retina display

      That's what the rumors said about the iPad2 ...

      If you can trust apple to do anything, it's to provide no more than minor upgrades to their products, as we've seen with every iPhone and iPad so far.

      It doesn't really matter what they release, millions will still buy it.

      • Do you figure if you just make shit up, it somehow gives your point credence?

        • by narcc ( 412956 )

          Sorry, what part of my post do you think I "made up"?

          The iPad2 was rumored to have a retinal display, and every new generation of iPhone and iPad has been a fairly minor upgrade in terms of specifications and features.

          Do you think that millions won't buy the next incremental upgrade?

          • Comment removed based on user account deletion
            • by narcc ( 412956 )

              I assume you think that adding a newer display somehow made it no longer a mundane update? I would expect a newer version to have a newer display!

              I know that Steve told you it was "revolutionary", but it really was just another minor update. Take a look at the specs. Somewhat less awesome now, eh?

              • Comment removed based on user account deletion
                • by narcc ( 412956 )

                  You're still on this? Okay, let's add some perspective.

                  Before the iPhone 4, Apple had one of the lowest pixel densities on the market. It's not like they were best-in-class here.

                  The LG Arena way back in early 2009, had a 311 ppi display.

                  Even earlier, the Xperia X1 from late 2008 had a 311 ppi display

                  Apple iPhone 4, in the middle of 2010 had a 326 ppi display.

                  Sure, it was the highest on the market at the time, but only slightly higher than phones that came out more than a year before.

                  So, yeah, it was just

                  • Comment removed based on user account deletion
                    • by narcc ( 412956 )

                      Sample quantities? For multiple, popular, retail products?

                      It seems that you're delusional. Not that it matters to me. Enjoy believing that the next mundane refresh of the iPhone will "change everything again".

                      Perhaps this time they'll have a notification system that works half as well as a 5 year old blackberry. It'll be revolutionary!

  • by SethJohnson ( 112166 ) on Sunday August 28, 2011 @01:35PM (#37235230) Homepage Journal
    Apple has been twisting Intel's arm (that IS a pun) about power consumption and threatening to dump their chips in favor of ARM. Another way Intel limits Apple is that their product cycles are tied to Intel's product cycles, which constrains Apple to a parity with other laptop vendors. By moving to a homebrewed CPU, it would give Apple even more architectural control / freedom which would assist in differentiating Apple products from their competition.

    Funny how it all comes full circle. Apple suffered from having its unique RISC architecture for many years. Then Apple conformed to X86 for just a few years and leveraged that to get enough marketshare that they can move back to an independent architecture again.

    Seth
    • The move from PPC to Intel was more about logistics than performance. Apple might have been Motorola's and IBM's most high profile customer but they would really be a small customer in terms of volume. Due to the nature of Apple's consumer business, their chips would have to be heavily customized requiring more R&D and cheaper by unit as they were intended for consumers. IBM's internal server/workstation division would pay more for PPC chips as they were intended for higher-end computing. Apple wou

      • Was Apple buying their PPCs from just IBM, or Mot/Freescale as well? I thought the reason Apple dropped PPC is that these 2 didn't have a long term roadmap on performance upgrades, which is what they were looking for. And since that time, IBM has been improving the power management of the Power considerably, so that today, despite being tops in performance and used for things like SAP, it consumes remarkably low power. Apple might want to consider re-instating servers w/ the Power7.

        Also, Power is now a

        • Was Apple buying their PPCs from just IBM, or Mot/Freescale as well? I thought the reason Apple dropped PPC is that these 2 didn't have a long term roadmap on performance upgrades, which is what they were looking for. And since that time, IBM has been improving the power management of the Power considerably, so that today, despite being tops in performance and used for things like SAP, it consumes remarkably low power. Apple might want to consider re-instating servers w/ the Power7.

          Apple first bought all CPUs from Motorola then switched to using desktops CPUs from IBM. They still relied on Motorola for mobile G4 chips and IBM never released a mobile G5. I suspect heat and power consumption were not good enough for laptops. Even if IBM has been improving on power management, Apple is not likely to have their servers on one platform and the rest of their line on another.

          Also, Power is now an open specification, so it's no longer restricted to just IBM and Motorola. Apple could take it to any fab, like TSMC, and have them make what they need.

          Yes but Apple will have to do all the R&D themselves. They design their own A4 and A5 chips however significan

          • Also, Power is now an open specification, so it's no longer restricted to just IBM and Motorola. Apple could take it to any fab, like TSMC, and have them make what they need.

            Yes but Apple will have to do all the R&D themselves. They design their own A4 and A5 chips however significant portions of those chips like the ARM and graphic cores are licensed by Apple but not designed by them. It's not that Apple couldn't do so but it is a much larger undertaking than designing their iDevice CPUs.

            Yeah, but Apple bought a company called PA Semiconductors in 2008, who were doing not an ARM, but a PowerPC based CPU. That company was not making any ARM processors until Apple bought them. So Apple could have taken their PWRficient designs, and designed their iPads, iPods and even Airbooks around it. As it is, OS-X already exists for the PPC, so it would have been a question of updating their recent versions so that it was supported.

            Also, Apple was a far bigger consumer of PPCs than IBM itself - the number of Macs they sold, while low, easily dwarfed the number of IBM Power7 systems sold. Incidentally, did IBM ever migrate their other legacy products to Power7, or do they still make upgrades to those legacy platforms as well?

            At their peak, Apple might have purchased maybe 2-3 million PowerPC CPUs a year from IBM with 2 million from Motorola. But remember these chips were cheaper than workstation/server grade POWER IBM would buy internally. While it is not clear how many units IBM sold to itself, the systems group sold about $24B worth of hardware and services. IBM also most likely sold more than just CPUs to themselves. In the ASIC business IBM has sold more processors to their other customer MS for Xbox than Apple and that processor hasn't changed in years. From the viewpoint of IBM which customer would you prioritize: Apple who needs new upgrades every year for cheaper processors and a lot of R&D, IBM who is internal but willing to pay a lot more per chip, or MS who needs the same processor year after year?

            X-box would definitely be a bigger customer than Apple: I was thinki

            • Yeah, but Apple bought a company called PA Semiconductors in 2008, who were doing not an ARM, but a PowerPC based CPU. That company was not making any ARM processors until Apple bought them. So Apple could have taken their PWRficient designs, and designed their iPads, iPods and even Airbooks around it. As it is, OS-X already exists for the PPC, so it would have been a question of updating their recent versions so that it was supported.

              I suspect Apple bought out PASemi more for their expertise and patents than their designs specifically for mobile devices not laptops. Again, it's not that Apple can't do it with enough resources or money but that the effort would be quite large. PASemi had maybe 150 engineers; I think Intel employs thousands for chip design. Also bear in mind what Apple wants from PowerPC was laptop and desktop CPUs for consumers. Even IBM with all their resources and chip expertise was never able to release laptop CPU

If A = B and B = C, then A = C, except where void or prohibited by law. -- Roy Santoro

Working...