Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
IBM Businesses Apple

Apple and IBM Working Together on 64-bit CPUs 460

Currawong writes "eWeek reports that IBM Microelectronics is working with Apple on a 64-bit PowerPC processor called the GigaProcessor Ultralite (GPUL). Unlike previous reports, eWeek now reports that Apple is testing the chip for use with future hardware. IBM apparently also plans to use the processor in linux-based servers. It's believed IBM will disclose some details of the processor in October at the upcoming Microprocessor Forum in San Jose, California. While this story is similar to recent stories about Apple using Power4-based IBM chips in future Macs, the GPUL, unlike the Power4, is smaller, runs cooler and consumes far less power, making it suitable for desktop machines and small servers. The processor is described as having the same 8-way superscalar design fully supporting Symmetric MultiProcessing." We had a previous story about these new chips.
This discussion has been archived. No new comments can be posted.

Apple and IBM Working Together on 64-bit CPUs

Comments Filter:
  • Cooler? (Score:4, Interesting)

    by greenhide ( 597777 ) <`moc.ylkeewellivc' `ta' `todhsalsnadroj'> on Friday September 20, 2002 @08:41AM (#4296649)
    the GPUL, unlike the Power4, is smaller, runs cooler and consumes far less power, making it suitable for desktop machines and small servers

    Does anyone know if the chip would actually be cool enough so that it would not require a fan? One of my favorite features of the G4 is that it requires no fan whatsoever. My PowerMac G4 makes so little noise that sometimes it's hard to tell if its running or not without looking at the little glowing power button on the front.

    I think this is one of the nicest features of Macintosh computers and if they need to add a fan I think that will be a real shame. On the other hand, Motorolla really hasn't gotten their act together, so Apple may not have a choice.
    • Re:Cooler? (Score:3, Informative)

      by danamania ( 540950 )
      It would be nice, but perhaps we won't be that lucky. The current g4 duals are horrifically loud, compared to their predecessors.

      a grrl & her server [danamania.com]
      • i have a new G4 dual under my desk and i can't hear it unless i put my ear within about three inches of it.

        on the other hand, the thing weighs a ton. closing the side panel feels like slamming a car door.
    • Re:Cooler? (Score:2, Interesting)

      by Beatbyte ( 163694 )
      It would not only not require a fan but will keep the laptop offerings from Apple from being lap burning machines.

      Plus this could yield higher speeds by not needing the cooling, but adding cooling and cranking the speed up.

      Either way, their major focus should be getting the speed up higher.

    • Re:Cooler? (Score:2, Informative)

      by GatorMarc ( 451163 )
      Except for some of the CRT iMacs which used convection to cool themselves, every G3 and G4 Mac has at least one fan.

      I have a feeling that although this chip runs cooler, it will still be hotter compared to the G3, maybe the current G4.
    • Re:Cooler? (Score:5, Informative)

      by Lumpy ( 12016 ) on Friday September 20, 2002 @09:31AM (#4296967) Homepage
      BAH! the requirement of a fan on the processor is based on very poor heatsink design. remember a small plastic fan is cheaper than a large block of copper and aluminum. and ANY processor including the cook-your-egg AMD's can use a fanless heatsink IF the heatsink is properly designed and sized, AND your case has a poper heat chimney and vents in it's design so that convection will promote cooling.

      Using fans is the cheaters way out or the cheap way out.
  • by sphealey ( 2855 ) on Friday September 20, 2002 @08:46AM (#4296677)
    A key question: will this chip have DRM (aka Digital Rights Reduction) features built-in? If NOT, there could be a good market here for IBM as the free alternative to Intel.

    sPh
    • Without DRM, will Office XI (or whatever the next version of Office for the Mac is called) run on it? Is Microsoft intending to take the 'Palladium' concept to all its products?
      • The next version of Office will run fine on non Palladium computers and Palladium computers running in insecure mode. Microsoft doesn't care if you want to distribute you word docs, powerpoint presentations, and excel spread sheets freely and openly.
    • by SmittyTheBold ( 14066 ) <[deth_bunny] [at] [yahoo.com]> on Friday September 20, 2002 @09:34AM (#4296990) Homepage Journal
      If Apple sticks to their old game, there will be no DRM whatsoever.

      After all, iTunes rips audio into MP3 formats instead of some "protected" format. QuickTime does not (IIRC) support DRM, except for (weak) protections on streamed movies to prevent a person from saving the movie.

      Apple has made a market by keeping a user's options open. Closing that up is not a priority for them. The infrastructure to do such things is not only not there, it would take a lot of time to implement. I am sure Apple is more interested in getting a new processor to market than they are in restricting the rights of their target market - content creators.
      • Apple has made a market by keeping a user's options open.
        Apple sells computers with DVD drives, Apple DVD player software, and Firewire ports. Put those facts together, and one very obvious and intuitive and natural capability comes to mind. But it isn't there, on purpose. Apple does what it thinks it needs to do, thus they got a license from DVDCCA which came with ridiculous terms. They either had to do that, or be left behind where DVDs were concerned. Apple chose to survive, which is why they are still around today.

        Here is the future: the dark lord in Redmond is going to create a large unwitting/unwilling installed base of DRM implementations, and there's not a damned thing anyone can do to stop it. Once that installed base exists, then various mass-market media will be made by the "big players" (the ones with all the money, who are able to put asses into seats in theaters worldwide, the ones who can buy slots for radio play) and you can only play it if your computer implements DRM.

        Apple, the company that cares enough about multimedia that they got the studios to release movie trailers in their Quicktime format and the exclusively-licensed-to-Apple Sorensen codec, can either be a part of this or not. They can either throw up their hands and say, "Well, you need to be running Windows on x86/Palladium boxes to play that movie trailer" or they can say, "Yes, of course you can play that music "CD Next Generation" media on Macs too."

        Do you really have the slightest doubt which way they are going to go?

    • by gclef ( 96311 ) on Friday September 20, 2002 @10:13AM (#4297205)
      I was at a talk recently given by one of the security guys from Apple. He was asked about the whole TCPA thing, and his response was that Apple wasn't participating in it at present, and didn't really see what they could offer to it. Unless some sort of TCPA-like thing became law, or unless someone came up with some way for Apple to contribute, they were going to stay out of it.

      So, at least for now, they're staying out of the DRM wars. Of course, this is all subject to management whims, but that's the state as of now.
    • Incidentally... (Score:5, Informative)

      by artemis67 ( 93453 ) on Friday September 20, 2002 @12:07PM (#4298084)
      I don't have a DVD player at home, but I just got the new Monster's Inc. DVD (yes, I know I need to buy a player, but I'm cheap...). I happened to bring a brand-new ThinkPad home from the office to do some work. No RCA out, just S-Video. Cool, I can work with that.

      So I pull out my S-Video cable, my computer speakers, and subwoofer, and get it all hooked up. Pop in the DVD and play it. Hmm... the TV is mirroring the laptop screen, but the video doesn't show up. After playing around with it for half an hour (and trying two different software players), I finally notice this little warning that says that "Copy protected DVD's will not output to the S-Video port" (or something like that).

      WTF? Why even have a DVD drive and an S-Video port if I can't combine them? Note to everyone: Don't buy a ThinkPad if you think that there's EVER a chance you'll want to play a DVD through the S-Video port. If IBM is so damned concerned about DRM, they need to put a big sticker on the laptop that this is a DRM-enabled system. I guarantee that I will never buy another ThinkPad.

      Anyway, next night, I bring home the Apple PowerBook. Hook everything up, pop in the DVD, hit play. No problemo.
  • or in an Apple Store. I've heard about the G5's for years and I know they are the next best thing. However, seeing is believing.
    • Again thank you Motorola for screwing us! I have a small feeling that IBM can be counted on a little more that motorola, because IBM sells its power pc based chips to more that just apple, where if I am not mistaken motorola only sells to Apple so when times gets tough for Motorola like they have for the past few years the R&D for power PC chips drop.
      • by sphealey ( 2855 ) on Friday September 20, 2002 @09:17AM (#4296876)
        Again thank you Motorola for screwing us! I have a small feeling that IBM can be counted on a little more that motorola, because IBM sells its power pc based chips to more that just apple, where if I am not mistaken motorola only sells to Apple
        The whole PowerPC thing was one of the most amazing displays of corporate loyalty I have ever heard of. Apple needed a new chip but was unwilling to abandon their historical supplier, so they forced IBM and Motorola to the table and knocked heads until they got a joint production agreement.

        Most companies would have said: "sorry Motorola - you are out of gas. We just signed with Digital (Alpha) [or IBM or Intel]. Thanks for the memories". Instead Apple force-fed the entire PowerPC thing.

        I wonder what their motivation was? And did Apple truely benefit in the long run?

        sPh

    • by Phoukka ( 83589 ) on Friday September 20, 2002 @09:26AM (#4296943)
      Except that the GPUL is not the next best thing. If you read the eWeek article, you'll find that the projected time-line reads, basically, the G5 first and then the next best thing after that. And it is very much up in the air what that next best thing will be. I know that Apple has had a long history of working with IBM and Motorola, and that adds a certain amount of probability to the conjecture that the GPUL will be the next best thing, but the existence of Apple's Marklar project shows that we cannot discount the possibility of a switch to x86 architecture. I think the most likely candidate within the x86 world is AMD's Hammer -- it will be available at desktop-processor-level prices, and will also be available in versions more suitable for servers. Since both markets are areas Apple has targeted, this makes the Hammer more appropriate than, say, a combination of Intel's Pentium4 on desktop and Itanium for servers.

      Again, though, let me reiterate that this is all just conjecture until "The Steve" makes some sort of formal announcement.
      • by BoomerSooner ( 308737 ) on Friday September 20, 2002 @09:36AM (#4297008) Homepage Journal
        So it may be a long wait! I got my G4 Tower a few months ago to see if I even would like Apple OSes. To my delight I love OS X (hell even OS 9) and OS X is everything X-Windows/Linux should have been striving for. I was going to sell my G4 and get a dual 1.25 but the one I have is more than enough for now and 1 to 1 1/2 years isn't too long to wait for the next Mac (besides I've still got to save for the 22" display!).

        I've tried to use Linux on the desktop since 0.98 (Slackware in '96) and never found it to my liking. I don't like to tweak and read man pages for hours, I just want the damn thing to work. That being said all my companies servers run Linux (killed the SPARC the other day) and being able to sftp/ssh to my servers from a terminal in OS X was great. Plus using Dreamweaver to do my JSP development makes a great environment.

        Hopefully 1 to 1 1/2 years is all I'll have to wait. I'm patient so I'll start saving now.
  • by Spencerian ( 465343 ) on Friday September 20, 2002 @08:50AM (#4296693) Homepage Journal
    I'm not a processor expert or anything, but this can't spell anything but good competition with Intel (not that they're evil or anything, but they haven't had a reason to make their chips better performers, and no, increasing clock cycles doesn't count). Won't hurt Apple either unless it requires their developers to rewrite stuff (haven't they done this enough already with the Mac OS X transition?)

    Multiple processors in a chip? Good. AltiVec or similar number-crunching in combination? Great. If Apple pursues this, their boxes might--might achieve a performance that easily blows away the still-powerful SGI workstations and their slow-clocks-but-very-powerful processors (MIPS? Alpha? Can't remember right now).

    I hope that some other enterprising company works up a PC mobo that can handle it for those not inclined to Apple products. That would light a file under Wintel's corporate ass to build something better.
    • SGI make MIPS processors. Alpha's are/were made by HPaq, formerly Compaq, formerly DEC/Digital.
    • I'm not a processor expert or anything, but this can't spell anything but good competition with Intel (not that they're evil or anything, but they haven't had a reason to make their chips better performers, and no, increasing clock cycles doesn't count). Won't hurt Apple either unless it requires their developers to rewrite stuff (haven't they done this enough already with the Mac OS X transition?)

      I think Intel has had one big reason to make their chips better performers: AMD. I don't knock IBM, but the fact of the matter is that IBM hasn't been at the top of the microprocessor curve for a few years, in my opinion. While many systems still use IBM's mainframes, quite a few systems have converted to n-way multi-processing Intel-based architectures. As far as Apple's developers having to rewrite stuff, I believe that most if not all of Jaguar (OS X 10.2) is compiled with gcc3.1 - so, for Apple it would be as simple as ensuring a decent backend to gcc3.x for this new processor (chances are that this is already 'in the works' by IBM).

      Multiple processors in a chip? Good. AltiVec or similar number-crunching in combination? Great. If Apple pursues this, their boxes might--might achieve a performance that easily blows away the still-powerful SGI workstations and their slow-clocks-but-very-powerful processors (MIPS? Alpha? Can't remember right now).

      I'm not sure that SGI has any particular headway any longer. Maybe against certain machines in Apple's lineup, but I know here at my current employer, we've been using SGI Octanes and Octane IIs for heavy duty image processing in our products and we're getting ready to deploy a new architecture based on a dual-Xeon HP box running Linux (to replace Irix which we use on the SGIs). Performance of the image processing applications is unchanged or better and the cost savings to the company are very decent. Incidentally, the SGIs that I know of all use MIPS processors - only machines from Digital (DEC), now Compaq, use Alpha processors, to my knowledge.

      I hope that some other enterprising company works up a PC mobo that can handle it for those not inclined to Apple products. That would light a file under Wintel's corporate ass to build something better.

      The motherboards used in current Apple products are, for all intents and purposes, 'PC' mobos. They have standard AGP & PCI slots, use PC RAM (DDR at 133MHz or more) and provide connectivity through a number of PC compatible technologies (Intel's USB bus, IEEE 1394/Firewire, Ethernet, etc.) Its not really a matter of the processor/mobo combo being PC or not, its a matter of what OS you want to run. You can get a Mac and run most of the popular flavors of Linux on it (notable exception: RedHat). No problem. I'm not sure that much of anything will light a fire under the Wintel monopoly. Just my opinion, though.

  • by xidix ( 594440 ) on Friday September 20, 2002 @08:55AM (#4296726)
    Yes, a new 64-bit PPC processor would be great, because the G4 is really showing its age. But I don't think this will be something to drive Wintel users over to Apple. If anything, it will just help Apple hang on to its existing marketshare.

    The thing to remember is that "switching" is expensive, and not just for the new hardware. When a longtime PC user switches to Apple, they have to replace all of their software with Mac versions (and in a lot of cases, say goodbye to certain titles altogether). A new PPC processor isn't going to make that any less of a reality (unless of course, it allows VirtualPC to run fast enough that it's actually usable).

    A 64-bit PPC would almost assuredly be backwards compatible with 32-bit PPC applications so for current Apple users, it will be a big boost in speed without having to reinvest in all of their software immediately (although, if you want the most speed, you'll eventually need to upgrade to the 64-bit versions of your apps).

    Great news for Apple, but it's not a "Windows killer".
    • Yeah, but to keep some of those titles, imagine how fast VirtualPC could run under this processor!

    • by Matthias Wiesmann ( 221411 ) on Friday September 20, 2002 @09:13AM (#4296853) Homepage Journal
      (although, if you want the most speed, you'll eventually need to upgrade to the 64-bit versions of your apps).
      Why would code that uses 32 bit pointers be slower than code that uses 64 bit pointers?
      Having 64 bits pointers is needed to address more than 4 gigabytes, but why would there be a performance gain? I would think that longer pointers imply moving more data into the CPU, and therefore would consome more memory bandwidth. Am I missing something?
      • You are missing this: random access memory is named that way for a good reason - no matter what memory address the system is accessing the time to access that memory will be the same - there is no seek time in RAM (or ROM for that matter.) The size of the pointer does not matter since the access is not serial - it is parallel. That is first point. Second point - yes, for some applications there will be gain in performance if 64 bit addressing is used, namely - more addressable memory will become available to the application and the OS will have to do less paging. As simple as that.
    • by Graymalkin ( 13732 ) on Friday September 20, 2002 @09:50AM (#4297082)
      There's little to worry about with porting of apps. Unless you've got some seriously processor dependent assembly in your PPC binary there's little that will stop it from running on a POWER chip. The PowerPC instruction set is a subset of the POWER one meaning POWER ostensibily has more instructions besides the ones PowerPC has. It is trivial to compile an app for generic PPC code that will run on every PPC chip you can find.

      I don't get what you mean by the G4 "showing its age", it isn't some ancient chip pulled out of a tar pit. It's performance problems come from the low clock speed and the lack of multiple floating point pipelines. That is more of an implementation issue than an overall design issue. The Athlon has 3 FP pipelines, the G4 has one. AltiVec is fine if you can fine the parallelism it is good at in your code. Most people for go that effort and stick to simple floating point operations. Hence the Athlon's high floating point performance.

      Please people, 64-bits does not equal performance, instructions per second is the important factor. With 8 way superscalar goodness the POWER4 design gets stuff done not with its 64-bit GPRs but the fact it can suck down multiple integer and floating point operations at once and out of order. You've got the potential of 4 FLOPs per cycle in the POWER4, at just 1.25GHz that's 5 GFLOPS of plain old floating point performance. That is twice the Athlon's performance at the same clock speed. A second core would effectively double that rate since the cores on a POWER4 share their L2 cache making them look like a single chip.
  • Not till LATE 2003 (Score:2, Interesting)

    by rgraham ( 199829 )
    Quote(s) from the article:

    Perhaps the most disappointing news for Mac fans, sources said, is that IBM does not expect to be finished with GPUL project until late summer 2003.

    But on the positive side:

    Meanwhile, sources said, the long-awaited PowerPC G5 CPU from Motorola is likely to break cover perhaps as soon as early 2003. The G5, according to published product road maps from Motorola, should be available as 32- and 64-bit products with backward compatibility, though Motorola has provided few additional details.

    As a laptop user I'm curious to know if these new chips will be a be viable option (in terms of power and heat). Guess its a good thing I'm not planning on upgrading for another 12-18 months.
  • Hmmm... (Score:3, Interesting)

    by rgoer ( 521471 ) on Friday September 20, 2002 @08:56AM (#4296736)
    So what is Apple's plan for all this horsepower? It seems that the current 7450/7455 G4 chips have more than enough "under the hood" to comfortable kick the likes of Photoshop and Illustrator around, not to mention the iApps, and everybody's favorite Final Cut Pro. So this news begs the question: where does the GPUL fit in to Apple's master plan?

    Perhaps, just perhaps, has Apple something up their sleeve? Like a purchase of Alias|Wavefront to go along with their other recent acquisitions, and fully stack the high-end graphics deck? Or maybe pro-E has finally gotten their act together and is releasing a Mac client? Or are there going to be some new Xserves based on this chip, and maybe we'll actually see some type of installed base start to grow in the Apple-branded server market.

    Who knows... but as big as this news is (for Apple-heads, at least), the upcoming developements this GPUL (potentially) foreshadows loom much larger.
    • by bill_mcgonigle ( 4333 ) on Friday September 20, 2002 @09:16AM (#4296872) Homepage Journal
      "New processor Z has just been released. Sources say the processor is so fast typical users won't have a need for it, but is expected to be popular among engineering and CAD users."

      I first started reading this line when the 386/25 came out. Replace CAD with 3D Graphics for this decade. Every time a new processor comes around, they say almost exactly the same thing - watch for it in the press. So far the prediction hasn't shown to be true.
      • by starseeker ( 141897 ) on Friday September 20, 2002 @10:48AM (#4297437) Homepage
        Actually, I'd argue for most users the Pentium II was the point where things got fast enough to be usable. For paper writing, email and web, a Pentium II will do just fine. I know because I've been able to do all of these on a Pentium 200, which is significantly slower. (granted I was using Linux, but still.)

        Where processor speed helps in my experience is a) heavy duty mathematical software and b) compiling software. For graphics, acceleration cards do far more than a processor upgrade, and memory is also a common bottleneck (or was - with the really cheap memory we have now I suspect it's less of a problem.) A fast processor can help if you have lots of excess toys running, but for doing your job the Pentium II was when that task was effectively solved.

        There is a reason the computer market is saturating. People don't feel the need to upgrade so much. If they upgrade their software, it may demand more resources, but people don't feel the need to use Office XP or whatever if 97 does the job. And despite what we all think of Microsoft, it does do the job. Hence Microsoft's consideration of subscription licenses - their revenue stream is likely falling off somewhat, or at least not growing as fast.

        Don't confuse Want with Need. From a marketing standpoint they may look the same, but they actually aren't. In a recession we notice that fact more.
    • Re:Hmmm... (Score:3, Funny)

      by banky ( 9941 )
      > So what is Apple's plan for all this horsepower?
      Are you kidding? I guess you haven't used OSX. Just THINK of all the new minimization effects we'll get! Imagine playing a dozen minimized Quicktime movies, all at once, with no dropped frames! Imagine Chimera loading quickly!

    • by edremy ( 36408 ) on Friday September 20, 2002 @09:37AM (#4297012) Journal

      G4 chips have more than enough "under the hood" to comfortable kick the likes of Photoshop and Illustrator around, not to mention the iApps, and everybody's favorite Final Cut Pro.

      You have *got* to be kidding. Enough power for FCP? Dude, I routinely run 30+ minute renders for a 3 minute chunk of video on a 933MHz G4, and I'm not even doing all that much. A few filters, some text generation, a mask or two and it's walk away from the machine time.

      Apple could be shipping 8-way 2GHz G4s and it still wouldn't be enough.

      • by thatguywhoiam ( 524290 ) on Friday September 20, 2002 @11:11AM (#4297613)
        You do realize that this sort of thing took hours and hours on a $100,000 Avid previously. And now you're doing it on (approx.) $5k worth of Apple hardware with no special boards or drives.

        I feel your pain, but let's get some real perspective. Video is almost always going to need some sort of rendering, especially when dealing with uncompressed (or nearly) video. That's upwards of 600K per frame, times 30 per second. Just for the data.

        I used to have all these stats for explaining to clients why 'video rendering' always takes so long. My favourite: one minute of Cinepak (old-school!) video requires more math than the Apollo missions did. Sure, it's a whack stat, but it get's the point across, eh?

        The G4 is no slouch. Realtime Video Everything requires a massive bank of DSPs, or a CPU that does not yet live.

    • You've never worked with print quality media in Photoshop have you? Never rendered a movie with any amount of special effects in FCP or After Effects? Apply a filter to an image meant to be stuck on an 8 foot tall poster. The G4 even at its fastest (previously the fastest) is not going to finish this process very quickly. Very few filters take advantage of AltiVec so you're basically stuck with the G4s single FP pipeline. Next time use the apps in a real environment before saying the G4 is the fastest chip ever made.

      While I'd rather get stuff done on a Mac as I like the environment ten times better than Windows, if you were going on a raw speed comparison a Athlon MP Windows system is going to mop the floor with even the fastest G4. A lot of software on MacOS is really great in my opinion, the systems running said software have a lot of room for improvement.
  • by turgid ( 580780 ) on Friday September 20, 2002 @09:01AM (#4296771) Journal
    So, does this mean that to compete, intel will have to migrate itanium down to commodity hardware in a hurry? What about recouping their R&D costs, and what about the cooling issues and prduction costs?
    • I can't see that working out either. There's so much development effort (compilers) needed to get Itanium working well, yet nobody is adopting it. They can't sell such a huge chip cheaply (as you say, "production costs.")

      So maybe Itanium will be a massive abortion. Oh, well. They made a ton of money back when they had no competition and charged whatever they wanted.

    • If you look at Intel's price sheet and subtract off the cost of the high amount of cache the Itanium2 is not any more expensive than the P4. I think Intel may be making a killing on the markup on the cache, but we could reasonably priced Itanium2's today if there was any demand.

  • is the 2 or 4 cores per die. AFAIK this is the first time a multiple core die has been used for a consumer level ship. It is really cool that OSX with its unix underpinnings combined with the great design of the Power ISA will be able to handle the transition to 64bits and the additional thread handling etc needed for the 2-8cores per system (assuming apple will use the 4way core chips in smp mode).
    • Currently, OS X's SMP abilities scale only to two processors. If they want to employ a 4-way chip, the OS is going to need some work. Is this a limitation imposed by Mach or BSD? Does BSD scale up to more than 2 chips?
      • by Graymalkin ( 13732 ) on Friday September 20, 2002 @09:32AM (#4296976)
        What are you smoking? The Darwin kernel can scale up to 32 processors. The 2 processor limit is definitely not in the kernel itself. It is actually a probably with the design of the G4. Instead of a point to point link to the memory controller the G4s are on a shared bus. Stick more than two processors on a shared bus topology like that and your overhead is going to eat any extra performance you can manage to get.
        • Ah, so the limit is in the chip, not the OS. Cool, that's good to know. Thanks!
        • That's the same as the pentium architecture, but it's never stopped intel from making 4-way Xeons. All serious SMP machines have cross-bar switches, such as MIPS, SPARC, Alpha, IBM POWER and Athlon.
          • The four way Xeons IIRC had two buses with point to point links to the memory controller. A pair of chips was on a bus but there were two busses for the chips to sit on. Even at the relatively low speeds of the Xeons four processors on a single bus is far too much overhead. There are a few systems kicking around with up to 32 Xeon chips all linked together with custom hack crossbar switches and other engineerery.
  • Comment removed based on user account deletion
  • by autojive ( 560399 ) on Friday September 20, 2002 @09:10AM (#4296830)

    So I click on the story's link and this is what I see. Interesting, indeed. :-P

    Targeted advertising at its best [mac.com]
  • See macedition.com/nmr/nmr_20020914.php [macedition.com]

    (Disclaimer: Naked Mole Rat Reports [macedition.com] are usually hilarious. But for the first time, on Sept. 14 there was a "guest columnist," who wrote a lame parody of those Nigerian spam messages.)
  • by Anonymous Coward on Friday September 20, 2002 @09:12AM (#4296851)
    I can really see why Apple hates rumor-mongering like this. They go through a lot of trouble to get a machine design done and out in the marketplace, and two weeks later someone posts a rumor somewhere saying "G5 systems will be announced in three months!" so the user goes "well, I was going to buy a new machine, but I don't want to get screwed so I'll wait for the G5".

    This chips' project doesn't even complete until summer 2003, that doesn't even imply it'll be ready to fabricate or be in any kind of production then, even if it DOES pan out to be a useful design. I imagine by tomorrow Macosrumors will be touting it to be in the new uber-G4 to be released next month.

    How long has the G5 been 'almost ready' as far as rumor sites go? Two years now? It's great to spin up your readership with crap like that, but it really does a disservice when it's untrue.

    • well of COURSE any retail-ish company hates rumors.

      that's one of the many reasons intel's itanium1 processor sold i think a grand total of 500 systems w/processor in it. the only reason anyone'd buy an itanium1 system was for the collector's value. rumors/plans of itanium2 came out and preorders for itanium1 dried up.

      if you can't beat em', join em'. apple should market their hardware (As alot of hardcore appleites already do) in the fact that "this hardware should last you X years before you want an upgrade, and y years before you NEED an upgrade, according to fairly legitimate sources. instill trust and loyalty, rather than try to ward off fear and doubt.
      • by jafac ( 1449 )
        At some point though, Apple's gotta throw us a frickin bone. Something to let us know that the platform has a future. Judging by the course of development on the Hardware side for the past two years, wrt not only bus speed, but CPU development, with AltiVec being practically the ONLY high point, the Macintosh Hardware landscape is incredibly bleak. The only thing selling Macs now on the Hardware side is Gee-Whiz fancy cases, DVD burners, and LCD monitors.

        The SOFTWARE story, on the other hand, is BRILLIANT. But what the fuck are you going to run this tremendously asskicking OS on in 5 years?

        I don't give a crap what the rumor sites say - I'm *not* going to invest $3500 in a pro Mac until Apple brings it's system architecture into the 21st century. I'm talking about bus bandwidth. I don't care if I have to squeeze another two years of life out of my heavily upgraded Beige G3. Apple's not getting my money, until they offer a system that's worth it to me.

        If I see developments - rumors, in the positive direction, I'm more likely to wait for the worthy upgrade, than I am to say "FUCK Steve Jobs, I'm building an AMD box, and running Linux". It's as simple as that. A platform that has a future, that I can afford, versus one that does not have a future, that I can't buy at any price.
    • It's called the Osborne effect. Look up the history of Osborne computer corporation. Apple is onhe of the two companies that completely ate up their market. Coincidentally IBM is the other.
    • Sure, there's money to be made manipulating the flow of information to influence buying decisions. Just don't expect consumers to thank you for it afterwards.
    • There is an easy solution to that. Openness. If Apple openly discussed the direction they were taking, what future plans were going to be, what products they intended to ship and when.... rummor sites would become rather pointless. If apple doesn't like rummors they can kill them instantly.

      Take Intel for example, the public knows pretty clearly where Intel is headed and when they change directions they announce it publically. Microsoft is actually another good example of this, perhaps they go a bit overboard and talk about vaporware as if it were a shipping product but at least you can't claim you don't know what they are thinking about.
    • Well. MOSR [macosrumors.com] is the only site that's both constantly incorrect and constantly paid attention to. There are other sites that are just as full of crap, but MOSR gets attention... well... because they act like they get attention.

      The frustrating thing with MOSR is that they seem to never fucking learn. They might always have well placed sources for their info, but... those sources are so overly optimistic that they consistently make MOSR look like idiots.

      ThinkSecret [thinksecret.com] and MacRumors [macrumors.com] are both much better rumor sites, and I don't believe that they detract from Apple's sales in the slightest. Nick DePlume of Thinksecret seems to care enough about accuracy that he doesn't make many long-distance predictions. I've never seen him be very incorrect. His steadfast accuracy has made me reconsider purchase of a PC desktop, lately, because he says ATI is working on an all-in-wonder card for the mac. I believe him completely.

      MacRumors has a much higher volume of information, so sometimes they come up with crap, but they never make it sound more authoritative than it is. They don't act like you can bet the farm on their information.

      At this point, MOSR needs to curl up and die. Back in the day, they had enough viewers and sources that they could have been the premier rumor site indefinitely. Even with Jobs' crackdown on leaks. But their BS predictions (and crappy management) probably alienated as many sources as it did readers. So now those sources go to Thinksecret.
  • Apple is working with IBM? I guess Steve Jobs doesn't think IBM is "Big Brother" any more, or maybe he has joined them, and we can now call him "little brother."
    • umm... Apple has been working with IBM for over 10 years now. Apple, IBM, and Motorola jointly developed the PowerPC processor. IBM manufactures most (if not all) of the G3 processors currently being used by Apple.
    • Umm, PowerPC was the AIM group, Apple IBM Motorola. Anybody remember Taligent and Pink? No? Apple and IBM actually would probably prefer it that way. IBM also bought what is now StarOffice (and OpenOffice) as a MS Office competitor. Apple was one of the first ports after OS/2.

      My impression is that early Apple saw IBM as too big and slow to hurt "cool" Apple. Later years they saw IBM as an ally, kind of the Big Elephant that can take the imcoming shots while Aplle scurries behind its protection. I can't recall any animosity between them
  • by smagoun ( 546733 ) on Friday September 20, 2002 @09:42AM (#4297035) Homepage
    The really important part here is that Apple would be using a new bus with these machines. What the bus is doesn't matter so much as the fact that it's not the Maxbus, which is what the G4 and its ilk use. Maxbus is designed for routers + other embedded apps, not high-performance desktop computers. Currently Maxbus runs at 167Mhz, which is about as far as Motorola is willing to push it (167Mhz single-pumped, mind you). As a result, even a single G4 can more than saturate the bus, and the dualies spend a *lot* of time idling (they share one memory bus). Big caches help the problem, but there's still a fundamental issue.

    Even if the new chips are clock-for-clock identical to the current G4, the mere fact that they're running on a newer bus will make the machines much more powerful.

    For more info about this, head over to Ars and check out the posts in the Mac Achaia by BadAndy from earlier this summer ("Altivec, anyone?" I think it was titled). He knows a hell of a lot more about this stuff than I do; it makes for fascinating reading, and you can really understand why faster CPUs alone won't cut it for Apple.

  • Speaking as a luke-warm Apple fan & potential switcher, this sounds cool...but so have most of the daily "ray of hope" rumors that serious Apple fans have been kicking around for years.

    IBM has known for many years that an Intel/MS monopoly ain't good for IBM. (Anyone recall OS/2 for PowerPC?) Pumping up Apple with better CPU's would be good strategy, even if they make no money on the chips. But what's taken them so long?

    My impression is that Motorola's attitude & situation are so bad that Apple couldn't get much out of 'em with "we'll switch to IBM" threats.

    Now if someone can actually SHIP substantial quantities of non-defective chips BEFORE Intel is cranking out Pentium 6's & Itanium 4's at 10GHz...
  • by Billly Gates ( 198444 ) on Friday September 20, 2002 @10:41AM (#4297394) Journal
    My next computer will be a mac. I never would of thought I would say this. The problem is my previous used pentium III 700 from June 2000 is almost as powerfull as the current low end powermacs. Very pathetic.

    Motorrola has no one to blame but themeselves for this. If they innovated and tried to keep up with the industry like everyone else, they would of not had this problem. They figured mac users are suckers and will always buy anyway so who cares. They guessed wrong.

    Believe it or not, consumers do look at the mhz rating as an indicator of performance and value for what they are paying for. Even some look at the mhz rating for internet speed! If they see an expensive box that has a low mhz rating, they will just shake their heads and move on to another pc. Consumers aren't real bright and apple needs to boost the mhz peed on these new chips and not just have them perform fast. Palladium scares the hell out of me and I want no part in it.

    Kudos to apple. As soon as palladium is out and when these babies find their way into powerbooks, I will be one of your first customers.

    ALso MacOSX is one of the easiest versions of unix out there! No rpm hell, no spending hours configurating text files, no waiting for gentoo to compile everything, and all of the binaries like Windows include the dependancies. I will still keep a copy of linux around for the hell of it but I would love MacOSX!

  • by g4dget ( 579145 ) on Friday September 20, 2002 @01:21PM (#4298768)
    "Giga" is so 20th century. It has the ring of Dr. Evil's "One Million Dollars" to it (imagine backwards pinky to corner of mouth). The new marketing-compliant prefix is "Peta". Please take note.

I'd rather just believe that it's done by little elves running around.

Working...