Apple and IBM Working Together on 64-bit CPUs 460
Currawong writes "eWeek reports that IBM Microelectronics is working with Apple on a 64-bit PowerPC processor called the
GigaProcessor Ultralite (GPUL). Unlike previous reports, eWeek now reports that Apple is testing the chip for use with future hardware. IBM apparently also plans to use the processor in linux-based servers. It's believed IBM will disclose some details of the processor in October at the upcoming Microprocessor Forum in San Jose, California. While this story is similar to recent stories about Apple using Power4-based IBM chips in future Macs, the GPUL, unlike the Power4, is smaller, runs cooler and consumes far less power, making it suitable for desktop machines and small servers. The processor is described as having the same 8-way superscalar design fully supporting Symmetric MultiProcessing." We had a previous story about these new chips.
Cooler? (Score:4, Interesting)
Does anyone know if the chip would actually be cool enough so that it would not require a fan? One of my favorite features of the G4 is that it requires no fan whatsoever. My PowerMac G4 makes so little noise that sometimes it's hard to tell if its running or not without looking at the little glowing power button on the front.
I think this is one of the nicest features of Macintosh computers and if they need to add a fan I think that will be a real shame. On the other hand, Motorolla really hasn't gotten their act together, so Apple may not have a choice.
Re:Cooler? (Score:3, Informative)
a grrl & her server [danamania.com]
Re:Cooler? (Score:2)
on the other hand, the thing weighs a ton. closing the side panel feels like slamming a car door.
Re:Cooler? (Score:2, Interesting)
Plus this could yield higher speeds by not needing the cooling, but adding cooling and cranking the speed up.
Either way, their major focus should be getting the speed up higher.
Re:Cooler? (Score:2, Informative)
I have a feeling that although this chip runs cooler, it will still be hotter compared to the G3, maybe the current G4.
Re:Cooler? (Score:5, Informative)
Using fans is the cheaters way out or the cheap way out.
Re:Cooler? (Score:2)
Re:Actually the new Dual Systems have a fan it's (Score:2, Informative)
I like to leave it on all the time so that I can acess my files from elsewhere without having carry any form of media (e.g. floppy / CD-R / ZIPdisk), but if either myself of my girlfriend want to work (old-fashioned pen and paper) at the desk, we really have to turn it off.
This is why people have a problem with fans - they are just too loud, even when they are quiet. A silent computer is a much more attractive idea. Obvioulsy different peole havedifferent thresholds, but in a small apartment, your threshold is often lower.
Re:Computer noise, it's not so bad (Score:3, Interesting)
A bit off topic perhaps, but some people I know find they can get their young babies to go back to sleep by playing a recording of, say, a vacuum cleaner. Apparently the white noise is supposed to be similar to the sound of the womb.
I'm not at all sure what this says about you. Perhaps you want to go back...
To return to the topic, I find that I don't notice the "jet engine like" whine of my PC until I turn it off. It's then that I appreciate the peace and quiet. Frankly I'm all for more efficient CPUs.
Simon
Re:Computer noise, it's not so bad (Score:4, Funny)
I'm not at all sure what this says about you. Perhaps you want to go back...
Been trying non-stop since I was 14.
Re:Actually the new Dual Systems have a fan it'sOT (Score:5, Interesting)
The whine isn't bad until you realize you used to watch TV on 12, and now it's got to be 15.
In fact, our whole world (mine, anyway) is like this - far more noise than we were intended to hear regularly, and it slowly causes us to lose frequencies and ranges...
Do you find yourself trying to figure out what people said?
Convection only (Score:2)
With a central "convection column" we could put the processor low in the box (it would need a stand like the G4 Cube to allow airflow underneath) and position components around the column, we might be able to do it.
Of course, if you just want to leave out fans, and don't want to explore liquid cooling, you could use Peltier effect [kent.edu] (Ars Technica [arstechnica.com] has some details) coolers with heat sinks and the "convection column" or a heat distribution "tree" that spread heat out along sinks until it could be expelled along the case sides...
It's possible, it would just take more effort than many are interested in.
Of course, you could always pipe Central Air into your case...
Re:Is that the only way you can tell? (Score:2)
Re:Cooler? (Score:2)
Hence, "non laptop"
Will it have DRM built-in? (Score:5, Interesting)
sPh
Re:Will it have DRM built-in? (Score:2)
Re:Will it have DRM built-in? (Score:2)
Re:Will it have DRM built-in? (Score:2)
First off there is pretty good support for games under Mac far better than most platforms. No its nowhere near as strong as the support on PCs but if you like to play the occasional game Mac isn't a bad choice. Its really the hardcore gamer that would be unsatisified with Mac.
But the bigger issue is I'd tend to believe that well over 75% of the home market would be satisfied with no-lousy game support. I see no evidence that game support is anywhere near that important for home PC sales. If it were, you'd see all sorts of game / computer bundles selling standard with most systems (yes I know there are some bundles like this). Instead you see office productivity / computer bundles primarily. If the OS and office suites are what ship standard with PCs then that is a pretty good indication that these are the core software needs of the users.
Re:Will it have DRM built-in? (Score:5, Insightful)
After all, iTunes rips audio into MP3 formats instead of some "protected" format. QuickTime does not (IIRC) support DRM, except for (weak) protections on streamed movies to prevent a person from saving the movie.
Apple has made a market by keeping a user's options open. Closing that up is not a priority for them. The infrastructure to do such things is not only not there, it would take a lot of time to implement. I am sure Apple is more interested in getting a new processor to market than they are in restricting the rights of their target market - content creators.
Apple will bend over and lube up when they need to (Score:3, Insightful)
Here is the future: the dark lord in Redmond is going to create a large unwitting/unwilling installed base of DRM implementations, and there's not a damned thing anyone can do to stop it. Once that installed base exists, then various mass-market media will be made by the "big players" (the ones with all the money, who are able to put asses into seats in theaters worldwide, the ones who can buy slots for radio play) and you can only play it if your computer implements DRM.
Apple, the company that cares enough about multimedia that they got the studios to release movie trailers in their Quicktime format and the exclusively-licensed-to-Apple Sorensen codec, can either be a part of this or not. They can either throw up their hands and say, "Well, you need to be running Windows on x86/Palladium boxes to play that movie trailer" or they can say, "Yes, of course you can play that music "CD Next Generation" media on Macs too."
Do you really have the slightest doubt which way they are going to go?
Re:Will it have DRM built-in? (Score:5, Interesting)
So, at least for now, they're staying out of the DRM wars. Of course, this is all subject to management whims, but that's the state as of now.
Incidentally... (Score:5, Informative)
So I pull out my S-Video cable, my computer speakers, and subwoofer, and get it all hooked up. Pop in the DVD and play it. Hmm... the TV is mirroring the laptop screen, but the video doesn't show up. After playing around with it for half an hour (and trying two different software players), I finally notice this little warning that says that "Copy protected DVD's will not output to the S-Video port" (or something like that).
WTF? Why even have a DVD drive and an S-Video port if I can't combine them? Note to everyone: Don't buy a ThinkPad if you think that there's EVER a chance you'll want to play a DVD through the S-Video port. If IBM is so damned concerned about DRM, they need to put a big sticker on the laptop that this is a DRM-enabled system. I guarantee that I will never buy another ThinkPad.
Anyway, next night, I bring home the Apple PowerBook. Hook everything up, pop in the DVD, hit play. No problemo.
Come'on, dude (Score:2)
sPh
Re:Will it have DRM built-in? (Score:3, Insightful)
Re:Will it have DRM built-in? (Score:2)
I'll believe it when it's on the shelf at CompUSA (Score:2, Insightful)
Re:I'll believe it when it's on the shelf at CompU (Score:2, Insightful)
Re:I'll believe it when it's on the shelf at CompU (Score:5, Interesting)
Most companies would have said: "sorry Motorola - you are out of gas. We just signed with Digital (Alpha) [or IBM or Intel]. Thanks for the memories". Instead Apple force-fed the entire PowerPC thing.
I wonder what their motivation was? And did Apple truely benefit in the long run?
sPh
Re:I'll believe it when it's on the shelf at CompU (Score:2)
sPh
Re:I'll believe it when it's on the shelf at CompU (Score:2, Insightful)
What's most important here, I think, is that Intel/Windows has created a culture that believes that when Intel releases a new CPU, everyone needs to upgrade. This is great for Intel, as it guarantees an ROI for their research.
The Mac crowd, however, is not like this. Mac owners will typically keep their Macs for 3-5 yrs w/o upgrading. OS X isn't doing much to change that, as every release of OS X is progressively faster than the previous release on the same hardware. While people may need to upgrade now to take advantage of OS X's best features, an upgrade now will mean no more upgrades for the next few years.
I think Motorola was aware of this and realized that for the amount of R&D they needed to compete effectively with Intel/AMD, they weren't able to sell enough CPUs to make up for the cost of bringing a new chip to market.
Just my thoughts, though
Re:I'll believe it when it's on the shelf at CompU (Score:2)
Not quite the next best thing. (Score:4, Insightful)
Again, though, let me reiterate that this is all just conjecture until "The Steve" makes some sort of formal announcement.
The funny thing is I'm going to wait for a G5 (Score:4, Insightful)
I've tried to use Linux on the desktop since 0.98 (Slackware in '96) and never found it to my liking. I don't like to tweak and read man pages for hours, I just want the damn thing to work. That being said all my companies servers run Linux (killed the SPARC the other day) and being able to sftp/ssh to my servers from a terminal in OS X was great. Plus using Dreamweaver to do my JSP development makes a great environment.
Hopefully 1 to 1 1/2 years is all I'll have to wait. I'm patient so I'll start saving now.
Big News for the Whole Industry (Score:3, Insightful)
Multiple processors in a chip? Good. AltiVec or similar number-crunching in combination? Great. If Apple pursues this, their boxes might--might achieve a performance that easily blows away the still-powerful SGI workstations and their slow-clocks-but-very-powerful processors (MIPS? Alpha? Can't remember right now).
I hope that some other enterprising company works up a PC mobo that can handle it for those not inclined to Apple products. That would light a file under Wintel's corporate ass to build something better.
Re:Big News for the Whole Industry (Score:2)
Re:Big News for the Whole Industry (Score:3, Informative)
I think Intel has had one big reason to make their chips better performers: AMD. I don't knock IBM, but the fact of the matter is that IBM hasn't been at the top of the microprocessor curve for a few years, in my opinion. While many systems still use IBM's mainframes, quite a few systems have converted to n-way multi-processing Intel-based architectures. As far as Apple's developers having to rewrite stuff, I believe that most if not all of Jaguar (OS X 10.2) is compiled with gcc3.1 - so, for Apple it would be as simple as ensuring a decent backend to gcc3.x for this new processor (chances are that this is already 'in the works' by IBM).
I'm not sure that SGI has any particular headway any longer. Maybe against certain machines in Apple's lineup, but I know here at my current employer, we've been using SGI Octanes and Octane IIs for heavy duty image processing in our products and we're getting ready to deploy a new architecture based on a dual-Xeon HP box running Linux (to replace Irix which we use on the SGIs). Performance of the image processing applications is unchanged or better and the cost savings to the company are very decent. Incidentally, the SGIs that I know of all use MIPS processors - only machines from Digital (DEC), now Compaq, use Alpha processors, to my knowledge.
The motherboards used in current Apple products are, for all intents and purposes, 'PC' mobos. They have standard AGP & PCI slots, use PC RAM (DDR at 133MHz or more) and provide connectivity through a number of PC compatible technologies (Intel's USB bus, IEEE 1394/Firewire, Ethernet, etc.) Its not really a matter of the processor/mobo combo being PC or not, its a matter of what OS you want to run. You can get a Mac and run most of the popular flavors of Linux on it (notable exception: RedHat). No problem. I'm not sure that much of anything will light a fire under the Wintel monopoly. Just my opinion, though.
I don't see the landscape changing too much... (Score:3, Interesting)
The thing to remember is that "switching" is expensive, and not just for the new hardware. When a longtime PC user switches to Apple, they have to replace all of their software with Mac versions (and in a lot of cases, say goodbye to certain titles altogether). A new PPC processor isn't going to make that any less of a reality (unless of course, it allows VirtualPC to run fast enough that it's actually usable).
A 64-bit PPC would almost assuredly be backwards compatible with 32-bit PPC applications so for current Apple users, it will be a big boost in speed without having to reinvest in all of their software immediately (although, if you want the most speed, you'll eventually need to upgrade to the 64-bit versions of your apps).
Great news for Apple, but it's not a "Windows killer".
Re:I don't see the landscape changing too much... (Score:2)
Yeah, but to keep some of those titles, imagine how fast VirtualPC could run under this processor!
Re:I don't see the landscape changing too much... (Score:5, Interesting)
Having 64 bits pointers is needed to address more than 4 gigabytes, but why would there be a performance gain? I would think that longer pointers imply moving more data into the CPU, and therefore would consome more memory bandwidth. Am I missing something?
Re:I don't see the landscape changing too much... (Score:2)
Re:I don't see the landscape changing too much... (Score:3, Informative)
Re:I don't see the landscape changing too much... (Score:4, Informative)
I don't get what you mean by the G4 "showing its age", it isn't some ancient chip pulled out of a tar pit. It's performance problems come from the low clock speed and the lack of multiple floating point pipelines. That is more of an implementation issue than an overall design issue. The Athlon has 3 FP pipelines, the G4 has one. AltiVec is fine if you can fine the parallelism it is good at in your code. Most people for go that effort and stick to simple floating point operations. Hence the Athlon's high floating point performance.
Please people, 64-bits does not equal performance, instructions per second is the important factor. With 8 way superscalar goodness the POWER4 design gets stuff done not with its 64-bit GPRs but the fact it can suck down multiple integer and floating point operations at once and out of order. You've got the potential of 4 FLOPs per cycle in the POWER4, at just 1.25GHz that's 5 GFLOPS of plain old floating point performance. That is twice the Athlon's performance at the same clock speed. A second core would effectively double that rate since the cores on a POWER4 share their L2 cache making them look like a single chip.
Re:I don't see the landscape changing too much... (Score:2)
Yeah, thank God I can still play Master of Magic on Win2k. I haven't tried DosEMU under linux yet, does anyone know if it works?
Jaysyn
Not till LATE 2003 (Score:2, Interesting)
But on the positive side:
As a laptop user I'm curious to know if these new chips will be a be viable option (in terms of power and heat). Guess its a good thing I'm not planning on upgrading for another 12-18 months.
Hmmm... (Score:3, Interesting)
Perhaps, just perhaps, has Apple something up their sleeve? Like a purchase of Alias|Wavefront to go along with their other recent acquisitions, and fully stack the high-end graphics deck? Or maybe pro-E has finally gotten their act together and is releasing a Mac client? Or are there going to be some new Xserves based on this chip, and maybe we'll actually see some type of installed base start to grow in the Apple-branded server market.
Who knows... but as big as this news is (for Apple-heads, at least), the upcoming developements this GPUL (potentially) foreshadows loom much larger.
You're kidding, right? (Score:5, Interesting)
I first started reading this line when the 386/25 came out. Replace CAD with 3D Graphics for this decade. Every time a new processor comes around, they say almost exactly the same thing - watch for it in the press. So far the prediction hasn't shown to be true.
Re:You're kidding, right? (Score:5, Informative)
Where processor speed helps in my experience is a) heavy duty mathematical software and b) compiling software. For graphics, acceleration cards do far more than a processor upgrade, and memory is also a common bottleneck (or was - with the really cheap memory we have now I suspect it's less of a problem.) A fast processor can help if you have lots of excess toys running, but for doing your job the Pentium II was when that task was effectively solved.
There is a reason the computer market is saturating. People don't feel the need to upgrade so much. If they upgrade their software, it may demand more resources, but people don't feel the need to use Office XP or whatever if 97 does the job. And despite what we all think of Microsoft, it does do the job. Hence Microsoft's consideration of subscription licenses - their revenue stream is likely falling off somewhat, or at least not growing as fast.
Don't confuse Want with Need. From a marketing standpoint they may look the same, but they actually aren't. In a recession we notice that fact more.
Re:Hmmm... (Score:3, Funny)
Are you kidding? I guess you haven't used OSX. Just THINK of all the new minimization effects we'll get! Imagine playing a dozen minimized Quicktime movies, all at once, with no dropped frames! Imagine Chimera loading quickly!
Don't use FCP, do you? (Score:5, Interesting)
G4 chips have more than enough "under the hood" to comfortable kick the likes of Photoshop and Illustrator around, not to mention the iApps, and everybody's favorite Final Cut Pro.
You have *got* to be kidding. Enough power for FCP? Dude, I routinely run 30+ minute renders for a 3 minute chunk of video on a 933MHz G4, and I'm not even doing all that much. A few filters, some text generation, a mask or two and it's walk away from the machine time.
Apple could be shipping 8-way 2GHz G4s and it still wouldn't be enough.
Re:Don't use FCP, do you? (Score:5, Insightful)
I feel your pain, but let's get some real perspective. Video is almost always going to need some sort of rendering, especially when dealing with uncompressed (or nearly) video. That's upwards of 600K per frame, times 30 per second. Just for the data.
I used to have all these stats for explaining to clients why 'video rendering' always takes so long. My favourite: one minute of Cinepak (old-school!) video requires more math than the Apollo missions did. Sure, it's a whack stat, but it get's the point across, eh?
The G4 is no slouch. Realtime Video Everything requires a massive bank of DSPs, or a CPU that does not yet live.
Re:Hmmm... (Score:2)
While I'd rather get stuff done on a Mac as I like the environment ten times better than Windows, if you were going on a raw speed comparison a Athlon MP Windows system is going to mop the floor with even the fastest G4. A lot of software on MacOS is really great in my opinion, the systems running said software have a lot of room for improvement.
itanium in commodity hardware? (Score:3, Interesting)
Re:itanium in commodity hardware? (Score:2)
So maybe Itanium will be a massive abortion. Oh, well. They made a ton of money back when they had no competition and charged whatever they wanted.
Re:itanium in commodity hardware? (Score:3, Insightful)
The interesting thing to me (Score:2)
OS X only handles dual processors (Score:2)
Re:OS X only handles dual processors (Score:5, Informative)
Re:OS X only handles dual processors (Score:2)
Re:OS X only handles dual processors (Score:2, Informative)
Re:OS X only handles dual processors (Score:2)
Re: (Score:2)
Targeted advertising at its best. (Score:3, Funny)
So I click on the story's link and this is what I see. Interesting, indeed.
Targeted advertising at its best [mac.com]
This story was broken in the Naked Mole Rat Report (Score:2, Informative)
(Disclaimer: Naked Mole Rat Reports [macedition.com] are usually hilarious. But for the first time, on Sept. 14 there was a "guest columnist," who wrote a lame parody of those Nigerian spam messages.)
I can see why Apple hates rumors (Score:5, Insightful)
This chips' project doesn't even complete until summer 2003, that doesn't even imply it'll be ready to fabricate or be in any kind of production then, even if it DOES pan out to be a useful design. I imagine by tomorrow Macosrumors will be touting it to be in the new uber-G4 to be released next month.
How long has the G5 been 'almost ready' as far as rumor sites go? Two years now? It's great to spin up your readership with crap like that, but it really does a disservice when it's untrue.
Re:I can see why Apple hates rumors (Score:2)
that's one of the many reasons intel's itanium1 processor sold i think a grand total of 500 systems w/processor in it. the only reason anyone'd buy an itanium1 system was for the collector's value. rumors/plans of itanium2 came out and preorders for itanium1 dried up.
if you can't beat em', join em'. apple should market their hardware (As alot of hardcore appleites already do) in the fact that "this hardware should last you X years before you want an upgrade, and y years before you NEED an upgrade, according to fairly legitimate sources. instill trust and loyalty, rather than try to ward off fear and doubt.
show me the future! (Score:3, Insightful)
The SOFTWARE story, on the other hand, is BRILLIANT. But what the fuck are you going to run this tremendously asskicking OS on in 5 years?
I don't give a crap what the rumor sites say - I'm *not* going to invest $3500 in a pro Mac until Apple brings it's system architecture into the 21st century. I'm talking about bus bandwidth. I don't care if I have to squeeze another two years of life out of my heavily upgraded Beige G3. Apple's not getting my money, until they offer a system that's worth it to me.
If I see developments - rumors, in the positive direction, I'm more likely to wait for the worthy upgrade, than I am to say "FUCK Steve Jobs, I'm building an AMD box, and running Linux". It's as simple as that. A platform that has a future, that I can afford, versus one that does not have a future, that I can't buy at any price.
Re:I can see why Apple hates rumors (Score:2)
Re:I can see why Apple hates rumors (Score:2)
Re:I can see why Apple hates rumors (Score:2)
Take Intel for example, the public knows pretty clearly where Intel is headed and when they change directions they announce it publically. Microsoft is actually another good example of this, perhaps they go a bit overboard and talk about vaporware as if it were a shipping product but at least you can't claim you don't know what they are thinking about.
Re:I can see why Apple hates rumors (Score:3, Interesting)
The frustrating thing with MOSR is that they seem to never fucking learn. They might always have well placed sources for their info, but... those sources are so overly optimistic that they consistently make MOSR look like idiots.
ThinkSecret [thinksecret.com] and MacRumors [macrumors.com] are both much better rumor sites, and I don't believe that they detract from Apple's sales in the slightest. Nick DePlume of Thinksecret seems to care enough about accuracy that he doesn't make many long-distance predictions. I've never seen him be very incorrect. His steadfast accuracy has made me reconsider purchase of a PC desktop, lately, because he says ATI is working on an all-in-wonder card for the mac. I believe him completely.
MacRumors has a much higher volume of information, so sometimes they come up with crap, but they never make it sound more authoritative than it is. They don't act like you can bet the farm on their information.
At this point, MOSR needs to curl up and die. Back in the day, they had enough viewers and sources that they could have been the premier rumor site indefinitely. Even with Jobs' crackdown on leaks. But their BS predictions (and crappy management) probably alienated as many sources as it did readers. So now those sources go to Thinksecret.
1984? (Score:2)
Re:1984? (Score:2)
Re:1984? (Score:2)
My impression is that early Apple saw IBM as too big and slow to hurt "cool" Apple. Later years they saw IBM as an ally, kind of the Big Elephant that can take the imcoming shots while Aplle scurries behind its protection. I can't recall any animosity between them
new bus is the interesting part (Score:5, Interesting)
Even if the new chips are clock-for-clock identical to the current G4, the mere fact that they're running on a newer bus will make the machines much more powerful.
For more info about this, head over to Ars and check out the posts in the Mac Achaia by BadAndy from earlier this summer ("Altivec, anyone?" I think it was titled). He knows a hell of a lot more about this stuff than I do; it makes for fascinating reading, and you can really understand why faster CPUs alone won't cut it for Apple.
Nice hot air, could be good IBM strategy... (Score:2, Interesting)
IBM has known for many years that an Intel/MS monopoly ain't good for IBM. (Anyone recall OS/2 for PowerPC?) Pumping up Apple with better CPU's would be good strategy, even if they make no money on the chips. But what's taken them so long?
My impression is that Motorola's attitude & situation are so bad that Apple couldn't get much out of 'em with "we'll switch to IBM" threats.
Now if someone can actually SHIP substantial quantities of non-defective chips BEFORE Intel is cranking out Pentium 6's & Itanium 4's at 10GHz...
Wahoo. Kudos to apple and goodbye palladium (Score:3, Insightful)
Motorrola has no one to blame but themeselves for this. If they innovated and tried to keep up with the industry like everyone else, they would of not had this problem. They figured mac users are suckers and will always buy anyway so who cares. They guessed wrong.
Believe it or not, consumers do look at the mhz rating as an indicator of performance and value for what they are paying for. Even some look at the mhz rating for internet speed! If they see an expensive box that has a low mhz rating, they will just shake their heads and move on to another pc. Consumers aren't real bright and apple needs to boost the mhz peed on these new chips and not just have them perform fast. Palladium scares the hell out of me and I want no part in it.
Kudos to apple. As soon as palladium is out and when these babies find their way into powerbooks, I will be one of your first customers.
ALso MacOSX is one of the easiest versions of unix out there! No rpm hell, no spending hours configurating text files, no waiting for gentoo to compile everything, and all of the binaries like Windows include the dependancies. I will still keep a copy of linux around for the hell of it but I would love MacOSX!
Re:Wahoo. Kudos to apple and goodbye palladium (Score:2)
Yes my close to 3 year old machine can kick your 500mhz g4. ITs sad but true. I haven't seen an apple sponsored risc vs cisc arguement since the mid 1990's when intel began to overtake them.
Also don't bother to tell me that photoshop 7 is faster. Their was a bug in it that did not make mmx or caching work properly in the intel platform. With photoshop 7.01 or higher intel's cream macs now 2:1 and sometimes 3:1 with the 2.8 ghz pentium IV's. Apple always uses version 7.00 in their benchmarks. I am not bashing apple but rather motorolla. I will not buy an apple machine untill they include IBM chips period.
I know alot mac users like yourself are pissed at my comment and you should be. But you should not be pissed at me but rather motorrola. I only speak the truth.
Note to Marketing Department (Score:3, Funny)
Re:More... (Score:3)
That's one of the really nice things about Linux.. (Score:3, Interesting)
Microsoft has been dependant on Intel for a long time. Their one foray into another architecture (WinNT for the Alpha) was just a proof-of-concept, and didn't go anywhere, IIRC.
The Linux kernel covers several architectures. SGI, x86, Alpha, PPC, and StrongARM are just a few.
It's really nice to finally see a real, immediate threat to Microsoft's dominance. Apple and IBM have enough revenue to run a massive advertising campaign. Even if it just involves OS-X, it'll still produce a large shift away from Microsoft's domain.
Re:That's one of the really nice things about Linu (Score:2, Informative)
Re:That's one of the really nice things about Linu (Score:2)
Now let's see if I can get a few more acronyms in there
Re:Shades of PowerPC (Score:2, Insightful)
Maybe, but then again it might just be a different version (like Windows XP has both a 32-bit version and a 64-bit version).
2) All of your current software will still work but in some sort of wierd "Compatibility Mode" that is ten times slower than it runs today.
Not likely. Just as the forthcoming AMD Hammer will have 32-bit backwards compatibility, I expect the IBM/Apple proc would do the same. You won't have to boot to "32-bit mode" it will just run 32-bit apps. And while it won't run them as fast as the 64-bit apps, it should run them at least as fast as a native 32-bit processor.
3) Developers will get screwed (again).
Only in the sense that they may have to decide whether to program only in 32-bit (for the widest compatibility with the least effort) or expend the extra effort to support two versions.
Re:Shades of PowerPC (Score:5, Insightful)
Look, I'm sorry but I'm sick of these posts. The PPC instruction set was designed to be a 64bit architecture. There is a 32bit subset that all current mac programs use and Mac CPUs understand. Theoretically, running 32bit code on a 64bit PPC should be as simple as setting a bit in a special register in the CPU, putting it in 32bit mode.
In fact it might make sense to make 64bit mode an option to the developer. If they don't need very large integers or 4+GB of address space, they could use 32bit mode. This would mean that you don't waste RAM and memory bandwidth using 64bit pointers when you don't need them. The OS would still be 64bit of course.
All applications should run flawlessly (if they did before :-). There is no emulation. And even if there was, how would that hurt the developers? The only time Apple has switched processor architectures before was 68k->PPC. I can still run a 1984 68k copy of MacPaint in Mac OS X's Classic environment. Hell, their 68k emulator was so good that they didn't update all of the OS to PPC straight away! Yes, the jump from OS9 to OSX was difficult for developers but this wont be, even if Apple had to use some sort of emulator (which they wont).
Re:Shades of PowerPC (Score:2)
I've never fiddled with anything that requires me to know cache line sizes. Anyone more knowledgeable have any info?
Re:Shades of PowerPC (Score:2, Interesting)
Clearly your sarcasm detector is set too high. When I said "the one you just bought" that means today (as in just, as in not 64 bit). So when Steve Jobs gets up and says "32 bits is dead" your screwed. Just ask all the people who bought quadras so they would be able to run OS X. Then it didn't appear for a few years and
Doubtful, if a 1GHz GPUL processor runs 2x faster than a 1GHz G4 processor
Clearly you have a short memory. The "emulated" 68k mode of PowerPCs (which were also supposed to be waaay faster) weren't because the emulator didn't fit in the cache. And for christ sakes, who the hell believes what chip companies say about speed anymore?
Yea, right. Since Apple has done such a poor job of allowing old apps to continue to function with a new their new OS, NOT!
I hope your fucking kidding. Clearly your not a Mac developer if you haven't been repeatedly screwed [userland.com] by [userland.com] Apple [joelonsoftware.com].
Go back to sleep, you clearly need it
So what's your excuse?
Re:Shades of PowerPC (Score:2)
Who bought a Quadra to run Mac OS X, and are they interested in upgrading to a slightly used state-of-the-art PowerBook 3400? The last Quadra was discontinued in 1995, long before there was ever any speculation about X. The official minimum requirement for X is a beige G3, introduced in 1997, but there are hacks to get it installed on earlier PCI-based Macs, which date back to 1995. How far back should Apple have gone? Should Mac OS X be able to run on my 15.9 MHz SE/30?
Re:Shades of PowerPC (Score:2)
Luxury. Back in my day we published magazines on a 8MHz Mac 512k and if we didn't like it we could lump it. But if you told young people today that that computer would become a multiprocessor RISC-based unix workstations made of translucent plastic they wouldn't have believed you.
Re:Shades of PowerPC (Score:3, Insightful)
> 68k mode of PowerPCs (which were also supposed to be
> waaay faster) weren't because the emulator didn't fit in
> the cache. And for christ sakes, who the hell believes
> what chip companies say about speed anymore?
The very first PowerMacs ran 68K software faster than it had ever been run before. You are completely wrong.
The 32-bit compatibility mode your'e talking about is an Intel thing, to make up for the fact that they've been bolting things onto their chips for 20 years, going from 8-bit to 32-bit currently. PowerPC is younger and benefits from a much more mature industry when it was designed. There are already 64-bit POWER chips, and some parts of the current 32-bit PowerPC are 64-bit and some are 128-bit. The switch to 64-bits was designed into PowerPC.
"Classic" Mac software runs in a partial emulator (some hardware is emulated, but not the CPU) on Mac OS X because Classic Mac apps have a 20 year history
The important thing to remember is that Apple has been on their current CPU for only a little more than five years, and on their current OS for only two years. They are RISC, they are 64-bit, they are UNIX, and they are ready for the future like nobody else. Every Mac sold for the past two years has had a Wi-Fi slot in it and antennaes built-in, as well as FireWire, and also Gigabit Ethernet on all pro machines for the past 18 months or so. The platform is in a great place for the future. In fact, that's the only thing holding Apple back for the past few years
Re:Shades of PowerPC (Score:5, Interesting)
I think he may also be referring to the death of OpenDoc, which badly burned many developers and for which I too still have not forgiven them. OpenDoc was brilliant and so, so close to being ready for prime time when it was killed. This was a one-two punch for many small developers -- once they spent perhaps eighteen months in their C conversion, they then spent another eighteen months or two years redesigning their application for an architecture that simply went up in smoke. I knew some small innovative software developers that had, perhaps, a two or three year lead over similar applications on the Windows end, who ended up behind, a place you simply can't afford to be if you are on a niche platform like the Mac. This experience soured many developers on Apple, and prepared many of them to be well disposed to open source.
Bitterness for past misdeeds aside, I expect a 32 bit to 64 bit conversion to go more smoothly than the 68K to PPC conversion, or the equivalent conversion on the Windows side.
paradigm shift (Score:2)
Systems exist for automatically marshalling software into behaving, or for helping developers write better software. The problem is they tend to be exceedingly slow. So, faster processors are a necessary step in reducing crappy software. It won't help with useless software or ugly software, but at least it helps with crashing and having l33t hackers 0wn your machine.
ummm... (Score:2)
I don't think that means what you think it means.. (Score:5, Informative)
With 64-bit you can address over 4 terabytes. Do you feel the need for more than that?
You can also work with integers up to 18.446.744.073.709.551.615, and floating-point numbers up to 1.7976931348623158 E+308. Feel the need for more than that?
There are wider registers in the CPU (such as the dedicated SSE2 or Altivec registers), but for normal operation I think 64-bit should keep us going for quite a few years.
RMN
~~~
Today's winner of "That's an Understatement!"... (Score:3, Informative)
OK, folks, a 64-bit address goes up to 2^64, which is 2^4 * 2^60. Crudely, that's about 16 * (2^10)^6, or 16 * (10^3)^6. Now let's review our metric prefixes, shall we?
So, yes, a 64 bit processor can address more than 4 terabytes. Roughly 4 million times as much as that, actually. That could be of some importance. :-)
More seriously, I can foresee within 5 years the certainty that addressing 4 terabytes would not be enough. Indeed, you could predict somebody would whine about gnu tar's 4 terabyte limit, and how they now can't back up their RAID full of pr0n. :-)
Re:Apple working on a CPU? Not likely... (Score:5, Informative)
Umm... Since Woz started working in Steve Jobs garage? One of their divisions is the "Hardware Engineering Division"
Even their boards are outsourced
I'm pretty sure that the design is done in-house. some manufacturing may be outsourced.
let alone the actual chips.
I don't know if they STILL have any chip designers (I sort of doubt it) but when AIM first got started the Somerset chip design facility was a joint venture between all three partners including Apple. I believe some of the chip designers at the facility were technically on the books as Apple employees. At the very least the chip designers at Somerset worked closely with Apple.
If Apple had any ability to develop their own CPUs they wouldn't still be stuck with the pre-historic G4, they would simply ditch IBM and use their own chips.
Despite the fact that they DO have hardware engineers, and may even have a few that specialise in chip design to evaluate & work with the other two AIM partners it is obvious that they are not themselves, and are unlikely to become, a chip designers. Though because of the way patent and license agreements between the AIM partners they probably could get into it. But that would be a nightmare, they would bear all the costs and still be stuck with a single supplier (themselves) that would likely fall behind the competition.
Re:Apple working on a CPU? Not likely... (Score:2)
Re:This could be good. (Score:5, Interesting)
After you've checked out IBM's prices [ibm.com] for PPC boxes you might not mind Apple's pricing so much...
OS X exists for x86 (Score:2)
Re:OS X exists for x86 (Score:2)
Re:First mover advantage (Score:2, Troll)