Apple Punishes ATI For Leaking The Cube? 317
Ruddy writes: "According to this
story at InsideMacGames and an even juicier
one at AppleInsider, graphics card company ATi leaked sacred knowledge
about Apple's soon to be unveiled products (the new G4 Cubes) in advance of the MacWorld announcements yesterday, effectively stealing some of Steve Jobs' coveted keynote thunder.
The leaked info spread quickly over the Web during the peak
of rumor-Cube-mania. In retaliation the Fearless leader apparently
then pulled the plug on the Radeon's
rollout at MacWorld, all but publicly spanking ATi for its indiscretions
and replacing the Cube's flashy Radeon card with the more mundane and
stale Rage 128 Pro (talk about spite and noses!)." This story just keeps cracking me up.
Re:I don't buy it... (Score:1)
Re:Wrath of steve... (Score:2)
Ummm... Apple was the top computer maker in the world back in the Apple II days. Every elemetary school teacher in America had seen one.
So what Sculley accomplished was that he built a household brand into a household brand.
Also worth noting is that the bulk of the early success that Apple enjoyed with the Macintosh came before Steve Jobs was ousted.
You gotta give the suits some credit for their successes, like the PowerPC transition, but the Scully/Spindler era will be remembered for the "Perfoma" line (just like a Macintosh, but without all those pesky high-quality components), and for nearly bankrupting a multi-million dollar company that had Sony-esque brand recognition.
Re:Powermac Cubes (Score:1)
14 years of retail taught me a few things about why companies do what they do.
You ever wonder why the pro models (G4's) come with no software? See my first paragraph if you are a slow learner.
Give me a break (Score:4)
Step back and listen to yourself. The whole notion that Apple pulled the Radeon is ludicrous:
The wheel is turning but the hamster is dead.
Haiku (Score:2)
So it was true all the time
Steve Jobs is a square
Re:Wrath of steve... (Score:1)
Re:Thunder? (Score:1)
The phrase "stealing [someone's] thunder" actually comes from early 20th-century American theatre.
A stage producer was proud of the proprietary machine he had designed for creating a particularly loud thunder sound effect. Well, through the theatrical espionage that happened in those cutthroat days, his design was leaked to a rival theatre company. Soon the producer was attending a production of the rival company (to check out the competition). Eventually the storm scene began and the producer heard the unmistakable sound of his effects device. He stood and famously shouted, "He has stolen my thunder!"
-AC
Re:benchmark (Score:1)
I once read that the human persistence of vision is equal to 26 frames / second. In the case of a FPS that gets .1 fps with one card than another when both rates are above 60fps, there is no perceived difference. Benchmark numbers are mental masturbation. Features are what it's all about.
Besides, who needs to run Shogo at 1600x1200? I'd be damned if I've ever played a game over 800x600.. after that point, all the detail in the world wouldn't matter... and 10 frames less than the TNT2 still isn't that big a deal. It is still 14 frames over Persistence of Vision.
Pictures, Pictures (Score:4)
For what it's worth, here's a picture of the Cube's innards, with the Rage 128 card exposed:
http://www.go2mac.com/image.cfm?image=images/shows /MWNY2k/wednesday_2/cube_guts2.jpg [go2mac.com]
Can it be swapped for a Radeon? There's no point in me engaging in idle speculation--just judge for yourself. Eventually, Apple will post developer specifications and clarify exactly how the card's connected to the motherboard.
Re:That's one of the reasons why I stick with PC's (Score:2)
Uhh...
According to the Q3 financial results from both companies, ATi's total sales were $288.2 million. Apple's profits were $200 million.
What are you smoking? Baking powder [hulman.com]?
--
Re:Not in this lifetime, bub. (Score:1)
One single 2x AGP slot. It fits. Specs are here [apple.com], and in the photo at the top of this page [apple.com] the video card is plainly visible, plugged into a brown AGP slot.
I would bet my left testicle that this chipset is hardwaired to the board.
That's OK, I don't really want it...
--
Re:Not in this lifetime, bub. (Score:2)
The key word here is DEDICATED. That means 'used only by'.
I don't think that either of us can really argue our way into a position of being right.
I propose this: I'm gonna buy a G4 Cube machine early next week (probably Monday). When it gets here, I'll crack that bad mamijami open and see for myself.
I'll either send you an email extolling my genius, or a photo of my severed nut.
Rami James
Desperate Guy
--
I can answer this in two words... (Score:3)
Hint: the first word refers to a male bovine. Get a grip, people! For starters, Radeon cards for Mac will not be ready until September [ati.com] -- two months from now. So there is NO way they were supposed to be in the Cube, which is on sale now [apple.com].
It's perfectly believable that Jobs was planning to let ATI show off Radeon during his Keynote, then snubbed them for leaks. Steve is known to be a vindictive SOB. But that's just one f***ing speech, 10 minutes of PR, not a complete shift of manufacturing!
Even if he wanted to, there is no way that Apple's Board of Directors would let Steve purposefully weaken the product line out of spite. Remember, they're answerable to the shareholders, and no matter how it looks on stage, Apple is not a one man corporation.i want to see those damn pictures!!! (Score:1)
They should be arriving in stores right now. (Score:2)
ATi has the best hardware in the business, their drivers usually leave something to be desired however.
Re:I don't buy it... (Score:1)
No, they didn't *stop using* the Radeon, they just didn't give ATI a chance to publicly demo the card.
It's not available until September, so it won't be in OEM Apple systems until then. This hurt ATI's ability to 'hype' the card in a public setting, that's all.
Re:Does Anybody Else Feel Fucked by Apple??? (Score:2)
No PCI slots.
I need something to justify this monster G4 I bought just a couple of months ago.
You've been using it for two months.
I suppose they'll expect me to fork out another $120 for these revolutionary input devices.
Yes. If you're not happy about it, send your $120 to MacAlly, Logitech, Microsoft, or whatever other vendor you thinks makes a better product.
--
Re:Uhm.. So they won't require apple displays now? (Score:1)
There are two ports on the back of the new Cube and G4 macs: one standard VGA, and one of these special cables that carries power, video and USB to the new monitors. So if you want to use a cheap 3rd party monitor you can. The ATI chipset has nothing to do with this physical connection.
______________________
"A person is smart. People are dumb, panicky, dangerous animals and you know it!"
Just goes to show ya... (Score:1)
Information wants to be Steve's (Score:1)
Apple does do the cheap spy thriller trick of releasing partially wrong info to it's people, and check which numbers turn up on the rumor sites. "MOSR says 110MHz - fire Johnsson".
Re:Wrath of steve... (Score:1)
Re:Wrath of steve... (Score:1)
John Sculley is famous for his Newton and his days at Apple Computer. Before Amelio, before Spindler, John Sculley was steering the course at Apple. Hired by legendary founder Steve Jobs, who convinced him to leave Pepsi by asking Sculley if he wanted to "sell sugar water" for the rest of his life, Sculley became one of the leading figures in Apple mythology by heading the company from 1983 to 1993. Ironically, Sculley later ousted Jobs in a power struggle and then positioned himself as a technology visionary.
During his decade at Apple, the company's annual revenues rose from $600 million to $8 billion. True to his reputation as an unparalleled mass marketer, Sculley built Apple into a household brand.
Too lazy to link, but found at: www.msu.edu/~luckie/hallofame.htm
Rader
Re:that's hillarious (Score:1)
Re:Who would buy one of these with the junk video? (Score:2)
Wrath of steve... (Score:2)
Once again, it seems that somebody needs to control Steve. He's just a kid, and needs somebody to stop his tempertantrums before he destroys what good he's done for Apple.
Re:Hello (Score:1)
Re:i want to see those damn pictures!!! (Score:1)
Re:That's one of the reasons why I stick with PC's (Score:4)
I am a PC weenie too, but I think some people are really giving Apple an unfair shake in not catching up on facts. As far as untested hardware, and Apple staying in a "safe zone" I say Phooey.
As for Jobs, he is a jerk that gets things done. It is highly doubtful that he "spanked" ATi other than omitting them from the keynote. The radeons will not be ready next month. End of story.
That said, I can see where Apple is falling behind. The need to:
Even this proprietary connection can be good, save for the fact Apple will have a solution "next month" for those who just want to buy a monitor for an older G3 or G4. It is all relative. It really is apples and oranges
Re:Not in this lifetime, bub. (Score:2)
If you are designing a strategic product, one of only 5 or 6 model lines your company makes, you design fallback positions.
If I were designing a computer around a yet-unreleased video chip, I would design a fallback plan.
The moment an engineer came in and said "By the time we're selling these things, ATI will have this great new video chip", the manager's first reaction would be "Great. Get some engineering samples and make our system run on them. But design a version around the Rage, too. Just in case ATI flubs". Always have a fallback plan.
Re:Just goes to show ya... (Score:1)
Refrag
Re:Blame ATI, not Apple (Score:1)
Does IBM buy hard drives on the open market for their current machines? They are getting a lot of the press for their recent designs in the high-end drive manufacture.
Presumably IBM cares when a product designer decides to use a non-IBM drive in his latest design.
Wrong. Try again. (Score:1)
Apple developing x86 solutions? You, like so many other peecee Wintel leemurs, have such low expectations that you can't get your mind around what Apple's strategy is.
Apple is worse than Microsoft in this thing (Score:1)
Honestly, are Apple's closed source, closed hardware, legal brief tossing ways any better than Microsoft's? I think not. The silliness with the leaked photos would happen in no other company.
Steve is messing with the bottomline $s of the company when important new items like state of the art video cards are not shown due to 'punishment'.
Apple has great hardware engineers and many great ideas, but they treat their customers like sheep and their partners like children. If they really want to sell hardware, get their engineers to work on some x86 solutions and let have an CuMine Cube and an iPC. I'd buy those products in a heartbeat. But buying a Mac today with no memory management or multitasking is farsical.
Stunts like this help prevent Apple from creating new customers. The Faithful Apple Cabal of users grows smaller everyday.
Re:Jobs cans mac components: news@11 (Score:1)
Re:Give me a break (Score:1)
Yes, but that isn't what I was addressing. The orginal poster was talking about the physical card in the machine.
The Cube is the most expensive Mac.
Not by a long shot. Have you even seen the thing at Apple's site? And since when does the iMac DV have a replacable video card? (I have one. Do you?)
The wheel is turning but the hamster is dead.
Re:Give me a break (Score:2)
Besides, Apple is a huge client for ATI, and Apple does get it's EOM supplies before the OEMs have a chance to ship on the shelves. One proof to this was the 8 month delay in the release of the Rage 128/TV card, was because ATI was ramping up production and supply of Rage 128 Pro & AGP2X for the new machines (iMac DVs and the then G4 tower).
The Radeon is shipping now in it's PCI form. AGP 2X will be available in september only. This only mean one thing: someone else is stocking up on them.
And nobody said anything about the G4 Cube being the only machine with it. The G4 tower is a serious contender for that card, too.
The Cube is the most expensive Mac.
The announced price of the G4 Cube doesn't make sense compared to the G4 tower given the absence of the superior video card. Redo the math with the Radeon card, and the G4 Cube's price makes sense.
Re:Just goes to show ya... (Score:1)
... but I do like the box. If it had come in at around $1000 (even up to $1200) I'd have snapped one up -- it'd be my first mac. Silence is worth something.
Re:Powermac Cubes (Score:1)
This is also why the iBook has one mono speaker, no microphone, and no microphone jack. The only way to get sound in (like, say, to take advantage of the voice authentication feature of Mac OS 9) is with a USB microphone, which would have high markup.
--
Re:No way did they pull the Radon out of the Cube (Score:2)
"business sense"? what business sense? the guy could and has run one of the greatest computer companies into the ground b/c his ego writes checks his business sense can't cash.
Apple the two years before Steve Jobs took over: lost $1billion each year.
Apple the two years after Steve Jobs took over: made $1 billion each year.
If that's running a company into the ground, I say we need a whole lot more of it.
Re:Apple has to resort to dual processors!!! (Score:2)
----
Re:This is a rumor based on steve's infamous rep (Score:1)
According to this [insidemacgames.com] news report on Inside Mac Games [insidemacgames.com], ATI employees are claiming that's exactly what happened. ATI announced the new products and Steve got ticked and yanked them from the show floor. It's not clear that Apple actually planned on having Radeons built into the new Macs, but ATI's press release implied it, and I think that's part of what got Jobs so angry.
______________________
"A person is smart. People are dumb, panicky, dangerous animals and you know it!"
Re:I don't buy it... (Score:1)
Re:Isn't that a _bad_ idea for Apple? (Score:1)
Re:Not in this lifetime, bub. (Score:1)
Gee let me get a high moderation by saying something that seems highly insightful even if its dead wrong.
Re:i want to see those damn pictures!!! (Score:1)
Re:The real reason why no Radeon (Score:2)
Re:that's hillarious (Score:2)
Re:THIS IS NOT A TROLL (Score:2)
Oh, please. There's nothing inherent about any operating system that makes it ``superior'' for all people and all applications. There are generally-accepted things that MacOS is lacking, but if a user doesn't need them, why force them upon him?
Likewise, I use Netscape Communication because I choose to. I'm much more familiar with it, and I would rather a piece of software which largely does what I tell it to than something like IE that is convinced that it's smarter than me and tries to surprise me at every turn.
Give your proselytizing a rest, ok?
Re:Wrath of steve... (Score:2)
Uhh..
Wouldn't they just be swapping video cards? The Cube just has a 4x AGP slot, right?
Yes, but the Radeon is a more expensive card than the 128, so it would also change the cost structure of the box. Not to mention the supply issues involved; Apple runs a couple of days inventory and a Just-In-Time manufacturing process that would be screwed with a 2 day notice of a swap.
It sure looks from the outside like they never intended to announce Radeon cards in the Cube or the other Macs.
Computers for moms (Score:3)
I don't think Apple computers are intended for the more technical users. For nearly a decade computer companies have tried to break out of the tech segment and sell to the moms, dads, grandparents and other non-computer users out there.
Apple got the artsie types, but couldn't get to the real non-computer users until the iMac.
What everyone complains as overly cute is non-threatening. Not just the all-in-one or the colors (ask Dell and Gateway about that) but the MacOS is generally easier for a newbie to use and the case of the iMac and the cube look friendly.
Look at the numbers of iMac users who have never owned a computer and you will understand Apple's resurrections. They are finally selling to the lost (and largest) market segment.
The computer industry will never truly understand the iMac (and Apple in general) because the industry is filled with technical users who could never really appreciate the appeal of a harmless, non-threatening computer.
Re:No way did they pull the Radon out of the Cube (Score:2)
We're referring to how Jobs ran the company into the ground in the 80's, not today. Today he seems to be doing a fairly good job--at least, he's brought the company back from the brink. Or so we've been led to believe.
Then perhaps you should make this clear? "the guy could and has run one of the greatest computer companies into the ground" is in the present tense, not the past tense. And trust me, Apple has completely turned around from the miserable days of Gil Amelio...
I find it pretty amazing, though, that the CEO of one of the largest computer companies in the world is basically acting like a child.
But it doesn't seem like there's much evidence to support this allegation. ATi claim that he did this. I haven't seen Jobs/Apple's side of the story. There are at least as many reasons to not believe this as to believe it.
I've worked with Steve on a number of projects. He isn't the person portrayed in the biographies of him from the early 80s. I didn't know him back then, so I can't tell you if he has changed or if those claims were bogus in the first place. Steve cares a great deal about the success of Apple. I doubt very much that he removed the ATi cards from the new Macs at two days notice, even if he was (rightly) mad at ATi for screwing over the surprise factor for the new Cube machine.
Re:That's one of the reasons why I stick with PC's (Score:2)
Uh, no. USB is from Intel.
Steve M
Re:Who would buy one of these with the junk video? (Score:2)
Re:Then replace the junk video! 2xAGP slot folks.. (Score:3)
Any normal 2x AGP card WILL fit inside the CUBE, such as a 3Dfx VOODOO 5. The ATI Pro plays Quake III & Unreal Tournament quite well, thank you.
--
Re:Not in this lifetime, bub. (Score:2)
Sorry, but I believe you owe the Slashdot community a testicle.
Re:No way did they pull the Radon out of the Cube (Score:2)
Anyone know when Adobe software (Photoshop/Illustrator) will be made available for the new platform?
D
----
Re:Leaking the Cube (Score:3)
The way I see it, the Cube is meant to fill the niche between first time buyes (iMacinites) and the über power hungry creative set. The tower G4 pricing has changed very little, other than increased bang for the buck over the previous series, but you'll notice the iMac pricing is slipping lower and lower. US$800 for an entry level iMac? Wait till September/October and you'll see that drop to $700, if not $650. Apple's creating a huge gap in their price range between the iMacs and the tower G4s.What fills that huge gap: The Cube.
O' course, I could be wrong... I just want two of the new 17" CRTs on my desktop
----
Re:Just goes to show ya... (Score:2)
retroactive downgrade of the G3s to lock out G4 upgrades). Now the company is stuck with a processor design that has next to no R&D dollars behind it
Um, no (sorry, couldn't resist). G3's upgrade easily & the PowerPC chip has R&D without par. Did you forget the MultiProcessor announcement yesterday? Hmm, yes you did.
and hope they sell for a few more years.
Yes, Apple is going to die.
--
I don't buy it... (Score:4)
---
A total lie (Score:4)
When Apple, or Dell, or Compaq, or any one of 100 other big PC manufacturers, picks a component for their system, it's not the same process as Joe Blow going over to Tom's Hardware, reading about the fastest thing on the market, and then ordering one of them. No. Graphics cards and the like are rigorously tested for hundreds and hundreds of hours to ensure a.) compatibility and b.) stability. Often times the overall design of the computer is influenced by the chip itself; if it kicks out too much heat it will be oriented a certain way, or the airflow in the case will be rearranged to accomodate it, etcetera. In addition, a lot of possible scenarios are thrown at the completed machine to make sure it's not going to crash. Building a mass produced PC (or Mac) is a lot more of a science than most of us who consider it erector set engineering give it credit for. A very obvious case would be the iMac, but even for G4's and G3's, a lot more thought goes into them besides slamming together a bunch of components and praying for the best.
With that in mind, I hope you see how utterly impossible this story is. Jobs cannot simply switch out the graphics chip without seriously delaying the product and causing massive engineering and supply headaches.
--
FUD **The cube has no fans-Silent. (Score:4)
No way did they pull the Radon out of the Cube (Score:2)
Look at ATI's schedule: the Mac Radon won't even be available until September. Who wants to bet that they'll become BTO options at that time.
Apple *could* use the opportunity (Score:2)
Now that would punish ATI since i'm guessing that apple do account for the bulk of their sales these days... would you use an ATI??!
If steve wants to keep his creations that secret he should build them in his garage... hand craft and nuture them... speak to them and make them translucent... and keep their carefully textured curves away from the public eye
Re:Wrath of steve... (Score:2)
Not in this lifetime, bub. (Score:3)
ATI leaks a rumor of the G4 Cube, and Apple spanks them by not releasing their chip which they have obviously taken great amounts of time an money to integrate into their new product.
They replaced an entire Video system in one week.
--
Obviously the writed of this completely rediculous article has not worked anywhere near electronics design and manufacture.
One week to change an entire chipset? You have got to be yanking my noodle!
Rami James
Guy with more clue than that.
--
Re:Just goes to show ya... (Score:3)
The trailer-park hotrodder says the same thing about his homebuilt dragster. You know what? Doesn't stop people buying Miatas.
Pretty much any computer out there is good enough right now to handle 85% of what any consumer level user would ever conceive of doing with it. In two years this will be 100%. Therefore style will become the competitive advantage for the vast majority of the market.
To go back to the car metaphor, everyone here is like the blacksmith-cum-artisans of the early days, used to performance being a matter of some concern in one's choice of car, sagely congratulating each other on their wisdom in denigrating that silly assembly-line thing that idiot Ford was thinking he could make something of.
I take my family Christmas dinners as an example. Computers had never been mentioned up until the last two years. In both of the last two the grande dames of clan Curylo have gone on at length about their new iMacs and how they got the drapes to match and found the right color for the seat cushion and yadayadayada.
You laugh, yes. But these are consumers and that is the future of the consumer computer market. Deal.
Re:I don't buy it... (Score:2)
My God, what a silly thread (Score:4)
That's as may be, and it's understood the Rage128 is not a 'super accelerator' chip like a Ge or maybe a Radeon. But there are some important issues including one that _nobody_ seems to be picking up on:
OT, but anyone else find that ATI drivers suck? (Score:2)
Listen, sorry to be posting this here, because it's quite blatantly off-topic, and I'll happily take the karma hit that moderators will assess on me. It's just that this is about the only forum I can think of where I'll get intelligent "yup, I know what you mean" kinds of answers.
I support a number of Winblows 9x systems with ATI graphics cards of varying descriptions. Some of them at ATI 3D Chargers, some of them are Xpert@Play, Xpert@Play98, All-in-Wonder Pro, and All-in-Wonder 128, etc. Most of them are based on either the Rage Pro or the Rage 128 chipset.
And I can't get over how many driver problems I have with these things! I'm not an idiot, I know how to install drivers, and, in fact, I've gone so far as reading ATI's docs. (When in doubt, read the docs.)
One particular case, 16 identical machines running Win 95B (OSR2) and equipped with Xpert@Play98 (Rage Pro) PCI, *every last one of them* after I installed the drivers kept on starting up with "New Hardware Found - PCI VGA Adapter". Again, you'll note, this was after ATI's drivers were installed.
Those machines around the office with the All-In-Wonder cards have frequent crashes, showing invalid page faults in the video drivers.
Since ATI is a hometown company and seems to offer products that suit our needs, we've always used them, but I'm really convinced that their software people couldn't find their own rectal cavities with both hands and a flashlight.
Anyone else have similar experiences with their Windows drivers? How are the Linux and Mac drivers?
Re:Does anyone read the stories? (Score:2)
In retaliation the Fearless leader apparently then pulled the plug on the Radeon's rollout at MacWorld, all but publicly spanking ATi for its indiscretions and replacing the display demo Cube's flashy Radeon card with the more mundane and stale Rage 128 Pro
Does anyone read the stories? (Score:5)
face. The Mac Radeons weren't going to be available
until September anyway. What was cut off was ATI's
participation in the PR event, not ATI's participation in the Mac market, and that is at
least an arguably reasonable sanction for "spilling
the beans" about the PR event in the first place.
Re:Wrath of steve... (Score:2)
Re:Wrath of steve... (Score:2)
--
Blame ATI, not Apple (Score:3)
One proof to this is Slashdot. Macs aren't a Linux machine per say (though it does run it quite well), and yet, Apple gets more press on Slashdot than any single Intel-based hardware company, including strictly-Linux vendors like VA Research (or do they sell non-Linux setups?).
So, when ATI was originally invited onto the Big Stage at one of the most coveted Mac events (and thus before the entire industry), it was granted a huge favor. Apple doesn't have top care about ATI. No more than it should about IBM that supplies hard drives into most of their machines.
But instead of taking this humbly, ATI's invlated heads went off to blow away some of the punch lines Apple (aka, Steve) need to keep the crowd alive, by pre-anouncing products that were up until this point just rumors.
Keeping the faithful crowd happy is what saved Apple.
For sure, Radeon will ship in the Cubes within 2 months from now. You can bet that, since the machine is already up for order, there are actually quite a few already in the pipeline that have the Radeon cards inside.
Thought for the launch, ATI got what it deserved: no chance for the spotlight, for having shadowed Steve's spotlight.
Re:Wrath of steve... (Score:3)
That's preposterous. (Score:2)
Apple is selling a "Supercomputer" here with 64 megs of ram. Their inclusion of last year's most so-so 3d graphics contender hurts Apple more than anyone else. I mean, come on, even S3 had better 3d performance.
Furthermore, I've not seen the inside guts of this cube, but I would be very surprised if the video is on a PCI card. In keeping with Apple tradition it's probably soldered right to the mainboard.
And, re-design the mainboard of your flagship product mere weeks before the launch just so that you can use the crappier version of an offending vendors product? It just doesn't make sense. Not even if you're insane.
The programming interface used by the Mach64 through the Rage128 is so similar that I'm pretty certian the only reason they put the Rage128 in there is simplicity of driver support. ie, they were too lazy to switch to 3dfx or nVidia.
Re:The real reason why no Radeon (Score:2)
First of all, Apple gets them before anyone. Same as when the 128 card came out. So I imagine they are stockpiling Radeons as we speak
Second, they often announce forthcoming options well before they're available. Remember the 3x 72GB drive array? I still don't think those are ready to ship, but they were announced quite a while ago, IIRC.
Re:who is this meant to punish? (Score:2)
Re:Computers for moms (Score:2)
> a market that hadn't been taped before; Thats no
> reason for it to not be up-gradable
It's upgradeable. You can replace the CPU, add more RAM, or replace the hard drive. This is similar to most budget PC's, where the video and sound are on the motherboard. In addition, you can add external hard drives through FireWire on most models (there are eight or nine companies making FireWire hard drives), or add a USB hard drive. Of course, you can add other USB and FireWire peripherals as well.
> it's just an excuse to not put any good componets
> into the systems
I'm not sure what you're talking about here. Apple using components that are similar or even exactly the same as Compaq, Dell, IBM, and other top-tier PC manufacturers.
> not upgrade to operating system to work on
> x86 machines
Darwin already runs on Intel. It's the open source guts of Mac OS X. They have a complete, open source, BSD-based OS for Intel. As far as using x86 chips in their own products: understand that they have one (1) computer that has a fan of any kind in it, and that's the big PowerMacs. You can't put an x86 chip into an iMac without having to add a fan.
> and not make it flexable enough for both a novice
> and a real power user who needs all the things
> that Mirco-sloth and *nix have machines give us.
Mac OS X will be out in public beta in September and ship in early 2001. I already have the Developer Preview 4, and it's a stable, mature product that they've been working on for a long time. It will be the first Unix that a novice can use. They have publicly states that if a novice knows they're running Unix, then they haven't done there job, and if an experienced user can't find Unix, they also haven't done their job. It comes with Samba and Apache built-in, as well.
Who are they punishing. (Score:3)
It's more likely that ATI wasn't sure they could get the Radeon out before the release or the final design stages so instead offered technology that they know works.
If Apple did make a concious decision in the short amount of time, putting a lower performing video card into their new product would only make them look the fools.
Although when I did see that the Rage 128 pro card was going to be used I wondered why they didn't go with others. The only reason I can come up with is that they don't want to put something that might fail and make them look bad. Whatever their reason I bet they'll offer new systems with a Radeon, 3DFX or NVIDIA card installed within the next few months, definiatly before christmas.
story (Score:2)
Well, the IMG article claims that they met with ATI staff about this (I can't read the other slashdotted article). Do they mean the people that demo stuff in the ATI booth or did they meet with anybody who actually knows anything? Maybe ATI is just covering their ass for some technical problem that hasn't been resolved.
Also, the article indicates that this is a temporary thing. Maybe Jobs wanted to prevent ATI from making a big splash, but I doubt he would bar all Radeon cards from all current Macs! As was mentioned by another poster, if he really wanted to punish ATI he would scrap them altogether and go with another vendor. Apple's recent marketing strategy has been based on these surprise announcements at huge events, and if ATI really did leak the info, Apple would be right to be a little annoyed.
Just like the rumored/denied/confirmed cube, we have to wait a bit to find out what's really going on.
-------
Real Losers (Score:2)
Neither one is going out of business over it, but in the long run, cube buyers stuck with aging Rage 128 hardware are going to have to shell out the cash for a quality vido card. They won't all buy Radeons, but those who do shell out the cash for it will give ATi a bigger profit margin than they would get from the built-in card. And I doubt the change will seriously impact on the sales of the ubercool cube.
So the consumers are the losers.
And Jobs is just a big loser.
Re:Not in this lifetime, bub. (Score:2)
Like any good urban legend, it plays on conventional wisdom (we all know that Jobs is a mad dictator, right?), and should be completely impossible to verify (we know that both companies would deny it if it were true, so even if they deny it the myth remains).
Best laugh I've had all morning.
Whoa... Wait a minute.... (Score:2)
If they can't produce enough of these babies, lay the blame on an early press release.
If there's a glitch with it, blame it on an early press release...
Do not, however, turn around, and say "Well, since you gave us up, and Apple enthusiasts know now, we're going to punish BOTH of you by both not allowing you to put your cards in our machines, and not allowing our customers to ENJOY having a quality card in the PC."
Someone was so obsessed with pee pee smacking ATI, that they forgot exactly who it is that's going to suffer for this.
That damn near sounds like something all the anti microsoft folks around here tout...
What a lot of these people fail to realize is that BUSINESS, and REVENGE are like oil and water. Make money first.... Throw tantrum later, when at home sleeping on gold threaded sheets.
krystal_blade
Re:Not in this lifetime, bub. (Score:2)
I would bet my left testicle that this chipset is hardwaired to the board.
Rami James
Guy who honors his bets.
--
Uhm.. So they won't require apple displays now? (Score:2)
Re:FUD **The cube has no fans-Silent. (Score:2)
That's one of the reasons why I stick with PC's (Score:3)
Now with this nice little Stevey Jobsey plot twist, everybody's quick to jump on the trigger and label him as an obsessive compulsive transexual nutcase. Well some of that is probably true, but think about it just a minute : ATI's Radeon is brand spanking new, hasn't even hit the shelves here in Canada yet. It is my personal guess that maybe Apple isn't so sure about shipping with Radeon boards yet, simply because the product is not yet mature enough to live up to Apple's reputation. I'm guessing they were still unsure about shipping Radeon so early, and alleging that ATI played an important suit-licking card too quickly is what tipped the scales in favor of reliability, hence the older, proven ATI Rage 128. ATI isn't losing a market-threatening wad of cash here, let's face it : it's Apple. They make less cash selling whole systems than ATI makes selling only video cards in the same fiscal quarter.
The real motor behind this brisk decision is probably just good old fashioned Apple P.R. Anyone who's worked on phone support would understand the difference between an ignorant on a PC, and an ignorant on a Mac. The PC guy will yell at you in frustration, whereas the Mac guy will be more calm and listening. They listen to phone techs, they listen to their Mac dealer, they listen to Steve Jobs as if he were a god, because he gives out the image that his company "Cares" for their customers more than the others. That illusion is what keeps Apple safe from the rabid competition that's hemorraging everyone else.
This is a rumor based on steve's infamous rep (Score:2)
It seems to me that this story only has weight because of Steve's infamous temper. But to his credit, there has been very little evidence of that temper since he left apple, tail tucked, many years ago.
Re:Then replace the junk video! 2xAGP slot folks.. (Score:2)
Re:Hello (Score:2)
who is this meant to punish? (Score:2)
it's the end-users who will be stuck with an ATI Rage Pro.
one of the main reasons i haven't been a fan of the recent macs is their insistence on including sub-standard 3d cards from ATI. especially when apple's marketing always lists them as the most amazing, "ultra-realistic" 3d you can get...
Re:Apple *could* use the opportunity (Score:2)
Certainly some budget pcs ship with them but the G4 Cube is not a low end budget machine. Even the iMac isn't particularly low cost.
Using a crappy chipset will really limit the sales of this machine since nobody wants to shell out serious cash for a computer that already has a bit of technology in it which is largely obsolete already.
Bad move steve - M$ seem to make more sense using a custom designed nvidia in their Xbox
Re:Not in this lifetime, bub. (Score:2)
Think: LAPTOPS HAVE AGP.
Rami James
Guy with two testicles.
--
YOu bet it eh? (Score:2)
If you're making a statement, you'd better stick to it, or I lose ALL respect for you.
This makes no sense from a business standpoint (Score:2)
Re:Not in this lifetime, bub. (Score:2)
--
pulled from the SHOW, not the 'puters (Score:5)
Re:Wrath of steve... (Score:2)
Re:Who would buy one of these with the junk video? (Score:2)
I like my mac, and I want another, but I ain't gonna get the cube without a serious 3D solution. (It's not a luxury, but a necessity! I need my quake3!)