Ars Technica Reviews Intel iMacs 662
Milton Waddams writes "Ars kick off what I'm sure will be a torrent of reviews of the of the new Intel iMac. Overall it looks like it's a bit faster than the iMac G5 and a bit slower than the PowerMac G5 dual core. I'm sure it will surprise many slashdotters to find out that Jobs' statements about the new iMac being twice as fast as the iMac G5 as being slightly over optimistic. AND it doesn't run Windows...yet..." I'm still waiting for the most important benchmark: frames per second in molten core combat.
Benchmarks, accuracy, and choice (Score:5, Insightful)
To be fair, Steve's statements were absolutely 100% accurate (assuming the figures are accurate, which I expect them to be). For that benchmark, the intel machine is 2x-3x faster. If anyone really expected them to provide not-the-best-benchmark-results, can I have some of what you're smoking ? And I have several bridges to sell you too...
My point is that the story write-up makes it sound like SJ is lying, and he's not. He's just presenting the best set of benchmarks he can, which is pretty much what I expect from the CEO of the company...
As for the multimedia-style benchmarks presented in the review, I think you can expect those to improve as Apple gets its collective head around SSE3. I would have thought the G5/G4 implementations would have been altivec'd to hell and back, and SSE doesn't have the immensely useful 'permute' operation, so the transform operation will have to be rewritten to SSE's strengths - I doubt that has happened yet...
Simon.
Re:Benchmarks, accuracy, and choice (Score:5, Interesting)
1000fps in glxgears? I can beat that by a good 50% with a 4 year old NV GE440 go in my compaq laptop
watch what happens when there are nvidia drivers and ATI drivers available.
P.S. ATI 800 series cards do work (fully accelerated) on the development platform.
I'd post a link but the lawyers are loose.
Re:Benchmarks, accuracy, and choice (Score:2)
sorry I can beat that by a lot more.
1908 fps in glxgears.
That being said people using osx on an intel D915GAG mobo (very close to the imac and FULLY supported)
report that the gui is faster than xp on the same hardware.
I am fairly sure that the apple boxes feel even slicker.
And I am NO apple fanboy (my girlie won't even let me buy the intel board cpu combo
and yes I am whipped.
Re:Benchmarks, accuracy, and choice (Score:3, Informative)
No way, that's a myth. (Score:5, Informative)
Try playing a video game at 25FPS, and then at 60FPS. Can't tell the difference? If you can't, you've got to be full of it.
Maybe your brain can't absorb all the information on each frame past a certian point, but *anyone* sure as hell can see the difference when it comes to smoothness and fluidness of movement.
And a note about GLXGears - the higher the number, the better chances of getting more complex objects on the screen at a decent frame rate. If you haven't noticed, games are a little more detailed then GLXGears. So while you can spin a few objects at 2000FPS, you might only see 20FPS in the latest game title. But if you get 10,000FPS in GLXGears, you'll probably see much higher performance in the game. It's a BENCHMARK. Seriously.
And what does this mean: "a lot of resources are wasted computing and rendering"? Explain to me what else you want the computer doing when you're running a graphics benchmark? I want mine running the damned benchmark, what else? It's not like everyone's machines are attempting to cure cancer and we should let that happen at all costs. I buy fast computers because I want to use all of the speed, not have an abundance unused in the background.
Re:No way, that's a myth. (Score:3, Interesting)
Re:No way, that's a myth. (Score:5, Informative)
Try playing a video game at 25FPS, and then at 60FPS. Can't tell the difference? If you can't, you've got to be full of it.[/QUOTE]
You have confused video and video game - videos will tend to have good (or real) motion blur and hence can be percieved as smooth even at lower frame rates. Games tend to have poor if any motion blur, and thus need a much higher frame rate to appear smooth.
LetterRip
Re:No way, that's a myth. (Score:4, Insightful)
Next time you watch a movie, pay attention to pans across landscapes and such. Usually a DVD is sourced from 24FPS film, so it applies here too. You can usually easily see the jerkiness of the video when it pans. Then, watch some panning video from a home camcorder which usually records at 60 interlaced frames a second. The difference is immediately noticeable.
My point is, the human eye is perfectly capable of perceiving well over 25FPS. 24FPS is the standard for movie film, and it's really the minimum you can use and still have it seem fluid enough. Any lower and it's distracting. Any higher and it looks strange because we're so conditioned to 24/25FPS. That's why home video tends to look like exactly that (cheesy) - it's a much higher frame rate.
Video games exasperate the issue, and frame rates mean even more. 60FPS is smooth enough for most people that it seems perfectly fluid, which is why the industry has pretty much standardized on it as a base-line.
true that - and another thing (Score:3, Insightful)
And the claim that the blurryness of video offsets ths framerate is also debateable. I'd argue the opposite, in fact - 60fps video is much, much more sharp than 24fps because the motion blur obscures the detail(you only notice it on the edges, but it affects the entire frame).
Re:Benchmarks, accuracy, and choice (Score:5, Insightful)
What's the difference? Video and film have motion blur, which makes for smooth transitions between frames whereas games display things in discrete frames with no blur whatsoever.
Ever tried waving your hand underneath a strobe light going at 30 cycles/sec? That's 30fps yet the motion still looks strange, since it's like you're seeing discrete frames and not continous motion burred between frames.
Re:Benchmarks, accuracy, and choice (Score:5, Informative)
The iBook and mini may use integrated graphics, but they will probably use newer chipsets with graphics faster than the GMA900.
Re:Benchmarks, accuracy, and choice (Score:4, Informative)
Don't mention it, then. Especially since the iMac Core Duo uses a PCI-Express ATI Radeon X1600. [arstechnica.com]
Re:Benchmarks, accuracy, and choice (Score:2, Insightful)
You've really just shown your bias haven't you? Absolutely 100% accurate, oh, unless they're not accurate.
Steve Jobs may not have been lying, but he was most certainly being deliberately deceitful.
I don't see such a huge moral gap between the two.
Re:Benchmarks, accuracy, and choice (Score:5, Insightful)
I think you've missed his point. This is a common industry practice used for just about every piece of hardware and software on the market. To single Steve Jobs out for this practice rather than accepting it as the "norm" shows a distinct anti-Mac bias.
Steve Jobs may not have been lying, but he was most certainly being deliberately deceitful.
It's hard to be deceitful when it comes to something as nebulous as benchmarks. Every benchmark you run will tell a different story. The result is that you can pull a variety of different conclusions from the benchmarks depending on how you spin it. Given that Steve Jobs is the CEO of Apple, we can expect that he will spin the benchmarks positively. On the flip-side, we can expect that the Mac haters will spin the benchmarks negatively. The ones to really listen to are the moderates who tell us whether we're generally being delivered what we're promised or not.
Re:Benchmarks, accuracy, and choice (Score:2)
Not necessarily. When AMD moved to their new speed rating system for the Athlon XPs, they *usually* performed equivalently to a P4 clocked at the "speed rating" on average. It did NOT reflect the peak performance of the CPU compared to a P4, but the average instead.
Re:Benchmarks, accuracy, and choice (Score:3, Insightful)
Re:Benchmarks, accuracy, and choice (Score:3, Funny)
Who cares what the parent said! I never listen to them anyway (especially now I'm in my 40's
Re:Benchmarks, accuracy, and choice (Score:3, Informative)
Re:Benchmarks, accuracy, and choice (Score:5, Insightful)
Average performance on a wide variety of applications is an excellent performance indicator. Raw clock rate and peak performance on a single app (the former being a favorite of Intel and the latter being classic Apple) are both crappy methods of measuring performance.
Re:Benchmarks, accuracy, and choice (Score:3, Interesting)
Consistently, since even the early 486 and Pentium days, AMD (and in fact also Cyrix) CPUs routinely beat Intel CPUs running at somewhat higher clock rates. With a small exception near the end of the Pentium III's lifetime and prior to the introduction of the Pentium 4, AMD CPUs have almost never been available with equivalent or higher raw clock rates than Intel's finest. They HAVE been available with performance matching or beating Intel's fines
Re:Benchmarks, accuracy, and choice (Score:5, Informative)
Too far back in history, chief. In the 486 days, the AMD and Cyrixes were nothing more than a "cheap" upgrade for a 386. They simply didn't compete. Consumers were thus able to somewhat trust the MHz rating of Intel chips as a general performance indicator.
When the Pentium era arrived, AMD was still not a competitor, but they did manage to produce chips that were "good enough" to be considered cheap alternatives to the Intel processors. As a result, the PC industry did start producing AMD-based machines for their "low cost" product lines.
When the Athlon and PIII came to market, however, is when things got interesting. For the first time, AMD managed to put out an extremely capable chip in comparison to Intel's offerings. But far more interesting was that AMD started ramping their chips to exceptionally high MHz levels to pass up Intel's chips. This practice gave the AMD chips a reputation for high performance, but extreme heat levels and lower reliability. This left Intel with the server market as their chips proved more reliable over the long haul, and performed just as well in most non-gaming situations.
Long story short, Intel and AMD traded various blows in performance, each one gaining a slight lead for a short time only to be quickly shown up by their competitor. AMD, however, managed to keep the MHz trophy the entire time.
Intel got the bright idea to beat AMD at its own game and thus produced the Pentium IV chip. Now the PIV isn't a very good chip, but it can be ramped up in MHz in ways that AMD's Athlons couldn't. AMD struggled for awhile, but quickly realized that they could no longer win the MHz trophy. So they came up with a Jedi Marketing Trick to make consumers think that AMD's chips were running at the same clock levels. That trick was to assign an arbitrarily created "speed rating" that placed the direct competitors to Intel's chips right next to the Intel MHz rating. That way reviewers would pick up the slower AMD chip and compare it with the Intel chip, rather than look at the real MHz.
AMD got a lot of bad press for the decision, but it did eventually pay off. Consumers accepted the PR rating as the "real speed" of the chip, and Intel again started losing market share. AMD finally ripped the market away from Intel with their AMD64 platform, which proved to be much more popular with consumers than Intel's own Itanium line.
So to summarize, AMD started competing with Intel. Knowing that Intel customers used MHz to judge performance, they ramped up their chips to extreme levels. Intel responded with their PIV Northwood core and took the MHz trophy back. AMD got smart and skewed the market by printing a number on the box that wasn't actually a MHz rating, while convincing many consumers that it was.
Clear as mud?
Re:Benchmarks, accuracy, and choice (Score:3, Informative)
AMD was never really in a position where their chips were the clear MHz winners. Up until the K6/K6-2/K6-3 series, they were always lagging in MHz. The Althon changed that, and for the first time they could out perform the Intel equivalents.
There were brief moments where AMD did have the MHz advantage with the Athlon, but the advantage swapped often between Intel and AMD. Once the P4 was released, that MHz advantage disappeared forever... however the AMD chips could STILL compete in performance, in mos
Comment removed (Score:5, Interesting)
Re:Benchmarks, accuracy, and choice (Score:3, Interesting)
There's not nearly enough evidence to reach a conclusion either way. QuickTime export is one of Altivec's strongest areas, and Xbench scores are notoriously bad at having any relationship to reality. Let's wait and see how they do in real life; perhaps you'll find Apple really does have a clue.
Indeed, if the iMac G5 had undergone the same revisions that the PowerMac line had a few months ago, the chances are they'd be faster than the Pentium equivale
Re:Benchmarks, accuracy, and choice (Score:3, Informative)
One of the XBench tests where the iMac x86 gets "slaughtered" is the "User Interface" test.
It turns out that the iMac x86 runs this test at 67 frames per second. Which is quite consistent with some newer Apple t
Re:Benchmarks, accuracy, and choice (Score:3, Interesting)
He never claimed the new iMacs were 2-4 times as fast. Watch the keynote. He claimed that on the SPEC scores, which he said were key indicators of performance, they were 2-4 times as fast.
He then went on to say that the speed improvements won't be across the whole system, because other components (he singled out hard drives) aren't improved over the G5 models.
I say that he can't win because for years he put up Photoshop numbers, and many people around here said "show us the SPEC numbers!"
Now
Re:Benchmarks, accuracy, and choice (Score:4, Insightful)
Horse crap. Common industry practice or not, I think most slashdotters will call bullshit to these sort of claims whether it comes from Steve Ballmer, Steve Jobs or Linus Torvalds.
It's hard to be deceitful when it comes to something as nebulous as benchmarks
Well I don't know about that - seems pretty easy [macobserver.com] to be deceitful and called for it if you ask me.
Steve doesn't lie (Score:5, Funny)
He's Q, with a turtleneck and a pair of jeans.
Shut up! (Score:5, Insightful)
Anybody who says anything remotely positive about Apple, or especially about Steve Jobs, is a "fanboy." You don't want to be called a fanbody, do you? Then get with the program. Talk about how cheaply you can get a Gateway that's just as good as the new iMac or something, and insist that Woz is the only person who ever had anything to do with Apple worthy of any respect at all.
Oh... and maybe Tog, if you are a UI nerd.
Re:Shut up! (Score:4, Insightful)
[Warning: This is an OT rant, no hard feelings if modded down.]
Wish I had known that before I made a not-so-nice comment about Apple which resulted in several mods going well out of their way to mod me down until I couldn't post on Slashdot for a couple of weeks. (From a certain IP, anyway. At least now you understand the origins of my sig.)
If it has suddenly become a little too cool to hate Apple now, I blame extremist mods for it. Over the years I've made silly little quips about Apple that nobody on Earth should have taken too seriously and have been mod-bombed over it. I wouldn't be the least bit surprised if, out of anger, they were finally M2'd out and the replacements came in to even up the balance by shifting over to the extreme opposite view. (i.e. over-modding anti-Apple sentiment.) Too much zealotry will always lead to people with too much opposition to your view.
This has already happened with regards to Microsoft. Go back a few years and ANY comment ridiculing or insulting MS would be modded up, but polite criticisms of Linux would be modded down. Even uninformed posts (i.e. there still seems to be some impression that Win2K was built on the same kernel that Windows 98 was) would get modded up. 2K is nearly 6 years old now, XP is 4, and the BSOD is virtually gone. Yet, the blue screen jokes STILL fly with full karma around here. The result? People stand up and say "uh, you guys need to get with the 21st century." People whinge about MS fanboys flooding Slashdot. Sorry, can't see that from my point of view. Fire is being fought with fire. My advice? Don't give Apple praise for being wrong or Microsoft scorn for being right.
No, I'm not pro-MS or anti-Apple, I'm just tired of these karma-fueled battles happening every year. I appreciate Taco's desire to keep Slashdot 'democratic', but it's irritating that ordinary Homer Simpson'ish people are allowed to be cops.
Re:Shut up! (Score:5, Insightful)
You hit the nail on the head, Nanogator.
I have also noticed extremist Apple fanboy moderation around here lately. My Mac credentials extend back to the late 80s on System 6 and I've owned a half dozen Macs over the years. I'm even typing this from a Powerbook (running Linux admittedly). I'm a strong supporter of Apple and I love to read books about their history. Yet even the most mild criticism of Apple or MacOS on /. will result in my comments being moderated down as Flamebait, Troll and Overrated. I never get similar mistreatment for negative comments about Linux or Windows. It seems Apple fanboys have no qualms abusing the moderation system to ensure that only positive Apple comments are seen.
Unfortunately this isn't new behaviour for Apple fanboys. As far back as I can remember - including the glory days of Usenet - the Apple fanboys have been the most intolerant, the least receptive to criticism, the most judgemental and often the least educated of all the enthusiast groups. The negative moderation of any criticism of the latest Macs is yet another example of this behaviour. Anybody who thinks Linux fanatics can be over the top has never seen an Apple fanboy in full swing. Even the Amiga users were never so extreme. That sort of stupid fanatacism is what led to one of my earlier sigs: "I love Apple hardware but goddamn I hate Apple users".
The example at the start of this thread epitomizes everything I hate about Apple fanboyism. Steve said something that deservedly should be called out for being deceitful bullshit. If any other CEO - Gates, McNealy, Ellison - had said something similar we'd have people throwing figures around and using datasheets to prove that the CEO was a lying bastard. Even when a relative nobody from GNOME or Xorg attempts to massage the figures there will be 100s of /. comments crying "Bullshit". Yet when Steve does the same thing the Apple fanboys are rallying behind him, providing him with excuses, apologising for his behaviour, rationalising the lies, and moderating or shouting down anybody who points out that the emperor has no clothes. Apple gets "special treatment" and I find that despicable.
amount of ram in benchmark (Score:5, Insightful)
iMac Core Duo: 512MB
iMac G5: 1GB
PowerMac G5: 4.5GB
Wouldn't such a large difference in the ammount of ram have a significant impact on benchmarks?
Re:amount of ram in benchmark (Score:5, Interesting)
512MB may be slightly cramping the style of the new imac, but it didn't look like any of Ars' benchmarks would need much more than that. Certainly 4.5GB isn't going to make any difference, and if you've been following Ars' articles, you'd know why that particular machine is so loaded. The CPU-bound, disk-bound, or graphics-bound benchmarks aren't going to notice the change in RAM amount. The photoshop test, being done on a fairly large image, might have seen some impact from the difference in available memory.
Given how heterogenous the systems are already, I'm not too concerned with a slight difference in memory size. Given the different instruction sets, execution hardware, cache layout, and memory controller, I think having only 512MB rather than a gig is unlikely to show up in the benchmarks or in most users' usage.
Re:amount of ram in benchmark (Score:3, Interesting)
My G4/1.4 GHz is definitely much snappier at everything with a gig than 512.
I think these benchmarks are a little off, because nobody in their right mind would leave this machine at 512 MB in this day and age.
Re:Missing 1 piece of information (Score:3, Informative)
Comment removed (Score:5, Funny)
Re:But does it run... (Score:2)
Notable, regarding Windows (Score:5, Informative)
I tried to boot from a Windows XP installer CD. No dice. I then tried booting from a Vista installer DVD (Build 5270). Again, no dice. When holding down the Option key, the only icon that appeared was for the iMac's internal hard drive. Holding down the D key to try to force booting off of the optical drive failed as well. With the Vista DVD, the optical drive churned a bit and the iMac hesitated as though it were contemplating whether it wanted to boot the foreign OS. Soon afterwards, the familiar gray Apple logo appeared on screen and Mac OS X finished booting.
The new Intel Macs don't have an EFI shell, so there's no way to directly get at the EFI. Someone is ultimately going to have to write and/or use an existing EFI shell to tell the EFI to boot from alternate media to get things going. Naturally, running Windows under virtualization [appleintelfaq.com], with technologies like Intel's VT/Vanderpool, which the Core Duo in the new Macs does support [appleintelfaq.com], are going to be the way to go for most users anyway.
Re:Notable, regarding Windows (Score:2)
Re:"D" key? (Score:3, Informative)
Holding the D key is for booting Apple Hardware Test only. You cannot boot from a CD normally in this way.
C is still the correct way of booting from CD on intel.
Re:"D" key? (Score:3, Informative)
D - diagnostic partitions only (Apple Hardware Test)
C - optical media
I stand corrected. Thank you.
Re:"D" key? (Score:3, Funny)
Watch boot video here. (Score:5, Interesting)
YouTube.com has a video of both systems booting. So if you're in to computer drag racing here ya go: http://www.youtube.com/?v=zmaAZwkhYeQ [youtube.com]
http://religiousfreaks.com/ [religiousfreaks.com]Re:Watch boot video here. (Score:5, Informative)
Re:Watch boot video & Incompetent Benchmarking (Score:5, Informative)
In the video, the G5 likely had more RAM installed, which would make it POST considerably slower. The boot time, however, is probably very representative of how much faster the Intel iMac is at booting. Other reasons the Core Duo may have booted so fast compared to the G5:
- Two processor cores!
- Mac OS X is expressly designed to boot fast by bringing up as much as possible in parallel. That's part of the point of launchd: to identify dependancies and kickstart multiple things at once. This is also why Apple gave up on displaying what was being booted in 10.4, and now just shows a progress bar (which is unrelated to what's actually happening, and only timed to match the previous boot time as a relative indicator). Reporting what servers are being launched would take longer than actually starting them. This parallelism would clearly benefit from multiple processor cores in the Core Duo.
- the G5 may have been booting for the first time, or they may have deleted the cache in an attempt to make the test "fair," not realizing that the cache has a huge impact on boot times. Among other things, Mac OS X caches the kernel extensions so that the next boot only stops to numerate which kexts to load if something in hardware has changed. If you wipe your cache files (/Library/Cache, ~/Library/Cache and System/Library/Cache), the next boot will take a lot longer while boot performance caching is rebuilt.
- other hardware may have been unfairly compared: how fast was the G5's drive? was something wrong with it? was the G5's drive full, and struggling to find space for cache files? was it bound to a directory server, and stalling on boot while looking for the server? was it full of 3rd party software, kexts, startup items, etc?
The video doesn't reveal anything about the demonstrators competence at setting up fair comparison, or their motivation, so we don't know.
Recall the comparison of database servers running on OS X server vs Linux, where they intended to be fair but their assumptions about how to do so were actually really bad?
Or look at the Ars review and benchmarks of the new iMac Core Duo vs the iMac G5. He does an array of benchmarks where the G5 has 1 GB of RAM, and the Intel iMac has 512MB! Sorry Ars, but that's just plain incompetent. Your benchmarks are WORTHLESS to even skim over. How about benchmarking the G5 iMac with 512 and 1 GB installed, and reporting if that makes any difference?!
Re: (Score:2)
The G5 is still quite the chip (Score:3, Insightful)
Re:The G5 is still quite the chip (Score:2)
Re:The G5 is still quite the chip (Score:5, Interesting)
Now, in retrospect, it looks like they have for Mac OS X, but maybe not for all the other applications (iLife, FCP, etc).
Now, given that the OS has a long history of multi-platform support, it is only a piece of the puzzle.
Application level changes are a bit harder, especially in relying upon functions specific to a chip. Which, for some applications, is the case. Others should be able to do a direct recompile, if the application is still around in source form, the author is interested, etc.
Back when I had acess to NeXT Cubes, I didn't have to worry about it. However, when I later bought NeXTStep 486, I had to. There were lots of applications for the 680x0 systems, I sometimes had to search for those 486 applications. I assume we are headed back into that world.
So, can it happen? Yes. However, I suspect that Apple will move on with the Intel architecture. I assume the PowerMac G5 will be a well respected machine in the meantime, as it does great for video editing, something Widnows machines still work hard to do poorly.
I suspect it might be like the Amiga. While the Amiga didn't get a lot of respect, those in the video editing world used it much longer than people antipicated.
But, in the end, the new macs will be Intel. As a side note, I just sold my G5 DP to someone looking to do video editing with FCP. Even with them knowing the Intel systems were coming out, they still wanted it.
Cell isn't a desktop processor (Score:5, Insightful)
Existing PPC binaries won't run fast on the Cell. In fact, they most likely won't run at all.
There is no way we'll see a general purpose desktop system based on the Cell - it's just not designed for that kind of purpose. We might see some sort of Cell coprocessor board become available though.
Re:The G5 is still quite the chip (Score:2)
I am waiting for some early adopters of intel iMac to sell their G5 Towers for much cheaper than what apple is selling.
Another advantage is that most of these systems are loaded with RAM and Software. I have already started seeing some on craigslist.org.
Here is hoping there are a lot of early adopters ...
FireWire 800 (Score:3, Interesting)
It sounds like from the review that Apple's pro apps aren't well suited for the Intel-based Macs until they have the Universal Binary versions (suggested to be in late March). Maybe that's why they left FireWire 800 off the initial MacBook Pro -- if you need FireWire 800, you're probably doing pro work. So Apple left it out to reduce costs until they have a complete system for pros.
Out of topic but somewhat in topic though (Score:2, Interesting)
Re:Out of topic but somewhat in topic though (Score:3, Interesting)
Re:Out of topic but somewhat in topic though (Score:2)
Re:Out of topic but somewhat in topic though (Score:2)
Re:Out of topic but somewhat in topic though (Score:2)
PC's today really aren't PC's. PC's haven't been PC's since the introduction of USB and the PCI bus. Haven't you wondered what all these "Legacy Free" PC's were selling?
Re:Out of topic but somewhat in topic though (Score:3, Informative)
How do you define a PC? Is it the CPU architecture? Is it the manufacturer of the CPU? Is it the company that wrote the prevalent operating system? Is is the company that first called its product a 'PC', 25 years ago?
Why make things so hard when the answer is right in front of us?
PC = personal computer. All Macs ever produced have been PC's. For that matter, so were the Apple 8-bit computers.
Re:Out of topic but somewhat in topic though (Score:2)
It's Called A 'Lie' (Score:2, Interesting)
In the real world that is called a Lie.
Jobs doesn't look like he even remotely concerned about even making -plausible- performance claims about the Intel stuff.
Looks like Jobs is going to be doing a 'optimistic' spinning this year with the mess Intel's Roadmap(tm) looks to be in. He should have been less of a pain in the ass to IBM he wouldn't be in the mess Apple is in with their hardware.
Can't Boot Windows (Score:2)
Re:Can't Boot Windows (Score:3, Informative)
Also, apparently on the new Intel-based Macs, one holds 'D' in
OC? (Score:4, Funny)
And by OC, I don't mean "The O.C."
I mean Over Clocked.
I realize it isn't in exactly the best form factor to start pushing out extra heat, but someone's going to try it.
Best Features of the iMac (Score:4, Funny)
2) Automatically emails fan letter to Steve Jobs during start up
3) If you cup your palms over the domed base, your hair will rise in air
4) Sprouts set of cybernetic insectoid legs and scutters away when threatened
5) Perfectly matches the iBlouse
6) Screen is flat, which is good for some reason
7) Special drool tray catches saliva from enthralled technogeeks
8) Communications directly with human pineal gland by firing information-rich beam of pink light
9) Wuvs you
Stolen from The Onion of about 4 years ago but still true today.
Four years ago the lampshade model was out... (Score:3, Informative)
The "desk lamp" iMac design hasn't been around for a long while now. Yeah, boy, that flat screen sure is novel; nobody sees the advantages of that...
The G5 model has no tray to catch drool in, even. Slot loading drive, on the side.
Waiting for the second generation (Score:5, Interesting)
I know its ungeek of me not to want to be on the bleeding edge, but I'm waiting for the second generation machines.
Re:Waiting for the second generation (Score:3, Informative)
The only big change on the horizon is the switch to Merom/Conroe/Woodcrest in the second half. This will bring the eventual switch to a fully 64-bit OS X.
the specs will not change drastically (Score:5, Informative)
Re:Waiting for the second generation (Score:5, Funny)
Re:Waiting for the second generation (Score:5, Informative)
In addition, the first Intel box is not a motherboard that Apple slapped together on its own, like it did for the first PPC boxes. It is a state-of-the-art Intel motherboard with all the latest doodads. Sure, Apple could stick in a fastre graphics card or hard drive, but the motherboard support chips are all modern. I think the next rev of the MacBook Pro will include Firewire 800, which I assume Apple couldn't integrate into the MacBook Pro motherboard in time to meet their ship date, but that's more of an incremental change than was included in the second-generation PPC boxes. (And don't forget the problems with the then-new PCI-bus Macs. Networking was so broken -- remember the Open Transport fiasco? The brand-new networking architecture that wasn't ready at the time the 7200/7500/8500 were released and which those new machines relied on, MacTCP having been deliberately obsoleted? It took several months after the boxes were shipped for *any* PPP dialing software to work at all with the Mac, and it took more than a year after that until most of the more significant networking bugs to be quashed.)
Seriously... (Score:2)
Does anyone relly have any info on this? Even anecdotal evidence would be appreciated.
molten core (Score:4, Funny)
I'm pretty sure that if you overclock your Dual Core to the point where it becomes "molten", your FPS rate is going to be Zero.
32-bits? (Score:2)
I can't believe Intel is building anything new these days that isn't AMD64, but I've already had a couple people tell me I'm wrong about Core Duo.
If so, why would anybody buy it at these prices?
Yes, 32-bit... (Score:5, Informative)
Not that many people actually need 64-bit capabilitity, mostly for programs that need very large memory access. iMac users certainly will not, Macbook Pro users is more questionable - my guess is they will upgrade that line with the 64-bit chip at the same time they release the Intel Powermac equivilent.
Hmm, that leads me to wonder what the new name for the Powermac will be... MacMac?
Spanning (Score:5, Informative)
Memory Anyone? (Score:5, Insightful)
The older imac was sporting twice the memory, and the g5 desktop had 9 times the memory.
Clearly the memory disparity was a factor in many of the tests.
I would give more credence to a test where all three machines had the same amount of memory so that paging/swapping/caching would be more at parity.
Re:Memory Anyone? (Score:3, Insightful)
Why?
Seriously, what makes it so clear to you that this was a major factor? If all the tests run could fit in 512 MB without swapping, going to 1 GB wouldn't gain anything, right? Is there something about current Mac platforms that I don't know?
There were many differences between the machines. I'd be more inclined to point out that a significant minority of the benchmarks tested the graphics chips more than the CPU.
Rosetta in same thread as app or not? (Score:3, Insightful)
"Rosetta runs in the same thread as the application, and translates blocks of code as they come up. "
Then
"...That allows the translation to run on one core while the application thread executes on the other core, meaning that the translated code will have a short distance to travel."
So, which is it? Does Rosetta run in a separate thread or not? Maybe he meant it runs in the same process, I don't know.
Don't people WTFK??? Speed claims were qualified! (Score:5, Informative)
Let's all go to www.apple.com/quicktime/qtv/mwsf06/ and load the keynote up to 1:07:00.
Steve Jobs is completely up front about which testsproduced the numbers (SPECint_rate2000 and SPEC_fp2000) and outright says "Now, everything is not going to run 2-3x, the discs aren't 2-3x faster, etc." He makes it very clear that his numbers are based off of these two benchmarks. He claims they are the most important benchmarks of performance, which is debatable, but they are certainly a fair test of raw cpu power. Other than the chip and motherboard, the only other significant component that has changed is the GPU, going from a Radeon x600 to an x1600. Does anyone disagree that this is in the 2-3x faster range?
All in all, people are making a mountain out of a molehill rather than checking the source of the numbers. god bless the internet.
-justinb
Experimental Error? (Score:3, Insightful)
XBench is not great for benchmarking unless you repeat it's tests about 10 times or more each... its results vary too much (even from one run to another on the same machine, never mind when comparing two different ones).
Come on people, do many tests, compute the data, adjust with Student's t-distribution. This is elementary stuff yet no one does it.
PowerMac Replacement? (Score:4, Interesting)
Another possibility would be for Apple to wait for the Extreme Edition of the Conroe, the Kentsfield. That would give them four cores, like the current PowerMacs. It won't be out until 2007, and Apple seems anxious to switch everything over ASAP. So they could go with Woodcrest, basically Conroe for servers. This might let them put together a dual-cpu/dual-core setup like they have with the current PowerMacs. This kind of setup was demonstrated by Intel [anandtech.com] last fall. There were also rumors [theinquirer.net] last year of Apple pressuring Intel to give them Woodcrest chips ahead of schedule.
And of course there's the more mundane question of what will they call the PowerMac replacement? They seem to want to get away from the Power prefix, while stressing the Pro tie-in to their Pro apps. So maybe Mac Pro? Seems too short. Maybe bring out the whole name, Macintosh Pro. Whatever it is, can it make people as upset as "MacBook" did?
Re:Why? (Score:5, Insightful)
Re:Why? (Score:5, Insightful)
IT's not Apple's job to help you run Windows software. Least of all on their hardware.
As the OP said, if I bought a new Mac, the last thing I'd want to do is try to figure out how to run software for Windows on it. Period.
Nobody is forcing you to buy a second machine to do anything. You can do without that software, buy a second machine, or (possibly) void your warranty by trying to get Windows to run on it. That doesn't mean you should expect Apple to roll over and give you a machine which it is easy to make run both OS's. They want to give you a good user experience if you bought their stuff.
If I buy a Honda Accord, is it reasonable to expect Honda to ensure that the turbo-kit I got for my Ford Escort runs on that Honda? Of course not. What does Honda care? And it's not about "the full Honda experience", I'll tell you that.
Apple would probably prefer you leave them out of the equation when it comes to running your Windows games. Specifically so they don't get calls from people who have either bodged their systems together from spare parts, or generally done stupid things with them.
You have complete freedom to buy, or not buy an Apple computer, and all that implies. Whining about being 'forced' to own a second computer to be able to have another platform is a completely specious argument in my opinion -- how is this any different than from when the computers were on completely different platforms?
Re:Why? (Score:2)
Indeed. I'm personally waiting for WINE to be ported to the Mactel platform. That should provide pretty much everything people want, as well as provide an easy migration path for video game producers. Or in other words:
1. Compile game against WINELIB
2. Release game for Macintosh
3. Profit!!!
Re:Why? (Score:2, Redundant)
Re:Why? (Score:2)
Of course, the only non-console game I still play is World of Warcraft, so it's a non-issue for me.
Re:Why? (Score:2, Insightful)
During the day I work on a multimedia engine that is currently Windows-only, but will soon be cross-platform. At night I hack on my linux boxes, surf the web on whatever webbrowser is on my couch, and laugh along with my friends at the Flash animations they show me. Generally speaking, it doesn't matter what OS I'm running as long as I can browse the web and ssh places.
But when someone asks me a question about OSX, I don't have a
Re:Why? (Score:2)
Re:Why? (Score:5, Insightful)
Re:Why? (Score:3, Interesting)
As one who agrees with the OP - why indeed? I am seriously considereing this new Mac (aka Unix) platform for my home PC as an upgrade to my Win2K desktop. I can't imagine dual-booting to something as lame as XP or that future hog, Vista.
For those of use who need to test stuff under Wintendo - such as Java apps or PHP scripts, there's a lot to be said for Virtual PC or VM Ware. Even on my Linux laptop I only run Windows apps in a Cross
Re:No AMD macs? (Score:2)
Re:No AMD macs? (Score:5, Informative)
http://www.tomshardware.com/2006/01/16/will_core_
Re:No AMD macs? Excuse me!! (Score:3, Insightful)
Excuse me, but while Apple is big on noise, they're not big on production. I'm sure AMD could have given them all the chips they need. They might not have been so forthcoming with the Marketing Money however.
For Intel, getting Apple is a coup worth paying enough for that even if they never make a cent from Steve Jobs, they've still silenced the biggest critic of the i86 architecture.
Their problem right now is keeping Dell/HP happy, both of whom s
Re:No AMD macs? Excuse me!! (Score:3, Informative)
I'm afraid that your information is several years out-of-date...
AMD's mobile CPUs are now commonly lower power than even the best of the Pentium-Ms to-date. That's in-addition to being cheaper, and higher performance at the same time.
For example:
mSempron 2800+ (1.6GHz) 25W
mTurion MT-34 (
this (Score:3, Informative)
Comparable to a Athlon 64 X2 (that's a desktop chip) with way less power draw (both idle and peak load).
Other factors exist too... AMD used to have a reputation for poor QA on the line, and while they seem to have overcome it, hey history is a stinger when you are dealing with companies like Apple.
Re:No AMD macs? (Score:4, Insightful)
Apple chose the same processor that Dell so heavily rely on. Of all the reasons, I just don't believe AMD can't manufacture enough chips. I think Steve Jobs always wanted to use Dell as the model to follow whether his mouth admits it or not.
Are you trolling or just slow in the head? Apple went with Intel for laptops. They needed a fast portable. AMD has nothing useful for laptops right now. their top chip uses 15-60% more power and is slightly slower than the Intel Duo. It uses more power idle than the Intel does at 100%. Choosing between going from a 6 hour battery with the g4 to a 3 hour battery with the Intel or a 2 hour battery with the AMD. Gee, tough choice. Apple may very well ship AMD chips some day, but not in portables or all-in-ones until they get their power consumption under control (AMD 65nm is due Q4). As for business models, Dell is about cheap, cheap, cheap with little inventory and interchangable supply. Apple is about grabbing the high end with innovative tech as a differentiator. The business models are very different.
Re:Same Good looks (Score:3, Insightful)
Re:Molten Core Combat? (Score:3, Funny)
It's an inside joke, referring to an area in the most popular MMORPG around, World of Warcraft. That's by this company called "Blizzard", and it runs on things called "computers". It comes on a couple of "CDs", which is short for "Compact Disc". Someday, when you get electricity in your cave, you might be able to try it too!
Re:Fat Binaries (Score:3, Informative)
we know photoshop is a little pokey (Score:3, Interesting)