Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Technology (Apple) Businesses Apple Technology

G5 Benchmark Roundup 251

"The G5 is the fastest PC in the world." "Yes, it is." "No, it's not!" Whatever. Read on for more on the subject, if you really want to.
Matt Johnson writes "Well it looks like we finally have our first comparison of G5 vs. AMD Opteron, completed by none other than Charlie White, the individual which gained much oh his fame by publishing misleading benchmarks to make Apple's Final Cut Pro Software look like a bad performer. Mr. White's latest comparison shows the Opteron operating roughly 50% faster but what he doesn't say is which compiler was used to generate those SPEC scores. When Apple declared its benchmarks I feared that whoever made the first comparison would likely make this mistake. It seems only appropriate that Charlie White would be first."

An anonymous reader writes "In an ironic twist to the recent benchmark wars, Intel referred the Mac site MacFixIt to an analyst at Gartner Group who actually backed the PowerPC G5 platform with this assertion: 'These models certainly equal Intel's advanced 875 platform and should allow Apple to go until 2005 without a major platform refresh.'"

Another anonymous user writes, "While browsing the Xbench benchmark comparison site, I discovered some G5 benchmarks! The 'G5 Lab Machine at WWDC' got an overall score of 164.78, but much higher scores in certain areas. All of the tests are calibrated to give 100 on an 800MHz DP Quicksilver G4."

vitaboy writes "Sound Technology, one of the "leading UK distributors specialising in musical instruments, music software and pro-audio equipment," seems to have some data regarding the real-world performance of the G5 compared to the high-end PC. They state, 'The dual 2GHz Power Mac G5 with Logic Platinum 6.1 can play 115 tracks, compared with a maximum of 35 tracks on the Dell Dimension 8300 and 81 tracks on the Dell Precision 650 each with Cubase SX 1.051 ... More impressively, the 1.6GHz single-processor Power Mac G5 played 50 percent more tracks than the 3GHz Pentium 4-based system.'"

This discussion has been archived. No new comments can be posted.

G5 Benchmark Roundup

Comments Filter:
  • by NickV ( 30252 ) on Saturday June 28, 2003 @09:51AM (#6319294)
    That Charlie White gets off on doing nothing more than trashing the Mac and he often makes tons of things up...

    A real good point, and one to points to the fact that Charlie White stats are COMPLETELY cooked up and fake, is that apparently AMD benchmarked against a SINGLE G5 2Ghz Powermac...

    Hmm... Where did the Single 2ghz G5 Powermac come from? We know Apple doesn't make them...

    If you're gonna lie, at least do it right. Sigh.

    (And another thing, AMD has more credibility than Apple regarding self-reported benchmark scores? There is no reason for that other than bias.)
    • by GreenHell ( 209242 ) on Saturday June 28, 2003 @10:30AM (#6319461)
      Actually, if you look at Apple's G5 Performance page [apple.com] you'll see that those are the numbers for the dual 2GHz G5.

      Given that, I'm still inclined to take the comparison with an entire shaker of salt. I mean, if he's suspiscious of Apple's numbers but not suspiscious of numbers obtained from another processor manufacturer than I don't know what to say other than 'Mr. White, your bias is showing.'

      I'm waiting until they hit the market so that the comparisons are done by people who actually got to test the machines themselves, not some guy who knew what he wanted the data to say before he even began writing. Until then, all I'll say is that it looks like nice hardware. But faster or better? Who knows.
    • by saden1 ( 581102 ) on Saturday June 28, 2003 @11:02AM (#6319602)
      Never trust a company that is trying to sell you something. The test should have been done by him rather than being provided by the companies themselves. This guy has zero credibility and I for one don't put much stock on what he says.

      Oh, and that last line about AMD having more credibility is just one of the most stupidest things I have ever heard. I don't buy Macs because of the $$$ and I pack a AMD chip but to say something like AMD has more credibility is very silly.

      It is good to see that the old adage of "like assholes, everyone has an opinion" hold true.
      • by CompVisGuy ( 587118 ) on Saturday June 28, 2003 @04:23PM (#6321158)
        I think that the $$$ argument is flawed -- it's a myth.

        If you compare the price of a G5 (or pretty much any Apple system), with an equivalently-specced PC from a reputable supplier (such as Dell) -- if you can in fact find an equivalent (and frequently you can't, Apple now out-perform PCs, and often you get standard features on an Apple that you can't get on a PC) -- then you will find the Dell system to be more expensive than the Apple.

        Granted, you will probably be able to build your own system, or buy from a local PC shop, a PC with a decent spec that is cheaper than an Apple system. However, I have had a couple of very bad experiences with small/mid-size PC builders, and a horrible experience building my own system (I'm a qualified electronics engineer, but I was let down by some dodgy components and very poor aftersales service). Others may have better experiences, but I think it's a matter of luck over judgement. So, ever since, I've vowed to only buy from the big boys.

        Next we come to software. On an Apple system, OS X is included in the price of the system -- you often have to pay extra on the PC system for Windows (OK, Linux etc. are often cost-free if you want to go that route). Sometimes Apple software is more expensive or unavailable on the Mac -- but in my line of work (statistical modelling), all the software I need is available. For Word document monkeys, you also have MS Office on the Mac (I'm told it's better than the PC version). Games are slower to appear on the Mac -- that's a potential drawback.

        I spend almost every working day in front of a computer. If you had to drive around for a living, you'd want a decent vehicle: any extra cost of an Apple system over a PC system, amortised over the time spent using the system, is almost zero. Oftentimes, the Apple system is cheaper anyway.

        But here's the real reason to buy a Mac: The integration between the hardware (some of the best-engineered in the industry) and the OS (OS X is probably the best OS around at the moment). "It just works" is something I hear from people who make the 'switch' from PC to Mac, and it's true.

        That's my opinion. Maybe I'm an asshole, though.

        • I build my own systems and I sure as hell can build a screaming system for much less than any Mac out there.

          I have had experience with doggy components, guess what I did with them though? I sent them back. About ten miles away from me is an Internet computer store that is supper cheap. It is really a miracle how they mange to keep their prices so low. I usually go in there and buy most of my stuff.

          $150 - CPU - AMD XP 2800
          $100 - Mobo - I'm an ASUS fanboy :p
          $110 - HD - IBM/Matrox
          $60 - Burner - LiteOn
          $40 - D
          • The whole DIY vs. vendor built arguement is going to be harder to settled than emacs vs. vi or FreeBSD is dying - No it isn't.

            Right now I have two computers on my desk, a 17" iMac LCD 800 and an Athlon XP 1900+ with an ASUS mobo. The PC has a Lian-Li Al case with Stealth fans through the whole box. It still sounds like a freight-train coming down a hill. My stock XP install crashes about once for every 8 hours it's on.

            It was cheaper to build, but it took much longer to set up. I spent 2000 on the iMac and
          • by Paradise Pete ( 33184 ) on Sunday June 29, 2003 @11:34AM (#6325232) Journal
            I build my own systems and I sure as hell can build a screaming system for much less than any Mac out there.

            That's like dropping a crate full of automobile parts next to your neighbor's new sports car and saying "see how much cheaper mine is?"

          • Re: (Score:3, Insightful)

            Comment removed based on user account deletion
          • Couple questions:
            Does the ASUS have Firewire 400 and 800 built in? USB 2.0?
            Why did you go with a DVD ROM instead of a DVD burner?
            What about fans (case and processor)?
            Does the "midrange" video support dual digital displays?
            How fast is the FSB on the mobo?
            Max RAM on the mobo?
            Digital sound input/output?
            PCI-X slots? AGP 8x?
            Gigabit ethernet?
            Serial ATA?

            It sounds like a great system, but I want to make sure it's an even comparison to what the G5 is offering, otherwise price is irrelavant.
      • Never trust a company that is trying to sell you something.


        Hey, give those companies a break. Remember how before these fast chips came along they all used to advertise their computers as "hopelessly slow", "utterly inadequate" and (my favorite, from a Data General ad), "retarded to the extent of inducing migraines".

    • The SPEC marks use only one CPU for SPEC_int and SPEC_fp.

      Why doesn't Apple publish their marks on specbench.org? Why don't people look at
      this:
      http://www.specbench.org/cpu2000/results/res2003q 2 /cpu2000-20030421-02108.html [specbench.org]

      or

      this: [specbench.org]
      http://www.specbench.org/cpu2000/results/res2003 q2 /cpu2000-20030421-02109.html

      The compiler?
      Compiler: Intel C/C++ 7.0 build 20021212Z and
      Intel Fortran 7.0 build 20021212Z,
      Compaq Visual Fortran Compiler Version 6.6
      Update B, Microsoft Visual Studio .NET (libraries)7.0.9466,
      Mi
      • Apple did publish full disclosure information on the VeriTest site. Since it often takes SPEC a while to post results that have been submitted to them, we don't really know whether Apple submitted their results to SPEC or not.
    • Well, I wanted to judge for myself, so I went to White's site and read the piece. An excerpt:

      The G5 is impressive enough without cooking up any numbers or twisting any words. When I looked at its specifications, all I could say through my gaping jaw was a reverent "wow." This baby is a monster, with 64-bit processing, a 1GHz front-side bus for each processor, a couple of 2GHz chips, and lots more. If Apple actually ships this box in August, it will be a formidable contender in the content creation arena, n

  • Useless article (Score:5, Insightful)

    by cioxx ( 456323 ) on Saturday June 28, 2003 @10:01AM (#6319333) Homepage
    First benchmarks? This is a joke. He didn't even get to test any of the G5's, nor bench'd them.
    DMN has obtained SPEC benchmark data from AMD

    Right! He obtained them.

    It's a biased opinion piece. Now I'm aware that Apple kick-started the G5 with lots of smoke, which is the nature of the business in the computer hardware world, but to discount these numbers just because of some hype during WWDC presentation is silly.

    How about we wait for the REAL benchmars from Anandtech and put away some speculation from webmasters who can't even hire anyone older than 14y/olds to design their websites?
    • Re:Useless article (Score:3, Interesting)

      by diverman ( 55324 )
      I agree. Wait for the REAL benchmarks.

      One thing I have to say about Apple's spin on benchmarks... Has Apple Marketing finally figured out how the rest of the companies play the game??? If so, Apple might stand a chance afterall!!!

      -Alex
    • I have just obtained benchmark results that the G5 is 1000% faster than an Opteron at the same clock speed! Using the highly scientific method of picking random numbers that fit my own agenda I can conclusively say that the G5 is the fastest computer ever made, ever. In fact, it is so fast that you will never need to buy a computer ever again.

      To proove this, I draw your attention to the number 13821. As you can see, this number is over 10 times larger than the number 1259. That's right, a difference of

  • by boomerny ( 670029 ) on Saturday June 28, 2003 @10:08AM (#6319362)
    I'll wait til the systems are actually shipping and I've seen some independent real-world benchmarks before making any judgements. Xlr8yourmac.com should have some good information once they ship, and maybe barefeats.com
  • by flaroche76 ( 642126 ) on Saturday June 28, 2003 @10:14AM (#6319387)
    People obviously shouldn't form an opinion on a new platform in the first week following its much hyped anouncement. I think the only thing this first week proves is that at least Apple was able to put itself back on the map and be worthy of performance comparison with high-end systems. Or else, why would these PC-centric doofus post early benchmarks and make asses out of themselves if not to try to defuse an apparent threat? What I want are options. I think Apple just gave me another one. But I won't base my judgement on the number of times Steve Jobs says the word 'awesome' in a keynote address or on shady benchmarks done on an apparently non-existing model (single 2ghz cpu)... I think people should let their emotions settle down and wait to get their hands on a real machine and try it out themselves...
    • by Anonymous Coward on Saturday June 28, 2003 @10:22AM (#6319427)
      People obviously shouldn't form an opinion on a new platform in the first week following its much hyped anouncement.

      Of course they should. That opinion is perfectly valid. And it is, "Wow. Those are going to be really fast. They look cool. I'm excited."

      Or else, why would these PC-centric doofus post early benchmarks and make asses out of themselves if not to try to defuse an apparent threat?

      In my experience, PC doofuses have always been big with the benchmarks. It's like a bragging right to them. "I tweaked my dual Smockron 4500 and got it up to 313.3 on SPECdickweed_base!"

      Meanwhile, us Mac doofuses (and I use the term with the greatest affection) spend that same time actually working. Because we need the extra cash to feed our $4000-a-year Mac habit.

      What I want are options.

      Oh, come on now. No you don't. What you really want is a computer that satisfies all of whatever your personal criteria for goodness are. If there were only one computer in the world but it were perfect, you'd be happy.

      The whole "what we really want is choice" thing just ain't so.
      • In my experience, PC doofuses have always been big with the benchmarks. It's like a bragging right to them. "I tweaked my dual Smockron 4500 and got it up to 313.3 on SPECdickweed_base!"

        Meanwhile, us Mac doofuses (and I use the term with the greatest affection) spend that same time actually working. Because we need the extra cash to feed our $4000-a-year Mac habit.


        Sure, certain models of x86 boxen costs less than macs, but the benchmarks cost $500. [spec.org]. I mean validation might well be worth something to the
  • by GurgleJerk ( 568712 ) on Saturday June 28, 2003 @10:19AM (#6319409)
    Looking at everything I've seen so far, it looks like the G5 at 2.0 GHz is comparable to a current Xeon or P4 on raw speed. Maybe it lags a little bit in some areas, and in a few areas it can beat the Xeon or P4. But I think we've gotten a little too anal about the processor specs. If I'm not mistaken, Apple didn't claim "World's Fastest Processor." they claimed "World's Fastest Personal Computer."

    At 2.0 GHz, the G5 is on par with the current top processors, but what I think people need to look at is that the 1GHz bus is a monster. It allows data transfer rates that smoke other desktop systems. This is where Apple picks up a lot of speed, especially with disk-hungry programs like Photoshop. So the total system is significantly faster than the PC in terms of that kind of real-world performance.

    And there are two more things that give the G5 an advantage: price and GHz. If the claim of twelve months to 3.0GHz is true, then at 3.0GHz the G5 will be exponentially faster than a 3.5 or 3.6 GHz P4. I don't know precisely how fast the Intel chips will be in 12 months, but a whole GHz? Unlikely.

    Lastly, price is a fantastic advantage for the G5 systems. At $3000 you can buy the fastest Mac and a machine that can run certain apps twice as fast as PC systems. And it's cheaper than these top-of-the-line PCs by more than $1000. The G5 is simply the fastest, cheapest system with the most potential in the future to get even faster. When looked at in total, there really isn't a lot of debate on those points.
    • by FueledByRamen ( 581784 ) * <sabretooth@gmail.com> on Saturday June 28, 2003 @11:42AM (#6319792)
      This is where Apple picks up a lot of speed, especially with disk-hungry programs like Photoshop.
      That probably should read "memory-hungry." Disk transfers are still really, really slow - although SATA (which is used in the G5) can go at 150 megs/sec, so can full-duplex Gigabit Ethernet (also included). The real performance ass-kicker is the memory bus - they use 128-bit DDR400, and I'm assuming it can be interleaved (since you're probably going to put multiple sticks in it anyway) for even better performance. They get 6.4GB/sec (gigabytes) out of it (stated at the Stevenote), which is pretty damn good. Not quite enough to saturate the processors' FSBs, but if you need to move a lot of stuff to/from RAM, PCI/X slots (optional), AGP, and the I/O controller (sound, ethernet, etc), like in any game, any high-end 3d app, or any audio app that includes an effects processor (especially when running it on a real-time audio input, recording, while also outputting the results, at 96khz 48000/stereo), the G5 will dominate.
      • Here's something odd that I noticed about Apple's new systems. You can only get the RAM is pairs of the same value. I wonder if this is just an artificial restriction in Apple's online store or if it's some hardware trick that they used to keep the G5 fed. I'm betting on the latter.
      • Canterwood's Dual DDR400 and 800MHZ FSB also does a very good job of getting data to and from the processor. So do Opteron's DualDDR400 integrated memory controller and 3x 6.4Gbyte/sec HyperTransport links.

        Remember, the G5 isn't the only processor with insane memmory and I/O bandwidth.
      • by Chief Typist ( 110285 ) on Sunday June 29, 2003 @11:42PM (#6328553) Homepage
        I got a chance to talk to the project leader for Photohshop during WWDC, and the memory bandwidth is exactly where they're seeing the major performance wins. This is also probably true with the music applications: both need to move large chunks of memory around.

        It's also interesting to note that Apple is aware of the new & cool things that having all of this bandwidth -- I asked at one of the graphics sessions if they had looked into using High Dynamic Range images as a standard part of Core Graphics (Quartz) -- they said "we're looking into it..."

        BTW: HDR images use a 32-bit floating point value for each component of a pixel (so you're no longer limited to values in the range of 0-255 to represent red, green, blue and alpha.) Using floating point values for each pixel gives you a lot more "headroom" when manipulating the image. A G5 with a fat & fast bus coupled with a kick-ass floating point vector unit will allow applications that Wintel can only dream of...
    • And there are two more things that give the G5 an advantage: price and GHz. If the claim of twelve months to 3.0GHz is true, then at 3.0GHz the G5 will be exponentially faster than a 3.5 or 3.6 GHz P4. I don't know precisely how fast the Intel chips will be in 12 months, but a whole GHz? Unlikely.

      Extremely likely. If hobbyists at home are overclocking their P4s to 3.5Ghz _now_, intel shouldn't have any trouble getting there in twelve months.

      You shouldn't be thinking in terms of Mhz, you should be thinkin

  • real world apps (Score:5, Interesting)

    by goombah99 ( 560566 ) on Saturday June 28, 2003 @10:30AM (#6319458)
    The keynote address was fairly long so I would guess most slashdot readers actually watched all of it. In it they did on stage examples of tests they did with real world apps.

    They showed four top-shelf apps: Photoshop, Mathematica, Emagic, and one other I'm spacing on. In each case the apps were not demoed by mac but rather by someone from the app company. And the examples they gave were clearly practical ones not special cases noone would actually want to do. In the case of Photoshop it was actually a commerical product (movie poster) that was recreated by replaying the artists commands. In the case of the Emagic it was the compositing of the actual musical composition that the musician had done. In the case of mathematical it was the calcualtion of a fractal curve: theodore grey pointed out they had to dumb down the calculations so they xeon would not run out of memory.

    in all cases the Apple ran more than 2X faster than the Xeon.

    now you could try to say these were tweaked apps, but that wont wash. these are pro-sumer apps that these comanies sell for a living. you better believe that would optimize the heck out of both the wintel and Apple versions. Certianly, if there was any tewaking tobe done they had lots of time and no shortage of manpower and experts to do it on the intel instruction set. Another test they did not demo live was the 40% higher frame rate in Quake

    If all they had shown was some single case like photshop or Quake I might have been less convinced. but here are five different genres of applications, in the most demanding fields of Imagery, music, (real world) numerical math, Gaming and others. Okay so your application--say MS word or web browsing--isn't so demanding. That's not the pointis it: you aren't doing things where the machine is the speed limit.

    I think its pretty reasonable to assume that over time compilers for the new G5 will imporve more that those for the i86 instruction set since there's new things to exploit. Likewise relatively few compilers do a good job of taking fulladvantage of the Altivec extensions yet. And with the fat, independent pipes to disk, and memory apps will need to be re-written since many of the old bottlenecks they were designed to avoid aren't there anymore

    So argue all you want about SPEC tests, but were taking shaving ten or 20 minutes per hour of real world usages. Its phenomenal. In my opinion the diveristy of tests clearly shows the mac is not only the fasest currently on-sale platform, but that there is not even any wiggle room to doubt that.

    • maybe (Score:5, Interesting)

      by jbolden ( 176878 ) on Saturday June 28, 2003 @11:05AM (#6319614) Homepage
      I think its pretty reasonable to assume that over time compilers for the new G5 will imporve more that those for the i86 instruction set since there's new things to exploit.

      Actually from an optomization standpoint x86 is pretty new too. What you need to do for Pentium IV (pre HyperThreading) is very different than what is needed for Pentium III and different from what is needed for PIV w/ HT. Further the complexity is so great that compiler science of today is really not up to the task.

      Conversely the G5 is much simplir problem due to better design. OTOH it also much newer. It may be that in practice (especially when people are willing to lose 32 bit and/or G3 compatability) you might get some truly wonderful improvement.

      So I'm really not sure where there is more room for improvement over time. I just don't think its nearly as easy to say as you had it in the above. In my opinion its going to come down to a political choice regarding the G3s vs. advances in compiler technology.
      • Re:maybe (Score:5, Insightful)

        by FueledByRamen ( 581784 ) * <sabretooth@gmail.com> on Saturday June 28, 2003 @11:47AM (#6319819)
        Losing 32-bit compatibility shouldn't be a problem at all. That's the great thing about the Mach-O Executable format (used by OS X) - you can stick binaries for as many different architectures as you want in there. Hell, if windows supported the format, you could stick an X86 and a PPC binary in there and run exactly the same file on both platforms. Ditto for Solaris, Linux, IBM's zOS - you get the point.

        My guess is that Apple will make the 64-bit versions of the Mach-O binary loader look in a different place (I don't know how the Mach-O format is organized - the next slot? a different directory tree?) for a 64-bit native version, and fall back to the 32-bit version if one can't be found. The existing loaders will just keep looking in the same place they always have, and see the 32-bit version.
        • Re:maybe (Score:3, Informative)

          by Graymalkin ( 13732 ) *
          The fat binary format you're talking about is less of a feature of the Mach-O format than it is a feature of NeXT/OSX's binary loader. A fat binary is a single file image with multiple Mach-O binaries inside of it. It has a header file declaring the CPU types of the binaries in the file and their offset addresses so the binary can be loaded. From there the Mach-O is loaded normally.

          It would be pretty trivial for a developer to release a fat versions of their software assuming the PPC-64 port was fully func
      • You can already trade G3 support for major speed gains if you use Altivec. Compiling a program for 64 bits will not give a *speed* gain, just address space and integer range. However, I'm sure Apple's and IBM's compilers are tweaked to take advantage of the G5's internal behavior quirks and performance hint instructions, and as in the example in the grandparent these optimizations might actually decrease performance on G4s and G3s.
    • isn't it great to see a /. comment thread where the people aren't flaming each other or shunning someone? :-) i love it

      but anyways... i completely agree. its not about numbers anymore. really, it hasn't been about numbers for the past 2 or 3 years once AMD Athlon XP became big. if all the little 12 year olds want to fight over decimals on benchmarks, let them. but when i saw the dual 2.0GHz G5 BLAZING past the dual 3.06GHz Xeon at WWDC, that was enough proof to me. they could show me all the numbers they
    • Re:real world apps (Score:5, Informative)

      by Andy_R ( 114137 ) on Saturday June 28, 2003 @01:42PM (#6320368) Homepage Journal
      They showed four top-shelf apps: Photoshop, Mathematica, Emagic, and one other I'm spacing on. In each case the apps were not demoed by mac but rather by someone from the app company.

      Emagic is the software company, not the program, and the fact that their Logic program one was demoed by Gerhard from Emagic rather than someone from 'mac' ( I think you meant Apple!) is a rather dubious disinction when you consider that Emagic is actually a subsidiary of Apple.

      Having said that, my contacts in the pro-audio community are hugely impressed by the specs that were being thrown around. Apple's decision to but Emagic and discontinue development on the PC version of Logic was widely criticised, but I think the pay-off of having Logic optimised for G5 will win Apple a lot of sales.
  • by xyrw ( 609810 ) on Saturday June 28, 2003 @10:34AM (#6319481) Homepage
    I'm surprised that slashdot is still stuck on benchmarks as an indication of processor speed. Hasn't it already been pointed out over and again that it is incredibly difficult to compare across platforms?

    I think it is best leave the pointless statistics to hardware fanatics, and use whatever platform makes one most productive. As such, if any benchmark is even minimally admissible, it is `real world' benchmarks. Yet they do not complete the picture, since productivity is a function of other things, such as user experience, planning required (for the type of job), ease of use-- the list goes on, but you get the idea.

    After a point, increasing the number of FPS you get in Quake 3 is not going to make it any more fun for you; likewise, beyond a certain threshold, it becomes pointless trying to get those pro tools to run faster.
    • by FueledByRamen ( 581784 ) * <sabretooth@gmail.com> on Saturday June 28, 2003 @11:51AM (#6319840)
      Actually, a higher FPS rating in Quake3-based games (I think the magic number is around 125) lets you jump higher and run a little faster. The key is that the engine physics are computed per frame, and something about the way they're written (maybe a rounding problem somewhere in there, don't ask me) allows for higher jumps and faster movement when you hit around 115 - 125 FPS.
      • Using Quake 3 for game performance benchmarking has its good points and bad points. For the good, it is widely accepted as a good platform neutral benchmark. Platform neutrality is one of the biggest problems in creating a good benchmark. The bad part is that Quake 3 is an old engine. At this point, it is irrelevant that the G5 can get 325 fps vs 275 (?) on a P4. What I want to see is whether or not the G5 has a noticeable advantage on newer games like UT2k3 where the difference between 50 and 100 fps is im
    • by dbrutus ( 71639 ) on Saturday June 28, 2003 @12:20PM (#6319952) Homepage
      But past a certain framerate I can compile in background and still run Quake at an acceptable speed. Sure, it makes the compile slow down some but if you're going to take a 10 minute break, isn't it nice to be able to get some work done in the background at the same time?
  • real world apps?! (Score:5, Interesting)

    by andrewleung ( 48567 ) on Saturday June 28, 2003 @10:45AM (#6319529)
    now, how come everyone is just focusing on SPEC benchmarks?! which compiler, what options were set, etc.?!

    i saw the keynote, they had photoshop/mathmatica/etc. going on there... photoshop has been out on PC for a while... REALLY enhanced with MMX/SSE/SSE2... and it probably was using the intel compiler... but the G5 version was only a few months old, barely optimized, and using whatever tools apple gave them (probably GCC 3.3)... and the G5s still kicked a lot of ass.

    benchmarks are important but it's not my job. if i can get shit done faster in photoshop with BSD guts, i'm all for it.

    fuck the benches. welcome to the REAL world...
    • Benchmarking (Score:4, Insightful)

      by Llywelyn ( 531070 ) on Saturday June 28, 2003 @05:22PM (#6321545) Homepage
      "The best benchmark is the app you want to use"

      Wisest advice I've ever heard--it was in my machine org and assembly textbook.

      *Any* cross-platform benchmark should be taken with a shaker full of salt--they simply do not represent real world performance.

      SPEC, for all of its nice points, also falls into this same category. In the end, when all is said and done, people prefer to confuse the model with reality--they think that real world performance follows SPEC scores.
  • XBench and Altivec (Score:3, Informative)

    by norwoodites ( 226775 ) <pinskia@gm3.14159ail.com minus pi> on Saturday June 28, 2003 @11:38AM (#6319780) Journal
    The problem with the XBench and the Altivec test is that it uses some instructions (dst) that are very bad to use on the G5 look at these technotes about tuning your program for the G5:

    The Altivec test uses the dst instruction every iteration through a loop so slows down the G5 (it might also slow down the G4 also).
  • Logic is available for both Mac and Win. Why were the PCs using cubase instead? They are different pieces of software, what an unfair comparison.
    • Quite simply because Logic is no longer being developed for Windows (now that doesn't happen every day!). So to make the comparison "fair" they choose the next competing product. Simple as that really.

      The above is based on the live feed done by iPalindrome @ arstechnica.com. The important bit is as follows:

      [14:51] Qbase on windows vs. Logic on Mac
      [14:51] Complex music piece created for the Matrix trailier
      [14:51] Play the PC first then the Mac
      [14:52] PC CPU is spiking aroujnd 85-90%
      [14:52] A

  • by Arkham ( 10779 ) on Saturday June 28, 2003 @01:11PM (#6320206)
    At this point, does it really matter if Intel, AMD, or Apple is the slightly faster computer?

    They're all extremely fast and all run one or more UNIX-like Operating Systems (Linux or BSD or OSX). For the Slashdot crowd, Windows is an afterthought, but I'll mention it as well.

    What a person decides to buy is not going to be based on speed anymore. All of the fastest current machines will blaze playing Quake 3 or UT2003.

    People who buy Macs may enjoy the speed, but that's not why they buy them. They buy them because they're cool, the have a really nice, easy-to-use, elegant OS that allows them to be productive. Also, they can use the commercial applications (Photoshop, Office, Filemaker, etc) they need on a stable, reliable UNIX platform.

    Linux/BSD users have a very different set of criteria. They're looking for cheap, super-secure, stable, configurable or some other particular criteria, but are not particularly concerned with the UI experience or with running commercial desktop applications.

    Windows users are a different group too. They want to run their commercial and vertical applications. They are not looking at Linux or Mac because their apps are not there.

    That's why there's not a lot of crossover right now between Mac and Intel/AMD. The audience is just different. Thanks to things like Lindows, there may be some Windows->Linux crossover, but this too is pretty small.
    • At this point, does it really matter if Intel, AMD, or Apple is the slightly faster computer?

      The simple answer is: YES! Speed does matter! You're argument remind me of what all of us Mac users used to say in these latest couple of years right before the G5 announcement. It felt like as if you had a real small penis and tried to defend with the good ol' saying that "It's not the size that matter. It's how you use it!". We were all going "Speed doesn't matter. My penis can surf the net, listen to music and
      • "This is true to a certain extent, but how is it for them who absolutely need to use a penis that is as big as possible to get their work done as fast as possible?"

        Then either:

        1) A 10% difference is not going to matter.

        2) They are going to use clusters, which mitigates the speed hit of the single system dramatically.

        3) They will likely be looking at those things which are specific to what they are doing.

        That being said, for the vast majority of people buying systems, a slight speed difference is not go
        • It's not the length, it's the width.

          I ran out of money recently and had to sell all my Amiga holdings (1 A3000 and 2 A2000)s. Temporary setback I assure you.

          Managed to get a 733Mhz VIA C3 back from a friend.

          My Amiga 3000 with a 68030 25Mhz CPU was smoother than that C3 abomination. I was able to use it as an X term thanks to Xami. (I love the 17" LCD that I had attached to the VGA port on the amiga). Later set it up on the CyberVision64.

          So it isn't about the speed. It is however about the processor (and
      • The "size does matter" argument doesn't always go that way. It's more often pulled out by PC users who think the speed of their PC somehow makes up for their personal shortcomings in the anatomy department.

        Of course speed does matter to an extent, but my point was that at some speed, it ceases to make any real-world difference. When G4 was 1/2 the speed of PCs in the market, and not able to keep up with Apple's "Aqua", this argument made some sense. If I am a Photoshop artist, a few extra FLOPS may mak
  • by OrangeHairMan ( 560161 ) on Saturday June 28, 2003 @05:02PM (#6321431)
    ...is that these things don't have an optimized operating system yet. It's like running benchmarks of Photoshop on Windows ME on a dual Opteron or something.

    Once 10.3 comes out, and once 64 bit apps get optimized, this system will kick even more butt...

    Orange
    • get it optimized today. With a dual G5, you should get it installed in a few days.
      • But that will not help because you do not get anything like the CHUD tools to optimize your programs, it will only optimize using gcc's scheduling which is weak and needs some help for the G5/970/power4 because there are cases where gcc should place noops so there are no reject the dispatch group by the processor (the most common one is the LSU reject which is caused by a load and a store to the same address in the same dispatch group, very bad).
  • by garyebickford ( 222422 ) <gar37bic@@@gmail...com> on Saturday June 28, 2003 @07:06PM (#6322101)
    It's not always the cycles, it's how they're spread around and how you use them.

    I still have an original 25MHz NextStation. CPU is a Moto 68040, plus (Intel?) Digital Signal Processing (DSP) chip that does (most of?) the rendering for both the display and the laser printer.

    Back in 1999 I compared this box in actual usability with a Mac Powerbook 5300, admittedly the slowest and lamest PPC Mac ever built.

    I found that in general usage, opening windows, updating display, doing word processing, etc., the NeXT outran the PB 5300.

    Compiling speed sucked big time. Stuff that took a few minutes on the PB5300 ran overnight on the NeXTstation. This demonstrated to me the advantage of having a display coprocessor.

    The user interface was also better by far than the Mac that stage. I used several 3rd party enhancements, such as one that provided an infinite-size virtual window, so it's not a completely fair comparison. The NeXT also scame with a bunch of cool apps, like Mathematica, Webster's, Lotus Improv (completely unique approach to spreadsheets, so far unduplicated.)

    Most impressive thing about the NextStation was the industrial design. It is still the most elegant design I have ever seen in a desktop computer. For example, the ribbon cables from the mainboard to the floppy and the disk are about 1.5 inches each - just a 90 degree curve, essentially. Those are the only wires inside the box!

    I've still got the NeXT, though it's back in the original boxes. I'll probably sell it eventually. I've also got three Perq workstations from 1982-3, but I haven't benchmarked them.

    It's worth noting that NextStep's complete object integration across all apps was cited as a major inspiration for Tim Berners-Lee's original proposal for the World Wide Web. In fact, I even have a running copy of that first version of TBL's code, called (surprisingly) "WWW".
  • by nuckin futs ( 574289 ) on Saturday June 28, 2003 @09:18PM (#6322732)
    Intel (and others) could dispute every benchmark out there, but no matter how fast a P4 or Xeon is, it has one major problem which prevents me from buying one...
    It still can't run OS X.
    And no...rumors about an Intel based Mac running OS X deep inside Apple HQ doesn't count.
  • A glimmer of hope (Score:2, Interesting)

    by navig ( 683406 )
    Even with all the disputes for and against the new G5s, it is good to see Apple providing a worthwhile high-end machine.

    The fact that these benchmark arguments are even occuring is 'a good thing' for the Apple community.

    For the last few years Apple owners have always had to begrudingly admit that they had no hope of beating Intel/AMD on nearly any performance metric. Thanks to the G5 they now have a glimmer of hope (and pride)!

    It is also good to see Apple announcing a 3Ghz edition of the G5 in the near f
  • What a joke (Score:5, Insightful)

    by coolmacdude ( 640605 ) on Saturday June 28, 2003 @10:36PM (#6323027) Homepage Journal
    I stopped reading the article when I got to the subtitle where it refers to Apple as a "Cupertino Fruit Company." Look, Mr. White, if you aren't even going to show any respect at all and even mock one of the companies in your so called comparison, how do you expect anyone to take you're evaluation seriously?
  • Real World Benchmark (Score:5, Interesting)

    by inertia187 ( 156602 ) * on Saturday June 28, 2003 @11:49PM (#6323322) Homepage Journal
    Here's my idea of a real world benchmark. Take 75 people with varying levels of technical no-how. Divide them into three groups of 25, and assign various real world tasks.

    Obviously one group of 25 is using only the latest and greatest that the wintel people has to offer, while another group is using only the latest and greatest that Apple has to offer.

    What is the third group doing? Each person in the third group gets to choose which platform they can use.

    All three groups would be given real world objectives. Some would be as simple as writing a report. Some would be as technical as application development. Others would be as pointless as a Quake III tournament. All would be measured for how much time it took to complete, and/or other pertinent measurements to see which platform stood out. This is less of a performance test and more of a productivity test.

    What is the third group for? It's the preference control group. Do people really prefer one platform over the other AND are they more productive when they can choose? That's what I'd really like to know. Most companies are dead set on one side or the other (usually wintel). If anyone goes off the beaten path, they are the black sheep.

    Personally, I like to work on multiple platforms - some at the same exact time. If the current BitTorrent implementation is better on OS X, I'm using it. If the best IRC implementation is in the X Window system, I'm there. If it's quicker for me to pull up the Windows calculator when I'm trying to convert a decimal value to hex, that's what I'll do. But am I really being more productive (and why am I using BitTorrent and IRC to measure this)?
  • by nettdata ( 88196 ) on Sunday June 29, 2003 @12:16AM (#6323406) Homepage
    A lot of people seem to be making a big deal out of benchmarks, but at the end of the day, I'm still going to buy the thing because it's the fastest Mac on the planet, and I don't care HOW it compares to other chips/boxes.

    The only comparison I'm interested in is how it does against the G4... and it ROCKS.

    Now I'm just waiting for a dual proc G5 XServe to be released...

    *drool*

    • Now I'm just waiting for a dual proc G5 XServe to be released...

      I'm waiting for a 4-WAY XServe. Check out the section of Apple's site for Panther Server. They seem to be going after (caution - over-used bad business jargon coming) enterprise-wide applications and enterprise users (end jargon) pretty seriously. All I have to say is that if so, it is about time.

  • by caleugene ( 531964 ) on Sunday June 29, 2003 @05:43AM (#6324108)
    Spin doctoring Apple's spin doctoring...classy.

    Charlie White is quick to rattle off about Apple's marketing practices, but he seems to forget how, oh, the rest of the industry does this too. It's standard practice.

    AMD would have you believe their chips are 3200+ fast...whatever that means. As if Quantispeed isn't the current biggest marketing annoyance on the planet...I mean how can AMD sit around trying to convince people of the MHz Myth when they can't even convince themselves...forcing themselves to use Pseudo-Hertz...

    And lovable Intel...with their NetBurst Architecture...it makes the internet zippier! Or HyperPipeline Technology. It must be good...

    If Charlie White really wants to convince people the G5 sucks, he should be a little more candid about his bias.
  • by AvantLegion ( 595806 ) on Sunday June 29, 2003 @11:25PM (#6328500) Journal
    Isn't it "ironic" that the vast majority of users that argue over benchmarks are NOT people that run tasks where the +/- 5% differences would make a difference?

  • by White Roses ( 211207 ) on Monday June 30, 2003 @06:39PM (#6335179)
    I'd say the fact this moron uses, as a reference link to external information, the haxial idiocy, pretty much eliminates any credibility this guy had.

    Oh, and right on BOXX's homepage, it says Workstation. And speed? In fact, the fastest Opteron you can get is 1.8GHz. So, again, this guy is an idiot. And if he wants to spend about $1000 more (yes, that's right, check the dual 2GHz G5 against the dual 1.8GHz BOXX with similar specs) on his system, then he's fallen into the same trap that all us deluded Mac users have evidently fallen prey to: quality costs money. Perhaps it's the fact that a G5 costs $1000 less that makes it "not a workstation"? Hmmm? 'Praps? And anyway, it's an Opteron. If that's what the G5 is competing agains, why is AMD bothering to make the Athlon64, which they freely admit is their desktop 64-bit processor? Let's see what these Opteron systems do against, say, a Power4.

    It's also so very nice of him to blindly trust AMD. Surely, they have nothing to gain by claiming that they have the fastest processor, oh no. And AMD naming their chips with blatantly misleading numbers, well, that's not marketing at all, is it? How can this Wintel court jester say that AMD has more or less credibility than Apple?

    And here it is, the crowning turd on the dung heap: "But then, there's credibility, which some people believe is everything." Though, evidently, not this delusional puppet, because he has none.

  • by afantee ( 562443 ) on Tuesday July 01, 2003 @06:27AM (#6338024)
    Anyone doubting the speed of G5 should take a look at this

    http://www.tpc.org/tpcc/results/tpcc_perf_ results.asp?resulttype=noncluster&version=5

    In short, the IBM pSeries 690 with 32-way 1.7 GHz IBM Power 4 is 10% faster than the newly released HP 64-way 1.5 GHz Itanium 2 6M Madison, which means the Power 4 is 220% as fast as Madison and much more than the 3 GHz Xeon.

    According to IBM, the Power 5 will be 400% faster than Power 4 and is coming next year. It looks that Apple is in good company.
  • by afantee ( 562443 ) on Tuesday July 01, 2003 @07:02AM (#6338100)
    According to this article:

    http://www.computerworld.com/hardwaret opics/hardware/server/story/0,10801,82642,00.html

    A Dell 1.3 GHz Itanium 2 (Madison) server costs 200% as much as a dual 2 GHz G5 Power Mac.

    There were 1900 Itanium 2 servers sold in the last 3 months - an embarrassing figure shared between so many OEMs. According to Intel, there are only 400 native programs for Itanium.

    In contrast, there are over 6000 native OS X programs that will run the G5 with no modification, and there should be many 64-bit apps in the next few months. So why should anyone want to pay twice the money for a hot and noisy Dell with less performance, less feature, less style, and much less software than the dual G5 Power Mac?

If I'd known computer science was going to be like this, I'd never have given up being a rock 'n' roll star. -- G. Hirst

Working...