Mac vs. PC Digital Photography Comparison Redux 314
Macmurph writes "Bibble Labs has released a lightning fast version of the RAW image convertor, MacBibble. According to MacBibble creator, Eric Hyman, "MacBibble 3.x is almost 10 times faster than the manufacturers software when converting RAW files under OSX.". Prelimenary tests indicate the Mac may be faster than PCs in RAW image conversion afterall. This calls into question the relevance of the the hotly debated article Rob Galbraith posted just 3 weeks ago and discussed here on Slashdot. Two thumbs up for the PowerPC G4's AltiVec vector processing engine, now being put to work in MacBibble."
Multi-processors (Score:4, Interesting)
Re:Multi-processors (Score:5, Informative)
It's probably a question of economics more than anything else. A 2 CPU system for most end-user applications probably delivers less percentage increase in performance than its percentage increase in cost right now. But up till now its been cheaper to replace a single CPU with a faster single CPU than to invest more upfront in a multi-CPU system -- you have to keep it longer, which means you fall farther behind the current performance curve.
If it became 'standard' to have them, OS and App vendors would be able to deliver a performance jump out of 2 CPUs through better parallelism that would outweigh the increased hardware costs associated.
In the PC world, there's also the historical problem of lack of mainstream OS support for multiple CPUs -- I can't remember if XP consumer even supports it, now that I think about it -- which creates that chicken-and-egg problem. NT4 was a highly marginal 'consumer' OS, Win2k had more reach but still not what the 9x series had and XP adoption has been slower due to people just keeping PCs longer.
I've had a dual CPU system at home for 3 years and I'm not entirely sure I'd replace it with another one once I looked at the economics of it. The biggest single benefit I can think of is that it doesn't bottleneck the way a single CPU can when a single process pigs out at 100%, I still have a nearly-idle CPU to work with -- which is the problem with 2 CPUs, one's nearly idle.
Re:Multi-processors (Score:5, Interesting)
Re:Multi-processors (Score:4, Insightful)
Essentially, the effect you're mentioning could be handled on a single CPU machine simply by running a scheduler that guarantee that no process will get any more than every second timeslice, or similar, penalizing single threded applications.
Re:Multi-processors (Score:2)
At work I have (because I'm too cheap) a 1.xx GHz PIII and at home I have a dual PIII 667 (actually 650 overclocked a tiny amount). I think the work PC is faster overall and just as responsive, even though its only a single CPU system.
I realize that much of the advantage is 133 vs 103Mhz bus, but that's part of the point -- if you stick with cheaper, simpler single CPU systems you can get a new one faster and take advantage of other improvements on the motherboard faster than you can with a more expensive (and hence kept longer) dual CPU system.
Re:Multi-processors (Score:5, Insightful)
Could it be that there's also an element of laziness on the programmers' part? I expect it to be easier to write an application that expects to run on one processor (you don't need to worry about dividing tasks over multiple processors to optimize performance) than a multi-processing app.
And, who really tries to optimize performance today? IMO many programmers expect Moore's Law to take care of the performance increase (relative to the previous release of their program). I rarely see a version n+1 of an application that's faster than version n was on the same hardware.
Re:Multi-processors (Score:3, Insightful)
It's not laziness, it's priorities. Optimization is low priority in programming; if there's other things that need to be done, they need to be done first. And hardware optimization comes even lower on the priority meter, especially hardware that only a few users have, and especially hardware that will at most give you a 2x speed-up.
Re:Multi-processors (Score:4, Interesting)
Good point. The question is then, why is speed such a low priority when the #1 argument to buy a new PC is that 'it's faster', and the #1 comparison statistic (useless or not) is the processor speed?
Re:Multi-processors (Score:2, Insightful)
So long as software speed is perceived as "acceptable", that's generally where the optimization stops. This, of course, is not true for highly competitive software markets. Maybe we need a little more competition.
Re:Multi-processors (Score:2)
Re:Multi-processors (Score:2)
Re:Multi-processors (Score:3, Interesting)
XP Home does not support dual processors. As a matter of fact, this may render it an unusuable OS for HT processors because what they do is trick the hardware into thinking it has two processors.
There's something else to consider about OS support for multiple processors: The UI needs to support it, too. I run Win2k on a dual machine, and because I'm aware of how to direct a process to an idle CPU, I'm able to make reasonably effective use of the second processor. Unfortunately, though, I have to go into Task Manager and hunt down the process I want to change. I'm able to use it, but I have a difficult time explaining this to other people. The big problem is that it's hard to tell, at a glance, which process is doing what.
I really hope that one day Windows (or whatever OS I end up using in the future) puts something in the titlebar of each app I'm running so I can set which processors it is active on. That would seriously be cool. I'm not a big fan of running apps in multi-threaded mode because I like having the CPU resources to keep the interface etc going. If, at a click, I could say "stop using this processor", I'd be one happy SMP adopter. I'd even be able to recommend this type of machine to less tech saavy people.
I gotta ask, though, does anybody know of a hack for Windows that let's me do this?
"The biggest single benefit I can think of is that it doesn't bottleneck the way a single CPU can when a single process pigs out at 100%, I still have a nearly-idle CPU to work with -- which is the problem with 2 CPUs, one's nearly idle."
True dat! Explorer is well multi-threaded, and is always very responsive on a dual machine. When I first started working on a dual I couldn't believe how enthusiastic Explorer and IE were, even when running a rendering in the background or something. The dual processor support didn't buy me 2x the speed (though potentially it could get close, depending on how efficient I plan on being...) what it did buy me was that my computer was ALWAYS useful, as opposed to having it completely busy chewing away at something.
Ah how I ache for a dual processor laptop.
Re:Multi-processors (Score:2)
Wrong.
XP does not support dual physical processores. However, it fully supports a hyperthreading CPU and it's second logical CPU.
Re:Multi-processors (Score:2, Interesting)
It's an OS issue... (Score:2)
Even today the only way to get a branded SMP box is if you splash out a *lot* of money on a 'workstation' model. Most people just build their own box.
Funny how a lot of these same users bashed MacOS (up to 9.x) for not having true preemptive multitasking, etc. etc.
I am still having problems even now with SMP under Windows... the driver for my Handspring Tréo keeps crashing
Incredible! (Score:3, Funny)
Stay tuned for next week when we take the existence of bikini waxes to show that George W Bush is a paedophile.
Re:Incredible! (Score:2)
Re:Incredible! (Score:3, Insightful)
Re:Incredible! (Score:2, Insightful)
Re:Incredible! (Score:5, Informative)
The trick is getting programmers to take the time and effort to optimize for specific platforms. This takes time and money to write quality code, but in the era of Microsoft timeline driven products, quality software code is harder to come by.
Re:Incredible! (Score:5, Interesting)
Typically the PowerPC (seen in most of the the www.top500.org list of fastest clusters) trounces Intel and even AMD at almost every benchmark.
Not just the 10 famous benchmarks as part of the composite in ByteMark , but at many other things such as the RC5 contest.
according to the RC5 benchmarks AMD is far slower than dual cpu macintoshes (half as fast). (source available for cor rc5 loops for most
processors)
The Mac Dual 1 Ghz g4 is faster than all existing dual AMD motherboards in RC5 benchmark by almost 100%.
21,129,654 RC5 keyrate for dual 1 Ghz g4 system ! And Now apple sells dual 1.25 Ghz stock and this week a 1.45 Ghz which would be even faster.
A dual 1800+ AMD MP get only HALF as many as a Mac! 10,807,034 rc5 keys !
Funny "Mhz myth" there showing itself I guess... Apple now is selling even FASTER machines than that one I mentioned made over one year ago, but with smaller caches and less fast read-write ram (it
now uses DDR on newest boxes).
The mac I mentioned uses a 2 MB L3 cache and no amd mp dual cpu boards I know about have any L3 cache at all, so maybe that is why some common macs are
over twice as fast, its not just altivec meager tweaks to rc5. AMD have similar , but less mazing vector ops.
The Pentium 4 takes many cycles (over 7?) to do a simple left shift. That is why the Pentium is MUCH slower than even the AMD or Mac.
Most modern CPUs can do a left integer shift in 1 cycle, any barrel position, not 7 slow cycles.
(Shifting is used a lot in decryption, encryption, graphics processing and many things).
Another reason the mac might be over twice as fast as an amd dual mp board is not just the 2MB l3 cache but the fact that mac can read and write to
a cold page of memory simulatneously FASTER than any AMD MP designs which are biased for linear access and streaming. Many memory scatter
benchmarks show this too. Apples newest DDR-RAM machines might not offer this feature though.
True, RC5 fits in primary cache of most machines, though interrupt services need larger caches depending on interrupt designs and load for the rest of the OS.
The RC5 benchmarks are never run with interrupts off, they use real world overhead.
The Macs made since september also can RAPIDLY service every pci slot almost simultaneously one 32 byte cacheline each if needed. How can it do that ? Three cool features of modern PCI
* out-of-order completion
* address bus streaming
* intervention
Out-of-order completion allows the memory controller to optimize the data bus efficiency by transferring whichever data is ready, rather than having to pass data across the bus in the order the transactions were posted on the bus. This means that a fast DDR SDRAM read can pass a slow PCI read, potentially enabling the processor to do more before it has to wait on the PCI data.
Address-bus streaming allows a single master on the bus to issue multiple address transactions back-to-back. This means that a single master can post addresses at the rate of one every two clocks, rather than one every three clocks, as it is in the 60x bus protocol.
Intervention is a cache-coherency optimization that improves performance for dual-processor systems. If one processor modifies some data, that data first gets stored only in that processor's cache. If the other processor then wants that data, it needs to get the new modified values. In previous systems, the first processor must write the modified data to memory and then the second processor can read the correct values from memory. With intervention, the first processor sends the data directly to the second processor, reducing latency by a factor of ten or more.
ALtivec is not usually the reason a mac performs better than Intel in benchmarks of properly compiled code, because the famous set of 10 algorithms in ByteMark were not using ANY altivec instructions.
And the AMD bests the Intel at Rc5 mainly from integer features.
I laugh when pc people try to dismiss the fastest machine (Macs) by claiming Altivec "cheating" all the time. The mac people should be the ones to call foul when Intel was cuaght PAYING adobe to slow down filters in one version of Photoshop to artificially make the Pentium MMX 166 Mhz look faster. They got caught paying big bucks. Adobe replied that it was an unfortunate side effect of adding optimization for MMX and not keeping the code efficient in the non MMX case as it was before. HA!
Almost every pc person likes to use benchmarks that use lots of assembly for intel (Quake, etc), but shy away from benchmarks that offer source code in ANSI C.
I knew the mac handled RAW better than PCs and this news is no surprise to me.
Re:Incredible! (Score:5, Insightful)
Using RC5 as a benchmark is only relevant insofar as you want to compare RC5 processing speeds. There RC5 algorithm, as well as the specific implementation found in dnetc, contain many aspects which make the results you obtain insightful for general use. You simply cannot compare RC5 rates and hope to extrapolate or project them into general processor comparisons.
The RC5 algorithm relies heavily on bitwise rotates (left, if you're curious, ROTL) which is an operation that is not commonly found anywhere outside the world of RC5. This instruction is so underused, in fact, that many x86 architectures (AMD's K6 for instance) have taken to simply emulating the ROTL operation and eliminating true hardware support. This is why some conventionally powerful platforms (such as Sparc and Alpha based systems) do abysmally in RC5 as compared to x86 platforms containing a hardware ROTL implementation.
Then again, this level of detail is probably lost on someone trying to compare a 1GHz G4 against an "AMD motherboard". AMD has made quite a number of CPUs in the past few years and their range of performance on RC5 is very broad. At one time, the AMD K5 was, in fact, the best-performing architecture in RC5 with the most keys per clock. AMD doesn't make any motherboards as far as I know.
The core of dnetc is also small and lean, often fitting entirely in L2 cache on many architectures. This means that dnetc does not adequately (if at all) exercise memory bus bandwidth. The cores also tend to be hand-tuned assembly, so they aren't as likely to exercise a processor's speculative execution routines. RC5 uses absolutely zero floating point math, also an uncommon scenario and not representative of many apps you would traditionally think of when you think of apps which require strong CPUs to perform well.
Many people enjoy having machines which perform well at RC5 and generate impressive distributed.net stats. Consequently, RC5 shows up as a metric in a great number of reviews and analyses on architectures and CPUs. I'm tickled whenever I see it and I think it's a great addition to any CPU review. However, it's not valid to try to make the claim that RC5 performance rates mean anything more than RC5 performance rates.
Moo!
Re:Incredible! (Score:4, Informative)
Let's start...
Uhh... I cannot even start to debunk this. Probably because I don't get what you mean, except that you were whoring for karma with the open-source crowd.
Never heard of them. How about the industry-standard SPEC benchmarks? Oh, wait, Macs are twice as slow when compared to Pentium IIIs with the same clock speed, IIRC. Apple is so ashamed of the processors they use, you won't see a single SPEC benchmark published by Apple.
I have covered that extensively on the Slashnet forum with DCTI. [slashnet.org] To make a long story short, the rotate operations (not bit shifts) were made available on the Altivec instruction set, and MMX/SSE2 doesn't have them. Observe that these useless (for the most part) instructions are only provide on the x86 and PowerPC ISAs, all other major CPU architectures do not offer these instructions. The more I think about it, the more it seems Apple was going for ultimate RC5 performance by including these rotate operations on Altivec -- so they could have at least one benchmark they'd always be ahead of everyone else, as long as they can keep their clock speed within 33-50% of x86 processors (that's 2-3 times less, if you haven't realized).
Wrong, only 4 cycles.
This has to be the worse piece of BS I have ever read on my life.
This is where you have shown how you don't understand anything you're talking about. This cache-snooping protocol is a feature of the Athlon (I doubt the Macs have it), and it is valid for the whole range of memory and not only the PCI bus -- which probably is marked as uncacheable in the MTRR so reads and writes are not cached, you obviously don't want that for I/O data.
Quit the karma whoring, troll.
Re:Incredible! (Score:2)
I went the specmark.org to look for the proof of your statement, but they actually don't have any motorola processors listed as tested. I would love to find the comparisions that you mentioned. I do remember one instance where Apple actually released SPEC benchmarks where the G4 beat either a P3 or P4 (I can't remember exactly) but there is zero chance on my finding that info now.
You tell'em buddy.
Re:Incredible! (Score:2, Informative)
Re:Incredible! (Score:3, Insightful)
"Widescreen" doesn't mean anything at all. My computer screen is 1.6:1. My TV screen is 1.78:1, though most of the content I watch on it is 1.85:1. When I go to the movies, the screen is usually 2.35:1. "Widescreen" can be applied with equal truthfulness to any or all of these.
And you wouldn't need the extra room for your apps if damn Apple would put back in the multiple terminals every other Unixy product in the world has. Damn Crippled Unix.
The only thing I can figure is that you're talking about virtual desktops. Either that, or you're an idiot. Maybe both.
You're getting shafted, paying for *way* more accessories than you need, but with a low powered CPU that will have you upgrading in a year, before you could even *find* a network that uses Gigabit ethernet.
My home network is 100% gigabit Ethernet. All you need to build a 100% gigabit Ethernet network is two Macs and a cable. It doesn't even have to be a crossover cable; all Macs are equipped with autosensing MDI-X ports.
That means a 3Ghz CISC still kills any G4 out there.
Except running Bibble, evidently. And BLAST. And all the other stuff that a G4 is faster at than a Pentium.
Apple is a speck on the PC world's radar.
I think you've got that backwards. The PC world is just a speck on Apple's radar. Apple is quite happy to go their own way and let the PC world go do... whatever it is that the PC world does. Every once in a while, the PC world takes a look at what Apple is doing and changes direction a bit, but that's about the limit of the interaction.
Re:Incredible! (Score:2)
Don't tell me that the quest for the meaningful, objective benchmark has ended in success.
who would have thought... (Score:4, Insightful)
I mean this is not rocket science! You would get similar results on most any machine using SSE2/MMX and hyper threading (perhaps...).
Re:who would have thought... (Score:5, Insightful)
Re:who would have thought... (Score:5, Interesting)
Rather than indicating that the distributed.net team would rather see PowerPC 74xx systems triumph in the key-crunching race, it would indicate that MMX/SSE2 are a royal pain in the ass to leverage unless you're coding/decoding pretty specifically what they were designed to code/decode - though IANAC++P...
and again /. fires off the flame/zealot war (Score:3, Insightful)
Biased... (Score:5, Insightful)
The reality remains that benchmarks prove little.
People who are in love with Macintosh have, throughout history, had the speed card in their deck. At this particular time, many would argue they don't. (Many would argue they do...)
People whoa re in love with other platforms, hardware and software, like their platforms for specific reasons, as well. Speed may be one of them.
But, I think, deep down, Mac users are attached to the platform for more than just speed. It's the efficiency of the operating system, the attention to detail, the clean interface, the simple plug-and-play, the good support, the Apple iLife products...
It's all in the eye of the beholder.
jrbd
Re:Biased... (Score:3, Interesting)
Re:Biased... (Score:3, Informative)
Not only is the hardware of decent quality, but it runs all the software I want. I get the commercial packages such as office and cubase vst, and I can also drop to a terminal and apt-get basic unix packages, courtesy of fink.
I'm posting this from a dual 1ghz g4 with 15K SCSI drives. It's a fast machine, but more importantly, it's a good machine.
Re:Biased... (Score:3, Insightful)
I have to disagree. I would say that the particular benchmark in the original article and to a lesser extent even the much maligned Steve Jobs Photoshop benchmark are really the only kind that count - real world tasks that (some) users will actually be using (a lot). The problem is when people imply or believe that these narrow benchmarks of specific tasks mean something about the performance of the chips more generally. A VERY fair criticism of Steve Jobs, though to be fair his audience is largely made up of people that make their living with Photoshop. It's not fair at all of Rob Galbraith who's audience is entirely made up of people that make their living not only with photoshop but spend most of their time working with RAW formatted images.
The original article by Rob Galbraith was an extrememly fair test of how well different platfroms did a particular set of processor intensive/time consuming tasks that professional photographers are doing all day long. It doesn't matter WHY one performs better than the other. It doesn't matter if the test is "fair". It doesn't immediately matter* if the results are lousy on one because the software is written in a notoriously slow scripting language running under emulation and great on the other because it was written in assembly - it only matters that I can process more photos in an hour on one machine than I can on the other.
*One caveat to the above - in the long run it might matter that the tests are at least somewhat "fair" because the next upgrade to the software might change things dramatically - which apears to be the case with this second
Hot Damn! (Score:2, Funny)
3DNow! (Score:5, Insightful)
Re:3DNow! (Score:5, Informative)
Re:3DNow! (Score:5, Informative)
result = vec_add( aVector, someOtherVector );
and it works properly regardless of what sort of vector you've chosen to use for aVector.
I've yet to see anything similar for 3D Now or SSE/SSE2. Everything I've seen for them is either a library that is too application specific (like a premade image recognition library), or requires using assembly and a compiler newer than VC++ 6.0 (maybe only SSE2 really requires that).
Apple also provides a bunch of other libraries, like vDSP (I'm sure AMD and Intel provide an equivelent), and BLAS (this is a somewhat standardized library across platforms. My recall is that there is a SSE/SSE2 version, but Intel charges money for it, instead of giving it out for free), and in general, they make it easier for Apple developers to take advantage of Altivec than Intel does SSE2 or AMD does 3D Now. Unfortuntaly, a lot of developers want to maintain only one code base across all platforms, so they won't use the Apple provided tools (there are free unoptimized versions of BLAS for every platform though, so developers should at least use that so they can't get speed benefit on platforms that provide it), which sucks because GCC also sucks for speed, so people using vendor supplied compilers on other platforms (like Intel's on Windows or Linux, SGI's on Irix, Sun's on Solaris) get a nice speed boost that would require hand assembly optimization to get on a G4.
Re:3DNow! (Score:2)
Hardly a fair comparison (Score:5, Interesting)
Hang on a moment. The last Mac vs PC test was conducted fairly - Photoshop on a Mac vs Photoshop on a PC. Using nearly-identical software the clear answer was that the fastest PC today was faster than the fastest Mac.
Now someone writes more efficient code for the Mac, then tries to claim that Macs are somehow quicker than PCs? Talk about an unfair test - that's like that's like writing a pi calculator in BASIC for the PC and seeing how quickly it can calculate 1m decimal places on a 2ghz P4, then writing one in assembler for a Mac classic. If the Mac classic wins, does that mean the Mac* is faster at calculating pi than a PC?
* Macs in general
Re:Hardly a fair comparison (Score:3, Insightful)
The name's the same, but we're talking about assembly level optimizations which necessarily differ between the two platforms. It's a different program for a different machine.
The fair test would be to pit the fastest Mac app against the fastest PC app.
Re:Hardly a fair comparison (Score:4, Informative)
No, you didn't read the article - it conducted tests using several different pieces of software but not one of them was Photoshop, although one was a third party photoshop plug-in. The tests were very narrowly focussed for a specific set of tasks of vital interest to professional digital photographers but of very little interest to anybody else.
The complaint at the time was that all of the software used was originally written for the PC and ported to the Mac. To use your analogy: In the original article the pi calculator was written in assembly for the PC but in basic for the Mac and the Mac suffered from that. Despite the MHz gap this was counterintuitive to those who follow this kind of issue because it was exactly the kind of specialized task where the PowerPC's superior vector/SIMD performance should have (and was assumed to) MORE than compensated for it's slower clock speed. Still it's a perfectly fair test because if you're interested in doing that task and this is the only software to do it with it doesn't matter to you WHY one system is faster than the other, only that it IS.
NOW, however as a follow up on the original story & controversy
A fair criticism of
Re:Hardly a fair comparison - LIAR Mac faster in C (Score:4, Funny)
cache bribes? How much did Intel give them, 64K or more?
The Trick with RC5... (Score:2)
P4 -- Works on one key every other clock cycle.
AMD -- Works on one key per clock cycle.
G4 -- Can work on 4 keys at once.
Or something else along those lines--regardless of the actual numbers, it demonstrates how powerful AltiVec really is.
Re:Hardly a fair comparison (Score:2)
When will people realise... (Score:5, Insightful)
When will people realise that raw speed, although useful to deisgners and artists, is NOT the be all and end all of which platform is preferable for this industry.
The main reason why macs are so dominant in publishing and art is becasue of the old (true) cliche - it just works. Designers are generally NOT a technical people, they think with the other side of their brain all day long, and technology confuses them, so even if a PC goes 20% faster at some filters, if they can't figure out problems with DLL's, conflicts, registry problems and having to reinstall Windows every 9 months then what is the better system for them?
How about usability and workflow (please comment on these only if you've used both machines (Win & OS X) in a demanding and very time specific industry to a large extent) - OS X hands down, allows me to ignore the fact that I am using very advanced technology that's incredibly advanced and *do my job*.
This allows me (and hundreds of thousands of others) to get a much bigger performance boost out of my work than a faster processor.
What are the productivity gains of perfect networking, great UI, better support for FireWire, BlueTooth, Wireless stuff etc etc etc.? It's not quantifiable but it is much more important than slightly faster processors, so lets just stop the whole thing there.
So in brief, processor speed important (and nice to see the Mac keeping up in one area) but not so important it outweighs the other thousand reasons design professionals use Macs.
-Nex
Re:When will people realise... (Score:2, Insightful)
It's funny you should say that... the last 5 or 6 graphics PCs in our office have all come with Windows XP. We installed Photoshop, Illustrator, Pagemaker, hooked them up to our Samba server, network printers, scanners etc and... they just worked!
In fact, they have done for over a year now with virtually no problems at all.
And they have a right-mouse button and a scroll wheel!
Re:When will people realise... (Score:3, Informative)
No offence, but I use a PC for programming, and my Mac for design work, I once tried to work solely on the PC for everything in between getting my new Mac, tried it for nine months, and did some traditional high end stuff. Speed aside, I found Win XP to be much less reliable to work with, I felt it was always trying to find a way to screw up behind my back - and I *do* think that your experience of XP is un-commonly good if you use the machines daily and hard.
But if it works, and you like it then good! (Although if you haven't used OS X (I'm guessing you switched from OS 9, or from no Macs at all) then I think you're missing out).
Oh, and search slashdot for 'Mac +Mouse' to read the 13'000 posts that describe how you can plug any 'normal' mouse into OS X and it'll work instantly...
-Nex
Re:When will people realise... (Score:2)
I can only speak for myself - I don't doubt that many people DO have nightmares with XP, I'm just saying that (so far) we've been lucky.
I guess I'm sceptical of the "Macs never crash and are easier to use" line because of my own (very limited experience) of OS 8.6
Re:When will people realise... (Score:2, Interesting)
Before I bought my powerbook last year my only experience with Macs was classic (7.5.5 through 8.6) and I wouldn't have gone near 'em with a barge pole let alone blown two grand on a new Apple Laptop, an afternoon with OS X changed all that, and this was back before Jaguar.
seriously, give the modern Mac a real test-drive your inner geek will love you for it
Re:When will people realise... (Score:3, Interesting)
That part is actually the beauty of the USB spec.
If you design a device to the spec, then it will work on any OS that implements drivers to that spec. So you shouldn't need special Mac drivers for USB speakers, drives, mice, keyboards if they were properly designed. I even had a Mac friend that wasn't sure how to hook up his computer to his new reciever that has a USB jack. I told him to just plug it in. And it worked. The maker didn't even include a CD, drivers or any of that crap.
Also, the right to sell products with the USB logo means that the company is required to have those products compatibility tested - which means any product with the USB logo is going to work, so you don't have to look up special hardware compatibility lists. The reason why computer standards didn't work as well in the past is because those standards didn't mandate compatibility tests.
I know USB has its downsides but it works.
Re:When will people realise... (Score:2)
Re:When will people realise... (Score:2)
I think Apple has already integrated most such drivers into OSX, because the public interface to many common devices is specified by the USB standard. Any good USB digital camera would show up as a hard drive. Tablets probably fall under the existing Human Interface Device spec. Scanners would fit too. I don't know about printers though, I imagine having a specific driver might allow you to use more features.
It is even easy to buy several specific development demo boards from Cypress Semiconductor (among other companies) that show how you can design and produce 100% USB compliant keyboards, mass storage and other devices.
Re:When will people realise... (Score:2, Interesting)
get a mac
I beleive your talking ..... (Score:2)
There's the USB spec, if you follow the spec you can find out information about any device you plug in.
(like PCI)
This has nothing to do with actually using the device
A USB device will still require the device specific control, data and interrupt messages to function propaly. Thease are put into a driver, and that driver 9/10 is OS specific.
Re:I beleive your talking ..... (Score:2)
That way, you don't need a specific driver by Logitech for your specific mouse, as far as the OS is concerned, the basic functionality of the mouse is the same.
Re:When will people realise... (Score:2)
Yeah, so? The trackball I used to click 'Reply to This' has two left buttons, two right buttons, a scroll wheel, and six programmable buttons across the top (Kensington Turbo Mouse Pro). And it works just wonderfully on my Mac, as will any proper USB device. Oh yeah, and my computer colour matches properly, which I'm sure your PCs don't. And it may be a small thing, but I like my 'command' key on my Mac rather than 'ctrl+shift' on a PC.
Re:When will people realise... (Score:2, Interesting)
These guys - the guys I know, at least, are a close knit community. And technology just isn't that important to most of them. So they use a Mac because they know that if they get stuck, there's a whole host of other people trying to use the SAME exact software the SAME way on the SAME platform that will help them out.
I'll add to that (Score:2)
Me too!
I generally agree though I would add a couple of points. First Apple's dominance of this particular market itself is a compelling reason for those entering the market to stick with it. If you are a designer you are going to be expected to use macs wherever you work - what kind of machine are you going to buy for yourself? If you are hiring designers you are going to be hiring people that have always used macs and might *never* have used PC's - what kind of machines are you going to buy for them? If you freelance everyone you have to work with will be sending you files from macs and taking the files you provide and using them on a mac - you can use a PC but it will add at least a little hassle. If you write software for these people which platform are you going to focus on? In the past the PC didn't even have the software required, that's changed now but still a lot of designers (especially somewhat older ones like me) still have the idea that you simply *can't* do professional design work using a PC.
Second I think focussing on stability/reliablity is a little unfair. I think windows is a lot more reliable than Mac users believe and MacOS 9 which many designers are still using until Quark gets it's act together certainly had NOTHING to brag about in that department. The real advantage is more subtle but perhaps more significant, especially considering how it apparently compensated for OS 9's UNreliablity. That is: when it worked it really did work more intuitively - This was certainly true back when the competition was DOS and then Windows 3.x and even Windows 95. After windows started to improve by borrowing heavily from the "mac way" the Mac was still more intuitively easy, partly from long familialarity by this point and partly from the continued focus on that value which shows up in Apple's focussed attention to little details that Microsoft seems to only think of as afterthoughts. I'd say OS 9 was an instance of getting all the big things wrong but getting all of the details, at least from the users perspective, right. UNIX is probably the exact opposite (which is why MacOS X is so exciting) and Windows is a (unhappy?) compromise between the two criticized from the UNIX side for getting the big things wrong (though not as wrong as the old MacOS) and criticized from the Mac side for getting the user interface details wrong (though not as wrong as UNIX). MacOS X has a real chance of getting both the fundamental things and all the little interface details right, though it's still a little immature and suffers from having had to make some comprimises. It is not quite there yet, though I think they are pulling ahead of the competition.
Re:When will people realise... (Score:3, Interesting)
The main reason why macs are so dominant in publishing and art is becasue of the old (true) cliche - it just works...if they can't figure out problems with DLL's, conflicts, registry problems and having to reinstall Windows every 9 months then what is the better system for them?
What are the productivity gains of perfect networking, great UI, better support for FireWire, BlueTooth, Wireless stuff etc etc etc.? It's not quantifiable but it is much more important than slightly faster processors, so lets just stop the whole thing there.
----- -----
I agree with your premise - that in the end, the result is what matters, and if you can save 10 hours of headaches with sacrificing a few seconds here and there, then you are probably better off.
However, to your point above, please see Rob Galbraith's post about 10 down from the top of the discussion forum related to his comparison [robgalbraith.com].
He states that he continues to use Macs as his primary machines. However:
"For a major project that ran through much of last year, I got up close and personal with Windows XP Professional running on the humble Dell box in the speed report. I connected a whole raft of pro digital SLR cameras, over a dozen card readers, plus several CD writers, several inkjet printers, a flatbed scanner and a film scanner. Every device connected and worked without a hitch, many of them sucking their own drivers from the ether and configuring themselves. Way, way cool."
"On the Mac, it was as it always has been for me dealing with pro digital photography peripherals, whether in OS X or earlier iterations of the operating system. Some devices worked fine, though many required the manual installation of drivers, while some devices, and especially USB and FireWire card readers didn't work at all. Or required a driver for OS X 10.1, then a different one for 10.1.2, then a driver change again in OS X 10.1.3. Ugh. I've had fairly serious ongoing fights with my film scanner, so much so that I only use it on the PC now, where it just works. Where's the true plug and play in that?"
"Part of this is just dumb luck of course, because with a different PC and different peripherals I could have been given a rougher ride by Windows XP, and an easier ride by the Mac. As it happens, however, life with Windows XP in 2002 was a breeze compared to the Mac. By OS X 10.2.3 things have settled down a lot on the Mac side, but for the speed report I experienced yet again an incompatibility between one card/reader combo that was not replicated on the two PCs. After awhile, these types of experiences make me think that Apple needs to spend more time delivering true plug and play for the pro digital photographer, and less time marketing the notion that they do."
"Keep in mind, my preference would be to remain on the Mac, and right now, two of the key applications I use everyday are Mac only, so I'll boot up my Mac first every day for a while yet. But I won't stay on the Mac because of what I now consider to be outdated notions about the Macs ease of setup and use, since my experience using the other platform is that life is okay over there, even preferable in certain, specific ways."
The speed of the hardware is irrelevant... (Score:4, Insightful)
Anyone remember the Byte benchmark? (Score:2, Interesting)
The reason it was faster was that the G3 had more on-chip cache, which suited the benchmark, and said absolutely NOTHING about the rest of the system.
A computer is as fast as its bottleneck... when evaluating performance it's best to see as many REAL WORLD benchmarks as possible. No use having a 12ghz processor if you still use 33mhz memory.
Sager notebook (prev article reference) (Score:3, Insightful)
I've completely stopped using the mac for all my conversion needs- maybe this app would be better, but really the speed difference is significant between the two platforms.
Maybe if I was willing to shell out $4k (USD) for a newer mac platform, just to get a few minutes faster at conversion, I could get some speed-up- but for that price I could buy two more of these laptops, with 2.8/3.06 Ghz procs and a gig of RAM. That's the typical Mac owner's conundrum.
Mind you, someone could write a SSE2 enabled RAW file converter, and it would perform the same way. hand crafted code that's optimized for speed using specialty CPU features is good for everyone, regardless of platform.
Now if only this guy could make CF cards transfer faster
Re:Sager notebook (prev article reference) (Score:3, Insightful)
Blockquoth the poster:
See, it's crap like this that tells me most Mac bashers really have no clue what they're talking about. The absolute supreme top-of-the-line PowerMac that was just introduced this past Tuesday only costs $3800. You have to add bloody RAID array to the configuration to even crack $4000...
...and that's only through Apple's own online store [apple.com]. Buy it anywhere else and you can get another $300 to $500 worth of free scanners, printers, cameras, and/or software.
Enough with the FUD already.
$2,420.00 (Score:2)
Based on MacBibble's specs, you should be able to convert a D1X Raw at 3008x1960 to JPG in about 2 seconds, if your storage media can even keep up.
I'm not doubting that your Sager is faster if you say it is - I'm just surprised you need it to be any faster than this.
Re:Sager notebook (prev article reference) (Score:2)
but in the article, the Alienware was 2-3x faster than the 800MHz G4 on the Bibble software. Perhaps you don't have an 800MHz G4, perhaps it's only 400MHz (I don't think they were ever any slower), which means that your Saeger is possibly 4-6x faster than the slowest G4 I can imagine.
And in the previous article, it was shown that Bibble's software was about the same speed as the Nikon software, on the Alienware, and about twice as fast on the Mac, though the PC was still faster.
However, with the *new* MacBibble, the process is 10x faster than the native software, which means that if you have a 400MHz G4, MacBibble should be exactly the same speed as Bibble on your Alienware... and if you have anything faster, like 500MHz, or 550MHz, or 800MHz, or 867MHz, or whatever, your Mac (at least for RAW conversion), is now faster than your PC.
Why would you want to spend $4k when your current Mac might very well be faster than your current PC? What speed is your Mac, anyhow?
jeezopetes (Score:5, Insightful)
I'm not really sure how many times it has to be said, but a great number of Mac users don't use Macs because they're faster. In fact, let me say it again:
It's not about speed
I really can't believe that with the Slashdot community--being so "in tune" with corporate ploys and runaway marketing tactics--still fall for the MHz propaganda, and the speed benchmarks that accompany it.
Since when is the most important thing about a computer the speed? Granted, if you're playing BitchBlaster 2023 that requires a GeForce9000 Mx2+3.144 video card, maybe.
But I'm not sure if people noticed: Most Mac people aren't die-hard gamers. Macs aren't great gaming platforms anyway. They're for people that do work with their computers and rely on them.
These people care not about the absolute speed of their Mac, rather, they care that it works every time that it is booted and that the end-user experience is much more pleasant than someone using something like Windows XP.
So please, people of Slashdot--I know you have above average intelligence:
It's not about speed.
Re:jeezopetes (Score:2, Funny)
You really haven't been here long have you.
It's All In The Interpretation (Score:3, Insightful)
It all comes down to a combination of hardware and software, and it's relatively easy to skew the results either way using these factors. So getting an unbiased test is going to be very unlikely, even in the best of conditions.
My motto is, if it works for you, go with it.
Dr. Wu
Adobe Photoshop 7.0.x AltiVecCore Update plug-in (Score:4, Informative)
No word on whether this gives the PS on G4s any kind of speed boost, though.
Perfect Example between Mac vs. PC (Score:2, Interesting)
Tests done of 4 computers, 4 different
processors, 2 OS's,
Over 30 tasks done using 6 different Programs.
Mac Favoring Article:
Tests done on 1 computer, 1 OS, 1 Processor,
1 Task done on 2 different Programs.
It just appears to me that this is an unfair comparison. It seems that the conclusions of the former test are founded on principals of scientfic testing and have more credibility. Whereas the conclusions of the latter article are amusing at best.
Re:Perfect Example between Mac vs. PC (Score:2)
i am a scientist, and neither of these tests has any sort of scientificism about them. you got the second one ok, but the first one used a poor choice of software (should have been the leanest, most optimized package for each system), a very limited set of test processes, and incredibly disparate hardware - yes, it was supposed to be 'top of the line' systems, but it's really bananas to cherries here; it's like dragging an extremely elegant, sophisticated, state-of-the-art 800hp F1 car up against an old-school brute-force 8000hp Top Fuel dragster and declaring the dragster a much better overall car, because it can romp in the 1/4. by that reasoning, i could say that this rock repels tigers...
the original test (Score:2)
Now that the powermac's were just updated, it would be interesting to see how the results would differ.
(I argue that the original tasks were particularly x86 friendly with focus on sse etc, and then no focus on the comparable altivec, basically a set of tasks chosen that would favour PC's all along, and not accurately reflect graphic designers actual work habits.)
Sweet Mother of God (Score:5, Insightful)
I love Macs, I've used them exclusively for over 10 years now and don't see myself switching anytime soon. Given that...
To Mac zealots:
PC are faster than Macs. Get over it. Yes the PPC chip is more elegant and efficient but it runs slow (relative to Intel). Good Altivec applications are few and far between and don't really apply to the day-to-day home and business user. If the PPC 970 comes out this summer, then maybe Macs will again TEMPORARILY hold the speed crown but until then, PC are faster by using brute force. If sheer computing performance is your #1 requirement, then a PC should be your choice. If you're poor and only have $400 to make sure your child has a computer, then a PC is your only choice. Don't even start by saying with that money you could buy some 1997 era Mac either. Please.
To PC zealots:
The overall user experience on an OS X system outweighs the fact that Win XP may idle faster when running Word. In those applications that can take advantage of vector processing, Altivec is far superior to 3DNow and SSE. Plus, I see a lot of complaining about the program was written explicitly for the Mac so the comparison is unfair. Welcome to our world. Most software written to support hardware (scanners, cameras, etc.) is a blatant PC port of a hastily written "good enough" POS program. Plus, Mac laptops have better battery life AND get the full desktop chip, not some crippled "mobile" version designed to prevent penile burns and 20 minute battery life.
Personally, I'll take elegant and efficient any day. Quite frankly, I'm glad the PPC has temporarily lagged behind. It's forced Apple to really tighten up things to keep competitive and it shows. This might not have happened if the processor would make up for any code bloat and inefficiency. Look at Safari - 3MB download. Look at OS X speed from 10.0 to 10.2. Phenomenal. When the 970 comes around, OS X should theoretically run like a champ.
Who Cares? (Score:5, Interesting)
I have a 1.33GHz Athlon. I have a CPU usage graph sitting in my system tray. My CPU usage almost never goes above 20% (exceptions: Compiling and encoding oggs, which will use 100% CPU however fast your CPU is). On a new Mac, a lot of the GUI related CPU load is shunted to the GPU, and PPC chips do run faster than x86 chips per MHz (This was never in dispute. The dispute is that a 1GHz PPC can outperform a 3GHz x86, which stretches even my 'will-to-believe'). So, If I upgrade to a new Mac with Dual 1.42GHz CPUs I get
And the reason I'm still using a PC? Cost. At the moment, my 18-month old system really isn't slow enough to justify upgrading it.
Read the press release? PCs WEREN'T MENTIONED! (Score:2, Informative)
NOTHING! This says NOTHING about performance in relation to x86. NOTHING! How could this -possibly- shed ANY light on the previous debate about performance between PCs and Macs? It's not like the previous article used any benchmarks involving software the camera people released.
*sighs* At least slashdot posts something meaningful every now and then.
not even close (Score:2)
Macs are not faster than PC's, period. This is where I loose so much respect for the mac community is when they resort to bullshit when the numbers don't add up. Especially the high end powermacs, I'm sorry but you can buy a $1,500 PC than will be much faster than a $2,500 powermac. Notice that I used the word "faster," the PC may be faster but the powermac with OSX makes it more powerful.
Personally, I will take Linux on commodity hardware any day. I hate the idea of being limited to one vendor and the most exciting software development today is happening on Linux.
Re:not even close (Score:4, Insightful)
Well, you see, this is all a matter of choice and opinion. If you ask me where the most exciting software development is happening, I'm going to say it's on the Mac. Final Cut Pro/Express, DVD Studio Pro, Cinema Tools, Shake.
I do video/film editing for a living and these programs signifigantly reduce the cost of running a professional shop. Now instead of spending upmteen gazillion dollars for a high end Avid system, you buy a $3000 PowerMac, spend $2000 on FCP and DVDSP and wisely invest in a good monitor and an NTSC monitor , throw in a couple of terrabytes of FireWire storage. You can have a broadcast quality studio for a fraction of the cost of setting up an Avid or Media100 shop.
That, to me, is exciting development.
video is a no brainer (Score:2)
If your doing video editing, a mac is the obvious choice, no doubt about it. It use to be the same for graphics, but not anymore. When I'm talking about software development, I'm talking about software development as a whole, not a certain niche or category.
D60 support? (Score:2)
The last ten times I've been to look, it's been nothing but a maze of twisty passages (all alike) to discover the Mac version only handles D30 files, not D60.
So
Quality is also an issue (Score:2, Insightful)
The claim of a 10x speed improvement over vendor versions notes that it uses a Kodak colour system for accurate conversion, but it's still difficult to judge without an independent quality review.
Mis-titled article (Score:2)
Mostly because a) no one uses a a Mac or a PC to do the actual photography and b) what platform you do your post-processing on is independant of the camera you take the photos with.
Mac/PC Zealots (sp?) IGNORE THE OBVIOUS.... (Score:3, Interesting)
Second - of course Bibble Labs is going to say their product is faster on a mac - thats MARKETING - look at their page - "This version of MacBibble is completely multithreaded and Altivec Optimized for G4 Process." -
This is the exact same thing as saying that the SAFARI web broswer is "FASTER" than IE. According to some benchmark on Apple.com's page it is, but in my own use on my dual G4 500, I don't notice a difference - except for the fact that Safari can NOT DRAW PAGES WORTH A DAMN - you can even try going to www.google.com with safari - the page is all fucked up. so even if it is "faster" it DOESN'T MEAN IT'S BETTER...
And on that note - my preference in broswers is Opera because it can turn off "pop-ups" - but this is just a disclaimer for all the ie/safari zealots that would flame me otherwise.
Re:Mac/PC Zealots (sp?) IGNORE THE TROLL.... (Score:3, Informative)
I have been using Safari since the day it was announced and now I almost never use IE. Safari definitely renders
With Safari you can block pop-ups too. If you're stuck in a loop of 'em just hit cmd-K, or select "Block Pop-Up Windows" from the app menu.
BTW you do raise one excellent point: "even if it is 'faster' it DOESN'T MEAN IT'S BETTER..." - my point exactly when referring to the whole Mac/PC debate. A few years ago when Macs really were quite a bit faster, I told myself that I would still prefer the Mac even if they weren't as fast as PC's. Guess what? Today it's debatable which is faster. I still don't care.
pudge is a troll (Score:2)
Re:Isn't this pointless? (Score:3, Interesting)
Re:Isn't this pointless? (Score:3, Interesting)
The Mac has the ability to do some cool wider pipline stuff and specialized vector processing - but you need to design stuff especially for it - otherwise it isn't as efficient and you lose to the big block Intel/AMD family.
I think the Playstation2 had this problem at first - it is *highly* optimized for vector processing and the first bit of releases for it hadn't taken full advantage of that.
If I can come up with a scenerio that is useful to me where I really *need* a mac, I'd consider it - but at this point, they are simply cute as hell and that is about it.
Re:Isn't this pointless? (Score:2)
Everyone figures that a 5 liter engine must get worse mileage than a 2 liter, but they rarely consider the fact that more torque means fewer revs.
A 400hp 5.7Liter Chevrolet Corvette gets the same 28MPG(highway) as a 140hp 1.8Liter Mazda Miata and it weighs an extra 400kg.
Re:Isn't this pointless? (Score:2)
I have no idea why you would call the Corvette stupid when it's one of the world's best sports cars. C5Rs finished 1st and 4th *overall* in the 24 hour race at Daytona in 2001 against far more exotic (and apparently delicate) machinery.
Re:Isn't this pointless? (Score:3, Funny)
Re:Isn't this pointless? (Score:2, Interesting)
Re:Isn't this pointless? (Score:2)
Re:Isn't this pointless? (Score:2)
http://cedar.intel.com/media/pdf/games/
or any of the intrinsics in the Microsoft compilers (same search)
http://msdn.microsoft.com/library/defaul
or maybe one of the other 2,770 hits I got by doing that search.
In the worst case, you can bite the bullet and write your own SSE2 C wrappers like someone did for the AltiVec functionality?
There is no such thing as a "special" C interface.
Re:So what? (Score:3, Insightful)
point is that it's tech related and of interest to plenty of
Re:The mac is fastest at RC5 and tons of routines. (Score:4, Funny)
Re:Slashdot Slashdotted? (Score:2)
posted from Chimera ;)
Re:New phrase coined: (Score:2, Informative)
So, it's not a Beachball of Death. Maybe it is a Beachball of Partial Death.
Re:If I could only afford it ! (Score:2, Informative)
Re:RAW format (Score:3, Insightful)
Digital Photographers enjoy the RAW format over JPEG or TIFF for several reasons. A good analogy is to consider a RAW file as a digital negative, or a JPEG or TIFF as a color slide.
RAW images contain more information from the camera - they're unprocessed, like a digital negative. JPEG's will have much of the same information, and with a low compression ratio will often have similar 'quality'. When you bring these into Photoshop and try to modify or play with the pictures, a RAW file will give you more information to fiddle with.
Rob Galbraith [robgalbraith.com] explains this in greater detail [robgalbraith.com]I've included some relevant quotes below: (snip)