NVIDIA Challenges Apple's iPad Benchmarks 198
MojoKid writes "At the iPad unveiling last week, Apple flashed up a slide claiming that the iPad 2 was 2x as fast as Nvidia's Tegra 3, while the new iPad would be 4x more powerful than Team Green's best tablet. NVIDIA's response boils down to: 'it's flattering to be compared to you, but how about a little data on which tests you ran and how you crunched the numbers?' NVIDIA is right to call Apple out on the meaningless nature of such a comparison, and the company is likely feeling a bit dogpiled given that TI was waving unverified webpage benchmarks around less than two weeks ago. That said, the Imagination Technologies (PowerVR) GPUs built into the iPad 2 and the new iPad both utilize tile-based rendering. In some ways, 2012 is a repeat of 2001 — memory bandwidth is at an absolute premium because adding more bandwidth has a direct impact on power consumption. The GPU inside NVIDIA's Tegra 2 and Tegra 3 is a traditional chip, which means it's subject to significant overdraw, especially at higher resolutions. Apple's comparisons may be bogus, but Tegra 3's bandwidth issue they indirectly point to aren't. It will be interesting to see NVIDIA's next move and what their rumored Tegra 3+ chip might bring."
This is funny. (Score:5, Funny)
Re:This is funny. (Score:5, Insightful)
Re:This is funny. (Score:5, Insightful)
One also has to consider that the older iPad 2 smeared the floor with the Tegra 3, why would they think that twice the performance is 'meaningless'? Considering Apple typically doesn't play too lose with the marketing statistics for metrics like battery life, real world performance, etc, then I don't find this to be a stretch. I will be interesting to see the real world benchmarks when the hardware arrives.
http://hothardware.com/Reviews/Asus-Eee-Pad-Transformer-Prime-Preview/?page=7 [hothardware.com]
Re:This is funny. (Score:5, Interesting)
Smeared the floor with Tegra 3? I'm sorry, but meaningless benchmarks are meaningless. I hold both Tegra 2 (Ice Cream Sandwich) and iPad 2 devices in my hand at this very minute, and I can tell you that there is essentially no noticeable difference between the two in terms of responsiveness or 3D performance from the point of view of the end user (and that's despite the iPad 2 having a significantly lower-resolution screen than the Tegra 2 device. The latter has 30% more pixels than the iPad 2 does.)
For the iPad 2 to "wipe the floor" with Tegra 3, it would have to be significantly slower than Tegra 2, and it isn't. Hence, these benchmarks can be nothing other than complete nonsense.
Re: (Score:3)
But your highly scientific benchmark is?
Re:This is funny. (Score:5, Informative)
http://hardware.slashdot.org/story/11/12/01/1518244/nvidias-tegra-3-outruns-apples-a5-in-first-benchmarks [slashdot.org].
Note that in the Apple infographic, they're claiming a 2X speed advanatge over the Tegra 3 with the iPad2's A5. That doesn't appear to be true, though the A5 has some advantages.
Whether the A5X has picked up enough ground to quadruple the Tegra's performance remains to be seen, and given Apple's debunked claims about the A5, seems unlikely.
Re: (Score:3, Insightful)
According to the first link in your linked article, the iPad2 beat the Tegra 3 by a very good margin: http://hothardware.com/Reviews/Asus-Eee-Pad-Transformer-Prime-Preview/?page=7 [hothardware.com]
Re: (Score:3)
It may well be that the PowerVR GPUs are faster than NVidia's conventional GPUs, but that's still only one part
Re: (Score:3, Informative)
soo... I'm guessing you just read the headline and skipped the: "*Update - 3/9/12: We became aware of an error in calculation for our GLBenchmark Egypt Offscreen results here and have since updated the chart above. As you can see, the iPad 2 boasts a significant performance advantage in this test versus the Tegra 3-powered Transformer Prime."
Re:This is funny. (Score:4, Interesting)
> 12 years of membership
What is this, a comparison of e-penis size as implied by length of slashdot membership? Maybe instead of trying to lead attention away from the fact that you were called out as an Apple shill--which I don't need the anonymous coward to tell me because I know already is the case, this being not the first time I see your nick on this board over the years attached exactly to this type of post--you should take it like a man and hang your head in deserved shame.
Re: (Score:3, Interesting)
I'll take any legitimate criticism if it's posted by an actual logged in member, and as long as it is accurate - I don't mind that at all.
What I do mind is being accused of being someone else (I am not); being accused of being paid to post (I have never been, nor will I ever be); or, as in some other posts have suggested, been one of several sock puppet accounts for a PR firm.
I mention the length of time I've been on /. merely as an aside. It's not a dick waving contest - I don't even have a particularly lo
Re: (Score:3)
Re:This is funny. (Score:4)
Re: (Score:2)
Ummm... Are you testing this by playing the same game on both?
Re:This is funny. (Score:4, Insightful)
Re: (Score:2)
Is there some commercial or ad you are referring to?
Re: (Score:3, Informative)
No, he's referring to a conspicuous weakness of the final lines of (quad-core, btw) G5 macs compared to the company's own first competing Intel offerings. Another not-so-well-known weakness is that they also drew more juice under load than most full-sized refrigerators.
Re: (Score:2)
I believe the parent is pointing to the fact that Apple doesn't appear to have put out any misleading marketing materials claiming that PPC was dominating Intel's chipsets on the G5. Is there some marketing benchmark out there that Apple lied about?
Re: (Score:2)
No, he's referring to a conspicuous weakness of the final lines of (quad-core, btw) G5 macs compared to the company's own first competing Intel offerings. Another not-so-well-known weakness is that they also drew more juice under load than most full-sized refrigerators.
The G5's were faster than P4's of the same period but the Core Duo was a significantly different animal from P4's that were around when the G5 came out. I see no conflict here. Intel learned some things from the competition and stepped up their game with the Core series of processors.
Re: (Score:2)
not sure about the water cooled system comment, but they did add "4G" to the iphone 4S with the OS patch.. everyone knows the software didn't upgrade the hardware to LTE.
Re: (Score:3)
I guess it depends what the carriers are calling "4G". I assume the menu displays whatever the carrier has termed 4G, since the 4S supports most of those "3.5G" technologies that have been rebranded as 4G.
Although sometimes software upgrades can upgrade hardware - remember the enforced-charged-$1.99-SO 802.11n patch for some early systems with draft-n support but no software support when they came out? (yes, yes, I know that's not what has happened with the 4S)
Re: (Score:2)
my point is they changed it with a software update.. whereas the non-4G but marketed as 4G Android phones have always been that way.
shetchy marketing if you ask me
(whether or not ATT was behind the decision doesn't matter because Apple included it in THEIR software update)
Re: (Score:3)
Re: (Score:3)
here you go :-) (Score:2)
http://www.youtube.com/watch?v=SvvcQpp3SYE&feature=youtube_gdata_player [youtube.com]
Re: (Score:2)
486 code on a Pentium, not water cooled (Score:2)
Apple doesn't play too lose with marketing statistics? You simply are forgetting the late PowerPC times where a water-cooled Apple system was slower than an air cooled Intel PC.
That is a bogus point. Those water cooled G5s were the standard shipping system. Its entire fair to compare a stock Mac against a stock PC.
The real "engineering" of the PPC vs x86 comparison was through the benchmarking utility. IIRC Apple used a very old version of ByteMarks that was compiled/optimized for the 486 even though they were running on a Pentium at the time. When ByteMarks was recompiled to optimize for the Pentium the PPC advantage faded.
Re: (Score:3)
Apple aren't they the scumbags who have ads pulled by the Advertising Standards Authority for being liars?
No. Apple didn't lie about anything. That nonsense was the agency deciding that devices without Flash were somehow not capable accessing the full internet. They might as well have been complaining about Android or Apple browsers not liking something written to work with some non-standard behavior in Internet Explorer 5. Flash may have been popular, but it certainly isn't a protocol that defines the net. Even people that don't use Apple products should be grateful for what Apple has done to move us all
Re:This is funny. (Score:5, Interesting)
Considering Apple typically doesn't play too lose with the marketing statistics
What planet are you from?
"After a legal complaint by 70-year-old William Gillis over the "twice as fast for half the price" statement found in iPhone 3G marketing, Apple responded with a 9-page, 32-point rebuttal—one paragraph of which included this overly harsh, but very telling, statement:
Plaintiff's claims, and those of the purported class, are barred by the fact that the alleged deceptive statements were such that no reasonable person in Plaintiff's position could have reasonably relied on or misunderstood Apple's statements as claims of fact.
In other words, if you believe what Apple says in an Apple ad, you are not a reasonable person.
http://gizmodo.com/5101110/apple-no-reasonable-person-should-trust-their-marketing
Re: (Score:2)
Re:This is funny. (Score:5, Funny)
The irony in this is that this is coming from a company that presented chunks of wood as their next-gen graphics cards.
Hey! Some of us care about 'Green Computing' here, you earth-raping performance whore!
Re: (Score:2)
Sure, wooden graphics cards don't use as much electricity, but they have to cut down trees to make them. All in all, the wooden graphic cards are actually worse for the environment.
Re: (Score:3)
wooden graphic cards
We call them "drawing boards" where I live.
Re: (Score:2)
That just releases the carbon faster!
Re: (Score:2)
Hey! Some of us care about 'Green Computing' here, you earth-raping performance whore!
Which is why I only use Radeon HD 5xxx [wikipedia.org] cards.
Re:This is funny. (Score:5, Informative)
The irony in this is that this is coming from a company that presented chunks of wood as their next-gen graphics cards.
I had no idea what you were talking about but a quick search showed this:
http://semiaccurate.com/2009/10/01/nvidia-fakes-fermi-boards-gtc/ [semiaccurate.com]
LOL... Nvidia faked a graphics board with a piece of PCB-looking plastic/wood that was screwed to the side of a PC with common hardware-store grade wood screws.
Re: (Score:3, Insightful)
Re: (Score:2)
Actually I am looking at a Galaxy SII as my next phone.
Currently using an iPhone 3GS and the upgrade choice (I'll buy the phone outright either way) is between the iPhone 4S and the Samsung Galaxy SII. Both are about even in the running so far.
Re: (Score:2)
Not true
Steve shit into a box, put a dock connector on it and even hardcore apple zealots said fuck no. It was off the market in like six months.
Can we be done with this meme? Sheesh.
Numbers are meaningless (Score:5, Insightful)
Does using the tablet have smooth and instant responsiveness? At the end of the day, that's all that matters. Tegra 100 or ipad 100 won't matter if the OS that uses it isn't smooth and keeps up with the user interactions. Consumers just care about experience, how they get there isn't of interest to anyone other than nerds.
Re: (Score:3)
What about iOS/Android gamers? Some of those games are pretty taxing and require pretty heavy-duty GPUs to run smoothly...
Re:Numbers are meaningless (Score:5, Interesting)
Re:Numbers are meaningless (Score:5, Informative)
From what we know the A5X is pretty much the same as the A5 except it uses 4 PowerVR SGX543 cores instead of 2. Now this 4 core GPU configuration is the same as the PS Vita albeit the Vita uses a 4 core ARM as the CPU and the Vita runs a smaller 960 × 544 qHD screen. Comparatively, the Vita should beat the iPad on gaming given the hardware for intensely graphic games. For Angry Birds, it may not make much of a difference. At the present time, we don't know if Apple tweaked the A5X in other ways to boost game performance.
The "New iPad" also has twice as much RAM as a Vita (1GB vs 512MB), which could make a significant difference to practical gaming capability. As you note, as well, we have no idea what else Apple tweaked in the chip. Combined with the difficulty in an apples-to-apples comparison between two very different devices, it'll be hard to ever know how different the raw specs are. I think it's reasonable to say, though, that the "New iPad" will be excellent for gaming, as will a Vita.
Re:Numbers are meaningless (Score:5, Interesting)
The Vita also has 128MB of dedicated VRAM which the iPad (or any other smartphone or tablet for that matter that I'm aware of) doesn't, making things even more difficult to compare.
Factor in the display changes as well (Score:2)
The Vita has a far smaller screen with a fraction of the pixels, that skews it even further. Then again, the Vita has to process more inputs.
Re:Numbers are meaningless (Score:5, Interesting)
One thing that I don't think has come up yet; but would be interesting to see, is whether Nvidia tries to turn their disadvantage into a bonus by doing more aggressive power scaling...
If, as TFA suggests, Tegra parts are held back by memory bandwidth; because faster busses are power hungry, this suggests that they might be able to substantially speed-bump their parts when the device is on AC power or otherwise not power constrained. So long as the switchover is handled reasonably elegantly, that could turn out to be an advantage in the various HDMI dock/computer replacement/etc. scenarios...
Re: (Score:2)
I'm still inclined to believe it'
Re: (Score:2)
Interesting, even the first generation Snapdragon in my old Desire renders Flash Video smoothly... Sounds like Tegra might have a few driver issues...?
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
So why did Apple show their benchmark?
Re: (Score:2)
I'm doubtful that the $100 is worthwhile for anyone who intends primarily to use the ipad as a reader. The screen on the ipad 2 is 'good enough' for that.
Re: (Score:3)
If the new iPad's screen compared to the iPad 2 is the same delta as the 3GS > 4 switch for the iPhone, only at 9.7" then it absolutely is worth the extra $100 if you intend to do a lot of reading on it.
The high dpi on the iPhone 4+ screen is extremely good for reading text, more than almost any other benefit (I assume that HD movies will also be a big thing on the iPad, unlike the iPhone).
Re: (Score:3)
You are looking for ways to make Apple's marketing look bad, but failing.
High dpi at a small physical size already means high resolution, but I didn't think I'd have to specify that we're not reading the text on one of those jumbo screens (where the same resolution as an iPad would result in a low dpi).
The high dpi of the iPhone 4 screen (compared to the 3GS) is what makes the text readable. Now, you achieve that on a screen of the same physical dimensions by increasing the resolution of the panel, but in t
Re: (Score:2)
Not bonch. Not employed to post. Not paid to post.
This is getting silly, kid.
Re: (Score:2)
The sun doesn't "help" plants grow. The sun provides energy that drives an electron cascade. The plant uses this energy to synthesise ATP. If you remove this source of energy, the plant cannot do this, so it's essential (in the absence of artificial light) not just "a help".
Y'know, if we're being pedantic and everything.
Your last sentence makes no sense (and conveniently doesn't specify a font size, or a size relative to the screen size, given that they are different resolutions to start with).
Put it this w
Re: (Score:2)
No, it isn't.
Re: (Score:2)
Seems good enough to me.
Re: (Score:2)
Try it.
Re: (Score:2)
I have ... I don't know what you think is wrong with it, so I'm not sure how to address it any better.
I've read a couple of books with it. It's fine.
Re: (Score:2)
When did you read it on a retina display?
Re: (Score:2)
Do I need to to know that the lower res display was good enough? I'm not claiming the retina display isn't better.
Re: (Score:2)
You mentioned whether or not the $100 was worth it, I said try it.
Re:Numbers are meaningless, unless you lie (Score:3)
I like my iPad1, though it's sluggish. I am (too) anxiously awaiting two new iPads due this Friday. I even kept the running commentary on the announcement up in a browser window (yes, I felt a bit dirty afterward). When I heard the proclamation of the speed difference, that certainly seemed to imply a 4-core processing using. At least, that was in the realm of possibility (4 CPU cores and 4 GPU cores vs the Tegra). I'm not convinced now that the claim is valid except for very special conditions with a hos
Re: (Score:2)
Consumers just care about experience, how they get there isn't of interest to anyone other than nerds.
True, but then why is Apple boasting about 2x, 4x, whatever?
RT.
Re: (Score:2)
Consumers just care about experience, how they get there isn't of interest to anyone other than nerds.
True, but then why is Apple boasting about 2x, 4x, whatever?
Ah, because performance is important, except when Apple doesn't have it?
Re: (Score:2)
Does using the tablet have smooth and instant responsiveness? At the end of the day, that's all that matters.
Well, my experience with Ipad 2 is that it has frequent stalls and responsiveness problems, so I guess it matters, and I guess Apple has not addressed that. Good.
Re: (Score:2)
You mean, numbers are meaningless because Apple customers can't count past 2? (Joke guys, joke. Never mind that Apple marketing apparently can't count past two [cnn.com])
My take on it: 1) Apple needs Qualcomm's LTE modem, leaving little room for alternatives for that tablet model. 2) Apple wants to keep its margins up and having only one processor design saves some money there 3) Reality distortion goes into overdrive to distract Apple fans from the reality that the ipad Wifi model is actually slower than the new rou
Re: (Score:2)
Does using the tablet have smooth and instant responsiveness? At the end of the day, that's all that matters. Tegra 100 or ipad 100 won't matter if the OS that uses it isn't smooth and keeps up with the user interactions. Consumers just care about experience, how they get there isn't of interest to anyone other than nerds.
I don't do games in tablet, but the ICS update significantly improve this in my TF101 (with tegra2). I own an ipad 2 too, and right now I prefer using the TF101 as may main tablet.
Re: (Score:2)
PowerVR, eh? (Score:4, Interesting)
I didn't know the PowerVR chips were still around. I had one of the early video cards based on the technology for my PC years ago. It worked ok, but that was long before things like shaders were an issue.
Still, we are talking about a portable device, so I'd think battery life would be more important than having the latest whizz-bang shaders. And just look at all the grief people hare having with the Android lineup due to shader differences between vendors.
Thank God I focus on business programming, not video games. I've yet to hear of ANY tablet or smartphone having problems displaying graphs and charts.
Re:PowerVR, eh? (Score:4, Informative)
PowerVR GPU's are integrated on a lot of ARM processors used by many mobile companies. Its not a secret, but only Apple related articles like to poke fun at it. PowerVR went from being a "brand name" to being the developer behind a lot of graphics on everything from PC's, to game consoles, to HDTV's, to cell phones etc.
For that matter Samsung had been integrating the both of them before the iPhone in any flavor came out. And continues to do so.
Re: (Score:3)
Re: (Score:2)
Re: (Score:3)
I didn't know the PowerVR chips were still around.
PowerVR has been doing very well in mobile devices for years. ARM's Mali is just starting to take away some market share from them, but before that most ARM SoCs came with a PowerVR GPU. Like ARM, they license the designs to anyone willing to pay, so if you want to make a SoC then a PowerVR GPU was an obvious choice - nVidia only uses the Tegra GPUs in their own chips, they don't license them. Now it's a less obvious choice, because you can license both CPU and GPU designs from ARM and the latest Mali st
Re: (Score:3)
3D data visualization does not require the use of anything more complex than Gourard shading, which does not rely on programmable shaders. You don't need ultra-realistic visuals to garner the gist of a 3D visualization. In fact, I'd argue that using fancy shaders on such data would skew the correlation between shading and depth/Z coordinate information.
Note that the problem being encountered by game developers is not an inconsistent or unreliable implementation of the built-in OpenGL shaders, but the p
Last Tegra device I'll ever buy (Score:2, Informative)
Bought a Galaxy Tab for the Tegra 2, was so utterly disappointed. The real world performance was atrocious even compared to devices it was officially benchmarked better against. Sold it within 3 months. Still waiting on a great Android tablet....
Re:Last Tegra device I'll ever buy (Score:5, Funny)
Well don't you worry, last week Apple announced Samsung's next tablet!
Re: (Score:2)
Don't worry, Nvidia! (Score:5, Insightful)
Or ask Amazon: Amazon releases 'Kindle' e-reader device. His Steveness declares "Nobody reads". And now Apple is pushing books, newspapers, and their own pet proprietary publishing platform...
Cheer up, emo Nvidia, all you have to do is sell Apple a Tegra N SoC, or even just the rights to include your GPU in their AN SoC, and Tim Cook will personally explain to the world that PowerVR GPUs are slow, weak, make you 30% less creative and are produced entirely from conflict minerals.
Re:Don't worry, Nvidia! (Score:5, Informative)
Just ask Intel about Apple's benchmarking strategy: For years, the finest in graphic design publicly asserted that PPC was so bitchin' that it was pretty much just letting Intel and x86 live because killing your inferiors is bad taste. Then, one design win, and x86 is suddenly eleventy-billion percent faster than that old-and-busted PPC legacy crap.
This wasn't totally misleading. The G4 was slightly faster than equivalent Intel chips when it was launched and AltiVec was a lot better than SSE for a lot of things. More importantly, AltiVec was actually used, while a lot of x86 code was still compiled using scalar x87 floating point stuff. Things like video editing - which Apple benchmarked - were a lot faster on PowerPC because of this. It didn't matter that hand-optimised code for x86 could often beat hand-optimised code for PowerPC, it mattered that code people were actually running was faster on PowerPC. After about 800MHz, the G4 didn't get much by way of improvements and the G5, while a nice chip, was expensive and used too much power for portables. The Pentium M was starting to push ahead of the PowerPC chips Apple was using in portables (which got a tiny speed bump but nothing amazing) and the Core widened the gap. By the Core 2, the gap was huge.
It wasn't just one design win, it was that the PowerPC chips for mobile were designs that competed well with the P2 and P3, but were never improved beyond that. The last few speedbumps were so starved for memory bandwidth that they came with almost no performance increase. Between the P3 and the Core 2, Intel had two complete microarchitecture designs and one partial redesign. Freescale had none and IBM wasn't interested in chips for laptops.
cure for the blue face (Score:5, Informative)
This was totally misleading, for any informed definition of misleading.
Just as there are embarrassingly parallel algorithms, there are embarrassingly wide instruction mixes. In the P6 architecture there were a three uop/cycle retirement gate, with a fat queue in front. If your instruction mix had any kind of stall (dependency chain, memory access, branch mispredict) the retirement usually caught up before the queue was filled. In the rare case (Steve Jobs' favorite Photoshop filter) where the instruction mix could sustain a retirement rate of 4 instructions per cycle, x86 showed badly against PPC. Conversely, on bumpy instruction streams full of execution hazards, x86 compared favourably since it had superior OOO head-room.
CoreDuo rebalanced the architecture primarily by adding a fair amount of micro-op fusing, so that one retirement slot effectively retired two instructions (without increasing the amount of retirement dependency checking in that pipeline stage). In some ways, the maligned x86 architecture starts to shine when your implementation adds the fancy trick of micro-op fusion, since the RMW addressing mode is fused at the instruction level. In RISC these instructions are split up into separate read and write portions. That was an asset at many lithographic nodes. But not at the CoreDuo node, as history recounts. Now x86 has caught up on the retirement side, and PPC is panting for breath on the fetch stream (juggling two instructions where x86 encodes only one).
The multitasking agility of x86 was also heavily and happily used. It happens not to show up in pure Photoshop kernels. Admittedly, SSE was pretty pathetic in the early incarnations. Intel decided to add it to the instruction set, but implemented it double pumped (two dispatch cycles per SSE operation). Of course they knew that future devices would double the dispatch width, so this was a way to crack the chicken and egg problem. Yeah, it was an ugly slow iterative process.
The advantage of PPC was never better than horses for courses, and PPC was picky about the courses. It really liked a groomed track.
x86 hardly gave a damn about a groomed track. It had deep OOO resources all the way through the cache hierarchy to main memory and back. The P6 was the generation where how you handled erratic memory latency mattered for important workloads (ever heard of a server?) than the political correctness of your instruction encoding.
Apple never faltered in waving around groomed track benchmark numbers as if the average Mac user sat around and ran Photoshop blur filters 24 by 7. That was Apple's idea of a server workload.
mov eax, [esi]
inc eax
mov [esi], eax
That's a RISC program in x86 notation. Whether the first and second use of [esi] amounts to the same memory location as any other memory access that OOO might interleave is a big problem. That's a lot of hazard detection to do to maintain four-wide retirement.
Here is a CISC program in x86 notation. I can't show it to you in PPC notation, since PPC is a proper subset minus this feature.
inc [esi]
Clearly, with a clever implementation, you can arrange that the hazard check against potentially interleaved accesses to memory is performed once, not twice. It takes a lot of transistors to reach the blissful state of clever implementation. That's precisely the story of CoreDuo. It finally hit the bliss threshold (helped greatly that the Prescott people and their marketing overlords were busy walking the green plank).
Did Apple tell any of this story in vaguely the same way? Nooooo. It waved around one embarrassingly wide instruction stream that appealed to cool people until it turned blue in the face.
Cure for the blue face: make an about face.
Do I trust this new iPad 3 benchmark? Hahahahahaha. You know, I've never let out my inner six year old in 5000 posts, but it feels good.
Re: (Score:2)
SPEC benchmarks are irrelevant to most users (they're actually irrelevant to most HPC users too - they're for dick waving, not for anything else). Important benchmarks are things like comparing complex Adobe Photoshop filter application times, because that's what translates to real money for users. Time spent waiting for the computer to do things is time spent not getting anything done that you get paid for. And these benchmarks were verified by other people.
As I said in my post, a lot of the differe
Re: (Score:3)
If you knew anything about specbenchmarks
I work on a compiler used in HPC. I know a fair amount about SPEC benchmarks and how little they're trusted outside of dick waving lists.
you wouldn't be making such a silly claim of a single Adobe filter being faster on one platform and trying to make the claim of winning the speed contest.
In the compiler world, we say that there is only one benchmark that really matters: your code. Apple's core market at that time was people running the Adobe creative suite. This suite ran faster on Macs than on Windows. Whether that was due to the processor, the compiler, or better code, is irrelevant to the user. The user cares about how much this expensive purchase
PPC v Intel x86 - A Mac game dev's perspective (Score:5, Interesting)
Overall, as a Mac game developer, it took a bit of effort to get Mac ports on a par with their PC counterparts. One caveat here, emphasize "port" - that the games tended to have been written with only x86 in mind. Contrary to popular belief it is entirely possibly to write code in high level languages that favor one architecture over the other, CISC or RISC, etc. So the x86 side may have had an advantage in that the code was naturally written to favor that architecture. However a counterpoint would be that we did profile extensively and re-write perfectly working original code where we thought we could leverage the PPC architecture. This included dropping down to assembly when compilers could not leverage the architecture properly. Still, this only achieved parity.
Again, note this was back-in-the-day, games that were not using a GPU. So it was more of a CPU v CPU comparison.
Re: (Score:3)
Out of curiosity, did the Mac sales bring in enough revenue to be worth all the costs of doing the porting?
Re: (Score:2)
Out of curiosity, did the Mac sales bring in enough revenue to be worth all the costs of doing the porting?
I never saw financials but the publisher continued to support the Mac throughout the PPC era. Now in the x86 era its a little bit easier to do the port and the Mac market is many times larger. I think doing a Mac version of a game today is much less risky.
Re: (Score:3)
Here [apple.com]
Re:Don't worry, Nvidia! (Score:5, Funny)
Tablets not GPU-limited, they're money-limited (Score:4, Interesting)
Don't get me wrong, I love gaming on my iPad (or at least I like it enough to have no desire to get a PS Vita), but there are few games that truly push the GPU because there is just no money in it to do so. Until people are willing to pay $30-40 for a top-notch game on their mobile device, we won't.
and before someone says that touchscreens are another factor, please, that's only a problem with ports (or developers who think touchscreen games are just like console or handheld games without thinking (*cough*EA sports*cough*). Fighting games that require you to hit a bunch of virtual buttons are wretched on a touch screen device. fighting games like Infinity Blade are pretty fun because they take advantage of the touch screen, rather than treat the screen like a virtual controller. I actually did like GTA III, but I often had to find alternative ways to complete missions because running and gunning was more difficult than using the sniper rifle.
Amazing (Score:2)
Nvidia is stupid enough to take the bait. Good job.
it should be true (Score:2)
the old ipad 2 is faster than the tegra 3, according to arstecnica, so it should make sense that the new ipad is even faster. i can't find the link but i saw it a few days ago, maybe here in a comment
Re: (Score:2)
Re: (Score:2)
the old ipad 2 is faster than the tegra 3, according to arstecnica, so it should make sense that the new ipad is even faster. i can't find the link but i saw it a few days ago, maybe here in a comment
Why is everyone focused on benchmarks when they have been notoriously unreliable. There are games available for both platforms, ie: Riptide GP, and there are youtube videos of them being played side by side. The Tegra 3 version had more details and was on a higher resolution platform. Seems to me that Tegra 3 won that bout. There are other examples of applications on both platforms. Unless your favorite application is a benchmark, lets hear about real applications that exist and are being used by actual pe
Apple's numbers make sense (Score:5, Insightful)
We recently saw a graphics benchmark of the A5 vs the Tegra3 posted to /., and the A5 beat the Tegra in real-world-ish benchmarks, and more than doubled it's score in fill rate. [hothardware.com]
The A5X is basically just the A5 with twice as many GPU cores, and graphics problems tend to be embarrassingly parallel, so unless it scales up really poorly with those extra cores (due to shared bandwidth limitations, or poor geometry scaling) it should have no problem beating the Tegra 3 by 2x, especially in terms of fill rate.
And when you quadruple the number of pixels on your screen, as Apple just did, which measurement matters? Fill rate.
iPad 2 Already Beat Tegra 3 (Score:5, Interesting)
Considering that these graphics benchmarks from Anandtech [anandtech.com] show the iPad 2 GPU handily beating a Tegra 3, it doesn't seem like much of a stretch that the iPad 3 GPU should beat it further.
who the hell cares? (Score:5, Insightful)
tegra smegma a5x tri-dual-octo-quad core ACME RX3200 Rocket Skates GigaHertzMegaPixelPerSecond my asshole graphics is irrelevant.
the ONLY thing that matters is how it works when its in your hands.
does it drive 2048x1536 at least as well as the ipad 2? yes or no.
the way i see it, neither NVIDIA or Apple can say anything about relative performance because there is nothing using tegra at that resolution.. you can benchmark/extrapolate all you want, but all that matters is real world.
the "quad core A5X GPU" damn well better be faster beause it's driving 4x as many pixels.
PowerVR has it's drawbacks (Score:2)
Re: (Score:2)
It comes down to the laws of advertising, in which 'faster' is legally constrained to 'as fast as'. So both X can be faster than Y, and Y can be faster than X, if they are the same speed.
Re: (Score:2)
That's also what they said in the late 90's when the PowerVR was competing with the 3Dfx Voodoo add-in cards. Given that there have been at least 50 million PowerVR-based GPUs shipped so far that's a heck of a footnote.