Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Apple Businesses Entertainment Games

Spore, Call of Duty 4 Confirmed for OSX 125

1up is reporting that, along with the big announcements from yesterday's MacWorld event, the welcome news trickles down that OSX will be getting some more games. The much-delayed Spore has been confirmed for the platform, as has the hit FPS title Call of Duty 4. "In Spore's case, the magic of cross-platform portability is achieved through the use of a special software layer supplied by Toronto-based TransGaming Technologies. This software is capable of interpreting hardware calls to Windows DirectX into Mac-capable instructions. Through use of this technology, Electronic Arts (and others) seem hopeful about bringing even more games to mac in the coming months."
This discussion has been archived. No new comments can be posted.

Spore, Call of Duty 4 Confirmed for OSX

Comments Filter:
  • by slcdb ( 317433 ) on Wednesday January 16, 2008 @03:48PM (#22070538) Homepage
    This "technology" provided by TransGaming is called "Cider". It's already been used to "port" some games to OS X. One such EA Game that I've already purchased was Battlefield 2142. And let me tell you, Cider leaves much to be desired. The poor performance imparted by this emulation layer causes it to play like it's on an old Pentium III machine, despite the fact that it's running on a quad-core Mac Pro. To top it off, the graphics quality, even when turned up all the way, is far lower than it should be. It seems as if the Cider emulation layer can't translate all of the DirectX APIs, so it only does some of the more basic ones, leaving advanced graphics effects out.

    This is not what I would like to see as the future of gaming on OS X. I want to see *real* ports of games, not some bullshit emulation layer that makes the game think it is running on Windblows.
  • Re:Yawn (Score:3, Interesting)

    by slcdb ( 317433 ) on Wednesday January 16, 2008 @03:52PM (#22070594) Homepage

    So is this like using Wine to run Windows Games on Linux?
    Yes, that's exactly what it is. The emulation layer is called Cider which is literally a fork from Wine (apparently from the days before Wine was GPL'd).

    TransGaming has another emulation layer called Cedega which is for emulating Windows Games on Linux.
  • by Kimos ( 859729 ) <kimos.slashdotNO@SPAMgmail.com> on Wednesday January 16, 2008 @04:01PM (#22070688) Homepage

    Cider leaves much to be desired. The poor performance imparted by this emulation layer causes it to play like it's on an old Pentium III machine, despite the fact that it's running on a quad-core Mac Pro. To top it off, the graphics quality, even when turned up all the way, is far lower than it should be. It seems as if the Cider emulation layer can't translate all of the DirectX APIs, so it only does some of the more basic ones, leaving advanced graphics effects out.
    Does anyone have some links/literature to substantiate this? I was scared this would be the case. I know that Wine Is Not supposed to be a Windows Emulator, but in my experience the performance is still awful. Even something like Picasa [google.com] running under Wine on Linux brings my system crawling to a halt.

    All these OS X "ports" are really just bundling the cost of a streamlined Windows emulation layer in with your Windows version of the game. It, in fact, discourages developers from learning the OS X toolkits because for games because they can just write one Windows version and slap Cider on it and sell it for OS X too.
  • by TheRaven64 ( 641858 ) on Wednesday January 16, 2008 @04:05PM (#22070744) Journal
    Uh, what? Why does Apple have to allow this? Transgaming markets Cedega, formerly WineX, a fork of WINE. Apple don't 'allow' them to do anything, they ported their codebase from using X11+OpenGL to use Quartz+OpenGL.
  • Re:Yawn (Score:4, Interesting)

    by p0tat03 ( 985078 ) on Wednesday January 16, 2008 @05:10PM (#22071464)

    The difference is that during Duke and Doom's time, the Mac platform was losing market share at a rapid pace to Windows - so while profitable for a short while, it eventually became uneconomic to port. Compare with today with OSX's exploding market share - Macs are already a significant minority in the market, particularly with laptops. I do think the tide is turning, but it will be a slow process, and "light" games like the Sims will get ported long before "hardcore" titles like Crysis.

    The only doubt in my mind is what this means for DirectX. As an indie game dev I can say without a doubt that the DirectX API is simple and easy to work with, and the level of tool support for HLSL is far better than what we have for GLSL. OpenGL is lagging behind DX, but in this new market where porting is of increasing importance, will we see developers abandoning DirectX in favor of OpenGL?

  • by jdgreen7 ( 524066 ) on Wednesday January 16, 2008 @06:38PM (#22072622) Homepage

    As an employee of TransGaming, I take offense to that generalization. I've spent the better part of a year reporting and working around platform limitations for the various drivers that we have to work with. Many of the stability issues that we've had reported to us are present on the PC version of the games as well, and others are due to crashes inside the drivers over which we have no control. The biggest issue, however, is the speed at which OpenGL evolves as compared to DirectX.

    With DirectX, Microsoft can go to ATI and NVIDIA and say, "Hey, what do you guys want to do, and we'll make a spec for it." With OpenGL, it's designed by committee which usually leads to a much more well thought out specification, but it takes quite a bit longer to get equivalent hardware features exposed to developers. Plus, individual vendors can pick and choose what they want to support. Since OpenGL is less used by developers, its driver teams are smaller, and there are typically more driver bugs to work out than on Windows.

    So, game X comes along and decides that it's going to use this newish method to render shadows. It picks a texture format that is well supported by most hardware on DirectX, then bases much of the engine on that assumption. As an example, many games use 32-bit floating point single channel render targets (D3DFMT_R32F). This is not new anymore for DirectX, and most hardware can handle it just fine. However, that same hardware under OpenGL cannot do so (with the exception of drivers that support the GL_FLOAT_R32_NV format, which is only certain NVIDIA cards, and not at all on Mac OS). So, in order to port the game, if we want to use the same concept of rendering to 32-bit float buffers, we end up having to use GL_RGB32F_ARB and ignoring the 2 extra channels. This now triples the amount of video card RAM that we need to use in order to pull off the exact same technique. If OpenGL simply exposed this functionality from the get-go, we wouldn't be forced to take over so much more RAM. This extra VRAM usage starts adding up, and eventually, we've blown past what the card can handle, and we have to start trimming graphics features from the game in order to get it to run at all.

    That was just a single example, but there are many cases like this in the world of OpenGL. Things are starting to converge, but until it becomes the leading graphics interface, there will always be discrepancies like this. Game developers want to use the latest and greatest technologies to write new and pretty games. In order to do this currently, they are forced to use DirectX to get the most benefits from the hardware they want to target.

    So, the alternative, as you mentioned, is for game developers to write their own rendering engine based on OpenGL. This is all fine and dandy, but you are quite often left writing multiple different paths for accomplishing the exact same thing. While this is true of DirectX to some extent, the disparity is much greater on OpenGL. One vendor will implement support for a whole range of features, while another will only implement the basics. But that same vendor will have the whole featureset working just fine in their DirectX drivers. Not to mention the great libraries that Microsoft throws in with DirectX to handle everything a game might want (think, texture loading from just about any format, all the math functions you could ever think of, scriptable Effects architecture, Mesh routines, audio, video playing, input, etc.). DirectX (and XNA by extension) has a very large array of features that game developers make wide use of.

    So, while in a perfect world, all games could be written using a standard library of features that are cross-platform from the beginning, we are still pretty far from that dream. SDL, ClanLib, and other libraries have all tried and succeeded to some degree, but none of them have the breadth of documentation, sample code, and support that DirectX has. Until that day comes, Cider and Cedega a pretty good fit for filling the void of Mac and Linux gaming. Each game released provides a better engine that the one previously, so as a technology, it will only get better with time. Is it perfect? Absolutely not, but then again, what is?

New York... when civilization falls apart, remember, we were way ahead of you. - David Letterman

Working...