Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Apple Businesses

X On OSX Now Free 177

ffejbean noted that OSXTalk (hey, they run Slashcode!) has an article up noting that XFree86 and MacOS X are getting more and more friendly every day. Now you don't have to purchase a lame commercial binary, you can just install it yourself. If only those iCubes didn't cost twice what they should, this may just be a great platform yet. (BTW, I'm getting confused here, should I post this as Apple, X, or BSD? Ah well, close enough :)
This discussion has been archived. No new comments can be posted.

X on OSX Now Free

Comments Filter:
  • It'll be nice to have X apps running on MacOS. Still, I think the ultimate thing to have would be a version of Xlib for MacOS that simply translates X11 calls to their Mac native equivalents. That way you don't have to have an X server running on the Mac to display X apps.

    It'd be nice to have that on Windows too, actually.
    --
  • With some work, we will see the number of Linux applications increase. If this option becomes popular, many mac users should be interested by Linux app.
  • (As in bsD/Apple/X)

    I suggest you stick to a photo from the first 6 series of DS9.

  • by TWX_the_Linux_Zealot ( 227666 ) on Tuesday October 24, 2000 @06:50AM (#679869) Journal
    Well, since the product started with the very apple gui (blecch!) and one mouse button on the hardware, the apple icon looked perfectly appropriate... Now that some actual useful stuff is there, like X support, etc, and since an x86 edition is doing something and is available, it might be almost ready to graduate into the realm of being classified in the BSD arena. Unfortunately I haven't enough experience on it (the last time I played with it was in a VERY early test on a school district computer, where no compilation tools were installed and what looked like Apple's 'finder' was the shell) to be more decisive. Besides, I thought all diehard Apple/UNIX fans ran AUX anyway...
  • by Spazntwich ( 208070 ) on Tuesday October 24, 2000 @06:50AM (#679870)
    I think Jobs has the right idea here. Nobody cares as much about apple hardware if it's not even slightly compatible with any widely accepted software, besides of course what Microsoft decides to throw at it. Opening up the world of linux to Apple hardware means it can now be used by the power users/graphic artists, AND the geeks. If they drop the price on their hardware, I'd drop my intel box for an apple running X any day.
    ---
  • I know that Macs (like me) are a chick magnet, and if a chick says she's into computers and a computer geek, the odds are she's really into a Mac (right Rob), and wouldn't know a command line from a hole in the ground, so I keep a Mac around for the chicks who drop by.

    Anyhow, once these Macs are running OSX, are the chicks still gonna be interested in them? I'm trying to keep my bachelor pad up to date and keep the chick tractor beam still functioning.

    Thanks,
  • by eln ( 21727 ) on Tuesday October 24, 2000 @06:52AM (#679872)
    So now that the X part of OSX is free, when are
    they going to free the O and S?
    Quit discriminating against the other letters of
    the alphabet!
  • (I'm not a coder) But... Why can't someone port Xlib over to macos X? It's got all of the unix goodies, like gcc and emacs built in. How hard would it be?

    Eric
  • Nobody cares as much about apple hardware if it's not even slightly compatible with any widely accepted software, besides of course what Microsoft decides to throw at it.

    Isn't this true of almost all non x86 platforms? I can't name 3 platforms that the same piece of "widely accepted software" run on simply because most "widly accepted software" runs on x86 Windows. You can see this is even more true when you start to hear about all of the applications that Microsoft didn't port to Windows NT for the Alpha processor.
  • ... when you do find that gal who is a command line junkie who uses Debian Linux or BSD, and simply blows you out of the water with her skills, and she's been BBSing since, like the 300 baud days, you're hers... <grin>
  • by otis wildflower ( 4889 ) on Tuesday October 24, 2000 @07:00AM (#679876) Homepage
    How about, instead of building a fully-fledged monolithic X system, extending the Mac interface to wrapper X functionality? What I'd really rather have is to run in my Mac interface and pop open X apps (using OSX widgets) as appropriate instead of having to throw out the OSX interface (which, let's be honest, is why you buy the damn thing in the first place).

    I wonder how well the basic widgets map..

    Your Working Boy,
  • More Unix-like behavior in OSX is always a good thing for the same reason why many Linux afficionados still run Solaris on their Sun boxen;

    Support.

    When something goes out, the last thing you want to do when getting it fixed is have to waste time proving to the vendor that it's not the unsupported OS you added that's causing the problem.

    -
  • Cire wrote: (I'm not a coder) But... Why can't someone port Xlib over to macos X? It's got all of the unix goodies, like gcc and emacs built in. How hard would it be?

    I think that is essentially what has already been done. Xlib requires an X server, though. That was the original poster's point: X would run much faster on MacOS X if, rather than working the way X normally does and going through a server, it just called MacOS APIs directly. This would lose the advantage of X's network features, but a lot of people don't use those anyway.
  • Both Linux and the Microsoft OSes have long been considered PC-based, which is part of the reason that Macs have had a hard time taking off. OS X has the potential to bring back Macintosh in a big way because suddenly both Mac and PC can work together with a common operating system. (LinuxPPC and other flavors for Mac are great hacks, but I think that they failed to take advantage of the architecture as OS X does). Once again, Linux has brought together mortal enemies, and has promoted a spirit of peace, love, and understanding. (Or something like that).

  • There's an application suite called "Exceed" that is an X Server for windows, and it can serve apps so that Windows is the native window manager, putting an entry on the taskbar and everything. It requires Exceed to be running as an app, but I'm sure that if enough people wanted it they might be willing to write another version to be a service...

    It's made by Hummingbird Software [hummingbird.com] and is expensive, but my school (Arizona State University) provides a license for students to use it for student use.
  • Whether or not you think that it's a last ditch strategy or a fancy plot, it's a good one. MacOS X promises to be a stable UNIX OS combined with an integrated (and if Apple pulls through like they have in the past, usable and functional) GUI. Jobs has a better product to sell, and one that will benefit the community.

    Jobs also opened up the source (after a fashion) to the OS, allowing developers to port it to the x86 architecture (and perhaps others in the future). That isn't trying "to get macs more accepted". That's a sound strategy for deploying a new OS.
  • zpengo wrote: Linux has brought together mortal enemies, and has promoted a spirit of peace, love, and understanding. (Or something like that).

    Not quite. MacOS X isn't Linux; it's a BSD system. Still, I think you're right, the increased portability of software to the Mac hardware will be nice.
  • If only those iCubes didn't cost twice what they should, this may just be a great platform yet.

    What should they cost?

    I hear this complaint quite a bit. It seems that one of the enshrined bits of common wisdom (or myth?) when it comes to PC buying is that Mac HW costs more for the performance you get.

    Mac fans counter that it's the same or better, and give the following reasons:

    1) even though PPC clock speeds are slower, programs run faster because the processor can do more per clock cycle. I've been told to expect twice the performance from a G3 than a similarly clocked PIII.

    2) productivity gain by less futzing about with hardware, due to standardization...

    Comments? Maybe even hard numbers? Balanced reasoning (ha!)?
  • Let's see here..... hmmmmm... the CEO of Apple making strategic business maneuvers to increase the acceptance, marketshare and sales of one or more of his company's products.

    Yup. Sounds like a conspiracy to me. Yup. Yup.
  • Or why not just write a server that translates the commands it recieves from its X-clients into native calls? That way, you wouldn't miss any of the networking features, but still have real MAC-windowses and so on.

    And probably someone else have allready written this somewhere below 0...
  • by FigWig ( 10981 ) on Tuesday October 24, 2000 @07:14AM (#679886) Homepage
    X doesn't natively have widgets - thus QT and GTK. There could be a wrapper for those libraries, but X just provides basic graphics services. The other stuff you see is dependent on the window manager, which again is built on top of X.

  • The ones spouting that myth were Apple themselves.
    arstechnica.com explains it best:

    "Apple also made it a point to reiterate their old bit about how, under certain circumstances with certain applications that certain Apple users often use, the G4 can sometimes be twice as fast as a PIII at the same MHz rating.

    Give it up Apple, because not only are such statements both
    a) purposefully misleading and
    b) so vague as to not really mean much of anything, and not even the die-hards are buying it anymore.

    Yeah, the MHz rating isn't all that matters, but it does really matter. Anyone who tells you otherwise is probably trying to sell you a Mac."
  • Sorry, but nobody made this point before. And I pressed submit by accident while swithing windows before i was able to develop my point. then I Read new post and saw somebody else saying what I was about to say and I don't want to make redundant post. Sorry I'm a newbie. Yes, I know I should post this as AC.
  • by KFury ( 19522 ) on Tuesday October 24, 2000 @07:19AM (#679889) Homepage
    Most people's complaint with the cube isn't it's price/performace ration when compared to Intel boxen, but when compared to other Macs. You're paying a premium for a Cube, and what you get is a slower machine than a Mac G4 dual-processor tower, and no expandability. For this you're paying more.

    This is why the Cube is the first CPU in Apple's history that hasn't bet initial sales forecasts (this gleaned from their recent earnings conference call).

    I love the Cube, but I'm waiting until they come out with the low-cost version in Jan or Feb. They can't cut anything but the price, and I'm pretty sure the 'low price' cube wil be the current cube, and they'll introduce something faster for the premium price. This is what they've been doing with Powerbooks and iMacs for the last three years.

    Kevin Fox
  • G4s only really get fantastic performance when they can parallelize with their matrix operations. Coincidentally, Photoshop spends most of its time applying filters, which are... matrix operations! Now you know who bankrolls Apple - print media companies.
  • by Skyshadow ( 508 ) on Tuesday October 24, 2000 @07:21AM (#679891) Homepage
    I think Jobs' real strategy is to turn the Mac into the equivalent of a designer label in clothing. The iMacs were a start, but the cubes are the *perfect* example of this. They look funky-cool. They have a well-recognized label. They cost a lot. Sound framiliar?

    Jobs has had the insight to see the potential of the computer as a sort of renewed status symbol, and the new Mac cubes are chasing that with a vengence. Everyone has a computer, but not everyone has a really swanky one. Ask yourself: Does the average mid-20's to mid-30's hipster who just surfs and emails and buys designer clothes and furniture want a beige lump or a sleek, cool-looking Mac? I'd be willing to bet that you'll start seeing cK and Ralph Lauren computers with sleek looks within the next couple of years -- Apple is just at the start of this trend.

    In this age where more and more of what you use a computer for is on the net anyhow, lack of software apps matter less and less. Style, on the other hand (and forgive me), never goes out of style.

    ----

  • There's no such thing as an iCube.

    -The Reverend (I am not a Nazi nor a Troll)
  • Conspiracy? You mean like using a cute, cuddly, fuzzy mascot to market evil opensource to children?

    Ha ha ha.

    It's not a conspiracy, it's good business.
  • Surprised there haven't been any comments pointing out that X is woefully out of date. Nostalgia aside, it's really fairly embarassing that we'll still all be using X Window in 2001 - I would have thought a tech-savvy audience like Slashdot would have been the first to point this out.

    So, is it really so exciting that Apple now support X? I suppose in one sense it's great to have all those legacy applications, but it would be nice to see the state of the art pushed forward somewhat - I would certainly have expected this of Apple, one of the more forward-thinking old-school computer companies.

    Then again, I must admit there are no serious contenders to X currently visible on the radar. I've looked at WHY [apsoft.com] (fairly promising but early days) and Berlin [berlin-consortium.org] (extremely interesting, but a little too bogged-down in providing support for glitzy rotations and the like too early on in the development), but I don't see X being replaced in the forseeable future, sadly.

    Perhaps this is because X Window was developed by academic experts who were basically employed to do this, whereas it's putative replacements are being developed by enthusiastic amateurs (and this isn't intended as a knock to those developers, but merely a reflection of the truth - I am an enthusiastic amateur myself!).

    Specifically, one thing X certainly needs is FAST and CONSISTENT (across the whole desktop) sub-pixel anti-alisasing. Acorn users have had this since 1990, so why has it taken so long for the rest of the world to catch up?
  • I remember some in the late 1980s,
    but don't know if they are still around.
    The Xserver is pretty portable. You have supply
    about 30-50 kernal graphics routines in their
    driver.

  • I could say the same thing about cigarette companies using the cute joe camel cartoon character to sell cigarettes... and no one would argue with me on that point.
  • even though PPC clock speeds are slower, programs run faster because the processor can do more per clock cycle.

    That is true. (It's also true of sparc, alpha, MIPS and pretty much any non-x86 architecture.)

    I've been told to expect twice the performance from a G3 than a similarly clocked PIII.

    That is unadorned horseshit. But don't take my word for it: go to www.spec.org [spec.org] and check out the numbers yourself. 20-30% is more the average gain, and that's cold comfort when you can buy 1.2GHz Athlon chips for less than $500 a pop.

    The "twice as fast as Wintel" claim is based on a small number of Adobe Photoshop operation benchmarks; usually filters that have been painstakingly optimized for the G4's "Altivec" vector processing unit. This isn't necessarily "cheating", since Photoshop is still one of the primary reasons to buy a mac, but if you are not a graphics professional, you are simply never ever going to see that kind of speed benefit using a Mac.

    In "regular use" applications, the scenario at the moment is even worse than you might guess based on the SPEC numbers: MacOS 9 is such a turgid, inefficient piece of crap, and the device drivers for 3rd-party Mac hardware so shoddily implemented, that MacOS applications will often run significantly slower than their Windows counterparts on similar hardware: just ask anybody [insidemacgames.com] if they're getting the same kind of Quake III framerates out of a G4/500 with a Radeon card as they would from a PIII/800 with the same graphics card.

    You just don't buy Macs for world-beating performance (Photoshop being the exception). You buy them for nice industrial design, an OS that for all of its architectural ugliness still offers a more compelling user experience than Windows, and more often than not just to maintain an existing investment in MacOS software.

  • A single 500MHz G4 was compared side-by-side with a single 1GHz Pentium III, running the same Photoshop script. the results were 124 seconds for the P3, and 108 seconds for the G4. So basically, the G4 was faster than a P3 running at twice the clock speed.

    This may not be true for *every* application, or Photoshop job for that matter, but it does support the claim that some applications, under some uses, perform at better than twice that of a similarly clocked P3.

  • We probably would, if not for the fact that you can't run AU/X on anything later than a quadra... Which topped out at 40 MHz 040's.... for all the supposed benefits of unix, you sure can get a lot more work done on a 400 MHz machine that crashes once a day on a bad day, than on a machine a 10th the speed which doesn't crash at all.

    And will everyone quit raggin on the one mouse button aleady? It's such a tired argument.
  • by _xeno_ ( 155264 ) on Tuesday October 24, 2000 @07:28AM (#679900) Homepage Journal
    Lack of apps for Alpha NT actually has nothing to do with MS just not porting them. It has much more to do with the target audience - and the target market isn't using Alphas for desktop tasks. Simply put, there is no market for Office on WinNT Alpha. In fact, the only market for WinNT/Alpha machines is as servers (mainly due to cost). As it turned out, there really wasn't a market for WinNT on Alpha either - in fact Win on Alpha support has been completely dropped. Not by Microsoft, but by Compaq.

    Conspiracy theorists might decide that WinNT for Alpha was dropped because Compaq wanted to force the people buying Alphas to use Tru64. However, this really isn't the case, because apparently the market for WinNT Alpha machines was less than 10% the market for Tru64 Alpha machines. WinNT on Alpha simply isn't commerically viable.

    However the Mac is an entirely different beast. The biggest difference is simply the target market - while Alpha machines are sold as high-end servers, Mac machines are sold as desktop boxes. That means that there is a market for applications on MacOS that there simply isn't for WinNT on Alpha.

    Since there is a definite market for desktop applications on MacOS that WinNT for Alpha lacked, then it stands to reason that if people aren't porting applications to it there issome other reason... Unfortunately for Apple, this isn't entirely true - there is a much larger market for Wintel applications than any other type. That's why there are almost always Win32/x86 versions even when there aren't for other platforms.

    A rather good example is the fact that Java for Win32(x86) is usually more advanced than Java for UNIX. (Keep in mind that Java for Linux is almost identical as Java for Solaris and Java for (Free)BSD. The differences are mostly in the JIT, along with thread support and other things that the OSes disagree on.) Sun may own Solaris, but Java developers are mostly interested in the applications running under Windows. As a result, Java for Windows gets the most attention and is usually released sooner than Java for any other platform - including Solaris.

    It's really a market thing. If Apple can create a market for MacOS apps, then companies will port. The market only has to be commerically viable - the cost of supporting the market cannot be prohibitive. From the few Mac developers I've talked with, this hasn't always been the case.

    In the case of WinNT for Alpha, though, it was too costly a market to support. There simply wasn't any demand. Outside the world of open source, the market determines what succeeds and what fails - not technology. Not stability. And, again, it's the market that will cause OSX to either succeed or fail.

  • Mojo... the enemy isn't Apple.. the enemy is Microsoft and a Unix market that is divided is much easier to damage. By the way, what the fuck dio you know about Apple products and engineering?
  • I'm curious. How does the OSX Beta compare with other graphically-rich Unix flavors (specifically Irix)? I want the opinion of Unix users (not necessarily LinuxPPC folks like myself).

    Since my machine is not uber enough to run it [ridiculopathy.com], I need to take your word for it.

    I just hope Apple doesn't sue me [ridiculopathy.com] for asking.

  • No matter how you look at it, the G4 cube is vastly overpriced... And that's coming from a Mac fan. You're paying a huge premium over a regular G4, for a machine with no expansion slots, no internal drive bays, etc...

    Yes, i think that regular G3's and G4's so far have provided a very competitive bang for the buck, but the G4 cube isn't aiming for that niche... it's aimed squarely at those who think that something should look cool and don't mind spending twice as much as they should for it.

    The G4 Cube really should be positioned between the iMac's and G4's in Apple's lineup, not between the cheapest G4 and the Dual Processor G4's.
  • Is it a mix of fussing and putzing?
  • by rrwood ( 27261 ) on Tuesday October 24, 2000 @07:34AM (#679905) Homepage
    If you read the linked article, you'll realize that this is not really a port of XFree86 to MacOS X. This is a port of VNC, which is extremely cool, to MacOS X. For those of you who are unfamiliar with VNC, it is similar to Timbuktu or PCAnywhere in that it lets you access and control a GUI desktop on a remote machine pretty much as if you were sitting in front of the remote machine. VNC does this by implementing an X server to host the X apps on the remote machine, and then shooting the pixel data to the viewing "client" machine. Obviously key presses and mouse gestures are sent from the viewer client to the remote/host machine, too. The best part (or worst, depending on your point of view wrt security, etc.) is that the VNC session stays put even if you quit the client, so your desktop session is maintained as you move around in meatspace.

    Click here to visit the VNC homepage [att.com]

    So, to run X apps on MacOS X using this hack requires you to run the X app on top of the VNC server, and then use the VNC viewer/client app to interact with the X app.

    Sounds like it'll be pretty sluggish, to me. Still, it is kinda clever, and it does let you run an X app if you really need it now.
  • Anyone know of any hard benchmarks for video processing/capture/editing.

    I've pretty much given up on the PC (be it Windows, BeOS, or Linux) for video capture and editing and will probably get a powerbook for that application simply to avoid the headaches PC video capture/editing always entails (unless Linux video editing has matured by then, which is a very distinct possibility), but I would be curious if anyone has any pointers to hard benchmarks or in-depth, relatively unbiased comparisons of the two platforms vis-a-vis video and NLE.
  • the results were 124 seconds for the P3, and 108 seconds for the G4
    If Apple is claiming that 124 is twice as much as 108, then I don't think I'll be buying their stock any time soon. Besides, application level benchmarks are only worth something to people who are going to run one application. Otherwise, use stuff like SPEC.
  • by ahg ( 134088 ) on Tuesday October 24, 2000 @07:37AM (#679908)
    I favor the Apple category for MacOSX related posts.

    About a month ago there was a MacOSX article with the BSD demon - the discussion was so Mac centric that it didn't really seem to relate to the common underlying BSD base.

    Here's the distinction I would make:

    If it's about Darwin, Apple's CLI open source edition of the OS that compiles on various platforms it should be categorized as BSD.

    If it is specifically about MacOSX which is tied to proprietary Apple hardware or an application running within that environment - then it is a Apple article.

    As for the 'X' option, while I can see it as a contender for this article... I guess because this news is particular to one "minority" platform and less relevent to the larger X user community I would still go with Apple categorization.
  • by HiyaPower ( 131263 ) on Tuesday October 24, 2000 @07:39AM (#679909)
    Hmm... A Seti workset on my Mac G4 at 450 Mhz takes 6 hrs, done on my 333 Pentium 2 its 16 hrs if it is done under Windoze and 12 if it is done under BeOS (same hardware thus 33% penalty for Microsquish OS over BeOS). My 700 Mhz Athlon does one in about 8 hours (direct scale from the P2) in Windoze. My dual 450 P2 does one in 12 hrs. Somehow, it seems that clock for clock, I am getting more out of the Power PC. Further, stability is not an issue even though I am running a frankenmachengezelshaftcomputingmachin machine (a 7500 with a boatload of ram, ide disks off a 3rd party card, drop in processor, etc.) Somehow the "rice pudding" model that M$ uses to make their OS is more the problem in stability than the hardware.
  • Well, the O and S are intellectual property of Microsoft, so Apple can't release those (they are only using them under license). Granted, there are some people who claim that those letters were used before Microsoft came into existence, but they are the same crazies who deny that Microsoft invented the graphical user interface and the web browser.
  • Actually, I run solaris on sun hardware because it's a better enterprise level OS than Linux currently is. When Linux surpasses Solaris in that respect, I'll switch. Want some examples? The ability to set system resources, like shared memory, semaphores, etc etc with a textfile and a reboot, and not a kernal recompile. Or the ability to fsck a disk, in production. Stuff like that.
  • Sheesh.

    I'm tired of hearing about this stuff. X is excellent technology, and the reason it's been around since 1984 and is still working wonderfully (well X11 has been around since 89) is that it's EXTREMELY WELL DESIGNED. Despite peoples' griping about the X toolkit and protocol, the whole system is vastly well designed, and built to last. It lasts not because of the abundance of "legacy" applications (at one time there was a migration from X10 to X11--and that was very quick--and think of the migration from win3.1 to win95, etc), but because it's excellent, excellent technology.

    A word about antialiasing. Most uses of X are in businesses, governments, and science. When you're controlling satellites, nuclear reactors, nuclear warheads, global databases, etc, does antialiasing do you ANY good whatsoever?

    And as people have pointed out numerous times, today's screen resolutions are so huge that antialiasing is outdated--it was designed to compensate for huge jaggies that no longer exist.

  • Almost completely off-topic, but I had to laugh when I saw this on EvangeList today. You think Slashdot readers and the Mac community might have significantly different priorities?

    Subject: [CTA] :CueCat Reader for Mac
    From: "Dan Fisher"
    Date: Tue, 24 Oct 2000 08:13:39 +1100

    Hey Guys,

    There's a really neat little product being pushed (FOR FREE) by RadioShack called the :CueCat Reader. There is a Mac version of the software in development, and they're gauging the response on their website with a form for Mac users to sign up if they're interested. Let's show them we want this! Go read up on it at the site, it's basically a barcode scanner that launches websites of the products or books, CDs, DVDs, whatever you scan into it.

    http://www.crq.com/mac.html [crq.com]

    Dan

    Personally, I agree with Joel On Software [editthispage.com] -- I can't imagine why I would want one of these, regardless of whose software it runs.

  • Actually, I run solaris on sun hardware because it's a better enterprise level OS than Linux currently is.

    I wasn't talking about Enterprise 6500s, I was
    talking about Ultra 2s. :-)

    -
  • by plsuh ( 129598 ) <plsuh&goodeast,com> on Tuesday October 24, 2000 @07:47AM (#679915) Homepage
    OK, I have moderator privs right now, but this thread is just too tempting to not jump into!

    1) Given the architecture of X Windows, you must have a an X server running on your machine. Even local X Windows apps run by connecting to a local X server. Just compiling an Xlib will not give you much in the way of speed gains -- loopback calls under the OS X's network architecture are very cheap (heck, this is true for most OS's).

    The relatively expensive part of the X Windows architecture (in terms of speed and resources) is the context switch that is necessary in this whole set-up: server process picks up mouse click and sends to client, client processes mouse click and sends display commands back to server, server processes display commands and puts them back onto the screen. Any mouse click requires at least two context switches (server to client, client to server), which are expensive under some OS's (MS WinNT/2K, Classic Mac OS). However, under the Mac OS X kernel, and indeed on most Unices, context switches are fast and cheap, so this is not much of a performance hit. (This is why Apache on Unix runs multiple processes, but is a single multi-threaded process on WinNT/2K.)

    Pushing the server-side functions into the client-side Xlib would only really save the cost of the loopback overhead plus the context switches, both of which are cheap in Mac OS X (and other Unices). Only on a Windows- or Classic MacOS-based system does it make sense to try and cut out the context switches.

    2) The X Windows server does in fact translate X calls into native Mac OS X calls. The implementation referenced above does it through the VNC application, which is extremely slow due to the massive number of layers involved -- one or two context switches are not so bad, but it looks to me that they're going through four or five. If you look at the Mac OS X graphics architecture, there is a lightweight graphics server underneath it all called Core Graphics Services, which is responsible for all drawing on the screen. Aqua, QuickTime, OpenGL, and QuickDraw all hook into this layer to do their actual drawing to the screen. It is possible (I don't claim it's easy, but it shouldn't be that hard) to write an X server that hooks in directly above the Core Graphics Services layer to translate X Windows calls to native, low-level CGS calls. This would make X Windows just as fast as the native libraries (aside from bottlenecks that might be inherent to X Windows) and allow for interleaved X and native windows on the screen. This is the Right Way To Do It (tm). :-)

    Disclaimer: I am an Apple employee, but these views are my own and not based on anything that is Apple Confidential. I work with WebObjects as part of Apple iServices, which is a bit away from the core Mac OS X dev teams.


    --Paul
  • you heard about the $300 rebate?
  • You're too late. Tenon has already done this:
    http://www.tenon.com/prod uct s/xtools/pre-release_beta/ [tenon.com]
    Simultaneous execution of X- and Aqua-based applications is provided. Support for both copy and paste functions is provided between X and native applications.
    Tenon plans to add full support of OpenGL, as well as more convenient way to close X applications and start remote X clients. Although no widget libraries have been ported, these too are planned.
  • by maggard ( 5579 ) <michael@michaelmaggard.com> on Tuesday October 24, 2000 @07:49AM (#679918) Homepage Journal
    The article is about cobbling together VNC to talk to X under MacOS X. This is *not* the same as firing up X on one's Mac monitor and getting a plays-well-with-other X window (which is what the "lame commercial binary" does.)

    Will there be a retraction this time or will it slide?

  • I remember thinking this years ago, when the Mac was still a strong contender, and CD drives were first becoming available. I'd see ads in the papers, with the CD drive for the PC costing $100, and the one for the Mac $200. And this seemed to be the case for most any piece of hardware.

    As for performance: I played around with an iMac at the local science museum a few weeks ago, and the general response speed reminded me of my old 486 that occasionally gets uses as an X-server.

  • Read the original post next time - the G4 was more than twice as fast per clock cycle than the PIII. 500MHz G4, 1GHz PIII, you do the math.


    However, it doesn't really say much since Apple naturally picked the tasks the G4 does best compared to the PIII.
    --
    Niklas Nordebo | nino at sonox.com | +46-708-405095

  • I suspect you may be trolling here, but I'll play along anyway...

    I don't doubt that X is well designed. Unix is well designed, and that's been around for years too. However, if something has been around essential unchanged for over ten years then it CANNOT be said that it represents the state of the art in its field. This is certainly true of X.

    There's nothing particularly wrong with it, except that we can now do better!

    As for your comments regarding anti-aliasing, I must strongly disagree here. Businesses, governments and science ALL benefit from anti-aliasing, simply because (in the most simple terms) it makes the writing on the computer screen easier to read! In fact these are three areas where operators would expect to read a lot of material from screen (eg. papers, reports, figures, etc etc.) and therefore where anti-aliasing would be of most benefit.

    This isn't intended to be a facetious question, but have you ever actually used a system with proper sub-pixel anti-aliasing throughout? Come back to anything else and your eyes will complain...

    As for the resolution issue, jaggies will ALWAYS occur no matter what the resolution, as at the end of the day you cannot perfectly approximate a curve by a series of rectangular dots on a CRT. Moreover, anti-aliasing makes small fonts MUCH easier to read, even at high resolutions, and prevents the "greeking" that so besets X's standard fonts (on my machine at least).

    Also, remember that some of us are forced to used resolutions such as 800x640, for either personal or financial reasons.

    (Note: just thought I'd say this early before some AC tries to be clever... anti-aliasing /= alpha transparancy. The latter *can* (but need not) be used to achieve the former - that's all.)
  • one thing X certainly needs is FAST and CONSISTENT (across the whole desktop) sub-pixel anti-alisasing. Acorn users have had this since 1990

    What was the screen res of an Acorn in 1990? As your screen res increases, antialiasing becomes less and less important. But you're right, all else equal, it would be nice to have antialiasing as an option.

    After using and loving sharp, flicker-free, non-antialiased text on a 15" 1400x1050 notebook, I would say that the world doesn't need antialiasing as much as low dot pitch and high refresh rates (or just discrete pixel displays). Yes, that's hardware technology, beyond the scope of an OS, but all OSs will benefit from those developments. And your retinas will love you. No more antialiased, blurry flicker. Visual joy.
  • 1) Twice the performance? Maybe equal. I have a 350Mhz G3 and a Compaq PII400 running Win2k side by side on my desk and generally speaking the Mac feels slower than the PC, albeit not by much. I don't run Photoshop, but do pound the snot out of the I/O capabilities of both machines and the Mac's I/O is doggy under OS9.

    2) Neither machine has been in need of much futzing. I've added a SCSI card to both machines without any problems, either OS or hardware. One thing that drives me nuts about the Mac lineup is the *lack* of standardization. The G3/G4 line notwithstanding, Macs have been all over the map in terms of hardware. PCs vary, but the behavior and components are largely swappable between a Dell and a Compaq and whatever. I think the PC lends itself to user-driven futzing because it can so easily be futzted with. Many Macs have lacked any expandability or changeability so you didn't futz because you couldn't.

  • Funny, I thought Slashdot followed the Law of the Schoolyard:

    1. Don't tattle.
    2. Always make fun of those different from you.
    3. Never say anything unless you're absolutely sure everyone else thinks the same thing.
    I guess you learn something new every day.
  • by arnald ( 201434 ) on Tuesday October 24, 2000 @08:05AM (#679927)
    Agree with most of this, except the fan. Fans are EVIL and must be banished by better design. It's like power amplifiers - most designs require fans, but companies that put their minds to it can come up with very high powered amps that cool by convection, and actually run colder than an equivalent fan-cooled device.

    This must be done with computers. It's particularly important if (like me) you use a computer in your recording studio - this is one area where Macs are particularly popular, so evidently Apple are listening to the market here.

    (NB this is why I still use a silent Atari 1024STE for my sequencing...)

    However, I haven't used a Cube yet, so for all I know it might melt within half an hour. :-)
  • For all of it's bluntness this is one of the most effective summaries that I've seen of this entire discussion about the differences in chip speeds and when it is and isn't important or significant. It certainly underscores some of the very real reasons why Apple machines might be a rational purchase, while simultaneously pointing out some of the smoke that Apple has been screening with regards to hardware performance.
  • I think Jobs' real strategy is to turn the Mac into the equivalent of a designer label in clothing.

    Here (in the UK) I see most ads (90%?) that need to include a computer as part of the set, are using iMacs and iBooks. That's a heck of a lot of 'free' advertising. And TV programmes too. Take 'Watchdog' for example (they pursue consumer complaints by hounding down the companies involved -- you don't want your company featured on this programme) -- they read email from viewers while in the studio on air, on a prominently placed iMac)

    Another example, I was checking out the latest models at one of the few department stores that carry macs, when a middle aged woman dragged her husband over to show him the machine. She said "Now that (hands gesturing with delight) is what I'd like to see at home..."

    Something is working in their campaign -- although I'd hate computers to go the way of the car....

  • by yerricde ( 125198 ) on Tuesday October 24, 2000 @08:18AM (#679934) Homepage Journal

    The "twice as fast as Wintel" claim is based on a small number of Adobe Photoshop operation benchmarks; usually filters that have been painstakingly optimized for the G4's "Altivec" vector processing unit. This isn't necessarily "cheating", since Photoshop is still one of the primary reasons to buy a mac, but if you are not a graphics professional, you are simply never ever going to see that kind of speed benefit using a Mac.

    How do you know the native core graphics drivers aren't also written in assembly language for Altivec? Painstaking optimization of graphics is part of what made the first QuickDraw so fast and Macs so attractive in the first place.

  • by Junks Jerzey ( 54586 ) on Tuesday October 24, 2000 @08:29AM (#679936)
    >I've been told to expect twice the performance from a G3 than a similarly clocked PIII.

    That is unadorned horseshit. But don't take my word for it: go to www.spec.org and check out the numbers yourself. 20-30% is more the average gain, and that's cold comfort when you can buy 1.2GHz Athlon chips for less than $500 a pop.


    It was horseshit when Apple tried to say that the G3 was twice as fast as a Pentium II back in 1998. I repeat: it was bunk. But, it has turned out that a 500 MHz G4 (not G3) is remarkably fast for it's clock speed. Here's [cpuscorecard.com] a PC-oriented benchmark site, quoted on Slashdot a few weeks back, showing that a 500 MHz G4 is only 15% slower than a 1 GHz Athlon. That's impressive, especially when you look at the huge difference in power consumption.
  • Back in the days of old Macs had better parts than PCs. The 2x price difference you'd see in CD drives etc. between the two different platforms was mostly due to the fact that Macs were a SCSI only outfit. This price difference is still true today, a 48 spin IDE CD-ROM drive will set you back between $40 and $50 while a 40 spin SCSI CD-ROM will set you back a little over $100. These drives are probably pretty comparable in data rate given the inherent speed advantage of SCSI. At least that's the feeling I get looking at egghead [egghead.com].

    This has changed somewhat in the last 3-4 years. Macs are now shipped with IDE hard drives and CD drives. Any external devices are connected with cross platform USB cables rather than SCSI and ADB which were unique to Apple in the consumer market. In addition the PC industry has switched from 70 pin SIMMs to 168 pin DIMMs so memory for Macs is now the same parts as PCs. As a result of these changes it's now a lot easier to buy parts for a Mac and a lot cheaper too.

    Apple has chosen to adopt more industry standard parts as an alternative to using only the best parts. This has lead to cheaper Macs at the expense of some of their really great quality that used to be worth paying extra for.
    ________________
    They're - They are
    Their - Belonging to them

  • It would be basically impossible to implement a system wherein all X apps acted as actuall Mac apps. The reason is that X, at its lowest level, does not know anything about high level controls like buttons and scrollbars. These widgets are all implemented at a higher level by toolkits like Motif, Qt, and GTK. And they don't work the same way.

    However, it would be possible to write a port of GTK, Motif, or Qt (or any other toolkit, I'd imagine) that would translate the high level controls into their MacOS equivalents. This is very similar to what TrollTech does for it's Qt toolkit under Windows. However, TrollTech has no plans to make a port that runs under anything but Windows and X11. Maybe if you've got some free time on your hands... :-)

  • In other words, a theoretical 500 MHz p3 would take 248 seconds to complete the task which took only 108 seconds on the G4.
    500mhz G4 did it in 108 seconds. If it's twice as fast per clock cycle, it should have done it in 62 seconds, which is half the time the P3 took. Regardless of the numbers, the benchmark itself is still worthless.
  • Like many things the rich buy, it also *performs* like a high-class machine. Ever hear someone complain about their 200k luxury car? PowerPC is a good technology; the Velocity Engine is much better than MMX. And the OS is more stable than Windows. So the cost is worth it to those who are willing to spend the cash.
  • While that's a noble goal, it would probably be much easier to just port the GTK libraries to OSX (using this X server) and use a OSX-like theme.
  • he plugged back into the Apple customer and made a real attempt to satisfy what Apple customers want

    (disclaimer--I'm a long time Macintosh Owner & supporter (since '88)...as well as a Linux user (circa 1997))

    What has Steve Jobs done to satisfy what the Apple Customer wants? I have a few examples

    • He killed the clones. Granted they were killing Apple's business, but Apple wasn't producing machines as fast or as sleek as the clone companies. It took almost 2 years for Apple to catch up to Power Computing's PowerTowerG3/275 (or something like that), and it took Apple almost 6 months to release a motherboard that ran at 66MHz (Motorola was close to shipping one right before the no clone policy.)
    • He doesn't ship any 6 PCI slot machines. Many of the people in Apple's Core markets need as many PCI slots as possible. SCSI (another sore subject), Video Caputre Cards, Video Cards (for multiple monitors), etc, are put to use well inside a Mac system.
    • Their B&W G3 "Trojan Horse" firmware upgrade. They issued a required upgrade for their Blue & White G3 Macs that disabled the machine if you upgraded to a G4 processor before Apple wanted you to. They later fixed this situation, but most Mac users weren't happy.
    • A few weeks after the B&W incident, Apple tried to ship G4 machines @ 400, 450 & 500 MHz. They couldn't get enough 500MHz chips, so they dropped the speeds down 50MHz, and then raised the prices of some people's orders (ie, if I ordered the 450MHz machine before Apple changed speed ratings, I would be shipped the 450MHz machine at the old 500MHz price...)
    • Recently, Apple charged their customers $30 to debug their beta OS! Granted it is nice, but it is far from feature complete, and there are lots of thorny issues that Apple needs to straighten out (ie, do you display folders that begin with a "." -- it's a system file/directory in Unix, but it worked fine in OS 9...and a few CD-ROM's I own have folders begining with a ".")
    • Mike at Xlr8YourMac [xlr8yourmac.com] had a petition to allow third party video cards to be installed at the Apple store. Almost 10,000 people signed it, but Apple seems to have ignored the wishes of thier customers. (FYI, ATI is the current supplier (both Radeon & Rage128 Pro), but 3dfx has Mac support, as does ProFormance, and supposedly the GeForce2MX is capable of supporting the Mac (no drivers, though) )

    I'm not saying I hate the Apple, but I (and many other Mac users), has become very frustrated to the way Apple treats their die-hard group of users. There are tons more examples I (and other Mac users) can give you. I really hope that this changes in the future.
  • My favorite X server for MS Windows is Reflection X. It allows you to do expect/send dialogs with a telnet server so that you don't have to diddle with a console window prior to starting X.

    I like the idea of running an X Server as a service in Windows, though. Even though it's essentially the same thing, it would definitely feel more natural.
    --
  • How do you know the native core graphics drivers aren't also written in assembly language for Altivec?

    Because (a) QuickDraw has been around a lot longer than the G4 units, (b) Apple still needs to support non-G4 macs, (c) graphics-intensive programs other than Photoshop tend not to exhibit the same gains (Quake III comes to mind instantly), and (d) I follow Apple's OS announcements pretty closely and haven't seen them announce any such thing for MacOS 9. Nor have ATI or 3dfx made any mention of re-tuning their device drivers for Altivec that I'm aware of. (Proof to the contrary will be happily accepted, on both counts.)

    Let me re-emphasize that first point: QuickDraw is really, really old. Re-writing the thing to be optimized for altivec when they've known for over a year now that they're going to be throwing it away in favor of Quartz would be a nonsensical waste of effort on Apple's part.

    The good news is that they are apparently optimizing quite a bit of the "Quartz" (aka "DisplayPDF") rendering engine for Altivec in OS X -- most reviews of the public beta have commented on how much snappier screen operations are on the G4 than even on higher-clock G3 machines.

  • by latneM ( 7876 ) on Tuesday October 24, 2000 @08:59AM (#679950)
    The second link in the post describes how to get XFree86 to run on MacOSX. Specifically, look here [darwinfo.org]. Even the page you refer to points to that article, and even says (direct quote) "As a side bonus, if you survive the 50+ mb download, you can log in to OS X's console and run the X server directly from there, if you so choose".


    How about reading the article before complaining about it?

  • As I said originally, application benchmarks are only worth using if you're only running that particular benchmark. :-) Although I do point out that for general benchmarking, wanting to use a specific application generally means assembler-optimized. What would happen, for example, if the Photoshop filter was optimized for MMX2/KNI/SMID/whatever the hell it is the P3 has?
  • OK - I'll reword it:

    While one will get X working under MacOS X it doesn't work under Quartz.

    Running X on MacOS X isn't terribly impressive - it's been doable on MacOS X's progenitor Darwin for quite some time.

    What folks want is an X that can run under MacOS X's Quartz/Aqua environment. Then one could simultanuosly run Classic, Carbon, Cocoa, Java & X applications at the same time. To date this is only doable using the "lame commercial binary".

  • I've seen the comment that Apple charged customers for a beta a lot. At least on the high end, the people who want early access to software end up paying more since they end up with more support. I still think you shouldn't charge for beta's, but it is hardly unusual.
  • According to the Darwin site,
    This really is a document about XFree86 on Darwin, but there's just a tiny trick to get things working on MacOS X. You should follow all the directions above, but to be able to get X working on MacOS X, you need to quit the MacOS GUI. To do this, log in as console, and it should drop you to a text prompt. From there you can start XFree86 the same as from Darwin.
    I'd call this a bit more than a "tiny trick;" it is more than a small matter that you need to quit the MacOS GUI.

    That means that you can't, at least not with XFree86-for-Darwin, run MacOS apps concurrently with X-based applications.

    It certainly represents a cool hack, but, in that it requires choosing not to use "MacOS," this rather diminishes the merits of having MacOS-X. If you haven't the GUI, how much better can "text mode" MacOS-X be than Linux or *BSD?

  • futzing is a term commonly used in Larry Niven novels, as a euphemism for "fucking." tanj dammit also comes to mind: "there ain't no justice!"
  • by Anne Marie ( 239347 ) on Tuesday October 24, 2000 @10:13AM (#679971)
    Behold: the Y Window System [hungry.com]. Check out the overview [hungry.com]. It shows promise, but then, we've been saying that about Berlin for years now.
  • > Nobody cares as much about apple hardware if it's
    > not even slightly compatible with any widely
    > accepted software, besides of course what
    > Microsoft decides to throw at it.

    Most "widely used" software on Windows are ports of Mac and Unix software. Word, Excel, Photoshop, Illustrator, GoLive, Director, Pro Tools, QuickTime ... these are all Macintosh apps that showed up on Windows around their 3.0 or 4.0 versions. In spite of the fact that Adobe, the second biggest software company in the world, ported their whole product line to Windows, they still make the majority of their money from Mac users.

    Internet Explorer on the Mac is probably the only widely used Windows-first app, but the Mac version uses a completely different codebase, UI, and has a standards-compliant rendering engine. Only the name is the same.
  • by gig ( 78408 ) on Tuesday October 24, 2000 @11:03AM (#679980)
    > It's really a market thing. If Apple can create a
    > market for MacOS apps, then companies will port.

    I'm a Mac user. Can somebody tell me: what are the apps from Windows that I am missing out on? Might as well leave out anything that already runs on Unix, since I'm running Mac OS X.

    Some of the major apps I'm running now: Pro Tools, Cubase, Peak, Photoshop, Dreamweaver, Fireworks, Director, Flash, FreeHand, Word (under duress), Internet Explorer, Acrobat, BBEdit, QuickTime Pro, VideoShop, ViaVoice, iMovie, RealPlayer, Shockwave, and about 50 smaller apps that do things like play MP3's or batch convert media files or whatever. I mean, what am I missing here?

    So far the only occasion I've had to actually have to use Virtual PC for something productive (as opposed to just for geek fun) was Ray Kurzweil's Poet Assistant, which is a small app that runs full speed in Virtual PC. I was sort of interested in Sonic Foundry's ACID for a while, but Bitheadz now has a Mac app called Phrazer that's the same thing.

    This is a legitimate question. I don't feel like I'm missing anything, but I'd like to know what these apps are. Never mind about games ... I'll get a console if I want those, although I have the Mac OS X version of Quake III.
  • Yes, but have you noticed how quickly fashion changes?

    Almost as quickly as systems become obsolete.

  • > He doesn't ship any 6 PCI slot machines.

    If you need more than 3, you need more than 6. People who do, use an expansion chassis. The kind of user you're talking about buys their Mac right from Digidesign or Avid, as a small part of an overall $10,000+ system. The expansion chassis is a minor expense. This "issue" with Apple is as much a real issue as the one-button mouse, which 70% of Mac users like better than two.

    I've always had at least one slot free on my Blue & White Power Mac. I mean, when you have FireWire and USB, PCI is much less important.
  • Yo,

    How about us poor Win2k users? The Radeon drivers are about 50% slower than the 98 ones. ATI isn't getting my money until they get their act together

    ostiguy

    Of course, the reason nVidia isn't an option is because of their closed source binary drivers for Xfree86. I am just another MCSE who runs OpenBSD for a router/firewall.

  • Fact: Apple recommends 192 Megs of memory as a realistic minimum

    Where did you get this so-called "fact?" 128MB works great. And remember, this is still pre-release unoptimized code. The goal is to get it down to 64MB by 1.0.

    Windows 95 box that I use for web surfing. It has 32 Megs of memory. It runs on a Cyrix 5x86-120 (sort of a 486-DX-120). It flies.

    Windows 95 is hardly comparable to OSX. A lot of the aforementioned requirements for OSX go into supporting the Classic environment which is an entirely different OS. If you think Win95 is better overall than OSX, then feel free to continue using it.

    But this is a moot point since OSX on Intel isn't good business sense right now.

    - Scott

    ------
    Scott Stevenson
  • If only the G4 cubes didn't have hairline cracks in their clear plastic casings

    Whatever they are, it's not too suprising that they showed up. Nobody has tried to make a computer like this with these materials before. It's hard to blame engineering for shortcomings in an experimental product. It's easy to blame PR for the way they dealt with it.

    If only the G4 cube had a fan so it wouldn't overheat like a toaster.

    Ummmm, I actually haven't heard of any of the cubes overheating. PowerPCs take much less power and generate much less heat than most other chips.

    If only it had capacity for a true RAID cage

    I think this may be outside of Apple's target market.

    If only the G4 cube had an SVGA connector

    Agreed.

    If only it had room for more than 2 DIMMs.

    Hmmm, size sacrifices have to be made somewhere to shrink the case. Otherwise, why not get a tower?

    - Scott
    ------
    Scott Stevenson
  • It doesn't. OSX is not, and will not be, a Unix flavor. It is a proprietary user environment running on top of the Mach kernel.

    That'a bit misleading, as it acts very much like a Unix flavor in a number of ways. Certainly much more than NT ever will. The fact that it is based on Darwin says a lot, and Darwin is nothing if it is not Unix. I'm not qualified to compare OSX to Irix, though. There are a number of articles on the web as to how good of a Unix MOSX is. Such a topic is beyond the scope of a single post.

    - Scott


    ------
    Scott Stevenson
  • Tenon has a beta of Xtools [tenon.com] available for download.

    According to their press release it is a:

    full implementation of the X Windows system running on Mac OS X. Based on X11R6.4, Xtools inherits the clean, fast, stable, and portable codebase from Xfree86. Integration with the Aqua environment is enabled by building the X server on top of Cocoa and QuickDraw, providing a rootless X windows display while still retaining the ability to use native applications.

    Of course, it costs real money, but it seems to be a smoother solution than VNC.

    -Andrew

  • by TheInternet ( 35082 ) on Tuesday October 24, 2000 @01:29PM (#680000) Homepage Journal
    The Mac is dying.

    You do realize how ridiculous that sounds considering that 1) this has been said since 1986 2) Mac marketshare has been increasing recently?

    It can't compete with Durons/Athlons/Thunderbirds, PIII, PIV, SMP

    Actually, that's the funny part. Despite popular slashdot belief, G4s do quite a admirable job of competing with processors at twice their clock speed. You'll note that IBM, Sun, etc. do not freak out that they sell high-end machines with low-megahertz processors in them. The real problem is that Motorola has not shipped faster chips in about a year. As for SMP, the G4 was designed with SMP in mind, as was OSX. You can get a dual G4 for $2500.

    The reason Mac lost is that they didn't realize the power of the commodity marketplace.

    Or maybe consider the option that Apple isn't really about that type of product. Do we really need another generic box maker?

    AMD is now doing SMP

    PowerPC has been doing SMP since the 604 days. This isn't that impressive.

    Motorola will be out of the PPC business withing 2 years

    Hopefully.

    Why do you think they are stuck at 500 Mhz?

    Because their fabrication process sucks. IBM had to come in and save the day.

    Right. No interest in going furuther.

    They just unveiled the G4 Plus at 1GHz. No idea when this will end up in an Apple machine, though.

    Motorola pulls out and Mac will croak.

    Strong words for somebody who has never run a multi-billion dollar computer company before (I'm assuming :).

    - Scott


    ------
    Scott Stevenson
  • by piecewise ( 169377 ) on Tuesday October 24, 2000 @02:09PM (#680004) Journal
    Just so people know, I bought a new Cube with 17" monitor and cost me $1799. Well worth it. It's not only absolutely beautiful, but it's incredibly fast.

    As a big supporter of Linux, I suppose I'm not AS into aestetics... but when I see a little power icon on the top of the Cube, but not button, I go to press it and suddenly it's glowing (i can't tell from where!) -- well, I'm very impressed. I'm finding these "Stupid Apple design details" really make it wonderful. People say, we don't need Tangerine iBooks! I want something dull!
    Well you know what? My work can be pretty dull, and when I go and look at this beautiful machine that's ALSO very fast and very ahead of its time (OSX), I'm pretty damn inspired to produce some beautiful code.
    It's not for everyone, but it's definately for me, and a few million other people.

    I'm running Mac OS X Public Beta on the Cube and I really love it. Yes there are some quirks, but it's just so amazing and a lot can happen between now and January.
  • If only the G4 cube had an SVGA connector...

    Then what's this "15-pin mini D-Sub VGA connector" on the spec sheet [apple.com]?

    Free clue: It has both the weird ADC connector and a standard SVGA connector.
  • It's not just a Photoshop benchmark, it's a Photoshop FILTER benchmark. They are:

    cross-platform
    used in multiple applications (not just Photoshop ... for example, Director and FreeHand both support Photoshop filters)
    very processor intensive
    (most of all) the user sits and stares at a progress bar while they happen.

    If you're encoding or compiling, you might start it and go away, maybe even overnight, but if you're applying Photoshop filters, you are going to watch a minute go by here, and two minutes there. The shootout that they do at the Expos is quite convincing in that case.

    PC Magazine also recreated those benchmarks. They put a dual 1GHz PC (note the "dual") against a dual 500MHz Mac, and the Mac won 6 of their 8 Photoshop filter tests, and tied a 7th. They also did a bunch of comparisons of 3D rendering and stuff, and the PC only barely beat the Mac in many cases. This is a dual 1GHz PC, and the Mac held its own very, very well, even in non-Altivec stuff. These Macs are fast machines.
  • True, but it'd be great if people would start working on an open source version..
  • Have you seen the $500 CRT? It comes ColorSync calibrated and has a fully flat 16" viewable display area. It has a completely clear casing that's really beautiful. When you plug it into a Mac, all of the display's controls are accessible from within Mac OS, and the display's power switch works for the whole computer. It's also got a couple of USB ports and gets its USB, power and analog video through one cable. An entirely different device than the house 17" CRT you pay $250 for from other vendors. I haven't used a CRT for about a year, but I was still blown away by this display when I saw it.
  • If only the G4 cubes didn't have hairline cracks in their clear plastic casings (I don't care whether they're cracks or just flashings from molding, those blemishes don't belong!).

    Yawn. They apparently fell a little short of crafting an absolutely seamless and perfect Cube out of transparent plastic. "If I had known that my Cube wasn't going to be geometrically perfect, I would have got a Compaq". Right.

    If only the G4 cube had a fan so it wouldn't overheat like a toaster.

    What are you talking about? Whose Cube overheated?

    If only it had capacity for a true RAID cage (not yet another über-clocked serial interface stretched to the limits).

    If you want a Cube with big storage, either 1) hook up two FireWire drives and turn them into a RAID with SoftRAID, or 2) hook up an actual hardware FireWire RAID, or 3) pay $200 extra for your Cube and get Gigabit Ethernet and hook up to an Ethernet storage device. Are you really stretching your FireWire bus to its limits? What are you doing that causes that? Lucky for you, the 800mbs version is almost ready.

    If only the G4 cube had an SVGA connector so you could connect a decent 21-inch monitor to it instead of Apple's ultra-lucid offerings.

    The Cube has an SVGA connector, as well as a standard DVI+ connector, which Apple calls an "Apple Display Connector", same as they call 1394 "FireWire".

    * If only it had room for more than 2 DIMMs.

    You're breaking my heart. The Cube is 8 inches square and can take a GB of RAM. Boo-hoo.

    If only you could put in a less expensive IDE CD-recorder.

    You can get a USB model for $200 or less, or a FireWire one for a bit more, and use either on multiple machines. Que makes some really nice looking ones.

    If only it didn't look like a giant spider once you finished connecting all the external devices.

    That's a weird complaint to make about a machine whose standard cabling goes:

    mouse --- keyboard --- display --- Cube --- wall power

    all in one long line, and has antennaes built-in for wireless networking. The last time I checked, there weren't any other manufacturers doing anything at all about how cables look on their machines. Apple is the only company I've ever seen who actually shows their products in their advertising with cables attached and showing, such as the iMac ad that shows how to set up an iMac. HP is not going to show you what kind of cables are involved in their machines until after you buy.

  • There is a VGA port and an ADC port on the video cards that ship in Cube and tower Macs. If you buy the retail (boxed) version of the same two ATI cards you can get from Apple, they have a VGA and a plain DVI on them. I have yet to see a display card with a digital output that didn't also have a VGA on it. At least not in the last couple of years.

    The ADC is actually one of the standard DVI connectors. It is the same as the plain DVI connector, except that it has a few more pins on the end that carry VGA, USB and power. It is better than the plain DVI plug (since all digital flat panel displays also need USB and power as well as DVI), but it is not cheaper. Hence, it is not used by manufacturers of commodity PC's.

    I know it's natural to go "oh no, not another connector", but the reason that adapters to split ADC into plain DVI / USB / VGA / power are cheap because all those things are within the ADC. The signals are the same. You're not converting anything, just re-cabling. It's only an issue of three cables between two devices, or one cable between two devices, not a competing technology.

    Think about it, though: the ADC carries everything any display could possibly need in one cable, whether the display is analog (VGA) or digital (DVI). That's why all of Apple's displays (2 digital LCD's and one analog CRT) all use the ADC connector. If every computer had one of these connectors, hooking up a display to a computer would be as simple as plugging in the cable from the display into the computer and that's it, without having to know or worry about whether it's an analog or digital display. We ought to applaud Apple for going down this road. Why switch from the VGA connector to the plain DVI connector (as an industry) and not get a little more than just the plain analog to digital switch? The ADC also carries VGA, so you have a way to adapt an existing VGA monitor design to an ADC connector easily. The signal is still there. The ADC is a good "universal" display connector, wheras the rest of the industry is going with having both VGA and plain DVI on everything from now until probably forever, along with instructions not to hook up a display to both at once, and the requirement that you have a vague knowledge of which display is analog and which is digital. Billions of people hooking up billions of displays over the coming years will also have to run a separate power and USB cable. Why would you voluntarily have three cables going between two devices? So you can knock $20 off the price of the computer. Not worth it. If you use Compaq or Dell or whatever brand of machine, you ought to be on them to get with this program. Think about it next time you're hooking up three cables between two devices.
  • Operating systems vendors invest a great deal of energy in getting applications developers to code products to the native API of the OS.

    The result is that it is very difficult for the developer to bring the product out on a competing platform, and it discourages users from moving to a different OS when they feel the vendor isn't serving their needs (because they can't get the solutions to their problems).

    If the developer doesn't want to deal with the OS vendor anymore, he's really got a problem - either suffer under the vendor's thumb, or make a great deal of personal sacrifice to move to a different operating system.

    I was sick of Apple so I wrote I'm worried about my future. That's why I'm a Be developer. [scruznet.com]

    And in fact I shipped (and still do support) on of the first commercial applications for the BeOS, Spellswell from Working Software [working.com].

    Nothing Be ever did made any sense, and while there are individuals at the company that I regard highly, on the whole I felt the company to be uniquely unresponsive and incompetent.

    And just when they were showing some promise of shipping enough BeOS [be.com] installations that I had some hope of making more than the measly couple hundred bucks I'd earned in royalties in the three years I'd been working on Spellswell, they announced a "change in focus" and said they weren't going to support the desktop anymore, except for the extent necessary to use it as a development platform for their new Strategy Du Jour, Internet Appliances.

    After I posted on BeDevTalk that Some of Us Work for a Living [escribe.com], the moderator told me he was fed up with a developer who was trying to discuss business issues of concern to Be's third-party developers on Be's third-party developer mailing list. That was my last message to bedevtalk - he unsubscribed me.

    I've been working on a really challenging C++ application for a few months, and after reading C++ Answers with Bjarne Stoustrup [slashdot.org] I got excited about really digging into the basics of programming - but from the perspective of a developer with 13 years of work experience and a lot of shipping products. [goingware.com]

    I bought a few books, mostly on C++ and also hit some websites and newsgroups, and I became a much better programmer as a result. And I really felt that I did better to spend my time on core architectural and language issues rather than dealing with OS-specific nits or tool issues. And so I wrote Study Fundamentals, Not APIs, Tools or OSes [goingware.com].

    So this brings me back to being used by operating systems vendors to serve their material needs at my expense and the cost of much personal pain. If you become a better programmer by learning the basics better, to can fluidly go from OS to OS without much of a learning curve.

    But there's the problem that you have to use some API to code your application to, and while Java claims to be "platform-independent" it is really a proprietary platform in itself [att.com] - just try making use of platform-specific code in a Java application, yes you can do it with the Java Native Interface but it is difficult and an assault on the Java developer's senses to write a dll in C or C++ to load into the runtime.

    So what you really need is a cross-platform application framework that you can write in with a language such as C++, that comes preconfigured with easy-to-use preprocessor symbols so you can drop into OS-specific code at your whim, and will compile from a single sourcebase to native machine code for multiple operating systems.

    Funny that, since December '99 I've been writing a multithreaded special-purpose graphics editor that is also an HTTP client with just such a cross-platform application framework. I can develop on Mac or Windows as the need suits me and switch back and forth at a moments notice (especially now that I've got filesharing between my machines). My client only asked for Mac and Windows versions but I could port to BeOS or Linux in a few days. The framework is called ZooLib [zoolib.org].

    It was written by my friend Andrew Green of The Electric Magic Company [em.net], originally to insulate himself from Apple's API nonsense. (Do you remember when all progress on developer tools at Apple and Symantec stopped while they went off into the sunset to develop Bedrock, itself a cross-platform application framework and an immense investment of time and money - and then abandoned it? If it hadn't been for then-tiny Metrowerks [metrowerks.com] Apple would have gone out of business after shipping the first PowerPC Macs, because there would have been no native PPC compilers.)

    He felt that if he could code to his own layer and Apple changed their API, he'd just have to reimplement the OS-specific layer and he'd be working again. But then a little more work and he'd be cross-platform...

    If you click that link today you'll just get a placeholder page. But just wait a few days...

    (For practical reasons the source itself, mailing lists and so on will be provided at http://zoolib.sourceforge.net/ [sourceforge.net] once it's released.)

    While ZooLib is to be newly released to the public it is not new code. It has been in use in commercial products for about five years - and in development in my own since last December. Part of why Andy gave me the code and I've been working with it is to give him meaningful architectural feedback and detailed bug reports so he can prepare it for public release.

    I've been urging Andy to release the source as-is for a couple of years but his standards are incredibly high for a programmer. Andy's code doesn't just work, it is correct.

    Andy spares no effort or time to fix the smallest problems (this is especially important in multithreaded code - think about reference counted smart pointers that are operated on by different threads, as you can do with Zoolib), and part of why he's been delaying the release is to improve the overall architecture.

    For more details, including relevant quotes from Judge Thomas Penfield Jackson's Findings of Fact and Final Judgement discussing why Microsoft felt it was more important than anything to suppress cross-platform API's, such as Netscape plug-ins, Java, Intel Native Signal Processing, Lotus Notes, Apple Quicktime (runs on Windows too!) and RealNetworks' multimedia technology, please read my early draft of:

    The Cross-Platform Manifesto [goingware.com]

    Thank you for your attention.

    Regards,

  • "Any mouse click requires at least two context switches (server to client, client to server) ..."

    This is NOT true. X buffers requests and responses so multiple X messages (including user input messages) may be handled by the server or client at a time. Therefore, rather than a context switch per mouse click, you have a context switch per N mouse clicks and other X events.

    Context switch overhead is a factor, for sure, but it's not as bad as you make it out to be.

    I personally find X performance to be more than adequate for 100% of what I do. I guess if I were doing realtime 3d rendering (including 3d games) I would have something to complain about.

    But of course the advantages of X far outweigh its disadvantages, as everyone ought to know by now ...
  • And another thing - the reason that VNC is slow is because it is basically pumping over all of the bits necessary to refresh any window which changes. X is a MUCH smarter protocol that sends over "meta" information about what to draw and how to draw it, instead of sending over every pixel as VNC does multiple times per second.

    In my mind, X is a "problem solved". There are improvements to be made to X, for sure, but it is the Right Way to do windowing systems. The only reason it is not used in Windows and OSX is because the companies that make these products have an agenda, and that agenda includes locking software into properietary APIs rather than using standardized open protocols like X.
  • Doesn't mean it's not CORRECT.

    X is simply a protocol for describing how clients and servers may communicate so that clients can draw window contents onto servers' screens. Period. It is a well-defined problem space, and X, as a solution, is pretty much IT. There have not been fundamental changes to X in 10 years because it is a correct, complete, efficient solution to the problem. Period.

    Yes, there are areas in which X can be improved, such as font support, but this is NO reason to chuck X. You try designing a network-transparent windowing system and see how far you get before all of the problems that X solves with respect to race conditions, efficiency, performance, correctness, etc, bite you in the butt and you give up and go with X.

    Xlib is a problem. It represents the minimal set of C API calls necessary to expose the full functionality of the X protocol to a client program. But it does not provide any kind of higher-level windowing system functionality such as buttons and scrollbars. Thus, many people have implemented these things in many different ways, most of them poor, and the result is that the typical X program looks and runs like crap.

    This is NOT the fault of X. It is the fault of the people who released X without releasing any kind of standardized, effective toolkit that won over a broad base of usage. It is the fault of the people who have and will continue to ruin Unix by refusing to engage in any kind of standardization whatsoever.

    The fragmentation of Unix systems and Unix desktops is a problem, but it IS NOT THE FAULT OF X!

    So stop blaming X already!

    X is state of the art because the "art" (network transparent windowing) has not changed, and will not change, in the same way that algebra is state of the art because the fundamental facts of mathematics do not change.

    BTW, there are resolutions for which jaggies do not occur, despite your assertion to the contrary - any resolution where the pixel is too small to be seen by the naked eye, will not have jaggies and will not require antialiasing. I predict that 95% of all computers will meet this criterion within 10 years.

    In the meantime, YES, we need support for antialising in X. There are standardized mechanisms for extending X to support things like this. The problem once again is that there is no common toolkit API that all X programs are using such that simply adding an antialiasing extension to the X server will magically fix X programs.

    Once again, not X's fault - it's the fault of toolkits and the general X developer community which failed to produce a single viable toolkit (and GTK makes me barf, by the way).

"Show me a good loser, and I'll show you a loser." -- Vince Lombardi, football coach

Working...