X On OSX Now Free 177
ffejbean noted that OSXTalk (hey, they run Slashcode!) has an article up noting that XFree86 and MacOS X are getting more and more friendly every day. Now you don't have to purchase a lame commercial binary, you can just
install it yourself. If only those iCubes didn't cost twice what they should, this may just be a great platform yet. (BTW, I'm getting confused here, should I post this as Apple, X, or BSD? Ah well, close enough :)
A big win for portability (Score:2)
It'd be nice to have that on Windows too, actually.
--
Good news (Score:1)
Icon for DAX? :) (Score:1)
I suggest you stick to a photo from the first 6 series of DS9.
Apple or BSDaemon, good question... (Score:3)
Job's buisness strategy (Score:3)
---
DO you still attract chicks with OSX (Score:2)
Anyhow, once these Macs are running OSX, are the chicks still gonna be interested in them? I'm trying to keep my bachelor pad up to date and keep the chick tractor beam still functioning.
Thanks,
What about the rest of it? (Score:4)
they going to free the O and S?
Quit discriminating against the other letters of
the alphabet!
Re:A big win for portability (Score:1)
Eric
Re:Job's buisness strategy (Score:2)
Isn't this true of almost all non x86 platforms? I can't name 3 platforms that the same piece of "widely accepted software" run on simply because most "widly accepted software" runs on x86 Windows. You can see this is even more true when you start to hear about all of the applications that Microsoft didn't port to Windows NT for the Alpha processor.
But... (Score:1)
X wrapper as a 'plugin' for OSX? (Score:4)
I wonder how well the basic widgets map..
Your Working Boy,
Re:Apple or BSDaemon, good question... (Score:2)
Support.
When something goes out, the last thing you want to do when getting it fixed is have to waste time proving to the vendor that it's not the unsupported OS you added that's causing the problem.
-
Re:A big win for portability (Score:2)
I think that is essentially what has already been done. Xlib requires an X server, though. That was the original poster's point: X would run much faster on MacOS X if, rather than working the way X normally does and going through a server, it just called MacOS APIs directly. This would lose the advantage of X's network features, but a lot of people don't use those anyway.
X on Macs (Score:1)
X on MS Windows (Score:1)
It's made by Hummingbird Software [hummingbird.com] and is expensive, but my school (Arizona State University) provides a license for students to use it for student use.
Re:conspiracy! (Score:2)
Jobs also opened up the source (after a fashion) to the OS, allowing developers to port it to the x86 architecture (and perhaps others in the future). That isn't trying "to get macs more accepted". That's a sound strategy for deploying a new OS.
Re:X on Macs (Score:2)
Not quite. MacOS X isn't Linux; it's a BSD system. Still, I think you're right, the increased portability of software to the Mac hardware will be nice.
Price-Performance of "iCubes" and other Macs (Score:5)
What should they cost?
I hear this complaint quite a bit. It seems that one of the enshrined bits of common wisdom (or myth?) when it comes to PC buying is that Mac HW costs more for the performance you get.
Mac fans counter that it's the same or better, and give the following reasons:
1) even though PPC clock speeds are slower, programs run faster because the processor can do more per clock cycle. I've been told to expect twice the performance from a G3 than a similarly clocked PIII.
2) productivity gain by less futzing about with hardware, due to standardization...
Comments? Maybe even hard numbers? Balanced reasoning (ha!)?
Re:conspiracy! (Score:1)
Yup. Sounds like a conspiracy to me. Yup. Yup.
Re:A big win for portability (Score:2)
And probably someone else have allready written this somewhere below 0...
Re:X wrapper as a 'plugin' for OSX? (Score:3)
Re:Price-Performance of "iCubes" and other Macs (Score:1)
arstechnica.com explains it best:
"Apple also made it a point to reiterate their old bit about how, under certain circumstances with certain applications that certain Apple users often use, the G4 can sometimes be twice as fast as a PIII at the same MHz rating.
Give it up Apple, because not only are such statements both
a) purposefully misleading and
b) so vague as to not really mean much of anything, and not even the die-hards are buying it anymore.
Yeah, the MHz rating isn't all that matters, but it does really matter. Anyone who tells you otherwise is probably trying to sell you a Mac."
Re:Good news (Score:1)
Re:Price-Performance of "iCubes" and other Macs (Score:5)
This is why the Cube is the first CPU in Apple's history that hasn't bet initial sales forecasts (this gleaned from their recent earnings conference call).
I love the Cube, but I'm waiting until they come out with the low-cost version in Jan or Feb. They can't cut anything but the price, and I'm pretty sure the 'low price' cube wil be the current cube, and they'll introduce something faster for the premium price. This is what they've been doing with Powerbooks and iMacs for the last three years.
Kevin Fox
Re:Price-Performance of "iCubes" and other Macs (Score:1)
Re:Job's buisness strategy (Score:5)
Jobs has had the insight to see the potential of the computer as a sort of renewed status symbol, and the new Mac cubes are chasing that with a vengence. Everyone has a computer, but not everyone has a really swanky one. Ask yourself: Does the average mid-20's to mid-30's hipster who just surfs and emails and buys designer clothes and furniture want a beige lump or a sleek, cool-looking Mac? I'd be willing to bet that you'll start seeing cK and Ralph Lauren computers with sleek looks within the next couple of years -- Apple is just at the start of this trend.
In this age where more and more of what you use a computer for is on the net anyhow, lack of software apps matter less and less. Style, on the other hand (and forgive me), never goes out of style.
----
FUD! (Score:1)
-The Reverend (I am not a Nazi nor a Troll)
Re:conspiracy! (Score:1)
Ha ha ha.
It's not a conspiracy, it's good business.
X? Come along... (Score:1)
So, is it really so exciting that Apple now support X? I suppose in one sense it's great to have all those legacy applications, but it would be nice to see the state of the art pushed forward somewhat - I would certainly have expected this of Apple, one of the more forward-thinking old-school computer companies.
Then again, I must admit there are no serious contenders to X currently visible on the radar. I've looked at WHY [apsoft.com] (fairly promising but early days) and Berlin [berlin-consortium.org] (extremely interesting, but a little too bogged-down in providing support for glitzy rotations and the like too early on in the development), but I don't see X being replaced in the forseeable future, sadly.
Perhaps this is because X Window was developed by academic experts who were basically employed to do this, whereas it's putative replacements are being developed by enthusiastic amateurs (and this isn't intended as a knock to those developers, but merely a reflection of the truth - I am an enthusiastic amateur myself!).
Specifically, one thing X certainly needs is FAST and CONSISTENT (across the whole desktop) sub-pixel anti-alisasing. Acorn users have had this since 1990, so why has it taken so long for the rest of the world to catch up?
XServers on Mac in past (Score:2)
but don't know if they are still around.
The Xserver is pretty portable. You have supply
about 30-50 kernal graphics routines in their
driver.
Re:conspiracy! (Score:1)
Re:Price-Performance of "iCubes" and other Macs (Score:4)
That is true. (It's also true of sparc, alpha, MIPS and pretty much any non-x86 architecture.)
I've been told to expect twice the performance from a G3 than a similarly clocked PIII.
That is unadorned horseshit. But don't take my word for it: go to www.spec.org [spec.org] and check out the numbers yourself. 20-30% is more the average gain, and that's cold comfort when you can buy 1.2GHz Athlon chips for less than $500 a pop.
The "twice as fast as Wintel" claim is based on a small number of Adobe Photoshop operation benchmarks; usually filters that have been painstakingly optimized for the G4's "Altivec" vector processing unit. This isn't necessarily "cheating", since Photoshop is still one of the primary reasons to buy a mac, but if you are not a graphics professional, you are simply never ever going to see that kind of speed benefit using a Mac.
In "regular use" applications, the scenario at the moment is even worse than you might guess based on the SPEC numbers: MacOS 9 is such a turgid, inefficient piece of crap, and the device drivers for 3rd-party Mac hardware so shoddily implemented, that MacOS applications will often run significantly slower than their Windows counterparts on similar hardware: just ask anybody [insidemacgames.com] if they're getting the same kind of Quake III framerates out of a G4/500 with a Radeon card as they would from a PIII/800 with the same graphics card.
You just don't buy Macs for world-beating performance (Photoshop being the exception). You buy them for nice industrial design, an OS that for all of its architectural ugliness still offers a more compelling user experience than Windows, and more often than not just to maintain an existing investment in MacOS software.
Did you watch the Mac Expo Europe? (Score:1)
This may not be true for *every* application, or Photoshop job for that matter, but it does support the claim that some applications, under some uses, perform at better than twice that of a similarly clocked P3.
Re:Apple or BSDaemon, good question... (Score:1)
And will everyone quit raggin on the one mouse button aleady? It's such a tired argument.
Re:Job's buisness strategy (Score:3)
Conspiracy theorists might decide that WinNT for Alpha was dropped because Compaq wanted to force the people buying Alphas to use Tru64. However, this really isn't the case, because apparently the market for WinNT Alpha machines was less than 10% the market for Tru64 Alpha machines. WinNT on Alpha simply isn't commerically viable.
However the Mac is an entirely different beast. The biggest difference is simply the target market - while Alpha machines are sold as high-end servers, Mac machines are sold as desktop boxes. That means that there is a market for applications on MacOS that there simply isn't for WinNT on Alpha.
Since there is a definite market for desktop applications on MacOS that WinNT for Alpha lacked, then it stands to reason that if people aren't porting applications to it there issome other reason... Unfortunately for Apple, this isn't entirely true - there is a much larger market for Wintel applications than any other type. That's why there are almost always Win32/x86 versions even when there aren't for other platforms.
A rather good example is the fact that Java for Win32(x86) is usually more advanced than Java for UNIX. (Keep in mind that Java for Linux is almost identical as Java for Solaris and Java for (Free)BSD. The differences are mostly in the JIT, along with thread support and other things that the OSes disagree on.) Sun may own Solaris, but Java developers are mostly interested in the applications running under Windows. As a result, Java for Windows gets the most attention and is usually released sooner than Java for any other platform - including Solaris.
It's really a market thing. If Apple can create a market for MacOS apps, then companies will port. The market only has to be commerically viable - the cost of supporting the market cannot be prohibitive. From the few Mac developers I've talked with, this hasn't always been the case.
In the case of WinNT for Alpha, though, it was too costly a market to support. There simply wasn't any demand. Outside the world of open source, the market determines what succeeds and what fails - not technology. Not stability. And, again, it's the market that will cause OSX to either succeed or fail.
Re:conspiracy! (Score:1)
OSX Beta report card (Score:1)
Since my machine is not uber enough to run it [ridiculopathy.com], I need to take your word for it.
I just hope Apple doesn't sue me [ridiculopathy.com] for asking.
Re:Price-Performance of "iCubes" and other Macs (Score:1)
Yes, i think that regular G3's and G4's so far have provided a very competitive bang for the buck, but the G4 cube isn't aiming for that niche... it's aimed squarely at those who think that something should look cool and don't mind spending twice as much as they should for it.
The G4 Cube really should be positioned between the iMac's and G4's in Apple's lineup, not between the cheapest G4 and the Dual Processor G4's.
Re: What's fuzting? (Score:1)
Read the article! (Score:5)
Click here to visit the VNC homepage [att.com]
So, to run X apps on MacOS X using this hack requires you to run the X app on top of the VNC server, and then use the VNC viewer/client app to interact with the X app.
Sounds like it'll be pretty sluggish, to me. Still, it is kinda clever, and it does let you run an X app if you really need it now.
"Ask Slashdot" NLE and video capture comparisons? (Score:2)
I've pretty much given up on the PC (be it Windows, BeOS, or Linux) for video capture and editing and will probably get a powerbook for that application simply to avoid the headaches PC video capture/editing always entails (unless Linux video editing has matured by then, which is a very distinct possibility), but I would be curious if anyone has any pointers to hard benchmarks or in-depth, relatively unbiased comparisons of the two platforms vis-a-vis video and NLE.
Re:Did you watch the Mac Expo Europe? (Score:2)
should I post this as Apple, X, or BSD? (Score:4)
About a month ago there was a MacOSX article with the BSD demon - the discussion was so Mac centric that it didn't really seem to relate to the common underlying BSD base.
Here's the distinction I would make:
If it's about Darwin, Apple's CLI open source edition of the OS that compiles on various platforms it should be categorized as BSD.
If it is specifically about MacOSX which is tied to proprietary Apple hardware or an application running within that environment - then it is a Apple article.
As for the 'X' option, while I can see it as a contender for this article... I guess because this news is particular to one "minority" platform and less relevent to the larger X user community I would still go with Apple categorization.
Re:Price-Performance of "iCubes" and other Macs (Score:5)
Re:What about the rest of it? (Score:2)
Re:Apple or BSDaemon, good question... (Score:2)
Re:X? Come along... (Score:1)
I'm tired of hearing about this stuff. X is excellent technology, and the reason it's been around since 1984 and is still working wonderfully (well X11 has been around since 89) is that it's EXTREMELY WELL DESIGNED. Despite peoples' griping about the X toolkit and protocol, the whole system is vastly well designed, and built to last. It lasts not because of the abundance of "legacy" applications (at one time there was a migration from X10 to X11--and that was very quick--and think of the migration from win3.1 to win95, etc), but because it's excellent, excellent technology.
A word about antialiasing. Most uses of X are in businesses, governments, and science. When you're controlling satellites, nuclear reactors, nuclear warheads, global databases, etc, does antialiasing do you ANY good whatsoever?
And as people have pointed out numerous times, today's screen resolutions are so huge that antialiasing is outdated--it was designed to compensate for huge jaggies that no longer exist.
Slashdot, meet Mac users (Score:1)
Subject: [CTA] :CueCat Reader for Mac
:CueCat Reader. There is a Mac version of the software in development, and they're gauging the response on their website with a form for Mac users to sign up if they're interested. Let's show them we want this! Go read up on it at the site, it's basically a barcode scanner that launches websites of the products or books, CDs, DVDs, whatever you scan into it.
From: "Dan Fisher"
Date: Tue, 24 Oct 2000 08:13:39 +1100
Hey Guys,
There's a really neat little product being pushed (FOR FREE) by RadioShack called the
http://www.crq.com/mac.html [crq.com]
Dan
Personally, I agree with Joel On Software [editthispage.com] -- I can't imagine why I would want one of these, regardless of whose software it runs.
Re:Apple or BSDaemon, good question... (Score:2)
I wasn't talking about Enterprise 6500s, I was
talking about Ultra 2s.
-
Re:A big win for portability (Score:5)
1) Given the architecture of X Windows, you must have a an X server running on your machine. Even local X Windows apps run by connecting to a local X server. Just compiling an Xlib will not give you much in the way of speed gains -- loopback calls under the OS X's network architecture are very cheap (heck, this is true for most OS's).
The relatively expensive part of the X Windows architecture (in terms of speed and resources) is the context switch that is necessary in this whole set-up: server process picks up mouse click and sends to client, client processes mouse click and sends display commands back to server, server processes display commands and puts them back onto the screen. Any mouse click requires at least two context switches (server to client, client to server), which are expensive under some OS's (MS WinNT/2K, Classic Mac OS). However, under the Mac OS X kernel, and indeed on most Unices, context switches are fast and cheap, so this is not much of a performance hit. (This is why Apache on Unix runs multiple processes, but is a single multi-threaded process on WinNT/2K.)
Pushing the server-side functions into the client-side Xlib would only really save the cost of the loopback overhead plus the context switches, both of which are cheap in Mac OS X (and other Unices). Only on a Windows- or Classic MacOS-based system does it make sense to try and cut out the context switches.
2) The X Windows server does in fact translate X calls into native Mac OS X calls. The implementation referenced above does it through the VNC application, which is extremely slow due to the massive number of layers involved -- one or two context switches are not so bad, but it looks to me that they're going through four or five. If you look at the Mac OS X graphics architecture, there is a lightweight graphics server underneath it all called Core Graphics Services, which is responsible for all drawing on the screen. Aqua, QuickTime, OpenGL, and QuickDraw all hook into this layer to do their actual drawing to the screen. It is possible (I don't claim it's easy, but it shouldn't be that hard) to write an X server that hooks in directly above the Core Graphics Services layer to translate X Windows calls to native, low-level CGS calls. This would make X Windows just as fast as the native libraries (aside from bottlenecks that might be inherent to X Windows) and allow for interleaved X and native windows on the screen. This is the Right Way To Do It (tm).
Disclaimer: I am an Apple employee, but these views are my own and not based on anything that is Apple Confidential. I work with WebObjects as part of Apple iServices, which is a bit away from the core Mac OS X dev teams.
--Paul
Re:Price-Performance of "iCubes" and other Macs (Score:1)
Re:X wrapper as a 'plugin' for OSX? (Score:1)
http://www.tenon.com/prod uct s/xtools/pre-release_beta/ [tenon.com]
Simultaneous execution of X- and Aqua-based applications is provided. Support for both copy and paste functions is provided between X and native applications.
Tenon plans to add full support of OpenGL, as well as more convenient way to close X applications and start remote X clients. Although no widget libraries have been ported, these too are planned.
Earth to Cmdr Taco.... (Score:4)
Will there be a retraction this time or will it slide?
Re:Price-Performance of "iCubes" and other Macs (Score:1)
As for performance: I played around with an iMac at the local science museum a few weeks ago, and the general response speed reminded me of my old 486 that occasionally gets uses as an X-server.
Re:Did you watch the Mac Expo Europe? (Score:1)
However, it doesn't really say much since Apple naturally picked the tasks the G4 does best compared to the PIII.
--
Niklas Nordebo | nino at sonox.com | +46-708-405095
Re:X? Come along... (Score:1)
I don't doubt that X is well designed. Unix is well designed, and that's been around for years too. However, if something has been around essential unchanged for over ten years then it CANNOT be said that it represents the state of the art in its field. This is certainly true of X.
There's nothing particularly wrong with it, except that we can now do better!
As for your comments regarding anti-aliasing, I must strongly disagree here. Businesses, governments and science ALL benefit from anti-aliasing, simply because (in the most simple terms) it makes the writing on the computer screen easier to read! In fact these are three areas where operators would expect to read a lot of material from screen (eg. papers, reports, figures, etc etc.) and therefore where anti-aliasing would be of most benefit.
This isn't intended to be a facetious question, but have you ever actually used a system with proper sub-pixel anti-aliasing throughout? Come back to anything else and your eyes will complain...
As for the resolution issue, jaggies will ALWAYS occur no matter what the resolution, as at the end of the day you cannot perfectly approximate a curve by a series of rectangular dots on a CRT. Moreover, anti-aliasing makes small fonts MUCH easier to read, even at high resolutions, and prevents the "greeking" that so besets X's standard fonts (on my machine at least).
Also, remember that some of us are forced to used resolutions such as 800x640, for either personal or financial reasons.
(Note: just thought I'd say this early before some AC tries to be clever... anti-aliasing
Antialiasing necessary? (Score:1)
What was the screen res of an Acorn in 1990? As your screen res increases, antialiasing becomes less and less important. But you're right, all else equal, it would be nice to have antialiasing as an option.
After using and loving sharp, flicker-free, non-antialiased text on a 15" 1400x1050 notebook, I would say that the world doesn't need antialiasing as much as low dot pitch and high refresh rates (or just discrete pixel displays). Yes, that's hardware technology, beyond the scope of an OS, but all OSs will benefit from those developments. And your retinas will love you. No more antialiased, blurry flicker. Visual joy.
Re:Price-Performance of "iCubes" and other Macs (Score:1)
2) Neither machine has been in need of much futzing. I've added a SCSI card to both machines without any problems, either OS or hardware. One thing that drives me nuts about the Mac lineup is the *lack* of standardization. The G3/G4 line notwithstanding, Macs have been all over the map in terms of hardware. PCs vary, but the behavior and components are largely swappable between a Dell and a Compaq and whatever. I think the PC lends itself to user-driven futzing because it can so easily be futzted with. Many Macs have lacked any expandability or changeability so you didn't futz because you couldn't.
Re:Insightful (Score:1)
Re:If only... (Score:3)
This must be done with computers. It's particularly important if (like me) you use a computer in your recording studio - this is one area where Macs are particularly popular, so evidently Apple are listening to the market here.
(NB this is why I still use a silent Atari 1024STE for my sequencing...)
However, I haven't used a Cube yet, so for all I know it might melt within half an hour.
Re:Price-Performance of "iCubes" and other Macs (Score:2)
Re:Job's buisness strategy (Score:2)
I think Jobs' real strategy is to turn the Mac into the equivalent of a designer label in clothing.
Here (in the UK) I see most ads (90%?) that need to include a computer as part of the set, are using iMacs and iBooks. That's a heck of a lot of 'free' advertising. And TV programmes too. Take 'Watchdog' for example (they pursue consumer complaints by hounding down the companies involved -- you don't want your company featured on this programme) -- they read email from viewers while in the studio on air, on a prominently placed iMac)
Another example, I was checking out the latest models at one of the few department stores that carry macs, when a middle aged woman dragged her husband over to show him the machine. She said "Now that (hands gesturing with delight) is what I'd like to see at home..."
Something is working in their campaign -- although I'd hate computers to go the way of the car....
Altivec optimizations in core graphics. (Score:3)
The "twice as fast as Wintel" claim is based on a small number of Adobe Photoshop operation benchmarks; usually filters that have been painstakingly optimized for the G4's "Altivec" vector processing unit. This isn't necessarily "cheating", since Photoshop is still one of the primary reasons to buy a mac, but if you are not a graphics professional, you are simply never ever going to see that kind of speed benefit using a Mac.
How do you know the native core graphics drivers aren't also written in assembly language for Altivec? Painstaking optimization of graphics is part of what made the first QuickDraw so fast and Macs so attractive in the first place.
Re:Price-Performance of "iCubes" and other Macs (Score:5)
That is unadorned horseshit. But don't take my word for it: go to www.spec.org and check out the numbers yourself. 20-30% is more the average gain, and that's cold comfort when you can buy 1.2GHz Athlon chips for less than $500 a pop.
It was horseshit when Apple tried to say that the G3 was twice as fast as a Pentium II back in 1998. I repeat: it was bunk. But, it has turned out that a 500 MHz G4 (not G3) is remarkably fast for it's clock speed. Here's [cpuscorecard.com] a PC-oriented benchmark site, quoted on Slashdot a few weeks back, showing that a 500 MHz G4 is only 15% slower than a 1 GHz Athlon. That's impressive, especially when you look at the huge difference in power consumption.
Re:Price-Performance of "iCubes" and other Macs (Score:3)
This has changed somewhat in the last 3-4 years. Macs are now shipped with IDE hard drives and CD drives. Any external devices are connected with cross platform USB cables rather than SCSI and ADB which were unique to Apple in the consumer market. In addition the PC industry has switched from 70 pin SIMMs to 168 pin DIMMs so memory for Macs is now the same parts as PCs. As a result of these changes it's now a lot easier to buy parts for a Mac and a lot cheaper too.
Apple has chosen to adopt more industry standard parts as an alternative to using only the best parts. This has lead to cheaper Macs at the expense of some of their really great quality that used to be worth paying extra for.
________________
They're - They are
Their - Belonging to them
Re:A big win for portability (Score:2)
However, it would be possible to write a port of GTK, Motif, or Qt (or any other toolkit, I'd imagine) that would translate the high level controls into their MacOS equivalents. This is very similar to what TrollTech does for it's Qt toolkit under Windows. However, TrollTech has no plans to make a port that runs under anything but Windows and X11. Maybe if you've got some free time on your hands... :-)
Re:Did you watch the Mac Expo Europe? (Score:2)
Re:Job's buisness strategy (Score:2)
Re:X wrapper as a 'plugin' for OSX? (Score:2)
Re:Job's business strategy - Focus on Customer (Score:2)
(disclaimer--I'm a long time Macintosh Owner & supporter (since '88)...as well as a Linux user (circa 1997))
What has Steve Jobs done to satisfy what the Apple Customer wants? I have a few examples
I'm not saying I hate the Apple, but I (and many other Mac users), has become very frustrated to the way Apple treats their die-hard group of users. There are tons more examples I (and other Mac users) can give you. I really hope that this changes in the future.
Re:Or try xmanager (Score:2)
I like the idea of running an X Server as a service in Windows, though. Even though it's essentially the same thing, it would definitely feel more natural.
--
Re:Altivec optimizations in core graphics. (Score:2)
Because (a) QuickDraw has been around a lot longer than the G4 units, (b) Apple still needs to support non-G4 macs, (c) graphics-intensive programs other than Photoshop tend not to exhibit the same gains (Quake III comes to mind instantly), and (d) I follow Apple's OS announcements pretty closely and haven't seen them announce any such thing for MacOS 9. Nor have ATI or 3dfx made any mention of re-tuning their device drivers for Altivec that I'm aware of. (Proof to the contrary will be happily accepted, on both counts.)
Let me re-emphasize that first point: QuickDraw is really, really old. Re-writing the thing to be optimized for altivec when they've known for over a year now that they're going to be throwing it away in favor of Quartz would be a nonsensical waste of effort on Apple's part.
The good news is that they are apparently optimizing quite a bit of the "Quartz" (aka "DisplayPDF") rendering engine for Altivec in OS X -- most reviews of the public beta have commented on how much snappier screen operations are on the G4 than even on higher-clock G3 machines.
Re:Earth to Cmdr Taco.... (Score:5)
How about reading the article before complaining about it?
Re:Did you watch the Mac Expo Europe? (Score:2)
Re:Earth to Cmdr Taco.... (Score:3)
While one will get X working under MacOS X it doesn't work under Quartz.
Running X on MacOS X isn't terribly impressive - it's been doable on MacOS X's progenitor Darwin for quite some time.
What folks want is an X that can run under MacOS X's Quartz/Aqua environment. Then one could simultanuosly run Classic, Carbon, Cocoa, Java & X applications at the same time. To date this is only doable using the "lame commercial binary".
Re:Job's business strategy - Focus on Customer (Score:2)
One Downside: Mutual Exclusivity (Score:4)
That means that you can't, at least not with XFree86-for-Darwin, run MacOS apps concurrently with X-based applications.
It certainly represents a cool hack, but, in that it requires choosing not to use "MacOS," this rather diminishes the merits of having MacOS-X. If you haven't the GUI, how much better can "text mode" MacOS-X be than Linux or *BSD?
Re: What's fuzting? (Score:2)
There already is (Score:3)
Re:Job's buisness strategy (Score:2)
> not even slightly compatible with any widely
> accepted software, besides of course what
> Microsoft decides to throw at it.
Most "widely used" software on Windows are ports of Mac and Unix software. Word, Excel, Photoshop, Illustrator, GoLive, Director, Pro Tools, QuickTime
Internet Explorer on the Mac is probably the only widely used Windows-first app, but the Mac version uses a completely different codebase, UI, and has a standards-compliant rendering engine. Only the name is the same.
Re:Job's buisness strategy (Score:4)
> market for MacOS apps, then companies will port.
I'm a Mac user. Can somebody tell me: what are the apps from Windows that I am missing out on? Might as well leave out anything that already runs on Unix, since I'm running Mac OS X.
Some of the major apps I'm running now: Pro Tools, Cubase, Peak, Photoshop, Dreamweaver, Fireworks, Director, Flash, FreeHand, Word (under duress), Internet Explorer, Acrobat, BBEdit, QuickTime Pro, VideoShop, ViaVoice, iMovie, RealPlayer, Shockwave, and about 50 smaller apps that do things like play MP3's or batch convert media files or whatever. I mean, what am I missing here?
So far the only occasion I've had to actually have to use Virtual PC for something productive (as opposed to just for geek fun) was Ray Kurzweil's Poet Assistant, which is a small app that runs full speed in Virtual PC. I was sort of interested in Sonic Foundry's ACID for a while, but Bitheadz now has a Mac app called Phrazer that's the same thing.
This is a legitimate question. I don't feel like I'm missing anything, but I'd like to know what these apps are. Never mind about games
Re:Job's buisness strategy (Score:2)
Almost as quickly as systems become obsolete.
Re:Job's business strategy - Focus on Customer (Score:2)
If you need more than 3, you need more than 6. People who do, use an expansion chassis. The kind of user you're talking about buys their Mac right from Digidesign or Avid, as a small part of an overall $10,000+ system. The expansion chassis is a minor expense. This "issue" with Apple is as much a real issue as the one-button mouse, which 70% of Mac users like better than two.
I've always had at least one slot free on my Blue & White Power Mac. I mean, when you have FireWire and USB, PCI is much less important.
Radeon, was Re:Price-Performance of "iCubes"... (Score:2)
How about us poor Win2k users? The Radeon drivers are about 50% slower than the 98 ones. ATI isn't getting my money until they get their act together
ostiguy
Of course, the reason nVidia isn't an option is because of their closed source binary drivers for Xfree86. I am just another MCSE who runs OpenBSD for a router/firewall.
Comment normalization (Score:2)
Where did you get this so-called "fact?" 128MB works great. And remember, this is still pre-release unoptimized code. The goal is to get it down to 64MB by 1.0.
Windows 95 box that I use for web surfing. It has 32 Megs of memory. It runs on a Cyrix 5x86-120 (sort of a 486-DX-120). It flies.
Windows 95 is hardly comparable to OSX. A lot of the aforementioned requirements for OSX go into supporting the Classic environment which is an entirely different OS. If you think Win95 is better overall than OSX, then feel free to continue using it.
But this is a moot point since OSX on Intel isn't good business sense right now.
- Scott
------
Scott Stevenson
Re:If only... (Score:2)
Whatever they are, it's not too suprising that they showed up. Nobody has tried to make a computer like this with these materials before. It's hard to blame engineering for shortcomings in an experimental product. It's easy to blame PR for the way they dealt with it.
If only the G4 cube had a fan so it wouldn't overheat like a toaster.
Ummmm, I actually haven't heard of any of the cubes overheating. PowerPCs take much less power and generate much less heat than most other chips.
If only it had capacity for a true RAID cage
I think this may be outside of Apple's target market.
If only the G4 cube had an SVGA connector
Agreed.
If only it had room for more than 2 DIMMs.
Hmmm, size sacrifices have to be made somewhere to shrink the case. Otherwise, why not get a tower?
- Scott
------
Scott Stevenson
Well, I don't know about that (Score:2)
That'a bit misleading, as it acts very much like a Unix flavor in a number of ways. Certainly much more than NT ever will. The fact that it is based on Darwin says a lot, and Darwin is nothing if it is not Unix. I'm not qualified to compare OSX to Irix, though. There are a number of articles on the web as to how good of a Unix MOSX is. Such a topic is beyond the scope of a single post.
- Scott
------
Scott Stevenson
Rootless X display already available for Mac OS X (Score:2)
Tenon has a beta of Xtools [tenon.com] available for download.
According to their press release it is a:
Of course, it costs real money, but it seems to be a smoother solution than VNC.
-Andrew
Some gaps (Score:5)
You do realize how ridiculous that sounds considering that 1) this has been said since 1986 2) Mac marketshare has been increasing recently?
It can't compete with Durons/Athlons/Thunderbirds, PIII, PIV, SMP
Actually, that's the funny part. Despite popular slashdot belief, G4s do quite a admirable job of competing with processors at twice their clock speed. You'll note that IBM, Sun, etc. do not freak out that they sell high-end machines with low-megahertz processors in them. The real problem is that Motorola has not shipped faster chips in about a year. As for SMP, the G4 was designed with SMP in mind, as was OSX. You can get a dual G4 for $2500.
The reason Mac lost is that they didn't realize the power of the commodity marketplace.
Or maybe consider the option that Apple isn't really about that type of product. Do we really need another generic box maker?
AMD is now doing SMP
PowerPC has been doing SMP since the 604 days. This isn't that impressive.
Motorola will be out of the PPC business withing 2 years
Hopefully.
Why do you think they are stuck at 500 Mhz?
Because their fabrication process sucks. IBM had to come in and save the day.
Right. No interest in going furuther.
They just unveiled the G4 Plus at 1GHz. No idea when this will end up in an Apple machine, though.
Motorola pulls out and Mac will croak.
Strong words for somebody who has never run a multi-billion dollar computer company before (I'm assuming
- Scott
------
Scott Stevenson
About the Cube (Score:3)
As a big supporter of Linux, I suppose I'm not AS into aestetics... but when I see a little power icon on the top of the Cube, but not button, I go to press it and suddenly it's glowing (i can't tell from where!) -- well, I'm very impressed. I'm finding these "Stupid Apple design details" really make it wonderful. People say, we don't need Tangerine iBooks! I want something dull!
Well you know what? My work can be pretty dull, and when I go and look at this beautiful machine that's ALSO very fast and very ahead of its time (OSX), I'm pretty damn inspired to produce some beautiful code.
It's not for everyone, but it's definately for me, and a few million other people.
I'm running Mac OS X Public Beta on the Cube and I really love it. Yes there are some quirks, but it's just so amazing and a lot can happen between now and January.
Re:If only... (Score:2)
Then what's this "15-pin mini D-Sub VGA connector" on the spec sheet [apple.com]?
Free clue: It has both the weird ADC connector and a standard SVGA connector.
Re:Did you watch the Mac Expo Europe? (Score:2)
cross-platform
used in multiple applications (not just Photoshop
very processor intensive
(most of all) the user sits and stares at a progress bar while they happen.
If you're encoding or compiling, you might start it and go away, maybe even overnight, but if you're applying Photoshop filters, you are going to watch a minute go by here, and two minutes there. The shootout that they do at the Expos is quite convincing in that case.
PC Magazine also recreated those benchmarks. They put a dual 1GHz PC (note the "dual") against a dual 500MHz Mac, and the Mac won 6 of their 8 Photoshop filter tests, and tied a 7th. They also did a bunch of comparisons of 3D rendering and stuff, and the PC only barely beat the Mac in many cases. This is a dual 1GHz PC, and the Mac held its own very, very well, even in non-Altivec stuff. These Macs are fast machines.
Tenon (Score:2)
Re:Price-Performance of "iCubes" and other Macs (Score:2)
Re:If only... (Score:2)
If only the G4 cubes didn't have hairline cracks in their clear plastic casings (I don't care whether they're cracks or just flashings from molding, those blemishes don't belong!).
Yawn. They apparently fell a little short of crafting an absolutely seamless and perfect Cube out of transparent plastic. "If I had known that my Cube wasn't going to be geometrically perfect, I would have got a Compaq". Right.
If only the G4 cube had a fan so it wouldn't overheat like a toaster.
What are you talking about? Whose Cube overheated?
If only it had capacity for a true RAID cage (not yet another über-clocked serial interface stretched to the limits).
If you want a Cube with big storage, either 1) hook up two FireWire drives and turn them into a RAID with SoftRAID, or 2) hook up an actual hardware FireWire RAID, or 3) pay $200 extra for your Cube and get Gigabit Ethernet and hook up to an Ethernet storage device. Are you really stretching your FireWire bus to its limits? What are you doing that causes that? Lucky for you, the 800mbs version is almost ready.
If only the G4 cube had an SVGA connector so you could connect a decent 21-inch monitor to it instead of Apple's ultra-lucid offerings.
The Cube has an SVGA connector, as well as a standard DVI+ connector, which Apple calls an "Apple Display Connector", same as they call 1394 "FireWire".
* If only it had room for more than 2 DIMMs.
You're breaking my heart. The Cube is 8 inches square and can take a GB of RAM. Boo-hoo.
If only you could put in a less expensive IDE CD-recorder.
You can get a USB model for $200 or less, or a FireWire one for a bit more, and use either on multiple machines. Que makes some really nice looking ones.
If only it didn't look like a giant spider once you finished connecting all the external devices.
That's a weird complaint to make about a machine whose standard cabling goes:
all in one long line, and has antennaes built-in for wireless networking. The last time I checked, there weren't any other manufacturers doing anything at all about how cables look on their machines. Apple is the only company I've ever seen who actually shows their products in their advertising with cables attached and showing, such as the iMac ad that shows how to set up an iMac. HP is not going to show you what kind of cables are involved in their machines until after you buy.
Re:That's NOT a 15-pin midi d-sub. (Score:2)
The ADC is actually one of the standard DVI connectors. It is the same as the plain DVI connector, except that it has a few more pins on the end that carry VGA, USB and power. It is better than the plain DVI plug (since all digital flat panel displays also need USB and power as well as DVI), but it is not cheaper. Hence, it is not used by manufacturers of commodity PC's.
I know it's natural to go "oh no, not another connector", but the reason that adapters to split ADC into plain DVI / USB / VGA / power are cheap because all those things are within the ADC. The signals are the same. You're not converting anything, just re-cabling. It's only an issue of three cables between two devices, or one cable between two devices, not a competing technology.
Think about it, though: the ADC carries everything any display could possibly need in one cable, whether the display is analog (VGA) or digital (DVI). That's why all of Apple's displays (2 digital LCD's and one analog CRT) all use the ADC connector. If every computer had one of these connectors, hooking up a display to a computer would be as simple as plugging in the cable from the display into the computer and that's it, without having to know or worry about whether it's an analog or digital display. We ought to applaud Apple for going down this road. Why switch from the VGA connector to the plain DVI connector (as an industry) and not get a little more than just the plain analog to digital switch? The ADC also carries VGA, so you have a way to adapt an existing VGA monitor design to an ADC connector easily. The signal is still there. The ADC is a good "universal" display connector, wheras the rest of the industry is going with having both VGA and plain DVI on everything from now until probably forever, along with instructions not to hook up a display to both at once, and the requirement that you have a vague knowledge of which display is analog and which is digital. Billions of people hooking up billions of displays over the coming years will also have to run a separate power and USB cable. Why would you voluntarily have three cables going between two devices? So you can knock $20 off the price of the computer. Not worth it. If you use Compaq or Dell or whatever brand of machine, you ought to be on them to get with this program. Think about it next time you're hooking up three cables between two devices.
Freeing the Developer from OS Vendor Shackles (Score:2)
The result is that it is very difficult for the developer to bring the product out on a competing platform, and it discourages users from moving to a different OS when they feel the vendor isn't serving their needs (because they can't get the solutions to their problems).
If the developer doesn't want to deal with the OS vendor anymore, he's really got a problem - either suffer under the vendor's thumb, or make a great deal of personal sacrifice to move to a different operating system.
I was sick of Apple so I wrote I'm worried about my future. That's why I'm a Be developer. [scruznet.com]
And in fact I shipped (and still do support) on of the first commercial applications for the BeOS, Spellswell from Working Software [working.com].
Nothing Be ever did made any sense, and while there are individuals at the company that I regard highly, on the whole I felt the company to be uniquely unresponsive and incompetent.
And just when they were showing some promise of shipping enough BeOS [be.com] installations that I had some hope of making more than the measly couple hundred bucks I'd earned in royalties in the three years I'd been working on Spellswell, they announced a "change in focus" and said they weren't going to support the desktop anymore, except for the extent necessary to use it as a development platform for their new Strategy Du Jour, Internet Appliances.
After I posted on BeDevTalk that Some of Us Work for a Living [escribe.com], the moderator told me he was fed up with a developer who was trying to discuss business issues of concern to Be's third-party developers on Be's third-party developer mailing list. That was my last message to bedevtalk - he unsubscribed me.
I've been working on a really challenging C++ application for a few months, and after reading C++ Answers with Bjarne Stoustrup [slashdot.org] I got excited about really digging into the basics of programming - but from the perspective of a developer with 13 years of work experience and a lot of shipping products. [goingware.com]
I bought a few books, mostly on C++ and also hit some websites and newsgroups, and I became a much better programmer as a result. And I really felt that I did better to spend my time on core architectural and language issues rather than dealing with OS-specific nits or tool issues. And so I wrote Study Fundamentals, Not APIs, Tools or OSes [goingware.com].
So this brings me back to being used by operating systems vendors to serve their material needs at my expense and the cost of much personal pain. If you become a better programmer by learning the basics better, to can fluidly go from OS to OS without much of a learning curve.
But there's the problem that you have to use some API to code your application to, and while Java claims to be "platform-independent" it is really a proprietary platform in itself [att.com] - just try making use of platform-specific code in a Java application, yes you can do it with the Java Native Interface but it is difficult and an assault on the Java developer's senses to write a dll in C or C++ to load into the runtime.
So what you really need is a cross-platform application framework that you can write in with a language such as C++, that comes preconfigured with easy-to-use preprocessor symbols so you can drop into OS-specific code at your whim, and will compile from a single sourcebase to native machine code for multiple operating systems.
Funny that, since December '99 I've been writing a multithreaded special-purpose graphics editor that is also an HTTP client with just such a cross-platform application framework. I can develop on Mac or Windows as the need suits me and switch back and forth at a moments notice (especially now that I've got filesharing between my machines). My client only asked for Mac and Windows versions but I could port to BeOS or Linux in a few days. The framework is called ZooLib [zoolib.org].
It was written by my friend Andrew Green of The Electric Magic Company [em.net], originally to insulate himself from Apple's API nonsense. (Do you remember when all progress on developer tools at Apple and Symantec stopped while they went off into the sunset to develop Bedrock, itself a cross-platform application framework and an immense investment of time and money - and then abandoned it? If it hadn't been for then-tiny Metrowerks [metrowerks.com] Apple would have gone out of business after shipping the first PowerPC Macs, because there would have been no native PPC compilers.)
He felt that if he could code to his own layer and Apple changed their API, he'd just have to reimplement the OS-specific layer and he'd be working again. But then a little more work and he'd be cross-platform...
If you click that link today you'll just get a placeholder page. But just wait a few days...
(For practical reasons the source itself, mailing lists and so on will be provided at http://zoolib.sourceforge.net/ [sourceforge.net] once it's released.)
While ZooLib is to be newly released to the public it is not new code. It has been in use in commercial products for about five years - and in development in my own since last December. Part of why Andy gave me the code and I've been working with it is to give him meaningful architectural feedback and detailed bug reports so he can prepare it for public release.
I've been urging Andy to release the source as-is for a couple of years but his standards are incredibly high for a programmer. Andy's code doesn't just work, it is correct.
Andy spares no effort or time to fix the smallest problems (this is especially important in multithreaded code - think about reference counted smart pointers that are operated on by different threads, as you can do with Zoolib), and part of why he's been delaying the release is to improve the overall architecture.
For more details, including relevant quotes from Judge Thomas Penfield Jackson's Findings of Fact and Final Judgement discussing why Microsoft felt it was more important than anything to suppress cross-platform API's, such as Netscape plug-ins, Java, Intel Native Signal Processing, Lotus Notes, Apple Quicktime (runs on Windows too!) and RealNetworks' multimedia technology, please read my early draft of:
The Cross-Platform Manifesto [goingware.com]
Thank you for your attention.
Regards,
Re:A big win for portability (Score:2)
This is NOT true. X buffers requests and responses so multiple X messages (including user input messages) may be handled by the server or client at a time. Therefore, rather than a context switch per mouse click, you have a context switch per N mouse clicks and other X events.
Context switch overhead is a factor, for sure, but it's not as bad as you make it out to be.
I personally find X performance to be more than adequate for 100% of what I do. I guess if I were doing realtime 3d rendering (including 3d games) I would have something to complain about.
But of course the advantages of X far outweigh its disadvantages, as everyone ought to know by now
Re:A big win for portability (Score:2)
In my mind, X is a "problem solved". There are improvements to be made to X, for sure, but it is the Right Way to do windowing systems. The only reason it is not used in Windows and OSX is because the companies that make these products have an agenda, and that agenda includes locking software into properietary APIs rather than using standardized open protocols like X.
Just because it's old ... (Score:2)
X is simply a protocol for describing how clients and servers may communicate so that clients can draw window contents onto servers' screens. Period. It is a well-defined problem space, and X, as a solution, is pretty much IT. There have not been fundamental changes to X in 10 years because it is a correct, complete, efficient solution to the problem. Period.
Yes, there are areas in which X can be improved, such as font support, but this is NO reason to chuck X. You try designing a network-transparent windowing system and see how far you get before all of the problems that X solves with respect to race conditions, efficiency, performance, correctness, etc, bite you in the butt and you give up and go with X.
Xlib is a problem. It represents the minimal set of C API calls necessary to expose the full functionality of the X protocol to a client program. But it does not provide any kind of higher-level windowing system functionality such as buttons and scrollbars. Thus, many people have implemented these things in many different ways, most of them poor, and the result is that the typical X program looks and runs like crap.
This is NOT the fault of X. It is the fault of the people who released X without releasing any kind of standardized, effective toolkit that won over a broad base of usage. It is the fault of the people who have and will continue to ruin Unix by refusing to engage in any kind of standardization whatsoever.
The fragmentation of Unix systems and Unix desktops is a problem, but it IS NOT THE FAULT OF X!
So stop blaming X already!
X is state of the art because the "art" (network transparent windowing) has not changed, and will not change, in the same way that algebra is state of the art because the fundamental facts of mathematics do not change.
BTW, there are resolutions for which jaggies do not occur, despite your assertion to the contrary - any resolution where the pixel is too small to be seen by the naked eye, will not have jaggies and will not require antialiasing. I predict that 95% of all computers will meet this criterion within 10 years.
In the meantime, YES, we need support for antialising in X. There are standardized mechanisms for extending X to support things like this. The problem once again is that there is no common toolkit API that all X programs are using such that simply adding an antialiasing extension to the X server will magically fix X programs.
Once again, not X's fault - it's the fault of toolkits and the general X developer community which failed to produce a single viable toolkit (and GTK makes me barf, by the way).