Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Apple Businesses

Raskin On 'Raskin On OS X' 197

Kelly McNeill writes: "A recent editorial appearing on osOpinion.com (and linked to here on Slashdot last Thursday) dealt with comments made by Mac creator Jef Raskin and his opinion of Apple's upcoming next generation operating system OS X. The somewhat controversial editorial generated a ton of mixed response both here as well as on the publishing site. As it seems, Mr. Raskin's thoughts on OS X (and Unix) were very misunderstood and he has since stepped up to the plate to clear the air and responded to the technology community at large."
This discussion has been archived. No new comments can be posted.

Raskin on 'Raskin on OS X'

Comments Filter:
  • by Anonymous Coward
    Down With GUIs!" [wired.com] by Jeff Raskin.

    Okay, so there's no whore like a karma whore. Sue me. Hey! Yeah, you. Looking for a l33t time, big boy?
  • by Anonymous Coward
    Raskin says (in his book "The Human Interface", if I recall correctly) that in hindsight, the one thing he would have changed about the Mac was the one-button mouse. So yes, he can admit mistakes once in a while. Mr. Raskin, for all his other shortcomings, is one of the few remaining innovators in the area of human-machine interfaces. It's good to see that he's still getting under people's skins... the community needs more crazy prophets to shake things up.
  • by Anonymous Coward
    Well, wait until you write a bunch of articles and then have no-one read what you wrote, but everyone flame you to hell for what someone else wrote about your work.

    He didn't say "what I meant was" because he wasn't correcting his own work - he was correcting the idiots who jumped to conclusions based on a rather biased magazine report of his work.

    His last remark was spot on, anyway. The people who flamed him the last time here on Slashdot were to a man dull and unimaginative. Hardly anyone stopped to think this guy might be worth listening to. Too busy correcting little errors in the reportage.

  • by Anonymous Coward
    Not anymore. The xfree86 guru's have had XWindows running inside of Quartz/Aqua for about a month now. So there's no need to shutdown - just start Gimp like you would any other tool. Then have a shootout with Photoshop running in Classic, and Gimp running on BSD....

    http://mrcla.com/XonX/

    Tom
  • If a train station is where the train stops....

    --

  • Note that Raskin was the one who insisted the Mac have a one button mouse...
  • I think you're exactly right. I don't have one myself, but the Palm is nice because you don't launch apps, they're always running. And since the storage isn't just like a filesystem, you can do global searches into the contents of the file. What systems are you talking about, that used smalltalk? I've heard that mentioned before..
  • I think he has a little trouble expressing his points. The title article is very much "I didn't mean that, you don't get it." However, let me interperet for him.

    I think his point isn't that the OS is bad, but the notion that the gui interface of your OS is bad. IE: Windows with no applications.

    Of course, most of us geeks don't make the mistake that everyone else does by assuming "OS Interface == OS". Mac and Windows users do this a lot though (most articles about Windows and Mac OS updates focus on the GUI improvements/changes).

    I highly advise the readers to read his wired article though. You've read this far of slashdot comments, the wired article is much more enlightening.
  • Perfect. Nailed him.

    I think his only excuse may be that the Wired article is fairly old (from 1993).

    -- Brian
  • A lot of that wouldn't be terribly difficult...

    Already I can double click on a url in many Mac apps and it will automatically be sent to my preferred web browser, or ftp client or whatever.

    But there would have to be a HUGE amount of thought put into it before it would be ready for prime time, to ensure that it would really work well, and that the perception of the interface that the users had was a good/useful one.
  • I agree. Nuances mean a lot - the difference Aaron made on MacOS 7.5 was amazing. The difference Norton Desktop made on Win 3.x was what that bum needed. And those are more major adjustments - Something as small as "I wish this window's border could be smaller" might really make someone happy. Sure, there are lots of people who will take whatever the OS dumps at them, but then there are others, like me, who strive to make those tiny adjustments that make using the GUI truly an extension of my intent.

    I say intent, though, because at times I can be caught up in what-ifs with the interface that waste a whole lot of time. :)

  • Ok.

    If you think that the standard out is the best you can get you are mistaken. Why do you think that the highest grade dvd players don't have them? Its because the wires are too thin and can have cross talk. This is eliminated by using component video in and out. Which sgi's have and apples do not. Ever see a monitor out on a television? Its unbearable.

    Ati's Allin wonder doesn't have component, svideo and composite video in and out, and can't handle uncompressed video. Uncompressed video has a very very high data rate. Not to mention variable hardware compression ratios which are all real-time. As you go up higher and higher, you can compress multiple streams at the same time in real-time as well.

    For more information go to http://www.sgi.com/peripherals/workstations_periph erals.html

    Don't forget the nobs and dials :)

    Also, check out the awesome Onyx2
    up to 256GB of ram 256 mb of texture memory and it can do 6.4_Billion_pixels a second. 16 pipelines and not to mention fullscreen hdtv antialiasing in real-time.

    Ok, enough, i think you get the idea.
  • I like some apple hardware. I'm really happy about OS X and hope apple will release it for x86 or ia32 or whatever its called now and windows will disappear.

    But i still hate the cube. The boss where i used to work was a mac zealot and thought the cube was so cool. It overheated and locked up at least 2 times a day and it started to crack on the top. He would always pretend like he couldn't see the crack.

  • The shell is not garbage. Eat your words.!

    It is an unassuming power tool chest.

    You can do anything in the shell and do it well.
    And there's nothing stopping you from running the shells in X.

    Unix would be poop without the shell.
    Ok, not poop, but just a stable windows.
    Ok, a hell of a lot better than that
    but it wouldn't be as cool and powerful.

    The more you use unix the more you appreciate the shell.
    And stop calling it command line, thats a windows thing.

  • You are right about chemistry. My chemistry teacher has shown me some cool software for molecule visualization on his apple, which doesn't surprise me considering apple's penetration into the educational market...but his processor was crying and it started to chop when he got to more complex models.

    Unix workstations are still the best for high-end anything.

    I'd be interested in seeing some of this software you speak of, it sounds really cool. I like software that simulates physics, especially visually and in real-time. And even more excited that its GPL so i don't have to pay :)

  • I wasn't aware there was a higher end GL card for apples. Could you provide a link, i would think that it would go well with the g4.

    But it remains a niche thing, but i'm still curious :)
  • Hey, you should talk to the company i work for. My boss would probably hook you up with a great deal on bandwidth for your servers in exchange for some advertising. www.vdi.net. I'm one of the sysadmin's there and we've got 2 ds3's and more on the way.

    If i were you i'd buy an O2 or an Octane from ebay and get the appropriate peripheral for editing and software. As for software, that will probably be expensive. For the peripherals that SGI has go to http://www.sgi.com/peripherals/workstations_periph erals.html and click on audio/video for either O2 or Octane.

    My point is, really, that if you buy an O2, your real limiting factor is how much you can spend on software, and once you get some revenue, you just buy more powerful software. If you're looking for software for it other than what it comes with, look at http://www.sgi.com/solutions/broadband/partners.ht ml . Hope i have been of some help.

    I could be all wrong about this because i don't have much experience in the area, but just remember that the best tool is the one that with which you do your job best.


  • Oh, i thought that OS 10 came with gcc.
    Sorry, my fault. Jumping the gun again and assuming.

  • GTK, QT.

    gdb gcc glibc

    emacs vi/vim.

    python perl

    I guess there aren't any tools for linux, oh wait, there are! And they're portable too. They are all getting polished every day and making their way to where apples are. But, Apples aren't portable. Of course they are cleaner and easier to write for, but run on 1 platform. Almost all the general unix tools are extremely polished and portable except for the gui libraries and they are getting there. I'm not bashing apple's just saying don't know unix tools, after all they've been around the longest and are the most polished (except for gtk and qt which are new on the block, relatively).

  • Mozilla can do it all. And what happened to making portable web pages? Develop for whats out there, not whats going to be out there.

  • She says she likes the look and feel. if you install Xfree86, that would defeat the purpose.

    So, to answer your question, you'll have to wait until there's an OS 10 (x, or whatever) version, which there will be eventually. But don't forget the drawback, youhave to buy apple hardware. $$$

  • No, the tv out is made so you can hook it up to a studio monitor. Apple makes no claim that the ntsc output is suitable for broadcast.

    True, it is a lot better than most pc video output, but cannot touch one of sgi's broadcast machines for output quality.

    Ever see a weather report? Thats directly out of an sgi O2. Those aren't expensive at all. The software is of course.

    eya.
  • Yea, i deal with this...but most of the time the client doesn't know what their IP address is let alone their login. But after i explain to them that they can connect to their server as if it were right beside them and their monitor was hooked up to it, they are so amazed. They ask if this is a special package we installed. Its hard for them to believe coming from a windows or mac background that you can have remote access to a comptuer without buying new software.

    Although everybody thinks pine and mutt are arcane. But you can see them from anywhere.
    Webmail sucks, so they usually go back to pine.

    _ramblings_

  • Dude, until just now, the graphics were old.
    Rage128 is hardly a powerhouse anymore.
    Nor was it last year.

    But the g4 is nice...and Altivec is the supercomputer part. But only if the software was written to take advantage of it.

    yea.
  • Who said anything about games?

    Ever hear of anybody doing any 3d modelling on a mac? And zealots don't count.

    Seriously, they have been behind the curve for quite some time. The cards aren't as fast at 2d as say a matrox g200 which is 3 years old. But i will admit that all their cards will drive any of their monitors quite well.

    As far as cpu goes, they've always been ahead of the pack with IBM/motorola. Intel sucks and AMD sucks less when compared to the G4. I'd still pick up a thunderbird with ddr-ram over an apple because of the cost.

    Intel tried to make something like altivec time and time again and can't seem to do anything with it. First it was mmx, then ssimd or somethign and i heard something about mmx2. Anyways, anything really make use of these? Maybe but do people ever notice a difference, no.

  • Ok, so its available, which is no surprise, its very portable, runs on every unix and apple even uses gcc standard on os X

    BUT! they do not include it.

    Probably because it wasn't the default shell on 4.4bsd or something.

    Or maybe they're all used to it or somethign
    I dunno.

  • Yea looks like a great site.

    unfortunately, you are using a version of netscape that is not supported.

    this site looks best in Internet Explorer.
    if you must use netscape, please use netscape 6.
  • I wasn't going either way, i was saying that if it did, then it would make a decent workstation.

    Thats more of a server thing than a workstation thing i guess.

    When it all comes down to it, it is what you use it for that makes it a workstation. If i have a linux box that i play games on only, then it isn't a workstation. It has potential, but it isn't.

  • Cool. Glad to hear it. :)

    Didn't mean to say that that it wasn't and that OS X was a toy.

    Just getting used to people refer to their mac's as workstations is gonna sound weird, thats all.


  • I didn't say it was a bad thing, i'm saying they won't be able to handle compiling and installing the standard unix way.

    Twist my words some more!

  • Yea, you can replace it. But how many people who use macs have ever compiled somethign before.

    They're used to getting a nice graphical installer where they click install and it does everyhthing then they click finish/done .

    My point is the default shell is not so hot.

    True people like sun use tcsh by default (last i heard) but people who use suns generally have at least some experience...or at least a sysadmin who will do it for them.

  • Dude, you can get octane's for the price of a mac on ebay, not to mention O2's. Then again, the software might be a little expensive. I dind't know you were a startup. But i wish you the best so you can get the workstation you really deserve :).

    Also, i didn't realize this was an internet thing... If this were a television thing a mac would certainly not do, but if its internet, its fine :)

    Anyways, check out ebay sometime, they've got people selling O2's and Octane's and the ocassional onyx or onyx2. But those start to get expensive even on ebay and do stuff that you don't need to do like edit multiple streams of uncompressed analog video non-linearly at the same time and anti-aliasing while rendering effects, etc.
  • My mother can handle window maker better than apple or windows. I just eliminated all the choices she doesn't need to make. To get on the internet she clicks the red dot and when it turns green she double clicks on the "globe" icon.
    Simple enough.

    Check out gnome or kde.
    Thats the idea, to make unix easier to use and make you more productive.

    www.gnome.org
    www.kde.org

  • Ever used their command line?
    Not the best.
  • Actually, Apple has done a fair amount of work in the "beginner" settings. The Finder in 8.5 (maybe 8.0) had a "Simple Finder" setting that removed a lot of the menu options. They've done a "panel" based interface with icons rendered as single-click buttons. (the button mode isn't in OS X PB). The MultipleUsers option has a simplified mode with only specified apps available. Combined with a home folder, it covers most of what, oh... my mom would use.
  • One promise that OS X holds is for simple apps that rpovide a GUI for various command-line functionality. RealBasic is capable fo driving the comamnd line, sort of VisualBasic for Unix :-). And the Cocoa development tools are relatively simple. On top of that, I expect to see a Hypercard update based on the Quicktime file format for easily creating media-rich applications

    The interest and tools just aren't there on Linux (or we'd have them already).

    Apple created a programming platform for Everyman with Hypercard, there were a LOT of amateur apps done in HC, personal finance, PIM, games... etc. Bring that to Unix will be a Good Thing. But don't expect it before MWSF2003.

  • OK, fine, click the close button to make the window go away.

    Now have Granny install new software, and see what happens, and you'll get the point if the whole argument.
  • You say: "sure, the tech underneath is crap (an i loathe that as much as any other geek)"

    I know you're trying to achieve some reconciliation with the Linux crowd with this comment, but you're totally selling Apple short. Let's be more precise: Apple's process scheduler and memory management is crap. There's actually a whole lot of stuff in MacOS below the GUI (the "tech underneath") that works really, really well.

    * Resources - sure, Unix has these too, but Apple's way is a lot better

    * File types - Double click any document, and it opens in the right program. Apple did it first. And not in a brain-dead way, either. You can control what the "right program" is; you can drag-drop it onto an alternative; and if the "right program" isn't around, you get a list of alternatives.

    * Applications - Drag a folder to install. There is no path to hack -- the OS knows where everything is.

    * The Macintosh toolbox - Over 5,000 data types and subroutines that any app can rely on. glibc, eat your heart out.

    * Hierarchical File System - B-trees are cool.

    * Open Transport

    * Sprockets

    * WorldScript - Don't forget, Apple co-invented Unicode.

    * NuBus, ADB, PCI, USB, SCSI, AirPort, Firewire, AppleTalk - NOBODY else does plug'n'play as well as Apple.

    * Multiple monitors - It just plain works. The way you expect it to.

    Sure, Apple is the subject of ridicule among PC masochists who believe anyone who doesn't suffer as much as they do must be inferior. And sure, Apple is derided by those too cheap to pay a little more for quality, style, and innovation.

    And sure, it's taken Apple a lot longer than it should have to add preemptive multitasking and protected memory to their kernel (although they have offered THREE different flavors of Unix prior to OS X Server). And sure, Apple's management is doing its best to kill the company, and only the loyalty of its customers keeps it alive.

    Apart from these minor nit-picks, Apple really does have terrific products with some awesome technology inside them.
  • Your comments about a friendlier command line are spot-on. Interesting how your syntax reads a lot like AppleScript. :-) The gap between "ought to be" and "is" often may be narrower than we think.

  • one of your suggested commands (almost) works ... Mac OS X PB contains an "open" command, which can be used to launch an GUI application from the command-line.

    open /path/to/app/or/document

    will open the application specified, or the document specified with the default app for its type.

    Now if apple could create an "AppleScript Shell" (assh?) then we could type ...

    $ tell application movieplayer to open porno movie

    and so on ...
  • "...That's where I've gone beyond the old form of command lines, and adopted the idea that issuing commands via a few alphabetic keystrokes is one of the best ways of operating a computer...."

    Commands as pre-set keywords? I wonder where he got that idea.....;)

    Sinclair ZX Spectrum forever!

    (for newer generation : http://www.sinclairspectrum.com/ )
  • by Anonymous Coward
    I agree with you. I'd also like to add a thought about pipes and pipelines.

    Pipes and pipelines are exceedingly important in some application areas. UNIX STREAMS is a great way of generalizing them, but STREAMS isn't the issue; the UI is.

    To connect a text document [a producer node] to a grep to a sort to a uniq is done like this in UNIX:

    cat mydoc.txt | grep myline | sort | uniq

    Quite easy for most of us to understand. It's possible to insert T-junctions using UNIX tee but that's already difficult and a more general graph would be almost impossible to express on the command line.

    But this important concept is more suited to a GUI than it is to a command-line. Why can't we have processors like grep and sort have graphical representations - e.g. grep has a place to type the regexp; sort has a "reverse order" checkbox, etc. But *also* they all have input and output connectors.

    Then, any user could wire (with the mouse) the text document into the grep box, then grep into sort, the sort into uniq, then uniq into maybe a text window to receive the output, so he can view it.

    Graphs with T-junctions and so on would be easy to construct, and future software could even deal with cases involving feedback loops.

    If we can take UNIX pipes and make a GUI around it, I think you could make a most powerful and flexible system - and people could learn the basics [rewiring stuff] in under a day.

  • If you want to start browsing, just typing http://slashdot.org into a commandline-like interface should be enough to bring up Netscape. If you want to send an email, typing louisjr@nospam.com should bring up the right email program

    Call up the Run box on Windows (hold down the Windows-symbol key and press R).

    Now type http://slashdot.org and it will start your default web browser and take you there. Type mailto:louisjr@nospam.com and it will start composing a new email message.

    The functionality you want is already there, just not in a conventional shell.

  • I realize you're probably just trolling, but OS9 is one of the major operating systems out there, as far as marketshare is concerned.

    And yes, I realize that Apple's marketshare is pretty low, but it's one of the most - if not the most - popular OSes outside of Redmond. Certainly in the consumer market.


    - Jeff A. Campbell
  • Hate to break it to you, but most of Raskin's work on the Mac was a direct response to the work already completed at Xerox PARC, including but not limited to bitmapped displays, icon based program launching, wysiwig programs, etc.

    Not trying to say that the Mac was totally unoriginal, but the Mac didn't spring forth fully formed as a creation of Apple alone.

    Raskin didn't claim the work was a creation of Apple, he claimed it was a creation of Raskin. Here's some of the evidence he would give for that contention: (from This page [mackido.com], emphasis added)

    My thesis in Computer Science, published in 1967, argued that computers should be all-graphic, that we should eliminate character generators and create characters graphically and in various fonts, that what you see on the screen should be what you get, and that the human interface was more important than mere considerations of algorithmic efficiency and compactness. This was heretical in 1967, half a decade before PARC started.

    Many of the basic principles of the Mac were firmly entrenched in my psyche. By the way, the name of my thesis was the "Quick-Draw Graphics System", which became the name of (and part of the inspiration for) Atkinson's graphics package for the Mac.

  • But can't you replace it? I mean, put in another shell or whatever.

    W
    -------------------
  • Well describe to me the REAL difference between word processing and writing email. Both involve the input of text into the computer. Rather than print said text you may want to send it to someone in the form of an email. If you separate everything into the most basic of components you end up with a fairly large number of combonations of "applications". The same GrabKeyboardInputAndBufferIt object could be used to put text into an email, insert text into a picture you're editing or send a command to the lower levels of the OS.
  • still believe if you can't ssh to it, it is not a proper workstation.

    ssh is included in Mac OS X. You can turn it on with the click of a checkbox.

    -jon

  • Good for you my Wiccan friend. Xfree86 has been ported to OSX, so don't worry about not being able to use GIMP. Also, check out Colorit!. It takes PS plugins and is under $50. I jump from GIMP to Painter, to Colorit!, playing to the strengths of each. The Nix geeks won't be able to resist OSX. They love new toys. Tenon has some tools for it too.
  • while there are some nice elements of System/MacOS, I find that using an OS designed for 3rd Graders/Grandmothers a bit annoying.

    Funny you should mention this.

    They guy I work with has a son who did a Geographic Information Systems project on wetlands while he was in the thrid grade. He used a Mac with the MapGrafix software from ComGrafix [comgrafix.com]. It kicked the shit out of a lot of college GIS projects.

  • Jef Raskin?
    Wow, I can't believe it. I just sent you an email last night. (A long rambling thing about wanting to write a new environment using your ideas.) Then I notice that one of my "old" posts on slashdot went from 7 replies to 8 replies and it was Jef Raskin himself.

    (I noticed your posts aren't getting moderated up. On slashdot, if you don't reply to an article right away it won't get noticed, because the moderators' attention goes when they are done with the article and move on to the next. Slashdot doesn't make a good bulletin board, but it does keep the topic moving.)

    Hmm, I guess I have nothing interesting to say other than I hope you have a chance to read the email I sent you earlier. :)

    Don Rivers
  • Yes he has. Why can't I import VI key movement keys into Netscape and KEdit and everything? Key movement is fairly global. All global operations should be settable to standards. I shouldn't have to learn a new interface to different applications. Such shouldn't be hard-coded into ea. app. His plan actually demotes applications and extends the OS. He just wants to see it done in such a way that users don't have to notice the OS while they use their system.
  • Well, I can't find the original Raskin article (if there even was one). Basically he says we got it all wrong...but then he doesn't clarify what he actually meant! So what *does* he mean by the OS being unnecessarily in the way?
  • In what way is your standard VGA/RGB output not of sufficiently high quality for broadcast? It seems to me that anything that looks ok on a monitor should be way overspec for conversion to NTSC and display on consumer grade TVs. (NB: I'm assuming you use some special purpose hardware for VGA->NTSC conversion, and not just using the TV-out option)

    Could you expand on the ways that SGIs are better to say an ATI-All In Wonder (just to pick a card at random, feel free to choose another).

  • The hollywood probly does smoothing; apparently studios (there was a slashcomment about this a few months back, but since we can't search archived discussions, it's effectively gone) do edge enhancement because 1) most TVs smooth the image, so the two cancel out 2) it makes people think the DVDs are sharper (a-la early CD players being unbearably "bright" in their sound to make make people notice that crystal clear "digital" sound). 1) implies that a good card optimising for monitor display would apply smoothing.

    As for the idct, that is just a bunch of cosine waves that need to be multiply-accumulated -- exactly the thing a DSP is good at, so it makes perfect sense to put that as special purpose hardware. However, I do have to add a bit of salt to your claim that all others do it in software, as MAC loops are bread-and-butter for all DSPs, and most video cards have DSPs (I'm guessing about that one).

    Maybe ATI are the only ones to have it completely in hardware, as opposed to firmware+DSP, but why would they do that?
  • The possibilities would be fantastic:

    > go to my document directory
    Shell> OK
    > edit my shopping list
    Shell> I don't know how to "edit"
    > open my shopping list
    Shell> I can't see a "shopping list" here
    > open the document shopping_list.doc
    Shell> You can't do that to a "shopping_list.doc"
    > load the damn text editor and let me edit shopping_list.doc!!
    Shell> OK (you don't need to swear!)
    > go up a level
    Shell> OK
    > go into pictures
    Shell> You are lost in a maze of twisty little directories. You can go up or down.
    > go into pr0n
    Shell> You are lost in a maze of twisty little directories. You can go up or down. There is a jpg here!
    > xyzzy
    Shell> that's the magic word! starting bash...

  • I think this speaks for itself: "Jef Raskin can be reached for response to this column by e-mailing him at: JefRaskin@aol.com."
  • Well, I just took Raskin's advice and read his article in WIRED, but to be honest, I'm still having trouble envisioning his proposal in practice. Perhaps he could mock up a demo of this new interface? Even a picture or diagram of some sort would help...
  • The Windows NT/2000 command line has tab-completion, just like *nix. This would do exactly what you wanted. Its not enabled by default, but once you install TweakUI and set it, you're all good to go.
  • ssh is included in Mac OS X. You can turn it on with the click of a checkbox.

    Turning on ssh with a...checkbox? There's something about that that's so unnatural
    --
  • This thought is called flexibility. And I can't underline this term even more. One of the key things why I use UNIX to it's full extend, and learned to love it, is flexibility. Small applications like sed, awk, find, grep, ls, cp and the others only contribute to this. Good editors like vi or emacs even extends this idea.


    Back in the old days of Apple had a technology that tried to address this called OpenDoc. Essentially it was a document centric technology, that pushed the application subordinate to the document. The result was that you'd have a document that was created by a bunch of small applications providing limited functionality on their own, but when combined gave the user real flexibility in what they wanted in a document. Basically, it was poised as a more flexible alternative to OLE, where it didn't require a larger application like word. IIRC, it was based off SOM. It was like a gui to java beans.

    Obvously, it never took off. It suffered from poor marketing and management, as many projects at Apple did at the time. Also it was really buggy. It's memory management really sucked. It was supposed to be cross platform too, but that got nixed when MS realized it competed with their technology ( and their whole development model).

  • Looking past the obvious semantics of "GNU's Not Unix," it is indeed flattering that Raskin should compliment the OS itself.

    I'm sure Microsoft would lust after the opportunity to do exactly what Apple did: build a well-designed Interface atop a core like OpenBSD. If it weren't for the fact that MS sold their Unix to another company, and are thus contractually obligated not to produce another Unix, their present push toward NT technology would have taken exactly this turn. (And the more I explore NT, the more I see thinly veiled impersonations of Unix.)

    My favorite book for getting outside the old paradigms of "files" and "applications" is Computers as Theatre by Dr. Brenda Laurel. "Actors" on a "Stage" makes for a much richer metaphor to the causal whole.

  • Will this spin doctor try to spin this spin on himself in about three weeks?

    Well since he was the actual genius who invented the Mac (not Jobs), I can cut him some slack as far as feeling he has some right to speak up. How big would anyone's ego be if they had invented the Mac. Heck, Gates, merely copied it for the most part, and look at him! [joke]

    Quoting from the article:

    For those who don't believe I invented the Mac, read the original documents. (Stanford University has put many of them online.) Lots of great people had a hand in it, and some made essential contributions, but at Apple, the dream, vision, and many fundamental design concepts, were first learned from me.
    This following part is what I think is the core of the issue:
    Many people missed, and Burg did not make clear, that I was talking about the *interface* to UNIX, not to UNIX itself, which I think is a work of genius and a masterpiece of elegant design. I was a UNIX user for years, but I know that we can do even better today.

    (...)

    Why is UNIX so much better than its interface? Because, its designers were experts at the theory and practice of software design and development. They were not experts at cognitive psychology, and the field of interface design barely existed when they were designing it. Now that we know a significantly more, it is time to fix the mistakes that they (understandably) made.

    OS X, on the other hand, was built recently, and thus there is no excuse for its designers to have done so badly in terms of interface design.

    'nuff said

    remember kids, some things require the ability to read books and things without pictures!

  • I have said, in print, that the one-button mouse was a mistake. It's in an appendix to my book. I also explain how the mistake came about, why it was a better idea than state-of-the art practice *at that time*, why the present multi-button mouses are wrongish, and how to do it better.

    I am not trying to sell books, just ideas. But *please*, get your fuel tank full of something more than hot air before you turn on your flame thrower.

    If you don't want to see me have the satisfaction of getting a trifle of royalties from your purchase of the book, then borrow it from a library.

  • I'm going to reply to a few comments here.

    I admit to being a bit of a tease, and somewhat deliberately so, hoping that people will get around their preconceptions.

    For example, people assume that "do the right thing" when you type (from the Wired article) is taken to mean to "open a word processor" Unless you make this assumption, there is no contradiction. Because opening a word processor (or opening any application) is never the right thing, for the reasons stated in my book, I am being consistent. Doing the right thing means to accept, display, and preserve what you have typed, in a way appropriate to the context.

    I am writing more articles and maybe another book to give more details. It is certainly too much to put into a posting.

    A freindly voice wrote that "I can't expect everyon to run out and buy" my book. If they want to read it, that's one good way to get it. Libraries are another. That's why I wrote the book, to put out ideas that didn't fit into a shorter format. I could have put it out on the web but a book from a major publisher carries more credibility, and makes it work better as a text (it is already in use in at least 11 universities). That's one good way to get ideas out to a particular audience.

    When I used to talk about my ideas, people said, "why don't you write a book." Now that I've written one, I seem to be hearing "you can't expect people to actually go out and read it".

    Is it really too much to ask that people who want to know about my ideas read my book and articles? If I want to know what John McPhee wrote about geology, or Wittgenstein about philosophy, I just get their books.

    I do like the question "just describe the intended behavior". That's exactly the right question. The answer is in the form of either a detailed spec or a working system. The former I'm not ready to make public, and the latter requires either money or a cadre of helpful programmers.

  • I didn't know that existed!

    I sorta wish Netscape were smarter too. That the default action when I start typing when it has focus is to redirect the text into the location-bar...

    Thanks!

    Geek dating! [bunnyhop.com]
  • I can't speak for his vision. I can only talk about my interpretation of his vision.

    If I get the gist correct, it's something along the lines that if the OS UI isn't helping you do your task, it's hurting you, and is unnecessary.

    At the extreme, it can be taken that the common Window, Icon, Mouse, Pointer interface is not necessarily best suited for a task. Why do we need icons? Why do we need widgets? Menu bars? Windows?

    That doesn't mean they aren't useful, it just means in certain situations, they aren't necessary. Like in a game of Quake, where the interface is mouse and keyboard, screen, and speakers. Or a game of Dance Dance Revolution, where the interface is scrolling arrows, flashing screen, and a large touchpad.

    It's difficult, I think, to make a mockup as you suggest, but here's a good example of using the OS UI as an advantage. Burning CDs under MacOS.

    Why should you open a program to burn CDs? Why should burning a CD be any different than writing to your floppy, to your network drive, to your hard disk? As demonstrated by Steve Jobs, you drag the files you want on your CD to the CD icon or CD folder, and the system burns it for you! I would expect making music CDs and mp3 CDs is only *margainally* more difficult, in which you would modify the filesystem of the CD in the same way you might change the filesystem on a floppy (PC or Mac), or properties of a network drive (private or open, who has write access, etc)

    Drag an mp3 to a music CD, and maybe, hopefully, it gets converted to CDA! Grab a music track from a CD, and with the help of CDDB, it should be compressed to an mp3, a wav, or some other format on the hard disk!

    Another example that is less WIMP in nature. Say you wanted to visit http://www.apple.com

    The GUI the Mac espouses is verb and noun in nature. Say there's a floating CLI; or even a hidden one. If it has focus, just typing

    go www.apple.com

    Should bring up netscape with the Apple website

    Typing

    find google powerbook titanium reviews

    Should bring up in google all the relevant search hits concerning the powerbook+titanium+reviews

    There's no icons, no mouse, no windows, no pointer. Just type away at the 'bare' OS. It's similar in action to you going up to a Mac and speaking into a mic:

    Computer, find, google, powerbook, titanium, reviews, execute.

    And a smart voice powered system should bring up Netscape, at the google site, with the proper search results.

    But that's my interpretation.

    Other interpretations exist, I'm sure.

    Geek dating! [bunnyhop.com]
  • What, isn't *every* person who buys an Apple a consumer?

    Regardless of whether Apple cares about or caters to developers, it's nothing to laugh at in terms of making development easy or easier. Anything that makes developing on an Apple easier is a win for Apple, because it gets them more developers.

    If it means greating their own CC based on GCC with functional improvements, or an IDE, developer libs, or kits... I dunno.

    But development user interface is as valid a concern as graphical user interface, or any other user interface.

    Because computers are getting more powerful and more capable, they can start acting more intelligently.

    Geek dating! [bunnyhop.com]
  • That's some of the problem, right there.

    Jef is almost talking about doing away with icons, close boxes, and button bars. Try to imagine what the UI to a word processor is without the preconceived notions of toolbars, icons, OS widgets, etc. The thought that the OS itself gets in the way of the app!

    When you use your word processor, you use the keyboard to type; there's most of the input. So why not use that? That almost means a return to the command line, if you haven't noticed.

    So you want to write a letter:

    At a cli, just type:
    write a letter to TWX the Linux Zealot

    The OS should launch the preferred editor, with the appropriate format for a letter.

    If one is still using a mouse, just click on where to start typing. Is it the header?

    TWX the Linux Zealot
    10220 Davenport Dr
    Cupertino, Ca 95145

    That's finished. Where next? The body?

    Dear TWX,

    It is with deepest regret that I inform you...

    Then you finish your letter. Now what? email it? Print it? save it? What UI would you use if you didn't have icons to save you? No floppy disk, no printer, no letter icon?

    How about the CLI again?

    save letter to TWX the Linux Zealot at c:\documents\Louis\private

    Letter interface closes!

    Or, type:

    save

    And a window shows up with all unsaved documents. Click on the one that you want (pictures? the first line? Time and date? command that started it?) and it saves it. Do you care where? No, not really, as long as there is enough space, it's easy for you to find it, and the OS can find it again.

    Printing. Easy. type:

    print

    Or

    print last saved document

    Or print last document

    Or print letter to TWX

    You get the idea. Icons and widgets, while useful at the time, are not the end all and be all of UI!

    Geek dating! [bunnyhop.com]
  • What?

    It's bad to click install and then finish/done?

    If Apple had another revolution up their sleeve, I would guess...

    a button 'compile', then a button 'install', then a button 'done'!

    Command Line Interface is an issue independent and unrelated to the people who have ever or never compiled things before. The OS that makes difficult things easy, and the impossible, possible.

    Geek dating! [bunnyhop.com]
  • Why should the UI be constrained to envelope icons, pencil and paper icons, or a blue crystal E?

    If Apple were to tap into the Terminal app, and create a CLI.app of magnificent proportions...

    Imagine typing into CLI.app the web address you want to visit:

    Goto http://slashdot.org
    Visit http://www.apple.com
    URL http://www.yahoo.com

    Or better yet

    search net Apples OS X beta rebate

    And it automagically loads Google, from IE, with those search constraints? Why not?

    Or for your doc; just start typing:

    New Doc

    And a blank doc starts up

    Open Game Design Draft 2

    And the Game Design Draft 2 document opens.

    Those are relatively easy because both require text input in the first place.

    But by analogy, Apple already has their CD-R interface, their CDDB interface, etc. Why should it be constrained to icons and widgets? That's a holdover from 20 years ago. It doesn't have to be that way, anymore, if a better way can be found, right?

    Geek dating! [bunnyhop.com]
  • What surprises me is that Raskin should criticise OS X's interface while praising Unix. Just when people were saying 'atlast a Unix that's easy to use'!

    If you want a model of ease of use, it sure as hell isn't Unix or Linux. Start a program with a few keystrokes - Like 'cp', 'grep', 'chmod'.. That's really intuitive! Surely the fact that people need to be trained to use Unix defeats what he's trying to achieve.

    But he'd just say "that's not what I meant! read my book!" Great, but don't treat me like an idiot just because I haven't researched your life.

    I am surprised that computers haven't evolved more than they have, but I don't think we'll see a major shift forward until Microsoft or Apple have refined everything they can refine in their OSes, and there's still a lot of little jobs to do.

    I'd like to see a next-generation Shell - with intuitive commands, natural language, maybe a mixture of icons and words if you want. If only text-adventures hadn't lost their popularity, there would be greater awareness of what can be achieved with parsers. I'd like to be able to 'cd' through applications -eg. "cd word", "cd inserts", "insert date & " - " & time".

    I haven't read Raskins' book. I know his work at Apple from reading 'Infinite Loop'. I have no idea what he's talking about. But I think I'm still capable of being a little visionary.

    Chris
  • while there are some nice elements of System/MacOS, I find that using an OS designed for 3rd Graders/Grandmothers a bit annoying. I dealt with too many problems on Apple's Mac OS in the 6.0-7.5 levels to want to think about their old-designed, cooperative multitasking OS, and while this may sound like a dis on it's creator, it is. At school I often use HP Terminals running CDE, and while not perfect, they're not too much harder than the MacOS, I click on the little pictures at the bottom, and the apps launch. I click the close box, and the app goes away. If grannies and 3rd graders want an easier to use OS, fine, but don't expect me to really care about it...

    And who says that UNIX can't be made at least somewhat usable to Joe Schmoe?

    </rant>

    "Titanic was 3hr and 17min long. They could have lost 3hr and 17min from that."
  • You neglect one very important point... CDE can be configured for use to be very, very easy to use. In fact, the configs that I mentioned are from something like 1994, and have not really been changed since, and they do work. I don't have to modify anything to get the system to work, and once I show a Mac user or a Windows user which icons are which, they're generally very pleased. They go back, because of what they have at home, but it's not all that big of a brainer...

    "Titanic was 3hr and 17min long. They could have lost 3hr and 17min from that."
  • they were big in schools because they had the pretty pictures before the PCs did, and because the GOVERNMENT subsidized their purchase... those "register receipts for apples" programs were not paid for by Apple, but by taxpayers...

    "Titanic was 3hr and 17min long. They could have lost 3hr and 17min from that."
  • Raskin may have been one of the first people to argue that computers don't need character generators. So what? That's not what's at issue here.

    At issue is the clunky way in which most current GUIs, including the Macintosh and Window burden the user with concepts like "files" and "applications".

    As a criticism of OS X, Raskin is right, and what he says applies equally to Windows, Gnome, and KDE. But Apple didn't have a choice: they were trapped in their own legacy. That was a problem Raskin himself could have helped avoid in the 1980s.

    Meanwhile, devices and services like Apple's own Newton, Palm Pilots, broadband phones, web-based applications, and game consoles are already redefining what "operating system" and "application" mean.

  • I have sent Jef Raskin's email below,
    after a few framing paragraphs.

    A few days ago, I sent an email to Mr. Malda asking to please
    consider starting a new topic heading, with its own icon:
    User Interface Design.
    Continuously improving UI design seems critical,
    to me, for the spread of free\open software.
    After not receiving any reply
    (and expecting nobody has time to answer email soon)
    I decided, with Jef's okay, to post it here.

    Today's follow-up of last week's thread heartens me.
    I see two early questionable comments rated 4.
    One seems, to me, to suggest that Raskin shouldn't
    expect people to read his writing carefully,
    let alone to check their references, before replying.
    The other seems, to me, to misinterpret,
    and quote out of context, from his
    article in the Dec. 1993 issue of Wired.
    Both have corrections rated 5.
    Way to go!

    The Slashdot self organizing community does self correct.

    Many respondents (both today and last week)
    seem truly unable or unwilling to understand basics of interface design,
    or even the importance of it, let alone more complex ideas
    (like those addressed in "The Humane Interface" and books like it).
    All the more reason for a new topic heading.

    I had emailed Jef Raskin about
    Slashdot's linking to the osOpinion.com
    editorial regarding his OSX concerns.
    He replied with a lengthy response to the Slashdot thread.
    Regardless of the decision about the new heading,
    here is what he wrote in response to last week's thread.

    Jef Raskin wrote:

    Thanks to Slashdot for notifying its readers of Michael Burg's article about
    my opinions on operating systems; unlike a lot of such articles, his was
    quite accurate. I have only a few comments on the article itself, which I
    will get to in a minute.

    The reader commentary, mostly anonymous, ranged from helpful and insightful to stupid and spiteful. Often, when the writer did not know how to do something I proposed, they labeled it impossible. In other cases, they assumed an inept solution of their own and then criticized their own solution, assuming that it would be how I'd do it.

    I do not imply malice to the contributions. This kind of stuff happens even with well-meaning writers. For example, my friend Simson Garfinkel reviewed my book "The Humane Interface" in Wired. He critiqued only one point in the book (about how to ease password-protected sign-ons) by assuming a particular implementation, one which I'd never propose. I sent Simson an email showing how it could be done quite nicely, and he agreed that he'd goofed (perhaps I should have given an explicit method). Unfortunately, the tens of thousands of readers of Wired will never know this.

    Fortunately, I can communicate directly with the Slashdot audience.

    My main critique of OSX has not to do with the underpinnings, which represent a long-needed set of improvements, but with its user interface. Burg clearly stated this. Many readers seem to have missed his point. I have always thought, and still think, that UNIX is a work of genius. I was a UNIX hacker for years, and its elegance, power, and flexibility continue to be an inspiration. It is based on a deep understanding of operating systems. The designers of Unix, and the majority of programmers who have gone on to expand it and to develop Linux, OSX, and other derivative systems, have not had a correspondingly deep understanding of user interfaces. As a result, the interfaces are difficult to learn and much harder than necessary to use.

    This is admitted by the UNIX world, which builds shells to protect us from its interface ugliness. Unfortunately, the various GUIs and other front ends have been created by people who do not have a as masterful understanding of interface design as they do of computers. Thus, the interfaces suck deeply.

    The failings of the OSX interface are less excusable because it is a contemporary product, and the designers did not take full advantage of what is known about interaction.

    A primary audience for my book "The Humane Interface" is my fellow nerds and hackers who have as little patience for psychological mumbo-jumbo as I do. We needed, and the existing literature did not supply, a sourcebook of straightforward and first principles with real, quantitative, measures that a programmer or software architect could use (not just a bunch of examples that peter out when you try to apply them to other problems). I assume that my readers are not scared off by a few logarithms.

    Burg quotes me correctly as saying that an operating system "is the program you have to hassle with before you get to hassle with the application." But the rest of the quote, "It does nothing for you, wastes your time, is unnecessary" applies only to having to deal with the operating system's interface. The stuff an OS does underneath, such as handling disk I/O, is, of course, essential. It is as essential as the circuit that comprises the disk motor clock, and the application-level user has as little need for knowing how one works as a programmer needs to know the other. Today's GUI interfaces require the user to understand far too much. They are primitive, crude, and surprisingly counterproductive. Burg put it well: "In short: the omnipresence of the OS is obsolete."

    But he also said, "The ability to be transparent, such as on the Palm handhelds... is far more important." Unfortunately, this is only a relative transparency. You still have to launch applications on the Palm. The OS is still obnoxiously there. But that's a whole other article that I won't repeat here (surf to my article "Down with GUIs" in Wired).

    Another example where Burg is still trapped in the old paradigm, while struggling mightily to escape, is where he said, "The idea of walking up to a PC in sleep mode and hitting a button, which would instantly activate a specific app, is compelling." He is still thinking in terms of having to hit a button and having applications. This tells me that he has gotten, maybe, a third of the way to understanding the kinds of interfaces that would really be easy to use. But, to quote his dead-on concluding paragraph "Raskin
    wasn't criticizing OSX for its qualities as an OS, but for the fundamental principal that it represents: something standing between you and whatever you want to do."

    Now let's get down and dirty and deal with some of the threads that followed the article.

    As TheJohn pointed out, "No, he's not really saying that at all. Raskin goes into quite a bit of detail about his vision in his book, The Humane Interface , and it doesn't involve most of the things people are attributing to him in this thread. It's not about locking people into one application provider, or even eliminating menus, or not having what I would call an OS (controlling devices, managing resources, etc.) It just doesn't look like what we often think of as an OS."

    An example of what TheJohn was talking about is a statement like this, from another poster "is that what the world really wants? a simple pad to activate your apps, a disk (cartridge) for a simple install and no real flexibility? I can see my grandma or mother using it, but me, my father, my sisters, or really anyone whos not "afraid" of that off white box would disregard it as another applience." I never suggested anything of the sort. Never have, never would.

    My Canon Cat design has been dumped on by the ignorant for its supposed limitation to its own built-in tasks. In fact, you could write code, even assembly code, in the word processor and execute it without leaving the letter you were typing. The user had access to all system resources. True, we didn't want users to execute code by accident. You had to type in a one-time password (it was in the manual) to gain access. That was it.

    To get at the Mac's power, you have to buy software, like compilers. My kind of designing gives far more flexibility than what we now have. So much for that rant. bear@spdcc.com pointed out that the piece was poorly titled. I had the same thought when I read it.

    Here's another case of someone being misled by their own preconceptions: Burg's article said, "Raskin goes on to illustrate that a computer should be as easy to use as to start typing on a keyboard to open a word processor --
    with no lost keystrokes, or to put a stylus to a tablet and start drawing in a graphics app." To which someone replied, "This is all very nice and good,but what if you wanted to use a spreadsheet instead?" I've explained this many times in print, so I won't go into it again as this reply is getting longish.

    Dancin Santa said, "It's one thing to make an OS as non-intrusive as possible, but it's a whole different proposition to remove any semblance of an OS altogether." Yup, it's a whole different proposition. And it can be done.

    osgeek@my-deja.com says (I have a lot of respect for those who sign their opinions), "A palm pilot is a very specific device that is normally only used for several simple applications: taking notes, scheduling, and keeping contact info. As soon as you start adding tons of varied applications to your Palm Pilot, you begin to find that its specialized interface & transparent OS are a hindrance." osgeek is so correct! But he or she went on, "You begin to wish that you had a better way to organize your files or add hardware. If you could keep adding all of that functionality back in there, guess what you'd have... a PC." osgeek is so wrong! Just because osgeek can't see anyway around the problem s no proof that it can't be solved. This is another case of a reader tripping on his own boxed-in-ness.

    Unfortunately, osgeek then becomes less civil, "Everyone is always looking to topple the PC with bullshit articles and arguments like Raskin's. They think that just complaining about it is going to inspire the industry to create something new and different that will change everything. For once, I'd like to see one of these pundits put forth a legitimate idea for the future of computing that might obsolete the PC - and no, web phones, PDA's, and Internet-saavy refrigerators don't count." I heard stuff like that about my "complaints" about usability before I created the Mac at Apple. Why does osgeek assume that I'm not developing something right now? Does osgeek know about the systems I've designed for various companies over the last decade?

    A quote from another commenter thought my machine would work like this:
    "Turn on keyboard
    Type letter.
    print letter.s
    You don't have to tell me I'm right, I know I'm right!"

    Wrong. You shouldn't have to turn on the keyboard.

    Sourav.mandal@nospam.ikaran.com wrote, "The 'computer as appliance' vision is stultifying. There's a reason a computer has totally general input (keyboard, mouse) and output (pixel-based monitor, sound) devices -- people want their workspace to be totally abstracted from the hardware in which it resides."

    Sourav, thinking to disagree with me, actually agrees. Read my book, which says that our present hardware, with its general text and graphic input devices and graphic and aural output devices are a sound basis for the future.

    Who is it that these people are arguing against?

    Enough examples. There was also a personal attack that requires an answer. I've run into this many times and usually just point people to the original sources (you can find many at <http://library.stanford.edu/mac/>) or to careful book-length accounts of the history such as Linzmayer's well-researched "Apple Confidential" or Malone's "Infinite Loop".

    Sabat said, "Jef Raskin did not have any hand in designing the Macintosh as we know it. He had the original idea of an "appliance" computer (sound like his current rant?) and started the project, misspelling its name as "Macintosh" (instead of McIntosh). Shortly after, Steve Jobs kicked Jef off the project and changed it completely, basing it on the idea of a low-cost version of the Lisa (which was not selling well at $10k a pop). Guys like Andy Hertzfeld and Bill Atkinson are to thank for the Mac GUI, not Jef."

    First, what does it mean to invent something? Edison invented the light bulb, and the Wright brothers invented the airplane. However, Edison certainly had no hand in designing the fluorescents that illuminate my office, and neither Wilbur nor Orville contributed to designing the Boeing 747. In 1978 I saw that Apple had no product that would take the company into the future. I also believed that getting personal computers to the world in quantity would require making them far easier to use. This was a vision very different than anything at Apple at the time. I called my invention "Macintosh" after my favorite apple, the McIntosh. I changed the spelling to avoid (I hoped) conflict with the McIntosh hi-fi company.

    I did coin the term "information appliance," so I knew that the Macintosh was not one. I designed it with provisions for bus expansion because I knew that people would want to add hardware. I wanted to have a programming language built in. Jobs took the bus expansion off and deleted the programming. The bus came back with the Mac SE, but you still(!) have to buy programming languages for it.

    Thousands of people have contributed, some brilliantly, to the Mac and its applications. I didn't write a single line of code or design a single circuit. I did hire great people and give them a direction which did not then exist in the personal computer world. I did create a number of new interface methods now taken for granted in every GUI-based machine.

    Sabat is unaware that the Lisa was originally a character-generator machine, and it *gained* its graphics-based screen from the Mac project. (I went over and convinced the Lisa team to change their architecture.) Jobs did not kick me off the project; that never happened, and is insulting to Jobs as well. (The facts are to the contrary: When I resigned, he tried to convince me to stay).

    I don't want to belabor this, or detail here the influence of Xerox PARCs outstanding contributions to interface design. Bottom line is that I created the Mac and ran the project for nearly four years. A zillion details were changed (some for the better, some for the worse) both while I was project leader and afterward my basic vision of a computer-based-on-an-interface instead of the prevailing build-it-and-then-add-random-software did change the face (and interface) of computing. Jobs, Hertzfeld, and Atkinson were major factors in making my invention a commercial reality, along with many others (Brian Howard was especially important, but rarely mentioned). My dream became their dream.

    Get the facts and turn down the flames. I am always happy to discuss real issues.

    -- Jef

    </ Jef Raskin>

    Sorry; I hit "submit" when I intended to "preview" before. This reads better.

    If Bobby Fischer told me how to play chess, or Bill Gates told me how to make money, I'd listen, I'd thank them, and I'd think long and hard before I contradicted them.

    Please forgive my mistakes, & thanks for reading,
    J. Daniels


  • by Phroggy ( 441 ) <slashdot3@ p h roggy.com> on Thursday February 08, 2001 @05:21PM (#445483) Homepage
    I think the biggest problem with this is, in your file I feel like I'm asking the computer to do something for me. The computer is ultimately in control, not me. In bash, when I type "mv ~/*.tmp /var/tmp/" I feel like I am actually doing it - not asking for it to be done. This is even more true in Mac OS - when I drag an icon, it feels like I'm actually moving a file myself. It goes where I put it, not where the OS thinks it should be. I like this control, and I don't want to give it up.

    --

  • by maggard ( 5579 ) <michael@michaelmaggard.com> on Thursday February 08, 2001 @03:05PM (#445484) Homepage Journal
    At all depands on how you want it.

    Gimp already runs under MacOS X. However Gimp is dependant on X Windows for it's display and this doesn't ship with MacOS X. Instead Apple developed their own Display-PDF based "Quartz" graphics engine and then built their "Aqua" GUI on top of this.

    X-Free86 has been ported to Apple's Darwin & MacOS X but it doesn't run under Quartz/Aqua. Thus under MacOS X one must shut them down and run X-Free 86 on it's own; not most Mac users first choice since they then can't use any native GUI applications.

    Tenon does have a commercial X Windows server for MacOS X that runs under Quartz/Aqua. Indeed it already runs Gimp just fine. "Xtools" is still in extended beta but it's expected to be final when MacOS X finally ships. This is the sort of thing most Mac users are likely to be most interested in - X windows as a peer and not a separate environment.

  • by phaktor ( 39283 ) on Thursday February 08, 2001 @02:20PM (#445485) Journal
    "The user must be in command, and the computer, the obedient servant. "

    I don't know about you, but I work only to support my computer habit. :)
  • by UnknownSoldier ( 67820 ) on Thursday February 08, 2001 @04:22PM (#445486)
    ... is a *TIGHTER* integration

    *Why* can't I select files in the gui, and have a shell "smart enough" to know what I selected!?

    Or,

    *Why* can't I select files in the shell, i.e. select *.txt *.doc, and have those files selected in the gui explorer!?

    Here is how I have a partial compromise on my Win2K boxes:

    I press Windows-E to bring up the explorer, with drives on the left pane, current contents of the selected folder on the right pane. I can right-click on a folder/directoy, and I get a menu choice "4NT Prompt Here" A shell opens up with it already in the selected directory.

    If I navigate around in the shell, changing directories, I have a command called "explore", (which I usually make an alias called "x") that brings up the 2 pane explorer view, with the current directory allready selected!

    It is REAL handy being able to go back and forth between the shell and the gui explorer.

    Here is how you can do this under Windows...

    Regedit:

    HKEY_CLASSES_ROOT\Directory\shell\Prompt\
    (Default)&Prompt there
    HKEY_CLASSES_ROOT\Directory\shell\Prompt\command
    REG_SZ: 4NT /k *cdd %1
    explorer /e,%_cwd


    4NT has the special commands "cdd" for change drive & directory, and %_cwd for "current working directory", since the default cmd.exe that ships with Windows is, uhm, under-powered ;-)

  • by Ukab the Great ( 87152 ) on Thursday February 08, 2001 @11:07PM (#445487)

    Not to say that it couldn't be done, but if you want Gimp to have an interface consistent with OSX (I don't mean aqua, I mean menu selections, keyboard shorcuts, dialogs, etc), that will necessitate some major changes. If you like appearances, this will especially be an issue. For starters, there's the matter of menus. The menu selections in Gimp (as pretty much in all other GNOME programs that use libgnomeui macros) are copied from M$. This means getting into the source code and changing the menu selections such as "Exit" to "Quit"(and dealing gracefully with the unused underline accelerators). It sounds petty, but mac people are UI afficiado's, and we don't tolerate windoze-lookin' stuff. You also have to keep in mind that the Gimp UI was designed for a UI system that doesn't have the menubar at the top. In order to have multiple top-level windows without having a menu for each one(which might be weird in a graphics program), gimp on X-windows uses (and IMAO, abuses) the hell out of contextual menus (aka right-clicking). To get GIMP to work, look, and feel well on an UI that implements a global menubar (such as OS X), you're going to have to move or duplicate alot of the stuff from contextual menu and put it in pull down menus. And some stuff that you find in a right-click menu might not translate very well to a global menubar menu if it's blindly copied. Yet more code writing.

    There's other UI matters, such as dialogs. For example, mac users are used to seeing dialogs that have the "Cancel" button on the left and the "OK" button on the right (UI experts such as Bruce Tognizzini say this is the correct way to do it, but that's another arguement for another time). Gnome does it the other way (the Windows way), where you have "Ok" on the left and "Cancel" on the right. Until the Gnome project decides to implement some sort of platform look and feel code (or libgnomeui for OSX is seriously recoded), the dialogs in Gimp under OSX wouldn't look like real mac dialogs.

    Finally, the other thing you have to keep in mind is that Gtk/Gnome straightly recompiled without serious work probably wouldn't take advantage of all those tasty OSX/G4 graphics features like Altivec (someone correct me if the powerpc linux gdk, libart, etc. work well with altivec), PDF-based graphics system, and color correction. I'm not saying a port of Gimp to OS X can't be done, but to keep UI consistency with OS X and utilize all its special features and optimizations to the fullest would probably be a lot of work.

    I'd like to see it happen, tho'.
  • by Sir_Winston ( 107378 ) on Thursday February 08, 2001 @10:03PM (#445488)
    The ATI video cards since the Rage 128 based units have had one thing sorely lacking in any other video cards even today: an integrated idct (inverse discrete cosine transform, if I recall) unit in the video hardware. Basically, the idct is the most intensive part of MPEG-2/DVD decoding, so doing it in hardware takes most of the work off the CPU. Granted, a full hardware MPEG-2 card takes almost all the burden off the processor, but there's something you obviously don't know.

    Hardware isn't necessarily better than software, if the hardware takes shortcuts that the software doesn't and you have enough processing power to run the software. I myself have a Hollywood+ card which I have been very happy with--I used to laugh at those fools using PowerDVD or other software-based DVD players, when I had dedicated hardware that had higher image quality.

    However, for analogue video capture as well as its TV tuner features--the best on the market, bar none--I got an ATI All-in-Wonder 128. On my old K6-2 400 machine it couldn't play DVDs well at all, which was fine since I got the Hollywood+ for that. Well, when I got my new KT7-RAID not too long ago, with a processor that'll o/c to 1GHz, I reformatted and reinstalled everything. I tried the ATI's DVD software, just to soo if it worked with the faster machine and all--and it did, surprisingly so. It has much better image quality than the Hollywood+ does. I hate to say it, since I championed the REALmagic card for so long, so smug that it was better than any other DVD solution. But, the fact is thatimages don't lie, and after comparing the output time and again from both cards--the Hollywood+ with its complete hardware MPEG-2 decoding, and the ATI with its hardware idct unit and the rest in software, I came to the reluctant realization that the ATI unit had a much clearer, more detailed image.

    The key is that I think the Hollywood+ must be trying to do some edge enhancement or something, because when I examine the two streams on my 20 inch 1600x1200 display, the ATI looks extremely lifelike and the Hollywood+ seems to look duller, less sharp but with more prominent edges. To try to eliminate resolution as a factor, I dropped down to 800x600 and 1024x768 to see if it made a difference--but it didn't. The ATI was always clearer, crisper, than the H+. This was on a new install with the latest drivers and the latest VIA 4-in-1's and the latest BIOS, with a Pioneer 10x DVD drive, and everything seemed to be functioning perfectly.

    Basically, I think the ATI's DVD software, based around the Cinemaster decoding engine, does a reference-quality job of decoding DVDs. The H+, on the other hand, seems to use some edge enhancement trick, or just doesn't decode as well. I think it's the former, because the H+ does in fact look better than the ATI when viewed on a standard television via the on-card TV Out. I think the H+'s decoding engine was designed around the idea of decoding DVDs for display on a standard TV, which doesn't benefit from a clear full-res picture but does benefit from a little bit of modest edge enhancement. Now, I could be totally off base with thisedge-enhancement theory, maybe the H+ doesn't do that, but the fact remains that its picture is not nearly as clear and pristine as that of the ATI at high resolutions or the native DVD res, though the H+ does look better than the ATI on a regular TV. The other area in which the H+ is superior is in its color: it has more vivid, rich colors and saturation than the ATI, but this is a function ATI and Cinemaster could easily improve in future software revisions--as it is, the ATI software offers little in the way of color/saturation/hue/brightness tweaking, while the H+ gives you total control.

    And it could go without saying that the ATI needs more CPU time, but even with my little Duron cranked down to a paltry 700 it still only eats ~30 to ~50 percent of the CPU, with a few other processes in the background to boot. The H+ uses much less, but the tradeoff is in image quality. Disagree all you want, but as an owner of both I have compared performance and decided to use the ATI when viewing DVDs on my PC, but when playing them on my TV for other people I use the H+. At high res, the ATI wins hands-down.

  • by update() ( 217397 ) on Thursday February 08, 2001 @02:24PM (#445489) Homepage
    Note that the difference between Jef Raskin and the jackasses who post to every Gnome or KDE story saying, "Why are developers wasting their time imitating existing interfaces? They should be doing something much better. No, I don't have any ideas as to what 'something much better' is, but that's what they should be doing." is:

    1) Virtually all of today's GUI's are derived directly or indirectly from his work on and before the Mac.
    2) He's written a book [amazon.com] that explains what 'something much better' might look like.

  • by q000921 ( 235076 ) on Thursday February 08, 2001 @06:13PM (#445490)
    Several systems prior to the Macintosh, among them Smalltalk, had a much more integrated approach to applications, documents, and data. You didn't have to "start up" applications, exchange data via files, or all the other clunkiness found in "modern" desktop systems. And while none of those systems realized it fully, the writing for an easy-to-use, persistent object-based desktop system without mainframe holdovers like "files" and "applications" was on the wall.

    Then came along Apple with their underpowered Macintosh programmed in assembly language and Pascal. They produced something that looked nice, but its model of applications and data was not much different from your average DOS machine. And that metaphor has held the desktop in a tight grip ever since and been copied over and over again, by Windows and now Gnome and KDE.

    I think what Raskin is complaining about is ultimately due to Apple and their initial success with what was already then a broken paradigm. It seems like adding insult to injury for an Apple employee to come back now, 15 years later, and say that everybody is doing things wrong. Well, of course we are doing things wrong. That's because the market and users expect things to be done "wrong". Undoing the damage now will be much harder because everybody now expects things to be done that way.

  • by Maldivian ( 264175 ) on Thursday February 08, 2001 @05:30PM (#445491)
    Jef,

    Regards and my apologies for the backlash from a few here that felt your comments too close at heart :)

    I'm working on a new UI written in objC and built around OpenGL, IBM Via Voice and touch input. I've been a fan of you and the MacOS for quite sometime. Our software is now mature enough to be opened to the public (It's an opensource project, but due to the nature of the project, we saw that it would be best that we worked on it silently till it matured towards version 1.0). Right now we have been able to built the server/client application on Linux, SGI and Digital Unix archs.

    We have 2 developes and 5 artists/designers. The team came from various backgrounds (including game development (and a very dead famous game company), sgi and yes microsoft).

    Our solution to the problem of UI was to forget everthing that we saw in other UI's out there. We even forgot what we saw in futuristic UI's (often seen in SciFi and so on).

    The use of an UI should be intutive enough for a child of say age 3, to be able to play around and navigate with ease. Our design relies around the way humans see things in general. The whole interface is a 3d window into a virtual space (which we call home for a lack of a better term), where you interact with objects such as pens and paper to write.

    Now the input to the UI could be done via touch (touch screen), mouse, keyboard and voice. One of us worked real hard on the Via voice engine and got it working well enough with a unqiue set of gramatical markup language that lets us navigate the UI with ease.

    For example, one of the applications we have is something called Achilles. Achilles is our human interface to a number of other transports (Eg: E-mail, News, Jabber (yes it works very well with jabber), IRC and even slashdot). Achilles interface is represented virtually by an Avatar (by default called Vishnu). A realistic human type avatar. Say you get a mail, Vishnu would call out for you, and when you aproach him, he'd open his hand and show you the scroll, that's how the e-mail goes, and a number of things are also send through the voice transcoder to be read out through the speakers intead of been shown.

    Now that's how some of our applications are. But we seem to have stumbled on a problem that we cant seem to get away from. Vishnu is very bright and nice, but everytime we present it to a test user, they almost always reverts Vishnu to an alternative text input interface. And read mail as they did with their other applications. But when we presented the same applications to a child, they immeditaly used the highest form of non abstraction (ie. Voice + Realitistically animated character).

    Now our question to you is. Is there a problem with the way we visualize UI's ? Is this derived from how we've seen UI's in the past? What would it take to break people from the usual Windows/Mac UI habits?

    Thank you.

  • by Kujako ( 313468 ) on Thursday February 08, 2001 @02:44PM (#445492)
    Like all computer it will read your mind. The problem is that now all the computers can do with the information they pick up off of your brain waves is store it and secretly plot against you. Do you think its a coincidence that your computer only crashes when your trying to get important work done? Or that only the important files get corrupted? Its all a plot I tell you, the computers are out to get us. When I see adds for new computer hardware all I can think is "dear God, the've learned how to reproduce using a Human as the gestation hoast, but then I'm a bit mental at times.
  • Hate to break it to you, but the "killer-app" growth of linux comes from Apache and old-line Unix stuff like NFS, and other server-side tools like Samba, not from KDE or Gnome. Yes, those can have GUI configurators, but nobody really uses them on a regular basis yet. At any rate, its servers that you set up and CAN TRUST to leave alone to do their thing without having to check on them every hour on the hour like with some other...proprietary...O/S out there...

    Others have Linux because of either the free/open-source model as a philosophical thing, or because they're in education and linux (w/ the source codes) is a great way to learn OS design and implementation.

    I couldn't name one person out there who says "yeah, i just HAD to go get Linux 'cause there's this great Desktop called GNOME out there..."

  • by drivers ( 45076 ) on Thursday February 08, 2001 @02:28PM (#445494)
    I'm sure he has some great ideas (it's giving me a few ideas) but I don't think he's helping himself much. The whole piece was "I didn't say that. If you would read my book you would know better." Well, let's see. Here's an (printer friendly version) article by him, from Wired magazine.

    http://www.wired.com/wired/archive/1.06/1.6_guis_p r.html [wired.com]

    What does he say? The same stuff he says he didn't say. Start typing to make a document. Start drawing with a pen tablet to make a drawing. "One big mistake is the idea of an operating system." And, "An operating system, even the saccharine Mac or Windows desktop, is the program you have to hassle with before you get to hassle
    with the application. It does nothing for you, wastes your time, is unnecessary."

    How can he blame his critics for saying such things?
  • by 1010011010 ( 53039 ) on Thursday February 08, 2001 @05:25PM (#445495) Homepage
    AI shell> get all files ending in tmp in my home place
    OK, I've found 10 files for your request
    AI shell> go to the place where my temporary files are stored
    OK
    AI shell> drop the files there
    10 files dropped.
    You have been eaten by a Grue.

    - - - - -
  • by Fross ( 83754 ) on Thursday February 08, 2001 @05:00PM (#445496)
    AI Shell> Go North
    Ok.
    You see a troll.
    AI Shell> Kill Troll
    You hit the troll.
    Troll says "p0uR h0t gr1tZ d0Wn mY pAnTz!"
    Troll died.
    You found one suspicious box of kleenex (used).
  • by MrBogus ( 173033 ) on Thursday February 08, 2001 @04:11PM (#445497)
    And for the bloat-haters out there, such an "AI Shell" would actually be very similiar to the natural language interpreter in Zork and other Infocom games. And that ran fine on 8-bit 48K machines.

    Apple has something similiar with HyperTalk/AppleScript, but the filesystem bindings are really wierd, and furthermore, it doesn't really run interactively.
  • by Fervent ( 178271 ) on Thursday February 08, 2001 @02:23PM (#445498)
    Many people missed, and Burg did not make clear, that I was talking about the *interface* to UNIX, not to UNIX itself, which I think is a work of genius and a masterpiece of elegant design.

    Definitely. The command line for the average user is absolutely garbage. Why doesn't the Linux/FreeBSD community recognize its explosive growth for what it is: proliferation of decent GUI's like KDE2, Gnome and Eazel is causing the growth.

    I have a professor who's been using Unix for some 20 years (he is the only one in our college with a Sparc on his desktop), and he prefers the GUI to the command line for most tasks. I installed RedHat on his laptop, just with the command line (he originally said this was fine) and he came back in a month asking for X-Windows.

    Point is, neither the command line nor most GUIs are terribly intuitive. But GUIs, for the end user, make a hell of a lot more sense. Unix's underpinnings are great. Its current interface is absolute garbage.

  • by iso9k ( 185654 ) on Thursday February 08, 2001 @02:17PM (#445499)
    He just hates to be wrong...
    he refered to us as "making a mistake" and "he is disappointed so few of them(us) took the time to understand the context of his remark(s)"
    Not once does he say, "well I guess I should have said".....or "what i meant was"....He seemed to blame us for not getting it. As if he made no mistakes, but it was the reader that was mistaken 100%.
    I guess I just dont like the idea that he did not put his ideas out correctly and then goes on the make it the reader's problem - I almost get the feeling of "if you did not read it right, you are dumb."
    Quite cocky if you ask me...
  • Jef is right.

    When you're writing a document in your favorite OS, be it OS X, Win2k, or Linux, it should be the interface of writing the document, and not the interface of the OS, that you should be dealing with. The constraint, put before by he and his crew upon the first iteration of the Mac OS, was of consistent UI so that all apps looked alike and felt alike. It was supposed to lessen the learning curve.

    What he is saying isn't wrong. If the OS is an interface you have to learn first, before you can use your app or do your work, it is a waste of time, it is unnecessary. Hardware should be powerful enough today that the OS intrusion should be minimal. When you're using something like Netscape, a web browser, it should be a world of URLs, links, images, files, and content. You shouldn't have to worry about fonts, except perhaps as a preference, or printer setup, except when you want to choose specific printers, or about security settings, except when you want warnings or such. Compare that with Linux, and compare that with Windows. Printers and fonts and stuff just works behind the scenes. Netscape does it's part, and gets what it needs from the OS, without having to fiddle with configuring printers for Netscape, configuring fonts or font servers for Netscape, etc.

    Or something similar with CD burning, under OS X and under Windows. If the drive is connected, all you have to do is drag files to it to burn stuff to it! No interface windows, no volume information, no format or filename or filesystem fiddling. Just treat is as another device to write to!

    Treat ripping music, making mp3s, and burning them as one set of functions. That's iTunes. OS doesn't get in the way. In fact, if OS really didn't get in the way, the CD should automatically connect with CDDB, so that when you popped up explorer or Finder, the CD has all the names, titles, album info, etc. Drag one of these items into an MP3 folder, or just drag the whole CD into the MP3 folder, and mp3 files, or even a whole mp3 album, gets created. The UI, in this case drag and drop, don't get in the way, and are the seamless transparent means by which one could operate. The OS merges functionality with the Apps involved, but it's the app you're using that gets the focus.

    His much maligned word processing example; start typing, and the OS should figure out you're writing an email, or a letter, or drafting a document. Does the system do it for you now? No, you need to find the right icon or the right folder, first. Why should this be? Why should the system be smart enough to figure out what we need? If you want to start browsing, just typing http://slashdot.org into a commandline-like interface should be enough to bring up Netscape. If you want to send an email, typing louisjr@nospam.com should bring up the right email program. Want to play music? How about 'play sad_songs' Or pop a CD into the drive. Want to copy it? 'copy CD to c:\scratch\music'

    Of course, my own guesses and implementation of Jef's idea may be broken too. But I think there's merit.

    Geek dating! [bunnyhop.com]
  • by bnenning ( 58349 ) on Thursday February 08, 2001 @03:18PM (#445501)
    Just how compatible will Mac OSX be with Linux? There are some programs I like on Linux, like the GIMP, that I would like to use on the Mac too.

    Mac OS X is basically BSD under the hood, so source compatibility should be good. I was able to compile and run most of the Obfuscated C Contest entries without a hitch. XFree86 has already been ported to OS X in full-screen mode; a hot key toggles between it and the normal OS X interface. Tenon is working on a (commercial) rootless X server for OS X, they have a beta available here [tenon.com].

    I really tend to judge OS's by looks a,d not substance I suppose, which is why I like gnome and Macs and not MS so much.

    I hope you're not implying that MS wins on substance :)

  • by nehril ( 115874 ) on Thursday February 08, 2001 @03:48PM (#445502)
    What does he say? The same stuff he says he didn't say. Start typing to make a document. Start drawing with a pen tablet to make a drawing.

    He is describing one system he designed that operates in that fashion. He doesn't say that all computers should operate in that way, just that once he designed one like that, and it worked. He used that as an example of how designers should break away from conventional thinking. fFr all we know the system he referred to was a simple experimental prototype. Hardly contradictory stuff for a researcher.

    An operating system, even the saccharine Mac or Windows desktop, is the program you have to hassle with before you get to hassle with the application. It does nothing for you, wastes your time, is unnecessary."

    I see no contradictions here. He is describing the "operating system" concept as it has been sold to us. What is the "Windows Operating System" to most users? It's the Start Menu, the nested menus, the dancing paperclip. In short, the cruft you have to slog through before you start typing your paper or drawing your next masterpiece. He is purposely describing what an OS is from a user's perspective, not from a computer scientist's.

    How can he blame his critics for saying such things?

    A great many of the derogatory comments I read here came from people who failed to see that when he says "an operating system" he is usually referring to the user interface of that operating system (average user perspective), not the collection of system calls and programs that provide access to hardware (computer scientist perspective). If you can keep straight in your head that he is a UI researcher, most of what he says makes sense, or at least makes you think.

  • by mauddib~ ( 126018 ) on Thursday February 08, 2001 @03:11PM (#445503) Homepage
    Point is, neither the command line nor most GUIs are terribly intuitive. But GUIs, for the end user, make a hell of a lot more sense. Unix's underpinnings are great. Its current interface is absolute garbage.

    Well, I understand your points, UNIX interface design was initially a bit poor. But the idea of pipes (and pipelines), shell subsitution, input and output redirectors, etc. etc. has been introduced with a thought behind it.

    This thought is called flexibility. And I can't underline this term even more. One of the key things why I use UNIX to it's full extend, and learned to love it, is flexibility. Small applications like sed, awk, find, grep, ls, cp and the others only contribute to this. Good editors like vi or emacs even extends this idea.

    But there is a drawback in this idea and it is called "User Friendly". This term has been introduced mainly for new users. The need for this term is obvious in two ways.

    First of all, not everybody is as techy as the average Slashdot reader. It is completely out of mind to think that a new computer user will pick up the idea behind UNIX and shells easily.

    The second drawback in this idea of flexibility is that it keeps open too many ways for a user to interact with the OS. Again: most techies will like this idea of open-mindedness, and are always willing to learn (myself included). But it also introduces doubt in how to act on certain problems. In 10 seconds I can think of 10 different ways of finding a file on a certain operating system. This might be ideal for flexibility, but it leaves the user with a problem on how to choose his/her best bet.

    The idea of using GUI's comes in mind. The use of a mouse comes in mind. But as we can see now, it doesn't really solve the problems involved in making things less complex. Instead of reading manual pages, people are now browsing through all the menus, different windows and still help pages. As it's biggest drawback it seems to loose a lot of flexibility. GUI programs tend to be bigger, capable of doing more and more things, but less than the sum of all the small command line utilities.

    Of course, the need for graphical applications is very high. We just *need* them, no doubt about it, but as noted above, it also limits a lot of things. My answer: introduce a shell which is understandable for normal users. A shell which understands lines like:

    AI shell> get all files ending in tmp in my home place
    OK, I've found 10 files for your request
    AI shell> go to the place where my temporary files are stored
    OK
    AI shell> drop the files there
    10 files dropped
    AI shell> no, I made an error there, put them back
    OK, 10 files put back to your home place
    AI shell> edit the document I was working on yesterday
    2 files found:
    foo.doc
    bar.doc
    AI shell> edit the last document
    OK, editor started
    user gets a word editor, opening the file bar.doc

    This might seem a bit strange, and really difficult to implement, but if something like this would only nearly be possible, it would be a huge leap for new users to overcome the UNIX-anxiety.

  • by schwanerhill ( 135840 ) on Thursday February 08, 2001 @03:19PM (#445504)
    while there are some nice elements of System/MacOS, I find that using an OS designed for 3rd Graders/Grandmothers a bit annoying. I dealt with too many problems on Apple's Mac OS in the 6.0-7.5 levels to want to think about their old-designed, cooperative multitasking OS, and while this may sound like a dis on it's creator, it is. At school I often use HP Terminals running CDE, and while not perfect, they're not too much harder than the MacOS, I click on the little pictures at the bottom, and the apps launch. I click the close box, and the app goes away. If grannies and 3rd graders want an easier to use OS, fine, but don't expect me to really care about it...


    And who says that UNIX can't be made at least somewhat usable to Joe Schmoe?


    Mac OS X has very little to do with Mac OS 6.0-7.5, and the relationship between them is only on the surface. (Hell, not one machine that can run Mac OS 7.5 will run OS X.) Mac OS X is not an "old-designed, cooperative multitasking OS;" it is "UNIX... made at least somewhat usable to Joe Schmoe."

    The Mac OS's strength has always been its powerful but easy to use (the two are not mutually exclusive) interface. It was never designed for novices; it was designed so that the computer does not get in the way of the user's work (as Raskin said). The user could be a third grader or any power user who could stand the OS's admittedly weak underpinnings. The lack of a command line does not make Mac OS < 10 a toy for third graders and grandmothers; it makes it a tool that a relatively large audience can use relatively efficiently, whether they be third graders, grandmothers, or people who know computers very well and have real work to get done.

    At the risk of pointing at the blatantly obvious, Mac OS X has a GUI that seems like it will be at least decent (it may not be as mature as Mac OS 9 until version X.1 or X.2) coupled with a command line (for those who want it) all built on top of a buzzword compliant core.

    Therefore, Mac OS X is an OS that third graders can and 'power users' can both use as they see fit. I've been running the Public Beta for 4 months now, and this is definitely not your grandmother's OS (although mine will be using it :) ).

  • by nickfarr ( 161419 ) on Thursday February 08, 2001 @02:55PM (#445505) Homepage

    Apple has "been dying" for the past 10-12 years or so. Just like I wouldn't reccomend Linux for people who have problems running winblows, I wouldn't reccomend an AVID or an SGI to someone who just wants to edit their Public Access TV show.

    Having worked on SGIs and Apples (both Mac powered AVIDs and standalone DV-equipped Macs), in both professional and commercial-grade applications, Apple is *far* better at doing most TV-quality applications that need to get done.

    Unless you're doing Music Video editing, special effects, or are producing the next 3 hour long movie, an Apple w/ Final Cut Pro (or even Imovie) will do what you want, when you want it to, without having to resort to more costly options that produce only marginally higher quality stuff.

    P.S. G4 video output made for TV production and watching DVDs. Most PC video cards are made for playng quake. Which tastes better: Apples or Oranges?

God help those who do not help themselves. -- Wilson Mizner

Working...