Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Apple Businesses

Jef Raskin On OS X: "It's UNIX, It's backwards." 420

drfalken writes "Interesting piece here about OS X from Jef Raskin's point-of-view (he was one of the wizards behind the original Mac GUI). He thinks that even the concept of an OS is a hold over from an older era, and that work should be done to get the user closer to the app. I dunno if I agree. "
This discussion has been archived. No new comments can be posted.

Jef Raskin On OS X: "It's UNIX, It's backwards."

Comments Filter:
  • He is not saying get rid of the OS, just it should be transparent. Yes it provides services and does it's job, but at this time it still gets in the way. People don't sit down at a computer to access devices and multi-task, that should not be a concern of theirs. It should happen but not visibly. People sit down at computers to process orders, or to communicate with someone, or to write that nasty fax to a problem supplier. They don't sit down at the computer to try to find the right button (where was it, the start menu, the system tray or the taskbar? I just saw it). They don't sit down to deal with wrong dll versions or incompatible hardware. The process of maintaining a computer should not enter the equation. Right now these are facts of life and not easily curable, but I think he is trying to say that this is the important stuff that needs to be focused on. (I know this is microsoft based, but it happens with most os's in one form or another)I think he is just saying that OS manufacturers instead of continually pushing "new integrated features" should focus on making themselves as invisible to the user as possible.
  • No, the idiots are the one's that he's aiming for.

    This guy is talking about computers in a similar way to what the following statements talk about their subjects.

    Imagine if you get in your car, push a button, and it drives to your destination without you even having to tell it where you want to go.

    Imagine walking into your kitchen, turning on the light, and the kitchen automatically makes whatever meal you are in the mood for, without you even having to chose what you were wanting.

    Imagine walking into your shop, pushing a button and your shop creates whatever you were just thinking about without you having to even lift a finger to accomplish it.

    Imagine wanting to read a book, pushing a button, and someone else reads the book and gives you a two minute synopsis of it so you can avoid the work involved in reading it yourself.

    I'm sorry, but I just don't see the point of making computers into push button "you will do what the computer wants" type of machines where humans become slaves to the machine. Drool, push button, drool, is it done yet? Especially funny is the idea that this guy says he would be doing this to "bring the person closer to the app" as if this is some benificial thing.

    Imagine waking up, pushing a button and your entire day is lived out for you as you stuff your face and fall back to sleep to avoid doing anything at all!

    Sorry, but this whole idea of simplifying things to the point of absurdity is just plain stupid. Sure, single use devices could be OK for some things, but it will not remove the need for more complicated devices that others (like me) would need for more complicated tasks. Programming and listening to MP3s each may be "simple" (well, maybe not the programming), but together would present too much difficulty for a single use device. This just seems stupid to move computers back to the single-thing-at-a-time type of scenarios. It frustrated me to no end when I had to run one program at a time. I would hate to see computers go that route again.

    Single use devices are just that. They won't replace the PC. Perhaps for single-use people they will be great. But for normal computer using people, they will not replace a standard PC with an operating system. It just won't happen.

    The OS may change, but the day it does my work for me, to the point where all I need to do is sit down and push one button, is the day I remove all computers from my house. It just wouldn't be worthwhile to me.

  • by Ian Wolf ( 171633 ) on Thursday February 01, 2001 @12:51PM (#463299) Homepage
    You fail to comprehend the magnitude of this new paradigm in computing.

    The servers you speak of are actually "slightlly thicker clients" connected to "almost fat clients" connected to "so close to servers that you can't really tell the difference clients" connected to Bill G's personal desktop. Think Amway and you're almost there.

  • by Anonymous Coward
    I have a hard time understanding what Jeff is complaining about. Mac OS X is doing exactly what he wants. In fact, others have written about it.

    What's up with Steve's drawers? [macedition.com]
    By Michael Gemar, 30 January 2001

    ...

    "It's like, reality, man..."

    And this seems to be the point of drawers, and image wells, and banning group boxes - moving the interface closer to a physical analogue. In a recent Ars Technica article, [arstechnica.com] John Siracusa pointed out that Aqua moves away from the explicit "desktop" metaphor to something more abstract. That's true, but as the author also notes, iMovie and iDVD (along with QT Player) seem to involve a much more physical metaphor. Siracusa argued that these brushed-aluminum, single-pane interfaces were rather independent of the principles of Aqua, but I think it should now be clear that, on the contrary, Aqua very much moves the Mac OS toward an appliance model of applications - one "console"; a pane in which any documents appear (as opposed to separate windows for documents); documents from which trays slide out for frequently-used controls; and with little required use of any control elements that aren't physically attached to the main console (remember, "Favorites" and "Bookmarks" can just as easily go into a fixed-location menu).

    It's not just that the interface appears more realistic - sure, Aqua has some powerful graphics processing behind it, can do all sorts of fancy tricks and can present more detailed UI elements. But that could have been done with the traditional elements of the Mac OS. What is striking here, and what we have seen with QT Player, iMovie and iDVD, is that Apple is moving toward a vision of appliance-like applications that are perceived and and used as physical objects, with little contact with the rest of the OS UI. These applications don't spawn window after window across your screen but instead appear to contain almost all the interaction in one physical object. (Also consider that this single-pane mode was, until recently, how the OS X Finder was to operate as well.)

    Components everywhere, OS nowhere

    Now in a sense, this conclusion is not news, as many folks had similar speculations following the release of QT Player and the similar-looking Sherlock. But with Steve proclaiming that the "Digital Hub" is the strategic direction of the Mac, these interface changes make much more sense. While Mac pros may be comfortable with a gazillion windows and palettes sprayed across their screen, regular folks who just want to edit their home movies, rip some CDs or send some email don't want or need such complexity. What they require instead is an interface that is simple, intuitive and much like the kind of things they interact with in the real world. In the Digital Hub model, Apple is not appealing to people who are necessarily computer savvy or who want to learn a new metaphor, even one as seemingly intuitive as a "desktop." Heck, Steve pretty much spilled the beans at the MWSF keynote when he said that the Finder itself will in principle be replaceable by other, simpler ways of interacting with the OS, ways that didn't involve the necessity of full-blown file access - and an e-mail application was explicitly mentioned. (Now which OS X app from Apple uses drawers and a single-pane?) In other words, a user wouldn't interact with the OS, but would interact with various stand-alone apps. The OS would become invisible, and the apps would be all the user sees.


    ...

    If this doesn't make sense, reread those last two sentences.
  • > Where's the step where you entered it into a text document

    The > is an output redirector. It redirects the text from the display to a file or other device.

    > I can do a listing like you did simply by double-clicking the mouse. No keyboard needed.

    In a race between a command-based OS and a GUI for tasks like a directory listing, a reasonably fast typist at the command line will win. How about copying a file (blah.txt, let's call it) from a 5-layer deep directory to another 6-layer deep directory? First you have to open 5 directories, hunting for the one you need each time, open the other six, and drag the file to the other directory. That vs.:

    cp /usr/docs/misc/mine/stuff/blah.txt /home/me/blah/misc/docs/bio

    Simple. Quick. No messy windows.
  • so you're saying its something like:

    you buy a wacom tablet and photoshop and illustrator. Then whenever you press the pen to the wacom's setup field, it asks which graphic program? you pick, and off you go. You don't have to worry about photoshop files overwriting other files either since they're all secretly .psd or something.

    you buy a card reader with some card reading software and it just works when you want to log in.
    I do think this is great; but its VERY similar to the idea of the start menu (which has no real innovations though); Basically I know users who arn't aware that you can browse for files outside of the app you made them in because they really don't know what the OS can do for them. -Daniel

  • And under OS 9 the function keys can be mapped to launch applications. Ta-freaking-da. And, since a compiled Applescript is an application, then you can... oh nevermind. Raskin's target audience thinks script is what the read out over the phone while at work :-)
  • A major point as well... if you consider command line completion I can move that file *TONS* faster than someone clicking through folders. That's why enabling Command-Line Completion is a necessity for Win2k (1 minute registry hack, very cool). Bash still does a better job, but command line completion in the W2k command shell makes it that much closer to the Real Thing and I can Get Stuff Done so much faster.
  • IMO He's VERY confused because he associates the operating system/device drivers with the behaviour of the UI. The two are so completely at a different level that it makes my nose bleed just thinking about it. Still there is a connection there, and he has a point, but I'll see if I can clarify it.

    The point is that of configuration control over an operating system and installed applications- i.e. interdependency checking both between applications, but also between applications and device drivers and applications and devices and drivers, even between kernels and applications.

    e.g. I would like to install say, a wordprocessor. Let's say that the wordprocessor is only supported on 2.4.2 kernel with a particular 3D accelerator device driver. In principle I should just be able to go to it, and click on an web page and it should download and run it without ANY further intervention. However if this would introduce an incompatibility between ANOTHER app and the kernel or some device driver, my system should tell me and let me decide whether I really want to do this.

    Another example how dependency checking can be made better: suppose I am running, say, bind, and it turns out that bind has a security issue. The security issue may not be publicised, but a new patched copy is available- the system should automatically find it and ask if I wish to install it. (Yes, I know that Microsoft already does that kind of thing, Linux needs it too! Only we'll do it better ;-)
  • There is a reason computers don't do this. I might want to write an email, a web page, a weblog entry, a shell script,a letter, a resume, a book, a technical article, a C++ program, a recipe, a garage sale flyer, a get-well card, I might want to type a phrase to "logoify" in an image editor. Does Wordperfect, or MS Word, or emacs, or notepad, or LyX open up? What if I'm posting on Slashdot, and I suddenly get an idea for something to put into a letter I'm writing, and I start typing. Do my words go in this "comment" field or does my wordprocessor magically open up?

    There's a reason computers don't think for us, they can't. Even if they could, it's still our thoughts that they're there to filter. I wouldn't trust a secretary to figure out what to do with my words if I just started talking without any context. How the hell is a computer supposed to do any better?

  • Fine. Where's the step where you entered it into a text document? That's what the original poster used as an example.

    It's already done. The "> dirlist.txt" portion of that command line created the text file with the needed info. No muss, no fuss.

    Folders are confusing? Do you find drawers and closets confusing also?

    (ignoring the flamebait portion of that comment) Folder structures rarely confuse me, but you'd be amazed at how often Joe User gets a deer-in-headlights look when viewing directory trees. Ever view a directory tree in Windows Explorer? How about one that's about 10 levels deep? Go ahead, expand a few directory trees (make sure you're in 1280x1024 resolution first) and select a folder. Now, click somewhere on the background in the left hand side of that window. Notice that the folder you are in, is no longer highlighted. Now, grab a friend and have him glance at the screen for just a second or two and see if he can tell you the name of the folder/directory you are viewing the contents of.

    Ever drag and drop a folder into another folder by accident? CAD operators at my work do it all the time. Then they come whining to us that some job number "isn't on the server anymore." It's there, but one simple misplaced click/drag has now rendered it unfindable by that user, and all the other CAD people here. So we (IS people) type a simple command line to find the job in question, and type another to put it back where it goes. Total time spent: 10-20 seconds. Now, as a test, I go to an NT box and click Start|Find|Files or Folders, enter a known good folder name several layers deep, and click find. Now I move that folder to where it goes by dragging and dropping it. Total time spent: 45-60 seconds after 5 tries, according to my trusty Bulova. And I'm purposely doing this as fast as possible. That makes this common task quicker by a factor of about 3-4 when I'm telnetted in to our Linux server, or sitting in front of it. Same goes for *BSD, Solaris, AIX, etc. Sure, a time savings of 20-50 seconds doesn't sound like much. But multiply that by the number of times things like this have to be done (to repair the damage done by people misusing click/drag/drop interfaces, I might add), and the company I work for saves several hundred to several thousand dollars a year because I, a lowly technician, know a few simple command lines, which take maybe 60 seconds to learn.

    Now, ask yourself if you ever type "mv ./001076 010064" by accident. No? Seems to me that making user errors less likely and less destructive also has value...
  • I have an awesome Atari 800 sitting here on my desk, and that thing of course has an OS, but it has something else too, cartidges!!! We should bring this awesome stuff back. We need these because they are awesome looking. Forget about beige computers or even bondi blue! I want a Brown computer like the Atari 400 that has a membrane keyboard that sounds like the top of a snapple cap when you type. Just stick in the cartidge of the App you want to use. And you nay-sayers out there will ask "How are you supposed to multitask?" Well, the Atari 800 has 2 cartridge slots! blade I want a MP G4 Atari 8000
  • by TheInternet ( 35082 ) on Thursday February 01, 2001 @01:00PM (#463320) Homepage Journal
    I think Jef's actually correct -- the OS does tend to get in the way too much for the average user. But I don't think we're near the point yet where we can just ditch the OS metaphor on PCs. This stuff is still evolving too rapidly. Attempting to box it in before its had a chance to mature will stunt its growth.

    Criticizing OSX because it is an OS is rather pointless, in my opinion. OSX is what Mac users (and arguably, the industry as a whole) needs today. In ten years, the world may look more like Jef's view of it, but there's still al lot more work to do. Appliances will probably become more like PCs, and PCs more like appliances until we find some sort of happy medium that works for most people.

    - Scott
    --
    Scott Stevenson
    WildTofu [wildtofu.com]
  • IMO having an OS facilitates ease of use by providing all that "common interface" jazz we like so much. If my computer's look and feel changed completely as I changed apps, I don't think I'd consider it nearly so friendly. Without a base OS, vendors' ideas might differ so much from one to another that switching from one app to another could be as drastic as moving b/t my Linux/Winders machines/partitions. I don't like the idea of that.

  • Or writing code, an email. How do you start a CAD program, hit the special T-square button on the keyboard.

    The concept seems pretty ludicrous to me. I agree that OS should be as simple as possible. It should mask most of what it is doing, yet it should have the ability to allow the user to get under the hood and provide a _basic_ loosely defined common interface for application developers.
  • He's almost home. The aprt he's m,issing to the puzzle is that a "general purpsoe computer" has to do too many different things to easily do what he's preaching (be totally invisible.)

    OTOH appliances, being tailroed to one task, do exactly that. Can nay consuemr name the OS in a Play Station? No. Why not? because its unimprtoant to them oin the same way the part number of the graphics processor inside of the box is. They just put their game in and play.

    Similarly a word processing appliance would word process justby turning it on and maybe putting a 'word prcoessor disk' in it.

    What has held such appliances back up til now has been the expensive of the screen. You could not afford to offer a device with a large enough, high enough resolution screen to be a serious word processing tool. With the drop in prices of both SVGA monitors and flat screens though this is fast becoming a non-issue. The only remiaining issue is finding a model that encourages developers to write as ful lfeatured apps for such a box as they do now for a PC.

  • Many of you might not like Raskin's idea's. But any of you have been using such a beast for years: it's called palm OS. Many of the principles that Raskin describes in his book "The Humane Interface" already exist in Palm OS, any a lot of the guys posting negatives about Jef's remarks certainly have no problem using their Palm. You press the power button, it instantly turns on. You press the phone button, you instantly get addresses. Press the memo button, you instantly get memo's. The storage of the user's data is transparent, so the only navigation a user really has to do is tap an application icon. Okay, maybe there's a few drawbacks to Palm OS (like being limited to 2048 bytes of stack space), but the Palm OS interface allows it's users to use applications in a highly efficient and intuitive manner. Jef sounds crazy when he talks about his humane interface, but then again, people thought he was crazy 20 years ago when we he started the macintosh project.
  • Quark is more difficult to use than notepad because it can do more!

    Maya is more difficult to use than Bryce because it can do more!

    Myth is more difficult to use than minesweeper because it can do more!

    Hp scientific calcular is more difficult to use than a throwaway calculator because it can do more!

    A helicopter is more difficult to use than a car because it can do more!

    Linux is more difficult to use than a Windows because it can do more!


  • If that sounds complicated to you, then stay far away from all electronics. You are against gui's? I suppose you use Lynx as your only browser? I guess when you do any graphic work, you just do it from the command line? ASCII art does rock I'll admit. And if you do any sort of system management work, you just look at a bunch of scrolling text all day, instead of a simple map with some green/red lights? The CLI has its place, but as a window in gui, not as a primary interface.
  • I can't really say I agree or disagree with Raskin. I do think he makes a good point about how a normal user shouldn't have to deal with the irrelivant details of the OS, however I'm not quite sure if I follow or agree with his method to address the issue.

    I was always under the assumption that a well designed OS should be intuitively layered in a way that allows different users and developers to take advantage of the services it provides in different ways.

    This approach has always allowed a third-party to address an issue with the kernal, operating system, and design a solution without having to break the underline operating system, which usually results in an operating system that is designed around components which are the best of thier breed (A Darwinian Approach).

    I think as long as we layer the underline technologies (kernal, filesystems, drivers, etc.) elegantly, the rest should eventually fall into place.

    Thanks Darwin!
  • by pjrc ( 134994 ) <paul@pjrc.com> on Thursday February 01, 2001 @03:04PM (#463337) Homepage Journal
    If RAM Storage was cheaper ... and all apps could be in RAM all the time, and we could do things like ... instantly-on in the Word Processor, or instantly-on in the Web Browser. But RAM is still WAY too costly, compared to Disk, so it ain't gonna happen.

    <stepping onto soapbox>

    At the rate software "technology" is going, it will never happen, as word processors and browsers keep growing in their memory consumption, at about the same rate as the prices decrease.

    Consider, if you will, running Netscape 1.1 and MS Word 4.0 (admittedly only on the Mac). Netscape 1.1 ran on PCs with 8 megs of RAM, perhap better than today's 4.x and 6.0 versions, and MS Word 4.0 worked quite well about 1.5 megs of ram allocated to it. These apps were about as responsive, perhaps better in many ways (on 486/68040 CPUs) as today's versions. It's amazing that today's word processors and browsers aren't any faster (often slower) and exceed the computer's memory capacity, despite a 20 to 40 fold increase in CPU speed and 6 to 10 fold increase in available memory.

    <steping off soapbox now...>

  • Chroma is right.

    We all take the drudgery of 'file, window, and application management' as an acceptable practice.

    I'm old enough to remember line based editing. At the time it didn't bother us because we knew of nothing else.
    -----
  • You're not giving anyone more usability through this. You're giving people something close to PalmOS on a computer, which a few might like, but many would disapprove of. What happens when I want to have two spreadsheets open? do I have two of my keyboard buttons allocated now, or is this even possible? Multitasking on a user level gets thrown out the window with a system like this, and that's a loss in functionality.

    And at a deeper level, you're really throwing away the idea of user expertise. This is the think that bothers me about a lot of the ideas presented by MacOS developers. They seem to think that it's worthwhile to spend a huge amount of effort to develop systems to make computers easy to use, when a few hours of training will do the job. It's not as though learning to push the start button, going to programs, MS Office, Excel is really so complex that somebody can't pick it up with a bit of training. Most people who use a computer enough to want something more than a single-purpose appliance are going to wind up spending a lot of time using it. It isn't unreasonable to expect them to spend a few minutes learning the basics.

    Besides, I've got an even more radical idea. Instead of pushing a specific button to get the program you want, we'll have a special area of the screen. When you select it, you can just type the name of the program you want to run, and the OS will pop that program right up. I think I'll call it a command line interface.

  • There are two sorts of systems that this discussion identifies:
    • Embedded special purpose systems

      Such as an "address book," an "email system," a "web browser," a "word processor" and such.

      There do exist appliances of these various sorts.

    • General purpose computer systems

      Unix is the "granddaddy" of this sort of thing; with the "duct tape" of scripting languages, you can readily hook together a Unix box to do a vast assortment of different sorts of stuff.

    There is a place for both approaches to computing, and the growth of PDAs may be suggestive of a way of "melding" them in a useful way.

    PDAs like the PalmComputing platform provide somewhat "dumbed down" interfaces, and are nevertheless useful. Due to limited screen, memory, and storage, they are largely kept to more "appliance-like" applications.

    The long term might well move towards having homes that use a paradigm somewhat reminiscent of this, with a "server" in a back room that provides Internet access, storage of documents, and a repository for "scheduled stuff." It would be entirely sensible for this to be something like a Unix box.

    There would then be "satellite" systems around the house, including:

    • PDAs to provide access to the data you want to carry around. Calendars, address books, perhaps MP3s, portable books, ...
    • There might be several "workstations" around, with larger screens, keyboards, mice, and the likes.

      These would be well-suited to "document processing."

    • "Family" room (or "bachelor's den" :-)) would logically have a large screen, speakers, and controllers, with "appliances" suited to displaying/playing audio and video recordings and supporting interaction [e.g. - video games].
    • The "Killer App" in the kitchen would be a touchsensitive PDA stuck to the refrigerator that would primarily display a combination of ToDo lists [groceries, anyone?] and calendar information.
    • A telephone with integrated address book and calendar would doubtless be a good thing.

    Virtually all of these could be implemented as "general purpose computers," but you're then left with the job of having to manage all the computer systems.

    It would be rather more attractive for most of these to instead be "appliances" that connect to a central "home server."

    Various of them make more sense if you integrate a PDA into the appliance, and have a wireless local connection so that they can get at the data on a local server.

    I would think it a slick idea to have a PDA running Linux, but that doesn't mean that I want to use a stylus to write in cd ~/addressbook; grep -i browne * | grep -i david | phonehome to dial my brother's phone number. The merit of running Linux would be that of having a well-understood robust portable platform for the developer. Given those things, to make life easy for developers, I'd be more than happy to have hardware where I press a couple of buttons to search for names in an address book.

    I would think it a suboptimal thing to just use a bunch of completely independent appliances, as with "MailStations" and such; the step forward is to have the appliances, and have a way for them to interoperate usefully with a "home server."

    1. Open the command line term program that comes with OSX
    2. Do whatever you'd do at the command line

    Apple is doing everyone a service by giving us the best of both worlds. I just with the new Finder was more of a shell; I wish I could choose to boot into BSD.

    MyopicProwls

  • by tswinzig ( 210999 ) on Thursday February 01, 2001 @01:55PM (#463348) Journal
    As for those who say that Internet-distributed apps via Mozilla-XUL or MS-.NET are the future, you are omitting an important human element: Territory. My workstation is my territory; I want to control it's config to suit my tastes, I want to determine its design tradeoffs (e.g. speed v. portability), etc. I would not be comfortable with getting all my apps via the Net no matter the speed, for it would just as weird as living in barracks and getting my toiletries by ration every morning.

    It's ironic that you accuse Raskin of having "A Limited Vision" when yours is just as limited!

    Why not wait and see what it's like using these distributed types of applications before slamming them? To me, being able to have my desktop and all programs available from ANY WEB BROWSING DEVICE is unbelievably cool. It will probably take 2-3 years for the speed of the net and the quality of these types of applications to become really satisfactory, but have some patience, and a little "Vision," why don't you?

    -thomas
  • The race analogy only holds true if you *remember* the directory paths. Luckily the auto-completion makes that easier, but for the most part you'd better have few directories or a great memory. Especially in dealing with files you haven't touched in a while. I sure find myself typing 'ls' a lot...

    FWIW, the "folder view" (for lack of a better description) on MacOS and Win usually makes for a lot fewer "messy" windows and a much faster file manipulation.
  • In the *real world* companies and even Open Source projects are going to create applications that use their own metaphors for movement, action, and so on. Currently, the OS is the only thing keeping interfaces even remotely consistent.

    Yes, but that's just by default. The OS currently provides consistency, but that doesn't mean consistency has to disappear with the OS crutch gone. Developers just have to get organized. Besides, you can still have a platform owners settings. Their ability to do that is not determined by whether they create a file system manager app or not.

    - Scott

    --
    Scott Stevenson
    WildTofu [wildtofu.com]
  • All my apps are already available from any web browser in the world. It's called VNC, and the VNC java applet viewer. Without the general-purpose operating system, this wouldn't be possible.

    VNC performance is slow, response is sluggish, even on a ADSL or cable connection. This is because the processing is all done on the server end. This distributed technology like .NET puts the program on the client side for fast response, it's just that the program is lightweight and stored on a remote server (along with the data, presumably).

    VNC also requires that you have a VNC client, which although ported to many OS's is not as ubiquitous as a web browser.

    Oh, and I've been doing this since 1998 on backwards Unix systems.

    Big whoop, I've been doing the same thing on Windows computers with PCAnywhere since 1996 or 97... that's not the point.


  • Improved a bit, but hey, consumers are already used to this. CD-players, VCR's, DVD-players, Sega, Nintendo, etc. etc. Put what you want to use in, and power up.

    All he's suggesting is refining it a bit. Updating it.

    Neat idea, and there's probably a market for it. The fancier "console" computers would probably have several slots, and allow you to switch between the applications. The starter model, or portable, takes one cartridge. The fancier ones have 8, with some sort of mega-external-chained cartridge handler. Think of an external unit with 32 cartridges in it.

    The guy's got a point. We want applications, not operating systems.
  • Remember the first generation of slightly graphical OS-less apps? For me that would be a throw back to my old Apple IIgs. Sure, it was cool to have the graphical apps. But my music app was drastically different from my paint app, which was also completely different from my word-processing app. Each application did things so completely different because there was no concept of a base-line similarity. There was no reason for a base-line similarity, because each app in effect was it's own OS. That's the part that truly baffles me when someone starts babbling about getting the user "closer to the app" and eliminating the "need" for an OS. An OS becomes important when you start to talk multi-tasking, or similarity between apps. If a user wants to use one thing at a time, then go back to the one app per disk type of thing (can you imagine an application fitting on a single floppy now? No, me neither.). I have a feeling that would create quite the rucous. No one wants to step back that far.

    No, I don't think that OSes are obsolete. OSes are a necissity in todays "I want to do four-million things all at the same time" computing world. I can't imagine trying to train people to use apps that all looked and acted completely different again. The people using computers today have a hard enough time with the apps that are all identical looking (What do you mean the little floppy icon does the same thing here as it did there? This isn't the same program?!?)

    It's amazing how stupid someone "in the know" can become when they truly remove themsevles from the common user long enough. Stick him on a help desk for awhile, that should wake him back up.;-).

  • Raskin is talking about a system that would be preconfigured to do exactly what the user wants to do, but he fails to mention, and possibly fails to consider, that such a system is nearly impossible to produce, simply because there are too many different kinds of user with too many different preferred modes of work

    I could be wrong, but I don't think Raskin is suggesting the the OS disappear entirely, but it's just not the appropriate method for 75% of the populous to be dealing with their computers. The more technically-minded will always want to have more control of their machine.

    - Scott
    --
    Scott Stevenson
    WildTofu [wildtofu.com]
  • The original poster picked a stupid example of something that's easier through the command line than in Finder-like GUIs. I've got two examples for you that aren't quite as easy but should be.

    Example 1
    You have a directory that contains two types of files, for this example, I'll say HTML files with a .html extension and GIF files with a .GIF extension. You want to select all the .GIF files and move them into a folder called "pictures". How do you do that on all the Finder-clones I've seen? Select each file individually holding down CTRL (I think, maybe it's shift) to select multiple files and then drag the selected group into the pictures folder. If the folder contains hundreds of files, that's no small task!

    Finder-like apps should have a way to select files based on patterns AND based on MIME types. Current ones really don't. The closest I've ever seen are various "Find File" utilities, which do the job, but are often logically separate from the App. I should probably mention the last version of MacOS I've ever used was somewhere around 6 or 7, so that feature may have been added. (Depending on the cost of G4 Cubes when MacOS X comes out though, I may pick up a Mac...)

    Example 2
    This one is a much better example, since I know for a fact that it cannot be done easily under Windows Explorer. (The above is weak since "Find File" is an option under Windows and probably MacOS. It's just not quite as obvious a choice to make as "Filter Files" might be...)

    You have a directory of files with a ".mpa" extension. The extension should be ".mp3". You want to change all the files in one operation. As far as I know, impossible under not only Windows Explorer, but ALSO under UNIX shells! (Scripting as always does not count, only basic shell commands.)

    This is downright trivial in DOS, though: ren *.mpa *.mp3. This type of task should be easily accomplished though - but it's not - in most cases, it invovles individually changing each of the file names!

    Those are just two examples that I've thought of (or encountered... Example 2 is something I've actually encountered, luckily on a DOS machine...) that demonstrate lack of functionality in all the Finder-clones I've seen. (Not sure about Nautilus, it STILL won't run on my machine.)

  • Gee, I was going to leave the toiletries rationing analogy alone until I saw you worshipping it like that...

    I would not be comfortable with getting all my apps via the Net no matter the speed, for it would just as weird as living in barracks and getting my toiletries by ration every morning.

    How exactly does this make sense? Living in barracks versus being able to access your programs and data from ANYWHERE that has a net-connected browser. Are they even close? Nope.

    And getting toiletry rations every morning versus being able to access your programs, any time of day, anywhere in the world, for as long as you like.

    Just because Microsoft has one business model for distributed server-based applications doesn't mean it's the only model available.
  • (Breaking own rule to ignore Anonymous Coward vermin)

    Actually I was there. Just as importantly I worked in a computer museum and did this stuff all day.

    What I wrote about refers primarily to PC OS evolution (& thus primarily Win & lesser extent Mac) but it's accurate as far as it's relevant.

    Got a point or just a snipe?

  • by MadAhab ( 40080 ) <slasher@nospam.ahab.com> on Thursday February 01, 2001 @01:13PM (#463382) Homepage Journal
    Your point is well made, but the dichotomy of expectations between you and those users illustrates perfectly why this article has a point - and where it falls short.

    The difference is that you are using linguistic constructs to effect an action; UNIX command shells have a syntax approaching that of a simple yet powerful language. On the other hand, most people deal with computers through mechanical constructs, e.g. "Press this button", and the GUI/OS edifice merely serves to confuse them, especially those who can't visualize operating system constructs (files, directories) into internal representation of physical objects. They simply press a button "download this" and don't understand what happened to the file. They expect a concrete, finite amount of knobs and tools that produce a finite, limited amount of results.

    So for those users, the idea of a magic box with magic buttons that just do what they want it to do is in fact what they really want. I fully expect that for this reason, we will see a simplified PC where applications are no more complex (or piratable) than a springboard module for the Visor. Such a device will be hugely popular; it also avoids the mistakes of the "network appliance" or .NET models. People expect and want concreteness and physical availability.

    These devices are more likely to run linux in some embedded sense than anything else. It's a perfect toolkit to produce these machines, eventually.

    But for those of us who use these devices for any reason outside the 90% that most people do - cutting edge gamers, programmers, people who work with databases, etc, the flexibility of a multi-purpose device is paramount. It must do magic, and we will learn the proper incantations. The PC will not die, but it will become ossified and optimized into such devices to such a degree that most will no longer need to be aware of it's inner workings - and in my experience, they close their eyes to it already.

    Boss of nothin. Big deal.
    Son, go get daddy's hard plastic eyes.

  • It sounds like he says there should be an OS to control switching around the apps, but it shouldn't get in the way.

    So, he wants you to be able to power-on right to an app, like a Palm does - ie faster bootup, and he wants the illusion you are using one app - so maximize the window and remove the title bar.

    Problem solved.
  • I think the article really doesn't track well at all. In my mind, on the Palm you most defiantley have an OS - after all, you have to launch apps from somewhere and you have only so many buttons!

    What the artcle seems to miss is that as screen size decreases, to make things usable the OS should give way to the app to present as much working space as possible to the user. Even there the PalmOS does not give up everything, as some screens like the app launcher and other system configuring screens let the OS have the whole screen.

    As screen size grows, you gain the ability to have multiple apps working at once and with that, enough screen size so that the computer can help manage the use of multiple apps.

    What I think he's really arguing for but doesn't realize are the following:

    * Interface that provides for sub-second loading times of ANY application.

    * Easier ways to focus more on one app but still make others accessable.

    For the second item, what I think I'd like to see, is a way to keep a focused app full screen. Then with a chording action or possibly a toggle key be able to see all my other apps displayed on top of that.

    Imagine writing a letter, then pressing a special "Shift key", seeing a translucent spreadsheet pop to the foreground, copy a bit of text and when you let go have it pop right into where you were in the document. That seems like it would be more efficient to me than current window management interfaces, and is sort of what virtual rooms try and do for you.

    In fact, I think I'd like to be able to define groups of apps as belonging together and be able to cycle amongst them in that manner.

  • Why do you get to decide what kind of interface I have to work with? Most of the user interfaces out there suck.

    However, I don't agree with Raskin, either. And OS *can* let users change their interface and still give a good API abstraction to applications to work with most user interfaces. Too many (e.g. Windows) don't do the right thing. Mac isn't very great for that, either.

  • Raskin has a clear vision, and it has decisively failed [landsnail.com] in the marketplace. Several times. The Canon Cat was a flop. It's the ultimate typewriter. But that's all it is. And that's the problem.

    Another try in that direction was the Xerox Star. Imagine a machine with a great mouse-oriented user interface that runs a word processing/spreadsheet/mail/database app and nothing else. That's the Star, Xerox PARC's attempt to build the ultimate office computer. Clobbered by the far-dumber but more flexible IBM PC.

    That said, the UNIX interface is a holdover from the teletype era. Command-line interfaces have been done far better. X is a horrible approach for a local GUI. And the text-file approach to system administration needs to be trashed, not papered over.

    Ease of use probably peaked around the end of the 68K Mac era. By then, the Mac had been perfected. It finally had enough engine behind it to be useful, but the apps hadn't yet been run over by Microsoft. If Apple had stayed with the 68K, and pushed Motorola into improving that product line, it might have been a win. But they got sucked into the IBM PowerPC deal, which basically cost them two years and their lead over Microsoft. (There's also the fact that Apple botched about five new OS projects in a row. The jury is still out on the latest try.)

  • by chancycat ( 104884 ) on Thursday February 01, 2001 @12:13PM (#463398) Journal
    I like it that Apple "went backwards" just a bit. Think about it: For years they have had the GUI part down. It's slick, it's great. But the muck underneath has been getting worse over the years. The amount of legacy [read: crufty non-elegant stuff] code in today's OS 9x is still far too much. OS X is exactly the mix of GUI and hard-core "from a main-frame" heritage foundation we need.

    I look forward to watching the arguments between folks who think OS X is better because of it's ease of use vs. those who love it because it is BSD underneath.

  • Just how often does a painter immersed in the creative act stop to think about minutiae of the paintbrush? Or worse still, get interrupted by the paintbrush


    Well, since I happen to be a painter the answer would have to be every time I work on anything of value. I demand the most out of my tools -- I use them intimately. I stretch their possiblities as far as I can. And for that purpose, I must be completely aware of every physical aspect of their functioning.

    Example: A brush that is 1/2" thick is better at drawing a 1mm line than any tiny brush. How? You just use one bristle. If I got handed a 1mm brush, even if that was the tool I was *supposed* to use for the job, I would be quite upset.

    The Mac fails because it tries to mask its nature from you. For certain problems it is very user-unfriendly. You have trouble with the OS, and you have no information and no hope of isolating it (without the aid of serious Kung Fu). This makes it completely unsuitable for those who demand the most out of their computers. (ever tried to code on a Mac? No wonder they are dependant on MS for application support....)

  • Subjective- but it's no worse than pcanywhere, IMNSHO

    Who cares?1 I wasn't comparing it to pcanywhere, I was comparing it to the ".NET"-style distributed computing idea!

    .NET distributed programs wouldn't allow you to run a unix program on windows, or a macos one in linux...

    Yes, and your point is?

    VNC does not require you to have a VNC client, you can use a web browser, if the server is set up to let you do this.

    True, I forgot about the Java client until after I posted (same with pcanywhere). However, with VNC you need a server (maintained by you) to run your programs. So yes, you can do something similar to distributed applications, with the other caveats I mentioned (namely, speed and response time problems).

    PCanywhere costs money, vnc is free. don't you know that /.ers lynch anyone advocating the use of software you have to pay for?

    Christ almighty I wasn't trying to compare pcanywhere with vnc! The original poster tried to make it sound like "Yeehaw, we've been doing this on UNIX for years." I merely countered Windows has had an equivalent for a long time, as well.

    I don't know which sponge-for-brains modded you up, maybe one of your "friends"?

    Thanks for accusing me of cheating the system, but actually my posts start at "2" because my karma is above 30 (or whatever the cutoff point is)... sheesh...

    -thomas
  • You need the OS to provide the provide solid base. If all applications had near total control, think of how often you would crash. I don't see how this could be secure. Does this mean the meat of a program would have to re-written from program to program instead of using an API type interface to the OS?
  • Here's how I did it: 1. Type "ls > dirlist.txt" 2. Press Enter. I now have a file called dirlist.txt containing a list of files in the current directory.
    Fine. Where's the step where you entered it into a text document? That's what the original poster used as an example.

    I didn't have to find some hotkey combination, didn't have to move my hand from mouse to keyboard and back, didn't have to maneuver through often-confusing "folders".
    I can do a listing like you did simply by double-clicking the mouse. No keyboard needed.

    Folders are confusing? Do you find drawers and closets confusing also?


    --

  • Excuse me. But isn't that the POINT of *nix operating systems? To give you as much control as possible over the application you're using?

    I fail to see how adding layer after idiot layer of abstraction brings a user "closer to his app".

    Then again. Maybe I'm the idiot.


    Chas - The one, the only.
    THANK GOD!!!

  • It seems to me that this guy has a point. We are approaching the final endgame in computing, that of 'One User, One App, One Internet', if you will. Thin Client terminals running an Internet browser will be all we need in 10 years, what with the increase in bandwidth that the internet is going through.

    This means that we will no longer require an OS. To use an OS on such a simple system is just an additional layer of complexity and a security risk. Its best just to run the browser on the metal, and elliminate these difficulties.

    All OS's will be redundant. Noone will win the OS war. Whether you see this future as good or bad is up to you, but the corporations are thrusting it upon us already. I have seen the future, and it is .NET.

    You know exactly what to do-
    Your kiss, your fingers on my thigh-

  • I have, it's called "The Humane Interface". It turns our everything he's saying makes sense. Not only that, he's actually built a system that does some of these things (in a limited way) called the Canon Cat.

    He's not advocating an appliance or convergence device. In fact, he's advocating something this community should love: the GUI equivalant of a command line interface. The basic idea is that there is only one interface, like a document, and in order to do things like spreadsheets or images or whatever, you use a cimple command. These commands would be plug-ins, so the "application" isn't static and inflexible.

    His book fleshes it all out in better detail, and he deals with the problems of games, multiple documents, etc. It's all quite visionary. It's all probably a ways in the future too. Making a system so generic and transparent is contrary to the branding strategies of most companies. So is the idea of selling small "commands" instead of monolithic applications.

    P.S. The whole "extra keys" deal only means a few extra keys (load, save, undo, command, maybe one or two more), not 20 extra keys for each application. Plus he advocates getting rid of a lot of useless keys (function keys, ScrollLock, CapsLock, etc).

  • It seems like Apple *is* following Jeff's basic philosophy. The user doesn't even have to know that the underlying OS is BSD in MacOS X, do they? The only thing that can get in the way of the user is the GUI environment, Aqua, which certainly isn't a throwback to the 70's.

    Cameron
  • I agree with the tenant that he is saying, the problem is that he doesn't realize how big the statement is that he is saying. An operating system should be able to facilitate some method of interaction. However, it is important to remember that not everyone interacts with the computer via a GUI. And as creating a general purpose GUI is very difficult the only correct thing to do is have a general purpose command prompt variant of the OS (E.g. as Darwin is for OSX) and a gui portion (E.g. Aqua for OSX). If he says that the user needs to be able to have an interface (GUI) to do what the user needs - yet to ONLY provide a GUI is a flawed concept - because not all users are alike. Some (if not all) of the most powerful command line interfaces are exclusively UNIX based. I believe that Aqua is going to evolve into a very powerful general purpose GUI. Both will try to give the user as much control over the system as he needs - it's not backwards - it's correct for the market they are going for.
  • I don't think the idea is to totally lose the OS, but just to get rid of the OS existing as an application. I think the use of the term OS in this article is misleading to *nix folks, tho. It sounds like they are really talking about the desktop/window manager.

    As I read it, you would still have the OS providing common services, including basic GUI elements. It sounds to me like the article is proposing the equivalent of KDE or Gnome, but without a desktop and panel. Instead you would start an app and it basically takes over and is the only thing you see until you switch to another app.

    Personally I would still take my PC as it is now, but I think the _vast_ majority of computer users (the ones who don't care about computers, but like to used word and IE), would find this a better solution.
  • Weren't the early Macs like this as well? Sure, there was an OS of sorts, but it was just on one diskette long with a single program. MacWrite, MacDraw and MacOS 4 or something, all on 1.4MB.

    It wasn't even until MacOS 6 (someone catch this if it's wrong, my Mac Experience starts with 7.5.2 (which makes one wonder why I like them as much as I do (but I digress))) that the OS would even run more than one program via the MultiFinder.

    Perhaps, then, this explains the context of the argument: one of the framers of the original MacOS has seen his pure, one user, one task system turn into multi-user and multi-tasking (though not yet pre-emptive), along with all the clutter and complexity that comes with it.

  • All you need to use VNC is a browser with a Java VM and network access.

    No, that's not true. You also need access to a machine maintained by you or someone you know, with the programs you like to use installed.

    Look I use pcanywhere (just like VNC) all the time, I'm not dissing that type of computing. It rules! However, it's not really comparable to .NET-style distributed computing.

    -thomas
  • by q000921 ( 235076 ) on Thursday February 01, 2001 @06:17PM (#463428)
    In fact, computers "without an operating system" existed long before Raskin or Apple even appeared on the scene.

    One of the pioneers in this area was probably Smalltalk, which provided a tightly integrated set of applications and let applications easily share data. Data in Smalltalk is, in a sense, "self-describing", so it can be exchanged easily between different parts of the system. And because Smalltalk is a safe language, errors in one application would usually not kill another application. That made it possible to build fairly large systems of closely interacting parts.

    UNIX, of course, came more out of a mainframe tradition. For its applications, it made sense to isolate processes well from one another. And that's why programming languages for UNIX and mainframe systems do not have to be particularly safe (the operating system will prevent the worst disasters from happening), and because they can't easily exchange data, data doesn't have to be self-describing.

    Apple copied the look of the Smalltalk interfaces but tried to build them out of what amounts to mainframe languages without even the benefit of mainframe process protection. The result was a system that was quite unreliable, leading eventually to the adoption of memory protection. But I guess some people at Apple, like Raskin, eventually figured out their mistake.

    The yearning for a Smalltalk-like system is also expressed by de Icaza and Microsoft, who come up with all sorts of complex COM-like systems.

    The fact is that the industry is still in a state of confusion. Many programemrs are too conservative to give up their mainframe style tools, but they still want to build Smalltalk-like dekstop systems. The result of using the wrong tool for the job is desktop software that ends up being both bloated and complex, and still is lacking in integration and extensiblity. This is, sadly, true for MacOS X as much as for Windows, Gnome, and KDE.

  • Don't you like fscking Google and slashdot better than stupid Flash sites that get you lost and take away your back button? What many UI poo-pooers don't understand is that UI dorks (like Neilsen, but yeah he's WAY over the top) like Raskin and, hell, me, what we are into is the idea that computers are here to ENABLE TASKS to be done MORE productively.

    Sure, we of /. dig linux. But if someone offered me a box and a RedHat CD versus an iMac and i have to write a 50-page, formatted report by tomorrow, guess what? I'm gonna take the iMac!

    The old engineering adage of "Solution for problem" is still true, truer than ever in fact in the case of UI. Raskin simply advocates that the OS should NEVER be the focus of a user's attention. And that is undeniably true. the IDEAL computer (possible currently or not) is one that requires no thought to be put into the bullsh*t of a task---that's what the computer's job is!

    I agree with an earlier post that said the author of this article is clearly uninformed and has a major axe to grind. Raskin is not a bad guy, and, while many of his ideas haven't been applicable, his overarching concerns are exactly what user-oriented software development should be about: making the TASK the focus of the attention, not the DOING of the task.


    --
  • The typical computer user does not want to be a system administrator, programmer or engineer. They want a reliable and easy to use tool.

    They don't need to know about:

    • file systems
    • exe and dll files
    • disk partitions and drive letters
    • RAM, ROM, gigabytes and megahertz
    • BIOS setup
    • device drivers
    • IP addresses and DNS servers
    • config.sys, autoexec.bat or the registry.

    Lest you think that I am prejudiced, UNIX/Linux systems are also severely deficient in these areas.

    IBM had a good idea with their document centered user interface that was introduced in the OS/2 workplace shell. You didn't "run" the word processor, you manipulated documents and new documents were created by tearing a sheet off of a template pad.

    The user should never be forced to understand or deal with the details of how the system's abstractions are implemented in hardware and software.

  • Essentially, the article says that Raskin doesn't like MacOS X, MS Windows, or any other general purpose operating system for that matter, because he thinks that computers should be pure appliances, relieving the user of having to worry about mundanities like file storage or program launching, rather than infinitely mutable environments. Raskin is a visionary, which is a good thing, but it means that he is concentrating on the future possabilities of ideal computer interfaces, while missing the more prosaic uses of technology today.

    But the real problem is that it's not clear how many of these difficulties can really be solved with an optimal human interface. As long as people want a machine that is capable of performing multiple functions (and the trend certainly seems to be more in the direction of increased, rather than decreased functionality) the choice of available capabilities is an essential difficulty. You can't get around it. You must present the user with an interface that allows him to choose what he wants to do, and if there are 100 choices you have to have a system that lets him pick from those choices efficiently.

    The same thing is true of documents, resources, web sites, etc. If I'm going to use a word processor, I have to be able to pick the document I'm going to work on. If that means choosing among thousands of documents, the computer has to have an efficient system for letting me sort through thousands of documents. You just aren't going to get around a need for some kind of filing system (even if it isn't heirarchical like most existing today) and a way to access it.

  • by msuzio ( 3104 ) on Thursday February 01, 2001 @12:15PM (#463441) Homepage
    I'm sorry, but unless the piece is grossing misrepresenting the point of view being put forth here, I have to disagree totally.

    An OS is *not* something that gets between a user and what they want to do. Instead, it's the tool that provides consistent services to both the user and the applications running on it.

    An OS provides:
    - device access
    - task management (multitasking)
    - one or more interfaces for the user (yes, I think interfaces are becoming a part of the OS. Live with it.)

    How would Rankin's ideas be implemented if *not* for an OS? How would a system be consistent and user-friendly without an OS+interface?

    I just can't see it?
  • It seems to me that this guy has a point. We are approaching the final endgame in computing, that of 'One User, One App, One Internet', if you will. Thin Client terminals running an Internet browser will be all we need in 10 years, what with the increase in bandwidth that the internet is going through. This means that we will no longer require an OS. To use an OS on such a simple system is just an additional layer of complexity and a security risk. Its best just to run the browser on the metal, and elliminate these difficulties.

    Maybe, maybe not. We've already gone through a couple of iterations of this. First, there was text terminals connected to the server (good 'ol VT-100s). Then these got replaced with PCs or even SparcStations (as they did at my university). Then the PCs were swept away by X-terms. Then the X-terms were swept away by newer Sparc/Linux boxes...and so on.

    I think having a real computer (as opposed to your WebTV or browser-only system) is going to be here to stay. It just gives you a lot more flexibility over what amounts to be a dumb terminal.

  • by 2nd Post! ( 213333 ) <gundbear@pacbe l l .net> on Thursday February 01, 2001 @02:23PM (#463445) Homepage
    I think it's one of the goals of Apple, as per their digital lifestyle.

    Walk up to an Apple Cube+SE with Bluetooth and wireless firewire and, miraculously...

    It detects your PDA and starts synching
    It detects your MP3 player and starts up background processes to configure and transfer music
    It detects your cell phone/pager unit and starts updating information

    Then when you sit down to the OS, and start on a document, that application gains central focus. They tried this in OS X with the one application mode, but that sorta lost out to general opinion.

    Their view that the Finder is just an application into browsing and viewing the PC and network, and not the PC or network itself, is one step I think. It's a very strong bias into the shaping of what the user thinks the PC or network is, but it can be swapped out into an email program, so that the network appears to be email lists, users, websites, emails, notes, attachments, and local storage. Or switch it into a web browser, and the device starts to look like web pages, music, movies, external sites, local storage, and information.

    Does that sound right?

    Geek dating! [bunnyhop.com]
  • by Bistromat ( 209985 ) on Thursday February 01, 2001 @12:15PM (#463446)
    There are several good reasons to keep an OS in an 'application' style around:

    -Portability: not everyone makes the same hardware, and you want your app to run on as many systems as possible.

    -Security: without an OS, there is no security whatsoever, except that built into the application; though, windows is ahead of the curve in this case - it has no security models and very little protection against a runaway app determined to trash the system. Most (all) other operating systems provide protection in the form of permissions against poorly-written or exploitable apps.

    -Ease of programming: without an OS to provide an additional abstraction layer, programming must interface directly to the hardware, a nightmare of excess code that should only need to be written once.

    Tiny little embedded systems designed to serve only one purpose might be better off without a true OS. A complex piece of hardware will never operate without a full OS. Think of the complexity that goes into the linux kernel, and think about the fact that only one application could run at any given time without the OS to run them in separate virtual machines.

    --nick
  • Hm. After reading that article (and my post will still probably be #20 -- tells you how deep it was), it's not really about OSX at all. It quotes one very small phrase that Raskin said about OSX, and goes on to explain that he wasn't talking specifically about OSX very much at all.

    That said, this article seems like another peice of flamebait in the "UNIX is still cool/no it's not" war.

  • Installing programs under Windoze is a total fuckup because of all the DLLs and inevitable scores of data files that have to be installed along with the application itself.

    The fundamental problem installing programs under windoze is the registry. Windows will never solve this problem as long as the registry exists. Sadlly, the registry is so ingrained that it probably can't be gotten rid of.

    Whats wrong with the registry?
    The registry tries to do too many things in one place. It contains system preferences, application preferences, system information, and only Satan and Bill Gates know what else. The problem is that it separates information critical to applications from the application and maintains the only copy of that information. DLLs have this problem, too, because they all get dumped into a common system directory. How does windows try to solve the problems created by storing DLLs in a central location? By maintaining reference counts of the DLLs inside the centralized registry! I want to cry.

    So, when you move or delete an application, the registry and DLL information, being separate, often do not get updated properly. Especially the shared DLL reference counts. I fear uninstalling on my windows machine more than I fear installing.

    Mac OS X has a better solution
    Mac operating systems breaks up the type of information that goes into the windows registry into different places. File information is automatically read from the applications and consoldated into the desktop file. Delete an application? OS X detects that through the filesystem and updates the desktop file. Desktop file gets corrupt? Delete it. OS X will create a new one. (just try deleting your registry file)

    The power of this is that the OS maintains this information as an index to the real information that resides with the application, rather than trying to be the authority on that information by having the only copy itself.

    OS X also handles preferences better. Each application's preferences gets put into a separate file. Therefore, you always know where to find an applications preferences. In addition, each application is required to be able to run even if the preferences are missing. So, you can always delete the preference files. The application will simply make another one.

    Mac OS X also has an elegant solution to the DLL problem. Again, they use the same solution. You do NOT put your DLLs into a central spot. (try counting the DLLs in your Windows\System directory.) Instead, it builds an index of them and leaves them with the application. So, when you remove an application directory, you also remove its DLLs.

    The de-centralized indexing approach that Mac OS X uses is much better than the centralized authority approach that windows attempts. So, OS X tries not to be the program you have to hassle with before you get to hassle with the application and does try to put the emphasis on the applications as much as it can.


  • He said a Powerbook RUNNING Platinum, which is another way of saying "the current MacOS." Platinum is actually an appearance setting, one of many you can apply to today's MacOS.

    And the GUI IS that important. It's the interface you use for everything that isn't done on the CLI. The GUI is what lets you find and open your files, or change your system's settings. Sure, you can do a lot of that on the command line in OSX. But most Mac users won't be working that way, and they shouldn't be forced to.

    I am very leery of Aqua too, after having used OSX PB. It's a leap backwards from the current MacOS GUI. At least they made some concessions to the users, like putting the Apple menu back.
  • Raskin has a clear vision - and the first iteration he tried to realize failed in the marketplace.

    Nothing too surprising about that - those with original ideas are seldom the same people that realize the commercial implementation of the idea.

    Jef's vision is with us today in several succesful forms, including the Palm Pilot.

    The problem with the Xerox Star was that it was hideously expensive - $10,000+ and very poorly marketed.

    You are also wrong about the use of text files for system administration. Implemented as XML they are the best method. Using binary files to control critical system operations puts your ass in a crack if something goes wrong when an application screws up the writing of said files.

  • Tog's comment on Aqua is everything but unbiased. This guy just cvan't break with his own past.

    As for your

    The last thing Aqua is is Usable.

    Have you ever used it? I have, now for 4 months, and I find it quite usable.

    Karma karma karma karma karmeleon: it comes and goes, it comes and goes.
  • I agree. His comments are kind of puzzling, especially given his heritage. Maybe it's because his experience with UI implementation was done at a time when the line the OS and the GUI implementation had to be pretty thin -- modularity really did cost memory and speed.

    But in a modern OS, the "desktop" or whatever interface of choice you'd like to implement is really just another application, with perhaps two added features:

    • a contract for the desktop application to fulfill (it must be able to open documents, launch programs, make links, and so on) so that it can be reimplemented or replaced
    • a class library or framework rich enough to enable desktop/activity/data nugget/component communication, whether it be SOM Workplace Shell classes, COM-based Windows shell, desktop and OLE interfaces, CDE specs, or the set of MacOS desktop interfaces that have been built up over the years. Note that at this level, windowing systems are really just a boring implementation detail. Really, for the most part, what we would strictly call on "OS" is not the interesting thing here.
    Although there are some ways an OS could make details like object storage more like whatever the metaphor of the day is. For example, MacOS supports file types and creators, which really frees users from having to respect naming conventions. The Palm OS uses little databases and doesn't even have a Finder feature. The web extends Unix-style paths across the Internet (although I would argue that we've wimped out on real locators for resources that are spec'ed by content identity, not just location).

    But I digress. As for MacOS X, what's really interesting is Cocoa, and that's already a powerful but unobtrusive framework. Too bad Mr. Raskin didn't address the relation of such frameworks to UI philisophy, where I think I might have actually had a point.

  • Saving state on exit is a good idea, but that can already be done. You may have already seen it - it's the 'document changed; save?' dialog box.

    I respectfully disagree completely. :-)

    Seriously, truly persistent document storage with automatic infinite undo, the ability to play with any version you want, automatic indexing and a built-in relational view of all your other documents...I'd be in heaven. The true value of persistent state isn't just saving a few keystrokes here and there. The big win isn't in the data; it's in the metadata, the linkage, the automatic indexing and all the stuff that is truly tedious to do yourself.

    In the olden days, this didn't seem practical, but next year I can probably get a 200 gig hard disk for under $300. Which probably only means more empty disk space unless somebody figures out that it's okay to blow a meg of storage on every text file I ever create if that's what it takes to provide all the real services people want in a file system (or document system). The capacity of that 200 gig disk is convincingly larger than the entire unindexed capacity (in bits) of the human memory that will serve you so conveniently for decades, and may (finally) be big enough to hold the index, too.

    Now, lest any of you believe I'm condoning some kind of device for idiot, keep in mind that I've said nothing about the beautiful query language I have in mind. The use of that is what will separate the experts from the novices...

  • by Shotgun ( 30919 ) on Thursday February 01, 2001 @12:18PM (#463475)
    This has been tried several times before. Basically he is saying that we need console type systems that come pre-configured and are controlled by the company that sold you the thing. IBM tried it with the PC-Jr. Radio Shack had a PC out in the early days that pop up their own little shell when you turned it on and tried to reign the user into their own little arena.

    They all fail for the same reason. Joe Blow gets the thing home and uses it for a week just like IBM et.al. intended. Then he heads over to CompUSA and sees how the $10 calendar program lets him put his own pictures on a calendar. "Why can't my computer do that?" he ask. Then he gets mad at whoever it was that sold him the computer in the first place, and starts looking to buy a real computer.

    Computers are complex and get in the way, because people want to do complex things that go in so many different directions that no matter where the OS is it is bound to be in the way eventually.

  • Was it just me, or did everybody else find the line between Raskin's opinions and those of the author to be blurred beyond any hope of parsing it out?

    It looks to me like a one-man rant, which uses a few quotes and paraphrases from one of the Old Prophets of the community to lend extra credibility.

  • The apple is spelled McIntosh [dictionary.com]. A Mackintosh is either a rain coat [ed.ac.uk] or a producer of musical theatre [musicbooksplus.com] depending on whom you ask. I'm sure Apple Computer used neither spelling in the effort to make something trademarkably misspelled, after the manner of the Prevue Channel, E-Z whatever, and other such tripe.

  • Actually, if you read The Humane Interface, Raskin talks about a system using his zooming interface concept that was actually done for a hospital database. The ZUI is a strange way of doing things (at least for those of use used to conventional GUIs, let along CLIs) and I'm still not convinced it's definitionally the best approach--but his points are certainly worth thinking about.

    The idea that the interface should be "data-centric" rather than "application-centric" strikes me as pretty sound. The browser-as-OS is often derided (including by me), but it's a tentative approach toward that metaphor. This isn't as far-fetched as it sounds, when you consider plug-ins: they're effectively transparent applications for handling new types of data. Right now they're just for displaying that data, but it's not too difficult to imagine an interface which allows "creator plugins" as well. If this was developed, you could arrive at a "smart appliance" which was actually useful--it'd be as easy to use as current IAs, but would be nearly as adaptable to new tasks as a current PC.

    At the very least, I'd like to see the operating system "get out of the way" more than it does. One of the things that often torques me off about the current MacOS UI is that the "Finder" is presented, effectively, as another application which is always there. When I switch applications with Cmd-Tab I don't want to have all the windows go in the background while it makes the desktop active, and the GUI shouldn't need its own menu bar. (The "one menu bar to rule them all" schtick also torques me off--even though it can mathematically be proven to be a "better target" than a menu bar attached to the top of each application window, the visual reinforcement of what commands go with what window strikes me as an adequate tradeoff. For sick fun, watch an utterly computer-ignorant user floundering around with MacOS when the finder menu bar has taken over even though their AppleWorks document window is the only thing that appears to be on screen.)

    Of course, the MacOS kernel torques me off much more than the MacOS UI, so I'm looking forward to MacOS X despite my feelings about Aqua being a step backward in usability engineering. Sigh. :)

  • The article says The idea of walking up to a PC in sleep mode and hitting a button, which would instantly activate a specific app, is compelling. The OS would manage all the applications in the background. If you wanted to switch apps, you hit another hot key. Work files could be stored in yet another "button."

    Though you still boot to the 'desktop', OS 9.x has a hotkey setup which will open a specific app, file, folder (e.g.- the "Documents" folder), control panel, etc. when you press a predefined F-key. It is setup in the Keyboard control panel. No, this isn't 'obvious' to the new user, but it is in the Help Center which is the first item in the Help menu when you start up. If you search under 'one key' or 'f key', you can find information and instructions on how to open the Keyboard control panel and set the F keys (all with mouse clicks, BTW).

    Once apps are running, hitting their F-key again brings them to the front. There are also 2 other ways of switching between running apps.: command+tab cycles through them (just as alt+tab does in Windoze); and the 'Finder Menu' in the upper right corner of the menu bar shows all open applications - clicking on one there brings it to the front. OS 9.1 adds a new menu to the menu bar in the Finder (desktop): Window. It lists all open folders on the desktop when you are in the Finder - clicking on one there brings it to the front.

    Does OS X support all of these as well? I would guess so, but haven't used it enough to know....

  • True.

    I look forward to OS X, since I'm considering working in the print pre-production world. Perhaps I can use bash to control QuarkXPress? ;-)
  • I find it annoying there both the /. headline and the original article's headline focus on MacOS X when the article is clearly about OS's & interfaces in general (though brought up in context of MacOS X.) It would have been more honestly headlined as "Former MacOS developer wishes OS's would fade into background".
  • Hell, install your programs on bootable DOS 3.3 floppies, and have them autoexec to launch your own damn app, Jeff..

    Raskin... shut up.

    How in the hell do you choose what you want to do on your computer? Hitting the keyboard? That sounds like how a 4 year old child uses a computer.. it just starts batting the keyboard. That is not a useful way to get things accomplished, and i pray we can ask more of the user than cavemen-like antics to operate their machine.

    And i know that this doesn't track with Raskin, but many people want to do more than word processing or drawing on their Koala Pads. But in Raskin's mind.. that's all anyone does. And never more than one at a time.

    And however you go between the two apps, its going to have to have some kind of means by which to separate the two applications.. else your 3D models would get all mixed up with your emails.. and that means will ALWAYS be an Operating System...

    For Godzakes... even Palm has an OS.. which has a UI, which you have to go to choose what app you want to run.

    Unless he wants every keyboard to have a button for every possible app your going to install, then you're going to have to have a way to choose what to do via software.. ... and that way - be it an "application" - whatever the hell that means - is the Operating System, stupid.

    And while i'm talking about it.. a buddy of mine at Apple got to see Raskin's post-apple project - and it was basically a pice of shit, useless overpriced Casio-pocket organizer.

    His big chance to poke his finger in Apple's eye, and not 1 in 1000 people (okay, 1 in 10 /.ers) could tell you the name of it because it was so useless.

    There will always be OSes.. they will just get better. That persuit is being taken on by Apple, Microsoft, the Open Source community, Be, and others... and either you're trying to be part of the solution..

    or just a has-been hack whiner who's not doing anything to put up...

    so i'd ask him to shut up.
  • by glowingspleen ( 180814 ) on Thursday February 01, 2001 @12:19PM (#463500) Homepage
    Brilliant! I agree, we should move to direct apps.

    Hmm...but I want to run more than one...hey wait a minute, I have a great idea! Let's get rid of the OS and just make an app. We'll have the app hold a bunch of shared files, and then we can fiddle with it so it allows multiple instances of one program. No wait, let's make it so we can run a bunch of different apps at once and change between them. And let's make our app "special" so that if one of the mini-apps breaks, the big app can just kill it without the mini-app taking out the whole system. Man, this is going to be GREAT!

    Oh yeah, that app would be an, uh, OPERATING SYSTEM. Oops.

  • by General_Corto ( 152906 ) on Thursday February 01, 2001 @12:20PM (#463503)
    I respect Raskin, he's a very clever man that got a short deal many years ago. Some of his ideas are very clever, but not all of them are truly applicable.

    • "Raskin goes on to illustrate that a computer should be as easy to use as to start typing on a keyboard to open a word processor -- with no lost keystrokes, or to put a stylus to a tablet and start drawing in a graphics app."
    This is all very nice and good, but what if you wanted to use a spreadsheet instead? Not everyone wants to only use a word processor. You have to decide what you're going to do mentally, then tell the computer "I'd like to do this now." Just because I start typing numbers doesn't mean I want to create a spreadsheet, but then again typing words doesn't mean that I'm continuing with my novel - I could be typing the headings for my spreadsheet.

    • "The idea of walking up to a PC in sleep mode and hitting a button, which would instantly activate a specific app, is compelling. The OS would manage all the applications in the background. If you wanted to switch apps, you hit another hot key. Work files could be stored in yet another "button." Interactivity between the apps could be facilitated the same way they are now, with a GUI shell, but without the preponderance of icons, start menus and switchers, and without the tedious effort of installing apps via the GUI or customizing your environment."
    Okay, so now I need a keyboard which has an extra 20 buttons for the apps that I want to be able to access. Great. Saving state on exit is a good idea, but that can already be done. You may have already seen it - it's the 'document changed; save?' dialog box.

    You're not giving anyone more usability through this. You're giving people something close to PalmOS on a computer, which a few might like, but many would disapprove of. What happens when I want to have two spreadsheets open? do I have two of my keyboard buttons allocated now, or is this even possible? Multitasking on a user level gets thrown out the window with a system like this, and that's a loss in functionality.

    • "'One big mistake is the idea of an operating system... It does nothing for you, wastes your time, is unnecessary'"
    This is where I laughed the most. The OS doesn't "get in the way", it provides basic services that all applications need. The whole reason that Windows or Linux or the BSDs (even PalmOS is big when you consider the total amount of storage available to the devices) are big is that they don't just act as system kernel, but they come bundled with tons of standardised libraries that make your life as an app writer easier. Probably the most dumb thing I've ever seen someone in the industry say.

    I wouldn't be following these guidelines too much if I was a system designer.
  • Ummm, why does it have to be all or nothing?

    Well, I wouldn't want it to be all or nothing, lest PDAs be unusable! I am talking about the desktop, and using the appliance paradigm as an argumentative device. I do this because that's what the Raskin article is about -- transforming general computer use. As you yourself have eloquently pointed out, an exclusive distributed-app paradigm is as unsuitable for general desktop/workstation use as total localization is for a PDA.


    *** Proven iconoclast, aspiring epicurean ***

  • by Bistromat ( 209985 ) on Thursday February 01, 2001 @12:20PM (#463509)
    "It's UNIX, it's backwards." Does that mean it should be XINU, and we've been wrong all along?
  • by myster0n ( 216276 ) on Thursday February 01, 2001 @12:22PM (#463511)
    Raskin goes on to illustrate that a computer should be as easy to use as to start typing on a keyboard to open a word processor -- with no lost keystrokes, or to put a stylus to a tablet and start drawing in a graphics app.

    And with a few simple presses on the arrow keys, you can start tetris and already have one block in the lower left corner. Or will it start sokoban? And God forbid if you even dare to touch your mouse, because suddenly you're in the middle of a quake deathmatch, no matter if the boss is looking at your screen at the time or not.

  • All my apps are already available from any web browser in the world. It's called VNC, and the VNC java applet viewer. Without the general-purpose operating system, this wouldn't be possible.

    Oh, and I've been doing this since 1998 on backwards Unix systems.

  • by Dancin_Santa ( 265275 ) <DancinSanta@gmail.com> on Thursday February 01, 2001 @12:23PM (#463515) Journal
    If he wants a device that does everything from word processing to emailing to gaming, he's going to have to settle on an OS to handle managing these tasks. His idea of a transparent OS has no merit when applied to the current PC paradigm, the paradigm to which he seems to subscribe.

    We've had devices where one could sit down and start typing with no loss of keystrokes, they are called typewriters. We've had drafting devices that allowed one to sit down and draft without an OS getting in the way, they are called drafting boards and pencils.

    The device that comes closest to an all-purpose device that Raskin is intimating is a game console. However, to switch between games (or theoretically applications) we still need to pop open the machine to swap media. Essentially the OS has been moved out of the machine into the user's brain. However, the device ceases to be an all-purpose device once an application is selected. How would I be able to check email while playing Tekken Tag? Without an OS to handle multiple programs simultaneously, to handle peripheral control, and to handle booting, I am SOL.

    If he is interested in devices that do one job really well (toaster, lightswitch) then he'll have to settle for a plethora of devices tailored for a specific task. If he wants a transparent OS that allows him to run multiple programs on his PC, he'll have to sell his snakeoil somewhere else.

    It's one thing to make an OS as non-intrusive as possible, but it's a whole different proposition to remove any semblance of an OS altogether.

    Dancin Santa
  • Tell me that I need to produce a 50-page formatted report by tomorrow, and I'll take the RedHat CD so I can have a decent LaTeX platform. :)
  • by Fatal0E ( 230910 ) on Thursday February 01, 2001 @12:24PM (#463529)
    it's not without some good points. I'm not gonna pull quotes out but the part where he says people want a computer that can turn on and off like a lamp (my words, not his) and do very specific things like word processing, internet browsing and video gaming is right on. They need to be done well enough such that people use it instead of learn it.

    People like my mom would love computers if the operating system and by extension the apps didn't have a high learning curve. A good example is my TiVo. Once I gave her the DVD menu metaphor she started "getting it". I don't know if it qualifies as an app, an OS or both but it works really well for what it does.

    One thing I feel he left out is that (IMO) computers need to come out of the computer room and make their way into the living room. I'm talkin connected to your TV (+cable) and stereo and then distributed to the rest of the house. A terminal in each room. Then you can use biometrics instead of keys and voice command instead of remotes. Just picture it, you could say "Clap Off!" instead of actually clapping off! wow


    "Me Ted"
  • by firewort ( 180062 ) on Thursday February 01, 2001 @07:44PM (#463539)
    Jeff Raskin's point was NOT:
    it's UNIX, it's backward.

    his point IS:

    It's an operating system, the paradigm for which is backwards.

    Computers, according to Raskin, should operate more like appliances. Reliably and simply.
    Start typing at the keyboard, and it's a document.

    start doodling at the tablet, and it's a graphics file. Make the computer as simple as a consumer television.

    Linguists have talked about this for some time-
    The computer interface consists of a mouse and keyboard.

    The mouse knows one word with modifier states. That word is "click." it can be modified with "right-", "double-" "middle-" or even "scroll-wheel"

    The keyboard is great, but slow, and the computer command line understands words, but usually requires two and three letter commands that need to be learnt, like a new language.

    The concept of an OS (cli or gui) is backwards and outdated for most things. It's very powerful, very functional, and even pretty when skinned with jelly-beans.

    But let's get forward thinking-- voice-controlled (and I mean, good voice controlled, not viavoice or dragon from two years ago) and gesture oriented.

    command line was pioneering in 1968 or 9.
    the mouse was incredible when Engelbart thought of it.
    Click and Drag was cool at PARC and Apple.
    Microsoft was innovative when they figured out how to market the masses to death, club OEMs into submission and buy up any product that was halfway decent.

    All of these things are old news. Yes they're being improved upon, but the improvements are EVOLUTIONARY.

    Raskin is interested in the REVOLUTIONARY.

    So am I.

    I'd like to ditch my keyboard and mouse, put on two gauntlets and a headset mic, and gesture and speak to my computer. Oh, and make the gauntlets and headset mic use bluetooth-- I like to walk around my office when I'm dictating.

    This comment is copyright of ME. using this comment without my permission is violating my ownership rights. :}


    A host is a host from coast to coast, but no one uses a host that's close
  • by gagganator ( 223646 ) on Thursday February 01, 2001 @12:26PM (#463547)

    imho apple is already experimenting with this. the new itunes software [apple.com] contains a single window that does everything, with connections to mp3 players occuring transparently in the background. idvd and quicktime are the same. it seems apple is moving its consumer apps to one gigantic window that require no interaction with the os or other apps

    pro apps continues to add multiple windows and palettes, and require interaction with other apps

    i think there is room for both, depending on skill level and use. the computer is general enough that interactions with other apps will continue to be useful, though for simpler use it can simulate a single device

    scroll this [macedition.com] article down to: the plot thickens

  • by Maserati ( 8679 ) on Thursday February 01, 2001 @12:26PM (#463548) Homepage Journal
    I have never wanted moderator points more than I do right now.

    Maybe Raskin will eventually figure out something for those people who "just want it to work" and have to deal with an Enterprise-level directory structure filled with documents of widely varying types.

    Until he shows me at least a prototype, heck a screen mockup, Jef Raskin can just shut up.

    On second thought, Bruce Tognazzini [asktog.com] already prototyped one of those for Sun. See The Starfire Project [asktog.com] for more details on a really powerful but very usable system.

  • by Mononoke ( 88668 ) on Thursday February 01, 2001 @12:30PM (#463592) Homepage Journal
    They can't do it. they can do it with applescript (or whatever they use) but not through the GUI.

    Wanna bet?

    Here's the process I used:

    1. Double-click folder (ie: directory) icon on desktop
    2. Press command-a (sellect all)
    3. Press command-c (copy)
    4. Click on text-entry app.
    5. Press command-v (paste)

    Here, I'll press command-v for ya here:

    gallery images printed already sf20010101.gif sf20010102.gif sf20010103.gif sf20010104.gif sf20010105.gif sf20010106.gif sf20010107.gif sf20010108.gif sf20010109.gif sf20010110.gif sf20010111.gif sf20010112.gif sf20010113.gif sf20010114.gif sf20010115.gif sf20010116.gif sf20010117.gif sf20010118.gif sf20010119.gif sf20010120.gif sf20010121.gif sf20010122.gif sf20010123.gif sf20010124.gif sf20010125.gif sf20010126.gif sf20010127.gif sf20010128.gif sf20010129.gif sf20010130.gif sf20010131.gif sf20010201.gif t-shirt images

    (Of course, HTML doesn't know what to do with the linefeeds, but they are there.)

    That's a directory listing of my Sinfest [sinfest.net] archive.

    Nothing in that procedure that would be unknown to any Mac user.


    --

  • by Somnus ( 46089 ) on Thursday February 01, 2001 @12:30PM (#463599)
    The "computer as appliance" vision is stultifying. There's a reason a computer has totally general input (keyboard, mouse) and output (pixel-based monitor, sound) devices -- people want their workspace to be totally abstracted from the hardware in which it resides. In this sense, the modern OS totally accomplishes its task in that the creation, installation and usage of applications are usually only limited by dev time and performance. Thereby, we humans can let our imaginations run wild.

    Handhelds and kitchen-counter-top Internet appliances have a totally different engineering goal: "What the hell is Bob's phone number?" or "Mommy, can I check my email before dinner?" Just because a user wants to have total convenience in one context does not mean he or she desires the trade-off in flexibility in another. The workstation paradigm still has its place.

    As for those who say that Internet-distributed apps via Mozilla-XUL or MS-.NET are the future, you are omitting an important human element: Territory. My workstation is my territory; I want to control it's config to suit my tastes, I want to determine its design tradeoffs (e.g. speed v. portability), etc. I would not be comfortable with getting all my apps via the Net no matter the speed, for it would just as weird as living in barracks and getting my toiletries by ration every morning.


    *** Proven iconoclast, aspiring epicurean ***

  • by John Whitley ( 6067 ) on Thursday February 01, 2001 @05:30PM (#463605) Homepage
    An OS is *not* something that gets between a user and what they want to do. Instead, it's the tool that provides consistent services to both the user and the applications running on it.
    BOOT TO THE HEAD, to you and everyone else in this thread that failed to read the article. It explicitly puts "OS" in context with the phrase: the concept of the OS as an application. As Raskin says:
    "One big mistake is the idea of an operating system ... [which] is the program you have to hassle with before you get to hassle with the application. It does nothing for you, wastes your time, is unnecessary,"
    Read Jef's book The Humane Interface and Don Norman's The Invisible Computer to get some vision into this movement. And read the article. The essence is that a class of tools new and distinct from the PC will emerge, in which (among other things) the concept of OS as application will be dead.

    Just how often does a painter immersed in the creative act stop to think about minutiae of the paintbrush? Or worse still, get interrupted by the paintbrush? Not often, and that's a hallmark of a good tool -- that it be subsumed as completely as possible beneath the user's attention to the task. The PC as we know it can undergo vast improvement towards being a really great tool for a particular task -- and this will likely involve some specialization. Again, read the above books and get a leg up on the next wave...

  • by bnenning ( 58349 ) on Thursday February 01, 2001 @12:40PM (#463617)
    Whether or not the author's view of the OS as an impediment to the user is correct (and I don't believe it is, at least not when done properly), his criticisms seem to apply less to OS X than most other OSes. For example:

    The idea of walking up to a PC in sleep mode and hitting a button, which would instantly activate a specific app, is compelling. The OS would manage all the applications in the background. If you wanted to switch apps, you hit another hot key. Work files could be stored in yet another "button."

    Sounds very similar to the Dock in OS X. With a good VM and inter-app communications (also in OS X), for the most part it doesn't matter if an app is currently running or not, as soon as you need it it will be.

    Interactivity between the apps could be facilitated the same way they are now, with a GUI shell, but without the preponderance of icons, start menus and switchers, and without the tedious effort of installing apps via the GUI or customizing your environment.

    Unless somebody has a telepathic user interface, you're going to need some way of telling the computer what you want to do, and I fail to see why clicking on an icon to do this is unreasonable. Regarding installers, the author appears to be unaware that it is possible and recommended in OS X to build your app so that the "install" process consists of copying a single file, ditto for uninstalling.

    I disagree with the fundamental attitude of this article, which is that because some people find current OSes too hard to use, they must be dumbed down for everyone. Certainly OSes can and should be more accessible to novices, but that does not have to take away power and flexibility for advanced users. OS X is a perfect example of this; with some few improvements to the public beta UI (many of which have apparently already happened), it can be both more approachable for new users and more powerful for experts than the classic Mac OS, Windows, or (flame retardant activated) Linux.

  • by b1t r0t ( 216468 ) on Thursday February 01, 2001 @12:34PM (#463637)
    Once I saw what Eazel [eazel.com] was developing was nothing more than Yet Another Browser, I wasn't too impressed.

    Jef Raskin refers to the OS as a "program you have to hassle with before you get to hassle with the application." To me, Nautilus seems like just yet another program you have to hassle with before you get to hassle with the application. I don't see how it can make things any easier for me that would make me run Linux as a desktop environment instead of MacOS.

    As to the OS itself, I don't really see what it does that gets in your way, aside from maybe requiring you to save your data files in its directory hierarchy. Certainly you can use OS X without having to care that it's running Unix underneath the hood. Much more noticable is that "classic" apps have to run in their own little sandbox, because the OS is different, not because it's there.

    Even the Palm OS, which is specifically mentioned as one that doesn't get in your way, is still an OS, and still there. You could run a Palm-like interface on top of Unix and be none the wiser. It seems to me what he has a problem with is the user interface environment, not the OS.

    Would such an appliance -- a home browser, word processor, spreadsheet, and game console -- be a popular item that would replace the PC in the household? Wildly so, especially if installing new programs was made simple, such as inserting a disk, selecting its activator key, ejecting the disk and running it, installed on your system until you remove it.

    Installing programs under Windoze is a total fuckup because of all the DLLs and inevitable scores of data files that have to be installed along with the application itself. I'm sure InstallShield is making a lot of money off of this. Under MacOS, it is possible to install (properly written) software by simply dragging its icon out of a CD-ROM's Finder window. Such software doesn't even have to be installed; it can usually be run right from the CD-ROM. This used to be common, but nowadays big apps want to be run from an installer because they have so much baggage that goes along with them. OS X will make this easier to do by allowing an application and its files be packaged in a folder that appears to be a single object.

    Sure, the Unix basis of OS X can be considered a step backwards when compared to something like BeOS or even the Xerox Smalltalk environment, but the reason to go with it is because it's a solid solution, and it's much better than the ad-hoc design of MacOS, which was never intended to do multitasking. Multitasking the MacOS was an amazing hack.

  • by jafac ( 1449 ) on Thursday February 01, 2001 @12:34PM (#463640) Homepage
    I think he has a point, but he's thinking about what computers *could* be from the standpoint of being in an ideal world.

    In the real world, we have limits on hardware performance, some subsystems are far more limited than others, then price comes into the equation, for various subsystems; Video, RAM Storage, Disk Storage, Network IO, etc.

    Right now, Network IO is prohibitively expensive, and the state of the technology is way behind that of Disk Storage; it's currently cheaper, and more convenient (offers better price/performance ratio). This is the ultimate factor in why .NET will fail. Net access is too expensive and too flaky for consumers to rely on it for their primary means of accessing apps.

    For what this guy is talking about, today's computers can't possibly do these things. For one thing, we still need disk storage. If RAM Storage was cheaper, and didn't have the volotility issues, then we wouldn't need Disk Storage, and all apps could be in RAM all the time, and we could do things like, sleep a machine, and press a button to be instantly-on in the Word Processor, or instantly-on in the Web Browser. But RAM is still WAY too costly, compared to Disk, so it ain't gonna happen.

    Computers and their OSes have been the way they are from day one, because the balances in cost and performance on the hardware side have always been pretty much what they are now. In the early days, of course, Disk Storage was highly cost prohibitive, so those machines were diskless (I'm talking TRS-80). Network connections were unheard of in your standard consumer machines until about 7-15 years ago, this came on gradually, then full-force as the technology evolved into something people could afford. We're experiencing another shift in network availability, speed, and cost, with DSL/Cable, and that's what Microsoft is betting on with .NET. But most people don't have DSL or Cable yet, and won't for some time. And even ME, on a corporate 100-base-T network, t1 connected to the internet, I'm not willing to bet my productivity on the notion that Microsoft's .NET server serving Word will always be up, and fully responsive when I need it (and that the service bills won't get me down).

    So, the kinds of paradigm shifts that this guy's talking about require the hardware to change, either in performance or cost. If that happened, you can bet the software guys would jump on that damn fast - lots of money to be made during those kinds of periods.

    Flatscreen monitors don't appreciably change things. We all thought that super-duper 3D cards would change our user experience into a 3D one (but just because the video could display lots of 3D information quickly, doesn't mean that the rest of the computer can get at that information as quickly, so the 3D interfaces we've seen have been slow, jerkey, useless eye-candy).

    My guess is that the next paradigm shift will be a result from an increas in bus speeds. CPU speeds may continue to ramp, or they may stall, network speed will increase per dollar, but I doubt we're going to see an increase in user-trust and reliability. So internal bus speeds are going to change things, and we're going to see computers doing things that they can't currently do, because bus and memory speeds are way too slow. Of course, the technology for this is not even on the horizon yet, so this is all pulled straight out of my ass - but the only other possibility is if RAM gets really cheap. I mean really, really cheap. Cheap enough to make disks look as unattractive as tape currently does. Either of those would surely change the model by which we compute, and OSes run.

    And Unix will still be Unix.
  • by dutky ( 20510 ) on Thursday February 01, 2001 @12:34PM (#463643) Homepage Journal

    Essentially, the article says that Raskin doesn't like MacOS X, MS Windows, or any other general purpose operating system for that matter, because he thinks that computers should be pure appliances, relieving the user of having to worry about mundanities like file storage or program launching, rather than infinitely mutable environments. Raskin is a visionary, which is a good thing, but it means that he is concentrating on the future possabilities of ideal computer interfaces, while missing the more prosaic uses of technology today.

    Personally, I agree with Raskin on what I would like my computing experience to be like, but I also recognize that we are a long way from making that experience happen in a ubiquitous manner. For the moment, I get more milage out of an OS centric system that provides me with the primitives that can be combined into a tailored work environment (e.g. Linux running X and Fvwm2 with a small collection of application programs and shell scripts) than I would out of a more turn-key system that wasn't designed by me for my own uses (e.g. MacOS, Windows, PalmOS, and even Gnome and KDE).

    Raskin is talking about a system that would be preconfigured to do exactly what the user wants to do, but he fails to mention, and possibly fails to consider, that such a system is nearly impossible to produce, simply because there are too many different kinds of user with too many different preferred modes of work. It is much easier to produce a clumsy generic environment that can be shoehorned into many different task niches, than to custom engineer a system and user-interface for each prospective user.

    The users that really care about a streamlined work environment (sometimes referred to as Power Users) will take the time and effort to tailor their system to their tastes. The users that don't care, and such users do exist, will either suffer (silently or otherwise) or pay someone else to produce a more tailored configuration for them. (while I am no Libertarian, or even much of a Capitalist, and as much as I hate to point this out, the dominance of generic, operating system centered, computing environments looks like a perfect example of the free market at work)

  • by TheJohn ( 109384 ) on Thursday February 01, 2001 @12:42PM (#463650)
    Basically he is saying that we need console type systems that come pre-configured and are controlled by the company that sold you the thing.

    No, he's not really saying that at all. Raskin goes into quite a bit of detail about his vision in his book, The Humane Interface [jefraskin.com] , and it doesn't involve most of the things people are attributing to him in this thread. It's not about locking people into one application provider, or even eliminating menus, or not having what I would call an OS (controlling devices, managing resources, etc.) It just doesn't look like what we often think of as an OS. There's a summary [jefraskin.com] of the book on the site. Read it, then shoot your mouth off.

    I'm not sure I agree with him entirely, but the book is interesting reading and does bear some thought, and it's clear he's no "bozo".

  • by jayhawk88 ( 160512 ) <jayhawk88@gmail.com> on Thursday February 01, 2001 @12:43PM (#463653)
    I think the point the author was trying to make is not that we don't need OS's, we just need them to be more transparent in certain situations.

    Certainly I, nor a large portion of the general computing public, would ever accept such a PC. My computer can be anything from a game console to a web server: I want and need and OS I can work with as an application. But what I want and need isn't necessarily what my uncle or grandmother wants and needs. Yes, anyone can be taught how to operate a computer to make it useable (how to install apps, how to run a program, etc), but why should navigating an OS be a requirement for using a computer, be it Windows, Linux, Be, or whatever?

    The idea of being able to walk up to a machine and just start typing a document, or drawing a picture seems interesting to me. Of course, it would take a very powerful OS to give this level of functionality while still remaining transparent, without degrading itself to little more than a toy. At the very least, it's an idea worth exploring at the research level.
  • by maggard ( 5579 ) <michael@michaelmaggard.com> on Thursday February 01, 2001 @12:36PM (#463670) Homepage Journal
    Long ago an OS only supplied the basic functions for running the computer. The interface was usually a command line. To use a device attached to the computer often each program had to supply it's own device drivers.

    This was why in the early PC world WordPerfect was such a hit: The program came on 1 or 2 floppies & the device-drivers (mostly printer) came on another 7 or 8.

    Eventually MacOS & Windows came out with the idea of universal drivers in the OS. No longer would each program need to supply it's own video or printer drivers, rather the OS would get installed with a driver for the device and everything would go through it. This was as much a reason MacOS & Windows succeeded so well as their GUI's.

    Later this expanded to typefaces and cross-application clipboards and inter-application communications and built-in scripting and system-supplied text-edit boxes and graphics widgets and a host of other services. Indeed today's OS's are about half of the application.

    The dividing line between application and OS has grown very fuzzy indeed.

    Starting in the mid-80's there were a series of projects to help further break down this distinction. Next had their object-oriented operating system, Apple/IBM/Novell had their OpenDoc component-architecture, Aple even did something of the like in their Newton OS, now in Linux there's Bonobo and it's cousins.

    Lots of users I know consider their computers to be Email/Word Processors/Web Browsers - they don't use or care about anything else. It could be green cheeese for all their overt interaction with the OS.

    So this leaves us with the question: When does the OS's GUI begin to dissolve into the applications? Will it? Will it completely? Is this a "good thing"? Or will there always be a clear distinction?

  • by 2nd Post! ( 213333 ) <gundbear@pacbe l l .net> on Thursday February 01, 2001 @12:36PM (#463677) Homepage
    I don't think what this guy is proposing works against Apple's design goals.

    As per the OS as an interface between applications and the computer, that is *always* necessary even if it's nothing more than an abstraction layer that allows applications and devices to communicate with a uniform series of APIs. In which case OS X is bundles, Quartz, Cocoa, XML configuration files, Quicktime, a filesystem, the Finder, and a few other things.

    Aqua, as a GUI, is an interface between which a human user can interact with the network, the applications, documents, data, and other tools. It is, as the name implies, just a Graphical User Interface into which all the other components plug in. Apple is espousing the digital lifestyle, in which you work with PDAs, mp3 players, camcorders, cameras, VCRs, TVs, radios, what have you, as these little tools Jeff may be talking about, but using OS X, Aqua, and all the other little things as a glue to network them all together.

    Nothing is conflicting or contradictory, except perhaps in the analysis that OS X gets in the way, or enhances one's 'digital lifestyle'. Steve thinks it's a multiplier. I have to agree, in that having iMovie, which sits on top of the OS X, using the Aqua interface, allows us to do non linear editing and connects our camcorders, our imaginations, our CD-RW and DVD-R devices together in ways that cannot happen without an OS and without a UI, especially a GUI.

    The same can be said with MP3s, mp3 players, CDs, and iTunes. Or Final Cut Pro, DVD-R, camcorders, digital cameras, CDs, MP3s, and DVD players. Aqua is the interface between all the software, the software is enabled with Quicktime, Quartz, and firewire, and all of the above sits on OS X.

    It's like arguing language is an impediment to understanding; it is, because it's constructs and semantics can create misunderstanding, when one needs to also see that without language, there doesn't exist a medium from which communication exists (yet).

    When devices all talk to each other wirelessly with XML packets and have AI to the point of 'grokking' each other, then OSes and such will not be needed. Until then, OSes and GUIs will allow such devices to interface with each other and with us.

    Geek dating! [bunnyhop.com]
  • Raskin's comments (as interpreted by Berg) are very interesting in that he states: "they keep lumbering forward with the idea that people prefer innovation and flexibility to predictability and stability."

    This reminds me of the constant wrangling in the web interface community about consistency of interface between sites. How do you create a site that does what you need it to do and conveys whatever aesthetic you're after, without making the site difficult to use? To put it in application terms, how do you build an app that people will appreciate for its innovation, and be able to use the first time around?

    Raskin's idea of a disappearing OS seems counter to the quote above about consistency and stability. In the *real world* companies and even Open Source projects are going to create applications that use their own metaphors for movement, action, and so on. Currently, the OS is the only thing keeping interfaces even remotely consistent.

    One of the reasons the Mac has such a well-loved interface (how many PC interface zealots do you know?) is that it's consistent from app to app. Basically, you buy a new Mac app, you launch it, and you figure it out on the first try.

    I just don't see how an OS-less computer would somehow make things easier for users, when every app would be allowed to have whatever interface it wanted.

"Experience has proved that some people indeed know everything." -- Russell Baker

Working...