Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Apple Businesses

Steve Jobs Interview with Time Magazine 205

broohaha wrote to us with the online version of Time's interview with Steve Jobs. It's the cover of this week's edition, and gives an interesting perspective into the labyrinth of his mind. The most interesting part is the Pixar stuff, IMHO. Just waiting for Toy Story II right now.
This discussion has been archived. No new comments can be posted.

Steve Jobs Interview with Time Magazine

Comments Filter:
  • and i don't mean jobs.

    when i read the article, all the time subscription ads on the right hand side, abnout halfway down the page, featured a time cover with bill gates' mug on the front...

    did somebody plan that?
  • Please point out a stupid comment...

    Not something you disagree with, but something
    that is STUPID. INCORRECT. FACTUALLY ERRANT.

    You can't do it.

    -WW
  • Gutting programmer effectiveness and routing new programmers into BASIC by a factor of at least 10 while maintaining, and even slightly improving the GUI is a great example of "not getting it". You can say OOP would become important in a few years and I can say the windowing GUI would become important in a few years with or without Jobs. But the revolution had already occured at PARC (and if you're focused on the mouse environment -- even a decade earlier at SRI which is where PARC, and indeed PLATO with its touch panel [thinkofit.com], got their inspiration -- I remember sitting in meetings at CERL/PLATO viewing the films of SRI's research in 1974 as part of PLATO's computer-based conferencing project).

    DOS applications were starting to pick up on it despite the horrid CGA they had to work with initially -- and it wasn't because Jobs did the Mac. The Windowing GUI was inevitable and obvious to people with money as well as most personal computer programmers, especially once Tesler had already popularized it with his 1981 Byte magazine article [byte.com].

    Dynamic, late-binding programming environments that highly leverage the sparse nerd matrix out there -- like Smalltalk, Python, etc. -- are, however _still_ struggling to make it past the concrete barriers Jobs poured into the OO culture with the Mac.

    When Jobs passed up Smalltalk for Object Pascal, and then again, with Next, passed up Smalltalk for Objective C, he set a pattern that continues to this day when Sun passed up that sun-of-Smalltalk, Self [sunlabs.com] and went with that son-of-Objective-C, Java.

    Gutting the superstructure of technology while maintaining appearances isn't leadership.

  • Anyone else suspicious about the picture of Bill Gates on the cover of Time on the pages of the article?
    I am... watch yourself, Bill.
  • No argument. Ours cost us less, but that's because my old man was an electrical engineer and we designed and built the S-100 cards ourselves. We even did photo-etching ourselves on some of our later cards. Fun. Educational.

    No, my nitpick consisted simply of pointing out that the Apple was not that innovative in a technical sense. It was hardly the first computer with swappable cards. That's all. I'm not saying that it wasn't the market breakthrough. It certainly was. Not everyone was ready to be their own engineer and software developer. The Apple ][ was a consumer item. The CP/M based S-100 bus machines were computers for computer people who couldn't afford 370's at home (and didn't have the raised floor ;-)
  • Their TCP/IP stack can't handle ftping at more that 10KB/s on a 10BaseT connection to the server that is 20 feet away...

    Wow, I could get 250K/s w/ my 5200 w/ 10BaseT, and I've got over 600K/sec sustained w/ my G3 and 100BaseT both with my cable connection(the first w/ a net server, the second w/ my ISP, but I've got > 500K/s w/ net servers)

    Your network must be a piece of shit.
  • You are describing the Macintosh Script Editor, except for the fact that it's not always on. You turn it on or off when you want to record a script for a particular repetitive task. When you're done recording, you can edit, modify and save the script as an application. I use it all the time.

    Excelent! I did not know that, it sounds cool.. Is it as powerful as say Perl or something? The always on part may not be a good idea anywho.. it makes life easier, but ONLY if the AI is good enough. Now, it sounds like maybe ApleScript could be designed to nudge the average user into using the scripting langauge, i.e. it should be mildly in your face so that people actually use it, just not so much that it interfers with life, but that is sorta just cosmetics. I'll look at this langauge next time i'm arround a Mac.

    Now, there is something to be said for having data abstractions available.. Dose AppleScript just simulate ckicks and stuff or do aplications provide a higher level interface to themselves? One of the things we may see eventually is derived user interfaces, i.e. the aplication provided ONLY the higher level access and ``themes programms'' or something would actually derive the interface from it. A system like this, if possible, would be especially ammenable to GUI scripting langauges since once you understood kinda how the interface derivation worked you would understand something about all your applications.

    Have you seem the Plan9 interface? It is kinda interesting.. All the menus are built on the cut and paste. The idea being it is better to just make cut and paste efficent enough to simulate menus then actually create seperate menuing stuff.

    Jeff
  • >anyone involved in the real world can tell you that linux/xeon is the only way to go for rendring and 3d animation/modeling.

    Except for that little problem with almost none of the "real world" modeling/animation/rendering software having been ported to linux yet... (e.g. Pixar's) or that niggling detail of minimal support for 3d hardware acceleration in a window.

    Other than that, yeah, Linux ROCKS for 3D content development...
  • ...but his direction is a poor one for Apple... being a niche player is not a good place to be...

    How is does creating and marketing a computer to the General Public make Apple a niche player? Certainly they are targetting the G4 systems to those who need the power the most. But I don't believe that building a computer for the 95% of the population that doesn't understand computers (but just wants one that works) makes them a 'niche player.'

    Stockholders aren't stupid. If they didn't like the direction Apple was taking, AAPL would still be selling at 13.


    --

  • Umm...the two million people who bought iMacs in the last year, the 200,000 people who have preordered iBooks, and the however many people who bought the blue and white G3's seem to trust Apple just fine.

    Motorola found a serious bug in one of their processors. I think it's a Good Thing that they halted production, rather than let them go out to the public. Other companies *cough*Intel*cough* would have shipped the chips, and spent a pile of money on marketing convincing people that there wasn't really a problem. While I'd like to see Apple handle the situation better (like, how about an extra 128mb RAM in exchange for the 50mHz in clock speed?), the fact that they're doing ANYTHING is, to me, a positive sign.
  • Well almost, besides that one guy Linus :) If apple got its act together, and worked really hard at compatibility and other issues/standards that microsoft and the PC industry have moved towards, they could win. =) --Brian
  • Jobs' point is not that the user SHOULDN'T know what's in the box, it's that they NEEDN'T know what's inside the box. Anybody who's picked up a telephone with a neophyte user on the other end should agree.
  • They still use co-operative multi-tasking, instead of pre-emptive, because they made an ideological decision before, and the developer community is not aware enough to yell about it

    This statement is laughable. Actually, most all of your statements are laughable, but the rest have been pretty well addressed by other people.

    They did not use cooperative multitasking because of an ideological decision. I challenge you: compare a well-done cooperative system versus a well-done preemptive system on an 8MHz 68000 machine with 128k of memory which is generally running only one or two programs at once. (Remember, the original Mac did have those funky Desk Accessories which could run concurrently with the current program, and can be considered to be a seperate process.) Guess which one is faster? I'll let you figure it out. (HINT: It's the one Apple actually went with.) Once the decision was made, it was pretty much stuck. Retrofitting a preemptive tasking system onto the OS once the hardware had picked up enough to handle it would have been an absolute nightmare.

    You ought to hang around the Mac developer community more before you say that we aren't aware enough to complain. Among the ones I hang out with, about half of all our complaining is directed toward either the lack of preemptive multitasking or memory protection in MacOS. It's a big issue, and Apple knows it, which is why they're releasing MacOS X. Again, retrofitting these features onto the old system would be far too painful. If you don't believe me, write some down 'n' dirty Mac programs and see the tricks the MacOS goes through to ensure you can still run an application from 1983 on a modern system. It's ugly.
  • When Jobs brought technology in from Xerox PARC, and Adobe, he had the keys to the kingdom handed to him on a silver platter:

    1) A tokenized Forth graphics engine.

    2) Smalltalk.

    The Forth graphics engine was originally intended to grow from a programmable replacement of the NAPLPS videotex graphics protocol, into a silicon implementation of a stack machine upon which byte codes, compiled from Smalltalk would be executed. At least that's the direction in which I had hoped to see the Viewtron videotex terminal evolve when I originated the dynamically downloaded tokenized Forth graphics protocol as a replacement for NAPLPS in 1981 and discussed these ideas with the folks at Xerox PARC prior to the genesis of Postscript and Lisa.

    If Charles Moore could produce an economical 10MIPS 16 bit Forth engine on a 10K ECL gate array on virtually zero bucks back then, why couldn't Jobs with all his resources produce a silicon Postscript engine with power enough to execute Smalltalk?

    Somehow a Forth interpreter made it into the first Mac, as did Postscript, but Smalltalk just didn't.

    The Motorola 68000 family just didn't have the power. It may have been better than the Intel 86 family, but that really isn't saying much, now is it?

  • So, not only do you want to be able to "muck around" in the box, you want Apple to bail you out if you break it? Sounds pretty damn contradictory to me. I've never understood why people think Macs are less hackable than PC's. Jesus, I work on HP PC's daily that have a sticker on the back that voids your warranty if you crack the case. It's not like Steve Jobs is going to break your fingers if you fire up ResEdit and do horrible things to the memory space...but don't expect his company to help you pick up the pieces!

    My mommy told me when I was very young...if you mess with stuff, you'll probably break it.
  • ummm... you can specify a startup disk which it boots off automatically. Why would you _have_ to press the option key just to startup? that is just stupid, and no, apple is not that dim.
  • If someone _must_ hold the Option key down to open the Startup Manager, this is unacceptable for any server (unless the user can specify a default boot partition and write it to firmware NVRAM).

    Yes, a default boot partition can be saved in PRAM.

    The 'Option key' option is, well, an option. You would use this to bypass the default.


    --

  • I fly off the handle a lot, too, but I wouldn't
    have made a comment like that based on ONE
    experience, especially if I wasn't POSITIVE that
    the network was setup correctly, and not saturated.

    -WW
  • I really liked his use of the royal 'we' throghout the interview. It's very 'Jobsian'. I'm going to start doing that in meetings. It'll be fun.

    example:
    them: What are your priorities for this week.
    Me: This week 'we' will be optimizing the epl templates.

    They will think I'm a god -oh yeah, that'll rock...
  • The original poster conceded the fact (as does Jobs himself in the Time article re: USB) that they weren't Jobs' original ideas. It's just that Jobs successfully brought the innovations to market. And AFAIK, Jobs did so with the blessings of the original developers, in most cases.

    Were it not for Jobs, those innovations we now take for granted may have died a slow death in a lab somewhere in Palo Alto... or at least they'd have been embraced by the market much later, and our current state of the art would still be a few years off (Windows 1.0, anyone?).

    Ah, but history never reveals its alternatives.

    --

  • Actually, it was the AUDIO people who were really put off by the lack of expansion slots - the photo-video people didn't really give much of a rat's ass, 3 was pretty much sufficient for a pair of SCSI cards, and a high-end video card.

    But with the FireWire and USB, this situation is near to being remedied (as soon as all those nifty mainstream FireWire devices come out!)

    "The number of suckers born each minute doubles every 18 months."
  • by Anonymous Coward
    There is a big difference between making it unnecessary to "look inside the black box" (as Jobs was arguing) and impossible to look inside the box. That was the beauty of NEXTSTEP, and will be the beauty of MacOS X -- a superb, simple, and easy graphical interface, but if you want to get into the dirt of things you can drop down to a Unix shell and bang away.

    Steve Jobs fully understands why the current MacOS is crap. He wasn't responsible for it -- they booted him well before Apple's long period of stagnation. He came to Apple to fix those problems. He was off trying to make something good (NEXTSTEP), and the BSD Unix foundation that Apple is now using came from the fruits of his labor at NeXT.

    As for Apple using NEXTSTEP's BSD/Mach as the foundation for MacOS instead of trying to roll their own.. well, wouldn't you? I think it's a more intelligent decision. Why are you criticizing Apple for not developing it all in-house. Part of running an intelligent business is knowing when it's cost-effective to go outside rather than home-grow. And further, since Apple acquired NeXT, one could argue that Apple is developing their OS fully in-house now.

  • >Their TCP/IP stack can't handle ftping at more that 10KB/s on a 10BaseT connection to the server that is 20 feet away...

    Huh? There must be something very wrong with your setup. My machine easily saturates our 10BaseT. Plus, I remember Jobs nearly saturating a 100BaseT when he demoed OS8.6 and was comparing file transfer to NT.

    Anyway, YMMV, but your claim is clearly circumspect.

    Yo
  • Emily? Emily Latella? Is that you?

    C'mon, girl. You can stop posting AC now :)

    (If I had a moderator widget, I'dve upped this one)

    --

  • I'm sorry but Apple won't win any points from me for using a unix kernel. Real-time is the way to go.
  • Yup.. But that was Amilleo's plan.. not Steves.

    Big difference.
    --------------------------------
  • Probably.

    I have a nice deal with my place of work, namely that I'm employed there. Nevertheless, I stand up to them and fight them on a nearly daily basis. The two are not mutually exclusive.
  • If you're in the mood for a really good Japanese anime, try some of these: Wings of Honneamise, Grave of the Fireflies, Vision of Escaflowne, Patlabor Movie 2, Neon Genesis Evangelion, Macross Plus. Those are good starting points.

    You're right. I've seen these, plus a few hundred more. My collection runs from Marmalaid Boy to Urotsukidoji, including all of Ghibli's releases.

    You'll do better to watch them subtitled, btw. Dubs are usually pretty bad.

    Amen to that.

    Boy, has this drifted, or what. Oh well, email wasn't an option.


    --

  • >As for Beos, only the kernel is proprietary,
    >every thing else is completely open.

    Really?

    Show me the source for:

    1. The 'twitcher'
    2. The tracker
    3. NetPositive
    4. Their TCP/IP implementation
    5. The underlying code behind replicants

    ...while you're at it, show me where Be says I could integrate this into my own operating system. I'd love to see Apple take on some of the special benefits behind the BeOS.

    Not so open, is it?

    As much as I like Be, the whole "Apple is stopping us" argument is tired and dead. First Apple shouldn't be obligated to support a competing OS - even though I personally think it'd be in their best interest in this case. Apple's supposed closed nature hasn't stopped the Linux community (LinuxPPC, Yellow Dog, even MkLinux). Even weirder still, such closed companies rarely release large chunks of their own source code to the public (Darwin).

    I have the utmost respect for Be, but their argument is pretty much defunct. I'd much rather they simply outright state that their PowerPC users don't hold the same importance that they used to, and quit shifting the blame. Then again, people might realize that Intel - Be's biggest investor - is being anticompetitive.

    - Darchmare
    - Axis Mutatis, http://www.axismutatis.net
  • "MessagePad *2100s* for $600-800"

    "my MP*130* can outdo"

    quick note: MP2000s have a 160MHz chip in them, as well as LOTS of other funky stuff over the 20MHz or so chips that were in the MP130. Note that these are arm chips and not 68k chips so they run faster than palms MHz for MHz anyway.

    I bought my eMate from a department store for $400AU I think that that is less than $600 you asked for.

    The issue is that there are very few left thanks to their steveing.
  • Cant wait to get my hands on one of those things. Too bad aple has not been releasing the specs. The way I see it is those things are a nice cheep platfrom to run risc linux on.
    As far as steve jobs I don't care much for him. IBM has been doing far more to help linux.
  • If someone _must_ hold the Option key down to open the Startup Manager, this is unacceptable for any server (unless the user can specify a default boot partition and write it to firmware NVRAM).

    The Option key method is used when you want to boot form a non-default volume. You are supposed to set the default boot volume (HD partition, CD, etc) using the Startup Device control panel, although I believe this still works only for MacOS. In the meantime, you must use special utilities to modify the NVRAM, or use a method similar to BootX, to boot non-Apple OS's.

    What happens when the power is restored after an outage (and a graceful UPS shutdown)? Must I go to work at 3AM to hold down the Option key?

    No, in that case the computer uses the default partition dictated by the NVRAM parameters.

  • Ok, I guess I have not made myself clear enough.. My point is not that every computer should be used for everything, but that people get so much more out of having the power of a pogramming language to tie together the things that they use the computer for (like using perl -e now).. and that the development of psudo-programming langauages with: (a) power (like perl), (b) connection to the applications and data the person uses (like scripting langauges in general), and (c) a smoth learning curve (so that even people who can not just go out and learn to programm will eventually see how to make there lives better by getting the computer to do some grunt work for them) Is ultimatly more importent the any ammount of push button prettiness. Th reason for this is that the Church-Turing Thesis say that the push button prettiness can not make people's lives better in the way that a programming langauge can.. while there is no known limit on the smoothness of the learniong curve associated to a langauge, i.e. We should be spending more time making the programming easier.. not the initial usage.

    As for your palm pilot example: more efficent data menipulation tools would probable be a welcome extension.

    Here is an example of something that could make life easier for everyone.. an X-Windows based AI script writer. It just sits in the corner of your screen making scripts to do everything you are repetitvly doing (like clicking on mp3s to DL them); then when you find yourself doing something repetitive, you just click over to the script writer and modify it's half writen script to meat you specific criteria. What would happen is the script writer would no understand why you were doing what you were doing (like how you picked the files to DL), but it would have pre-writen some of the grunt parts of the algorithm.

    Jeff
  • The Pixar mini-movie of the chess dudes is from when? 1996? Several years old and many generations old in terms of animation technology.

    It won the animation academy award in what, '97?
    I'm not sure that Pixar's "animation technology" has really changed by "generations" since "Geri's Game". They've been working on refining stuff like cloth and hair, but their renderer and animation system are still pretty much the same as two years ago... and they STILL use Alias PA for modelling... SOOO mid-nineties :)

    You are comparing the best that Square has to offer against Pixar's first pieces of work.

    "Geri's Game" was Pixar's most recent of what, something like a dozen or more short films they've made over the past dozen years... If Pixar really wanted to have a few dozen artists make a couple photoreal faces like Square's, I'm pretty sure they could swing it.

  • by Suydam ( 881 ) on Wednesday October 13, 1999 @08:47AM (#1615966) Homepage
    While this is a fun article to read there really isn't anything all that earth-shattering here.

    Some of the things he attibutes to Apple as reasons why it's great are pretty right-on the mark though.

    Example: Apple WAS able to add USB to their boxes without worry about anything. They just did it. But I don't know if I wholly agree with his reasoning why. It seems to me that apple could make a jump like this precisely because they'd become a small niche-player in the industry. The smaller warrior always moves faster.

  • Beos was never part of Apple's OS plan. There was rumors in 1996 or 97 of a buyout, but that was pre-Steve. And if the Be people wanted to run on the G3's they could have asked the LinuxPPC people for the specs. I'm running this on an iMac running LinuxPPC and it runs fine. How is any of this the fault of either Apple in general or Steve Jobs in particular?
  • (or Apple does not care, after all, their focus is the consumer, not the developer, as Steve Jobs would say)

    Yes, which makes sense, considering their intended user is the Average Person and not the Unix Guru.

    Their TCP/IP stack can't handle ftping at more that 10KB/s on a 10BaseT connection to the server that is 20 feet away...

    Really! Maybe your server is running NT or something. On my home network with a Mac client talking to a Linux server I have no such problems..

  • But even if they do, do you really think anyone will trust them again?
  • by Anonymous Coward
    Why is it that whenever an Apple topic comes up, some bozo feels the need to make some completely random comment with absolutely no foundation in truth? MacOS, despite its limitations, is quite capable of saturating a 10Mb link and does almost as well with 100BaseT. Please, get a life, check your 'facts' and stop spouting rubbish.
  • I hope that I wasn't the only one un-nerved by Bill looking over my shoulder as I read the article!
    (in the time ad http://image.pathfinder.com/time/images/timelink.g if)

    seibed
  • in America where if it says animated no one over 13 could possibly like it.

    Are you sure about that? Seems like a lot of age>13 folks are sitting in the theatre when I see them.

  • ...Linux ain't fer sale!

  • One thing that really jumped out at me in the interview was his mellow attitude. If I recall correctly, he's been described as a controlling megalomaniac... I don't know if anybody's heard the "Employee Number One" anecdote, so here goes:
    Apple decided to issue all the employees name badges and ID numbers, since the company had grown beyond the point where everybody knew everybody. Wozniak was assigned number 1, Jobs number 2, etc. Jobs couldn't stand being number 2, despite the fact that Woz really was Apple at that point... So Jobs went ahead and assigned himself Employee Number Zero, since it hadn't been taken yet, and it placed him above everyone else.
  • In the end I much prefer the ability to configure my own look and feel from modern WM's to anything apple has produced. Trust me, there is a major difference in philosophy from Unix users and macintosh users. Unix users are not going to flock to Mac OS X because of the look and feel.

    What makes you think you CAN'T do this on a Mac?

    http://www.kaleidoscope.net/ [kaleidoscope.net]

    Probably MORE schemes for Kaleidoscope than there are themes for any Unix WMs. Sure the source is closed, and the authors expect $25 of your hard-earned cash... Open-source has yet to make inroads on the Mac. Still, the main scheme editor is none other than ResEdit, which is free (as in beer).

    Apple's own implementation of this feature fell somewhat short, so they dropped it. There were few complaints.

    --B

  • Uhm.... pardon me, but it's not the TCP/IP stack that cares about files... it just juggles packets.

    You should be looking at your FTP software if it crashed when saving large files.
  • The render-farm only runs Renderman and other Pixar-internal software. No interactive software there, so no concern about the modeling programs. There is no reason they couldn't use Linux if they wanted to, but there's not much reason for them to use Linux, either because once they have a Unix-like system supporting their batch-processing software what Unix it is doesn't matter. Sun made that deal very attractive for them, I doubt they paid software license fees. And Steve wants to sell Macs, not Linux. In the several discussions we had about Linux before I left Pixar, Steve hadn't "gotten it": he still thought it took a mega-corporation to make a good desktop and that Linux had no hope of ever getting a good GUI.

    I find it amusing to watch SGI go for Linux and Debian, because it might end up that my old software group at Pixar will use my hobby project as their interactive operating system. Some of them were quite resistant to Linux when I left.

    Bruce

  • They've already said publicly that, if they really wanted to, they could've continued on the Mac platform. The point is that is was easier to do it on the Intel platform because WAY more specs were available, so that's what they went with. They didn't blame it on Apple, so much as say "Hey, our plans just don't work out together".

    -----------

    "You can't shake the Devil's hand and say you're only kidding."

  • I'm afraid you're confused. Film production, for the most part, doesn't involve tasks that would need real-time services from the operating system. The only ones I can think of are data-collection from rotoscoping devices (rotoscoping was anathema at Pixar, though), film recording, and video recording. At the NYIT CGL (Pixar's predecessor) we had some software that would look at the horizontal sync and timecode and tell VCRs when to record - that was before you could get a single-frame recorder from the factory, so we had to hack the hardware. We used PDP-11 V6 Unix for that job, and did the real-time programming at interrupt level in the device driver. If you do that, you can get real-time service from Unix or Linux, but it's probably not as easy to program as QNX. I did that VCR control software for years, but the original hack was by Bruce Laskin and Karre Christian, I think.

    Bruce

  • Pixar's specialty is not the absolute leading-edge of computer graphics, although they certainly do innovate and try to be on the leading edge. Their specialty is story. They could probably have done Toy Story II with Toy Story I technology and people would still be watching it 20 years after FF8 is forgotten, for reasons that have nothing to do with computer graphics and everything to do with the story. That doesn't mean that you might not like FF8 much better than TS2. FF8 is an "adrenaline flick". There's a time in young men's lives when nothing can beat an adrenaline flick.

    But this is not to say anything, positive or negative, about the graphics in FF8 vs. Pixar. I really haven't looked.

    Bruce

  • "Steve" is this Samoan guy with a journalism degree from Pago Pago Technical College. His real name is "Ubiquitous Bullshitus", but he says "Call me Steve!" -so we all do. Anyway, he was the recently announced winner (you must have heard of this) of the 1999 Nobel Prize for Relentless Smack-Talking. It's a great honor, and Time Magazine is fortunate to be interviewing him.
  • off topic... but i laughed out loud

    First posts rule..
    Yeah.. yeah... first posts kick ass.
    ___
    "I know kung-fu."
  • by G-Man ( 79561 ) on Wednesday October 13, 1999 @10:41AM (#1615993)
    Interesting rhetorical gymnastics. Too bad they're either misleading or flat out wrong.

    I think it is telling that Apple views its mission to make sure that the common user does not understand "the black box"

    The very quote you use says "don't need", as opposed to "make sure..does not". Quite a different logical meaning there. Perhaps you would be happier with a car or television that *forced* the user to be intimate with it's underlying technology? Maybe you should have to manually set the fuel-air ratio in your Honda? Screw channels, you should have to manually tune your TV. The entire computing world is built upon the concept of functional abstraction, otherwise we'd be trying to send web pages using assembly language. Apple is trying to 'abstract' up to the user level by integrating software and hardware. Many users may not know how a computer works. So what? Perhaps, God forbid, they actually want to do other things with their lives.

    Their TCP/IP stack can't handle ftping at more that 10KB/s on a 10BaseT connection to the server that is 20 feet away...

    Gee, that's funny, my cable modem dowloaded a file the other night at 180KB/s. That pipe seems pretty full to me. Perhaps my Macs are running NT without my knowledge...

    Apple finally realized that to get consumers you need to get their workplace

    Hardly. I don't see Sony products anywhere in my workplace, and they seem to do okay. People bought two million iMacs because they were easy to use and looked cool (at least to their eyes), while the computers at work were neither. Frankly, I'd be less likely to buy the same product I see at work (phone, VCR, company car) because I know PHBs only care about buying what the herd mentality tells them they should buy, and what fits with the corporate culture. They're gonna buy the white Ford Taurus GL, not the SHO, and surely not a Beetle/Audi TT/Ferrari/anything mildy interesting. As people see how clueless some IT departments are, they'll come to the same conclusion about computers.

    Not only can their product not work at that level, but they have no interest in developing one that can (MS at least used the OS/2 code they had written for IBM to make NT)

    So the world needs another kernel? Avie Tevanian did a lot of work on what became the Mach kernel. Avie worked for Steve at NeXT. Apple bought NeXT. A lot of the other technology (e.g., QuickTime) was "homegrown" at Apple. People used to complain that Apple had NIH-syndrome. Now you criticize them because they didn't reinvent the wheel? So what are we to make of companies that now support Linux? How about IBM? Is this an indictment against OS/2 and AIX, or is it just good business? How about SGI and IRIX?

    So what if Jobs has his own ideology about technology? Since Apple is a vertical integrator, they will never dominate the overall market. You can take Steve's vision or leave it. If I don't like Saab's vision for the automobile, I don't buy one, but I'm not frightened by them. I'm more frightened by a company that's wants to have a piece of everything. Now who could that be?
  • by gsfprez ( 27403 ) on Wednesday October 13, 1999 @10:46AM (#1615996)
    His products sell, his products' have inspired all of consumer electronics, over 90% of iMac owners are on the internet, and Apple's stock has travelled from $14 to over $70.

    I'm still looking for why people are "scared" of him, why they don't "get" him, and feel compelled to bag on his goals.

    Everything he's done at Apple has helped Apple and made better products and made everyone involved money.

    Last time i checked, even people that use Linux would like to accomplish goals like that.
    ___
    "I know kung-fu."
  • When someone's telling you why the Mac sux and they say "co-operative multitasking" you know that the almost worthless debate has become completely worthless. It's equivalent to saying you use NT instead of Linux because NT has a microkernel and Linux doesn't. It's way too much of an over simplification.
  • The real break through in user interface is not some push button GUI.. it is a Programming Langauge which has a really good learning curve.. and that has access to all the information the user wants to menipulate. Light weight programming needs to be something a user can not help but do in there every day life ... one day programming should be though of like driving a car is today.

    Mmmmm ... yummy AppleScript. Easy to learn, natural language syntax, object-oriented, controls the whole computer and many of the apps. Built into every sweet, sweet Apple pie. Tasty.


  • (As a former AU/X user, I just have to chip in.)

    For those who might not have experienced AU/X, it was a full blown System V UNIX that ran some Mac II and Quadra hardware. '

    The cool thing about it was that MacOS 7 booted on top of AU/X as a UNIX process, giving you full access to Mac programs along side your Unix command shell. Note that there was no GUI for the Unix programs - you couldn't run X Windows with the MacOS running. This meant either Unix command programs or un-modern MacOS programs, but no nice Mac GUI for Unix programs.

    The uncool thing about AU/X was that GNU banned it, so no gcc meaning lots of software was not accessible.

    Anyways, as you said, Apple *could* have ported A/UX to PowerMacs, but that would mean big licence fees to UNIX, inc. Much better to salvage the cool part (MacOS-on-UNIX , aka "Blue Box" or "Classic"), and port that to an open platform like Mach/BSD.
  • The Apple ][, besides being first-to-market with a complete solution, was also much cheaper than your average CP/M machine in those days.

    (*80/S100/CPM machines were primarily kit machine or put together by local retailers. I think the Apple ][ debuted at $2500, while CP/M machines were double that price once you got all the parts.

    The CP/M machine were also very fractured from a hardware standpoint, with a number of competing video and disk drive standards. This problem wasn't really solved until IBM introduced it's own CP/M 'clone', the PC. Before then, the #1 selling CP/M computer was actually the Apple ][ with an optional Z80 card made by Microsoft of all people.)
  • Here is an example of something that could make life easier for everyone.. an X-Windows based AI script writer. It just sits in the corner of your screen making scripts to do everything you are repetitvly doing (like clicking on mp3s to DL them); then when you find yourself doing something repetitive, you just click over to the script writer and modify it's half writen script to meat you specific criteria.

    You are describing the Macintosh Script Editor, except for the fact that it's not always on. You turn it on or off when you want to record a script for a particular repetitive task. When you're done recording, you can edit, modify and save the script as an application. I use it all the time.

    The Mac even has a script menu that sits in the upper right all the time (if you choose to use it) so you can access OS-level scripts and application-level scripts that you've recorded or written within applications. Mac users trade scripts with each other. "Hey, here's a script that keeps two folders synchronized, no matter where they are." ... "Here's a script that sorts files into separate folders depending on file type." ... "Here's a script that launches your mail client at a certain time and checks mail and then quits."

    I thought it was funny that you described AppleScript in your first message, where you said the Mac was too limited and for stupid people, and what they really need is easy programming, but it got beyond funny when you described the Script Editor, too.

    The beauty of the Mac is that the complexity is not in your face, so it's easy to learn. Unfortunately, people who want to sneer at the Mac use one for a minute or two and think they've seen the whole thing. All of the things you'd do with a GUI/CLI can also be done with a Mac, just within the one interface, which happens to be a GUI. Our command line are flexible scripts that can be treated as applications or attached to other objects to make the objects smarter.

  • Jobs can have his ideology (or to be nicer about it, vision of the future of computing)

    Others can like it.

    I was stating my objections to it.

    I hardly think he is stupid, but I think he is running in the wrong direction...
    We are all in the gutter, but some of us are looking at the stars --Oscar Wilde
  • it was talking to a Sun Ultra10
    We are all in the gutter, but some of us are looking at the stars --Oscar Wilde
  • by qwerjkl ( 97170 ) on Wednesday October 13, 1999 @10:49AM (#1616009) Homepage
    My opinion of Jobs is finally starting to form a bit. The way he phrases his sentences is very familiar, much like the way very smart people I know phrase things, with confidence. Probably my two favorite things he said in the interview were the comment about normal vs. talented people, and woz. He said that a small group of very talented people can do much greater things than ANY number of normal people. And I think that is definately correct. He also said that the normal 'talented' label applies to usually only 30% better than normal, with twice as good being VERY good. Then he says woz was 25 times or more better than average. Wow. That's a compliment. Anyhoo... I dunno bout you guys, but I like reading these kind of stories, a bit of the history of computing, a bit of its future.
  • by methuseleh ( 29812 ) on Wednesday October 13, 1999 @10:51AM (#1616010)
    Ok, son. Time to put down your Pokemon cards and Learn a little bit about computer history. You see, before he sold all of that colored plastic, he actually sold quite a bit of beige plastic. In fact, way back in the mid 80s, when you were just a gleam in you parents' eyes, Mr. Jobs and his partner Steve Wozniak introduced a funky looking little box called the Macintosh computer. And, surprisingly, that underpowered, tiny-screened, expensive little chunk of plastic did change the world. See, there's this other old dude named Chuck Geshke who developed a language that could put pretty pictures and cute letters on paper using another overpriced, underpowered, slow, expensive beige plastic box called a laser printer. And yet another old dude named John Warnock created a slow, underpowered, clunky program called PageMaker, which ran on Steve Jobs' little beige box and printed out pretty pages on Chuck Geshke's little beige box. The result was something called "desktop publishing" which really did change the world by making quality printed communication much more accessible to common folk who didn't have big DEC PDP-11s or phototypesetting machines in their garages. Sure, he didn't bring world peace or end world hunger, but he did change the world in his own small way, and with a lot of help from others. And he's done much more than just sell a lot of colored plastic. Now go on back to your Pokemons.

    --

  • by smileyy ( 11535 ) <smileyy@gmail.com> on Wednesday October 13, 1999 @12:43PM (#1616013)

    Large numbers of USB peripherals did not start to appear (and in correlation, appear cheaply) until Apple forced the issue with the iMac.

    Since the PCs still had traditional serial ports, companies saw no compelling reason to start producing USB peripherals, despite the superiority of technology.

    Of course, USB support in the various Windowsen also aggreived the problem.

  • well it was very reproduceable, and that collision domain was not saturated at all (I tried this transfer several times as the OS crashed 100MB in each time... and in between transfering from one Sun to another on that domain I had no such problems, near 1MB/s
    (100x)

    Not to say that that is the structural limitation of Macs, but they have severe issues... the crashing due to large transfers for example.. it could be an older version, but it was a G3.. what MACOS version that is I do not know, but as a fairly recent computer it is hard to believe that it is THAT outdated...
    We are all in the gutter, but some of us are looking at the stars --Oscar Wilde
  • Why would I want a Unix with a 1984 look and feel(mac)? Honestly, I never could figure out why macintosh users were so amazed at the macintosh look and feel, to me it was never anything special even after I used if for some time. In the end I much prefer the ability to configure my own look and feel from modern WM's to anything apple has produced. Trust me, there is a major difference in philosophy from Unix users and macintosh users. Unix users are not going to flock to Mac OS X because of the look and feel.
  • You shouldn't need to muck around, resolving driver conflicts, trying to get Linux to run with your video card, etc.

    As long as the reality is "You shouldn't have to much around...but you can if you want to", and not "You shouldn't have to much around, so we're going to keep you from doing so", then I'm happy.

    Given Mac OS X's BSD underpinnings, I'm (fairly) confident that muck-around ability will exist in future Mac OS's. How supported it will be is another issue. =/

  • Too bad Gil killed Newton.. Then we would have two good PDA devices. I wish it never died.

    I believe the Newton was "Steved," not Gil-ed. I bought an eMate about a month before Jobs came in and shut down the Newtons, but luckily, I haven't had a single problem with it.

    In fact, I use it every day to take notes on in my big lecture classes at Penn State. It works like a charm, and I'd highly reccomend it to any other college students. The fact that it's battery life is between 12 and 15 hours is the greatest thing about it.

  • I have BeOS installed on a spare drive on my home Mac, and on a spare partition on my work PC. I've used it since the Preview Release (i.e. 1997) I love the snappy feel of the OS, and particularly the strong underpinnings the OS has for real time applications like audio. The fact is, though, that there's been precious little (detectable) forward motion in the OS (from a user point of view) since the preview release 18 or so months ago. It seems like so much of their effort is being split trying to support umpteen million video card, sound card, and motherboard combinations while the stuff that drew me to the OS back in '97 has slowed to a crawl. Anyway, I wish Be luck, but for me that boat's already sailed.
  • I did not claim they try to make them hard to understand per se, they occlude the internal machinery, meaning "understanding" macOS is nothing more than playing with ResEdit...

    I may not have used Macs much, but somethings are universal, and this was more of a comment on Jobs than Macs...

    they have some saving grace, but his direction is a poor one for Apple... being a niche player is not a good place to be, and I think he is doing his shareholders a disservice even if they do not recognize it as such

    Of course these are my opinions and I do not own Apple stock. You obviously disagree. Viva la difference
    We are all in the gutter, but some of us are looking at the stars --Oscar Wilde
  • There is a lot of talk about how importent it is to make things easy to use, but unfortuantly most of the people talking (like Jobs) do not really understand what a computer is.. (Now, that I have your attention I can make my point)

    A computer is not a typewriter.. it is not a video game.. it is not a spell checker.. and it is not all of those things. It is a machine which represents a model of computation equivelent to a turing machine.. meaning that it can run any algorithm, i.e. do anything you can figure out how to tell it to do. The importent part of this is the ANYTHING part. Jobs may make a few simple jobs (pun) easier for a few stupid people.. and this is a good thing.. but it is not what we CAN do with a computer.

    The truth is NO ONE can make a ``supper easy to use'' model of computation equivelent to a turing machine.. Not Jobs.. not Bill Gates.. Not Anybody. Note: ``supper easy to use'' is DEFINED to mean no need to understand logic or basic flow control of some sort in this statment. This is just the Chuch-Turing Thesis applied to user interface design.

    The real break through in user interface is not some push button GUI.. it is a Programming Langauge which has a really good learning curve.. and that has access to all the information the user wants to menipulate. Light weight programming needs to be something a user can not help but do in there every day life..

    I'd say the closest thing we have to this today is shell scripting and maybe Excel type things, but one day programming should be though of like driving a car is today.

    Jeff
  • Holy shit!!!

    Ok, some people may disagree with what I said, but check out the moderations:
    Moderation Totals:Troll=1, Insightful=1, Interesting=3, Overrated=4, Total=9.

    !!!!
    We are all in the gutter, but some of us are looking at the stars --Oscar Wilde
  • *cough* it was actually Windoze 98.
  • In comparing auto mechanics with "computer savvy people," you are starting with your assumption and deriving the rest from there.
    The real "computer savvy" people are usually intelligent people who have taken time and effort to educate and discipline themselves (either formally or not so formally) in computers. There is a reason why the word "engineer" is added to so many computer related jobs. Network engineer, Computer engineer, Software Engineer: each of these jobs heavily require complex thinking, reading, and most of all, analysis. All of these skills carry over into other parts of life, including politics, religion, etc. It's not that these skills actually make our opinions right, it just means that we can support our decisions with rational arguments, either to convince others, or just to make ourselves more confident in our own opinions.
    This is the difference between computer tech and a mechanic. Mechanics are looked down on because most of them are not really mechanics, but wrench monkeys: high-school drop outs who aren't afraid to get their hands dirty.
    In the end, your argument can be applied to almost any proffession:
    Physicians are just like mechanics, except they know a lot about fixing people.
    Engineers are just like mechanics, except they just know a lot about building things.
    God is just like a mechanic, except he just knows a lot about making Universes.
    Do any of the above really make sense? Not really, unless you do some fancy word and mind tricks.

    And one last point, about computer professionals looking down on users for stupid mistakes: it's the same in any specialized field:
    Dumbass! You didn't change the oil in 50,000 miles!
    Moron! You can't build a house on a flood plane and not get wet!
    Freak! Ofcourse weight doesn't effect acceleration(in the first order).


    I could go on rambling, but I can sense your eyes drooping already....
  • Obviously you never used an Amiga.

    The first Amiga was based on an 8MHz (actually 7.14) 68000 with 256k ram. Yes thats right, 256k, not 128, but that was about a year later than the Mac, and the extra memory was partially offset by the fact that it was *color* and higher resolution than the early Macintoshes, hence requiring more memory to display the graphics.

    The Amiga had fully pre-emptive multitasking from day 1, using a well written and extremely efficient Microkernal architechture, with dynamic loading libraries and fully asyncronous IO, among things.

    Part of the reason it was so small and fast was that the entire microkernal was written in hand coded assembly.

    If you're interested in specifics, there is a *very* good interview with Carl Sassenrath, the designer of the Amiga kernel, at
    http://technetcast.ddj.com/tnc_981120.html

    I challenge you to say that the 8Mhz early Macintoshes, (Mac 128 to 512 or Mac Classic) were faster than the early Amiga's for real world applications. No way.

  • Certainly it's annoying as hell to have anything obscuring the animation. BUT, I read ultra-fast. This is a blessing and a curse. I don't lose time reading subtitles on anime, b/c I glance at them and comprehension goes right to my brain. It makes me really susceptible to advertising though, that's the downside.

    My problem is that with very rare exceptions, dubs are bad.

    The fx are bad.

    Songs either remain in Japanese or are translated to English, usually with dismal results (the Bubblegum Crisis dubs had to be buried in a salt mine for 10,000 years).

    Voice actors don't usually convey the subtleties of the script, although I attribute this to there not being much demand for that domestically. Japanese production companies scrimp on animation, which is why there is such an emphasis on the cheaper audio side of any given anime. Certain episodes of Escaflowne and Evangelion have absolutely ZERO movement for a couple minutes at times. But it's carried by the vocal performances, and unless you have an eye for it, it doesn't bother you. Additionally, and this might seem kind of pretentious here, but I really do feel that the Japanese language uses a lot of nuance to get things across, conveying an additional bonus that we relatively brash Americans don't have.

    Badly done dubs are the worst, no question. But I'd rather have a sub to fall back on than ignore the anime altogether. DVDs are turning out to be great for anime because they can carry multiple soundtracks and run the subs as closed captioning (which can thus be turned off if you so desire).

    Ghost in the Shell, to me, had a crappy dub. Mokoto had no emotion in her voice at all, and was very flat. I have not seen the subbed version, but based on the manga, I have trouble believing that she'd speak like that. Batou was also usually pretty boring. One of these days I'll hear it in Japanese though, and be able to make a better comparison.

    Haven't seen the Kiki dub at all. (Besides, Porco Rosso was the balls ;)

    Basically, I just think that, barring my learning Japanese, the most faithful version of an anime will be the subbed one. I want to hear every little thing, and if I want to see the full animation, I'll watch it again w/o the subs but with the Japanese soundtrack. That's more important to me - the quality that I know is in the original - than losing a little convenience.

    But, to each his own.

    (Ever hear some of the voices on Sailor Moon? It makes me shudder even more than watching the original Japanese Sailor Moon ;)

    Boy have we diverged from the original subject or what?
  • by Wah ( 30840 ) on Wednesday October 13, 1999 @11:01AM (#1616034) Homepage Journal
    Both times? :)

    I'm just saying no one (read: major movie studios) seems to think there is a market for really cool animated movies (I'm thinking like HBO's Spawn, i.e. rated R). Because of the Disney factor everyone here thinks animation means cute, fuzzy, and happy. Animation techniques have gotten so amazing I just wish someone would make a Star Wars/Hobbitt type epic. That's what my hope for the FF movie is.
    And I know Pixar ain't gonna make it.
  • As he acknowledged, he works alongside a lot of "talented" people. Many of the things he does, he does merely as team leader. For all of those people who like to harp on Jobs' inflated ego (and I'm not saying it isn't), he seems to give a lot of credit to his "team". Thus, in the context in which he was speaking, the use of "we" seemed appropriate.

    You'll notice that when he described his daily routine (waking up, logging on, eating breakfast with his kids, etc...) he didn't use the "royal we", and appropriately so.

    If you or I were answering the same questions, I doubt we'd use language that was much different.

    --

  • I think a few points should be made about what Jobs didn't say. First off, he never mentioned NextStep - except for once as an aside.

    Second, he didn't say a thing about Apple Enterprise.

    I think it's very telling. Jobs is planning on selling consumer black-boxes, he's made that fairly clear. I don't think he can make it any clearer than he doesn't want to sell to corporate America. Odd, that -- his return to Apple was on the coat-tails of NEXT Software, which is now Apple Enterprise, which he didn't mention once in his interview...
  • by um... Lucas ( 13147 ) on Wednesday October 13, 1999 @11:09AM (#1616037) Journal
    I remember a TV special a while back where Pixar was explaining their choice in computers. It had nothing to do with price/performance, but rather performance/cubic foot... They liked the suns because they were so small they could stack tons and tons of them in a relatively small area and get much more processing power than they could by trying to fit Onyx's or anything else into that area.

    I think that shows that there are so many other factors to consider when you're in need of processing power... Yeah alphas are cheap, but whose going to sell you 400 quad alpha systems and ships them standard in low profile enclosures?

    anyone involved in the real world can tell you that linux/xeon is the only way to go for rendring and 3d animation/modeling.

    You're just crazy! For one, so many studios and/or software companies would need to report their solutions to Linux. For two Xeon is a bum when it comes to the highend... Yeah, it's 25% faster than PIII's, but compared to Alpha, SPARC, MIPS, PA-RISC, PowerPC, it's the bottom of the barrell so far as floating point performance goes... and that's what you need for rendering... lots of it...

  • How correct is his statement about the USB though? I don't remember exactly when the iMac came, but USB ports have been on the PII motherboards since the very beginning, haven't they?

    If anything, it was Microsoft who were slow (not that we should talk).

    -
    /. is like a steer's horns, a point here, a point there and a lot of bull in between.
  • Uhm, clearly you don't know the meaning of Real Time in this context (OS), or you would have avoided to post.
  • The first computer I ever used was an Apple ][+, but jeez -- if they'd wasted time trying to make the early Macs backwards compatible with Apple ]['s they would have never gotten them out of the door. One of the reasons so many computers are such a complete pain in the ass today is the degree of reverse compatibility they maintain with the IBM PC Model 5150 from 1981.
  • I think it is telling that Apple views its mission to make sure that the common user does not understand "the black box"

    I think you misunderstood. Apple views its mission to make sure that the common user does not have to understand "the black box"

    This is similar to the way we no longer have to manually adjust the choke and ignition timing to enable us to start our cars in the morning. Of course those of us who know how still can, if we wish. But we don't have to start our grandmother's cars for them.


    --

  • Well, look at his old chap, BillG:

    His products sell, over 90% of computer owners are his customers, and Microsoft's stock has travelled from $10 to over $170 (before the split).

    So why people is 'scared' by him?

    Steve Jobs is a great packager and salesman. And he don't claim anything else, beside some jihad overtones... keep in mind that most of the technical work on the earlier Apple machines and the Mac was done by Woz, not Steve.
  • I'm just saying no one (read: major movie studios) seems to think there is a market for really cool animated movies

    Miramax (a Disney subsidiary) is hoping to change that. Check out Princess Mononoke [princess-mononoke.com]. No, she's not a princess by the strict Disney definition.

    Then again, I think the best animation is story-driven. It doesn't matter how well rendered it is if there's no story or plot. Toy Story worked because it was a good story well-told.
    --

  • As soon as.... I only been waiting forever

    If you can't find a Firewire device you like at Firepower [firepower.com] then you've got a problem. That's just one company. Dealers are selling these things for about 15-20% less than the list prices on this site, too (like all computer gear). Firewire is well into the useful stage.

    such as photo, video and others. All of which relies on many many expansion slots.

    Yeah, photo and video people just HATE Firewire, right? There hasn't been a Mac produced with more than three empty PCI slots for a year now, and they can barely make enough of them to meet the demand. AND, if you really need more slots, there's an expansion chassis from a third-party that gives you six or eight more.

  • While every computer may be a universal computer (Turing Machine) in the sense that it can run any algorithm, few (actually, none) are used in this manner. Every computer may have infinite *potential* uses, but the actual uses are bounded by the applications it runs. It may be replication of a physical product like a CD or DVD player, it may be something which has no physical analog like a Web browser, or perhaps something in-between like a word processor.

    AFAIK, my Palm Pilot is a Turing Machine, but it is simple to use because it is limited to a certain set of apps. Could someone port The Gimp to it? Well, yes, but obviously the form factor of the Palm does not lend itself to that use. This particular *physical* instantiation of a Turing Machine is limited in functionality.

    To a lesser degree, the same is still true of the home PC. To the extent that one can identify the uses for a particular computer -- web, e-mail, word processing, games -- one can make it "super easy" to use. Obviously, if you then load Mathematica or AutoCAD onto the machine, it may no longer be easy to use, but that is driven by the complexity of the application.
  • While it's not going to win any awards for bangs-for-bucks, Pixar are using a bunch of nice Sun hardware for their render-farm - 120 E4500s and 4.5 terabytes of disc storage, totaling 1680 CPUs. List price has lots of zeros.... Sun press release [sun.com].
  • Why can't you praise them both? I don't know anything about Foundation Imaging, but just because their CGI may be better than Pixar doesn't mean Pixar doesn't deserve all the kudos that's coming to them.

    Plus, CGI sophistication is only a part of good moviemaking... "Roughnecks" may have phenomenal CGI but if the story sucks, the movie sucks IMHO.

    The reason "Toy Story" and "A Bug's Life" were both such excellent movies is because they had excellently-written storylines, exceptional voice talent, etc. The great accomplishment of the CGI is that you forgot that it was CGI.

    --

  • Let's put this in historical perspective shall we? Jobs et. al. get a look at all the goodies at Xerox PARC. Yes they they saw the Forth graphics engine. Yes they saw smalltalk (and in the general sense OOP), and a lots of other things that were going to be important in a few years that they didn't capitalize on. And what treasure did Jobs take out this little raid? What idea would epitimize Jobs "just not getting it?" The GUI.

    One revolutionized the way poeple used computers, and the other, well is kind of neat if you like that sort of stuff.
  • by evilpenguin ( 18720 ) on Wednesday October 13, 1999 @12:05PM (#1616070)
    Because of Steve-o many killer products devloped or ripped-off) have been brought to market: -the mouse -the networked laser printer -expansion slots (Apple II)

    WARNING! NITPICK AHEAD!

    The Apple was hardly the first computer (even personal computer) with expansion slots. There were two major camps in the 8-bit computing world. Those who centered their designs around the 6502, and those who centered their designs around the 8080/Z80. Most of the early 8080/Z80 designs used something called the S-100 bus. It was a 100-pin bus and most of the designs had the CPU as just another card. You could swap everything including the processor. Not only that, but it was a broadcast bus so you didn't have this "slot address" crap you had with the Apple ][ bus.

    The Apple did a lot, and I still think Visicalc was one of the finest pieces of software writing of all time (all that functionality squeezed out of an inferior processor running in some tight memory limits, and to this day Excel doesn't give you that much more functionality), but there were much more sophisitcated architectures out there.

    They didn't win the marketing war, though.

    As I said, a nitpick. BTW, I was moving a really old couch out of my parent's basement and I found a computer hobbyist catalog from 1976 in there. How would you like to buy an S-100 bus 32k (that's "k") static memory card for $835?

    That's what these things cost assembled. No wonder my Dad and I wire-wrapped our first computer...
  • This is just the usual clueless Apple bashing. He even admits himself to almost never having used macs. But of course he considers himself an expert anyway!

    The only technical claim, about Mac TCP/IP speed, has been shown to be wrong by a factor 100 or so by several other posters.

    The claim that MacOS uses co-operative multi-tasking beacuse of some idelogical decision is absurd. What ideology, exactly, would that be? In reality preemptive multitasking just cannot be retrofitted into MacOS, and Apple has been working hard (if not always successfully...) for almost 10 years to replace it.

    The claim that Apple tries to make their products hard to understand is too silly to comment on.

    If the BSD comment is meant to mean anything, I can't decipher what it is (but I'm sure it is wrong :-).

    The only thing sadder that techno macho posturing from people who don't actually know what they're talking about, is probably those who moderate it up as "Insightful".

    Bah!
  • > Imagine when Apple releases MacOS X client, AND IT WORKS. Unix with the MacOS look and feel

    run, do not walk, to your software archive, pull that old 68k Mac out from mothballs, load up A/UX 3.0 ..... you mean you want it like that? but you HAVE it like that already (ok so it's 10 years old and only runs on 68k macs - porting's easy these days - if they wanted A/UX on PPCs it would have happened in 6 months) .... it's just that Apple threw it away and now seem to want to do it again from scratch .... oh well

  • Well, I'm looking forward to it, and I think it'll be pretty good. But what I really want to see right now is Disney's Fantasia 2000. DL'ed the trailer yesterday and even though the stupid QT movie is the size of a wheat thin, it looks amazing. Now I really have to hope there's no Y2K crisis - release date is 1/1/2000

    obApple: They're actually doing really well, despite the G4 not shipping in the quantities they need it to. (Personally I couldn't care less about bugs that only appear over the rated speed. There's nothing new about chips running into problems over the rated speed...) The new iMacs are going to sell amazingly well, and iBooks and G4s are getting into the channels. Jobs is not an especially great guy, in fact he's a creep. The difference between Steve and Bill is merely that Steve is an egomaniac and Bill's a megalomaniac. I wonder if they've ever been seen together irl... hm...

    But it's undeniable that Steve has been doing good stuff for Apple for the past couple years. This doesn't necessarily translate as good stuff for the consumers, esp those of us who had clones and/or used Linux or BeOS. But even then there's still a need for a strong Apple as a foundation. So I'm cool with Steve running the show. He just can't slack off, with his helicopter and handicapped parking spot.

  • by Anonymous Coward
    No contest. Steve hands down. One of the major problems with linux, is that it does not have a salesperson. Linus will put the audience to sleep, and Eric is worse than Mussolini crossing his arms on the pulpet. Imagine when Apple releases MacOS X client, AND IT WORKS. Unix with the MacOS look and feel? Yikes! Oh boy, seems to make linux look pretty ugly, but hey its open. Question is, who cares about that. Yea, yea I know. Linux this...linux that. Ouch boys, I just bought a iMac, DV special edition. Perfect for my undersized Manhattan studio.
  • nice to hear Apple is still innovating - for a while there I thought Microsoft® was going to run out of new ideas to copy.

    Chuck
  • Their TCP/IP stack can't handle ftping at more that 10KB/s on a 10BaseT connection to the server that is 20 feet away...

    This is just silly. I used to routinely approach 500 k/s with Anarchie as far back as System 7.5. At home now Anarchie can completely saturate my cable-modem connection in both directions. Your exaggerations weaken your argument.

  • Um, I'm a big fan of Apple and the Mac, but I have to step in and correct you here.

    The mouse was developed a long, long time ago. Jef Raskin, IIRC brought it and most of the GUI stuff to Apple (wish I had my copy of Infinite Loop handy) and managed to get Steve happy with it. After Steve's baby, the Apple III died. And then after Steve's next baby the Lisa (computer) died. Then Steve kicked Raskin out. He's not a pleasant guy. A couple IBM PC programs used mice before the Mac came along as well, but were never popular.

    Steve and most of Apple fought the Laser Printer tooth and nail. It _barely_ got approved, along with support for Aldus PageMaker and Adobe Postscript, which turned out to save the Mac from certain death in the 85-88 timeframe. DTP is still the strongest Mac market. I know. I do DTP.

    Expansion slots were used on a bunch of different computers before the Apple II. (see someone else's post for details)

    He had nothing to do with 24bit video on Macs (first started appearing after the MacII came along, first on the mb when the Quadras appeared both of which happened after he was gone). Steve liked black and white, and the NeXT didn't go color for a really long time because of this.

    Steve is not really a visionary at all. All of this stuff was either pushed on him, approved by other people in spite of him, or had absolutely nothing to do with Steve.

    (in fact, there's a really funny story about the 3.5" floppy drive in the early Mac days)

    What Steve's good at is taking credit and marketing well. He is dangerous to a company in other capacities. Read "The NeXT Big Thing" and be amazed that NeXT did not spontaneously combust under his leadership, and lack therof. He's been good for Apple this time around, but I'd be wary of relying on him too much.
  • I still find it very interesting that Apple is supporting the porting of linux to the Mac platform at all. They hold all the cards on the Macintosh platform, they have chosen to give up the OS Monopoly. I know they have problems with Be and I don't know what that is about so I can't formulate an opinion, but for a company that has been the only source of an OS to support bringing in another OS is pretty impressive to me.
    They opensourced their Quicktime server which is also fairly impressive. Of course IBM is doing more to help Linux though, I mean 90% of Linux computers are running on Intel machines, thus IBM and Intel have more to gain right now. If MkLinux or Linux/PPC get more popular and more mainstream I think that there will be more support from Apple.
    The important thing with Apple's Linux support though is that maybe one day we will see off the shelf iMac/Linux boxes. Sure IBM is supporting it in the background, but why can't I go to CompUSA and get an IBM with Linux on it. That would do more for the popularity of Linux than probably anything before. I think the recent M$ Linux Myths page solidifies this, they name several large companies that use NT, but Intel didn't step up and say, "Hey, when we debuted our 64Bit Merced chip it was running Linux, NOT NT." Sony didn't step up and say, "Hey, our development environment, A computer three times more powerful than the Pentium III and immensely more powerful graphically than anything out now, is running Linux, NOT NT" or Tivo didn't say "We chose Linux over CE", Nokia didn't either. So the support is there but it is not mainstream support. I don't have a Mac so I don't know the difference between MkLinux and LinuxPPC (are they even different projects?) but for Apple to help open up a whole new platform to Linux is not to go unnoticed.
    Of course you also have to consider Darwin, Apple's new open source OS based on Mac OS X. Does anybody know anything about this? I'd say that could be a major contributer to the open source movement. Even if it isn't any good, I don't see Microsoft releasing any open source operating systems. And although I'm not a lawyer and have a hard time reading licensing agreements the Apple Public License seems to be fairly open, unlike the Sun public License. If I am wrong please correct me, but it seems as though they are saying if you make changes for internal use only then you don't have to release your code but if you "deploy" the software than you have to release your code. Sounds kinda GPLish to me. So maybe they aren't helping Linux greatly per se, but they are helping the open source movement become more mainstream. And regardless of what we think, for a company that is in the business of making money that is pretty impressive.

  • I think it is telling that Apple views its mission to make sure that the common user does not understand "the black box"

    What's more important in the consumer market? That the consumer can use the product to reach an end or that the consumer knows exactly what internal processes achieve the end? Very few people could tell you how their CD player or VCR works--but they can easily use them to reach their. You shouldn't need to muck around, resolving driver conflicts, trying to get Linux to run with your video card, etc.


Work is the crab grass in the lawn of life. -- Schulz

Working...