Forgot your password?
typodupeerror
Apple Businesses

New G4s Coming Our Way 259

Posted by Hemos
from the more-power-to-shake-a-stick-at dept.
MasterOfDisaster writes "According to c|net, and this article on maccentral.com, Apple will release "four new, single-processor Power Mac G4 models, all using a 133MHz system bus, and ranging in speed from 466MHz to 733MHz" as well as MacOS 9.1 and several other things, next Tuesday at MacWorld Expo in San Francisco."
This discussion has been archived. No new comments can be posted.

New G4s Coming Our way

Comments Filter:
  • i don't think this is nearly as bad as you make it out to be. here's the thing: computers are getting too fast these days. there are very few people who need 1GHz computers. most people just need a pretty average machine, and even an "average" machine these days are pretty quick!

    with processor speeds increasing they way they have, i predict that computers will start to sell based on other criteria, rather than just "speed." this is where you're going to see Apple really take off. it'll be similar to why people buy cars: they don't buy just the car with the fastest engine, they buy the one with the features and style they want.

    you say the "average consumer" is going to pick the bigger number of Mhz. i say the "average consumer" doesn't even care! have you talked to "normal" people about getting a new computer? this is what they say: "I want to buy a computer." they don't say, "i want to buy a 1GHz Athelon." most computer-illeterate people i've met just equate a computer as a computer. as long as it's not "old" (that is, used), it's just a "computer" the same as a car is a car. they'll go out and buy the one they like the most after "test driving" it in the store.

    it's mostly computer-savvy or at least somewhat-computer-intersted people who even look at "specs." it's the people who have a passing interst, but no really solid knowledge in computers that buy based on bigger number of Mhz. when you start selling to people who really don't give a shit as long as the computer does the job they want it to, then pretty Apple computers, with easy Firewire and USB port and the slick interface of Aqua is going to sell.

    at any rate, i'm very much looking forward to the future of Apple. i love running Linux, but i still get all my "real work" done on a Mac, and i don't think that's going to change with Mac OS X (except that it may actually cause me to use my linux box considerably less)

    - j

  • It is true it does. Anyways I am still setting up a Linux box on an old IBM Aptiva that I got for free. Linux Mandrake on a Pentium 75mhz. Should be a real screamer. Personally Chip speed means squat. I want OS stability and so do many others. OS X will spur more mac sales than chip speed alone.
  • Yes, it is.

    I told a Mac friend i Would help him troubleshoot his Mac internet server (webstar, icky, but stable/secure). Anyways, long story short the server process was not getting any CPU time because a pop-up window was preventing it from switching tasks. Yes, thats right. A little 'Click Accept' dialog was shutting down his site. Thats how "not that bad" it is. An open window will DOS a Mac box. Its actually pathetic. I had pre-emptive multitasking on my Amiga, 12 years ago.

  • I remember having a conversation in 1994 about the future of windows. I remember two code names -- Cairo and Chicago. I think one of them was Win95 and the other was what became NT 5.0 The projected release for NT 5.0 was late 1995 early 1996.

    All hearsay, of course.

    --
  • I'm betting kiam to kittycats that Steve will have an update to the Public Beta and a lot more info on OS X, I expect a release for MacWorld Tokyo in February, or at WWDC in May.
  • I don't think we can judge what C|Net is saying until after Expo starts and Steve goes one way or the other...these early releases are subject to fact-distortion since they're just from "sources" and not from Apple themselves.

  • 1.)Have you ever worked anywhere that required working with colors and shapes? What if those colors and shapes needed to look the same on every monitor in the shop? Well, that new Apple Display Connector should help.

    Do you have any idea what you're talking about? ADC is just an interface combining power, DVI, and USB in one cable/port. It's only purpose is to eliminate cable clutter.

    Macs have been known for color consistency for years because of ColorSync. This has nothing to do with ADC, which is based on a 3 year old IBM technology and was introduced only 6 months ago with the CP machines.
  • A NON-ERGONOMIC optical mouse with ONE FREAKING BUTTON!

    Make that the most ergonomic mouse around with no buttons. You obviously have this mixed up with the old Apple mouse from a year ago.

    There are no ergonomic 3-button mice, because they all force you to keep three fingers poised over their respective buttons with either your palm pushing the mouse, or else your thumb and pinky clamped onto its sides... very un-ergonomic. The buttonless Apple mouse is a dream, to use... especially since the OS does not really require multiple mouse buttons. I like my MS Intelimouse, particularilly the spiffy scroll wheel in the middle, but the new Apple mouse is much more pleasant to use. (For the record, I still use the MS mouse on Win and Linux boxes, and I think MS makes some of the best mice on the market.)

  • Let's not forget that MOSR was the same group that SWORE that we were *this close* to having a Mac-branded Palm at MWSF 2000. Grain of salt dude, these guys have ZERO reliability.
  • by Anonymous Coward
    Has anyone read the Wired [wired.com] article yet?
  • MacOS X may not be coming to x86 machines, but the core (Darwin) works well according to various places...which means it's not a TOTAL mistruth. But you're right, they won't port to x86 fully because then they lose their profits, and a company with no profits fails.

    Let me get this straight. Apple won't port OS X to x86 because everyone will stop buying Apple hardware and instead run OS X on x86, ergo Apple won't make any money.

    There's too somewhat contradictory implications to that:
    • Mac users think Mac hardware is way overpriced and are just waiting for an opportunity to jump ship to commodity x86 hardware so long as they can run OS X.
    • Apple can't sell enough copies of OS X on x86 to make money. This either means you think that most PC users would rather use Windows or that OS X just isn't going to be good enough.
    The upshot of all this is that Mac OS X isn't compelling, and PC users won't buy in numbers necessary to generate the profits necessary to offset the loss of some Mac hardware sales.

    I'd argue that Apple wouldn't lose any real market share in hardware; the Mac users are pretty much sold on Mac hardware. There is a risk that the performance claims made by Apple would be shown to be largely subjective when people ran the same OS side by side on different hardware. Whether they would gain a lot of x86 users depends on Win32 developers embracing this new platform.
  • 1. Firewire.
    This is probably the best thing I can ever remember them using... It's the best external connection to rival external SCSI I've seen... course if they could just modify the cost to be a bit lower (for devices) than it would be real sweet...
    FireWire = Digital Video. Hellllooooo Non-Linear Editing! Woo hoo!

    So? Optical mouse may be nice at some things, but I know lots of desktop publishing people & artists that hate that part about the newest mac's I don't think that's entirely fair. As someone who spends a great deal of time over his mouse (NLE work, constantly) I'm a big fan of the new optical mouse. I didn't like the puck, which is where you could be getting confused.

  • Roll on MacOS X I say. I have a dual CPU 500Mhz G4 with 512Mb of RAM and MacOS 9 makes it run like a dog. Crash protection and multi-threading capabilities are pathetic and the UI looks very arcane compared to other operating systems.
  • MacOS X was originally due in fall '99. Jobs then changed this when Aqua was introduced, pushing it back a year(IIRC). Then it was delayed again to Jan 2001, and then "early" 2001.

    Anyway, Rhapsody was originally due out in '98, and while MacOS X Server did eventually come out in '99, it really didn't fulfill the promise of Rhapsody: a stable consumer OS. So you could say that MacOS X is really about 3 years late now.

    Also, Rhapsody was originally going to run on x86 machines as well as PPC, which was completely dropped after Apple realized that if it did, no one would buy Apple's overpriced hardware.

    -this is from a long time Apple/Mac user: Apple IIe, Classic, Classic II, 5200, Blue G3, G4/400
  • My wishlist for this next year includes a lithium polymer battery for my 1999 series Powerbook. I would REALLY like to get rid of this 1.something pound Li battery. I don't really give a shit if I get more time to play games or type things up in Appleworks, I just want a damned lighter piece of equipment to carry around. Even halving the weight of the battery would be fine by me. With that out of the way I just really want to say yahoo! (in a non-proper sort of way that doesn't infringe on copyrights). I've been waiting oh so long for Apple to release systems with higher clock speeds. Motorola has demonstrated 1GHz processors and Apple is trudging along with their line of 500MHz G4s. The addition of a higher memory bus speed is also a plus. Now if they would only crank out boxes with 4x AGP enabled they'd be in a great position for games as well as hardware acceleration for Maya and other apps.
  • by crayz (1056)
    Even I, who really don't care much for Aqua, am really looking forward to MacOS X.

    Having a machine that doesn't crash and has real dynamic memory allocation will be heaven for most Mac users. All Apple really needs to do is take out that friggin debug code so the thing doesn't run slow as shit.
  • When Apple first came up with the Mac, there was a reason to use them for the fact that the Mac was an all new type of computer which is what lead to what Windows is today.

    Unfortunately, from day one, Apple kept its doors tight close and would not let anyone except Apple to get in. In addition, Apple targeted its market to a narrow and small segment such as "graphic artist and desktop publisher". In addition, Apple marketed itself as a "cool" company that produces "cool" product.

    It is my belief that those event along with other erroneous events directly caused by Apple are what make Apple what it is today -- corporations don't take their product seriously to do anything with it.

    So while the Mac OS X is "cool", that is all what it is "cool" -- if there is no business strategy to deliver it to the consumers and corporations that it will just be used by those Mac fan and no one else.

    Finally, 10 years ago, there used to be a reason to buy a Mac, for publishing and graphic. Today, you can get those applications on a PC: Photo Shop, Adobe, etc.

    And for those Mac users who keep toting its UI as being easy to use -- would you please stop it and get a life!! Just use Windows, KDE, GNUME, OpenLook, etc. (any other UI) for few weeks and you will see that the Mac UI is not the magic you think it is.

    So tell me, why do I need a Mac?
  • ethernet, usb, firewire built-in.

    The entry level iMac for $799 doesn't come with the FireWire ports. Only the iMac-DV and iMac-DV+ have those ports. Those new G4s look pretty cool. They come with 10/100/1000base-T gigabit ethernet cards.

  • In what way would that prevent it from being preemptive? Pre-emptive multitasking is when the scheduler can interrupt (and suspend) a running task to run another task. The Amiga had this. How the scheduler decides which task to run next is beside the point.
  • Please. In probably less than two months, all you command line freaks can sit there in the terminal in OS X and do whatever the heck it is you do. Pick up a copy of OS X PB and you can do it today.
  • By the fact that while Mac prices are dropping *almost* to levels I can afford, it seems that Apple no longer has the option of configuring a custom G4 anymore.

    Not long ago you could go into the Store section of their site and choose to beef up a machine with more memory, larger hard drives, better monitors - but now you only get choices for software and peripherals like digital cams. What gives?

    I guess the only way this relates to the topic at hand is that the only time I look at Apple is when something new is about to come out and I can afford the old stuff. I am really in the mood to run a LinuxPPC/MacOS machine.

  • If Steve Jobs doesn't give some serious, concrete info on OS X at MWSF(i.e. ship date, improvements since PB, carbonized apps), he will be killed by the crowd.

    I am not joking.
  • Please point me to where this "giant war" is exactly. I must have missed it.


    --------
  • The 450/500 MHz G4s they put in the dual systems had been out for almost a year, and are probably pretty cheap.

    But these freshly baked 733MHz wonders will be much more expensive at first. And it would add a lot to the price of the system to add an extra processor.
  • by crayz (1056)
    I never said I like Aqua. In fact I don't. It looks pretty, but it really isn't as functional as the OS 9 GUI.

    But still, the prospect of having a stable, modern underpinning to the OS is very appealing.

    I'm hoping that Apple will change Aqua a bit before the final(we should see the results at MWSF) and also that they will release tools to allow 3rd parties to create themes that can drastically change the interface. I will almost certainly not be using the default Aqua scheme(even if they give me an Apple menu and trash the dock, it's way too bright).
  • You can't compare different CPU architectures (and surrounding hardware) and proclaim which is better just by comparing megahertz. Obviously non-technical people might make such comparisons so Apple would be better off to hide all mention of the processor speed or use some other scale.

    The biggest problem with the Mac is not the megahertz the machine runs at but the perceived speed that it takes to do stuff. I have an outrageously specced Mac sitting on my desk and the UI acts as slow and retarded as the Mac I used to use at university nealy a decade ago. The single-mouse-button, single menu strip is just painful to use as it was then and Apple haven't picked up on any of the UI advances that other operating systems have made in that time. I don't think the MacOS X UI will be much better but at it will be a real OS under the hood and much more power-user friendly with access to shell prompts etc.

  • Chicago was the code name for Windows 4.0, aka Windows 95.

    Some documents [pcc.edu](note its date) claim Cairo was the code name for Windows NT 4.0 (the first release of NT with the Win95 interface).

    Others [microsoft.com] claim that Cairo is/was the code name of NT 5.0 (aka Windows 2000)

    Perhaps Microsoft used the Cairo codename for NT 4, and then reused it for NT 5. That's just a guess, though. However, the claim that NT 5 was due out in late '95 doesn't seem to have any basis in fact. Rather, it seems like the result of a confused combination of the version number of Cairo #2 with the release date of Cairo #1.

  • The techinical reason is that they're too busy trying to get OSX ready to take on a project as massive as a processor family switch again. The jump to PowerPC was a pretty amazing thing, and they pulled it off pretty well. It was also made possible by the fact that the PPC line offered so much power over the older chips, that emulation for backwards compatibility ran at a reasonable pace. I'm not sure that a switch to say, x86 architecture would provide the power to emulate PPC software and run it acceptably. And that would be a necessity.
  • motorola has demonstrated 1Ghz chips perhaps, but as for being able to produce any significant number of G4's at a higher clock speed than 500mhz, they haven't been doing so well. It's motorola's issue here, not directly apple's. Now, it's completely apple's fault that they're so entirely dependent on motorola, and as such so dramatically effected by their mistakes. But why would you ever think that Apple is deliberately holding back on clock speed. The only reason that Macs have been stuck at 500mhz for so long is because there aren't faster G4's in production. Sure, they probably realize the foolishness in the mhz speed race that intel and amd are having, but that's what's happening, and if Apple could sell more computers by keeping up better, I'm fairly certain they would.
  • by sheckard (91376) on Wednesday January 03, 2001 @08:05PM (#532421) Homepage
    Face it, speed sells. If the average consumer was to pick between a (top-of-the-line) 733Mhz G4 and a even middle-of-the-road 1Ghz Athlon, guess which one they're going to pick. Now, don't give me the crap about how Macs aren't for the average consumer or whatever, but face it, this is a problem for Apple. It's a shame that they're being held back by Motorola when their Mac OS X is so wonderful. But boy does it need it's CPU cycles.
  • umm, read, "low end"

    A "low end" machine is generally the cheapest available. An E-Machine box for Intel arch, an iMac for Apples.

    And a "low end" PC generally doesn't have a 17" display - if so damn, I'm still using a 15" at home (and I cry when I leave my 19" at work)

  • with all due respect. Robert Morgan is realy not a reliable source, his acuracy is very questionable and he is often driven by overly emotional causes. Remember the famed apple set top box that he threated to reveal to the world if apple didnt release it, the thing was nothing of interest at all, it was not a set top computer that did all the cool quicktime stuff he though it did, it was a limited run prototype for a video on demand (or near demand) pilot being run by a british cable. if I recall corectly the pilot was abandond before he even started talking about it.

    frankly I just cant take his word as aboslute confirmation.

    On the otherhand, that doenst mean that the story has to be bullshit. But Moto has many reasons not to focus on Apple as a customer, they have many problems of there own in their business and from what I have been hearing they are suffering from quite a brain drain in their microprocessor department. this alone is enough reason to focus on more profitable areas of their business (including selling PPC chips for signal processors).

    anyway, take it all with a grain of salt
  • MOSR is extremely unreliable. Try AppleInsider [appleinsider.com] instead. Plus, you misread the blurb. It says OS X will be released at Mac World Tokyo.
  • by GauteL (29207) on Thursday January 04, 2001 @05:34AM (#532425)
    I'm sorry, but clock-rating is not everything when it comes to CPUs, and the G4's are very fast clock for clock compared to Intel CPUs, and the P4 is the opposite, sacrificing performance per clock for a high clockspeed.
    It doesn't mean that the P4 is bad compared to the G4, it just means that you can't compare them by looking at the MHz/GHz-rating.

    They have taken different routes to high performance, but people seem to automatically assume that higher MHz == higher speed. It is often speculated that _this_ is the reason for Intels sacrifice on the Pentium 4 (something I find rather believable).
  • Look at it this way: Since the dualies came out the best-selling G4 Tower was the low-end single-cpu G4 400. That's not good. Since the dualies came out G4 Tower sales have dropped precipitously. That's even worse. Consider also that Apple's big inventory problems all the pundits are flapping about right now are confined almost entirely to the Dualie G4s (and low-end Cubes and low end PBs). Everyone knows the dualies only offer a performance edge for MP enhanced apps, and there aren't too many of them yet, or much use for that second CPU besides the pure Photoshop box. Until OSX is out that is. Then consider that 733 MHz (or whatever) G4s are going to be in mighty short supply for awhile, unlike all those 450s and 500s that MOT has been cranking out since forever and Apple could afford to almost give away. Also consider that these new G4 733s (or whatever) have DUAL AltiVec Units, so there are already two Vector brains on the new single G4s for those Photoshop vs Pentia Bake-offs SteveJobs just llloooves to show off to the faithful. Whoever said that dualies will be kept on for OSX servers doesn't have a clue. Apple's current OSX server (v1.2) does not support SMP at all! That's why all the server configs Apple sells today are SINGLE cpu G4 500s. There will be no MultiProcessing Servers of any sort from Apple until after OSX is released and the Server bundle is upgraded to v2, or however they repackage/brand it. Real MP boxes will be back then in a big way. REAL MP boxes like Quad 800s (or whatever) that will need much more serious bandwidth on the bus than 133MHz. So look for new MP OSX Server boxes with higher speed buses (so those extra processors are used efficiently), and a case design to make the guys in the back room drool, all sometime after MacWorld Tokyo (Feb 24) when OSX is expected to ship. The chips running on them might also come from a surprising source. What good are multiple vector units in a Quadbox Server? The rumors are already flying of new high speed IBM PPC 801s on the horizon that are G4s (in that they are good for MP) only without Motorola's AltiVec parts. Maybe Apple will call them G3.5s...
  • It seems all ya trollers on /. have 1 main problem with macs: They're insignificant and don't matter. Then stop yer damn bitching and talk about something you're interested in, cocklick. People like you make my brain sad.
  • I mean has Apple given the okay for us to discuss new products? I would hate to talk about a Apple product before it was released. We all know what happened last time someone did that.
  • I think everyone can agree that Apple priced the G4 cube wrong to begin with. My wife's mac was finally getting old enough that she wanted to get a new mac last year. She really didn't want an iMac as she wants a bigger monitor. So our choices from Apple were either a Power Mac G4 or a G4 Cube and I couldn't justify the extra $300 for less expandibility. Now if the G4 Cube had been $300 less than a Power Mac G4 we would have gotten one. The only thing that's been put in the PCI slots is a second video card for the 2nd monitor. So honestly, she could have lived without the expandibility, but not for more money.

    Jobs/Apple can sell some nice plastic and Apple needs to evaluate the price points of their hardware better. The G4 Cube was better than an iMac and a bit less than the Power Mac G4. So Apple should have priced it between those two product from the start. If they had done that, then I think they would not have had the inventory problems they had with the Cube. Also they should not try to sell hardware to a niche market inside an existing niche market. Or if they really wanted to do that they should have done a better job at forcasting based on this and the initial price they planned.

    Apple's been on the verge of going out of business for the last 20 years and will probably do so for at least the next 10 years. ;-)
  • MacOS X may not be coming to x86 machines, but the core (Darwin) works well according to various places...which means it's not a TOTAL mistruth. But you're right, they won't port to x86 fully because then they lose their profits, and a company with no profits fails.
  • by Kevin T. (25654)
    I am sure that they will have BTO back soon.

    That's right...you ain't seen nothing yet!

    Sorry. I couldn't hold back.

  • Maybe I have my info wrong, but the ADC is a littl e more than just power, DVI, and USB; the monitors hooked up to them (LCD and the like) actually use the USB port to transmit calibration data, IIRC, though you'll have to scroll down to the bottom of the PDF linked [akamai.net] in order to get an inkling of some of this capability...



    Geek dating! [bunnyhop.com]
  • Apple should just stick a divide-by-two flip-flop on the CPU's clock pin, then jack up the oscillator frequency until they're MHz-competitive with the x86 world. It wouldn't hurt the performance much, and it's no dirtier than some of the tricks that Intel's played over the years (487SX Coprocessor upgrade [pcguide.com], anyone?).
  • if there is vocal opposition then surely a majority have heard it?
  • Does OS X even support SMP? Although there is a FreeBSD smp project, they expect support only by mid-2001, and even say:

    Due to FreeBSD's history, this is much like trying to fit a square peg into a round hole, and as such, the intermediate results aren't pretty in many ways. We are specifically not attempting to rewrite the kernel from scratch, nor are we on a crusade to fix all the architectural nits currently present in the kernel. In fact, we expect to leave a trail of architectural nits that will still be evident in many ways when FreeBSD 5.0 is released. This is a pragmatic project rather than a theoretical one; we need to have the kernel working and stable in under a year, so time restraints require that we be realistic about what to do when.

  • MacOS has supported multiple processors for years
    Well, sort of... MacOS preceding OS-X only partly supported dual processors. The second processor was more like a co-processor, and could only be used by applications that were specially written for it - like Photoshop. Contrast that with Windows NT, Linux 2.x, and OS-X which can run any program, including the operating system itself on any available processor - a much more effective and useful solution.

    Torrey Hoffman (Azog)
  • It's hard to get excited about specs that were met by AMD/Intel over a year ago... Granted we're comparing apples to oranges, but does the average consumer understand the difference between a RISC and CISC (NOPE)...

    They also don't understand the difference between closed-source and GPL. I guess all those Linux proponents should just go home.

    --

  • computers still have a long long way to go speed-wise. it's as if you're in 1904 saying "why would a car ever need to go faster than 25 miles per hour?"

    besides, people will always be drawn to the faster machine, both by internal competitive drive and by marketing pressure.

    Let's try applying the automotive analogy to that last sentence of yours: "People will always be drawn to the faster car". Er, no actually: People base their car buying decisions on many factors, and speed is pretty far down the list for most people, because any car you'll buy will be more than capable of going as fast as you actually want to go in 99% of situations.

    Sure, cars had a lot of room for increases in speed in 1904, but eventually those increases leveled off. Who's to say that the same thing can't happen to computers? How can you say with confidence that it isn't happening already?

  • Actually the Athlon is probably going to be pretty OK. What'll kill you is a 1.5Ghz P4. Ugly design, go with the AMD for general purpose performance if you're on X86.
  • Here's the problem, and it's a big problem:

    For nearly 3 decades now, the computer consumer has been accustomed to ever increasing speeds, for stable or declining prices. Anyone remember spending five grand on a 4khz 8086 with 4 megs of RAM?
    Then, 6 months later, the machine would be obsolete, as a machine twice as fast was out for probably four and a half.
    Maddening. 3 years later, it was compelling to get a new machine, maybe still 5 grand, but we wer talking about significant gains; 66 khz.

    The problem with Apple is, nobody's buying new machines. I'm not buying a new machine, because my Beige G3 at 300 MHz, with 192 megs of RAM on a 66MHz bus, though I'd like it to be faster and more responsive, I'm not willing to blow $3500 on a machine that's barely twice as fast. I spent $1500 on this G3, two years ago, twice as fast for twice the money? After 2 years? Blow me.

    I would pay that kind of money for a dual 600 with a 200 MHz bus. But this 133MHz bus ride is bullcrap. Apple's hardware technology is behind the curve. Don't tell me I don't need a faster machine. When it comes down to it, I don't need ANY machine. I need food, air, and shelter. What I WANT is a machine that's faster. One that can run the latest bloated eye-candy at least as quicly as the 2 year old machine ran it's OS.

    Apple has to either significantly lower it's prices, or improve it's hardware advances. That's all.

    Personally, I think this announcement has only one purpose. It is to generate sales of the older discount hardware to fix Apple's inventory problems. Frankly, the older discounted machines are far more attractive than the vapor they're announcing today - and I believe that's by design. As soon as the inventory of the older machines is eliminated, Apple will announce upgraded models (this is EXACTLY the Yikes plan, rehashed), with the 200 MHz buses, perhaps faster CPUs, perhaps not, but they'll stress MP more than single CPU. My guess is that Apple would really rather sell single processor machines, as the profit margin is higher. - but in order to appeal with single processor machines they need higher MHz-age.
  • Seriously, they haven't even mentioned or hinted at the so "Apple" prices these things are going to cost. Will OSX even be loaded on these machines?

    Of course not. Mac OS X isn't ready to ship yet. Did you see the public beta? The user interface was a disaster. Hopefully they've fixed the design flaws, but there's still some debugging and polishing left to do. When they do release it, it needs to be perfect.

    --

  • LinuxPPC altivec enabled GCC?? O_O

    That just woke me up _real_ fast. (wish I had a G4 instead of G3, too). The thing is, it's Linux- it doesn't have to be just a distribution, you can maintain things yourself. The important thing is the compiler because if you are a good little linux user and know how to compile all stuff with ./configure, make, make install (or whatever the RTFDirections says), you get all the software set up for your processor- given certain conditions.

    Altivec can be used for block moves, for a wide variety of big-data-handling operations. It can be _general_ _purpose_. Does this GCC simply allow for software to be written using Altivec (as if it was some sort of very specialised MMX) or does it dynamically take advantage of the 128-bit registers wherever possible? Whether or not it _does_, it _could_ in future do that: particularly if the C libs are written to be Altivec optimised where possible (again, such as using the registers to move large chunks of data).

    Very cool, can't wait for it to become more generally useful- I sort of doubt that all of GCC can make use of Altivec (in the way that Quicktime and Quickdraw were rewritten to make use of it, and that OSX's rendering layer does) but it's just a matter of time because we _are_ talking about a current-generation powerful consumer-level architecture with special characteristics. Linux has a way of adapting itself to these. Eventually, not only will PPC look like a very sensible choice for Linux deployment, but Linux will look like a very sensible option for Mac alternate OS choice.

  • by Rombuu (22914) on Wednesday January 03, 2001 @08:12PM (#532495)
    It's a shame that they're being held back by Motorola when their Mac OS X is so wonderful.

    Apple, as usual, is being held back by Apple. They've switched processor families before and there is no technical reason they couldn't again. For some reason, everyone else in the world knows that selling PC hardware is a low margin game and that Apple's forte is their OS and some of their applications, but they keep stumbling around trying to convince themselves that making cool looking boxes is going to recapture their past and short lived glory years.
  • Dunno where you get _that_. I've seen a G4 just once: it was in the house of a guy doing prepress work. Nothing was launched, then he doubleclicked on a Quark document and BAM it was there. Maybe half a second or less to launch Quark off the hard disk and open and display the file. Are you perhaps using Microsoft applications? Microsoft has been known to put in delay timing loops to make sure the Mac versions aren't quicker than the Windows versions.
    • Although there is a FreeBSD smp project, they expect support only by mid-2001...
    Um NO... FreeBSD has supported SMP since version 3.0-STABLE [freebsd.org]. Since 4.0, the SMP code has been quite good. What you're quoting is a blurb on incorporating BSDi's SMP code, which is great, and won't be done until 5.0.

    OS X [apple.com] supports SMP fully [apple.com], as it's based on NextStep [xappeal.org]. (OS X is nothing more than NextStep with its out of date userland programs updated with FreeBSD's [ispworld.com].)

    Do some research [salon.com] next time.
  • Read it. It was quite stupid, and ill-researched:

    There are even calls for the return of Steve Wozniak, Apple's vice-president of research and development from 1976-1985, a time when Macs held a strong position in the marketplace.

    Does the writer even know that the Mac wasn't introduced until 1984? Or that Woz had nothing to do with the mac?

  • I'm not saying Apple is deliberately holding back high clocked chips or anything, merely annoucing genuine joy that they are releasing higher clocked models. I wasn't meaning the 1 GHz demonstrations to be something Apple needs in their products, just saying the technology is around for them to hit marks higher than 500 MHz.
  • Yes, OS X does support SMP,

    But I believe it will only be supported for apps written to the BSD or Cocoa subsystems. I may be wrong about Carbon, but I think Carbon apps will be funneled to one CPU, and I'm pretty certain Classic (the majority) apps will be single CPU only.

    So, not only do we have to wait for OS X to come out, but we have to wait for the major vendors to release native ports of their apps. If I'm right about Carbon, that will be quite a while. I don't think Adobe, for one, has ANY plans to rewrite Photoshop in Cocoa (although Apple could make that somewhat attractive by ressurecting the OpenStep for Windows thingie - then Adobe could port to Cocoa, and recompiled binaries would run on OS X and NT, and rumor has/d it that there was an OpenStep runtime for SPARC/Solaris as well - ah, fantasyland. . . )
  • The Australian National University Considered using the G4 in the Beowulf cluster, but decided against it. Can't remember why.
  • by jafac (1449)
    If SJ is truly paying attention to the power users now, "more stuff, less fluff" then why in hell did they do that stupid fucking punk-ass dock?
  • True, carbon threads actually live within one BSD process (AFAICT) and so it sticks to one processor... but running multiple carbon apps will be spread across both processors. Classic is likewise only one process, but running two classic's would use both processes.
  • you may say "yahoo!"

    You cannot say (capital "Y">"Yahoo!".

    consider yourself warned.

    -Yahoo! corp. legal copyright enforcement team.
  • by cowscows (103644) on Wednesday January 03, 2001 @08:16PM (#532519) Journal
    Maybe with all the negative press that intel has been getting over it's P4, with the empty clock speed, at least a little more consumer awareness about the fact that clockspeed is just one of many numbers determining computer speed. Apple certainly is in a sucky spot with this whole motorola thing. I wonder if/when the computer world is going to end up more like the car world, where most any machine you buy will have plenty of power/speed, and other things can become a deciding factor in purchases. Apple would certainly like it that way.
  • by hawk (1151) <hawk@eyry.org> on Thursday January 04, 2001 @07:16AM (#532521) Journal
    err, not quite.

    Macos had 24 bit addressing from the start, although I think the early systems or hardware decoded anything with the high bit high as the roms (but it's been a while, and my little brother has my copies of inside mac).

    At system 6.0.something (i don't hink it was .0), apple started going "32 bit clean"

    This comes from the nature of the early 68xxx processors. The original design had a 16 bit data path, 16 bit ALU (wait, it was 32, wasn't it? it could do 32 bit operations, but did it do that by using the same alu on each half? it's been too long . . .) , and 32 bit registers (usable as high and low 16 bit registers). Motorola clearly labeled which registers/paths/whathave you would grow to 32 bits in future expansion.

    Given that a 32 bit register was addressing a 24 bit address space (there were only 24 pins for addresses; this was still DIP packaging for the processor), it left 8 bits which were tempting to use.

    Apple told developers not to use those bits, as they were reserved. Programs that followed the directive were generally executable on later machines, while those that weren't needed to be rewritten. The two biggest violators, in order? Apple and Microsoft . . .

    Sometime around the IIX and SE/30, the ROM's became "32 bit clean" and other
    software was similarly designated. Such machines could generally (but not always, iirc) go past 16M of memory. Roms could be retrofitted to some models
    to allow such software.

    I want to say that it was system 7 that required 32 bit clean roms, but it's
    been a while, and I'm not certain. There were certainly significant
    differences between systems 1-6 and 7, but it really wasn't a 16/32 transition. The original 68k was a 16 bit chip in the same sense that the 8088 was an 8 bit--data path, and not much more. For most intents & purposes, the macos was a 32 bit os with a bit of 24 bit crippling from the start.

    hawk, dusting off old memory cells.
  • No. I've had these machines, and it's different.

    The monitor's base had a bunch of connectors (ADB, sound out, and mic in). but, they did not connect using a single cable, but ratter a bunch of them.

    My PMac 8600AV/200 also has that setup, onto my Apple Multisinc 17" display. The whole thing requires a thread of cables on the back of the monitor, which is totally different from the ADC connector, and ultimately, from the NeXT cable.

    Besides, the NeXT calbe also predates the AV systems.

    Karma karma karma karma karmeleon: it comes and goes, it comes and goes.
  • >Apple reserved the first 8 MB of memory address space for Rom, I/O
    >stuff, leaving the top 8 MB for programs and the MacOS to run in.

    Ahh. I had them backwards :)

    >It's interesting that Apple had the foresight at the time (1982?) to
    >reserve the bottom of memory for what they thought they needed for
    >hardware address space, leaving the sky the limit for adding memory
    >above the 16MB barrier when Motorolla overcame that limitation of
    >their processors.

    It's not so much foresight, I think, as failing to do something
    extremely stupid :) As I recall, there's nothing special about
    any of the addresses, so they can all be put anywhere you
    want at boot time.

    Remember the Switcher (pre-multifinder)? On a 512k or 1M machine,
    you had multiple programs loaded by having multiple copies of
    the system loaded at varying addresses (only one of which could
    be at the "normal" load space)

    >This is in stark contrast with Intel/IBM/MS that decided to reserve
    >memory at 640 MB of memory in the x86, setting an ultimate upper limit
    >to never be overcome in real mode.

    That's not quite how it happened, though. IBM only claimed 256kb
    of address space, anyway. We quickly figured out that 512kb was
    workable, and it seems to me that there was a year or two before
    someone figured out you could add another 128.

    There wasn't really anything hardwired to that space, although the
    color and monochrome cards had fixed addresses. These should have
    been movable, except that the bios drivers were *so* slow and poor
    that everyone had to write to the hardware (If memory serves,
    keeping up with a 1200 baud serial port was beyond the bios's
    ability, but it may have been a faster [but still slow] speed
    where it couldn't hack it.)

    Some early mac programs did the same direct to hardware thing, but
    a) these got broken hard early on by competitors that didn't, and
    b) the toolbox was well enough done tha it generally gave better
    performance than custom code anyway.

    >Trying to install NetBSD on old 68K based Macs helps you sort all of
    >this stuff out. :)

    Trying? MacBSD on a IIci was my primary machine for a few months--which
    is whent the serious 1-bit display problems on LyX went away (no, I
    didn't fix them; I just kept reporting what I couldn't see . . .). However,
    the limited display size soon had me using primarily the Linux
    box at its side, as I could drive the 17" display at 1024x768 . . .

    hawk
  • That sounds right.

    /me brushes more dust off brain

    wait a minute, wasn't that a third party utility that let you do that? and eventually apple bought it and included it?

    I never really followed it that much, becasue my 030 macs were all 32 bit clean, while it just didn't matter on my 68k models . . .

    hawk
  • I, like an idiot, suggested to my dad to get a G4, since he did alot of digital photography. Normally, I am a win2k/linux advocate, but I "Thought" from all that I heard, perhaps Apple had a better product for what my Pop wanted to use it for

    Big, Big, Big mistake. I feel like a complete ass. My father has had nothing but complete trouble with the piece of crap. The mouse locks up every hour..no,the whole damn machine locks up every hour. The scsi card already had to be replaced and same thing with the HD..at least that is what CompUSA's shitty support said and did.

    of course, it still locks up every bloody hour or so for no particular reason. My father has tried and tried and tried and tried to get Apple support and sales to either pay to have a complete diagnostic on it. (NOPE, they said "HE" would have to pay the 100+ bucks for CompUSA to run this Diagnostic crap on the Motherboard and only after that would they consider replacing the motherboard)

    he also tried to get them to Replace the whole machine..again, the only thing they would offer is the damn diagnostic test which he would have to pay for.

    and now here is the kicker, although there is a 90 return window, because he took into compUSA to get the SCSI card replace (took 2 weeks), then back again to get the HD replace (took 4 fucking weeks), he was push beyond that 90 day window...so now he cannot even get his money back and APPLE will not..NO, they REFUSE to remedy the situation

    To give a comparision, when my dads 1 1/2 year old Dell Laptop went kaput, they [DELL] flew in a TEchnician to replace the motherboard, no questions asked. Now that, is unbelievable customer service. Something APPLE severely lacks

    We are still trying to get Apple to do something, but everytime we call and try to move up the management ladder we always get "they will call you back" which they never EVER do. So frustrating

    I feel so bad recommending this to my father who pretty much has a 5g paper weight on his desk. I will never ever recommend Apple again after this fiasco. If anybody has any pull at Apple, please let me know. I would love to bring some Closure to this.
  • Strange that you should choose that example . . .

    I think it was the 1903 sears catalog that offered a car capable of all speeds from 0 to 25, noting in the ad that they didn't think the average man had any use for going 45 or 50 as more expensive cars did . . .

    While I'm at it, in law school we read a case about "reckless entrustment," in which the owner of the car was being sued for lending it to the driver when he should have known better. Part of the claim was that the driver had a reputation for "driving as fast as 50 miles per hour" . . .
  • This has nothing to do with ADC, which is based on a 3 year old IBM technology and was introduced only 6 months ago

    Close, but not entirely true.

    Actually, ADC is simply Apple's use of prior "technology" (as much as cables can be considered technology) borrowed from NeXT Computer, which we all know has been absorbed by Apple (and Apple by Steve, but that's another story).

    My 040 color slab (aka, "NeXT Station Color) has that kind of cable (diffeent pinouts etc, but the end result is the same) that goes from the machine to the sound box (external speaker) where the keyboard monitor etc are connected.

    If I had an NeXT Mono monitor (the cool-looking one), then that cable would connect to the monitor, and the keyboard, sound box etc would connect to the monitor, like the current ADC connector.

    My black 040 NeXT Cube at home also has the same kind of connector. but for my color (Fimi) monitor to work, it has to be connected onto the NeXT Dimension board. So, one cable goes to my monitor, the other to the sound box where keyboard is connected.

    Get black hardware info at this address [channelu.com].

    Karma karma karma karma karmeleon: it comes and goes, it comes and goes.
  • by ericdano (113424) on Wednesday January 03, 2001 @08:23PM (#532547) Homepage
    What a STUPID proposal. Dropping Dual G4s! In the face of Mac OS X being around the corner, you'd think they'd WANT to show off the fact that a dual G4 running OS X would kick ass......but no!

    Being an owner of a couple of macs, including a 9600 (old multiprocessor 604 computer) and a pc owner (1 dual pentium 166, 1 dual pentium pro, 2 dual pentium II 333 a single processor athlon and a partridge in a pear tree ;-) ), I'd say that my experience with multiprocessor computers is very favorable. Running Linux/FreeBSD or Windows 2000/NT, it really makes the machine more useable. Like if I encode a MP3 on my single processor computer, it will chew up all the processor time and make other programs running deadly slow (on my windows 2000 machine), but on the dual processor machine (windows 2000 or freebsd/linux) the machine can easily encode a mp3 and it will only chew up 50% resources.

    I think Apple jumped the gun with dual G4s, but NOW IS NOT THE TIME to stop making them. OS X will take advantage of the extra CPU and make the thing fly!
    --

  • Somewhere, someone speculated that Jobs might announce a G3 Cube (can't recall where I read this).

    I think it has some potential. Granted, G4 Cube sales have been a disappointment. But iMac sales are starting to drop off. High-end iMac DV sales apparently did pretty well, because there is little inventory left on these. Given that the high-end iMac DV SE sells for $1500, maybe a G3 Cube would be a good product to replace the high-end iMac.

    How about a bundle: G3 Cube + RAGE 128 + 15 inch flat screen? By bundling the screen with the G3 Cube, Apple might be able to sell the whole package for under $2000. Consider that Compaq and Acer are marketing flat-screen PC bundles for about that price. Such a product would address one complaint about the iMac, its all-in-one design.

    There are reasons why Apple might not do this. For one, it might hurt sales of the G4 Cube. But my sense is that anyone who might stretch a bit to reach $2K for a G3 Cube would not go for the G4 Cube anyway. Since G4 sales are poor, it does not appear that the cachet of the trendy design is really moving the product anyway. So, why not market the design to another segment to try to recoup the investment?

  • > I had pre-emptive multitasking on my Amiga, 12 years ago.


    No you didn't. The Amiga used fixed priority scheduling.

    There is no incompatability or inconsistency between pre-emptive multitasking and fixed priority scheduling.

    In fact, the fixed priority scheduling is what made (and still makes) the Amiga such a dream to work on, compared to most other platforms. The computer can be doing 20 different things, but as long as you have the priorities set right, the task that you're working with, runs at 100% full speed. I wish OS/2 or NT or Unix could do that. I hate so-called "modern" schedulers.


    ---
  • Mode32 [macinsearch.com] is the extention yer looking for.
  • I have a Mac with 2 CPUs sitting on my desk here and I've yet to see the evidence that MacOS 9 even remotely puts that power to use. Apps still crawl when they're not in the foreground and it's all too easy to lock the whole machine when an errant app crashes. The user interface looks little different from the one used a decade ago and feels remarkably clumsy compared to KDE or W2K.

    MacOS X may be a different story, but until that appears, the Mac is stuck with an arcane OS and a pretty but stuck-in-time user interface. Neither of these things would make me compare the Mac to a Ferrari except for the exhorbitant price markup both logos entail.

  • Does OS X even support SMP?

    Yes. Here's output from our iMac running the OS X beta:

    % hostinfo

    Mach kernel version:
    Darwin Kernel Version 1.2:
    Wed Aug 30 23:32:53 PDT 2000;
    root:xnu/xnu-103.obj~1/RELEASE_PPC

    Kernel configured for up to 2 processors.
    1 processor is physically available.
    Processor type: ppc750 (PowerPC 750)
    Processor active: 0
    Primary memory available: 192.00 megabytes.
    Default processor set: 67 tasks, 131 threads, 1 processors
    Load average: 1.03, Mach factor: 0.49

    It says "up to 2 processors" but as far as I know there's no reason why it couldn't do 4 or more, and I expect it will when Apple releases quad or higher systems.

  • It's hard to get excited about specs that were met by AMD/Intel over a year ago... Granted we're comparing apples to oranges, but does the average consumer understand the difference between a RISC and CISC (NOPE)...

    --

  • I submitted this in Oct. but was DENIED. hehe. No anymosity

    Motorola has hit 1 Ghz with the G4 Processor. Here's the story from CNET [cnet.com]

    I'm sure Apple's pricing might scare people away from a G4 too, unless they sell a kid :/


    aztek: the ultimate man
  • by RealTypeR (75674) on Wednesday January 03, 2001 @08:41PM (#532586) Homepage
    "First of all, Apple is falling farther and farther behind on the performance race. "

    Have you compared the speeds of say a G4/500 dual processor system and one using a high end AMD or Intel chip? The systems are very comparable. The Mac will easily hold its own, and in certain tasks, like in photoshop etc, it is much much faster. they are not "falling farther and farther behind."

    "Second, software: I'm sure I won't have too much trouble convincing the die-hard command line users that MacOS is inefficient and hard to use, but even in terms of GUI, the once-proud Apple has been overtaken by BeOS and Windows ME, and has GNOME and KDE hot on its heels. Much like hardware, Apple is handicapped by its users' insistence that changes be minor and easy to adapt to. "

    MacOS is inefficient? Hard to use? I believe most people will acknowledge that MacOS is one of the easiet OSes to use. It is criticized sometimes for not being "sophisticated" enough for the power user. This does not make it inefficient. Though it lacks features like protected memory, etc, is it a very efficient OS, in the sense that Mac users are very very productive. Ask a graphic artist or desktop publisher. the mac OS is not hard to use, nor is it inefficient. Compared to Windows ME and the various Linux GUIs available, the average new computer user will find the Mac OS the easier to use.

    You also comment on Apple's lack of "innovation.". Lets see, I'll name a few. These are not necessarily all apple inventions, but Apple was the first to actually bring these to the masses:
    1. Firewire.
    2. USB as the main I/O interface.
    3. Get rid of legacy ports
    4. iMovie - video editing for the masses
    5. iMac - an easy to setup, all in one unit that appeals to the "average joe" who doesnt always care about technical specs
    6. Optical mouse standard on all systems
    7. OS X
    8. Innovative Industrial design
    9. Colorsync technology
  • by DLG (14172) on Wednesday January 03, 2001 @08:47PM (#532590)
    Been watching these boxes for a while, and I think there are a few things to note.

    1. The dual processors... Apple can go back to dual processors again when OS X is on them mainstream. Right now with 9.04 multiprocessing is barely useful for most users (photoshop users being perennial exception. Meanwhile a 733mhz G4 at 133mhz is pretty big news since what it will do is make everything faster in the short term.

    2. MacOs X is not gonna be truly ready until September (a year late but hey, Win95 was supposed to come in 93 and we know NT 5 was supposed to come out in 95.:)) At that point I hope to see Dual 733's at 133mhz bus.What will the Win world have? WinME running Pentium III's?

    3. It would be great if MacOS ran on more boxes than just Apples but they didn't do so well with that. Asking them to move to cheap commodity hardware is not really rational.The real deal here is that folks don't recognize true cost of ownership with computers until they have owned a few. The real shame is that Apple HAS reduced costs by using crappier equipment and it bit them.
    4. The biggest problem Apple had was that no one wants to buy a new machine until OS X comes out. Apple was ready with a whole new set of boxes that would have looked really perty with the perty new OS but instead they are running same old OS 9. If Apple really wanted to get new models sold and empty it's inventory, finish the OS in the 1Q...

    I am a longtime Apple user and Linux user and I hope to use both for a long time to come. As long as Apple makes machines that last me 5+ years I am not gonna bitch much. Since I am still using a 7600 with a g3 upgrade card I am definately waiting. I like the idea of a dual processing 733mhz, but in truth there is a sweet spot right now with dual 450....1999...No matter what anyone says about comparing 300 dollar pc's with this, the G4 is a better chip than anything Intel makes. Athlon might manage to screw that up if they keep raising the mhz but sheerly for media related stuff, the G4 rocks.Just RIP a few CD's...

    dlg
  • If you read the article, they point out the issue that these faster chips may not be available for a while....

    On the other hand, if Mot really can cough up a 733 G4, I would much rather be running Photoshop on that than a 1Ghz Athalon (or After Effects, or ...)

    The real down side to the story is the comment about how most of the systems are likely to be single processor. This is going in the wrong direction. Alot of potential buyers are going to be quite disappointed. Frankly, I was hoping for a base single processor system, a mid-range dual processor, and a high-end quad processor system. If you've had to sit for an hour while AE renders 3 freaking seconds of footage, you'll know why I was hoping for quad processor towers....

    But for what most of the Hertz whiners out there do with their systems, no, quad processors won't quadruple the frame rate of Doom.

  • Aren't G4 RISC-type chips of some sort?

    PowerPC is a RISC archtecture. Same family lineage as IBM's POWER chips.

    - Scott
    ------
    Scott Stevenson
  • MacOs X is not gonna be truly ready until September

    What is the world does this mean? I use OSX every day as my primary OS. Except for incompletely 24-bit color support, it works great. Since I start using it in September, the OS has never crashed on me (though Classic can get a bit unruly at times).

    - Scott

    ------
    Scott Stevenson
  • It's about damn time.

    As for Apple (or more specifically Motorola) lagging behind AMD and Intel in terms of speed. This will keep more current Mac users with the platform, but Apple is going to need Mot to kick out 1Ghz chips real soon.
  • they've sued the few websites that support them

    This is garbage. Most rumor sites publish rumors for personal gain -- whether it be for fame or money. They are taking advantage of 6-12 months worth of hard work on the part of Apple and blowing it all in one day. I don't see how this is "supporting" Apple. It's not as if Apple is going to sell more boxes because of the rumor sites.

    - Scott
    ------
    Scott Stevenson
  • All i have to say is i better be able to make my toast in this one.
  • Great, GCC with Altivec code. When I was running LinuxPPC 2000 on a G4, I was told that Linux couldn't use the Altivec unit because the kernel didn't understand how to save and restore the registers properly. Is this fixed, if so in what kernel revision, and if not, how does an Altivec-aware GCC help Linux?
  • Where the fuck are you getting this? MacOS has supported multiple processors for years. You used to be able to get 9600 MPs that had dual 604e processors. That was back in the days of OS 7.x. Besides that Apple sells it's hardware on a much higher margin than PC manufacturers like Dell or Gateway. They have much higher volume production contracts than most PC makers as well as exclusive deals with people like Motorola. You don't see G4 (MPPC 7400) chips or motherboards in anything else do you? Since the chips don't go through any intermediaries before they get to Apple's assembly facilities they get them at about cost. I don't know if you know but Athlons and Pentium chips cost only a fraction of their retail (or whole for that matter) coming out of the factory. Prices like this vastly decrease the production cost for Macs and that money goes into Apple's cauffers. PC manufacturers often are forced to buy their hardware at wholesale prices which greatly reduces the profit from selling hardware. Oh yeah, Unix is a 31 year old idea. DOS almost as old, MacOS is a baby compared to the two. No OS is perfect, Unix still has lots of areas where it could use some work.
  • What is really significant about these new machines is the faster bus speed. While PCs have been humming along with 133mhz+ busses, the G4s have been hindered by (100mhz?) busses. But even more so, the dual G4s have been hindered. Apple has shown than plunking two G4s into a box instead of one is easy, so future machines (spring? summer?) may even feature two 733mhz (1ghz?) G4s in them.

    Maybe the tortoise is catching the rabbit?
  • Good point. The only thing that makes me sicker than the Megahertz race in PCs is the Megapixel race in DCs. Yes, our camera has 2 megapixels. All the images are recorded as 4 by 500,000 JPEGs with a strong skew towards pink. :)

    Anyway, this is not unexpected news from Apple. Many expected that the price cuts on older models were signs of newer stuff coming out. It doesn't sound like anything revolutionary here; just improvements on existing designs. On one hand, it's good for them to be cautious after the Cube debacle. On the other, it won't rejuvinate them like the iMac did. With the still somewhat cloudy PC market, it's hard to fault them for being conservative.

  • Quad processors wont necessarily increase the speed of AE renders either. Adobe really needs to G4 optimize their products, more than just cracking out a few AltiVec ready filters. Building a line of SMP boxes is the wrong move for Apple at this point. With a bunch of single processor boxes (merely with more RAM and a faster CPU) you can just reuse the same motherboards which means you can buy them en masse and not be at a loss. With a small number of multiprocessor boxes you aren't moving the mobos out in volume which means you can't order large numbers of them. This drives up the cost that smart companies won't pass onto their customers. THis leaves Apple with sagging profits. So they are deep sixing their multiprocessor systems except in server models which don't sell heavily anyways.
  • I don't play computer games, so I might be out of the loop on this one, but why does speed matter so much? What can a 1 GHz computer do for the average user that a 500 MHz computer can't? I'm sure there are lot's of answers like "faster SETI", "faster compiles", "faster ray-tracing", "quake @ 1600x1200" etc. But most people don't use their computer for that.

    Most people just want to interact with the internet and create different kinds of documents. Their "power app" is playing DVDs. The only thing that keeps their CPU below 99.5% idle is "Clippy" dancing at the bottom of the screen.

    What people really need, more than CPU power, is good, easy to use software. Apple tries to provide that. If you don't belive me about the importance of good software, look at the success of the Palm Pilot vs. the failure of Windows CE.

  • Still deciding what to get; the iBook, all cute and cuddly like, the PowerBook, all serious and stoic, or the G4 Cube, suave and classy.

    In terms of performance, PCs seem to be fast enough that faster just doesn't matter. Why would I need a 1.5GHz system? I'm running on a 500MHz system, and plan to be running it for another few years yet. Heck, even 800MHz would seem to last for at least 5 years, given my track record with my last computer.

    Still, I'll probably think a 500MHz Apple sucks, right? I dunno, I don't have enough experience with the G3/G4 to say; do they age particularly better than a x86?

    On the other hand, I am enamored with Apple's drive for innovation.

    The USB IO adoption
    The Firewire IO adoption
    The use of Airport and wireless networking
    Mac OSX (in the near future), and Unix stability, without the ugliness of Linux!

    Well, Linux isn't quite ugly, it's damn functional, but sorta a pain to set up. Win2k is such a breeze to use.

    Then there's the quiet fanless iMacs and G4 cubes.
    There's the firewireness of the iBooks and Powerbooks.
    Optical Mice. Everywhere
    *Really* nice LCD screens.

    Other hardware coolness I'm looking forward to; More snazzy designs!
    A Newton2!
    Wireless PCs; at least, as much as possible...
    OS X!
    Pervasive computing!
    Inclusion of mic and USB cam with *all* computers!
    Instant Messaging type usability in the OS

    Other random cool stuff...
    Still, they aren't dead yet, and they're still doing okay...

    Maybe I'll regret writing this post in a few months, when I have my Apple. I'll post and let everyone know!

    Geek dating! [bunnyhop.com]
  • This is extremely depressing to me. I've greatly enjoyed the fact that my 500mhz powerbook that i bought about a year ago is still the fastest clock speed you can buy in a macintosh. None of that silly next door neighbor buying the newest faster chip every two weeks for me. Way to make your computers appear to become obsolete a little less quicky apple!
  • by Cinematique (167333) on Wednesday January 03, 2001 @09:28PM (#532634)
    Why is it that every time a /. article mentions Apple Computer, a giant war starts as to which is more powerful, a PC or Mac? False facts fly like "Apple's lowest priced comptuer is still over $1000" when in fact they sell an iMac for $799.

    Obviously I'm going to be taking a little shit for the fact that my email is from mac.com... so I must be *clearly* Apple baised :p BAH. My very first comptuer was a 286 laptop, followed by a 386 desktop, and a Pentium 120. It wasn't until I left for college that I got my own Mac. Why? Because it fits my computing needs and desires.

    Now you are probably wondering... "Gee thats great, get to the fucking point." My point is that regardless of what you like, what you know, and whom you support, a little research is clearly in order. I'm really growing tired of watching people spew misinformed posts on to the boards and positioning them as fact.

    funkdat.

  • 1.)Have you ever worked anywhere that required working with colors and shapes? What if those colors and shapes needed to look the same on every monitor in the shop? Well, that new Apple Display Connector should help.

    Yeah, you can buy those too. If you are really going to need that kind of accuracy, you probably bought monitor/video card that already has this feature included(GO SGI FLAT PANEL, WOOO!). Anyway, ensuring color consistancy on all platforms has been solved a while ago.

    2.)Want to add hardware? While you'll have fewer options than a Wintel user, your purchase is almost guaranteed not to conflict with any common configuration. And when you want to put it in, you open the door (no screws).

    You know most PCs have screwless maintainance(well, with the case anyway. I like sun hard drive holder. in/out in/out... WWWEEEEEEEE!!!). It's all in case design, and if you don't like it, you can always get a different case(same with an apple machine too I guess)

    Also, I have had very few problems dealing with hardware conflicts, especially now a days. So much in windows is handled almost transpartly by the OS now. While it still is slightly buggier than the Mac version, it also has to deal with more hardware. It's not as bad as you make it sound.

    3.)Your purchase will last. I own a Power Mac 8600. I do all kinds of demanding work on it. To be fair, video is not one of them. But guess what? It's still really fast. Sure, I notice the difference during some Photoshop filters, and during sound file manipulations, but my machine was bought after the G3 came out. Let's see how those celeron boxes are doing in 4 years.

    The speed increase in both platforms I believe has been very similar. What you argue is just a point of view. I used the same computer until recently for about 5 years(Windows, not a single reformat). I did basic 3D animation/modeling. If you can do it five years ago, you can still do it today on the same machine. Some people don't seem to understand that at all.

    4.).DLL? what's that?

    Dynamiclly Linked Library. Though I have never programmed on the MacOS, I'm pretty sure you have something similar. Anyhow, I don't really see the point of your argument. If there is a problem with DLLs it is simply a bug in the program(or in some cases the dll), not in the concept of DLLs.
  • Well I stand corrected about the motherboards. Although the point is still valid concerning extra cost for a dual system. They're sticking in the daughtercard and a second processor for the same price as the (when they first came out) single processor systems.
  • by BWJones (18351) on Wednesday January 03, 2001 @10:10PM (#532637) Homepage Journal
    What is with people equating speed with clock cycles? There is more than clock cycles at work here folks. (As the latest Pentium 4 debacle will demonstrate). I am sure that the enlightened ones here will agree with me when I say that there is more than one way to get performance out of a chip just as there is more than one way to get a car to go fast.

    If you are assuming that more Mhz means faster chips, then you might be mistaken to say that the 400 Mhz SGI Octane is slower than the 500 Mhz Macintosh, or the Pentium system running at 750 Mhz. The reality is that the SGI will easily outpace both systems at most tasks just as a Porsche 911 will outrun a Dodge Viper that has a much larger engine than the Porsche. Its all about balance, and code optimizations and memory tasking and wait states etc etc etc....

    Please lets not let Intel brainwash us all into thinking that CPU cycles are all that. There is more to chip design than making pipes deeper and cranking up the clock crystals. For instance, the R10k MIPS chip in my SGI will never be able to work in a laptop design as the G4 chip can. The MIPS chip would start a fireball in anything without a heat sink the size of a VCR cassette and big fans, whereas I expect to be working with the G4 in a Powerbook some time next month without using clock pacing tricks like Intel has had to implement in the Pentium portables. (a trick by the way implemented by Apple sometime back in 1991 for their powerbooks at the time). The chips are obviously designed for different purposes, but it is pretty cool that the G4 chip has the legs to run in a workstation, while at the same time having low enough power consumption/heat production to be used in a portable configuration.

    Companies like Transmeta, Motorola, IBM, and ARM will show the way to more elegant chip designs and somehow they will have to compete with Intels marketing juggernaut. (I know, I know, Intel now owns a part of ARM. Perhaps this is a good thing?)

    My point is simply that we should not buy in to Intels marketing thus making it harder for better/more efficient chip designs to come to market. So lets not let this misconception last much longer O.K.?

  • The Mac will easily hold its own, and in certain tasks, like in photoshop etc, it is much much faster.

    The reason for this is that the G4 has 1 MB L2 Cache, which the Athlons and P3's have reduced in size to push the MHz. Why does this matter?

    The L2 Cache has a bandwith of ~10GB/s whereas accessing the main memory is 10 times slower, (PC133 has a bandwith of 1.08GB/s). When you're doing effects in Photoshop, a large L2 cache makes a huge difference, simply because the processor can load 1 mb chunks of the picture into the processor cache and perform the effect on it while the Athlons/PIIIs only have room for a quarter of that. In the very specialised problem that Photoshop is a huge L2 cache matters a lot more than MHz. (Most other apps benefit little from a L2 >256kb)

    It would be interesting in seeing a benchmark comparing intel's Xeons (which also has a big L2) and the G4. Also, photoshop optimized for the P4, which thanks to rambus has a high memory bandwith (but small caches) would be interesting.

    (As for the other apple "innovations", they're mostly interesting from a design perspective, not technical, so i'll leave them alone :) )

    -henrik

  • (a year late but hey, Win95 was supposed to come in 93 and we know NT 5 was supposed to come out in 95.:)

    that would have been quite a trick considering WinNT 4.0 came out in 1996! :-)

    i think the original target date for NT5 was late 1998/early 1999.

  • I have an April 1997 PCWorld that talks about Windows NT 5 coming out in June of 1997, even though it is a year late.

"Just think of a computer as hardware you can program." -- Nigel de la Tierre

Working...