New G4s Coming Our Way 259
MasterOfDisaster writes "According to c|net, and this article on maccentral.com, Apple will release "four new, single-processor Power Mac G4 models, all using a 133MHz system bus, and ranging in speed from 466MHz to 733MHz" as well as MacOS 9.1 and several other things, next Tuesday at MacWorld Expo in San Francisco."
Re:Still losing the speed race (Score:2)
i don't think this is nearly as bad as you make it out to be. here's the thing: computers are getting too fast these days. there are very few people who need 1GHz computers. most people just need a pretty average machine, and even an "average" machine these days are pretty quick!
with processor speeds increasing they way they have, i predict that computers will start to sell based on other criteria, rather than just "speed." this is where you're going to see Apple really take off. it'll be similar to why people buy cars: they don't buy just the car with the fastest engine, they buy the one with the features and style they want.
you say the "average consumer" is going to pick the bigger number of Mhz. i say the "average consumer" doesn't even care! have you talked to "normal" people about getting a new computer? this is what they say: "I want to buy a computer." they don't say, "i want to buy a 1GHz Athelon." most computer-illeterate people i've met just equate a computer as a computer. as long as it's not "old" (that is, used), it's just a "computer" the same as a car is a car. they'll go out and buy the one they like the most after "test driving" it in the store.
it's mostly computer-savvy or at least somewhat-computer-intersted people who even look at "specs." it's the people who have a passing interst, but no really solid knowledge in computers that buy based on bigger number of Mhz. when you start selling to people who really don't give a shit as long as the computer does the job they want it to, then pretty Apple computers, with easy Firewire and USB port and the slick interface of Aqua is going to sell.
at any rate, i'm very much looking forward to the future of Apple. i love running Linux, but i still get all my "real work" done on a Mac, and i don't think that's going to change with Mac OS X (except that it may actually cause me to use my linux box considerably less)
- j
Re:Still losing the speed race (Score:1)
Re: (Score:1)
NT Shipping Dates, Cairo, Chicago, etc... (Score:2)
All hearsay, of course.
--
Re:Finally... (Score:2)
Re:Damn! We want dual processor G4s! (Score:2)
Re:Still losing the speed race (Score:1)
Do you have any idea what you're talking about? ADC is just an interface combining power, DVI, and USB in one cable/port. It's only purpose is to eliminate cable clutter.
Macs have been known for color consistency for years because of ColorSync. This has nothing to do with ADC, which is based on a 3 year old IBM technology and was introduced only 6 months ago with the CP machines.
Re:Confessions of a former Mac User (Score:1)
Make that the most ergonomic mouse around with no buttons. You obviously have this mixed up with the old Apple mouse from a year ago.
There are no ergonomic 3-button mice, because they all force you to keep three fingers poised over their respective buttons with either your palm pushing the mouse, or else your thumb and pinky clamped onto its sides... very un-ergonomic. The buttonless Apple mouse is a dream, to use... especially since the OS does not really require multiple mouse buttons. I like my MS Intelimouse, particularilly the spiffy scroll wheel in the middle, but the new Apple mouse is much more pleasant to use. (For the record, I still use the MS mouse on Win and Linux boxes, and I think MS makes some of the best mice on the market.)
Re:9.1 to be released at MacWorld Tokyo in *Februa (Score:2)
Wired (Score:2)
Mac advocate argues Apple hardware overpriced? (Score:1)
Let me get this straight. Apple won't port OS X to x86 because everyone will stop buying Apple hardware and instead run OS X on x86, ergo Apple won't make any money.
There's too somewhat contradictory implications to that:
I'd argue that Apple wouldn't lose any real market share in hardware; the Mac users are pretty much sold on Mac hardware. There is a risk that the performance claims made by Apple would be shown to be largely subjective when people ran the same OS side by side on different hardware. Whether they would gain a lot of x86 users depends on Win32 developers embracing this new platform.
Re:Confessions of a former Mac User (Score:2)
This is probably the best thing I can ever remember them using... It's the best external connection to rival external SCSI I've seen... course if they could just modify the cost to be a bit lower (for devices) than it would be real sweet... FireWire = Digital Video. Hellllooooo Non-Linear Editing! Woo hoo!
So? Optical mouse may be nice at some things, but I know lots of desktop publishing people & artists that hate that part about the newest mac's I don't think that's entirely fair. As someone who spends a great deal of time over his mouse (NLE work, constantly) I'm a big fan of the new optical mouse. I didn't like the puck, which is where you could be getting confused.
MacOS, a waste of time (Score:2)
get your dates right (Score:1)
Anyway, Rhapsody was originally due out in '98, and while MacOS X Server did eventually come out in '99, it really didn't fulfill the promise of Rhapsody: a stable consumer OS. So you could say that MacOS X is really about 3 years late now.
Also, Rhapsody was originally going to run on x86 machines as well as PPC, which was completely dropped after Apple realized that if it did, no one would buy Apple's overpriced hardware.
-this is from a long time Apple/Mac user: Apple IIe, Classic, Classic II, 5200, Blue G3, G4/400
If we could only send a Jackass to the moon... (Score:2)
umm, no (Score:1)
Having a machine that doesn't crash and has real dynamic memory allocation will be heaven for most Mac users. All Apple really needs to do is take out that friggin debug code so the thing doesn't run slow as shit.
Apple's problems (Score:1)
Unfortunately, from day one, Apple kept its doors tight close and would not let anyone except Apple to get in. In addition, Apple targeted its market to a narrow and small segment such as "graphic artist and desktop publisher". In addition, Apple marketed itself as a "cool" company that produces "cool" product.
It is my belief that those event along with other erroneous events directly caused by Apple are what make Apple what it is today -- corporations don't take their product seriously to do anything with it.
So while the Mac OS X is "cool", that is all what it is "cool" -- if there is no business strategy to deliver it to the consumers and corporations that it will just be used by those Mac fan and no one else.
Finally, 10 years ago, there used to be a reason to buy a Mac, for publishing and graphic. Today, you can get those applications on a PC: Photo Shop, Adobe, etc.
And for those Mac users who keep toting its UI as being easy to use -- would you please stop it and get a life!! Just use Windows, KDE, GNUME, OpenLook, etc. (any other UI) for few weeks and you will see that the Mac UI is not the magic you think it is.
So tell me, why do I need a Mac?
Re:Still losing the speed race (Score:1)
The entry level iMac for $799 doesn't come with the FireWire ports. Only the iMac-DV and iMac-DV+ have those ports. Those new G4s look pretty cool. They come with 10/100/1000base-T gigabit ethernet cards.
Re:Still losing the speed race (Score:2)
Re:Confessions of a former Mac User (Score:1)
I'm bothered... (Score:2)
Not long ago you could go into the Store section of their site and choose to beef up a machine with more memory, larger hard drives, better monitors - but now you only get choices for software and peripherals like digital cams. What gives?
I guess the only way this relates to the topic at hand is that the only time I look at Apple is when something new is about to come out and I can afford the old stuff. I am really in the mood to run a LinuxPPC/MacOS machine.
Re:Finally... (Score:1)
I am not joking.
Re:Apples, Oranges, Grapes, Pears.... (Score:1)
--------
Price plays a part (Score:2)
But these freshly baked 733MHz wonders will be much more expensive at first. And it would add a lot to the price of the system to add an extra processor.
I agree (Score:1)
But still, the prospect of having a stable, modern underpinning to the OS is very appealing.
I'm hoping that Apple will change Aqua a bit before the final(we should see the results at MWSF) and also that they will release tools to allow 3rd parties to create themes that can drastically change the interface. I will almost certainly not be using the default Aqua scheme(even if they give me an Apple menu and trash the dock, it's way too bright).
Re:Still losing the speed race (Score:2)
The biggest problem with the Mac is not the megahertz the machine runs at but the perceived speed that it takes to do stuff. I have an outrageously specced Mac sitting on my desk and the UI acts as slow and retarded as the Mac I used to use at university nealy a decade ago. The single-mouse-button, single menu strip is just painful to use as it was then and Apple haven't picked up on any of the UI advances that other operating systems have made in that time. I don't think the MacOS X UI will be much better but at it will be a real OS under the hood and much more power-user friendly with access to shell prompts etc.
Re:NT Shipping Dates, Cairo, Chicago, etc... (Score:1)
Chicago was the code name for Windows 4.0, aka Windows 95.
Some documents [pcc.edu](note its date) claim Cairo was the code name for Windows NT 4.0 (the first release of NT with the Win95 interface).
Others [microsoft.com] claim that Cairo is/was the code name of NT 5.0 (aka Windows 2000)
Perhaps Microsoft used the Cairo codename for NT 4, and then reused it for NT 5. That's just a guess, though. However, the claim that NT 5 was due out in late '95 doesn't seem to have any basis in fact. Rather, it seems like the result of a confused combination of the version number of Cairo #2 with the release date of Cairo #1.
Re:Still losing the speed race (Score:2)
Re:If we could only send a Jackass to the moon... (Score:2)
Still losing the speed race (Score:3)
Re:Still losing the speed race (Score:1)
A "low end" machine is generally the cheapest available. An E-Machine box for Intel arch, an iMac for Apples.
And a "low end" PC generally doesn't have a 17" display - if so damn, I'm still using a 15" at home (and I cry when I leave my 19" at work)
Re:Is it Motorola's fault? (Score:1)
frankly I just cant take his word as aboslute confirmation.
On the otherhand, that doenst mean that the story has to be bullshit. But Moto has many reasons not to focus on Apple as a customer, they have many problems of there own in their business and from what I have been hearing they are suffering from quite a brain drain in their microprocessor department. this alone is enough reason to focus on more profitable areas of their business (including selling PPC chips for signal processors).
anyway, take it all with a grain of salt
Re:9.1 to be released at MacWorld Tokyo in *Februa (Score:1)
Oh.. and MHz is everything? (Score:3)
It doesn't mean that the P4 is bad compared to the G4, it just means that you can't compare them by looking at the MHz/GHz-rating.
They have taken different routes to high performance, but people seem to automatically assume that higher MHz == higher speed. It is often speculated that _this_ is the reason for Intels sacrifice on the Pentium 4 (something I find rather believable).
Re:Damn! We want dual processor G4s! (Score:1)
Re:Wow! (Score:1)
Are you sure we can talk about this? (Score:2)
Re:Apple Hardware prices (Score:1)
Jobs/Apple can sell some nice plastic and Apple needs to evaluate the price points of their hardware better. The G4 Cube was better than an iMac and a bit less than the Power Mac G4. So Apple should have priced it between those two product from the start. If they had done that, then I think they would not have had the inventory problems they had with the Cube. Also they should not try to sell hardware to a niche market inside an existing niche market. Or if they really wanted to do that they should have done a better job at forcasting based on this and the initial price they planned.
Apple's been on the verge of going out of business for the last 20 years and will probably do so for at least the next 10 years.
Re:get your dates right (Score:1)
BTO (Score:1)
That's right...you ain't seen nothing yet!
Sorry. I couldn't hold back.
Shooting off here... (Score:2)
Geek dating! [bunnyhop.com]
Re:Boring.... (Score:2)
Re:Hopefully? (Score:2)
Does OS X support SMP?? (Score:2)
Re:St. Steve is the loser... (Score:2)
Torrey Hoffman (Azog)
Re:Boring.... (Score:2)
They also don't understand the difference between closed-source and GPL. I guess all those Linux proponents should just go home.
--
Re:nope (Score:2)
computers still have a long long way to go speed-wise. it's as if you're in 1904 saying "why would a car ever need to go faster than 25 miles per hour?"
besides, people will always be drawn to the faster machine, both by internal competitive drive and by marketing pressure.
Let's try applying the automotive analogy to that last sentence of yours: "People will always be drawn to the faster car". Er, no actually: People base their car buying decisions on many factors, and speed is pretty far down the list for most people, because any car you'll buy will be more than capable of going as fast as you actually want to go in 99% of situations.
Sure, cars had a lot of room for increases in speed in 1904, but eventually those increases leveled off. Who's to say that the same thing can't happen to computers? How can you say with confidence that it isn't happening already?
Re:For better and for worse... (Score:2)
Re:Still losing the speed race (Score:2)
For nearly 3 decades now, the computer consumer has been accustomed to ever increasing speeds, for stable or declining prices. Anyone remember spending five grand on a 4khz 8086 with 4 megs of RAM?
Then, 6 months later, the machine would be obsolete, as a machine twice as fast was out for probably four and a half.
Maddening. 3 years later, it was compelling to get a new machine, maybe still 5 grand, but we wer talking about significant gains; 66 khz.
The problem with Apple is, nobody's buying new machines. I'm not buying a new machine, because my Beige G3 at 300 MHz, with 192 megs of RAM on a 66MHz bus, though I'd like it to be faster and more responsive, I'm not willing to blow $3500 on a machine that's barely twice as fast. I spent $1500 on this G3, two years ago, twice as fast for twice the money? After 2 years? Blow me.
I would pay that kind of money for a dual 600 with a 200 MHz bus. But this 133MHz bus ride is bullcrap. Apple's hardware technology is behind the curve. Don't tell me I don't need a faster machine. When it comes down to it, I don't need ANY machine. I need food, air, and shelter. What I WANT is a machine that's faster. One that can run the latest bloated eye-candy at least as quicly as the 2 year old machine ran it's OS.
Apple has to either significantly lower it's prices, or improve it's hardware advances. That's all.
Personally, I think this announcement has only one purpose. It is to generate sales of the older discount hardware to fix Apple's inventory problems. Frankly, the older discounted machines are far more attractive than the vapor they're announcing today - and I believe that's by design. As soon as the inventory of the older machines is eliminated, Apple will announce upgraded models (this is EXACTLY the Yikes plan, rehashed), with the 200 MHz buses, perhaps faster CPUs, perhaps not, but they'll stress MP more than single CPU. My guess is that Apple would really rather sell single processor machines, as the profit margin is higher. - but in order to appeal with single processor machines they need higher MHz-age.
Re:Finally... (Score:2)
Of course not. Mac OS X isn't ready to ship yet. Did you see the public beta? The user interface was a disaster. Hopefully they've fixed the design flaws, but there's still some debugging and polishing left to do. When they do release it, it needs to be perfect.
--
Yikes! (Score:2)
That just woke me up _real_ fast. (wish I had a G4 instead of G3, too). The thing is, it's Linux- it doesn't have to be just a distribution, you can maintain things yourself. The important thing is the compiler because if you are a good little linux user and know how to compile all stuff with ./configure, make, make install (or whatever the RTFDirections says), you get all the software set up for your processor- given certain conditions.
Altivec can be used for block moves, for a wide variety of big-data-handling operations. It can be _general_ _purpose_. Does this GCC simply allow for software to be written using Altivec (as if it was some sort of very specialised MMX) or does it dynamically take advantage of the 128-bit registers wherever possible? Whether or not it _does_, it _could_ in future do that: particularly if the C libs are written to be Altivec optimised where possible (again, such as using the registers to move large chunks of data).
Very cool, can't wait for it to become more generally useful- I sort of doubt that all of GCC can make use of Altivec (in the way that Quicktime and Quickdraw were rewritten to make use of it, and that OSX's rendering layer does) but it's just a matter of time because we _are_ talking about a current-generation powerful consumer-level architecture with special characteristics. Linux has a way of adapting itself to these. Eventually, not only will PPC look like a very sensible choice for Linux deployment, but Linux will look like a very sensible option for Mac alternate OS choice.
Re:Still losing the speed race (Score:3)
Apple, as usual, is being held back by Apple. They've switched processor families before and there is no technical reason they couldn't again. For some reason, everyone else in the world knows that selling PC hardware is a low margin game and that Apple's forte is their OS and some of their applications, but they keep stumbling around trying to convince themselves that making cool looking boxes is going to recapture their past and short lived glory years.
Re:Still losing the speed race (Score:2)
Re:Does OS X support SMP?? (Score:2)
OS X [apple.com] supports SMP fully [apple.com], as it's based on NextStep [xappeal.org]. (OS X is nothing more than NextStep with its out of date userland programs updated with FreeBSD's [ispworld.com].)
Do some research [salon.com] next time.
Re:Wired (Score:2)
There are even calls for the return of Steve Wozniak, Apple's vice-president of research and development from 1976-1985, a time when Macs held a strong position in the marketplace.
Does the writer even know that the Mac wasn't introduced until 1984? Or that Woz had nothing to do with the mac?
Re:If we could only send a Jackass to the moon... (Score:2)
Re:Does OS X support SMP?? (Score:2)
But I believe it will only be supported for apps written to the BSD or Cocoa subsystems. I may be wrong about Carbon, but I think Carbon apps will be funneled to one CPU, and I'm pretty certain Classic (the majority) apps will be single CPU only.
So, not only do we have to wait for OS X to come out, but we have to wait for the major vendors to release native ports of their apps. If I'm right about Carbon, that will be quite a while. I don't think Adobe, for one, has ANY plans to rewrite Photoshop in Cocoa (although Apple could make that somewhat attractive by ressurecting the OpenStep for Windows thingie - then Adobe could port to Cocoa, and recompiled binaries would run on OS X and NT, and rumor has/d it that there was an OpenStep runtime for SPARC/Solaris as well - ah, fantasyland. . . )
Re:Can you imagine... (Score:2)
Re:Wired (Score:2)
Re:Does OS X support SMP?? (Score:2)
Re:If we could only send a Jackass to the moon... (Score:2)
You cannot say (capital "Y">"Yahoo!".
consider yourself warned.
-Yahoo! corp. legal copyright enforcement team.
Re:Still losing the speed race (Score:4)
counting macos bits (Score:3)
Macos had 24 bit addressing from the start, although I think the early systems or hardware decoded anything with the high bit high as the roms (but it's been a while, and my little brother has my copies of inside mac).
At system 6.0.something (i don't hink it was
This comes from the nature of the early 68xxx processors. The original design had a 16 bit data path, 16 bit ALU (wait, it was 32, wasn't it? it could do 32 bit operations, but did it do that by using the same alu on each half? it's been too long . .
Given that a 32 bit register was addressing a 24 bit address space (there were only 24 pins for addresses; this was still DIP packaging for the processor), it left 8 bits which were tempting to use.
Apple told developers not to use those bits, as they were reserved. Programs that followed the directive were generally executable on later machines, while those that weren't needed to be rewritten. The two biggest violators, in order? Apple and Microsoft . . .
Sometime around the IIX and SE/30, the ROM's became "32 bit clean" and other
software was similarly designated. Such machines could generally (but not always, iirc) go past 16M of memory. Roms could be retrofitted to some models
to allow such software.
I want to say that it was system 7 that required 32 bit clean roms, but it's
been a while, and I'm not certain. There were certainly significant
differences between systems 1-6 and 7, but it really wasn't a 16/32 transition. The original 68k was a 16 bit chip in the same sense that the 8088 was an 8 bit--data path, and not much more. For most intents & purposes, the macos was a 32 bit os with a bit of 24 bit crippling from the start.
hawk, dusting off old memory cells.
Re:Still losing the speed race (Score:2)
The monitor's base had a bunch of connectors (ADB, sound out, and mic in). but, they did not connect using a single cable, but ratter a bunch of them.
My PMac 8600AV/200 also has that setup, onto my Apple Multisinc 17" display. The whole thing requires a thread of cables on the back of the monitor, which is totally different from the ADC connector, and ultimately, from the NeXT cable.
Besides, the NeXT calbe also predates the AV systems.
Karma karma karma karma karmeleon: it comes and goes, it comes and goes.
Re:counting macos bits (Score:2)
>stuff, leaving the top 8 MB for programs and the MacOS to run in.
Ahh. I had them backwards
>It's interesting that Apple had the foresight at the time (1982?) to
>reserve the bottom of memory for what they thought they needed for
>hardware address space, leaving the sky the limit for adding memory
>above the 16MB barrier when Motorolla overcame that limitation of
>their processors.
It's not so much foresight, I think, as failing to do something
extremely stupid
any of the addresses, so they can all be put anywhere you
want at boot time.
Remember the Switcher (pre-multifinder)? On a 512k or 1M machine,
you had multiple programs loaded by having multiple copies of
the system loaded at varying addresses (only one of which could
be at the "normal" load space)
>This is in stark contrast with Intel/IBM/MS that decided to reserve
>memory at 640 MB of memory in the x86, setting an ultimate upper limit
>to never be overcome in real mode.
That's not quite how it happened, though. IBM only claimed 256kb
of address space, anyway. We quickly figured out that 512kb was
workable, and it seems to me that there was a year or two before
someone figured out you could add another 128.
There wasn't really anything hardwired to that space, although the
color and monochrome cards had fixed addresses. These should have
been movable, except that the bios drivers were *so* slow and poor
that everyone had to write to the hardware (If memory serves,
keeping up with a 1200 baud serial port was beyond the bios's
ability, but it may have been a faster [but still slow] speed
where it couldn't hack it.)
Some early mac programs did the same direct to hardware thing, but
a) these got broken hard early on by competitors that didn't, and
b) the toolbox was well enough done tha it generally gave better
performance than custom code anyway.
>Trying to install NetBSD on old 68K based Macs helps you sort all of
>this stuff out.
Trying? MacBSD on a IIci was my primary machine for a few months--which
is whent the serious 1-bit display problems on LyX went away (no, I
didn't fix them; I just kept reporting what I couldn't see . .
the limited display size soon had me using primarily the Linux
box at its side, as I could drive the 17" display at 1024x768 . . .
hawk
Re:counting macos bits (Score:2)
/me brushes more dust off brain
wait a minute, wasn't that a third party utility that let you do that? and eventually apple bought it and included it?
I never really followed it that much, becasue my 030 macs were all 32 bit clean, while it just didn't matter on my 68k models . . .
hawk
Apple's Service completely sucks (Score:2)
Big, Big, Big mistake. I feel like a complete ass. My father has had nothing but complete trouble with the piece of crap. The mouse locks up every hour..no,the whole damn machine locks up every hour. The scsi card already had to be replaced and same thing with the HD..at least that is what CompUSA's shitty support said and did.
of course, it still locks up every bloody hour or so for no particular reason. My father has tried and tried and tried and tried to get Apple support and sales to either pay to have a complete diagnostic on it. (NOPE, they said "HE" would have to pay the 100+ bucks for CompUSA to run this Diagnostic crap on the Motherboard and only after that would they consider replacing the motherboard)
he also tried to get them to Replace the whole machine..again, the only thing they would offer is the damn diagnostic test which he would have to pay for.
and now here is the kicker, although there is a 90 return window, because he took into compUSA to get the SCSI card replace (took 2 weeks), then back again to get the HD replace (took 4 fucking weeks), he was push beyond that 90 day window...so now he cannot even get his money back and APPLE will not..NO, they REFUSE to remedy the situation
To give a comparision, when my dads 1 1/2 year old Dell Laptop went kaput, they [DELL] flew in a TEchnician to replace the motherboard, no questions asked. Now that, is unbelievable customer service. Something APPLE severely lacks
We are still trying to get Apple to do something, but everytime we call and try to move up the management ladder we always get "they will call you back" which they never EVER do. So frustrating
I feel so bad recommending this to my father who pretty much has a 5g paper weight on his desk. I will never ever recommend Apple again after this fiasco. If anybody has any pull at Apple, please let me know. I would love to bring some Closure to this.
1904 cars and 25 mph. (Score:2)
I think it was the 1903 sears catalog that offered a car capable of all speeds from 0 to 25, noting in the ad that they didn't think the average man had any use for going 45 or 50 as more expensive cars did . . .
While I'm at it, in law school we read a case about "reckless entrustment," in which the owner of the car was being sued for lending it to the driver when he should have known better. Part of the claim was that the driver had a reputation for "driving as fast as 50 miles per hour" . . .
Re:Still losing the speed race (Score:2)
Close, but not entirely true.
Actually, ADC is simply Apple's use of prior "technology" (as much as cables can be considered technology) borrowed from NeXT Computer, which we all know has been absorbed by Apple (and Apple by Steve, but that's another story).
My 040 color slab (aka, "NeXT Station Color) has that kind of cable (diffeent pinouts etc, but the end result is the same) that goes from the machine to the sound box (external speaker) where the keyboard monitor etc are connected.
If I had an NeXT Mono monitor (the cool-looking one), then that cable would connect to the monitor, and the keyboard, sound box etc would connect to the monitor, like the current ADC connector.
My black 040 NeXT Cube at home also has the same kind of connector. but for my color (Fimi) monitor to work, it has to be connected onto the NeXT Dimension board. So, one cable goes to my monitor, the other to the sound box where keyboard is connected.
Get black hardware info at this address [channelu.com].
Karma karma karma karma karmeleon: it comes and goes, it comes and goes.
Damn! We want dual processor G4s! (Score:3)
Being an owner of a couple of macs, including a 9600 (old multiprocessor 604 computer) and a pc owner (1 dual pentium 166, 1 dual pentium pro, 2 dual pentium II 333 a single processor athlon and a partridge in a pear tree ;-) ), I'd say that my experience with multiprocessor computers is very favorable. Running Linux/FreeBSD or Windows 2000/NT, it really makes the machine more useable. Like if I encode a MP3 on my single processor computer, it will chew up all the processor time and make other programs running deadly slow (on my windows 2000 machine), but on the dual processor machine (windows 2000 or freebsd/linux) the machine can easily encode a mp3 and it will only chew up 50% resources.
I think Apple jumped the gun with dual G4s, but NOW IS NOT THE TIME to stop making them. OS X will take advantage of the extra CPU and make the thing fly!
--
iCube? (Score:2)
I think it has some potential. Granted, G4 Cube sales have been a disappointment. But iMac sales are starting to drop off. High-end iMac DV sales apparently did pretty well, because there is little inventory left on these. Given that the high-end iMac DV SE sells for $1500, maybe a G3 Cube would be a good product to replace the high-end iMac.
How about a bundle: G3 Cube + RAGE 128 + 15 inch flat screen? By bundling the screen with the G3 Cube, Apple might be able to sell the whole package for under $2000. Consider that Compaq and Acer are marketing flat-screen PC bundles for about that price. Such a product would address one complaint about the iMac, its all-in-one design.
There are reasons why Apple might not do this. For one, it might hurt sales of the G4 Cube. But my sense is that anyone who might stretch a bit to reach $2K for a G3 Cube would not go for the G4 Cube anyway. Since G4 sales are poor, it does not appear that the cachet of the trendy design is really moving the product anyway. So, why not market the design to another segment to try to recoup the investment?
Re:Still losing the speed race (Score:2)
There is no incompatability or inconsistency between pre-emptive multitasking and fixed priority scheduling.
In fact, the fixed priority scheduling is what made (and still makes) the Amiga such a dream to work on, compared to most other platforms. The computer can be doing 20 different things, but as long as you have the priorities set right, the task that you're working with, runs at 100% full speed. I wish OS/2 or NT or Unix could do that. I hate so-called "modern" schedulers.
---
Re:counting macos bits (Score:2)
Re:Apple's problems (Score:2)
MacOS X may be a different story, but until that appears, the Mac is stuck with an arcane OS and a pretty but stuck-in-time user interface. Neither of these things would make me compare the Mac to a Ferrari except for the exhorbitant price markup both logos entail.
Re:Does OS X support SMP?? (Score:2)
Yes. Here's output from our iMac running the OS X beta:
It says "up to 2 processors" but as far as I know there's no reason why it couldn't do 4 or more, and I expect it will when Apple releases quad or higher systems.
Boring.... (Score:2)
--
Apple isn't being held back by anyone but Apple. (Score:2)
Motorola has hit 1 Ghz with the G4 Processor. Here's the story from CNET [cnet.com]
I'm sure Apple's pricing might scare people away from a G4 too, unless they sell a kid
aztek: the ultimate man
Re:Confessions of a former Mac User (Score:5)
Have you compared the speeds of say a G4/500 dual processor system and one using a high end AMD or Intel chip? The systems are very comparable. The Mac will easily hold its own, and in certain tasks, like in photoshop etc, it is much much faster. they are not "falling farther and farther behind."
"Second, software: I'm sure I won't have too much trouble convincing the die-hard command line users that MacOS is inefficient and hard to use, but even in terms of GUI, the once-proud Apple has been overtaken by BeOS and Windows ME, and has GNOME and KDE hot on its heels. Much like hardware, Apple is handicapped by its users' insistence that changes be minor and easy to adapt to. "
MacOS is inefficient? Hard to use? I believe most people will acknowledge that MacOS is one of the easiet OSes to use. It is criticized sometimes for not being "sophisticated" enough for the power user. This does not make it inefficient. Though it lacks features like protected memory, etc, is it a very efficient OS, in the sense that Mac users are very very productive. Ask a graphic artist or desktop publisher. the mac OS is not hard to use, nor is it inefficient. Compared to Windows ME and the various Linux GUIs available, the average new computer user will find the Mac OS the easier to use.
You also comment on Apple's lack of "innovation.". Lets see, I'll name a few. These are not necessarily all apple inventions, but Apple was the first to actually bring these to the masses:
1. Firewire.
2. USB as the main I/O interface.
3. Get rid of legacy ports
4. iMovie - video editing for the masses
5. iMac - an easy to setup, all in one unit that appeals to the "average joe" who doesnt always care about technical specs
6. Optical mouse standard on all systems
7. OS X
8. Innovative Industrial design
9. Colorsync technology
Hopefully? (Score:4)
1. The dual processors... Apple can go back to dual processors again when OS X is on them mainstream. Right now with 9.04 multiprocessing is barely useful for most users (photoshop users being perennial exception. Meanwhile a 733mhz G4 at 133mhz is pretty big news since what it will do is make everything faster in the short term.
2. MacOs X is not gonna be truly ready until September (a year late but hey, Win95 was supposed to come in 93 and we know NT 5 was supposed to come out in 95.:)) At that point I hope to see Dual 733's at 133mhz bus.What will the Win world have? WinME running Pentium III's?
3. It would be great if MacOS ran on more boxes than just Apples but they didn't do so well with that. Asking them to move to cheap commodity hardware is not really rational.The real deal here is that folks don't recognize true cost of ownership with computers until they have owned a few. The real shame is that Apple HAS reduced costs by using crappier equipment and it bit them.
4. The biggest problem Apple had was that no one wants to buy a new machine until OS X comes out. Apple was ready with a whole new set of boxes that would have looked really perty with the perty new OS but instead they are running same old OS 9. If Apple really wanted to get new models sold and empty it's inventory, finish the OS in the 1Q...
I am a longtime Apple user and Linux user and I hope to use both for a long time to come. As long as Apple makes machines that last me 5+ years I am not gonna bitch much. Since I am still using a 7600 with a g3 upgrade card I am definately waiting. I like the idea of a dual processing 733mhz, but in truth there is a sweet spot right now with dual 450....1999...No matter what anyone says about comparing 300 dollar pc's with this, the G4 is a better chip than anything Intel makes. Athlon might manage to screw that up if they keep raising the mhz but sheerly for media related stuff, the G4 rocks.Just RIP a few CD's...
dlg
For better and for worse... (Score:2)
If you read the article, they point out the issue that these faster chips may not be available for a while....
On the other hand, if Mot really can cough up a 733 G4, I would much rather be running Photoshop on that than a 1Ghz Athalon (or After Effects, or ...)
The real down side to the story is the comment about how most of the systems are likely to be single processor. This is going in the wrong direction. Alot of potential buyers are going to be quite disappointed. Frankly, I was hoping for a base single processor system, a mid-range dual processor, and a high-end quad processor system. If you've had to sit for an hour while AE renders 3 freaking seconds of footage, you'll know why I was hoping for quad processor towers....
But for what most of the Hertz whiners out there do with their systems, no, quad processors won't quadruple the frame rate of Doom.
Yes (Score:2)
PowerPC is a RISC archtecture. Same family lineage as IBM's POWER chips.
- Scott
------
Scott Stevenson
September? (Score:2)
What is the world does this mean? I use OSX every day as my primary OS. Except for incompletely 24-bit color support, it works great. Since I start using it in September, the OS has never crashed on me (though Classic can get a bit unruly at times).
- Scott
------
Scott Stevenson
All I can say is... (Score:2)
As for Apple (or more specifically Motorola) lagging behind AMD and Intel in terms of speed. This will keep more current Mac users with the platform, but Apple is going to need Mot to kick out 1Ghz chips real soon.
Completely wrong (Score:2)
This is garbage. Most rumor sites publish rumors for personal gain -- whether it be for fame or money. They are taking advantage of 6-12 months worth of hard work on the part of Apple and blowing it all in one day. I don't see how this is "supporting" Apple. It's not as if Apple is going to sell more boxes because of the rumor sites.
- Scott
------
Scott Stevenson
Expectations (Score:2)
Re:Sure mHz matters, but what about the Altivec! (Score:2)
Re:St. Steve is the loser... (Score:2)
Hooray for bus speed! (Score:2)
Maybe the tortoise is catching the rabbit?
Re:Sure mHz matters, but what about the Altivec! (Score:2)
Good point. The only thing that makes me sicker than the Megahertz race in PCs is the Megapixel race in DCs. Yes, our camera has 2 megapixels. All the images are recorded as 4 by 500,000 JPEGs with a strong skew towards pink. :)
Anyway, this is not unexpected news from Apple. Many expected that the price cuts on older models were signs of newer stuff coming out. It doesn't sound like anything revolutionary here; just improvements on existing designs. On one hand, it's good for them to be cautious after the Cube debacle. On the other, it won't rejuvinate them like the iMac did. With the still somewhat cloudy PC market, it's hard to fault them for being conservative.
Re:For better and for worse... (Score:2)
Re:Still losing the speed race (Score:2)
Most people just want to interact with the internet and create different kinds of documents. Their "power app" is playing DVDs. The only thing that keeps their CPU below 99.5% idle is "Clippy" dancing at the bottom of the screen.
What people really need, more than CPU power, is good, easy to use software. Apple tries to provide that. If you don't belive me about the importance of good software, look at the success of the Palm Pilot vs. the failure of Windows CE.
Funny, I'm a new fan of Apple, myself. (Score:3)
In terms of performance, PCs seem to be fast enough that faster just doesn't matter. Why would I need a 1.5GHz system? I'm running on a 500MHz system, and plan to be running it for another few years yet. Heck, even 800MHz would seem to last for at least 5 years, given my track record with my last computer.
Still, I'll probably think a 500MHz Apple sucks, right? I dunno, I don't have enough experience with the G3/G4 to say; do they age particularly better than a x86?
On the other hand, I am enamored with Apple's drive for innovation.
The USB IO adoption
The Firewire IO adoption
The use of Airport and wireless networking
Mac OSX (in the near future), and Unix stability, without the ugliness of Linux!
Well, Linux isn't quite ugly, it's damn functional, but sorta a pain to set up. Win2k is such a breeze to use.
Then there's the quiet fanless iMacs and G4 cubes.
There's the firewireness of the iBooks and Powerbooks.
Optical Mice. Everywhere
*Really* nice LCD screens.
Other hardware coolness I'm looking forward to; More snazzy designs!
A Newton2!
Wireless PCs; at least, as much as possible...
OS X!
Pervasive computing!
Inclusion of mic and USB cam with *all* computers!
Instant Messaging type usability in the OS
Other random cool stuff...
Still, they aren't dead yet, and they're still doing okay...
Maybe I'll regret writing this post in a few months, when I have my Apple. I'll post and let everyone know!
Geek dating! [bunnyhop.com]
This is Horrible News! (Score:2)
Apples, Oranges, Grapes, Pears.... (Score:3)
Obviously I'm going to be taking a little shit for the fact that my email is from mac.com... so I must be *clearly* Apple baised :p BAH. My very first comptuer was a 286 laptop, followed by a 386 desktop, and a Pentium 120. It wasn't until I left for college that I got my own Mac. Why? Because it fits my computing needs and desires.
Now you are probably wondering... "Gee thats great, get to the fucking point." My point is that regardless of what you like, what you know, and whom you support, a little research is clearly in order. I'm really growing tired of watching people spew misinformed posts on to the boards and positioning them as fact.
funkdat.
Re:Still losing the speed race (Score:2)
Yeah, you can buy those too. If you are really going to need that kind of accuracy, you probably bought monitor/video card that already has this feature included(GO SGI FLAT PANEL, WOOO!). Anyway, ensuring color consistancy on all platforms has been solved a while ago.
2.)Want to add hardware? While you'll have fewer options than a Wintel user, your purchase is almost guaranteed not to conflict with any common configuration. And when you want to put it in, you open the door (no screws).
You know most PCs have screwless maintainance(well, with the case anyway. I like sun hard drive holder. in/out in/out... WWWEEEEEEEE!!!). It's all in case design, and if you don't like it, you can always get a different case(same with an apple machine too I guess)
Also, I have had very few problems dealing with hardware conflicts, especially now a days. So much in windows is handled almost transpartly by the OS now. While it still is slightly buggier than the Mac version, it also has to deal with more hardware. It's not as bad as you make it sound.
3.)Your purchase will last. I own a Power Mac 8600. I do all kinds of demanding work on it. To be fair, video is not one of them. But guess what? It's still really fast. Sure, I notice the difference during some Photoshop filters, and during sound file manipulations, but my machine was bought after the G3 came out. Let's see how those celeron boxes are doing in 4 years.
The speed increase in both platforms I believe has been very similar. What you argue is just a point of view. I used the same computer until recently for about 5 years(Windows, not a single reformat). I did basic 3D animation/modeling. If you can do it five years ago, you can still do it today on the same machine. Some people don't seem to understand that at all.
4.).DLL? what's that?
Dynamiclly Linked Library. Though I have never programmed on the MacOS, I'm pretty sure you have something similar. Anyhow, I don't really see the point of your argument. If there is a problem with DLLs it is simply a bug in the program(or in some cases the dll), not in the concept of DLLs.
Re:For better and for worse... (Score:2)
Re:Still losing the speed race (Score:4)
If you are assuming that more Mhz means faster chips, then you might be mistaken to say that the 400 Mhz SGI Octane is slower than the 500 Mhz Macintosh, or the Pentium system running at 750 Mhz. The reality is that the SGI will easily outpace both systems at most tasks just as a Porsche 911 will outrun a Dodge Viper that has a much larger engine than the Porsche. Its all about balance, and code optimizations and memory tasking and wait states etc etc etc....
Please lets not let Intel brainwash us all into thinking that CPU cycles are all that. There is more to chip design than making pipes deeper and cranking up the clock crystals. For instance, the R10k MIPS chip in my SGI will never be able to work in a laptop design as the G4 chip can. The MIPS chip would start a fireball in anything without a heat sink the size of a VCR cassette and big fans, whereas I expect to be working with the G4 in a Powerbook some time next month without using clock pacing tricks like Intel has had to implement in the Pentium portables. (a trick by the way implemented by Apple sometime back in 1991 for their powerbooks at the time). The chips are obviously designed for different purposes, but it is pretty cool that the G4 chip has the legs to run in a workstation, while at the same time having low enough power consumption/heat production to be used in a portable configuration.
Companies like Transmeta, Motorola, IBM, and ARM will show the way to more elegant chip designs and somehow they will have to compete with Intels marketing juggernaut. (I know, I know, Intel now owns a part of ARM. Perhaps this is a good thing?)
My point is simply that we should not buy in to Intels marketing thus making it harder for better/more efficient chip designs to come to market. So lets not let this misconception last much longer O.K.?
Re:Confessions of a former Mac User (Score:2)
The reason for this is that the G4 has 1 MB L2 Cache, which the Athlons and P3's have reduced in size to push the MHz. Why does this matter?
The L2 Cache has a bandwith of ~10GB/s whereas accessing the main memory is 10 times slower, (PC133 has a bandwith of 1.08GB/s). When you're doing effects in Photoshop, a large L2 cache makes a huge difference, simply because the processor can load 1 mb chunks of the picture into the processor cache and perform the effect on it while the Athlons/PIIIs only have room for a quarter of that. In the very specialised problem that Photoshop is a huge L2 cache matters a lot more than MHz. (Most other apps benefit little from a L2 >256kb)
It would be interesting in seeing a benchmark comparing intel's Xeons (which also has a big L2) and the G4. Also, photoshop optimized for the P4, which thanks to rambus has a high memory bandwith (but small caches) would be interesting.
(As for the other apple "innovations", they're mostly interesting from a design perspective, not technical, so i'll leave them alone :) )
-henrik
Re:Hopefully? (Score:2)
that would have been quite a trick considering WinNT 4.0 came out in 1996!
i think the original target date for NT5 was late 1998/early 1999.
Re:Hopefully? (Score:2)