Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Apple Businesses

Dual 1Ghz G4 PowerMac With Extra Yummy 875

A huge number of readers submitted the new Dual Ghz Power Mac that Apple has announced. Includes a Geforce 4 and assorted other bells and whistles that will ring and blow for the Mac Junkie. They start at $3k and seriously make me want a Mac.
This discussion has been archived. No new comments can be posted.

Dual 1Ghz G4 PowerMac With Extra Yummy

Comments Filter:
  • by qurob ( 543434 ) on Monday January 28, 2002 @11:51AM (#2913704) Homepage
    For some reason they hold resale like a fucking BMW.

    It can be a few years old and almost cost what it did, fucking new.

    There's 604's going on eBay for $800+

    Intel hardware retains value about as well as lunch meat.
  • I'm converted (Score:3, Interesting)

    by Troed ( 102527 ) on Monday January 28, 2002 @11:58AM (#2913751) Homepage Journal
    I've had PCs since -93 (Atari STs before that) - I liked building my own computer, I liked (!) resolving problems in DOS/Windows configurations etc.


    No more. I've got friends with Macs and knowing a thing or two about operating systems I'd pick Mac OS X over Windows any day - and thus I'm now also going to convert from PC/Windows to Apple computers. I seriously hope more and more people will do this, not just those with a techie background that can see through the MS commercials and understand that for what they use their computer for, they really really should go Apple.


    Price? Umm. Let's not go there. I'm going for the iMac instead .. I'm not a gamer and can live without the GeForce 4.

  • GForce 4 !MX! (Score:2, Interesting)

    by Everybody ( 59419 ) on Monday January 28, 2002 @12:12PM (#2913849)
    While semantically this is a GForce 4, technically the real GForce 4 (non-MX) is based on the NV25 core (dual vertex shaders and improved pixel shader).

    The GForce 4 MX used by Apple usese the NV17 core (one vertex shader and no pixel shader). This might still be a nice chipset, but it is not anywhere near XBox or real GForce 4 performance.
  • by Pfhor ( 40220 ) on Monday January 28, 2002 @12:19PM (#2913899) Homepage
    Video editing groups. There is a serious following of apple in the multimedia area. Final Cut Pro has dragged in tons more video editing people. Guess what? Apple still has the attention of tons of desktop publishers, a lot of smaller, independent editing houses, and graphics departments inside larger corporations. They are apple's market. My college would be buying them for a digital editing lab, a highschool doing video editing may get a bunch of iMacs and a few G4s to do the high end rendering, for the kids who want to do work that iMovie can't. And the machines are also possible servers for all of the above people.

    And these machines are just something to keep the iMac from undermining the Power Mac G4 sales, supposedly the G5's will be out soon.
  • by Pengo ( 28814 ) on Monday January 28, 2002 @12:24PM (#2913925) Journal

    I use it mostly for development and as a unix admin workstation. I hack around with python and objective-c and even play Retrun-to-cstl-wolfenstein on it.. I imagine that I will be using it for another 8-12 months before it gets retired as a server or nat box (Which would replace my wifes old nappy-iBook (think toilet seat)). The cool thing about the iBook is , with exception of a huge hard disk .. it does everything I need just fine for a unix box. I setup DNS , apache for serving MP3's to friends and now I can actually turn off my linux machine when I am not using it. That means a nice quiet little server that makes almost NO noise, runs unix, configured the BSD (ipfw) firewall and handles my DSL nat just fine for the rest of my machines at the house.

    I find myself upgrading my PC about once every 12-14 months, I expect to get at least 2-3 years out of my G4 (as I almost have done with my iBook)

    Cheers
  • by z7209 ( 305927 ) on Monday January 28, 2002 @12:59PM (#2914142)
    Just for fun, try to build a comparable brand name PC for $3000.

    I tried with Dell and ended up with a $5,071 quote. I'm sure my specs can be debated, but I got:

    --Dual Xeon 2.2Ghz (Hard to tell if this is a good comparison)
    --512 MB RAM
    --80GB HD
    --ATI Fire GL2, 64MB,VGA/DVI (Best I could find on their site, besides high-end)
    --Sound Blaster Live! Value
    --Windows XP Pro

    Anyone have any idea whether the Xeon 2.2Ghz is fair to compare with at all?
  • by toupsie ( 88295 ) on Monday January 28, 2002 @01:02PM (#2914169) Homepage
    No, I got a "Dual G4/500" for $3k not $6k. Maybe you need to trade in that Athlon for a PowerMac because it appears to have a addition bug in it. And no, your $1.5k Athlon cannot outmuscle both Mac systems. Your Athlon cannot do what a Mac can do because it is crippled by poor Operating Systems. Windows and Linux are no match for the BSD induced power of Mac OS X. Where is your DVD authoring and burning software stock on your machine?
  • by Steve Cowan ( 525271 ) on Monday January 28, 2002 @01:03PM (#2914176) Journal
    I work in audio. I want raw performance power, and I want style - the equipment I use in my studio has to impress my clients.

    The G4/DP 1 gig is a very appealing option, except:

    1. I don't need a SuperDrive. I don't want a SuperDrive. Apple won't give you a 933 or 1GHz DP machine without a SuperDrive. Sorry but I'd rather save hundreds of dollars by simply not buying one!

    2. ADC (Apple Display Connector) still really bugs me, and now they've really made it ugly. For those of you who aren't aware of Apple's hardware decision here I'm going to sum it up:
      • Apple created a proprietary connector, "ADC", for displays.
      • This connector carries power, DVI and USB along the same cable, reducing cable clutter.
      • The video card is a special one, with an extra set of pins at one end which connect to a separate power socket on the motherboard.
      • Without this power socket there is not enough juice fed to the card to power an entire display.

        THEREFORE Your system can only work with one Apple display, because only one card slot has this power connection.

      • If you wish to power an Apple Display using a system with no ADC port, you can, but you need to buy an external solution [drbott.com] worth hundreds of dollars, which plugs into a video card's DVI output, a USB port, and into mains via a line-lump style power supply; and combines all these signals into an ADC connection.
      • Such adaptors require a DVI output from your video card.
      • The new video cards available on these Macs have one ADC output and one VGA output. There is absolutely no way to connect any current Apple display to that second monitor port.
      • There is no less-expensive, single-port card available for your Power Mac G4.
      • If you want a second Apple display you would have to purchase a video card with a DVI output to go into an un-accelerated PCI slot, and the special multi-hundred dollar adaptor described above to connect to the second Apple display's ADC connector.
      • If you want to use a non-Apple display on the ADC port you must buy a sub-$100 adaptor which breaks the DVI video signal out of the ADC connector for a 'standard' DVI flat display.
      • To my knowledge there is no adaptor that will give you a VGA output from the ADC port.

      What I'm getting at here is that Apple boasts that all the new Power Macs have support for dual monitors built in, but for a company who puts so much work into beautiful designs, they expect me to use two different, cosmetically mismatched displays! I don't believe that a VGA connector belongs on a flat panel due to inherent flickering issues, so that means a flat display on the ADC and a CRT on the VGA port. Ugly!

      If I want two displays that look the same, I have to enter into an imposing combination of needlessly wasted PCI slots, buying redundant cable adaptors, and spending a lot of money!

      I would love to have a DP 1 GHz with dual Apple 17" Studio displays. I really would. But the premium is too high.

      Apple should bury ADC now and issue an admission of stupidity.

      Apple did a great job of embracing standards with USB, and is arguably responsible for its success. Why they chose to suddenly abandon the DVI connector on Yosemite and original Sawtooth computers is a mytery to me. DVI was just catching on as a standard way of connecting flat panel displays. If Apple hadn't moved to ADC, we would have seen more Wintel video cards with DVI conectors on them now, because there would be more DVI-connected monitors on the market.

      Apologies for the rambling post... ADC has bothered me right from the start and now these new dual cards seem like the ultimate inconvenience.

  • Re:3k or 3 PCs? (Score:1, Interesting)

    by Anonymous Coward on Monday January 28, 2002 @01:11PM (#2914221)
    This past summer, I sold my 2.5 year old "Blue & White" G3 400 tower for $1100. That is astounding. (No monitor, 512MB RAM, 15GB storage.) How much do you think a P II 450 would've gone for 2.5 years later?

    blakespot
  • Comment removed (Score:1, Interesting)

    by account_deleted ( 4530225 ) on Monday January 28, 2002 @01:14PM (#2914252)
    Comment removed based on user account deletion
  • by bdowne01 ( 30824 ) on Monday January 28, 2002 @01:17PM (#2914281) Homepage Journal
    Wrong.

    I have a $1500 iMac G3 500 and an Athlon 1.4 Ghz. When ripping MP3's under Slacware 8.0 on the Athlon, the best rate I could get was 3x. The little iMac up starts clocked almost 5x.

    Just remember two things:
    1) It's not the clock speed that matters
    2) RISC rules
  • by billvinson ( 135790 ) <billvinson@gmail.com> on Monday January 28, 2002 @01:24PM (#2914328) Homepage
    And Compaqs are not known for being the highest quality of computers or at least that is true from what I have seen. The PowerPC is a nice chip, the Athlon/Duron is a nice chip, the P4 is an acceptable chip. They all do what they are supposed to... Hell, once you pass over 300 MHz most improvements become largely irrelevant. I have a near 1GHz tower sitting under my desk (Linux) and I have an iBook 500 MHz sitting on my desk. Now guess where I get more done :)

    That is right, the lowly 500 MHz iBook. It is built with quality in mind, it is quite fast, and it runs MacOS X which is absolutely amazing... Not that I don't use Linux and the BSDs too (They definitely kicks Mac's ass as servers).

    I say you get what you pay for as Windows machines just can't handle multimedia in any way near the Mac machines...

    Bill
  • by djohnsto ( 133220 ) <dan.e.johnston@g ... inus threevowels> on Monday January 28, 2002 @01:40PM (#2914428) Homepage
    So when is Apple going to let developers use features that aren't available on the Rage 128? The latest OpenGL SDK from Apple only lists extensions that run on virtually ALL of their products. Doing this is nice in that developers don't have to worry about compatibility issues when using certain features. However, aside from Carmack and his DOOM demo, no one has been able to use any of the advanced features that the GF3/GF4 support!

    Explicitly, Apple's OpenGL doesn't include support for:

    • GL_NV_vertex_program
    • GL_NV_texture_shader
    • GL_NV_vertex_array_range - Yes this is in the glext.h that ships with the SDK, but there is no way to allocate video memory to use with it!
    • GL_NV_register_combiners2
    • Any AGL extensions. These would allow the use of offscreen render surfaces, anti-aliasing, allocating memory for vertex array range, etc.

    Obviously, Carmack was able to get needed programming info to make these things work, why not the rest of us? Is it that game developers now need to beg Apple to work on cutting edge technology on their machines? In my opinion this is killing any reason to use OpenGL over DX8/DX9 for future game development. Even if OpenGL itself supports advanced features that rival DX, I can't use them to build a cross-platform game. If that's true, what's the point of using OpenGL? (I actually like the DX8 programming model better.)

  • Comment removed (Score:2, Interesting)

    by account_deleted ( 4530225 ) on Monday January 28, 2002 @01:45PM (#2914468)
    Comment removed based on user account deletion
  • Seriously Seriously (Score:3, Interesting)

    by lukej ( 252598 ) on Monday January 28, 2002 @02:08PM (#2914613)
    seriously make me want a Mac.

    I'll assume Taco doesn't have a mac from this comment...

    Why? You see these posts all the time:

    /. : Apple introduces new hardware/software X !
    Poster: Wow, now that Apple has X I really/finally want one!

    Why do people do this?

    Do you yap all morning about how you want a cup of hot coffee, and never get one? Then repeat the process tomorrow when there is a fresh pot?

    I wanted my Apple (now outdated) and so I invested my $3500k 4-5yrs ago, and it was/is awesome. Now with some of the new stuff they are coming out with I'm PLANNING on getting another... not just talking about it...

    If you think Apple's stuff is worthy, buy it.
    Just my gripe...
  • by Have Blue ( 616 ) on Monday January 28, 2002 @02:38PM (#2914794) Homepage
    If you are doing much multimedia work using tools like Photoshop,
    Exactly the program Steve always drags out to push the Mac over Windows at expos, because it really does beat the crap out of the PC.

    The Mac also has every program in your list except 3D Studio Max, and it has Maya to make up for that.
  • by Dominic_Mazzoni ( 125164 ) on Monday January 28, 2002 @03:31PM (#2915124) Homepage
    > However, the chipset hasn't been updated yet (ergo no ATA100 or DDR support yet)

    Someone correct me if I'm wrong, but won't the 2 MB of L3 Cache with DDR RAM make a BIG difference?
    Note: that's 2 MB of L3 cache per processor!

    My guess is that the vast majority of apps would not see any performance gain if you used DDR for main memory, all else being equal. So I think that the DDR L3 Cache was a good move.
  • Why not DDR? (Score:2, Interesting)

    by piotrr ( 101798 ) <piotrrNO@SPAMswipnet.se> on Monday January 28, 2002 @03:45PM (#2915219) Homepage
    I'm guessing cost versus performance here. It's been shown in tests such as running actual application-like benchmarks rather than theoretical tests, that a low memory fetch latency is more important to memory performance than max burst transfer rate. That's why RDRAM suffers in many cases, since it still runs at a latency compared to 100MHz SDRAM. Most applications just favor short-latency fetches to high-speed large-block transfers in order to run the best. But it is odd. There's no real latency difference between PC133 and DDR2100, and there IS a difference in sustained transfer rate, so I suppose they just didn't think the sustained transfer rate increase was worth the extra cost (as well as design time).
  • by Hadlock ( 143607 ) on Monday January 28, 2002 @06:53PM (#2916683) Homepage Journal
    the new fab'd processors use server-chip technology, as described in this cnet article:

    http://news.com.com/2100-1040-824621.html [com.com]

    they say (in short) that it uses silicon insulating to help prevent "silicon drift" even more, so less power is used. what was immediatly brought to my attention was that this new fabrication uses only 10 watts, 15 watts @ peak power consumption. I have a lava lamp w/a 40 watt light bulb....i'm curious, does a 15 watt processor (using 100% of it's computing power, all the time), produce as much heat as a 15 watt light bulb?
  • by Refrag ( 145266 ) on Monday January 28, 2002 @10:09PM (#2917496) Homepage
    Don't try to play the game with a winmodem piece of shit.

    Also, drop another 2GHz Pentium 4 in there and that computer might be up to competing with the dual 1GHz G4.

    Where the hell is your FireWire and Gigabit Ethernet?
  • by Graymalkin ( 13732 ) on Tuesday January 29, 2002 @09:00PM (#2922686)
    The NBC crew in Afganistan kept a Powerbook as part of their equipment because they could do in field editing of footage shot with the Sony VX2000 they were carrying. More interesting than that (which I find damn cool) is a lot of editing was done with iMovie. I think it is an NBC policy now that PBs are given to crews using DV cameras. How many companies offer a full video editing suite that can run off of a battery?

Happiness is twin floppies.

Working...