Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
iMac Apple Technology

The Most Powerful iMac Pro Now Costs $15,927 (vice.com) 201

Apple recently updated the upgrade options for the iMac Pro, and getting the very best will cost you. A baseline model will cost you just under $5,000, and maxing out the hardware to absurd heights runs a whopping $15,927. An anonymous reader writes: The most expensive possible upgrade is a $5,200 charge for upgrading the RAM from 32GB to a startling 256GB. Other addons include an additional $700 for a 16GB Radeon video card and $2,400 for a 2.3 Ghz Intel processor with 18 cores. Almost $16,000 is a lot of money for a computer, especially one so overpowered that there are very few reasonable applications of its hardware. Most people will never need more than 16GB of RAM to play video games, and 32-64GB will take care of most video editing and 3D modeling tasks. With 256GB of RAM, you could run advanced AI processes or lease computing power to other people.
This discussion has been archived. No new comments can be posted.

The Most Powerful iMac Pro Now Costs $15,927

Comments Filter:
  • News for Appleheads [binged.it], stuff that costs a shit ton of money.

  • by enriquevagu ( 1026480 ) on Tuesday March 19, 2019 @02:17PM (#58299356)

    Most people will never need more than 16GB of RAM to play video games Sounds familiar to me. No comments, other than the famous "640K ought to be enough for anybody." is often attributed erroneously [wikiquote.org] to Bill Gates.

    With 256GB of RAM, you could run advanced AI processes or lease computing power to other people.. Of course, because both tasks are memory-bound, and not compute-bound /sarcasm.

    • 640k *was* enough for anyone, at the time the arbitrary limit was created. The problem was that that it was a difficult-to-circumvent limitation in an an operating system that migrated across various platforms for almost two decades, and couldn't be removed without breaking backwards compatibility. And Moore's law had already been in full force for more than a decade when the first version of DOS was released, so there was little excuse for such an assumption. And while there's very little evidence that

      • And while there's very little evidence that Gates ever made such a statement, the fact that Windows XP was similarly handicapped at the ~3.5GB boundary suggests a recurring theme of disregard for the rate of hardware advancement.

        The ~3.5GB boundary was because of a technical limitation. 32-bit machines can address memory up to 4GB. XP couldn't handle virtual memory space (initially at least, maybe after one of the service packs). And some of that 4GB space was reserved for drivers.

        They could have handled

        • by dryeo ( 100693 )

          That's true for the original IBM PC as well. The processor could only address 1MB and drivers, video memory etc needed some of that address space.
          These limits all seemed fine at the time as maxing out memory was expensive.
          Perhaps in a decade we'll be bitching about the 16PB limit or whatever it is.

        • I used to do desktop field service, and had a client that actually paid us to handle warranty work on their Dell workstations. One such machine had 64MB RAM (oh yeah!), back in the Windows 98 days, and the client being a research outfit, they wanted to do serious statistical analysis, so the RAM was critical. The software vendor made it mandatory - 64mb. No less.

          They had all kinds of problems with that workstation form the beginning, and called us in to figure out why only 48MB of RAM was shown as available

          • Comment removed based on user account deletion
            • The Atari 800/400 had similar problems.

              It was designed for an SS-50 bus, and actually had the connectors for the edge connector on the motherboard.

              However, by the time it was near market, the newer FCC regs meant that it just wouldn't be possible for it to pass.

              The result is that the board was wrapped in a think (1/4"? It's been a while . . ) RF case, with limited connections.

              And *that* in turn mandated those idiotic serial diskette drives.

              At least they eventually figured out (Rev B ROM on them, iirc) that

            • I recall vividly when WordPerfect was updated and perceived as terribly slow. The official responses referred to the dev team having 64MB of RAM, 'ran fine'... Most secretaries had machines with 16MB, not at all uncommon for 486 machines. 64MB was never common for my small business customers.

              The Dell bug started with missing address lines, though by itself that's not the problem, clearly. This would have been seeing 1992, so yes, W4WG... Which I loved.

              I never saw anything back then with GB RAM, modules typi

            • by AmiMoJo ( 196126 )

              The address line thing may have been that it didn't support 128MB total address space, so stuff like the chipset and PCI devices got mapped into the 64MB address space.

              Same thing happened when machines started hitting 4GB of RAM, but only showed up 3.5GB due to 32bit address limits.

        • 1995 called, they want their PAE back

        • With 32-bits you can address up to 4GB of RAM. However, operating systems like to provide features such as shared memory and memory-mapped I/O. For these to work seamlessly and provide the intended convenience for application developers, the OS has to make them *look* like regular memory access. Microsoft did something relatively easy with Windows and made the top half of the address space reserved for these purposes. When machines with more than 2GB started appearing, Windows added features to take adva
      • Re:Poor article... (Score:5, Interesting)

        by nojayuk ( 567177 ) on Tuesday March 19, 2019 @03:31PM (#58299804)

        I saw a comment on a hardware hacking blog a few years ago about a musician who used a repurposed server as his composing workstation. He wrote music for films, TV shows, entertainment and promotional work for a well-paid living.

        His workstation/server had four 8-core Xeons so he could composite multiple channels of music in real-time and 512GB of RAM so he could keep several hundred GB of music samples in RAM as he worked. He reckoned the server paid for itself in time saved and delivery-to-customer scheduling with the first two projects. He had used high-end Apple kit before he moved to this solution but nothing out of Cupertino could match what he had built himself.

        • by mjwx ( 966435 )

          I saw a comment on a hardware hacking blog a few years ago about a musician who used a repurposed server as his composing workstation. He wrote music for films, TV shows, entertainment and promotional work for a well-paid living.

          His workstation/server had four 8-core Xeons so he could composite multiple channels of music in real-time and 512GB of RAM so he could keep several hundred GB of music samples in RAM as he worked. He reckoned the server paid for itself in time saved and delivery-to-customer scheduling with the first two projects. He had used high-end Apple kit before he moved to this solution but nothing out of Cupertino could match what he had built himself.

          This.

          I used to work for Geoscience and Geospatial companies. This is where you're trying to manipulate images in the 10's of gigabyte ranges (back in 2009). Only the HP and Sun workstations could even hope to match what you could custom build for the GIS and Remote Sensing analysts and the Sun ones only ran Solaris or Linux IIRC and both options were hideously expensive and I think they top out at $8000 odd. It was worth it to custom build as if a part broke, you'd just go buy a new one off the shelf (t

      • ...but I'm going to go out on a limb here and guess that very few (if any) such projects are going to be running on a grossly overpriced all-in-one iMac.

        Well, depends on what you're working on.

        If you are working say, on 4K RAW video, in Davinci Resolve, and have some serious color grades on there, doing some sound work, AND...using Fusion in there too doing basically VFX....well, that can bog a machine down pretty badly if you don't have something pretty beefy.

        Hell, even with just HD video, RAW or not,

      • ~3.5GB limitation had nothing to do with disregard to hardware advancement, it was purely a limitation of what could be addressed in 32 bits.
    • Also I want to point out on How stupid it is to point out on the max cost of the product, when you go and select the top of everything in the customize bucket.
      Most people won't need the max spec, but the system is designed to handle the max spec, for that small handful of people who need it.
      For most Professional that 15k investment is probably better served with getting a 3k system every 5 years for 25 years.

      • That small handful of people who need it would get way more value from a Threadripper box, which absolutely demolishes Intel's 18 core part for less money, and has a stupidly large number of PCI lanes.

    • by EvilSS ( 557649 )

      Most people will never need more than 16GB of RAM to play video games

      Not to mention no one* is buying these for playing games in the first place. Plus the upgrades are available for people who think they have workloads that require it. They are OPTIONAL. I can spec out a $90,000 server blade as well. Doesn't mean I spend $90K every time I order a new blade.

      *Yes, I know some idiot will, but overall, no one is buying them for games!

    • by dcw3 ( 649211 )

      Came here to say the same...mod parent up.

    • by mjwx ( 966435 )

      Most people will never need more than 16GB of RAM to play video games Sounds familiar to me. No comments, other than the famous "640K ought to be enough for anybody." is often attributed erroneously [wikiquote.org] to Bill Gates.

      With 256GB of RAM, you could run advanced AI processes or lease computing power to other people.. Of course, because both tasks are memory-bound, and not compute-bound /sarcasm.

      The average user will not need 16 GB in the near future. Even gamers aren't RAM limited, my 12 GB is enough to make my graphics card the bottleneck.

      I suspect they won't require 16 GB as a recommended amount for some time yet. We've been on 4GB for a while and the average user still isn't utilising all of that.

      The only people who need a lot of RAM are people who are running very RAM intensive programs like databases, image processing, virtualisation, et al. where you need to keep huge volumes of data in memo

  • When adjusted for inflation the Lisa costs more. 256 GB ram will be in celeron laptops a few decades from now.
  • Just putting this here for posterity.

    • by lowen ( 10529 )
      Hmm, and a couple of years ago I priced out a Dell Precision workstation with 1TB of RAM. The 1TB RAM alone listed at around $38,000.
    • I already use 32gigs on my laptop, for my servers I am happy with 128gigs.

      Now why so much where we could had nearly the same type of work 20 years ago with 1/1000 of the resource?

      While there is some bloat, there is the annoying factor of security being a big concern, Back 20 years ago, shared memory was common, no sandboxes, if you keep on moving your pointer values you will finally run into some other line of ram for an other system. Today we have sandboxes and virtualization, for added levels of protectio

      • Back 20 years ago, shared memory was common, no sandboxes, if you keep on moving your pointer values you will finally run into some other line of ram for an other system. Today we have sandboxes and virtualization, for added levels of protection, to make sure App 1 will not overwrite App 2.
        On windows or Mac OS ... not on unix or anything else with an MMU.

  • by alvinrod ( 889928 ) on Tuesday March 19, 2019 @02:18PM (#58299372)
    If your time is worth hundreds of dollars per hour, then this purchase becomes justifiable if it can save you a sufficient number of hours. I don't think that there are that many people who will see significant improvements from maxing this thing out. About half of the cost is maxing out the RAM and using the largest internal SSD possible. You can save considerably by avoiding the Apple tax and installing your own RAM upgrade and you've probably already got an external RAID setup for storage if you're in the market for this kind of machine. The $2,400 for the extra 10 cores is probably the only thing that most people would want/need to touch and I expect that over a few years of use, it's likely to justify its cost.
    • I work with a lot go photo and video stuff, it's really nice to have internal storage be as large as possible to hold large projects, then when I'm done I can save them off to traditional larger external spinning discs.

      Every now and then I look into faster external RAID arrays but that itself is a very expensive option and can be kind of fragile.

      Having a lot of internal storage also saves you time in that you don't have to be as picky in cleaning out your system from time to time. I fought for way too long

      • I work with a lot go photo and video stuff, it's really nice to have internal storage be as large as possible to hold large projects, then when I'm done I can save them off to traditional larger external spinning discs.

        Every now and then I look into faster external RAID arrays but that itself is a very expensive option and can be kind of fragile.

        Having a lot of internal storage also saves you time in that you don't have to be as picky in cleaning out your system from time to time. I fought for way too long with a laptop that was always too close to the edge of available hard drive space, which was really annoying.

        You're looking for a NAS. If an article regarding a $15,000 workstation is at all appealing whatsoever, then having a dedicated storage array is entirely practical for 1/4 of the price.

        "But Voyager, they're expensive!"
        Let's assume you're a DIY tinkerer. A quick Newegg build on a Ryzen3 with 32GB of ECC RAM, a case, and 5x4TB drives is about $1,350 soup to nuts; in RAIDz2 (RAID6), that's still 12TB of storage with two drives fault tolerance, and I only limited it to 5 because that's the maximum number of drives I could buy at a clip (the build supports three more on the case and the mobo). Do two drive orders and you can hit 24TB before you hit physical limits.

        Let's assume you're not a tinkerer and basically want a thing in a box. $1,500 will get you the aforementioned 5 drives and an 8-bay QNAP.

        "But network connectivity is slow!"
        Add about $400 to the QNAP and $600 to the DIY build and you've got 10-gigabit connectivity, possibly a bit more if you're on a Mac and need a thunderbolt-to-10GbE adapter.

        "But then I can't access my data when I'm not home!"
        The $15,000 Mac won't let you do that. However, all of these systems have some form of remote access, bit it the more arcane SFTP on the DIY build, or the shiny WebUIs and dropbox-like mobile apps of QNAP and Synology.

        "But Thunderbolt has lower latency!"
        Possibly, but 10GbE over Fiber is pretty damn quick, especially if you do a direct connect to your machine. An 8-bay TB enclosure will cost you $2,000 before you put drives in it, and you get zero options for multi-user or remote access.

        "But it's ugly!"
        Both Cat6 and fiber cables support long enough runs to put the storage appliance wherever you'd like to hide it. Thunderbolt doesn't. If you're willing to go a bit higher on the DIY front, Lian Li makes some beautiful cases with a price tag to reflect them.

        There are countless combinations out there; if storage is your only concern and you've got somewhere to put an 8U rack, QNAP has a rack mounted NAS with a companion storage expander that you could fill with 4TB drives, landing you with 80TB of storage (assuming 4 disk fault tolerance) and *still* spend less than this $15,000 Mac.

        • by mjwx ( 966435 )
          The problem with using NAS/SAN drives is pure economics. A lot of people who require high end workstations will be doing work on consumer OS's. With GIS, it's Windows (ArcGIS). So this tends to throw a spanner in the works. The setup for an iSCSI over Ethernet connection requires a separate network (well it should if you're doing it properly) and If for any reason the drives are disconnected which is a problem on Windows desktop operating systems it costs a lot of money. So it becomes trivial to just say "
          • The problem with using NAS/SAN drives is pure economics.
            A lot of people who require high end workstations will be doing work on consumer OS's. With GIS, it's Windows (ArcGIS).

            The GP was talking about photos and videos, not GIS datasets, so the goalposts just got shifted. To address it though, at $16,000 for the Mac in TFA, A Poweredge with plenty of storage and an Optiplex or two to access it are entirely practical alternatives.

            The setup for an iSCSI over Ethernet connection requires a separate network (well it should if you're doing it properly) and If for any reason the drives are disconnected which is a problem on Windows desktop operating systems it costs a lot of money.

            We're already quite far away from 'storing lots of photos and videos', and iSCSI seems like a weird protocol to implement in this context, and I really don't understand what you're getting at with respect to Windows losing access to network storage vs. an

    • When I have a project large enough to support a tech support then I buy linux machines cause they can be cheaper than macs.

      But when I have to do my own or I'm using the interface I buy macs. Trying to keep a linux boxed patched and all the ports closed takes expertise to be confident it was done right. Getting hacked one time on a linux box for me was so expensive it killed a multi-year project.
      The premium to get a powerful mac is pretty cheap compared to an employee recruitment, retention

    • Even if the speed merely lets you continue your workflow without distracting delay, keeping the creative process going unabated...

    • For most people maxing everything is going to be a waste of money.
      You may need to Max out on RAM, or get a Top video card, But maxing everything is just Luxary and bragging rights, and nearly every professional will not be fully utilizing everything, as every professional has their specialty and uses the computer differently.

  • by Miles_O'Toole ( 5152533 ) on Tuesday March 19, 2019 @02:19PM (#58299376)

    Sixteen grand for a machine like this is still dirt cheap for a high end animation studio like Pixar or Ghibli.

    • Ten years ago it would cost $20K for a fully stocked Mac Pro and two 30" Apple Cinema monitors.
      • by Anonymous Coward

        Ten years ago it would cost $20K for a fully stocked Mac Pro and two 30" Apple Cinema monitors.

        Or $599.99 for an equivalent windows system.

      • by Megane ( 129182 )
        But you could upgrade them two years later without throwing away your whole computer and a large LCD monitor.
    • by Anonymous Coward

      A high end animation studio involves rooms and rooms full of racks and racks of high performance CPUs ( not GPUs , mind you ).

      I took a tour of one of Pixar's render farms and it was almost 15000 square feet of AMD servers.

      You're not getting that for 16 grand, even if it is AMD stuff.

      • by Anonymous Coward

        Well, Pixar would be using these as workstations, not rendering nodes, right?

        When my friend worked at a pre-press facility for a few years, their standard practice was to buy everyone doing graphics work machines with the RAM maxxed out, or nearly so. It was just too much hassle to go around and upgrade them later, and nobody complained about having too little but nearly everyone complained about not having enough.

    • But I know for a fact that high-end studios like Lucasfilm and the ones who did Aquaman, among others, use Linux as the operating system. Big production studios use Linux. All that money is used for computing power instead of DRM. Now for more modest production studios, it used to be Mac and Windows. However, now the likes of DaVinci Resolve and Lightworks runs just fine on Linux whether you have an AMD Graphics card or Nvidea. If you are into 'Free as in Freedom' and wish to Stallmanize, you can always run
  • by guruevi ( 827432 ) on Tuesday March 19, 2019 @02:20PM (#58299380)

    I wouldn't say 32-64GB is too much for some of those tasks, CAD and the like could easily spike 128GB with modern systems. The RAM is just an option because the Intel processor is designed for servers/workstations and simply allows you to. It's also useful if you have a rig of GPU's, which this iMac is capable of powering a number of eGPU systems so for very remote circumstances I can see it being useful.

    In comparison, a Dell workstation can run you a lot higher, the CPU and RAM being the primary cost drivers, one of those Xeons by itself can cost upwards of $10k on the street.

    • Comment removed (Score:5, Insightful)

      by account_deleted ( 4530225 ) on Tuesday March 19, 2019 @02:27PM (#58299422)
      Comment removed based on user account deletion
      • With BIM we easily max out 64GB when working with point clouds.

      • You call it browser bloat, but really it's javascript librairies bloat and images bloat. Nothing else on the web can waste CPU and RAM as fast as those two things.

      • You can browse just fine using 8 or even 4gb of ram.
        Granted it will begin to slow down at some point, much sooner than with more, but it will work just fine.

        What you are seeing is the system doing what it's supposed to do, using what is available. What's the point of having all that ram if it's just sitting there idle? It will release it if needed, but it will take what it can because that's what makes it most efficient. This is especially important on a laptop where you can trade ram for cpu and drive
    • This is precisely the point.

      If your professional workload needs a lot of RAM, Apple will sell you a system for it. They're not going to question why or say "Gee, that's a lot." They'll just put in the higher-end components and send it with a bill. What it's used for is up to the customer.

      You want to edit 4K video with a huge RAM-backed cache? This will do it.

      You want to run 50 VMs to test your shiny new software? This will do it.

      You want to take advantage of your newly-minted CTO's "upgrade everything!" dri

      • by Pascoea ( 968200 )

        and you don't see any more upgrade funding until the Lions win the Super Bowl

        Wait, all my Lions fan buddies are CONVINCED that is going to happen any year now. Any. Year. Now.

        Go Pack!

    • I don't see RAM being as big of a factor as CPU and/or GPU capacity. A full-on CG render (still-frame or animation sequence) is among the most taxing (depending on settings and resolution, natch), and can swallow your CPU (or GPU) whole for hours on end if you let it. Maxing cores and going 4-way(or higher) SLI/Crossfire on the CPU and GPU fronts (respectively) will give you more love for your buck in the CG world, so long as your software and OS (and modules/drivers) can keep up with the extra horsepower.

      N

  • . . . that Chrome won't eat it all.

  • by SuperKendall ( 25149 ) on Tuesday March 19, 2019 @02:26PM (#58299414)

    You don't have to pay Apple's prices to upgrade RAM, you can buy the chips yourself. The process to get to the ram slots is somewhat involved [youtube.com], but you can also just have Apple install the ram you bring them.

    The thing is the RAM the iMac Pro uses is not cheap (2666MHz DDR4 ECC / PC4-21300), so you'll be paying a lot regardless of the path you take. For instance an iFixit RAM upgrade kit to 128GB is $2,000.00 [ifixit.com]. To reach 265GB you'll need four 64GB memory chips... and probably best not to use the cheapest ones. Crucial does not even list chips that will work with the iMac Pro...

    • Comment removed based on user account deletion
      • You can't clean dust from them easily or at all

        Do you need to though? With a wholly vertical design there's really not much of a way for dust to build up. It gets flushed out the system by the fans and doesn't really have anywhere to collect the way it would with a flat motherboard and/or case design that has a lot of area at the bottom to collect dust.

      • But holy shit, I'd rather see you donate your money to a non-profit cause than to Apple's Pocket Book.

        Oh, did Apple release a new laptop?

    • yea upgrade your mac yourself and see what happens ..most imacs these days ram and ssd are soldiered on and even cracking case to clean it ..voids apple service plans .. tons of youtube videos on it
  • 48 Gigs of RAM and room for eight hard drives plus SSD raid card internally et cetera makes these a great option still and sadly the only option. The future will be what it is. Likely Ubuntu servers for the drives if Apple can't get their heads out of the sand and with no value added for the rest of the system, well, the main workstations / laptops can be anything.

    Hey, Apple, the point of a walled garden is to make the garden nice. You're at the point where you've stopped even maintaining the walls. I'm not

  • As in the statistics system. I deal with data scientist that do spend $16k+ per week on data modeling and forecasting at AWS. With that expense, it's should be easy to justify that desktop. But they'd complain about only 18 cores.

  • If you are maxing out a iMac Pro at $16K, I doubt you'd be spending it on playing video games. Sadly this is really the only Apple that professional video editors or animators can use right now.
  • Why do I hate Apple so much? Because they are literally raping their fanboy customers. Let's do a comparative breakdown of this so-called "$16k Build" based on NewEgg's prices: -256GB DDR4 RAM: Around $1,000 -16GB GPU (Radeon VII): $700 -18-core Intel CPU (Intel i9 9980XE): $2,000 -1TB SSD and 8TB HDD: Around $300 -Other components (case, motherboard, fans, etc.): Around $1,000 -28" 4k monitor (good brand): Around $500 .... At most, that's $5,500 in hardware (and that's WITHOUT the enormous discount Apple i
    • A physically larger comparable PC that is comparable in speed to the iMac will start around $10K, not $5K.

      256GB 2666 DDR4 (64x4) is over $2,000 (not $1,000), and for ECC sodimms this goes up to over $6,000.

      The Intel i9 9980XE doesn’t support 256MB or RAM. A 18-Core 2.3 GHz Intel Xeon W is $3,000.

      28" 5k monitor (good brand): Around $1200 (not $500)
    • Why do I hate Apple so much?

      Did you ever have a Mac?
      Or an iPad?
      Or an iPhone?

      See ... so why do you hate them? If you have mental problems please seek counseling instead of going postal.

      At Your Service

    • Why do I hate Apple so much? Because they are literally raping their fanboy customers. Let's do a comparative breakdown of this so-called "$16k Build" based on NewEgg's prices:
      -256GB DDR4 RAM: Around $1,000
      -16GB GPU (Radeon VII): $700
      -18-core Intel CPU (Intel i9 9980XE): $2,000

      Intel's ark says [intel.com]

      Memory Specifications
      Max Memory Size (dependent on memory type) 128 GB
      Memory Types DDR4-2666
      Max # of Memory Channels 4
      ECC Memory Supported No

      Half the memory and no ECC support.

      Frankly, I find it difficult to imagine why someone would need a graphics workstation with more than 128 GB RAM (as opposed to offloading the work to a server, or a HPC cluster) So I can't say that ECC is an absolute must...

  • > Most people will never need more than 16GB of RAM to play video games, and 32-64GB will take care of most video editing and 3D modeling tasks. With 256GB of RAM, you could run advanced AI processes

    AI, Games and 3D modelling may be popular things, but they don't come close to the space and computationally bounded computational problems that you come across in engineering and physics.

    In my case, an arbitrary amount of compute power and memory can be thrown at randomness distinguishability testing and en

    • by jythie ( 914043 )
      I had a similar thought. I've specced out number crunchers for our sims that go well into that price range. There are some tasks that can pretty much take any resource you can afford to throw at them. And since our sims can actually run in OSX, we could actually make use of a monster like this.
      • I would use a multi socket server motherboard with a couple of high core count Xeons and gobbets of memory. That would happily run Linux and could come in at less than $15K.

        My life got easier since I put multiprocessing and multithreading support into my analysis code. Most of my compute bound problems scale linearly with core count.

        • by jythie ( 914043 )
          Oh yeah. If I was given a budget to spec out a new rig for our sims, I would not go with something like this. On the other hand if such a machine fell in my lap I could make very good use of it.

          Though sadly, our sims will probably never support multithreading and scales.. ahm.. odly.... we tend to want a smaller number of higher clocked cores and as much memory/disk write bandwidth as we can get.
  • I rather buy myself a used car, and build my own Linux workstation ;-) https://www.youtube.com/watch?... [youtube.com]
  • HP and Dell offer servers with multiple 16 core processors, 256+ GB of RAM, and TB of SSD for under $10,000...
  • by King_TJ ( 85913 ) on Tuesday March 19, 2019 @06:18PM (#58301046) Journal

    I mean, in all the years I've used Apple products, that's always a complaint about them from detractors; They don't give you enough flexibility or choice!

    Well, here's a system from Apple that you can configure in all sorts of insane, over the top ways, IF you actually want to -- and people are complaining because it's too much?

    I actually own one of these iMac Pros, but I purchased it in the standard "base" configuration. I was also able to buy it for $1,000 off the regular price on a sale that Micro Center stores ran on it, shortly after it was released. They ran various sales on it for months after that, varying between about $500 off and that $1000 discount -- but there were definitely some opportunities to get one for less than Apple's advertised pricing.

    It's been a great computer and I have no regrets purchasing it.... The 5K display in it is excellent and partially justifies the base cost of the computer when you see how much equivalent monitors sell for separately. I certainly don't see the need to buy the upgraded configurations for many thousands more? But I'm glad those were available, in case people needed them. I can see someone running a lot of virtual machines in test environments, as a developer, possibly needing a lot more RAM. Maybe not 256GB but 128GB? Yeah .... could happen.

  • by Jeremy Erwin ( 2054 ) on Tuesday March 19, 2019 @06:20PM (#58301062) Journal

    What will make my shiny new imac pro obsolete? (bearing in mind that nothing on the imac pro can be upgraded without surgery)

    Will it be new graphics hardware?
    Will it be the widespread availability of cpus with more than 18 cores?
    Will it be higher resolution displays?

    No. It will be the emergence of bloatware the likes of which even god has never seen.

  • He doesn't know much about computers. But he wants a better one than everyone else to read email on.

  • Still waiting for the Mac Pro Tower. Having gone through half a dozen iMac all in ones they each had their lives shortened drastically by the screen going bad. A Xeon will make those fans run loud not G5 risc chip fast but still over 150 degrees F.
  • Frink's prediction might come true after all. [youtube.com] "I predict that within 100 years, computers will be twice as powerful, ten thousand times larger, and so expensive that only the 5 richest kings of Europe will own them "

  • Underutilizing computers has apparently become so commonplace the general people doesn't even know what computers are used for anymore.

    No, computers are not devices just to browse Facebook or play video games.
    Some people actually use them to run real programs on them.

    256GB is also pretty mundane, pretty much any half-decent machine has that. RAM is cheap.
    As a developer, I can easily use up more than 16GB just by starting an IDE or a compilation. And I'm not even doing hardware synthesis.

  • Slashdot confirms that Macs are too damn upgradeable.

What is research but a blind date with knowledge? -- Will Harvey

Working...