Forgot your password?
typodupeerror
Businesses HP Apple Technology

The End of the PC Era and Apple's Plan To Survive 549

Posted by Soulskill
from the moore-money-moore-problems dept.
Hugh Pickens writes "Charlie Stross has written a very interesting essay, ostensibly about the 'real reason why Steve Jobs hates Flash,' but really about how Jobs is betting Apple's future on an all-or-nothing push into a new market as Moore's law tapers off and the personal computer industry craters and turns into a profitability wasteland. Stross says that Apple is trying desperately to force the growth of a new ecosystem — one that rivals the 26-year-old Macintosh environment — to maturity in five years flat — the time scale in which they expect the cloud computing revolution to flatten the existing PC industry and turn PC manufacturers into suppliers of commodity equipment assembled on a shoestring budget with negligible profit. 'Any threat to the growth of the app store software platform is going to be resisted, vigorously, at this stage,' writes Stross. 'And he really does not want cross-platform apps that might divert attention and energy away from his application ecosystem.' The long-term goal is to support the long-term migration of Apple from being a hardware company with a software arm into being a cloud computing company with a hardware subsidiary. 'This is why there's a stench of panic hanging over Silicon Valley. This is why Apple have turned into paranoid security Nazis, why HP have just ditched Microsoft from a forthcoming major platform and splurged a billion-plus on buying up a near-failure; it's why everyone is terrified of Google,' writes Stross. 'The PC revolution is almost coming to an end, and everyone's trying to work out a strategy for surviving the aftermath.'"
This discussion has been archived. No new comments can be posted.

The End of the PC Era and Apple's Plan To Survive

Comments Filter:
  • It's not ending... (Score:5, Insightful)

    by Pojut (1027544) on Friday April 30, 2010 @03:28PM (#32048150) Homepage

    ...just changing. People seem to be exclusively using mobile devices more and more (whether it be phones, tablets, or laptops/netbooks/etc). That being said, tower PCs will ALWAYS have a place in the enthusiast and hobbyist markets. Even with my phone, laptop, and whatever else, I still love having a full-blown setup at home that I can chill out in front of.

    Hard to beat a multi-screen setup with a full size keyboard and a kensington expert trackball :-)

  • Wow... (Score:2, Insightful)

    by Scyth3 (988321) on Friday April 30, 2010 @03:30PM (#32048186)
    That's some over-the-top fear mongering.
  • by Anonymous Coward on Friday April 30, 2010 @03:31PM (#32048196)

    now we're going back to the thin-client model in a spectacularly fucked-up way? What a sack of shit!

    Time to give up my nerd hobbies and look for something else to get interested in. As a non-IT user now there's no more point in GNU, Linux. Everything's going to be a fucked-up locked-down black box bunch of HORSE SHIT.

    yay.

    Fuckers.

  • by BiggoronSword (1135013) on Friday April 30, 2010 @03:31PM (#32048208) Homepage Journal
    Don't forget PC gaming. Even with all the consoles out there PC gaming will always persist.
  • by DarkSabreLord (1067044) on Friday April 30, 2010 @03:32PM (#32048212)
    I don't think the PC is going to meet its demise anytime in the foreseeable future. Microsoft dominates the business sector right now because it caters to businesses in a way Apple doesn't. Apple may take over the home user market, but until they convince businesses to adopt their ideologies PCs won't be dying anytime soon
  • IT Tech POV (Score:2, Insightful)

    by DWRECK18 (1796294) on Friday April 30, 2010 @03:32PM (#32048220) Homepage
    I have to say that just but reading the article and the way things seem to be going in the IT field just on support I can see where he is coming from. I myself have put to use google docs as a way of storing my files so that I can access them anywhere. Cloud Computing is definately penetrating the IT industry in its entirety. Apple's stance on this and their fear of everything is understood, as is everyones fear of the change. Many companies will change with the times, but can we honestly say that PC's are going to go away and the revolution is over? There are still many flaws in making things available over the cloud and a lot of companies would rather have the ability to maintain their own information as opposed to putting it on the cloud and losing control over the hardware and software that maintains it. Most will not trust the security of the cloud over the ability to run NIDS and other such devices to secure their own networks and files. So a valid fear yes, unsubtantiated no, but is it truly going to take over and make pc's secondary any time soon, doubtful.
  • by Anonymous Coward on Friday April 30, 2010 @03:34PM (#32048238)
    The description sounds like the business model for consoles
  • by hotrodman (472382) on Friday April 30, 2010 @03:34PM (#32048242)

        Half of my users have trouble getting vpn protocols to work reliably over their isp links. ALL of my users complain loudly when things aren't fast and snappy. I would NEVER put any of these people 'on the cloud', considering one lost packet is enough to get them riled up. It's bad enough that they will complain about new emails not coming in....it would be worse if they can't get to ANY of them when their connection is down.

        You can get a lot of power into very small notebooks now.....why go backwards back to a dumb terminal that is dependent upon overloaded Starbucks wifi in order to get ANY program to work?

        Desktops may be dying out....but we're not switching the entire world to the cloud anytime soon.

      - Eric

  • Privacy (Score:5, Insightful)

    by fluffernutter (1411889) on Friday April 30, 2010 @03:35PM (#32048262)
    So far most of these new devices seem to have a huge tradeoff.. Privacy. There are very few apps on my iPod touch that allow me to keep my stuff within the confines of my home; especially if I am on the road and not on my own netwok. Until these privacy concerns are addressed I would hope PCs survive, otherwise the tech industry has done a monumental disservice to everyone. This all sums up my main dislike for Apple.
  • by denis-The-menace (471988) on Friday April 30, 2010 @03:36PM (#32048270)

    Not quite.
    Many people still don't feel like having a "cloud" service in the Internet hold the only copy of my documents. They can and will hold the files hostage if I stop paying, if they go belly up or if the government says so. Unlike money, documents don't loose value in a mattress.

  • Who is this idiot? (Score:2, Insightful)

    by Montezumaa (1674080) on Friday April 30, 2010 @03:38PM (#32048300)

    This guy sounds like a desperate market speculator that has no clue how the market works. The "personal computer" market is just have as rough a time as other markets, but it does not mean that we should just throw our arms in the air and give up. While I have not purchased new PC hardware in four or five years(for economic reasons), it does not mean that I do not want new hardware. Whoever this fucktard is, he needs to keep the stupid opinions to himself.

    Yeah, perhaps Apple and HP are looking to switch their platforms, but it does not mean that this will seal the end of the PC market. Only an idiot would buy into this horseshit.

  • by denzacar (181829) on Friday April 30, 2010 @03:38PM (#32048304) Journal

    ...the Y2K bug.

    I tend to take any prediction anyone in the computer industry makes with a rather large grain of salt since then.
    Particularly the ones relating to "the end of the world as we know it" and similar predictions of global occurrences.

    Seeing "END OF THE WORLD!!!11eleven!" not happen before your eyes does that to you.

  • by butterflysrage (1066514) on Friday April 30, 2010 @03:40PM (#32048332)

    wake me up when consoles have the same control options as a PC. While an analog stick may be miles above a D-pad, it still has a long way to go before I will swap one in to replace a 7-button mouse + keyboard with a half dozen macroes.

  • by FooAtWFU (699187) on Friday April 30, 2010 @03:41PM (#32048338) Homepage
    It's not really PCs that they're predicting will die per se I think. It's the ability of companies like Dell and HP (and Apple, for that matter) and the like to make tons of cash selling PCs. People who use the PCs will have it great, though, since everything will be ever-so-cheap!
  • by binarylarry (1338699) on Friday April 30, 2010 @03:41PM (#32048340)

    While I'm a huge fan of apocalyptic prophesies, I tend to agree.

    The reason being, business is going to use the cloud but it's going to augment existing practices, not replace them. No sane business is going to trust all of their valuable IP with a 3rd party, there isn't a third party out there you can really trust. Not Google, Not Apple, Not Microsoft (LOL)... they've all had very serious and public security failings in their recent history.

    This may be less true for consumers at home, but that's nothing new as "the cloud" for them is just a fancy new term for "the world wide web."

  • by Anonymous Coward on Friday April 30, 2010 @03:42PM (#32048350)

    You understand that PC does not stand for microsoft right?

    PC = Personal Computer

    That would be the hardware, not the software it runs.

    Apple is a hardware company, and as more people realize the hardware they 'sell' is at a 300% market, and its the exact same hardware that HP sells for 50% market.. well... yeah apple needs a new markup market.

  • Re:Moore's law (Score:1, Insightful)

    by Anonymous Coward on Friday April 30, 2010 @03:42PM (#32048354)

    It is tapering off. Any smaller and quantum mechanics come in to play, which is something we're not quite done studying yet much less implementing. Chips as we know them are very near their physical limits.

    But TFA is a sack of shit anyway. Throw some heat pipes on the die and slap another die ontop of it. Repeat as needed*. You can call them eSandwiches.

    * If only it were that easy...

  • by alexborges (313924) on Friday April 30, 2010 @03:43PM (#32048358)

    Yup. And I think this article is not at all wrong except maybe in the timeframe. Sooner or later networks will be reliable and very, very wide. The timeframe for the sustitution of local computing for remote "clouded" computing is directly proportional to the value of "sooner or later". The more networks take to get decent, the more time the PC has.

    Now there is an interesting gridlock: network providers are idiot money whores that still want to get dough out of an investment that has already returned them many times over. They do not want to move to ipv6 and PC software makers like MS have no incentive to do so because, yes, this will cheapen networks and make them more reliable thus making them obsolete.

    It is Interesting that yes, GNU, Linux and FOSS platforms in general will kill microsoft by being the dominant OS infrastructure of the new cloud which will be subsequently used to lock us in for the "service" of content providers and of just about anything else (applications and games)....

    Now, in the future, if this happens my young padawan, an Open Net movement with the GNU ideal on its mind will make its own cloud and we (yes, you and me) will compete with the other fuckers on services combined with foss platforms, unlocked phones and "freePads" or LiberPads. You see, if what I forsee is coming, and ipv6 is implemented despite the gridlock, net neutrality more or less comes by default and killing it looses any justification from the net providers who should anyways compete in price per MBPS and that only.

    And on and on....

  • by betterunixthanunix (980855) on Friday April 30, 2010 @03:44PM (#32048392)
    "That being said, tower PCs will ALWAYS have a place in the enthusiast and hobbyist markets."

    Or in professional markets, business markets, and so forth. People who need high performance systems and who are willing to sacrifice mobility will continue to buy tower PCs and workstations. Even mainframes remain in use by the very customers they were originally intended for: large institutions with large computing needs.

    Now, consumers may abandon tower PCs, which is another story entirely.
  • by casings (257363) on Friday April 30, 2010 @03:47PM (#32048430)

    Yea when you have a console that runs the latest games at 1920x1200 with FPS over 100, then you can say consoles have caught up.

    The only reason consoles have caught up to PCs is because they hinder innovation due to their limited hardware.

  • There have been plenty of phones for a while now that have a USB port. The most popular form factor is micro-USB, but it's still USB. It's up to the manufacturers to put compelling software on the phones and for the wireless companies (I'm looking at you, Verizon) to not ruin the experience.

  • by alen (225700) on Friday April 30, 2010 @03:48PM (#32048458)

    Dell and HP lose money selling PC's. they make money on the services and warranties and crapware people end up buying. just like best buy doesn't make any money on the stuff they sell.

  • by L3370 (1421413) on Friday April 30, 2010 @03:49PM (#32048462)
    The PC will stick around for some time, but the profitability of retailing them could disappear. If technology is taking the route described in the article, companies will be making computers just so the customer has access to the content.

    I imagine it being similar to Sony and MS selling their game consoles at a loss, just so they can get the customer to buy the content that runs on those platforms.
  • by coolmoose25 (1057210) on Friday April 30, 2010 @03:52PM (#32048492)
    Sorry, but I've been around too long to buy it. I remember seeing Larry Ellison predict the end of the PC era just as it got going. Literally, I was in the audience, as he described how the NC (Network Computer [wikipedia.org] for those that don't remember) would replace the PC. Conveniently, it was all driven by Oracle. No need for Apple, or Microsoft, or any of their nonsense anymore! And that was in 1998 I think... Remember 1998 folks? You were still using those clunky Netware networks - might have even been on Token Ring still, and you were excited by that new Windows 98 that was coming out that was FINALLY going to fix the problems with Windows 95... me, I was excited about that new fangled phone operating system... Palm OS.

    Sorry... Saying that PC's are going to bite it because of the "cloud" is like saying that we have bullet trains now, so you no longer need your car.

    (There's your car analogy for those looking for one)
  • No kidding (Score:5, Insightful)

    by Sycraft-fu (314770) on Friday April 30, 2010 @03:56PM (#32048538)

    I see no end in site for PCs. I see only changes. The biggest change is that hardware has gotten good to the point that you no longer need the best for many things. I mean time was, computer were slow even for simple stuff. I remember in high school I'd send a document to print and go off to the kitchen to snack while I waited the 10+ minutes it took. The system was just slow. Booting took forever, launching an app could take 30 seconds, etc. Media playback was limited to tiny, postage stamp sized video. Even if you had good hardware, it wasn't good enough.

    That's not the case these days. For basic stuff a low end system works fine. Also because lithography technology has progressed so much, basic can be quite small. Hence a small, cheap thing like a netbook is feasible to make and sell, and quite popular for various things. Still a computer though, and it hasn't killed off other computer markets.

    We just don't have a "one size fits all" market, or perhaps more accurately we are now able to make technology good enough to make different kinds of systems for different uses.

    The iPad is not the future. The iPhone is not the future. A combination of devices, including ones not yet created, are the future. We do not appear to be heading towards a "death" of normal computers.

  • by HermMunster (972336) on Friday April 30, 2010 @03:56PM (#32048544)

    For the past 25 years we've seen these types of predictions. What's being said is nothing new. Just a new surface on an old polygon.

    The industry has a long way to go before it is going to die. There's nothing Apple nor anyone else can do that will change things. The industry, in a way, is at fault for any problems being perceived. The constant niggling of customers by tiny incremental change leads customers to believe that there's nothing happening and thus their unwillingness to pay the price for the technology. Make big changes, some radical, such as from the command line to the GUI and we'll see another 50 years of growth in PC.

    This is more feldercarb by some industry exhaust spewing waste into the ecosystem. They are just blowhards seeking to get you to think that this Apple product is the direction we'll be going. We do not run our computers for gaming, as gaming is secondary. We expect significantly more from our computers than a gaming console provides. We do not do serious productivity work on an iPad or gaming console.

    And Moore's law has nothing to do with this. Everytime someone says Moore's law has come to an end we have another go at it.

    I think what I'm reading are the younger generation that didn't see the world as it was back then, before computing was involved in every aspect of our lives. These people have a problem with their imagination and hence their mind is out of focus when it comes to innovation and technology. I'm certain this isn't quite like the music business where a friend said that the only reason music sucks today is because all the good music has already been made. It's really a lack of vision that drives one to conclude that these cobbled devices are technology's future. They are a just a crutch to innovation.

  • by Hamsterdan (815291) on Friday April 30, 2010 @03:57PM (#32048560)
    A good game is not defined by its resolution, graphics and explosions. (Same goes for movies BTW).

    A good game is defined by good *gameplay*.
  • For your situation, I'd recommend CSIP (Chainsaw to Idiot People). Seriously, if they're that damned picky, and you haven't snapped yet, kudos to you.

    Em's law: Shit happens, and it happens on a regular basis. Prepare for it.

  • by frank_adrian314159 (469671) on Friday April 30, 2010 @03:58PM (#32048582) Homepage

    No sane business is going to trust all of their valuable IP with a 3rd party, there isn't a third party out there you can really trust.

    No sane [aircraft] business is going to trust their [engines] with a 3rd party, there isn't a third part out there you can really trust.

    No sane [mainframe computer] business is going to trust [printers or disk drives] with a 3rd party, there isn't a third party out there you can really trust.

    No sane [personal computer] company is going to trust [motherboard manufacture] with a 3rd party, there isn't a 3rd party other there you can really trust.

    Get back to me in ten years and tell me, if you still have a job as an organization's "cloud information management" person, how things are going...

  • by binarylarry (1338699) on Friday April 30, 2010 @04:00PM (#32048618)

    This isn't about outsourcing some kind of widget that can be duplicated and mass produced, it's about the data that drives the business itself.

    What you suggest is like Paul McCartney outsourcing a new Beatles album.

  • Close. (Score:1, Insightful)

    by Anonymous Coward on Friday April 30, 2010 @04:02PM (#32048640)

    "The Cloud" is just the fancy new name for "Utility Computing", which was a fancy new name for "The Grid", which was a shorter version of "The Network Is The Computer", which was just a fancy new name for what used to be called "Mainframe Computing"

    When the world does actually switch over to it, it's not still going to be called "The Cloud", and there's a few spectacular failures and legal changes that will happen before we get to the shiny new name that sticks to a successful implementation.

  • by pastafazou (648001) on Friday April 30, 2010 @04:02PM (#32048642)
    the PC industry already IS a profitability wasteland. PC manufacturers have been suppliers of commodity equipment assembled on a shoestring budget with negligible profit for over 5 years now. That's why IBM liquidated their PC division to Lenovo. It's also why Dell's market capitalization continues to dwindle despite their efforts to diversify. And why Acer gobbled up Gateway and eMachines. Companies either have to continually grow their volume to maintain the same profits, or get into something different with more margin. Apple has been doing that for a while now, as has IBM. HP's PC division doesn't make them much money at all (relative to volume), but with all their other lines (printers, servers, etc) it's worth their effort because they can be the sole supplier for some huge corporations, thus making their profits on the specialty equipment.
  • by hotrodman (472382) on Friday April 30, 2010 @04:03PM (#32048654)

        I believe I have very picky users. But then again, a lot of people paid a lot of money to buy a lot of equipment so a lot of people can do sales calls and do the road-warrior thingy and work from remote offices. It's how our company makes its money. So I expect to be able to buy equipment that gets me as close to the ideal as possible. Cloud computing makes no sense in our environment, and probably wouldn't for a very, very long time. I have seen these articles a lot over the years.... and it's just same ole, same ole.
        Yes, I am an old, grumpy Unix admin. It is totally normal to keep a shotgun beyond your desk, right?

      - Eric

  • by Sycraft-fu (314770) on Friday April 30, 2010 @04:07PM (#32048692)

    Let's assume we hit the absolute limit. We develop a lithography technique that is as small as possible, and there is no way to do anything on the quantum level. I'm not saying that is remotely likely, just assume. So what? That now means there no use for anything but an iPad? Hardly. While there's a wide variety of users for computers these days that require little power, there are plenty of other uses that require more power. Media creation would be a big one. People love to shoot, edit, and distribute video. Wonderful, but you need an ok system to do SD video, and you need a reasonably high end system to do HD. Video games would be another area. Those modern consoles, including the Wii, have some heavy hitting graphics hardware in them. Not the kind of thing you pack in an ultra mobile device.

    In fact, if we hit the absolute limit of transistor size scaling, we'd then be at a point where the only way to get higher performance is larger chips, more processors, more power usage. It would in fact be a hindrance to portable devices. The mobile market we have today is possible only because we've been able to scale things down so well. The potential technologies that people talk about for the future in the mobile market will only be possible with more scaling. If we can't build smaller, more efficient chips, well then we'll just have to live with larger devices.

    Also just because a market becomes saturated, doesn't mean there isn't money to be made in it. Sure, everyone who wants a PC owns one these days, more or less. It is even getting that way with laptops. So what? There's still a market. As an example, look at TVs. In America we hit TV saturation long, long ago. EVERYONE has a TV, even extremely poor families have a TV. What's more, you can now replace a TV with a tiny device. In theory, a smart phone could replace a TV. Doesn't matter, people don't want to watch TV on their smart phone, they want a 65" big screen TV. Doesn't matter that they could have it more mobile or in another device. They want a bigass TV, so they'll buy one.

  • by grumpyman (849537) on Friday April 30, 2010 @04:19PM (#32048908)
    For all those who f the cloud - have you ever thought about some of your most precious resource are in other people's hand? I.e. Bank?
  • by Grishnakh (216268) on Friday April 30, 2010 @04:21PM (#32048956)

    Yep, that's going to work great. Instead of buying a white-box PC for $400-600 and using it for 3-5 years before upgrading, we'll just use cloud-based computing for $100-150 per month. So much more economical.

    Also, we already have glasses that can display full-resolution screens. I tried on a pair at a trade show in 2000, and it worked great: full-color, 1024x768 resolution (as that was 10 years ago, I'm sure they could do better now). So where are they?

  • by outlander78 (527836) on Friday April 30, 2010 @04:31PM (#32049148)

    A couple of points:

    1. These next-generation devices lack storage, and it is far cheaper to put a drive on your local network than it is to rent space online, in which case you pay monthly fees not only for the storage but for the bandwidth to access it. A desktop in the basement is a good solution for this requirement.
    2. The cost of a terminal which can be used to access virtual OSes over a network usually costs about the same as a desktop. If you can have the desktop for the same price, why not keep it?
    3. When a product becomes a commodity, people don't stop buying it - in fact, quite the opposite. Just because Apple can't charge $2000 for a computer anymore doesn't mean low-margin suppliers won't continue to sell them.
  • ATTN: MAC USERS (Score:3, Insightful)

    by BJ_Covert_Action (1499847) on Friday April 30, 2010 @04:49PM (#32049400) Homepage Journal
    If you don't know what Ctrl-Alt-F1 and Ctrl-Alt-Backspace are for, GTFO.
    If you think a pretty web browser is more important than a properly secured one, GTFO.
    If you don't know how to listen to music with any player other than iTunes, GTFO.
    If you think the App store counts as a software repository, GTFO.
    If you think you know how your computer actually works, GTFO.
    If text that is not encompassed by a pretty bubble widget scares you, GTFO.


    Most importantly:

    If you think personal computers are no longer necessary, interesting, or are part of a dying industry, turn in your geek card at the door as you GTFO.

    ;)
  • by Above (100351) on Friday April 30, 2010 @05:00PM (#32049594)

    It appears many of the responders have interpreted the "end of the PC era" to mean that in 5/10/15 years there will be no more PC's. This interpretation is amazingly stupid, and misses the entire point Steve is trying to make.

    Steves point is that particular applications and use cases are moving away from the PC. We watched NetFlix and YouTube on a PC in the past because we needed to push out new software to a general purpose platform to support it. But that's not how most users want to watch it. My new TV streams both inside the TV. I'll never watch Netflix on my PC again.

    A couple of years ago if I wanted to find a nearby restaurant I would have loaded Google Maps, searched, and clicked around on my PC. Today I take my iPhone off my belt, load UrbanSpoon or Yelp, and get more useful information plus a map I can take with me. I don't search for restaurants on my PC anymore.

    People aren't going to get rid of their PC's, and the PC will always be the platform for really new innovation because of its general purpose nature and the ability to run new software. But PC's have effectively saturated the market. Maybe people need a desktop and a laptop, but no (consumers) need 10, 20, or 50 PC's per person. There is no growth.

    But TV's, game consoles, smart phones, tablets and other form factors are just starting to do interesting things. They are doing them in a more convenient way much of the time, and in a way consumers are more likely to use. I can start a netflix movie on my TV with 3-4 remote presses. Compare to 5 years ago where you had to build a media center PC, hook it up to your TV, deal with all sorts of programs to get content, etc.

    Steve's point is that while PC's are 95% of the way people access information today, they will be 50% in 10 years. Not because PC's have gone away, but because there is an explosion in other devices. So if you keep building for the PC, you'll be building for 50% of the market in 10 years. We'll still be doing word processing on a PC with a mouse and keyboard then, but other things will be done elsewhere.

  • by Mistlefoot (636417) on Friday April 30, 2010 @05:01PM (#32049612)
    There is much truth to what you say.
    Even though you are modded plus 5 Funny.

    Much of what makes Apple cool is the Elitist attitude.

    They have traditionally been the BMW of computers.

    If a BMW where the same price as a Hyundai I am sure some of that desire to have a BMW would wane.
    This is not to say the product isn't good. But because it is a product that your poor uncool friends can't afford, whilst you can, makes the purchase even better.
  • Re:Moore's law (Score:3, Insightful)

    by mweather (1089505) on Friday April 30, 2010 @05:13PM (#32049814)
    Moore's law has nothing to do with the size of the transistors. It has to do with the number of transistors, and their cost. If you fit the exact same number of transistors for half the cost, Moore's law holds just as true as if you doubled the number of transistors for the same cost.
  • by steveha (103154) on Friday April 30, 2010 @05:14PM (#32049826) Homepage

    This guy sounds like a desperate market speculator that has no clue how the market works... etc etc etc

    Whereas you sound like you didn't even read TFA. Or if you did, you don't understand it. Let me break it down for you.

    TFA says that there will be relentless downward pressure on computer prices from now on. This point is unassailable.

    I can buy a Compaq laptop with a dual-core AMD chip, a great 15.6" display, big hard disk, a DVD drive, and lots of RAM, all for about $400, quantity 1 retail. (Or $370 on sale at Fry's.) I can put Ubuntu on it, and the result is nearly as nice as an Apple laptop. Checking apple.com, I see that I can buy a 13" MacBook for $1000, or a 15" MacBook Pro for $1800. No question, the Apple notebooks are nicer: they have that nifty magnetic power cord, they have slot-loading optical drives, they have the great unibody aluminum chassis, etc. But I have to tell you, if I'm spending my own money, it's going to be that $400 computer, or even a $250 netbook with a 10.1" screen. Does a 13" MacBook really offer me four times the value of a $250 netbook?

    TFA says that in the future, Apple is worried that it will be forced to cut their prices and sell at low margins, because the entire PC industry will be forced to cut prices and sell at low margins. I don't see much to debate here either. Here is a quote from TFA:

    PCs are becoming commodity items. The price of PCs and laptops is falling by about 50% per decade in real terms, despite performance simultaneously rising in real terms. The profit margin on a typical netbook or desktop PC is under 10%. Apple has so far survived this collapse in profitability by aiming at the premium end of the market -- if they were an auto manufacturer, they'd be Mercedes, BMW, Porsche and Jaguar rolled into one. But nevertheless, the underlying prices are dropping. Moreover, the PC revolution has saturated the market at any accessible price point. That is, anyone who needs and can afford a PC has now got one. Elsewhere, in the developing world, the market is still growing -- but it's at the bottom end of the price pyramid, with margins squeezed down to nothing.

    Is that clear enough for you? PCs aren't going away, but the traditional PC profit margins are going away, and this will cause a shakeup in the PC manufacturing industry. Apple has, so far, managed to make higher margins than the typical 10%, but how long can they continue this?

    And what do you know, Apple has successfully set up a whole ecosystem where consumers must go through the Apple App Store to get applications, and Apple collects a 30% cut. TFA says that Apple would do almost anything, maybe even give the hardware away, to get all their customers locked into such an ecosystem.

    In short, TFA doesn't say that PCs are going away. It says that PCs are going to be cheap, fast, and ubiquitous, and that companies selling PCs will be forced to accept slim margins. And Apple really doesn't want to play that game. Remember how Steve Jobs dissed netbooks? Apple doesn't want to sell a netbook, or even an iPad, for $250; and the market won't let them get away with selling a netbook for $500. The actual problem Steve Jobs has with netbooks is the razor-thin margins. So far, the market will allow Apple to charge $500 and up for an iPad (although I don't think that can last forever either; great iPad competitors ). [slashdot.org]

    TFA isn't the only place I have seen this theory. See also: http://industry.bnet.com/technology/10006035/why-apple-will-eventually-dump-the-mac/ [bnet.com]

    Maybe the article is far-fetched. But if Steve Jobs thinks he has any chance at all of locking all of Apple's customers into an App Store ecosystem where Apple skims 30% of all the action, you better believe he will go for it.

    steveha

  • by dnahelicase (1594971) on Friday April 30, 2010 @05:20PM (#32049916)
    Plus people are getting better at computer and spending more time with them. A younger generation is more adept at working with video editing, gaming, and more advanced computer functions that simply weren't accessible a few years ago. Nobody is going to have a workstation at home to check email, but my phone is not replacing my need for a computer, but expanding it. Now I get relevant email almost all hours of the day. I don't need a workstation to read it, but it's nice to have a desktop to organize it every once in a while.

    I realize my phone can take a video and upload it to youtube, but it's a ways away before I can create a mashup of different movie scenes, edit myself into them, create a lightsaber duel, and upload it on my phone.

    That day will probably come, but I imagine by then we'll have thought of new cool things to do that needs a machine or decent size.

  • Who's the idiot? (Score:4, Insightful)

    by Nicolas MONNET (4727) <nicoaltiva@gmaiCOLAl.com minus caffeine> on Friday April 30, 2010 @05:54PM (#32050362) Journal

    Yeah, what does he know about computing and the future? After all he's just a long time Linux user, former sysadmin, Perl hacker and currently a very successful science fiction author. And a very good one at that. IMO the best current SF writer that I know of.

  • by Anonymous Coward on Friday April 30, 2010 @06:11PM (#32050546)

    Let's not lose sight of the fact that the world will be a better place the sooner its rid of Flash. Please, buy all non-Apple products if you wish, but let's all dump Flash ASAP and help promote open web standards.

  • by dangitman (862676) on Friday April 30, 2010 @06:28PM (#32050730)

    But because it is a product that your poor uncool friends can't afford, whilst you can, makes the purchase even better.

    Who the fuck can't afford a Mac? Depending on the vintage, even a homeless person might obtain one for free from a dumpster. Perfectly usable modern Macs are available on eBay for $100 - $200. People don't buy Macs because they're "exclusive." We're not exactly talking Chanel or Louie Vitton here.

  • by Sycraft-fu (314770) on Friday April 30, 2010 @06:33PM (#32050792)

    When you look at it, it turns out the number of mainframes in use hasn't gone down. It didn't peak and then decrease. It has in fact grown a bit. It is simply that other kinds of computers have grown more. The microcomputer didn't kill the mainframe, it just expanded the computer business to markets the mainframe was never going to reach. I would never own a mainframe of my own, no matter how much I might want to, however I do own a microcomputer. In fact, I own 3 of them.

    However mainframes are still in use in many places. IBM still makes new ones (the IBM zSeries). The market is still there, though small. It was never very big, and was never going to be very big.

    We have probably reached saturation for desktop computers already, and probably did so some time ago. We are likely reaching saturation for laptops too. Doesn't mean they are going away, doesn't mean new ones aren't going to be sold all the time. Just means that the total number in use isn't going to grow a whole lot.

  • by Blakey Rat (99501) on Friday April 30, 2010 @06:35PM (#32050806)

    I'm not complaining about the pre-emptive multitasking or protected memory. What I'm complaining about is mostly:

    * Completely half-assed backwards compatibility. The "Classic" environment never worked worth crap, and Apple didn't even pretend to care about improving it after 10.2 came out.

    * Removing features that were in Classic. Suddenly, Finder isn't spatial anymore, it doesn't have labels, you can't tab folders against the bottom of the monitor.

    * Dismissing any sense of consistency. Suddenly, Macs have two completely different window styles, both in appearance and behavior, for absolutely no reason whatsoever. Since that wasn't screwing with their users enough, they decided every new app should have it's own completely different window style.

    * Pissing all over previous usability research. Remember when the destructive window control (Close) was widely separated from the non-destructive ones (Zoom, WindowShade)? We don't need that anymore-- in fact while we're at it, let's make it look like a stoplight (of all things!) instead of using the old icons that at least somewhat attempted to explain the button's behavior.

    * Making new UIs that were... well, a complete mess. (To be generous.) Remember when the live search feature was added into Finder? What a disaster. Did Apple care? Nope, not even slightly. (I'm not saying the Windows one is better, but, again, Apple *used* to raise the bar for usability.)

    Despite all this stuff, they've sold tons of machines, which goes to show that maybe usability doesn't matter at all. Which is a depressing thought.

  • Re:ATTN: MAC USERS (Score:3, Insightful)

    by Gerald (9696) on Friday April 30, 2010 @06:55PM (#32050992) Homepage

    If your computer doesn't run UNIX and Word natively, GTFO.

  • by Gr8Apes (679165) on Friday April 30, 2010 @09:56PM (#32052702)

    You sound like someone stuck in the past, 15 years ago past no less.

    I, and probably 99% of the rest of the current mac users, couldn't care less about pre-OSX Mac OS anything. They sucked. (Yes, I did use it, and it has as much fondness for me as Win 3.11 or NT 3.51, or perhaps OS/2 2.0)

    Seriously, move on. You complain about a relatively congruent system and compare it to the Ribbon... wait, menu... wait, Icon system of Windows and the completely non sensical and inconsistent GUI it comes with? If you doubt me - just do a plain install of Win7 or 2008 R2 and check out the default administrative apps and their modal dialogs. There are at least 3 completely different types of windows.

      Are you trolling or what?

  • by carlmenezes (204187) on Friday April 30, 2010 @10:14PM (#32052814) Homepage
    I have to disagree. Tower PCs are currently only useful because our mobile PCs don't have the horse power. Mobile PCs will keep getting faster and smaller. So will tower PCs. There was a time they used to be a lot bigger and heavier. However, let us look at what made them that size : (1) hard drives - used to be huge and heavy. Seen SSDs of late? (2) CD-ROM drives - who needs them now when for half the space you get a memory card reader that takes media with more space? (3) power supplies - needed to be big and are probably what is keeping the tower PCs at their size, but there is now less need for large supplies with performance per watt going up. (4) graphics cards & CPUs - going to come full circle soon - these two will merge into one processor that uses less power than your average desktop CPU (5) motherboards - these are already really small. So, if you take these main components, the need for a full tower sized case actually is diminishing really rapidly. If you ask me, with tech like wireless HDMI, your tower pc is probably going to be confined to the attic or some unseen space very soon. We're very quickly reaching the point where smaller devices have enough computing power for most of our needs and as far as heavy lifting goes, I figure it is only a matter of time before every small little computing device at home is able to "lend a hand" and "help out" with all that computing. The very fact that PS3s are dominating the SETI distributed computing stats should say something. The PS3's cell processor is quite the beast. Are you trying to say that the PS4 is not going to be smaller and faster? What about the PS8? Do you thing you will be able to see it?
  • You're lost (Score:2, Insightful)

    by symbolset (646467) on Saturday May 01, 2010 @03:11AM (#32054412) Journal
    Moore's law matters very little in the way that you think of it. We turned a corner here and you don't see it. When performance is good enough, we don't need more performance - we need more performance per Watt.
  • by tgibbs (83782) on Saturday May 01, 2010 @09:54AM (#32055774)

    Well, yes, but Stross didn't say that computers are going to go away. He said that they will no longer be profitable enough to sustain a company such as Apple. Sure, the hobbyists will remain, but Apple doesn't want to turn into Heathkit. And there will continue to be uses for computers in the business and scientific worlds. But it will become less and less possible to sell a premium computer at a high profit margin. The consumer uses that have driven the growth of the computer market--web browsing, video, music, games--as well as the basic education/business functions--word processing, presentation, database, simple calculation--are migrating onto other devices, supported by data storage out on the web. Apple will likely continue to sell computers indefinitely, but they will be more like the the low end Powerbook and Mini, and command less and less of a premium price.

  • by WheezyJoe (1168567) <fegg@NoSpAm.excite.com> on Saturday May 01, 2010 @11:37AM (#32056364)

    Oh... it's ending. I agree with the FA that the personal computer (PCs running Windows, Linux, MacOS) are gonna die out.

    Slashdotters are bound to disagree with this for the same reason real geeks like me resisted mice back in the 80's. The command-line was the only way, because it was powerful and we had climbed the learning curve. X10 or X11 only had one purpose... more xterms on a bigger screen. We called Macs "MacinToys" because of their substandard hardware, no multi-tasking, and no command-line to get done what you really wanted it to do.

    But all the time during early Windows and Macs, there was a feeling that faster hardware would make the GUI more palatable. And our art-school friends used Windows and Macs regardless, in spite of all the drawbacks we command-line geeks were so well aware of.

    Fast-forward to today. Just about every Linux distro boots straight to a mouse-based desktop, and all the admin tools have a GUI. The GUI has won. We are happy to spin 90% of our CPU cycles just to paint the screen, because CPU cycles (and RAM! and storage!) are so damned abundant. Macs, Linux, now even Windows comes with a command-line shell, but how often do we actually use it? Really?

    But all the other stuff we invested our time learning and mastering, like partitioning, directory structure, networking, defragging, anti-malware, plug-ins, superior 3d-party apps, maintenance, maintenance, all the other stuff we have to do for our grandma to keep her PC working ok are still around. Let's face it ladies, we spend (waste) a lot of time keeping our computers healthy and up-to-date. And we're smug about it.

    The future is a computing platform free of all that shit, where all the skills we are so smug about are as obsolete as the command line. That's where Jobs and the iPad are going, and the market for problem-free, geek-free computing is hungry enough to pay a premium for it, even as PC hardware gets cheaper and cheaper, even as we complain about handing control over to Some Corporation. This market is sick and tired of always running to (or paying) people like us for help.

    And that's the last frontier, the last bit of value-added left to the computer industry. Intel and the market flourished because MacOS and Windows never ran quite good enough with the CPU and memory available. Now, 3GHz 8-core CPU's with 4 GB RAM are really quite good enough (compare that to your... VAX). But to people who just want to get online or do word processing, there's still a lot of cruft to deal with.

    Let's face it... we LIKE that cruft. We LOVE it. But it's also time-consuming, time spent downloading this and configuring that or installing just the right liquid-cooled heat exchanger and on and on until our dream PC is "just right". Jobs and Apple are out to hand out a machine that's "just right" out of the box. And they damn-well don't want third-party plug-ins like Flash i) requiring an extra step before you fully use the Internet, and ii) putting the platform at risk in case Adobe screws something up.

    Perfect the turn-key computing device, and Jobs has good reason to believe people will hand over their money for years and years to come.

    Because it's the maintenance-free, worry-free, geek-free, turn-the-key experience that Jobs thinks is where the money is. And he's right, just like he was right about the GUI. Geeks like us may want (and pay for) premium hardware, but we'll buy it from Newegg at the cheapest margins possible, and even still, our girlfriends will look up from their iPads with THAT look in their eyes and ask how much longer we're going to need getting our little do-it-urself project to the level Apple is selling out-of-the-box.

    "But mine will be better, once I'm done...", we start to explain, thinking how "closed" and "restricted" that iPad is.

    Talk to the hand. While she's Facebooking how immature we are to all her iPad friends, we're all hell-bound to end up like that grumpy old COBOL developer: "In my day, we wrote code in ed, one line at a time, 'cause we only had 1024K in the whole damned mainframe for 85 VT-100's across the whole campus... and we LOVED IT!"

Last yeer I kudn't spel Engineer. Now I are won.

Working...