Forgot your password?
typodupeerror
Businesses HP Apple Technology

The End of the PC Era and Apple's Plan To Survive 549

Posted by Soulskill
from the moore-money-moore-problems dept.
Hugh Pickens writes "Charlie Stross has written a very interesting essay, ostensibly about the 'real reason why Steve Jobs hates Flash,' but really about how Jobs is betting Apple's future on an all-or-nothing push into a new market as Moore's law tapers off and the personal computer industry craters and turns into a profitability wasteland. Stross says that Apple is trying desperately to force the growth of a new ecosystem — one that rivals the 26-year-old Macintosh environment — to maturity in five years flat — the time scale in which they expect the cloud computing revolution to flatten the existing PC industry and turn PC manufacturers into suppliers of commodity equipment assembled on a shoestring budget with negligible profit. 'Any threat to the growth of the app store software platform is going to be resisted, vigorously, at this stage,' writes Stross. 'And he really does not want cross-platform apps that might divert attention and energy away from his application ecosystem.' The long-term goal is to support the long-term migration of Apple from being a hardware company with a software arm into being a cloud computing company with a hardware subsidiary. 'This is why there's a stench of panic hanging over Silicon Valley. This is why Apple have turned into paranoid security Nazis, why HP have just ditched Microsoft from a forthcoming major platform and splurged a billion-plus on buying up a near-failure; it's why everyone is terrified of Google,' writes Stross. 'The PC revolution is almost coming to an end, and everyone's trying to work out a strategy for surviving the aftermath.'"
This discussion has been archived. No new comments can be posted.

The End of the PC Era and Apple's Plan To Survive

Comments Filter:
  • by Pojut (1027544) on Friday April 30, 2010 @03:44PM (#32048378) Homepage

    With the PS3 you can use [playstation.com] a keyboard and mouse, and there are options [xoxide.com] for the 360.

  • good dept! (Score:3, Interesting)

    by opencity (582224) on Friday April 30, 2010 @03:44PM (#32048382) Homepage

    " moore-money-moore-problems"
    is a very good gag.
    My personal "recent" favorite is "weapons-of-map-reduction" about big table IIRC, but I laugh out loud periodically. Good work /. editors!
    I think someone (else besides me) should put together a list of the best depts and hack some voting software together.

  • by istartedi (132515) on Friday April 30, 2010 @03:52PM (#32048494) Journal

    LOL. End of PC era. Can I have what they're smoking? In a Smithsonian exhibit, I saw a graph of TV ownership in the US. It was a saturation curve, flattening out in the 1970s, IIRC. By then, most people had TVs, and it was just gap filling. I saw the PC ownership curve saturating in the late 90s. By PC, I mean Personal Computer, including Macs.

    The point? Companies like Zenith and Sony made money long after the "TV revolution" was over. Better models, ergonomic features, add-ons, incremental refinements, solid state vs. tube, etc.

    It's shortsighted to think that we aren't going to continue to have refinements in the PC other than Moore's law related speedups. No, companies like Intel won't be driving huge speculative bubbles anymore; but they won't be going bankrupt either. Just like TV makers, the differentiator will be how well they run their business. It'll be things like customer service, cashflow, etc. It'll be boring business stuff, sorry; but not the end of the world.

    Oh, and f*** the cloud. You can have my hard drive when you pry it from my cold dead fingers. Actually, make that my affordable solid state drive. See? Plenty of refinements left in the pipeline.

  • Re:Moore's law (Score:4, Interesting)

    by Dunbal (464142) * on Friday April 30, 2010 @03:53PM (#32048514)

    It is tapering off. Any smaller and quantum mechanics come in to play

          So obviously this whole concept of parallel computing and multi-core processors has just whizzed right past you, huh? Especially when Intel and AMD are planning for 128 and 256 core CPU's for HOME use, and current supercomputers use tens of thousands (or more) of CPU's, thus busting the "we can't get smaller" myth. Yeah ok maybe you can't fit more transistors on a 5mm x 5mm chip, but you can fit a LOT of chips in a mini tower case...

  • by alen (225700) on Friday April 30, 2010 @03:54PM (#32048516)

    i read almost the exactly worded article almost 10 years ago when everyone was afraid of Microsoft.

    i like my iPhone and will probably buy a few iPads, but we're reliving the 1970's with this stuff. people will get tired of their dumb devices in a few years like they hated the rent some time dumb terminals and someone will sell a mobile device in 5-10 years that will run a real OS and not the dumbed down iphone OS that's locked to apple or google. we are in a flash memory revolution and in 5-10 years we will be back to more mobile power than we know what to do with. like we hit with PC's around 2000

    people are looking for a good mobile computing experience today and Windows 7 isn't it. Slate looked OK but too slow.

  • Re:Moore's law (Score:5, Interesting)

    by mveloso (325617) on Friday April 30, 2010 @04:09PM (#32048740)

    It's not moore's law is tapering off. It's just that machines are so fast that 99% of the population doesn't need it.

    Why i7 when a core duo would be just as good? Or an atom? You don't need a lot of processing power to log onto facebook, watch a Youtube video, or create and edit a word doc.

    The world is changing, and many people in the field find it hard to believe. Just the other day I heard some tech make some crack about an underpowered core 2 duo box...for general office use. A P4 would be perfectly fine for that; a c2d is overkill.

  • by gtbritishskull (1435843) on Friday April 30, 2010 @04:14PM (#32048824)
    The point is not that all computing will go to the cloud. The problem is that computers are becoming a commodity. Innovation is slowing down because, for most consumers (including businesses), they don't need their computers to get any faster / get better features. There is no way for computer sellers to stay "ahead" of their competitors, so profit margins shrink. You can still make a profit in commodity markets, but not the profits an innovative company like Apple has come to expects. So, they are looking ahead to other areas where they can be innovative and make the large profits.
  • by RockoTDF (1042780) on Friday April 30, 2010 @04:15PM (#32048832) Homepage
    Actually, I have to wonder about this. I got into computers because of gaming (ie what parts do I need, how to troubleshoot problems, etc) and never had the latest console until my brother got a PS2 when I was 17. I can't help but wonder that if my parents had bought me an N64 instead of a PC for the family that Christmas if I would be where I am now.
  • Not really (Score:2, Interesting)

    by AnonymousClown (1788472) on Friday April 30, 2010 @04:18PM (#32048886)

    The point? Companies like Zenith and Sony made money long after the "TV revolution" was over. Better models, ergonomic features, add-ons, incremental refinements, solid state vs. tube, etc.

    Those guys (I think SONY too), GE and RCA licensed their names to cheap Asian electronics makers. Meaning that named brand TV is really some really cheap thing from Asia that is using the name only. The reason is that the margins became so thin that those big US companies didn't think it was worth it to manufacture and they were able to get a better return by licensing their names. The Asian manufacturers got instant brand recognition.

    I was really surprised when IBM cut their ties with Lenovo. I was really expecting IBM to license their name to Lenovo, allowing Lenovo to keep selling PCs and Laptops under the IBM name - with IBM having nothing to do with it.

    Many other industries operate this way.

  • by jeffmeden (135043) on Friday April 30, 2010 @04:21PM (#32048948) Homepage Journal

    The big problem is that there is probably not going to be another "killer app" for the a new desktop PC, like the WWW was for most people for the past 10 years or so. You went from PCs as an enthusiast market (not selling very many, relatively high cost per unit) to a market where everyone realized they needed to get one, and suppliers sprang up at lightning speed to fill the demand. Look at how fast Dell Inc. grew up and ate into the big traditional companies like HP and IBM. The same thing is about to happen again; the PC market is at the end of it's "killer" phase where everyone needs to have one. For the past ten years, millions upon millions of PCs were sold each year. There is no way that we are ever going to see those kinds of numbers again; even a $300 pc does a passable job at internet, email, and productivity.

    People will probably keep their current PC for five or ten years, until it completely breaks, and if they get a new one it will be a bargain PC sold to them with little margin (most of which will go to the big box store that sells it.) For big tech companies, it's time to get behind the "next big thing" or plan on becoming the next Compaq, relegated to the recycle bin of computer history for lack of innovation. The PC market is going to dry up like the Sahara; all the enthusiasts in the world couldn't keep Dell busy the way they were when every last person in the modern world was in the market for a computer. Soon enough every last person in the developed world will be in the market for this next thing, and some people know it and are furiously trying to make sure they are in on it.

  • Re:Privacy (Score:2, Interesting)

    by jpcarter (1098791) on Friday April 30, 2010 @04:28PM (#32049078)

    So far most of these new devices seem to have a huge tradeoff.. Privacy. There are very few apps on my iPod touch that allow me to keep my stuff within the confines of my home; especially if I am on the road and not on my own netwok. Until these privacy concerns are addressed I would hope PCs survive, otherwise the tech industry has done a monumental disservice to everyone. This all sums up my main dislike for Apple.

    You are in the minority. I'm sorry.

    We need a major privacy catastrofuck to educate the masses or sway public opinion. Even then, you're fighting cheap convenience.

  • by jollyreaper (513215) on Friday April 30, 2010 @04:46PM (#32049352)

    We hear this "everything's already there, been there, done that". But in reality we have a lack of innovation in products and markets because of a rather large monopoly that has stifled competition at every turn, even after being convicted. It isn't that we have one OS to rule them all that's helped us get to where we are, it is inspite of that that we are where we are. We have continued to penetrate new markets, to educate people, to bring products such as tablet computing and smart phones inspite of being smothered from the top.

    There's an old saying that goes "You can give a monkey a computer and he'll use it but probably just to crack open walnuts." The IT failures I've seen come from a lack of vision, a lack of understanding, and a lack of follow-through. It's like watching someone turning an electric screwdriver by hand because they don't realize there's a power switch.

    It's a false line of reasoning to say "Just because I can't think of a better way nobody else can, either." But it's really hard to improve on what we've got. Look at the mouse. I can make a lot of complaints about it but have we yet found an input tool to make the mouse completely a thing of the past? No. It's just like we haven't really found a good replacement for the keyboard. People keep trying but I think it's safe to say the computers of the next decade will come with mice and keyboards.

    We're going to be going through a system upgrade at my job. The old system is pretty crappy, no argument there, but we're still not even using it properly. Back to what I said above, failures in vision and understanding. I'll do my best to see that we can make a change of it this time but we're likely to be back to using the system to a fraction of its full ability.

  • Re:Moore's law (Score:4, Interesting)

    by geekoid (135745) <{moc.oohay} {ta} {dnaltropnidad}> on Friday April 30, 2010 @04:48PM (#32049374) Homepage Journal

    I won't reply to AC, but I will reply to you.

    WHile you are technically correct. we could do parallel development with any chips and it has nothing to do with Moore's law, in actuallity the poster was correct, and here is why: Market.

    A by product of Moore's law was speed. So When AMD became a player, Intel removed a lot of there focus from parallel development, to focusing on clock speed. Increasing clock speed drove the market, not individual clock cycle usage.

    Now the number of transistors on a sqr millimeter of chip has gotten damn hard to double. There are many reasons for this.

    Running out of transistor space and forced companies to focus on parallel development, finally. I think we would be a decade ahead if AMD didn't come along...or marketed parallel as real performance, and not speed.

  • Re:ATTN: SWITCHEURS (Score:1, Interesting)

    by Blakey Rat (99501) on Friday April 30, 2010 @04:52PM (#32049454)

    I'm an old school Mac user. I switched to Windows when it became clear (after trying the first 4 versions of OS X) that Apple no longer gave half-a-shit about usability.

    If the choice is between two unusable systems (as it seems to be), I'll pick the one with the most apps.

    I still like Apple's hardware though.

  • by Anonymous Coward on Friday April 30, 2010 @07:23PM (#32051342)

    There's a lot that Stross got right (I do hope he doesn't delay the launch of his next Laundry novel because he spends his time writing essays). There's going to be a major change in the PC/network world soon, and Jobs is reacting to that knowledge. What's going to happen is that people who do not know how to use a computer or secure their own wi-fi network will be buying appliances that require no technical knowledge. There's also going to be a big market of people who do have tech know-how, but just want to do a lot of things you can do with the new devices, in addition to owning a personal computer. The "cloud" is not going to replace corporate computing and databases, because—as many other people have remarked—no sane corporation would trust their family jewels to anything as amorphous as "the cloud". Sure, corporations use the services of other corporations to manage their data—but the owner knows where the data is, and how it's being secured, and all kinds of conditions are locked up tight in a contract that invokes huge penalties for things like, for example, security leaks.

    The "home computer" for the masses is going away, that's right. Its place will be taken by phones and (because there's obvious limitations to such a small screen) larger tablet-like devices. However, there's one problem with this scenario: it isn't going to happen as long as every big player wants users to live only in their corporate cloud.

    Two major innovations brought us where we are today: the standard PC operating system and the World Wide Web. Neither was planned by any major corporation. DOS (later to become Windows) became the universal OS because IBM didn't understand the importance of their PC OS, and gave control of it to one of the luckiest opportunists of all time—Bill Gates. Sure, Gates founded a hugely successful corporation as a result of recognizing this opportunity. But that corporation came about as a result of the success of his idea—the insight that the real money was in software, and the way to make money on an OS was to license it to anybody who wanted to pay. Once consumers realized that they could buy the cheap "IBM clones" and run the same programs as would run on the IBM PC, the hardware ceased to matter, and software was everything. This was not planned by one of the big players in the computer industry; it created the biggest player.

    The same was true of the WWW. It just grew on its own. No corporation started it deliberately (sure DARPA started the internet, which provided the necessary infrastructure, but the internet is not the Web). Nobody made big bucks on the Web itself (which is not to say that the Web can't be used to make big bucks).

    Now we're at another cusp. What's going to have to happen for mass acceptance of the "cloud" is that the cloud be both free and open . But none of the big players of today really want that. They want to lock people into their proprietary cloud-jails. That is not going to work. Somebody is going to have to come along and think of something new, something that will leave Apple, Google, HP, and all the other players in the dust. That will be the new Player.

  • Re:ATTN: SWITCHEURS (Score:3, Interesting)

    by Blakey Rat (99501) on Friday April 30, 2010 @10:42PM (#32052994)

    You sound like someone stuck in the past, 15 years ago past no less.

    Eh, ok. At least I'm not worshipping technologies stuck in the 70s like some Slashdotters are. :)

    I, and probably 99% of the rest of the current mac users, couldn't care less about pre-OSX Mac OS anything. They sucked.

    The technology of it sucked, mostly. Nobody's going to make the claim that cooperative multitasking or lack of memory protection is a good thing in OSes.

    What I loved is the attention to detail, the useful features (many of which disappeared forever), the focus on consistency and usability. The entire OS was contained in a single folder, for all practical purposes-- nobody had to reformat a disk to upgrade. In fact, I went from System 7.1 to 9.2 on a single machine without ever formatting. The fact that the entire OS ran (what amounted to) a plug-in architecture.

    And look, I know that the later versions of Mac were just a quick hack to distract us all from the fact that Apple's OS development had gone off into la-la land. That doesn't make them awful products.

    And yes, it crashed. It crashed a decent amount, although I'd say that if the system was well-maintained (removing buggy Extensions) it certainly didn't crash any more than Windows 95 or 98.

    You complain about a relatively congruent system and compare it to the Ribbon...

    First of all, I like the Ribbon. I'm not going to apologize for that.

    But more relevant to this conversation, at least Microsoft is *trying*. Even if you hate the Ribbon, you have to acknowledge that Microssoft was taking a huge risk by implementing it-- but they did the research, they surveyed the users, they truly believed that it was a step forward, and they took that risk. Successful or not, I respect that.

    Those four things I just mentioned? Apple can do that with hardware, but when's the last time they took *any* sort of risk in their OS? Hell, they basically rewrote a windowing system and file browser *from scratch* and we ended up with something nearly identical to what was there before-- Apple doesn't have any guts at all anymore, and they certainly aren't doing anything to move the state of the art forward. Completely stagnant.

    If you doubt me - just do a plain install of Win7 or 2008 R2 and check out the default administrative apps and their modal dialogs. There are at least 3 completely different types of windows.

    I never said Windows had a better UI. Re-read my post.

    I did exaggerate to call both Windows and OS X "unusable", that's clearly not the case-- both are a dozen times better than products of 10 years ago. I just momentarily forgot how literal-minded the average Slashdotter was.

    The thing is, Apple products used to be head and shoulders over the Microsoft equivalents. Now they're pretty much exactly the same.

    (Or as a more petty and small-minded reply: 3 window types, whee. What's OS X up to now, 12?)

    Are you trolling or what?

    No, I just happen to have an opinion that's different than yours. That's still legal, right?

  • Re:ATTN: SWITCHEURS (Score:3, Interesting)

    by Blakey Rat (99501) on Saturday May 01, 2010 @01:23AM (#32053990)

    OS X is actually the first time I've ever experienced version X of a product having *fewer* features than version X-1. The number of features subtracted from OS 9 Finder alone was enormous. That's what bother me more than the backwards compatibility thing.

    There were features Apple put in that I loved, and relied on every day, and... *riiippp* gone now! Tough shit! It's never coming back!

    It's all about moving on to new things.

    Yah, now we just need to get them to move on to *better* things. ;)

  • by RichM (754883) on Saturday May 01, 2010 @08:52AM (#32055462) Homepage
    I've talked with my coworkers about this a few times.
    We agree that the future will involve something much like a Nokia N900 [nokia.com] with a couple of USB ports on it.
    The basic idea is that you get to the office, plug your 24" LCD into the mini-HDMI port on the device, plug your keyboard and mouse into the USB ports and away you go.
    Network access would be provided either by wireless or VPN via HSDPA.

No hardware designer should be allowed to produce any piece of hardware until three software guys have signed off for it. -- Andy Tanenbaum

Working...