The End of the PC Era and Apple's Plan To Survive 549
Hugh Pickens writes "Charlie Stross has written a very interesting essay, ostensibly about the 'real reason why Steve Jobs hates Flash,' but really about how Jobs is betting Apple's future on an all-or-nothing push into a new market as Moore's law tapers off and the personal computer industry craters and turns into a profitability wasteland. Stross says that Apple is trying desperately to force the growth of a new ecosystem — one that rivals the 26-year-old Macintosh environment — to maturity in five years flat — the time scale in which they expect the cloud computing revolution to flatten the existing PC industry and turn PC manufacturers into suppliers of commodity equipment assembled on a shoestring budget with negligible profit. 'Any threat to the growth of the app store software platform is going to be resisted, vigorously, at this stage,' writes Stross. 'And he really does not want cross-platform apps that might divert attention and energy away from his application ecosystem.' The long-term goal is to support the long-term migration of Apple from being a hardware company with a software arm into being a cloud computing company with a hardware subsidiary. 'This is why there's a stench of panic hanging over Silicon Valley. This is why Apple have turned into paranoid security Nazis, why HP have just ditched Microsoft from a forthcoming major platform and splurged a billion-plus on buying up a near-failure; it's why everyone is terrified of Google,' writes Stross. 'The PC revolution is almost coming to an end, and everyone's trying to work out a strategy for surviving the aftermath.'"
ATTN: SWITCHEURS (Score:4, Funny)
If you don't know what Cmd-Shift-1 and Cmd-Shift-2 are for, GTFO.
If you think Firefox is a decent Mac application, GTFO.
If you're still looking for the "maximize" button, GTFO.
If the name "Clarus" means nothing to you, GTFO.
Bandwagon jumpers are not welcome among real Mac users [atspace.com]. Keep your filthy PC fingers to yourself.
Re: (Score:3, Funny)
Re: (Score:2)
Its the Open-Apple key, but I will allow Apple-Key.
"Shift-Apple-3 to take a screen shot"
I'm partial to Shift-Apple-Ctrl-4 (I generally like my screenshots to go to clipboard), especially since a subsequent tap of the spacebar makes it basically behave like a time-delayed Shift-Apple-Ctrl-3. Very handy.
ATTN: MAC USERS (Score:3, Insightful)
If you think a pretty web browser is more important than a properly secured one, GTFO.
If you don't know how to listen to music with any player other than iTunes, GTFO.
If you think the App store counts as a software repository, GTFO.
If you think you know how your computer actually works, GTFO.
If text that is not encompassed by a pretty bubble widget scares you, GTFO.
Most importantly:
If you think personal computers are no lon
Re: (Score:3, Insightful)
If your computer doesn't run UNIX and Word natively, GTFO.
This troll makes me all nostalgic (Score:5, Funny)
Oh, those adorable bisexual Mac ravers. I'd forgotten all about their deliciously ambiguous sexuality and rebellious fashion sense. Here, have some glow sticks and pacifiers, Mac rave kids! Ah, the good old days, when trolls asked us to think of our breathing, to picture mare sex, and the GNAA was more than just a funny name. Not like trolls these days, with their 'nigger' this and 'Obama's got a bigger dick than me which makes me feel inferior' that. Boring! Open Source developers sodomizing innocent coworkers in an orgy of shit and puke, THAT was a troll. But try telling that to kids these days...
Damn kids, git offa mah lawn.
Re:ATTN: SWITCHEURS (Score:5, Funny)
If you think Clarus is a misspelling of Claris, you REALLY, REALLY need to get the FUCK off my platform. RIGHT. NOW.
Re: (Score:3, Informative)
Huh, interesting, OS X is the only thing that I think Apple has done extraordinarily right.
Re:ATTN: SWITCHEURS (Score:4, Insightful)
I'm not complaining about the pre-emptive multitasking or protected memory. What I'm complaining about is mostly:
* Completely half-assed backwards compatibility. The "Classic" environment never worked worth crap, and Apple didn't even pretend to care about improving it after 10.2 came out.
* Removing features that were in Classic. Suddenly, Finder isn't spatial anymore, it doesn't have labels, you can't tab folders against the bottom of the monitor.
* Dismissing any sense of consistency. Suddenly, Macs have two completely different window styles, both in appearance and behavior, for absolutely no reason whatsoever. Since that wasn't screwing with their users enough, they decided every new app should have it's own completely different window style.
* Pissing all over previous usability research. Remember when the destructive window control (Close) was widely separated from the non-destructive ones (Zoom, WindowShade)? We don't need that anymore-- in fact while we're at it, let's make it look like a stoplight (of all things!) instead of using the old icons that at least somewhat attempted to explain the button's behavior.
* Making new UIs that were... well, a complete mess. (To be generous.) Remember when the live search feature was added into Finder? What a disaster. Did Apple care? Nope, not even slightly. (I'm not saying the Windows one is better, but, again, Apple *used* to raise the bar for usability.)
Despite all this stuff, they've sold tons of machines, which goes to show that maybe usability doesn't matter at all. Which is a depressing thought.
Re: (Score:3, Insightful)
You sound like someone stuck in the past, 15 years ago past no less.
I, and probably 99% of the rest of the current mac users, couldn't care less about pre-OSX Mac OS anything. They sucked. (Yes, I did use it, and it has as much fondness for me as Win 3.11 or NT 3.51, or perhaps OS/2 2.0)
Seriously, move on. You complain about a relatively congruent system and compare it to the Ribbon... wait, menu... wait, Icon system of Windows and the completely non sensical and inconsistent GUI it comes with? If you doubt
Re: (Score:3, Interesting)
You sound like someone stuck in the past, 15 years ago past no less.
Eh, ok. At least I'm not worshipping technologies stuck in the 70s like some Slashdotters are. :)
I, and probably 99% of the rest of the current mac users, couldn't care less about pre-OSX Mac OS anything. They sucked.
The technology of it sucked, mostly. Nobody's going to make the claim that cooperative multitasking or lack of memory protection is a good thing in OSes.
What I loved is the attention to detail, the useful features (many of whic
Re: (Score:3, Interesting)
OS X is actually the first time I've ever experienced version X of a product having *fewer* features than version X-1. The number of features subtracted from OS 9 Finder alone was enormous. That's what bother me more than the backwards compatibility thing.
There were features Apple put in that I loved, and relied on every day, and... *riiippp* gone now! Tough shit! It's never coming back!
It's all about moving on to new things.
Yah, now we just need to get them to move on to *better* things. ;)
Re: (Score:3, Informative)
I have never owned a Mac. I rarely end up having a chance to mess around with them but last year two people with 12" Aluminum Powerbooks had hardware failures. I do a lot of side jobs fixing broken laptops so I've taken apart my fair share of various PC laptops. I told them I could probably help them. Both just needed the Power-In board replaced.
I was stunned by how well put together the Mac laptops felt compared to the average HP/Toshiba/Dell. Even the higher end Tecra line and the like seem like toys
Re: (Score:3, Informative)
This is interesting. Most Mac owners I know are poor uni students, or were poor uni students. Most of them simply went without things that other people take for granted - cars, nice apartments, holidays, etc, and often made a small income from their machine - dj's, print designers, sound designers, etc.
Most of them see their machines as a hard working tool that suits their needs, not a fashion accessory. They sacrifice a few things to make that purchase, because that tool is more important to them than a ho
Re: (Score:3, Insightful)
But because it is a product that your poor uncool friends can't afford, whilst you can, makes the purchase even better.
Who the fuck can't afford a Mac? Depending on the vintage, even a homeless person might obtain one for free from a dumpster. Perfectly usable modern Macs are available on eBay for $100 - $200. People don't buy Macs because they're "exclusive." We're not exactly talking Chanel or Louie Vitton here.
It's not ending... (Score:5, Insightful)
...just changing. People seem to be exclusively using mobile devices more and more (whether it be phones, tablets, or laptops/netbooks/etc). That being said, tower PCs will ALWAYS have a place in the enthusiast and hobbyist markets. Even with my phone, laptop, and whatever else, I still love having a full-blown setup at home that I can chill out in front of.
Hard to beat a multi-screen setup with a full size keyboard and a kensington expert trackball :-)
Re: (Score:2, Insightful)
Re: (Score:2, Funny)
Re:It's not ending... (Score:4, Insightful)
wake me up when consoles have the same control options as a PC. While an analog stick may be miles above a D-pad, it still has a long way to go before I will swap one in to replace a 7-button mouse + keyboard with a half dozen macroes.
Re: (Score:3, Interesting)
With the PS3 you can use [playstation.com] a keyboard and mouse, and there are options [xoxide.com] for the 360.
Re: (Score:2)
do all games support them however?
Re: (Score:3, Insightful)
Yea when you have a console that runs the latest games at 1920x1200 with FPS over 100, then you can say consoles have caught up.
The only reason consoles have caught up to PCs is because they hinder innovation due to their limited hardware.
Re: (Score:2, Insightful)
A good game is defined by good *gameplay*.
Re: (Score:2)
100% agree. Certain genres just plain work better on the PC, and some games also belong on PC even if they can be kinda ported to consoles (I'm looking at you, Dragon Age.)
Re:It's not ending... (Score:4, Interesting)
Re: (Score:2)
I didn't say there wasn't still a gap, I said the gap has been significantly reduced.
a new tide, higher than the last tide (Score:3, Informative)
I didn't say there wasn't still a gap, I said the gap has been significantly reduced.
I haven't looked at the numbers closely, but I suspect the gap hasn't been reduced at all. ATI's new Evergreen is kicking out a trillion SP FLOPS and now has full IEEE double precision as well. After ignoring graphics for years, OpenCL has caught my fancy. I once hoped that IBM would kick out an upgraded version of Cell with fast IEEE double precision, but their unholy alliance with Sony proved to be quite the fiasco.
What has greatly changed is the relevance of the gap. When you're a student living in a
Re: (Score:3, Informative)
Last I checked every MMORPG runs on PC, with a few on MACs also, but I can't think of a single MMORPG that works on consoles.
The MMORPG genre is pretty damn big at this point. Since you blog this stuff, why do you think that MMORPG's have not moved to include consoles?
Uh...just to name a few:
Phantasy Star Online for Dreamcast [wikipedia.org]
EverQuest Online Adventures on PS2 [wikipedia.org]
Final Fantasy XI on PS2 [wikipedia.org]
Phantasy Star Universe on PS2 and 360 [wikipedia.org]
Age of Conan coming to 360 [wikipedia.org]
Re:It's not ending... (Score:5, Insightful)
While I'm a huge fan of apocalyptic prophesies, I tend to agree.
The reason being, business is going to use the cloud but it's going to augment existing practices, not replace them. No sane business is going to trust all of their valuable IP with a 3rd party, there isn't a third party out there you can really trust. Not Google, Not Apple, Not Microsoft (LOL)... they've all had very serious and public security failings in their recent history.
This may be less true for consumers at home, but that's nothing new as "the cloud" for them is just a fancy new term for "the world wide web."
Re: (Score:3, Insightful)
No sane business is going to trust all of their valuable IP with a 3rd party, there isn't a third party out there you can really trust.
No sane [aircraft] business is going to trust their [engines] with a 3rd party, there isn't a third part out there you can really trust.
No sane [mainframe computer] business is going to trust [printers or disk drives] with a 3rd party, there isn't a third party out there you can really trust.
No sane [personal computer] company is going to trust [motherboard manufacture] wit
Re:It's not ending... (Score:5, Insightful)
This isn't about outsourcing some kind of widget that can be duplicated and mass produced, it's about the data that drives the business itself.
What you suggest is like Paul McCartney outsourcing a new Beatles album.
Re: (Score:3, Funny)
What you suggest is like Paul McCartney outsourcing a new Beatles album.
Well...he did let George write SOME songs...
Re: (Score:3, Interesting)
Re: (Score:2)
And if cloud computing takes off (which is an even bigger "if" in my opinion) it won't matter where your "main" computer is, or perhaps not eve
Re: (Score:3, Insightful)
Yep, that's going to work great. Instead of buying a white-box PC for $400-600 and using it for 3-5 years before upgrading, we'll just use cloud-based computing for $100-150 per month. So much more economical.
Also, we already have glasses that can display full-resolution screens. I tried on a pair at a trade show in 2000, and it worked great: full-color, 1024x768 resolution (as that was 10 years ago, I'm sure they could do better now). So where are they?
Re:It's not ending... (Score:5, Insightful)
Or in professional markets, business markets, and so forth. People who need high performance systems and who are willing to sacrifice mobility will continue to buy tower PCs and workstations. Even mainframes remain in use by the very customers they were originally intended for: large institutions with large computing needs.
Now, consumers may abandon tower PCs, which is another story entirely.
Mainframes are a very good example (Score:5, Insightful)
When you look at it, it turns out the number of mainframes in use hasn't gone down. It didn't peak and then decrease. It has in fact grown a bit. It is simply that other kinds of computers have grown more. The microcomputer didn't kill the mainframe, it just expanded the computer business to markets the mainframe was never going to reach. I would never own a mainframe of my own, no matter how much I might want to, however I do own a microcomputer. In fact, I own 3 of them.
However mainframes are still in use in many places. IBM still makes new ones (the IBM zSeries). The market is still there, though small. It was never very big, and was never going to be very big.
We have probably reached saturation for desktop computers already, and probably did so some time ago. We are likely reaching saturation for laptops too. Doesn't mean they are going away, doesn't mean new ones aren't going to be sold all the time. Just means that the total number in use isn't going to grow a whole lot.
Re: (Score:3, Insightful)
Re:It's not ending... (Score:5, Funny)
Don't be ridiculous. Tower PCs are going to die, and we're all going to be using mobile devices in the future. When you go to work in an office, you won't be using a Dell in a tower case with a 24" monitor any more; you'll be answering your email, working on spreadsheets and documents, and doing CAD design or programming on a netbook with a 7" screen, or even a smartphone with an on-screen keyboard, or perhaps one of those virtual keyboards that are projected onto a desk. I predict full-size keyboards and monitors are going to be obsolete within 5 years.
Offices in the near future will be completely revolutionized by this mobile technology. Gone will be walled offices and cubicles, and instead people will come to work at offices which are just very large rooms which look much like cafeteria seating areas, where everyone can sit together at long tables, and do all their work on their smartphones, while being able to easily collaborate with each other, and anyone in the entire office. It's going to be amazing how much more productive everyone is in such an environment.
Re: (Score:3, Interesting)
The big problem is that there is probably not going to be another "killer app" for the a new desktop PC, like the WWW was for most people for the past 10 years or so. You went from PCs as an enthusiast market (not selling very many, relatively high cost per unit) to a market where everyone realized they needed to get one, and suppliers sprang up at lightning speed to fill the demand. Look at how fast Dell Inc. grew up and ate into the big traditional companies like HP and IBM. The same thing is about to
Re: (Score:3, Insightful)
Well, yes, but Stross didn't say that computers are going to go away. He said that they will no longer be profitable enough to sustain a company such as Apple. Sure, the hobbyists will remain, but Apple doesn't want to turn into Heathkit. And there will continue to be uses for computers in the business and scientific worlds. But it will become less and less possible to sell a premium computer at a high profit margin. The consumer uses that have driven the growth of the computer market--web browsing, video,
Re:It's not ending... (Score:5, Insightful)
Oh... it's ending. I agree with the FA that the personal computer (PCs running Windows, Linux, MacOS) are gonna die out.
Slashdotters are bound to disagree with this for the same reason real geeks like me resisted mice back in the 80's. The command-line was the only way, because it was powerful and we had climbed the learning curve. X10 or X11 only had one purpose... more xterms on a bigger screen. We called Macs "MacinToys" because of their substandard hardware, no multi-tasking, and no command-line to get done what you really wanted it to do.
But all the time during early Windows and Macs, there was a feeling that faster hardware would make the GUI more palatable. And our art-school friends used Windows and Macs regardless, in spite of all the drawbacks we command-line geeks were so well aware of.
Fast-forward to today. Just about every Linux distro boots straight to a mouse-based desktop, and all the admin tools have a GUI. The GUI has won. We are happy to spin 90% of our CPU cycles just to paint the screen, because CPU cycles (and RAM! and storage!) are so damned abundant. Macs, Linux, now even Windows comes with a command-line shell, but how often do we actually use it? Really?
But all the other stuff we invested our time learning and mastering, like partitioning, directory structure, networking, defragging, anti-malware, plug-ins, superior 3d-party apps, maintenance, maintenance, all the other stuff we have to do for our grandma to keep her PC working ok are still around. Let's face it ladies, we spend (waste) a lot of time keeping our computers healthy and up-to-date. And we're smug about it.
The future is a computing platform free of all that shit, where all the skills we are so smug about are as obsolete as the command line. That's where Jobs and the iPad are going, and the market for problem-free, geek-free computing is hungry enough to pay a premium for it, even as PC hardware gets cheaper and cheaper, even as we complain about handing control over to Some Corporation. This market is sick and tired of always running to (or paying) people like us for help.
And that's the last frontier, the last bit of value-added left to the computer industry. Intel and the market flourished because MacOS and Windows never ran quite good enough with the CPU and memory available. Now, 3GHz 8-core CPU's with 4 GB RAM are really quite good enough (compare that to your... VAX). But to people who just want to get online or do word processing, there's still a lot of cruft to deal with.
Let's face it... we LIKE that cruft. We LOVE it. But it's also time-consuming, time spent downloading this and configuring that or installing just the right liquid-cooled heat exchanger and on and on until our dream PC is "just right". Jobs and Apple are out to hand out a machine that's "just right" out of the box. And they damn-well don't want third-party plug-ins like Flash i) requiring an extra step before you fully use the Internet, and ii) putting the platform at risk in case Adobe screws something up.
Perfect the turn-key computing device, and Jobs has good reason to believe people will hand over their money for years and years to come.
Because it's the maintenance-free, worry-free, geek-free, turn-the-key experience that Jobs thinks is where the money is. And he's right, just like he was right about the GUI. Geeks like us may want (and pay for) premium hardware, but we'll buy it from Newegg at the cheapest margins possible, and even still, our girlfriends will look up from their iPads with THAT look in their eyes and ask how much longer we're going to need getting our little do-it-urself project to the level Apple is selling out-of-the-box.
"But mine will be better, once I'm done...", we start to explain, thinking how "closed" and "restricted" that iPad is.
Talk to the hand. While she's Facebooking how immature we are to all her iPad friends, we're all hell-bound to end up like that grumpy old COBOL developer: "In my day, we wrote code in ed, one line at a time, 'cause we only had 1024K in the whole damned mainframe for 85 VT-100's across the whole campus... and we LOVED IT!"
What is that smell? (Score:5, Funny)
Ah, the smell of hyperbole in the morning....
Re: (Score:2)
Re: (Score:2)
I'm not drinking the Koolaid (I don't have an iPhone or an iPad and don't plan to buy any of them in the future), but I can see where the technology is going.
Still, writing your essay on a notebook on a train while going to the Uni seems more comfortable... at least until someone invents a foldable iPad or trains start sporting docking stations embedded in the passenger seats.
Re:What is that smell? (Score:4, Interesting)
i read almost the exactly worded article almost 10 years ago when everyone was afraid of Microsoft.
i like my iPhone and will probably buy a few iPads, but we're reliving the 1970's with this stuff. people will get tired of their dumb devices in a few years like they hated the rent some time dumb terminals and someone will sell a mobile device in 5-10 years that will run a real OS and not the dumbed down iphone OS that's locked to apple or google. we are in a flash memory revolution and in 5-10 years we will be back to more mobile power than we know what to do with. like we hit with PC's around 2000
people are looking for a good mobile computing experience today and Windows 7 isn't it. Slate looked OK but too slow.
No kidding (Score:5, Insightful)
I see no end in site for PCs. I see only changes. The biggest change is that hardware has gotten good to the point that you no longer need the best for many things. I mean time was, computer were slow even for simple stuff. I remember in high school I'd send a document to print and go off to the kitchen to snack while I waited the 10+ minutes it took. The system was just slow. Booting took forever, launching an app could take 30 seconds, etc. Media playback was limited to tiny, postage stamp sized video. Even if you had good hardware, it wasn't good enough.
That's not the case these days. For basic stuff a low end system works fine. Also because lithography technology has progressed so much, basic can be quite small. Hence a small, cheap thing like a netbook is feasible to make and sell, and quite popular for various things. Still a computer though, and it hasn't killed off other computer markets.
We just don't have a "one size fits all" market, or perhaps more accurately we are now able to make technology good enough to make different kinds of systems for different uses.
The iPad is not the future. The iPhone is not the future. A combination of devices, including ones not yet created, are the future. We do not appear to be heading towards a "death" of normal computers.
26 year old legacy (Score:5, Funny)
Re:26 year old legacy (Score:5, Funny)
Wow... (Score:2, Insightful)
Death of the PC? I don't think so... (Score:5, Insightful)
Re:Death of the PC? I don't think so... (Score:5, Insightful)
Re:Death of the PC? I don't think so... (Score:4, Insightful)
Dell and HP lose money selling PC's. they make money on the services and warranties and crapware people end up buying. just like best buy doesn't make any money on the stuff they sell.
Re: (Score:2)
Compare a business-class graphics card to a gaming card. I have no idea what the differences are (more pipelines, less memory, I have no clue, but I need it for 3D modeling, apparently). 4-year-old cards still go for +$300. Try to pull that off in gamer-land
Basically what I'm trying to say is that businesses have always been willing to pay a premium for a more reliable box, and I don't se
Re: (Score:2, Insightful)
You understand that PC does not stand for microsoft right?
PC = Personal Computer
That would be the hardware, not the software it runs.
Apple is a hardware company, and as more people realize the hardware they 'sell' is at a 300% market, and its the exact same hardware that HP sells for 50% market.. well... yeah apple needs a new markup market.
Re: (Score:3, Insightful)
I imagine it being similar to Sony and MS selling their game consoles at a loss, just so they can get the customer to buy the content that runs on those platforms.
How To Put Apple Out Of Business (Score:5, Funny)
Offer a phone with a USB port.
I hope this helps the bankruptcy of Apple.
Cheers.
Re: (Score:3, Insightful)
There have been plenty of phones for a while now that have a USB port. The most popular form factor is micro-USB, but it's still USB. It's up to the manufacturers to put compelling software on the phones and for the wireless companies (I'm looking at you, Verizon) to not ruin the experience.
IT Tech POV (Score:2, Insightful)
will we finally get beyond http, then? (Score:5, Insightful)
Half of my users have trouble getting vpn protocols to work reliably over their isp links. ALL of my users complain loudly when things aren't fast and snappy. I would NEVER put any of these people 'on the cloud', considering one lost packet is enough to get them riled up. It's bad enough that they will complain about new emails not coming in....it would be worse if they can't get to ANY of them when their connection is down.
You can get a lot of power into very small notebooks now.....why go backwards back to a dumb terminal that is dependent upon overloaded Starbucks wifi in order to get ANY program to work?
Desktops may be dying out....but we're not switching the entire world to the cloud anytime soon.
- Eric
Re: (Score:2, Insightful)
For your situation, I'd recommend CSIP (Chainsaw to Idiot People). Seriously, if they're that damned picky, and you haven't snapped yet, kudos to you.
Em's law: Shit happens, and it happens on a regular basis. Prepare for it.
Moore's law (Score:5, Informative)
Moore's law is tapering off? I've heard about the impending end of Moore's law for at least the past 10 years, but they keep on going. What evidence is there that Moore's law is tapering off? Wikipedia cites Intel in 2008 as predicting Moore's law to continue until 2029. Not an unbiased source, but I think we'd see the end coming if it was to come in the next 10 years.
Re:Moore's law (Score:4, Informative)
What evidence is there that Moore's law is tapering off?
None. It's called Fear-Uncertainty-Doubt (FUD [wikipedia.org]) and is a standard marketing strategy, albeit an unethical one.
And what would it matter if it did? (Score:5, Insightful)
Let's assume we hit the absolute limit. We develop a lithography technique that is as small as possible, and there is no way to do anything on the quantum level. I'm not saying that is remotely likely, just assume. So what? That now means there no use for anything but an iPad? Hardly. While there's a wide variety of users for computers these days that require little power, there are plenty of other uses that require more power. Media creation would be a big one. People love to shoot, edit, and distribute video. Wonderful, but you need an ok system to do SD video, and you need a reasonably high end system to do HD. Video games would be another area. Those modern consoles, including the Wii, have some heavy hitting graphics hardware in them. Not the kind of thing you pack in an ultra mobile device.
In fact, if we hit the absolute limit of transistor size scaling, we'd then be at a point where the only way to get higher performance is larger chips, more processors, more power usage. It would in fact be a hindrance to portable devices. The mobile market we have today is possible only because we've been able to scale things down so well. The potential technologies that people talk about for the future in the mobile market will only be possible with more scaling. If we can't build smaller, more efficient chips, well then we'll just have to live with larger devices.
Also just because a market becomes saturated, doesn't mean there isn't money to be made in it. Sure, everyone who wants a PC owns one these days, more or less. It is even getting that way with laptops. So what? There's still a market. As an example, look at TVs. In America we hit TV saturation long, long ago. EVERYONE has a TV, even extremely poor families have a TV. What's more, you can now replace a TV with a tiny device. In theory, a smart phone could replace a TV. Doesn't matter, people don't want to watch TV on their smart phone, they want a 65" big screen TV. Doesn't matter that they could have it more mobile or in another device. They want a bigass TV, so they'll buy one.
Re:Moore's law (Score:5, Interesting)
It's not moore's law is tapering off. It's just that machines are so fast that 99% of the population doesn't need it.
Why i7 when a core duo would be just as good? Or an atom? You don't need a lot of processing power to log onto facebook, watch a Youtube video, or create and edit a word doc.
The world is changing, and many people in the field find it hard to believe. Just the other day I heard some tech make some crack about an underpowered core 2 duo box...for general office use. A P4 would be perfectly fine for that; a c2d is overkill.
Re:Moore's law (Score:5, Funny)
Don't forget Gates' Law where software speed halves every 18 months.
Re:Moore's law (Score:4, Interesting)
It is tapering off. Any smaller and quantum mechanics come in to play
So obviously this whole concept of parallel computing and multi-core processors has just whizzed right past you, huh? Especially when Intel and AMD are planning for 128 and 256 core CPU's for HOME use, and current supercomputers use tens of thousands (or more) of CPU's, thus busting the "we can't get smaller" myth. Yeah ok maybe you can't fit more transistors on a 5mm x 5mm chip, but you can fit a LOT of chips in a mini tower case...
Re:Moore's law (Score:4, Interesting)
I won't reply to AC, but I will reply to you.
WHile you are technically correct. we could do parallel development with any chips and it has nothing to do with Moore's law, in actuallity the poster was correct, and here is why: Market.
A by product of Moore's law was speed. So When AMD became a player, Intel removed a lot of there focus from parallel development, to focusing on clock speed. Increasing clock speed drove the market, not individual clock cycle usage.
Now the number of transistors on a sqr millimeter of chip has gotten damn hard to double. There are many reasons for this.
Running out of transistor space and forced companies to focus on parallel development, finally. I think we would be a decade ahead if AMD didn't come along...or marketed parallel as real performance, and not speed.
Re: (Score:3, Insightful)
Privacy (Score:5, Insightful)
Who is this idiot? (Score:2, Insightful)
This guy sounds like a desperate market speculator that has no clue how the market works. The "personal computer" market is just have as rough a time as other markets, but it does not mean that we should just throw our arms in the air and give up. While I have not purchased new PC hardware in four or five years(for economic reasons), it does not mean that I do not want new hardware. Whoever this fucktard is, he needs to keep the stupid opinions to himself.
Yeah, perhaps Apple and HP are looking to switch
Re: (Score:2)
I don't know why they modded you down. What you said is quite practical and on point.
Re:Who is this idiot? (Score:5, Informative)
Whoever this fucktard is
He is a top-tier science fiction author with Hugo and Locus awards, including a nomination this year. Before that, he was a programmer and tech journalist with a monthly column on Linux.
Re:Who is this idiot? (Score:5, Insightful)
This guy sounds like a desperate market speculator that has no clue how the market works... etc etc etc
Whereas you sound like you didn't even read TFA. Or if you did, you don't understand it. Let me break it down for you.
TFA says that there will be relentless downward pressure on computer prices from now on. This point is unassailable.
I can buy a Compaq laptop with a dual-core AMD chip, a great 15.6" display, big hard disk, a DVD drive, and lots of RAM, all for about $400, quantity 1 retail. (Or $370 on sale at Fry's.) I can put Ubuntu on it, and the result is nearly as nice as an Apple laptop. Checking apple.com, I see that I can buy a 13" MacBook for $1000, or a 15" MacBook Pro for $1800. No question, the Apple notebooks are nicer: they have that nifty magnetic power cord, they have slot-loading optical drives, they have the great unibody aluminum chassis, etc. But I have to tell you, if I'm spending my own money, it's going to be that $400 computer, or even a $250 netbook with a 10.1" screen. Does a 13" MacBook really offer me four times the value of a $250 netbook?
TFA says that in the future, Apple is worried that it will be forced to cut their prices and sell at low margins, because the entire PC industry will be forced to cut prices and sell at low margins. I don't see much to debate here either. Here is a quote from TFA:
Is that clear enough for you? PCs aren't going away, but the traditional PC profit margins are going away, and this will cause a shakeup in the PC manufacturing industry. Apple has, so far, managed to make higher margins than the typical 10%, but how long can they continue this?
And what do you know, Apple has successfully set up a whole ecosystem where consumers must go through the Apple App Store to get applications, and Apple collects a 30% cut. TFA says that Apple would do almost anything, maybe even give the hardware away, to get all their customers locked into such an ecosystem.
In short, TFA doesn't say that PCs are going away. It says that PCs are going to be cheap, fast, and ubiquitous, and that companies selling PCs will be forced to accept slim margins. And Apple really doesn't want to play that game. Remember how Steve Jobs dissed netbooks? Apple doesn't want to sell a netbook, or even an iPad, for $250; and the market won't let them get away with selling a netbook for $500. The actual problem Steve Jobs has with netbooks is the razor-thin margins. So far, the market will allow Apple to charge $500 and up for an iPad (although I don't think that can last forever either; great iPad competitors ). [slashdot.org]
TFA isn't the only place I have seen this theory. See also: http://industry.bnet.com/technology/10006035/why-apple-will-eventually-dump-the-mac/ [bnet.com]
Maybe the article is far-fetched. But if Steve Jobs thinks he has any chance at all of locking all of Apple's customers into an App Store ecosystem where Apple skims 30% of all the action, you better believe he will go for it.
steveha
Who's the idiot? (Score:4, Insightful)
Yeah, what does he know about computing and the future? After all he's just a long time Linux user, former sysadmin, Perl hacker and currently a very successful science fiction author. And a very good one at that. IMO the best current SF writer that I know of.
Remember, remember... (Score:3, Insightful)
...the Y2K bug.
I tend to take any prediction anyone in the computer industry makes with a rather large grain of salt since then.
Particularly the ones relating to "the end of the world as we know it" and similar predictions of global occurrences.
Seeing "END OF THE WORLD!!!11eleven!" not happen before your eyes does that to you.
I remember it (Score:5, Informative)
I remember how much time and money was spent updating software and hardware to deal with it. I remember that despite that there were still glitches.
Re:Remember, remember... (Score:4, Informative)
The Y2K bug was a real issue, and it was fixed do to ahrd work and money.
No, airplanes weren't going to fall out of the sky. I did witness the spectacular failure of several financial systems I was involved with in 1997.
At the time it was estimated that if it happens in production on new years eve, the system would need to be shut down, data fixed by comparing it to previous back-ups, and the ode fixed, then tested that brought online and then some people would still be off because they might have deposited money between the last back up and the failure. So 1m- 3 YEARS before people could get their money. What do you think would have happened?
My Strategy for surviving is (Score:5, Funny)
PC revolution is almost coming to an end, and everyone's trying to work out a strategy for surviving the aftermath
a sawed off shotgun, lots of ammo, and a Ford Falcon XB Interceptor
Seriously? (Score:2, Funny)
If this guy was any more pro apple / elitist, he'd be Steve Job's sex slave.
Cloud ? Apple ? Trust ? (Score:2)
oh yea.
http://www.thedailyshow.com/full-episodes/wed-april-28-2010-ken-blackwell [thedailyshow.com]
there's another daily show skit in an irrelevant subject actually. i would like to link that too but its too long watch for the punchline in the end - i will summarize it : "go fsck yourself"
good dept! (Score:3, Interesting)
" moore-money-moore-problems" /. editors!
is a very good gag.
My personal "recent" favorite is "weapons-of-map-reduction" about big table IIRC, but I laugh out loud periodically. Good work
I think someone (else besides me) should put together a list of the best depts and hack some voting software together.
Wrong, but right. (Score:2)
Beyond the hyperbole and the buzzword dropping, he's right.
People are on the move more, and are more connected, than ever. People picking up and moving across the country numerous times is commonplace. Going halfway around the world for whatever reason even moreso. People want their information at their fingertips. The coming cloud, Android, the iPad insanity, Palm, and all. Mobile is the future. Myself, my current desktop is probably the last one I will ever own, save for use as a server. I picked up a 5 y
The PC era is ending? Again? (Score:5, Insightful)
Sorry... Saying that PC's are going to bite it because of the "cloud" is like saying that we have bullet trains now, so you no longer need your car.
(There's your car analogy for those looking for one)
The end of the TV era (Score:5, Interesting)
LOL. End of PC era. Can I have what they're smoking? In a Smithsonian exhibit, I saw a graph of TV ownership in the US. It was a saturation curve, flattening out in the 1970s, IIRC. By then, most people had TVs, and it was just gap filling. I saw the PC ownership curve saturating in the late 90s. By PC, I mean Personal Computer, including Macs.
The point? Companies like Zenith and Sony made money long after the "TV revolution" was over. Better models, ergonomic features, add-ons, incremental refinements, solid state vs. tube, etc.
It's shortsighted to think that we aren't going to continue to have refinements in the PC other than Moore's law related speedups. No, companies like Intel won't be driving huge speculative bubbles anymore; but they won't be going bankrupt either. Just like TV makers, the differentiator will be how well they run their business. It'll be things like customer service, cashflow, etc. It'll be boring business stuff, sorry; but not the end of the world.
Oh, and f*** the cloud. You can have my hard drive when you pry it from my cold dead fingers. Actually, make that my affordable solid state drive. See? Plenty of refinements left in the pipeline.
Re: (Score:3, Insightful)
Apple's grudge with Flash (Score:5, Informative)
No Flash, no cool little applications on your Phone for free... your only source for a quick fix is the App Store.
See this every generation (Score:5, Insightful)
For the past 25 years we've seen these types of predictions. What's being said is nothing new. Just a new surface on an old polygon.
The industry has a long way to go before it is going to die. There's nothing Apple nor anyone else can do that will change things. The industry, in a way, is at fault for any problems being perceived. The constant niggling of customers by tiny incremental change leads customers to believe that there's nothing happening and thus their unwillingness to pay the price for the technology. Make big changes, some radical, such as from the command line to the GUI and we'll see another 50 years of growth in PC.
This is more feldercarb by some industry exhaust spewing waste into the ecosystem. They are just blowhards seeking to get you to think that this Apple product is the direction we'll be going. We do not run our computers for gaming, as gaming is secondary. We expect significantly more from our computers than a gaming console provides. We do not do serious productivity work on an iPad or gaming console.
And Moore's law has nothing to do with this. Everytime someone says Moore's law has come to an end we have another go at it.
I think what I'm reading are the younger generation that didn't see the world as it was back then, before computing was involved in every aspect of our lives. These people have a problem with their imagination and hence their mind is out of focus when it comes to innovation and technology. I'm certain this isn't quite like the music business where a friend said that the only reason music sucks today is because all the good music has already been made. It's really a lack of vision that drives one to conclude that these cobbled devices are technology's future. They are a just a crutch to innovation.
The iPad is a consumer-oriented device (Score:2)
Its entire purpose is to fit into the producer/consumer model, and provide yet another carbon-based audience member to Big Media. Why else would Rupert Murdoch love it? The PC will remain as a more populist, creative device.
FU Marketing (Score:2)
You know, even if one buys into the premise, one does have to wonder about the efficacy of saying things like "If you want pornography, buy a Google phone" or some such. Seriously. How does saying FU to the flash community, forget Adobe, in anyway going to endear people to your position? This is beyond, the Field of Dreams "build it and they will come" failed approach so many try. This is religious dogma. As Matt Damon spouts in the movie Dogma, "Do this or I'll spank you." Yeah, right. For a company that f
Attention Charlie Stross (Score:3, Insightful)
Death Of PC Greatly Exaggerated (Score:5, Insightful)
A couple of points:
/. users are way too literal. (Score:3, Insightful)
It appears many of the responders have interpreted the "end of the PC era" to mean that in 5/10/15 years there will be no more PC's. This interpretation is amazingly stupid, and misses the entire point Steve is trying to make.
Steves point is that particular applications and use cases are moving away from the PC. We watched NetFlix and YouTube on a PC in the past because we needed to push out new software to a general purpose platform to support it. But that's not how most users want to watch it. My new TV streams both inside the TV. I'll never watch Netflix on my PC again.
A couple of years ago if I wanted to find a nearby restaurant I would have loaded Google Maps, searched, and clicked around on my PC. Today I take my iPhone off my belt, load UrbanSpoon or Yelp, and get more useful information plus a map I can take with me. I don't search for restaurants on my PC anymore.
People aren't going to get rid of their PC's, and the PC will always be the platform for really new innovation because of its general purpose nature and the ability to run new software. But PC's have effectively saturated the market. Maybe people need a desktop and a laptop, but no (consumers) need 10, 20, or 50 PC's per person. There is no growth.
But TV's, game consoles, smart phones, tablets and other form factors are just starting to do interesting things. They are doing them in a more convenient way much of the time, and in a way consumers are more likely to use. I can start a netflix movie on my TV with 3-4 remote presses. Compare to 5 years ago where you had to build a media center PC, hook it up to your TV, deal with all sorts of programs to get content, etc.
Steve's point is that while PC's are 95% of the way people access information today, they will be 50% in 10 years. Not because PC's have gone away, but because there is an explosion in other devices. So if you keep building for the PC, you'll be building for 50% of the market in 10 years. We'll still be doing word processing on a PC with a mouse and keyboard then, but other things will be done elsewhere.
The future is mobile computing (Score:3, Interesting)
We agree that the future will involve something much like a Nokia N900 [nokia.com] with a couple of USB ports on it.
The basic idea is that you get to the office, plug your 24" LCD into the mini-HDMI port on the device, plug your keyboard and mouse into the USB ports and away you go.
Network access would be provided either by wireless or VPN via HSDPA.
Re: (Score:3, Insightful)
Not quite.
Many people still don't feel like having a "cloud" service in the Internet hold the only copy of my documents. They can and will hold the files hostage if I stop paying, if they go belly up or if the government says so. Unlike money, documents don't loose value in a mattress.
Re: (Score:2)
If MobileMe is what they think the future looks like, I'd start shorting Apple stock.
Re: (Score:2, Insightful)
Yup. And I think this article is not at all wrong except maybe in the timeframe. Sooner or later networks will be reliable and very, very wide. The timeframe for the sustitution of local computing for remote "clouded" computing is directly proportional to the value of "sooner or later". The more networks take to get decent, the more time the PC has.
Now there is an interesting gridlock: network providers are idiot money whores that still want to get dough out of an investment that has already returned them m
smoking (Score:5, Funny)
you (Score:4, Funny)
Re:I can be persuaded by both sides (Score:4, Interesting)
We hear this "everything's already there, been there, done that". But in reality we have a lack of innovation in products and markets because of a rather large monopoly that has stifled competition at every turn, even after being convicted. It isn't that we have one OS to rule them all that's helped us get to where we are, it is inspite of that that we are where we are. We have continued to penetrate new markets, to educate people, to bring products such as tablet computing and smart phones inspite of being smothered from the top.
There's an old saying that goes "You can give a monkey a computer and he'll use it but probably just to crack open walnuts." The IT failures I've seen come from a lack of vision, a lack of understanding, and a lack of follow-through. It's like watching someone turning an electric screwdriver by hand because they don't realize there's a power switch.
It's a false line of reasoning to say "Just because I can't think of a better way nobody else can, either." But it's really hard to improve on what we've got. Look at the mouse. I can make a lot of complaints about it but have we yet found an input tool to make the mouse completely a thing of the past? No. It's just like we haven't really found a good replacement for the keyboard. People keep trying but I think it's safe to say the computers of the next decade will come with mice and keyboards.
We're going to be going through a system upgrade at my job. The old system is pretty crappy, no argument there, but we're still not even using it properly. Back to what I said above, failures in vision and understanding. I'll do my best to see that we can make a change of it this time but we're likely to be back to using the system to a fraction of its full ability.