Is Apple Pushing Away Professionals? 556
Barence writes "Is Apple turning its back on professional users to focus on consumers? That's the argument in this article, which claims Apple is alienating the creative professionals who have supported the company for 20 years or more. Fury over the dumbing down of Final Cut Pro, Apple's refusal to sell non-glossy screens and poor value hardware is fueling anger from professional Mac users. 'People will get hacked off. I'm only Apple because I want the OS, but if I could come up with a 'Hackintosh' with OS X, I'd be so happy,' claims one audio professional."
Define professionals? (Score:2, Insightful)
But if you mean image/video field workers as professionals, then you probably are right.
Apple product lines are just following the industry trend of consumerism and becoming more targeted for home users, rather than enterprises(for which they never were targetting to begin with).
Re:Define professionals? (Score:3, Insightful)
I have never understood (Score:1, Insightful)
Why "professionals" love this moving target apple presents, its nothing new and it seems like every time apple farts you have to reinvest in all new software and sometimes hardware. Just doesn't seem "professional" to me ...
Research (Score:4, Insightful)
Looks like the author has only done some superficial research on some aspects.
For example, 3ds Max is a Windows-only application, but it's far from the only major application in this sector. For example, LightWave licenses are less expensive, there's a Mac client as well and right now the features it has to offer are running circles around Max. And that's coming from a long-time Max user.
It's one of the major applications in the business, but far from dominating.
CAD is mostly done on Windows and *nix, but that's partly for historical reasons (code bas which has grown over decades in some cases).
Part of the problem is also the specialized hardware support on the Mac platform. You just can't expect an overpriced two year old entertainment graphics card to beat the results professional graphics software will achieve on a Quadro/Fire with optimized drivers and certified compatibility. That's like expecting an AMC Gremlin to beat a well-tuned Formula 1 racer.
worse than microsoft (Score:3, Insightful)
At least microsoft targets business users as well.
However, if this trend continues, and other companies follow Apple in targeting the average Joe, then I foresee a sad future, where devices are locked down, professionals pay big bucks to get the tools they need, and universities and open source developers can't get hardware they can freely develop on.
This is a software thing (Score:5, Insightful)
First off Apple still offers anti-glare displays as an option on ALL their MacBook Pros. So the rant about not offering matte displays is completely off base. In fact, I'm writing this post on a later model Macbook Pro with an antiglare screen and a quick glance at the store shows this option still available.
The real ire is the SOFTWARE, namely the utter fiasco that is Final Cut Pro X. But this is a well known issue and Apple has tried to smooth things over a bit by letting people DOWNGRADE to the last version. So it seems that Apple is well aware of how badly it messed things up and being that Final Cut has been a huge success until now, it only stands to reason that Apple will not make the same mistake twice and will release a new version that addresses their user's concerns. And while that is mere speculation, seeing how much money FCP has brought in and how much hardware it has ended up selling for Apple, it stands to reason that they will not idly stand by while their egg laying goose dies a painful death at the hands of an angered user base.
Also, Apple is more reliant upon developers now than ever. Those trendy consumer gadgets such as iPhone and iPad require a strong developer base, and it requires those developers to develop within OS X and with Apple Tools, even Flash Builder and Titanium require XCode to do the compiling. So to drive away your development community would also make no sense since that would only boost rivals creating apps for other products such as Android phones and tablets.
Apple is trying to normalize the look and feel of it's two operating systems iOS and OS X to make them not only easier to use for the consumer but easier to develop for for the developers. OS X Lion, while causing ire for it's sweeping UI changes now features a lot of the same features as iOS -- which from a UI development standpoint simplifies the development process.
So in the end, time will heal these wounds. Give it a few more months and see what the upcoming release of FCP has to offer it's core user base as well as how iCloud and iOS5 reshape how users and developers interoperate with OS X and iOS based devices. I think then a lot of these changes will make sense and some of the shock at these changes and the handful of missteps will die off.
Re:Don't get it (Score:5, Insightful)
The high end of the "Pro" market is touchy because they tend to depend on fairly large tangles of interconnected products: If asked "what do you use?" they might say "Final Cut"; but they actually mean "Final cut, two dozen specialized plugins, one or more boutique hardware components for capture or output, some sort of storage backend, possibly some in-house custom tools...".
One of Apple's strengths, particularly of late, has bee their ability(and willingness) to just pick up and say "fuck everybody who thinks some legacy feature/interface/API is good enough. As of today, it is the new shiny or nothing!"(see ADB, Adobe/64-bit Carbon, Final Cut Pro, etc.). Combined with some good taste, this has worked very well in the consumer and low-end "prosumer" markets. By largely ignoring legacy issues and expecting people to keep up or suck it up, they've been able to maintain a pretty aggressive release schedule for new and interesting features with a comparatively small engineering team. However, that is absolutely incompatible with the requirements of more esoteric professional environments(along with institutional IT, their less colorful but considerably larger counterparts). You just can't keep a spaghetti ecosystem of critical 3rd party hardware and software moving that fast, at least not at a price anybody is willing to pay.(Even fairly basic things, like supporting pro-level video cards, can be pretty dire, despite the fact that Mac Pro is more PC-like than it has ever been. The default options suck to an almost comical degree, and driver support for anything else is atrocious.)
For consumer and prosumer requirements, where it is much more likely that the integrated hardware and a small number of common software packages are enough, Apple's approach works just fine. It seems unlikely, though, that they can reconcile that with the requirements of the more specialized users. And, now that they have a big, lucrative, consumer market, their incentive to try isn't what it once might have been.
Their stupidity. (Score:1, Insightful)
The only viable solution to the problem is to buy a screen from a third-party manufacturer. “I want to see only the images and applications I’m using, not reflections of the room around me, and I often look at the screen for up to 16 hours a day,” says photographer Bill Wisser. “Recently, I bought $7,000 of computer equipment, including a new eight-core Mac Pro and a new 30in monitor – a Dell.
he could have just bought much better pc equipment and an even bigger monitor with a whopass budget like 7000. he chose to buy macs. and he suffers for it.
stupidity.
Re:Define professionals? (Score:3, Insightful)
Ever since the iLine, and Steve Jobs turning from a benevolent genius to a narcissistic, goose stepping lunatic, the scene has changed to apple being creative, and you can too, just as long as you're creative in the "Apple" sanctioned way.
When it comes to the iSheep, (read as the great unwashed, non-technological masses), taht's exactly what they want. They want to be "creative" by proxy. They get to taste genius, and all they ahve to do is spend daddy's money.
Unfotunately, the creative types don't need restrictions. Yeah sure, being creative with IBM used to be like trudging through the mud, and at the time comparatively, using MacOS would feel like ice skating in comparison. But now, you MUST do what Apple says, or you're toast. Dumbing down of interfaces to conform to the masses is merely one facet of it. The uberban of Flash is another. And by the way, think what you like about flash, but the bottom line is that you're having your technology choices dictated to by a company.
For all these reasons, I have left apple. I refuse to buy an Apple product anymore because I am smart enough to make my own choices, and unfortunately, the solid brick that is Apple is *almost* what I want, but since I can't customise it at all, it is *entirely* not what I need.
Good luck with the mass market Apple, but I am not the only professional who's sick and tired of being corralled into your line. The bitten apple used to be a sign of the rebels; A homage to the greatest rebel Computer Scientist in history, Alan Turing, who had cracked the enigma codes through the sheer might of his intellect, who was then crushed by the same English government he had saved, as unfortunately he was gay. Being faced with the choice of imprisonment or chemical castration, he chose the third route of committing suicide. As he adored the fable of Snow White, he dipped an apple in poison, and took a bite.
THIS was once the spirit of Apple Inc. Shame on you for losing your way.
Re:Some "Professionals" Aren't (Score:2, Insightful)
Re:Define professionals? (Score:5, Insightful)
THIS was once the spirit of Apple Inc. Shame on you for losing your way.
Reminds me what Jason Newsted said, when asked for his response to people saying Metallica had sold out: "Yeah we sold out. We sold out every arena we played for the last five years."
Until the general public stops eating up every single thing they produce, it will never change. They make far too much money to give half a crap about the loyal customers that kept them viable before the iCrap era. They'd rather you just shut up and keep buying those iPhones/iPads. It's sad, but true.
Re:This is a software thing (Score:4, Insightful)
As someone who makes their living doing Mac and iOS development, I'd like to point out the flip side to the story of rejections and stuff. For every very vocal person complaining that their app was rejected, or that they can't figure out how to install an app on their phone (As a registered developer, I have no problem re-signing an ipa with my own keys and installing it on my devices, without the appstore or jail breaking. Enterprises have even less issue), there are probably a hundred of us who are making our livings at this and not running into any of these issues.
That's not to say there's no truth to it either. But on a day-to-day basis, the irritations I'm having with Apple as a developer are not any of these things.
What the hell are you talking about? (Score:4, Insightful)
Name a single thing you used to be able to do on Mac OS X that you can't do anymore on Mac OS X. They fumbled around with the new Final Cut Pro release--and they're trying to recover from that now. There is absolutely nothing else you can point to. You can still run Flash on OSX.
The 'iLine' is a new line of products specifically targeted at the handheld/mobile market. It has different constraints and craves a different solution. In case you haven't noticed, they're doing pretty well. Millions of people who otherwise wouldn't be using smart devices now are; and it hasn't prevented anyone from doing anything they could do before on Macs or any other kind of computer. If you think there is something bad about a type of technology just because it is aimed at non-technical users; then you just flat out do not understand the point of technology. Like many other so-called nerds on this forum, you think the point of technology is to create some sort of exclusive club with a sign out front that says "you must know *this* much about tech to enter".
BTW: if you are naive enough to think that the absence of web standards leads to a better, more democratic internet, then you are a lost cause.
Nobody cares that you are having some sort of one-sided feud with Apple. What the hell is your deal with Turing, anyway? did you just watch some documentary?
Re:Define professionals? (Score:4, Insightful)
The fact that the X button sometimes closes the application, and sometimes leaves the application running without a UI is also bad. The green + shrinking the screen is a poor UI choice. The list goes on and on.
It is just strange that the UI gets held up as Apples triumph, when the UI is sub par and the good parts of a OSX are under the hood.
Re:silly (Score:3, Insightful)
What dumb ass "creative professional" does all their work on a laptop screen?
What dumb ass 'technical know-it-all' doesn't understand the value of having a portable workstation?
Re:What the hell are you talking about? (Score:5, Insightful)
Re:Define professionals? (Score:4, Insightful)
With Windows or common Linux desktop environments when you maximize a window the window takes up the full screen, regardless of whether there's enough content in the window to fill the whole screen. This often leaves vast areas of white space on the sides and bottom of the window.
On the Mac, the green button zooms the window to be big enough to see everything that's inside the window, and if you click it again it just returns to the size it was before. The maximize button to exit full screen in Windows behaves *identically* to the green button in OS X when exiting full screen. It returns the window to a user-determined size that doesn't necessarily show the full content of the window. Your lack of understanding of it doesn't make it bad design.
In the same way, having the icons on the right side makes more sense, because normally the windows cover up the space on the left. When I hit the green button, I can see 1. All of the content of the window, 2. the icons on my desktop, and 3. the windows behind my front window. How exactly is having vast areas of white space within the front window better than being able to still see the full content of the front window but also being able to use your much-valued screen real estate to see other things in the unused space around and behind it?
The x button closes the application if the application is only capable of having one window(like utility programs) and closes just the window if the application is capable of having multiple windows. This makes it so you don't have to wait for a whole application to relaunch if you accidentally close the last window. But most Mac users know that you can hit command q to completely close a program(which is the functionality you're claiming that OS X doesn't have) or command w to close just the window. It's interesting that you'd deride OS X for the fact that Windows lacks that granularity of function.
Re:Define professionals? (Score:3, Insightful)
fanboys justifying fanboys. nice.
that would cost about $5k without the picture of the fruit on the side.
"lol look we're professional! it has FIBER NETWORKING!" hahahhahhahahha jesus, you seriously need to check into apple rehab.
Re:Define professionals? (Score:4, Insightful)
You seem to have gotten the whole point of the menu-on-top thing wrong. It's not a matter of maximizing usable space in a low-res environment, it's a consequence of Fitts's Law: The acquisition time for a target on screen is something like log(distance/size). If the menus were on the windows, the distance would be smaller, but by having the menus against an edge of the screen, you can't overshoot towards the edge, so the target size is effectively infinite. No bad UI about that.
Also, leaving an application running without a UI is a perfectly reasonable idea, once you dissociate applications and windows -- in fact, there are many, many programs that practically demand it (mail, bittorrent, IM in general). If you look at the windows world, this distinction also exists, except that windows handled this in a completely hackish way until Windows 7, which was via the system tray icons. Windows 7 finally supports windowless applications properly, though that's a recent advance and many applications still don't support it. I can't honestly think of any non-X11 applications that actually close completely when you close the last window.
All in all, though, I agree: the best part of modern Macs is what's underneath the hood. They're effectively the only consumer-grade machines in the market that are purpose-made to run Unix.
Re:Audio Pros are so silly. You dont need a Mac! (Score:4, Insightful)
Audio Pros are all snobs.
Well, that's a problem with elitism in general, and is hardly limited to Apple users, but they're guilty of it. In the real world, a dispassionate evaluation of one's own requirements generally results in better purchasing decisions, a close match between work requirements and the equipment meant to service them. That's one complaint I have with the Apple-using community: they tend to see all problem domains as having the only solution in terms of Apple. When your only tool is a hammer ... well. The world of computing is vast, the needs of users varied, and the products of one single company cannot reasonably be expected to serve the needs of everyone.
... but why would you want to?"
The other aspect to that mindset is the ability to rationalize away faults and missing capabilities. Blows my mind. I've had more than a few conversations with Apple users that usually run along these lines:
"How come your nav is still talking? You're playing an MP3 and browsing the Web."
"Multitasking."
"Huh. Well, mine doesn't do that
"???"
Yes yes, I know I'm talking about an early iPhone, that's not the point. I'm talking about attitudes here, not the hardware.
Re:Define professionals? (Score:4, Insightful)
If change is good, then what about the ability to change a battery, or a software configuration? If I weary of SuSE 11.4, I can change distros. How does one do this on an iPad?
I dislike such appliances not because they represent change, but because they prevent it.
Re:What the hell are you talking about? (Score:5, Insightful)
Because FCX won't ever catch up. It's a question of scale.
The old versions of FCP are designed to allow teams to work on projects. The new software is designed to be used by a single user. If only one person at a time is editing, the new version may well be better than the old version. That workflow matches how a huge number of people work, so it makes sense for Apple to focus on that market. From amateur home user to professionals working on smaller projects, Apple is moving in the right direction.
For the broadcast market, it's the wrong direction. If your work scales beyond one user per project, it's time to move on. Apple makes high margins on consumer electronics, lower but OK margins on home computers, and not much at all from businesses or government sales. Apple is going to focus on the market segment where they make higher profits, not the niche market with high sales and support costs.
At one time, if it had an engine, Ford and GM made it. Ford sold tractors and airplanes. GM sold buses, locomotives, and heavy trucks. Those markets are willing to pay a higher initial price for products which last a long time and can be repaired and rebuilt over and over. The market for cars is different. People will junk cars after 10 years if they get a lower price up front. Consumers don't see cars as an investment used to make money, cars are an expense. Make it as cheap as possible and sell me a new one every couple of years, driving the latest model impresses people. Ford and GM still sell light trucks, and probably always will. But they got out of those other markets. Some of the technology may be the same, but each market demands a different set of trade-offs, a different way of doing business. It's easier to structure your business around one large market than try to do everything.
Apple sells to consumers. They're good at it. If they sold vehicles, they'd sell cars. If you need the equivalent of a van or pickup, Apple is still in that market. But they won't, can't, scale up a pickup to a tractor trailer.
Re:Define professionals? (Score:3, Insightful)
The fact that the X button sometimes closes the application, and sometimes leaves the application running without a UI is also bad.
Why is it bad? It's a developer choice do do whichever is more appropriate for the app. On Windows an app MUST close when its last window closes, unless the developer puts it into the system tray.
The reasoning for leaving the app open when a multiple document app has it's last window closed is straight forward. It's a common usage pattern to finish working on one document and then start working on another. If apps quit when the last window closes, then this happens: The user closes the first document, and the UI to open the next document (File/Open) disappears. They then have to restart the app, which involves waiting, before they ca open their next document.
But for apps which are not document based, that argument doesn't apply. Closing the window on a single window app really does mean you've finished working with that app for the time being.
Then there are other reasons for choosing one behaviour or another. If an app does useful work even when there are no Windows, then of course it makes sense to keep it open. iTunes is an obvious example.
There's a reason why Mac developers have this choice and Windows developers don't get it (apart from the system tray utility option). Because with Windows, the disappearance of the last window means that access to the menu has also disappeared. That's not the case with Mac.
Mac applications act this way due to legacy decisions made for the original circa 1984 Mac, not because it's the right way to do things. At the time, it took the Mac a long time to start applications. Apple decided on this behavior to make the computer more responsive when opening new documents. Now days, document open much more quickly and this behavior is no longer required. Personally, the behavior drives me nuts because when I click on a running app in the doc that has no open windows, the program doesn't do anything. It should, at that point, actually respond; open a new project, give me a file-open dialog box, anything but sit there looking pretty. Programs that do something useful in the background with no open files are few and far between. If a program has something useful to do in the background than it should be implemented as a light-weight daemon, rather than a full blown app like iTunes.
The other issue with this behavior is that it is not easy to tell at a glance to tell what programs are running. The strength of the Windows Task Bar is that it clearly separates running programs from application launch icons. Certainly, this is a matter of what people are accustomed to, but for myself and I think many people who are accustomed to windows, this is infuriating.