Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
OS X GUI Operating Systems Upgrades Apple

OS X 10.9 Mavericks Review 222

An anonymous reader writes "John Siracusa at Ars Technica has put together a comprehensive review of Apple's OS X 10.9 Mavericks. This is the first time a major OS X update has been free, and it works on any device that supports Mountain Lion. This suggests Apple is trying to boost adoption rates as high as possible. Siracusa says the following about Apple's move away from skeuomorphic design: 'Mavericks says enough is enough. The leather's gone, the fake pages are gone, the three panes are independently resizable (more or less), even the title bar is bone-stock, and it's boring?' On the other hand, he was a big fan of all the internal optimizations Apple has done, since the energy savings over Mountain Lion are significant. He found a 24% increase in his old MacBook Pro's battery life, and a 30% increase for his new MacBook Air. He also praised the long-needed improvements to multi-monitor support: ' Each attached display is now treated as a separate domain for full-screen windows. Mission Control gestures and keyboard shortcuts will now switch between the desktop and full-screen windows on the display that contains the cursor only, leaving all other displays untouched.' The 24-page review dives deeply into all the other changes in Mavericks, and is worth reading if you're deciding whether or not to upgrade."
This discussion has been archived. No new comments can be posted.

OS X 10.9 Mavericks Review

Comments Filter:
  • by Anonymous Coward on Tuesday October 22, 2013 @06:37PM (#45207651)

    Apple has really fucked up big time on 10.9.

    Basically, the sRGB spec is no longer sRGB, and colour managed applications that use ColorSync are completely hosed. Almost everything is more saturated then it should be. Towers of bug reports have been filed on this alone and absolutely nobody has received a response from Apple, which makes me think it's some retarded "stylistic choice" of theirs to literally try and make the OS "look better" (it doesn't).

    So, basically, if you rely on OS X for colour accurate work, you're totally fucked.

    • If they dont fix color issues and piss off the graphics/hollywood crowd, they'll lose the constant free advertising, and that's not going to help the bottom line. They'll need even more "Apple's CEO just sneezed, is that a hint at iTissue" journalism, and I don't know that it's actually possible.

    • by Just Some Guy ( 3352 ) <kirk+slashdot@strauser.com> on Tuesday October 22, 2013 @07:23PM (#45207951) Homepage Journal

      Have a link? I'm not readily finding anything but I'd be interested in reading more.

      • by Anonymous Coward on Tuesday October 22, 2013 @08:15PM (#45208249)

        http://forums.macrumors.com/showthread.php?t=1651041
        http://forums.macrumors.com/showthread.php?t=1649988&highlight=saturation

        You can actually see the difference in the Ars Technica article just from the screenshots (which likely means it's intentional, since you can screenshot the issue and clearly see it in the pixel colours). Look at the icons closely, and you'll notice that the majority of them seem darker and more saturated then normal. I'd link you to the ADF forum discussion about this exact same issue, but that's kinda pointless since you'll need an ADC account to view it.

        We've got a whole bunch of ultra high end Eizo monitors in the office that do self calibration and colour correction inside the monitor itself. These units are all configured to accept a straight sRGB IEC-61966-2.1 colour space, and nothing else. Since the monitor ASIC handles the calibration & correction for the panel, there's no need to use ICC profiles if you don't want. We've found this to be an insane boon when you're targeting the sRGB colour space for mobile app development and graphics design (where sRGB is basically the safest space to target if you want it to look decent on any handheld).

        Anyways, under 10.7 and 10.8- setting up OS X to use the sRGB IEC-61966-2.1 colour space resulted in a pretty perfect image on the monitor (which was configured for the sRGB colour space "mode" and self-calibrated). No problems there, or with any of the Cocoa APIs, or OpenGL stuff.

        Under 10.9, everything is basically "fucking whacked" (according to our IT guy). About 60% of the Mac OS X UI doesn't adhere to the sRGB spec anymore in that if you have an ICNS file that was generated from sRGB source material, it is no longer displayed as straight sRGB in the Aqua UI- it's being tinkered around with by Apple's bug and/or design decision. A lot of stuff being displayed through NSImageView is totally hit and miss as far as the colours go, even with an sRGB monitor profile (this is even worse on Apple's own computers that use LCD panels which are somewhere in-between a wide gamut and sRGB... The colour variances I've seen on our office laptops running 10.8 and 10.9 side beside are unbelievable). Even OpenGL is hit and miss now- before everything seemed to be uncorrected (which was fine, applications could implement colour management themselves if they wanted), but in 10.9 it seems like some stuff is completely whack and other things almost look partially colour corrected depending on the monitor profile. We think this is due to the GPU drivers and brand, but nobody knows for sure.

        In a nutshell, things are NOT as they should be.

        1) Their Aqua UI should assume that input images are in the sRGB colour space, and display them as accurately as possible according to the monitor profile
        2) NSImageView & friends should do the same thing for data sources that have no associated colour space
        3) OpenGL should preferably be totally uncorrected, since anything else would be totally ambiguous and up to the manufacture

        Our six 10.9 pilot systems were recently reverted to 10.8, which still has horribly broken colour management... BUT, at least on 10.8, if you tell it to output sRGB then that's precisely what it does (and this works well with our Eizo monitors). 10.9 seems to take this all one step further in that they fuck around with anything and everything at will, and it's just a complete nightmare to deal with as a user.

        TLDR; it is very evident Apple has no clue what they're doing in regards to colour management. This is becoming more and more apparent with each release of OS X.

        • What is so wrong with UI widgets not adhering to sRGB, so long as content is displayed pursuant to specification does it really matter?

        • We've found this to be an insane boon when you're targeting the sRGB colour space for mobile app development and graphics design (where sRGB is basically the safest space to target if you want it to look decent on any handheld).

          I hope this isn't a silly question, but why on earth do you care about accurate colour matching on mobile devices? Given that they have screens of very variable quality and no decent colour accuracy themselves it seems that putting much effort in will be wasted.

          • by tlhIngan ( 30335 ) <slashdot&worf,net> on Wednesday October 23, 2013 @10:36AM (#45212863)

            I hope this isn't a silly question, but why on earth do you care about accurate colour matching on mobile devices? Given that they have screens of very variable quality and no decent colour accuracy themselves it seems that putting much effort in will be wasted.

            Just because you use Android doesn't mean people don't care.

            The iPhone 4s was about 90% sRGB (mostly due to a faulty blue filter that lets in a little green), the iPhone 5 (and 5s, 5c, and associated iPods) are actually a little over 99% sRGB. And Apple calibrates every display as they come off the line. tests done on the displays have shown excellent calibration with very little variability between devices.

            While Androids have better screens, the AMOLED ones, especially Samsung Pentile variants tend to be far worse - the OLED display is nice but oversaturates for the most part. LCD Androids may or may not be calibrated as well - some devices exhibit such wide variations in color accuracy and error that they're effectively uncalibrated screens, while others do calibrated them to an extent during manufacturing (usually the flagships).

            The modern smartphone and tablet display is a far cry from early mobile LCD displays - they're often very good (especially Apple displays - if you need color accuracy on a portable, you're pretty much limited to Apple) and people do expect their photos to be somewhat like reality.

            If you want to see what crap looks like, check out a cheap digital photo frame, then look at a modern smartphone or tablet display and you'll find they're much nicer.

      • by Alan Shutko ( 5101 ) on Tuesday October 22, 2013 @08:26PM (#45208313) Homepage

        All I can find is this in the Apple Dev Forums (login required) [apple.com]. It seems that certain people in a workflow without a monitor color profile see differences without embedded profiles look differently. This does not appear to be a problem in a workflow where you regularly profile your monitor (and in fact, I don't see a problem).

        So, if you depend on OS X for color accurate work, and if you are working exclusively with untagged images that are to be assumed to be sRGB, and if you have a monitor which does its own sRGB calibration and you're depending on the bits from the image being sent directly to the monitor without adjustment, then you might see problems. I don't know how big of a community that is.

    • That's one of the first things I noticed. The strange thing is I noticed the same process in reverse when I switched to Mac's back in like 2003. Mac's color balance had a more white look and Windows was more contrasty.

      After I upgraded at first I assumed it deleted the calibration profile and ended up going through the whole monitor calibration process only to end up with something close, but not exactly like what I started with and neither like how it looked under Mountain Lion. It doesn't really bug
    • Well, after updating, I had three applications that no longer work. I'm not sure what they deprecated in the new OS, but it's getting annoying to lose applications after every OSX update.
      • by jbolden ( 176878 )

        10.6 -> 10.7 was much worse. Get used to it Apple is picking up the pace for developers not slowing it down.

  • Enough already! (Score:2, Interesting)

    by mothlos ( 832302 )

    Here we have Soulskill yet [slashdot.org] again [slashdot.org] trying to act like skeuomorphic artistic design is some sort of big, bad thing which we should be concerned about. This is not an important issue in human interface design. This seems to be some sort of pet peeve lens which Soulskill keeps bringing up. Skeuomorphism may bother designers who don't want to be tied down to designs based on mid-twentieth-century conventions of office life and people who demand every last pixel of their screen be useful for them. ell, it may even

    • Re: (Score:2, Interesting)

      by drinkypoo ( 153816 )

      It is an important issue. It's not the end of the world, but it's dumb to waste screen real estate on gewgaws to make the interface look like something from yesteryear to which it is superior. And notably, the world already rejected these ideas back in the classic MacOS days.

    • Re: (Score:3, Interesting)

      by phantomfive ( 622387 )
      From what I've seen, the anti-skeumorphic hatred started with WinPhone 7 users desperate to find a way their phone was superior to iPhone. They tied it with the idea that WP7 was a unique UI (and it was nice, but not as nice as the Zune HD, and not amazingly original). After that some Android users jumped on the bandwagon, also wanting to feel superior. Some iPhone users started to feel bad about it.

      Skeumorphism is just a thing, if done right it is great, if done poorly, it is bad.
      • Re:Enough already! (Score:5, Insightful)

        by UnknownSoldier ( 67820 ) on Tuesday October 22, 2013 @07:28PM (#45207987)

        > Skeumorphism is just a thing, if done right it is great, if done poorly, it is bad.

        As a 3D, UI, & UX expert I concur 100%.

        Skeumorphism is like spice. A little kicks it up a notch. Not having any is TOO plain; having too much and that is worse then not having any.

        IMO the BIGGER problem is OSX 10.9 and iOS 7 completely desaturating and removing all 3d shading -- THAT is the hideous UI crime. The UI designers should be forced to use Windows 1.x for their stupidity.

        • Re: (Score:3, Interesting)

          You would not pass any computer art class today with that attitude.

          Every professor out there has been teaching this is the way to go and flunking out those who do these outdated 20th century things. Unfortunately, this trend is post impressionism which once became popular because herasy to do art any other way. These new students are landing jobs at companies like Apple and Microsoft. Simple color is it.

          • Re:Enough already! (Score:5, Insightful)

            by fyngyrz ( 762201 ) on Tuesday October 22, 2013 @10:55PM (#45209039) Homepage Journal

            Any "computer art professor" that teaches which style is "superior", as opposed to "how to do" any style you are tasked to implement, isn't worth the time spent with them.

            The issue of replicating physical interfaces is not, and never will be, cut and dry. Some physical interfaces are highly refined and functional, and abandoning them leads to problems (look at a modern audio system as compared to, for instance, a late 1970's Marantz. Now try to turn up the midrange, or route one recording input to a recording output, assuming your modern hardware even has them.)

            There are some excellent UI design guidelines out there. Like, don't constantly show and hide interface elements, it fouls up muscle memory. But "bury everything in menus" is a total newbie suck move, and "remove all familiarity" (which is what the rabid anti sku folk are saying, really) is also a suck move.

            Change and so forth in moderation, see?

            • Re: (Score:3, Insightful)

              by AmiMoJo ( 196126 ) *

              and "remove all familiarity" (which is what the rabid anti sku folk are saying, really) is also a suck move.

              The problem with skeuomorphism is that the familiarity is often misleading or at best limiting. People experience something like this when they go to a foreign country. Things look similar superficially, but are subtly different and disorienting.

              For example a skeuomorphic address book would look like an actual book, but not really work like one. You can fold the corners of real pages down to act as bookmarks, then turn the book sideways to find them. You can't search a real book by entering search terms, so

              • by fyngyrz ( 762201 )

                The problem with skeuomorphism is that the familiarity is often misleading or at best limiting

                "often" is not "always", and that destroys the argument against skeuomorphism without even requiring you to prove your assertion of "often." Sorry, but it's BS and it's been BS all along. Familiarity can be a great thing, a significant assist into the how and why of something. Radio dials. The play, pause, rewind, record, FF, and dub interface of tape machines. The phase display of a radio-teletype scope. The hand

                • by AmiMoJo ( 196126 ) *

                  Radio dials. The play, pause, rewind, record, FF, and dub interface of tape machines.

                  Interesting examples. The play, rewind and FF buttons are of course simple arrows and a double bar symbol whose meaning can only be learned. Dials make sense on machines with linear tapes, but on computers it is usually possible to seek directly to where you want to go. Such interfaces were chosen largely because of the limits of technology and low cost manufacturing, rather than because they were good.

                  Hands on a clock are another excellent example of how skeudomorphism fails. They were developed because cl

              • Off topic ...

                Holy crap, I thought your .sig was a joke ... "const int one = 65536; (Silvermoon, Texture.cs)" but sure enough it is real.

                https://silvermoon.svn.codeplex.com/svn/Silvermoon/Silvermoon/OpenGL/Texture.cs [codeplex.com]

                const int one = 65536;

                Sad that the noob programmer couldn't even use a descriptive name for texture coordinates in 16.16 fixed point format!
                i.e.

                private int[] textCoords = new int[] { one, 0, one, -one, 0, 0, 0, -one };

                instead of using whitespace and columns for alignment to make it m

    • It IS a big deal (Score:5, Interesting)

      by bussdriver ( 620565 ) on Tuesday October 22, 2013 @07:25PM (#45207963)

      We use computers and mice, maybe a track pad. It is one thing to theme something with fluff and quite another to try to simulate historical metaphors while ignoring known methods of user input and popular conventions.

      Making something look like a book is a nice touch that is a matter of opinion but making you do the motions of the real world to interact with a computer program using a mouse... that is just idiotic and should be a cause for concern.

      Skeuomorphism is great if you are making something tor a target demo that understands some real world item well and would instantly "get it" while you could slowly migrate them to something better suited to the newer technology that is replacing it.

      You might want to use VHS tape or film reels as metaphors when introducing video editing in the 90s... But as soon as people can adapt, those metaphors can be chucked for more modern or abstract ones; as Apple and others have done with digital video editing. Some terms like film and reels still remain despite this generation never using or even seeing actual film.

      • You might want to use VHS tape or film reels as metaphors when introducing video editing in the 90s...

        But even back then, yes, even with technophobes, if you'd forced your users to rewind those tapes in real time you would have had a serious problem.

        • I remember a couple apps that had rewind buttons in them! I don't remember their names... Obviously they didn't rewind in real time because then there would be zero benefit to bothering to buy and learn the computer. I fooled around with most everything in the area as it came out... the early stuff actually DID make you wait because it was hooked into actual tape decks-- The benefit of recording and replaying all your edits was only worth it for a professional -- the COST was totally unjustifiable for mos

    • Re:Enough already! (Score:4, Insightful)

      by steelfood ( 895457 ) on Tuesday October 22, 2013 @07:47PM (#45208091)

      I guess it depends on what your standpoint is. From a user standpoint, transitioning to a new technology via a familiar UI is better than doing it via an unfamiliar one. Once there however, the real test is how unintrusive and easy to use the UI actually is.

      From a designer standpoint, again, when in transition, a familiar UI is easier to work with. However, once the transition period is over, it can be a limiting factor for improvements to the interface or to the functionality of the device.

      Take the keyboard for example. We still use the same QWERTY layout of its predecessor, the typewriter. This was the natural course of evolution for typing as people transitioned away from typewriters to keyboards. But it is limiting, in that the key layout is not ideal for the typist, and the flat keyboard layout itself is not friendly to the hand at all.

      On the other hand, look at the Segway. It has such a revolutionary interface that nobody really knows what to do with it. It probably would've gained far more traction had it looked closer to a bicycle. It could have eventually replaced all those motorized bikes with the 80cc engines, been legitimately the next revolution in transportation. Instead, it's now associated in my mind with being a fat slob, since the only people I've ever actually seen use one are mall security guards and the occasional beat cop.

      • It probably would've gained far more traction had it looked closer to a bicycle.

        Or if it cost less than $8000USD

    • Here we have Soulskill yet again trying to act like skeuomorphic artistic design is some sort of big, bad thing which we should be concerned about.

      I think whaling on skeuomorphic design completely misses the point.

      Good skeuomorphic design gives the user cues about how things work, what you can click, what you can slide etc.

      Bad design (skeuomorphic or otherwise) paints a pretty picture on the screen for the hell of it. The form doesn't suggest function and well-established conventions from other software are ignored.

      At worst, bad design creates false cues that misdirect users.

      Unfortunately, recent versions of iOS and OS X have included several glar

      • by mothlos ( 832302 )

        bad design creates false cues that misdirect users

        I don't think I could agree with this any more. I didn't intend to take sides in the pro/anti skeuomorphism debate; I'm simply annoyed to see /. consistently framing skeuomorphism as fundamentally flawed instead of something which newbs and the artistically inept (e.g. suits) will rely on too heavily and apply when inappropriate.

  • app store should not need it's own password/ login for free stuff.

    also Software update seems better for OS stuff.

    • Re: (Score:2, Informative)

      by Anonymous Coward

      App Store _is_ Software Update, now.

    • by fermion ( 181285 )
      I have no issue with the Mac keeping the password. I don't plan to do auto update becasue the machine is simply more critical than my iPad.

      Just a data point. All seems to be going well on my machine. About an hour to update, rebooting fine.

      About the only thing I would complain about was the need to register my iCloud. I wish they would have kept the online password manager.

    • by antdude ( 79039 )

      Apple wants control. I found out that Apple uses your ID account to inject your data into each downloaded app as DRM. Read https://discussions.apple.com/docs/DOC-5261 [apple.com] ... I was wondering why my downloaded 10.9 copy did not match others' with file sizes, CRC checksums, etc. :(

  • by narcc ( 412956 )

    Finally! An OS suitable for Sarah Palin.

    She's a real Maverick.

  • by 93 Escort Wagon ( 326346 ) on Tuesday October 22, 2013 @07:05PM (#45207837)

    So when can we expect the Review of Ars Technica's Review of OS X Mavericks?

  • With the skeumorphism gone, the stock Calendar app finally became usable.

  • Overall, Siracusa's review of OS X.9 is excellent but I got a chuckle out of this statement about the Sprite Kit: "All of this functionality is provided through a pleasantly abstracted Objective-C API that's a far cry from the typical low-level C/C++ game engine code." I understand the distinction he's trying to make between a pleasantly abstracted API and a typical low-level API, but Objective-C is a fright pig of immense proportions, not to mention overt vendor lock-in bullshit.
    • by smash ( 1351 )
      Objective-C is available for anything clang runs on.
      • A fright pig of immense proportions is available for anything clang runs on.

      • by AmiMoJo ( 196126 ) *

        Sure, but have you actually tried to use it for anything serious on another platform? There is a reason you don't see many Linux or Windows apps written in Object-C.

        • by smash ( 1351 )
          Windows: because they went down the path of c++ and then c#. Linux: because the desktop environment guys are too busy trying to reimplement windows, rather than finish GNUstep.
    • Objective-C is a fright pig of immense proportions

      No, it isn't. I developed in C++ for 12 years before (initially reluctantly) moving to Objective-C about 10 years ago. After some orientation, I realised it was actually a breath of fresh air. The most productive language I've ever used, bar none.
      • by Bogtha ( 906264 )

        I wouldn't say the most productive language, but it's certainly the most productive language at that level. Higher-level languages like Python will always beat lower-level languages like Objective-C for productivity.

        I find that practically everybody who talks about how awful Objective-C is has turned their nose up at it without trying to use it for any substantial period of time. Yes, it can look weird and verbose when you first start using it, but once you catch on to the patterns, it's a very pleasan

      • C++ has changed a lot since 10 years ago - more so than Obj-C, I dare say.

        Frankly, in 2013, a programming language that has only recently got any form of automated memory management, and still doesn't have any namespacing facilities, is outright embarrassing.

        And if you found Obj-C to be "the most productive language you've ever used", especially 10 years ago, then I think that you didn't really explore many other options.

    • The Objective-C spec (absent of the Apple APIs) is much, much smaller than the C++ spec, and it's a proper superset of standard C. Any ANSI C program will compile as an objective C program.

      C++ has a massive spec and even when you know what you're doing, you're pretty likely to shoot yourself in the foot at some point. A friend of mine recently joked that the motto of C++ should be, "Yes, well, don't." As in: "I can do this amazing thing in C++ and it's totally legal!" "Yes, well, don't." Pretty much every C

  • Are others confirming a 25-30% battery life increase? That is a stunning increase. (If performance, screen brightness, etc. are maintained). Surely that was not achieved just by trimming eye candy. I am really curious what power optimizations were done?
    • by Anonymous Coward on Tuesday October 22, 2013 @10:39PM (#45208975)

      I am really curious what power optimizations were done?

      You are in luck. An article about that is the topic under discussion.

    • Re: (Score:2, Informative)

      by Anonymous Coward

      In case you don't want to read the 28 page article.

      Timers of all programs are synchronised so they are fired right after each other so that there are longer periods processing and longer periods of idle. This means that frequency throttling up and down happens a lot less often.

      Also for invisible and inaudible applications (obscured, or minimised, and not producing or recording audio) they reduce the rate of the timers, so less screen redraws and other things are done.

      When showing the battery menu it will sh

      • by nmb3000 ( 741169 )

        Timers of all programs are synchronised so they are fired right after each other so that there are longer periods processing and longer periods of idle. This means that frequency throttling up and down happens a lot less often.

        That sounds a lot like the timer coalescing [microsoft.com] added in Windows 7, and it did have notable improvements in power usage over XP. So while the idea isn't new or innovative on the part of Apple, it does help them maintain their lead over Windows when it comes to lower power consumption.

  • Let me share those tips I've found:

    No credit card required to create an Apple ID if you don't have one: tip 1 [apple.com]

    No Snow Leopard upgrade from Leopard (however you should have a Snow L. licence for this Mac): tip 2 [macworld.com]

    One still needs a Snow Leopard at least to use the new App Store and download the Maverick files.
    Maybe you can go to a friend's and use your new ID to download your Maverick copy... or wait for a tip 3 someone may post here !

Genius is ten percent inspiration and fifty percent capital gains.

Working...