Apple Freezes Snow Leopard APIs 256
DJRumpy writes in to alert us that Apple's new OS, Snow Leopard, is apparently nearing completion. "Apple this past weekend distributed a new beta of Mac OS X 10.6 Snow Leopard that altered the programming methods used to optimize code for multi-core Macs, telling developers they were the last programming-oriented changes planned ahead of the software's release. ...`Apple is said to have informed recipients of Mac OS X 10.6 Snow Leopard build 10A354 that it has simplified the`... APIs for working with Grand Central, a new architecture that makes it easier for developers to take advantage of Macs with multiple processing cores. This technology works by breaking complex tasks into smaller blocks, which are then`... dispatched efficiently to a Mac's available cores for faster processing."
Why is multicore programming so hard? (Score:2, Insightful)
Haven't video game programmers been doing it forever, doing some things on the CPU, some on the graphics card?
And I heard functional languages like Lisp/Haskell are good at these multi-core tasks, is that true?
Re:Why is multicore programming so hard? (Score:5, Informative)
The problem is shared-memory, not multi-processor or core itself. Graphics card have dedicated memory or reserve a chunk of the main memory.
It is true, because they privilege immutable data structures which are safe to access concurrently.
Re:Why is multicore programming so hard? (Score:5, Insightful)
Only partly true. Even in pure functional languages like Haskell, the functional-programming dream of automatic parallelization is nowhere near here yet; in theory the compiler could just run a bunch of thunks of code in parallel, or speculatively, or whatever it wants, but in practice the overhead of figuring out which are worth splitting up has doomed all the efforts so far. It does make some kinds of programmer-specific parallelism easier; probably the most interesting experiments in that direction, IMO, is Clojure [clojure.org]'s concurrency primitives (Clojure's a Lisp-derived language with immutable data types, targeting the JVM).
Lisp, FWIW, doesn't necessarily privilege immutable data structures, and isn't even necessarily used in a functional-programming style; "Lisp" without qualifiers often means Common Lisp, in which it's very common to use mutable data structures and imperative code.
Re:Why is multicore programming so hard? (Score:5, Interesting)
Note that I was not talking about automatic parallelization which is indeed possible only with pure languages (and ghc is experimenting on it); but simply about the fact that is is easier to parallelize an application with immutable data structures since you need to care a lot less about synchronization. For instance, the Erlang actors model (also in other languages like Scala on the JVM) still requires the developer to define the tasks to be parallelized, yet immutable data structures make the developer's life a lot easier with respect to concurrent access and usually provide better performance.
My "It is true" was referring to "functional languages" which do usually privilege immutable data structures, not to Haskell or Lisp specifically (which as you said has many variants with mutable structures focused libraries). As you said, Clojure is itself a Lisp-1 and it does privilege immutable data structures and secure concurrent access with Refs/STM or agents. What is more interesting in the Clojure model (compared to Scala's, since they are often compared even though their differences, as functional languages and Java challengers on the JVM) is that it doesn't allow unsafe practices (all must be immutable except in variables local to a thread, etc).
Interesting times on the JVM indeed.
Re:Why is multicore programming so hard? (Score:5, Interesting)
Yeah that's fair; I kind of quickly read your post (despite it being only one sentence; hey this is Slashdot!) so mistook it for the generic "FP means you get parallelization for free!" pipe dream. :)
Yeah, I agree that even if the programmer has to specify parallelism, having immutable data structures makes a lot of things easier to think about. The main trend that still seems to be in the process of being shaken out is to what extent STM will be the magic bullet some people are proposing it to be, and to what extent it can be "good enough" as a concurrency model even in non-functional languages (e.g. a lot of people are pushing STM in C/C++).
Re: (Score:2)
Forgive my profound ignorance on the topic, but am I to understand that the determination of which can run in parallel is done a
Re: (Score:3, Insightful)
only the CRAPPY video cards use any of the main memory. Honestly, with how cheap real video cards are I cant believe anyone would intentionally use a memory sharing video card.
It's like the junk winmodems of yore. DONT BUY THEM.
Re:Why is multicore programming so hard? (Score:4, Informative)
Re: (Score:2, Informative)
What do you think the GPU driven super computer fuzz is all about?
Re: (Score:2)
Re: (Score:2)
No, they haven't, with few exceptions. Doing multiple things at the same time isn't really the issue here, we're trying to figure out how to effectively split one task between multiple 'workers'. Video games are one of the harder places to try to apply this technique to, because they run in real time and are also constantly responding to user input. Video encoding is the opposite. One of the big problems with multicore is coordinating the various worker threads.
You could learn a lot by taking the time to re
Re: (Score:2)
By that I mea
Re:Why is multicore programming so hard? (Score:4, Informative)
I'm by no means a multiprocessing expert, but I suspect the problem with your approach is in the overhead. Remember that the hardest part of multiprocessing, as far as the computer is concerned, is making sure that all the right bit of code get run in time to provide their information to the other bits of code that need it. The current model of multi-CPU code (as I understand it) is to have the programmer mark the pieces that are capable of running independently (either because they don't require outside information, or they never run at the same time as other pieces that need the information they access/provide), and tells the program when to spin off these modules as separate threads and where it will have to wait for them to return information.
What you're talking about would require the program to break out small chunks of itself, more or less as if sees fit, whenever it sees an opportunity to save some time by running parallel. This first requires the program to have some level of analytical capability for it's own code (Let's say we have two if statements one right after the other, can they be run concurrently? or does the result of the first influence the second? What about two function calls in a row?). The program will have to erect mutex locks around each piece of data it uses too, just to be sure that it doesn't cause dead locks if it misjudges whether two particular pieces of code can in fact run simultaneously.
It also seems to me (again I'm not an expert), that you'd spend a lot of time moving data between CPUs. As I understand it, one of the things you want to avoid in parallel programing is having a thread have to "move" to a different CPU. This is because all of the data for the thread has to be moved from the cache of the first CPU to the cache of the second. A relatively time consuming task. Multicore CPUs share level 2 cache I think, which might alleviate this, but the stuff in level 1 still has to be moved around, and if the move is off die, to another CPU entirely, then it doesn't help. In your solution I see a lot of these moves being forced. I also see a lot of "Chunk A and Chunk B provided data to Chunk C. Chunk A ran on CPU1, Chunk B on CPU2, and Chunk C has to run on CPU3, so it has to get the data out of the cache of the other two CPUS".
Remember that data access isn't a flat speed. l1 is faster than l2 which is much faster than RAM, which is MUCH faster than I/O buses. Anytime data has to pass through RAM to get to a CPU you lose time. With lots of little chunks running around getting processed, the chances of having to move data between CPUs goes up a lot. I think you'd lose more time on that then you gain by letting the bits all run on the same CPU.
Re: (Score:2)
A = 1
A = A + 1
If you ran the first line on Core 1 and the second on Core 2, how would it know that the second line would need to be processed after the first (other than it's place in the code itself).
I wonder if they are using this parallel process only for isolated threads then? I thought any modern OS already did this? Does anyone know exactly how they are tweaking the OS to better multitask among cores (In semi-technical Laymen's terms)?
Re: (Score:2)
On top of the fact that Core 2 needs to somehow know not to run the instruction until after Core 1 runs its preceding instruction, you're also moving the value of A from Core 1 to Core 2. Normally, when one instruction follows another and the same variable is used, that variable's value is cached in CPU level 1 cache. It's almost instantly accessed. In your example you have to move it between Cores; that means it has to go out to CPU level 2 cache from Core 1's L1, and back into Core 2's L1 so it can be a
Re: (Score:2)
Video games are one of the harder places to try to apply this technique to, because they run in real time and are also constantly responding to user input. Video encoding is the opposite.
Since when does encoding of live video not need to run in real time? An encoder chain needs to take the (lightly compressed) signal from the camera, add graphics such as the station name and the name of the speaker (if news) or the score (if sports), and compress it with MPEG-2. And it all has to come out of the pipe to viewers at 60 fields per second without heavy latency.
Re: (Score:2)
I didn't specify live video encoding. That sentence does not make sense if interpreted to be referring to live video encoding. I would be remarkably misinformed to have used live video encoding as an example of something that does not run in real time. Live video encoding is not often encountered in a desktop PC environment, and I would go so far as to say that the majority of video broadcasts are not live.
I am somewhat confused as to why you're talking about live video encoding. Does this relate to multico
Heard of a webcam? (Score:2)
I didn't specify live video encoding.
Your wording gave off the subtext that you thought live video encoding was commercially unimportant. I was just trying to warn you against being so dismissive.
Live video encoding is not often encountered in a desktop PC environment
Citation needed [wikipedia.org].
I would go so far as to say that the majority of video broadcasts are not live.
And you'd be right, but tell that to my sports fan grandfather or my MSNBC-loving grandmother.
Most PCs have VGA or DVI-I output abilities, and the conversion to the RCA connectors requires no special electronics.
Most PCs won't go lower than 480p[1] at 31 kHz horizontal scan rate, and they output RGB component video. SDTVs need the video downsampled to 240p or 480i at 15.7 kHz, and most also need red, green, and blue signals to be multiplexed into co
Re: (Score:2)
Haven't video game programmers been doing it forever, doing some things on the CPU, some on the graphics card?
Yeah, and they sync an unknown (but often quite large) amount of "cores" (ie, the shaders, etc in the GPU) quite easily too.
Of course, the only reason it's so easy for video game programmers is that raster graphics are one of the easiest things ever to parallelize (since pixels rarely depends on other pixels), and APIs like OpenGL and Direct3D make the parallelism completely transparent. If they had to program each individual pixel pipeline by hand, we'd still be stuck with CPU rendering. The purpose of Gra
Re: (Score:2, Informative)
Haven't video game programmers been doing it forever, doing some things on the CPU, some on the graphics card?
Not really - although it's easy to use both the CPU and the GPU, normally this would not be at the same time. What's been going on "forever" is that the CPU would do some stuff, then tell the GPU to do some stuff, then the CPU would carry on.
What you can do is have a thread for the CPU stuff updating the game world, and then a thread for the renderer, but that's more tricky (as in, at least as diffi
G5? (Score:5, Interesting)
what is the status of 10.6 on the PowerPC G5?
Re: (Score:2)
Thanks for Playing (Score:4, Informative)
Re: (Score:2)
Re:G5? (Score:5, Informative)
Snow Leopard is going to be the first version of Mac OS X that only runs on Intel Macs, so I'm afraid you're going to be stuck on plain old leopard
Re:G5? (Score:4, Interesting)
The main feature of Snow Leopard is its 64-bit kernel and an upgrade across the board to 64-bit apps.
The problem for porting this to PowerPC is that the move to 64-bits only makes things slower on PPC because, as it is based on a modern 64-bit architecture with plenty of registers, it's already gained most of the benefits of 64-bits even when using 32-bit apps. Moving to 64-bit apps just means it has to move around more memory.
On the other hand, 32-bit Intel CPUs are register starved, so the additional memory overhead of the move to 64-bits is far outweighed by the improvement in moving to the 64-bit "Intel" architecture (developed by AMD).
So faced with spending twice the efforts to optimize SL for PPC machines that Mac users have known to be marked for death since 2006, resulting in a product that only runs 64-bit versions of PPC apps slower than Leopard, Apple decided to target its modern 2009 operating system to its modern hardware platform.
There are probably some G5 owners who might like the idea of being able to upgrade to SL, but they probably don't realize that it would only result in some new trim and slower overall performance. And if you compare the number of G5 machines Apple was selling in 2005-2006 with the number of Intel machines it has sold since, you'll see another reason why Apple is supporting Intel exclusively.
FYI:
Apple sold 0.8 to 1 million PPC Macs per quarter in 2005-2006.
Apple sold 2.3 to 2.6 million Intel Macs per quarter in the last year.
Why Windows 7 is Microsoft's next Zune [roughlydrafted.com]
Re: (Score:2, Interesting)
Re: (Score:2)
I just hope the Optical Drive goes the same way on notebooks. Most people use it very few times a year (not more than 4 or 5 in my case)
From your comment I guess that either you don't watch a lot of DVD movies, or you are given DVDs as a gift 4 or 5 times a year and live in a country that lacks a DMCA counterpart so that you can rip the DVDs to your computer.
Re: (Score:3, Insightful)
Just tell me this: How is an average user without a DVD/CD drive going to install an OS? Even I have problems with this, and I am pretty experienced.
(Booting from an USB stick never quite worked. Also I already need the one that I have, as a keyfile storage.)
What PC can't boot from USB MSC? (Score:2)
How is an average user without a DVD/CD drive going to install an OS? (Booting from an USB stick never quite worked.)
On which hardware did booting from USB mass storage fail? I used UNetbootin on a desktop computer to turn an Ubuntu 8.04 ISO into a bootable copy on an SD card. I booted from the SD card on my Eee PC and replaced the included Xandros on the internal SSD with Ubuntu. Everything worked fine once I applied the published fixes for Hardy on Eee PC 900 (except for sound after resume). Or are you talking about PCs made before USB 2.0 was common?
Also I already need the one that I have, as a keyfile storage.
My Eee PC has three USB ports and one SD card slot.
Re: (Score:2)
Re: (Score:2)
That's probably why Apple tells you that you'll need to stream the install disk from another computer, or buy an external USB DVD drive.
Re:Living in the past (Score:4, Insightful)
No you're thinking of the Zune.
The MacBook Air is very popular, even though it costs a lot. People pay extra for "cool, sexy" Mac products (those are Microsoft's words used in its advertising about how cheap low end generic PCs are).
Actually I'm surprised by how many starving student types I see with an Air. I decided against buying one (which would have come in handy while traveling), but apparently the cool kids buy what they like, not what they "can afford."
They also buy expensive skinny jeans and $400 iPod touches and other stuff that Microsoft billionaires don't seem to think that they will. Of course, there are people who like to "save money," who go out and buy $700 PCs and then spend thousands of dollars putting GPU cards in them every six months to play the latest PC game.
And then there are those guys who saved money buying the Xbox 360 because it was so much cheaper than the PS3, except that it was only cheaper because it left off a lot of things like wireless and a hard drive. Plus they got a great deal on HD-DVD! And they ended up saving 80% on the Zune after it tanked and Microsoft dumped the extras on the market in a fire sale.
Microsoft is all about saving money! Except for the whole thing about Vista costing more than XP, and introducing a whole bunch of new licensing levels to force generic PC users to pay for features through software upgrades that "unlock" features for hundreds of dollars.
But yeah, your joke about there being two Zunes was funny stuff man, we should get together and play Halo in your mom's basement.
Re: (Score:3, Informative)
Apple's solution was to enable Remote DVD sharing, so that the "BIOS" (EFI) of the disc-less MacBook Air can install its OS from scratch via the DVD drive of another computer on the local network.
But yes, a generic PC would have a problem installing Windows without a local DVD drive, because generic PCs have a completely retarded, ancient BIOS firmware that rarely offers any functional network boot support, and Windows makes 70's-era assumptions about what CPM drive letters it is installing on.
How to break the 0.05 Mbps barrier in rural areas? (Score:4, Insightful)
The same thing that happened to audio cds is going to happen to dvd. They will become obsolete as long as bandwidth keeps increasing.
A lot of people still can't get more than 0.05 Mbps dial-up. What, apart from a government-sponsored program analogous to rural electrification [wikipedia.org] (started 1936 in the United States), is going to increase bandwidth to bufftuck nowhere?
Comment removed (Score:5, Insightful)
Re: (Score:3, Insightful)
The Rewritable CD drive is not what killed off the floppy. The USB stick did.
Re: (Score:2)
Re: (Score:2)
I believe the Xbox 360 has a PPC CPU.
Re: (Score:2)
Re: (Score:2)
Close enough...the thing is essentially a PC.
Re: (Score:3, Informative)
I would argue that when USB flash drives became cheap is when floppy drives became discontinued. By the way, many MB manufacturers and PC makers still include PS/2 connections when most keyboards and mice are USB these days. Backwards compatibility is hard to break even when the technology is obsolete.
Re: (Score:2)
Back in the day, you couldn't even take for granted that you could
boot off of a CD.
You mean, you couldn't take for granted that you could boot your PC off a CD. EVERYONE else had it well figured out. These days you can't take for granted that you can boot from USB, still! Some motherboards still bone it.
Re: (Score:2)
Back in the day, you couldn't even take for granted that you could boot off of a CD.
It was easy for Macs (hold down the c key), but unreliable (at best) for Windows. Thus it was much easier for Apple to drop floppy drives, and it took years for PCs to adopt the same (still a lot of new models with them though).
Re: (Score:2)
Alas, as when Apple stopped putting floppy drives in Macs, others followed
A common myth, but not supported by evidence. Some PC manufacturers were still shipping PCs with floppies years later. What actually happened was that over the years, various computer manufacturers dropped floppy drives, and it's impossible to claim that one caused the others to do so (and this seems an unlikely claim anyway - when CD/DVD writers and USB drives were commonplace, it was obvious there was no need for floppies - we don't
Re: (Score:2, Interesting)
Re: (Score:2)
They refused to provide custom chips without large binding orders. They were willing to provide Apple with chips just not à la carte.
Re: (Score:2)
Excuse #527, and it doesn't change the fact that IBM promised 3 Ghz chips within a year, and never delivered. And it's not like Apple was small potatoes - they were selling four million Macs a year at the time, with higher margins than the Cell.
Re: (Score:2)
If Apple had order (binding order) 4 million 3ghz chips IBM would have either provided them or paid large penalties.
Re: (Score:2)
They promised Apple 3 Ghz chips within a year and never delivered.
Fixed that for you.
Re: (Score:3, Insightful)
I'm sure it won't.
I tried upgrading to Leopard on my G4 iBook. Tried it for a couple months, then downgraded back to Tiger.
Some of the UI decisions they made in Leopard, like folders in the Dock that display as all of their contents stacked in a pile instead of a folder icon, were completely brain-dead. There was enough public outcry (and third-party workarounds) that Apple added options to fix the behavior in newer versions, but they still go with the stupid options by default. Did they forget to do usa
Snow leopard is such an apt codename (Score:3, Funny)
Spread your tiny wings and fly away,
And take the snow back with you
Where it came from on that day.
The one I love forever is untrue,
And if I could you know that I would
Fly away with you.
In a world of good and bad, light and dark, black and white, it remains very hopeful that Apple still sees itself as a beacon of purity. It pushes them to do good things to reinforce their own self-image.
I can't wait to try this latest OS!
Re:Snow leopard is such an apt codename (Score:5, Funny)
I almost threw up.
Why rush to use all the cores? (Score:5, Interesting)
I realize I can throttle the video encoding to a single core, but I'm just using that as an example... if all apps start using all cores, aren't we right back where we started, just going a little faster? I love being able to do so much at once...
Doom is a GBA game (Score:2)
One thing I love about my quad-core Q6600 is the fact that I can be doing so many things at once. I can be streaming HD video to my TV while simultaneously playing DOOM, for example.
Doom can run on a Game Boy Advance [idsoftware.com], rendering in software on a 16.8 MHz ARM7 CPU. You could emulate the game and your quad-core wouldn't break a sweat.
if all apps start using all cores, aren't we right back where we started, just going a little faster?
That's what developers want: the ability to use all the cores for a task where the user either isn't going to be doing something else (like on a server appliance) or has another device to pass the time (like a GBA to run Doom).
Just nice it. (Score:2)
The point I'm trying to make is that I don't want everything to be multi-threaded.
Then use your operating system's process manager to "nice" (deprioritize) the apps that you don't want to be multithreaded.
Re: (Score:2)
Remember, video encoding requires tremendous amounts of CPU power in the encoding process, far more so than audio encoding. That's why when Pixar renders the images for their movies they use thousands of Apple Xserve blade servers running in massively parallel fasion to do rendering at reasonable speeds.
We can make all the Beowulf cluster jokes on this forum, :-) but one reason why Beowulf was developed was the ability to synchronize hundreds to thousands of machines in a massively parallel fashion to speed
Nice (Score:2)
On a UNIX system (like Mac OS X) you should be able to "nice" the low-priority processes to give them less attention. If I'm running a twelve-hour, max-the-CPU simulation and I want to play a game while I'm waiting, I nice the simulation to a low priority. That way it yields most of the CPU to the game while I'm playing, yet runs at full dual-core speed when I'm not.
I'm not sure this is actually working in Mac OS X 10.5, though. Since I got my dual-core system, the activity monitors don't seem to show that
Re: (Score:2, Informative)
These days, the relatively lower memory and IO speeds are often the real performance bottlenecks for ordinary applications. So improved IO scheduling might do more than multiple cores for the perceived performance of a specific system or
Re:Why rush to use all the cores? (Score:5, Interesting)
Yup. If applications start getting to good at being able to "use the whole machine"
again then that's exactly what they will try to do. The fact that they really can't
is a really nice "release valve" at this point. As an end user managing load on my
box I actually like it better when an app is limited to only 100% cpu.
(IOW, one core)
Re: (Score:3, Insightful)
That only works because you have few cores.
Once we get to the point where a consumer desktop has 32 cores, you're not going to be able to use even half of that CPU by running independent tasks simultaneously. You'll need to have apps that can take advantage of many cores. The more cores you have, the more power a single core application fails to take advantage of.
Re: (Score:2)
Agreed.
There are a few applications I prefer either single-cored or with limited memory access. After Effects, for example, is so incredibly poorly behaved on the Mac that I'd rather use version 6 - which can only see around 1.5 gigs of ram - than 7 or higher, which can see much more. In my experience it's made almost no difference on rendering time - where it DOES make a difference is in wether or not I'm actually able to use my machine for anything else while the render is grinding.
Of course, you can te
Re:Why rush to use all the cores? (Score:5, Interesting)
You ought to be able to set your program to only run on certain processors. I know Windows has this feature (set affinity in task manager) so I assume Linux/Mac does as well.
I'd recommend putting heavy tasks on your last core or two, and anything you particularly care about on the second core - leave the first for the kernel/etc.
Re: (Score:3, Informative)
One area: Graphics rendering. And I'm not talking about games, but Lightwave et al. especially when one is rendering a single very large image (say billboard). Currently most renders allow splitting of that frame across several machines/core where each one renders a smaller block and then reassembles the larger image. However, not all the rendering engines out there allow the splitting of a single frame. Also, if the render farm is currently being tasked for another project (animation) and you need to
Re: (Score:2)
I start doing something heavy (...)
You just answered your own question.
Re: (Score:3, Interesting)
Applications shouldn't be concerned with limiting themselves so that they cannot under any circumstances slow down other applications. It's the job of the OS to provide the tools to prioritize applications according to the desires of the user.
OS X, by virtue of its Unix underpinnings, should support nice/renice to alter the priorities of processes. One would hope that with additional support for developers to make use of multiple cores, Apple would also provide users with increased ability to easily alter t
i don't know you but... (Score:5, Funny)
I always read it as "Slow Leopard"
Why would my Mom upgrade to Snow Leopard? (Score:3, Interesting)
My biggest problem with this upgrade is that it seems more like a Windows Service Pack than a true Mac OS X upgrade. Are we going to have to pay for "new APIs" and "multi-core processing"?
How does all this help the average user (i.e. my Mom)? WooHoo! They are building a YouTube app and you can record directly off the screen! Big whoop. You can do that today without too much trouble with third party applications. Is the Mac OS X user interface and built-in apps already so perfect that they can't find things to improve?
I'm usually a pretty big Mac fan-boy but I just can't seem to get excited about this one. Hell, I'm even thinking (seriously) about ditching my iPhone and getting a Palm Pre. sigh...how the world is changing. Has Apple lost it's Mojo?
Re: (Score:2)
I doubt she will be motivated to...
I think some of the changes affect the corporate user more than they do the home user.
From what I've read, the mail, calendar and contacts apps now communicate with MS exchange (using the Active Sync technology Apple licensed from MS for use in the iPhone).
While I'm sure there are other changes, I think those are some of the "bigger" one that a lot of people have been waiting for, myself included...
Re: (Score:2)
>> Is the Mac OS X user interface and built-in apps already so perfect that they can't find things to improve?
I thought that concentrating on performance optimizations and stability was an improvement to the current version.
-dZ.
Re: (Score:2)
Re: (Score:2, Interesting)
My biggest problem with this upgrade is that it seems more like a Windows Service Pack than a true Mac OS X upgrade.
I much prefer frequent, incremental updates. The $100 that Apple charges for the OS is peanuts compared to the amount of use it gets.
Maybe you like the MS upgrade cycle, but look at all the bad press they get for it... you can hardly blame Apple for wishing to avoid that.
Re: (Score:3, Informative)
Erm, so what is this Windows XP installation that I have been using since XP Service Pack 1 that I have *incrementally upgraded* through to Service Pack 3 with all the additional Microsoft updates then?
Apple does "updates" as well.
You can't deny that the move from XP to Vista is a big one. SP1 to SP2 may have affected some business users or home power users, but most users didn't really notice a difference... the overall XP experience was mostly unchanged. SP3 was even more slight. Apple updates tend to add marketable features. For instance, Leopard added Time Machine and Spaces along with the service-pack style under-the-hood stuff.
but I wish you Apple fanbois would occasionally go read a technical book or something so that you can at least have some degree of intelligent conversation with those of us who do.
Considering that I run XP, Ubuntu, and Apple stuff I think you might be b
Re: (Score:2)
Re: (Score:2)
Erm, so what is this Windows XP installation that I have been using since XP Service Pack 1 that I have *incrementally upgraded* through to Service Pack 3 with all the additional Microsoft updates then?
That Windows XP installation you've been running is Windows NT 5.1 [wikipedia.org], and the updates you've been downloading for it are free...just as Apple's updates for Leopard are free, just as Tiger before it, and Panther before that, and so on.
I'm no MS fanboi by any means, I use mostly (incrementally upgradeable) Gentoo
Re:Why would my Mom upgrade to Snow Leopard? (Score:5, Insightful)
My impression of most Apple users is that they want not to use Microsoft products and do hide inside an elitist little club where there is no need for most of them to be concerned about technical issues. That's fine if that is what they want but those same people should not try to argue with people who do know what they are talking about when it comes to OSes - at least, in my case, when it comes to UNIX, Linux or Windows.
Most Apple people I know are very knowledgeable about other operating systems and make informed choices to use Macs. What does drive us nuts are those who criticize our choices but also freely admit...
I don't use Apple.
and
I know nothing about OSX.
Re: (Score:2)
Re: (Score:2)
Well, I think it does! ;p
Re: (Score:2)
I think every major version is a service pack, except Apple charges $150 for it, and changes the API enough that you can't run new software. I wanted to run XCode on my 10.4 laptop, so I had to go buy a 10.5 upgrade, even though it didn't have any new features I actually cared about. I still think it should have been a $30 minor feature pack, not a whole OS.
I think it's the most annoying part about Apple. They definitely seem to nickel and dime you, especially by not shipping with a full-screen media pl
Re: (Score:2)
Well, with past major versions of Mac OS X we at least got some newfangled toys to play with (the Dock, Spotlight, Spaces, etc.) But with SL, we get APIs and back end stuff. That may be neat and all but it doesn't do much for me, immediately.
Now granted it will be faster and more stable, which is a good reason to upgrade, but I'm not sure its a good enough reason to pay $100. Even the "enterprise" features wont do much for the average person. I guess Apple is just using SL to get a foot into the corpora
Re: (Score:2)
To be pedantic, the PPC iMac was discontinued in January of 2006. If the machine is really two years old it will run Snow Leopard fine.
Re: (Score:2)
Of course, a decent copy of XP Pro costs as much as two of those Mac OS X upgrades combined, and a copy of Vista Ultimate would pay for pretty much every OS X update that has been released.
Not to mention the fact that in terms of features, the jump from XP to SP3 has been smaller than any of the OS X upgrades.
XCode never stopped working on an upgrade, you just can't necessarily run the new XCode because of the new API's that are part of the new Operating system... sounds kind of normal to me. Stuff that the
Re: (Score:2)
$117.49 (http://www.newegg.com/Product/Product.aspx?Item=N82E16832116515) for XP Pro SP3 is less than the $129 (http://store.apple.com/us/product/MC094) for the Leopard upgrade.
And what is the price of rice in China this morning?
Blind adoration won't get you anywhere.
As opposed to non sequiturs, which get you anywhere fast.
Re: (Score:3, Informative)
Oh, and it looks like PRO is gone with Snow Leopard - more for you to whine about.
Cleanup (Score:5, Interesting)
From what I've read, they are cleaning up the code and optimizing it for the Intel platform. Supposedly it will take up less hard drive space and memory, but I'll believe that when I see it. Even if they fail, I'm glad they attempted this cleanup, even if it just inspires Microsoft to do some similar scrubbing with Windows 8. It's about time someone stopped and said, "Hey, instead of shiny feature 837, can we make sure that our web browser isn't leaking memory like a paper boat?"
It's not really for your mom - it's so she doesn't call you as often.
I'm usually a pretty big Mac fan-boy but I just can't seem to get excited about this one. Hell, I'm even thinking (seriously) about ditching my iPhone and getting a Palm Pre. sigh...how the world is changing. Has Apple lost it's Mojo?
I had the same thought. Apple is getting too greedy with their hardware prices, and they continue to screw customers over with their overpriced parts for repair. Plus, the computer world is changing, and they don't seem to understand what's happening.
Try remotely controlling a Mac with VNC over a cellular broadband connection. It's like sucking a watermelon through a straw. Try creating a virtual network of virtual machines for testing before deployment, which is illegal under Apple's TOS except for their server software. You'll be dragging your toaster into the bathtub by the end of the day.
Netbooks are evidence that people want computers for convenient access to information, usually located on the internet, and to have something to sync their iPod to. I'm not sure how much longer Apple can charge twice what their competitors are charging and get away with it. And they still have no chance of entering the enterprise market with their hardware costs and licensing restrictions.
I'm due for a laptop upgrade, and given the choice of a Dell Precision, RGBLED screen, and a dock that supports legacy ports and dual 30" displays, or a slower MacBook Pro with a crappier display for the same price, they're really making the decision for me. I'll continue recommending Macs for friends and family that may call me with technical questions, but if Windows 7 offers the same kind of robustness for half the price, what's the point?
Re: (Score:3, Informative)
Supposedly it will take up less hard drive space and memory, but I'll believe that when I see it.
I think it's safe to believe the part about less hard drive space, because Apple will save a lot of space with a very simple method. According to AppleInsider [appleinsider.com], Snow Leopard will trim the standard install size by "several gigabytes" (4GB according to Ars Technica [arstechnica.com]) by only installing printer drivers for currently connected printers. Drivers for newly attached printers will install over the network and Software Update, so this works best with an always-on connection.
Personally, I'm blown away by the fact tha
Re:Why would my Mom upgrade to Snow Leopard? (Score:4, Insightful)
Re:What's up with the punctuation (Score:5, Funny)
Perhaps the editor doesn't know how to edit?
Oh wait, kdawson, never mind.
Re:What's up with the punctuation (Score:4, Informative)
Why... is there... there so much... punctionations in the summary?
Because the summary is directly quoting the article and using ellipses [wikipedia.org] to indicate that certain party of the quotation have been omitted. Usually there would be a space on either side of the ellipsis when this was done, but this is /. so I'll let this one slide.
Re:What's up with the punctuation (Score:5, Informative)
No ellipse is not a change to the text but a deletion from the text.
Re: (Score:2, Interesting)
There has been a slight shift in the adding of ellipses to passages to indicate omission. In a text that has ellipses in the text itself (for example, Pynchon's Gravity's Rainbow), some scholars use square-bracketed ellipses to indicate omission. In general, the use of bracketed ellipses redundantly and unambiguously signals editorialization.
That is, until some clever writer begins including square-bracketed ellipses in his or her text [. . .].
Re:What's up with the punctuation (Score:4, Funny)
That is, until some clever writer begins including square-bracketed ellipses in his or her text \[. . .\].
We just need escape codes :)
Re: (Score:2)
Good solution. Might make it in.
Re: (Score:2)
Interesting I hadn't thought about how to quote text with ellipses.
Re:What's up with the punctuation (Score:5, Informative)
Turn to side B for the next lesson.
Re: (Score:2, Offtopic)
Slashdot is kind of an in-between case, though; when the editors post a story by submitters, it's sort of formatted as if they're "quoting" the submitters, but it's not quite like quoting a book or speech or something. It's expected that when submitting a piece to a site with editors (assume for the sake of argument we can call Slashdot editors that), that your text might be, well, edited before publication. The Economist, for example, edits letters to the editor before publication for style and brevity, wi
Re: (Score:2, Funny)
kdawson is... doing... his Captain... Kirk... impression. Mr. Taco, Warp Factor... 10.
Re: (Score:2, Funny)
it works automagically
this is Apple after all
MAGIC!
Re: (Score:3, Insightful)
double bind here. those who speak do not know. those who know do not speak - they're under NDA.
Re: (Score:2)
I don't know what Grand Central is exactly, or how it does what it does, but I do know a bit about parallel programming.
It ain't easy.
There's all sorts of pitfalls, and doing some sort of QA to make sure there are no race conditions is a real pain.
So, if Apple has come up with some way to make parallel processing easier, in some useful cases, this is a Good Thing, and will help developers write better applications for Mac OSX, and therefore sell more Macs. Making difficult things easier is usually l