100 Years of Macintosh 280
Zero seconds on the Mac OS system clock is January 1, 1904. The Mac OS epoch hits 100 years ... now. That's assuming you live in the Pacific time zone, anyway: the Mac OS epoch is unique in that it is time zone-specific. Of course, none of this applies unless you are running Mac OS, and all you Mac users are using Mac OS X, right? (Geek note: the Mac OS epoch is unsigned, which is why it can count over 100 years from 0 seconds, and 32-bit Unix can't, though it can count backward to 1901.)
Mac OS 9.2.2 seems to be OK (Score:3, Insightful)
Re:Mac OS 9.2.2 seems to be OK (Score:5, Funny)
Check that 'Speech Recognition'! (Score:4, Funny)
Re:Mac OS 9.2.2 seems to be OK (Score:3, Funny)
Re:Mac OS 9.2.2 seems to be OK (Score:2)
Fixed long ago (Score:2, Informative)
Re:Fixed long ago (Score:3, Funny)
As with many things, the answer should be obvious: time travelers. While the mainstream press seems to have, once again, missed a great Apple story, it can no longer be kept secret: the Macintosh is the preferred computer of time travelers everywhere. Or everywhen. Or at least everywhen across a span of sixty millennia.
Re:Fixed long ago (Score:4, Informative)
This article and all comments seem to be a little twisted.
What's an epoch in this context? An epoch for dates is usually the year after which the entered year is assumed to be the next century rather than the previous one.
For Macs, this has varied over the years with different software releases.
The other way to look at it might be the date it "rolls over." But date 4,294,967,295 is not for something like 35 years. I think it's in 2040, but I'm not entirely sure. I haven't had to deal with it in a while. :)
The only significance of today's date is that it's 100 years after time 0.
(And, of course, there are other APIs available on the Macintosh that won't break even then.)
...from 20,000 BC to 30,000 AD (Score:2)
All's well with 7.5.5, too (Score:2)
(Given that my Apple IIGS got through Y2K without a hiccup, I'm not particularly surprised that there were no issues with newer hardware either.)
Well, uh.. happy.. epoch.. then. (Score:3, Funny)
Apple and the Future (Score:2, Interesting)
Re:Apple and the Future (Score:5, Informative)
1) Who says they model themselves as a hardware company? Companies that do both hardware and the software that runs on it are common in enterprise computing (Sun, IBM, SGI, etc). Would you say these companies have little software experience because they are hardware companies? Apple is much the consumer equivalent of these; they make hardware and software woven very tightly together; the idea behind a Mac is not that you get superior hardware or superior software, but that you get a package. And that in being a cohesive package, it is superior, almost inherently, than a hodgepodge of off-the-shelf components (much like Sun's claim that Solaris is the best OS for Sparc, or SGI and IBM with IRIX and AIX (which are both perhaps on the way out, in favor of custom Linux distros)).
2) Yes, Apple patches are offered as timely as Microsoft (which is to say, perhaps not as timely as they should be). I've seen plenty of reports on Bugtraq of Apple being unresponsive to reported bugs, but then I've seen the same with MS. Presumably, they simply didn't take the issue seriously or deemed it unworthy of addressing for some other reason (which leads us back to just how trustworthy your computing really is, if you can't trust the company that designed it).
3) What ``BSD patching system''? I'm pretty well experienced with administering Open and FreeBSD, and I am totally unaware of some patching system inherent to all BSD-derived OSes (say, Solaris?). Both Open and Free have similar pkg and port systems, but this is more because Open liked the way Free did it, not because they are both BSD's (that is, BSD refers to the underlying OS components--as opposed to, say, GNU--not anything else (certainly not the kernel, which, on OSX, is Mach-, not FreeBSD-based)). I think you are confused.
Re:Apple and the Future (Score:3, Informative)
Re:Apple and the Future (Score:3)
The oft neglected third option is that there's a long list of things to do ahead of a given defect. There are only so many programming monkeys at Microsoft or Apple working on code. In other words: A neglected defect is not automatically an indication that a company is ev
Re:Apple and the Future (Score:3, Insightful)
People take it for granted that cars work reliably, just as they take it for granted that computers don't. Back when I started using PCs around the time of Windows 3.1, I took it for granted that errors occurred (actually, I remember, though perhaps ina
Re:Apple and the Future (Score:5, Interesting)
I'm sorry, but I simply don't agree with this point of view. Your heart is in the right place, but this is not the answer.
First, the hacker *is* guilty. Software is designed for a specific purpose (even general purpose software) and because of that, the creator of that software cannot and should not be held liable for that. Problem #1 is that software is written by humans, who are, by nature, error prone. Problem #2 is that finding defects and using them maliciously requires creativity. Because of this, there's no practical way for a software company to know that their software is 'liability free'. Problem #3 is that there are far too many products out on the marketplace today that can be misused in such a way that a simple modification would prevent that sort of behaviour from happening. Why single out software? Problem #4 is that in cyberspace, monetary damage is very difficult to measure. Problem #5 is that the environments that the software is run on are far too diverse to guarantee any sort of working order. As such, anybody 'relying' on a computer system would be incredibly ignorant without ways of minimizing damage due to loss of functionality or data. (I should pause here a sec to let you know that I'm quite fatigued, and I apologize if what I'm posting is difficult to read.)
Secondly, unloading legislation that says you are liable for an attack that somebody else carries simply because you didn't cover all your bases is going to do more harm than good. The Open Source Community will be hit the hardest. Who would want to contribute spare time to a project only to open the door for being sued because somebody decides to be a git? I mentioned in an earlier point that there's no real scientific way to certify the 'safety' of software. The only real way to approach that would be heavy testing on a very diverse range of platforms and configurations. I can see Microsoft with their 25+ billion in the bank doing this, I can't see a startup company doing that. Nor can I see that startup company surviving their first lawsuit over this. The only way to minimize this negative effect on the industry would be to tightly define very specific rules about very specific exploits, such as the one you mentioned with Apple. Well, what good is this legislation going to have if it only covers a limited scope? Okay, I'm drifting a bit here. Sorry. I just don't see this doing anything but making software development less accessible, and making megacorps like Microsoft stronger. Software could become 'less exploitable', but the cost of that is growth. Even then, defects will not disappear. BS like the Blaster Worm will still happen, it just might take a little longer.
Third, how does one even begin to define effective legislation here? In order to prevent a defect from being exploitable, one has to know every single way that defect can be used. I remember back in the Windows 95 days, you could rename your Windows folder. Doing so meant instantly breaking your system. A shortcut or batch file could be made to do this. If somebody sends out an email tricking people into running a shortcut to do this, how do you define Microsoft's guilt due to damage done? The rename feature works perfectly. Using it to rename your Windows folder is like cruising down the highway at 70mph and shifting into reverse. Sure, the car could be made to prevent that, but why would somebody do that in the first place? Should Honda be partly responsible because of deaths caused by somebody saying "
Re:Apple and the Future (Score:2)
In such an instance, Microsoft would not be liable for simply making a mistake as a ``reasonable person'' is apt to do (or, as you said, humans are error-prone). But they would be liable for spending millions on advertising Trusted Computing without actually doing anything in the way of R&D (I don't actually know if they've done anyt
Re:Apple and the Future (Score:5, Informative)
Closer to Microsoft than anything else. Apple's patches generally come in the form of installer applications that can be downloaded and installed automatically via the bundled "Software Update" application (GUI and command line) or can be downloaded and installed manually from the support section of their website.
Apple does not publish the source of any of their GUI applications or the GUI framework itself. It does however release the source to the rest of the OS under the name "Darwin". Patches and other updates to Mac OS X generally find their way into Darwin and can be browsed at http://developer.apple.com/darwin.
The typical artist/writer/mom-or-dad user can click a couple buttons and have OS X update itself (or even set it to always keep itself updated). More technical users can browse the Darwin website for more details. (This was recently done by several folks wanting to know more about how Panther, Mac OS X 10.3, does its automatic defragmentation and optimizing. They dug around in the Darwin souce until they found that particular part of the HFS+ architecture. Examined the code and made a few posts explaining the process to everyone else).
Re:Apple and the Future (Score:2)
Correction: the typical artist/writer/mom-or-dad user leaves the default settings, so his/her OS X updates itself every week. You don't need to "set it", it's set by default; you have to "click a couple of buttons" to disable it.
Actually, I'm not so sure if it's a good to idea to put it as default - I wonder what will happen if a "typical artist/writer/mom-or-
Re:Apple and the Future (Score:2, Insightful)
Comment removed (Score:4, Funny)
Re:That's one bad apple. (Score:2, Funny)
There is no article... (Score:4, Interesting)
Re:There is no article... (Score:4, Funny)
According to this article.... [slashdot.org].
epoch == start of time, not duration (Score:5, Informative)
Re:epoch == start of time, not duration (Score:2)
Re:epoch == start of time, not duration (Score:2)
Re:epoch == start of time, not duration (Score:2)
Don't even get
Re:epoch == start of time, not duration (Score:3, Insightful)
Think XML: it is not even close to being efficient, but its purpose is not for speed but for portability and flexibility (roughly).
but freform text parsing is much more messy than that to.
Nobody said anything about freeform text parsing. If you define a fixed text pattern to represent date/time, then parsing is extremely easy. ie. All you need to do is look for space separators and separate into t
Re:epoch == start of time, not duration (Score:2, Insightful)
So, assuming you put that in little endian format (to be able to sort), prefix zeros (to be able to sort), and specify the offset in army time (not all timezones work that nice):
2003 01 01 07 26 00 00 -800
This only sorts easily (alphabetically) if comparing times from the same timezone. Otherwise, some different sorting algorithm is required.
Additionally, this requires 27 bytes (versus 4), and...
The times are not
Re:epoch == start of time, not duration (Score:2)
Re:epoch == start of time, not duration (Score:2)
Re:epoch == start of time, not duration (Score:2)
This has nothing to do with database normalization.
If you count the seconds since the epoch, you are limiting yourself to those date/times which occur *after* the epoch date/time. With a text format, you have no such limitation. I'
Re:epoch == start of time, not duration (Score:2)
Oh Yeah?! (Score:4, Funny)
Nya, nya!
For sale: orignal 1904 Mac (Score:5, Funny)
Re:For sale: orignal 1904 Mac (Score:5, Funny)
You, sir... (Score:2)
(tig)
Ugh. (Score:5, Insightful)
It is unique, in the sense that it is crappy.
On Unix, the epoch is an extremely well-defined moment in time, so then is any point in time measured in epoch-seconds is also extremely well-defined.
On the Mac, the epoch-seconds depends on the time zone, meaning that in order for a measurement of time in macos-epoch-seconds to be meaningful, you also need to know the time zone. To me, that kind of ruins the whole point...
Re:Ugh. (Score:5, Interesting)
Bruce
Re:Ugh. (Score:2)
Re:Ugh. (Score:5, Funny)
Reference [usatoday.com]
Re:Ugh. (Score:2)
This battle is about science time vs. calendar time, really.
Re:Ugh. (Score:2, Funny)
Re:Ugh. (Score:2)
It's a big deal for several reasons. First of all, since more people would be driving in the dark to work, half awake, there would be even more accidents than usual. You think that morning commute is bad now?
Secondly, humans are biologically tuned to having a certain amount of daylight each day. It has been shown that people will generally be more sick and less productive
Re:Ugh. (Score:3, Interesting)
Sheesh.
You kids now-a-days.
(Note - UNIX does not use UTC since UTC incorporates leap seconds which UNIX & POSIX does not honor.)
Huh, I have older files from that (Score:4, Interesting)
Dec. 31, 1903, 6:00 PM
Which may be the default for the Central time zone.
Do I really need those files anymore? Well sure! Some of them are old entries for the Bulwer Lytton Contest [sjsu.edu], and you never know when I'll have enough to collect for section of a short story collection. Plus, you know that as soon as I throw away a file, I'll need it the next day. That's just how things work.
This is one of the many, many reasons why I've gone from a 60 Meg to a 60 Gig hard drive. ;-)
Re:Huh, I have older files from that (Score:2)
Re:Huh, I have older files from that (Score:2)
The real question is: can you still open those files? They were created with whatever Word application was on the Mac in 91 (Word? WordPerfect? Simpletext?) but would have to be opened with what's available for OS X.
um.. OK.. (Score:5, Interesting)
A technically interesting length of time (such as 2^32 seconds) from epoch would be noteworthy, but that's a few decades off.
A non-technically interesting length of time (such as 20 years) from the date the Macintosh was first introduced would also be noteworthy, and that's later this month I believe.
I'm a bit tired; did anyone grok that?
Re:um.. OK.. (Score:3, Interesting)
A non-technically interesting length of time (such as 20 years) from the date the Macintosh was first introduced would also be noteworthy, and that's later this month I believe.
That is indeed later this month, dated from the SuperBowl 1984 when the Apple SuperBowl commercial aired. And there are some rumors that Apple will air it again, during the 2004 SuperBowl, to get some of that old time feeling back.
Re:um.. OK.. (Score:3, Insightful)
Hardware clock (Score:3, Informative)
Little know fact (or widely known) almost all Macs will reset to January 1, 1969 if the batter is removed.
Re:Hardware clock (Score:5, Funny)
Removing the batter from most apples will completely ruin the pie. So a reset would seem appropriate.
Re:Hardware clock (Score:2)
Ha! (Score:5, Funny)
Picking Epochs (Score:2, Interesting)
Good question! (Score:5, Funny)
Re:Good question! (Score:2)
(The universe is somewhere between 2^59 and 2^61 seconds old.)
If time was constant everywhere in the universe, you could assign 295,147,905,179,352,825,856 IPv6 addresses to every second. Since it ain't, I'm not sure it's useful to count from the moment the quantum sock that is our universe turned inside o
Re:Good question! (Score:2)
First, I thought we talk about computer clocks, not IP address space problems.
Second, what's wrong to assign IPv6 addresses every second *even* when time is not constant everywhere?
Third, we have sunrise at different moment at Earth, which (and because) is rotating. However we have so-called Universal time, which is the point zero for all other time-zones. In the same way the ag
Re:Homework question for geeks (Score:4, Funny)
Assume 13.9 as worst case.
Assume 365 days, 5 hours, 48 minutes, and 46.5 seconds for a 'year' -- the time it takes for the Earth to go around the Sun once -- ignore the 'slowdown' (which hasn't happened in 5 years anyway) alluded to in an earlier article. (no real need to be so exact though -- 31,560,000 seconds would work fine)
((365*24+5)*60+48)*60+46.5 = 31556926.5 seconds in the 'year'
13.9x10^9 * 3.156*10^7 = 4.38684*10^17 seconds
2^32=4,294,967,296
2^64=18,446,744,073,709,551
If you really need to skimp on the bit length, we could suffice with 59 bits, which would give us:
2^59=576,460,752,303,423,488 ~= 5.76*10^17 -- at least 100 quadrillion years to spare before the Y-576trillion-K bug rears its ugly head.
Re:Picking Epochs (Score:2, Informative)
Wow, so many mistakes in one post (Score:3, Informative)
Re:Picking Epochs (Score:4, Informative)
The previous version had 1969-01-01 as the epoch.
Palm OS too (Score:4, Interesting)
Re:Palm OS too (Score:5, Interesting)
Re:Palm OS too (Score:2)
It makes sense that Palm would have at least (Score:2)
I don't know how Palm could have gotten away with it without paying money to Apple. Maybe they are both based on a standard Motorola programaming model?
Re:It makes sense that Palm would have at least (Score:2)
Re:Palm OS too (Score:3, Informative)
Apple's greatest contribution may not be it's spin on the UI or 3.5" floppies or mice or whatever, but the degree to which former Apple employees have taken lessons learned at Apple and applied them to so many new products and technologies over the last 20 years or so. So many successful startups were founded by former Apple employees.
There you go, then. (Score:2)
Explanation as to what this is about (Score:5, Interesting)
Unlike what the article says, GetDateTime() is still available under the Carbon framework in MacOS X. However, there are now other ways of dealing with date/time in the MacOS. Ironically the preferred method, CFDate is also available under MacOS 9. So, I don't really get the point of the write up saying that this works only in MacOS 9.
Frankly this is of little interest to anyone who is not a Macintosh programmer - and only mild interest to those of us who are Macintosh programmers.
It is interesting to note that the Apple Newton also measures time from this reference point. However, it measures minutes since 1904 instead of seconds in dealing with its default date handling routines. On the Newton they had no real reason for picking that reference date other than that the Mac already used it.
On the original Mac, they did have a good reason for picking it. Apparently 1904 is the first leap year in the 20th century and it simplified the algorithm for factoring in leap years by starting at that point. Since they were trying shoe horn a graphical OS onto a 128Kb machine with no HD (but they did have some ROMs), you can't really fault them for taking a few shortcuts.
Re:Explanation as to what this is about (Score:2)
IIRC, they were tryng to shoe horn a graphical OS onto a 64Kb machine. At the very last minute, the RAM was doubled. But Andy and the gang had already pulled off a miracle.
Re:Explanation as to what this is about (Score:3, Interesting)
Not entirely. Users of Microsoft Excel across Mac and Windows platforms at least used to have to compensate for the 1904 (Mac) or 1900 (Win) date systems when copying data. It was a major pain to always have to add or subtract 1462 days to get the dates to work properly.
20 years of Macintosh (Score:5, Informative)
Re:20 years of Macintosh (Score:2, Funny)
Yes, I know it was the twentieth anniversary of Apple, the company, but isn't the name a little ambiguous?
Re:20 years of Macintosh (Score:2)
However, there were Apple computer's made before the "Macintosh" line of computers were released(nearly 20 years ago to the day). There were both Lisa computers(with a gui) and simply Apple computers(not apple macintosh computers).
Related Comic (Score:4, Funny)
One of the weirdest stories ever... (Score:5, Funny)
OS X: Like Linux, but a whole lot slower... (Score:2, Insightful)
and all you Mac users are using Mac OS X, right?
No, actually. You forgot that OS X is optimised for G4 architecture and newer. Even a fast G3 box is often brought to its knees by Jaguar due to its lack of specific hardware features. OS 9 is not dead: that is apple marketing hype. sure, its becoming more of a niche platform, and eventually the market will drive it to being a "retro platform" or whatever but thats another couple years at least. but its preferred if you don't have a particular need for a
Re:OS X: Like Linux, but a whole lot slower... (Score:2, Offtopic)
Personally I could never bring myself to using Macs before OS X simply because they were so different to everything else on the market at that time. OS X bridges the divide and still lets me get my work done with the ease of use of Mac OS X and the fantastic development environment brought about by Unix and Cocoa.
Epoch, Tick, Wall Time & Wrap Around (Score:3, Informative)
date corresponding to 0 in an operating system's clock and
timestamp values. Under most Unix versions the epoch is
00:00:00 GMT, January 1, 1970; under VMS, it's 00:00:00 of
November 17, 1858 (base date of the US Naval Observatory's
ephemerides); on a Macintosh, it's the midnight beginning
January 1 1904. System time is measured in seconds or ticks
past the epoch. Weird problems may ensue when the clock wraps
around which is not necessarily a rare
event; on systems counting 10 ticks per second, a signed
32-bit count of ticks is good only for 6.8 years. The
1-tick-per-second clock of Unix is good only until January 18,
2038, assuming at least some software continues to consider it
signed and that word lengths don't increase by then.
Wall Time is the `Real world' time
(what the clock on the wall shows), as opposed to the system clock's
idea of time. The real running time of a program, as opposed to
the number of ticks required to execute it (on a timesharing
system these always differ, as no one program gets all the ticks,
and on multiprocessor systems with good thread support one may get
more processor time than real time).
Wrap Around of a counter that starts over at zero or
at `minus infinity' (see infinity) after its maximum value has
been reached, and continues incrementing, either because it is
programmed to do so or because of an overflow (as when a car's
odometer starts over at 0).
Re:Epoch, Tick, Wall Time & Wrap Around (Score:2)
Weird problems may ensue when the clock wraps
around which is not necessarily a rare
event; on systems counting 10 ticks per second, a signed
32-bit count of ticks is good only for 6.8 years. The
1-tick-per-second clock of Unix is good only until January 18,
2038, assuming at least some software continues to consider it
signed and that word lengths don't increase by then.
You're confusing the epoch datetime and "ticks," at least on the Macintosh.
The classic Mac OS had the normal epoch seconds time (which is wh
Hmmm... (Score:2)
Re: (Score:2)
Poor Mac Users (Score:2)
What a shame. Mac users obviously weren't able to participate in the net prior to 1904. Well, at least there's archives like Goggle Groups where they can read what they missed.
BTW, the Apple II has the same calendar scheme as the Mac. My GS's calendar is good through 2038.
Re:huh? (Score:2, Insightful)
Re:Uhhh no it's not (Score:5, Informative)
1904, 1956, 1976 , 1984, 2001, depending on the machine.
This was a "Stump the Experts" Question at the 2003 world wide develoepr conference.
more than that... (Score:5, Funny)
nothing compared to that guy who came up with the internationalisation bug/easter egg that took three minutes just to describe....
I thought WWDC was full of nerds, but then Stump the Experts was like concentrated nerd juice...
Mac Geek Trivia (Score:5, Informative)
As for January 1, 1904, this date was selected because the original Mac's clock (which counts in seconds) can encompass a period of about 136 years. Selecting 1904 as the start date means that the 136-year period covered by the clock (1904-2040) includes the birthdate of nearly every Mac user, and extends well past the expected lifetime of the Mac OS. It also means that the simplest rule for leap-years can be used (every fourth year has an extra day), which simplifies day and date calculations. They didn't choose the year 1900 because it was not a leap year.
~Philly
Re:Uhhh no it's not (Score:2)
Re:Uhhh no it's not (Score:3, Insightful)
Re:Uhhh no it's not (Score:2)
Re:Uhhh no it's not (Score:2)
Re:Dear Apple, (Score:5, Funny)
Okay, the whole 'Mac users are gay' troll is very stale now. Here's something a little fresher:
"I heard that OSX is based on eunichs!"
(man I hope the mod dudes are in good humor today.)
Re:Dear Apple, (Score:2)
Re:Dear Apple, (Score:2)
Oh man, heh. I hope that gets modded up. It'll light a fire under the Slashdotters here to come up with something a little more edgy than 'Windoze'.
Re:Dear Apple, (Score:2)
Re:Count backwards to 1901? (Score:2)