Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Open Source Operating Systems Programming Software Apple

Apple II DOS Source Code Released 211

gbooch writes "The Computer History Museum in Mountain View, California, is not just a museum of hardware, but also of software. The Museum has made public such gems as the source code for MacPaint, Photoshop, and APL, and now code from the Apple II. As their site reports: 'With thanks to Paul Laughton, in collaboration with Dr. Bruce Damer, founder and curator of the Digibarn Computer Museum, and with the permission of Apple Inc., we are pleased to make available the 1978 source code of Apple II DOS for non-commercial use. This material is Copyright © 1978 Apple Inc., and may not be reproduced without permission from Apple.'"
This discussion has been archived. No new comments can be posted.

Apple II DOS Source Code Released

Comments Filter:
  • by i kan reed ( 749298 ) on Tuesday November 12, 2013 @04:35PM (#45405655) Homepage Journal

    Whatever your complaints about your job, at least debugging your code doesn't involve stepping through assembly on a pencil and paper virtual machine.

    • by adisakp ( 705706 ) on Tuesday November 12, 2013 @04:38PM (#45405701) Journal

      Whatever your complaints about your job, at least debugging your code doesn't involve stepping through assembly on a pencil and paper virtual machine.

      That was how I wrote my first published game back in the 80's. I have no complaints. Everything was new back then and even though the "wheel hadn't yet been invented", programming was still exciting and it was some of the most fun coding I have ever done.

      • by i kan reed ( 749298 ) on Tuesday November 12, 2013 @04:42PM (#45405759) Homepage Journal

        I like to imagine every new programmer has that amazing sense of euphoria as they begin to uncover all the major algorithms for themselves, and begin developing a sense of just how much is possible with programming.

        Then it's your job. To give the end-user some uninteresting but necessary layer of data connectivity.

        • Re: (Score:3, Insightful)

          As I've said numerous times: kids shouldn't be learning mathematics without the most powerful way to directly apply it: Programming. Seriously, #1 complaint teaching a kid math more advanced than long division: "I'll never use this in the real world" -- Change that. Make the way algebra is taught to be via computer programs and kids could actually DO STUFF by applying their knowledge immediately. That's how I'm able to turn any kid fluniking out in math into the head of the class.

          I learned BASIC on an Ap

      • Whatever your complaints about your job, at least debugging your code doesn't involve stepping through assembly on a pencil and paper virtual machine.

        That was how I wrote my first published game back in the 80's. I have no complaints.

        Do you think the pencil and paper mechanics made any qualitative difference, good or bad, to the overall learning process?

        • Whatever your complaints about your job, at least debugging your code doesn't involve stepping through assembly on a pencil and paper virtual machine.

          That was how I wrote my first published game back in the 80's. I have no complaints.

          Do you think the pencil and paper mechanics made any qualitative difference, good or bad, to the overall learning process?

          Don't bother waiting for a response. It's only been about thirty years, so I imagine he won't be simulating the first input interrupt for another ten years or so.

          Damn, I bet he *hates* when people write spinlocks. Must be infuriating...

      • by Dimwit ( 36756 )

        What game, just out of curiosity? I remember reading about how Ant Attack was developed that way.

    • by _merlin ( 160982 ) on Tuesday November 12, 2013 @04:43PM (#45405769) Homepage Journal

      Actually it does. That's how we track down compiler bugs, and also how we backtrack from crash location to look for the cause when we have a core from an optimised build.

    • by perpenso ( 1613749 ) on Tuesday November 12, 2013 @04:58PM (#45405975)

      Whatever your complaints about your job, at least debugging your code doesn't involve stepping through assembly on a pencil and paper virtual machine.

      Back then it was actually easier to read through large amounts of code, flipping between different sections, etc when it was on paper.

      The listing wasn't used for paper and pencil emulation, we had quite nice integrated editors and debuggers to see what was going on (ex. the LISA 6502 assembler). The listings were for reading and understanding. These lists were used somewhat like tablets today. You can take the listing anywhere, flop down on the couch and start reading, ...

      • by fermion ( 181285 )
        And how. I recall once when I was refactoring some code written in relatively low level language. I printed it off, physically cut it up, and played with the pieces until I could imagine what I wanted to do. Seriously, there has been more than once when creating a physical space with code helped me solve some complex problems.
    • by kylemonger ( 686302 ) on Tuesday November 12, 2013 @05:09PM (#45406087)
      It didn't involve pencil and paper for long on the Apple II. I remember reading about a step-trace 6502 debugger for the Apple II back then. I didn't have any money to buy it so I wrote my own (in assembler of course) to ease debugging of a video game I was writing. It wasn't a hard job; the 6502 instruction set is small and straightforward and the CPU only has three registers.
    • by oldhack ( 1037484 ) on Tuesday November 12, 2013 @05:27PM (#45406297)

      Reading 6502 assembly is easier than reading some of today's bloated and convoluted Java/Perl/FP/what-have-you code. It's not like the assemblies of modern CPUs with OOE, branch predictions, and all such complexities.

      Also, from a technical perspective, publishing source for 6502 machine code wasn't that big a deal. You could recreate a reasonable assembly source from the machine code by spending some time with reverse assembler (unless the code does goofy things like writing over its code and such). In fact, Apple II monitor code had a nifty reverse assembler built in.

      • by NormalVisual ( 565491 ) on Tuesday November 12, 2013 @06:10PM (#45406691)
        In fact, Apple II monitor code had a nifty reverse assembler built in.

        I'm sure there are a lot of us that remember "CALL -151"... :-)
        • by ArcherB ( 796902 )

          In fact, Apple II monitor code had a nifty reverse assembler built in.

          I'm sure there are a lot of us that remember "CALL -151"... :-)

          I remember that. For whatever reason 3d0g would get me out of it. I was just a kid and had no idea what to do with the gibberish that the assembler would spit out at me. I just knew how to get out and back to my prompt.

    • That is true, but growing up on the Apple ][ I didn't have a good paint program, so I wrote my graphics by filling squares on graph paper, making a list of the coords on lined paper, and then typing them in.

      Audio was worse, because you had to translate the tones into frequencies, and (attempt to) account for the time of your algorithms when deciding on the note timing.

  • Legacy Support (Score:2, Interesting)

    by pubwvj ( 1045960 )

    I wish that Apple, and other companies, would create deep legacy support all the way back. Software from the Apple II should be able to run on the MacOSX and iOS. The computational power is there to do the necessary emulation.

    • Re:Legacy Support (Score:5, Informative)

      by stewsters ( 1406737 ) on Tuesday November 12, 2013 @04:42PM (#45405763)
      Ask and ye shall receive?
      http://www.virtualapple.org/ [virtualapple.org]
    • by Greyfox ( 87712 )
      Ooh then I could dig up the old 5 1/2 diskettes (made double-sided with a hole punch) with my pirated (Yes, I was 13 for a year,) copy of Karetika on it, and take that for a spin again.

      Actually I'm pretty sure mame or one of its associated projects will emulate the ol' Apple 2, and I think also the C64 and maybe even the TI 99/4A. So while Apple doesn't support it directly, you probably could get that, at least on your OSX machine. Now the Amiga was a sexy little box but I haven't seen an emulator project

      • Yaaaa! And don't forget that Karateka was a bootable image, not requiring any DOS to get into.

      • Ooh then I could dig up the old 5 1/2 diskettes (made double-sided with a hole punch) with my pirated (Yes, I was 13 for a year,) copy of Karetika on it, and take that for a spin again.

        Back then I thought it was so cool that Karateka played with an inverted screen if you flipped the floppy over.
    • Meh... While I can see the value, this is exactly the problem that Windows is stuck in. Although they aren't completely backwards compatible, they try to be backwards compatible for a lot of stuff, which means they have to hold on to libraries which are poorly designed, and in some cases incorrect implementations because so much software depends on the incorrect implementation. MacOS is much cleaner because it has maintained less backwards compatibility. If you want to run old software, do it in a virtual
      • Re:Legacy Support (Score:5, Insightful)

        by sjames ( 1099 ) on Tuesday November 12, 2013 @05:25PM (#45406267) Homepage Journal

        Some sort of virtual machine is the correct way to do legacy support. In some cases full virtualization is the answer, in others, a thinner layer that looks like the old OS to the application and like a modern app to the outer OS might be more appropriate.

        The MS approach of keeping the severely broken APIs around forever is NOT the answer.

        • That's part of why I liked OS/2 (2.0 and later) - if something needed DOS 2.1 to run properly, you just booted a 2.1 image in another window and you were good to go.
        • One problem with virtualization (e.g. XP Mode) or paravirtualization (e.g. WOW64) is that it's likely to support only those applications that use peripherals supported by the operating system's bundled class drivers. It's far less likely to support applications that use a custom driver, such as an EPROM programmer.
          • by sjames ( 1099 )

            That is a limit of the particular implementation rather than the concept. There is no reason they couldn't let the VM handle the device transparently.

            • by tepples ( 727027 )

              There is no reason they couldn't let the VM handle the device transparently.

              Other than that a lot of programs (ab)using the LPT port as a GPIO are fairly timing-sensitive. And other than that Microsoft wants to control who has the right to market Windows-compatible hardware through the Windows Logo program, and it's likely to make VM I/O passthrough difficult for this reason, especially for a freely licensed VM such as VirtualBox.

              • by sjames ( 1099 )

                These days, virtualization is fast enough that using the LPT as a GPIO should work just fine.

                The rest is reasons MS WILL not let the VM handle the devices, not reasons they CAN not.

        • by smash ( 1351 )
          In theory, sure. In practice, when you want applications to talk to each other and share data, virtualization doesn't really work very well. Also, Microsoft don't keep around broken APIs *forever*. A long time, yes (2-3 business upgrade cycles, so say 10-15 years, sometimes more, sometimes less) - but not forever.
        • That's pretty much how Linux does it as well, for libraries that do backwards compatibility at all. You provide a file that tells the ELF linker which version of an exposed api method links to which internal implementation. The linker embeds the library version linked against into the executable and voila, your program can run against a newer version of the library with no expensive, bloated vm infrastructure required.
      • Re:Legacy Support (Score:5, Informative)

        by tlhIngan ( 30335 ) <slashdot&worf,net> on Tuesday November 12, 2013 @06:15PM (#45406743)

        Meh... While I can see the value, this is exactly the problem that Windows is stuck in. Although they aren't completely backwards compatible, they try to be backwards compatible for a lot of stuff, which means they have to hold on to libraries which are poorly designed, and in some cases incorrect implementations because so much software depends on the incorrect implementation. MacOS is much cleaner because it has maintained less backwards compatibility. If you want to run old software, do it in a virtual machine, and allow the OS itself to evolve and drop the baggage of keeping the compatibility. Not to say that everything should be changed every OS iteration, but there needs to be a process for getting rid of the cruft.

        No what happens is that Windows has to work around everyone else's bugs - a lot of nasty developers don't do things the proper way and Windows suffers. It's why "C:\Documents and Settings" exists still on Windows Vista/7/8 - too many developers hard code that string (including the "C:\" part!) that not having that hard link means programs break.

        Apple decided to take the other method - basically dictating that if you do not use just the published APIs, your programs will probably break. Yes, you can use private APIs. But as per the warning, Apple has full right to change the private APIs as they see fit.

        Which is better? There's no consensus - Microsoft's means your programs still working, crappy coding and all, but you have to live with the fact that you still have a window named "Program Manager", that if you use a localized version of Windows, you'll eventually have a "Program Files" folder show up (yes, it's localized) because some program hard coded it, etc.

        Apple's means a leaner system because all these hacks don't need to exist - private APIs are not fixed in stone but can change and be updated as time goes on and deleted when necessary, rather than having to hang around because some app uses it.

    • One of Apple's cheap Macs (the "LC"?) had a single nubus slot that took a special card that allowed you to hook up an Apple 5.25 floppy -- and had a hardware/software Apple // emulator.

      These days, no mac supports any kind of floppy, even via USB -- you'd have to download disk image files to get to the software.

      • by dgatwood ( 11270 )

        USB floppy drives should work just fine, assuming they comply with the UFI spec. Those drives won't read Apple 3.5" disks, though, because AFAIK none of the USB floppy drives support GCR.

        • by dgatwood ( 11270 )

          And by Apple 3.5" disks, I mean 400k or 800k. The 1.44 MB format was the same as it is on PCs.

      • Re: (Score:2, Informative)

        by Anonymous Coward

        One of Apple's cheap Macs (the "LC"?) had a single nubus slot that took a special card that allowed you to hook up an Apple 5.25 floppy -- and had a hardware/software Apple // emulator.

        These days, no mac supports any kind of floppy, even via USB -- you'd have to download disk image files to get to the software.

        It was a Processor Direct Slot (PDS), and most of the LC line supported it IIRC (up to at least the 68040-based LC 575 -- the original LC was built around the 68020). Had an Apple IIe SoC. http://apple2online.com/web_documents/apple_iie_card_owner__s_guide.pdf [apple2online.com]

        You can still use USB floppy drives with modern Macs; I recently retrieved some files off of 3.5" disks on my Core i5 MacBook Air (2013 edition, the one right before the current Haswell version)...

    • Kind of like every version of windows was an unstable POS until they dropped all of the dos underpinnings from it? Or should companies who have realized their products have outlived their utility be forced to continue to support them with their current products even decades later?

  • Contradictory? (Score:3, Interesting)

    by innocent_white_lamb ( 151825 ) on Tuesday November 12, 2013 @04:36PM (#45405673)

    It's being "made available" but it "may not be reproduced."

    How does that work, again?

  • by eclectro ( 227083 ) on Tuesday November 12, 2013 @04:39PM (#45405729)

    Because it could be a competent competitor to current Apple products?

    I know I want an Apple II smartphone that I could play Oregon Trail on and make phone calls back to the '70s with!

  • by crow ( 16139 ) on Tuesday November 12, 2013 @04:39PM (#45405733) Homepage Journal

    Back in the day, the source code for Atari DOS was included in a published book that explained exactly how it worked. That's one of the things that was great about that platform--so much information was readily available.

    It was all written in 6502 assembly. Anyone that cared would disassemble it themselves, so it's not like there were any big proprietary secrets to protect. I'm surprised that this wasn't published 30 years ago.

    • by TheCarp ( 96830 )

      Actually when did that stop as a general practice?

      I feel like I am just on the cusp at 35 years old where I remember when many, if not most, consumer electronics, that my parents bought when I was a kid, all had schematics. I mean, my father was no electrical engineer, he was one of those guys who knew just enough to avoid the capacitors in the back of the TV, how to identify fuses and how to resolder a bad connection.... but not enough to analyze logic or signals and really fix a non-trivially broken TV or

      • by LocalH ( 28506 )

        When people generally stopped repairing technology and instead chose to start throwing it away and replacing it with new.

      • Re: (Score:3, Insightful)

        IIRC, the change occurred in the mid to late 90's, as software and hardware got complex enough that a lot of it started being subcontracted, and storage got large enough that you could store the entire set of plans digitally, making both the plans and the documentation much more mobile. However, the shift really began in the mid 80's, when the increasingly complex manuals started being "available" instead of provided by default.

        Some examples include the Apple IIGS being the first Apple-based PC (as opposed

    • by etash ( 1907284 )
      I guess that's why atari does not exist today, while apple sits on 100 billion of usd :P (i'm not an apple fanboy btw)
      • by iroll ( 717924 )

        You would guess wrong, because Apple computers of the same vintage also came with schematics and source code.

    • by Dogtanian ( 588974 ) on Tuesday November 12, 2013 @07:27PM (#45407505) Homepage

      Back in the day, the source code for Atari DOS was included in a published book that explained exactly how it worked. That's one of the things that was great about that platform--so much information was readily available.

      Yes, but possibly in spite of, rather then because of, Atari themselves. According to the book "Hackers" by Steven Levy, the Atari 800 was treated as a closed platform in the early days, and Atari wouldn't divulge documentation on its inner workings;

      Transferring his new assembly-language skills to the Atari was difficult. The Atari was a "closed machine". This meant that Atari sequestered the information concerning the specific results you got by using microprocessor assembly-language commands. It was as if Atari did not want you to be able to write on it. It was the antithesis of the Hacker Ethic. John would write Atari's people and even call them on the telephone with questions; the voices on the phone would be cold, bearing no help. John figured Atari was acting that way to suppress any competition to its own software division. This was not a good reason at all to close your machine. (Say what you would about Apple, the machine was "open", its secrets available to all and sundry). So John was left to ponder the Atari's mysteries, wondering why Atari technicians told him that the 800 gave you only four colors in the graphics mode, while on the software they released for it, games like "basketball" and "Super Breakout", there were clearly more than eight colors.

      Of course, it's true that all this stuff was *later* very well-documented, but how much Atari helped in that is open to question (*). It's certainly well-known that Atari were assholes in general in their late-70s/early-80s heyday, and they definitely tried to suppress third-party development of VCS games. So though I've heard enough people disputing aspects of "Hackers" not to take it as gospel, it does seem to tie in with what I've heard about Atari at the time.

      The Atari DOS [atariarchives.org] book doesn't appear to have been published by Atari themselves, and whether it was with their blessing, I don't know. "Mapping the Atari" wasn't an official publication either.

      While Atari released documentation, I suspect it was at the level *they* wanted people to be using the machine at. And for all their plus points, the 400 and 800 were clearly intended as more closed, consumer-oriented machines. The 800 did have some good expansion capabilities, but this was clearly meant to be done via its official ports and interfaces designed for that use. The lower-end version, the Atari 400 had far less official expansion capability, e.g. it was never originally designed to support RAM expansion- it was possible, but apparently required far less friendly hardware modifications and installation directly onto the motherboard.

      The 1200XL was notoriously even more closed (and flopped massively). FWIW, the BASIC "manual" that came with my 800XL was a paltry pamphlet, and the official DOS 3 manual was nicely-presented, but certainly not deep.

      Of course, it all worked out in the end, but I guess what I'm saying is that let's not romanticise the original intentions of companies like Atari back then, who'd have been happy to sit on those secrets and not release them to their users (who they viewed as potential competition).

      (*) Those early days (1979 onwards) were before my time- I got my 800XL in 1986, so I can't speak from personal experience.

  • by tekrat ( 242117 ) on Tuesday November 12, 2013 @04:45PM (#45405803) Homepage Journal

    Seriously, this is cool and all, but, why wasn't this done over a decade ago? In fact, Apple should have done it themselves *before* ending the manufacturing of the Apple //, to inspire people to find new ways to hack this machine and utilize it in ways never intended by Woz.

    It's sad that one of the best hacking platforms out there is the Raspberry Pi, and not the much simpler to figure out Apple // -- although to be fair, people are doing amazing things with the Pi, I just wish there was a popular 8-bit machine out there for the young'ns to get them started.

  • by JoeyRox ( 2711699 ) on Tuesday November 12, 2013 @04:49PM (#45405853)
    Can someone please transcribe this into 6502 binary instructions and place it onto punch cards for easier reading?
  • Now to find copies of all the old Beagle Brothers Software titles
  • for non-commercial use. This material is Copyright © 1978 Apple Inc., and may not be reproduced without permission from Apple.

    What the hell? There is no commercial use. Stop being dicks and release it to the public domain.

  • Maybe we can fix a few bugs as a community, eh?

  • There was Beneath Apple DOS [apple2history.org], a fabulous book from the time which was invaluable for figuring out what was going on. My understanding was that Don Worth and Peter Lechner disassembled the shipped code and sorted out how things worked, with great explanations. Those were a great guide and helpful for writing all kinds of software. I suspect that a similar effort these days would not be resolved without legal intervention- I have no idea if they even asked permission or if it would have occurred to people t

    • There was Beneath Apple DOS, a fabulous book from the time which was invaluable for figuring out what was going on. My understanding was that Don Worth and Peter Lechner disassembled the shipped code and sorted out how things worked, with great explanations.

      One thing that made their task easier was a program supplied with Apple DOS called FID (File Developer). That program hooked into a mid level part of DOS called the File Manager. FID spent a lot of time populating a data structure called the "File Manager Parameter List" and then calling various lower level routines.
      Worth and Lechner, however, did a wonderful job of explaining Apple DOS at all levels, from how the disk hardware works all the way up to the command processor.

  • I haven't programmed assembly in decades and can still read it and follow along.. Crazy how some things just get burned into one's gray matter.
  • Aren't you glad you're a coder now and not in 1978? I know there are still ASM programmers, but seriously:

    ORG $B800 OBJ $B800 PRENIBL LDX #$32 INDEX FOR (51) 5-BYTE PASSES. LDY #$0 USER BUF INDEX. PNIB1 LDA (BUF),Y FIRST OF 5 USER BYTES. STA T0 (ONLY 3 LSB'S USED) LSR LSR ;5 MSB'S TO LOW BITS. LSR STA NBUF1,X FIRST OF 8 5-BIT NIBLS. INY LDA (BUF),Y SECOND OF 5 USER BYTES. STA T1 (ONLY 3 LSB'S USED) LSR LSR ;5 MSB'S TO LOW BITS. LSR STA NBUF2,X SECOND OF 8 5-BIT NIBLS. INY LDA (BUF),Y THIRD

The truth of a proposition has nothing to do with its credibility. And vice versa.

Working...