Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Programming Apple

Why Was Hypercard Killed? 392

theodp writes "Steve Jobs took the secret to his grave, but Stanislav Datskovskiy offers some interesting and illustrated speculation on why HyperCard had to die. 'Jobs was almost certainly familiar with HyperCard and its capabilities,' writes Datskovskiy. 'And he killed it anyway. Wouldn't you love to know why? Here's a clue: Apple never again brought to market anything resembling HyperCard. Despite frequent calls to do so. Despite a more-or-less guaranteed and lively market. And I will cautiously predict that it never will again. The reason for this is that HyperCard is an echo of a different world. One where the distinction between the "use" and "programming" of a computer has been weakened and awaits near-total erasure. A world where the personal computer is a mind-amplifier, and not merely an expensive video telephone. A world in which Apple's walled garden aesthetic has no place.' Slashdotters have bemoaned the loss of HyperCard over the past decade, but Datskovskiy ends his post on a keep-hope-alive note, saying: 'Contemplate the fact that what has been built once could probably be built again.' Where have you gone, Bill Atkinson, a nation of potential programmers turns its lonely eyes to you."
This discussion has been archived. No new comments can be posted.

Why Was Hypercard Killed?

Comments Filter:
  • by Anonymous Coward on Wednesday November 30, 2011 @02:42PM (#38217080)

    Supercard didn't flourish. The market was just too tiny. In many ways, Filemaker and similar apps filled the niche.

    If people REALLY wanted a Hypercard-like program, there were alternatives.

  • by mr100percent ( 57156 ) on Wednesday November 30, 2011 @02:45PM (#38217112) Homepage Journal

    So, you believe Apple is a bunch of fascists, and for that reason they killed one of their programming languages? Baloney. Steve Jobs was the one on stage at NeXT showing how even a child could write GUI apps. He made XCode free and bundled it with every boxed copy of OS X back in 2001 when Microsoft required a paid dev account.

  • Occam's Razor (Score:5, Interesting)

    by ink ( 4325 ) on Wednesday November 30, 2011 @02:47PM (#38217142) Homepage

    Or, it could be that all those fond memories of Hypercard are exaggerated. I can't recall even one such application that was useful apart from simple educational games. The challenge in creating a GUI-based development system has been tackled many times [wikipedia.org]. The most recent one that I have used is the default Mindstorms programming environment LabView [wikipedia.org], which I quickly discarded for a gcc-based environment.

    The one killing blow that keeps me from really using these environments is that they are fundamentally incompatible with version control. This means that they cannot be large projects, or have much collaboration -- relegating them to trivial systems, which are all I remember Hypercard being.

  • by mr100percent ( 57156 ) on Wednesday November 30, 2011 @02:49PM (#38217150) Homepage Journal

    Bad caricature of Steve. Doesn't match reality.
    "You watch television to turn your brain off and you work on your computer when you want to turn your brain on."
    -- Steve Jobs, Macworld Magazine, February 2004

  • by Anonymous Coward on Wednesday November 30, 2011 @04:12PM (#38218166)

    Windows (and to some extent) school taught them not to be curious and interrogating...

  • Comment removed (Score:5, Interesting)

    by account_deleted ( 4530225 ) on Wednesday November 30, 2011 @04:25PM (#38218268)
    Comment removed based on user account deletion
  • by wed128 ( 722152 ) on Wednesday November 30, 2011 @04:46PM (#38218546)

    Well a lot of us act have contempt for the "average user". I personally have contempt for anyone who equates 'different' with 'hard'. I have contempt for anyone who is unwilling to learn to use a tool in order to benefit from it. I have contempt for people unwilling to explore. I have contempt for people who expect new tools to be handed to them on a silver platter. Quite frankly, it's greedy and insulting.

    If you want to do something, learn to do it. Don't bitch that it's too hard, don't whine until someone makes it easier. Don't call up your boyfriend/son/coworker/roomate/neighborhood nerd to solve something that can be looked up with a 3 second google search.

    For christ's sake, it's the 21'st century. We need a license to drive a car. We need a license to use a HAM radio. There should be a licensing system for the internet.

    dammit.

  • by bberens ( 965711 ) on Wednesday November 30, 2011 @05:10PM (#38218840)
    I think what you're seeing is that the basic fundamentals of computing concepts (disk IO, networking, sorting algorithms, etc.) are all largely "solved" problems. We'll continue to see evolutionary improvements in these areas but it was very exciting when we went from 0 through the first few generations of PCs. With modern languages, libraries, and frameworks we're actually doing much harder work. You see the low level stuff is all "internal" to the computing system. It's pure math, something the machines are quite good at. What's really difficult is taking business concepts and converting that into a "computer concept" and then turning that information into something that is meaningful to a human on the other end. It's a mixture that includes not just math/logic but also psychology and sociology if you want your software to be useful. IMHO the new cross-domain (domains here being math, psychology and sociology) is much harder than the stuff we did previously which was much more pure math. The more "pure" experiences you're looking for do exist but they're getting to be more and more niche in the areas of large scale simulation, massive databases on the Google/Amazon scale, etc. Admittedly I'm in the relatively young generation of programmers so most of my "low level" work was academic, but I personally found that work much easier to do than the stuff I currently get paid for. Writing software with mass appeal and good usability is quite difficult.
  • Re:Occam's Razor (Score:4, Interesting)

    by ink ( 4325 ) on Wednesday November 30, 2011 @05:29PM (#38219144) Homepage

    Maybe my point wasn't clear enough then; I don't believe that there was any kind of conspiracy to "kill" Hypercard -- I believe that all the developers moved on to either more capable development platforms. The person likely to build something in Hypercard then is probably reaching for Dreamweaver, Excel, or FileMaker Pro now. The actual developers have moved on to professional tool sets.

    Hypercard killed itself.

  • by lennier ( 44736 ) on Thursday December 01, 2011 @12:58AM (#38223246) Homepage

    When I design a GUI I want limits around how the end-user can customize it...unless it really easy to reset it to the default values.

    Actually, I think this is the biggest problem with GUIs: that the developer can lock down the end-user from customising it. You're not me, you don't know how I like my desktop to look, it's really not your business telling me what my GUI should look like unless you're paying for my computer.

    See, as a user, what I really want isn't a whole pile of non-interacting "applications", each of which thinks it's the best thing since sliced Marmite, loosely joined by a filesystem and OS in which they savagely compete for my attention. What I want is to build a personalised workflow of "data I really care about" and "stuff I want to do to that data", and your application-developer mindset about what you want your application to look and feel like doesn't really appear on my radar at all. I want something a bit like a giant spreadsheet where I can plug in every possible data source and transformation as just sort of functions out of a toolbox. I don't want applications, and I especially don't want "apps", as in super-dumbed-down applications which don't even believe in using a shared filesystem.

    But the way we've built things at the moment, we've priviledged this rather out of date concept of "application", and left the idea of "data" in the dust. And the GUI model has somehow lent itself to that. I think mostly because the GUIs we've built have been excessively cranky and explosive contraptions which melt down at the slightest touch of a pixel out of place. I'd like to think that that doesn't have to be the way of the future. Shouldn't a GUI just be something like a skin over the data which is already there? But we've never made a way to expose the raw data without doing so in shiny chunks of non-user-accessible pixels. Would be nice to change that.

An authority is a person who can tell you more about something than you really care to know.

Working...