Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Google Microsoft Programming Apple

Why Can't We Put a BASIC On the Phone? 783

theodp writes "In the Sixties, we could put a man on the moon. Nowadays, laments jocastette, America's tech giants can't even put a BASIC on the phone. Woz managed to crank out a BASIC interpreter for the 6502 in the '70s. As did Bill Gates and Paul Allen. So, why — at a time when development has never been easier — can't Google, Apple, and Microsoft manage to support a free BASIC or other programming-for-the-masses development environment on desktops, laptops, tablets and phones?" My limited experience with Android development showed using Java to be obtuse and downright obnoxious to do anything (at least without Eclipse, and even with it doing anything non-standard required digging through horrendous ant buildfiles). And, of course, without a REPL things were even more obnoxious. There is the android-scripting project, but it doesn't provide particularly exhaustive access to the platform.
This discussion has been archived. No new comments can be posted.

Why Can't We Put a BASIC On the Phone?

Comments Filter:
  • by Anonymous Coward on Monday December 26, 2011 @08:19PM (#38498070)

    Programming isn't a matter of a few swooshes on a capacitive touchscreen. Also, who could muster enough attention between two instant messages?

  • TouchDevelop (Score:2, Interesting)

    by Anonymous Coward on Monday December 26, 2011 @08:26PM (#38498150)

    Microsoft Research has a TouchDevelop app for Windows Phone, maybe you can try it
    https://research.microsoft.com/en-us/projects/touchdevelop/

  • by InterestingFella ( 2537066 ) on Monday December 26, 2011 @08:35PM (#38498244)
    Funnily, Microsoft is the only one providing stuff like TouchDevelop [microsoft.com] for their phones. It even says on their page "bringing the excitement of the first programmable personal computers to the phone", so it's particularly well suited for what this whole story is about. It's better than Python too, as it's specifically targeted at touch phones (and Python is horrible with its indents for code blocks like if and for - seriously?). Yet Slashdot crowd likes to discredit MS everywhere they can and hope that Windows Phone 7 never catches on, while MS is the only one providing what the same crowd wants on phones.
  • by InterestingFella ( 2537066 ) on Monday December 26, 2011 @08:51PM (#38498404)
    That's the whole point of BASIC and TouchDevelop. You don't have to know about the APIs, libraries, and how to load all that stuff together so that you would actually get something done. The whole purpose is to be easy to use and something that can be used to quickly throw something together, without worrying about the details. This approach does come with limitations, but it doesn't matter.
  • Re:Windows Phone 7 (Score:4, Interesting)

    by elabs ( 2539572 ) on Monday December 26, 2011 @08:52PM (#38498418)
    Yeah, you can write apps for WP7 in VB.Net or C#. It's actually pretty amazing how simple and intuititve the free tools are. You can download them and have an app running in the emulator in just minutes. Adding controls is drag-n-drop from the WYSIWYG editor. The contorl libraries are impressive, especially when you consider the freely downbloadable WP7 "Toolkit". The main thing that is different than BASIC is the powerful langauge capabilities like OOP, LINQ, properties, threading, concurrent collections, generics, closures,events, delegates, a LINQ-to-SQL embedded database, easy encryption libraries and the list goes on and on.
  • by pinkeen ( 1804300 ) on Monday December 26, 2011 @09:04PM (#38498548) Homepage
    Indentation style is thing of preference. You didn't provide any justification.

    I love C and C++ and python's identation seemed weird at first. But once you get used to it, it appears very clean, elegant and efficient.

    C is great but really - those curly braces may seem like a sexy thing in a geeky way but they seriously decrease the legibility of the code. They may have been a poor design decision.
  • by caseih ( 160668 ) on Monday December 26, 2011 @09:11PM (#38498612)

    If you say so. Sounds like you haven't done too terribly much with python. I dislike a lot of things about a lot of languages, but I can't say that any of them (well maybe except PHP) are "horrible." I despise curly braces but I can't make the claim that Java is a "horrible" language. Some are more awkward than others sure. There are many things you can complain about in python but whitespace formatting falls pretty far down the list. Having a 1:1 correspondence between my psuedo-code on paper and python code is extremely nice and productive too. Broken web sites and e-mail clients do make cutting and pasting python code problematic, I'll grant you that. But my experience with python has been much the same as ESR's. (And he had the same initial reservation as you.)

    I am bitter that Epiphany chose to tear out python and replace it with Javascript of all things. Sort of makes sense given that Javascript is an integral part of browsers. But still makes me sad. Python is such a good language for writing extensions in.

  • by FairAndHateful ( 2522378 ) on Monday December 26, 2011 @09:42PM (#38498878)

    Yeah and getting a paycheck is the fun side of working a job..

    I dunno, the paycheck is necessary, but it's not the only fun part. The paycheck just keeps me clothed and fed. My job gives me problems to solve and I like that. I get to demand a wage because there's a lot of people out there who are not able to do what I do. On the main thread, people who are capable of writing tools for the simps never bother because they don't need those tools. Like BASIC. The people who need BASIC can't write it. If you think Python or something else will fill that void, advertise it.

  • by Ethanol-fueled ( 1125189 ) on Monday December 26, 2011 @10:11PM (#38499094) Homepage Journal
    It's easy to "magically" do things in Python without being forced actually come up with real algorithms, though that's a strength as well as a weakness. It's good as a time-saving abstraction to all those academics with no background in programming who now have to program, especially for bioinformatics. SciPy and NumPy come to mind.

    I would not recommend it as the first programming language of a noob seriously interested in computing because of the automatic type handling and all the other stuff that prevents people from learning the little necessary but tedious things like the pitfalls of integer division and type mismatch problems. There are also few dev shops that use Python primarily, so a noob would be better off starting with a more familiar language like C/C++ or Java. People complain that everything in Java must be a class, but that is a good message to pound into the mind of the noob - we don't need any more aspies banging out unreadable C monoliths.
  • by anubi ( 640541 ) on Monday December 26, 2011 @10:15PM (#38499122) Journal

    Coming up with the clever algorithm to solve a problem is what is fun

    You nailed that one as far as I am concerned.

    This week's project for me... I have this old DOS based SPICE analyzer I really like. Its short, simple, and generates great plots - on an old EPSON dot matrix printer.

    Now, I really want to get rid of that printer. That old spice analyzer is the only thing I have that requires it. What I really want is a bitmap image file that will go into anything. So, its time to dust off the ole Borland Turbo Assembler.

    I plan to hook the printer interrupt and divert the printer data to my program just like a printer capture program, but instead of just capturing the data to a file, I will take it byte by byte and convert it to bitmap format. Lots of rotate-thru-carry instructions to rearrange the data intended for the printhead into bitmap format. Its a state machine, so there is a 6-way switch for the incoming byte to be tested for "esc", tested for "L", tested for "2", store hi-byte, store lo-byte, and append into bitmap. This is easily done in assembler with an index to an array of pointers to subroutines.

    ( easy in C++ too, but I was having trouble trying to insert the data, delivered to the printhead 8 bits at a time, in a vertical format, into the bitmap when the assembler would let me use the "carry" bit to transfer the incoming byte bit by bit into the most significant bit of eight bytes in the bitmap). There is a helluva lot of looped busywork to rearrange all the bits.)

    Being all the plots are generated by the same program, all have the same size and use the same control-code sequences so I do not have to reconstruct the entire esc sequence interpreter of the Epson printers.

    I haven't had this much fun since reading Jeremy Bentham's "TCP/IP Lean" where he implemented state machines in C++ to make TCP/IP stacks, and I wanted to modify it so I could get bidirectional file transfer through the "FORM=FILE" method.

    That's the fun of this. Doing things that are not on the menu.

    As far as Basic goes, I actually still use my GWBasic interpreter at times to verify some little math loop. Its like using a hand calculator in a way, its a short sweet way of running a snippet, but I would not want to develop "serious" code on it any more than I would want to design my bitmap generator in DEBUG.

    I do enjoy doing things the old way on my old machine, where I understand exactly what and why I am doing anything, and know exactly what every byte in the code does. Its something I do not know in the new machines, and I can easily end up using hundreds of kilobytes of code along with megabytes of required libraries to execute some little algorithm I could code in assembler for 4k bytes or so of code. Its almost like trying to buy a house, signing off on reams of legal documents I do not understand, just to say "I agree to buy this house and I will pay for it in monthly payments of whatever. If I do not pay, you have the right to take the house back".

  • by OeLeWaPpErKe ( 412765 ) on Monday December 26, 2011 @10:32PM (#38499228) Homepage

    While I am very reluctant to mention this problem : speed. Now I am very, very tolerant of speed issues in scripting languages, as I know it doesn't really matter. But python ... python is an absolute disaster. Python claims execution speed doesn't matter because it's easy to use, then encases your legs in concrete. The argument justifying that one ? "It's very easy to move 0.01 cm".

    So here's the issue : you always get the impression python can be used to do calculations, python claims to be capable of them. And indeed it's syntax seems to allow for more. So let's compare:
    1) C, C++, hell even Java : c = a+b (1 assembly instruction)
    2) CPython: c = a + b ( > 2500 assembly instructions)

    This means that a C program running on an 8086 will actually calculate faster than a python program running on a current pc.

    The speed of CPython is so bad that it regularly becomes a problem, and requires all sorts of complex solutions from massively parallellizing using numpy to rewriting half the project in C/C++. That's my main gripe with python. And numpy may approach (from a large distance, but at least not a factor 200 anymore) the performance of normal C/C++ loops, but if you use a numpy like vector processing library in C/C++ (there's many) you're back to a factor 200 or more distance from numpy.

    Second is memory usage. I made the mistake of loading in a year's worth of monitoring data using the obvious method : a class representing a datapoint. Result : 12 GIGS (in C++ doing the same thing to the exact same data resulted in 120 megabytes) ... Yes, again numpy can solve this ... but only by greatly increasing complexity and destroying the utility of 90% of python's language features.

    Third is the fact that python is very much a typed language, but the only available variable type is a void*, and it actually allows changing the type of a variable, which is a horrible, horrible mistake (and why ? out of some sort of obligation to the idea of dynamic languages supporting this monstruosity). Same for adding members to class instances after creation time. Horrible.

    Fourth, the massive complexity and time-dependencies that result from actually using dynamic aspects of the language, introducing tons of non-obvious dependencies on the exact execution order of the program. If these features are used they directly lead to classes that only become valid useful objects after 3-4 method calls and dependant on all sorts of stuff succeeding. Our style guide outlaws actually using the features that make python dynamic, and I doubt we're the only ones.

    Fifght, the fact that the dynamic nature of python makes it very hard to document libraries that make use of it.

    Sixth, you can't use static analysis on python programs. So ipython just about the best possible autocomplete you can get (ie. autocomplete can only work by executing the program you're trying to develop)

    *sigh*

    Program a bit in haskell. Now, haskell's pedantic too, no question there, but you will find 10 places python could be improved before you even get through the tutorial.

  • by amRadioHed ( 463061 ) on Monday December 26, 2011 @11:13PM (#38499460)

    That said, the reason we don't have a nice "visual basic" for phones is because Apple and Google does not WANT that on the platform. They do not want people writing their own apps easily. It's not profitable to allow everyone to write their own custom apps.

    Whey did Google create this [appinventorbeta.com] then? It sure looks like an attempt to enable app making for everyone.

  • by nessus42 ( 230320 ) <doug@alum.mit.UMLAUTedu minus punct> on Monday December 26, 2011 @11:50PM (#38499648) Homepage Journal

    While I am very reluctant to mention this problem : speed. Now I am very, very tolerant of speed issues in scripting languages, as I know it doesn't really matter. But python ... python is an absolute disaster.

    You apparently weren't all that reluctant, since you went on for quite a few paragraphs, and yet little of what you said had any merit. First of all, what does anything you have to say have to do with whether or not Python makes a decent introductory programming language? Python is the introductory programming language at MIT. Are you saying that you know better than the entire MIT Electrical Engineering and Computer Science faculty. Furthermore, you seem completely oblivious to the tradeoffs between dynamically typed and statically typed languages. As a result of these tradeoffs dynamically typed and statically typed languages excel and different sorts of things. For large projects, statically typed languages often fair better, due to catching errors earlier. For smaller projects, however, the flexibility of dynamically typed languages is often a huge boon. For instance, I've completed programming projects in a day in Python that would have taken me a month in C++. On the other hand, I'm not so sure I would have wanted to work on a project with a couple of dozen programmers all working on a giant Python programmer.

    Regarding speed, Python is not particularly slower than other scripting languages. It is about the same speed as Perl. It is much faster than either Ruby or Tcl. The kind of rant you have made, which demonstrates absolutely no understanding of the tradeoffs involved or of the advantages of the technology that you criticize, is almost always uninformed flamebait that speaks more about the author of the flamebait than of the topic under discussion.

    |>ouglas

  • by symbolset ( 646467 ) * on Tuesday December 27, 2011 @12:12AM (#38499736) Journal

    Finding the right algorithm is definitely fun. It helps young minds develop and despite the great mass of established art a young mind can find a new path that solves a classical problem exactly because they don't know it's difficult, nor been distracted by the ways others have tried. I've been there on both sides. IBM and Microsoft both lend their engineers to high schools and colleges to gather this IP that the students don't know might be great wealth beyond their imagining - and have since the early '80's at least. There's a lot of dross to wade through, but the effort is worth the gems.

    It's also fun to know the lay of the land, to be skilled in the art of Wirth and Turing and Venn and up-to-date with the Journals of the ACM, to stand on the shoulders of giants and lift the bar just a bit higher in one little corner of the field in the hope that one day somebody might deem your work worthy to stand on your shoulder too.

    At some point the young minds must transition from the former to the latter, or you're just exploiting them. You owe some of them the bridge. BASIC isn't required (and is, perhaps, prohibitive) for any of this. It's better to teach the machine in the abstract. The effort is probably best moved to elementary schools now. Kids today are pretty far advanced relative to kids from my day. In my day access to actual computers was a special privilege reserved for folks who'd had at least a year's high school instruction. Now kids take to the Internet at 1 or 2 years of age. Finding kids who don't know proving P!=NP is difficult and yet are capable of exploring the question without that bias is going to be a challenge.

  • by shutdown -p now ( 807394 ) on Tuesday December 27, 2011 @03:06AM (#38500432) Journal

    Anyone who has the skills to write a BASIC interpreter will also be someone who thinks BASIC is a POS, and won't have any interest at all in doing such a thing without a handsome paycheck to compensate them for that time lost.

    That's not quite true. Back in the day, I was part of the user community of a little nice BASIC interpreter called Rapid-Q. The "nice" part was that it was written in C++Builder, and it exposed huge chunks of VCL under a thin API wrapper, so you could make very neat looking GUI apps with it. It also let you bundle the interpeter .exe with compiled bytecode, as a single binary, which gave you executables ~300Kb in size that did what VB users didn't dream of (and their most basic hello world was over a megabyte).

    Anyway, after the one and only developer of Rapid-Q was hired by RealBASIC and stopped working on the project, there was a bit of a commotion in the community, and a few people - myself included - set off to write our own replacements - because we actually wanted a tool like that (and the original was not open source). Some of them actually succeeded - one group ended up with another bytecode interpreter, another actually made a bona fide compiler (to assembly code, which they then ran through FASM, if I remember correctly).

    Mine was a BASIC-to-C++ translator. Definitely taught me a lot, like how to write a recursive descent parser by hand. And yes, after a few months of maintaining it, I concluded that BASIC as a language doesn't cut it anymore, and moved on to greener pastures. But I still liked it when I started working on it, and it certainly proved me capable to write it. Ditto for other guys who succeeded - most projects ended up being abandoned eventually, but not after several successful and working releases. Which, I think, disproves your original point.

  • by martin-boundary ( 547041 ) on Tuesday December 27, 2011 @04:36AM (#38500750)
    That's going a bit overboard, I think.

    BASIC was invented by a mathematician, John Kemeny [wikipedia.org], and a computer scientist Tom Kurtz [wikipedia.org]. They did this as part of a revolutionary change in how students were taught mathematics, and suceeded admirably.

    Here's an online version of Kemeny's book Introduction to Finite Mathematics [dartmouth.edu] with Laurie Snell and Gerald Thompson. What you have to understand is that this book looked nothing like the books on applied math of the day. It was truly revolutionary, and a lot of modern books have copied its ideas.

    You'll also find that BASIC is very well suited to solving the kind of problems that are in that book. It's even arguable that BASIC's suitability for solving simple number crunching problems is what made the microcomputer revolution possible (remember, the killer app for the early PCs wasn't games, it was the spreadsheet - people bought micros so they could program compound interest...).

    It's of course not so well suited for programming consumer software, but then again the language was invented 15 years before consumer software took off.

New York... when civilization falls apart, remember, we were way ahead of you. - David Letterman

Working...