Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Programming Facebook Google Apple

Why Apple, Google, and FB Have Their Own Programming Languages 161

An anonymous reader writes: Scott Rosenberg, author of Dreaming in Code dissects Apple's Swift, Google's Go, and other new languages — why they were created, what makes them different, and what they bring (or not) to programmers. "In very specific ways, both Go and Swift exemplify and embody the essences of the companies that built them: the server farm vs. the personal device; the open Web vs. the App Store; a cross-platform world vs. a company town. Of all the divides that distinguish programming languages—compiled or interpreted? static vs. dynamic variable typing? memory-managed/garbage-collected or not?—these might be the ones that matter most today."
This discussion has been archived. No new comments can be posted.

Why Apple, Google, and FB Have Their Own Programming Languages

Comments Filter:
  • Naming (Score:3, Interesting)

    by Anonymous Coward on Friday December 05, 2014 @02:00PM (#48532785)

    So Apple named a language "Swift", Google named a language "Go", and Facebook named a language "Hack"?
    Obviously it's not just the dirty FLOSS hippies who can't come up with decent software names.

    • Re:Naming (Score:5, Funny)

      by VGPowerlord ( 621254 ) on Friday December 05, 2014 @02:09PM (#48532875)

      Well, I heard* that the name Go was decided upon when an executive at Google needed to think up a name and was looking around his office.

      On his computer at the time was a Chrome browser with the Google homepage open, but a Solitaire window was open in front of it obscuring the right half the page.

      Now, if the Solitaire window was on the other side, we might have the ogle programming language instead!

      * and by "heard" I mean "just made up"

    • Hack is like PHP but with improvements.

      There is no name that could be more accurate.
    • Re: (Score:3, Insightful)

      by DulcetTone ( 601692 )

      Some of these are actually good names, e.g. Swift

  • New programming languages props up the publishing industry by producing dead-tree door-stoppers that detail the new programming language. I stopped buying dead-tree door-stoppers years ago, cleared off the bookshelves and get the ebook version.
    • All of Go and Swift's documentation is available as free PDFs.

    • by sconeu ( 64226 ) on Friday December 05, 2014 @02:36PM (#48533193) Homepage Journal

      I don't think it's really that. I think it's more the divide specified.

      Some of us do NOT like using ebooks for reference manuals. We like having dog-eared tomes with tons of bookmarks or post-it tabs. The ability to flip back and forth between multiple pages in an ad-hoc manner is also useful.

      • I bought many programming door-stoppers in the 1990's and 2000's. Most were tossed as the technicial information became obsolete over time. This is especially true for new programming languages that are still maturing at a rapid pace. The only reference books I still have are for C/C++, Python 2.6 and data structures from ten years ago.
        • by hjf ( 703092 )

          I have Programming Perl. Bought 15 years ago, and I used to consult it a lot.

          Now I just google the information. Easier to find than on the book...

          Books are nice and have a romantic feeling about them. But e-docs are oh god so much more convenient.

    • safari books online.
      www.safaribooksonline.com
      imho certainly worth the money...
    • I read e-books exclusively now, EXCEPT for coding manuals. These require more back and forth lookup than works of any other genre.

  • Comment removed based on user account deletion
  • Algorithms (Score:5, Interesting)

    by Anonymous Coward on Friday December 05, 2014 @02:10PM (#48532893)

    If I give you an algorithm, throw a dart at a page of programming languages to select one and if you cannot implement that algorithm in that language then you are nothing but a code monkey.

    A computer scientist can implement any algorithm in any language.

    Why are these companies using their own languages?

    Coder lockin. That is the only reason to have your own language.

    Work a few years at XYZ company working on their proprietary algorithms in their ABC programming language?

    Good luck getting another job.

    See, they learned the hard way with their stuff in Javascript - common language and coders - uh, I mean Javascript engineers - left for greener pastures because so many other companies were using that language.

    • Re: (Score:2, Interesting)

      by SQLGuru ( 980662 )

      I disagree with your coder lock-in statement. But I agree with your "throw a dart" metaphor.

      Just because you CAN code an algorithm in a language doesn't mean it's the best option. Just because I can drive a screw into a 2x4 with the heel of my shoe doesn't mean I should.

      Languages are developed to make certain problem domains easier. If they are flexible enough, people will adopt them for other problem domains as well. If they aren't flexible enough, they might stick around in their problem domain, but t

    • >. A computer scientist can implement any algorithm in any language.

      ou CAN pound a nail with a screwdriver. You can even pound a nail with a saw. A hammer is a much better, more efficient tool for that job. If you need to install hundreds of nails, a nail gun is a much better tool.

      I COULD use VB to convert one type of XML to another, but I use xslt (true xslt, not loops) because it's a better tool for the job. I use several languages each day, selecting the one best suited for the task at hand.

      > if

      • by narcc ( 412956 )

        generally there is one condition that decides whether JavaScript is the best choice . JS is the best choice if and only if that's the only possible option.

        This opinion sounds uninformed.

        • I said it is the best choice for client-side processing on web pages, because it's the only plausible option. Where other options exist, the others are probably better suited to the task.

          If you disagree, can you come up with a counterexample, any scenario where you'd consider using something other than JavaScript, but decide JavaScript is better than the alternative? I'm curious what solutions could be worse than JavaScript. As stated in what you quoted, I do mean solutions - things that would work, but

          • by narcc ( 412956 )

            Where other options exist, the others are probably better suited to the task.

            I can't imagine what you'd think is better. Other languages have adopted features like first-class functions and closures as a direct result of influence from JavaScript. What does that indicate to you?

            Taking it further, the prototypal approach to OO that JS uses is, without question, superior to the classical approach. As there are vanishingly few examples of other languages that use prototypes instead of classes, just about any language you can offer as a substitute would be, necessarily, inferior. (A

            • by Jack9 ( 11421 )

              > Taking it further, the prototypal approach to OO that JS uses is, without question, superior to the classical approach

              Please point to the study that demonstrates this. I would argue the opposite.
              Runtime definition of types (modifications to a prototype has the same effect) has never been shown to be more productive than static typing, so I have to question assertions that it's obviously true.

              > Python would be examples of popular languages that would clearly be worse than JS on the web

              Java on a brows

            • You could have said the same thing with a lot less words had you phrased it as:

              No, I can't think of even one application where you'd consider two languages and decide JavaScript was better for that application.

              I didn't say JavaScript doesn't have a lot of features. It does have a large mishmash of features. I said it'll almost always be the worst choice, if you have any other option.

              > its intended purpose, making it exceptionally well-fit for the web.

              It's the ONLY choice for client-side web. A

              • by narcc ( 412956 )

                It's the ONLY choice for client-side web. As I said twice before, that's the one place nothing is worse or better - because you have no other choice.

                You seem to forget that, for many years, it was not the only choice. JS handily beat the competition. You may be too young to remember those early days, so I won't hold it against you.

                Since neither iOS nor most Android devices run Java applets, that means MOST users today won't run them. A "solution" that won't run at all for most users isn't a solution. You can't say "Java and JavaScript would both work, but JavaScript would be better".

                Again, you forget your history. Java in the browser was effectively dead long before iOS and Android hit the scene. It lost out for a reason, after all. Java had its chance, there was more than a little excitement surrounding it, and it still failed miserably.

                If you're advocating JavaScript as a server-side language, well that's just silly.

                I'm not advocating anything, just calling out your opinion as u

                • You've made some good points, and ones directly responsive to my statement this time. That is true, once upon time Java was a serious option on the client side and it did have the hype. So much so that Livescript was renamed JavaScript to take advantage of the Java hype. JavaScript won, against actual competition. Ps I WAS around during that time, and I've written ActiveX controls for use on public web pages. JavaScript beat both ActiveX and Java in the browser.

                  The PayPal link is interesting as well.

                  • by narcc ( 412956 )

                    Otherwise, where you have a choice, JavaScript is NORMALLY not the best suited for any role other than client side web page code. Exceptions may exist.

                    That's a bit more reasonable. Though I wonder why you limit its utility like that? Is there something intrinsic to the language that makes you think it's less suitable than, for example, Python in situations where that language is well-suited? For clarity: JS can't replace PHP where it works well for reasons independent of the languages themselves (that's in the differences between node.js and mod_php), yet JS obviously can't compete with C where C shines, for obvious reasons directly related to the lang

                    • I'm not well-versed enough in Python to to an indepth analysis, but I can say that Python appears to be ideally suited for roles that were once done by shell scripts. The Red Hat installer Anaconda seems like a perfect role for Python, with a lot of interaction with external binaries and very little real computation. The focus of JavaScript, the purpose for which it is created, is of course different.

                      Further, I would say that unlike Perl or C++, a key constraint on the development of JavaScript was time.

    • A computer scientist can implement any algorithm in any language.

      Sure, but that doesn't mean you can use any language to write large scale, reliable, and maintainable software. To do that, you need encapsulation, strong typing, static and dynamic analysis tools, etc. Many large teams have written projects with ten million lines of code, using languages with these attributes, such as C++ or Java. Good luck trying to do that with PHP or JavaScript.

    • Re:Algorithms (Score:5, Interesting)

      by tlambert ( 566799 ) on Friday December 05, 2014 @02:59PM (#48533459)

      If I give you an algorithm, throw a dart at a page of programming languages to select one and if you cannot implement that algorithm in that language then you are nothing but a code monkey.

      A computer scientist can implement any algorithm in any language.

      The "D" language used in writing DTrace scripts does not have loop constructs or recursion, and is not Turing complete. While I can do some pretty astonishing things in "D" that would make your jaw drop, even without looping constructs and recursion, it's pretty easy to come up with things which are impossible to implement in "D".

      So I would say your page of programming languages would, at a minimum, need to be Turing complete programming languages.

    • by HiThere ( 15173 )

      So you claim you could implement a B+Tree in Whitespace? BrainFuck?

      Sorry, but different languages have different strengths, otherwise we'd all be programming in TML (Turing Machine Language). I could have said assembler, or machine language, but both of those are easier to use.

      OTOH, I'll admit that I tend to flit from language to language more than is necessary, and I've still skipped some, like Haskell and CaML. Sometimes a language doesn't look like it would make anything I'm doing easier.

      OTOH, I once

    • Coder lockin. That is the only reason to have your own language.

      Perhaps that was the case with Apple, but Google people just wanted to have something more reasonable for writing servers, and C++ prove itself to be a major PITA.

    • Hi,

      I'm Australian. So as far as you possibly can get from technology and innovation.

      I can understand the need for a specific language from a technology giant. When you build the hardware platform as complex as these guys probably have, with the type, and volume (in space and time), of data they have from customers hitting various services, it makes sense to have an internal language that understands how the data is stored and when wanting to run queries you want them to be ru

    • by Bogtha ( 906264 )

      A computer scientist can implement any algorithm in any language.

      Just because it's possible, it doesn't mean it's effective. Developers could write applications with Brainfuck or Whitespace, but they'd take far longer, have a lot more bugs, and be incredibly unhappy.

      There's a lot of variation between programming languages, and it makes a big difference in how productive programmers are. Better programming languages are valuable.

      Why are these companies using their own languages?

      Because they saw

    • Not only that, but I think it is more relevant to developer lock in to a particular platform. Just like C# and Apple, and say porting video games from Xbox to Playstation. They want exclusivity on applications developed for their particular platform. This is nothing new. It is just a way to exclude competition to their particular market, and to prevent or at least make it more difficult to get the same functionality from a competing service.

      As probably many people mentioned, any coder worth their salt can u

  • real men have their own programming language, that's why!

    Like how they used to say in the chip business, real men have their own fab.

    • You would also want your language to work best with the services you offer.
      Why was VB so popular for Windows Development? Well it was designed to make Windows Apps. Other languages could do this as well, but they were often a bit more cumbersome to achieve similar tasks.

      • Why was VB so popular for Windows Development? Well it was designed to make Windows Apps. Other languages could do this as well, but they were often a bit more cumbersome to achieve similar tasks.

        Actually, VB was popular because it lowered the bar on *who* could make Windows programs. It wasn't so much that it was better at making those programs, it was that vastly more people were capable of building working programs using it as a tool than using any other of the available tools. In terms of computer language learning curves, BASIC is still pretty hard to beat. In fact, I'd argue that no one has beat it yet.

        • by narcc ( 412956 )

          Careful. On this site, that's flamebait. So is this:

          The reason we use programming languages is to make it easier to write programs. A good programming language, then, can be judged on how much easier it is to use than other languages. What does that tell us about BASIC?

          • by caseih ( 160668 )

            The only problem with BASIC is that each compiler is its own non-standard dialect these days, many of which are proprietary, old-school non-FOSS institutions. FreeBASIC is very good, though, and open source. Modern dialects of BASIC (dunno about Visual Basic) are very structured and support a wide variety of programming paradigms from object-oriented to event-driven to procedural. Some dialects do enforce strong typing. So while you or I might not have reason to use BASIC as we have other languages we a

    • Yep, this. It all boils down to a really smart guy who's still trying to get respect by proving how big his academic dick is.

    • Re:Why? (Score:4, Insightful)

      by Actually, I do RTFA ( 1058596 ) on Friday December 05, 2014 @04:33PM (#48534293)

      real men have their own programming language

      If I was in charge of a huge budget, and the ability to foist my language on the public, I would invent my own language. heck, every programmer wishes they could design the language everyone uses.

  • Missing the Point (Score:3, Informative)

    by Capt.Albatross ( 1301561 ) on Friday December 05, 2014 @02:14PM (#48532929)

    While Mr Rosenberg claims that Go is distinguished by its approach to concurrency, his section 'The Essence of Go' is almost entirely devoted to the trivia of braces and semicolons. Yo won't learn anything about Go's approach to concurrency here.

    • his section 'The Essence of Go' is almost entirely devoted to the trivia of braces and semicolons.

      Good point. You have succinctly captured why I feel like I learned nothing from that article.

    • The whole article is sortof an exercise in textual essentialism.

      What does style tell us about what these things mean? It's a literary crit technique that might be applicable here, but he clearly either doesn't know what he's talking about or he's a dilettente who has absorbed the surface features of computer languages without groking the underlying concepts.

  • by hawguy ( 1600213 ) on Friday December 05, 2014 @02:16PM (#48532955)

    I assumed it was a case a Not Invented Here Syndrome [wikipedia.org].

    • The Google vs. Oracle lawsuit made a business case for not-invented-here syndrome. I think every major platform vendor will have there own programming languages in the future. Custom APIs and programming languages stops entire classes of patent/copyright lawsuits dead. It stops developers from moving between eco-systems. It even prevents your employees from stealing top-secret software and moving to a competitors. (And if they do steal the software, it becomes really obvious when law-enforcement shows

      • by hawguy ( 1600213 )

        The Google vs. Oracle lawsuit made a business case for not-invented-here syndrome. I think every major platform vendor will have there own programming languages in the future. Custom APIs and programming languages stops entire classes of patent/copyright lawsuits dead. It stops developers from moving between eco-systems. It even prevents your employees from stealing top-secret software and moving to a competitors. (And if they do steal the software, it becomes really obvious when law-enforcement shows up.)

        I do agree from a portability/programmer perspective, NIH programming sucks. However, the legal perspective - it's great!

        Also, the funny thing with lawsuits - even if you win, you still lose.

        Given the permissive BSD style license that both Google and Facebook use for their respective languages, I don't think that they created these languages for any of these reasons.

        It seems that detecting stolen software would be easier if the code was stolen and used as-is. If someone steals secret Go language code from Google and moves to Facebook and rewrites it in Hack (after all, the the actual coding is the easy part of any software project so rewriting it is much easier than creating the project from s

      • This is a good thing.

        It means that schools will start teaching actual programming again instead of 'coaching to a language'. Colleges are cranking out Python/Javascript coders like they used to turn out Java coders. If every company is different maybe they'll teach the logic so that people can learn any language.

  • Outside of their respective organizations, I'm not sure these things are really catching on. Adoption of Go seems to have come to a standstill. Uptake of Swift has been kindda slow. And Hack seems to been ignored even by dedicated underground computer hobbyists. As well as lumberjacks.

    • by Anonymous Coward

      except for rocket, docker, coreOS, rackspace, and others that are rolling out more tools and services written in Go every day.

    • by Karlt1 ( 231423 )

      Uptake of Swift has been kindda slow.

      It's been out less than a year. Objective-C is the third most popular. Why wouldn't you believe that Objective-C developers wouldn't move over to Swift?

      http://www.tiobe.com/index.php... [tiobe.com]

      • Darn, I'd put that the thing about "lumberjacks" in, hoping that everybody who hadn't gotten it yet would finally C my Objective. ;-)

    • Go isn't supposed to be widely used, just well supported on Google's cloud.

  • by ArcadeMan ( 2766669 ) on Friday December 05, 2014 @02:23PM (#48533031)

    Then Red Forman would be the mascot for Swift.

    As in, a swift kick in the ass. Dumbass.

  • by sideslash ( 1865434 ) on Friday December 05, 2014 @02:35PM (#48533181)
    Swift needed to be created because Objective C stinks, and no other modern language would have fit smoothly into the Smalltalkish legacy of the Cocoa framework. I'm just glad that the Apple fanboys who constitute most of my fellow iOS developers are finally allowed to believe bad things about Objective C, at least now that there's a nice alternative. Made me a little sick before to hear people praising Obj-C while writing reams of ridiculously verbose code that nobody will want to maintain 5 years from now.

    Go is a fantastic language for server side development with concurrency that's not painful to wrap your head around, and is perfect for cloud development in Google's world.

    Won't comment on Facebook Hack, since it's not clear to me why Facebook itself needs to exist. But to each their own...
    • Re: (Score:2, Insightful)

      by Anonymous Coward

      Swift needed to be created because Objective C stinks, and no other modern language would have fit smoothly into the Smalltalkish legacy of the Cocoa framework. I'm just glad that the Apple fanboys who constitute most of my fellow iOS developers are finally allowed to believe bad things about Objective C, at least now that there's a nice alternative. Made me a little sick before to hear people praising Obj-C while writing reams of ridiculously verbose code that nobody will want to maintain 5 years from now.

      Objective-C is fine. The square brackets syntax just become second nature and disappear into the background after a couple of days. And personally I have no objection to methods with long names - it helps me understand what has been written when I return to a program after months (or a year) away. The long names actually make the code more readable and maintainable. I like Objective-C and have no bad things to say about it or the concepts behind it.

      What Swift does bring is enhancements that I would like to

      • by HiThere ( 15173 )

        The problem with Objective-C is its libraries. Nobody has put much work into the cross-platform ones for a decade, and it shows. And since I don't use Apple, Cocoa is of no interest to me, but all the documentation refers to it. So I ignore Objectve C.

        I find it a very interesting language that's hobbled by lack of usable documentation. (Even for cross-platform stuff I got redireected to the Apple site.)

      • And personally I have no objection to methods with long names - it helps me understand what has been written when I return to a program after months (or a year) away. The long names actually make the code more readable and maintainable.

        No, in many cases the extra length is just ridiculous boilerplate. And even in cases where the extra length clarifies what's going on, you can do the same thing in other languages, i.e. every language supports use of meaningful names.

        Can you seriously argue that concatenating a string in Objective C is elegant?

        Be careful! You're repeating yesterday's Dogma of the Faithful. Apple fanboys now have corporate blessing to move to Swift, and you may find yourself left behind. /joke, joke

        • by Bogtha ( 906264 )

          even in cases where the extra length clarifies what's going on, you can do the same thing in other languages, i.e. every language supports use of meaningful names.

          But Objective-C is very unusual in that it interleaves method parameters with the method name. The best alternative to that is using named parameters, and hardly anybody uses those all the time, so developers end up having to memorise the arguments and their order for every method if they want to be able to read code quickly.

          Can you serious

          • The only substantial way of improving on string concatenation in Objective-C would be to introduce custom operators, and that brings its own set of issues. The other alternatives sacrifice consistency.

            I think it's telling that the ultimate way Apple found to improve on Objective-C is to put it on a retirement path by introducing a replacement language. That's mostly all I'm saying here.

          • by jeremyp ( 130771 )

            The only substantial way of improving on string concatenation in Objective-C would be to introduce custom operators, and that brings its own set of issues. The other alternatives sacrifice consistency.

            Actually, you could quite easily bring custom operators to Objective-C by adopting the Smalltalk approach. Simply allow symbols to be messages e.g.

            [@"foo" stringByAppendingString: @"bar];

            could be written as

            [@"foo" +: @"bar];

            Smalltalk allows you to drop the colon with binary operators so you could even have

            [@"foo" + @"bar];

    • Won't comment on Facebook Hack, since it's not clear to me why Facebook itself needs to exist. But to each their own...

      My understanding is that Facebook needed a more statically-typed language (while still preserving the familiar syntax of PHP) in order to exploit more performance advantages when compiling their code to the HHVM, which started off as a PHP compiler.

    • by Bogtha ( 906264 )

      You don't have to be a fanboy to like Objective-C. It's a great language for its age and use cases. Yes, it's verbose, but a lot of that verbosity actually aids readability and maintainability.

  • by jtara ( 133429 ) on Friday December 05, 2014 @03:05PM (#48533515)

    These all strike me as iffy use cases. What is more compelling is creating a language for some more-specific need. These are generally referred-to as Domain Specific Languages, or DSLs (not to be confused with trying to push high-speed internet over a twisted pair...)

    I designed one and implemented a compiler and interpreter for it in the early 1980's. It's not all that hard. I had had one compiler construction course in college. I used classic tools Yacc/Lex/Prep and wrote it in C.

    The language is (was? haven't followed) called VSL, or Variation Simulation Language.

    The problem was this: in the early 80's auto companies were experimenting with variation simulation. It's simulating the build of complex mechanical assemblies so that the effects of dimensional variations can be analyzed. The technique was developed at Willow Run Labs during WWII, as part of the solution to the awful-quality airplanes they were building for the war. They gathered experts to fix the problem, and they used this technique. At the time, it was done by a room full of woman working Friden mechanical calculators...

    So, in the early 80's there was some Fortran code written by a university professor that ran on a mainframe. I worked for a company that set out to commercialize it. My first task was to port it from the mainframe to IBM PC.

    Two problems: Models were written in Fortran, and then linked against a library. Fortran is painful, for anything. It's especially painful for manipulating representations of 3D objects. And compiling and linking Fortran on a PC was slow! Half-hour builds! And that's just to find you had a syntax error and then rinse and repeat.

    My boss wanted to build a "menu system" that engineers could design in. Keep in mind, we are talking 80's and this was just to be a scrolling text menu. Yes, there were graphics workstations, but this was a new untested product, and nobody was going to pop the $20,000 that they did for, say, finite element workstations. they wanted it to work on a PC so that we could more easily convince the auto companies to try it - make it an easier decision to give it a go.

    He wrote up the menu system, and presented it to us in the conference room. He rolled-out a roll of paper the length of the conference table, and then it hung over both ends! I convinced him that the time for this approach had not yet come.... Sure, point and click on graphics - but he couldn't afford either the time or money for that development. But not that silly long-ass text menu!

    The alternative was VSL. It was specifically-tailored to the task, it had "objects" of a sort - and by this I mean "3D objects". You could just pass a fender around in a function call, for example.

    It didn't compile to machine code, but generated bytecode. I wrote an interpreter in Fortran, and so eliminated the costly link step. The Fortran program just read the bytecode into an array and interpreted it. Was it slow? No, it was fast as heck! That's because almost all the work was done in well-optimized library functions written in Fortran or even assembly in some cases. (I also talked my boss into hiring an actual mathematician who fixed our broken edge cases, and knew the right heuristics to speed things up.)

    This made it much easier for engineers to create and use models. Now they wrote them in VSL, much more expressive to the task than Fortran. And in a minute they either knew they had a syntax error or were testing their model.

    In a couple years, we went from a couple of pilot projects to like 50. Every auto company took it up. Boeing used to help re-engineer the FA-18. Today probably every car, airplane, and hard drive was analyzed using VSL. (Siemens wound-up with the product eventually, after a few acquisitions.) I don't know if VSA is still under the hood, or if it really has any practical use today: the models are now written using point/click/drag/popup stuff on drawings. What my boss new we had to eventually get to, but couldn't at the time.

    Of the languages mention

  • by tlambert ( 566799 ) on Friday December 05, 2014 @03:06PM (#48533519)

    Go does not see significant use, even at Google. It's one of the allowed implementation languages, along with Python, JavaScript, and C/C++, but it doesn't see a lot of uptake internally at Google.

    • by HiThere ( 15173 )

      Have you read the bit about "Concurrency is not MultiProcessing" (or something that means the same thing). Go is a single threaded language, which is concurrent but not multiprocessing. So there's basically no payoff in many cases from using it, and you've got to run it through an interpreter (unless you use the gcc version).

      So why bother?

  • Proud tradition (Score:4, Interesting)

    by Just Some Guy ( 3352 ) <kirk+slashdot@strauser.com> on Friday December 05, 2014 @03:10PM (#48533551) Homepage Journal

    Like when Bell Labs developed C to write Unix? There's a long tradition of major companies coming up with new languages to scratch an itch. Thank God is hasn't died. How boring to live in a time when we'd decided that there was nothing left to innovate?

    • by drjzzz ( 150299 )

      Bell Labs didn't develop C; in fact I think Bell Labs hardly knew what to do with it. Two (brilliant) people -- Keringhan and Ritchie -- working in Bell Labs wrote C and developed Unix so that they could do what they wanted, better and quicker, on the minicomputers around their labs. Their slim volume "The C Programming Language" is amazingly engaging, concise, and deeply instructive. Modern IDEs are great for many things but they also constitute a significant hurdle to actually coding, which K&R had yo

Don't tell me how hard you work. Tell me how much you get done. -- James J. Ling

Working...