Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Programming Apple IT

Tim Cook: Coding Languages Were 'Too Geeky' For Students Until We Invented Swift (thestar.com) 335

theodp writes: Speaking to a class of Grade 7 students taking coding lessons at the Apple Store in Eaton Centre, the Toronto Star reports that Apple CEO Tim Cook told the kids that most students would shun programming because coding languages were 'too geeky' until Apple introduced Swift. "Swift came out of the fundamental recognition that coding languages were too geeky. Most students would look at them and say, 'that's not for me,'" Cook said as the preteens participated in an Apple-designed 'Everyone Can Code' workshop. "That's not our view. Our view is that coding is a horizontal skill like your native languages or mathematics, so we wanted to design a programming language that is as easy to learn as our products are to use."
This discussion has been archived. No new comments can be posted.

Tim Cook: Coding Languages Were 'Too Geeky' For Students Until We Invented Swift

Comments Filter:
  • by xxxJonBoyxxx ( 565205 ) on Thursday January 25, 2018 @09:02AM (#55998793)
    >> we wanted to design a programming language that is as easy to learn as our products are to use

    Congratulations you invented LOGO!
    https://en.wikipedia.org/wiki/Logo_(programming_language)
    • by iapetus ( 24050 ) on Thursday January 25, 2018 @09:10AM (#55998859) Homepage

      I was thinking COBOL...

      • I was thinking COBOL...

        * COBOL IS THE FIRST THING I THOUGHT OF AS WELL - THE LANGUAGE FOR NORMAL SANE PEOPLE

        (Bah, why is it telling me than I'm yelling? Obviously, I'm speaking perfectly normally!)

      • I thought the same thing, everyone can understand:

        ADD A TO B GIVING C
        MULTIPLY C BY D GIVING E

        etc

    • by red_dragon ( 1761 ) on Thursday January 25, 2018 @09:11AM (#55998863) Homepage

      Congratulations you invented LOGO!

      Or, they could've dug through their own software catalogue:

      https://en.wikipedia.org/wiki/HyperCard [wikipedia.org]

      • by 0100010001010011 ( 652467 ) on Thursday January 25, 2018 @09:36AM (#55999057)

        I wish they did. I learned to program on Hypercard. It took care of one of the biggest 'problems' with most languages now, a GUI. Python's GUI tools are still a mess that don't always work cross platform.

        It was easy enough and came with enough built in documentation that 13 year old me could figure it out before Stack Exchange.

        • Hypercard's logical successor is Filemaker Pro. I don't know whatever happened to that, but it let you construct interfaces by dragging and dropping. Back when web interfaces were a relatively young thing, I participated in creating a tool to let users select VLANs on Catalyst switches so that they could connect ports on their desks with ports in the testing lab using Filemaker Pro on Windows as the server, and perl on Linux for the backend (using Expect to actually talk to CatOS.) That took about three days, including making it acceptably attractive and usable. And web interfaces aren't (or weren't, no idea what's up with Filemaker now) even the program's primary goal.

          • Filemaker is still around [filemaker.com]. I had to learn it a couple of years back because a research project I was working on used it. I think the project itself is indicative of why Filemaker still exists. When the project started, they went looking for someone who could put together a database for them on the cheap. The guy they hired had been using Filemaker for years and, yeah, he could totally do that for them. It was a mess. The poor folks on the research project didn't even realize that a web-based front end was a
            • by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Thursday January 25, 2018 @10:26AM (#55999381) Homepage Journal

              I suspect he's still out there, somewhere, causing people to be stuck with Filemaker.

              Doubtless an open-source, platform-agnostic solution works better. I use Drupal these days when I want to do a job like that. It provides all the usual functionality you expect from a CMS, and a fairly small set of modules will let you create views on arbitrary database tables so that you can use it as a glorified database reporting tool without ever writing a line of code, CSS aside. You just create views. And the CSS is only needed to make it look pretty, not to make any sense of the information at all. And there are export tools.

              • I suspect he's still out there, somewhere, causing people to be stuck with Filemaker.

                Doubtless an open-source, platform-agnostic solution works better. I use Drupal these days when I want to do a job like that. It provides all the usual functionality you expect from a CMS, and a fairly small set of modules will let you create views on arbitrary database tables so that you can use it as a glorified database reporting tool without ever writing a line of code, CSS aside. You just create views. And the CSS is only needed to make it look pretty, not to make any sense of the information at all. And there are export tools.

                Maybe, if you want to re-invent every aspect of every wheel.

                But FileMaker simply can't be beat for being able to bang-out a really decent-looking, and acting, application in almost zero time.

            • I think one of the most painful projects I ever had was trying to interface Filemaker, a campus Oracle DB and Mailman. Take student records from the Oracle db, transfer them to Filemaker for , god knows what reason, then push them later to mailman for mass mail. The first demo somehow the mailman instance broke and turned into a kind of loop that spammed about 9000 students with nearly 100 mails each, mostly just repeated "out of office" messages.

              I never drank so hard as I did that night.

              • I think one of the most painful projects I ever had was trying to interface Filemaker, a campus Oracle DB and Mailman. Take student records from the Oracle db, transfer them to Filemaker for , god knows what reason, then push them later to mailman for mass mail. The first demo somehow the mailman instance broke and turned into a kind of loop that spammed about 9000 students with nearly 100 mails each, mostly just repeated "out of office" messages.

                I never drank so hard as I did that night.

                So, because YOU didn't take time to learn FileMaker properly, it is FILEMAKER's Fault?!?

                Got it!

            • It is easy to get this opinion, and get on this bandwagon, but there has to be a balance between time spent learning a languge, and the value of the solutions provided. Not everything needs a jackhammer, some things need a regular hammer, and some things just a thumb. Use the tool for the job, and then replace it when it is cost effective to do so. Such as when the long term cost has a high expected return on investment.

              There is something to be said for easy to deploy but comparatively crappy solutions. T
          • Hypercard's logical successor is Filemaker Pro. I don't know whatever happened to that, but it let you construct interfaces by dragging and dropping. Back when web interfaces were a relatively young thing, I participated in creating a tool to let users select VLANs on Catalyst switches so that they could connect ports on their desks with ports in the testing lab using Filemaker Pro on Windows as the server, and perl on Linux for the backend (using Expect to actually talk to CatOS.) That took about three days, including making it acceptably attractive and usable. And web interfaces aren't (or weren't, no idea what's up with Filemaker now) even the program's primary goal.

            FileMaker lives on!

            Still an Apple Subsidiary. Still Cross-Platform (Mac and WIndows + iOS + Web). Standalone, on-Prem Server, and Cloud-Based Server Options.

            Still the hands-down Easiest GUI Form and Report Editors on the Planet!

            Only Multi-User DBMS I know of with LIVE record-updating to ALL users sitting on the same Record!

            Still the most misunderstood and ignored Multi-User, Multi-Platform DBMS ever!

            http://www.filemaker.com/ [filemaker.com]

        • pfft - back in my day we had these things called Lie-Brer-EEES where you could check out books that had actual programs written in them that you could TYPE IN by yourself (with some hours of effort) to play Star Trek (You were the E, Klingons were the K...). In truth, I actually bought that book with my own birthday money at an incredible sum at the time.
          There was no Ent-ree-net, hell boy modems weren't even readily available for consumers yet!
          I just got tasked with setting up a new project in a framework
          • PC Magazine used to have printed hexadecimal assembly code that you could type into Microsoft's debug program to create small utilities in the .com format.
      • Congratulations you invented LOGO!

        Or, they could've dug through their own software catalogue:

        https://en.wikipedia.org/wiki/HyperCard [wikipedia.org]

        I love HyperCard and HyperTalk; but it does NOT have nearly the "Natural Language" properties that Bill Atkinson and Dan Winkler thought it did. And as far as "wordiness", COBOL has NOTHING on HyperTalk!!!

        https://en.wikipedia.org/wiki/... [wikipedia.org]

        Proof: One has to look no further than AppleScript, which is the syntactic kissing-cousin of HyperTalk.

    • Funny, I have always thought of The Swift language as Apples crippled lerch to be just like Python.
  • by ArtemaOne ( 1300025 ) on Thursday January 25, 2018 @09:06AM (#55998825)

    I can see exposure and accessibility being a factor in getting people interested in computer programming. Kind of like Carl Sagan's and Bill Nye's attempt to get simplified science to the masses. The reach sparked a passion in people that may have never had a reason to get into the field and expand their horizons.

    • I think the idea Cook is advocating is that Swift isn't just a "toy" language to be used for kids' classroom projects.

      I've taken an interest in human factors lately and realized in my reading that human limitations such as working memory [wikipedia.org] are a big part of why programming is difficult, and an abundant source of programming errors. Those limitations can't be wished away. Neither is it necessarily a wise allocation of labor to demand exceptional working memory as an entry condition to the career of programming

      • by Anonymous Coward

        I think Tim Cook is just publicizing Apple.

      • by TheRaven64 ( 641858 ) on Thursday January 25, 2018 @12:19PM (#56000309) Journal

        Back when I was a PhD student, I came across a study showing that about 10% of the population naturally uses hierarchies in their mental model of structure. This came up in the context of HCI research, where you find things like filesystem hierarchies that make complete sense to some people and are largely incomprehensible to others. This was one of the reasons for iTunes' early success (before version 5, when they completely screwed up the UI): music was in a flat library, with arbitrary filters. You could filter by album, artist, or genre independently, there was no hierarchical structure. Geeks said 'why would I need this, I already have my music in a music/{genre}/{artist}/{album}/ hierarchy, people too stupid to understand that shouldn't use computers'.

        Why am I talking about this? Because almost all mainstream programming languages implicitly adopt hierarchical structures. We have namespaces containing classes containing instance variables and methods. We have nested scopes. We have call stacks of subroutines (though coroutines are starting to come back into fashion).

        So what makes Swift different? Absolutely nothing. It has a load of marketing behind it, but structurally it is no different from any other Algol family language with some Smalltalk influence. It requires thinking in precisely the same way as Objective-C or Java, it just spells some of the things differently. And it is both more verbose slower than Objective-C++ for pretty much every task.

        • by dgatwood ( 11270 )

          You could filter by album, artist, or genre independently, there was no hierarchical structure. Geeks said 'why would I need this, I already have my music in a music/{genre}/{artist}/{album}/ hierarchy, people too stupid to understand that shouldn't use computers'.

          Those geeks were nuts. Manually creating a hierarchy where you can look things up by genre, artist, album, and title creating at *least* three symlinks for every single file you add, and if it involves more than one artist, potentially more than

  • by Austerity Empowers ( 669817 ) on Thursday January 25, 2018 @09:07AM (#55998835)

    Code like a beast Bro! Bro that code into shape! Be awesome! Beer at noon. Pointers? What are you a nerd? Memory management? That's like for the CPU to deal with, bro, be bro! Efficient code? BRO! They keep making faster CPUs! Mutilate that code!

    Bro, it's got what your body craves.

    • However for a lot of the most common programming you will be always writing the same code over and over again. To deal with these internals.
      Making sure you don't miss pointers, having to measure every value... All for the normal CRUD Application we need to produce over and over again.

      Because of these higher level languages the stability of our code has noticeably increased. Think back to using a computer back in the 1990's. Even if you had a solid OS such as a Unix, Linux, even NT which wouldn't cause the

    • Bros can code, and them getting into coding means more competition for scarce jobs and wages. Nerd stuff is the one thing us nerds had. And it's been taken away from us.
      • No not really, I still code in C. I don't see it going away any time soon. I'm not releasing UI layer applications, and not terribly concerned with first to market.

        It frustrates me that C#/Swift/etc. are being pushed so hard at the application layer and forcing C coders to do a lot of undocumented and probably shadier stuff than they were doing before, just to use OS API calls to functions that we all know were coded in C to begin with.

    • by Anonymous Coward on Thursday January 25, 2018 @09:34AM (#55999047)

      Because the world needs more "programmers" that think like this.....

      Do us all a favor Cook, get rid of it. If they need a "cool" language to be programmers, then we DON'T need them to be programmers. We WANT the geeks who will try to make their code as efficient as possible, or implement a RFC with immaculate detail. We WANT them to know the limits and pitfalls of their chosen language forwards and backwards so that they make better more secure code. We DON'T want people who will only code if a monkey can do it. (Chances are you could automate that anyway.)

      I'll take the geek's code over the quarterback's code any day, and if we had our way Cook, you wouldn't have a choice about it either. (We want some laws to forbid the company ideal of profit over responsible software development.)

      • We WANT the geeks who will try to make their code as efficient as possible

        We do, but we also want people who will spend 10 minutes throwing together a simple program that will automate something in their workflow and save them half an hour a week for the rest of the year. We don't really care if it takes 30 seconds of CPU time to run, when a good programmer could optimise it to run in under a millisecond, because it's still far cheaper to spend 30 seconds of CPU time than 30 minutes of human time each day.

        We WANT them to know the limits and pitfalls of their chosen language forwards and backwards so that they make better more secure code

        If they're writing reusable code, or code that's taking untrusted data and

  • Comment removed (Score:5, Insightful)

    by account_deleted ( 4530225 ) on Thursday January 25, 2018 @09:07AM (#55998839)
    Comment removed based on user account deletion
    • Re: Wait, what? (Score:2, Insightful)

      by Anonymous Coward

      Not trolling, but clever marketing. Look at his audience. Heâ(TM)s trying to hook the kids before they get comfortable with other tools and environments

    • Re:Wait, what? (Score:5, Interesting)

      by AHuxley ( 892839 ) on Thursday January 25, 2018 @09:45AM (#55999121) Journal
      Its virtue signalling and educational political correctness.
      So much money has been added to gov education over the decades from the gov and private sector in the USA.
      The amount per student in some city and states should have produced amazing results if a lack of spending in the past was the only problem.
      After decades of testing the results are not looking as good as expected. Average students given support, funding, new computers, GUI robots, computers in the home still all fail to study, won't learn, cant pass tests, cant pass exams.

      The politically correct educators cant admit they got it so wrong for many decades and that all that new funding was wasted on below average students.

      So its has to be what was been used to educate the very average and below average students. Change the computer education and the results for below average students will improve for some reason.
      More spending and a new way to look at computers has to work in ways that past funding and new computers did not.
      Everyone just wants to keep the funding going and see the next gen of computers sold and supported.
      The sales pitch is the new language. In the past it was robot kits, tablets, laptops, GUI, desktops, new calculators with the needed new textbooks.
      The test results stay the same every generation as the problem is not lack of funding, the lack of computers, the wrong computer language.

      The students just won't, cant, have no interesting in study. Every other aspect of education has been improved. Books, GUI, buildings, food, more teachers, better teachers, more gov money, private sector money, computers.
      • Re:Wait, what? (Score:5, Informative)

        by werepants ( 1912634 ) on Thursday January 25, 2018 @10:17AM (#55999325)

        So much money has been added to gov education over the decades from the gov and private sector in the USA.

        The amount per student in some city and states should have produced amazing results if a lack of spending in the past was the only problem.

        What the hell are you talking about? In my state, total per-pupil funding is about $7000-$8000/yr, and has barely been keeping pace with inflation. For reference, daycare for one kid costs about $2000 PER MONTH here - what the schools get is a pittance by comparison. And keep in mind that daycare can be done by college students and stay-at-home moms, while teachers must have a bachelor's degree, minimum, and often have an advanced degree. Many of those are STEM degrees, worth quite a bit in industry.

        The schools haven't been adequately funded for decades, and things are only getting worse.

        • by AHuxley ( 892839 )
          The US private sector has added their funding in too in some cities and states. Nothing really got the exams and test results expected given the amount of funding given. Its not a lack of funding per student.
        • by ZiakII ( 829432 )
          Depends on the state we spend $20,000 up to $35,000 in NJ.
        • So much money has been added to gov education over the decades from the gov and private sector in the USA.

          The amount per student in some city and states should have produced amazing results if a lack of spending in the past was the only problem.

          What the hell are you talking about? In my state, total per-pupil funding is about $7000-$8000/yr, and has barely been keeping pace with inflation. For reference, daycare for one kid costs about $2000 PER MONTH here - what the schools get is a pittance by comparison. And keep in mind that daycare can be done by college students and stay-at-home moms, while teachers must have a bachelor's degree, minimum, and often have an advanced degree. Many of those are STEM degrees, worth quite a bit in industry.

          The schools haven't been adequately funded for decades, and things are only getting worse.

          You must be in a low cost of living state: see this [governing.com].

          The comparison to daycare is also bizarre. Daycare, you may be surprised to learn, lasts all day, and doesn't take random days, weeks, and MONTHS off.

        • Re:Wait, what? (Score:5, Informative)

          by Solandri ( 704621 ) on Thursday January 25, 2018 @12:15PM (#56000259)
          Schools are vastly over-funded. The U.S. spends more on education per student [oecd.org] than any country except Switzerland. While a few states dip into the $7k/yr per student range you give [governing.com], the national average is over $12k/yr per student [ed.gov].

          Total expenditures for public elementary and secondary schools in the United States in 2013â"14 amounted to $634 billion, or $12,509 per public school student enrolled in the fall (in constant 2015â"16 dollars).

          (Discrepancy with the OECD stats is due to being from different years, and the OECD stats including post-secondary non-tertiary education, while the NCES stats are for only K-12).

          Spending per student has about doubled in inflation-adjusted dollars [justfacts.com] over the last 40 years. and tripled since the 1960s. It peaked around 2007, and the people trying to get even more money put into education have been abusing that by using 2007 as the start of their spending graphs.

          Where is all the money going? I don't have time to find it again, but the Education Department's own stats are contradictory. If you take the amount of spending it lists in teacher non-salary benefits, and divide it by the number of teachers they give, it ends up something like $50k/yr per teacher. What's going on is the number of non-teaching administrators has exploded [heritage.org] since 1970, far outpacing the growth in number of students. These administrators have been hiding it by shifting some of their salary expenses into those of teachers in the stats. Every time education receives a spending increase, the administrators sop up most of it and let only a trickle get through to teachers. Every time education receives a spending cut, these administrators pass it all on directly to the teachers and students, while protecting their own jobs and salaries. As a result, the teachers are constantly complaining of not having enough money despite the huge increases in education spending over the decades.

      • but more specifically it's _unequal_ funding. In America we fund schools with local property taxes. We do this so that rich people don't have to pay for the poor to be educated. To the point where if you try to send your kid to one of the nicer schools you'll get prosecuted for theft.

        The result is that if you look at per-capita spending on schools it's very, very high but if you look at the results across entire districts they're very, very poor. It doesn't help that America has a massive underclass of
        • by AHuxley ( 892839 )
          That was corrected for by politically funding and the private sector wanting to be seen supporting poor areas. Even with much more spending per student the tests did not get passed, the exams did not get passed. The costs of supporting new computers, GUI robot kits, laptops for each average student.
          The results of more funding would have been seen in much better results over generations if a simple lack of funds was a problem.
      • by Rande ( 255599 )

        The achievement of students is highly correlated with the success of their parents, much higher than the quality of the school and the teachers and materials.

        So instead of mucking around spending money on schools, how about we get better parents?

      • I never coded in Swift so just searched for an example and found this, literally the first Swift code block I ever saw:

        override func viewWillAppear(_ animated: Bool ) {
        super.viewWillAppear( animated ) // Create a button which when tapped blah blah
        let button = UIButton(type: UIButtonType.system) as UIButton
        let xPostion:CGFloat = 50

        and so on (along with misspelling for xPosition). This doesn't look to me any different nor a

    • Re:Wait, what? (Score:4, Insightful)

      by Hal_Porter ( 817932 ) on Thursday January 25, 2018 @09:57AM (#55999211)

      I dunno, I think he's smarter than he looks.

      Consider.

      Back when people wrote code in Objective C it was easy to have some Objective C for the iOS UI, some Java for the Android UI and a big gob of portable C/C++.

      Now if they write the whole app in Swift it will be easier to get it running on iOS. And it seems like there are various projects to get Swift running on Android too.

      E.g.

      https://medium.com/@ephemer/ho... [medium.com]

      I.e. Apple have something which is a competitor to writing everything in C# and using Xamarin to target both platforms.

      Xamarin has always seemed a bit horrid to me frankly. And doing the 'big gob of portable C/C++ with two sets of UI code is also horrid.

      If Apple can build a platform that people use for IOS apps knowing they can run well on Android they've got a pretty compelling platform. And if it turns out not to work very well on Android they've got more iOS exclusive applications.

      • Grow up.

        Programming is *hard*. It's hard for a variety of reasons, but one of which is the mutation of your end user's problem into something that can be automated. The language is really the last thing that's a barrier to new entrants. A failure to teach problem decomposition is where I'd address this, first.

        Swift. Spare us, who make a living doing this, the next Silver Bullet.

        --#

      • How is Xamarin horrid? I's not even a language.
  • by klubar ( 591384 ) on Thursday January 25, 2018 @09:10AM (#55998857) Homepage

    If you want easy to use languages that teach the concept of programming (as opposed to ones for developing professional applications) there are better choices.

    I think there needs to be a distinction between intro languages and ones used for developing complex, large applications. It's great to give beginners (whatever their age) an intro to programming and maybe Swift is the language for this.

    This is sort of the same as the woodshop class for 7th graders that doesn't use power tools. Great intro to woodworking, but not the approach you'd use if you were building a house. The class might inspire kids to learn more about the field -- which is all you are looking for.

  • To me it looks like Javascript on top of ObjC with a little bit of Rust syntax here and there. Not really all that revolutionary. The thing it does for newcomers (young or old) is that it takes the C feel out of iPhone development.
  • by Gravis Zero ( 934156 ) on Thursday January 25, 2018 @09:15AM (#55998907)

    Just because the CEO of Apple says something doesn't mean he's not totally full of shit. Does anyone honestly think removing the 3.5mm jack from the iPhone was about courage?

    • Just because the CEO of Apple says something doesn't mean he's not totally full of shit.

      Of course not. If the CEO of Apple says something, that DOES mean he's full of shit.

    • Tim Cook is out within a year anyway, 1.5 max. And he knows it.
    • Does anyone honestly think removing the 3.5mm jack from the iPhone was about courage?

      Yes, it was about having the courage to see if you could get your customers to give you more money by having to buy new earbuds from you. Apparently they will.

      The next bit of courage was removing the function keys and all but one of the ports while raising the price significantly and using one year old GPUs and CPUs...and apparently customers still bought it.

      At this point, Apple is just trolling its customers to see how bad it has to get before they refuse to buy things. If you have any doubts about

  • Less geeky? (Score:5, Funny)

    by Daetrin ( 576516 ) on Thursday January 25, 2018 @09:19AM (#55998935)
    I've never used Swift, let's see what the code samples on wikipedia look like [wikipedia.org]...

    var str = "hello,"
    str += " world"
    ---
    let myValue = anOptionalInstance?.someMethod()
    ---
    let leaseStart = aBuilding.TenantList[5].leaseDetails?.startDate
    ---
    guard let leaseStart = aBuilding.TenantList[5]?.leaseDetails?.startDate else {
    //handle the error case where anything in the chain is nil
    //else scope must exit the current method or loop
    }
    ---
    protocol SupportsToString {
    func toString() -> String
    }
    extension String: SupportsToString {
    func toString() -> String {
    return self
    }
    }
    ---
    func !=(lhs: T, rhs: T) -> Bool

    Ahh yes, it's very clear how Swift is so much less "geeky" than other languages like C# or Java! I'm sure a student looking at it for the first time would instantly realize how much better it is instead of saying "that's not for me"!

  • by aaarrrgggh ( 9205 ) on Thursday January 25, 2018 @09:22AM (#55998965)

    I am sure there is an intelligent comment in there somewhere; does anybody have any idea what he means? I can see how someone could apply "too geeky" to certain languages which makes them a poor choice for your first useful program... but talking to 7th graders it doesn't really make sense to talk about C, at least to me.

    • by mark-t ( 151149 )

      I am sure there is an intelligent comment in there somewhere

      When you begin with a flawed premise, your chance of arriving at the correct conclusion drops significantly.

    • by Jeremi ( 14640 )

      I am sure there is an intelligent comment in there somewhere; does anybody have any idea what he means?

      I think he's referring not to the language's syntax (which isn't all that different from many other computer languages) but rather to Apple's approach to teaching the language, which involves interactive playgrounds [apple.com] rather than the traditional "type in a few paragraphs of mysterious text into a blank IDE and hope something happens" approach.

    • by AHuxley ( 892839 )
      Follow the money.
      City and state govs have another generation that cant pass a test after all the tablets, new teachers, GUI robots, laptops, internet, desktop computers, new buildings.
      Nobody wants to admit its the many below average students given the education funding over the past decades in parts of the USA.
      Funding that was expected to ensure everyone could get a good job, into university.
      It was only a lack of money that held back poor areas. Add the funds and it would all be equal. Exam resul
  • Good move from Apple serving his own interests !

    Programmers are the bottleneck of the digital economy. Scare and valuable resources. So the master plan looks good!

    1. Design a fun and easy language to develop powerful applications
    2. Get the kids hooked on it (meaning Dad and Mom has to pay to by a Mac AND the school too)
    3. Say: hey! you can put your first app on the AppStore and maybe earn monies
    4. Get more adult programmers trained on Apple only dev env.
    5. $$$
    6. Create another new and fun and easier languag

    • by AHuxley ( 892839 )
      Its like selling the calculator and the needed math book to schools together with a way of educating that needs a brand of calculator.
      Once the gov accepted the method, the books, calculator and support had to be accepted.
      But now new code and a GUI that has to be "upgraded" every year due to advancements.
      Teaching only works well with the brand. The brand sets the computer and math standards.
  • by volodymyrbiryuk ( 4780959 ) on Thursday January 25, 2018 @09:29AM (#55999009)
    Because you can use emoji characters? "Oh look how cute I can name this variable *pile of poo emoji* now I want to be a programmer more than anything in the world".
    • Because you can use emoji characters? "Oh look how cute I can name this variable *pile of poo emoji* now I want to be a programmer more than anything in the world".

      OK, so my son would think that was funny and cool, and likely be more interested because of it.

      But he'll grow out of it, I swear ...

  • I think it is more likely, if there is even the correlation as Cook observed, that it happened to be the case that Apple simply invented Swift around the same time that students were starting to stick with programming more than they had been. This is Cook giving Apple credit for something that there is precisely zero evidence that they had any influence on at all.

    It's roughly on par with thinking that pressing a street crossing button multiple times is going to make the light change any faster than just

  • by geekmux ( 1040042 ) on Thursday January 25, 2018 @09:29AM (#55999019)

    Complaining that programming code is "too geeky" is like complaining that a steamroller is "too flatty".

    We have enough issues and vulnerabilities being generated today by the "geeks" who have the mental capacity and intelligence to code.

    The last thing software security and integrity needs is coding dumbed down to the point where Cletus T. Dipshit is at the programming helm of next-gen solutions.

  • I can't stand that word...

  • by Oswald McWeany ( 2428506 ) on Thursday January 25, 2018 @09:31AM (#55999029)

    Not sure if Speedware is still around. I briefly did a little work with that. They had one command that was ahead of its time. Would have been a hit with millenials.

    Do Nothing;

    Yes, that was the command. Don't remember the rest of the Speedware syntax, so below is wrong, but the idea was:

    if (x)
    {
            Do Nothing;
    }
    else
    {
          CallMyProcedure();
    }

    For some reason "Do Nothing" was preferable to them than saying "Not X" in the If statement. I used to pepper my code with "Do Nothing" just to be silly. (I was young and liked having a laugh back then).

    • by chefren ( 17219 )
      Hey if you get paid by or evaluated based on the number of lines of code you write, this probably seems like a great idea! If every other line of code you write is "Do Nothing;" the amount of bugs per line in your code is halved. Great success!
    • by tepples ( 727027 )

      I guess it's like Python pass, which allows filling a syntax slot that requires a compound statement (such as catching an exception or defining a dummy function or class) but otherwise is like writing the constant None on a line by itself.

      • Most languages have a no-op statement. C allows just a plain ";" as a do nothing statement. Even going back to Fortran, there was CONTINUE [oracle.com]. It's a very old concept.

        • Most languages have a no-op statement. C allows just a plain ";" as a do nothing statement. Even going back to Fortran, there was CONTINUE [oracle.com]. It's a very old concept.

          Very true, although ";" is decidedly less amusing than "Do Nothing"... at least to me.

    • if (!x) { CallMyProcedure(); }

      There fixed that for you.
  • it still doesnt feel good to me to use

  • by Anonymous Coward on Thursday January 25, 2018 @09:45AM (#55999119)

    Probably the students had the (usual) comparison between Swift and LOLCODE. Here is a Wikipedia example of Swift:

    guard let leaseStart = aBuilding.TenantList[5]?.leaseDetails?.startDate else { //handle the error case where anything in the chain is nil //else scope must exit the current method or loop
            }

    Here is an example of LOLCODE:

    HAI 1.0
    CAN HAS STDIO?
    I HAS A VAR
    IM IN YR LOOP
          UP VAR!!1
          VISIBLE VAR
          IZ VAR BIGGER THAN 10? KTHX
    IM OUTTA YR LOOP
    KTHXBYE

  • Tim Cook? Really? (Score:5, Insightful)

    by Opportunist ( 166417 ) on Thursday January 25, 2018 @09:47AM (#55999131)

    That's the man that thought removing a headphone jack from a cellphone is a good idea and that having non-replaceable batteries are what customers want.

    Who in their sane mind listens to an imbecile like that?

    • That's the man that thought removing a headphone jack from a cellphone is a good idea and that having non-replaceable batteries are what customers want.

      Who in their sane mind listens to an imbecile like that?

      Ever seen the unit sales of iPhones, even those without a headphone jack?

      And virtually NO phone has a removable battery. But in iPhones (and many others) they ARE replaceable.

  • It's still an Apple-only language and as such should be avoided by beginners at all costs.

    And no, crappy swift ports for other platforms, with zero real-word use, don't count.

    • crappy swift ports for other platforms, with zero real-word use, don't count.

      These ports would have to keep up with the huge language syntax changes. Not easy!

  • Hypercard [arstechnica.com] was far ahead of its time. Unfortunately, most of these friendly languages are not very effective at hardcore tasks.

  • by Qbertino ( 265505 ) <moiraNO@SPAMmodparlor.com> on Thursday January 25, 2018 @10:15AM (#55999311)

    Yes, PLs need to be consistent, easy to learn and easy to use. All true. But PLs also need to offer easy solutions to tougher everyday problems. Cross-platform portability, the ability to easyly abstract away the hard stuff like networking, GUI, graphics and such and an easy integrated way to swtich from OOP to functional to sequential, from event-driven to imperative and back.

    The PL squaring the circle the best right now is Python. And it show, as Python is the only PL used professionally in every field you can think of while at the same time being known for a very n00b friendly PL. If Apple want's Swift to compete/beat Python in that field they have to offer all that Python offers + a free cross-platform IDE + a binary cross-compiler for all major platforms including mobile. You know, like Python freezing, only better. That would be something new and get opinion leaders on board. Until then I'm not hodling my breath.

    My 2 cents.

  • coding languages were too geeky

    coding is a horizontal skill like ... mathematics

    So which one is it? Can programming languages be "less geeky" or nor? And what the hell is a horizontal skill? I'd think that would be swimming or bench pressing or something.

  • I mean come on, BASIC isn't geeky, neither is C, C++ etc. And now Python sure.
  • almost by definition. Film at 11.
  • I always thought Swift took many concepts from C#. Programming is geeky, what is the big deal?

As you will see, I told them, in no uncertain terms, to see Figure one. -- Dave "First Strike" Pare

Working...