Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Programming Apple

Four Years On, Developers Ponder The Real Purpose of Apple's Swift Programming Language (monkeydom.de) 261

Programming languages such as Lua, Objective-C, Erlang, and Ruby (on Rails) offer distinct features, but they are also riddled with certain well-documented drawbacks. However, writes respected critic Dominik Wagner, their origination and continued existence serves a purpose. In 2014, Apple introduced Swift programming language. It has been four years, but Wagner and many developers who have shared the blog post over the weekend, wonder what exactly is Swift trying to solve as they capture the struggle at least a portion of developers who are writing in Swift face today. Writes Wagner: Swift just wanted to be better, more modern, the future -- the one language to rule them all. A first red flag for anyone who ever tried to do a 2.0 rewrite of anything.

On top of that it chose to be opinionated about features of Objective-C, that many long time developers consider virtues, not problems: Adding compile time static dispatch, and making dynamic dispatch and message passing a second class citizen and introspection a non-feature. Define the convenience and elegance of nil-message passing only as a source of problems. Classify the implicit optionality of objects purely as a source of bugs. [...] It keeps defering the big wins to the future while it only offered a very labour intensive upgrade path. Without a steady revenue stream, many apps that would have just compiled fine if done in Objective-C, either can't take advantage of new features of the devices easily, or had to be taken out of the App Store alltogether, because upgrading would be to costly. If you are working in the indie dev-scene, you probably know one of those stories as well. And while this is supposed to be over now, this damage has been done and is real.

On top of all of this, there is that great tension with the existing Apple framework ecosystem. While Apple did a great job on exposing Cocoa/Foundation as graspable into Swift as they could, there is still great tension in the way Swift wants to see the world, and the design paradigms that created the existing frameworks. That tension is not resolved yet, and since it is a design conflict, essentially can't be resolved. Just mitigated. From old foundational design patterns of Cocoa, like delegation, data sources, flat class hierarchies, over to the way the collection classes work, and how forgiving the API in general should be. If you work in that world you are constantly torn between doing things the Swift/standard-library way, or the Cocoa way and bridging in-between. To make matters worse there are a lot of concepts that don't even have a good equivalent. This, for me at least, generates an almost unbearable mental load.

Four Years On, Developers Ponder The Real Purpose of Apple's Swift Programming Language

Comments Filter:
  • Walled garden (Score:4, Insightful)

    by LynnwoodRooster ( 966895 ) on Monday June 11, 2018 @10:48AM (#56764964) Journal
    They already have one for consumers, this just makes it easier to put one up for developers.
    • Re: (Score:2, Troll)

      by mwvdlee ( 775178 )

      This.

      The purpose of Swift is to make it more expensive to support multiple platforms.

      • Near as I can tell this is one of two purposes. The other purpose is that many "app" developers are not professional programmers.

        Anyway, the lack of support for C/C++ is going to hurt them in the long run, probably not even that long run. For example if they want to improve Metal adoption, they probably need to get C support out there soonest.

        There are plenty of languages out there that address one concern or another that various types of programmers have or don't want to deal with, and C/C++ support ensure

        • For example if they want to improve Metal adoption, they probably need to get C support out there soonest.
          I doubt you find many macOS or iOS developers that are fluent in C - or willing to use C. Why would anyone even think about C for a desktop OS or writing Apps is beyond me ... why not write straight in assembler?

          • Re:Walled garden (Score:4, Insightful)

            by Austerity Empowers ( 669817 ) on Monday June 11, 2018 @12:21PM (#56765646)

            Most programmers aren't going to write directly against a graphics API either, but those APIs are typically consumed by big engines that are written in C/C++, for good reasons, and it is hard to write swift that calls C/C++ and vice versa.

            • C++ objects and Swirft objects don't interact seamlessly, however writing the relevant glue code is easy to google, e.g. http://www.swiftprogrammer.inf... [swiftprogrammer.info]

              For C-APIs you don't need glue code, only Swift function definitions: https://theswiftdev.com/2018/0... [theswiftdev.com]

              So what exactly is your point?

              • This is the opposite of what is needed. If you have your favorite high level graphics engine (not necessarily Unity or Unreal, if you're a small indy), it's probably in C++, sometimes plain old C. The good ones have abstracted the underlying OpenGL/DirectX interaction, so if you want to support Metal, which you really might wish to do, you have to figure out how to call it. It's designed to be called from Swift. You're not going to rewrite your entire engine in Swift just to support OS X, which isn't the wo

            • by Dog-Cow ( 21281 )

              Calling C is trivial in Swift. Grouping C and C++ together in this respect just makes you look like an idiot.

        • by Dog-Cow ( 21281 )

          What lack of C/C++ support? Swift can't bridge to C++, but it does bridge nicely to C. And all platforms which Swift supports also support C and C++. Do you have an actual argument, or are you just spouting stuff that looks technical, but makes no sense?

  • by b0s0z0ku ( 752509 ) on Monday June 11, 2018 @10:48AM (#56764966)
    Apps written in different languages are hard to compile across platforms.
    • The problems are not the languages but the UI Frameworks (and Filesystem and Network Access).

      • Mac OS is based on one of the BSD's... it should just be POSIX-compliant. That's WHY we have standards.

  • Like the many languages that have unsuccessfully tried to do that over the decades. Jack of all trades, master of none.
  • by QuietLagoon ( 813062 ) on Monday June 11, 2018 @10:53AM (#56765008)
    Don't look for reasons why Swift may be technically superior. Look for reasons why Apple wants Swift to keep developers locked inside the Apple world. Every minute that a developer uses to learn Swift, is a minute not spent on learning a non-Apple technology.
    • > Every minute that a developer uses to learn Swift, is a minute not spent on learning a non-Apple technology

      And that was true of Obj-C for the last 30 years, so... yeah, solid argument there.

      • ...And that was true of Obj-C for the last 30 years, so... yeah, solid argument there....

        Obj-C was not developed by Apple. It was used by Apple (and, btw, NeXT), but it was developed separately from Apple by StepStone.

        • by Dog-Cow ( 21281 )

          Can you name a platform that has an Objective-C compiler today, and not a Swift compiler? An original NeXTstation, I guess.

          • Re: (Score:2, Informative)

            by Anonymous Coward

            gcc has an Objective-C compiler. It’s just as standard as the C and C++ compilers, though your package manager almost certainly have it as a separate package from the C and C++ compilers (which are, themselves, also usually separate).

            It’s not particularly difficult to get your hands on an Objective C compiler.

          • by angel'o'sphere ( 80593 ) on Monday June 11, 2018 @12:15PM (#56765616) Journal

            Every linux distro that installs the gcc suit has an objective-C compiler ...
            And if you really want to: a Switft compiler to, directly from swift.org: https://swift.org/download/ [swift.org]

          • Clang supports all of the features of Objective-C on all *NIX platforms and Windows. The GNUstep Objective-C runtime, in combination with clang, supports a superset of the language features of Apple's implementation and is used by Microsoft in their Windows bridge for iOS. You might be surprised at some of the places Objective-C ends up - there are quite a few larger (mostly non-Web, though there are also things like SOGo) server systems that use Objective-C - I've consulted for a couple of companies that

          • by guruevi ( 827432 )

            Every single platform has an Objective-C compiler. Just because it's not in VisualStudio doesn't mean it doesn't exist.

        • by TechyImmigrant ( 175943 ) on Monday June 11, 2018 @12:05PM (#56765526) Homepage Journal

          ...And that was true of Obj-C for the last 30 years, so... yeah, solid argument there....

          Obj-C was not developed by Apple. It was used by Apple (and, btw, NeXT), but it was developed separately from Apple by StepStone.

          Because Apple used Pascal, then extended it to object pascal. Then people wanted C, but they needed object semantics to port the object pascal libraries and C++ wasn't ready and was ugly as sin, so they went with Objective C, which is uglier. They were swept along with the stream like everyone else.

          The real problem with Swift is not the language, it's that when you try to use swift to program a mac or iphone, you spend most of your programming time interacting with the libraries in arbitary ways invented by Apple and every example on the internet seems to be written in Objective-C. You're left guessing at what the Swift equivalent would look like.

          • by EMB Numbers ( 934125 ) on Monday June 11, 2018 @01:15PM (#56766032)

            You have your history completely wrong.

            Objective-C was created by Dr. Brad Cox in the Mid 1980s - https://en.wikipedia.org/wiki/... [wikipedia.org]
            Objective-C is a mashup of ANSI C and Smalltalk. Smalltalk is the original Object Oriented programming language created by Alan Kay in the 1970s. Alan Key coined the term, "Object Oriented", to describe Smalltalk. He later said, "I coined the term Object Oriented, and I promise, C++ is not what I had in mind." He has also said he wished he called Smalltalk "Message Oriented" because messages are the key, not objects.

            Steve Job's NeXT Computer Corp. used Objective-C to create NeXTstep graphical frameworks currently known as Cocoa (Foundation Kit, App Kit, etc.) and Cocoa Touch (UIKit etc.) NeXTstep shipped commercially in October 1988.

            Apple purchased NeXT in December 1996, and NeXtstep eventually became Mac OS X and iOS.

            • >You have your history completely wrong.

              Not at all.

              I'm talking about Apple's adoption of languages. If you are of a certain age, you will remember a time before NeXT, before SJ left Apple when there were Apple ][s, Apple //es, Lisas, Early Macintoshes. At no point did I claim that Apple invented these languages. Why are you implying I did? Why is the name of the creator of Objective C relevant to my statements about Apple's adoption of Objective C?

              You may be 12 years old and so were not around when these

        • Yeah, but Apple is the only entity that still controls platforms based upon it. The only other entity I'm even aware of that uses Objective C is the GNUstep project, which is trying to replicate Apple's APIs.

          So the GP is right. Nobody claimed Apple developed it, the claim is that it's essentially a technology that, if you use, you'll be locked into Apple's ecosystem.

          That said... I don't know that this is the purpose of Swift. The language itself is fairly open, and it came about at about the same time

          • by Dog-Cow ( 21281 )

            Just because no one else is doing serious development with ObjC does not mean you are locked in. The clang compiler is part of the LLVM suite, and is available on every platform that the rest of LLVM is available on. Nothing is stopping anyone from creating ObjC libraries on non-Apple platforms.

        • ...And that was true of Obj-C for the last 30 years, so... yeah, solid argument there....

          Obj-C was not developed by Apple. It was used by Apple (and, btw, NeXT), but it was developed separately from Apple by StepStone.

          Yeah, for awhile, then it was licensed by NeXT in 1988, who extended the language a bunch, then it was transferred to Apple when Apple acquired NeXT.

          It is currently listed as being owned by Apple.

    • Why do people write such nonsense?

      How can a developer be locked in? If I'm payed to write software for Apple OSes ... why would I care that "I'm locked to Swirft"? Next year I write software for Linux and use C++ and Qt ... wow, that was easy.

      • ...why would I care that "I'm locked to Swirft"?...

        You don't need to care, Apple does. If you are writing for Swift, that is time you are not honing your skills in, or learning, other languages. Unless, of course, you've figured out how to tap into your other selves in the multi-verse...

        • Apples does not care about that.

          If at all a programmer cares ... because it is his job future that is at stake. Apple does not care if you switch from Mac/iOS development to Android/Linux/Windows ... aka C/C++/Java. It is completely irrelevant for them what a single developer or a group of developers does.

          If you are (1)writing for Swift, that is time you are not (2)honing your skills in, or (3)learning, other languages.
          Get a clue. What are you doing? Writing a program (1) for profit?, honing your skills in

      • by StormReaver ( 59959 ) on Monday June 11, 2018 @12:48PM (#56765824)

        How can a developer be locked in?

        It will become self-evident to you after you have written your first major application in a platform-specific language and/or environment that you want to make available on some other platform.

        Until then, I will provide you with the most common scenario:

        You are a company or independent developer who has unwisely decided to write thousands, tens of thousands, hundreds of thousands, or millions of lines of code which uses a vendor-specific language. You are happy with your product, and you are happy with your vendor. All is good in your world. You have spent a huge amount of time and money on making your products, and you can't afford to do it all again from scratch. But you're okay with that, because your vendor is AWESOME, and you just can't foresee a reason why you would ever leave such an incredible experience.

        You become disillusioned and/or pissed off with your language vendor (for whatever reasons; they don't matter for this discussion. Just know that it happens a lot), so you decide, "screw them! I'm out of there!" But, oops! You have written yourself into an intractable dependency on that vendor, because you didn't have the experience to understand how vendor lock-in works. You don't have the resources to rewrite from scratch, so you wind up having to surrender yourself to the whims of your vendor.

        You plead with your vendor to be reasonable, but your vendor DOES have the experience to know how vendor lock-in works. So not only does your vendor ignore your pleas for mercy, but they raise their prices and/or make more unreasonable demands of you that you have absolutely no power to fight. You find yourself with no choice but to either cave to your vendor's demands, or go out of business.

        You realize too late that had you used cross-platform technologies from either the very start, or at least switched to cross-platform technologies early on, you could have told your scumbag vendor to piss off; and then you could have found a different vendor to support you.

        • You do realize you are contradicting yourself?

          Either I'm happy with my product and sell it good, and hence I'm not (yet) affected by vendor lock in, or I'm not, then I switch immediately.

          If I'm happy I make so much money that I can rewrite the software anyway, so: "the I don't have the resources" scenario can actually not happen.

          Then again: it is the customer who decides. If the customer runs only Macs, I have to write for Macs, plain and simple.

          Regarding the topic: you can easily write software for Macs in

    • Don't look for reasons why Swift may be technically superior. Look for reasons why Apple wants Swift to keep developers locked inside the Apple world. Every minute that a developer uses to learn Swift, is a minute not spent on learning a non-Apple technology.

      Um, Swift is Open Source.

      How does that lock ANYONE in to an "Apple World"?

      You stupid Haters REALLY need to think before posting your stupid, imbecilic drivel.

  • I'm out (Score:2, Insightful)

    by ReneR ( 1057034 )
    After writing an entire application stack in Obj-C / Cocoa (ExactScan, OCRKit, ...) we will not continue using Apple only technology. To much vendor lock in, too much extra work porting and sharing code with other platforms. Yes, Swift may be partially vendor neutral, however all the Cocoa / AppKit / UIKit et al. APIs do not help, and Swift is otherwise not too native on Linux and Windows.
    • Re:I'm out (Score:4, Interesting)

      by lhunath ( 1280798 ) <lhunath AT lyndir DOT com> on Monday June 11, 2018 @11:56AM (#56765458) Homepage

      It is pretty hard to avoid vendor lock-in due to APIs nowadays.

      And of all the languages, I daresay Swift is one of the few that's in a clear direction heading away from vendor specificity and toward open access.

    • by Dog-Cow ( 21281 )

      How is it not "native" on Linux? The toolchain creates native binaries, and Swift can interop with C, including the clib, just fine. There's no pre-existing UI toolkits, but that's hardly Apple's fault. Not as if Apple wrote UI/AppKit just for Swift, after all.

    • by DarkOx ( 621550 )

      Swift is otherwise not too native on Linux and Windows.

      What exactly is native? I mean being native on Windows / Linux means being C/C++ or on Windows .Net thanks to a pretty herculean effort by MS to provide very complete platform integration (and you still have to call into even first party C/C++ libraries sometimes to get some things done effectively).

      As to Python / Ruby neither can do anything at all natively other than file and socket operations. Both do offer good web stacks probably because socket operations were easier than doing any kind of native inte

    • by Arkham ( 10779 )
      <quote>After writing an entire application stack in Obj-C / Cocoa (ExactScan, OCRKit, ...) we will not continue using Apple only technology. To much vendor lock in, too much extra work porting and sharing code with other platforms. Yes, Swift may be partially vendor neutral, however all the Cocoa / AppKit / UIKit et al. APIs do not help, and Swift is otherwise not too native on Linux and Windows.</quote>
      &#16;
      After writing an entire application stack in C#/.NET we will not continue using Micr
  • by ElitistWhiner ( 79961 ) on Monday June 11, 2018 @10:57AM (#56765038) Journal

    The only criteria that's relevant if you're already supporting a profitable application on either iOS or OS X platforms - maintainable. How fast does a language enable the rev of your code base, does it abstract your code base above platforms and can its libraries and API's bridge between manufacturer hardware swaps and recompiles without cost of a total rewrite.

    Those were lessons learned during NeXT transitions from little to big endian and Mac OS X revs that Apple made 3X/yr during SteveJobs reign.

  • by Anonymous Coward

    Working for a well known chip company here. I'm one of the DB guys. I know we dumped out in-house iOS team when the whole "port to Swift" BS started. Management took one look at the cost and out-sourced the lot to east Europe.

    • Working for a well known chip company here. I'm one of the DB guys. I know we dumped out in-house iOS team when the whole "port to Swift" BS started. Management took one look at the cost and out-sourced the lot to east Europe.

      Mmmm. Bet that is going just peachy.

      Stupid PHBs.

    • by Altus ( 1034 )

      Why on earth would you rush to port your entire existing app to swift? The languages interoperate pretty well.

  • Oh please. (Score:5, Insightful)

    by Anonymous Coward on Monday June 11, 2018 @11:11AM (#56765128)

    Firstly itâ(TM)s not lock in any more than objc is. No other (serious) platform uses objc. So stop whining about lock in. Programmers on the Apple platforms donâ(TM)t give a crap about that anyway. Weâ(TM)re there because we want to be. And most of us learned objc after learning other languages, because the iOS sdk hasnâ(TM)t been around forever, so we all came from other languages. We arenâ(TM)t so stupid we canâ(TM)t learn something else, kthx.

    Secondly it addresses some real pain points from objc. One is verbosity. Another is the fact that nil objects donâ(TM)t crash the app which makes bugs hard to find. Inconsistent message vs function call syntax. Inconsistent property vs method syntax. I could go on and on.

    Thirdly it addresses type safety. Objc will let you have an array of id, which is like having an array of java.lang.Object and trusting the programmer to use it appropriately. Anyone who argues that strong type safety isnâ(TM)t better has never worked outside of some niche application. This is enough of a pain point with objc that it deserves its own paragraph.

    Fourthly it brings functional programming to the table better than objc. Yes you can pass blocks around in objc but itâ(TM)s syntactically painful where as Swift makes it easy (aka first class citizen).

    I could continue but Iâ(TM)m tired of typing on my phone. The point is there actually are plenty of real world gains in Swift.

    Outside of /. where everyoneâ(TM)s mind is already made up and Apple is the devil, I donâ(TM)t know anyone that has used swift for real who doesnâ(TM)t think itâ(TM)s massively superior to objc.

    Is it finalized? No. Itâ(TM)s a language in flux and theyâ(TM)re taking community feedback too. So everyone is an early adopter by definition and they expect some upgrade pain. Itâ(TM)s really not that bad in practice and Xcode handles 90% of it. Itâ(TM)s no worse than switching to a new iOS version and dealing with depreciations every year.

    Again, weâ(TM)ve all chosen the platform and the language because we think itâ(TM)s superior. We arenâ(TM)t stupid. We arenâ(TM)t locked in. We all know other languages. Thanks.

    Cue the haters.

    • by slickwillie ( 34689 ) on Monday June 11, 2018 @11:41AM (#56765360)
      "I could continue but Iâ(TM)m tired of typing on my phone."

      Thâ(TM)at woâ(TM)rks ouâ(TM)t siâ(TM)nce Iâ(TM)m tiâ(TM)red oâ(TM)f râ(TM)eading iâ(TM)t.â(TM)
    • Itâ(TM)s hard tâ(TM)o take youâ(TM)re argument serioâ(TM)usly â(TM)with all reminâ(TM)ders that Apple doesnâ(TM)t give two shits about â(TM)â(TM)interoperabilityâ(TM)â(TM).
      • You're really blaming the wrong party here. Sure, Apple is replacing normal quotes with "smart" quotes, which not everyone prefers for various reasons. By itself, however, that would be a minor issue, and at least they are following the Unicode standard. The way those smart-quotes are mangled is not Apple's fault; that's entirely due to Slashdot. Between the ongoing failure to handle Unicode properly and the persistent lack of IPv6 support, Slashdot is falling remarkably behind the times for a tech-focused

  • Swift is not alone (Score:5, Insightful)

    by what about ( 730877 ) on Monday June 11, 2018 @11:13AM (#56765152) Homepage

    There is a pattern and it is like this
    - School do not teach anymore why things are done the way they are, the reasons
    - Students are not interested, have low attention span, generally consider the teaching "old stuff"
    - Computer science is populated by freshers, you are old at 40

    the result is:
    - Reinventing the same solution, worse

    How to stop this total waste of time,money ?
    - Keep older developer around and when they say that the "new shiny idea" has been done before, listen to them.

    A list of already done things:
    1) AI (in the current form), it is Neural net, learned 30 years ago, yes, now you have a supercomputer on desk but it is not new tech and has all the same issues it had before.
    2) Blockchain, can anybody think of a revision system ? GIT, SVN ?
    3) Languages, lots of them, really, are we so dumb that to save a few keystrokes we produce something that is obscure after the third line ?
    Can we agree that all possible logic and consistency tests should be done at "compile time" ? (no Unit testing is not the same thing)

    FInally, a question: How do we get rid of the pointy haired boss ? (See Dilbert)
    He knows nothing, makes random decisions (on a good day) and sucks half of the budget in bonuses...

    • by Tablizer ( 95088 )

      How do we get rid of the pointy haired boss

      In the White House?

    • Blockchain, can anybody think of a revision system ? GIT, SVN ?

      Blockchain is very different from GIT and SVN, both of these trust all users, which can't be done with financial transactions.

      Students are not interested

      Looks like the old people are not very interested either, what I just wrote is really the basic idea of blockchain, to be able to handle hostiles as well as friendlies (assuming that there's not too many hostiles).

  • Nil is a feature (Score:4, Interesting)

    by Maury Markowitz ( 452832 ) on Monday June 11, 2018 @11:14AM (#56765160) Homepage

    > Classify the implicit optionality of objects purely as a source of bugs

    Among other issues, this remains my biggest complaint.

    Obj-C generally "did the right thing" with nil in most contexts. Sure, nil-pointer errors are a pain, but declaring them away and just forcing everyone to type ! everywhere does not eliminate them. It does, however, eliminate the simplicity of binding a nil to a text field and just getting a zero, precisely what has to happen in code now.

    • by Kjella ( 173770 )

      Obj-C generally "did the right thing" with nil in most contexts. Sure, nil-pointer errors are a pain, but declaring them away and just forcing everyone to type ! everywhere does not eliminate them. It does, however, eliminate the simplicity of binding a nil to a text field and just getting a zero, precisely what has to happen in code now.

      So... you read an object that has no description because field is nil, show it in a dialog and afterwards the description is "0"? Why not the empty string ""? And when doing debug messaging I usually replace it with "(null)". And you sure don't want to convert nil to zero when it comes to numbers because for a field like "number_of_kids" nil would typically be used for unknown / didn't want to answer / not relevant / not applicable. That said, I wish they'd turn it around so you'd get a nil assignment error

    • Classify the implicit optionality of objects purely as a source of bugs.

      Among other issues, this remains my biggest complaint.

      This is what you choose to complain about, fixing the "billion-dollar mistake" [infoq.com]? You actually want the language to implicitly accept all messages sent to nil as no-ops with a default return value, regardless of the intended interface, and to allow nil to be passed for any reference parameter even when it makes no sense for the parameter to be omitted?

      I would be among the first to promote language-agnostic APIs and allowing the developer to choose the language best suited to the problem domain. H

  • The next language (Score:2, Interesting)

    by Anonymous Coward

    Needs to find a way to provide intuitive usage of all those cores. One of the reasons C was so successful was it abstracted the hardware into a human understandable model. One of the problems with all the new languages is they try to fit a square peg in a round hole and ignore how the hardware works. Yes hardware has gotten much much much faster, but so much of that speed has been consumed with layers and layers and layers of abstraction. If someone figures out how to do what C did for uni-processor compute

  • by 110010001000 ( 697113 ) on Monday June 11, 2018 @11:23AM (#56765232) Homepage Journal
    All us cool kids have switched to Go and Rust. It is better to use the latest languages produced and controlled by a single corporate entity to keep your skills "sharp". By the way, can someone find me a job? I don't know C or C++ but I know how to use the latest frameworks!
  • I don't know jack about Swift, but does it prevent people from assuming things about the CPU?

    It could very well be that Apple has been planning a transition to ARM CPUs for a long time and Swift is one of the step required to have everyone coding in a language were re-compiling for a new CPU is as effortless as possible to make sure all programs can make the jump to the next generation of their computers.

  • Doug Crockfield said it best:

    Anytime we change a software standard, it's an act of violence. It is disruptive. It will cause stuff to fail. It will cause cost and harm to people. So we need to be really careful when we revise the standards because there is that cost. We have to make sure that we're adding so much value to offset the cost.

  • by Arkham ( 10779 ) on Monday June 11, 2018 @11:46AM (#56765396)
    I was a huge fan of Objective-C. I still think it's an elegant language. I love the delegate patterns, I love the quirky stuff like method swizzling, I love the runtime message passing.

    When Swift first came out, it was really rough. The Obj-C bridge APIs were all using forced-unwrapped optionals and the like, and Apple didn't do a great job explaining why all of that was in place. Only with Swift 2, 3, and 4 did it become clear that you should never force-unwrap unless using some crufty API that required it (which it turns out is almost never these days).

    I do miss some things like nested message passing, but I also don't miss a lot of things. I love the map, reduce, and filter capabilities, I like the more nuanced closures vs completion blocks. There are still things that frustrate me, but they generally get better every year. Swift is one of my favorite languages to code in these days.
    • by Tony ( 765 )

      This.

      HUGE fan of Objective-C, back from the NeXT days. Been writing iOS apps since ARC was optional and I never used it, because *I* knew how to alloc and free memory. Yeah, that meant it was a pain in the ass when Apple made ARC a requirement.

      I miss introspection, sure. But optionals, tuples, native unicode support, default parameters, conditional generics, and a consistent calling syntax just makes my life easier. Add to that first-class immutable structs, better functional programming support, and stron

  • And ironically solves NIH by copying elements from nearly every other programming language in existence.
  • by Tim12s ( 209786 ) on Monday June 11, 2018 @12:00PM (#56765496)

    Reduce bugs + Retain performance... Quite often those two are not aligned.

    Apple live in a world where the real value of their product is dependent on 3rd party developers.

    (1) iOS apps crashed more than android apps. The common causes for application crashes on iOS and noticed that the majority of crashes are based on poor coding due to legacy syntax which can be corrected. Shown below are some extracts from 2016.

    (2) Typical of any virtual machine is the initialization cost of the VM. This means that you need to take a fully compiled approach otherwise you lose perceived performance advantages. JVM code is often more performant than a typical C, written at the same skill level, once the VM is warm/hot and great for server workloads however initialization costs are unavoidable.

    Everyone forgets history.

    FYI - ios apps crash more than android apps: https://www.techspot.com/news/... [techspot.com]

    FYI - some infoq: 47% of apps crash more than 1% of the time: https://www.infoq.com/news/201... [infoq.com]

    Android 2.3 'Gingerbread' had a crash rate of 1.7%, for example, while iOS 6 apps crashed 2.5% of the time.

    FYI - AppCoda: https://www.appcoda.com/apteli... [appcoda.com]

    3 Most Frequent iOS Crashes , 23rd May 2016

    SIGSEGV (50%) - This signal occurs when your app attempts to access memory that has not been allocated by the program.
    NSInvalidArgumentException - This crash occurs when a method is being called on an object that can’t respond to it.
    SIGABRT - You’ll see this in your debugger when there is an unhandled exception (see #2). However, in a deployed app SIGABRT appears when there is an assertion failure or abort method called within the app or the operating system. Often raised during asynchronous system method calls (CoreData, accessing files, NSUserDefaults, and other multithreaded system functions).

    • by Tim12s ( 209786 )

      Again, iOS apps crashed 47% more than Android apps - That is a relatively HUGE difference in experience.

      ---

      Android 2.3 'Gingerbread' had a crash rate of 1.7%, for example, while iOS 6 apps crashed 2.5% of the time.

      (1) iOS apps crashed more than android apps. Apple would have noticed the common causes for application crashes on iOS and that the majority of crashes are based on poor coding due to legacy syntax which can be corrected. Shown below are some extracts from 2016.

  • by darkain ( 749283 )

    Microsoft has Visual Studio (C/C++, C#, Basic, etc). Google has GO. Apple has a pissing contest, I mean, Swift.

  • I paid the bills as a Carbon (who knew?) developer. Considering I've always been able to keep current and learn new skills every few years, I was blindsided by Cocoa/Objective-C and the change to the Apple Developer tools. Inside Macintosh was a great resource and when Cocoa was born Inside Macintosh was left by the wayside. The small independent/inhouse developer was left to flounder. For all the greatness attributed to Steve Jobs, he seemed to have abandoned the small developers who couldn't go to "bo
    • by shess ( 31691 )

      I paid the bills as a Carbon (who knew?) developer. Considering I've always been able to keep current and learn new skills every few years, I was blindsided by Cocoa/Objective-C and the change to the Apple Developer tools. Inside Macintosh was a great resource and when Cocoa was born Inside Macintosh was left by the wayside. The small independent/inhouse developer was left to flounder. For all the greatness attributed to Steve Jobs, he seemed to have abandoned the small developers who couldn't go to "boot camps" or wherever else folks went to get on board the new platform. OS X is nice, but Carbon was a well documented and easy to navigate environment. Carbon made the Mac what it was and Apple and Steve Jobs decided to push NeXT OS instead. I am not alone in having fond memories of Carbon while using Microsoft's tools to ply my trade.

      Eh, I spent the 90's working on NeXTSTEP stuff and abandoned Apple for a decade or so after the merger, but my opinion is exactly the opposite - after using NeXTSTEP, I looked at MacOS and could hardly believe that that was what Mac was running in the late 90's. Simply put, they were mired down by technical constraints which were making it increasingly hard to make broad changes and were looking for a way out. They didn't just realize it when they acquired NeXT, they'd already failed to build a replacemen

  • Apple touted it as a more human-readable, easy to grok language, but it's only marginally more easy to deal with than the unwieldy Obj-C it's meant to supplant. Frankly, I don't think Apple should be focusing on Languages, they clearly do not understand how to support developers or produce a language that streamlines development. Perhaps they should copy MS and produce a language that rip-offs the Java API with Obj-C or Swift as it's underpinnings, as much as I can't stand MS, I opted for C# for some Unity
  • by jimbo ( 1370 )

    "..an unbearable mental load" - a language is a tool, ffs, it has issues; so learn them and use it if you have to or become a florist.

    • by Jeremi ( 14640 )

      "..an unbearable mental load" - a language is a tool, ffs, it has issues; so learn them and use it if you have to or become a florist.

      Part of being an effective developer is choosing the right tool(s) for the job. If you choose poorly, you end up spending too much of your time fighting the mismatched tools rather than getting your program working, and you end up with a poor-performing/buggy/hard-to-maintain program that took a long time to develop and may have to be thrown out anyway.

      Even a florist would know better than to use a jackhammer to trim bouquets.

  • .. i.e. less scary to most people.

    However, if you want to establish a PL it has to be cross-plattform. That's a given. And better at cross-plattform than any other PL. I mean GUI builder, binary compiler, FOSS all the way, hardware trickery, etc. Fall short on that and you'll have a hard time getting tracktion. .Net anyone? We have enough flaky half-assed shit in the PL space already - i.e. every freakin' PL out there. Apple has fallen a little short of this.

    The only other possible option for them is offeri

  • I decided a little under 2 years ago to move into the iOS development world. Had a project I could sell ;) I did not have any Apple devices or computers and had not used an Apple product since an Apple II.
    So I picked up a Mac Mini and installed XCode and I was off. While I was more comfortable with Obj-C. I made the choice early to go with Swift. I think it was Swift 2-2.5 or something.
    The hardest part was moving to the Apple/Swift way. And the Apple Developer/VPP/DEP/Apple Enterprise Developer/MDM Manage
  • You using it wrong, peepaw!

Never worry about theory as long as the machinery does what it's supposed to do. -- R. A. Heinlein

Working...