Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Programming Businesses Apple IT Technology

Steve Jobs thinks Objective C is Perfect? 784

josht writes "Nitesh Dhanjani has posted an e-mail thread between Steve Jobs and himself. Dhanjani argues "I'd like to see Apple developers gain more choice. With every iteration of OSX, there seems to be so much effort put into innovation of desktop components, but the development environment is age old." I agree with Dhanjani. What has Apple done recently to wow the developers, and make it fun to code Cocoa components? I've looked into the Objective C and Xcode environment but compared to Microsoft's .NET solutions, it doesn't impress me at all. I think Apple can do a lot better. Steve Jobs disagrees. What do the readers think about Objective C and Xcode?"
This discussion has been archived. No new comments can be posted.

Steve Jobs thinks Objective C is Perfect?

Comments Filter:
  • namespaces (Score:4, Informative)

    by Anonymous Coward on Monday December 26, 2005 @01:04PM (#14339897)
    Anyone claiming a language lacking proper namespace support is "perfect" is nothing short of delusional.
  • by CapnRob ( 137862 ) on Monday December 26, 2005 @01:05PM (#14339900)
    He's already taken down the emails in question, apparently having had second thoughts about the appropriateness of posting private emails.
  • objective-c is cool (Score:3, Informative)

    by matt4077 ( 581118 ) on Monday December 26, 2005 @01:05PM (#14339902) Homepage
    I develop with both objective-c and c# and while I like the c# syntax and gc better, Interface Builder is the most elegant way of user interface programming out there.
  • Dharma (Score:1, Informative)

    by Anonymous Coward on Monday December 26, 2005 @01:05PM (#14339904)
    Jobs is must be hoping that other developers see the supposed benefits of Obj C as well with the rumored Dharma (Cocoa for Windows) project, if it does in fact exist.
  • Love it (Score:5, Informative)

    by Richard_at_work ( 517087 ) on Monday December 26, 2005 @01:06PM (#14339905)
    Sure, Xcode could do with a little bit of work to add features missing, but I truely find Cocoa a dream to work with. One year ago I only developed for the web, then I bought a Mac and was introduced to Cocoa by a friend. I havent looked back since, and have produced several 'scratch the itch' applications that otherwise wouldnt have been made.
  • by Anonymous Coward on Monday December 26, 2005 @01:09PM (#14339915)
    The development environment is hardly static. Key-value-observing and bindings, Core Data; we get more toys for every system version and they are working on adding garbage collection to Objective C.
  • Network Mirror (Score:4, Informative)

    by BushCheney08 ( 917605 ) on Monday December 26, 2005 @01:25PM (#14339982)
    Network mirror still has the original blog post up. [networkmirror.com]
  • Qt? (Score:1, Informative)

    by Anonymous Coward on Monday December 26, 2005 @01:26PM (#14339990)
    Use Qt [trolltech.com] and STFU. Qt is available on all major platforms natively and that includes MAC. It makes C++ development completely pain free and if your idea is that of hating C++, there are python bindings available and there will be supported Java bindings available soon. (Q1 2006).

    Plus, Java has been an alternative option that doesn't suck on only one platform - MAC. The UI looks native, the development tools are good (XCode, Eclipse, NetBeans ...). What else do you want - .NET? If so, there is Mono for ya on OSX.

    So what else does Jobs need to do and believe in?
  • Comment removed (Score:4, Informative)

    by account_deleted ( 4530225 ) on Monday December 26, 2005 @01:38PM (#14340041)
    Comment removed based on user account deletion
  • by am 2k ( 217885 ) on Monday December 26, 2005 @01:39PM (#14340045) Homepage

    Agreed. I just started using CoreData, and it's a pretty amazing technology. For instance, take a look at this tutorial [cocoadevcentral.com]. It's a whole working database-based app without writing a single line of code! If you want custom behavior, enhancing is very easy, too.

    Key-Value Observing has revolutionized Cocoa development, most developers just didn't notice (b/c it takes some time to get used to it).

  • Python (Score:3, Informative)

    by truthsearch ( 249536 ) on Monday December 26, 2005 @01:40PM (#14340050) Homepage Journal
    There's also a Python bridge for Obj-C. So for those that prefer a very different language, with its interpreter already distributed with the OS, Python's a great option. You get the native objects exposed by OS X available to Python.

    And let's not forget OS X is built on top of BSD. So effectively anything which can be written for BSD can be written for OS X. There are, of course, limited GUI tools, but options are available. Qt libraries, for example, will display native GUI elements when possible.
  • by mrsbrisby ( 60242 ) on Monday December 26, 2005 @01:44PM (#14340065) Homepage
    Actually, Safari is largely written in C++ (the KDE rendering engine)

    Actually, Safari IS written in Objective-C. So is WebKit. WebCore on the other hand is written in Objective-C++ (KWQ), and yes, includes a significant amount of C++ code (KHTML).

    I think this adequately demonstrates the flexibility of Objective-C: to be able to interoperate with C and C++ code on their terms, while C++ code can only interact with Objective-C the same way C can.
  • Comment removed (Score:4, Informative)

    by account_deleted ( 4530225 ) on Monday December 26, 2005 @01:58PM (#14340142)
    Comment removed based on user account deletion
  • by Anonymous Coward on Monday December 26, 2005 @02:02PM (#14340161)
    "Besides, it doesn't hurt that Xcode and it's related dev tools are free on OSX, whereas it's a $600 investement on Windows for the equivalent software."

    Nope. Have you tried the new Visual Studio 2005 Express Editions? They are almost the full IDE, minus some enterprise stuff, for free. They're perfect for small projects and pretty much any sort of desktop application. Visual C++ Express Edition is probably the best free (as in beer) C++ IDE on Windows and although I haven't tried alternative C# IDE's like SharpDevelop, Visual C# 2005 is very powerful. If using free (as in beer) Microsoft products irks you for some unknown reason, there are very good OSS alternatives such as SharpDevelop.
  • by coolgeek ( 140561 ) on Monday December 26, 2005 @02:17PM (#14340251) Homepage
    Memory management in Obj-C is really simple, and making issue of it is an extreme exaggeration. You merely have to follow the rule of "if you allocate it, you're responsible for it", and make sure to either send it [obj autorelease] upon allocation or [obj release] in the [parent dealloc] routine. It really is that simple. Maybe that's too much to ask of the sissy programmers coming out of school these days.
  • Re:namespaces (Score:5, Informative)

    by kongjie ( 639414 ) <(moc.cam) (ta) (eijgnok)> on Monday December 26, 2005 @02:20PM (#14340271)
    All you have to do is look at Jobs' history to see that being a "manager at Sears or Circuit City or something..." would not have satisfied his ambitions. If you're seriously suggesting that, you've misjudged the man.

    You may be right about his programming talent (I'm not saying you are) but clearly you don't know a single thing about human nature.

  • Comment removed (Score:5, Informative)

    by account_deleted ( 4530225 ) on Monday December 26, 2005 @02:33PM (#14340337)
    Comment removed based on user account deletion
  • by ian stevens ( 5465 ) on Monday December 26, 2005 @02:33PM (#14340340) Homepage
    Having recently introduced myself to Cocoa through Cocoa Programming for Mac OS X [amazon.ca] by Aaron Hillegass, I would have to say that Cocoa is quite fun to program for. Specifically, Apple's Interface Builder allows you to quickly build up a GUI without writing a single piece of code. A lot of common tasks require next to no code at all. For instance, adding tabular data requires only that you create your model in XCode and perform all other tasks in Interface Builder. Within seconds your application can have a table with movable, sortable, editable columns. The only code you have to worry about is your model. Of course, should you want to do something more complicated with tables you can.

    Tabular data is just one example, but there are many other ways in which programming for Cocoa is quite easy. Copy and paste using multiple types is a snap, and drag and drop is just a slight extension on top of that, accomplished in minutes. Can Windows' Visual environment say the same? Friends of mine who have implemented drag and drop on Windows spent days doing so, and it still didn't work quite right. The broken nature of drag and drop in many Windows apps is the result.

    Since Mac OS X uses PDF as its native format, creating PDF versions of your data requires only a few lines of code. Similarly, Cocoa provides support for many data formats such as RTF, PNG and TIFF so saving and reading images is a no-brainer.
  • by hkb ( 777908 ) on Monday December 26, 2005 @02:40PM (#14340372)
    I code various C#/.NET things at work, and code Cocoa stuff at home for fun. I'm well-versed in both environments.

    - The environments are apples and oranges (no pun intended). The languages, the workflow, everything is much different.

    - Moving away from ObjC would require some significant reworkings of Cocoa, as its workflow is based on the "ObjC way". Take a look at the mess that is the Cocoa/Java bridge, or Cocoa#.

    - Objective C is WAY more descriptive than other languages (take a look at how you pass arguments in functions, for example).

    - Objective C is easy to learn. Yeah, it's a lot different than the usual paradigms, but when you learn it, you'll enjoy its simplicity.

    Things I hate about Cocoa:

    - It's not managed code. Why should application developers in this day and age have to worry about memory management? (autorelease doesnt count)

    - Having to keep two different programming paradigms in my head. I never even learned C#, I learned Java and jumped right into C#, because they were so similar.

    - Practically no one else in the world uses Objective C, so it's not a very valuable (salary-wise) skill to have.

    - The X-Code/Interface Builder dance is quite clunky. It was cool back in the day, but Microsoft has a much better system developed.

    - VS.NET 2005 > Xcode
  • by Anonymous Coward on Monday December 26, 2005 @02:40PM (#14340378)
    python

    garbage collection - no
    pointers - yes
    ide - no
    type '.' and get function names - no

    i strongly suggest you try out python.
    a perfect language for brilliant folks
    such as yourself
  • by Savantissimo ( 893682 ) * on Monday December 26, 2005 @02:42PM (#14340390) Journal
    Jef Raskin, the creator of the Mac, wrote a piece, Holes In The Histories [raskincenter.org] in which he gives the inside story on Steve Jobs:

    Another cause for inaccuracy is the deliberate misleading of reporters, coupled with some reporters' tendency to believe an apparently sincere and/or famous source. Levy's book gives prominent thanks to Apple's PR department, which learned the history of the Mac from Steve Jobs, whose well-deserved sobriquet at Apple (and later at NeXT) was "reality distortion field." Many times I had seen him baldly tell a lie to suppliers, reporters, employees, investors, and to me; Stross's book provides many examples of this. When caught, Jobs's tactic was to apologize profusely and appear contrite; then he'd do it again. His charm and apparent sincerity took in nearly everybody he dealt with, even after they'd been burnt a few times. For those who didn't know him he seemed utterly credible. In his defense it should be pointed out that some reality distortion is necessary when you are pioneering: when I am conveying my vision of the future I create a non-existent world in the minds of listeners and try to convince them that it is desirable and even inevitable. I'm pretty good at this, but Jobs is a master, unconstrained by "maybe" and "probably." His attractive creation-myth--swallowed whole by susceptible reporters--wherein Apple's computers were invented exclusively by college drop-outs and intuitive engineers flying by the seats of their pants became legend. To hear him tell it, the Macintosh had practically been born, homespun, in Abe Lincoln's log cabin. That it had been spawned by an ex-professor and computer-center director with an advanced degree in computer science would have blown the myth away. A good story will often beat out the dull facts into print.
  • by Lord Crc ( 151920 ) on Monday December 26, 2005 @02:45PM (#14340410)
    Just because they're "bytecodes" doesn't tell us whether they're interpreted or compiled.

    True, however in this interview [oreilly.com] with Anders (Chief C# Language Architect), he states that "I think one of the key differences between our IL design and Java byte code specifically, is that we made the decision up-front to not have interpreters". A bit further down he says "When you make the decision up-front to favor execution of native code over interpretation, you are making a decision that strongly influences design of the IL".

    Certainly you CAN interpret it, but it was designed to be JITed.
  • Comment removed (Score:5, Informative)

    by account_deleted ( 4530225 ) on Monday December 26, 2005 @02:54PM (#14340469)
    Comment removed based on user account deletion
  • better solution (Score:4, Informative)

    by penguin-collective ( 932038 ) on Monday December 26, 2005 @03:03PM (#14340523)
    The next generation Objective C and Xcode already exist: Smalltalk and Smalltalk programming environments.

    Smalltalk is a language with Objective C's object model, but runtime safety, garbage collection, and reflection. Objective C was an attempt to create a very low overhead version of Smalltalk that would interoperate more easily with C code, but most of the technical reasons for making the compromises that were made in the design of Objective C are gone.

    The only thing that would need to be done would be to extend Smalltalk with a notion of "native" or "unsafe" methods; that has been done multiple times before, and it can be done either by permitting C code to be embedded in Smalltalk (reversing the Smalltalk/C situation from Objective C) or by defining a Smalltalk subset that's close to the machine (as Squeak has done).
  • by Colonel Panic ( 15235 ) on Monday December 26, 2005 @03:08PM (#14340554)
    So does C++: they're called virtual function tables.

    true. However, this sort of dynamic dispatch is limited to objects which are members of a certain inheritance hierarchy. In Obj-C, when a message is sent to an object the determination of whether or not that receiver can respond to the message is determined at runtime. It doesn't matter that the receiver is a member of a particular class which inherits from some class which defined some virtual functions.

    With virtual functions in C++ it's sort of a cross between compiletime and runtime: The receiving object must be a member of a certain class hierarchy with the virtual functions defined in a parent somewhere. If you try to pass a class pointer of the wrong class (meaning it's not part of an exclusive hierarchy) then you'll get a compile time error. In Obj-C on the otherhand, we don't care what the class pedigree of an object is as long as it can respond to the message being sent (or to put it another way, as long as it's interface matches the expectations of the user).

    Perhaps it could also be said that virtual functions are C++'s hack to allow limited dynamic dispatch. But it's not as dynamic as what is possible in Obj-C (or other very dynamic OO languages like Smalltalk and Ruby).
  • Re:namespaces (Score:1, Informative)

    by Overly Critical Guy ( 663429 ) on Monday December 26, 2005 @03:08PM (#14340555)
    Not to mention the fact that Steve Jobs do

    I guess you missed that old NeXT demo Slashdot posted earlier in the year where Steve Jobs demonstrates NeXT object-oriented programming and interface builder.
  • by theAtomicFireball ( 532233 ) on Monday December 26, 2005 @03:13PM (#14340571)
    Jef Raskin, the creator of the Mac,

    Jeff Raskin was NOT the creator of the Mac. He was the originator (IIRC) of the Macintosh project, and its first manager, but his vision of the Macintosh was so at odds with Jobs', that to give Jef any credit for what the Macintosh became is unfair and incorrect. If you want to see Jeff's vision, go look at the Canon Cat, which he designed after being asked to leave teh Macintosh project.

    Raskin was a smart guy, but he wanted to design interfaces for smart people; interfaces that had a learning curve associated with them due to all sorts of key combinations to remember. Though he backed away from this a little later in his life, when he saw how successful GUIs were (and, perhaps, wanting to claim an unfair amount of credit for that), all his interfaces were designed to be incredibly efficient for the intelligent geek who wanted to take the time to learn how to use them.

    That doesn't really go along with the Mac's tagline "The computer for the rest of us."

    Remember that history and fact are not the same thing. Jef & Steve both have (well, in Jef's case, had) their versions of what happened; neither is fact, and the real truth, if there is one, probably lies somewhere in the middle, probably a touch closer to Jobs' version, that is, if you know how to interpret the Jobsian language and make sure to read it outside of the RDF.
  • by uwmurray ( 516566 ) on Monday December 26, 2005 @03:18PM (#14340601) Journal
    Dude you're nuts. Have you even *looked* at gcc's objective C support and the runtime or are you just pulling this out of your ass? Obj-C messages are highly optimized and incur about 2x-3x the overhead of C function calls.

    Objective-C / Cocoa has it's warts, speed is not one of them.

    As slow as javascript my ass. I doubt you've ever coded in obj-c. Please study a bit before you spread this kind of FUD.
  • by ceoyoyo ( 59147 ) on Monday December 26, 2005 @03:36PM (#14340691)
    Python. There you go. Have fun! Oh, don't forget to install PyObjC or you may be disappointed.

    Note: Apple didn't come up with it. Neither did MS. It's open source. Apple employees have a lot to do with PyObjC though, which is also open source.
  • by theAtomicFireball ( 532233 ) on Monday December 26, 2005 @03:57PM (#14340804)
    Well, Copeland and OpenDoc are pre-Jobs Apple, so go yell at Amelio or Sculley about those two.

    On the rest of it, Apple never made a blanket "use Carbon", or "use Cocoa" claim. They said, consistently, to use Carbon if you have a lot of legacy toolbox code, and to use Cocoa if you were starting a project from scratch or were bringing things over from NextStep. OpenStep is just the old name for Cocoa and Rhapsody is just the old name for OS X, so you're kinda overstating your point just to make it look more schizophrenic than it really was.

    Metrowerks was in it for the money just as much as anyone else; they weren't "there" for anybody but themselves (and later, their shareholders). Once Metrowerks released a Windows version, they stopped giving the Mac priority and the tools stagnated. Apple inherited a perfectly good IDE from NeXT and Metrowerks gave no indication that they were chomping at the bit to upgrade their tools in a urry so that developers could be ready when OS X came out. Metrowerks wanted to play it cautious and didn't want to gamble on Apple's transition to OS X, so what else was Apple to do? I've met relatively few people except a few cranky old Toolbox guys who didn't want to make the transition to OS X, who aren't happy with Apple Developer support compared with what it was historically. The Inside Macintosh books used to cost an arm and a leg and weren't available in soft editions, MPW was a nightmare, as you stated, and the only other way to create applications was to buy third party IDE.

    Life has been pretty good (not perfect, but pretty good) since Apple bought NeXT. It's been tumultous at times, but has steadily been heading in the right direction, and as a matter of fact, developers have not been leaving the platform in droves; there has been a well-documented and steady increase in the number of developers using OS X as their primary platform.

    I'd hardly say they're "suffering".
  • by Anonymous Coward on Monday December 26, 2005 @04:13PM (#14340882)
    From http://forums.microsoft.com/MSDN/ShowPost.aspx?Pos tID=126606&SiteID=1 [microsoft.com]:

    "Until November 7, 2006, we are promotionally discounting the downloadable versions of Express to free. This doesn't mean that the product turns off after a year, but rather that as long as you download the product before November 7, 2006, you can get it for free and you can use it forever."
  • by Anonymous Coward on Monday December 26, 2005 @04:44PM (#14341039)
    There is not a single modern architecture where one instruction equals one cycle. This is particularly wrong when discussing Java or C#, where it will depend heavily upon your VM, JIT compiler, and how much optimizing it has decided to do. In any case, your estimates are highly optimistic to say the least.
  • Re:Resources (Score:3, Informative)

    by bani ( 467531 ) on Monday December 26, 2005 @05:01PM (#14341106)
    i hate m$ as much as the next /.'er, but you're wrong on both counts.

    MS gives theirs away for free, too [microsoft.com].

    MSDN membership is comparable to the price of ADC [apple.com], when you compare the same ADC and MSDN levels. (msdn level 1 costs $500, level 2 costs around $1500).

    one really annoying thing about apple -- I can use the latest versions of microsoft visual c on my clunky old W2K installation just fine (no way in hell am i "upgrading" to xp!). however apple's latest xcode requires me to upgrade osx -- it won't install on osx 10.3.9. it will only install on osx 10.4, and i can't see any good reason for it. this means for apple, i have to shell out $130 just to be able to upgrade xcode.
  • by IamTheRealMike ( 537420 ) on Monday December 26, 2005 @05:17PM (#14341170)
    The single selling point with Objective-C / Cocoa is the NSAutoreleasePool mechanism. This mechanism is like a garbage collector finally done right.

    I must strongly disagree. In no sense is the auto-release pool equivalent to garbage collection. For one, you still have to think hard about memory management in any complex application - for temporary objects that are just part of the internal works of a function, they work OK, but then stack allocation works better. For actually passing objects around inside a program they don't work at all and you must still manage refcounting and ensure there are no refcount cycles.

    For those who have not encountered this particular construct (which is not unique to Cocoa), an NSAutoreleasePool basically keeps memory around until the main loop is reached. So you can allocate objects inside one and not worry about freeing them, as long as they don't have to survive beyond this particular event. It's a bit more involved than that : there are stacks of them, and you can create them and flush them manually outside the context of a GUI thread. But it's a bit of a cludge and not a substitute for full automatic memory management (though I would agree that a language which forces you to use GC for everything is not suitable for implementing desktop applications).

  • by Knytefall ( 7348 ) on Monday December 26, 2005 @05:40PM (#14341270)
    Here are the emails:

    From: Nitesh Dhanjani
    Subject: Re: Will XCode+ObjC ever suck less?
    Date: December 25, 2005 5:27:02 PM CST
    To: sjobs@apple.com

    I look forward to the improvements! Thanks,

    Nitesh.

    On Dec 25, 2005, at 5:10 PM, Steve Jobs wrote:

    I guess we disagree. First of all, .NET with CLI and managed code runs SLOW, so most serious developers can't use it because of performance. Second, the libraries in C# are FAR less mature and elegant than those in Cocoa. We are working on a better implementation for garbage collection than we've seen out there so far, but in the end its a performance hit and an unpredictable time that is not good for some kinds of apps.

    Steve

    On Dec 25, 2005, at 2:36 PM, Nitesh Dhanjani wrote:

    Objective C is old and clunky. Its almost 2006, and I _still_ have to look out for yucky pointers? I'd love to be able to write native apps with Ruby (or even C#!.) There are open community projects in progress that are trying to bind ruby and C# (mono) with Cocoa, but I'd love for Apple to step in and make this happen faster. Today, Microsoft seems to be _way_ ahead of the development curve - with their .NET implementation, you are allowed to code using a plethora of languages (C#, Python, VB, etc), as long as the interpreter/compiler follows the IL specification - pointers don't matter, garbage collection is done for you - ah the beautiful world of managed code.

    Having said that, most native OSX apps are still beautiful and well designed. Imagine how much better we could do if the developers had a more flexible choice of languages? I can _bet_ you a lot of OSX app developers use Objective C because they have no other choice.

    Nitesh.

    On Dec 25, 2005, at 3:11 PM, Steve Jobs wrote:

    Actually, Objective C is pretty great. Its far nicer than most other ways of writing apps. What don't you like about it? What do you like better?

    Steve

    On Dec 25, 2005, at 11:59 AM, Nitesh Dhanjani wrote:

    Hi Steve

    Will it ever be easy to write native OSX GUI apps? Objective C sucks.

    Thanks,
    Nitesh.
  • by mcc ( 14761 ) <amcclure@purdue.edu> on Monday December 26, 2005 @07:42PM (#14341790) Homepage
    Unless you mean that in ObjC the possible methods for an object are not available at link time,

    Correct.

    in which case type safety is not available. I don't know enough about ObjC; perhaps you can explain it to me succinctly?

    You are actually almost right. The way to put it should be that type safety is available but not required. Method type safety is a compile-time warning, not a compile-time error, because while a program which passes the type safety checks is guaranteed to function without type errors at runtime, a program which fails the type safety checks is not guaranteed to encounter type errors at runtime.

    If this sounds "bad", consider it in context of what happens when a type safety violation does occur at runtime-- the object is given a chance to deal with the "method not found" error itself, in the form of a forwardInvocation: method call basically saying "hey, I tried to execute the method named 'blah' on you and it didn't work"; if that fails, an exception occurs. The penalty for a type error is not all that bad, especially compared to what happens in this point in C++ (you get a messy crash). Also consider that while compile-time type safety is not totally accurate, there is also run-time type safety available which is much more accurate. All objects accept a respondsToSelector: (methodSelector) method which basically ask, "do you accept this method?", and this method has the ability to determine side-effects of dynamic dispatch that the compiler could not.

    These are somewhat advanced techniques within Objective C, and one should not be running programs which emit type safety warnings unless you really know what you're doing. However, when used correctly these things are quite powerful. Performing type safety checks at runtime instead of compile-time allow objective c libraries to leverage the Delegate pattern in a way most languages can only dream of; an objective c object can accept any other object as a delegate, and then simply say "do you accept this method? if so, run it. if not, never mind". In Java, the analogous construct would require a potentially very messy use of interfaces and probably a lot of blank methods to satisfy those interfaces. ForwardInvocation: allows even stranger and more interesting constructs, for example "proxy objects"-- Objective C offers a concept called "distributed objects" which are much like Java RMI, except that distributed objects lack any of the stub hassle and are in fact entirely transparent to any code interacting with the distributed object in question.

    (Full disclosure: Absolutely everything I describe above as an advantage in Objective C can be fully implemented in Java by use of the reflection classes. However, people rarely take advantage of this, perhaps partly because the reflection classes are not very fun to use, and perhaps partly because the Java reflection functionality is quite slow.)
  • by Weedlekin ( 836313 ) on Monday December 26, 2005 @08:22PM (#14341957)
    The interface can exist as separate code, though. All the stuff that IB serialises is available through the Cocoa API (check out the docs on NSView and NSControl, for example), and can be instantiated directly with programming statements if you wish. Using IB to keep the UI code separate from the stuff that interacts with it is however a better way to work, as it allows modifications to be to made the UI without having to recompile the application (separation of concerns).

    MS recognise the above, and will themselves be following a similar route in the future with XAML, which is set to replace WinForms as the UI-building methodology of choice once Vista is launched (XAML will be one of the Vista technologies back-ported to XP). WinForms is thus in life-support mode at the moment: they will fix bugs, but not add more features to it because it is considered to be a deprecated technology.

    NB: I've adopted a mixed-mode approach to Mac programming that seems to work very well from a productivity viewpoint. I do a lot of the main stuff in AppleScript or F-Script, and "drop down" to Objective-C for performance-critical stuff, custom Cocoa sub-classes, Darwin-related tasks, and other things that AppleScript or F-Script either isn't good at, or does too slowly. One could of course do this equally well using (for example) Python with the PyObjC bridge, and I believe that there is something similar for Ruby (don't quote me on that, though!), so the scripting languages I use are just one of the options available to Mac developers. And XCode happily manages all the different language files from a single project, ensuring that Obj-C code is compiled before running the interpreted stuff, managing CVS repositories, and generally making the experience pretty holistic.
  • Re:For the record (Score:1, Informative)

    by Anonymous Coward on Monday December 26, 2005 @10:40PM (#14342450)
    The Java bridge was sacked. It's no longer supported, and there won't be any future additions to it.
  • Re:The Real Irony (Score:3, Informative)

    by oudzeeman ( 684485 ) on Tuesday December 27, 2005 @09:47AM (#14344340)
    xnu isn't a microkernel.
  • by stripes ( 3681 ) on Thursday December 29, 2005 @03:01PM (#14359771) Homepage Journal
    I don't see how this is any better than extending a class and overriding a method or providing a new on as you would in Java

    If you use a category add a method to a class in ObjC when someone passes you a object of that class (or I assume a subclass) it has that method and you can use it. If you extend the class in Java and someone passes you an object of the original class you can't use the new method. E.g. even if Java's string class were not final, adding a "encode to UCS-2 and frobnicate any combining marks" method to a subclass wouldn't let you call that method on string people pass you.

    In C++ the "way around" that is to use an overloaded non-member function. I don't recall being able to do anything like that in Java though.

On the eighth day, God created FORTRAN.

Working...