Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Desktops (Apple)

LabView App Abandons the Mac After Four Decades (appleinsider.com) 74

An anonymous reader quotes a report from AppleInsider: Having been created on a Mac in the 1980s, LabView has now announced that its latest macOS update will be the final release for the platform. LabView is a visual programming language tool that lets users connect virtual measurement equipment together to input and process data. AppleInsider staffers have seen it used across a variety of industries and applications to help design a complex monitoring system, or automate a test sequence.

It's been 40 years since Dr James Truchard and Jeff Kodosky began work on it and founded their firm, National Instruments. The first release of the software was in October 1986 where it was a Mac exclusive. In a 2019 interview, Jeff Kodosky said this was because "it was the only computer that had a 32-bit operating system, and it had the graphics we needed." Now National Instruments has told all current users that they have released an updated Mac version -- but it will be the last.

National Instruments says it will cease selling licenses for the Mac version in March 2024, and will also stop support. LabView has also been sold as a subscription and National Instruments says it will switch users to a "perpetual licesse for your continued use," though seemingly only if specifically requested. As yet, there have been few reactions on the NI.com forums. However, one post says "This came as a shocker to us as the roadmap still indicates support."
National Instruments says LabVIEW "will continue to be available on Windows and Linux OSes."
This discussion has been archived. No new comments can be posted.

LabView App Abandons the Mac After Four Decades

Comments Filter:
  • Labiew sucks (Score:4, Interesting)

    by RightwingNutjob ( 1302813 ) on Friday October 13, 2023 @08:07AM (#63922487)

    If your logic doesn't fit neatly into one screen, it's worse than a sea of gotos and meatballs in even the worst text-based language.

    But the front end guis are pretty, so yeah. A product designed for managers and not the people using it.

    • There is a conceptual similarity between dataflow programming, as LabVIEW supports, and functional programming. While it's understandable that the developers may have been unfamiliar with functional programming in the 1980s, there is no good reason why LabVIEW hasn't adopted it yet. See https://forums.ni.com/t5/LabVIEW-Idea-Exchange/For-Each-Element-in-Map/idi-p/4219633 [ni.com].

      Instead, LabVIEW is stuck with clunky while loop frames with a stop sign node to wire up in the middle, broadcast events symbolized by sate

    • by dbialac ( 320955 )
      You're talking about years of change and between then and now on Mac (68xx0 to PPC to Intel to Apple) and in addition the probable overhead of maintaining a completely separate source code in Objective C. Windows and Linux likely share a source code in C++ where changes are somewhat similar. Objective C is the result of what turned out to be a wrong guess in the 80s when it wasn't clear if it would be C++ or Objective C as the primary C based object oriented language.
      • by dbialac ( 320955 )
        ...and just to be clear, I'm referring to the underlying language the product is developed in, not the front-end language the user interacts with.
    • If your logic doesn't fit neatly into one screen, it's worse than a sea of gotos and meatballs in even the worst text-based language.

      I completely agree which is my I write all of my code on a 130" screen using an 8K projector.

      • by chihowa ( 366380 )

        You joke, but when I had to use LabView for a while I just kept requesting bigger and more monitors to make it tolerable. You can push things into sub-VIs to a certain extent, but you always end up with the screen-too-small problem as the complexity increases. Ugh... nightmares...

        • You joke, but when I had to use LabView for a while I just kept requesting bigger and more monitors to make it tolerable. You can push things into sub-VIs to a certain extent, but you always end up with the screen-too-small problem as the complexity increases. Ugh... nightmares...

          And here I always thought it was just me, not breaking my stuff down into small-enough sub-VIs.

          "Glad" to see it's actually a design problem with G itself!

  • Good riddance (Score:5, Insightful)

    by necro81 ( 917438 ) on Friday October 13, 2023 @08:18AM (#63922511) Journal
    I have used LabView over the years. It's handy for simple use cases: hook up one of their USB-connected ADCs, build a virtual instrument, and BAM! you've got a great visual readout on your computer and a way to stream data to disk.

    But at a certain level of complexity and sophistication, the whole thing becomes a house of cards. I've had to work on complex LabView projects - embedded automation on cRIO chassis and the like - and it quickly becomes very hard to navigate and trace through what's happening. When the in-house LabView guru left the company, they became unmaintainable. Encapsulating projects for archiving or portability, build procedures to generate the embedded image, figuring out dependencies, integrating with source/revision control, and just documentation in general are all really hard beyond the simple use case. Oh, and the licensing costs a shitload - many $k per seat.

    This in contrast to just about any other programming language that allows for realtime execution (C++ comes to mind, but there are others): you can find lots of devs out there who can jump right in and figure things out, and there's probably enough in-house talent to at least take a look. Going back to older versions of this-or-that dependency or compiler is a cinch. And you don't need a $10k license just to open the damn project.
  • by TechyImmigrant ( 175943 ) on Friday October 13, 2023 @08:32AM (#63922545) Homepage Journal

    I have encountered labview as an option for lab automation over the decades and always is a pain in the arse.
    You have to buy/license/steal it which in a company is always a lot of bureacracy to wafe through.

    Meanwhile, I would crack open my language and library of the day (C, python, various GPIB libs) and write a script that did the job, with no need to write a justification in triplicate to the purchasing department and wait three weeks.

    • Meanwhile, I would crack open my language and library of the day (C, python, various GPIB libs)

      Congrats on not being the target market. LabView doesn't exist to try and be better than C, python or any other programming language, it exists for the people who can't code in C, python, or anything else, and need to slap together a quite interface in a minute or two.

      • It feels like a fork in the road that one takes. Having programmed a few rudimentary instruments, I found it damn near impossible to navigate LabView because the logic behind the GUI seemed foreign to me. Nevertheless, I've seen some very impressive experimental setups--e.g., complete femtosecond laser system with ALL controls, DAQ and data analysis--done in LabView, and my impression was always that it might be easier to learn IGOR Pro or Matlab instead because these two offer more flexibility down the lin
      • by guruevi ( 827432 )

        It takes a minute or 2 to just open LabView even on modern PCs. There is nothing quick about LabView, the only reason people use it is because someone doesn't want to admit they were fooled by the marketing and the rest is just stuck on the thing.

        They literally sell the equivalent of an Arduino in a case for $700+ which virtually requires their software to use it. Everybody in a lab that wants to remain sane should be replacing their LabView blocks with a singular Python block until they can just completely

      • Meanwhile, I would crack open my language and library of the day (C, python, various GPIB libs)

        Congrats on not being the target market. LabView doesn't exist to try and be better than C, python or any other programming language, it exists for the people who can't code in C, python, or anything else, and need to slap together a quite interface in a minute or two.

        Actually, I have always felt that knowing how to code in conventional languages is more of hinderance than a help when coding in G. You have to constantly re-think how to do things sort-of "inside-out".

    • by dfghjk ( 711126 )

      National Instruments stole their marketing strategy directly from Steve Jobs and never updated it. Donate products to schools, get students using it for free, offer those students discounts, then rely on the students continuing to use LabView after they graduate. If it weren't for LabView there would be no reason for NI to exist, and LabView is an abomination.

  • What do people use in place of LabView, whether commercial or open source?

    • Well you could always use a SCADA system with industrial instruments. Like GE iFix or Ignition! Or Wonderware. Those allow you to build GUI's that control things. But not exactly like LABview which is more for benchtop devices
    • Re: Alternatives (Score:5, Informative)

      by doragasu ( 2717547 ) on Friday October 13, 2023 @08:43AM (#63922573)

      We use Python with pyvisa to access instruments. And you can use the GUI library of your choice (being Qt and imgui popular choices).

      • Programming is not an alternative for a simple click and drag interface. I'm sure plenty of people here can come up with something. But you're here on Slashdot, not sitting in a lab wondering why someone is talking about snakes while everyone else is talking about a computer program.

  • by Rosco P. Coltrane ( 209368 ) on Friday October 13, 2023 @08:59AM (#63922635)

    National Instruments devices are totally overpriced. LabView is a terrible "language" that easily replaced by Python nowadays.

    And if all that wasn't enough, a few years ago, they dropped support for Linux for their libraries: the last OS you can get packages for is CentOS 7. And they left us in deep doodoo with all the test equipment we had that relied on those Linux libraries.

    It's so bad that my company, which was a devoted National Instruments shop for almost 35 years, was very easily convinced by yours truly to start buying devices that are 1/4 the price of the equivalent NA devices that officially support Linux and Python.

    We now buy stuff mostly from LabJack [labjack.com] and I have nothing but good things to say about their hardware, software and support. It took a while, but we finally managed to rip out all our National Instruments acquisition devices that weren't supported anymore and replaced then with LabJack devices, recoded the samplers that relied on the NA libraries, upgraded the CentOS7 machines to something current at last, and now we're finally free from headaches.

    I know we're not the only company vowing never to buy National Instruments ever again: they sat on their laurels for too long, thinking they were an unavoidable industry standard, and now smaller and better incumbents are eating their lunch.

    • [National Instruments] sat on their laurels for too long, thinking they were an unavoidable industry standard, and now smaller and better incumbents are eating their lunch.

      That'll change soon. Emerson Electric just finalized their acquisition of National Instruments [nasdaq.com], so half the company including upper management will probably be laid off in the next year or two as that seems to be Emerson's typical pattern of behavior when buying up companies. All those people who sat on their laurels will be working hard to jump ship starting next week.

      • [National Instruments] sat on their laurels for too long, thinking they were an unavoidable industry standard, and now smaller and better incumbents are eating their lunch.

        That'll change soon. Emerson Electric just finalized their acquisition of National Instruments [nasdaq.com], so half the company including upper management will probably be laid off in the next year or two as that seems to be Emerson's typical pattern of behavior when buying up companies. All those people who sat on their laurels will be working hard to jump ship starting next week.

        Oh, so this is all Emerson's (un)doing!

        Now it makes sense. Stupid MBA-driven bastards!

    • by timholman ( 71886 ) on Friday October 13, 2023 @10:37AM (#63922815)

      National Instruments devices are totally overpriced. LabView is a terrible "language" that easily replaced by Python nowadays.

      The main problem with LabView is that NI always treated it as a hammer for every possible nail. Years ago we bought NI ELVIS boards, because they were much cheaper than buying HP and Tektronix equipment for our teaching labs.

      But the students and instructors hated using the ELVIS boards. You had to load and maintain LabView on a computer just to emulate a voltmeter in software. It was absolute overkill, and a software maintenance nightmare to boot. But it was cheaper than professional HP and Tektronix hardware, and so we stuck with it.

      What ultimately killed National Instruments in the educational space was the advent of inexpensive Chinese test equipment that was perfectly suitable for a student lab. The ELVIS board was much cheaper than an oscilloscope, power supply, signal generator, and multimeter from HP, but in turn the Chinese equipment was far cheaper than the ELVIS. NI made a vain attempt to fight back with an "all-in-one" multi-instrument, but (again) it ran LabView under the hood, and was an absolute pig in terms of performance, while being twice as expensive as what we already had.

      I'm sure that LabView will be around for a very long time. I know one guy who I went to school with who has made his entire career as a LabView consultant, so there's money to be made at it. But as an instructor, there has never been a piece of software that I was so happy to abandon as LabView.

      • The main problem with LabView is that NI always treated it as a hammer for every possible nail

        The problem with LabView is that, as a programming language, it's just about as efficient as telling a story by drawing a comic strip vs writing it down in English: unless you're illiterate, it's a terrible option no matter what you have to do with it.

        I first encountered LabView in the early 90s at university, and the first thought that went through my mind was "It's cute for a demo, but what a really, really inefficient way of doing anything". And lo, 30 years later, the damn thing is still around. That al

    • National Instruments devices are totally overpriced. LabView is a terrible "language" that easily replaced by Python nowadays.

      And if all that wasn't enough, a few years ago, they dropped support for Linux for their libraries: the last OS you can get packages for is CentOS 7. And they left us in deep doodoo with all the test equipment we had that relied on those Linux libraries.

      It's so bad that my company, which was a devoted National Instruments shop for almost 35 years, was very easily convinced by yours truly to start buying devices that are 1/4 the price of the equivalent NA devices that officially support Linux and Python.

      We now buy stuff mostly from LabJack [labjack.com] and I have nothing but good things to say about their hardware, software and support. It took a while, but we finally managed to rip out all our National Instruments acquisition devices that weren't supported anymore and replaced then with LabJack devices, recoded the samplers that relied on the NA libraries, upgraded the CentOS7 machines to something current at last, and now we're finally free from headaches.

      I know we're not the only company vowing never to buy National Instruments ever again: they sat on their laurels for too long, thinking they were an unavoidable industry standard, and now smaller and better incumbents are eating their lunch.

      Do those alternatives support Macs?

      • by guruevi ( 827432 )

        Yes, Mac have USB, and Python.

        NI wants you to buy into their expensive ecosystem of overpriced Arduino boards.

        • Yes, Mac have USB, and Python.

          NI wants you to buy into their expensive ecosystem of overpriced Arduino boards.

          Ok; so there is an alternative to NI's bullshit hardware (which I was never much impressed with, anyway). Great!

  • In practice the simplest instruments become tangled sub-systems with clunky non-intuitive icons. In production the complied system took 30 sec to load before starting. I regret specifying it for a test system.
    • by labnet ( 457441 )

      Yep,
      I dropped 30k on a NI pxi ATE system as I thought it was the best. But what a nightmare the whole NI software ecosystem is.

  • by Petersko ( 564140 ) on Friday October 13, 2023 @11:33AM (#63922939)

    Turn subscriptions in to perpetual licenses? That's the right thing to do if you're going to abandon something. Well done.

  • We looked into using LabView for our AI research in the 90s. It was grossly overpriced and it was hard to find out what it actually could and couldn't do. That shopping trip was short.

  • It is obvious that NI didn't want to/couldn't figure out how to hand-translate piles of carefully-designed time-critical libraries of low-level x86 Assembly code into Apple Silicon code, and they either weren't getting the performance they needed out of Rosetta2 Translations; or, more likely, they were using a bunch of sloppy hacks that Rosetta2 wouldn't put up with.

    But that's not surprising; NI has been itching to drop Mac support since at least LabView 6.0.

  • The Mac was chosen partially because of the 68000. Yes, other computers (Atari, Amiga) also use the 68000.

    But, they were not the computers used in labs.

    When the Mac was introduced, Apple had agreements with universities to provide Mac's to students and in their class/lab work. Drexel was just one the Apple Consortium.

    Those students also went to engineering and labs and brought their experience with Macs and LabView with them.

    Windows 3.1 sucked as it wasn't preemptive multitasking and, face it, the Mac OS

    • by jaa101 ( 627731 )

      Windows 3.1 sucked as it wasn't preemptive multitasking and, face it, the Mac OS with QuickDraw was just a better choice.

      MacOS didn't have pre-emptive multitasking until OS X which had a public release in 2001. Microsoft got there first with Windows NT in 1993.

  • This is the second time in recent days of reporting that Mac support being dropped while Linux support remains. After years of seeing Linux support missing when Mac support is present for software I find it surprising to now see cases where a software producer thinks that Linux is now worth the effort and Mac is not.

    I guess two cases can't really be called a trend but do wonder what is behind this view? Is Apple doing something that leads to this refocus, or is it simply that Linux is more popular than
  • Not surprised at all. Emerson bought National Instuments on Weds. Of course, they are going to start chopping the elements that don't fit Emerson's business plans/use for the tech they acquired. https://www.emerson.com/en-us/... [emerson.com]

One good suit is worth a thousand resumes.

Working...