LabView App Abandons the Mac After Four Decades (appleinsider.com) 74
An anonymous reader quotes a report from AppleInsider: Having been created on a Mac in the 1980s, LabView has now announced that its latest macOS update will be the final release for the platform. LabView is a visual programming language tool that lets users connect virtual measurement equipment together to input and process data. AppleInsider staffers have seen it used across a variety of industries and applications to help design a complex monitoring system, or automate a test sequence.
It's been 40 years since Dr James Truchard and Jeff Kodosky began work on it and founded their firm, National Instruments. The first release of the software was in October 1986 where it was a Mac exclusive. In a 2019 interview, Jeff Kodosky said this was because "it was the only computer that had a 32-bit operating system, and it had the graphics we needed." Now National Instruments has told all current users that they have released an updated Mac version -- but it will be the last.
National Instruments says it will cease selling licenses for the Mac version in March 2024, and will also stop support. LabView has also been sold as a subscription and National Instruments says it will switch users to a "perpetual licesse for your continued use," though seemingly only if specifically requested. As yet, there have been few reactions on the NI.com forums. However, one post says "This came as a shocker to us as the roadmap still indicates support." National Instruments says LabVIEW "will continue to be available on Windows and Linux OSes."
It's been 40 years since Dr James Truchard and Jeff Kodosky began work on it and founded their firm, National Instruments. The first release of the software was in October 1986 where it was a Mac exclusive. In a 2019 interview, Jeff Kodosky said this was because "it was the only computer that had a 32-bit operating system, and it had the graphics we needed." Now National Instruments has told all current users that they have released an updated Mac version -- but it will be the last.
National Instruments says it will cease selling licenses for the Mac version in March 2024, and will also stop support. LabView has also been sold as a subscription and National Instruments says it will switch users to a "perpetual licesse for your continued use," though seemingly only if specifically requested. As yet, there have been few reactions on the NI.com forums. However, one post says "This came as a shocker to us as the roadmap still indicates support." National Instruments says LabVIEW "will continue to be available on Windows and Linux OSes."
Labiew sucks (Score:4, Interesting)
If your logic doesn't fit neatly into one screen, it's worse than a sea of gotos and meatballs in even the worst text-based language.
But the front end guis are pretty, so yeah. A product designed for managers and not the people using it.
Re: (Score:2)
There is a conceptual similarity between dataflow programming, as LabVIEW supports, and functional programming. While it's understandable that the developers may have been unfamiliar with functional programming in the 1980s, there is no good reason why LabVIEW hasn't adopted it yet. See https://forums.ni.com/t5/LabVIEW-Idea-Exchange/For-Each-Element-in-Map/idi-p/4219633 [ni.com].
Instead, LabVIEW is stuck with clunky while loop frames with a stop sign node to wire up in the middle, broadcast events symbolized by sate
Re: (Score:2)
Re: (Score:2)
Re: (Score:3)
I completely agree which is my I write all of my code on a 130" screen using an 8K projector.
Re: (Score:2)
You joke, but when I had to use LabView for a while I just kept requesting bigger and more monitors to make it tolerable. You can push things into sub-VIs to a certain extent, but you always end up with the screen-too-small problem as the complexity increases. Ugh... nightmares...
Re: (Score:2)
You joke, but when I had to use LabView for a while I just kept requesting bigger and more monitors to make it tolerable. You can push things into sub-VIs to a certain extent, but you always end up with the screen-too-small problem as the complexity increases. Ugh... nightmares...
And here I always thought it was just me, not breaking my stuff down into small-enough sub-VIs.
"Glad" to see it's actually a design problem with G itself!
Good riddance (Score:5, Insightful)
But at a certain level of complexity and sophistication, the whole thing becomes a house of cards. I've had to work on complex LabView projects - embedded automation on cRIO chassis and the like - and it quickly becomes very hard to navigate and trace through what's happening. When the in-house LabView guru left the company, they became unmaintainable. Encapsulating projects for archiving or portability, build procedures to generate the embedded image, figuring out dependencies, integrating with source/revision control, and just documentation in general are all really hard beyond the simple use case. Oh, and the licensing costs a shitload - many $k per seat.
This in contrast to just about any other programming language that allows for realtime execution (C++ comes to mind, but there are others): you can find lots of devs out there who can jump right in and figure things out, and there's probably enough in-house talent to at least take a look. Going back to older versions of this-or-that dependency or compiler is a cinch. And you don't need a $10k license just to open the damn project.
Re:In 2019 32 bit computers were "the Mac"??? LOL (Score:5, Informative)
Are you saying that in 1986 there was a 32 bit version of windows? The interview was in 2019 talking about releasing the product in 1986.
The original Macintosh used the Motorola 68000 CPU, which had a 32-bit internal architecture but used a 16-bit data bus and a 24-bit address bus. In essence, it was a bit of a hybrid: it could perform some 32-bit operations but was limited in other ways that you'd typically associate with a full 32-bit architecture. It was was pretty advanced for its time. I guess they could have targeted Amiga as well but they were probably targeting what they felt was a more popular OS. We don't see Linux as a solid option until well into the 90's. I would guess some Sparc stuff could do it with Sun. But ultimately the guy was mostly on target.
Re: (Score:2)
That's the deceit, though. "32 bit" is the native width of general purpose registers, it's only one aspect. Macs were 32 bit, but they lacked a lot of things.
There wasn't a 32 bit Windows in 1986, but there was in 1987 and that version of Windows was more capable than MacOS. Furthermore, Windows, and DOS, could use 32 bit integers before that.
I ported TeX to OS/2 1.0 when OS/2 was new. Worked fine, you know why? Because you DON'T REQUIRE A 32 BIT OS TO DO 32 BIT OPERATIONS.
"I guess they could have target
Re:In 2019 32 bit computers were "the Mac"??? LOL (Score:5, Informative)
There wasn't a 32 bit Windows in 1986, but there was in 1987
That's kind of completely irrelevant for a product released in 1986. Should they have waited a year and then developed it?
Furthermore, Windows, and DOS, could use 32 bit integers before that.
This is a somewhat odd reading of history. Labview was released in October 1986. the i386 CPU was only released 1 year prior to that with the first PC (the Compaq Deskpro 386) released 1 month before.
Yes technically DOS could in theory run code supporting 32 bit operations before LabView was released, but we're talking 1 month. One scant month. And the first 32 bit DOS extender wasn't released until a month AFTER LabView hit the market.
Their claim is fair. For all practical purposes, 32bit on PCs was not available when LabView was first released.
Re: (Score:3)
Windows 3.1 did not ship until 1991 and remained hugely inferior to Mac OS in every respect until Windows NT in 1995. Windows 3.1 was a graphical shell on DOS, and DOS conot even be called an operating system It was a program loader.
Re: (Score:2)
1987 and that version of Windows was more capable than MacOS. Furthermore, Windows, and DOS, could use 32 bit integers before that.
Just lolz.
You either never used a Mac that time or a Windows machine or any of both.
Re: (Score:2)
Re: (Score:1)
When the guys in question decided to go for a Mac, there was no google.
And I did not talk about when the first or second or competing 32bit processor existed.
From their point of view there was only one 32bit system available, and they went for it.
Now we are 40 years later, and we can google around.
(Arguments about which OS was more capable is mostly just opinion.)
Not really. As that machine most certainly had no compiler to make any use of the address space. If it even had more than 640kb of memory. So, if
Re: (Score:3)
That's just not true. Those early versions of Windows were terrible in comparison to the early versions of the Mac. It took another decade for Windows to even begin to approach the usability level of Macs. If you think otherwise it's because you didn't use both in that time period.
Re: (Score:2)
The Windows in 1986 was substandard by any reasonable standard. Because it's design was heavily 8-bit influenced. It was a small GUI on top of a 16-bit DOS that was trivial evolution over an 8-bit microcomputer style (CP/M, etc). It ran 16-bit applications because it was only an x286. When Windows ran on the 386 at first it was still running 16-bit applications for the most part, 32-bit apps were treated as a special case and not the standard for some time (at least on the home versions, the Windows NT di
Re: (Score:2)
Re: (Score:1)
Re: (Score:3)
"In 1986 Mac was the only consumer grade 32bit OS out there."
False. Amiga existed, Atari ST existed. AT&T sold a 68K-based desktop Unix product. There were a dozen or more Unix workstations around.
Sure, you could make up some reason why some of those were not "consumer grade" (but not all) but NI didn't say "consumer grade", it only said 32 bit OS. There were quite a few, just none that ran on the PC. You know, because the PC itself was not 32 bit. Once it was, 32 bit OSes for it appeared almost i
Re: (Score:3)
Calling Pre-System 7 Classic OS "32-bit" seems bizarre. It didn't support 32 bit addressing, that being a big push for "32-bit clean" to get rid of the flag usage in the 24 bit addressing space later on in the 90s. In regards the 68000, the data bus, both internal and external, were both 16 bit. Yes, negative comparison to the 8088 which was 8 bit externally but 16 bit internally. Or the 386SX, same thing but 32 bit internally/16 external. The only thing 32-bit about the 68000 was the registers.
Re: (Score:2)
Re: (Score:2)
No criticism of you intended. Sorry. The whole 32-bit thing at the time was just a little weird as being of importance. By the time the real benefits of 32 bit were realized, you'd have rewritten your software a couple times for sure.
Re: (Score:1)
The only thing 32-bit about the 68000 was the registers.
And that is the only thing that matters.
At the relevant time frames there were already 68020s shipped in Macs and soon 68030s.
Yes, addresses only where 24 bits, which gave a wopping 16MB of address space. That was 3 quarters of the size of my hard drive at that time ... and would have been an unoptainium huge amount of RAM.
Re: (Score:2)
Interestingly the Mac system software at the time was only capable of 8mb, which wasn't even the full 24 bits. Even with a PMMU. I actually remember people maxing their RAM out...in the ~1990 range. Yes, expensive. I couldn't afford it. Impossible? no.
Also, bus size matters a lot. Every 32-bit operation ended up being 2 16-bit ops on the bus.
Re: (Score:3)
At the relevant time frames there were already 68020s shipped in Macs and soon 68030s.
Not at all.
In 1986, Apple released the Mac Plus, which came with 1 MB RAM, and with a lot of work and snipping a lead on a resistor, could be upgraded to 4 MB of RAM. I did a few of those once 1 MB SIMMs came down in price.
In 1987, Apple released the Mac II with a 68020, and the Mac SE, with a 68000.
In September 1988, Apple released the Mac IIx with a 68030.
The first release of LabView was October 1986, so in between the M
Re: (Score:2)
At the relevant time frames there were already 68020s shipped in Macs and soon 68030s.
Not at all.
In 1986, Apple released the Mac Plus, which came with 1 MB RAM, and with a lot of work and snipping a lead on a resistor, could be upgraded to 4 MB of RAM. I did a few of those once 1 MB SIMMs came down in price.
In 1987, Apple released the Mac II with a 68020, and the Mac SE, with a 68000.
In September 1988, Apple released the Mac IIx with a 68030.
The first release of LabView was October 1986, so in between the Mac Plus and the Mac II.
I would not be surprised to learn that NI had access to some pre-production Mac IIx systems.
Re: (Score:1)
So "two years" is not the relevant time frame?
For some reason I have the impression you wanted somehow to correct me.
But actually you supported my point.
No idea if that was your point.
Re: (Score:1)
In 1986, when they were deciding which platform to target, the MacOS was the only 32-bit GUI solution in common use.
Your snark is A-, but your reading comprehension is a generous D+.
It's a common mistake that was made in the 1990s (Score:2)
The marketing term "32-Bit" often stood for not having 8086 style segmented memory. That way you could allocate memory regions larger than 64k without tricks.
This was normal for 68k CPUs like on the Mac, the Amiga or the Atari ST, but not for the x86 world where it took until the 386 for it to become more or less normal... and of course on Windows you still had to deal with segmented memory until the end. That is, BTW, the reason why input fields and Notepad can only hold 64k on Windows. (It might be diffe
Re: (Score:2)
What?
Segmented memory had nothing to do with memory allocation. 8088/8086-based machines had a 20-bit address bus but only 16-bit registers. In order to access all of the possible 1MB
Re: (Score:2)
Yeah, but if you do a malloc on a real mode machine you can only allocate 64k in any usable way. After all you don't really have universal pointers, but "near" (16 Bit) and "far" (16 Bit plus magic to get the right segment) pointers instead. Incrementing a pointer past a 64k boundary was hard, therefore compilers wouldn't do do it by default.
LabView, Making Lab Work Hard (Score:4, Insightful)
I have encountered labview as an option for lab automation over the decades and always is a pain in the arse.
You have to buy/license/steal it which in a company is always a lot of bureacracy to wafe through.
Meanwhile, I would crack open my language and library of the day (C, python, various GPIB libs) and write a script that did the job, with no need to write a justification in triplicate to the purchasing department and wait three weeks.
Re: (Score:2)
Meanwhile, I would crack open my language and library of the day (C, python, various GPIB libs)
Congrats on not being the target market. LabView doesn't exist to try and be better than C, python or any other programming language, it exists for the people who can't code in C, python, or anything else, and need to slap together a quite interface in a minute or two.
Re: (Score:2)
Re: (Score:2)
It takes a minute or 2 to just open LabView even on modern PCs. There is nothing quick about LabView, the only reason people use it is because someone doesn't want to admit they were fooled by the marketing and the rest is just stuck on the thing.
They literally sell the equivalent of an Arduino in a case for $700+ which virtually requires their software to use it. Everybody in a lab that wants to remain sane should be replacing their LabView blocks with a singular Python block until they can just completely
Re: (Score:3)
Meanwhile, I would crack open my language and library of the day (C, python, various GPIB libs)
Congrats on not being the target market. LabView doesn't exist to try and be better than C, python or any other programming language, it exists for the people who can't code in C, python, or anything else, and need to slap together a quite interface in a minute or two.
Actually, I have always felt that knowing how to code in conventional languages is more of hinderance than a help when coding in G. You have to constantly re-think how to do things sort-of "inside-out".
Re: (Score:2)
National Instruments stole their marketing strategy directly from Steve Jobs and never updated it. Donate products to schools, get students using it for free, offer those students discounts, then rely on the students continuing to use LabView after they graduate. If it weren't for LabView there would be no reason for NI to exist, and LabView is an abomination.
Alternatives (Score:2)
What do people use in place of LabView, whether commercial or open source?
Re: (Score:2)
Re: Alternatives (Score:5, Informative)
We use Python with pyvisa to access instruments. And you can use the GUI library of your choice (being Qt and imgui popular choices).
Re: (Score:2)
Programming is not an alternative for a simple click and drag interface. I'm sure plenty of people here can come up with something. But you're here on Slashdot, not sitting in a lab wondering why someone is talking about snakes while everyone else is talking about a computer program.
Re: (Score:2)
Anything in NodeRed?
National Instruments stuff is completely terrible (Score:5, Insightful)
National Instruments devices are totally overpriced. LabView is a terrible "language" that easily replaced by Python nowadays.
And if all that wasn't enough, a few years ago, they dropped support for Linux for their libraries: the last OS you can get packages for is CentOS 7. And they left us in deep doodoo with all the test equipment we had that relied on those Linux libraries.
It's so bad that my company, which was a devoted National Instruments shop for almost 35 years, was very easily convinced by yours truly to start buying devices that are 1/4 the price of the equivalent NA devices that officially support Linux and Python.
We now buy stuff mostly from LabJack [labjack.com] and I have nothing but good things to say about their hardware, software and support. It took a while, but we finally managed to rip out all our National Instruments acquisition devices that weren't supported anymore and replaced then with LabJack devices, recoded the samplers that relied on the NA libraries, upgraded the CentOS7 machines to something current at last, and now we're finally free from headaches.
I know we're not the only company vowing never to buy National Instruments ever again: they sat on their laurels for too long, thinking they were an unavoidable industry standard, and now smaller and better incumbents are eating their lunch.
Re: (Score:3)
[National Instruments] sat on their laurels for too long, thinking they were an unavoidable industry standard, and now smaller and better incumbents are eating their lunch.
That'll change soon. Emerson Electric just finalized their acquisition of National Instruments [nasdaq.com], so half the company including upper management will probably be laid off in the next year or two as that seems to be Emerson's typical pattern of behavior when buying up companies. All those people who sat on their laurels will be working hard to jump ship starting next week.
Re: (Score:2)
[National Instruments] sat on their laurels for too long, thinking they were an unavoidable industry standard, and now smaller and better incumbents are eating their lunch.
That'll change soon. Emerson Electric just finalized their acquisition of National Instruments [nasdaq.com], so half the company including upper management will probably be laid off in the next year or two as that seems to be Emerson's typical pattern of behavior when buying up companies. All those people who sat on their laurels will be working hard to jump ship starting next week.
Oh, so this is all Emerson's (un)doing!
Now it makes sense. Stupid MBA-driven bastards!
Re:National Instruments stuff is completely terrib (Score:4, Insightful)
The main problem with LabView is that NI always treated it as a hammer for every possible nail. Years ago we bought NI ELVIS boards, because they were much cheaper than buying HP and Tektronix equipment for our teaching labs.
But the students and instructors hated using the ELVIS boards. You had to load and maintain LabView on a computer just to emulate a voltmeter in software. It was absolute overkill, and a software maintenance nightmare to boot. But it was cheaper than professional HP and Tektronix hardware, and so we stuck with it.
What ultimately killed National Instruments in the educational space was the advent of inexpensive Chinese test equipment that was perfectly suitable for a student lab. The ELVIS board was much cheaper than an oscilloscope, power supply, signal generator, and multimeter from HP, but in turn the Chinese equipment was far cheaper than the ELVIS. NI made a vain attempt to fight back with an "all-in-one" multi-instrument, but (again) it ran LabView under the hood, and was an absolute pig in terms of performance, while being twice as expensive as what we already had.
I'm sure that LabView will be around for a very long time. I know one guy who I went to school with who has made his entire career as a LabView consultant, so there's money to be made at it. But as an instructor, there has never been a piece of software that I was so happy to abandon as LabView.
Re: (Score:2)
The main problem with LabView is that NI always treated it as a hammer for every possible nail
The problem with LabView is that, as a programming language, it's just about as efficient as telling a story by drawing a comic strip vs writing it down in English: unless you're illiterate, it's a terrible option no matter what you have to do with it.
I first encountered LabView in the early 90s at university, and the first thought that went through my mind was "It's cute for a demo, but what a really, really inefficient way of doing anything". And lo, 30 years later, the damn thing is still around. That al
Re: (Score:2)
National Instruments devices are totally overpriced. LabView is a terrible "language" that easily replaced by Python nowadays.
And if all that wasn't enough, a few years ago, they dropped support for Linux for their libraries: the last OS you can get packages for is CentOS 7. And they left us in deep doodoo with all the test equipment we had that relied on those Linux libraries.
It's so bad that my company, which was a devoted National Instruments shop for almost 35 years, was very easily convinced by yours truly to start buying devices that are 1/4 the price of the equivalent NA devices that officially support Linux and Python.
We now buy stuff mostly from LabJack [labjack.com] and I have nothing but good things to say about their hardware, software and support. It took a while, but we finally managed to rip out all our National Instruments acquisition devices that weren't supported anymore and replaced then with LabJack devices, recoded the samplers that relied on the NA libraries, upgraded the CentOS7 machines to something current at last, and now we're finally free from headaches.
I know we're not the only company vowing never to buy National Instruments ever again: they sat on their laurels for too long, thinking they were an unavoidable industry standard, and now smaller and better incumbents are eating their lunch.
Do those alternatives support Macs?
Re: (Score:2)
Yes, Mac have USB, and Python.
NI wants you to buy into their expensive ecosystem of overpriced Arduino boards.
Re: (Score:2)
Yes, Mac have USB, and Python.
NI wants you to buy into their expensive ecosystem of overpriced Arduino boards.
Ok; so there is an alternative to NI's bullshit hardware (which I was never much impressed with, anyway). Great!
Re: LOL (Score:3)
Show me one post in this thread featuring a crying fanboi. LabView seems to be nearly universally disliked on all platforms.
Re: (Score:3)
Exactly. If I was a Mac fanboi, knowing LabView, my reactions would be something likef:
"MacOS must be a very good platform: National Instruments decided to leave it."
"Finally, this horrid stain has left my beautiful computer".
I don't have to suffer at work with a Mac anymore and spoil my experience of Macs, as Macs are meant to be do only pleasurable things with."
Re: (Score:2)
Your post is an excellent example. Keep crying.
Go away.
We adults are actually having a relatively discussion here (for once!).
FOAD.
Re: (Score:2)
Relatively sane discussion.
Slashdot needs a fucking Edit button!
Re: (Score:2)
Re: (Score:2)
NI's decision itself is weird. MacOS isn't a small platform for them and they just spent a ton of programming effort porting to ARM. They have been moving more towards Windows support only anyways.
Thankfully this just helps accelerate the move to Python since we just have MacOS and Linux, their Linux support is bad and no more MacOS, so seems like no more NI.
Oh, they did this after making LabView Apple Silicon Native?!?
Then it has to be Emerson's stupidity!
Idiots.
Goodbye LabView!
Looks like a good idea but.. (Score:1)
Re: (Score:2)
Yep,
I dropped 30k on a NI pxi ATE system as I thought it was the best. But what a nightmare the whole NI software ecosystem is.
Nicely Handled. (Score:3)
Turn subscriptions in to perpetual licenses? That's the right thing to do if you're going to abandon something. Well done.
ripoff (Score:2)
We looked into using LabView for our AI research in the 90s. It was grossly overpriced and it was hard to find out what it actually could and couldn't do. That shopping trip was short.
Lazy, Lazy, Lazy (Score:2)
It is obvious that NI didn't want to/couldn't figure out how to hand-translate piles of carefully-designed time-critical libraries of low-level x86 Assembly code into Apple Silicon code, and they either weren't getting the performance they needed out of Rosetta2 Translations; or, more likely, they were using a bunch of sloppy hacks that Rosetta2 wouldn't put up with.
But that's not surprising; NI has been itching to drop Mac support since at least LabView 6.0.
Processor isn't really the issue (Score:2)
The Mac was chosen partially because of the 68000. Yes, other computers (Atari, Amiga) also use the 68000.
But, they were not the computers used in labs.
When the Mac was introduced, Apple had agreements with universities to provide Mac's to students and in their class/lab work. Drexel was just one the Apple Consortium.
Those students also went to engineering and labs and brought their experience with Macs and LabView with them.
Windows 3.1 sucked as it wasn't preemptive multitasking and, face it, the Mac OS
Re: (Score:2)
Windows 3.1 sucked as it wasn't preemptive multitasking and, face it, the Mac OS with QuickDraw was just a better choice.
MacOS didn't have pre-emptive multitasking until OS X which had a public release in 2001. Microsoft got there first with Windows NT in 1993.
Start of a trend? (Score:2)
I guess two cases can't really be called a trend but do wonder what is behind this view? Is Apple doing something that leads to this refocus, or is it simply that Linux is more popular than
Not surprised (Score:2)