why I hate, despise, detest, and loathe LabView

  1. Inability to write descriptive comments!
  2. Inability to name variables!!!
  3. Nonlinear, graphical programming interface:
    1. Messy, horribly hard-to-follow programs!  Wires everywhere!
    2. Extreme difficulty to insert new commands into an established program without ruining the organization structure!!
    3. Frakking impossible to debug!!!!!
    4. Computer processors operate linearly anyway–LABVIEW IS LYING!!!
  4. Sequence structures–the most cumbersome way possible for the LabView creators to have tried to rectify the problem that sometimes YOU JUST NEED TO EXECUTE COMMANDS IN ORDER JUST LIKE A CONVENTIONAL PROGRAM, DAMMIT!!!
  5. Mouse sensitivity!  As in, my programming ability should not have to rely on my skill to accurately position the mouse over some of those frakking tiny terminals!
  6. Timing structures–THEY DO NO SUCH THING!
  7. The fact that it has to rebuild all its data acquisition sub-VIs every time I want to make a tiny change to the sampling mode!
  8. Shift registers and sequence instances!  The saddest excuses for variables on the planet–and they contribute to the messy wiring problem!!
  9. It handles arrays in an extraordinarily clunky manner–and when you’re taking data, the role LabView is best suited for, MOST OF THE TIME YOU CAN’T POSSIBLY AVOID USING ARRAYS!

411 Responses to why I hate, despise, detest, and loathe LabView

  1. Craig says:

    I am totally in agreement you on this, however you left one really important thing out: it’s proprietary and costs thousands of dollars per seat. I hate how NI suckers people into buying their expensive equipment because “it will work optimally with LabView”. Oh yeah, code and settings are hidden from plain site in all sorts of stupid menus and behind other widgets!
    I have been “programming” labview for 2 years now and I hate it more and more every day. I also have been programming in python for about the same time and I have very little complaints with that language. Now I ‘m converting some automated test equipment from a windows/labview (doubly loathsome) platform to an entirely open-source linux/python platform – and I’m loving every minute of it despite the typical lack of documentation.

  2. nomel says:

    I agree. We had some part screening programs that were made by some former engineers. We had to add features…it took HOURS UPON HOURS to move all the damn little wires around. Towards the end, after I showed some higher ups what it took to maintain, I was pretty much told to screw neatness and get the feature added. With that and some run-ins with version compatibility problems…we’ve now moved to an easy to use text based language and never looked back. Things that took screens and screens in labview took a page of infinitely editable lines…with comments on every one if you want!

    My conclusion: Upside: Labview is interesting because it’s graphical. Downside: Labview is graphical, so it’s impossible to add features in a timely manner when specifications change. And seriously…who was in charge of the mouse UI!? Each attempt to make a long wire or scroll the screen was one step closer to dumping Labview completely…which we did.

    I wouldn’t mind it if they changed arrays, changed navigation and zoom, and added scripting support…so you could make blocks of scripted code for those times when something graphical is just wayyyyy too needlessly complicated.

  3. Say No to LabVIEW! says:

    I hate how you have to search through endlessly nested pop-up menus of icons that sometimes don’t even look like what they’re supposed to represent (that’s right type cast, you look like a satellite). And the structure of the menus seems completely arbitrary!

    And why are there like 50 different ways to perform one simple task?!? I JUST WANT TO READ AN ANALOG VOLTAGE FOR CHRIST’S SAKE!

  4. jshoer says:

    I think this post is kind of hilarious, since it’s old and I keep getting comments on it. Makes me wonder just how many people out there type “I HATE LABVIEW!” into Google searches!

  5. LabVIEW sucks says:

    I am a student at Stevens and am required to take a Design Lab. Last semester I had to program in C++ and I absolutely loved it. Now this semester I’ve been introduced to LabVIEW and I hate every minute of it. Its impossible to debug and figure out where problems lie. I have no idea how to follow someone else’s program because I don’t know where the program “starts”. Its messy and there are wires everywhere. Its frustrating to even get a case structure set up and even more so to get nested loops set up without the program freezing. Most of all, it completely violates the philosophy of programming because it takes something that is simple and makes it ten times more inefficient.

    And the reason my professor says people like it: its graphical and so non-computer engineers can understand it. **** YOU. If I have to work 3 times as hard to do my job so some non-programmer can understand my work by “looking at the pretty pictures”, then I want doctors to do dumb down their work so I can understand how they do their jobs.

    If I ever get offered a job where I have to work with LabVIEW I will turn it down immediately. I abhor this program. Give me C++ back.

  6. nicola says:

    Man, is there some logic beyond this crap? I guess this G language is made to be seen, not to be used.

  7. Kyle says:

    Remember when you went to McDonald’s some 10 years ago or so and discovered that as you ordered a Big Mac, fries, and a coke, the cashier pushed enormous buttons on the oversized machine with pictures of food and drinks on it!? Are we now dumbing-down even at the university level!? A Ph.D. should mean you can spend a few hours learning how to interface a device with serial commands and a big boy’s (girl’s) programming language.

  8. fluffums says:

    I wondered why my superiors insisted on Labview. When I asked they said “It’s an industry standard”. Okay, but why? “G” is just a cool looking GUI interpreter that’s based on C, C++ and maybe C#. When it comes to interfaces (supposedly it’s strength, right? Virtual Instruments?) it’s horribly slow, and it’s hardware and accessories are as overpriced as its software package. As an engineer — even as a technician — it’s understood you would know how to program already, so I still have to ask “Why is Labview an industry standard”? Anyone?

  9. pookie says:

    I wondered why my superiors insisted on Labview. When I asked they said “It’s an industry standard”. Okay, but why? “G” is just a cool looking GUI interpreter that’s based on C, C++ and maybe C#. When it comes to interfaces (supposedly it’s strength, right? Virtual Instruments?) it’s horribly slow, and it’s hardware and accessories are as overpriced as its software package. As an engineer — even as a technician — it’s understood you would know how to program already, so I still have to ask “Why is Labview an industry standard”? Anyone? If programming gas pumps or ATMs is considered the “industry” isn’t it a bit overrated?

  10. jshoer says:

    The “industry standard” point is kind of interesting, especially since c/C++ and Matlab are also industry standards, both of which require some nontrivial programming knowledge. Of course, Matlab also includes Simulink for some additional graphical programming hassles…

  11. mtron says:

    “I think this post is kind of hilarious, since it’s old and I keep getting comments on it. Makes me wonder just how many people out there type “I HATE LABVIEW!” into Google searches!”

    Actually, I typed “Labview sucks”, and this is what I found =)

    I’ve tried many times to get into labview, as there are so many people around where I work who use it and swear by it, and in fact we have a site license so it’s free if I want it. We have a few ni-daq cards around too, with great looking features on them. However, the only times I’ve managed to get them to do exactly what I want), is when I’ve programmed them in C (not real impressed with ni-daqmx c api either, but can work it out), or using COMEDI.
    My issue I think is that when I want to do something in LabVIEW, I can see in my mind exactly what I want to be done- but get stuck again and again when working out how to implement it, as the LabVIEW way of doing things seems completely unintuitive to me, and I can’t see the way to solving my problems, because I can’t see where to look. I have to resort to looking at examples, and when that fails, randomly attaching wires till it works.

    I will try it again, because there’s too much code around here already using it to ignore. But it will hurt.

  12. Pierre says:

    I typed diplomatically “alternative to labview” :-)

    Fortunately, NI provides a C library to access their boards, which I intend to use with Python, in plain text code!

  13. Crypt says:

    My boss is requiring me to use this POS software. I downloaded some VIs for this equipment I have to use and it makes spaghetti code seem like the best written code on the planet. I can’t tell where a program starts ever. What’s sad is what took like 20 Vi’s to read a meter took me a page of code in VB6. I hate labview. What a horrible convoluted way to program. Assembler code is more readable.

  14. Frustrated Enginerr 9658 says:

    I agree. I use Labview to automate tests and to control lab equipment. Its impossible! Labview is the opposite of programming. More loops than I thought humanly possible. Timing issues. Code not doing what it should…. Its crazy.

  15. ali65 says:

    Graphical programming is horrible. C is too low level for certain tasks and not very portable.
    A scripting (interpreted or compiled on the fly) language is preferable. A language with strong OOP support.

    What options we have? Python, Ruby or Groovy, maybe Perl. Right?

    I wonder when will instrument manufacturers start providing support for any of these.

    I started creating wrappers around the NI and other manuf’s C Library using the Groovy language, using JNA and it works great.

    Or sucks less. :)

    • joe montana says:

      Every instrument takes commands from either serial, gpib, or ethernet, the only support you need is documentation on the commands. It’s trivial to make all the drivers you need in any normal language before lunch.

      The only language that needs support is Labview cause it’s so brain damaged. My skin crawls when I hear anyone spout about their ability to “program” in it.

      Labview is actually a good thing in industry because it keeps the stupid idiots that use it as stupid idiots, and it’s so easy to work magic and excel away from them. In fact your productivity is so great they can’t even comprehend and they think your cheating or something.

      Another benefit is that it is a surefire way to determine if someone is a worthless idiot. For this reason alone, I give thanks and praise to National Instruments, it is the greatest gift to engineering.

      • chiraldude says:

        Trivial to create an instrument driver? All before lunch? Most instruments are fussy about configs and full of undocumented “features”. Much testing and debug is required to create robust instrument control code. Whatever language you use, I suspect your code is crap. Generates errors constantly, locks up the instrument, is cryptically formatted, and has no comments. Anyone inheriting your code will likely give up trying to understand it and end up rewriting it in LabVIEW.
        Also, your attitude sucks. Insulting everyone who doesn’t write code like you do? You wouldn’t last a week where I work.

      • joe montana says:

        >>>chiraldude says: Trivial to create an instrument driver? All before lunch? Most instruments are fussy about configs and full of undocumented “features”. Much testing and debug is required to create robust instrument control code.

        All the more reason to keep the Labview “programmers” away from it. Labview lacks any sane way to do version control for petes sake. Do you even understand what version control is?

        >>>chiraldude: I suspect your code is crap. Generates errors constantly, locks up the instrument, is cryptically formatted, and has no comments.

        This describes most Labview programs I’ve seen, except you missed slow, and bloated and behind schedule, and over budget and over promised.

        >>>chiraldude: You wouldn’t last a week where I work.

        Well, if your shop uses Labview by choice, and has people such as yourself trying to defend it. Then I wouldn’t last a week, nor would anyone else with a shred of ability.

        I should clarify that not all Labview programmers are crap, some are forced to use it, and they get the job done. These guys understand it’s problems but use it anyway for political or compatibility reasons.

        I only have issues with people that lack the intelligence to understand it’s limitations and promote it’s use as the right choice, forcing other people to wade into the cesspool created by National Instruments.

        Labviews only reason for existence is to enable non-programmers to program. This says it all. Ponder this statement a little while and understand it’s far reaching implications.

  16. philly says:

    I totally agree that labView sucks..
    Who decided that graphical programming is suitable for designing complex measurement/control programs? It’s a stupid decision.

    What if the number keys on our keyboard has pictures of one ball, two cows, three cats, etc instead of “1”, “2”,”3″? Or what if English is written in “graphic icons” instead of alphabets? ( I mean, a picture of cow instead of writing “cow”. ) Or, what if math is taught without mathematical symbols? Especially without parentheses. LabView feels exactly like that..

  17. ali65 says:

    Not just Graphical Programming sucks, Graphical Editing in general.

    Why cannot I define a flow chart in plain text. I would be so much easier to leave the hard work of drawing to some intelligent software.

    Have you ever opened a CAD fine, which was impossible to edit, because the previous author used a non-standard mesh size?

  18. James says:

    Just to give a different perspective, I’m someone with a lot of experience LabVIEW but very little in text languages. I have the reverse problem to the other posters: things that are easy in LabVIEW are hard for me in a text language. I would imagine that someone with considerable experience in one text language could leverage a lot of that knowledge in adopting another, but would be a complete beginner in trying to program in LabVIEW, not knowing how to do the simplest things.

    I’ve also had the reverse experience of having to deal with someone else’s badly written C code, and I often find it easier to rewrite it in LabVIEW. But much of that is just bad code versus good code, and there is a LOT of poor LabVIEW code out there. I have to deal with that too.

    — James

    • Terrell Jackson says:

      “I often find it easier to rewrite it in LabVIEW.” Don’t you mean draw.

      • James Powell says:

        I usually just call it writing. LabVIEW is more similar to text coding than it is to actual drawing, since it is specifying relationships among symbols.

      • Ben Mead says:

        Ok, James, do you have a simple answer to “Where does a LabView program start execution?” Is there a good rule for that?

        For highly reliable and tightly managed programs where all changes should have documentation to back them up, I think that the lack of a good way to represent changes in text between version 1 and version 2 should be a non-starter, yet here we are, such a language is well accepted in industry.

      • Josh White says:

        Ben Meade. Where does a program start? You are thinking control flow instead of data flow. Functions execute when data is available. So execution flow is following the wires. As for revision control, it is there as well. There is a visual compare feature. Also, each subVI stores a revision number every time you save. But just like text languages, you need configuration management

  19. Todd says:

    I’m Todd from National Instruments. Thanks for the feedback and please know that we’re aware of many of the perceived problems you mentioned on this thread. In LabVIEW’s defense, and like other spoken languages, it’s common to identify with your first language and perceive only the problems of others while trying to learn them. Additionally, there are hundreds of thousands of LabVIEW programmers around the world and that number continues to grow.

    Maybe the real problem is that LabVIEW is labelled as “easy” (yes, we’re certainly guilty of that), especially for those without a rigorous programming background. In fact, no language is easy when it comes to complex problems and each requires a large amount of training and experience before mastery/proficiency is achieved. Want to create a basic UI that displays the acquired data from a desktop instrument? That’s a sweet spot for LabVIEW and anyone can do it! Want to control CERN’s Large Hadron Collider (http://sine.ni.com/cs/app/doc/p/id/cs-10795)? No matter what language you choose, it’s going to take some time to do it right.

    National Instruments offers a ton of online resources, all free of charge, including tutorials, webcasts and example code (ni.com/zone). Over 130,000 registered users on the NI Discussion Forums can help answer your questions (ni.com/forums). And there are training courses offered around the world. If you are using LabVIEW and want to see our R&D make changes, you can submit your ideas and vote on those of others at ni.com/ideas. Many of the top rated ideas are currently being implemented in our next major release.

    I totally understand your frustrations, but if anyone on this thread is still using LabVIEW and feels upset, email me directly and I’ll do what I can to help get you the help you need to be successful with LabVIEW.

    Todd Sierer
    LabVIEW Community Manager, National Instruments

    • jshoer says:

      Hi Todd,

      I’m glad someone from NI has discovered this; and I understand that there are reasonable arguments for structuring LabView the way it is. That said, I still think the graphical programming language is incredibly frustrating. The top search strings that hit my blog are all variations on “LabView sucks” and “I hate LabView,” so I know I’m not alone.

      I don’t think the problem is a lack of tutorials; rather, it’s that the graphical programming editor does not allow users to put into practice many of the programming principles we learn when we pick up other languages. In a technical profession, users will almost certainly pick up something like C, C++, Matlab scripting, Python, or any number of common programming languages, and when we do, we learn to do things like give variables descriptive names, write comments, use whitespace to set off code for clarity, allocate and manipulate arrays with speed of execution in mind, write for loops as arrays if possible, etc, etc. Many of these principles cannot be implemented in LabView. I almost always have code readability issues as I have to trail wires all over the place, and when I refer to a VI I haven’t opened up in a while, I easily forget what those wires and doing and why they go where they do. I have to spend a significant amount of time tracing them backwards until I find some input or control, rather than just being able to read off a descriptive variable name.

      I’d like to suggest a solution: some kind of option within LabView to switch to a text-editing mode. Given the way computers work, I know there’s some text-based interpreter buried somewhere under LabView’s bubbly exterior, so I wouldn’t think this would require a whole overhaul. I do a lot of coding in Matlab, and if I could just port my scripts and my scripting language experience right over into LabView when I head into my lab, that would be amazing and a lot less frustrating.

      • Todd says:

        Thanks for the feedback. Interestingly enough, most (if not all) of the programming principles you listed can be done easily in LabVIEW, but what strikes me the is the fact that you are used to programming in a certain way and don’t want to have to change. That’s our problem to address: we need to continually adapt LabVIEW to meet the needs of new potential users and your comments drive that point home.

        Now we have built several nodes in the development environment to allow text programming in LabVIEW. For example, we introduced MathScript several years ago to allow people bring in their own .m files into LabVIEW and interface their algorithms with hardware. After four years it run as fast or faster than other math languages, and it can be deployed to RT systems. We also have a VHDL node that lets FPGA developers bring in their own VHDL and run it side by side with LabVIEW FPGA code.

        As far as a C node in LabVIEW, I’m not sure what our strategy/roadmap is there but I’ll be sure to post anything I find to this thread.

      • James says:

        Like Todd, I’m confused by most of your criticisms. Why can’t you give your variables descriptive names or write comments, for example? That seems to be trivially easy.

        — James

      • David says:

        Are you aware of LabWINDOWS, also known as CVI? It is in reality the crown jewel of National Instruments software. It is as good as LabVIEW is bad.

        If you’re a C programmer you’ll be productive in CVI in one afternoon with nothing more than the introductory document that comes with it. You won’t need stacks of books, training courses and materials, tutorials, examples, or forums. You won’t even find books available for CVI. They’re simply not needed.

        For some reason NI’s advertising of CVI is very subdued. I guess it’s the LabVIEW lock-in and the income NI gets from things like LabVIEW training and books that keeps it at the forefront. Also, there’s a mysterious coolness concept associated with LabVIEW.

        Many of the people that support LabVIEW are managers who can’t use it themselves. But they have impression that they could use LabVIEW if they wanted to. But that’s the decieving part of LabVIEW. It’s not nearly as easy at it seems.

        Give CVI try. You’ll be glad you did.

      • Mats says:

        The biggest mistake you NI people have done is to promote LabVIEW much more than LabWindows/CVI. I have heard C/C++ people complaining about LabVIEW since the 1990s. CVI is a better solution for those who enjoys text based programming. Today, electronic engineers replace LabVIEW with Python.

    • Say No to LabVIEW! says:

      Responding to Todd on 10/6/09:

      First, thanks for stepping into the lion’s den.

      Regarding your stance, I have reluctantly learned and adapted to many different programming languages in the 15 years i’ve been programming (started in pascal). Yes, it is frustrating to learn a new language, but i have found no language as frustrating as LabVIEW. And it is not because i am a novice who entered with the expectation of an “easy” interface. It is because the foundation behind labVIEW is fundamentally flawed. My biggest problem with labVIEW:

      It is a proprietary language. This fact means that its development is not governed by an unbiased standards committee. In essence its very self serving and I have found caters to anyone “on the inside”, something i have observed when sitting down with developers at HQ in Austin. I feel that those inside the labVIEW universe get caught up in the rhetoric and are unable to unbiasedly view their product, something the comes inherently with non-proprietary software.

      This fact of course extends to the development environment as well. Since it is proprietary, the compiler is also proprietary, and hell why not make the hardware proprietary too. With out doubt if you buy in to the NI universe and purchase their language, IDE, hardware and technical support, then everything is more or less roses. But as soon as you talk 3rd party anything, it all goes down the drain.

      Take the red pill.

    • MarcoG says:

      Maybe there are thousand of Labview “developers” all around the world just because NI proposes this stuff as an industrial standard that is really simple to use because it’s graphical. Maybe this proposal is made to incompetent management that believe in every word you say. It’s just my theory but it’s based on my experience. I really would like to see how much time a skilled labview developer need in order to create a simple application that calculates a number of Fibonacci sequence (with shift register and all this stuff). And then compare the time spent with the time needed with any programming language you want.
      In my opinion, Labview is an horrible way to try to do everything without any knowledge: the final result is you make everything in a amateurish way, with a lot of time spent, low efficiency and the final product is not maintainable. In the while how much money did you pay for the software? And which is your final efficiency? And do you use all hardware resources?

      • chriggy says:

        MarcoG, I will comment on this as an independent programmer. I have never worked for NI nor have any relationship to them other than using the program.

        Disclaimer: I have about 7 years experience with Labview and about 5 years experience with C/C++.

        To answer your question, I could implement your Fibonacci sequence problem in Labview in about a minute(including GUI). Once you get used to shift registers it becomes second nature.

        My previous employer’s stance was to prototype in Labview, and then port to C/C++ to avoid licensing fees.

        Having done both, I’d say that as so far as Labview being a proper tool for the job(and it is not the proper tool for every job), the time and productivity savings are immense. On average, I can do a Labview program an about 1/4 to 1/3 the time as C/C++.

        I’ve also had .NET developers tell me that they could not have developed a similar app as I’ve developed in Labview in nearly the same amount of time.

        So, food for though. Labview is not the proper tool for every job, but when it is, it’s very efficient.

      • abhinandan says:

        I agree with Chriggy….

      • David B says:

        It’s painfully obvious most people trashing LabVIEW on this thread have very little experience with it or have had to read code that was written by a hack instead of a well experienced LabVIEW programmer. I remember back in the early 90’s when LabVIEW first came out they would have programming competitions between the LabVIEW users and CVI users during NIWEEK. It was funny to watch. Every single time I watched, the first place CVI programmer was way behind the last place LabVIEW programmer. When I entered the challenge, I had time to finish off two beers from the time I finished the challenge to the time the 1st “C” programmer finished.

        Like any program, you can make a real mess of code in LabVIEW. It’s advertised as “easy” so a lot of non-programmers create some really nasty code but that’s not the fault of LabVIEW. NIWEEK no longer invites the CVI programmers enter the programming race at NIWEEK because it’s to embarrassing for them. They still have the challenge between LabVIEW programmers and it’s amazing how fast a good programmer can crank out code.

        I was a Basic programmer, then a “C” programmer and later became a “C++” programmer. Since I switched to LabVIEW, I never want to go back.

        NI offers really good training for people who want to learn how to use LabVIEW. It doesn’t take long before your brain starts thinking in terms of LabVIEW diagrams instead of text code. It’s also much easier to visualize and program parallel tasks on dual processors in LabVIEW. I could go on and on.

      • rwalle says:

        reply to chriggy: Sorry but nobody uses LabVIEW for Fibonacci. It is a meaningless example. (BTW I can do that with Python within 30 seconds, so 1 minute is still too long.) You might try these things that people actually do with LabVIEW and see how cumbersome it is: (1) Basic file I/O (2) string manipulation (3) Error handling (God I miss the simple try-catch clauses in normal programming languages) (4) HTTP requests (GET, POST, etc.) (5) basic arithmetics, like calculating (a/2 + b) * c + 3

        These things (and many others) that take only one or a few minutes to write with, say, Python or Java, can take tens of minutes on LabVIEW, which includes searching documentation, getting confused, looking for another example, still confused, searching documentation about something it uses, trying a few times and maybe eventually getting it correct.

        And what makes this worse it LabVIEW’s HORRIBLE documentation and community. What a nightmare it is. A lot of documentation on their website is just explanations of terms, and you still have no idea how to actually use it after reading it. So you search again and found some official or unofficial examples on the forum. But they only deal with the most basic cases, and you still have no clue the correct way to deal with your case where you need to add an additional parameter. Maybe you eventually figure it out, or maybe you don’t.

    • funan says:

      Please for god’s sake, just add the zoom in and out function.
      I think this has been requested since the dawn of the dinosaurs.

  20. ali65 says:

    Dear Todd,

    I tried your link, but


    We are sorry, but an error has occurred with our application.

    Please try again, or if the same problem occurs, come back to our site later.

    If you would like to help us fix the problem, please Contact Us.

    I don’t want to fix your problems, but please do not try to use failed projects as a good example, especially not until you have proved that your product did not cause the failure.

  21. dr jimmy says:

    Labview really just does not work. The people at NI will keep offering suggestions on the problems, which lead to new problems. The projects just never work in the end. These types of problems seem to get worse as they make more and more hardware and software.

    Im sure how NI made it so long but its a credit to their salesmen ship and arrogance. They will keep offering solutions and new hardware to buy and try to make you think like you dont know what your doing. But after 20 years of dealing with these clowns I can tell you, there is never an end to it.

    I sincerely believe that NI is one of the biggest problems with american engineering. People get suckered in and never learn to be real engineers and programmers. NI is a diseas.

  22. Ali65 says:

    Dr Jimmy,

    Big corps (monopolies) often produce bad products selling for high price. But their sales people are not clowns, and these corps are not the disease. The problem is with the corrupt or stupid suckers who buy it.

    And NI also developed a few good and competitively priced products. (unlike MSFT)

  23. D Shrestha says:

    I am getting tired of the LabView compatibility with the newer devices. I am trying to move away from it for all my new projects. If you are not already sucked into it, probably a smart thing to stay away from it and NI cards. I think once upon a time, it was good. But now it is worthless if you do not want to upgrade to their latest version even if you don’t need it, and want to pay for their phone support. Stay away from it.

  24. JC says:




    STUPIDLY HARD TO USE!!!!!!!!!!

  25. JC says:

    oh yeah, i left something out.

    my teammates want to comment:

    Connor says: labview is frustrating.

    Amos says: It sucks crap.

    yep. basically.

  26. dr jimmy says:

    The funnist part of NI is meeting their “engineers”. Most of them are just clowns that studies liberal arts in college. Its no wonder this product has such problems, its just total clowns that work there..

    eventually, matlab and other programming langauges will phase out NI as they start to encorportate driver capability. America needs to get rid of this disease. This is the type of product you want to have people using over in india or china.

    • David says:

      I agree with everyone that the LabVIEW graphical programming language is a sales gimmick to say the least. NI’s “ease of use” claim is misleading at best and dishonest at worst. But I’ve had no problem with NI personnel. NI support is second to none in my experience. The people are capable, friendly, and try hard but the product is weak. LabVIEW can make the best engineer look like an idiot. There’s only so much they can do to help.

    • shru says:

      do you think indians are supposed to be using wats scrap to you…. you bugger ,idiot,fuck yourself…you guys are bigger scraps in dis world

  27. jhanus says:

    Well, NI your certainly know how to complicate some brutally simple things, so thank you on that.

    Programming in LabVIEW is completely unintuitive, in the end I’m connecting tiny lines with no logic at all. When I program I fell like a complete idiot, it’s very irritating, I would rather program in frickin assembler.

    I simply despise this program, long live C/C++.

    And one question for NI, were the hell does a program start, totally useless.

  28. Janssens says:

    Have you read what they wrote here?
    some motivation are really stupid.
    They changed names/concept just to be “more easy” and as a result they partially implemented OOP.
    Every accessor method is a file on your hard disk… how stupid is this?
    C# .NET is way more easy than labview (other than more powerfull).
    Who cares I don’t have the drivers for NI instruments.

    • Dan Kopelove says:

      Couldn’t agree more. Furthermore if you change the name of the data that labview is accessing, it breaks everywhere in your project. I’m going postal over this.

  29. Me says:

    Hi Todd.

    The list also needs:
    Constantly not supporting old functions. They put tons of examples on their website, when I go to use it using the current version of LabVIEW I get,”NI no longer supports this function.”

    I ask a question on the message board about a vi example on their website. I was told the 2006 example I was referring to was “old”. WTF!?

  30. kroverstreet says:

    I totally agree. LabView is only the industry standard because NI makes the most diverse set of data acquisition hardware. NI still povides C libraries so there is really no need for LabView. I HATE LabView, with a violent passion. It is feel good programming at its worst. The biggest problem I have with it though, is when you buy a $100k laser system, and you get some half-assed LabView program to run it with. Then, the LabView code craps and the laser shuts off, leaving me with a $100k paper weight! LabView facilitates replacing robust electronics with unstable computer code that slows my experiments to a halt. I HATE LABVIEW!!!!

    • pockelscell says:

      krovoerstreet, I agree with you completely. Actually I work at a company that makes solidstate lasers and I am with the stance that lasers or even most machinery can be run on C or just with more simplistic serial communication and if you want you could just slap Ruby/Python/Groovy/Java/.Net interface over it, if it’s necessary to do that of course. Labview is like the idiot’s guide to programming. The NI salesman can have his point of view and so can I and I definitely believe that I don’t need to be spending money on bloated corporate hardware, especially when you or even I can run something through RS-232 or ethernet through a hyper terminal.

  31. nomme_de_guerre says:

    I have to say, my spirits have been lifted immensely reading this post and associated comments. I am a hardcore C/C++ programmer and have been resisting the encroachment of Labview fascism where I work. I’ve felt increasingly isolated and alone, wondering if there is a future for an old school programmer like myself. Through the darkness I’ve been comforting myself with the works of K&R, Stroustrup, and Knuth. This blog post and comments have given me hope. Viva le resistance!

  32. Joe says:

    Try using TestPoint. You can still buy it. I’ve been using it for years. No wires, nice programming environment, sequential programming gets put behind the objects. I wish someone would redevelop it.

  33. M. L. says:

    On the one hand I would love to be able to interface my instrument with Python and rapidly building a GUI with glade. But on the other hand, I would also say that most of the problems described here come from bad code. I am no associated with the author whatsoever but I would still recommend Blume’s style book to anyone starting with labview, and better read it before starting, and also some general book about programming like the excellent and free Think Python http://www.greenteapress.com/thinkpython/thinkpython.pdf

    My own personal frustration, that cannot go away by improving programming style, comes from the impossibility to open programs written with different versions of labview, combined with the price per seat. I have a license for Labview 8.6 on my personal computer, but in the lab, there is 7.1 and 8.6 will simply not allow me to save my code for 7.1. All properly licensed softwares from NI and no compatibility within NI’s own softwares. I do not use any fancy features either. This is just insane.

    So there: no real hate for the language, no love either. But plenty of despise for the people at NI.

  34. LackView says:

    I need to add a field to a quite complex class cluster and labview will smartly shuffle every reference with similar names… Any volunteer? Maybe the ones playing the same “bad developers write bad code” adagio for years?

  35. RON says:

    Labview is not a programing languish, it a service. C and C++ are programing languishes because you can use them to create labView. you can not use labview to create labview. think about it. Labview is a service, the NI software team do all the work for you so that you can just drag an icon and make things happen.

  36. Dale M says:

    Having programmed with LabView and other scripts/languages for the last decade I can at least describe my experiences with the benefits of each method for automation and programming. Every company I have worked for has had a preference of some kind (and excuse for that matter) for using one programming format over another. As far developing a GUI, or having built in functions, that can call instruments and interfaces (like a serial or usb port) a program like LabView comes in very handy. Using Visual C++, or Visual Basic, (though C# has improved in this area) to develop a GUI takes time to be able to define your controls and you often have to incorporate outside libraries (like VISA and a secondary USB/Serial driver) to communicate with other devices. Python and TCL/TK are great scripting languages with reasonable speed, but to the graphical development leaves much to be desired and often do not contain all the necessary controls you would like. The one major, and I mean MAJOR, problem I have with LabView is the editing and reading of long programs….. Working with those wires and trying to follow the program path is a nightmare without a really comprehensive guide. And it you have too many sorting mechanisms in your LabView code it can slow the execution to a crawl. On the other hand I would call LabView a programming type of language as you do have to define the types of data you use and the fact that you have data structures, data flow and it compiles to a stand alone program.

  37. Thad says:

    This site was at the top of the Google search when I searched for “labview” sucks.

    Another engineer and I were recently given the task to bring our old machines up-to-date with data logging and possible full machine control in the future. Our boss liked Labview so we purchase a copy and let me tell you this software is in no way simple to use nor is it intuitive. Labview’s endless arrays of little boxes with silly “wires” makes little sense to an engineer with a degree. The problem lies in the fact that as an engineer we are taught text based programming languages in college such as C/C++, Java, Visual Basic etc. But Labview goes against everything you learned in your programming classes. I just want to grab data from an RS-232 gauge and have it display neatly on screen with the ability to log that data to a database or even a text file. It took us two days of searching through one outdated tutorial after another to finally get the gauge working. And how am I supposed to debug this crap? We wound up downloading an RS-232 library and wrote an application in C# in about 2-3 hours in about 150 lines of code. That’s what we and just about every engineering student on the planet wants to do.

    And $4300 USD for the professional edition, is that per seat? Are you f*****g kidding me? Even the bloody base version costs $1250 USD. Student copies cost $100 but you have to be a student (Read: Valid student ID) and have $100 to spend. What about hobbyist types? We have to pay $1250 for the base package? Then to top it all off its windows only locking out Mac and Linux users. Guess what NI, we would rather use the C libraries and interface with hardware using C or even use C inside Python. Then we can choose the OS of our liking and be done with it. The only redeeming quality of your software is the ability to rapidly develop custom control UI’s.

    NI, the bottom line is this: Your software goes against everything real engineering students are taught in school. And don’t you dare think that the simple answer is for us to change our “ways”. Our ways that have been taught for decades have served the world quite well. And if you think schools should teach labview to students you are barking up the wrong tree. Its not a true standard, not cross platform and the cost is exorbitant.

    Perhaps in the future I would like to evaluate the possibility of developing an open source alternative based on Python, C, MatLAB and COMEDI (Or perhaps other languages like Ruby). Then develop a GUI tool to rapidly prototype a UI for control/display. The back-end stays text based, sane and works on any OS and IDE/compiler. That would be ideal for my needs and I am sure many others would agree.

    • Thad says:

      I retract my statement about Labview’s inability to run on Mac or Linux, it was a mistake on my part. But it still stands that the software is grossly overpriced for hobbyist types.

    • Theo says:

      Hmmm. Old post but I would like to put my two cents in here. I have been developing a somewhat advanced control program for a lab (currently 1 x 8 control loops – scalable to multiples of 8) with many instruments. I certainly understand the frustrations you guys have, especially when it comes to debugging or looking at the code. However, I get the feeling that a lot of your criticism is because you are comfortable with other environments. I am a mechanical engineer with minimal programming outside of matlab and labView. However, this post above that says, “We wound up downloading an RS-232 library and wrote an application in C# in about 2-3 hours in about 150 lines of code. That’s what we and just about every engineering student on the planet wants to do.” Really? I haven’t programmed in C# but I could tell you that as a intermediate level user of LabVIEW I could write that program in 10 minutes. LabVIEW is a difficult program for a lot of programmers, because it is difficult to look at.

      • gs says:

        From reading Thad’s post I don’t agree with you that most of his frustrations came from being unfamiliar. It is from LabView being so expensive, and I totally agree. We moved away from LabWindows (another NI product) also because of it’s cost. At the time we were using it, NI was charging $1000 for every computer that the executable was distributed to that did not have an NI IEEE card. Well, when I was pushing out an app to 30 computers that did not have IEEE cards, the cost became outrageous. We quickly switched to MS visual studio which was a fragment of NI’s price and no license fee attached to the distributable. The other problem I had with NI software is that it was bloated. The amount of resources it consumed was crippling our systems.

  38. Jamie says:

    Yet another results from Googling ‘I hate LabVIEW’.

    I have 4-5years embedded ANSI C/ASM experience, along with (lately) 4-5yrs LabVIEW experience. So a bit of a dual perspective. During my time as an embedded developer, I had to interface with folks using LabVIEW for the front-end interface. I *hated* LabVIEW then. I couldn’t understand it, it was non-intuitive, and I really despised having to do work in any way with it. I just wrote the serial packeting, and they took it from there.

    The learning curve of LabVIEW is *not* as easy as many make it out to be. It took me a good 1-2yrs of direct side-to-side experiance from a fellow LabVIEW developer to really get proficient at it. Once that happened, I can more quickly develop a simple front panel to display results from an external instrument (for example).

    However what I’ve found is the more GUI-rich you want/need to make your program, the less and less LabVIEW becomes appropriate. Once the acquisition is taken care of, the level of customization with a GUI quickly tapers off – you find yourself performing more ‘hacks’ and ‘workarounds’ than I normally would with say VisualStudio. In short LabVIEW is perfect for an R&D or T&M enviroment – not so much for production or end-user applications, in my opinion.

    The biggest thing that I’ve come to realize (and I agree with many posts above) is the insane cost of ownership that comes along with NI Hardware and software. They are *constantly* shoving new products down your throat, with little concern for backwards compatability. Once you’re ‘locked in’ as one poster put it, there’s little turning back. You *have* to stick with development in LabVIEW or face, basically, ground-up re-development.

  39. Thad says:

    Since writing my last rant we have ditched the idea of using LabVIEW completely. A costly loss but within three days of my fellow engineer hacking away at C# code and me putting the hardware together, we now have an application that is monitoring multiple RS-232 gauges along with basic data logging. Tying into the analog stuff and machine control will be done with Opto22 hardware. Sorry NI your clumsy development environment is just not practical for our needs.

  40. Tom Cruise says:

    It’s like programming in Egyptian hieroglyphics. http://imgur.com/NSasa

  41. Charlie says:

    This freaking structure is like an over-protective parent, not trusting you to get at the data directly.

    Or maybe it’s a necessity of this crazy commitment to graphical programming. The data is naturally represented in a text structure, to choose to work with it via lines and symbols is masochism.

  42. Minty says:

    Just to enter the lions den and try to remove a couple of splinters from its paw. Thought I’d put a few remarks in.

    I’ve been writing labview for 10+ years. Delphi 10+, PHP 5+, C 10+ and C++ 8+ (amongst other languages). So have a bit of experience in this arena. I would also point out that I started with Labview and moved to other languages rather than the other way round.

    Some the concerns (adding comments, naming variables etc) are just inane.If you cannot write text then you shouldn’t be programming in any language!

    But for the more constructive comments on issues, here is my 2 cents worth.

    The biggest problem for migration from other languages is that Labview is a change of “PARADIGM”. Being “Data Flow” Many of the things that you would normally have to think about or concern yourself with simply don’t exist and nor do they need to (state information, memory management, pointers etc). The change in thought process is equivalent from going from imperative to object oriented. If you have never learnt a language, then this is not an issue. However, if you have several “imperative” languages under your belt, then it becomes a lot harder to change your minds-eye picture of the program.

    I’m lucky. Labview represents a very close analogy to my minds representation of a piece of software (in any language). And whilst at the lower levels you may argue that the “VIs” are synonymous with “functions”, “procedures” etc. At the mid levels the VIs become more like a UML model of the system. If you can picture a UML chart with multiple layers, you have a VI hierarchy. This is the real power of Labview.

    But its the implementation specifics that seem to cause the most problems. And this boils down to the “emphasis” on the data rather than the “container”. In Labview, we don’t care where the data is only what it is. We don’t have to worry about array over indexing, explicit casting, memory allocation/freeing (much like PHP in that sense). So many of the features of other languages are just irrelevant.

    I could prattle on for a lot longer but I’ll conclude with a few “Pros” and “Cons”. A few things that labview is superior at and those it is inferior at that haven’t already been mentioned.

    No programmatic memory management (very rare to see a GPF in LV)

    Real world control. Huge number of pre-packaged libraries for instrumentation control.

    Enormous choice of high level functions built in (graph fitting, regression, integration/ differentiation etc, etc)

    Very appropriate for “top-down” or “bottom-up” design.
    Data flow is particularly effective for these topologies.

    Cross Platform.
    Well. Aren’t they all?

    Rapid Development Prototyping.
    Very quickly get functioning modules which can be checked in-situe to the development environment.

    Inherent parallelism.
    Executing parallel code is the default. Sequential operation has to be designed (by connecting VIs in “series”.

    Poor OOP implementation.
    Very cumbersome and generates a huge number of files as well as being “buggy”

    On a function-by-function comparison basis, generally,labview is a lot slower than other languages such a C or C++

    Requires a “HUGE” run-time environment (a bit like .NET on steroids)

    Cumbersome for UI intensive Apps.
    Some applications actually require more code to try an control the UI than actually doing the hard stuff.

  43. Russell says:

    I arrived here Googling “LabVIEW alternative”.

    I need to build a reasonably complex automated battery test system.

    My electronics experience is condierable, my programming experience is limited to assembly (various) ANSI C amd Visual Basic, oh, and LabVIEW – forced upon me by a previous employer – what a CROCK of CRAP.

    The idea of a ‘graphical’ language appeals – but it CERTAINLY WON’T BE LABVIEW!

    Any suggestions for an alternative? Otherwise it’ll be back to VB (or I may give C++ a go).

    • Thad says:

      How about giving C# a try? Using Visual Studio 2010 it is pretty damn powerful and simple. It took a bit getting used to coming from C++, I kept writing functions to do things that C# and .Net have built in. I wrote a fairly robust application to control a Galil X-Y table in about a week.

      Give it a try. A friend wrote an application to monitor three Vaccuum gauges in 2 weeks that included a web reporting interface that connected to an SQL database. My brother works in the gaming industry as a programmer and C# is the preferred language for rapid tool development. You cant go wrong with C# if you are looking to quickly develop applications.

      C# has a C++ like syntax, Java like structure and Visual Basic like rapid GUI application development. Its the best of the three. And yes I have done C/C++ development before under both Windows and Linux so its nice to just make something work with minimal effort.

      • yonose says:

        Hello Thad

        shall you consider python (ironpython if you wish), and D programming language, python is slow but also is very good for developing fast applications, must say slower but easier than C#!! and D I don’t know why is not popular, because I find it more adaptable -albeit nost a sa easy- than C#…
        don’t get me wrong with C#, it’s just that I’m aware of some of it’s limitations…

        Mono nowadays is really good but looks like a legally walled garden… D programming lanaguage takes in mind the advantages of java and disadvantages of C/C++ … D’s runtime is pretty light an there’s a community of developers in the digitalmars site.

      • chriggy says:

        Having worked with both LabView(8 years) and C/C++(5 years), I will say that the tool to use depends on the application.

        Labview is better for some, C/C++ for others.

        I don’t know the complexity of the web interface your friend designed, but any halfway decent LabView programmer would be able to implement the monitoring of the gauges and shoving it into a SQL Database in less than a day(more like a couple of hours).

        That leaves almost the rest of the two weeks for the web interface.

        And some applications work best using a dual approach(C/C++/C# for the web reporting, Labview for the back end, perhaps?)

    • yonose says:

      Hello Russell

      shall you consider python (ironpython if you wish), and D programming language, python is slow but also is very good for developing fast applications, must say slower but easier than C#!! and D I don’t know why is not popular, because I find it more adaptable -albeit nost a sa easy- than C#…
      don’t get me wrong with C#, it’s just that I’m aware of some of it’s limitations…

      Mono nowadays is really good but looks like a legally walled garden… D programming lanaguage takes in mind the advantages of java and disadvantages of C/C++ … D’s runtime is pretty light an there’s a community of developers in the digitalmars site.

  44. jtstand says:

    Sorry guys, I know this is a hate discussion, but I cannot keep it to myself why a developer should love LabView:

    1. LabView is so damn difficult to develop that every task takes 10 times as long compared to some non-graphical programming language. This is great, because more developers can make a living! The more difficult LabView becomes, the better!

    2. If you charge a lot for a very small change in a LabView vi, nobody will question it. If you would do the same with a non-graphical language based software product, some customers might run a diff on the sources and may ask why it costs thousand of $s changing two lines. Don’t you love LabView?

  45. Dave says:

    Finally! Someone who is not afraid to speak the truth about LabVIEW.
    The company I used to work for shoved LV down our throats because one engineer convinced the uppy-ups to adopt it. Expensive move. We had to bring in a $100/hour LV programmer to re-develop a device OS that was already done in VB.
    The LV OS was buggy and no one understood it. Back comes the $100/hour guy to “fix” it. Oh, and he offered to teach LV to the engineers. For two weeks. More $$$ down the tubes.
    A change in the device mandated a change in the OS. Bring in that guy again.
    Then we needed two devices to run at the same time… “The OS needs to be redesigned!”, he said. Talk about job security!
    Now, the company is in trouble. They need some work done that is similar to the system they had before and they say that LV can’t do it. It probably can do the task, they just don’t have the expertise. Call the $100/hour guy again?

  46. Donald Murray says:

    Labview does suck :-) At last, people that agree with me.
    “Labview is meant for people who don’t know programming”
    is such crap. It’s meant for Engineering. If you are an engineer and you don’t know C, you are in serious trouble.
    I had to program a project years ago in labview and I just coded it in C. I did it faster, it ran faster, and I was able to get it to do exactly what I wanted, when I wasn’t able to do that using labview.

    If you must program visually. Look up Visual Programming Language (VPL) on wikapedia….you’ll find about 50 of them, many opensource.

    If labview were free, I’ll bet it might be made useful by opensource developers, but as it is, it’s kinda useless.

    What would be better is something that allowed you to program like a RAD GUI like wxDevC++ for wxWidgets. You program it visually, and then generate code and compile this. That “might” be useful….but purely to get a product out faster.

  47. SolidSnake says:

    Here’s how I see it.

    LabView is a great tool to learn programming. It removes the hurdle of envisioning things like “objects” as round bubbles and “classes” as rectangles, the stuff professors try to teach you in college, and relate those images with often confusing syntax.

    I have two experiences with LabView. The first when I knew nothing about programming and the second when I learned about programming.

    Though LabView can do OOP (and GOOP to closely match text-based language structure and rules) its often not what NI “sells” to you as a capability. Moreover its kinda of confusing especially implementing OOP because LV uses data-flow rather than performing operations in sequence.

    I am currently 1/7/11 working on a project that uses OOP and am learning alot about this otehr side of LV but as some have said here already if you “know” the key concepts of programming why would a seemingly sane institution even bother with LabView?

    I look at it this way: LV is the Apple of the data acquisition world. Expensive and all wrapped in a neat package.

    The free alternatives often don’t have documentation and one has to do extra searching to find compatible hardware to talk to each other where as NI provides instruments that can communicate to many, many 3rd party peripherals already.

    NOW I will say that most of the problem with LV is the fact that it follows a data flow paradigm which as was said earlier in these posts, slow, but more in terms of what you already know you want to implement versus what LV must chug through to “maybe” get the response you expect. And that’s with proper wiring!

    I can see through first hand experience under my mentor that LV when using OOP can be very powerful terms of quickly organizing an OOP program.

    In terms of wiring that’s really a personal discipline and and agreement that your superiors understand the importance of proper wiring. (but of course they only expect results).

    You can potentially (with good wiring and instantiation practice) show more of what a program is doing in one screen than reading through lines and lines of running syntax.

    The things that really do suck about LabView are the prices for their software and equipment and the fact that data flow (if your not fully aware of what that means in how it impacts your work) can bring you to your knees.

    I would say if you are an independant developer and have the money to spend get LabView because you can code without a boss hurrying you “artwork”.

    I you are or work for a business that does not appreciate the importance of good code and documentation then for get it.

    If you work for a mindful employer who understands the importance of doing things in a clean, centralized way from the start, mention LabView as a choice.

    My honest opinion is that if you are a great programmer/developer…build some Android apps instead, F**** test engineering.

  48. Richard says:

    SolidSnake: LabView is a great tool to learn programming.
    I cannot disagree more!
    It is a terrible way to start if you want to learn programming concepts.
    Almost every other language can easily demonstrate simple programming concepts with a few lines of code.
    The concept of a “for” loop or an “if” statement makes absolutely NO {explicative deleted} sense in LabView.
    It is possible to look at a “for” loop in nearly any other language (except possibly forth or ADA or lisp) and immediately see the parts of the loop… but in LabView it is a drawing of a loop with an arrow and numbers. There is NO indication of the meaning of the numbers.
    One can program explicit concepts in BASIC or C or FORTRAN, etc., etc. that are not readily apparent when done in LabView.
    Also, it is very difficult to describe a Labview program while one can read a BASIC program over the phone with little difficulty.
    Same with listings in a book or magazine. If the LabView program is not constructed with printing in mind, it can be a nightmare. (Actually, some LabView constructs deliberately hide components of the statements making a simple printout impossible.)

    • SolidSnake says:

      “One can program explicit concepts in BASIC or C or FORTRAN, etc., etc. that are not readily apparent when done in LabView.”

      Such as? in LV a For loop is simply a box with a limit you set for N times you want it to loop. Done.

      For a beginner (as I stressed earlier) can have some diffculty getting it to run. And so much because of the loop itself. A For in a text language must have all the little structures such as a main, constructor, a call function (optionally for this simple example) and some import file. Again to a beginner ( as I once was) there were seemingly alot of moving parts in order even being understanding the concept of the day, in this example a “simple” For loop.

      In labview you can draw your For loop box and immediately learn how the function works with out the additional overhead.

      Now again, Labview is great for jumping into the many basic functions shared by most if not all languages but with out studying computer science one can easily just start wiring wrapping without a care for how that affects system performance. This is true for any language but LV can make things to easy to achieve for its own good.

      A real complaint about labview is its tendency to crash sometimes without reason however.

      • ali65 says:

        This is a joke:
        “Labview is great for jumping into the many basic functions shared by most if not all languages but with out studying computer science one can easily just start wiring”

        Do you know how much typing and mousing you need to do in LV to get your “for” box on the screen? Compare that once to the simple procedure of typing “for(i=0;i<n;i++){}" in your favorite editor. Certainly the result of the LV (the box) will look simpler, but it is no question which one is easier to create and maintain; the one which is based on a non-graphical computer language.

      • Alex says:


        I find labview frustrating as well a lot of the time, even though it’s the only thing I’m really any good at, but you’re twisted if you think typing that stuff out is easier than simply right clicking and selecting for loop and then dragging it. Piece of pixx

  49. Richard says:

    Thanks Ali65
    I find it most difficult keeping my wits about me in these discussions.
    It always seems as though the LabView fans have an explanation for everything. No matter how convoluted the answer is, they consider the matter closed when they have spoken.
    LabView is similar. It is a VERY closed language. If it were open at all, there would be other language developers that would be creating LabView clones to compete with the original.
    Look at C compilers. There are many, many compilers. Don’t like that implementation? Get another that suits you better. No can do with LabView.

    Seasoned programmers look at LabView code and emit question marks over their heads.
    SolidSnake says: “in LV a For loop is simply a box with a limit you set for N times you want it to loop. Done.“…”draw your For loop box and immediately learn how the function works
    When I was learning LV, the instructor did this on a screen in front of a classroom of C programmers. The groans from the ‘students’ grew to a din. People had difficulty understanding the symbolism in the ‘for’ loop box.

    LV programmers cannot take their experience to other languages. Text based programmers can ‘port’ their knowledge to other languages rather easily (as long as they are not using LabView).

    • Diddleydoo says:

      Well if you don’t UNDERSTAND what you’re doing in a programming language as opposed to learning off the dictionary like a stupid monkey then, yeah, I suppose you’ll have trouble porting knowledge between languages.

      Me, I have no problem because no matter which language I’m using, I UNDERSTAND what’s going on (LabVIEW included). The basics of programming and computer science apply across all programming languages.

      So maybe LabVIEW’s just too HARD for some people to grok. But that’s fine with me. It’ll secure my job for several decades to come!

  50. SolidSnake says:

    I guess my experience is very different from most. In school we actually had an LV course which was basically an introduction to it but (by my experience only) I learned more about basic programming concepts after using it then than whole semesters computer engineering courses.

    Ali65 I am very aware of the advantages of being able to port text code from editor to editor. I do it all the time.

    Since no one actually reads comments I’ll restate that in my experience I feel LV could be a good tool for absolute beginners. Total beginners being someone who hasn’t seen a line of code in their life.

    I actually do agree with both your points

    My previous comment:My honest opinion is that if you are a great programmer/developer…build some Android apps instead, F**** test engineering.

    But again in my experience I feel LV could be a good tool for absolute beginners. Total beginners being someone who hasn’t seen a line of code in their life.

  51. Interesting thread and amazing inaccurate in many regards. I’ve been using LV for 12 years and wouldn’t go back to text based code. That’s right — I wouldn’t go back. In the old days I programmed in C, pascal, C++, Unix, Idris, Co-Idris and cans till wrap my head around text code, but why do it? Programming in text in a graphical environment like the computer makes about as much sense as us still using the QWERTY keyboard instead of Dvorak arrangement. Text-based programming is like that: those who were “brought up” on it, use and find it easy, simple, direct. Those who were “brought up” on LV find it amazingly easy to use, simple and direct and the debugging capability is IME unparalleled. Now I’m sure there will be others who will be inflamed by these comments but there really isn’t any reason to be so upset. I’m not interested in engaging in debate and I’m certainly not interested in “diving back into” text-based languages. I’ve been spending the majority of my time over the last ten years removing all of the text-based modules that I could from my overall project so that it is far, far easier to maintain and extend — yes, in LV. That is where I have done virtually all of my programming for the last ten years.

  52. BobbyO says:

    Like Val Brown, I too have been programming in LabVIEW for about the last 12 years. I have a total of almost 40 years programming under my belt, starting as a kid in BASIC and then progressing through all of the standard (and, along the way, a few not-so-standard) text-based languages that you could imagine. FYI, I’m an advanced-degreed EE who has a significant CS education also and no, I don’t work for NI. My experience learning LabVIEW was like getting shot off the deck of an aircraft carrier in an F-18: that’s how quickly I got up to speed and how exhilarating it felt. Text-based programming was productive, yes, but also completely routine and a tad mundane. LabVIEW is a different paradigm, plain and simple, and it takes a bit of time and effort to get your mind around, but the rewards are incredible. Because LV is graphical, when programming you are using both sides of your brain — the visual, creative hemisphere along with the logical, analytical side — in tandem. I estimate that my programming speed and effectiveness is somewhere between 4 and 10 times greater in LV than working in any text language. LV re-ignitied in me a passion for software that had slowly eroded through my career, and that fire is still hot after a dozen years. All of the CS concepts that I had ever learned or used are applicable in LV, but their implementation is completely different in new and exciting ways. I had absolutely no problem recreating in LV the various structures, algorithms, and approaches that I was used to previously. LV is a general-purpose programming language with tremendous capabilities, and I vastly prefer to program applications in LV than in any text language.

    I hate to throw around insults or epithets, but really, most of the prior complaints sound like lazy whining from people who are more interested in digging in their heels and fighting than they are in learning something that might challenge them. As someone who is well-versed in both programming camps — graphical and text — I can say that 99% of the so-called criticisms of LV in this thread are nothing more than ignorance on the part of the posters of how to properly program LV to take advantage of its enormous capabilities.

  53. Richard says:

    VERY well said! Too bad you don’t know what you are talking about.
    If you “hate to throw around insults or epithets” then just shut up.

    If the LabView developers (NI) had any balls at all they would open it up and go Open Source. Then the people complaining about the difficulties would be able to fix them.

    NI has a great thing going here… They manufacture outstanding hardware but nearly force users to use a crappy language.

  54. BobbyO says:


    There’s two big reasons why NI doesn’t go open source with LabVIEW:

    1. It’s their flagship product. The last time I asked anybody at NI about it, the number was something like 40% of their sales were from LV and related tool kits. It’s a serious moneymaker for them. Commercial users gladly pay for LV because they find that they can program their applications much more quickly and effectively than they could previously in text languages. See my comments earlier.

    2. For a large fraction of the commercial/industrial market, open source software is strictly and expressly forbidden. For a variety of reasons (mostly related to government regulatory requirements), you won’t find open source content in the nuclear, aerospace, utility, or medical fields. Where you do find things that you’d think are open source (for example, Linux is used pretty widely in aircraft memory systems), it ends up that this content comprises commercial versions which have gone through extensive quality control and are, in fact, locked down tighter than a nun’s twat.

    BTW, I just loved your comment, “Too bad you don’t know what you are talking about.” My colleagues and customers have called me many things over the years, but uninformed is not one of them. My Bachelor’s degree is from MIT and my Master’s from RPI, I have taught graduate-level EE classes, and I’ve got nearly 30 years professional engineering experience. I make a comfortable 6-figure income solving heavy duty control and instrumentation problems, primarily using LabVIEW, and my skills are in strong demand. Tell me your qualifications for questioning my judgment and I might condescend to think about taking you seriously. Keep in mind that you don’t do anything to enhance your credibility when you make disparaging remarks about people whom you don’t know.

  55. Arjen says:


    clearly you haven’t got a clue about where open source software is and is not being used. I am not going to give examples as they would be too easy targets to pick on.

    But that’s not my main point. My point is that you seem to relate someones judgement, amongst other things, to ones income? So that means eg that George W Bush would be more qualified to judge LabView than say, a programmer from India? BTW, you shouldn’t count the numbers on the right hand side of the decimal comma / point when bragging about your income.

    O yeah, I really enjoyed reading this whole thread. I didn’t even here of LabView until three days or so ago since I started working for a counterpart in an EU exchange program. My new employer had some problems he wanted solved in LabView. Actually I already fixed the problems in three days of coding using a Python wrapper around NI DAQmx drivers. One of the issues was that the data should be written to a database and not a text file. Luckily LV has a solution for that. Just order the 1000 Euro database toolkit (or whatever it’s called) and the whole shining world of ODBC driver goodness is waiting for you! Can’t wait untill my leprotic kid panhandled the money on the streets of Dystopolis. One view of the ‘really easy schematic example’ made my guts almost fall out. After I recollected my guts, I found this excellent post. Thanks!

  56. FRS says:

    Wow not a lot of love in this room. I have to agree which is why I have switched to FlowStone by DSPRobotics!

  57. ali65 says:

    You don’t shoot yourself in the foot using LabVIEW.
    You hang yourself on wires.

  58. BobbyO says:


    I will certainly admit that I clearly have no idea where open source is and isn’t used in Europe, where it would appear you currently are. I live and work in the USA, and my customers are American. Perhaps things are very different on your continent than they are here. I specifically mentioned the nuclear, aerospace, utility, and medical fields because I have performed multiple projects in each of those sectors. Here in the USA, various regulatory agencies (NRC, FAA, FDA, etc.) have sets of complex rules relating what can and cannot be done with software, especially in the context of critical life safety systems. Software has to pass rigorous validation and verification (V&V) and quality assurance testing in order to be allowed to be released. In the kickoff process of every project that I have worked on in these fields, my customers set down rules to work by, and one of them is always — without exception — no open-source content.

    I really want to hear your examples of commercial use of open source software; I would guess that they are not in strongly-regulated industries. If they are, as you say, “too easy targets to pick on” then maybe your argument isn’t a very strong one in the first place. As it is right now, you have no argument at all, as you are simply making statements without justifying them.

    You misunderstood my comments challenging a previous writer’s judgments. I don’t equate wisdom with income, and in fact I quite enjoyed your quip about W. What I was trying to illustrate is that I make my living using LabVIEW as my primary tool. Customers are eager to get my services and pay me well for them. In a free-market economy the value of goods and services is determined chiefly by supply and demand. There is a high demand for experienced and quality LabVIEW programmers, and a relative shortage of qualified professionals. All of these posters trying to deny the value of LabVIEW need to open their eyes to the reality of the marketplace. If LabVIEW sucks so much, then how come it continues to sell strongly year after year, with a steadily increasing user base? If it’s so impossible to program and debug in, then why do companies keep using it? If the posters on this thread were even close to correct, then the industrial, test, automation, and instrumentation marketplaces would have abandoned LabVIEW a long time ago. Just the opposite is true.

    • Arjen says:


      Thanks for you clarification, that puts things a bit more in perspective.

      About use of open source software being used by commercial industry: I’m not familiar with the engineering industry but if you ever used the internet you might have noticed that half of it runs on Apache and MySQL. Companies like Google, Twitter and Facebook heavily rely on open source software and they do have the kind of V&V QA that you mention. These are just a few obvious examples (and they are American companies).

      Your argument about LabView being used widely as a proof of it’s quality could easily be turned around as:
      – proof of it’s proprietary lock-in; once you’re in, you can’t get out
      – proof of it’s commercial and marketing department

      Moreover, it’s the same argument as “If God does not exist, why do so many people believe in him then?”.

      But good for you that you are making an income with LabView. I am also happy that there are people willing to become lawyer, dentist, fireman, politician, gravedigger or porn actor. They are in wild demand, but it’s not for me and being in demand doesn’t make their business better or worse than anyone else’s.

      As a finalising thought: there hardly are any really bad programming languages or paradigms. There are, however, tons of bad programmers. In the hands of those, any program in any language becomes a piece of junk.

      • Chriggy says:

        It really depends on what you’re trying to do. If you’re trying to do a fancy user interface, Labview is not the way to go. If you’re trying to do data acquisition and digital signal processing, Labview is definitely the way to go. And if that is what you’re trying to do, then you should be familiar with electrical diagrams, which is what programming in Labview is akin to.

    • Thad says:

      “I really want to hear your examples of commercial use of open source software;”

      Wind River Linux. I find it difficult to believe that you have worked in the industry for so long and have not heard of it.

      Opto22’s Snap PAC controller runs Linux.

      Have you not heard of eCos or RTEMS either? eCos has a commercial vendor (RedBoot) that offers eCosPro with commercial support. RTEMS is currently used on board the Mars Reconnaissance Orbiter running the Electra software radio. It also runs on various data acquisition and control systems on the National Synchrotron Light Source at Brookhaven National Laboratory and the Advanced Photon Source at Argonne National Laboratory.

      Open source is not limited to serving webpages or hobbyists. It is used in serious industrial scale systems all over the planet and in space.

    • R says:

      Was surprised to read your claim that open-source software is expressed forbidden. To give you one example, R is used within various aspects of the pharmaceutical industry, from R&D to clinical trials.

  59. Larry says:

    First of all was a challenge to find the actual .exe on their FTP server… then the fun started.
    1) Downloading it took 12 hours from their servers at 20k/sec (we’re on 12ms ping and 9.8M down, so we’re not short of bandwidth)
    2) Installed it, downloaded Agilent driver, but no instructions on how to install driver
    3) Finally found how to install driver, opened example, ran it, got obscure -1244xxxxxx error
    4) Watched tutorial, went to Measurement and Automation Explorer, but no “Devices and interfaces” listed
    5) Google search again, turns out I need something called VISA
    6) Now waiting another 4 hours to download 479MB of VISA bloatware from their shitty servers
    Is it really 2011? Feels like 1995 man
    Hate it already and haven’t even started using it

    • David says:

      You need to look at your internet access. I can use a hotel’s ‘shitty’ internet connection and download NI software quicker then you did.

  60. elset says:

    I graduated with a BS in Computer Science, where I studied C++ for four years. I found a job where I program LabVIEW and have been using it almost exclusively since then, for the past 3 years. I taught myself how to program using LabVIEW in <2 weeks. Granted, my first programs were garbage, but unlike all of you, I actually spent some time learning how to use the environment. Now, I'm a Certified LabVIEW Developer. As has already been said, 99% of the complaints in this thread (which I also had when I started) are strictly the poster's ignorance of how to program LabVIEW. Take a day or two and actually try to learn how to use the environment (rather than spending the time bitching and moaning and googling LabVIEW sucks) and all of these complaints go away.

    • yonose says:

      Then so be it.

      I agree everyone of us need to do some more effort to undestand it, but if you learned C++ and then use LV almost exclusively tells (as if you care sure I don’t that much)a lot about your fanatic inclinations…

      Labview is good for quick, low-to-mid scale applications, not actually for development, IMHO… and wrapping other programming langauges within LV’s backend can be a pain sometimes. I hope your C++ skills don’t go down the drain, either…

      • elset says:

        Actually, my almost exclusive use of LabVIEW tells a lot about my job, not my “fanatic inclinations”. In fact, I’m starting to do some hobbying on my personal time using C.

    • CertifiedWhat? says:

      LabVIEW sucks, also if you don’t agree.

  61. ms says:

    I am glad to see that in more recent years of this thread’s life opinions have been posted by users that have put the time in required to learn and understand LabVIEW (as they did with EVERY language) and now like it. Because it looks easy, people think they can fire up LabVIEW and get going, and when things don’t work or get “messy” they blame the language! So, hold on, are you telling me you never opened a book on C? Never attended a training course, or module at university? And you just started coding straight out of the box?? How would you expect to do this with LabVIEW then?

    I have to agree that different languages are good for different applications, and LabVIEW might not be the best for what you need, however, I have to point out that I would personally always choose LabVIEW over any other language, and I have programmed in many.

    Saying that, I also have to point out to Larry that he made his own point moot, saying you “hate it already and haven’t even started yet”. If you haven’t used it, you can’t say you hate it. The fact that it is taking you long to download stuff has nothing to do with LabVIEW. And I am sure you don’t expect us to see a couple of searches in Google as hard work.

    By the way, if you don’t know how to do something in a textual language, how many Google searches do you have to go through till you find a working solution? Is there an award-winning support line you can call for help? There is one for LabVIEW! Is there an active R&D team working continuously on adopting and incorporating the newest and latest technologies into your language? Or do you have to work out on your own how to achive parallelism and multi-threaded or multi-core execution? LabVIEW can do that!

    These are only two reasons why it IS worth paying for the software… it is a language that is alive and breathing and progressing and evolving with the times, and there is a huge organisation behind it to support you with all of your needs! And I am sure more experienced LabVIEW developers can post more and more reasons.

    And if the above don’t help strengthen my view, then have a look at the application linked below, and answer me this one little question:
    – How many years would it take to develop this in a textual language? (cause it WOULD be years…)

    Please note that you would need a language (of your choice) for the user interface, a different language (VHDL) for the FPGAs, and yet another language for the Real-Time target. Oh! And of course you have to find vendors for all of these and people to support all languages and equipment…

    As you will see, NI provided one language for all platforms, all of the hardware, and of course all of the support. Are you telling me that this is of less value to you than all of the above negative posts? Would you trully, honestly prefer to do all of this with different vendors and custom circuitry and 3 or 4 different languages and thousands of man-hours??? I think not.

    To repeat my earliest comment, I still agree that for some things you could prefer other programming languages…


    • yonose says:

      Because it is really practical to do with labview but at the price of flexiblity and scalability… there are many ways to get the same results.

      RT applications with comedi+embedded linux and RT kernel has worked pretty well for me… one programming language can be good enough for real-time applications and front-ends. LV it’s not the only choice. i.e. Montavista linux, Debian, Wind River linux…

      One of the drawbacks of using Labview is when you need to add more and more features for some projects you’ve actually done… in simpler words, reusing code in LV still is messy, even with the improvements made within these years.

      LabView is not a panacea, I honestly think it’s just optimal for very specific applications and DAQ will always work best with it’s back-end only… text-based languages are still better for developing visualization apps to communicate with it.

      That’s the time/money and work/flexibility fair trade, if you don’t want to deeply undestand internals and make stuff working, LV may be your best choice, otherwise do embedded linux.

      P.S. please advertise the product with discretion, remember it’s still expensive for starters and individuals than some feasible text-language alternatives…

  62. Richard Hamm says:

    I have used LabView since 1987 and stopped using it seriously in 1996. I have tried to pick up the thread since in several projects at various companies, but LabView has become so dense and complex it has ceased to be the productivity tool that caused engineers like myself to fall in love with it. I look at LabView code written by others in my current R&D position and find it almost impossible to interpret, let alone modify. Like others who have found that the once very straight forward process of reading a voltage, which has turned into a monstrously complex process has made working with the software very difficult. The biggest difficulty I have had is using the help menu with search; try to find a something using a simple very descriptive short phrade or word and a torrent of items are listed in no particular order. It is like trying to drink from a fully charged fire-hose. Once NI got rid of technical manuals, the utility of self learning how to use LabView went out the window. I will not pay or ask my company to pay thousands of dollars to attend a course which will only reveal some of the tricks of the trade, then have to pay more for the next tranch of information. LabView books are typically not very helpful; I have purchased many of them and find that one needs to have as many as one can afford to have a resource library of limited utility…no one book addresses most of the issues a user encounters doing even simple programs…usually NI issues a new version of LabView which supercedes what has been learned earlier. I do mourn the loss of the old productive LabView versions…9.5 and older.

  63. JohnDoe says:

    Labview sucks because it’s basically a TurboPascal capability with drawing in place of writing text.
    I’m developing with LV for 7 years, from 6.1 to 2010sp1.
    Started moving to .NET and visual studio 2k10.

    It sucks because
    1. it’s binary (=throw away 30+ years of text R&D like subversion, make, etc…)
    2. debugger is too trivial, one cannot really debug a program in the normal way
    3. sequence structure is non sense by definition
    4. OOP is bad implemented
    5. it’s too low level, cannot define abstract concepts, interfaces, cannot implement most of design patterns (because its OOP implementation is not complete)
    6. it’s ok for simple quick demos, but as soon as you develop more, it actually becomes even more complicated than developing in .NET
    7. drivers and API for NI instruments are available also for .NET. Also graphical controls are available, if you miss them.
    8. Cannot define variables! a variable is not a graphical control. When I define an int, I don’t want an “int control” ok?
    This is pure madness.
    9. two windows for every source file, cannot inherit (bc it’s file-based, so you need to put files into a file-container, cannot have 2 vi with the same name in memory -> cannot overload), cannot make more than one function per file!

    and I can go on infinite

  64. AndyS says:

    WPF might be the answer. XAML for UI of any complexity. Any .NET language for logic. .NET framework for math, network/serial comms (talk to hardware directly) and parallel multi-core processing. Enjoy easy life on unlimited possibilities.

    • yonoe says:

      Totally Agree.

      Even if I dislike C# a little more than a bit, every programming language used for parallel programming is just enough to do some good work without needing LabView as an essential tool to od the job. Yes you name it:

      systemC, bitC, ada, erlang, C# and even Python and Lua are going ahead!!!


  65. Test Engr says:

    Yes I also googled “LabView sucks”. I just spent a solid month studying C# and can do pretty much what is needed for a professional level test engineering job in that environment. Two weeks into my 1-month LabView study and I can see that this spaghetti -code nightmare of visual non-sense is a nonstarter. As for the contention above that LabView is a good environment for FPGA development I can say after using both that the Xilinx tools are head and shoulders above LabView. With 29 years of test engineering experience and 10 years of CVI development I say LabView sucks. If you never want to be a professional programmer study LabView.

  66. yonose=yonoe says:

    And I 1000% agree with the some of the fair critics made here:

    LabView SHOULD be made simpler, or al least, while adding new features, make then as simple to use them!!!


  67. BrunoC says:

    Basically I’m not a programmer, but an R&D engineer for automotive engine component. My colleague from the lab currently use DasyLab, and will switch to LabView.

    Although I agree with all the critics (even if I don’t progrmm, I know what “good programming” means), I must recognize that LabView has an intrinsic advantage to programming langauges (since, according to me, Labview IS NOT a languag, rather than a GUI to let a mchine programming for you…): lab engineers who are just interested in building test bed and measurement aparatuses, but have NO passion or even slightest interest in SW development (other said: non-geeks engineers ;)) are able to build controling programs for their devices.

    Their programm are shit? wes they are. Their “code” are super shit? yes they are.

    But at least, a non programmer can control a test bed and perform measurement required for its needs. Don’t forget that his boss don’t care about the code, he only cares if he can see the diagrams he wants during a 1600 hours virbrition test of his DUT.

    That’s all. As long as the SW and automation don’t understand that, LabView will remain the standard. Because a programmer is expensive and can do ony programming, no test bed development.

  68. James says:

    I think some of the criticisms are due to a lack of understanding of how to use Labview. You can bundle all the wires together, for example, and unbundle only the ones you need for each event case. You can create mini programs for functions you need, and drop them into your main programs, removing a lot of the mess. For debugging, just make a mini program of the function you were trying to use, and try different configurations to remove the bug.

    Admittedly, there are many situations when Labview is not the best ‘language’ to use. However, I managed to make a full featured measurement program in Labview in 1 month, with only 2 months experience of Labview. It would have taken years for me to learn all that was required to do the same thing in C, for example. Labview is like a giant library of code that you can plug together and get results fast.

    The criticisms I have for Labview are – it costs too much, you have to pay a license fee to distribute your software, and I’ll admit not being able to name variables is a problem, and when your program gets huge like mine was, you can no longer use the “light bulb” to show where the problems are. Wiring stuff off-screen is a pain – there should be a zoom function. I’ll also admit that nobody but me would be able to follow how my program works in order to edit it. Two problems that are tedious – you can’t copy and paste property nodes (and others), you have to get them from a nested right-click menu from a specific vi. each time. Creating a good looking interface is harder than it should be, and mostly involves trying to make it not look like Labview, but a regular program.

  69. Ex NI-er says:

    I used to work in the LabVIEW R&D and for that i hope you guys forgive my sins :). For Todd — you got to appreciate that guy he is “marketing” which is not very subjective.

    My thoughts on this, at first out from college its kind of cool to see a graphical programming. I done C++ and Java all through college, have a job developing Java based application. LabVIEW is sort of good for building quick UI for demo with graphs, dials and others. Its really bad in relationship with displaying tables and arrays.

    However there is a jarring problem with LabVIEW. My cases:
    1. “NI supporter” will say if you keep your diagram nice and clean then you can get away from the spaghetti problem. Well I will say for them get back to me once you have a library of VIs that spans in 5000 + vis. Which is what you get when you are developing large system. Should I mention getting 5000 + vis to load in LabVIEW environment and migrating through version and building .exe is a pain in the neck. I have to say spending countless days just linking VIs and trying to trigger build is just a waste of productivity.

    2. LabVIEW is really bad in terms of writing good code that handles errors in programming. I do not mean “insane object because some of 0xFFFFEEE in the diagram issue” (P.S don’t bother calling in NI for this error they do not even know what to do with it, all they do is give it to someone in R&D to look into the VI to find one of the freaking wire bend that is not “kosher” believe me kids not event 50% of LabVIEW R&D knows how to do this). I mean error propagation from your logic. No throw exception concept, you can potentially still running your program and might killed something if you ignore error. Fascinating right ? Don’t bother asking them about this, since R&D does not know how to address this either.

    3. LabVIEW + c code. I think they have this one interrupt very well. Of course I would be puzzled since LabVIEW is essentially built on top of C and c++.

    4. LabVIEW and .Net framework. Oh boy here comes the big gorilla. Let me just say this, be careful with memory leaks. But no worries who would want to use .Net framework in this day and age right?

    5. LabVIEW and Java. Good luck there.

    Last and foremost. LabVIEW is NOT a programming language. It is a tool marketed by a company that insist it is the only way of programming.

    By definition a programming language will allow its user to apply programming patterns and create new programming technique that makes programming itself as an art. In LabVIEW all you get is a vis with wires and wires, without elegance in itself.

    But why is it useful especially in engineering labs ? One answer to that is does well with communicating with its hardware product which is expensive but those have a nice concept. I have no problem in that.

    So conclusion is that if you use NI hardware then by all means use LabVIEW. But please do not teach college students LabVIEW as the base of programming language, we will make this nation dumber if we do that.

    • ms says:

      I am sorry Mr Ex NI-er, but I sincerely doubt you ever worked for NI, because if you ever did, like I once did myself, you would know some of the stuff you claim is completely and utterly rubbish! LabVIEW is not built on top of anything! The environment is built on C++, but the code builds straight through with no interpreter or anything, so this has nothing to do with how C or C++ comms are achieved… In any case, I believe it is quite clear by your very own definition of programming that LabVIEW is in fact a programming language. You’re just a bad artist. My VIs are very elegant I inform you, and they work fine, and they link with no problems, and the executables are built with no problems at all. And I’ve worked with large applications, and I’ve helped confused users, and I’ve shown them the light!

      And btw, the hardware is not expensive, it’s an investment. As is LabVIEW. If you want to spend hours and days and weeks trying to integrate data acquisition with control, and data management with test executives and so on, then clearly you will waste your money and engineering time while others will simply invest in NI software and hardware that makes it so easy to integrate all of the above and more, it’s mind-blowing.

      I would say, if you want to make a strong point, don’t claim you worked for NI, just put a good case forward with accurate facts and figures. I no longer work for NI, or work with NI products, but I still have the utmost respect for everything they stand for.

      • Richard says:

        You, my dear man, are absolutely incorrect.
        Your facts are jumbled.
        LabVIEW is no more a programming language than Lego a building material.
        (Which is probably why they are used together.)

        BTW: NI hardware is excellent, their software is questionable at best.

  70. Ex NI-er says:

    For ms: I am sorry sir, if i touch your buttons :) but here is the fact:

    From NI website itself:


    Several things from the site:
    “What Does an “Insane Object” or “fpsane.cpp” Error Mean and What Should I Do?”
    Possible Fixes

    The following are some possible fixes:

    The first step is to delete and recreate the most recently created objects on either the front panel or block diagram …

    If the VI is small, try selecting the entire diagram and copying it to a new VI. After saving the new VI, there is a good chance the insane object error will no longer appear. …

    If you find that a consistent sequence of editing operations results in an insane object error, or if you are unable to resolve the error on your own, please contact National Instruments technical support (see the Related Link below) for assistance.

    In short: “Have a nice day, go recreate your work since we do not know how it get that way.”

    • Richard says:


      Thank you for that.
      I, for one, cannot make convincing statements about my intense dislike for LabVIEW because I do not have extensive experience with it. Unlike you, I can only mutter disparaging remarks. The small extent of my contact with the foul beast was enough to make me learn to stay away.
      You, sir, are a master. You know the evil that lurks within.
      Bless you.

    • ms says:

      This proves absolutely nothing! It only shows how little argument you have over the subject matter. All this link points to is how you can deal with a corrupt file. What is Microsoft’s solution to corrupt files? Oh wait! NOTHING! You lose it! What this article offers is a way out of rewriting the code! You simply copy and paste the block diagram, no rework whatsoever! Or are you that lazy?

      I am perfectly happy to argue any one of your points, with perfectly valid counter-arguments, but it simply angers me when all you do is throw mud at what you don’t understand, or simply don’t want to learn to understand. It’s very simple: you learn something through proper training courses, and you become good at it. The same as at University, the same as with anything else in life. If you “teach yourself”, chances are you’ll never be that good at it; and in LabVIEW – your code will look a mess, your debugging will be impossible, and you’ll moan all day about what your manager told you to do, instead of realising he has perfectly valid reasons when choosing this path. And instead of wasting time googling “I hate LabVIEW” or posting on here, maybe you should invest in learning how to program in LabVIEW. And yes, it’s not the same as other languages, but it never tried to be. If it ever did, it was only to satisfy laggards like some people posting here. LabVIEW is what progress is all about. Even Bill Gates stated “the future lies with Graphical Programming”! Argue with THAT!

      • Richard says:

        Cool yo’ jets, bubby.
        Youse tha one what drunk da kool-aid.

        Got a little button pushing going on yourself, huh?

        By the way, Google states:”No standard web pages containing all your search terms were found.Your search – Bill Gates started “the future lies with Graphical Programming” – did not match any documents.”

        He did not say that. He said: “The future lies with graphical windowing interface.” BIG DIFFERENCE!!

        You might even want to learn what that difference is all about.

        You said something about laggards.
        Those that learn things by themselves learn by experimentation. You might have heard of the “School of Hard Knocks”. You get to learn the hard way… that stuff really sticks.

        Now sit down, and shut up.

      • ms says:

        I wasn’t referring to that quote, it was specific to programming, and I possibly misquoted slightly, but I’ll find the keynote and post it.

        As for the “school of hard knocks”, trust me, I’ve been through it plenty of times, I’m a real engineer! But I never blamed the product for whatever was going wrong due to my own lack of expertise, which seems like the case here.

        And I will never sit down, or shut up, you can bet on that. Especially when there’s forum fascists like you… Either argue, or leave the rest of us to argue. If you have no point to make then maybe it’s you that should sit down and shut up. But I would never really ask you to do that, I’m just returning the suggestion.

      • Richard says:

        “forum fascists like you”?
        Excuse me, it was the ‘drugs’ talking.
        When you have something constructive to say, please be my guest.
        Until then, please leave the LabVIEW fascist comments at home.

      • ms says:

        It was you that told me to sit down and shut up. That my dear friend, is fascism. And I have loads of constructive things to say, I’m just waiting for someone to put up a good argument!

      • Richard says:

        LOL… The argument you are looking for is way up there. Read all about it.
        This thread is titled “why I hate, despise, detest, and loathe LabView” so, what are you still doing here anyway? Like I said, you drank the KoolAid, you love LabVIEW, and no one can open your mind enough so that you can understand otherwise.

        And the last guy that said “I’m a real engineer!” in front of me was last seen banging a thumbtack into a cement wall with an effing aerosol spray can of some cleaner stuff.

        I’m sorry, I really need to calm down here. It is just too funny for me.

        I seriously apologize for the “sit down and shut up” remark. I know I hurt your feelings, and that’s just not fair on the Internet. Please take your love and devotion for LabVIEW somewhere where it will be appreciated.

        If you want an argument. Start it with a basis in fact.
        LabVIEW SUCKS. There is a fact. Now, go ahead and argue.

  71. Ex NI-er says:

    For ms cool down :) I sort of smell some http://www.lava.org follower odor from you.

    Let me clarify one thing about LabVIEW programming:
    1. It is not interpreted, which you point out. Which is 100% correct.
    2. LabVIEW environment is built on top of c back in the early 90s then to c++ in later days. However your statement of the code built straight through is rather incorrect, for these reasons:
    – Most native Block Diagram component they yellow ones are written to emit direct assembly commands. I am talking about most of the simple operators.
    – The hardware driver VIs you see is not actually emitting direct assembly call. All of them in turn call c build all that is generated by the hardware team. So not built straight from labview.
    – In the 2000s there is a notion of XNodes which is building blocks for almost the FPGA module and newer primitives that show up lately. These XNodes utilize LabVIEW vis underneath and also call c dlls to make it function.

    Here is the question if LabVIEW build straight through then why when you run a LabVIEW build exe, say a simple VI that add 2 number (a + b = c) will show a 30+ mb of memory? I do not think a C.exe will need this much foot print to run a simple program.

    I’ll let you ponder on that, but for others the reason is that LabVIEW is in fact do not interpret LabVIEW code like a Java interpreter do. But what it does is that it is a scheduling algorithm that loads these direct build assembly code generated from LabVIEW VIs and decide when to run parts of it. That is why when you run a simple exe there is no way to get around of loading a whole scheduler built by National Instruments and start executing those “direct build” code.

    So yes I know labview.

    I am saying for people who still in college keep in mind lab view might be used in your labs (partly because of NI hardware demands LabVIEW), but if you are looking for jobs out there getting to know a real programming language like .Net, Java, C/C++, Web related language is a must.

    Not a lot of LabVIEW outside university/test and measurement out there.

    And BTW Microsoft or Mac does not corrupt my source code, i.e. they are just text file for crying out loud! Have you ever do text programming ? How can you corrupt a text file?

    • ms says:

      “Not a lot of LabVIEW outside university/test and measurement out there”? I think you are very mistaken there… Search the web for testimonies from the largest manufacturers in the world, check the Fortune 500 (btw, don’t worry, all of them use LabVIEW), if you could, I would ask you to check on the biggest military/aerospace projects, but they’re obviously held under wraps… LabVIEW used to be just a tool, nowadays it’s a fully blown environment, and most scientists will never want to learn to program in text. I have gone through all flavours of programming (I mentioned this in an earlier post) and I hated every single second of it! But I love LabVIEW! For me, it’s great that I can’t get deep into the insides of my code sometimes, however I do not consider the depth in LabVIEW lacking at all. I can use ready-made functions by simply utilising an add-on palette, and if I get stuck I have a world-class award-winning team to support me. And maybe in the R&D world you were only seeing the ugly side of LabVIEW, the side that involves dealing with the biggest bugs in its history, but in support you see the beauty of it! I helped customers out of serious trouble, and helped them in no time! How long will it take you to figure a problem out in C? Who do you call when you’re stuck? I guess Google is a great search engine but do you always find what you need?I have never let a user down, whenever they called with an issue, I solved it, and my turnaround time was always under 48 hours as defined by my superiors. For everyone that has had problems, have you tried calling the technical support team? And don’t tell me about costs for the support, it’s a measly couple of hundred bucks/pounds/lira… That, in the scheme of all things, is nothing! Your company would rather spend that and have you up and running in no time, rather than have you spend days and days trawling through endless pages on the web trying to figure out what you’ve done wrong. And this is why managers pick LabVIEW! The time of an engineer (sorry Richard, I’m still very proud to be one) is worth much much more to a company than simply buying the perfect solution.

      It’s all well and dandy sitting here trying to convince you that I have not drank the KoolAid, I just used plenty of languages to see that this one would save me money if I were to own a company. It would save me time, resources, money, everything. And if my company were to go bust, then my end users would always be able to ask NI for help! What would a company do if their C code programmer retired? Would you ever touch someone else’s code? I know I wouldn’t dare! In LabVIEW though? Send me your VIs and I WILL solve your problem. If you’ve coded well, and do not have a bunch of wires everywhere, then I’ll do it in a jiffy! If your code is messy, then I’ll still solve your problem, it just might take a couple of days. But the problem would arise only because you taught yourself, and didn’t realise that (as Todd said) LabVIEW is NOT easy. It’s easy to get a quick measurement, but a complex application is just a complex application, no matter what language you use. After a year of C I could do very little compared to what I could do after a year of LabVIEW, never mind taking up development of large applications.

      Take two programmers, one for C/C++/VB/etc, and one for LabVIEW; put them through a week of certified training; then give them a task… if your programmer finishes the task faster, I’ll sit down, and shut up – no need to be told. Until then, I hope one day you realise, your managers are right – you will be more efficient, and less costly if you learn LabVIEW and work with NI on your projects. And again, as I mentioned in my first post, there will always be other languages that are slightly better at some tasks (like PLCs for example), so your C expertise will not go to waste, but overall, your LabVIEW programmer will be more than capable to tackle a LOT more projects without any further google searches or training than any other programmer of the same level of knowledge. It’s just that everything is so well integrated…

      I don’t use LabVIEW anymore, I’m back in textual programming, but I can’t wait to switch back again at some point in the future… My google-searching skills are far inferior than those beautiful ready-made functions I used to have at my fingertips…

      ms, the engineer that never wanted to program.

      • Richard says:

        Yeah, that took a while to think out and compose. Nice references, though. I realize now that “ms” must be for misguided soul.

        You have answered your own question (more than once, I might add) in your missive.

        You do not like to program, “ms, the engineer that never wanted to program.” This is why you like LV so much. It is not for programming.

        I have worked in production testing environments that did an “experiment” similar to what you describe. They took an existing system coded with Microsoft development environments, using custom developed and produced hardware and replaced the whole megillah with NI components and programmed it with LV.
        The hardware was superb! Precise and fault tolerant (until that a$$hole with the spray can touched THOSE two wires “just to see what would happen”).
        The software was an utter failure. The software (mind you, timed against MS’ older runtime; it was not .NET) was SO slow that they needed to build a second machine to keep up with the humans on the assembly line. Then the two machines could not run on the network in parallel because the LV could not run on the network with different identifiers, so the LV “core” had to be rewritten.

        Before you start screaming that it was created by amateurs, let me tell you that this company contracted one of the most experienced LV programmers around. (This guy even participated in the creation of the “LabVIEW for Everyone” book. The LV BIBLE, as it is known. It comes in at just under 1000 pages.) He spent months creating the “ultimate” operating system that just couldn’t cut it.

        Let’s also mention that this “operating system” had to the distributed to the plant (which was overseas) by disk. It was too big to just send the plant as an update. Even FTP took hours. Oh, and the plant could not be updated with just a new VI, they had to get the whole thing because of the linking (or something) and the expenditure of the development system for plant use was out of the budget. Not to mention that there would be no way to teach them how to compile it (or whatever it does). The previous system was updated by sending the program as an email attachment.

        While you are on the subject of searching Google, try Google Trends (http://www.google.com/trends) and put LabVIEW against almost any other programming language. It is difficult to use “C” as a search term because the letter “C” is nearly ubiquitous. If you try “C++” you will be able to see what I mean. The references to LV pale in comparison. It compares well with “COBOL”, a mostly dead language.

        I believe that Ex NI-er was quite right. I smell that lava.org funk. If you return there everyone will be happy. We will continue to ‘despise, detest, and loathe LabView’ and you will be able to cuddle with all your little LV friends.

      • ms says:

        Thank you for a great post Richard, now this is a testimony (albeit biased naturally) that we can discuss! Although I can see how hard it is for you not to insult me; I’ll take sarcasm or irony over insults though, I can live with that for now…

        It didn’t take a while to compose or think, I write like a maniac! I just have a job and a life too… and I have to admit, I am starting to feel like I want to give up on you all, but I can’t! I truly believe LabVIEW can help you. In most cases anyway…

        So let’s talk about this project then if you like. I don’t care much about the size of the application and how extremely difficult it was to send over (btw, have you heard of wetransfer.com?), I only see all of this as moaning on top of the rest of the moaning. I want to understand more about the development… Do you honestly believe that all of this development in a different language would have taken less time? Or would the plant operators have found it easier to learn a textual language??? I bet you that in a week of NI training tops they would have been able to debug the code to keep everything smooth, but obviously any major problems would have to involve the developer – as with any other language of course. And at the end of the day there is only one thing that matters… It worked! You mention that a second PC was needed, and the main code had to be redone, and all of these simply point to an extremely complex application! I am certain that there is a very high chance that all of these problems could have surfaced with any other language as well… I can see no proof of the opposite.

        As for the trending, I think it just shows how many users prefer the NI Tech Support team to Google! And those guys at the plant will have a similar team in their part of the world to help them too btw… Would they have a team like that for a C system? I didn’t think so.

        And finally, I’ve never even seen the Lava website… Like I said, I used to work for NI, admittedly a very funky company! ;)

      • Richard says:

        I see that the time stamps here mean shit.
        I am previously laid off and now, out for the last six months, and the following (at least) three months on medical. Big surgery and recovery. Whatever.
        I have had hours to sit and ruminate on this repartee.

        You will NEVER convince me to love LV. Forget it. It has a pox.

        Your argument is useless because I have a brain and an opinion.
        I see that you do too. Too bad that we cannot agree on this one subject.

        But, you see, there really are more important things.

        You worked for NI, love LabVIEW, and NEVER SAW JAVA.ORG????
        Get over there! You’ll love it.

        P.S. That system? It was ALREADY running using the old VB operating system. It took 10 minutes to create a routine that allowed each station to self identify to the network and determine the ‘next’ address.

        There is no need for an “answer team” (A single “team” that “knows all”) about C. There are hundreds of forums. And the Google search engines have no trouble finding answers. (Funny how all of that came about without the use of LabVIEW.)

        Most people don’t know about LabView, I aim to keep it that way.

  72. David Lanteigne says:

    What a great conversation! I am old, and claim no particular expertise in computers, device control, or data acquisition. I have fond memories of writing mixed Forth and 8085 assembler code to run the experiment for my Ph.D. dissertation.

    My current assignment requires Labview, so I attended a day long seminar just today, and I took their kindergarten class. It is all new to me, so I expect it to be confusing, but I am a bit befuddled at all these many nested levels of function icons that all look the same. NI seems to be very proud of their graphic interface, but the graphic ambience leads me to expect to see a gorilla throwing barrels. The “space-bandwidth” product of the modern computer screen (and that’s just fancy talk for the number of pixels) still seems inadequate for the kind of design envisioned here, without a lot of squinting and scrolling.

    Most interesting of all was to talk to NI engineers. I found myself accidentally alone in a room with the product manager from Austin; I had gone back to a terminal to try to finish the morning’s exercises. He asked me how I found Labview, and I believe I answered diplomatically, “From where I sit, the learning curve looks very steep.” He asked me my previous experience, and I said, “Zero.” He was not impolite, but my answers seemed to get under his skin. Especially when I indicated that I know a little about writing data acquisition and digital output, but was only learning Labview now because it’s unavoidable. He suggested training, something about Core1, Core2? I asked another who had been a morning instructor, and to my surprise, he could recommend no good books. I am still puzzled about how one is supposed to learn this language, or GUI, or whatever it is.

    Mostly, I got the impression the NI guys are all talking to each other, and they assume everyone knows what they know. Even in a non-Labview session on some RF test equipment, the speaker went on at some length about a particular piece of equipment that goes into the PXIe chassis. The whole talk was more about selling PXIe than anything application oriented. When it was done, I asked, does this device produce an analog output or a digital output? The answer I got was, it’s a vector analyzer, so of course it produces a digital output. “Vector analyszer” wasn’t in the talk, wasn’t on the slide, but “of course”.

  73. Gary Wolford says:

    I started programming by learning Pascal (too many) years ago. I have taken classes in C, C++ and UML. I have written a little Visual Basic, TCL, C, C embedded and maintained some Fortran code. So that said, I have some experience with textual programming.

    From what I have read, I see that someone that is obviously better at C programming disparaging the graphical programming environment because of poor programming style. When starting a module in C, what is on the top? Why the documentation for the code, what it does, who wrote it, version information etc. Everyone that programs in a textual programming language KNOWS proper style.

    What do you do in LabVIEW? Put the documentation right with the code using text blocks!!

    LabVIEW does and can have variables that are named. You have to know how to name them. The (long) wires can and should have names.

    If you have too many wires on a diagram, you have too many wires on a diagram and,, hmmmm you woudl need to put them in a subroutine in C, so you would put them in a Sub vi!!!

    If you cannot follow the flow of the wires (Left to Right, generally Up to Down) then you are not using good GRAPHICAL programming style.

    If you don’t know where the program starts, someone wrote it wrong, documented it poorly or used a very complex framework.

    If you cannot debug the application, turn on the light bulb. It is the debugger and you can actually see the flow.

    Debug of LabVIEW is no harder than using a proper C debugger, just,,, why GRAPHICAL!! Never used a watch window in a C debugger? Well learn how. Never used a probe in LabVIEW? Well, learn how.

    Don’t forget that C (or whatever language you started with) was very, very non intuitive when you started. Well, what is the difference with an expert toolset like LabVIEW?

    LabVIEW is easier to solve simple problems than any environment that I know once you know how. Nothing is easy to solve the really complex problems as complex problems often lead to complex solutions.

    The big thing is, if you see being FORCED to use a tool as a chore, then you will not like it. If you see it as an opportunity, then it will be a joy.

    There are a hell of a lot fewer LabVIEW Developers in the world than C Developers. Check out LinkedIn how many jobs for qualified LabVIEW Developers are UNFILLED.

  74. math for kids…

    […]why I hate, despise, detest, and loathe LabView « Lists of Things[…]…

  75. disney-interactive…

    […]why I hate, despise, detest, and loathe LabView « Lists of Things[…]…

  76. jdstanhope says:

    I worked at NI for 14 years and 10 of which were in LabVIEW. I joined NI years ago because I thought graphical programming was the future. I was wrong. I tried and tried to use LV to write programs but to no avail. In the end I could not figure out a way to make graphical programming viable for anything but toy programs. This is of course my opinion and there are many people inside and outside of NI that disagree.

    My problems (in no particular order) with any graphical programming languages are:

    – Wire management. Even with automatic layout you can spend a lot of time fiddling with wires.The auto layout can radically rearrange your diagrams so that you have to spend some time figuring out where your code moved to on screen.
    – It is much easier to reference data through a name then drawing a line between two small points on screen. The concept of variable scope is not difficult.
    – Mathematics and complex algorithms are very hard to draw and even harder to read in boxes and lines. Which is why LV has the equation box, mathscript box, and a C box.
    – The density of information is very low and it is hard to fit a single algorithm on visible chunk of screen (unless you have a 30″ monitor).
    – The existing tools for searching, indexing, diffing, merging, and reviewing code do not work well if at all. They have to be re-created for graphical languages.
    – Unless your graphical language is stored in a text format then a version control system has to store an entire copy of each version.
    – The size of a graphical program on disk is much larger than the corresponding program written in text language (unless the language is COBOL).
    – You are locked into a single set of tools since there is no standard for graphical languages. In some cases there is no way to translate your graphical program into a text program since there is a limited API to the in-memory graphical model.
    – Generating code in a graphical programming language requires a complex API where as generating code in text languages requires a print statement.
    – You generally have to create an icon and write a name, thereby making twice as much work for yourself.
    – Constantly switching between one hand on the mouse and two on the keyboard is awkward and slows my productivity.
    – There is very little benefit to remembering the names of types or functions since you usually have to hunt them down with a mouse.
    – Each graphical language requires a special editor. The editors are far more complex than text editors and therefor more likely to behave badly (I have not looked at the source code to Emacs in a while but I would bet that it is far smaller than the LabVIEW editor source code). The graphical nature also requires more memory and CPU to run the editor there by increasing the amount of time it takes to load code and reducing the amount of code you can edit at one time.
    – If the editor doesn’t support a gesture then you will have to work within the existing gestures. For instance try to change a condition to a loop in LabVIEW. In any text language you replace the word “if” with “while”. In LV you have to disconnect all the code from inside the conditional and then remove the conditional and drop a for loop.
    – The relationships between functions and data structures are not restricted to 2 or 3 dimensions. This is why you very quickly get spaghetti code with wires overlapping each other. Graphical models are useful when the model has real physical characteristics like houses and circuit layouts.
    – It is possible to obscure part of you program with another part (at least in LV). Imagine if a text editor let you put a for loop on top of a function call.

    • James says:

      “- It is much easier to reference data through a name then drawing a line between two small points on screen. The concept of variable scope is not difficult.”

      Interesting. To me it is the exact opposite: it is vastly easier to see the data flow with wires than by reading and correlating all the variable names.

      I suspect the issue is you are much better at text (you understand variable scope) than at LabVIEW (you need a 30” monitor and “spend a lot of time fiddling wires”), while I’m the opposite. Well-written code is better than spaghetti code, regardless of language.

      — James

      BTW, to replace a conditional (“case structure”) with a loop, try placing the loop first, then removing the conditional; no disconnecting or rewiring required.

      • Richard says:

        “BTW, to replace a conditional (“case structure”) with a loop, try placing the loop first, then removing the conditional; no disconnecting or rewiring required.”

        This is one of the hidden gotchas of LV. If you know that this is the ‘easy’ way to edit the LV, then it is wonderful and fast. If you don’t know about it, you are in for a frustrating ride. These editing ‘tricks’ are usually not well documented. The user is often left to fend for themselves. If not able or willing to experiment, then they don’t find these easier ways. Again, text based languages can usually be edited with any text editor or IDE. Programmers can usually take their favorite editor to change code. They don’t have to adapt to the programming language’s editor.

        I won’t continue adding insult to injury with the complaint that the NI team seems to rearrange the function icons as soon as you learn which palette they are usually found on. It is easy to open a browser to Google and type a function name or purpose to find out how a function works; finding a function and palette in LV if you don’t know what you are looking for is difficult at best.

      • ms says:

        Have you tried the “search” button at the top of the palettes? Or hitting Ctrl+Spacebar in the latest versions of LabVIEW? You’re in for a treat.

      • James says:

        “If not able or willing to experiment, then they don’t find these easier ways”
        Not being willing to experiment is a problem. What I suggested was not a “trick”, just what came into my head when you were having problems doing what I know isn’t hard. By actually experimenting with doing it I found that it is even easier: just Right-Click>>Remove Case Structure and then place your loop. Difficult to make it any easier, but one does need to right click on things and look at the menu options.

        Again, I think you must have much better abilities or experience with text than graphics (such that memorizing obscure function names to look up with Google is much easier than searching through a menu tree). You should defiantly go with your personal strengths.

        — James

  77. bogota,colombia apartments…

    […]why I hate, despise, detest, and loathe LabView « Lists of Things[…]…

  78. Byrsa says:

    I have used Labview for LAb Testing. I Like the ease with which you create the GUI but I think the block diagram needed revising. For example: Let say, I want to have nested functions such as functionA(FunctionB(FunctionC(Input1,Input2),Input3))) This is kind of easy to do in Matlab or any language but to implement it in Labview you need a lot of blocks and wiring.

    I gave up Labview and now I am using LabScript. it is a brand new platform that is very easy to use

  79. AlexanderTheGrape says:

    I googled “alternatives to Labview” and this came up as link #2.
    I’ve read this entire thread from start to finish. I’ve gotten a HUGE laugh from a lot of this thread. My opinion is that LabView is hands down the most frustrating “language” I have ever had the (dis)pleasure of working with, of which that set is, in order [Pascal, C++, FORTRAN, 8051/assembly, Java, MATLAB, LabView, Python (which rocks)]. I actually find the comments of jdstanhope as one of the more useful replies to this thread, and I think it epitomizes one of the major problems tried and true text based programmers (including myself) have with LabView.

    Fundamentally what frustrates me about LabView is that I simply find the interface difficult, “sklunky” (like a Model T), and time consuming to use, compared to the prototypical development interface like Eclipse, Visual Studio, MATLAB, Spyder, or hell just a text editor on command line. The whole idea of graphical programming just intrinsically seems like a flawed concept to me. I don’t need jumbled wires (THAT NEVER GO WHERE YOU WANT THEM TO) connected to arbitrarily/tersely labelled terminals separated by a couple pixels, that require me to look up the part on Help, to show me where variables go/are. I don’t like *being forced* to use the mouse input, which is far far far slower than pure typing. I don’t like not being able to see where a wire leads to/what it does at the end of it’s long trek across my screen, which can be confused with other wires (thank god for triple click I guess, but for the love of god that requires the mouse!!!!). I don’t need pictures of loops on squares that all look the same anyway (except to the trained eye) to tell me what a for loop is.

    The argument a lot of the pro LabView people are making, that we haters simply must put more time into learning good programming style, or simply don’t know what we’re doing well enough, is moot, because LabView is a proprietary product. In anything other than LabView, none of the “programming style” you learn there matters. Realistically I can look at *any* text based language’s for loop and know what’s going on, and even use it after about a week of syntax adjustment. That is what bugs me about LabView/graphical programming, it simply is so fundamentally different from the majority of other, (more) robust programming languages, that it just leaves me and I imagine the other people in this thread going, “Why would anyone make it this way?!?!?” I can look at almost any other text-based language and have a good general idea of what’s going on, I think. I really don’t feel the same way about LabView, and thus, the frustration.

    Almost everyone has admitted that anything that can be done in Labview can be done in a text based language, and vice versa. With that in mind, I think I would much rather learn a language like Python or Ruby, because they can be used for a broader range of things, and do not require a proprietary, expensive “compiler”. Right now it seems like object oriented programming that doesn’t require you to manage memory is the jam. And naturally so, right? In the same capacity that I find connecting wires cumbersome, I used to find managing memory in C++ cumbersome. I propose that text-based programming was around, and will be around long after graphical programming, NI, and LabView have been buried. In the end I just don’t find the process of programming graphically as efficient as programming in text (someone made a similar point about information/space, i’m saying low useableinfo/screen limits your programming/time), and my point is that only people that have really dedicated themselves to/learned first on the former could or would think otherwise (of course there are *always* outliers and anomalies).

  80. joe says:

    I really started to use Labview when version 4 came out.

    I first became aware of NI around the time TurboBasic was big and they came out with LabWindows for DOS. Early on, Labview was a real time saver.

    Mixing C with Labview provided more than enough performance. So what if it had no UNDO feature back then!!

    As the years went on, Labview changed normally for the better. It got more reliable and they even added an UNDO for us!!! Life was good in the test and rapid prototype world…..

    I have been burned trying to use their hardware in the past and after I could not speak with someone who acutally knew anything about the deisgn I never considered their hardware for anything beyond basic lab use. They did eventually pass on my report and repeated my tests. In this case they changed the specs on the board to solve the problem.

    When it comes to Tech support, I view this as a way for me to train their new employees is all. I don’t have that sort of time…. I digress…

    Then something changed. It was almost like NI had changed management and their entire development team overnight. Really, someone thought moving the Make Current Values Default from the Operate menu where it had always been to the Edit menu was going to save developers time… Really?? I am not a child trying to automate my platic building blocks. I feel my blood pressure going up….

    I am still current but have not opened the last three updates. Every new release, when I think it could not be any worse they surprize me. I have given up on them.

    My solution has been to stay with the older versions. 6.1 when I can. We may be on 2009 for a very long time…

  81. David says:

    Hi. My name is David Fuller and I am the VP of App & Embedded SW R&D at NI. I have worked my entire professional career at NI and I started writing instrument drivers with LabVIEW and CVI in 1992. I developed using C/C++ for six years and then moved into management. I still program for fun at home and mostly dabble with .NET just to see what Microsoft is up to. Someone told me about this thread and a post from JDStanhope who I worked with for most of the time he was at NI. I was truly sad to see him leave. I want to take the opportunity to address some of the points in his post and others. The following comments represent my personal opinions of topics on this thread.

    One of my colleagues, Todd, found this discussion in October 2009 and I stand by his original comments. It’s impressive to see such a vibrant conversation going on despite this blog being officially retired in June 2009. Thus, for those with constructive commentary on LabVIEW, like Joe and others above, I challenge you to submit it on our open Idea Exchange – ni.com/ideas. Your feedback will be available to the public and voted on in order to be directly incorporated into our product development cycle.

    With that, David enters the Lion’s Den. ;)

    Leading with LabVIEW
    We clearly lead with LabVIEW as our recommended programming language. But, we do have a big-tent view of languages which is why we also provide CVI, a C environment and rich HW APIs/libraries and UI controls for C/C++ and .NET. Further, within LabVIEW itself, we have embraced the idea of Multiple Models of Computation defined by The University of California at Berkeley. That is, different problem domains are more optimally solved by certain domain-specific languages. For example, I personally believe that math algorithms are nicely expressed using the textual notation we all learn in school. That’s why LabVIEW provides structures that take textual math as input and supports state chart and continuous-time domain Models of Computation. For programing, it really is about the right tool for the right job. If you guys simply detest LabVIEW at a “religious” level, then by all means use the language of your choice. At worst, we could be guilty of “LabVIEW’s the answer, what was the question?” (I stole this quote which was originally about Sun and JAVA.) We do lead with LabVIEW because independent of this community that includes some passionate self-labeled haters, with LabVIEW we have unarguably enabled thousands of scientists and engineers to be more productive and successful in doing their jobs.

    LabVIEW as a Language
    I hope we can move quickly beyond the debate as to whether LabVIEW is a programming language or not. LabVIEW is a fully-compiled fully-expressive programming language. You develop code. It runs. If you guys want to get into pedantic technical debates about language esoterica regarding “the true nature of compilers”, email me and we can go deep if that makes you feel better.

    Open v. Proprietary
    LabVIEW is by design VERY open. LabVIEW effectively interoperates with .NET, C/C++, VHDL, and communication standards like web services. LabVIEW is a de-facto standard in Test & Measurement, and Jeff Kodosky, co-founder of NI, along with his team, did create it. As Oracle owns JAVA, Microsoft owns .NET, Apple owns ObjectiveC, so too, NI owns LabVIEW. I do hope (though this is just my hope,) that one day we take the same path as Microsoft with .NET and we standardize aspects of our language. We are working towards technically making this feasible.

    Productivity and Communication
    Productivity as a measure is highly situational with at least variables relating to:
    • Developer(s) skill
    • Language intrinsics (syntax, type-system,explicit/implicit memory management)
    • Libraries for IO and analysis
    • Develop-deploy-debug cycle time and quality of execution insight
    • Editor mechanics (create, edit, diff/merge)
    • Run-time performance
    • Scale and complexity of the problem

    Most of the constructive commentary in this thread relates to editor mechanics. LabVIEW enables many nonprogrammers to succeed in automation measurement, control, or design applications. Many would fail without LabVIEW, so I doubt they measure productivity in mouse clicks versus keys pressed. With regard to editor mechanics for professional programmers, clearly, there are different steps needed to create a diagram and the visuals matter whereas in text they do not. We mitigate some of these editor concerns with what I consider world class layout. Further, John is right that we currently only have binary file formats which means we must provide visual merge/diff. However, visual diff/merge has its own advantages and we will eventually offer both a textual diff and visual diff.

    Overall, I believe for NI’s target domains, the benefits of our productivity strengths in almost every category above, FAR outweigh the small advantages of text-based editor mechanics. Further, I am really looking forward to our team proving to the world the crystal clear superiority of diagram-based programming for touch environments.

    Diagrams versus Text
    I claim that structured dataflow clearly conveys the following better than text: coarse-grained parallelism, data-dependencies, order v. chaos, and overall program structure. I believe this is why so many structural modeling languages like UML, choose diagrams to convey information. I completely disagree that visual images are less information dense than text. I think this is where the debate devolves into preference, but I will leave this section with the comment that, “a single picture is worth a thousand words…”

    So, if you made it this far, thanks for listening. We certainly want to know what our greatest critics see as real problems with LabVIEW. We are investing heavily in LabVIEW and it is a personal goal of mine to find and address core issues related to graphical development – some of the feedback on this thread is sound and actionable – some is passionate vitriol. Thanks for the former. Hopefully, we can reduce the latter, but this is the Internets after all…

    • Richard says:

      Nice weasel-words PR work there…

      I have mostly ‘passionate vitrol’ because this is such a HOT topic in my mind that I cannot think clearly whenever exposed to LabView. I will not go into specifics because you will just dazzle me with more NI bullshit.

      All I have is sour grapes because I cannot give you the grimy details of the situation.

      LabView still sucks.

  82. James says:

    What I like about computer programming (at least text-based) is that it is the lovechild of english and math. You read code sort of like how you would read a book, and you create functions and solve problems like you do in math.
    What I’m trying to say is that while LabView may be a good concept, it takes the reading portion out of the equation and deals more with visual-spatial reasoning (like in circuitry). Also, typing is much easier than using the GUI. I can type 50 words a minute, but I am less good with moving and clicking.

    To expand on my lovechild argument, a simple 15LOC program would be like a short answer on a test. A 200 LOC would be like an 5 paragraph essay. A 1000 LOC program would be like a 5 page essay. A 15000 LOC program would be a like a book. I don’t get the same sort of english crossover in labview that I get in other languages (no matter how bad they are).

    • James (a different James) says:

      A good comment James. Personally, I’m better with a mouse than at typing and have better visual-spatial reasoning than reading ability. So it is unsurprising that I like, and am good at, LabVIEW. Given that one can do most things with either a graphical or text language, it is best for each person to go with their own strengths.

      — James

  83. Tom says:

    In my case, I found this thread while searching for “labview object oriented interpreter” so nothing about “hate labview / labvew sucks”.
    Like many labview programmers, I started with C/C++, assembly, java, etc.. until I had to do a project with matlab and labview 7.1, about 10 years ago.

    Since then, I’ve never stopped programming in labview and dont even want to bother learning another language.

    Sure it can be frustrating sometimes and ni website / search function are so damn useless most of the time; but it is just amazing the amount of work that can be done so fast with labview: be it in vision, motion, communication, acoustic, automation, with PC, real-time or FPGA hardware, 3rd party equipments or drivers, with database, etc…. whatever you want, there is most likely already a solution for it in the labview examples, ni website, LAVA website, and many more resources.

    I started as a biomedical engineer and conducted projects in all those different fields, with many different products, from many different manufacturers, and always with labview.. it’s just that easy (for me?).

    I am pleased to learn more and more with every project and with every new labview version. However, to avoid bugs, I always wait for the “SP1” version before upgrading, so I am currently playing with LV 2011 SP1, released a month ago .. and loving it :)

  84. Rolf says:

    I’ve gone through many design languages from assembly to basic, c, c#, .net, php, iOS etc. I’ve also worked with LV for about 2 years.

    I will never return to LV just because you’ll always run into structural programming problems and end up with a bunch of spaghetti.

    The only nice thing about labview is that you think that you can program.


  85. RF dude says:

    I have waited 6 months for a test engineer to develop an ATE to test RF performance using Labview.
    After 6 months it is still not working , i see the guy with 50 windows open with all these ridiculous wires everywhere on the screen, he is always pulling his hair out as is his colleage next to him
    This Labview is so overcomplicated that he cannot debugg the problem on it
    After 6 months I have given up waiting for him to deliver to me the soloution,
    Having used a trial version of labview , after about 30 mins i saw immediatly that to do a very very simple task was so overly complicated that you run the risk of never ever getting your ATE to work.Life is too short
    I promptly removed labview from my pc
    I talked to a colleage who was automating with Python
    It seemed so obviously easy to use
    It took me a couple of days to do what i wanted and i have no SW of ATE expertise at all!!
    Labview is ridiculously overcomplicated and is a full time job to maintain
    Labview can cost you dearly on your development verififcation.
    You will will spend all your project time getting labview to work instead of testing your product.
    It could eventually get you the sack from what I have seen

  86. Bautrocknung says:


    […]why I hate, despise, detest, and loathe LabView « Lists of Things[…]…

  87. That Axel Guy says:

    With four months of experience I’m rather new to the LV world. Having a strong background in CS and text-based programming I’m certainly a guy who will rather fight with this graphical style as I am used to algorithms noted down in pseudo-code and having top-to-bottom sequence where (if you have a reasonable programming style) each line does exactly one things. I’ve had some nice moments with LV but overall I am looking forward to the day the cool project I am now working and that is based on LV for DAQ is finished and I can return to a job with an “ordinary” language. LV is great when you want to do a quick measurement or build a prototype to throw away but in our project it is used for a distributed system comprising a large number of nodes to control and measure stuff and we want to go down to the some-dozen-ns range. A reasonable number of developers is involved and for this you need _good_ software engineering. And this is exactly where LV fails completely:

    1. User defined named constants that are entered at _central places_: It is really a 40 years old no-brainer that constants with a good name ease understanding of the code and allow for changes to be only applied at one place. E.g.: the minimum and maximum range for an analog input. I found workaround solutions like using Globals (but will they induce overhead?) or creating a VI that implements an enum with a case-structure to encapsulate the value at one place (but what about different types? and it takes a lot of space on the block diagram in the end) or encapsulate each constant in one VI (which would lead to 100s of constants VIs for a larger project) but these are no replacement to e.g. VMAX_HW_REVISION_2 (without knowing the application you can assume what is meant, in our block diagram there is a double constant 5).
    2. As already mentioned, traditional SW version control with e.g. SVN is impossible. SVN has saved my ass in my old company several times when I was able to go back easily (with a graphical client) the last three months through all revisions and find out, when exactly this strange change was made by me, then compare it to my emails and suddenly remember the whole story. Not with LV.
    3. As mentioned on another “why I hate LV” site: During the build you can do nothing. Ok, sometimes it’s nice to have an excuse to get yourself a coffee but most of the time I want to finish the stuff and then it’s just a pain to wait several minutes without even being able to look through the code. This is just insane, it would already be an improvement to lock the files for changes but allow for browsing them. All possible productivity gains go away for large-scale application.
    4. It’s trying to be clever about types but it isn’t. If you have a state machine implementation with enum and you want to reuse it but change e.g. the number of states or some state names and you create another control with the new enum and change the case-structure to evaluate the new state machine type you still have the old types everywhere in the wires. Have fun replacing each of them! And, yes, you get this little red dot to tell you there is a cast. So you go through each instance watching out for little red dots because who knows what might happen if you miss a cast and your state machine goes nuts. I’d rather take 2 pages of g++ error message output to find my type error, fix it and be confident when it compiles that my state machine does not enter illegal states.

    There is surely a lot more to say about this but I’ll stop it now. I could rant at least the same about the NI software being crappy, MAX just freezes on a regular base for some seconds to nearly 15 minutes, LV itself is dead slow when the project is large and if our shared variables are not updated in Distributed System Manager chances are even that either our component has died of the DSM just is not in the mood to update them. For the money they charge the behavior of the sw is a joke.

  88. Sergey says:

    I’m surprised this discussion is so active 1.5 years after I pointed to it at lavag.org. People who keep following the topic are not indifferent to LabVIEW, they hate or love it.
    Two of my collegues (programmers) told me once that they tried to use LabVIEW but could not get use to that because of completely different logic and workstyle. They are not hate it, they just admitted that this language is not for them. After reading discussion I see that some people more comfortable with text but some people more comfortable with graphic. This is personal pecularities not related to languages itself.
    The thing with LabVIEW is, either you “get it” or you don’t.

    • Mr. Tea says:

      It’s not much the problem of “getting it” as it is a problem of being comfortable with it. Most engineers take a programming course or two in university be it Java, C/C++, Fortran, ADA etc. Computer science/programming majors are most always taught Java, Python or some other introductory OOP language. They then move on to more powerful yet difficult languages such as C/C++.

      The problem comes in when you now sit those engineers or software developers in front of a LabView IDE and say make X software operate Y hardware. They now have to adjust to not only to a new “language” (if you can even call it that) but to a completely new programming paradigm. People who are proficient in a programming language often do not want to be forced to learn a new one unless they are willing. And there in lies the problem, project managers not properly polling their staff for input and doing some basic research. If some some manager type decides that NI hardware + labview is the way to go then they better factor in the price for costly training or hire an expensive and experienced development team. Sitting a traditional written language developer in front of a labview IDE and forcing them to use it will only have the opposite effect of saving money. They may even feel insulted as the will think “what was wrong with my previous language and the code I developed?”. Now you just pissed off your dev team and friction develops between management and staff. More time and money go down the drain.

      If your inexperienced with programming then LV may be what you need, the pretty boxes and wires might make more sense than “return((*coreTable->regRead)(NULL, RegNum, RegVal));” So it boils down to this: experienced software engineers DO NOT need LV to be more productive, let them continue using their tools and language of choice be it C compiled with GCC using hand written make files on Unix/Linux or C# in Visual studio on Windows. They already know the best methods to design a solution. If a team or individual is not experienced with any programming language then maybe LV is an alternative. That or hire some competent programmers or a consultant.

      Bottom line is anyone can *get* LV, the question is do they need to?

  89. srfbe4 says:

    Pure OOP is obsolete, dataflow serves modern systems better.
    You can go as low level as you want, with LabVIEW.
    Script nodes do exist, so you can still code.
    If you prefer text shit, you have no imagination.
    And you ARE much slower, too.
    You are also linear in structure, and in you thinking.
    You will never design anything other than calculations.
    The best engineers want it the most.
    I guess spotty faced little “programmers” don’t like it.
    But again, undereducated IT laymen have no imagination.
    They are usually quite thick, too.
    Extremely easy to debug, when you seek code, I map the flow.
    Ridiculously expensive though, that’s true.

    “Competent programmers” unable to debug based on visual structures? What a joke, it’s also shit for business, I’d rather fire the programmers slowing the business down, and let engineers do all the prototyping.

    Are really all the IT monkeys such pathetic, complaining, narrow-minded, undereducated, untalented little shits?

    • Mr. Tea says:

      Wow. That is the worst troll I have ever seen. Thanks for the laugh. Mentally challenged internet folks such as yourself are quite amusing to say the least.

      You also need to brush up on your English, either English is not your first language or you are an unemployed butt-hurt engineer who lacks English and social skills, among other things.

  90. free web resources…

    […]why I hate, despise, detest, and loathe LabView « Lists of Things[…]…

  91. Will Hugget says:

    Why i don’t like labview and i have purchased it in the past (version 5 & 6) is
    1. It is memory hungry for which i can’t seem to manage using labview – i.e.lacks memory management features e.g.stack, heap.
    2. Pointers and references (which i like to use In C++/C) and which give very strong programming advantage are not seemly programmable. Maybe someone could advise differently.
    3. We are using a limited toolboxes geared toward what Ni think we should program
    4. Students should not be encouraged to use this application programming methods as it will not help them understand what programming a microprocessor system is about, for which is important for later on in their profeesional career.
    5. DATAFLOW? If you really think it through the CPU is a data-flow device – i.e. receives input data either as code or “data”, from input devices e.g.harddisk, ports, may do some work on it stores it in memory or some output device, displays it either in a printer or on a screen and so on. So all programming langauges are about data flow and processing. To say LV has this feature which others don’t have is utter rubbish.

  92. asdgfe5 says:

    Will: nevermind, the post was intended to demonstrate the validity of equivalent counter-arguments,on the same level of course, which the original poster committed against him/her/itself. Imbeciles such as Mr Tea did not really get it. By the way, no, noone said that a CPU is a dataflow device, although certainly can be modelled like that. However, I am sure you have discovered that the controllable existence extends far beyond the CPU, and most of its manifestations can be very well modelled as dataflow devices, including the process of thought itself. Since LV is one of the languages with direct representation of this kind of modelling, the argument is valid. Some other things are also taken from thin air, who said it’s the only one providing this feature, apart from yourself, of course?

    Mr Tea: no, it’s not my “first” language. I also speak further three fluently, two which are again not my “first”, which makes at least 3 languages that I have a much better command of than yourself, including your native, which is presumably English, and the only one you every tried to speak. By the way, native is the educated expression for a “first” language. Social skills? Where is the society that you have measured my social skills on? The sole fact that I can afford to fuck your dead mother in any sense your level of imagination will ever allow for, or your inability to comprehend sentences consisting of more than one, well, should I say dataflow streams, will certainly lead you to your next well justified inferiority complex soaked empty rant without recognising that I have mirrored the mentality of the original post to show your attitude. As it turns out, you didn’t even see the mirror, and although I speak a far better English than you, this is something that cannot be remedied by all the languages ever spoken on this planet.

    Unfortunately, our brief discussion here is well representative of differences between programmers and engineers, considering the vast majority of cases, although not all. Programmers do recognise a vague content referred they call “rubbish”, but fail to see the process behind, the purpose ahead, or their own role in either. The good thing is that most of you don’t have the imagination to discover how sad this is. Now, go on, say something collectively clever, and conformingly empty, as usual.

    • Richard says:

      2/10 Troll level: meh.

    • Mr. Tea says:

      Its funny that you actually replied, thanks for another great laugh! :-)

    • jtstand says:

      “I have a much better command of” and
      “I can afford to f…”
      “I speak a far better English”
      Both the programmers and the engineers may agree that none of your qualities can provide proof to your theories. In fact, the opposite is true: truth of your theories may provide proof of your genuity. (excuse my English, it is not my first)

    • Tim Tom says:

      Hurling childish insults at others is beyond pathetic. You have zero proof that anything you have written is genuine beyond the fact that you have the inability of carrying on a conversation devoid of derision. The ability to speak more than one language is also irrelevant, I know of one man who is fluent in English, Italian and French yet is an under-educated ignoramus.

      “Unfortunately, our brief discussion here is well representative of differences between programmers and engineers”

      Discussion? It took you one month to craft this response and it still contains nothing more than vitriol and ignorance. Your attempt to sound intelligent is nothing more than a sad, deplorable outburst.

  93. forking hated every minute of using it too. As many have said, the ‘wiring’ gets very messy. Plus what person later on is going to look at your ‘code’ and go… “oh ya, that’s what they meant to do. I’ll just make the changes here..” If I were to go back to what I had worked on in some physics lab and try to figure out what I was working on, I’d be lost. I think it is an interesting idea to have the top level flow, but it should really only be used as a design flow outline, not actual running code.. redonk.

    As I recall, it took me a while to figure out there there is indeed some C code running in the background of all the modules available, so getting my program working was only made possible by making changes to the underlying code running in the background that the UI is supposed to represent.

    I would never recommend anyone to use LV for anything useful… Possible a ‘programming 101’ course for very very fresh faces to understand how code flow works.. But even then, I think it is just confusing to wrap your head around the whole ‘a while loop is a fucking window that you must set to a stop sign button…wtf!?!?’

    And many more examples of horrid coding styles.
    Anyways, just wanted to rant more than anything, so happy there are others out there who also loathe, hate and despise LV. HA!

  94. fuse117 says:

    Okay, I want to have a kick at the now rotten horse.

    My problems with LV are the following: documentation and online knowledge base are a pain to search through; I can only develop for LV using LV; wires, nodes, wires nodes, … spaghetti; menu, menu, menu, … maybe find what I need; takes minutes to code mathematical operations and algorithms when it should only take seconds; hard to document; and a nightmare to maintain.

    Also, collaborating on work can be a huge pain too. In fact, I dislike sharing VI’s with other people, because if they edit them and give them back to me, I have to spend a lot time figuring out what I’m looking at.

  95. EngineeringStudent says:

    Wow, it’s great to know there are so many of you out there! I’m another “Labview sucks” Googler.

    I just cannot stand it when I know exactly how to code up what I want to do, but then it takes me another 2 hours to figure out the Shift Registers, arrays, find arrays, etc. to make it happen in Labview. And maybe this will come with programming experience, but it is difficult and time-consuming to create a stable program that is intuitive and not susceptible to randomly breaking down at the most inopportune time!

  96. john says:

    Don’t hate labview but am looking for an alternative. Issue it that very clumsy to transport standalone programs to many of my cust. (small program in region of100-200meg). Used it initially as am not a programmer for for data acquisition, communication via 232 485 dnet profibus ieee was relatively easy. Thoughts?

  97. Marc says:


    I was initially forced to use Labview and hated it for the first 2-3 years. I had to stick with it because we had very limited budget and the initial medical R&D data collection system made in Labview before I arrived had no chance of being completely re-written in something else. At that time, I would have used Delphi.

    Part of the issue was that the original code which served as example to me was VERY badly written. The whole application was written in just a few humongous VIs that needed scrolling in an area about 5 x 5 screens.

    Now, I have used Labview for about 10 years and strangly enough, I feel like all other development environments takes days to to what takes minutes in Labview once you know it.

    However I agree that…
    1- Maintenance is difficult, especially when doing it in someone else’s code.
    2- The runtime is huge so I agree that this is an issue for distributing LV code
    3- You have to be careful with unexpected multi-threading

    Initially, I had a tendency to write some of the functions in Delphi, and call DLLs from Labview but as time went by, I realized that about anything could be coded in LV faster and cleaner.

    With a bit of experience, you no longer make spaghetti code!
    Also, LV has much better crash resistance. For instance saturating overflows rather than crashing or generating random number by truncating. For situations where you need to collect data even if something goes wrong, this saves the day.

  98. Matt Reaves says:

    Wow! I’ve never seen so many people pissed off about such a great thing. Every programming language has its strengths and its weaknesses. For instance, MATLAB always assumes you’re dealing with a multi-dimensional array. I rarely deal with multidimensional arrays! Try plotting in C++. Labview is great at making GUIs. Also, once you get used to it, it’s just about as easy as any other language to write most scripts, but it makes it much easier to change variables on the fly and see the effects. You can also interface with just about any other language: MATLAB, C++, Python, etc. As with every problem, you should choose your programming language based on your problem. For simple scripting that’s free, I prefer Python. But there are so many problems that Labview excels at. The point here is: Don’t think of Labview as an alternative to other languages; it should supplement them. This is just another tool in your bag! Embrace it!

    • Jim Fowler says:

      All the guy’s points allude to poor practices and lack of knowledge.

      I have been a Certified LabVIEW Architect since 2007.

      >Inability to write descriptive comments!
      Not true. I comment all day long. Literally. I use VI documentation on EVERY VI.

      >Inability to name variables!!!

      Which variables do you speak (shout) of? Locals? Globals? (don’t use them) Functional globals? Shared single process variables? Wires? Clusters? Shift registers? All of the above can be named except shift registers, and I comment the latter anyway.

      >Nonlinear, graphical programming interface:
      It’s very much linear – completely linear, actually… unless you don’t know how to use it and throw massive amounts of global variables around.

      > Messy, horribly hard-to-follow programs!
      From complete novices who have no idea what they’re doing… I’d bet my job that you can read my code with little to no experience.

      > Wires everywhere!
      Darned textual languages have those pesky operators and tokens everywhere. I hate those!

      > Extreme difficulty to insert new commands into an established program without ruining the organization structure!!

      Take a class. Hire someone who knows what they’re doing. Hence the term, “LabVIEW architect.” Write modular code and you will never have this problem. If you write one huge function in C++ you will also have great difficulty in making it scalable.

      > Frakking impossible to debug!!!!!
      If you have no idea what you’re doing or how to use proper coding practices… If I name all your variables “thing” and don’t document them, you’ll have a heck of a time debugging, too.

      > Computer processors operate linearly anyway–LABVIEW IS LYING!!!

      So what’s “fork” and “join” in C, then? Were they lying, too?

      >Sequence structures–the most cumbersome way possible for the LabView creators to have tried to rectify the problem that sometimes YOU JUST NEED TO EXECUTE COMMANDS IN ORDER JUST LIKE A CONVENTIONAL PROGRAM, DAMMIT!!!

      Never, ever use sequence structures. I agree with you on this one. I never use them, maybe five times in my whole career.

      > Mouse sensitivity! As in, my programming ability should not have to rely on my skill to accurately position the mouse over some of those frakking tiny terminals!

      So don’t write LabVIEW. If you can’t use a mouse, then don’t get into numerous other industries, either, for that matter. (Architecture, CAD, etc.) Mouse sensitivity is adjustable in your OS, you know…

      > Timing structures–THEY DO NO SUCH THING!
      Uh, yeah they do. They’re meant for LabVIEW Real-Time, on an RTOS like VXWorks. You will have jitter on Windows, and even then they still work reasonably well.

      > The fact that it has to rebuild all its data acquisition sub-VIs every time I want to make a tiny change to the sampling mode!

      Don’t use Express VIs. They’re for complete novices who have no idea what they’re doing. Use the proper API (DAQmx) and it won’t rebuild every time. I never use Express VIs.

      > Shift registers and sequence instances! The saddest excuses for variables on the planet–and they contribute to the messy wiring problem!!

      No they don’t… not if you follow an architecture and use software engineering instead of a QBasic programming mentality. …and what’s a “sequence instance,” anyway? At least learn the proper terminology.

      > It handles arrays in an extraordinarily clunky manner–and when you’re taking data, the role LabView is best suited for, MOST OF THE TIME YOU CAN’T POSSIBLY AVOID USING ARRAYS!

      There’s this thing called a typedef’ed cluster. It is your friend. It’s analogous to a struct. I don’t use that many arrays, to tell you the truth.

      Learn what you’re talking about before you flame something out of ignorance.

  99. Spiriguda says:

    Non capisci un cazzo

  100. David says:

    Labview will never be as fast as C. In my testing of the program I had a 20 millisecond overhead thought the test. Now if a complete program needs that other 20 mil then I suggest you use C. Normally the speed of the code is directly dependent on the programmer knowledge. How ever if I am you competitor I will beat you to market by 3 months using Labview, it a fact everyone can draw a line faster than typing text code. I hate engineers that only think of one dimension of building code. And by the way if you are using C or Labview to code manufacturing test of any type then pray your competitor don’t have Prestige it is lighting quick to develop on and test times can improve by 50% . if interest contact me, mfgtestsystems@yahoo.com

  101. Bryan says:

    I use Labview, C, C++ SQL C# .net and work in analogue and digital elections and I can say that the lot of you need to relax and get a hobby that does not involve computers; life is too short to argue like this.

  102. Dude says:

    this dude is completely wrong! that is all

  103. jh says:

    I will stick to Simulink…

  104. Mad Damon says:

    I Too shit on the whole concept of this kiddo-programming-by-drawing-wires. I grew up programming with the usual textbased languages, i can’t get used to this silly, limited, inflexible, hard to expand shit.

    Image programming with multiple blocks of textcode next to eachother on one huge sheet, the horror!, even just from an editing-perspective.

    Labview sucks ass in my view.

    thanks for creating this blogentry!! :-)

  105. Patrick Raphael says:

    1) Control references (such as to GUI objects) aren’t ‘first-class’ meaning they follow special rules and can’t be treated like other data types. Say you want to break up your code so that plotting the data is handled by a subVI You can’t just have the main VI do something like ‘get the reference to this control, pass it to the subVI’ You have to *explicity” link control references to subVis, for every single control ref you want to pass. And this has to be done by a drag-and-drop through the GUI. I have never encountered this any other modern language.

    2) You cannot dynamically create controls, and many properties cannot be changed past runtime.

  106. Larry Koss says:

    I have not completely scanned this blog to find if anyone has recommended the best alternative to LabView, which is Opto22.

    I have been working with Opto22 for 12 years and it just gets better every year. I have been using Opto22 at Kennedy Space Center supporting science research experiments for human support systems. I monitor and control environment chambers, bioreactors, and other custom instruments and equipment.

    Opto22 is Very, Very, Very easy to use, modify, configure, both the hardware and the software. Less expensive than LabView, free training available as an introduction to their hardware and software. Their professional version of software is only $1000/license, but their basic free version is very powerful in it’s own right.

    Please goto http://www.opto22.com and check it out! Larry

    • Mr. Tea says:

      I have use the 5300 Blue Fusion from CTC. Their hardware is pretty good, though they don’t support high voltage AC or DC I/O, only 5V-24V DC. They have a smaller 5200 but it only runs their older Quick Step language which is very limited, not based on any familiar programming language, a pain to organize and you have no visual way to see the flow. I had to build a GCC tool chain for a Hitachi SH2 under Linux from scratch because that is the only way to build C code on their controller to do any decent math (they offer a Cygwin based tool chain but it did not run on win 7 64 and had a screwy configuration.).

      The thing that sold me was their Quick step 4 (QS4) programming language. Its an excellent mix of a C like language and a visual diagram design. They have some good programmers there and they get it when it comes to writing code for a controller. You lay out your tasks, and their associated steps. You also get to create functions so different tasks can call a function simultaneously. In each of those steps is the C like language, QS4. An even better addition is a C step where you put actual C code and the step is treated like a function call. So you can do some real fancy stuff and its both visual and typed. This lets you visualize the flow of your code without the diagrams getting out of hand because you can squeeze as much QS4 code into a step as you like. But of course it makes sense to spread out your code over a number of steps,

      The editor, Quick Builder, is simple and with a simple click of one button compile all code at once and another click publishes the code to the controller and runs it.

      Its not perfect, I wish the editor managed variables better. Register numbers assigned to variables are not searchable, you have to view a printable list of assignments which is organized alphabetically by variable name. So if you want to know what variable register 300 is assigned to you have to do some digging. I also think they need better decision blocks that are like switch statements so the flow can split more than two ways instead of a true/false statement.

      FYI, I am not a paid shill. I have spent a lot of time writing code in C/C++/C# and having to program PLC’s/PAC’s in crap ladder logic (Kill it already!). Then CTC comes along with a perfect mix of C like syntax and visual flow diagrams and the addition of C code.

  107. Nate says:

    Love how variables can have the same name. Greate feature.
    Relative paths DO NOT work, which is awesome with working on a team of more than 1. Structures randomly changes positions so you get to waste time every month finding bugs that should not exist.

    • Nate says:

      * Great
      Also, no backwards compatability. Upgrading causes everything to break. LV examples online tell you what a frequency, voltage, etc. is and NOT how to use the HW/SW. LV is a great idea but realistically a big waste of time and money.

  108. AK says:

    It is beyond me why someone would use this shitty fucking tool when you have so many powerful awesome programming languages out there. Labview is for total tools who want everything fed to them in a shitty GUI.

  109. BC says:

    Replace LabView with Python. Its free and better in every way.

  110. Stan Pisarski says:

    I am a college professor that teaches Labview as a technical elective for electrical, mechanical, and computer engineering technology students. Labview is a great programming environment. When done properly, a Labview program can easily be followed if it is commented and written in a way that is easy to understand. The students that I teach love it…much better than a text based language like C. The take a course in C before using Labview! I also use Labview in my other job as an engineer. I have used Labview since 1995 and have learned Labview on my own without taking any courses but by doing it the hard way by using it for an application that is required to keep the business going. Most of the Labview programs that I have created are used in the electronics production area to run semi-automated and automated test sets to calibrate, test, and burn in electronic products made by the company. The test sets work flawlessly. Other Labview programs that have been created are used to run R&D tests on electronic circuits to collect and analyze data. One program controls the GPIB connected equipment to perform frequency response testing on electronic circuits, collect the data, place it in a file and graph the results…all done automatically. Was this easy to do? Not not really but it was fun to do and satisfying to see it run correctly. Grow up…take a course or spend the time that it requires to do it right!

    • Richard says:

      I am a college professor that teaches Labview as a technical elective for electrical, mechanical, and computer engineering technology students.

      – I taught C programming for three and a half years.

      Labview is a great programming environment. When done properly, a Labview program can easily be followed if it is commented and written in a way that is easy to understand.

      – Labview is a crutch that teaches you nothing about how computers work. It teaches you in a completely abstract world, Which is OK if that is the only language you will ever use. In addition, it is available only from one source. You can get C compilers for free. Many of them. There is one LabView with a high price.

      The students that I teach love it…much better than a text based language like C.

      – The students love LabView because it is easy to learn and easy to run in an enclosed environment.
      – The language to start on may or may not be C, but it certainly is not LabView. The concept of a textual language is mostly portable. The constructs “for”, “while”, and “if” are all the same. LabView uses a graphical context which is not visually similar to anything.
      – The graphical nature of LabView means that you cannot print a listing easily, and things can be hidden from view.
      – The data driven logic of LabView is not easily translatable to any procedural language.

      Grow up…take a course or spend the time that it requires to do it right!

      – Do it while you are young! Learn a language that will broaden your view, not one that constricts your vision.

  111. bublina says:


    I am a programmer/engineer. I have 5 years C/C++ experience and 8 years LabVIEW experience. I am learning myself C# by now.

    Frankly, all of the negative comments are 100% valid !
    I went through same sh*t. LabVIEW is such a monster. Sometimes it felt like working with a spiked elephant !

    After 2 – 3 years of agony, LabVIEW became somehow usable, and by now, I have to say it is a perfect servant.

    Probably no language gives you the possibility to write(draw) sh*tcode for so long before you learn how to do it properly. NI´s effort of teaching LabVIEW s*cks balls. I never ran through single example, solution or whatever, that would be coded well and anytime I look or help on discussion boards, 99% of the code is total mess and holy garbage. Even the projects their engineers show on those clown meetings boil my blood.



    -lowest development time for industry application
    write a complete concrete plant control program in a week
    -multiple HW architectures, one environment ~ fpga, controller with real-time OS, PC with general purpose OS
    -cheap, huge amount of libraries
    yes, no jokes, if you are certified, they give you almost everything (SW) for 500$ a year


    -huge, without concept
    it feels like the company is ran by managers, not engineers
    NI spawning more and more tight purpose SW and functions
    LabVIEW run-time is fat pig
    -horizontal and flat learning curve
    it really takes some goddamned time to bend it
    -weak performance
    without C/C++ DLLs, you are done for high performance apps
    this kills it probably most

    Both lists can easily expand a lot.

    Would I do LabVIEW again ? Maybe :)

    • chiraldude says:

      So I have read a good chunk of this Labview bashing fest.
      I just have to insert a contrary point here.

      I spent a number of years trying to work with text based programs. Had to write BASIC in school and then a little bit of C for work projects. I really sucked at writing textual code! Dam difficult with all that cryptic syntax. It turns out that I have just enough dyslexia to make writing text based code nearly impossible.

      One day, I was introduced to HP VEE (now Agilent VEE) and a short time later, Labview.
      OMG! I thought I was in heaven because it was now very easy to write code!
      No more compile errors because I missed a semicolon or had extra parentheses! No more variable names to keep track of!
      Now I can sit in front of my dual 20″ displays and write Labview code all day long without getting tired!

      It turns out that you can write Labview code that does 90% of what C++ can do and, if you are writing code to grab data from test instruments, it will be 10x faster to write in Labview.
      If you are a real code warrior and need full Class/Object support with virtual functions, Labview can do that too!

      As far as spaghetti code complexity, it is simply a matter of discipline. If you divide you labvew code into logical blocks that are called via sub-vi’s you can avoid spaghetti code.
      This is the same as with text code. If you don’t split your code into logical functions that are called by “main” you have the same sort of mess.

      • feta says:

        I think you had a typo: you wrote labview has “full” Class/Object support.

      • klessm1 says:

        BTW you had a typo in your post as well…it is spelled LabVIEW. Not labview, Labview, or LabView.

        I know of only a couple of OOP concepts not covered in LabVIEW (that should exist in an OOP language). I wrote a fairly large app in LabVIEW using around 150 classes and I didn’t run into a situation where I needed them. If you can list some of those deficiencies I will give you some credit. If you can describe a time when you were required to use LabVIEW and you needed to use those concepts I would be floored.

        As to the original post and most of the garbage in this thread, most of it boils down to poor programming practices. Slow speed, hard to read, hard to update…I don’t think it is LabVIEW sucking. I would hate to see your text code.

        LabVIEW haters like to point to the performance issue quite a bit. Most of the problems are due to poor software development, but if by some miracle it isn’t then it is due to it being a high level programming language. Like any high level programming language you will lose some performance in speed and memory footprint, but you usually gain it all back in development time. A great tradeoff as CPU speeds and memory size keep increasing all the time and my deadlines keep getting shorter. Throw in multi-core CPUs (which LabVIEW inherently takes advantage of) and you have a great case for a high level language like LabVIEW for “most” tasks. And if I need something really fast in the hardware layer (like a communication protocol or real time computations and feedback) then I would use an FPGA…and write the software in some abstract high level language (LabVIEW) because I don’t want to take a year to write it. I say most though because you have to use the right tool for the job. I wouldn’t write a web page, mobile app (for now), or low level device driver (like a usb driver) with LabVIEW because it is the wrong tool for the job . However if I needed to write a large GUI intensive application with a hardware abstraction layer and database interaction I wouldn’t choose C, C++, Python, or MATLAB. I would choose either C# or LabVIEW. And since drawing is so much more fun than writing text I would choose LabVIEW every time.

      • feta says:

        You can use also machine code if you like it. It lacks only a couple of OOP concepts too, so you’ll fill yourself at home.

        I use lABview because I’m forced, and although my programs does not have 150 classes, I can assure you that the class implementation I found in version 8.6.1 is (mildly put) laughable. And I don’t care if it is better now, I’ve been burnt enough by NI updates.

        Addendum: lABview sucks beyond imagination. It is the only thing that makes BASIC look good.

  112. forced to use it undergad says:

    I hate LabVIEW.

  113. Fred says:

    I think the best way to use LabVIEW is for GUI only, with a switch on it which calls a normal program in the background that does everything related to engineering.

  114. Hitesh Dhola says:

    Programmer might be hatting LabVIEW because, It looks Easy (Actually it’s not).

    It defies the purpose of being expert at some programming with tons of years of experience. Imagine someone new, at Lab comes and do that thing in few days while you were doing over years,

    You hate it. Because you know that, It seems everything can be done in labview in seconds (but actually it’s not True and everyone else thinks it otherwise)

    Why I hate Labview?
    I am using it from Last 8 years, There is no other option. There should be competition.

    • EnzoKosmos says:

      http://www.opto22.com is your option, Opto22 has been in the IO business since the 1970’s, they are the ones that came up with a production of optical solid state relays before anyone else. It’s a shame they haven’t given more attention to academia like LabView has by giving or at a low price NI hardware and LabView software to universities, but maybe they have good reasons.

  115. Intaris says:

    I have over 20 years experience in C, C++, Pascal and LabVIEW.

    LabVIEW has a fundamentally different operation model to most other languages out there. People learn to take variables and pointers for granted. They’re actually not needed for the vast majority of tasks. People who have only just managed to learn text languages generally don’t have the 1) energy, 2) computer science and general information theory background or 3) time to learn HOW labVIEW works before learning how to program in LabVIEW. The automatic assumtion that you can port all of the idioms you learn in Language X into LabVIEW is the reason for well over 90% of haters here.

    Sure, LabVIEW has its weak points (as does every single programming language out there) but as an experienced programmer I find reading the majority of the comments here to be hilarious.

  116. Diddleydoo says:

    The other funny thing is that those of us who “get” LabVIEW sometimes have trouble understanding why everyone isn’t moving over in that direction. Microsoft DID venture in that direction but as with HP’s VEE, it was simply inferior.

    I earn good money programming LabVIEW for a living. We have a team of 4 programmers implementing and maintaining scientific software for control or sophisticated scientific experiments (Scanning Probe Microscopy) involving windows GUI, deterministic Real-Time devices and also custom FPGA development. I never need to leave the LabVIEW IDE to achieve all of this. Code re-use is generally pretty good across device boundaries and if anyone wants to get even close to our productivity using a different language, good luck. You’ll have to employ three times as many programmers to even be able to keep up.

    And yes, we know what we’re doing in both a scientific and computing sense.

  117. Newt says:

    I’ve been using LabVIEW a long time, but originally programmed in a handful of text based languages. I like both. And yes, CVI is pretty great.
    LabVIEW is a bloated, slow, buggy pig: period. The drivers are crap. “LV OOP” is a joke. The toolkits and utilities are written by interns with crayons in their noses. But, it IS easier to get up and running quickly once you ‘get it’. I was fortunate enough to have written in VHDL in my earlier years which got my brain wrapped around the idea of dataflow programming early on. This is what’s hanging the rest of you guys up. You can’t think of the code sequentially because it isn’t sequential. If you can’t escape that mode of thinking you will never write decent LabVIEW code. Many of the complaints I’m reading about have more to do with poor practice than legitimate beefs with LV, and yes, there are many legitimate beefs with LabVIEW.

  118. Jan de Boer. says:

    31-07-2013 it still sucks!!!

    I’m starting with it and hate every minute, fuck this shit, can someone do my assignments for me? then I’ll delete this shit of my computer.

  119. LVUser says:

    I’ve using LabVIEW for the past 5 years and yes, my firsts codes looked very bad, after a while I understood how it worked and everything is fine now. I would say 90% of the LabVIEW haters doesn’t understand the concept of “Dataflow”, and because they don’t understand it, it is easier to say “I hate it” I’ve seen tons of applications where everything is modularized and fits in one screen, no need for zoom or extra screen space. Discipline is the key.

    • Multi-Prog TTy says:

      This is a “tool for the job” argument. If you want to work out how to multi-thread in Visual Studio, then that’s fine, but in LabVIEW it’s just a case of writing simple code and it works. No UI lockups, no timing issues. A delay outside of a loop within-a-loop defines the time for the whole loop. In .NET you’re still stuck with a lot of thread mayhem for even the simplest of progress bar type applications, and it’s troublesome to switch from one type of coding to another (BackgroundWorker, Invoke etc.). But… pure event driven code is best left to C# or VB.NET. It’s much, much more transparent in a text environment. LabVIEW is awesome for hardware development jobs, and knowing how to program it properly is so, so different to classes and forms that most people get bogged down with wire hell, misuse of Local Variables etc. that they miss the wonder of how pure LabVIEW is as a programming language and dev environment.

      I work as a senior developer in a science/research environment programming a huge variety of hardware automation projects.

  120. Jon says:

    What the farmer doesn’t know, he doesn’t eat!!

  121. Peter Meier says:

    I used LabView for years and it’s made for measurement data analysis and for this purpose it’s really good until the problem is not from a university.

    1. Labview libraries changing too often and functionalities were removed by NI, which is not acceptable at all. Updating becomes a hazard for production environments. Sometimes a engineer has to update the OS and with the OS LabView.

    2. Controlling ressources like serial ports or CAN bus is a real problem. Take a look at the Python “with” statement …

    3. Real life task like inserting data into a database should be minimized. It becomes a hell.

    4. The documentation is bad. Help function aren’t a replacement for docs like in Python. LabView docs made going crazy because the first time I tried to use CAN bus, there was no example .Even for a TCP/IP client, there was no usable and easy to understand example. I had to control a camera for a vision system. I had to insert pause times by try and error.

    5. The price could be acceptable but isn’t because of 4. and 1.

    6. Programming with LabView can be easy as long as the data flow paradigm isn’t violated. As soon as one tries to do something with strings, the readability is going to be bad.

    7. Accessing ressources like instruments and RS232 is expensive when the program is opening the devices. This leads to designs with one page opening devices and routing handles through all states of a state machine which makes the program very ugly.

    Some tips:

    a. Modularize as much as possible.
    b. because of a. write first small modules controlling the hardware and other software like databases
    c. then try to so merge the modules in a state machine together.
    d. Do a state machine for ressource control and control of sequences at the top.

    • drjdpowell says:

      Re (7): I actually like the fact that poorly written LabVIEW looks terrible. I sometime have to upgrade old code, in LabVIEW or C, and I can often assess a LabVIEW program for quality in moments, while with text code it can take a lot of study before I can appreciate the full horror. LabVIEW spaghetti actually looks like spaghetti.

      — James

  122. Peter Meier says:

    And to mention Python:

    1. Ressource management is much easier
    2. String handling and database operations are simple
    3. With scipy and numpy scientific operations are easy
    4. It’s not at all a problem to program multithreading or with multiple processes
    5. Interprocess communication is easy
    6 With django or web2py presenting webpages with data is easy
    7. Logging is simple as hell
    8. Real objects

    Python is just missing the interface to NI hardware. I really have to work on it. When you have to design a test station for a functional test, Python or a .Net language is my favorite

  123. Eldrich says:

    I’m another one of those who typed ‘I hate LABVIEW’ into Google and this came up. It’s GodAwful to read someone else’s program and i can’t even begin to make sense of it.

    LabView is #fail

    • chiraldude says:

      Funny, when I try to read most C++ code I say the same thing. Lots of cryptic symbols, code branches that jump all over the place, header files with cryptic names and comments few and far between. Readability must be part of the code specifications from the start no matter what the language.
      I work at a company that uses lots of LabVIEW code. We have formal coding standards that are enforced during code reviews. When LabVIEW code is written by a disciplined, experienced programmer it is much easier to read than the same code developed in a text based language.

  124. marco says:

    Just some questions: Have you ever used Labview more than a painting tool over your computer?
    you don’t want to or you are not able to.?

  125. Maciej says:

    LabVIEW is good for some things. It is really quick to prototype especially in conjunction with NI hardware. It’s great for test systems ( to test hardware , FPGA , embedded code ).
    It does lack in Object oriented design ( there is OO but it is awkward, there is actor framework which is cool though ).

    LabVIEW integrates with .NET very well for for product development it’s best to use it to do the Data Acquisition Part, and use .NET for application logic, and UI’s.

    LabVIEW is a tool when used properly it is very powerful.

  126. wesramm says:

    It is clear, that for most of the negative comments here, the poster hadn’t spent more than 1 or 2 minutes (literally) with LabVIEW before they decided it was terrible.
    For instance, the rhetorical question about “where does the program start” can be answered in seconds.

    The end result of the application is all that you can “certify” for quality, whatever the programming language. LabVIEW is just one tool amongst many for Engineers, but the assertion that you cannot be an Engineer unless you know how to program C is ridiculous. I am an engineer, not a computer scientist. LabVIEW helps me do that, it works well when you know how to use it, and it is well supported.

  127. I read this article completely concerning the comparison of latest and earlier technologies,
    it’s awesome article.

  128. Marcus says:

    Wesramm has it right. Labview is just a tool, if you can’t, or refuse to learn new tools then you have to question your abilities as an engineer.

    The list of original complaints are laughable to someone who done just a few small projects. This is a list made by a computer science guy who can’t accept that his line coding skills are not needed to run an automated test bench, or create modulation files, digital filtering, demodulate SSI data, or thousands of other applications that are more quickly and easily done with labview.
    I started out twenty years ago setting up my bench and collecting data with line code. Used it for years. Labview came along and I pitched the line code immediately. No contest.

    • EnzoKosmos says:

      If LabView works for you, then great keep using it, however, if you have a choice and you don’t like LabView, then use Opto22 PAC Project Basic (free) or Pro ($995/license). Opto22 software is now compatible with Allen-Bradley hardware. Opto22 hardware is less expensive than LabView, software is less expensive as well. I have been using Opto22 hardware and software for 14 years at Kennedy Space Center supporting Life Science Research and find it to be easy to use and reliable. Opto22 now has remote access to your system using groov web-server hardware and software. Goto http://www.opto22.com to find out more about Opto22. I am not a sales rep. I am a satisfied customer, engineer, technician, etc.

  129. EnzoKosmos says:

    Opto 22 Provides National Geographic with Control System for Its Deepsea Challenge Expedition


  130. poopytowncat says:

    Best thread ever on LabView. Excuse me if I’m only up to 2011 and missed something. I’ll resume reading tomorrow.

    Anyway: I didn’t see any mention of LabView magic. I was told, ‘When you get tired of drawing the G**d*** m***** f****** wires you can get out of any corner you’ve painted yourself into by saying the magic word and use global variables.’

    In response to someone in 2011 who said you get to use both sides of your brain, I was told by someone who had just copy and pasted about a thousand case-statement blocks that you don’t really have to think about programming – LabView code just ‘flows out of your finger tips’. Stuff that ‘flows’ out of my body ain’t code.

    Oh, and another thing. LabView folks say LabView makes parallel programming easy. Just draw those while- blocks – don’t worry – as many as you want! They’re all doing some wonderful thing at the same time! LabView takes care of stuff for you! In contrast – anyone using some old fashioned text language is cautioned to be very careful – use multiple threads with caution. Pay attention to sync… you know, all that geek gobbledygook. Who cares!

  131. Edward Vogel says:

    I just tried to add a text box to an existing VI and cannot figure out(remember? I wrote some of this piece of crap some months ago. . .). Looking forward to replacing this application with Python. All it does is parse ascii serial data into strip charts. . .

  132. […] few results, given then venom I have encountered personally, but complaints fall along these lines (one example and […]

  133. mrpibb64 says:

    I solved production problems at a shop that was primarily Labview dependent. First i updated/corrected existing c# .net test code, then electronic test fixture and Labview code. After this, the company let me go as they wanted someone with ‘advanced labview certification’ going forward. this occurred at the height of the stupid economic recession and from that point i was basically out of work for 3 years!!!. That made me hate labview that much more!

      • mrpibb64 says:

        it is what REALLY happened to me friend. nothing i could do to change the situation. I could code LV as well as the next guy. there is no nonsense about it!

        i have played the ‘skill set match game’ for jobs and i relocate for those jobs as required. i used c#, and now i use python, and this specific job specialty demands it. so once again THERE IS NO NONSENSE ABOUT IT!!!!

        if you have not had to retrain yourself, then you are clearly nearing the end of your career as things continue to change.
        LV jobs will continue to shrink and disappear in favor of Python based implementations. don’t believe me? check out job descriptions yourself.

  134. abhinandan says:

    I agree with the extreme cost involved with anything associated with NI. but only that.

    I am not a guy with 20 years of experience, i explored LABVIEW recently for a couple of projects. but i think every poster in this page needs some saving.

    ROFL people. Its frikin NI,

    how would a company so huge be built around something (supposedly) so flawed.

    I have Used c,c++, java and c#.
    i have used labview TOO…

    Labview is better.
    stop with the shitting on it. my god. educate yourselfs.

    many problems posted here, i happened to go through almost all of them or a version of it during my getting used to phase of labview.

    I happened to have discovered the solutions to frickin
    EACH AND EVERY PROBLEM in the forums or on YOUTUBE. have you heard of it ?
    yeah, YOUTUBE. use it people.

    ppl here claim to have years of experience with so many different programming languages, and hardware stuff….

    i guess thats the problem. please frickin re-learn LABVIEW with an open mind.

    all of you will save yourselvs a lot of time and be able to spend more time with your families at home rather than shitting all over a frickin company and their product
    (yeah those ppl have made it, they have a company, a brand for themselves to be proud of, you dont!!!!!, deal with it)

    or if your so sure LABVIEW is shitty,
    make something better and prove your point.

    nw dont start with ” hey new guy, you too are posting here, you too dont have a life”

    I do, i felt so sorry for the people in this thread that i felt god would punish me if i dint say something.

    and for the first point,
    they cost caus they’re good,

    ni has had a major role with the LHC,

    frickin beat that.

    Go get your selves saved. Re-Learn LABVIEW.

  135. HD says:

    This is awesome that seven years later, replies are still being left in this thread.

    I hate, despise, detest, and loath LabVIEW also. And Matlab. They’re horrible. Hard to use, poorly documented, and ill-supported. I just fixed a bug by hand-copying all my “code” into another project, because somehow this project became “corrupted.” No one knows how, and there’s no tool to detect this, and no guarantee that it would work, but it was the only thing NI could come up with. This was a non-trivial, days-long task. Sigh.

    That said… I’m not a programmer, though I do know how to program, have done some serious programming in my youth, and have written books on it. My day job is to be an experimental scientist. And while I am at this very instant supposed to be tearing my remaining hair out debugging a stupid LV FPGA bug instead of writing this post, everything being equal, LV (and Matlab), for all their faults, still work better than most everything else.

    In fact, in a building not far away, there are N FTEs striving to reproduce a set of LV and Matlab algorithms, which work, and more importantly, can be verified to work, on an NI real-time chassis and in an NI FPGA, in a regular FPGA on a custom board. We can’t send the NI chassis into production, so we have to reproduce the functionality in a custom board.

    These guys aren’t bozos – they are the cream-of-the-crop from MIT and similar institutions. They regularly get poached for higher pay elsewhere (cough, Wall Street).

    These guys however are not scientists – they’re engineers. They largely don’t understand the constraints under which the algorithms were designed, nor should they.

    If engineers had been involved from the beginning, the project would have cost twice as much, because instead of just changing something, every change would have to be explained to the engineers, then implemented and compiled, then tested, then changed again. At best one turnaround per day. This isn’t speculation – debugging their systems, I’ve seen it in process.

    If I had had to learn VHDL to build the system, it would have never been built – because I don’t want to be a programmer. And none of the other scientists would have been able to help. Because they don’t want to be programmers either. It’s hard to make people do what they don’t want to. It’s not like I don’t have other things to work on.

    Instead, using LV (and ML) allowed a bunch of non-programmer scientists to iterate the design until it worked, on a hardware system, from simple proofs-of-concept, to a fairly sophisticated measurement machine that we can use daily and adapt to other purposes fairly simply.

    Now we just have to explain it to the engineers once, and show them the performance required, and they go off and try to emulate this design in VHDL, knowing it can be done – here’s a working system to compare to. We also have a suite of test equipment which can be used to validate the system.

    This isn’t just idle speculation about the costs. I know of a team who tried to do a competing item, whose very competent scientists did learn VHDL – they failed, because someone who does it a couple hours a day is never going to get to the required skill level. I know of another team who tried to implement this system using engineering help – it took them twice as long to arrive at a working system which succeeds by brute force (we can implement the algorithm on a Virtex II while they require a Virtex V – and still don’t do as well).

    This was possible because any of a group of scientists could look at the LV, make changes, and see the results inside of two hours. Turnaround of 3x/day, instead of 1x/day – if lucky.

    In addition, we were able to make useful, intuitive GUIs that displayed information as anyone in the group desired. We have yet to convince the engineers that displaying the data properly for analysis is just a crucial as implementing the algorithms. And again, they’re the only ones who can change the non-LV displays, so anytime a change is requested, it falls into the priority queue, and displays are never part of anyone’s critical design review priorities. So they don’t get updated.

    And don’t even get me started on NIs repair policies (It’s $2500 to fix a processor, no matter what, even if it turns out to be a bad solder connection on their part.) And with FlexRIO, while everyone elses hardware prices drop with Moore’s Law, NI prices have basically doubled.

    So yeah, I hate, despise, detest, and loathe LV, and will probably for at least another six hours today. But NI LV and its hardware allowed us to succeed at a very difficult task that I’ve watched other equally competent organizations fail at for years. In the absence of information, these groups made different, and equally defensible, design choices.

    Horses for courses. LV, and ML, work for some things. Much as I detest going back to that screen.

  136. Before an individual start making use of heiken ashi graphs within your
    own trading a person need to make sure that will an individual
    fully comprehend how these people work and also what the various types
    associated with candlesticks mean first. For instance, if I actually got certainly not produced a take note
    earlier within the particular yr regarding duty varieties We got discovered while
    searching via my on-line school records, I might have
    overlooked out in a incredible tax savings possibility with regard to pupils who tend to be having to pay down their own pupil loans and proceeding in order to school fully committed.
    Forex trading Car buying and selling EA, as it brings Overseas
    trade buying and selling specialist advisor to be able to the
    particular public can make even more individuals susceptible to scams.

  137. Its not my first time to visit this website, i am visiting this web page dailly
    and take nice data from here every day.

  138. Yoni says:

    Naaaa… labview programming is pretty cool. you just don’t know how to do it. :)
    almost all the problems you mentioned can be solved using good programming practices.

    And matlab is even better.

  139. iTm says:

    I agree with your title but disagree with most of your reasoning. I find it a fantastic language to write code in. It is programming for draftsmen. Takes a leaf out of PLC code as well (Ladder Logic). Sure it can get a bit Clunky when you are trying to mimic certain text based languages, but most of the time, if I need these functions, I am writing rubbish code.
    I hate it because it is so buggy! Compile, deploy fail! Re-open Project, Deploy – Pass!, Clear compiled object cache, Pass! No consistency. We have a list of 25+ workarounds for problems NI deny exist or don’t care enough about to fix, Thats what I get for my $2500 per year. Boot time is appalling as well. I have an application that takes 8 minutes to load from .RTEXE It decompresses all the data from the file into RAM before running it.

  140. Mark Spitz says:

    I’ve used a few NI products with their associated drivers in Visual Studio applications. These are real world applications that run 24/7, and need to be supported. The result? Constant licensing and version control issues, and a support staff that is incapable of helping to solve them. In once case we have had to ask a customer to send their PC back to us to work out the driver issues, and this is for a simple USB I/O product. I can’t recommend NI anymore to my customers because of these bad experiences.

    I can see how the fully integrated LabView package, with NI boards and drivers with GUI appeals to some folks, for some applications that NI shines with as mentioned above.

  141. aticoexport says:

    Atico Export offers technical educational lab equipment / instruments for school, college and teaching labs including thermodynamics, fluid, mass transfer click more

  142. Hi there, I check your blogs regularly. Your humoristic
    style is witty, keep up the good work!

  143. Martin Kunze says:

    To whom it may concern

    Since 1981 I have programmed, and developed hardware.
    At 1996 I started programming LabVIEW. LabVIEW 4 was some kind of horror. No undo, no events, no references …

    I totally agree that many people are using LabVIEW as a tool that is made for people who aren’t familiar with programming.
    This leads always into a chaotic code.
    Where nobody in the world is capable to handle and debug it.
    Using local and global variables lead into race conditions, using occurrences lead to hang ups and so on.

    None way around is to use:
    – Events
    – Producer consumer constructs
    – Queues
    – Classes
    – Typed def clusters

    and never use historical things like:
    – Local variables
    – global variables
    – Occurrences
    – Units
    – Wait
    – Wait until

    Than you’ll create even very large and performing applications with a minimum CPU load. These applications need documentation too but they are very good maintainable, easy to read, easy to understand, easy to enhance.

    By reading many of the prev. mails I can see that many of you have faced bad LabVIEW program code and some of you don’t have the required skills to program in LabVIEW effectively and good.

    Sorry to say that but in LabVIEW I’m always 10 times faster than in C or C++. My programs have several hundred VIs but they are readable, maintainable and reliable while running 24/7 even in security applications.

    One thing I confirm is that the NI technical support isn’t capable to handle specific questions. Everything that needs more detailed knowledge than normal is unknown. This is caused by the grown complexity and abilities of LabVIEW.

    Kind regards

  144. Bruno says:

    Very good words, Martin Kunze.

    If you really want to know if it is GOOD or BAD, compare it to a gun.
    You can say it is bad, because can kill people around you.
    You can say it is good, because can kill a bear trying to catch you.

    Programming environments only do what programmers are asking them to do.
    If you are doing the wrong way, will have bad experience.

    No matter what tool you are using…Training and good references are required to use it correctly. NI for instance offers very good trainings, hands-on and certifications in almost every country.

    Best wishes.

  145. Vandwo says:

    Started a job with that program recently.
    Would say totally fucked up. Yes you can make some fast test program to run some machine but you can never, and i underline NEVER use it for programming end up devices! It’s program for making fast graphical software, logging strings from other units, combining them, controlling some units, making users interface for the operators to work with it, but only in the test lab. This software should not ever get out of the test labs. And i totally agree, it really is pain in the ass with this mouse clicks and tiny spaces. Took me 1 week to master how to add buffers in my case structures (finding this little freaking spots at the sides of the case structure..)

    In all, i will quit my current job because I’am unpleasant with this graphical language, because i need to invest like few years to completely master it, I prefer the good old C and Python instead, more logical and sequenced language.

    • Martin Kunze says:


      I have build large alarm an safety systems with several hundred VIs.
      The are running on vessels 24h 7 days a week.
      They are maintainable and reliable.
      So from my point of view.
      Both of you should read manuals and visit some edu curses.



    • chiraldude says:

      Sorry to hear of your unpleasant experience. However, it was your employer that F***d up. Why did they hire someone with no experience?
      You say you need to invest a few years to master it, how many years have you spent mastering C and Python? Now you have another skill to add to your resume.

  146. Jeff Holt says:

    Good code is a universal concept (and bad code is everywhere). All languages require education, experience, and discipline. LabVIEW is a great tool but like all tools it is not right for every problem. Here is an example regarding SpaceX which I found here http://www.businessinsider.com/how-to-get-a-job-at-spacex-2013-2 but is really a recap of a Reddit AMA:

    There are four separate software teams – flight software, enterprise information systems, launch engineering, and the avionics test team. And yes, they’re hiring software folks.

    They explained what you need to know for each of them:

    * For Flight Software, C++ and algorithm/data structure knowledge are very important.
    * For Enterprise Information Systems, C# or Front End experience + great algorithm/data structure knowledge.
    * For Launch Engineering (the team that uses LabVIEW), awesome LabVIEW + great algorithm/data structure knowledge.

    For myself, I happen to hang my hat on LabVIEW and am very comfortable in the knowledge that I will always have gainful employment. I get paid and enjoy what I do – who could ask for more?

  147. Martin Kunze says:

    Dear Jeff,

    you are right!

    As I said before LV is a great tool.
    I really like it and use it every day but it’s not the Swiss knife to solve every kind of challenge in this world.
    I have done some frame grabbing applications.
    First I have used NI tools. Processor load nearly 80% for a P4, 1.6GHz
    Than I wrote a DLL and just showed the result. Processor load 27%

    This is not because of bad code from NI.
    This is just because NI always tries to program the egg putting wool milk pig to make it as easy as possible.
    This causes sometimes a big overhead.

    As you mentioned, a good and satisfying job is what we all need to be happy.

  148. This_makes_me_giggle says:

    1. Inability to write descriptive comments!
    Which comments are you looking for? You’ve always been able to double-click to drop free text. You can add a tip strip to any item on your front panel or go into the VI properties to comment on the VI as a whole.

    2. Inability to name variables!!!
    Just rename the label and you’ve named your variable. If you’re using subVIs, the label determines what the connection is called.

    2. Nonlinear, graphical programming interface:
    Left to right is just as linear as top to bottom. If you’re losing linearity, odds are you should take a step back and look at the code. Nonlinear lv code is bad lv code.

    3. Messy, horribly hard-to-follow programs! Wires everywhere!
    Clusters and subVIs minimize wires. Good code uses these.

    4. Extreme difficulty to insert new commands into an established program without ruining the organization structure!!
    You’d have to explain this one more for me to understand your issue. Was your original code not scalable? This happens in all coding languages. Use typedefs to make things easier to expand. Group similar items together into clusters/arrays. If you want to add another function, simply add another subVI.

    5. Frakking impossible to debug!!!!!
    Has this increased since you wrote this blog? Highlight execution, probes, step into/over, and breakpoints are the same tools you’ll experience elsewhere.

    6. Computer processors operate linearly anyway–LABVIEW IS LYING!!!
    You don’t understand data flow. If you write a text-based program that executes two functions to determine variable values before calling a third function using both of those variables, you’re doing the same linear task lv would do for you.

    If you’re focusing on the idea of multi-threading, it’s a concept that also exists in text-based. Personally, I prefer the parallel loops to the text-based code to handle multiple threads and cores.

    7. Sequence structures–the most cumbersome way possible for the LabView creators to have tried to rectify the problem that sometimes YOU JUST NEED TO EXECUTE COMMANDS IN ORDER JUST LIKE A CONVENTIONAL PROGRAM, DAMMIT!!!
    Sequence structures are generally evidence of bad code. If you need to execute things in order, just include an error wire. This enforces data flow and eliminates the need for sequence structures. It also allows you to easily control error handling.

    8. Mouse sensitivity! As in, my programming ability should not have to rely on my skill to accurately position the mouse over some of those frakking tiny terminals!
    I’ve got nothing here. This annoys me sometimes too. When I get really annoyed, I just drop the tool menu and choose the precise tool I want.

    9. Timing structures–THEY DO NO SUCH THING!
    Are you talking about the real-time module addon or using a while loop with delay? I’ve never bothered to buy the real-time module. Everything else relies on Windows to control the timing. Windows shares resources so your timing will have some inherent flaw.

    10. The fact that it has to rebuild all its data acquisition sub-VIs every time I want to make a tiny change to the sampling mode!
    You’re using express Vis rather than actually coding the task. Using the actual calls will change this.

    11. Shift registers and sequence instances! The saddest excuses for variables on the planet–and they contribute to the messy wiring problem!!
    I’m not sure why shift registers upset you. I use them rather frequently without issues.

    12. It handles arrays in an extraordinarily clunky manner–and when you’re taking data, the role LabView is best suited for, MOST OF THE TIME YOU CAN’T POSSIBLY AVOID USING ARRAYS!
    Clunky on the front panel? On the block diagram you can choose to view them as an icon and they’re rather small.

    I can’t help you there.

    Out of the 12 things you’ve complained about, 10 are things related to your experience. There’s plenty of valid complaints, as there are with any language. Most of what you said is like me saying “I can’t use spaces in variable names in C so I’m unable to make clear variable names.”

    • Martin Kunze says:

      !!! You hit the Point !!!

      Human without skills are telling rubbish.
      Nothing else to say..

    • abhinandan j says:

      my opinion…

      LabVIEW = elegance .

      the only thing i agree with in this page, is that NI is costly to deal with,

      i am okay with that caus LabVIEW and their products work like a knife cuttin butter,

      LV is like the James bond o coding environments ….

      and i dono about the customer support issue caus im just a graduate who is lookin for job.

      i used LabVIEW in 2 projects, did all the mistakes during my first project mentioned, This was because i dint have time to look into “the right way” because i was new to LabVIEW and i had to get the damn thing working. ( i did)

      during my second project however, the time i had enabled me to make all the (ALL THE ) , yes

      ALL THE changes this poster has indicated as solutions, and just fell in love with LabVIEW ( i also know c,c++,c#, java, ETC).


      u people are missing out on something very beautiful.

  149. Casa says:

    LabVIEW has only one “real” problem and that is lack of competition.

    This has resulted in:

    1) LabVIEW is expensive (not so much for LabVIEW itself, but every time you move, there’s another toolkit to buy – trying asking for the price of LabVIEW C Generator!).

    2) You can only program on a desktop computer or dedicated National Instruments hardware.

    3) You cannot program mobile devices or microcontrollers.

    4) You are dependent on one company for upgrades, support and innovation.

    5) You are only allowed to do what National Instruments allows you to do.

    6) Small user base. Less than 1% of all software development happens using LabVIEW.

    The language itself is easy to use and powerful (on a desktop computer). Technically it’s great at instrumentation control and good and general tasks such as file processing.

    A competitor in the graphical programming space would be a welcome addition for customers.

    • lvbutthurt says:

      1. I agree, but it is a company trying to make money and at least break even on software development costs. I have never seen another language that gives away toolkits as extensive as NI. Sure there are some free libraries, but there are a ton of free libraries for LabVIEW as well.
      2. I can program just fine on a laptop. For very large projects I like to program on a nice desktop, but that’s like >5000 vi projects. Smaller projects load just fine on a laptop.
      3. You can program fpga’s and some microcontrollers. Can build web services and real time apps. Can’t write a mobile app (client side), but can expose web services to your mobile app.
      4. A little competition never hurts, but they do contiously innovate and improve on their products. They do take inputs from their user base and implement some of the better suggestions.
      5. Not really…directly in LabVIEW, maybe, but what they “allow you” is a lot. I don’t know of many languages that allow you to program fpgas, build exes and DLLs, create web services, and develop real time applications.
      6. Yes, it is small as it is a 4th generation language. Plus it costs money. I wish it was free too.

      • Casa says:

        Hi lvbutthurt,

        Great to hear from you.

        When I was referring to the desktop, I was trying to differentiate between desktop (and laptop) versus microcontrollers and mobile devices. Ten years ago, the desktop (and laptop) was king, but embedded is becoming increasingly more prevalent.

        You can only programme FPGA from National Instruments. You cannot programme any microcontrollers from anyone. You may think that LabVIEW can programme microcontrollers, but this is not possible.

        Not being able to target tablets, phones and microcontrollers is a severe limitation. Being limited to only NI FPGAs is also a limitation.

        I would never ask or want LabVIEW to be free. It could be reduced from $4500 to $1500, which I think will keep it revenue neutral, as the number of users should more than triple.

        Just because something is a 4th generation language is no reason/excuse for it to have a very small user base. Perl, Python and Ruby are all 4th generation languages, which have many more users. In any case, regardless of generation, the number of users/popularity is a good indication of customer rating.

      • lvbutthurt says:

        I used to think that ni missed the boat on web client and mobile ( I wanted to write web apps and mobile apps with it), but they need to stick to what their bread a butter is. And that is software for hardware. Not too much hw can (or should) be connected to mobile and tablets (besides simple sensors). Through the cloud yes, connected to distributed realtime systems, rio boards, or servers. This is where NI can help. Even though I would like LabVIEW to run on everything I know it isn’t really feasible, and even if it did, would it be the best tool for the job? Maybe not.

        Yes, only ni’s fpgas, but you are complaining about that? Give me another 4th generation language that can do what lv can do and program FPGAs (even if it is their own hw). I can’t think of one. The hw is so cheap compared to resource costs for vhdl programming (and project time costs as well).
        They do support some arm models.http://www.ni.com/white-paper/6207/en/ expensive, and I’ve never used it, but you can do it.

        I would like it free, and awesome toolkits (like rf toolkit, or FPGA module) for purchase. Even 1500 is too much. It is a gateway for their hardware…it makes using their hardware easy so I’m willing to buy the hardware.

        It is a 4th gen language (which have targeted audiences) AND (big and) it costs quite a bit of money. If it was free the user base would most likely be quite large (assumption based on no data, but I think an accurate one). The other languages you mentioned are free.

      • Martin Kunze says:


        I would just mention that NI has supported / supports Win-CE / Win-Emb. I have used it about 10 years ago on an ARM 7.
        I agree that many toolkits are expensive but a gui from NI said to me that the afford for a single device driver is 50 men years of development time.
        So everyone can calc on his own how many devices must be sold to reach the break even point.

  150. Casa says:

    LabVIEW Embedded for ARM is no longer available. If you go to the white paper you referenced and click on the ni.com/arm link, it will give you “Page Not Found”.

    NI abandoned LabVIEW Embedded for ARM years ago, much to my disappointment.

    • lvbutthurt says:

      I wasnt aware they stopped selling it (again, never used it so didn’t pay too much attention). I think the reason why they bailed on it was because it was too large an investment and they were not gaining market share. I remember it coming out and thinking it was a neat idea, but again, never really had the need to use it. I think most customers (not all) would want something to go with that processor. Like say an FPGA and some measurement modules. Enter crio (which I have used). Much larger head start for the user. If you already have a design, or need to develop a low cost product for mass production then it won’t work too well. In that case need to bite the bullet and start writing some low level code.

      • Casa says:

        The reason they stopped selling it is because they want you to buy their hardware and the more expensive cRIO. As you stated, cRIO is a good solution if you’re only making 1 to 10 units. If you want to have something that is required in quantity then you need to look at something like a microcontroller and programming in C.

        This is precisely what I did. I bought the LabVIEW Embedded for ARM evaluation kit and it was great until I hit some problems that I won’t go into. NI said, what you need is our RIO solution. I said, what I will do is use C and the very nice Texas Instruments TM4C1294 microcontroller (build in Ethernet, includign PHY, USB OTG, etc). Rather than use an FPGA plus microcontroller I use 2 microcontrollers – one for the hard real time deterministic stuff and another for the main program, user interface and data comms. Very nice solution and very inexpensive.

        I like LabVIEW for the desktop (and laptop ;-) ), but when it comes to the embedded world, nothing beats a microcontroller (or two) and that means C or similar rather than LabVIEW.

  151. mymoon says:

    I’ve been working on a labview project for the last 2 months. I used to write c, c++, c#, assembly, basic, pascal, delphi, java programs since 1985 (Well it was mostly assembly and basic in the 80’s(maybe pascal ald cobol too)). But i’ve never ever used such a bad language. Wires everywhere. The data types are absurd and using arrays are a pain.

    • chiraldude says:

      You put the wires “everywhere”, not LabVIEW.
      Put down the wires carefully and neatly or you end up with a mess!
      Data types are pretty straight forward so no idea what you mean by “absurd”.
      I do agree that array representation is cumbersome compared to text languages. It takes some getting used to.

  152. I always spent my half an hour to read this web site’s posts
    all the time along with a cup of coffee.

  153. maximum shred free trial uk

    why I hate, despise, detest, and loathe LabView | Lists of Things

  154. Hey You says:

    People seeing LabVIEW and how easy it is to “write” simple code thought that LabVIEW is easy – easy too learn, easy to program. And no one before ever said them that it is like with every other language – you need to LEARN to program it and you have to follow good practices. Like in C++ where if you do not use tabs then your code is a mess.

    So if you have a mess in your code it is your mess and not LabVIEW mess. Because I look at my programs and every one of it is readable and scalable. Programs fit dimentions of my monitor, dataflow is easy to see and everything is modularized so when I want to make a change then I always know where to do this.

    I have heard that there was a time when National Instruments was talking about LabVIEW only as about something easy and fast to program. And it is true but still you need to care about good programming practices. National Instruments tell about it, at least nowadays, and they offer many LabVIEW courses (Basic, Advanced…) that I think everyone who want to program in LabVIEW should take, at least LabVIEW Basic.

    I keep my code clean. My wires are straight and I create subVIs. I use clusters so I do not have bunch of wires on my block diagram but they are all hold in one common wire. If I want to read value from one variable I just unbundle the cluster (use ‘unbundle by name’ initialized with type def cluster, not just ‘unbundle’ -> learn how to program in LabVIEW). My all clusters, enums and these sort of things are type defs so when I make a change then these elements in other locations are changed too for me. I use state machine instead of sequence structure. I do not use local variables for other things than initialization. I forgot how to create global variable cos I used it once 2 yeras ago and since then I have never needed it. I write comments in my code but I remember that code should be self-documented so I name my constants, variabels and subvis properly.

    So finally when I need to create a program with some kind of generation, acquisition, callibration, data analysis, GUI and so on that everyone say “2 weeks to be done” then as a LabVIEW programmer I say “2 days”.

    Of course there are differences between languages. No one can say “C++ is better than Java” cos there is no comparison. In LabVIEW you can create serious and reliable measuring and control system 5 times faster than in other languages but you will not create a game cos it is not language for it.

    I know cutting-edge projects that are built with LabVIEW. But you need a LabVIEW programmer to build it, just like a C++ programmer to make code in C++. The thing is that you do not need to be a programmer to make SOME code in LabVIEW so everyone can make small, basic programs – read data from file, make an analysis, show it on a graph.

    So yes, LabVIEW is fast and easy and yes, LabVIEW is for everyone. But when you want to create a program, not just display data on a graph, you need to take a course and learn how to program. And then LabVIEW is easy and fast.

    Greetings from Cracow,
    Hope that one day you will take some courses and you will finally have fun with programming in LabVIEW. Every day I see that LabVIEW is worth its the price but you need to know how to properly use it.

    • Casa says:

      Hey You.

      I would like to echo your comments regarding the need for LabVIEW learning. Just like any programming language, LabVIEW requires training/learning to harness it’s full potential and a resultant well written application. No magic bullet here.

      One of the problem is that LabVIEW is so easy to get a simple prototype application going that people assume that it’s OK to proceed to a serious application. There is just as much learning required to fully utilise LabVIEW as there is to fully utilise any othe programming language.

      So, to those thinking about using LabVIEW, sure, play and get a simple prototype application going. If you then want to go on and write a reliable and serious applciations, make sure you spend a few weeks just learning (NI training or a textbook). Warning, do not just dive in and write something that your organsiation will rely on. You should be comfortable with shift registeres, queues and events before embarking on a significant project.

  155. mainly test says:

    LV a dead end, just no one knows when.

    • chiraldude says:

      C++, C#, JAVA, etc… will be dead someday too, just no one knows when.
      LabVIEW could become more mainstream if NI would not charge so freaking much for it!
      My suggestion to NI: Create a free version for high school students.

    • Casa says:

      It certainly looks like LabVIEW is declining in popularity. It’s a pity since technically it’s a very nice programming language, intrinsically suited to not only conventional processors, but also multi-core processors and FPGAs. In order for LabVIEW to thrive, some changes need to occur:

      1) Reduce the price from $4500 to $1500. This should be revenue neutral since, with virtually zero production cost, the reduced profit-per-licence will be more than made up with more licences as LabVIEW hits “critical mass”.

      2) Reintroduce LabVIEW Embedded for ARM, which was effectively abandoned in 2006, onto a modern capable microcontroller such as the Arduino Due ($50) or Texas Instruments TMC129 Lauchpad ($20). These compile directly into C and do not require the LabVIEW run-time engine.

      3) Introduce LabVIEW targeted for some modern single board computers such as the Raspberry Pi ($35) or BeagleBone ($89).

      4) Sell National Instruments myRIO boards in single unit quantities for four times cost (estimated to result in a $240 sale price). myRIO uses the exciting Zilinx Zynq chip which combines two ARM Cortex A9 microcontrollers and FPGA fabric. This is one area that LabVIEW technically excels in – the same language can be used to programme microcontrollers and FPGA.

      5) Allow user interface using any device (PC, tablet, mobile phone, etc) using just a standard web browser (no plug-ins or client side applications). This requires a relatively small amount of programming effort for LabVIEW.

      The main thrust is to reduce the price to achieve “critical mass”, have more execution targets and allow any device with a web browser to be used as a user interface front end. Technically very easy.

      Time for a paradigm shift for LabVIEW!

  156. Xin says:

    Good luck with search the problem you have on google or the labview forum. I feel like winning a scratch card every time when I found anything useful on the web.

      • chiraldude says:

        I looked at the trend line for C# and there was a big peak in 2012 but now it’s way down. The trend for VB shows an even sharper decline. Same for C++. There seems to be an overall decline in interest in programming in general. This just means more job stability for people like us that are good programmers, whatever the name of the compiler.

      • Richard says:

        OK, yeah, the line of “interest” in C++ has gone down.
        How about looking at THIS:
        The trend for VB and C++ are significantly higher than LabView.
        Face reality.

      • Martin Kunze says:

        Dear Richard

        from the point of view of a mass product you are right .
        LabVIEW is by far not a mass product.
        Locking at the chart also unveils that the common interest in programming language is going down.
        Compared VB and C with LabVIEW shows a nearly constant interest in LabVIEW while the others are going down.

        This discussion leads into nothing.

        The only interesting point is that the common interest in programming seams to go down. This is cased by the complexity of todays programming language.

        20 years ago one might be happy to write a ‘hello world’ in C. While one could buy a book with the explanation of the whole windows API.
        Today one might buy a library for the same task.

        LabVIEW isn’t very much different except that you can create a GUI very fast. Programming in LabVIEW is enhanced over the decades. It’s so far enhanced that LabVIEW becomes more and more a general propose language.
        Causing side effects. LabVIEW today is far away from being a language for scientists incapable to program.
        The fact is that a incapable person should not try to program in LabVIEW. The person must learn to program LabVIEW like any other computer language first.

        Kind regards

  157. Richard says:

    My Dear Martin,

    Your comment tells me everything I need to know.
    The stilted and polite sounding language you you use is in contrast with your inability to spell (“seem” -> “seam”), use punctuation, or manipulate the shift key. If English were your second language I would understand, but you made sure that your comment was so perfectly worded that I was inclined to believe that you were to be proficient in English. In addition, it means that you wrote without even even making an attempt at proof reading.

    If you had the slightest opportunity to truly review and comprehend the charts at all you would have noticed that the LabVIEW curve is in approximately the same downward trend as the languages in the second chart. Yes, this indicates that all computer languages are losing interest in Internet searches.
    In addition, the second chart indicates that while the “mainstream” languages are continually decreasing in interest, the LabVIEW chart shows that the interest is so slight that the line cannot get off the lowest boundary.

    Perhaps you should continue the experiment I started: I charted LabVIEW against a number of languages. The result is nearly always the same: LabVIEW is at the bottom of the chart. As a matter of fact, the ONLY language that I found that went below the interest level of LabVIEW was COBOL. Few of today’s college graduates are even aware of COBOL. The computer industries are having difficulty finding COBOL programmers to maintain their legacy systems.

    I realize that the debate is futile. It a never ending story. People have debated the subject of the “best” computer language since computers had languages.

    In my opinion, every programming language has a position it the limelight. The all have advantages and disadvantages in the ease and comfort level of the programmers that use them. Languages evolve and continue to become specialized for their respective disciplines.

    Languages such as PHP belong in web applications. Java is widely used in platform independent computer solutions. C and assembler have a stronghold in micro and embedded applications. LabVIEW belongs in the toilet.

    In conclusion: Bite me.

    • Martin says:

      Dear Richard

      Beg me pardon but I disagree.
      LabVIEW is made for fast and reliable programming and prototyping.
      Further more I have developed several big applications for up to 1000 sensors that must work 24/7.
      All customers are happy with it.
      They are reliable easy to maintain and easy expandable.

      What I tried to say is: The common interest in computer programming is going down. This is caused by the engorgement of people by todays available software.

      Btw. it’s always impolite to write things like you did about a none native speaker. Be shamed!


      • Richard says:

        Did you not read what I said?

        I said that if you were not a native speaker of the English language, you mistakes would be acceptable.

        But you tried to impress me with you “expertise”.

        You tried to fake your way through this.

  158. Hi there would you mind letting me know which web host you’re working with?

    I’ve loaded your blog in 3 different internet browsers
    and I must say this blog loads a lot quicker then most.
    Can you suggest a good web hosting provider at a honest price?

    Thank you, I appreciate it!

  159. Line cookie Run hack.rar 9.71 Mb says:

    I think that everything posted made a lot of sense.
    But, what about this? what if you composed a catchier post title?
    I mean, I don’t want to tell you how to run your blog, but suppose you added a post title that grabbed folk’s attention? I mean why I hate, despise,
    detest, and loathe LabView | Lists of Things is kinda
    plain. You might peek at Yahoo’s front page and note how they create news headlines
    to get people to open the links. You might try adding a video or a
    related picture or two to grab people interested about
    everything’ve written. Just my opinion, it might make your posts a little livelier.

  160. Labview enthusiast says:

    You are all idiots if you can’t do the things the blogger complains that he can’t. Poorly written programs can be done in any language.

  161. Howdy! My spouse and I often publish guest articles or blog posts for other blog site owners to help gain publicity to our work, as well as provide excellent content to blog owners. It really is a win win situation! If you happen to be interested feel free to email me at: %EMAIL% so we may communicate further. Thanks alot :)!

  162. Sean G. says:

    I have to agree to some extent. I attended a Labview “Open Day” in the UK in Berkshire, possibly THE BEST DAY I’ve ever had. the NI Engineers were VERY clued up and showed us how to do everything I wanted to know about, in all they were GREAT BUT…..within 5 days of being able to understand just about everything I needed with LV it had all fallen out of my head, it just won’t stick unless you are using it 24/7.

    Now I am also learning VB, PIC C and CAPL and they seem to stick (mostly) but LV just keeps eluding me in then it slides out of my brain (me, brain LMAO) after a few days. Why is that I wonder?

    • chiraldude says:

      This really comes down to specific personalities and abilities. For me, I have moderate dyslexia. I have done a fair amount of coding with text based languages but would spend 90% of my time debugging and only 10% actually thinking about the logic and flow of the code. With LabVIEW, debugging is now about 5%. On the other hand, I know someone who thinks that Assembly is the only “real” language and everything else is for lazy/stupid people.

    • mrpibb64 says:

      I have used LV in the past on projects, but not the only thing i do on a given job assignment. i recently completed contract job, and a new contract asks for LV as the main area of expertise and a test to take before any job interview can take place. I am hoping that teeter totters in favor of C# instead of LV, or at least Measurement studio add in to VS in place of LV.

      for advanced roles, any test that the recruiter doles out is not a good indicator of skill background. Instead, I show demos of work i have done at other jobs and projects of my own. these projects represents my own efforts to become cognizant in today’s most popular programming languages, C#, C, C++, Python, Ruby, Tcl/Tk to name a few.

      in place of LV, applying OOP and the ability to work with existing systems(What is actually out there in the world that needs to be upgraded by folks like us) is what is needed. VS provides these sorts of services with datamarshaling and platform invoke to access older code bases and interface to new ones. that way down the road, it is maintainable, and provided that it is modular, it can be moved around and upgraded itself as well.

      I used the Ni-DAQ drivers with the NI I/O cards and custom test fixtures using QBASIC procedural coding for years. as this was the in house legacy code base at the time.

      So, What does NI have in mind for Internet of Things?? I am working with BBB, raspberry pi, and netduino P2 platforms with RESTful APIs.

      Management still sees LV as a golden tool as it represents a single tool solution to them. they bought the salespitch but did they do any research or asked opinion of staff or online discussion blogs from independent resources as to LV being a relevant platform?

      if you get the same answer from several independent sources on this question then you can rest assured, that LV is not a good way to go anymore.

    • hummer stimpson says:

      Stay away from visual basic. If you want to learn an easy to use Microsoft language then learn C#. If you care about cross platform then use Python. And one more thing, Microsoft once killed a whole language that was heavily used with no easy migration path: visual basic. The visual basic of today is .net based. I still maintain VB6 code because rewriting would be too costly. Microsoft really screwed up when they made that decision.

      • Martin Kunze says:

        Well well.
        I wish all of you a happy x-mas and a happy new year.
        Let me leave some comments.
        IoT is native to LabVIEW. using the VI-Server or web tools.
        OOP works well. Have a look into the sensor / actor framework
        Learning C, C++, C# is always a good idea.
        VB is not a programming language. It’s a kind of problem for everyone.

      • abhinandan says:

        ive been following this thread for quite some time now.

        this has become kind of an online family to me…

        wish you all a happy new year…

  163. Just found this thread while looking for something else. Boy, it sounds like somebody is off his meds… and/or out of touch with reality. I find rants like this thoroughly comical. Some kid who is still wet behind the ears pontificating on all the things everybody else is doing wrong. Hysterical…

    Hopefully in the past 7 years the OP has grown up a bit.

  164. Casa says:

    LabVIEW is a good programming language that is fun, capable and easy to debug. There is however one aspect that is the death knell of LabVIEW and that is it basically only runs on a desktop computer.

    Learn C, Java, etc and you can program desktop computers, microcontrollers, tablets, mobile phones, anything. Learn LabVIEW and you are stuck on the desktop (or expensive National Instruments hardware that has FPGA capability). The Internet of Things is an exciting arena, but LabVIEW is not a player.

    This, in a nutshell, is LabVIEW.

    • mrpibb64 says:

      @Casa , I have used LV by itself and together with C# in complex systems. What has stood the test of time on jobs i have taken since cross training into MS .NEt back in 2004, are c# jobs to program, configure, and test new devices, modules, and product prototypes. Data base connectivity is easier as well using .net. along with using SCPI/IVI-COM for OEM test gear remote control.
      so test results can be easily recorded into DB. I have used Sql, PostGreSQL and MySQL for this purpose.

      There are other things out there that employers want these days, are the various scripting languages; Python, Tcl/Tk, Ruby, Lua. I am using 3 out of 4 of these well known scripting languages. However, in the resume, I have a section called scripting. Most of my scripting background involves AT modem command scripting with embedded devices like cellphones, bash or just dos command line interface scripting to other families of devices, and OEM test gear remote control with SCPI/IVI-COM scripting.

      as far as tools go, i use a couple of different versions of MS Visual Studio Professional as my main toolset. it is created and maintained by thousands of people worldwide. further, third party applications that look and act like MS visual studio are available as well.

      As far as data acquisition goes, specifications drive the design of the electronic test fixture and which controller / module you intend to use. I use various terminal devices which enable me to quickly create c# winform applications. so no need for the expensive proprietary LV based NI cards or single service house for NI gear.

      in communication with Micro-controller and FPGA/CPLD device targets, most of the time standard protocols are used.

      If a system does have LV I do emphasize to client that I use C# based DLL, which NI LV does accept to provide interface to new hardware.

      So there is a lot of stuff available out there for you to use that provides much of the same functionality that LV can provide.


  165. I use LV at work, sometimes extensively. I like it a lot. I write programs that are neat, clean, and very modular. The OP was complaining about his personal failings (rather than the language) as other commenters have already pointed out. I also don’t think that LV has a place in the IoT due to the immense weight/size of its run-time engine.
    IMHO, the way LV would gain traction would be if NI would open-source the thing. Just give it away, maybe keep control of development by continuing to fund it but let the code be free. Lots of other software is free this way and the company that leads/owns it just needs to find the appropriate revenue stream under the new model (consulting, training, corporate support, customizations, etc). I think if that happened, it would remove a lot of the mistrust and animosity toward LV by letting the code-base be examined.

    • chiraldude says:

      NI will never make LabVIEW open source, you can count on that. However, it is possible to get it at an extreme discount if you are a student:

      Maybe someday, if LabVIEW gets enough main stream exposure, there will be an effort to create an open source language similar to LabVIEW but that is a long way off.

  166. Eric K says:

    One thing that gets on my nerves are these paid NI trolls that go around to forums where Labview is mentioned and post reasons why it’s good and argue in it’s favor. To me it’s obvious. No one would have such a stake in the “LabVIEW sucks” fight. I bet you won’t see any comments rolling in outside of typical work hours.


    I hate Labview from the depths of my soul. Its such a bad language. It’s fine at a low-level where a person is writing a code for him or herself graphing some numbers or coding to view a simple test measurement but anything beyond that…labVIEW is just
    NOT the tool for it.

    Once you get to a point where you’re sharing projects and need to use it on a more integrated systems level it’s awful. I mean, there’s nothing worse. A VI can work when you save it and the next time you open it, it has to be tinkered with again. Seriously, the shit never ‘just works’.

    There’s always one reason or another that you can’t get it to run. And these cult-like goons will never admit that Labview sucks for a any purpose. They have an incessant need to defend this piece of shit to the ends of the earth. Each time I open a VI designed to work “across the board”, there’s always some issue.
    “Invoke Node Error 11702” or “Can’t find file”.

    It literally gets confused and asks you to find files for it that are right in front of its nose. No other programming language does that, it’s ridiculous. It’s like if Windows starts and opens a prompt asking you to find the “My Computer” Icon on the Desktop for it.

    I’m here to work goddamnit, not play “where’s waldo”!

    Stupid managers are NI’s life blood. They set up sales-pitch meetings to convince them that they can turn their teams into
    competent programmers at the low cost $1000 / license (sarcasm). Then they try to drive it home with the tens of thousands of dollars of bullshit NI (optimized) Hardware.

    I gave Labview a fair chance. I went through core 3 of the self-paced training. I wanted to like it. But with my OOP background, I couldn’t ignore the stupid programming gimmicks (dumbass shift registers, sub VIs, etc…) along with how stupid the “Data flow” design is.

    Another thing, good luck trying to make sense out of someone else’s spaghetti nest of code. Comments can be added but it’s useless. lines of text explaining the program just look like alphabet soup mixed with spaghetti. It’s a stupid fucking mess. NI is in denial about its shittiness.

    It’s pathetic….like their dumb bird logo.

    If I won the Powerball jackpot, I would buy National Instruments just so I could fire everyone there and shut it down for good. I would be doing the world a huge favor.

    This thread is proof that labview sucks. Complaints and rants have been rolling in since 2007. That speaks volumes about the shit that is labview and NI.

    Guess what NI? Labview is getting the red, squiggly line under it because my browser doesn’t recognize the word. That doesn’t seem like an “industry standard” type program to me.

    I’m so glad that I was able to get that off my chest. I feel much better now.

    • lvbutthurt says:

      There are trolls and there are morons. You sir, qualify for both. ;-)

      • Richard says:

        And you must be a tried and true NI fanboi.

        Name calling will get you nowhere when you attack the TRUTH.

        Truly, as the original poster said: LabView SUCKS.
        The only reason you see favorable postings is because NI is paying fanbois. They suck the money away from legitimate businesses and promote their crap.

      • chiraldude says:

        I have known lots of “code warriors” and every language has its die-hard fans. I have a family member who thinks the only “real” programmers work in assembly. To him, people who work with C++ are stupid and/or lazy. I still use LabVIEW even though he thinks I am a lazy moron. Do I care what he thinks? I use LabVIEW cause I get paid to use it. I also get regular calls from head hunters looking for LabVIEW experts. If it were useless, nobody would use it.

      • lvbutthurt says:

        Sorry for the wait. I was busy making a ridiculous amount of money writing G code to save my company a ridiculous squared amount of money instead of complaining about how I can’t do my job because someone else sucks. Ironically I was writing G code to generate a large amount of c++ code because it would take our ate guys 6 months to generate it. And a year for them (or never) to figure out how to auto generate it (which is what they need). In other words 2 days of coding in LabVIEW = 6 months – year of c++ coding. Nothing against c++, I’m not “one of those” morons that thinks a language doesn’t have its own strengths.
        However I must say that a tool is only as good as the person using it. So if you contend that LabVIEW sucks, then maybe it does…in your hands.

      • Adam says:

        2 days of Labview equal to 6 months to a year of C++ code. That’s bullshit. I have experience in both languages and much prefer C++ over Labview. My manager and I spent weeks trying to figure out an issue in some Labview code that was actually Labview rearranging a bundle automatically. Just that fact alone that Labview can’t handle a bundle/structure/whatever you want to call it is a serious issue to me. (Not to mention variables with the same name — a feature according to the NI rep, typedefs don’t work, etc. Hence, I stay away from Labview as much as possible. And the fact that every year when an upgrade comes out up 50% of the already written Labview code breaks and the Runtime engine isn’t backwards compatible…give me a break. That doesn’t save anytime at all. My team’s Matlab code, some C, C++, and other code that’s 10 years old still runs perfectly fine. Try that with Labview and see how much time you save…i.e. none. We had some test code that was finished over 4 years ago and still have Labview specific issues and none of the HW has changed. Not exactly saving any time at all.

        As an unrelated issue, Labview code is such a memory hog and has memory leaks all over place why use it knowing your system will eventually crash — especially during a critical test.

        And one more thing. Labview SUCKS!

      • Abhi Nandan says:

        Adam and who ever else are going berserk on LabVIEW, please feel free to mail me @ abinandan.j@gmail.com with your vi’s. I am to have a look at it personally and comment on where and how you can optimize things even fix it if it takes not more that a couple of hours… (i’m on vacation)

        ( if you do not have authorization to share your office work a similar program which u could create, or even a problem statement)

        both of us might learn something… :) its just an email… go ahead…

  167. Karla Kasa says:

    Hi, I believe your website could be having web browser compatibility issues. When I take a look at your blog in Safari, it looks fine however, when opening in Internet Explorer, it’s got some overlapping issues. I simply wanted to give you a quick heads up! Besides that, great blog!

  168. Pura Hubbell says:

    Ciao, la docente non ha dato consenso alla pubblicazione online, le puoi cmq richiedere alla docente oppure attraverso la nostra sede

  169. Fred says:

    There is a very easy answer to the question “why labview sucks?”. Because you (the “prorgammer” who asks the question) do not want to or can not understand the way how it´s done in LabView. However, LabView is not meant to be the best programming language and is definately not flawless but it has its strong sides. Like oop, automatization and controlling. To set up an automatic testing machine is ridiculously easy in labview compared to other languages like Delphi, C++, C#, Basic or VEE. And we use all of those at work but definitely prefer LabView. Sure, as I started to learn LabView in college I was not too happy about that. Learning C# in parallel I totally preferred C# yet with the time (around 6 years later and a couple languages added) I like both. Sometime I do wish I could realize some part of work in C or Delphi instead in LabView but I have the same feeling also the other way around. And since we also use TestStand together with LabView to manage sequencial execution work goes really easy. Fast to set up, easy to debug and easier to maintain. But for to be working with such ease it needs quite a while to master LabView. I´d like to call it “easy to learn, hard to master” but if you do, then you´ll be able to really recognize when it´s best to use which language and will be wery satisfied with LabView. Of course there is also the NI hardware which is supposed to be set up easily with LabView. And it is! It works great and has a very good quality. But that doesn´t mean, that you cannot use third party hardware. Of course you can, and in most cases it is almost as easy as to set up NI hardware. I do understand that everybody may sometimes experience difficulties but let´s be honest, that happens with any existing programming language. And maybe you won´t have problems to create an online app in Java but wirh LabView yes or you will have a much easier work to work with databases with Delphi but with LabView you´ll go through hell but try to use any other language in Automation or controlling! And last but not least my opinion abou the previous posters who declared how much they hate labview: being upset about your own inability to master a programming language calling it the fault of exactly that language instead realizing that you yust had to put a little bit more effort inside and tried a little harder. The product LabView is not the problem, it is the bigotry of the people who try to use it but only stay trying.
    PS: for those who don´t want to read it all: stop whining and learn to work with the language!
    PPS: I just remembered to read, that somebody wanted to use the same principles from text based languages in labview. It is possible but not efficient. LabView has its own tools to manage these things. Learn to use them and stop whining!

    • Ron says:

      I too have used LV in the past. of course, in any real system you end up using more than one language somewhere! i.e. microcontroller, background debug mode, I2C, JTAG, SPI, ect. I have also used LV together with C#; and from c++ native , to C# to .net dll in LV. If you work on somebody else’s work, and you have to dig into VIs 5, 6 or more levels deep and there are many parallel and multi-layer dependencies, then it quickly becomes like spaghetti code! Ok on databases, the easiest yet is EF(Entity Framework). you use classes as data tables, properties and attributes assignments as rows of data table. How does this work? most databases follow a pattern; around 95% of them. just google it, get the free tools and follow the step by step to make a small relational database. it will take you 15 minutes tops! If you need to comply to MySQL, like that because it has less commands than SQL does. many times DB is overkill and ‘flat file’ is appropriate, comma, or other delimited text file with one or more items. It is refreshing that NI is actually listening that in order to access the tools, free versions longer than a month access need to be out there. they now have some that are downgraded versions for students. the requirement to have various licenses and to renew those licenses over time. one very large reason i don’t like LV is that if a problem exists there is only one place to go to fix it, the makers. you could look for others having the same / similar problems on NI site or LAVA site. With MS Visual Studio on the other hand, these tools have been developed world wide by THOUSANDS of people. documentation on just about how to do everything is available(except for the new stuff; i.e. Universal mode with UWP is lacking on WFD). I never need a license for MS VS (any or my versions from 2003, 2005, 2010, 2013, or now 2015). also there are third party tools much like MS VS like SharpDevelop available for free(i used it on the job for free, most of the time it worked just fine).

      real world situation; AT Microsoft Xbox, I was a Mfg Test Engineer there where i used c# to first solve their production line problems, then worked on their NI PXI test stations; reverse engineering, electronic test fixtures, test software upgrade. what was tough to work with was that MS was only allowed a certain number of LV ‘license seats’. so i was stuck for about 2 weeks not able to do my job until an open license seat came available(NI is pretty non-negotiable on that). After I fixed their production line problems i was let go because MS wanted someone with ‘advanced LV certification’. I attempted to do this at various companies, no one was willing to foot the bill to do so(and MS was not willing to do so at that time either). this was really band timing too; as this was at the height of the economic depression; they said ‘nothing personal’ and let me go.

      so you can understand my further dislike of LV for these and other reasons as stated above.

      my two cents(my two dollars – same thing)

  170. Tony A. says:

    So this conversation is 7 years old, by the looks of it. Well, today I used LabVIEW for the first time, trying to interface a DLL that I have easily used in several languages, including some I have near zero familiarity with. What a pain! I just want to say that LabVIEW hasn’t stopped sucking.

    • Brian says:

      Yep! This is the one part about LabVIEW that really sucks! I have been using LabVIEW for a long time and definitely have mastered it but, interfacing with existing dll’s can be a real pain! If a dll function uses complex data types you must write a new dll “wrapper” to decompose structures and variants into simple stuff that LabVIEW can handle.
      NI has so far not created Labview tools for this. Probably because they want us to purchase TestStand?

  171. home works says:

    Great site. Plenty of useful information here. I am sending it to some buddies ans also sharing in delicious.
    And obviously, thank you to your effort!

  172. Hoyt says:

    if you need help to find the right galleries, try out artalia.net

  173. Grumpy RA says:

    I think Labview is for the brain dead or code illiterate people. Let’s not forget the extremely slow run time && labview arbitrarily trying to parallel-run everything that has not be sequenced using case structure and some dummy variable.

    • Sounds like you hate LabVIEW because you don’t know how to program using LabVIEW. The sequencing you are referring to is not accomplished how describe, that is how dummies do it who don’t know how to use the tool.
      Here’s an analogy:
      I hate hammers because that claw end is really bad at driving in nails and the blunt end really hurts when I hit my head with it.

      • chiraldude says:

        Labvew allows inexperienced programmers to create simple data collection programs without much effort. If you are a moron, you will write moronic code whether it is C++ or Labview.
        I bet Grumpy RA only writes code in assembly because that is the only pure computer language…

      • Grumpy RA says:

        I program mostly in C. I hate LabVIEW because:
        1. Not backwards compatible, sucking people in to dropping hundreds of dollars to stay compatible with new VIs.
        2. Extremely slow processing speed, program yourself a prime number generator in C, Python and LabVIEW, and results will speak for them selves.
        3. It’s like having to write Chinese or Ancient Egyptian to talk to the computer. If you don’t already know the language inside and out, it’s more of drawing than writing. The computer then need to translate it to real code then compile it to run it.
        4. The bloody wires! It’s a program, not a circuit.

        Here’s a better analogy.
        I want my friend to pick me up from work.
        My friend C asks me to direct him logically, step-by-step to where I am. He will say “what?” everytime I use the wrong syntax.
        My other friend LabVIEW asks me to draw a picture of a car, then a picture of me, and the topography of the city, and wants me to connect them with wires and be sure to specify that he MUST get from A to B to C then D. Oh, I should mention that he’s riding a turtle, and wants me to pay $500

        5.the key word is LV allows “INEXPERIENCED” programmers to create “SIMPLE” data collection programs without much effort.

        Often people with University education who end up working in fields that require data collection are less experienced at LV than actual programming, and “SIMPLE” data collection is called “R” and in the modern age, why should one MAKE “DATA COLLECTOR” program and then “DATA ANALYSIS” program, if one can spend a little extra time to do all of that one macro?

        The real question is that why should anyone use LabVIEW, when they have good fingers and a keyboard?

      • Grumpy RA says:

        I mean, I’d LOVE labview if it were more of:
        1. Front Panel left as is, and their references variable/data type are given to you. The references are commented out at the top of the “back panel”.
        2. Back panel, I can code, I mean code, not draw using the references. Yes, I have seen how one can code in C, and have that embedded into LabVIEW. Speed is least appealing. Maybe even have a tab for multi-threading, so you can allocate certain things to run in parallel.

        3. LabVIEW is essentially a library of headers and drivers for UI and NI apparatus.

        4. I’d actually be happy to buy a LabVIEW license and tell future employers that it is the most awesome programming environment. Instead of me threatening to quit if they choose to use it for the next project.

        5. World would be a better place.

      • Lvbutthurt says:

        Grumpy, the 80’s are calling. They want your keyboard back. Seriously if you were on my team and you threatened to quit if you “had” to use LabVIEW for a project I wouldn’t let you. I’d fire you instead. Mark of a bad programmer and bad employee: closed minded, slow, and filled with excuses of why they “can’t”.

        Main problem with your argument is that you don’t realize that LabVIEW and C are two different tools that can both be used successfully in order to save time and money for an organization. A good programmer considers both execution speed AND development time as items that effect the project’s bottom line. If it was all about execution speed we would all still be writing in assembly. If you need to write a test framework with hardware abstraction, dut communication abstraction, and GUIs for helping test developers write test programs then using C would be a bad choice. If you are writing a random number generator, then C would be a good choice. Like having a fan brush and a roller brush. One is good for painting houses and one is good for painting pictures.

        And just because you can’t understand a language doesn’t mean the language is bad (you obviously don’t understand the G language from your comments). Wanting to change it to be more like the languages you are used to is closed minded. Why not learn why it is the way it is and how to get the most out of the tool…you may learn something new (which by the way, is a mark of a good programmer/engineer).

        On a final note: nothing you’ve said yet makes me think the original hammer analogy is not spot on.

      • Grumpy RA says:

        Well that’s just it, LabVIEW is not the tool I’d be using for certain tasks. It has its ups and downs. The projects that I’m asked to make are inefficient on LabView. I’m not saying LabVIEW can’t do certain things. I just think who uses labview for certain things are stupid.

        Go ahead, prime number generator. Also, try making 100 by 100 interactive switches. You can do that in 10 lines in C with 2 for loops. In LabVIEW, you’ll be connecting them wires for a century.

        I’m not trying to convince you, or anyone. I am stating facts, and if you somehow feel obligated to defend labview on a link that contains “why i hate” and “labview” together. You’re either:
        1. NI employee, who wishes to defend LabView anonymously.
        2. LabView enthusiast who hates real programming

        You both talk as if it’s so important to discredit my posts, why should you need to discredit me if they weren’t true? If I weren’t stating the facts, Labview would prove me wrong itself.

        Trust me, if you are from NI, and wish to defend your software, at least represent it honestly. If are are just an enthusiast, shouldn’t you be paid to do something else?

      • Grumpy RA says:

        If you’d define a good programmer to be the one to say, “let’s start spending a lot of money, and force people to learn something new so they can perform the tasks they are already proficient at slower”.

        I you’re referring to the first few years of C++ eventually they got better, but it set things behind for quite some time. LabView is setting things back, and without proper criticism, it’ll keep sending programming to the stone age.

      • Grumpy RA says:

        Say the premise is that Let’s use the elevator in this building. Sure, if I’m moving heavy materials, or going 100 floors up, elevator would be nice. Labview is the elevator and C is the staircase.

        Now, say we have 10,000 people who all need to go to the second floor from first floor, it’s much faster and overall better for them to take the stairs unless all 10,000 people cannot walk up the stairs.

        Using Labview for tasks I’m asked to perform is essentially that exactly.

  174. Shaman says:

    Labview is “dataflow” but then they introduce things like “Event Structures” or “DVR” (data value reference, that is a POINTER), that VIOLATES their own paradigm.
    YOU can’t delete your profile from their website, which is a clear violation of your own right to provacy (in europe).

  175. Shaman says:

    If you criticize them in their forum , you will get fooled or trolled by their lackeys, they unleash monkey-coders that post insults like “you don’t know how to code”, or similar

  176. Shaman says:

    you just can’t have a system that lasts 30 years, it’s just impossible. Labview was born on 1984, and still today 2016 it’s basically the same old crap. It’s like “MS-DOS has good foundations and we still are based on it”. That is why they abuse their own structures (abused the datalog refnum type), they invent things like DVR (pointers), evet structures that violates dataflow completely, but it’s ok, they have variant type (at the time of acticveX and VBasic I guess) but also classes (we ain’t cool dude, let’s throw in classes)…. they pass by value and say “this is the way”, but they also have “refnum” (pointers) and a few years later introduce DVR (pointers….again).
    The UI is stuck in the 90s, can’t easy resize automatically elements or panels, can’t create dynamic stuffs, moving parts, no ribbon support, no toolbar support (you have to manually implement toolbar lol).
    I could go on, I have 11 years of experience on labview.
    The VISA drivers are still monolithic, the runtime is huge and monolithic. This means 200 MB to install for a 1MB exe to run. 20 years later, they haven’t solved this stupid thing.

    • Fred says:

      Yeah but why not? Why shouldn´t we keep the good parts of the things we developed once and evolve them until there is no more potential in it. MS-DOS is being replaced by EFI. So will LabViews core be once replaced by another but as long as it does its job, and it does it well, there is no necessity to replace it. And it still has potential to grow and evolve. And your last argument is completely invalid. LabView exes need the runtime to work as well as most other executables need anything else too. .Net, Java etc. All other content you might use or watch on any pc or mac or whatever needs some kind of toolbox or runtime system or other program to run and these are in most cases large and bulky. For example an exe written in any .Net language will run on most PCs without any Problems, but during the Setup process of the pc there already has been installed the .Net runtime. And if I remember correctly the .Net runtime has a lot more than 200MB.

  177. scottrod says:

    I’ve used LabWindows/CVI for years now and love it. I could never use a graphical programming environment, ugh. I’ve heard LabView is kind of sucking wind at NI anyway.

  178. MarcoFabian says:

    All these comments resume in 1 thing: “I Never took a prefessional training”. I hated LabVIEW before because follow the wires was a bullshit, but after take the training all was clarified. I learn how to use the debuging tools and I love it. I never back to text based coding where one “.” crash the code and it is hard to find it. Good programmers uses good programming practices, it not depends on the programming language.

    • Grumpy RA says:

      So, spend thousands of dollars to buy a program that requires thousands of more dollars to be properly trained to use.

      While, you can spend that time and money on learning to program properly in a real programming language. Also, providing memory leaks everywhere, and very slow. I thought MatLAB was slow, but LabVIEW is slower.


      Do you guys want to spend $10,000 in grand sum to buy a pair of shoes(including training provided on how to walk)?

      • Grumpy RA says:

        Not to mention, that this special pair of shoes will make you walk slower, and easier to fall?

      • chiraldude says:

        $10,000??? Sure you could buy a full development system Plus a couple of Compact RIO chassis with modules and pay for training direct from NI. But, you can also get a home use license for less than $200 http://sine.ni.com/nips/cds/view/p/lang/en/nid/213095
        You can get online training for around $300 or just search YouTube for tutorials.

      • In a C or C++ showdown, I will run circles around you any day of the week in the domain in which LabVIEW is for: GUI oriented programs that interface with users. They may or may not control and interface with hardware, if they interface with hardware then you’ll be further slowed down at which point I’ll be drifting circles around you. :-). My GUIs will be nicer, code will be more maintainable and I will be able to explain it to my kids who will understand how it works and why.

      • Casa says:


        LabVIEW certainly is quick to get things going. However, if you want your diagram to look good and maintainable, you’ll spend time cleaning it up. Also some things take longer in LabVIEW.

        It would be good to have a LabVIEW versus C/C++/C# showdown. I suspect the LabVIEW programmer with be faster out of the gates by having the GUI and basic operation done quickly. But as the application gets more complete and is cleaned up and commented, the difference will be gone.

        Carlos, you mentioned that “My GUI will be nicer”. This is one of my biggest gripe with LabVIEW. The don’t like the GUI. The toggle and slide switch and graph navigation in particular looks awful.


      • Casa,
        My diagrams look good from the start and yes it takes longer to make clean diagrams and it takes forethought and experience to know what it’s going to take to accomplish a task, like say, controlling the LHC. If you invest the time up-front, you don’t end up with a cobbled-together mess just like in any language.

        I would put my money on an experienced LabVIEW programmer than on a C/C++/C# one. The productivity difference between one and the other language is significant. I don’t see why the C/C++/C# developers would ever catch up. If we’re all commenting our code, then the comments should take roughly the same for the various languages. Arguably LabVIEW is easier to comment so even there it might have an advantage.

        If you don’t like the look of the GUI, you can customize it with your own decorations. There is a whole subset of the community that focuses on this aspect of modifications. Once you’ve created your own libraries of modified GUI elements, you’re free to continue to reuse and share (or sell) them as you prefer.

  179. Lis Xeem says:

    I love visual programming, it is a lot easier for me to program when I can visually see it like Labview. I’m too poor to afford education on Labview, but from my love for this kind of stuff and going to sleep with the problems, I figured them out and was able to finish programming very complex labview programs finishing my application within a week. I do not like text programming.

    • Rich says:

      So, here;’s what you should do…

      Join the rest of the NI fanbois and pay the big bucks for all that NI software and leave the rest of us professionals alone.

      • chiraldude says:

        Well Rich, no one forced you to read these posts.
        I am quite a fan of visual programming as well. With LabVIEW I never had to spend hours hunting for a misplaced comma that subtly changed the way a function processed data.
        With text code, stuff like that happens all the time.

      • rwalle says:

        To chiraldude:

        “spend hours hunting for a misplaced comma” means you are a really bad programmer. LabVIEW does not really help there.

      • chiraldude says:

        Ok, how about c++ needing dozens of .h files to compile even the simplest instrument control app. Having to spend 90% of your time figuring out includes then tracking down obscure compiler options to debug undocumented compiler quirks really sucks.
        I’m not a pure software pseudo-engineer. I work in the real world where I have to build and debug test hardware. For me, code is a necessary evil. Coding skill doesn’t help much when you are trying to debug parasitic capacitance/inductance issues on a PCB.

  180. Ben Mead says:

    Thread repeats itself quite a lot. A whine fest is self sustaining, but it would be great if this evolved into known solutions to common problems. Or desired solutions.

    To the “where does all this start?’ problem, ONE button that made all the originations blink and get highlighted would be amazing. Even a keyboard shortcut to cycle through the starting points.

    “I hate Labview” is not productive, Creating a list of “If I had X, understanding unfamiliar code would be so much easier” could actually make it all easier.

    For instance Is there any program out there that takes a labVIEW program and represents it as function calls and variables? It might not be as easy to read, since every block is more or less a function call with inputs and outputs and wires represent information traveling around, but at least you could track changes from one version of the program to another in a succinct way.

    • Ben Mead says:

      By the way, to try to answer some of the complaints on this thread, there is another framework that is data driven, but nobody thinks twice about it because it plays nice with the rest of the world: Desktop GUI programming.

      Event and handler is the standard for multiple data stream sorts of things, so it’s not that data driven programming is automatically problematic… It’s that the tool is not up to the task of making the program easily externally understandable.

      Debugging could be the strength of the platform for instance. This would make debugging trivial:

      Take the offending input and allow someone to feed it into the data flow. HIGHLIGHT and bring focus to the path the data takes through the program. Allow the user to change focus via keyboard shortcut from block to the next block as the decision making progresses, and let the user look at all the inputs and outputs for a block. At some point, the inputs or the outputs from a block are not what you’d expected, and then the program should allow the user to then change focus upstream to blocks that created that data.

      At that point, people would be begging to have graphical debuggers, because all the iterative execution and breakpoints and difficult determinations of how to stop execution at just the right place would become a simple case of following the flow upstream or downstream to find the faulty decision making or data manipulation.

      • drjdpowell says:

        If you ask a LabVIEW expert on the advantages of LabVIEW, they would definitely list easy debugging, including tools that sound similar to what your describing. Using “Retain wire values” to trace backwards and forwards through a calculation is my primary debug tool.

      • Musaka says:

        We have all that.

        The stepping debugger is animated. It greys out everything then un-greys the wires, blocks and structures as the data travels through the program so you can see what things are being executed and when. It even shows the values as animated labels moving along the wires so you can see the data.

        We can also right click on any wires and get a probe window that shows us the current values as the program executes as a live watch.

  181. Yaknow says:

    Oh yes, Motherf**cking Labview, a.k.a. the most stupid program I’ve ever had to work with. Congratulations to whoever made this fucker. Doesn’t even work on my mac well: It shuts down while I’m working on my VI.
    Seriously I don’t understand why any school nowadays would recommend this for anyone while it’s not even compatible on mac.

    This program should not be used anymore guys, please.
    It’s 2016! We gotta move forward.., this program is making us go backwards :( and many people feel very annoyed by that obviously. So please world, do something about it :( :( :( :( ………..

  182. Kaos says:

    you can not say that a language programming is better than other one, because it dependes of the situation and the requirements of that moment. I have programmed PLC, Java, C, C++, labview, labwindows and visual basic. And all of them have advantages and disadvantages.

    • feta says:

      Here the genius comes. Go program in machine code… I’m pretty sure you’ll find situations where this is the best choice.

      • Casa says:

        There are indeed times when machine code (assembler language) is the best choice.

      • chiraldude says:

        I know someone who writes everything in assembly, even the GUI. Says everything else is inferior. He also told me his company would replace Microsoft, now he says he has a search engine that can outperform Google. I wish I were half as good as he is at assembly but I still would use LabVIEW to run test instruments every time.

      • Casa says:

        I guess there are times (that is, situations) where assembly would be better, but then there are also times when assembler and certain people are better.

        I remember when I was programming in assembler, it would take quite a while to write things that are readily available in a higher level language, but you build a foundation, that if it’s good enough, you can then proceed like others. You proceed with code that requires less programming resources (particularly important with microcontrollers) and is faster. The penalty is it takes some time to get rudimentary functions established (one off for all projects) and may be more difficult to maintain by others.

  183. PRITAM SAHA says:

    Its funny how people here are complaining about LabVIEW and its inefficiency. Try to control and automate complex lab equipment using python or C or Matlab…lines and lines of codes or dlls will take up half a day…whereas 10 minutes of graphical programming using GPIB commands and built in string manipulation functions gets all the automation you need done. There are things that LabVIEW is good at and there are things its not. I am definitely not going to try writing complex data parsing and manipulating codes in LabVIEW. I will use PYTHON. I am definitely not going to write a complex data analysis/mathematically complex code in LabVIEW…i will use MATLAB. But, for test automations, creating graphical interfaces to run characterization/automation tests in labs, LABVIEW beats every other tool/language out there. I wrote an entire pseudo-SPI bit banging state machine in NI LABVIEW FPGA language and i had the job done in less than a week. Downside…i could only synthesize the code on NI FPGA chassis (made my xilinx of course). But, upside…it took me about 3 days to get that complex code written, compiled, synthesized, tested and deployed. If i had used VHDL or Verilog, it would have taken me over a week. So, if you are dumb enough to use LabVIEW in the wrong applications, then you are the fool…not NI or LABVIEW!!! Also, Two words…STATE MACHINES….Every LabVIEW code can be written in simple STATE MACHINE ARCHITECTURES which will use as much space as your monitor screen size (you wont have to scroll). You will have the least number of wires running around and you can also add comments to every case for any other developer to understand. You need to learn how to code in LabVIEW rather than complaining. I could show you hundreds of poorly written python or C codes…does not mean the language is poor. It just means,,,whoever wrote that specific code is not smart enough…kinda like you folks complaining here !!!

    • feta says:

      I just stopped reading your wall of text at dll: this says much about your deep knowledge of serious programming. And yes, C is better than labview anywhere, anytime, for anything.

      Graphical interfaces with labview are good? Be serious… what is your reference? Visual basic? Delphi?

    • pam5 says:

      I agree with you even though I don’t like Labview and prefer to program data acquisition in C# or other languages. LV is good for quick prototyping and in applications limited to data acquisition and display, the frustration arises when LV is used for tasks it is not designed for. I have an example in my lab where most people are using LV: one of the NMR spectrometers runs under LV and is hardly usable. In NMR the data acquisition is a minor component of the code, the most important part deals with the design of the experiment, i.e. coding the RF timing sequences and scripts to combine these sequences (there is virtually an infinite number of possible experiment sequences in NMR). When users have to vrite a new VI for each new experiment I can tell you it is really a pain, this is an example where LV is definitely not recommended.
      The general problem is that most people choose a tool, language etc. not because it is the best suited for their problem but because it is the only one they master enough. There are really few people having a large enough epertise to be able to choose the best programming tool for a given problem. How many people on this forum are experts in both C/C++ and LV? Probably not many. Discussing with the others is like trying to convert someone from one religion to another.

    • rwalle says:

      I wonder if this guy are talking about the same Python, C, MATLAB that everybody else is using. I have never spent more than half an hour on MATLAB to get the instrument work. What is this person talking about?

  184. forex forum says:

    Hey there I am so thrilled I found your site, I
    really found you by accident, while I was researching on Bing for something
    else, Nonetheless I am here now and would just like to say thank you for
    a fantastic post and a all round exciting blog (I also love the theme/design), I don’t
    have time to look over it all at the moment but I have saved it and also included your RSS feeds,
    so when I have time I will be back to read much more, Please do
    keep up the awesome work.

  185. Ridiculous story there. What happened after? Good luck!|

  186. That is really fascinating, You’re a very professional blogger. I’ve joined your feed and stay up for searching for extra of your excellent post. Also, I have shared your web site in my social networks|

  187. why I hate, despise, detest, and loathe LabView | Lists of Things

    […]Good year! Heimlich Maneuver and post its and liposuction were invented. Prefer 1st 2.[…]

  188. Hi there, You’ve done an excellent job. I will definitely digg it and personally suggest to my friends. I’m sure they will be benefited from this website.|

  189. Fact: 99 of all articles and website content online is regurgitated and rewritten material from some original piece of literary or artistic work.. . So, isn’t it true about web content copyright that as long as you reword or rewrite a piece of content from a site’s copy online, then it’s not theft or copyright infringement?.

  190. What i do not realize is in reality how you are not actually much more smartly-appreciated than you may be now. You are so intelligent. You already know therefore considerably in the case of this topic, made me in my opinion consider it from so many various angles. Its like men and women aren’t fascinated except it’s one thing to do with Lady gaga! Your individual stuffs great. At all times care for it up!|

  191. Can you tell us more about this? I’d like to find out some additional information.|

  192. Uzun zamandır aradığım içerikdi teşekkürlerimi sunarım

  193. Uzun süredir bulamadığım içerikdi çok teşekkür ederim

  194. Thank you for every one of your work on this blog. My mother enjoys going through investigation and it’s really easy to see why. My partner and i learn all about the powerful way you offer informative solutions via this website and invigorate participation from other ones on this subject matter then our child is actually studying a great deal. Have fun with the rest of the year. You’re the one conducting a fantastic job.

  195. This is the right webpage for everyone who wants to find out about this topic. You understand so much its almost tough to argue with you (not that I actually would want to…HaHa). You definitely put a fresh spin on a topic that has been written about for a long time. Excellent stuff, just wonderful!|

  196. Abbey Berry says:

    I’m not being funny,
    but most of these points are completely invalid.

    Maybe you need to get better at it? Then you will be able to use it effectively.

    “Inability to write descriptive comments!”
    -LabVIEW offers many ways to correctly and neatly comment your code. If you do it right and keep it consistent throughout you can have very well documented software that is VERY easy to follow.

    “Inability to name variables!!!”
    -Well i don’t know what you mean by this but you can name all the variables you want?

    “Frakking impossible to debug!!!!!”
    – This is just bollocks. I debug software daily and i think LabVIEW is great at following bugs and finding them.
    Again though… maybe you just need to get better?


    If you need expert advice email me at:


  197. Real programmers are language agnostic, and LabView is a programming language. Ergo, none of you are real programmers.

    Sure, graphical editing can be a PITA, and personally I feel most at home in some type of embedded C, though C++ is great, too. And Python is probably the fastest (to write) language for straightforward tasks, so I love it, too.

    But 99% of the complaints here are basically generic “programming is hard” whining. If you are already thinking algorithmically, then LabView proficiency should take about 2 days tops, at which point you can add it under your “language agnosticism” umbrella.

    I recommend going back to school.

  198. Stefan K says:

    Well, I attended a presentation some years ago, and one of the slides just show the sentence: “A Fool with a Tool is still a fool.”

    You should know the IDE you´re using for your daily work. And as Marshall said, a programmer designs the application, not the language. The IDE is just the tool for bringing your smart thoughts, your creativity and your knowledge into the bit-world of PCs.

    Some comments on your intro post:

    1. Inability to write descriptive comments!
    –> WRONG. There are multiple ways to create comments. The easiest one: Just double-click whereever you want in Frontpanel or Blockdiagram and comment.
    Btw. there is a nice option to create a complete documentation of your project with all VIs (or selectable content) And in current versions you can use nice tools like bookmarks which agaion makes it much easier to comment.

    2. Inability to name variables!!!
    –> You can name every variable with the descriptive name you want. Which type of variable are you talking about?

    3. Nonlinear, graphical programming interface:

    a) Messy, horribly hard-to-follow programs! Wires everywhere!
    –>Right when you did bad planning and bad structuring of your application, easy to create “spaghetti code”,
    –> WRONG if using good structure, think about using Sub-VIs, separating tasks into right chunks.
    No one would create just ONE c-file or whatever which includes everything and not using reusable function calls with nice API…
    I know apps with 3500 VIs which are readable, much better as some text code I´ve seen, which was structured messy and commented badly…

    b) Extreme difficulty to insert new commands into an established program without ruining the organization structure!!
    –> Right if using the wrong structure.
    –> WRONG with using a proper “base design”, which is completely programming language independant… for example “statemachine-pattern”, Producer-consumer….and again, this is language independant.

    c) Frakking impossible to debug!!!!!
    Well, sometimes yes,but proper system design and using the right architecture / modularity will improve this a lot.

    d)Computer processors operate linearly anyway–LABVIEW IS LYING!!!
    –> Why lying? Can you explain what you mean here? We´re living in a multicore world and code should be created in a way to use this performance. So LabVIEW – if using the right architectures – will make it very easy to do this.

    4.) Sequence structures–the most cumbersome way possible for the LabView creators to have tried to rectify the problem that sometimes YOU JUST NEED TO EXECUTE COMMANDS IN ORDER JUST LIKE A CONVENTIONAL PROGRAM, DAMMIT!!!
    –> RIGHT, most worst option to do serialization
    –> WRONG, because there are much better options (state machines, etc…)

    5.) Mouse sensitivity! As in, my programming ability should not have to rely on my skill to accurately position the mouse over some of those frakking tiny terminals!
    –> I Agree, slowing down mouse speed in OS will improve, but still bad if you have a high resolution monitor and lots of in/output-Terminals on a small sub-VI.
    But the same is correct for “text-based”-Programmers.. the ones who know “10-finger-blind-programming” and the ones with “2-fingers-eagle-eye-circle-search and hit-style”

    6.) Timing structures–THEY DO NO SUCH THING!
    –> Don´t understand that. It depends on the platform where you use them and the way how you use them…. (Windows vs. Realtime, Processor assignment of structures). And in addition it influences how LabVIEW will compile code.

    7.) The fact that it has to rebuild all its data acquisition sub-VIs every time I want to make a tiny change to the sampling mode!
    –> Eehm, this only happens, if you use the DAQ assistant, which is a tool designed for the most beginners for easy entry into the DAQ world.Did you try to make DAQmx-VIs out of the assistant? Just rightclick the Blue VI… Or did you try to use the DAQmx-API directly , as you would use the DAQmx calls for examples in C# or your favourite language?

    8.) Shift registers and sequence instances! The saddest excuses for variables on the planet–and they contribute to the messy wiring problem!!
    –> Again, highly depends on your architecture and your programming style

    9.) It handles arrays in an extraordinarily clunky manner–and when you’re taking data, the role LabView is best suited for, MOST OF THE TIME YOU CAN’T POSSIBLY AVOID USING ARRAYS!
    –> WRONG, You can handle arrays like in any other language…

    Congrats, a lot of Labs are running on it… and also applications in medical, infrastructure, time-critical control, huge data mining / analysis…..

    I agree, that LabVIEW is different of other languages, which are text-based. The dataflow is something you have to learn new, but if you understand what means, you design with the same speed (maybe faster), the same quality and the same resource needs as in any other tool.
    It´s not the IDE who creates bad code, it´s the programmer…..


  199. Openion on a blog is an art. Good discussion create relations. You’re doing fantastic work. Keep it up.

  200. Sunny says:

    This article is rather old, and it appears that the OP has changed their mind a bit regarding Labview. I feel that anyone who has worked with Labview over an extended period of time can understand that it is a surprisingly powerful language. OP, is there any chance you could make an addendum to your post to reflect your change in views? This blog tends to pop up at the top of my search results regarding Labview searches, and it really sends an outdated message.

  201. Magnificent goods from you, man. I’ve understand your stuff previous to and
    you are just too fantastic. I actually like what you have acquired here,
    really like what you are saying and the way in which you
    say it. You make it enjoyable and you still care for to keep
    it sensible. I can not wait to read far more from
    you. This is really a wonderful site.

  202. tolga says:

    Please give Core1-2 courses to this guy…

  203. CryptoTab says:

    Amazing issues here. I am very glad to peer your article. Thank you a lot
    and I’m taking a look ahead to contact you.

    Will you kindly drop me a e-mail?

  204. wheelchair vehicle Lifts

    why I hate, despise, detest, and loathe LabView | Lists of Things

  205. Lily Zhang says:

    True, most of the issues have workarounds — especially with NI continually efforts. However, LabVIEW is actually not for professional software developers but for engineers. It is very simple to verify the fact: LabVIEW is made by C++ not by LabVIEW. Why do you not use LabVIEW to write the compiler/IDE for LabVIEW? And if a programming language is really good in terms of software engineering principle for complicated applications, then there would be many vendors.

  206. eduardo_mero01 says:

    Sounds like problems of a guy with just 2 days LabVIEW “experience”. tolga is right – take some core courses and above listet points are obsolet. Take some software developer courses and you will see: LabVIEW, C#, C++, etc. are Tools – each for it’s own each with its advantages and disadvantages. Your arguments are like screwdrive doesn’t do my hammer task, without knowing what screwdriver is good for and how /for what you can use it. Very poor!

  207. eduardomero01 says:

    Sounds like problems of a guy with just 2 days LabVIEW “experience”. tolga is right – take some core courses and above listet points are obsolet.

    Take some software developer courses and you will see: LabVIEW, C#, C++, etc. are Tools – each for it’s own each with its advantages and disadvantages. Your arguments are like screwdrive match my hammer task, without knowing what screwdriver is good for and how /for what you can use it. Very poor!

  208. ghost says:

    *Would you be interested in exchanging links?

  209. Bill says:

    The real problem with LV is that it empowers non-programmers (yes, those clueless liberal-arts types) to create code. Have you ever seen code created by these clowns? I was involved in a relatively large LV project that was a real nightmare since it was built by hackers with no clue about software engineering.

    • Martin says:

      Yes I did.
      My LiDAR app cobtrols:
      – User Events
      – visualization
      – pump laser diode control
      – usb interface
      – Ethernet
      – TDNS file handling
      And so on..
      The system load was 7% for a Windows standard embedded system …
      This is the art of LabVIEW programming that I like

  210. Mixup says:

    My favorite wp theme is Striker,and I think developers such as UPThemes and GraphPaperPress are super.

  211. Jewel Smit says:

    I cannot thank you enough for the post.Really looking forward to read more. Cool!

  212. Paul Gaier says:

    I’ve been writing code in LabVIEW for over 20 years, and have never wanted to do anything else. 99% of your complaints are made by programmers who have no idea what they are doing. LabVIEW is a real programming language and requires a certain level of intelligence and effort to learn, just like any other language. I learned BASIC, C, C++ and dabbled in some Python and HTML. I had no desire to spend my life writing text files. The visualization of LabVIEW is key, much like drawing and reading schematics or architectural blueprints. Some of us are just as capable of understanding pictures as we are understanding text-based instructions. It’s always better to not criticize (or hate, despise, detest, loathe) something you simply (very simply) don’t understand.

    • chiraldude says:

      Kind of funny how out of date the initial comment is. I started using LabVIEW around 2000 and back then it really was just for running instruments in a lab. Version 2017 is so far from version 6. Too bad NI couldn’t have found a way to rename/rebrand it. It really should be called “G” for Graphical or “V” for Visual and the VI thing should have been changed to .vc or something similar.

  213. I feel your pain. National Instruments has promoted an over complicated way of programming LabVIEW. It is possible to write tidy, simple code in LabVIEW but it is years since I have seen anyone do it. If anyone has a budget and needs help starting a new project in a tidy way or rewriting a project tidily, contact me at nally dot io. It will not be written in a the NI way and you will never earn your CLD, but you and everyone else will understand the code and be able to maintain it.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: