Feeding Large Arrays Into Variant Attributes in LabVIEW

Updated Oct 8, 2024

Reported In

Software

  • LabVIEW

Issue Details

I am providing several instances of large 2D arrays into a Variant Attribute. When using LabVIEW 32-bits LabVIEW was having multiple memory errors and once I switched to 64-bits the memory errors have gone away. 

Is there any risk in using large 2D arrays as a Variant Attribute input? 

Solution

Providing large 2D arrays as a Variant Attribute input is a risk as you might have many memory issues. Each copy of these arrays is going to be 100s of MB and when copying a variant with multiple of these would magnify that. In addition, each of these arrays must be a single allocation. As they are allocated and deallocated, memory can become fragmented such that even though there is enough total memory free, there may be small allocations spread out so there are no more areas large enough for that array and this is supported by the problem not being seen in 64-bit.

 

Additional Information

If this application must stay in 32-bit, then it will need some focus on reducing memory usage. This could be just using LV's profiling tools (Tools>>Profile...) to identify areas for improvement. If that doesn't provide enough, some strategic use of Data Value References may be appropriate. The variants themselves could be put in DVRs or DVRs could be used for the variant attribute values.