In order to acquire images from a GigE Vision camera, you need to first make sure that you have all the correct hardware and software. Below is a list of requirements.
Once you have the hardware and software installed correctly, you must configure the network as well. As discussed in Part I, GigE Vision cameras can obtain an IP address from a DHCP server or select one for itself using Link Local Addressing (LLA). If you connect the camera to a Gigabit Ethernet network with a DHCP server, the camera is automatically detected. If the camera is connected directly to the computer (using either a regular or cross-over cable), you will need to wait about a minute for the camera to timeout on the DHCP request and use LLA. The Windows operating system may display a warning that the network card has only limited operation. You can ignore this warning. Note: that the delay only applies to Windows XP and 2000, not the Windows Vista Operating System.
Figure 1. Windows displays a warning when camera is directly connected
Typically, network drivers will split any data larger than 1500 bytes into multiple packets. However, the GigE Vision standard allows packet sizes of up to 9014 bytes. These large packets, also known as Jumbo packets, allow the camera to more efficiently transfer data across the network. You can enable Jumbo packets in many network cards from the Windows Device Manager by right-clicking the network card and selecting Properties.
Figure 2. Example of setting Jumbo packets on the Intel PRO/1000 Adapter
When a camera acquires an image, it immediately streams those data packets to the host. However, network firewalls will not allow the packets to reach their destination because firewalls typically block uninitiated incoming traffic. Therefore you will need to disable you firewall in order to acquire images from a GigE Vision camera. You can disable the Windows Firewall from the Control Panel (Start»Control Panel). However, if you have a network card with an Intel PRO/1000 chipset and you are using the High Performance driver, you will not need disable the firewall. Since the High Performance driver redirects incoming GigE Vision packets to the NI-IMAQdx kernel driver before it reaches the firewall, you firewall settings will not affect image acquisition.
Measurement and Automation Explorer (MAX) is used to verify that you have discovered the camera and can acquire images. Since the NI-IMAQdx driver supports Plug and Play (PnP), any GigE Vision cameras on the same subnet as the host should automatically appear in the Devices and Interfaces sub tree. GigE Vision cameras are enumerated under the NI-IMAQdx sub-tree and are identified by a special icon. If you are using NI IMAQdx 4.3.5 or later, GigE Vision cameras will appear in the Network Devices sub tree.
Figure 3. MAX automatically detects any GigE Vision camera on the same subnet
MAX will display any GigE Vision camera on the same subnet as the host. However, NI-IMAQdx allows you to acquire images from cameras on remote subnets too. You can discover cameras on remote subnets by calling the appropriate function in the NI-IMAQdx API. For example, the C function IMAQdxDiscoverEthernetCameras() has a parameter to specify the subnet to discover cameras on.Once you are able to discover the camera in MAX, the next step is to acquire images from it. Select the camera from the sub tree to open it in the main window. Below are the various parts of the Acquisition tab and their descriptions
Figure 4. The Acquisition Attributes page
Once you have set the acquisition parameters correctly, click the Snap to acquire one image or click the Grab to acquire images continuously.
NI-IMAQdx provides a unified API for acquiring images from both IEEE 1394, USB3 Vision, and GigE Vision cameras. While some functionality is specific to one type of bus, most functions and VIs can be used with both types of cameras. This enables a more bus agnostic development for image acquisition. You can replace your IEEE 1394 camera with a GigE Vision camera, or vice verse, with little or no change to your code.The NI-IMAQdx LabVIEW API is divided into high-level and low-level VIs. Using the high-level VIs, you can program a simple snap, grab or sequence operation. The low-level VIs allow you to perform the same tasks as the high-level VIs but give you greater control over execution details. Look at the examples that ship with LabVIEW to understand how to program image acquisition using NI-IMAQdx.
Figure 5. A simple grab example
The above example illustrates a simple Grab acquisition in LabVIEW. The acquired images are displayed in the Image indicator. This example shows the buffer number in the Buffer Number Indicator . You may lose buffers if the loop rate of the while loop is not higher than the frame rate of the camera. In such cases, an image copied into a buffer in memory is overwritten with another image before the original image can be processed. In most machine vision cases, it is important to notify the user if any frames have been missed.
Cameras typically support several settable attributes that enable the camera to be flexible enough to work in different environments with varied constraints. While most machine vision cameras support some typical attributes, such as gain, shutter speed, or bit depth, many cameras have a unique attribute subset that is specific only to that camera or family of cameras.
Figure 6. The Camera Attributes tab in MAX displays all attributes
The GigE Vision standard defines a minimal set of attributes that are required to capture an image. These attributes, such as image width, height, pixel format, etc., must be supported by every GigE Vision camera. However, additional attributes supported by a camera can be exposed using the GenICam standard.
The GigE Vision specification relies on GenICam, which is a standard of the European Machine Vision Association (EMVA) to describe the features (attributes) supported by a camera. Every GigE Vision Camera must provide an XML device description file conforming to the GenICam syntax. When a camera is connected and first selected in MAX, this XML file is retrieved and interpreted by NI-IMAQdx to enumerate the attributes supported by the camera. Since each camera vendor provides an XML file specific to each camera, NI-IMAQdx can auto-populate the attributes specific to that camera.
Figure 7. Snippet of an XML file describing the gain attribute
Figure 7 shows a very simple example of the gain attribute being described in an XML file. Upon parsing this snippet of XML code, NI-IMAQdx determines the following:
Every such attribute supported by the camera will have a section of code in the XML file that defines the attribute parameters. You can examine the XML file manually by opening it from the <Program Files>\National Instruments\NI-IMAQdx\Data\XML directory. Note: The above example is a very simplistic representation of an attribute and is provided as an academic exercise. Typical XML files are far more complicated and involve many cross references.
Camera settings can be controlled by using MAX (see Figure 6) to set the values of attributes exposed in the XML file. However, many applications need the ability to programmatically change camera attributes. The NI-IMAQdx API provides methods to change the value of any attribute exposed in the XML file. While camera attributes can be set in any supported API, we shall discuss its implementation in LabVIEW.Every attribute supported by a camera is defined by these (non-exhaustive) properties:
The camera manufacturer can provide you with documentation detailing the properties for each attribute. If the documentation is not available, you can use MAX to determine the properties of a certain attribute. To do so, simply select the desired attribute from the Camera Attributes tab. For example, we shall examine the ExposureTimeAbs attribute of a Basler Scout scA640-70gm camera. From Figure 8, we can determine that the attribute ExposureTimeAbs is a floating point number with microseconds as its unit.
Figure 8. ExposureTimeAbs Attribute in MAX
In LabVIEW you can set the attribute values using a Property Node. However, at development time, LabVIEW cannot know the attribute's name or representation. Therefore, you will need to provide the attribute name and call the appropriate function depending on the attribute representation (integer, string, boolean, etc.).
Figure 9. Setting the ExposureTimeAbs attribute of a Basler scA640-70gm
While GenICam provides a flexible method to control cameras, the standard is not enough to guarantee interoperability. Interoperability is the ability to switch between different cameras and still maintain the functionality of the application software. For example, if we replace the Basler camera in the above example with a camera from another manufacturer. This camera might have the attribute ExposureTimeAbs represented in nanoseconds or as an integer or may even refer to the attribute by a different name. Clearly, each camera will produce different acquisition results for the same inputs.In order to improve interoperability, EMVA, in partnership with camera manufacturers, created the GenICam Standard Features Naming Convention. The goal of this document is to standardize the name, representation, access, unit, and function of many of the attributes common to most cameras. By using this naming convention in unison with the GenICam standard, camera manufacturers can promote interoperability with other cameras for standard features while still giving users access to unique features in their cameras.
In most Machine Vision applications, the camera needs to take images based on real-world events. For example, a bottle inspection system must capture each image when a bottle is in the exact same position on the conveyer belt with respect to the camera. This will make the bottle appear in exactly the same location in the image and hence will simplify the image processing. You can achieve such control using hardware triggers.In typical hardware triggered systems, a proximity sensor or an encoder sends pulses to trigger an acquisition. In many cases, the trigger is connected to a framegrabber which initiates an acquisition. However, due to the potentially long distances possible with GigE Vision cameras (up to 100 Meters), triggering the framegrabber is not feasible. Therefore all trigger signals must be directly connected to the camera.In GenICam, selecting trigger modes works just like setting camera attributes. In fact, trigger modes are attributes in GenICam. You can use the same API discussed in the previous section to set trigger modes. The GenICam Standard Features Naming Convention defines several trigger control features that let you customize the behavior of your triggered action. While camera manufacturers are not required to implement all triggering modes, the most commonly used modes are described below.
Jumbo Packets: If your NIC device, or any intermediate network hardware (switch, router, etc.), does not support Jumbo packets, you will be limited to a packet size of less than 1500 Bytes. The GigE Vision packet size cannot be greater than the maximum packet size allowed by the NIC.Firewalls: Many corporate networks employ firewalls for network security. However, you cannot acquire from GigE Vision cameras with the firewall enabled, unless you use the High Performace driver. If your company's network policy does not allow you to disable the firewall or use a different network driver, you will need to use a system dedicated to image acquisition, that is not part of the corporate network.Corrupt XML files: As with any new standard, camera manufacturing companies routinely release new revision of their firmware. If you get an error stating that the XML file is corrupt, please contact the camera manufacturer for the latest revision of their firmware.Interoperability: While GenICam gives camera manufacturers the flexibility of creating a custom attribute set, it makes it difficult to easily switch between cameras without modifying your code. While the GenICam Standard Features Naming Convention alleviates this problem to a certain extent, most of the conventions are only recommendations and not requirements. So a camera manufacturer may deviate from the convention, in which case, the application software will need to be modified to be interoperable with other cameras.
Collaborate with other users in our discussion forums
A valid service agreement may be required, and support options vary by country.