## Solution

The SGL datatype is of

IEEE 754 binary32 format which implies that even though it is a 32-bit datatype, only

**23 bits** are used to represent the

significand in memory as shown below. The total precision of SGL datatype is

**6 1/2** decimal digits. Thus, if a decimal string with more than

**6 significant digits** is converted to SGL representation, it might result in a

**round-off error** produced by quantization of the real number in binary.

The following alternatives might be useful while trying to obtain precision greater than 6 decimal digits.