Ferret v6.6.4 is required for LAS v7.2Beta. It contains several new items and a few bug fixes.
Version 6.6 was the first version of Ferret linked with the netCDF 4 library for netCDF and OPeNDAP dataset access. It was linked in such a way that it used many shared-object libraries, and a number of users had trouble finding matching versions of the libraries on their systems. v6.6.4 is linked with as many static libraries as possible which should reduce these difficulties.
NOTE: A serious bug has been found in this version of Ferret which affects computing the definite integral. Zonal integrals return results that are too small by a factor of pi/180 (0.01745). This bug first appeared in Ferret v6.42 and was fixed in version 6.67. Installations of LAS which run these versions of Ferret are unaffected by this bug.
1) We are running into a number of datasets which contain variables where missing data is filled with NaN, but NaN is not listed as its missing_value or _FillValue attribute. Even though this does not conform to the CF standard, Ferret now accommodates these datasets by sweeping through data that is read from netCDF files, and replacing any NaNs with the variable's bad-flag. Testing has shown this not to be a problem for performance.
2) If a variable in a netCDF file has no missing_value or _FillValue attribute, Ferret treats it as if the missing data are NaN. If variables which don't have these attributes are found when a dataset is opened, a NOTE is issued, for instance,
yes? use test0
*** NOTE: If no missing_value or _FillValue attribute on variables, will use NaN
3) A new qualifier SET AXIS/REGULAR replaces the coordinates of an existing axis with regularly-spaced coordinates with the same start, end, number of points, and units. Useful if an axis has been created in such a way that the coordinates appear to be irregularly spaced but in fact should be treated as a regularly-spaced axis.
4) New smoothing transformations, @SMX, @SMN, smooth the data by keeping the minimum and maximum of the data in the window. These are similar to @MED which uses the median of the data.
5) Symbols are automatically created when a vector plot is drawn:
- PPL$VECLEN containing the vector length scale. This may have been computed by Ferret or specified by the user with the VECTOR/LENGTH qualifier
- PPL$VEC_XSKIP, PPL$VEC_YSKIP containing the skip values. These may be computed by Ferret, or specified by the user with VECTOR/XSKIP=/YSKIP=
6) A new qualifier VECTOR/KEY forces a vector key to be made, even if /NOLAB is present. Without /KEY, the VECTOR/NOLAB qualifer supresses the key.
7) A new function MINMAX returns the minimum and maximum values in the argument. This is less compute-intensive than running the STAT command.
1) Previously when the STAT command was run on a variable with large values, overflow in the computation of the standard deviation resulted in a note: Standard deviation: **too big**
In v6.6.4 the algorithm for computing variances has been replaced with a more robust algorithm, so that the STAT command can return the correct result. Very large data will still return **too big** when the result would overflow single-precision representation.
2) The variance-based color levels calculation has also incorporated the improved algorithm for variance, so that data fields with large variances can be plotted with /LEVELS=v. Previously, attempts to do such a plot resulted in an error: "value out of legal range; data too large to compute Std Dev."
3) The LIST command for writing ASCII data previously chopped off string variables if the strings were longer than 22 characters. This is increased so that strings may be as long as the maximum record length, 1024 characters.