X

NV5 Geospatial Blog

Each month, NV5 Geospatial posts new blog content across a variety of categories. Browse our latest posts below to learn about important geospatial information or use the search bar to find a specific topic or author. Stay informed of the latest blog posts, events, and technologies by joining our email list!



Comparing Amplitude and Coherence Time Series With ICEYE US GTR Data and ENVI SARscape

Comparing Amplitude and Coherence Time Series With ICEYE US GTR Data and ENVI SARscape

12/3/2025

Large commercial SAR satellite constellations have opened a new era for persistent Earth monitoring, giving analysts the ability to move beyond simple two-image comparisons into robust time series analysis. By acquiring SAR data with near-identical geometry every 24 hours, Ground Track Repeat (GTR) missions minimize geometric decorrelation,... Read More >

Empowering D&I Analysts to Maximize the Value of SAR

Empowering D&I Analysts to Maximize the Value of SAR

12/1/2025

Defense and intelligence (D&I) analysts rely on high-resolution imagery with frequent revisit times to effectively monitor operational areas. While optical imagery is valuable, it faces limitations from cloud cover, smoke, and in some cases, infrequent revisit times. These challenges can hinder timely and accurate data collection and... Read More >

Easily Share Workflows With the Analytics Repository

Easily Share Workflows With the Analytics Repository

10/27/2025

With the recent release of ENVI® 6.2 and the Analytics Repository, it’s now easier than ever to create and share image processing workflows across your organization. With that in mind, we wrote this blog to: Introduce the Analytics Repository Describe how you can use ENVI’s interactive workflows to... Read More >

Deploy, Share, Repeat: AI Meets the Analytics Repository

Deploy, Share, Repeat: AI Meets the Analytics Repository

10/13/2025

The upcoming release of ENVI® Deep Learning 4.0 makes it easier than ever to import, deploy, and share AI models, including industry-standard ONNX models, using the integrated Analytics Repository. Whether you're building deep learning models in PyTorch, TensorFlow, or using ENVI’s native model creation tools, ENVI... Read More >

Blazing a trail: SaraniaSat-led Team Shapes the Future of Space-Based Analytics

Blazing a trail: SaraniaSat-led Team Shapes the Future of Space-Based Analytics

10/13/2025

On July 24, 2025, a unique international partnership of SaraniaSat, NV5 Geospatial Software, BruhnBruhn Innovation (BBI), Netnod, and Hewlett Packard Enterprise (HPE) achieved something unprecedented: a true demonstration of cloud-native computing onboard the International Space Station (ISS) (Fig. 1). Figure 1. Hewlett... Read More >

1345678910Last
16799 Rate this article:
5.0

Speeding up linear gridding of irregular points with multiple values (GRIDDATA)

Anonym

Gridding or interpolating large amounts of data is a common task for some IDL and ENVI users. Here, I am showing a trick that can speed up bi-linear interpolation using a triangulated collection of irregularly gridded points in 2-D. The assumption here is that there are multiple values for each distinct point (x,y), and instead of using GRIDDATA repeatedly for several hundred values at the same locations, the code is pre-computing weights for the triangle corners. This saves computations in the final step and thus achieves a nice speed improvement.

The speed improvement on my computer was going from 31.4 seconds to 8.4 seconds. Here is the output produced by the example code:

 

IDL> grid_speed

% Compiled module: GRID_SPEED.

% Time elapsed: 31.422000 seconds.

% Time elapsed: 8.4380002 seconds.

Mean, Variance, Skewness, Kurtosis

    0.426667    0.0376666      1.26416     0.492909

    0.426667    0.0376666      1.26416     0.492909

 

min, mean, max difference

-1.19209e-007 1.24474e-011 1.19209e-007

 

 

Here is the example code:

 

 

pro grid_speed

 compile_opt idl2,logical_predicate

 

 ;Set up random data points

 ;Let's say 200,000 spatial (X,Y) points with 400 measurements each

 npts = 200000

 nbands = 400

 im = randomu(seed, npts, nbands)

 x = randomu(seed, npts)

 y = randomu(seed, npts)

 

 ;Set up an output gridded space for desired locations

 nx = 768

 ny = 768

 start = [0,0]

 delta = 1d / [nx, ny]

 dim = [nx, ny]

 gridIm1 = fltarr(nx, ny, nbands)

 gridIm2 = fltarr(nx, ny, nbands)

 

 ;traditional approach for bilinear gridding

 tic

 triangulate, x, y, tr

 for i=0, nbands-1 do begin

   gridIm1[0,0,i] = griddata(x, y, im[*,i], triangles=tr, /linear, $

     start=start, delta=delta, dimension=dim)

 endfor

 toc

 

 tic

 triangulate, x, y, tr

 

 ;compute triangle numbers for each input point

 ;multiply by 3 so that triangles are numbered

 ;by the starting index 0, 3, 6, 9, ...

 index = lindgen(n_elements(tr))/3*3

 xt = x[tr[*]]

 yt = y[tr[*]]

 linTr = lindgen(size(tr, /dimensions))

 tr_num = round( $

   griddata(xt, yt, float(index),triangles=linTr, /linear, $

     start=start, delta=delta, dimension=dim))

 

 ;Compute weights for each of the 3 points in the enclosing triangle

  wts = ptrarr(3)

 for i=0, 2 do begin 

   w = griddata(xt, yt, lindgen(n_elements(xt)) mod 3 eq i, $

     triangles=linTr, /linear, $

     start=start, delta=delta, dimension=dim)

   wts[i] = ptr_new(w, /no_copy)

 endfor

 

 ;Compute interpolation for all bands using weights

 ;instead of GRIDDATA

 for i=0, nbands-1 do begin

   gridIm2[0,0,i] = im[tr[tr_num] + i*n_elements(x)] * (*wts[0])

   gridIm2[*,*,i] += im[tr[tr_num+1] + i*n_elements(x)] * (*wts[1])

   gridIm2[*,*,i] += im[tr[tr_num+2] + i*n_elements(x)] * (*wts[2])

 endfor

 toc

 

 ;Verify that the results are the same for both

 ;methods.

 print, 'Mean, Variance,Skewness, Kurtosis'

 print, moment(gridIm1)

 print, moment(gridIm2)

 print

 diff = gridIm2 - gridIm1

 print, 'min, mean, maxdifference'

 print, min(diff, max=maxDiff), mean(diff), maxDiff

end

Please login or register to post comments.