After data sets have been screened, they may still require conversion to a common time-series scale before being used together in computations.  In order to calculate hourly reservoir inflow, the data sets for the lake levels and outflows should be normalized to the same hourly intervals. 

Task 1. Convert to hourly intervals

  1. Clear any time window that may be set, select the data set named
    /FOX RIVER/LUTZ PARK/FLOW‑RES OUT//15MIN/USGS-CST-FIXED/
  2. launch the Math Functions module.
  3. Set the ‘Selected Data Set’ field to the outflow pathname and tabulate the data.
    1. Note that the outflows occur every 15 minutes, at an offset of 3 minutes. 
    2. Subsequent calculations require hourly flows averaged at the top of the hour.
  4. Select the ‘Min/Max/Avg/…Over Period’ operator under “Time Functions”.
  5. Select a "Function Type" of "Average over Period".
  6. Set the time interval to 1HOUR.
  7. Leave the offset box unchecked - the default is to use the end-of-period (i.e., “top of the hour”). 
  8. Click ‘Compute’


Question 1. Tabulate the new data set. What is the “Type” of the resulting data?  How does this affect the values computed for the new time interval? 

The type of data generates is "PER-AVER" data. This impacts the computed results due to the fact that we are averaging the data over the period. Since this is an averaged value we will not peak as high as compared to the 15-minute data.

Extra Credit: What would be the hourly flow for 1800 on 22Sep2000 if the 15-minute flows were instantaneous values instead of PER-AVER?

If the data was kept at an instantaneous data value, we would expect the value to be close to 5,223 cfs that is denoted for the value at 18:03. If we choose to use the function type of "Interpolate at end of Period" (which generates an INST-VAL data type), we are given a value of 5,238 cfs at 22 Sep 2000, 18:00.


  1. Save your new data set using the suggested E-part of 1HOUR (this will be auto-generated due to the timestep).

Task 2. Interpolation

The two elevation data sets will later be averaged into a mean reservoir level, but first must be normalized to hourly data.  The irregular Fond Du Lac data set consists of readings taken at the top of each hour, as well as one taken every 4 hours at the satellite ‘shoot’ time. 

  1. Clear any time window that may be set, select the data set named
    /LAKE WINNEBAGO/FOND DU LAC/ELEV//IR MONTH/COE CST
  2. launch the Math Functions module.
  3. Set the ‘Selected Data Set’ field to the Fond Du Lac pathname and select the ‘Irregular to Regular’ Time Functions operator.
  4. Select the “Interpolate” option.
  5. Set the time interval to 1HOUR.
  6. Leave the offset box unchecked - the default is to use the top of the hour. 
  7. Click ‘Compute’.
  8. Tabulate the results to verify that only the top-of-hour values remain.
    1. From the Math Function’s File menu
    2. Save the data set using the suggested E-part of 1HOUR.

Task 3. Snapping

The 30-minute Jefferson Park data set consists of instantaneous readings taken at 2 and 32 minutes past each hour.  If simply converted to an hourly interval, the levels at the top of the hour would be linearly interpolated between the values at 32 and 2 minutes.  This has the unfortunate effect of attenuating the high and low points.  It is preferable to avoid this attenuation by extracting the 2-minute offset readings, and “snapping” them back to the top of the hour.

  1. Clear any time window that may be set, select the data set named
    /LAKE WINNEBAGO/JEFFERSON PARK /ELEV//30MIN/COE-CST/
  2. launch the Math Functions module.
  3. Set the ‘Selected Data Set’ field to the Jefferson Park pathname and select ‘Irregular to Regular’ Time Functions operator.
  4. Select the “Snap” option.
  5. Set the time interval to 1HOUR. 
  6. Check the ‘Tolerance Forward' offset box and specify ‘2’ and ‘Minute’. 
  7. Click ‘Compute’.
  8. Tabulate the results to verify that only the values originally at a 2-minute offset remain, and that they are now shown with an offset of 0 minutes.
  9. Save the data set using the suggested E-part of 1HOUR. 
  10. Exit the Math Functions screen.
  11. From the HEC-DSSVue Main Window, tabulate the new hourly pathnames for Fond Du Lac, Jefferson Park, and Lutz Park to verify that all the data occurs at the top of the hour.
    1. (NOTE: the Jefferson Park times might appear still offset by 2 minutes due to a bug in some versions of DSSVue 3.0 during the save to DSS.   If this is the case, proceed using the pathname “/LAKE WINNEBAGO/JEFFERSON PARK/ELEV//1Hour/COE-CST-FIXED/”)