With the lake levels and outflows screened and normalized to the same time scale, hourly reservoir inflow can be computed using the standard equation: 

\sum I = \frac{S_2 - S_1}{\Delta t} +\sum O

where

S2 and S1 are the storages at the end and the beginning of the time period

\Delta t is the length of the time period

\sum O is the total outflow rate

\sum I is the net inflow

For this workshop, \Delta t equals 1 hour, or 3600 seconds.  The hourly average Lutz Park data provides the total outflow rate in cfs.  The only terms not already in hand are the top of the hour storages.  These will be calculated from hourly instantaneous reservoir levels using a storage-elevation relationship.  The reservoir levels are smoothed averages of the hourly levels at Fond Du Lac and Jefferson Park.

Task 1. Average Elevations

  1. Clear any time window that may be set in HEC-DSSVue, select the data sets named
    /LAKE WINNEBAGO/FOND DU LAC/ELEV//1HOUR/COE‑CST/
    /LAKE WINNEBAGO/JEFFERSON PARK /ELEV//1HOUR/COE-CST/

  2. launch the Math Functions module.

    If your Lake Winnebago/Jefferson Park did not translate the data from 2 after the hour and 2 after half pass the hour to being directly on the hour and half pass, please use the COE-CST-FIXED dataset instead of the COE-CST dataset.

  3. Set the ‘Selected Data Set’ field to the Fond Du Lac pathname, then go to "display"
  4. Select "All data sets" option and plot all the data. 
  5. Modify the plot settings until the data sets can be easily distinguished.
    1. Note that the lake levels at the north and south ends fluctuate in opposite directions.  This ‘sloshing’ effect (seiche) occurs due to wind setup and atmospheric pressure differentials due to passing fronts.  The average of these two gages provides the best approximation of the still-water reservoir level needed for determining the storages.
  6. Close the plot.
  7. Average the two elevations by adding the data sets together, then dividing the resultant by 2.  Be very careful with the mouse when stacking consecutive computations and start over if mistakes occur.  

    1. Plot your results.

      Enable the Display menu options for “Original data with computed” and “All Data Sets” to see the results and both lake levels together.


  8. Save your results using a B-part of ‘MEAN LAKE’
  9. Exit the Math Functions screen.

Task 2. Smooth Elevations

Even though the mean levels provide a much closer approximation of the still-water reservoir level, wind-driven fluctuations still render them unsuitable for directly determining true reservoir storages.  Smoothing the levels with a moving-average technique dampens out the short-term fluctuations to provide a more realistic measure of storages changes.

  1. Clear any time window that may be set in HEC-DSSVue, select the data set named /LAKE WINNEBAGO/MEAN LAKE/ELEV//1HOUR/COE‑CST/
  2. Launch the Math Functions module.
  3. Smooth the elevation data using the ‘Forward Moving Average’ smoothing technique.  Use 7 values to compute the average.
  4. Plot the results. 

  5. Modify the plot settings in order to distinguish the lines.  The smoothed data set should generally track through the middle of the range of fluctuations.      

    Extra Credit: Why would provisional real-time water management calculations favor smoothing with forward-moving average, while smoothing of historical data often uses a centered-moving average?   What should real-time water managers keep in mind when interpreting data smoothed with forward-moving average?

    The forward-moving average data provides results up to the latest value of the original unsmoothed data, and having the latest data is important for real-time operations.    Using centered-moving average, the computed values stop when the averaging period runs out of actual data.   

    The forward-moving average ends up "lagging" the original data by half the averaging period.   For example, in the plot below note how the forward-moving average shows the level on 7Oct continuing to decline for 3 hours after the actual data starts to rise.   The subsequent peak and decline exhibit the same lag.   

    Finally note that DSSVues offer a "Reduced Number of Values" option for centered moving average, which will compute results up to the latest original value despite missing values at the end of the averaging period.   This confines the lag effect to just the most recent few values, but those last values are also less smoothed.

  6. Save your results using a B-part of ‘SMOOTH MEAN LAKE’
  7. Exit the Math Functions screen.

Task 3. Compute Storage

  1. Clear any time window that may be set in HEC-DSSVue, select the data sets named
    /LAKE WINNEBAGO/SMOOTH MEAN LAKE/ELEV//1HOUR/COE‑CST/
    /LAKE WINNEBAGO/MEAN LAKE/ELEV‑STOR//1973/OSKKOSH DATUM/
  2. Launch the Math Functions module.
  3. Using the smoothed elevations and the elevation-storage curve, compute reservoir storages using the ‘Rating Table’ Hydrologic function.
  4. Tabulate and plot the results.

    Question 1. What are the units of the computed storages?

  5. Save the storage results
    1. B-part of ‘SMOOTH MEAN LAKE’
    2. C-part of ‘STOR’.
  6. Exit the Math Functions screen.

Task 4. Compute Net Inflow

The DSS file now contains data sets for each of the terms needed to solve the equation for hourly net inflow.

  1. Clear any time window that may be set in HEC-DSSVue, select the data sets named
    /LAKE WINNEBAGO/SMOOTH MEAN LAKE/STOR//1HOUR/COE‑CST/
    /FOX RIVER/LUTZ PARK /FLOW‑RES‑OUT//1HOUR/USGS‑CST‑REV/
    Launch the Math Functions module.
  2. Compute the change in reservoir storage for each hour. 

    Question 2. What function did you use?

  3. Tabulate the results to see that the current data set consists of the storage change at the top of each hour.
  4. Now convert the storage changes from acre-ft/hour to ft3/s by using the ‘Set Units’ operator under the general tab. Enter the new units and a conversion multiplier of 12.1. (Multiply acre-ft by 43560 to convert to ft3/hour and then divide by 3600 to convert ft3/hour to ft3/s.)
  5. You can also convert the units using multiply and divide under the Arithmetic tab.

    1. Note that the units are still listed as ‘ACRE-FT’ when this method is used. 


      Question 3. Why?

      We did not tell DSSVue to change the units like we did with the "Set Units" work process.

  6. Tabulate the results to see that the current data set consists of the storage change expressed as an hourly average inflow rate. Save the result with a C-PART of ‘STOR-CHANGE’.
  7. Add the outflow to your storage changes.  If you used only Arithmetic functions to perform the calculations, a message box will warn you of the apparent unit mismatch.  Click ‘Yes’ to proceed with the calculation.
  8. Tabulate the results. 
    1. The values should be positive and negative, with magnitudes in the tens of thousands.
    2. Save this data set with a B-part of “LAKE WINNEBAGO”
    3. C-part of ‘FLOW-RES IN’. 
  9. Exit the Math Functions screen.
  10. Refresh the catalog, and tabulate the new inflow record. 
  11. Enable the ‘Allow Editing’ option under the tabulation’s Edit menu. 
  12. Change the type from ‘INST-VAL’ to ‘PER-AVER’, and the units from ‘ACRE-FT’ to ‘CFS’ if the arithmetic functions were used earlier.
  13. Tabulate and plot the net inflow data to verify the numbers and units.