Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Negative Volume for Tabular Storage Curves #245

Open
kmmacro opened this issue Jun 25, 2019 · 10 comments
Open

Negative Volume for Tabular Storage Curves #245

kmmacro opened this issue Jun 25, 2019 · 10 comments
Labels

Comments

@kmmacro
Copy link

kmmacro commented Jun 25, 2019

When a storage node exceeds the maximum depth on its tabular storage curve, a negative volume is calculated because of these lines in table_getArea (y2=0 and y1>0):
https://github.com/OpenWaterAnalytics/Stormwater-Management-Model/blob/3b6d7a3ccdf79de72f2f5abce8a5cef0863dd34f/src/table.c#L617

https://github.com/OpenWaterAnalytics/Stormwater-Management-Model/blob/3b6d7a3ccdf79de72f2f5abce8a5cef0863dd34f/src/table.c#L629-L633

This results in storage curves that don't make physical sense (max storage depth = 8):
image

The model used to generate this plot is attached.
SWMMstoragebug.zip

@dickinsonre
Copy link

this might be accounted for in the rest of the swmm c code by the use of the fulldepth. I am saying that perhaps the equation generates a negative but it is not used. Thanks for the sample file.

@bemcdonnell
Copy link
Member

@dickinsonre, these results find their way to the binary output file. I think it’s fair to say that, whether or not the engine is routing correct, the output to the user is incorrect. Does that make sense?

@DJHostetler
Copy link

@kmmacro, you bring up a couple of good points about the storage curve code. Yes the negative area extrapolation is an issue, but using any kind of extrapolation at all seems like an unexpected approach for most modelers.

In my experience, modeling of storage nodes is one of the biggest sources of errors for modelers who do not have a thorough understanding of the SWMM model. Often there is a mismatch between the storage nodes max depth and the depth-area curve max depth that can introduce many unseen errors. Another issue I often see is that storage nodes that are not defined for a high enough depth and reach max depth during large storm events. It arbitrarily dampens the area HGL and does not accurately reflect site conditions.

My non-expert suggestion would be to either use the last tabular area defined in the table and not extrapolate at all or throw an error or warning if the storage nodes' table depth is exceeded. HEC-HMS throws an error message if you exceed the maximum defined stage-area depth. I would also remove the storage node's max depth property as it creates two sources of truth between the stage-area table and the node itself, but that is probably asking for too much of a change.

@dickinsonre
Copy link

The storage node maximum depth is needed for the functional form equation for a storage node. SWMM 5.1.013 also has a storage nodes surcharge depth that makes it behave more like the default junction or manhole. If a storage node floods you can have a "hat" on top of the node to simulated flooding. The hat area approximates the working of the ponded area in SWMM 5.

Many users (especially wet well users) do not have a deep enough table, an error message or warning message about the curve not meeting the user input rim elevation is important.

@LRossman
Copy link

The negative volume for extrapolated storage curves will only occur for storage unit shapes that are tapered at the top, like the closed storage unit in the example provided. A simple fix in the table.c code in the table_getArea() function is:

    // --- extrapolate area if table limit exceeded
    if ( dx > 0.0 ) s = dy/dx;
    else s = 0.0;
    dx = x - x1;
    if (s < 0.0 && dx > -y1 / s) dx = -y1 / s;   // NEW LINE HERE //
    return a + y1*dx + s*dx*dx/2.0;

After this fix the storage unit behaves as follows:
image

Regarding which max. depth should govern, the one for the curve or the one for the storage node, it seems that since closed shapes are allowed and it makes no physical sense to extrapolate them, that the curve's max. depth should override the node's max. depth. This change would require more work to implement.

@dickinsonre
Copy link

I agree with this, but do not forget there have been almost 1 1/2 decades of SWMM5 models (thanks Lew!) so any change for backward compatibility should be an optional change; Or a ML SWMM5

"it seems that since closed shapes are allowed and it makes no physical sense to extrapolate them, that the curve's max. depth should override the node's max. depth. "

@bemcdonnell
Copy link
Member

@dickinsonre, I have an opinion about backward compatibility. I think we should be considerate of what is actually backward compatible instead of allowing complete backward compatibility of everything. Maintaining old features and old (erroneous in this case) hydraulic behavior takes a huge effort and gives the user so many options they don’t know what to do. Instead of backward compatibility, what are your thoughts on “porting guides”? A simple document that says how to make your model behave the old way when using the new engine. No extra code required. I like the simplicity of SWMM versus other software.

@dickinsonre
Copy link

Maybe, however, we have had many issues in the past with changes in the functioning of storage nodes in various versions of SWMM5. it is hard to explain to a customer that changes in the code made their previously calibrated model work differently, To address this I think I should make a test file suite that tests all versions of SWMM 5.1 for the various storage node options. Since I used to work at CDM, this comparison should go back to SWMM 5.0.013 as they still use it for Alcosan.

Simplicity is important this is important "A simple document that says how to make your model behave the old way when using the new engine" If it is possible.

@bemcdonnell
Copy link
Member

@dickinsonre, I hear what you are saying and I too have experienced customers frustrated on this subject. When I deliver a model to a client, I am typically delivering the model with specific instructions on SWMM-version. Furthermore, if a new version of SWMM Is released and they call and tell me the results don't match, I typically say that your model was calibrated to a specific version of SWMM.

I think it would be valuable to use this particular issue as an example on how we might address the backward compatibility. I'm not suggesting we discuss in details on how the code could be implemented, rather, I'm interested in hearing how we manage user experience. This could involve adding switches to the Options section. In the eyes of a modeler using the software, how would they see the option to use the "new" vs the "old"?

@bemcdonnell
Copy link
Member

@dickinsonre, interestingly, we are discussing whether this issue topic is a bug-fix or a new feature! :-)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

5 participants