-
Notifications
You must be signed in to change notification settings - Fork 14
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
mismatching array sizes when reading element_solid_history_variables
#63
Comments
What is a ptf file? |
Its similar to |
I created a pull request for this issue (actually noticed this issue after creating the pull request as I came up with the same problem with my d3plot files). You can check the solution here: The problem is in _read_states_solid function in d3plot.py file where the i_solid_vars already contains the element_solid_history_variables therefore it is wrong to increment it again. The solid true strain data (element_solid_plastic_strain_tensor, element_solid_thermal_strain_tensor and element_solid_strain) are contained in element_solid_history_variables, if present. When element_solid_strain is requested, it is at the end of solid element data, therefore also at the end of element_solid_history_variables. The element_solid_history_variables are composed of strain data in the following order: [element_solid_plastic_strain_tensor], [element_solid_thermal_strain_tensor], [element_solid_strain]. |
🐛 Describe the bug
I am trying to read
element_solid_history_variables
from my *.ptf files, but when I'm callingD3plot()
withstate_array_filter
orstate_filter
, I get get an error.🔢 To Reproduce
ValueError: could not broadcast input array from shape (522,5913,1,11) into shape (522,5913,1,5)
💘 Expected behavior
Function call works without throwing an error message.
🖥️ Setup
ℹ️ Additional context
My current workaround is to call
D3plot()
without arguments. I can then read out the array as expected, but the issue is that this doesn't work with the rest of my infrastructure, where I can only read a set of timesteps at a time.The text was updated successfully, but these errors were encountered: