Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

VCS and template object clean up to aid iteration speed and memory footprint #242

Closed
durack1 opened this issue Aug 30, 2017 · 45 comments
Closed
Milestone

Comments

@durack1
Copy link
Member

durack1 commented Aug 30, 2017

There have been a number of issues that describe code slowdowns and considerable memory growth when generate pngs using loops. Some of the problems are highlighted in the issues:
#241
#237
#236
#235
PCMDI/amipbcs#10
CDAT/cdat#1424
CDAT/cdat#1397

It appears the primary problem, is that vcs and EzTemplate objects are being created and never deleted, and over an iteration of ~8000 this can lead to a huge memory footprint (~1 -> 100Gb) and a very marked slowdown (~3 to 2000s) in the time taken to generate a single 3 panel png output file.

There are a number of ways to resolve this, and I believe the vcs.removeobject function should be augmented to provide a user the ability to purge all vcs (and EzTemplate or all used graphic) objects in the current session, so a command like:

vcs.removeobject('all',force=True)

This example function would purge ALL objects, even if there are current handles to them (the force=True option).

It would be useful to think about other use cases, and then incorporate these into the single function, rather than having multiple functions that do variants of the same thing.. Oh yes, and documentation of this functionality will be key

@lee1043 @doutriaux1 @danlipsa pinging you all here

@durack1 durack1 added this to the 3.0 milestone Aug 30, 2017
@doutriaux1
Copy link
Contributor

@durack1 this is misleading,

  • all objects created by vcs are now being cleaned up. User generated object are NOT and SHOULD NOT be deleted by anyone but the user. Poor memory management by the user cannot and should not be blamed on the software. There is currently only one known case of memory growth which is when using vtk built against mesa. According to @danlipsa calling canvas.close() after a canvas.clear() fixes the issue, also switching to OpenGL2 (expected for the 3.0 release) solves the issue
  • your proposed solution will NOT work as vcs.removeobject, removes ONE object. Agreed that we need to add code to make sure the object is not on use by any canvas before deleting it.
    remove("all") is not a viable solution as it s confusing, are your trying to remove a graphic method named "all" what about "all_boxfill", etc... The proper way to handle what you suggested is to implement a new function in Adding reset function for @durack1 WITH inline documentation #241 we named vcs.reset() to handle it. Probably vcs.clear() would be more appropriate, we can discuss the best possible name at length.
  • Finally once we fixed removeobject to NOT delete objects in use, we should not implement a force=True option, but have removeobject return a clear error object indicating all dispalys and objects preventing the deletion of the requested object. So that the user can act accordingly.

@durack1
Copy link
Member Author

durack1 commented Aug 30, 2017

@doutriaux1 feel free to edit the top #242 (comment) above, I was just trying to get down succinctly what the problem was, and what I thought was a reasonable solution - my aim was to reduce the tweaking of object deletion repetition (see below) and also the other kludges to make the software more user friendly..

FYI, my code includes the following:

vcs.removeobject(iso1)
vcs.removeobject(iso2)
vcs.removeobject(title)
vcs.removeobject(t1)
vcs.removeobject(t2)
vcs.removeobject(t3)
vcs.removeobject(tmpl)
...
gc.collect() ; # Attempt to force a memory flush
x.backend.renWin = None ; # @danlipsa fix UV-CDAT/vcs#237
x.close()

I am issuing an x.close at each loop iteration, but the objects associated with vcs canvas x (x = vcs.init()) were not being purged so the kludges above are currently required with the 2.10-19-gba336f2 nightly dated 22nd August

I would also note that a user cannot be expected to know to issue a x.backend.renWin = None command, and so there still is some cleaning up left to do

@doutriaux1 doutriaux1 modified the milestones: 3.0, Next Release Mar 29, 2018
@doutriaux1 doutriaux1 modified the milestones: 8.1, 8.2 Mar 27, 2019
@durack1
Copy link
Member Author

durack1 commented Nov 25, 2019

@downiec @jasonb5 @gabdulla this is the bug that I was talking about this morning

@durack1
Copy link
Member Author

durack1 commented Nov 25, 2019

@downiec sounds good, I am also onsite tomorrow and this would certainly benefit from a screen showing. The script that really diagnoses the issue really well (plot timing starts at 3s and blows out to ~900+s after ~1000 loop steps) so it should be relatively easy to ascertain what the issue is and squash this long-standing bug once and for all

@durack1
Copy link
Member Author

durack1 commented Nov 26, 2019

@downiec that demo file for testing is at https://github.com/PCMDI/amipbcs/pull/22/files

@durack1
Copy link
Member Author

durack1 commented Dec 3, 2019

@forsyth2 @muryanto1 this was the VCS issue that I was having problems with

@downiec ping

@danlipsa
Copy link
Contributor

danlipsa commented Dec 6, 2019

@durack1 Did you tried your test with vcs nightly?

@durack1
Copy link
Member Author

durack1 commented Dec 6, 2019

@danlipsa the test was with cdat/label/v82 which pulled down

vcs-8.2-py_2.json
vcsaddons-8.2-py37h516909a_1.json
vtk-cdat-8.2.0.8.2-py37_mesalibhead77ed_0.json

Are there other versions that have some changes I should be looking for?

@durack1
Copy link
Member Author

durack1 commented Dec 6, 2019

@danlipsa I just left my code running for weeks and on the 1858th loop, it's now taking 2027 seconds to plot and save a 500Kb PNG.

@danlipsa
Copy link
Contributor

danlipsa commented Dec 6, 2019

Wow! How long does it take for the first loop? Does the memory increase as well? How much?

@durack1
Copy link
Member Author

durack1 commented Dec 6, 2019

@danlipsa 3 secs, the memory footprint goes from 1.3 -> 2.3Gb, so that is a bug that is solved, but there is still a MAJOR issue. If you take a peek at #242 (comment) and the issues linked in this issue, you'll see some of the logs.

The latest run looks like (only have a couple of timesteps, I didn't pipe the terminal output to a file):

...
01809 processing: 1872-09 sic    bc Time: 2084.169 secs; Max mem: 2.377 GB PyObj#: 0534798;
01810 processing: 1872-10 sic    bc Time: 2033.407 secs; Max mem: 2.377 GB PyObj#: 0534955;
01811 processing: 1872-11 sic    bc Time: 2027.456 secs; Max mem: 2.377 GB PyObj#: 0535192;
01812 processing: 1872-12 sic    bc Time: 2043.189 secs; Max mem: 2.377 GB PyObj#: 0535389;
01813 processing: 1873-01 sic    bc Time: 2082.316 secs; Max mem: 2.377 GB PyObj#: 0535586;
01814 processing: 1873-02 sic    bc Time: 1993.879 secs; Max mem: 2.377 GB PyObj#: 0535783;
01815 processing: 1873-03 sic    bc Time: 1980.209 secs; Max mem: 2.377 GB PyObj#: 0535980;
01816 processing: 1873-04 sic    bc Time: 1993.473 secs; Max mem: 2.377 GB PyObj#: 0536177;
01817 processing: 1873-05 sic    bc Time: 2057.221 secs; Max mem: 2.377 GB PyObj#: 0536374;
01818 processing: 1873-06 sic    bc Time: 2165.160 secs; Max mem: 2.377 GB PyObj#: 0536571;
01819 processing: 1873-07 sic    bc Time: 2012.586 secs; Max mem: 2.377 GB PyObj#: 0536728;
01820 processing: 1873-08 sic    bc Time: 2027.061 secs; Max mem: 2.377 GB PyObj#: 0536965;
01821 processing: 1873-09 sic    bc Time: 2048.570 secs; Max mem: 2.377 GB PyObj#: 0537122;
01822 processing: 1873-10 sic    bc Time: 2007.304 secs; Max mem: 2.377 GB PyObj#: 0537319;
01823 processing: 1873-11 sic    bc Time: 2047.055 secs; Max mem: 2.377 GB PyObj#: 0537516;
01824 processing: 1873-12 sic    bc Time: 2035.291 secs; Max mem: 2.377 GB PyObj#: 0537753;
01825 processing: 1874-01 sic    bc Time: 2049.639 secs; Max mem: 2.377 GB PyObj#: 0537950;
01826 processing: 1874-02 sic    bc Time: 2011.265 secs; Max mem: 2.377 GB PyObj#: 0538147;
01827 processing: 1874-03 sic    bc Time: 2002.291 secs; Max mem: 2.377 GB PyObj#: 0538304;
01828 processing: 1874-04 sic    bc Time: 2031.438 secs; Max mem: 2.377 GB PyObj#: 0538541;
01829 processing: 1874-05 sic    bc Time: 1998.815 secs; Max mem: 2.377 GB PyObj#: 0538738;
01830 processing: 1874-06 sic    bc Time: 1997.119 secs; Max mem: 2.377 GB PyObj#: 0538935;
01831 processing: 1874-07 sic    bc Time: 2045.043 secs; Max mem: 2.377 GB PyObj#: 0539132;
01832 processing: 1874-08 sic    bc Time: 2083.603 secs; Max mem: 2.377 GB PyObj#: 0539329;
01833 processing: 1874-09 sic    bc Time: 2040.905 secs; Max mem: 2.377 GB PyObj#: 0539526;
01834 processing: 1874-10 sic    bc Time: 2014.244 secs; Max mem: 2.377 GB PyObj#: 0539723;
01835 processing: 1874-11 sic    bc Time: 2017.730 secs; Max mem: 2.377 GB PyObj#: 0539920;
01836 processing: 1874-12 sic    bc Time: 2004.599 secs; Max mem: 2.377 GB PyObj#: 0540117;
01837 processing: 1875-01 sic    bc Time: 2014.951 secs; Max mem: 2.377 GB PyObj#: 0540314;
01838 processing: 1875-02 sic    bc Time: 1999.243 secs; Max mem: 2.377 GB PyObj#: 0540511;
01839 processing: 1875-03 sic    bc Time: 2095.909 secs; Max mem: 2.377 GB PyObj#: 0540668;
01840 processing: 1875-04 sic    bc Time: 2083.434 secs; Max mem: 2.377 GB PyObj#: 0540905;
01841 processing: 1875-05 sic    bc Time: 2037.010 secs; Max mem: 2.377 GB PyObj#: 0541102;
01842 processing: 1875-06 sic    bc Time: 2042.832 secs; Max mem: 2.377 GB PyObj#: 0541259;
01843 processing: 1875-07 sic    bc Time: 2024.613 secs; Max mem: 2.377 GB PyObj#: 0541496;
01844 processing: 1875-08 sic    bc Time: 2029.772 secs; Max mem: 2.377 GB PyObj#: 0541693;
01845 processing: 1875-09 sic    bc Time: 2028.806 secs; Max mem: 2.377 GB PyObj#: 0541890;
01846 processing: 1875-10 sic    bc Time: 2069.256 secs; Max mem: 2.377 GB PyObj#: 0542087;
01847 processing: 1875-11 sic    bc Time: 2020.442 secs; Max mem: 2.377 GB PyObj#: 0542284;
01848 processing: 1875-12 sic    bc Time: 2044.194 secs; Max mem: 2.377 GB PyObj#: 0542481;
01849 processing: 1876-01 sic    bc Time: 2045.896 secs; Max mem: 2.377 GB PyObj#: 0542678;
01850 processing: 1876-02 sic    bc Time: 2048.841 secs; Max mem: 2.377 GB PyObj#: 0542875;
01851 processing: 1876-03 sic    bc Time: 2035.946 secs; Max mem: 2.377 GB PyObj#: 0543072;
01852 processing: 1876-04 sic    bc Time: 2050.273 secs; Max mem: 2.377 GB PyObj#: 0543269;
01853 processing: 1876-05 sic    bc Time: 2028.228 secs; Max mem: 2.377 GB PyObj#: 0543466;
01854 processing: 1876-06 sic    bc Time: 2012.156 secs; Max mem: 2.377 GB PyObj#: 0543663;
01855 processing: 1876-07 sic    bc Time: 2029.790 secs; Max mem: 2.377 GB PyObj#: 0543860;
01856 processing: 1876-08 sic    bc Time: 2011.127 secs; Max mem: 2.377 GB PyObj#: 0544057;
01857 processing: 1876-09 sic    bc Time: 2022.354 secs; Max mem: 2.377 GB PyObj#: 0544254;
01858 processing: 1876-10 sic    bc Time: 2027.942 secs; Max mem: 2.377 GB PyObj#: 0544451;
01859 processing: 1876-11 sic    bc Time: 2058.447 secs; Max mem: 2.377 GB PyObj#: 0544648;
01860 processing: 1876-12 sic    bc Time: 2222.158 secs; Max mem: 2.377 GB PyObj#: 0544805;
01861 processing: 1877-01 sic    bc Time: 2042.727 secs; Max mem: 2.377 GB PyObj#: 0545002;
01862 processing: 1877-02 sic    bc Time: 2038.007 secs; Max mem: 2.377 GB PyObj#: 0545239;
01863 processing: 1877-03 sic    bc Time: 2043.589 secs; Max mem: 2.377 GB PyObj#: 0545436;
01864 processing: 1877-04 sic    bc Time: 2045.696 secs; Max mem: 2.377 GB PyObj#: 0545633;
01865 processing: 1877-05 sic    bc Time: 2055.333 secs; Max mem: 2.377 GB PyObj#: 0545830;
01866 processing: 1877-06 sic    bc Time: 2038.993 secs; Max mem: 2.377 GB PyObj#: 0546027;
01867 processing: 1877-07 sic    bc Time: 2047.401 secs; Max mem: 2.377 GB PyObj#: 0546224;
01868 processing: 1877-08 sic    bc Time: 2052.641 secs; Max mem: 2.377 GB PyObj#: 0546421;
01869 processing: 1877-09 sic    bc Time: 2040.124 secs; Max mem: 2.377 GB PyObj#: 0546618;
01870 processing: 1877-10 sic    bc Time: 2070.017 secs; Max mem: 2.377 GB PyObj#: 0546815;
01871 processing: 1877-11 sic    bc Time: 2068.008 secs; Max mem: 2.377 GB PyObj#: 0547012;
01872 processing: 1877-12 sic    bc Time: 2049.180 secs; Max mem: 2.377 GB PyObj#: 0547209;
01873 processing: 1878-01 sic    bc Time: 2060.046 secs; Max mem: 2.377 GB PyObj#: 0547406;
01874 processing: 1878-02 sic    bc Time: 2050.727 secs; Max mem: 2.377 GB PyObj#: 0547603;
01875 processing: 1878-03 sic    bc Time: 2073.048 secs; Max mem: 2.377 GB PyObj#: 0547800;
01876 processing: 1878-04 sic    bc Time: 2068.090 secs; Max mem: 2.377 GB PyObj#: 0547997;
01877 processing: 1878-05 sic    bc Time: 2073.349 secs; Max mem: 2.377 GB PyObj#: 0548194;

The file that is being used here is found at https://github.com/PCMDI/amipbcs/pull/22/files#diff-db7ce91f4ea32772ae1ab4ad8df7e998, it depends on mounts to data using the LLNL /p/ mounts. If you have access to an LLNL machine, you should be able to reproduce this easily

@durack1
Copy link
Member Author

durack1 commented Dec 6, 2019

@danlipsa if you don’t have access, I can hang the two (or four) input netcdf files off the ftp LLNL server

@durack1
Copy link
Member Author

durack1 commented Dec 6, 2019

@danlipsa my guesswork here is the VCS-VTK implementation needs a tweak, as python objects created just continue to accumulate and their handles remain intact as go.collect() doesn’t flush these and the continuing growth of objects leads to a larger and larger overhead and HUGE slowdown

@danlipsa
Copy link
Contributor

danlipsa commented Dec 6, 2019

@durack1 If you could put the netcdf files on an ftp server that would be great. Thanks.

@durack1
Copy link
Member Author

durack1 commented Dec 6, 2019

@danlipsa there should be a bz2 archive at ftp://ftp.llnl.gov/incoming/191206_VCS82Bug/ that contains everything you need to reproduce the bug. Let me know if you have any trouble accessing this.

Just extract the archive, and run make_newVsOldDiffs_test.py in a valid CDAT/VCS environment and you should see time/object/memory growth within a couple of iterations.

ping @downiec @muryanto1 @gabdulla @forsyth2

@danlipsa
Copy link
Contributor

danlipsa commented Dec 6, 2019

@durack1 I get that

This site can’t be reached
The webpage at ftp://ftp.llnl.gov/incoming/191206_VCS82Bug/ might be temporarily down or it may have moved permanently to a new web address.
ERR_FTP_FAILED

@durack1
Copy link
Member Author

durack1 commented Dec 6, 2019

it's a bit fiddly, rather try browsing to ftp://ftp.llnl.gov/incoming and then see where you can go next. It works for me in firefox

191206ftpLlnlGovIncoming
191206ftpLlnlGovIncoming191206_VCS82Bug

@danlipsa
Copy link
Contributor

danlipsa commented Dec 6, 2019

I don't think is that, maybe I cannot access this from outside.

@durack1
Copy link
Member Author

durack1 commented Dec 6, 2019

@danlipsa I am outside, try connecting to ftp.llnl.gov first (user: anonymous; password: email address), then cd incoming and then cd 191206_VCS82Bug and then get 191206_durack1_VCS82Bug.bz2.

The configuration of the system is pretty weird.

@durack1
Copy link
Member Author

durack1 commented Dec 6, 2019

@danlipsa any more luck?

@danlipsa
Copy link
Contributor

danlipsa commented Dec 7, 2019

OK, I am getting it right now. I am using a ftp client (filezilla) to be able to get this. Maybe the issue is that the setup does not allow to list a directory only to cd and get a file.

@durack1
Copy link
Member Author

durack1 commented Dec 7, 2019

@danlipsa yeah I have raised queries about the config a number of times, they claim it's setup for heightened security, but in reality it rather limits the utility of having such a service when standard commands etc are blocked.

Let me know if you have any problem running the example script, I think the demo should work as simply as extracting the archive to an empty directory, loading up a CDAT conda env and running the script

@durack1
Copy link
Member Author

durack1 commented Dec 7, 2019

@danlipsa any luck?

And just to be complete, this is what the script is currently up to:

...
01911 processing: 1881-03 sic    bc Time: 2156.521 secs; Max mem: 2.377 GB PyObj#: 0554892;
01912 processing: 1881-04 sic    bc Time: 2156.445 secs; Max mem: 2.377 GB PyObj#: 0555089;
01913 processing: 1881-05 sic    bc Time: 2183.604 secs; Max mem: 2.377 GB PyObj#: 0555286;
01914 processing: 1881-06 sic    bc Time: 2182.729 secs; Max mem: 2.377 GB PyObj#: 0555483;
01915 processing: 1881-07 sic    bc Time: 2177.963 secs; Max mem: 2.377 GB PyObj#: 0555680;
01916 processing: 1881-08 sic    bc Time: 2155.600 secs; Max mem: 2.377 GB PyObj#: 0555877;
01917 processing: 1881-09 sic    bc Time: 2197.829 secs; Max mem: 2.377 GB PyObj#: 0556074;
01918 processing: 1881-10 sic    bc Time: 2234.277 secs; Max mem: 2.377 GB PyObj#: 0556271;
01919 processing: 1881-11 sic    bc Time: 2204.871 secs; Max mem: 2.377 GB PyObj#: 0556468;
01920 processing: 1881-12 sic    bc Time: 2213.772 secs; Max mem: 2.377 GB PyObj#: 0556665;
01921 processing: 1882-01 sic    bc Time: 2200.148 secs; Max mem: 2.377 GB PyObj#: 0556862;
01922 processing: 1882-02 sic    bc Time: 2185.507 secs; Max mem: 2.377 GB PyObj#: 0557059;
01923 processing: 1882-03 sic    bc Time: 2188.674 secs; Max mem: 2.377 GB PyObj#: 0557256;

@danlipsa
Copy link
Contributor

danlipsa commented Dec 7, 2019

@durack1 Yes, I got the data. Thanks! I'll look at this as soon as I can. It may not be before the end of the year though as I am busy with two other projects.

@durack1
Copy link
Member Author

durack1 commented Dec 7, 2019

@danlipsa no problem, just to make sure that you can reproduce this issue (as in a couple of months I will likely have forgotten details), can you reproduce my experience over the first 1 to 100 steps of the loop, the time increases from ~1.3 to 7.4s and the python object counts increase from 178534 to 197982:

(base) bash-4.2$ conda activate cdat82MesaPy3
(cdat82MesaPy3) bash-4.2$ date
Sat Dec  7 15:33:39 PST 2019
(cdat82MesaPy3) bash-4.2$ python make_newVsOldDiffs_test.py
Allow anonymous logging usage to help improve CDAT(you can also set the environment variable CDAT_ANONYMOUS_LOG to yes or no)? [yes]/no: no
~/anaconda3/envs/cdat82MesaPy3/lib/python3.7/site-packages/unidata/__init__.py:2: UserWarning: unidata package is deprecated please use genutil.udunits instead of unidata.udunits
  warnings.warn("unidata package is deprecated please use genutil.udunits instead of unidata.udunits")
~/tmp make_newVsOldDiffs_test.py
~/tmp
tosList: tos_input4MIPs_SSTsAndSeaIce_CMIP_PCMDI-AMIP-1-1-6_gn_187001-201812.nc
sicbcList2: siconcbcs_input4MIPs_SSTsAndSeaIce_CMIP_PCMDI-AMIP-1-1-5_gn_187001-201806.nc
newDir: ./pngs/pngs_v1.1.6/sic
UV-CDAT version:      8.1
UV-CDAT prefix:      ~/anaconda3/envs/cdat82MesaPy3
delFudge:             True
Background graphics:  True
donotstoredisplay:    True
00001 processing: 1870-01 sic     Time: 01.326 secs; Max mem: 1.764 GB PyObj#: 0178534;
00002 processing: 1870-02 sic     Time: 01.137 secs; Max mem: 1.764 GB PyObj#: 0178728;
00003 processing: 1870-03 sic     Time: 01.151 secs; Max mem: 1.764 GB PyObj#: 0178925;
00004 processing: 1870-04 sic     Time: 01.158 secs; Max mem: 1.764 GB PyObj#: 0179122;
00005 processing: 1870-05 sic     Time: 01.170 secs; Max mem: 1.764 GB PyObj#: 0179311;
00006 processing: 1870-06 sic     Time: 01.221 secs; Max mem: 1.764 GB PyObj#: 0179508;
00007 processing: 1870-07 sic     Time: 01.200 secs; Max mem: 1.764 GB PyObj#: 0179705;
00008 processing: 1870-08 sic     Time: 01.217 secs; Max mem: 1.764 GB PyObj#: 0179902;
00009 processing: 1870-09 sic     Time: 01.230 secs; Max mem: 1.764 GB PyObj#: 0180099;
00010 processing: 1870-10 sic     Time: 01.249 secs; Max mem: 1.764 GB PyObj#: 0180296;
00011 processing: 1870-11 sic     Time: 01.271 secs; Max mem: 1.764 GB PyObj#: 0180493;
00012 processing: 1870-12 sic     Time: 01.296 secs; Max mem: 1.764 GB PyObj#: 0180650;
00013 processing: 1871-01 sic     Time: 01.319 secs; Max mem: 1.764 GB PyObj#: 0180883;
00014 processing: 1871-02 sic     Time: 01.381 secs; Max mem: 1.764 GB PyObj#: 0181080;
00015 processing: 1871-03 sic     Time: 01.377 secs; Max mem: 1.764 GB PyObj#: 0181277;
00016 processing: 1871-04 sic     Time: 01.385 secs; Max mem: 1.764 GB PyObj#: 0181474;
00017 processing: 1871-05 sic     Time: 01.417 secs; Max mem: 1.764 GB PyObj#: 0181671;
00018 processing: 1871-06 sic     Time: 01.435 secs; Max mem: 1.764 GB PyObj#: 0181868;
00019 processing: 1871-07 sic     Time: 01.467 secs; Max mem: 1.764 GB PyObj#: 0182065;
00020 processing: 1871-08 sic     Time: 01.497 secs; Max mem: 1.764 GB PyObj#: 0182262;
00021 processing: 1871-09 sic     Time: 01.537 secs; Max mem: 1.764 GB PyObj#: 0182459;
00022 processing: 1871-10 sic     Time: 01.556 secs; Max mem: 1.764 GB PyObj#: 0182656;
00023 processing: 1871-11 sic     Time: 01.584 secs; Max mem: 1.764 GB PyObj#: 0182853;
00024 processing: 1871-12 sic     Time: 01.637 secs; Max mem: 1.764 GB PyObj#: 0183050;
00025 processing: 1872-01 sic     Time: 01.663 secs; Max mem: 1.764 GB PyObj#: 0183247;
00026 processing: 1872-02 sic     Time: 01.699 secs; Max mem: 1.764 GB PyObj#: 0183444;
00027 processing: 1872-03 sic     Time: 01.731 secs; Max mem: 1.764 GB PyObj#: 0183641;
00028 processing: 1872-04 sic     Time: 01.759 secs; Max mem: 1.764 GB PyObj#: 0183838;
00029 processing: 1872-05 sic     Time: 01.819 secs; Max mem: 1.764 GB PyObj#: 0184035;
00030 processing: 1872-06 sic     Time: 01.840 secs; Max mem: 1.764 GB PyObj#: 0184232;
00031 processing: 1872-07 sic     Time: 01.876 secs; Max mem: 1.764 GB PyObj#: 0184429;
00032 processing: 1872-08 sic     Time: 01.914 secs; Max mem: 1.764 GB PyObj#: 0184626;
00033 processing: 1872-09 sic     Time: 01.956 secs; Max mem: 1.764 GB PyObj#: 0184823;
00034 processing: 1872-10 sic     Time: 01.999 secs; Max mem: 1.764 GB PyObj#: 0185020;
00035 processing: 1872-11 sic     Time: 02.045 secs; Max mem: 1.764 GB PyObj#: 0185217;
00036 processing: 1872-12 sic     Time: 02.093 secs; Max mem: 1.764 GB PyObj#: 0185414;
00037 processing: 1873-01 sic     Time: 02.158 secs; Max mem: 1.764 GB PyObj#: 0185611;
00038 processing: 1873-02 sic     Time: 02.199 secs; Max mem: 1.764 GB PyObj#: 0185808;
00039 processing: 1873-03 sic     Time: 02.245 secs; Max mem: 1.764 GB PyObj#: 0186005;
00040 processing: 1873-04 sic     Time: 02.282 secs; Max mem: 1.764 GB PyObj#: 0186202;
00041 processing: 1873-05 sic     Time: 02.341 secs; Max mem: 1.764 GB PyObj#: 0186399;
00042 processing: 1873-06 sic     Time: 02.389 secs; Max mem: 1.764 GB PyObj#: 0186596;
00043 processing: 1873-07 sic     Time: 02.438 secs; Max mem: 1.764 GB PyObj#: 0186753;
00044 processing: 1873-08 sic     Time: 02.483 secs; Max mem: 1.764 GB PyObj#: 0186950;
00045 processing: 1873-09 sic     Time: 02.543 secs; Max mem: 1.764 GB PyObj#: 0187187;
00046 processing: 1873-10 sic     Time: 02.594 secs; Max mem: 1.764 GB PyObj#: 0187344;
00047 processing: 1873-11 sic     Time: 02.659 secs; Max mem: 1.764 GB PyObj#: 0187581;
00048 processing: 1873-12 sic     Time: 02.726 secs; Max mem: 1.764 GB PyObj#: 0187778;
00049 processing: 1874-01 sic     Time: 02.795 secs; Max mem: 1.764 GB PyObj#: 0187975;
00050 processing: 1874-02 sic     Time: 02.860 secs; Max mem: 1.764 GB PyObj#: 0188172;
00051 processing: 1874-03 sic     Time: 02.922 secs; Max mem: 1.764 GB PyObj#: 0188329;
00052 processing: 1874-04 sic     Time: 02.989 secs; Max mem: 1.764 GB PyObj#: 0188566;
00053 processing: 1874-05 sic     Time: 03.037 secs; Max mem: 1.764 GB PyObj#: 0188723;
00054 processing: 1874-06 sic     Time: 03.092 secs; Max mem: 1.764 GB PyObj#: 0188960;
00055 processing: 1874-07 sic     Time: 03.158 secs; Max mem: 1.764 GB PyObj#: 0189157;
00056 processing: 1874-08 sic     Time: 03.229 secs; Max mem: 1.764 GB PyObj#: 0189354;
00057 processing: 1874-09 sic     Time: 03.297 secs; Max mem: 1.764 GB PyObj#: 0189551;
00058 processing: 1874-10 sic     Time: 03.372 secs; Max mem: 1.764 GB PyObj#: 0189748;
00059 processing: 1874-11 sic     Time: 03.448 secs; Max mem: 1.764 GB PyObj#: 0189945;
00060 processing: 1874-12 sic     Time: 03.515 secs; Max mem: 1.764 GB PyObj#: 0190142;
00061 processing: 1875-01 sic     Time: 03.588 secs; Max mem: 1.764 GB PyObj#: 0190339;
00062 processing: 1875-02 sic     Time: 03.655 secs; Max mem: 1.764 GB PyObj#: 0190536;
00063 processing: 1875-03 sic     Time: 03.759 secs; Max mem: 1.764 GB PyObj#: 0190733;
00064 processing: 1875-04 sic     Time: 03.824 secs; Max mem: 1.764 GB PyObj#: 0190890;
00065 processing: 1875-05 sic     Time: 03.876 secs; Max mem: 1.764 GB PyObj#: 0191127;
00066 processing: 1875-06 sic     Time: 03.972 secs; Max mem: 1.764 GB PyObj#: 0191324;
00067 processing: 1875-07 sic     Time: 04.060 secs; Max mem: 1.764 GB PyObj#: 0191521;
00068 processing: 1875-08 sic     Time: 04.136 secs; Max mem: 1.764 GB PyObj#: 0191718;
00069 processing: 1875-09 sic     Time: 04.222 secs; Max mem: 1.764 GB PyObj#: 0191915;
00070 processing: 1875-10 sic     Time: 04.299 secs; Max mem: 1.764 GB PyObj#: 0192112;
00071 processing: 1875-11 sic     Time: 04.381 secs; Max mem: 1.764 GB PyObj#: 0192309;
00072 processing: 1875-12 sic     Time: 04.469 secs; Max mem: 1.764 GB PyObj#: 0192506;
00073 processing: 1876-01 sic     Time: 04.566 secs; Max mem: 1.764 GB PyObj#: 0192703;
00074 processing: 1876-02 sic     Time: 04.645 secs; Max mem: 1.764 GB PyObj#: 0192900;
00075 processing: 1876-03 sic     Time: 04.733 secs; Max mem: 1.764 GB PyObj#: 0193097;
00076 processing: 1876-04 sic     Time: 04.824 secs; Max mem: 1.764 GB PyObj#: 0193294;
00077 processing: 1876-05 sic     Time: 04.921 secs; Max mem: 1.764 GB PyObj#: 0193491;
00078 processing: 1876-06 sic     Time: 05.041 secs; Max mem: 1.764 GB PyObj#: 0193688;
00079 processing: 1876-07 sic     Time: 05.126 secs; Max mem: 1.764 GB PyObj#: 0193885;
00080 processing: 1876-08 sic     Time: 05.217 secs; Max mem: 1.764 GB PyObj#: 0194082;
00081 processing: 1876-09 sic     Time: 05.316 secs; Max mem: 1.764 GB PyObj#: 0194279;
00082 processing: 1876-10 sic     Time: 05.427 secs; Max mem: 1.764 GB PyObj#: 0194476;
00083 processing: 1876-11 sic     Time: 05.523 secs; Max mem: 1.764 GB PyObj#: 0194673;
00084 processing: 1876-12 sic     Time: 05.630 secs; Max mem: 1.764 GB PyObj#: 0194870;
00085 processing: 1877-01 sic     Time: 05.744 secs; Max mem: 1.764 GB PyObj#: 0195067;
00086 processing: 1877-02 sic     Time: 05.842 secs; Max mem: 1.764 GB PyObj#: 0195264;
00087 processing: 1877-03 sic     Time: 05.945 secs; Max mem: 1.764 GB PyObj#: 0195461;
00088 processing: 1877-04 sic     Time: 06.051 secs; Max mem: 1.764 GB PyObj#: 0195658;
00089 processing: 1877-05 sic     Time: 06.211 secs; Max mem: 1.764 GB PyObj#: 0195855;
00090 processing: 1877-06 sic     Time: 06.260 secs; Max mem: 1.764 GB PyObj#: 0196052;
00091 processing: 1877-07 sic     Time: 06.357 secs; Max mem: 1.764 GB PyObj#: 0196249;
00092 processing: 1877-08 sic     Time: 06.471 secs; Max mem: 1.764 GB PyObj#: 0196446;
00093 processing: 1877-09 sic     Time: 06.575 secs; Max mem: 1.764 GB PyObj#: 0196643;
00094 processing: 1877-10 sic     Time: 06.688 secs; Max mem: 1.764 GB PyObj#: 0196840;
00095 processing: 1877-11 sic     Time: 06.794 secs; Max mem: 1.764 GB PyObj#: 0197037;
00096 processing: 1877-12 sic     Time: 06.966 secs; Max mem: 1.764 GB PyObj#: 0197234;
00097 processing: 1878-01 sic     Time: 07.081 secs; Max mem: 1.764 GB PyObj#: 0197431;
00098 processing: 1878-02 sic     Time: 07.219 secs; Max mem: 1.764 GB PyObj#: 0197628;
00099 processing: 1878-03 sic     Time: 07.326 secs; Max mem: 1.764 GB PyObj#: 0197825;
00100 processing: 1878-04 sic     Time: 07.431 secs; Max mem: 1.764 GB PyObj#: 0197982;

@durack1
Copy link
Member Author

durack1 commented Dec 16, 2019

@danlipsa this is great, I've just managed to break the 3000s limit, next stop 1hr to generate a ~500Kb png:

...
02213 processing: 1906-05 sic    bc Time: 3050.964 secs; Max mem: 2.377 GB PyObj#: 0614386;
02214 processing: 1906-06 sic    bc Time: 3026.980 secs; Max mem: 2.377 GB PyObj#: 0614583;
02215 processing: 1906-07 sic    bc Time: 3015.333 secs; Max mem: 2.377 GB PyObj#: 0614780;
02216 processing: 1906-08 sic    bc Time: 2999.414 secs; Max mem: 2.377 GB PyObj#: 0614977;
02217 processing: 1906-09 sic    bc Time: 3021.320 secs; Max mem: 2.377 GB PyObj#: 0615174;
02218 processing: 1906-10 sic    bc Time: 3024.196 secs; Max mem: 2.377 GB PyObj#: 0615371;
02219 processing: 1906-11 sic    bc Time: 3029.702 secs; Max mem: 2.377 GB PyObj#: 0615568;
02220 processing: 1906-12 sic    bc Time: 2993.783 secs; Max mem: 2.377 GB PyObj#: 0615765;
02221 processing: 1907-01 sic    bc Time: 3015.676 secs; Max mem: 2.377 GB PyObj#: 0615962;
02222 processing: 1907-02 sic    bc Time: 3091.512 secs; Max mem: 2.377 GB PyObj#: 0616159;

@danlipsa
Copy link
Contributor

@durack1 Attached is a your script modified so that it does not seem to leak memory and also the time to save a PNG stays constant. I only tried the first 100 steps. The rule I followed is: every time there is a x.createXXX there should be a corresponding removeobject. EzTemplate is different: you have to remove all created templates as well as the template passed as parameter. You were already doing all that, but the code was commented out and you were doing something similar by accessing vcs.elements - I think you missed some of the objects this way. I removed that and also removed all del commands as they should not be needed. Take a look at this and let me know how this works for you.
script.zip

@doutriaux1
Copy link
Contributor

user error 🤣

@durack1
Copy link
Member Author

durack1 commented Dec 17, 2019

@doutriaux1 I’d suggest that if such issues cause a reasonably advanced user problems, then it’s a problem with the software and documentation, rather than user error. Defaults should be far cleaner, and self clearing, than accumulating at the rate they do

@durack1
Copy link
Member Author

durack1 commented Dec 17, 2019

@danlipsa thanks for this, I’ll test it out today and get back to you.

Is there a way we can clean up the default behavior of EzTemplate so such issues don’t recur, I.e it cleans up after itself?

@danlipsa
Copy link
Contributor

@durack1 I think the issue is that createXXX as well as EzTemplate store the created graphics methods and templates in the global map vcs.elements. Never understood fully why that is the case. Is it only to access those graphics methods later on using their name? This does not seem to me that compelling. @doutriaux1 ? So I think we can change this behavior but it is not a backward compatible change and also a bigger change. So I think the way it is implemented now it is consistend but may be surprising to python users not accustomed to deleting objects. I do think we have to improve documentation on this.

@doutriaux1
Copy link
Contributor

@danlipsa yes it is so that you can refer these object by name (and save them in init attribute json file)

@doutriaux1
Copy link
Contributor

@danlipsa if my fading memory doesn't betray me, I think thre's a vcs function to remove all object not used on any canvas (search for something like removeallobject or something like that. maybe cleanall, not quite sure.

@doutriaux1
Copy link
Contributor

@danlipsa one should probably implement __del__ on objects to clean the vcs register wen ser delete objects. I don't think my cleanall function dared removing objects created by user. You can't assume they will not be re-used by user later.

@danlipsa
Copy link
Contributor

@doutriaux1 Yes, that is a good idea.

@durack1
Copy link
Member Author

durack1 commented Dec 17, 2019

@danlipsa @doutriaux1 I would vote for more logical rather than maintaining backward compatibility, it seems to me that if there are default behaviors that trip a user up, then these should be changed for ease of use and logic, rather than preserved.

On this subject, I was curious about the object use, it seems 180k objects is extreme to me, the below example shows how many (redundant?) objects are created when vcs is loaded, from 6k to 183k seems a little over the top to me:

(cdat82MesaPy3) bash-4.2$ which python
/export/durack1/anaconda3/envs/cdat82MesaPy3/bin/python
(cdat82MesaPy3) bash-4.2$ python
Python 3.7.3 | packaged by conda-forge | (default, Jul  1 2019, 21:52:21) 
[GCC 7.3.0] :: Anaconda, Inc. on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import gc
>>> print('Python object count:',len(gc.get_objects()))
Python object count: 6040
>>> import vcs
/export/durack1/anaconda3/envs/cdat82MesaPy3/lib/python3.7/site-packages/unidata/__init__.py:2: UserWarning: unidata package is deprecated please use genutil.udunits instead of unidata.udunits
  warnings.warn("unidata package is deprecated please use genutil.udunits instead of unidata.udunits")
>>> print('Python object count:',len(gc.get_objects()))
Python object count: 183097

@durack1
Copy link
Member Author

durack1 commented Dec 17, 2019

For comparison, I was curious about cdms2 and cdutil (which likely imports cdms2 as one of its defaults), while 59/65k seems a little large, it's far better than the 183k of vcs:

(cdat82MesaPy3) bash-4.2$ python
Python 3.7.3 | packaged by conda-forge | (default, Jul  1 2019, 21:52:21) 
[GCC 7.3.0] :: Anaconda, Inc. on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import gc
>>> print('Python object count:',len(gc.get_objects()))
Python object count: 6040
>>> import cdms2
>>> print('Python object count:',len(gc.get_objects()))
Python object count: 59126
>>> 
(cdat82MesaPy3) bash-4.2$ python
Python 3.7.3 | packaged by conda-forge | (default, Jul  1 2019, 21:52:21) 
[GCC 7.3.0] :: Anaconda, Inc. on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import gc
>>> print('Python object count:',len(gc.get_objects()))
Python object count: 6040
>>> import cdutil
>>> print('Python object count:',len(gc.get_objects()))
Python object count: 65136

@durack1
Copy link
Member Author

durack1 commented Dec 17, 2019

@danlipsa I also note that there is still the active line
x.backend.renWin = None ; # @danlipsa fix UV-CDAT/vcs#237 and I wonder if this behavior is also something that should be considered, I haven't tested what object/resource counts do when this line is commented

@durack1
Copy link
Member Author

durack1 commented Dec 17, 2019

@danlipsa I just tried your script and can report that it solves the problem, so here is the first 10 and last 10 timesteps over the first loop:

~/tmp make_newVsOldDiffs_test_dan.py
~/tmp
tosList: tos_input4MIPs_SSTsAndSeaIce_CMIP_PCMDI-AMIP-1-1-6_gn_187001-201812.nc
sicbcList2: siconcbcs_input4MIPs_SSTsAndSeaIce_CMIP_PCMDI-AMIP-1-1-5_gn_187001-201806.nc
UV-CDAT version:      8.1
UV-CDAT prefix:       ~anaconda3/envs/cdat82MesaPy3
Background graphics:  True
donotstoredisplay:    True
00001 processing: 1870-01 sic     Time: 01.408 secs; Max mem: 1.766 GB PyObj#: 0183591;
00002 processing: 1870-02 sic     Time: 01.266 secs; Max mem: 1.766 GB PyObj#: 0183550;
00003 processing: 1870-03 sic     Time: 01.228 secs; Max mem: 1.766 GB PyObj#: 0183590;
00004 processing: 1870-04 sic     Time: 01.269 secs; Max mem: 1.766 GB PyObj#: 0183590;
00005 processing: 1870-05 sic     Time: 01.153 secs; Max mem: 1.766 GB PyObj#: 0183590;
00006 processing: 1870-06 sic     Time: 01.141 secs; Max mem: 1.766 GB PyObj#: 0183590;
00007 processing: 1870-07 sic     Time: 01.149 secs; Max mem: 1.766 GB PyObj#: 0183590;
00008 processing: 1870-08 sic     Time: 01.184 secs; Max mem: 1.766 GB PyObj#: 0183590;
00009 processing: 1870-09 sic     Time: 01.203 secs; Max mem: 1.766 GB PyObj#: 0183590;
00010 processing: 1870-10 sic     Time: 01.165 secs; Max mem: 1.766 GB PyObj#: 0183590;
...
03543 processing: 2017-03 sic    bc Time: 01.157 secs; Max mem: 2.351 GB PyObj#: 0178489;
03544 processing: 2017-04 sic    bc Time: 01.153 secs; Max mem: 2.351 GB PyObj#: 0178489;
03545 processing: 2017-05 sic    bc Time: 01.146 secs; Max mem: 2.351 GB PyObj#: 0178489;
03546 processing: 2017-06 sic    bc Time: 01.154 secs; Max mem: 2.351 GB PyObj#: 0178489;
03547 processing: 2017-07 sic    bc Time: 01.169 secs; Max mem: 2.351 GB PyObj#: 0178489;
03548 processing: 2017-08 sic    bc Time: 01.249 secs; Max mem: 2.351 GB PyObj#: 0178489;
03549 processing: 2017-09 sic    bc Time: 01.155 secs; Max mem: 2.351 GB PyObj#: 0178489;
03550 processing: 2017-10 sic    bc Time: 01.155 secs; Max mem: 2.351 GB PyObj#: 0178489;
03551 processing: 2017-11 sic    bc Time: 01.144 secs; Max mem: 2.351 GB PyObj#: 0178489;
03552 processing: 2017-12 sic    bc Time: 01.153 secs; Max mem: 2.351 GB PyObj#: 0178489;

It weirds me out a little the step jumps in objects that occur now and then

durack1 added a commit to PCMDI/amipbcs that referenced this issue Dec 17, 2019
@durack1
Copy link
Member Author

durack1 commented Dec 17, 2019

@danlipsa @doutriaux1 to be a little clearer. I personally find it confusing that we have vcs.clean_auto_generated_objects(), vcs.clear(), vcs.close(), vcs.destroy(), vcs.flush(), vcs.onclosing(), vcs.removeP() and vcs.removeobject(), it's really not intuitive to me what each of these functions do, and how they overlap. It may be obvious to someone else, but it certainly isn't to me without considerable guidance.

@danlipsa
Copy link
Contributor

Some of them should be removed: clean_auto_generated_objects, some of them should be private: removeP, onclosing
Not sure if flush() is needed. Seems like a very low lever operation. It should probably be private as well.
Not sure what the difference between destroy and close is. Even after briefly looking at the code.
I agree, those should be cleaned up. I filed a different issue with this: #429

@danlipsa
Copy link
Contributor

@durack1 Seems like x.backend.renWin = None is not needed. I think the issue and workaround were needed only for the old OpenGL1 backend. Now we ported everything to the new OpenGL 3.2 backend.

@danlipsa
Copy link
Contributor

On this subject, I was curious about the object use, it seems 180k objects is extreme to me, the below example shows how many (redundant?) objects are created when vcs is loaded, from 6k to 183k seems a little over the top to me.

I think the reason for this are the default graphics methods and templates that are loaded from a json file at initialization.

@danlipsa
Copy link
Contributor

@durack1 Can you close this issue then?

@durack1
Copy link
Member Author

durack1 commented Dec 18, 2019

@danlipsa yeah thanks, this issue can be closed, and the code improvement can be focused in #429

@durack1 durack1 closed this as completed Dec 18, 2019
@downiec downiec modified the milestones: 8.2, 8.2.1 Jul 27, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants