You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
1.) When setting 'ism_red' to less than 1.0 in a model parameter dictionary for a model_box, the wavelength of the model gets very restricted (up to ca. 0.8 µm but not sure how universal this is), which seems buggy, and then leads to problems when computing residuals, for instance. MnWE 😉
from species import SpeciesInit
from species.data.database import Database
from species.read.read_model import ReadModel
SpeciesInit()
database = Database()
wavel_sampling = 3000 # or: None -- same problem
lmbdBer = (0.1, 39.0) # or: None -- same problem
# the choice of atmospheric model does not matter here (beyond needing to adapt lmbBer)
database.add_model(model='drift-phoenix', teff_range = (1600,1800), unpack_tar=False,
wavel_range=lmbdBer, wavel_sampling=wavel_sampling)
read_model = ReadModel(model='drift-phoenix')
best = {'teff': 1700., 'logg': 4.1, 'feh': 0.0,
'ism_red': 0.6, ## Problem if less than 1!
'ism_ext': 1.0,
}
model_box = read_model.get_model(best)
model_box.open_box()
According to the example in the documentation ("for example bounds={'ism_ext': (0., 10.), 'ism_red': (0., 20.)}") and from a quick look at the formula, R_V < 1 should not be a problem mathematically, so I do not know what is happening. If you actually do need to restrict R_V to >1 for some reason, the user should be warned at a higher level, of course.
2.) It might be good to warn the user or throw an error if he or she sets ism_red (R_V) but not ism_ext (A_V); logically, A_V=0 is the default, so in practice it does not matter, but it might indicate confusion on the user side, which can maybe easily occur because of the number of parameters. When mini-debugging to prepare this "issue", I was not getting the problem in 1.) when not setting ism_ext, which confused me at first.
3.) Maybe you noticed that I set the upper wavelength to 39.0 µm. The reason is that the spectrum files go up not to 40 µm as claimed in model_data.json but to 3.999999999999999289e+01 🙂. I see two problems here beyond the fact that the model does not actually go up to 40 µm. Firstly, since we are working with doubles, the last three digits are actually meaningless. Secondly, there are way too many significant digits, leading to files that are really a factor of several too large—and we are talking hundreds of MBs to GBs, not the inelegance of e.g. 7.1 kB vs. 1.7 kB 🤓. I know we were discussing this a bit somewhere else but certainly five or six significant digits cannot not be enough. Many people have big hard drives but many are a bit limited, and there is really no gain in having huge files for nothing. There are grids that I could maybe run on my laptop, which would be much more convenient, if they were smaller. (The RAM is a separate issue, of course.)
Also some wavelength minima are affected, by the way, but not for all grids. Maybe a hot fix for now would be to limit the ranges in model_data.json (e.g. starting at 0.601 or ending at 39.9 µm), before you remake all the grids affected…
Thanks a lot!
The text was updated successfully, but these errors were encountered:
I am not sure if the folllowing is really a bug given the above but just in case: I encountered issue 1.) above when plotting the residuals with plot_spectrum() after a fit with ism_red that yielded a best value below unity. The output contained:
[…]/species/phot/syn_phot.py:227: UserWarning: Calculation of the mean flux for 2MASS/2MASS.H is not possible because the wavelength array is empty. Returning a NaN for the flux.
The filter profile of 2MASS/2MASS.H (1.4180-1.8710) lies outside the wavelength range of the spectrum. The returned synthetic flux is therefore set to NaN
[…]
Residuals (sigma):
- 2MASS/2MASS.H = nan
- 2MASS/2MASS.J = nan
- 2MASS/2MASS.Ks = nan
- SLOAN/SDSS.g = 26.48
- SLOAN/SDSS.i = 14.70
- SLOAN/SDSS.r = 41.68
- SLOAN/SDSS.u = 13.56
- SLOAN/SDSS.z = 14.67
[…]/species/plot/plot_spectrum.py:1299
[…]
-> 1299 max_tmp = np.max(np.abs(residuals.photometry[item][1][finite]))
[…]
ValueError: zero-size array to reduction operation maximum which has no identity
Now the "out of range" warning was because the model was being stored only up to 0.87 µm or so (maybe printing the spectrum range in the message "lies outside the wavelength range of the spectrum" would be helpful for debugging in the future, by the way), but just from reading the line the residuals array does contain at least one finite value (the SDSS ones), so residuals.photometry[item][1][finite] should not be zero-sized, no? I might be misinterpreting the error message, though. Mentioning just in case this is an issue too.
Sorry, I should have guessed that a polynomial could become negative… For R_v < 0, indeed the Cardelli et al. (1989) extinction becomes negative at some wavelengths:
Therefore this was not buggy behaviour but effectively bad input on my part! My apologies.
Maybe R_V should be enforced to be .ge. 1 and the documentation updated to reflect this.
Points 2.) and 3.) and the second comment still remain, though.
Three somewhat linked problems:
1.) When setting
'ism_red'
to less than 1.0 in a model parameter dictionary for amodel_box
, the wavelength of the model gets very restricted (up to ca. 0.8 µm but not sure how universal this is), which seems buggy, and then leads to problems when computing residuals, for instance. MnWE 😉This will output:
See how lambda stops at 0.76 µm! If
best
contains instead'ism_red': 1.0
(or a higher number), the wavelengths will be (correct behaviour):According to the example in the documentation ("for example
bounds={'ism_ext': (0., 10.), 'ism_red': (0., 20.)}
") and from a quick look at the formula, R_V < 1 should not be a problem mathematically, so I do not know what is happening. If you actually do need to restrict R_V to >1 for some reason, the user should be warned at a higher level, of course.2.) It might be good to warn the user or throw an error if he or she sets
ism_red
(R_V) but notism_ext
(A_V); logically, A_V=0 is the default, so in practice it does not matter, but it might indicate confusion on the user side, which can maybe easily occur because of the number of parameters. When mini-debugging to prepare this "issue", I was not getting the problem in 1.) when not settingism_ext
, which confused me at first.3.) Maybe you noticed that I set the upper wavelength to 39.0 µm. The reason is that the spectrum files go up not to 40 µm as claimed in
model_data.json
but to 3.999999999999999289e+01 🙂. I see two problems here beyond the fact that the model does not actually go up to 40 µm. Firstly, since we are working with doubles, the last three digits are actually meaningless. Secondly, there are way too many significant digits, leading to files that are really a factor of several too large—and we are talking hundreds of MBs to GBs, not the inelegance of e.g. 7.1 kB vs. 1.7 kB 🤓. I know we were discussing this a bit somewhere else but certainly five or six significant digits cannot not be enough. Many people have big hard drives but many are a bit limited, and there is really no gain in having huge files for nothing. There are grids that I could maybe run on my laptop, which would be much more convenient, if they were smaller. (The RAM is a separate issue, of course.)Also some wavelength minima are affected, by the way, but not for all grids. Maybe a hot fix for now would be to limit the ranges in
model_data.json
(e.g. starting at 0.601 or ending at 39.9 µm), before you remake all the grids affected…Thanks a lot!
The text was updated successfully, but these errors were encountered: