Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add SDM-PSI Estimator #591

Draft
wants to merge 61 commits into
base: main
Choose a base branch
from
Draft
Changes from 1 commit
Commits
Show all changes
61 commits
Select commit Hold shift + click to select a range
52b0ef8
Start working on SDM anisotropic kernel.
tsalo Nov 18, 2020
02c5595
Merge branch 'neurostuff:main' into sdm
tsalo Oct 27, 2021
721ebcf
Run black and improve docstring.
tsalo Oct 29, 2021
0a9b40a
Merge branch 'main' into sdm
tsalo Dec 19, 2021
26cb5e8
Start looking at the Albajes-Eizagirre et al. (2019) supplement.
tsalo Dec 19, 2021
907ac54
Improve documentation a bit.
tsalo Dec 20, 2021
61e8d7d
Use PyMARE instead of metafor's rma.
tsalo Dec 21, 2021
ca6ddc0
Get the permutation functions working.
tsalo Dec 21, 2021
8c530f6
Draft hedges g function.
tsalo Dec 21, 2021
b5be408
Use formulae from wikipedia for hedges' g.
tsalo Dec 21, 2021
f0eee9c
Add empty functions for calculating Hedges g to transforms.
tsalo Dec 21, 2021
b3cf2a7
Try improving Hedges g functions.
tsalo Dec 21, 2021
2067196
Add simulation code (thanks @JulioAPeraza!).
tsalo Dec 22, 2021
57f5d89
Get the simulations working.
tsalo Jan 14, 2022
910a44a
Support different sample sizes per study.
tsalo Jan 14, 2022
0e9207d
I think the single-voxel imputation makes sense now.
tsalo Jan 15, 2022
05260fc
Lots of progress on the simulations.
tsalo Jan 15, 2022
0f0f49a
Update sdm.py
tsalo Jan 15, 2022
3e0a1a2
Document some stuff.
tsalo Jan 16, 2022
1575455
Import stuff.
tsalo Jan 16, 2022
1996c6f
Update setup.cfg
tsalo Jan 16, 2022
e87eea7
Update sdm.py
tsalo Jan 16, 2022
c1c61ed
Some notes.
tsalo Jan 16, 2022
f8b4dd6
Fix the simulated data.
tsalo Jan 16, 2022
61a2769
Documentation improvements.
tsalo Jan 16, 2022
f600bd3
Update sdm.py
tsalo Jan 16, 2022
0d3feb3
Just use the masker to simulate the subject data.
tsalo Jan 18, 2022
fe22b9c
Merge branch 'main' into sdm
tsalo Mar 3, 2022
cb38ec0
Remove supplement R values.
tsalo Mar 3, 2022
679f051
Update sdm.py
tsalo Mar 3, 2022
afa59ca
Run black.
tsalo Mar 3, 2022
4c315fb
Merge branch 'main' into sdm
tsalo Mar 8, 2022
e74be9c
Update sdm.py
tsalo Mar 8, 2022
1855caf
Test out the simulation method.
tsalo Mar 30, 2022
c6d9371
Keep working on the simulation code.
tsalo Mar 30, 2022
e575b0e
Add some tests.
tsalo Mar 30, 2022
34547dc
Update sdm.py
tsalo Mar 30, 2022
a24104f
docstrings.
tsalo Mar 30, 2022
1e0c9cc
Try writing _calculate_hedges_maps.
tsalo May 11, 2022
6df11b3
Run black.
tsalo May 11, 2022
59777cf
Merge remote-tracking branch 'upstream/main' into sdm
tsalo May 11, 2022
c350dd9
Update.
tsalo May 11, 2022
58d9bb4
Update.
tsalo May 11, 2022
63e878e
Move some functions.
tsalo May 11, 2022
d4c2c65
More.
tsalo May 11, 2022
103f5b9
Update sdm.py
tsalo May 13, 2022
267ceaf
Update setup.cfg
tsalo May 14, 2022
7e7a6ac
Update sdm.py
tsalo May 16, 2022
eb52eb9
Update sdm.py
tsalo May 16, 2022
3410c6b
Update test_sdm.py
tsalo May 17, 2022
177c3cc
Update sdm.py
tsalo May 17, 2022
6119e7c
Update sdm.py
tsalo May 17, 2022
ff933c7
Update test_sdm.py
tsalo May 17, 2022
6f7993c
Minor stuff from last night.
tsalo May 18, 2022
54bc79c
More stuff.
tsalo May 18, 2022
ef2e5ab
Update.
tsalo May 19, 2022
ec769ba
Just use PyMARE for the heterogeneity stats.
tsalo May 23, 2022
57d0ebb
Update sdm.py
tsalo May 24, 2022
8bcb8e3
Update sdm.py
tsalo May 24, 2022
2861950
Merge remote-tracking branch 'upstream/main' into sdm
tsalo Aug 12, 2022
ee28f73
[skip ci] Update sdm.py.
tsalo Aug 12, 2022
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
Update sdm.py
  • Loading branch information
tsalo committed May 17, 2022
commit 177c3cce216c288aff8ee10df7df1488f2a71228
137 changes: 84 additions & 53 deletions nimare/meta/cbma/sdm.py
Original file line number Diff line number Diff line change
@@ -60,6 +60,8 @@ def simulate_voxel_with_no_neighbors(n_subjects):
- Y = R
"""
y = np.random.normal(loc=0, scale=1, size=n_subjects)
# Enforce null mean and unit variance
y = stats.zscore(y)
return y


@@ -377,63 +379,94 @@ def simulate_subject_maps(n_subjects, masker, correlation_maps):
def impute_studywise_imgs(
mle_coeff_img,
mle_tau2_img,
lower_bound_imgs,
upper_bound_imgs,
lower_bound_img,
upper_bound_img,
seed=0,
):
"""Impute study-wise images.
"""Impute study's image.

.. todo::

Implement.

Use the MLE map and each study's upper- and lower-bound effect size maps to impute
study-wise effect size and variance images that meet specific requirements.

Parameters
----------
mle_coeff_img : :obj:`~nibabel.nifti1.Nifti1Image`
Meta-analytic coefficients map.
mle_tau2_img : :obj:`~nibabel.nifti1.Nifti1Image`
Meta-analytic variance map.
lower_bound_imgs : S-length :obj:`list` of :obj:`~nibabel.nifti1.Nifti1Image`
Study-wise lower-bound effect size maps.
upper_bound_imgs : S-length :obj:`list` of :obj:`~nibabel.nifti1.Nifti1Image`
Study-wise upper-bound effect size maps.
lower_bound_img : :obj:`~nibabel.nifti1.Nifti1Image`
Study's lower-bound effect size map.
upper_bound_img : :obj:`~nibabel.nifti1.Nifti1Image`
Study's upper-bound effect size map.
seed : :obj:`int`, optional
Random seed. Default is 0.

Returns
-------
study_effect_size_imgs : S-length :obj:`list` of :obj:`~nibabel.nifti1.Nifti1Image`
Study-wise effect size maps.
study_var_imgs : S-length :obj:`list` of :obj:`~nibabel.nifti1.Nifti1Image`
Study-wise effect variance maps.
effect_size_img : :obj:`~nibabel.nifti1.Nifti1Image`
Study's effect size map.
var_img : :obj:`~nibabel.nifti1.Nifti1Image`
Study's effect variance map.

Notes
-----
- This step, separately conducted for each study, consists in imputing many times study
images that meet the general PSI conditions adapted to SDM:

1. the effect sizes imputed for a voxel must follow a truncated normal distribution
with the MLE estimates and the effect-size bounds as parameters; and
2. the effect sizes of adjacent voxels must show positive correlations

- First, it assigns each voxel a uniformly distributed value between zero and one.
- Second, and separately for each voxel, it applies a threshold, spatial smoothing and
scaling that ensures that the voxel has the expected value and variance of the truncated
normal distribution and, simultaneously, has strong correlations with the neighboring
voxels.
- To ensure that the voxel has the expected value of the truncated normal distribution,
the threshold applied to the voxels laying within the smoothing kernel is the expected
value of the truncated normal distribution scaled to 0-1, and the number (between 0 and 1)
resulting from the smoothing is rescaled to the bounds of the truncated normal
distribution.
To ensure that the voxel has the expected variance of the truncated normal distribution,
SDM-PSI selects an anisotropic smoothing kernel that follows the spatial covariance of the
voxel and makes the variance of the resulting value in the voxel coincide with that
variance of the truncated normal distribution.
Please note that each voxel must follow a different truncated normal distribution,
and thus this thresholding/smoothing/rescaling process is different for each voxel.
"""
# Nonsense for now
study_effect_size_imgs = lower_bound_imgs[:]
study_var_imgs = lower_bound_imgs[:]
return study_effect_size_imgs, study_var_imgs
effect_size_img = lower_bound_img.copy()
var_img = lower_bound_img.copy()
return effect_size_img, var_img


def scale_subject_maps(
studylevel_effect_size_maps,
studylevel_variance_maps,
studylevel_effect_size_map,
studylevel_variance_map,
prelim_subjectlevel_maps,
):
"""Scale the "preliminary" set of subject-level maps for each dataset for a given imputation.

This is redundant for studies for which the original statistical image is available.

Parameters
----------
studylevel_effect_size_maps : :obj:`~numpy.ndarray` of shape (S, V)
S is study, V is voxel.
studylevel_variance_maps : :obj:`~numpy.ndarray` of shape (S, V)
prelim_subjectlevel_maps : S-length :obj:`list` of :obj:`~numpy.ndarray` of shape (N, V)
List with one entry per study (S), where each entry is an array that is subjects (N) by
voxels (V).
studylevel_effect_size_map : :obj:`~numpy.ndarray` of shape (V,)
Imputed study-level effect size map.
V is voxel.
studylevel_variance_map : :obj:`~numpy.ndarray` of shape (V,)
prelim_subjectlevel_maps : :obj:`~numpy.ndarray` of shape (N, V)
Preliminary (unscaled) subject-level maps.
An array that is subjects (N) by voxels (V).

Returns
-------
scaled_subjectlevel_maps : S-length :obj:`list` of :obj:`~numpy.ndarray` of shape (N, V)
List with one entry per study (S), where each entry is an array that is subjects (N) by
voxels (V).
scaled_subjectlevel_maps : :obj:`~numpy.ndarray` of shape (N, V)
An array that is subjects (N) by voxels (V).

Notes
-----
@@ -447,22 +480,15 @@ def scale_subject_maps(
subject-level maps for each dataset (across imputations), and scaling it across
imputations.
"""
assert len(prelim_subjectlevel_maps) == studylevel_effect_size_maps.shape[0]

scaled_subjectlevel_maps = []
for i_study in range(studylevel_effect_size_maps.shape[0]):
study_mean = studylevel_effect_size_maps[i_study, :]
study_var = studylevel_variance_maps[i_study, :]

study_subject_level_maps = prelim_subjectlevel_maps[i_study]
study_subject_level_maps_scaled = (study_subject_level_maps * study_var) + study_mean
scaled_subjectlevel_maps.append(study_subject_level_maps_scaled)
scaled_subjectlevel_maps = (
prelim_subjectlevel_maps * studylevel_variance_map
) + studylevel_effect_size_map

return scaled_subjectlevel_maps


def calculate_hedges_maps(subject_effect_size_imgs):
"""Run group-level analyses and calculate study-level Hedges' g maps.
"""Run study-wise group-level analyses and calculate study-level Hedges' g maps.

.. todo::

@@ -789,25 +815,30 @@ def run_sdm(
all_subject_effect_size_imgs = []
imp_meta_effect_size_imgs, imp_meta_tau2_imgs = [], []
for i_imp in range(n_imputations):
# Step 3: Impute study-wise effect size and variance maps.
study_effect_size_imgs, study_var_imgs = impute_studywise_imgs(
mle_coeff_img,
mle_tau2_img,
lower_bound_imgs,
upper_bound_imgs,
seed=i_imp,
)

# Step 4: Simulate subject-wise effect size(?) maps.
imp_subject_effect_size_imgs = scale_subject_maps(
study_effect_size_imgs,
study_var_imgs,
raw_subject_effect_size_imgs,
)
all_subject_effect_size_imgs.append(imp_subject_effect_size_imgs)
imp_subject_effect_size_imgs = []
for j_study in range(n_studies):
# Step 3: Impute study-wise effect size and variance maps.
imp_study_effect_size_img, imp_study_var_img = impute_studywise_imgs(
mle_coeff_img,
mle_tau2_img,
lower_bound_imgs[j_study],
upper_bound_imgs[j_study],
seed=i_imp,
)

# Step ???: Generate imputed study-wise effect size and variance maps from subject maps.
imp_hedges_imgs, imp_hedges_var_imgs = calculate_hedges_maps(all_subject_effect_size_imgs)
# Step 4: Simulate subject-wise effect size(?) maps.
study_imp_subject_effect_size_imgs = scale_subject_maps(
imp_study_effect_size_img,
imp_study_var_img,
raw_subject_effect_size_imgs[j_study],
)
imp_subject_effect_size_imgs.append(study_imp_subject_effect_size_imgs)

# Step ???: Create imputed study-wise effect size and variance maps from subject maps.
imp_hedges_imgs, imp_hedges_var_imgs = calculate_hedges_maps(
imp_subject_effect_size_imgs,
)

# Step ???: Estimate imputation-wise meta-analytic maps.
imp_meta_effect_size_img, imp_meta_tau2_img = run_variance_meta(