-
Notifications
You must be signed in to change notification settings - Fork 2.1k
Open
Description
Description
I was talking to @theorashid who linked me to this case study of CAR priors. It seems like they're just MvNormals, but with degenerate covariance matrices. We can now sample from such distributions use the new method="eig"
or method="svd"
argument. So a potential rng_fn
would just make an appropriate MvNormal with the method argument set, then return it's rng_fn.
Looking at the logp
method for these distributions, it seems like it's just using the eig method; so we might be able to simplify these to wrappers around MvNormalRV
that just constructs the mean/covariance and sets the appropriate method, but that's a step beyond what this PR is asking for.
theorashid
Metadata
Metadata
Assignees
Type
Projects
Milestone
Relationships
Development
Select code repository
Activity
ricardoV94 commentedon Mar 9, 2025
Plus some special logic for sparse covariances that would be nice to support in MvNormal as well?
jessegrabowski commentedon Mar 9, 2025
A hidden internal
SparseMvNormal
, in the vein ofPrecisionMvNormal
would be nice. We could rewrite to it when we see the covariance is sparse.One issue I foresee is that we don't have sparse implementations of relevant algorithms (Cholesky, Eig, SVD, and Solve). I know sparse cholesky exists, because @bwengals was telling me it's a nice one for GP stuff. For the others I have no idea.
But also beyond the scope of this issue
Muhammad-Rebaal commentedon Mar 11, 2025
Hi @jessegrabowski !
Hope you are fine !
If no one is working on this issue , could you assign this to me I'd like to solve this.
Thank You !
ricardoV94 commentedon Mar 11, 2025
We don't assign issues, you can just open a PR
Muhammad-Rebaal commentedon Mar 11, 2025
Ok sure thing I'll open a draft one.
rng_fn
to CAR/ICAR #7723feat pymc-devs#7713: implement ICARRV as SymbolicRV
asifzubair commentedon Jul 30, 2025
Hi @jessegrabowski , @ricardoV94 , I started this PR ( #7879 ) to address this issue. Please note I've only implemented ICARRV ( and corresponding test ) for now. I was hoping to get your reviews and then the implementation for CARRV would be easier. Please let me know your thoughts. Thank you 🙏
ricardoV94 commentedon Aug 11, 2025
Just because we can, should we? Are these draws valid in any meaningful sense? Are they useful?
jessegrabowski commentedon Aug 11, 2025
It would enable prior predictive checks for ICAR models, which would be nice.
ricardoV94 commentedon Aug 11, 2025
But are they valid/meaningful? This is an improper prior IIUC
ricardoV94 commentedon Aug 11, 2025
Is it something like that? https://stats.stackexchange.com/a/159322
And if valid, should we use svd specifically, instead of eigh?
jessegrabowski commentedon Aug 11, 2025
Yeah I might have missed some details.
To merge the PR, I would want to see that the prior and the no-data MCMC gives the same answer
Yes, that was my understanding when I wrote the issue
ricardoV94 commentedon Aug 11, 2025
I'm even more skeptical you can use mcmc for any source of truth: https://stats.stackexchange.com/questions/211917/sampling-from-an-improper-distribution-using-mcmc-and-otherwise
MCMC wmith improper flat will not give you a flat distribution, it can't without infinite bits of precision (and infinite runtime I guess)
ricardoV94 commentedon Aug 11, 2025
Also would need to check that calling numpy with those other methods achieves that goal. Implementation/explanation seems to suggest padding with zeros specifically, I didn't bother to read the formula to see if ignoring this detail would change anything.