-
Notifications
You must be signed in to change notification settings - Fork 21
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Subset using a list of barcodes #273
Comments
Hello @yoavhadas , Sure, you can do it in a way similar to AnnData's slicing: import pegasus as pg
pdata = pg.read_input("<your-file-name>.h5ad")
pdata_subset = pdata[barcode_list, :].copy() where |
Hi @yihming , I could subset the object using the above approach. Now I must reanalyze it (repeat the PCA, harmony, clustering, and UMAP generation) to get an updated object. When I use a similar script as I did for the original dataset, I get an error:
Would you happen to have any suggestions on how to overcome this issue? Thanks |
Hi @yoavhadas , Sorry for getting back to you late. Have you tried setting |
Hi, I have a question when working with multimodaldata, for example, I have used three matrices containing the same cells to generate one multimodaldata, as follows:
When I filter the multimodaldata through barcode, I got a unimodaldata that only contains one matrix like this:
I was wondering if there is a method to subset the entire multimodaldata, allowing me to obtain multimodaldata with three matrices that only contain subset cells? Additionally, I am not sure if you will update the pegasusIO related tutorials, as I feel the content is somewhat limited. For instance, processes like creating multimodal data from a matrix, such as CreateSeurat, took me a long time to figure out on my own. Regardless, Pegasus is still a single-cell analysis tool that I really like. I have already used it into my own work and cited it, hoping it will continue to improve. |
Hello,
Is it possible to subset a pegasus h5ad file using a list of barcodes?
Thanks,
Yoav.
The text was updated successfully, but these errors were encountered: