Skip to content

Repository to hold the code for studying the Vela Pulsar and its Glitch of July 2021 by the Argentine Institute of Radioastronomy

Notifications You must be signed in to change notification settings

qu-gg/vela-glitch-analysis

Repository files navigation

Single Pulse Analyses of the Vela Pulsar
using Machine Learning Techniques

Vela-Pulsar (MNRAS, arXiv)
Vela-Glitch (MNRAS, arXiv)

This repository holds the experiments and models as explored in the works, "Vela pulsar: single pulses analysis with machine learning techniques" and "First glitching Pulsars monitoring program at the Argentine Institute of Radioastronomy" as published in MNRAS. We provide training and visualization scripts. Data generated by our calculations or observations are available from the corresponding authors upon reasonable request.

Overview

We leverage Variational Autoencoders (VAEs) and Self-Organizing Maps (SOMs) to enable individual pulse reconstruction and clustering from the Vela Pulsar (SR B0833-45 / J0835-4510) given daily observations ranging from 1-3 hours and 50-100k pulses. The first works presents the initial presentation of the technique when applied to pulsar data and highlights the results on 4 days of observation from January and March of 2021. Using these techniques, we were able to isolate 'mini-giant' pulses effectively into clusters and highlight the trend of increasing signal peak amplitude with earlier pulse arrival times, supporting earlier pulsar models suggesting pulsar emitting regions of different heights in thheir magnetospheres.

framework schematic

Fig 1. Schematic of the prposed VAE-SOM model used in analysis.

Citation

If you found the information helpful for your work or use portions of this repo in research development, please consider citing:

@article{lousto2022vela,
  title={Vela pulsar: single pulses analysis with machine learning techniques},
  author={Lousto, Carlos O and Missel, Ryan and Prajapati, Harshkumar and Sosa Fiscella, Valentina and Armengol, Federico G L{\'o}pez and Gyawali, Prashnna Kumar and Wang, Linwei and Cahill, Nathan D and Combi, Luciano and Palacio, Santiago del and others},
  journal={Monthly Notices of the Royal Astronomical Society},
  volume={509},
  number={4},
  pages={5790--5808},
  year={2022},
  publisher={Oxford University Press}
}

or

@article{10.1093/mnras/stad723,
    author = {Zubieta, Ezequiel and Missel, Ryan and Fiscella, Valentina Sosa and Lousto, Carlos O and del Palacio, Santiago and Armengol, Federico G López and García, Federico and Combi, Jorge A and Wang, Linwei and Combi, Luciano and Gancio, Guillermo and Negrelli, Carolina and Gutiérrez, Eduardo M},
    title = "{First results of the glitching pulsars monitoring program at the Argentine Institute of Radioastronomy}",
    journal = {Monthly Notices of the Royal Astronomical Society},
    year = {2023},
    month = {03},
    issn = {0035-8711},
    doi = {10.1093/mnras/stad723},
    url = {https://doi.org/10.1093/mnras/stad723}
}

Repository structure

Here we detail the folder structure of the repository, for convenience. We include a requirements.txt file for packages used.

  vela-glitch-analysis/
  │
  ├── README.md                                    # What you're reading right now :^)
  ├── requirements.txt                             # Pip requirements file to enable easy setup
  |
  ├── raw_som_reconstruct.py                       # SOM analysis of the raw signals
  ├── vae_model.py                                 # VAE model implementation
  ├── vae_reconstruct.py                           # SOM analysis of the VAE reconstructions
  ├── vae_train.py                                 # VAE training script
  ├── temporal_cluster_histogram.py                # Plot functions related to signals over time
  |
  ├── data/                                        # Files of the month and day captured
  |   └── <monthDay>/                              
  ├── graphs/                                      # Graphical plots for the given day and antenna
  |   └── <monthDayAntenna>/                       
  ├── models/                                      # Trained PyTorch checkpoints
  │   └── <monthDayAntenna_modelType.torch         
  ├── reconstructed/                               # VAE models' reconstructions of datasets
  │   └── <monthDayAntenna_modelType.torch         
  |── scripts/                                     # Automated training .bat scripts
  │   └── complete_reconstruction_script.bat       
  ├── utils/                                       # Various functions for plotting and model utility
  │   ├── plot_functions.py                        
  |   └── util_functions.py                  
  └──

Data

Data-specific processing details are best read in the acknowledged papers; here we focus on how they're loaded for the VAE and SOM models. Figure 2 includes the neural architecture schematic. To get the relevant window of peak for a dataset, we use a windowing technique where the average index of the max signal value is found across the dataset and a window of 100 before and after timesteps are taken. In the event that the average peak occurs near the boundaries (which is simply a data split in pre-processing consideration), checks are used to truncated one side and extend the other to fit 200 timesteps.

Code + Models

We include two modes of analysis, raw-signal analsis and analysis using the reconstructions from the VAE models. Argparse arguments are specific to the data-structure of month/day/antenna but this can easily be swapped out in the np.load() for whatever data you have.

Raw: We simply include raw_som_reconstruct.py to handle loading, clustering, and getting statistics for the raw signals with SOM. VAE: The general workflow goes from VAE training (var_train.py) then application of the SOM to get clusters and plots out (var_reconstruct.py). The trained models are saved under models/ and their per-day reconstruction files saved under reconstructed/.

The VAE architecture and training script is loosely based on the available project here, however it was adapted for adaptive pooling and parameter counts relevant to the problem. An analysis with hyperparameter tuning was not performed in-depth, however with the current setup it was fairly robust across days and variations.

neural architecture schematic

Fig 2. Schematic of the neural network data-input and layer-specific architectures.

Visualizations

We provide many visualizations with respect to the VAE reconstruction and the SOM clustering analysis, as well as text statistics. An example analysis is provided within graphs/. It is broken down first by dataset, then by raw/vae signals used, and finally the SOM shape considered for analysis. Figure 3 shows an example of one of the visualizations provided.

clustering example

Fig 3. Example of a 6-cluster SOM on the July 19th dataset.

About

Repository to hold the code for studying the Vela Pulsar and its Glitch of July 2021 by the Argentine Institute of Radioastronomy

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published