Skip to content

Commit

Permalink
Merge pull request #3 from ornlneutronimaging/qa
Browse files Browse the repository at this point in the history
advance release candidate to new stable release
  • Loading branch information
KedoKudo authored Feb 1, 2024
2 parents 584194f + 3c693ab commit 39a7822
Show file tree
Hide file tree
Showing 83 changed files with 2,369 additions and 2,876 deletions.
46 changes: 46 additions & 0 deletions .github/workflows/package.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@
name: conda packaging and deployment

on:
workflow_dispatch:
push:
branches: [qa, main]
tags: ['v*']

jobs:
linux:
runs-on: ubuntu-latest
defaults:
run:
shell: bash -l {0}
steps:
- uses: actions/checkout@v3
- uses: conda-incubator/setup-miniconda@v2
with:
auto-update-conda: true
mamba-version: "*"
environment-file: environment.yml
cache-environment-key: ${{ runner.os }}-env-${{ hashFiles('**/environment.yml') }}
cache-downloads-key: ${{ runner.os }}-downloads-${{ hashFiles('**/environment.yml') }}
- name: install additional dependencies
run: |
echo "installing additional dependencies from environment_development.yml"
- name: build conda package
run: |
# set up environment
cd conda-recipe
echo "versioningit $(versioningit ../)"
# build the package
VERSION=$(versioningit ../) conda mambabuild --output-folder . .
conda verify noarch/neunorm*.tar.bz2
- name: upload conda package to anaconda
shell: bash -l {0}
if: startsWith(github.ref, 'refs/tags/v')
env:
ANACONDA_API_TOKEN: ${{ secrets.ANACONDA_TOKEN }}
IS_RC: ${{ contains(github.ref, 'rc') }}
run: |
# label is main or rc depending on the tag-name
CONDA_LABEL="main"
if [ "${IS_RC}" = "true" ]; then CONDA_LABEL="rc"; fi
echo pushing ${{ github.ref }} with label $CONDA_LABEL
anaconda upload --label $CONDA_LABEL conda.recipe/noarch/neunorm*.tar.bz2
133 changes: 132 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
@@ -1,6 +1,5 @@
__pycache__/*
*.pyc
*~
.ipynb*
.coverage
/cover/*
Expand All @@ -10,5 +9,137 @@ __pycache__/*
.cache/*
.idea/*
NeuNorm.egg-info*/
tmp/*

.pytest_cache/


# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
*$py.class

# C extensions
*.so

# Distribution / packaging
.Python
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
wheels/
share/python-wheels/
*.egg-info/
.installed.cfg
*.egg
MANIFEST

# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec

# Installer logs
pip-log.txt
pip-delete-this-directory.txt

# Unit test / coverage reports
htmlcov/
.tox/
.nox/
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*.cover
*.py,cover
.hypothesis/
.pytest_cache/
cover/

# Translations
*.mo
*.pot

# Django stuff:
*.log
local_settings.py
db.sqlite3
db.sqlite3-journal

# Flask stuff:
instance/
.webassets-cache

# Scrapy stuff:
.scrapy

# Sphinx documentation
docs/_build/

# PyBuilder
.pybuilder/
target/

# Jupyter Notebook
.ipynb_checkpoints

# IPython
profile_default/
ipython_config.py

.pdm.toml

# PEP 582; used by e.g. github.com/David-OConnor/pyflow and github.com/pdm-project/pdm
__pypackages__/

# Celery stuff
celerybeat-schedule
celerybeat.pid

# SageMath parsed files
*.sage.py

# Environments
.env
.envrc
.venv
env/
venv/
ENV/
env.bak/
venv.bak/

# Spyder project settings
.spyderproject
.spyproject

# Rope project settings
.ropeproject

# mkdocs documentation
/site

# mypy
.mypy_cache/
.dmypy.json
dmypy.json

# Pyre type checker
.pyre/

# pytype static type analyzer
.pytype/

# Cython debug symbols
cython_debug/
22 changes: 22 additions & 0 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.5.0
hooks:
- id: check-added-large-files
args: [--maxkb=8192]
- id: check-merge-conflict
- id: check-yaml
args: [--allow-multiple-documents]
exclude: "conda.recipe/meta.yaml"
- id: end-of-file-fixer
exclude: "tests/cis_tests/.*"
- id: trailing-whitespace
exclude: "tests/cis_tests/.*"
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.1.14
hooks:
- id: ruff
args: [--fix, --exit-non-zero-on-fix]
exclude: "tests/cis_tests/.*"
- id: ruff-format
exclude: "tests/cis_tests/.*"
File renamed without changes.
3 changes: 1 addition & 2 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,5 @@ Thanks for your interest in contributing code to NeuNorm!
We are currently working on new features such as:
- use of a mask to block pixels and not allow normalization of those pixels
- speed up loading of data by working in parallel

For any contribution you would like to add, please fork the NeuNorm project and use the pull request to bring them back to master.

For any contribution you would like to add, please fork the NeuNorm project and use the pull request to bring them back to master.
16 changes: 0 additions & 16 deletions ImagingReso.egg-info/PKG-INFO

This file was deleted.

15 changes: 0 additions & 15 deletions ImagingReso.egg-info/SOURCES.txt

This file was deleted.

1 change: 0 additions & 1 deletion ImagingReso.egg-info/dependency_links.txt

This file was deleted.

6 changes: 0 additions & 6 deletions ImagingReso.egg-info/requires.txt

This file was deleted.

1 change: 0 additions & 1 deletion ImagingReso.egg-info/top_level.txt

This file was deleted.

File renamed without changes.
5 changes: 0 additions & 5 deletions MANIFEST.in

This file was deleted.

8 changes: 0 additions & 8 deletions NeuNorm/__init__.py

This file was deleted.

23 changes: 0 additions & 23 deletions NeuNorm/_utilities.py

This file was deleted.

10 changes: 6 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,19 +7,21 @@
Abstract
--------

NeuNorm is an open-source Python library that normalized neutron imaging measurements.
NeuNorm is an open-source Python library that normalized neutron imaging measurements.

In order to cancel detector electronic noises, source beam fluctuations and other pollution signals from close by beam lines, every data acquired need to be normalized. In order to perform the normalization, one must take, in addition to his data set, either 1 or 2 extra data set. A set of open beam (OB) when sample has been removed but beam is on. An optional set of dark field (DF) is taken when beam is off and sample off. The dark field allows to clean the electronic noises from the images. The principle of normalization can be summarized by the following figure.
In order to cancel detector electronic noises, source beam fluctuations and other pollution signals from close by beam lines, every data acquired need to be normalized. In order to perform the normalization, one must take, in addition to his data set, either 1 or 2 extra data set. A set of open beam (OB) when sample has been removed but beam is on. An optional set of dark field (DF) is taken when beam is off and sample off. The dark field allows to clean the electronic noises from the images. The principle of normalization can be summarized by the following figure.

![](documentation/source/_static/normalization_principle.png)

which is defined by the following equation

![](documentation/source/_static/normalization_equation.png)
$$
I_{n}(i, j) = \frac{I(i,j) - DF(i,j)}{OB(i,j) - DF(i,j)}
$$

where In is the image normalized, I the raw image, DF the dark field, OB the open beam and i and j the x and y-pixels along the images.

To improve the normalization, the program also allows the user to select a region of interest (ROI) in the sample images in order to match the background of the raw data with the background of the open beam. This is necessary for some beam lines where the fluctuations of the beam are too important to be neglected. The program calculates then, for each raw data, the average counts of this ROI divided by the average counts of the same ROI of the open beams, then apply this ratio to the normalized data.
To improve the normalization, the program also allows the user to select a region of interest (ROI) in the sample images in order to match the background of the raw data with the background of the open beam. This is necessary for some beam lines where the fluctuations of the beam are too important to be neglected. The program calculates then, for each raw data, the average counts of this ROI divided by the average counts of the same ROI of the open beams, then apply this ratio to the normalized data.

Input data often contains very hight counts coming from gamma rays. Those are also corrected by the program by doing a median filter around those "gamma" pixels. How those gamma pixels are determined. They are always the highest counts for the input file format.

Expand Down
9 changes: 9 additions & 0 deletions codecov.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
# Configuration file for codecov reporting code coverage

codecov:
token: 6cba80e7-6201-4ee5-baf2-b44b290ad103
status:
project:
default:
# base on last build, but allow drop of upto this percent
threshold: 0.5%
2 changes: 0 additions & 2 deletions codecov.yml

This file was deleted.

Loading

0 comments on commit 39a7822

Please sign in to comment.