Skip to content

Commit

Permalink
Merge pull request #122 from dnvgl/update-fatigue-calc
Browse files Browse the repository at this point in the history
Fatigue calculation updates
  • Loading branch information
eneelo authored Nov 29, 2024
2 parents ef64035 + bd1bc35 commit 929a655
Show file tree
Hide file tree
Showing 8 changed files with 150 additions and 21 deletions.
8 changes: 4 additions & 4 deletions .github/workflows/codeql.yml
Original file line number Diff line number Diff line change
Expand Up @@ -24,18 +24,18 @@ jobs:

steps:
- name: Checkout
uses: actions/checkout@v3
uses: actions/checkout@v4

- name: Initialize CodeQL
uses: github/codeql-action/init@v2
uses: github/codeql-action/init@v3
with:
languages: ${{ matrix.language }}
queries: +security-and-quality

- name: Autobuild
uses: github/codeql-action/autobuild@v2
uses: github/codeql-action/autobuild@v3

- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v2
uses: github/codeql-action/analyze@v3
with:
category: "/language:${{ matrix.language }}"
12 changes: 6 additions & 6 deletions .github/workflows/publish.yml
Original file line number Diff line number Diff line change
Expand Up @@ -21,12 +21,12 @@ jobs:
uses: actions/checkout@v4
- name: Set up python
id: setup-python
uses: actions/setup-python@v4
uses: actions/setup-python@v5
with:
python-version: "3.11"
- name: Load cached Poetry installation
id: cached-poetry
uses: actions/cache@v3
uses: actions/cache@v4
with:
path: ~/.local # the path depends on the OS
key: poetry-publish-0 # increment to reset cache
Expand All @@ -41,7 +41,7 @@ jobs:
run: poetry self add "poetry-dynamic-versioning[plugin]"
- name: Load cached venv
id: cached-poetry-dependencies
uses: actions/cache@v3
uses: actions/cache@v4
with:
path: .venv
key: venv-${{ runner.os }}-${{ steps.setup-python.outputs.python-version }}-${{ hashFiles('**/poetry.lock') }}
Expand All @@ -57,16 +57,16 @@ jobs:
env:
POETRY_PYPI_TOKEN_PYPI : ${{ secrets.PYPI_API_TOKEN }}
- name: Upload artifacts
# https://github.com/actions/upload-artifact#upload-artifact-v3
uses: actions/upload-artifact@v3
# https://github.com/actions/upload-artifact
uses: actions/upload-artifact@v4
with:
name: builds
path: |
dist/qats-*.tar.gz
dist/qats-*.whl
- name: Upload builds to release
# https://github.com/softprops/action-gh-release#%EF%B8%8F-uploading-release-assets
uses: softprops/action-gh-release@v1
uses: softprops/action-gh-release@v2
if: startsWith(github.ref, 'refs/tags/')
with:
files: |
Expand Down
14 changes: 7 additions & 7 deletions .github/workflows/test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -19,11 +19,11 @@ jobs:
- name: Check out repository
uses: actions/checkout@v4
- name: Setup Python
uses: actions/setup-python@v4
uses: actions/setup-python@v5
with:
python-version: '3.11'
python-version: "3.11"
- name: Load pip cache if cache exists
uses: actions/cache@v3
uses: actions/cache@v4
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip
Expand Down Expand Up @@ -54,12 +54,12 @@ jobs:
uses: actions/checkout@v4
- name: Set up python ${{ matrix.python-version }}
id: setup-python
uses: actions/setup-python@v4
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
- name: Load cached Poetry installation
id: cached-poetry
uses: actions/cache@v3
uses: actions/cache@v4
with:
path: ~/.local # the path depends on the OS
key: poetry-0 # increment to reset cache
Expand All @@ -75,7 +75,7 @@ jobs:
run: poetry self add "poetry-dynamic-versioning[plugin]"
- name: Load cached venv
id: cached-poetry-dependencies
uses: actions/cache@v3
uses: actions/cache@v4
with:
path: .venv
key: venv-${{ runner.os }}-${{ steps.setup-python.outputs.python-version }}-${{ hashFiles('**/poetry.lock') }}
Expand Down Expand Up @@ -109,7 +109,7 @@ jobs:
sphinx-build -b html docs/source docs/_build
- name: Upload artifacts
if: ${{ matrix.python-version == '3.11' }}
uses: actions/upload-artifact@v3
uses: actions/upload-artifact@v4
with:
name: test-builds
path: |
Expand Down
3 changes: 3 additions & 0 deletions data/example.tension.key
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
time
Tension
END
Binary file added data/example.tension.ts
Binary file not shown.
4 changes: 4 additions & 0 deletions qats/fatigue/rainflow.py
Original file line number Diff line number Diff line change
Expand Up @@ -35,6 +35,10 @@ def reversals(series, endpoints=False):
the first and the last points in the series, unless `endpoints` is set to True (in which case they are always
included).
See Also
--------
qats.signal.find_reversals
"""
series = iter(series)

Expand Down
77 changes: 73 additions & 4 deletions qats/signal.py
Original file line number Diff line number Diff line change
Expand Up @@ -463,7 +463,7 @@ def average_frequency(t: np.ndarray, x: np.ndarray, up: bool = True) -> float:

crossings = np.diff(crossings) # array with value=1 at position of each up-crossing and -1 at each down-crossing
crossings[crossings != indicator] = 0 # remove crossings with opposite direction
ind = np.where(crossings == indicator)[0] + 1 # indices for crossings
ind = np.nonzero(crossings == indicator)[0] + 1 # indices for crossings

if ind.size > 1:
# more than one crossing -> calculate frequency
Expand Down Expand Up @@ -553,8 +553,8 @@ def find_maxima(x, local: bool = False, threshold: float = None, up: bool = True
crossings = np.diff(crossings) # array with 1 at position of each up-crossing and -1 at each down-crossing

# get array indices for up-/down-crossings
crossing_indices_up = np.where(crossings == 1)[0] + 1 # up-crossings
crossing_indices_do = np.where(crossings == -1)[0] + 1 # down-crossings
crossing_indices_up = np.nonzero(crossings == 1)[0] + 1 # up-crossings
crossing_indices_do = np.nonzero(crossings == -1)[0] + 1 # down-crossings

# number of up-/downcrossings
n_up = crossing_indices_up.size
Expand Down Expand Up @@ -604,7 +604,7 @@ def find_maxima(x, local: bool = False, threshold: float = None, up: bool = True
d2s = np.diff(ds) # equal to +/-1 at each turning point, +1 indicates maxima
d2s = np.insert(d2s, 0, [0]) # lost data points when differentiating, close cycles by adding 0 at start

maxima_indices = np.where(d2s == 1)[0] # unpack tuple returned from np.where
maxima_indices = np.nonzero(d2s == 1)[0] # unpack tuple returned from np.nonzero
maxima = x[maxima_indices]

n_peaks = maxima.size
Expand All @@ -626,6 +626,75 @@ def find_maxima(x, local: bool = False, threshold: float = None, up: bool = True
return maxima, maxima_indices


def find_reversals(x) -> Tuple[np.ndarray, np.ndarray]:
"""
Return reversals (signal turning points).
Parameters
----------
x : array
Signal.
Returns
-------
array
Signal reversals.
array
Indices of reversals.
Notes
-----
.. versionadded :: 5.2.0
This function provides quick identification of signal reversals (turning points), as an alternative
to `qats.fatigue.rainflow.reversals()` which is slower for large signal arrays. Note that if the
signal includes oscillations with high frequency compared to the frequency oscillations (e.g., due
to noise in the signal causing), the present function may in some cases include some very local
turning points that are not identified by `qats.fatigue.rainflow.reversals()`. However, when the
turning points obtained from `find_reversals()` are passed through `reversals()`
(with `endpoints=True`), the resulting array will normally be the same as if the signal itself was
passed through `reversals()`.
Specifically, the following two code lines will **not** necessarily produce identical arrays:
>>> from qats.fatigue.rainflow import reversals
>>> rev1 = np.array(list(reversals(x)))
>>> rev2, _ = find_reversals(x)
... but the following code lines will normally produce `rev3` identical to `rev1` above:
>>> rev3 = np.array(list(reversals(rev2, endpoints=True)))
Examples
--------
Extract reversals (turning points) from the time series signal `x`:
>>> rev, _ = find_reversals(x)
Extract reversals and corresponding indices:
>>> rev, indices = find_reversals(x)
Use `find_reversals()` to speed up cycle counting:
>>> from qats.fatigue.rainflow import count_cycles
>>> rev, _ = find_reversals(x)
>>> cycles = count_cycles(rev, endpoints=True)
For large arrays, the latter example is practically equivalent to (but faster than)
the following code:
>>> cycles = count_cycles(x)
"""
# local maxima and minima (all peaks, both positive and negative)
ds = 1 * (np.diff(x) < 0) # zero while ascending (positive derivative) and 1 while descending
ds = np.append(ds, [0]) # lost data points when differentiating, close cycles by adding 0 at end
d2s = np.diff(ds) # equal to +/-1 at each turning point, +1 indicates maxima
d2s = np.insert(d2s, 0, [0]) # lost data points when differentiating, close cycles by adding 0 at start

# identify turning points (both maxima and minima)
rev_indices = np.nonzero(np.abs(d2s) == 1)[0] # unpack tuple returned from np.nonzero
rev = x[rev_indices]

return rev, rev_indices


def psd(x: np.ndarray, dt: float, **kwargs) -> Tuple[np.ndarray, np.ndarray]:
"""
Estimate power spectral density of discrete time signal X using Welch’s method.
Expand Down
53 changes: 53 additions & 0 deletions test/test_reversals.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,53 @@
# -*- coding: utf-8 -*-
"""
Module for testing signal processing functions
"""
import os
import unittest

import numpy as np

from qats import TsDB
from qats.signal import find_reversals
from qats.fatigue.rainflow import reversals


class TestReversals(unittest.TestCase):
def setUp(self):
self.data_directory = os.path.join(os.path.dirname(__file__), '..', 'data')
self.peaks_file = "example.tension.ts"
self.peaks_path = os.path.join(self.data_directory, self.peaks_file)

def test_find_reversals(self):
"""
The function `qats.signal.find_reversals()` identifies all turning points of
a the specified signal. This is similar to what `qats.fatigue.rainflow.reversals()`
does, but in some cases they produce slightly different arrays.
However, when the turning points from `find_reversals()` are passed through
`reversals()` (with `endpoints=True`), the outcome should be the same as if
the signal itself was passed through `reversals()`.
"""
db = TsDB.fromfile(self.peaks_path)
ts = db.get("Tension")

# find reversals by fatigue.rainflow.reversals()
rev_rainflow = np.fromiter(reversals(ts.x), dtype=float)
# find reversals by signal.find_maxima()
rev_signal, _ = find_reversals(ts.x)

# for this time series, these two method does not result in the same turning points
# note: this is not a requirement, but we ensure that this is the case now for
# the value of the next check
self.assertNotEqual(rev_signal.size, rev_rainflow.size)

# however, passing the turning points from `find_reversals()` through
# `reversals()` (with `endpoints=True`) should result in identical arrays
rev_combined = np.fromiter(reversals(rev_signal, endpoints=True), dtype=float)

# now check that this is actually the case
self.assertEqual(rev_combined.size, rev_rainflow.size)
np.testing.assert_array_equal(rev_combined, rev_rainflow)


if __name__ == '__main__':
unittest.main()

0 comments on commit 929a655

Please sign in to comment.