Skip to content

vonrafael/Skip-CBAM-SelfAttention-GANomaly-Pytorch

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

52 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Attention-Map-Guided-Two-stage-Anomaly-Detection-using-Hard-Augmentation-Pytorch

(Back to menu)

Anomaly Detection Network

implement from paper Attention Map-Guided Two-stage Anomaly Detection using Hard Augmentation https://arxiv.org/pdf/1805.08318.pdf , l just implement Attention Network .

The Attention Network is able to generate Attention maps that show which part of region in image is anomaly and normal,

and able to find anomaly images by loss value, it means if loss value is greater than particular loss threshold, it will regard as anomaly image

but the acc is lower than skip-Attention GANomaly, so need ADGAN to further detect anomaly images.

below image ref. from paper https://arxiv.org/pdf/1805.08318.pdf

image

Implement-Issue

(Back to menu)

-Generate fake image have some black images

[Solution] G_loss add anomaly image loss , and theauc will low when adding this loss criteria

image

-Train attetion failed if the input_attn = 1, but succeed when input_attn=empty

-The cutout augmentation is not real, needs to further implement paper method

-Maybe it is not easy to implement in tensorflow2

-The loss function is not same as paper, maybe result in false result

Hard-Augmentation

l just implement cutout augmgnetation...

(Back to menu)

Generator + Discriminator model

Table of contents

Requirement

pip install -r requirements.txt

implement

(Back to menu)

GANomaly

(Back to menu)

Tha base framework is GAnomaly, and modified Encoder-Decoder framework

Encoder-Decoder network modification

  1. Skip-GANomaly : add skip-connection
  2. Skip-CBAM-feature : Encoder features do CBAM Attetion network and then skip to Decoder
  3. Decoder-SelfAttention : Decoder feature do self-Attention and pass to next layer as input feature

Ganomaly

SkipCBAM-AutoEncoder

(Back to menu)

  1. implement skip-CBAM-GANomaly first, and then start adding self-Attention into skip-CBAM-GANomaly network

CBAM-ori

Self-Attention

(Back to menu)

below image is self-Attention network , ref. from paper https://arxiv.org/pdf/1805.08318.pdf

image

AttentionNetwork

(Back to menu) CBAM

Train-on-custom-dataset

(Back to menu)

Custom Dataset
├── test
│   ├── 0.normal
│   │   └── normal_tst_img_0.png
│   │   └── normal_tst_img_1.png
│   │   ...
│   │   └── normal_tst_img_n.png
│   ├── 1.abnormal
│   │   └── abnormal_tst_img_0.png
│   │   └── abnormal_tst_img_1.png
│   │   ...
│   │   └── abnormal_tst_img_m.png
├── train
│   ├── 0.normal
│   │   └── normal_tst_img_0.png
│   │   └── normal_tst_img_1.png
│   │   ...
│   │   └── normal_tst_img_t.png


Train

(Back to menu)

python train.py --img-dir "[train dataset dir]" --batch-size 64 --img-size 32 --epoch 20

image

Test

(Back to menu)

python test.py --nomal-dir "[test normal dataset dir]" --abnormal-dir "[test abnormal dataset dir]" --view-img --img-size 32

Lose-value-distribution

(Back to menu)

Blue : normal dataset

Orange : abnormal dataset

image

image

image

Reference

(Back to menu)

GANomaly: Semi-Supervised Anomaly Detection via Adversarial Training

https://arxiv.org/abs/1805.06725

Skip-GANomaly: Skip Connected and Adversarially Trained Encoder-Decoder Anomaly Detection

https://arxiv.org/pdf/1901.08954.pdf

Attention Map-Guided Two-stage Anomaly Detection using Hard Augmentation

https://arxiv.org/pdf/1805.08318.pdf

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%