What In The Zap Is That? AI categorization of spectra from LIBS/XRF analyzers. https://spacecruft.org/spacecruft/witzit/
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
 
 
Jeff Moe 486da519ff Add 150+ example exports from SciAps X-555 XRF 6 months ago
examples Merge plot notebooks for both Olympus/Sciaps 6 months ago
img Deep Learning Sequence() diagram 11 months ago
notebooks Add 150+ example exports from SciAps X-555 XRF 6 months ago
template rename sciaps eV template 6 months ago
.gitignore ignore tmp/ 6 months ago
LICENSE witzit - What In The Zap Is That? 11 months ago
README.md Add scripts to plot from CLI 6 months ago
requirements-dev.txt pep8/black format jupyter 11 months ago
requirements.txt pysalx, mca works, no versioned reqs 7 months ago
witzit-load rename sciaps eV template 6 months ago
witzit-plot Add scripts to plot from CLI 6 months ago
witzit-plot2png Don't show plot when making PNG 6 months ago

README.md

witzit - What In The Zap Is That?

witzit --- What In The Zap Is That? AI categorization of spectra from LIBS/XRF analyzers.

Example Jupyter plot from SciAps X-555 XRF:

xrf-spectrum-sciaps

Example plot from Olympus Vanta XRF: xrf-spectrum-vanta

Install

Install Dependencies

Get system dependency and upgrade Python pip. Perhaps do something like this, or set up a Python virtual environment.

sudo apt update
sudo apt install git python3-pip
pip install --user --upgrade pip

pysalx

The pysalx repo contains scripts for interacting with the device. Install that too.

git clone https://spacecruft.org/spacecruft/pysalx.git
cd pysalx/
# Set the date
./scripts/pysalx-date-set
# whatever else...
..

Clone Git Repo

Get source code with git.

The default requirements.txt installs a Tensorflow without GPU support. You can edit the requirements.txt file to change which is supported. The "generic" version supports both.

git clone https://spacecruft.org/spacecruft/witzit
cd witzit/
pip install --user --upgrade -r requirements.txt

witzit scripts

  • witzit-load --- Load and text display data from SciAps or Olympus XRF.
  • witzit-plot --- Plot a sample from a SciAps X-555 or Olympus Vanta-M.
  • witzit-plot2png --- Plot a sample from a SciAps X-555 or Olympus Vanta-M and save to PNG.

Development is most easily done under Jupyter with Tensorboard for training models. These files are in the notebooks/ directory.

  • witzit-plot.ipynb --- witzit Jupyter notebook, plotting application for SciAps X-555 or Olympus Vanta-M.
  • witzit-predict.ipynb --- witzit Jupyter notebook, prediction application.
  • witzit-train.ipynb --- witzit Jupyter notebook, training application.

Data

Note: Files in the data/ directory may be deleted and/or manipulated by scripts in this application.

Note well, the data/ directory is ignored by git, and is a temporary directory where data to be processed is stored. For example, if you have a main original archive of 10,000 samples and you want to process just 1,000 of them, they would be copied to the data/ directory.

Data is also stored here, which can also be deleted/moved by scripts:

/srv/witzit/

Each element sample will be stored under here:

/srv/witzit/data/element/

Each element model will be stored under here:

/srv/witzit/data/models/

Temporary logs during training may be written to the gitignored logs/ directory.

Usage

HOWTO USE. Getting closer...

# Example:
debian@workstation:~/spacecruft/witzit$ ./witzit-load-x555
       energy (eV)   2048
0        20.590676      0
1        45.021816      0
2        69.452957      0
3        93.884097      0
...
1023  25013.647367    175
1024  25038.078508    173
1025  25062.509648    155
1026  25086.940789    193
...
2047  50031.135199      1

Jupyter Notebooks

Run jupyter thusly:

cd witzit/notebooks
jupyter-lab

Hardware

  • SciAps LIBS Analyzer

  • SciAps XRF Analyzer

  • Olympus XRF Analyzer

Deep Learning Algorithm

Can use lots from wut.

def uncompiled_model():
  model = Sequential([
    Conv2D(16, 3, padding='same', activation='relu', input_shape=(IMG_HEIGHT, IMG_WIDTH ,3)),
    MaxPooling2D(),
    Conv2D(32, 3, padding='same', activation='relu'),
    MaxPooling2D(),
    Conv2D(64, 3, padding='same', activation='relu'),
    MaxPooling2D(),
    Flatten(),
    Dense(512, activation='relu'),
    Dense(1, activation='sigmoid')
  ])
  return model

Amazingly (to me), the paper Classification of radioxenon spectra with deep learning algorithm (2021) by Azimi, et al. uses nearly the identical CNN Sequence() as wut uses, indicating it may be a very good base to start from.

Paper is non-gratis science:

The Sequence() diagram is pulled from the Azimi paper, but is the same as in wut, so makes a good reference.

Deep Learning Sequence

Articles:

See Also

  • pysalx --- Unofficial scripts for interacting with the SciAps LIBS and XRF analyzers.

https://spacecruft.org/spacecruft/pysalx/

  • wut? --- What U Think? SatNOGS Observation AI.

https://spacecruft.org/spacecruft/satnogs-wut/

Status

Alpha software under development. Need to check:

Unofficial

Unofficial, unaffiliated with SciAps or Olympus.

License

License: GPLv3 or any later version.

Copyright (C) 2019-2022, Jeff Moe