satnogs-wut/README.md

132 lines
3.5 KiB
Markdown
Raw Normal View History

2020-01-01 23:12:46 -07:00
# satnogs-wut
2020-01-02 16:44:03 -07:00
The goal of satnogs-wut is to have a script that will take an
observation ID and return an answer whether the observation is
"good", "bad", or "failed".
2020-01-02 16:52:23 -07:00
## Good Observation
2020-01-02 16:51:29 -07:00
![Good Observation](pics/waterfall-good.png)
2020-01-02 16:52:23 -07:00
## Bad Observation
2020-01-02 16:51:29 -07:00
![Bad Observation](pics/waterfall-bad.png)
2020-01-02 16:52:23 -07:00
## Failed Observation
2020-01-02 16:51:29 -07:00
![Failed Observation](pics/waterfall-failed.png)
2020-01-02 16:44:03 -07:00
# Machine Learning
The system at present is build upon the following:
* Debian
2020-01-01 23:18:12 -07:00
* Tensorflow
* Keras
Learning/Testing, results are inaccurate.
2020-01-02 16:44:03 -07:00
# wut?
The following scripts are in the repo:
* `wut` --- Feed it an observation ID and it returns if it is a "good", "bad", or "failed" observation.
2020-01-02 19:13:58 -07:00
* `wut-obs` --- Download the JSON for an observation ID.
* `wut-water` --- Download waterfall for an observation ID to `download/[ID]`.
* `wut-water-range` --- Download waterfalls for a range of observation IDs to `download/[ID]`.
2020-01-02 16:44:03 -07:00
* `wut-ml` --- Main machine learning Python script using Tensorflow and Keras.
* `wut-review-staging` --- Review all images in `data/staging`.
2020-01-02 17:30:22 -07:00
# Installation
Most of the scripts are simple shell scripts with few dependencies.
## Setup
The scripts use files that are ignored in the git repo.
So you need to create those directories:
```
mkdir -p download
mkdir -p data/train/good
mkdir -p data/train/bad
mkdir -p data/train/failed
mkdir -p data/validation/good
mkdir -p data/validataion/bad
mkdir -p data/validataion/failed
mkdir -p data/staging
mkdir -p data/test/unvetted
```
## Debian Packages
You'll need `curl` and `jq`, both in Debian's repos.
```
apt update
apt install curl jq
```
## Machine Learning
For the machine learning scripts, like `wut-ml`, both Tensorflow
and Keras need to be installed. The versions of those in Debian
didn't work for me. IIRC, for Tensorflow I built a `pip` of
version 2.0.0 from git and installed that. I installed Keras
from `pip`. Something like:
```
# XXX These aren't the exact commands, need to check...
# Install bazel or whatever their build system is
# Install Tensorflow
git clone tensorflow...
cd tensorflow
./configure
# run some bazel command
dpkg -i /tmp/pkg_foo/*.deb
apt update
apt -f install
# Install Keras
pip3 install --user keras
# A million other commands....
```
2020-01-02 17:11:16 -07:00
# Usage
The main purpose of the script is to evaluate an observation,
but to do that, it needs to build a corpus of observations to
learn from. So many of the scripts in this repo are just for
downloading and managing observations.
The following steps need to be performed:
1. Download waterfalls and JSON descriptions with `wut-get-waterfall-range`.
These get put in the `downloads/[ID]/` directories.
1. Organize downloaded waterfalls into categories (e.g. "good", "bad", "failed").
Note: this needs a script written.
Put them into their respective directories under:
2020-01-02 17:16:11 -07:00
* `data/train/good/`
* `data/train/bad/`
* `data/train/failed/`
* `data/validation/good/`
* `data/validataion/bad/`
* `data/validataion/failed/`
2020-01-02 17:11:16 -07:00
1. Use machine learning script `wut-ml` to build a model based on
the files in the `data/train` and `data/validation` directories.
1. Rate an observation using the `wut` script.
# Caveats
This is the first machine learning script I've done,
I know little about satellites and less about radio,
and I'm not a programmer.
2020-01-02 16:44:03 -07:00
# Source License / Copying
2020-01-02 16:56:30 -07:00
Main repository is available here:
2020-01-02 17:11:16 -07:00
2020-01-02 16:56:30 -07:00
* https://spacecruft.org/spacecruft/satnogs-wut
2020-01-02 16:55:08 -07:00
License: CC By SA 4.0 International and/or GPLv3+ at your discretion. Other code licensed under their own respective licenses.
2020-01-01 23:18:12 -07:00
2020-01-02 16:55:08 -07:00
Copyright (C) 2019, 2020, Jeff Moe