script notes, etc.

master
server 2020-02-13 13:29:26 -07:00
parent 98f355fb05
commit a2360bd109
1 changed files with 46 additions and 13 deletions

View File

@ -30,37 +30,59 @@ observation ID and return an answer whether the observation is
<img src="satnogs-wut/media/branch/master/pics/waterfall-failed.png" width="300"/> <img src="satnogs-wut/media/branch/master/pics/waterfall-failed.png" width="300"/>
</div> </div>
## wut Web
Main site:
* https://wut.spacecruft.org/
Source code:
* https://spacecruft.org/spacecruft/satnogs-wut
Beta (test) site:
* https://wut-beta.spacecruft.org/
Alpha (development) site:
* https://wut-alpha.spacecruft.org/
## Observations ## Observations
See also: See also:
* https://wiki.satnogs.org/Operation * https://wiki.satnogs.org/Operation
* https://wiki.satnogs.org/Rating_Observations * https://wiki.satnogs.org/Observe
* https://wiki.satnogs.org/Taxonomy_of_Observations * https://wiki.satnogs.org/Observations
* https://wiki.satnogs.org/Category:RF_Modes
* Sample observation: https://network.satnogs.org/observations/1456893/ * Sample observation: https://network.satnogs.org/observations/1456893/
# Machine Learning # Machine Learning
The system at present is built upon the following: The system at present is built upon the following:
* Debian Buster. * Debian Buster.
* Tensorflow 2.1 with built-in Keras. * Tensorflow 2 with Keras.
* Jupyter Lab. * Jupyter Lab.
* Voila.
Learning/testing, results are ~~inaccurate~~ getting closer. Learning/testing, results are good.
The main AI/ML development is now being done in Jupyter. The main AI/ML development is being done in Jupyter.
# Jupyter # Jupyter
There is a Jupyter Lab Notebook file. There Jupyter Lab Notebook files in the `notebooks/` subdirectory.
This is producing real results at present, but has a long ways to go still... These are producing usable results.
* `wut-ml.ipynb` --- Machine learning Python script using Tensorflow and Keras in a Jupyter Notebook. * `wut.ipynb` --- Machine learning Python script using Tensorflow and Keras in a Jupyter Notebook.
* `wut-predict.ipynb` --- Make prediction (rating) of observation, using `data/wut.h5`. * `wut-predict.ipynb` --- Make prediction (rating) of observation from pre-existing model.
* `wut-train.ipynb` --- ML Training file saved to `data/wut.h5`. * `wut-train.ipynb` --- Train models to be using by prediction engine.
* `wut-web.ipynb` --- Website: https://wut.spacecruft.org/
* `wut-web-beta.ipynb` --- Website: https://wut-beta.spacecruft.org/
* `wut-web-alpha.ipynb` --- Website: https://wut-alpha.spacecruft.org/
# wut scripts # wut scripts
The following scripts are in the repo: The following scripts are in the repo.
* `wut` --- Feed it an observation ID and it returns if it is a "good", "bad", or "failed" observation. * `wut` --- Feed it an observation ID and it returns if it is a "good", "bad", or "failed" observation.
* `wut-audio-archive` --- Downloads audio files from archive.org. * `wut-audio-archive` --- Downloads audio files from archive.org.
* `wut-audio-sha1` --- Verifies sha1 checksums of files downloaded from archive.org.
* `wut-compare` --- Compare an observations' current presumably human vetting with a `wut` vetting. * `wut-compare` --- Compare an observations' current presumably human vetting with a `wut` vetting.
* `wut-compare-all` --- Compare all the observations in `download/` with `wut` vettings. * `wut-compare-all` --- Compare all the observations in `download/` with `wut` vettings.
* `wut-compare-tx` --- Compare all the observations in `download/` with `wut` vettings using selected transmitter UUID. * `wut-compare-tx` --- Compare all the observations in `download/` with `wut` vettings using selected transmitter UUID.
@ -69,19 +91,30 @@ The following scripts are in the repo:
* `wut-dl-sort` --- Populate `data/` dir with waterfalls from `download/`. * `wut-dl-sort` --- Populate `data/` dir with waterfalls from `download/`.
* `wut-dl-sort-tx` --- Populate `data/` dir with waterfalls from `download/` using selected transmitter UUID. * `wut-dl-sort-tx` --- Populate `data/` dir with waterfalls from `download/` using selected transmitter UUID.
* `wut-dl-sort-txmode` --- Populate `data/` dir with waterfalls from `download/` using selected encoding. * `wut-dl-sort-txmode` --- Populate `data/` dir with waterfalls from `download/` using selected encoding.
* `wut-dl-sort-txmode-all` --- Populate `data/` dir with waterfalls from `download/` using all encodings.
* `wut-files` --- Tells you about what files you have in `downloads/` and `data/`. * `wut-files` --- Tells you about what files you have in `downloads/` and `data/`.
* `wut-files-data` --- Tells you about what files you have in `data/`.
* `wut-img-ck.py` --- Validate image files are not corrupt with PIL.
* `wut-ml` --- Main machine learning Python script using Tensorflow and Keras. * `wut-ml` --- Main machine learning Python script using Tensorflow and Keras.
* `wut-ml-auto` --- Machine learning Python script using Tensorflow and Keras, auto.
* `wut-ml-load` --- Machine learning Python script using Tensorflow and Keras, load `data/wut.h5`. * `wut-ml-load` --- Machine learning Python script using Tensorflow and Keras, load `data/wut.h5`.
* `wut-ml-save` --- Machine learning Python script using Tensorflow and Keras, save `data/wut.h5`. * `wut-ml-save` --- Machine learning Python script using Tensorflow and Keras, save `data/wut.h5`.
* `wut-obs` --- Download the JSON for an observation ID. * `wut-obs` --- Download the JSON for an observation ID.
* `wut-ogg2wav` --- Convert `.ogg` files in `downloads/` to `.wav` files. * `wut-ogg2wav` --- Convert `.ogg` files in `downloads/` to `.wav` files.
* `wut-rm-random` --- Randomly deletes stuff. Very bad.
* `wut-review-staging` --- Review all images in `data/staging`. * `wut-review-staging` --- Review all images in `data/staging`.
* `wut-tf` --- Shell script to set variables when launching `wut-tf.py`.
* `wut-tf.py` --- Distributed learning script to be run on multiple nodes.
* `wut-water` --- Download waterfall for an observation ID to `download/[ID]`. * `wut-water` --- Download waterfall for an observation ID to `download/[ID]`.
* `wut-water-range` --- Download waterfalls for a range of observation IDs to `download/[ID]`. * `wut-water-range` --- Download waterfalls for a range of observation IDs to `download/[ID]`.
* `wut-worker` --- Shell script to set variables when launching `wut-worker.py`.
* `wut-worker.py` --- Distributed training script to run on multiple nodes.
* `wut-worker-mas` --- Shell script to set variables when launching `wut-worker-mas.py`.
* `wut-worker-mas.py` --- Distributed training script to run on multiple nodes, alt version.
# Installation # Installation
Most of the scripts are simple shell scripts with few dependencies. Installation notes...
## Setup ## Setup
The scripts use files that are ignored in the git repo. The scripts use files that are ignored in the git repo.
@ -354,7 +387,7 @@ Alpha and Beta development and test servers are here:
* https://wut-beta.spacecruft.org * https://wut-beta.spacecruft.org
# Caveats # Caveats
This is nearly the first machine learning script I've done, This is the first artificial intelligence script I've done,
I know little about radio and less about satellites, I know little about radio and less about satellites,
and I'm not a programmer. and I'm not a programmer.