script notes, etc.
parent
18bec38353
commit
b828430d37
47
README.md
47
README.md
|
@ -68,7 +68,8 @@ The main AI/ML development is being done in Jupyter.
|
||||||
|
|
||||||
# Jupyter
|
# Jupyter
|
||||||
There Jupyter Lab Notebook files in the `notebooks/` subdirectory.
|
There Jupyter Lab Notebook files in the `notebooks/` subdirectory.
|
||||||
These are producing usable results.
|
These are producing usable results. Voila is used to convert
|
||||||
|
Jupyter notebooks into websites.
|
||||||
|
|
||||||
* `wut.ipynb` --- Machine learning Python script using Tensorflow and Keras in a Jupyter Notebook.
|
* `wut.ipynb` --- Machine learning Python script using Tensorflow and Keras in a Jupyter Notebook.
|
||||||
* `wut-predict.ipynb` --- Make prediction (rating) of observation from pre-existing model.
|
* `wut-predict.ipynb` --- Make prediction (rating) of observation from pre-existing model.
|
||||||
|
@ -116,6 +117,8 @@ The following scripts are in the repo.
|
||||||
# Installation
|
# Installation
|
||||||
Installation notes...
|
Installation notes...
|
||||||
|
|
||||||
|
There's more docs on a few different setups in the `docs/` subdir.
|
||||||
|
|
||||||
## Setup
|
## Setup
|
||||||
The scripts use files that are ignored in the git repo.
|
The scripts use files that are ignored in the git repo.
|
||||||
So you need to create those directories:
|
So you need to create those directories:
|
||||||
|
@ -337,48 +340,6 @@ mkdir download
|
||||||
rsync -ultav rsync://ml.spacecruft.org/download/ download/
|
rsync -ultav rsync://ml.spacecruft.org/download/ download/
|
||||||
```
|
```
|
||||||
|
|
||||||
# TODO / Brainstorms
|
|
||||||
This is a first draft of how to do this. The actual machine learning
|
|
||||||
process hasn't been looked at at all, except to get it to generate
|
|
||||||
an answer. It has a long ways to go. There are also many ways to do
|
|
||||||
this besides using Tensorflow and Keras. Originally, I considered
|
|
||||||
using OpenCV. Ideas in no particular order below.
|
|
||||||
|
|
||||||
## General
|
|
||||||
General considerations.
|
|
||||||
|
|
||||||
* Use Open CV.
|
|
||||||
|
|
||||||
* Use something other than Tensorflow / Keras.
|
|
||||||
|
|
||||||
* Do mirror of `network.satnogs.org` and do API calls to it for data.
|
|
||||||
|
|
||||||
* Issues are now available here:
|
|
||||||
* https://spacecruft.org/spacecruft/satnogs-wut/issues
|
|
||||||
|
|
||||||
## Tensorflow / Keras
|
|
||||||
At present Tensorflow and Keras are used.
|
|
||||||
|
|
||||||
* Learn Keras / Tensorflow...
|
|
||||||
|
|
||||||
* What part of image is being evaluated?
|
|
||||||
|
|
||||||
* Re-evaluate each step.
|
|
||||||
|
|
||||||
* Right now the prediction output is just "good" or "bad", needs
|
|
||||||
"failed" too.
|
|
||||||
|
|
||||||
* Give confidence score in each prediction.
|
|
||||||
|
|
||||||
* Visualize what ML is looking at.
|
|
||||||
|
|
||||||
* Separate out good/bad/failed by satellite, transmitter, or encoding.
|
|
||||||
This way "good" isn't considering a "good" vetting to be a totally
|
|
||||||
different encoding. Right now, it is considering as good observations
|
|
||||||
that should be bad...
|
|
||||||
|
|
||||||
* If it has a low confidence, return "unknown" instead of "good" or "bad".
|
|
||||||
|
|
||||||
# Caveats
|
# Caveats
|
||||||
This is the first artificial intelligence script I've done,
|
This is the first artificial intelligence script I've done,
|
||||||
I know little about radio and less about satellites,
|
I know little about radio and less about satellites,
|
||||||
|
|
Loading…
Reference in New Issue