Merge branch 'master' of github.com:cbassa/strf

pull/30/head
Cees Bassa 2022-02-03 23:32:27 +01:00
commit 0afac75bad
13 changed files with 743 additions and 0 deletions

3
.gitignore vendored
View File

@ -25,3 +25,6 @@
/rfinfo
/rfplot
/rfpng
/rfinfo
.env

73
GUIDE.md 100644
View File

@ -0,0 +1,73 @@
# Guide for RF doppler analysis from SatNOGS Observation Waterfall Images
Forum post: https://community.libre.space/t/new-software-satnogs-waterfall-tabulation-helper/4380
This tool evolved from the need to tabulate doppler data from the lunar
Change-4 probe,
see <https://gitlab.com/kerel-fs/jupyter-notebooks/-/tree/master/change4/data#data-tabulation-method-new>.
## Installation
Install dependencies
```
pip install -r contrib/requirements.txt
```
# First Setup
Choose a folder where TLEs are stored and
a folder where RF doppler observatons (`.dat`-files) are stored.
For now the paths are configured via environment variables,
so make sure to set them correctly before each usage.
Example:
```
# Filename convention: {TLE_DIR}/{observation_id}.txt
SATNOGS_TLE_DIR="./data/tles"
# absulute frequency measurement storage
# Filename convention: {DOPPLER_OBS_DIR}/{observation_id}.txt
SATNOGS_DOPPLER_OBS_DIR="./data/obs"
# SATTOOLS/STRF/STVID sites.txt file
SATNOGS_SITES_TXT="./data/sites.txt"
mkdir -p $SATNOGS_TLE_DIR
mkdir -p $SATNOGS_DOPPLER_OBS_DIR
```
## Usage
0. Make sure the (3) env variables are set.
1. Choose SatNOGS Observation ID from network and run the tabulation helper
```
./contrib/satnogs_waterfall_tabulation_helper.py 1102230
```
An interactive plot will show up.
Clicking inside the plot will add a signal marker.
If you are finished with adding signal markers,
save the signal markers using the keyboard shortcut `f`.
Custom keyboard shortcuts:
- u - undo last signal marker
- f - save the signal markers in an strf-compatible file
Useful Matplotlib navigation keyboard shortcuts (documentation):
p - toggle 'Pan/Zoom' modus
o - toggle 'Zoom-to-rect' modus
h - Home/Reset (view)
c - Back (view)
2. Run rffit for orbit fitting, e.g.
```
./rffit -d $SATNOGS_DOPPLER_OBS_DIR/1102230.dat -i 44356 -c $SATNOGS_TLE_DIR/1102230.txt -s 7669
```
## Known issues
- A site id of the form `7{station_id}` is automatically assigned and written to
the `sites.txt` (e.g. station 669 should get `7669` assigned).
Only SatNOGS stations <999 are supported, as the strf sites.txt parse only allows
4-digit site ids. In case of problems, choose a free site id and manually correct the
doppler obs (`.dat`-files, last colum).

1
config.sh 120000
View File

@ -0,0 +1 @@
../sattools/config.sh

65
contrib/README.md 100644
View File

@ -0,0 +1,65 @@
# STRF contrib Scripts
## Installation
- Create Pyhton virtual environment and install dependencies via pip:
```
mkvirtualenv strf
pip insatll -r contrib/requirements.txt
```
- Create initial configuration from `env-dist`
```
cp env-dist .env
```
- Configure the data paths (they must already exist!) and add your SatNOGS DB API Token in `.env`
## Find good observations with SatNOGS artifacts
```
$ ./contrib/find_good_satnogs_artifacts.py
```
## Download SatNOGS Artifacts
```
$ ./contrib/download_satnogs_artifact.py 4950356
Artifact Metadata for Observation #4950356 found.
Artifact saved in /home/pi/data/artifacts/4950356.h5
```
## Download SatNOGS Observation TLEs
To add the TLE used in a specific SatNOGS Observation to your catalog, the following command can be used:
```
$ ./contrib/download_satnogs_tle.py 4950356
TLE saved in /home/pi/data/tles/satnogs/4950356.txt
```
## Plot SatNOGS Artifacts
This can be done using [spectranalysis](https://github.com/kerel-fs/spectranalysis).
## Ultra-Short Analysis Guide
```
source .env
OBSERVATION_ID=4950356
./contrib/download_satnogs_tle.py $OBSERVATION_ID
./contrib/download_satnogs_artifact.py $OBSERVATION_ID
./contrib/analyze_artifact.py --observation_id $OBSERVATION_ID --site_id 977
rffit -d "$SATNOGS_DOPPLER_OBS_DIR/$OBSERVATION_ID.dat" -c "$SATNOGS_TLE_DIR/$OBSERVATION_ID.txt" -i 27844 -s 977
```
```
$ ./contrib/download_satnogs_tle.py 4950356
TLE saved in /mnt/old_home/kerel/c4/satnogs/data/tles/satnogs/4950356.txt
$ ./contrib/download_satnogs_artifact.py 4950356
Artifact Metadata for Observation #4950356 found.
Artifact saved in /mnt/old_home/kerel/c4/satnogs/data/artifacts/4950356.h5
$ ./contrib/analyze_artifact.py --observation_id 4950356 --site_id 977
Load /mnt/old_home/kerel/c4/satnogs/data/artifacts/4950356.h5
Extract measurements...
Data written in /mnt/old_home/kerel/c4/satnogs/data/doppler_obs/4950356.dat
$ rffit -d ${SATNOGS_DOPPLER_OBS_DIR}/4950356.dat -c ${SATNOGS_TLE_DIR}/4950356.txt -i 27844 -s 977
```

View File

@ -0,0 +1,170 @@
#!/usr/bin/env python3
from astropy.time import Time
from datetime import datetime, timedelta
import argparse
import ephem
import h5py
import json
import os
import sys
import matplotlib.pyplot as plt
import numpy as np
import pandas as pd
def load_artifact(filename):
hdf5_file = h5py.File(filename, 'r')
if hdf5_file.attrs['artifact_version'] != 2:
print("unsupported artifact version {}".format(hdf5_file.attrs['artifact_version']))
# return
wf = hdf5_file.get('waterfall')
data = (np.array(wf['data']) * np.array(wf['scale']) + np.array(wf['offset']))
metadata = json.loads(hdf5_file.attrs['metadata'])
return hdf5_file, wf, data, metadata
def extract_peaks(data):
"""
Returns
-------
points: (time_index, freq_index)
measurements: measurement tuples of relative time in seconds and relative frequency in hertz
"""
snr = (np.max(data, axis=1) - np.mean(data, axis=1)) / np.std(data, axis=1)
# Integrate each channel along the whole observation,
# This filter helps under the assumption of minimal Doppler deviaion
snr_integrated = np.zeros(data[0].shape)
for row in data:
snr_integrated += (row - np.mean(row)) / np.std(row)
snr_integrated = pd.Series(snr_integrated/len(snr_integrated))
SNR_CUTOFF = 4
CHANNEL_WINDOW_SIZE = 4
channel_mask = (snr_integrated.diff().abs() / snr_integrated.diff().std() > SNR_CUTOFF).rolling(CHANNEL_WINDOW_SIZE, min_periods=1).max()
rows=[]
for i, row in enumerate(channel_mask):
if not row:
continue
rows.append(i)
points_all = []
points_filtered = []
for i,x in enumerate(np.argmax(data, axis=1)):
points_all.append([i, x])
if x in rows:
points_filtered.append([i, x])
points_all = np.array(points_all)
points_filtered = np.array(points_filtered)
if len(points_filtered) == 0:
print("WARNING: No measurement passed filter, loosening requirements now.")
points_filtered = points_all
measurements = None
measurements = np.vstack((wf['relative_time'][points_filtered[:,0]],
[wf['frequency'][x] for x in points_filtered[:,1]]))
return snr, measurements, points_filtered
def plot_measurements_all(wf, data):
plt.plot(wf['relative_time'][:],
[wf['frequency'][x] for x in np.argmax(data[:], axis=1)],
'.',
label='all')
def plot_measurements_selected(measurements):
plt.plot(measurements[0,:],
measurements[1,:],
'.',
label='filtered')
def plot_legend(observation_id):
plt.legend()
plt.title("Observation #{} - Maximum Values".format(observation_id))
plt.grid()
plt.xlabel('Elapsed Time / s')
plt.ylabel('rel. Frequency / kHz')
plt.show()
def dedoppler(measurements, metadata):
tle = metadata['tle'].split('\n')
start_time = datetime.strptime(wf.attrs['start_time'].decode('ascii'),
'%Y-%m-%dT%H:%M:%S.%fZ')
f_center = float(metadata['frequency'])
# Initialize SGP4 propagator / pyephem
satellite = ephem.readtle('sat', tle[1], tle[2])
observer = ephem.Observer()
observer.lat = str(metadata['location']['latitude'])
observer.lon = str(metadata['location']['longitude'])
observer.elevation = metadata['location']['altitude']
def remove_doppler_correction(t, freq):
"""
Arguments
---------
t - float: Time in seconds
freq - float: Relative Frequency in herz
"""
observer.date = t
satellite.compute(observer)
v = satellite.range_velocity
df = f_center * v / ephem.c
return f_center + freq - df*2
output = []
for dt,df in measurements.T:
t = start_time + timedelta(seconds=dt)
f = f_center + df
freq_recv = remove_doppler_correction(t, f)
output.append((t, freq_recv))
return output
def save_rffit_data(filename, measurements, site_id):
with open(filename, 'w') as file_out:
for time, freq in measurements:
line = '{:.6f}\t{:.2f}\t1.0\t{}\n'.format(Time(time).mjd, freq, site_id)
file_out.write(line)
if __name__ == '__main__':
parser = argparse.ArgumentParser(description='Analyze SatNOGS Artifact.')
parser.add_argument('--observation_id', type=str,
help='SatNOGS Observation ID')
parser.add_argument('--filename', type=str,
help='SatNOGS Artifacts File')
parser.add_argument('--output_file', type=str,
help='STRF-compatible output file')
parser.add_argument('--site_id', type=int, required=True,
help='STRF Site ID')
args = parser.parse_args()
if args.observation_id:
# Use canonic file paths, ignoring `--filename` and `--output_path`
filename = '{}/{}.h5'.format(os.getenv('SATNOGS_ARTIFACTS_DIR'), args.observation_id)
output_file = '{}/{}.dat'.format(os.getenv('SATNOGS_DOPPLER_OBS_DIR'), args.observation_id)
else:
if not any([args.filename, args.output_file]):
print('ERROR: Missing arguments')
filename = args.filename
output_file = args.output_file
print('Load {}'.format(filename))
hdf5_file, wf, data, metadata = load_artifact(filename)
print('Extract measurements...')
snr, measurements, points = extract_peaks(data)
plot_measurements_all(wf, data)
plot_measurements_selected(measurements)
plot_legend(args.observation_id)
m2 = dedoppler(measurements, metadata)
save_rffit_data(output_file, m2, site_id=args.site_id)
print('Data written in {}'.format(output_file))

View File

@ -0,0 +1,85 @@
#!/usr/bin/env python3
import argparse
import logging
import requests
import settings
import shutil
import sys
import tempfile
import os
from urllib.parse import urljoin
from pprint import pprint
logger = logging.getLogger(__name__)
def fetch_artifact_metadata(network_obs_id):
url = urljoin(settings.SATNOGS_DB_API_URL, 'artifacts/',)
params = {'network_obs_id': network_obs_id}
headers = {'Authorization': 'Token {0}'.format(settings.SATNOGS_DB_API_TOKEN)}
response = requests.get(url,
params=params,
headers=headers,
timeout=10)
response.raise_for_status()
return response.json()
def fetch_artifact(url, artifact_filename):
headers = {'Authorization': 'Token {0}'.format(settings.SATNOGS_DB_API_TOKEN)}
response = requests.get(url,
headers=headers,
stream=True,
timeout=10)
response.raise_for_status()
with open(artifact_filename, 'wb') as fname:
for chunk in response.iter_content(chunk_size=1024):
fname.write(chunk)
def download_artifact(observation_id, filename):
try:
artifact_metadata = fetch_artifact_metadata(network_obs_id=observation_id)
except requests.HTTPError:
print('An error occurred trying to GET artifact metadata from db')
return
if not len(artifact_metadata):
print('No artifact found in db for network_obs_id {}'.format(network_obs_id))
return
print("Artifact Metadata for Observation #{} found.".format(observation_id))
try:
artifact_file_url = artifact_metadata[0]['artifact_file']
artifact_file = tempfile.NamedTemporaryFile(delete=False)
fetch_artifact(artifact_file_url, artifact_file.name)
artifact_file.close()
shutil.copy(artifact_file.name, filename)
os.remove(artifact_file.name)
print("Artifact saved in {}".format(filename))
except requests.HTTPError:
print('Download failed for {}'.format(artifact_file_url))
return
if __name__ == "__main__":
parser = argparse.ArgumentParser(description='Download SatNOGS Artifacts from SatNOGS DB.')
parser.add_argument('observation_ids', metavar='ID', type=int, nargs='+',
help='SatNOGS Observation ID')
args = parser.parse_args()
logging.basicConfig(level=logging.INFO)
for observation_id in args.observation_ids:
download_artifact(observation_id,
'{}/{}.h5'.format(os.getenv('SATNOGS_ARTIFACTS_DIR'), observation_id))

View File

@ -0,0 +1,26 @@
#!/usr/bin/env python3
import argparse
import os
from satnogs_api_client import fetch_observation_data
if __name__ == '__main__':
parser = argparse.ArgumentParser(description='Download TLE from a specific Observation in SatNOGS Network.')
parser.add_argument('observation_id', type=int,
help='SatNOGS Observation ID')
args = parser.parse_args()
obs = fetch_observation_data([args.observation_id])[0]
filename = '{}/{}.txt'.format(os.getenv('SATNOGS_TLE_DIR'), args.observation_id)
with open(filename, 'w') as f:
f.write(obs['tle0'])
f.write('\n')
f.write(obs['tle1'])
f.write('\n')
f.write(obs['tle2'])
f.write('\n')
print("TLE saved in {}".format(filename))

View File

@ -0,0 +1,54 @@
#!/usr/bin/env python3
import sys
import logging
import requests
import settings
from urllib.parse import urljoin
from pprint import pprint
from satnogs_api_client import fetch_observation_data, fetch_tle_of_observation
logger = logging.getLogger(__name__)
def fetch_latest_artifacts_metadata():
url = urljoin(settings.SATNOGS_DB_API_URL, 'artifacts/',)
params = {}
headers = {'Authorization': 'Token {0}'.format(settings.SATNOGS_DB_API_TOKEN)}
try:
response = requests.get(url,
params=params,
headers=headers,
timeout=10)
response.raise_for_status()
except (requests.ConnectionError, requests.Timeout, requests.TooManyRedirects):
logger.exception('An error occurred trying to GET artifact metadata from db')
artifacts_metadata = response.json()
if not len(artifacts_metadata):
logger.info('No artifacts found in db')
sys.exit(-1)
return artifacts_metadata
if __name__ == "__main__":
logging.basicConfig(level=logging.INFO)
# Fetch list of artifacts
artifacts_metadata = fetch_latest_artifacts_metadata()
observation_ids = [artifact['network_obs_id'] for artifact in artifacts_metadata]
# Load corresponding obs from network
observations = fetch_observation_data(sorted(observation_ids, reverse=True))
# Filter by good status
for observation in observations:
if not observation['vetted_status'] == 'good':
pass
print("{}/observations/{}/".format(settings.SATNOGS_NETWORK_API_URL[:-5], observation['id']))

View File

@ -0,0 +1,10 @@
astropy
matplotlib
numpy
Pillow
python-decouple
requests
git+https://gitlab.com/librespacefoundation/satnogs/python-satnogs-api.git@e20a7d3c
h5py~=3.6.0
pandas~=1.3.4
ephem~=4.3.1

View File

@ -0,0 +1,213 @@
#!/usr/bin/env python3
import argparse
import requests
import ephem
import csv
from astropy.time import Time
from astropy import constants as const
from datetime import datetime, timedelta
from io import BytesIO
import matplotlib.pyplot as plt
import numpy as np
from PIL import Image
import settings
from satnogs_api_client import fetch_observation_data, fetch_tle_of_observation
def tabulation_helper_dialog(lower_left, upper_right,
bandwidth, duration,
f_center, filename_out,
waterfall_matrix,
epoch_start, site_id, correction_method=None):
# Assumptions: x-axis - freqency, y-axis - time; up-time advance, right-freq advance
# Derive conversion parameters
x_range = upper_right[0] - lower_left[0]
y_range = lower_left[1] - upper_right[1]
x_offset = int(lower_left[0] + 0.5 * x_range)
y_offset = lower_left[1]
t_step = duration / y_range
f_step = bandwidth / x_range
# Highlight center and calibration lines
waterfall_matrix[:,x_offset] = (255,0,0,255)
waterfall_matrix[y_offset,:] = (255,0,0,255)
waterfall_matrix[:,upper_right[0]] = (255,0,0,255)
waterfall_matrix[upper_right[1],:] = (255,0,0,255)
markers = {'x': [], 'y': [], 't': [], 'f': []}
fig, ax = plt.subplots()
ax.imshow(waterfall_matrix)
marker_line, = ax.plot([0], [0], marker='.', c='k', zorder=100)
def on_mouse_press(event):
tb = plt.get_current_fig_manager().toolbar
if not (event.button==1 and event.inaxes and tb.mode == ''):
return
markers['x'].append(event.xdata)
markers['y'].append(event.ydata)
markers['t'].append(epoch_start - (event.ydata - y_offset) * t_step)
markers['f'].append(f_step * (event.xdata - x_offset))
print('x:{:.2f} y:{:.2f} --> {:7.0f} --> {} {:7.0f}'.format(markers['x'][-1],
markers['y'][-1],
markers['f'][-1] - f_center,
markers['t'][-1],
markers['f'][-1]))
update_plot_markers()
def update_plot_markers():
marker_line.set_xdata(markers['x'])
marker_line.set_ydata(markers['y'])
# ax.scatter(x=[int(event.xdata)], y=[int(event.ydata)], marker='.', c='k', zorder=100)
ax.figure.canvas.draw()
def undo_track_point_selection():
markers['x'] = markers['x'][:-1]
markers['y'] = markers['y'][:-1]
markers['t'] = markers['t'][:-1]
markers['f'] = markers['f'][:-1]
update_plot_markers()
def write_strf_spectrum(filename, site_id, markers):
with open(filename, 'w') as f:
for t,freq in list(sorted(zip(markers['t'],markers['f']), key=lambda x: x[0])):
if correction_method:
freq_recv = correction_method(t, freq)
else:
freq_recv = freq
line = '{:.6f}\t{:.2f}\t1.0\t{}\n'.format(Time(t).mjd, freq_recv, site_id)
print(line, end='')
f.write(line)
def save_selected_track_points():
write_strf_spectrum(filename_out, site_id, markers)
print('Stored {} selected track points in {}.'.format(len(markers['t']), filename_out))
def on_key(event):
if event.key == 'u':
undo_track_point_selection()
elif event.key == 'f':
save_selected_track_points()
fig.canvas.mpl_connect('button_press_event', on_mouse_press)
fig.canvas.mpl_connect('key_press_event', on_key)
plt.tight_layout()
plt.show()
def read_sitestxt(path):
sites = {}
with open(path) as fp:
d = csv.DictReader([row for row in fp if not row.startswith('#')],
delimiter=' ',
fieldnames=["no", "id", "lat", "lon", "alt"],
restkey="observer",
skipinitialspace=True)
for line in d:
print(line)
sites[line["no"]] = {'lat': line["lat"],
'lon': line["lon"],
'sn': line["id"],
'alt': line["alt"],
'name': ' '.join(line["observer"])}
return sites
def add_station_to_sitestxt(station):
# Read the sites.txt
sites = read_sitestxt(settings.SITES_TXT)
if '7{}'.format(station['id']) in sites:
# Station is already available
return
else:
with open(settings.SITES_TXT, 'a') as f:
f.write('7{} {} {} {} {} {}\n'.format(station['id'],
'SN',
station['lat'],
station['lng'],
station['alt'],
station['name']))
def tabulation_helper(observation_id):
# Fetch waterfall image and observation data
observation = fetch_observation_data([observation_id])[0]
tle = fetch_tle_of_observation(observation_id)
result = requests.get(observation['waterfall'])
image = Image.open(BytesIO(result.content))
waterfall_matrix = np.array(image)
epoch_start = datetime.strptime(observation['start'], '%Y-%m-%dT%H:%M:%SZ')
epoch_end = datetime.strptime(observation['end'], '%Y-%m-%dT%H:%M:%SZ')
site_id = int("7" + str(observation['ground_station']))
station = {'lat': observation['station_lat'],
'lng': observation['station_lng'],
'alt': observation['station_alt'],
'name': "SatNOGS No.{}".format(observation['ground_station']),
'id': observation['ground_station']}
# Store TLE to file
with open('{}/{}.txt'.format(settings.TLE_DIR, observation_id), 'w') as f:
f.write('0 OBJECT\n')
for line in tle:
f.write(f'{line}\n')
# Add SatNOGS station to sites.txt
add_station_to_sitestxt(station)
# Set image parameters
lower_left = (66, 1553)
upper_right = (686, 13)
# Set data parameters
bandwidth = 48e3 # Hz
duration = epoch_end - epoch_start
f = observation['transmitter_downlink_low']
drift = observation['transmitter_downlink_drift']
if drift:
f_center = f * (1 + drift / 1e9)
else:
f_center = f
filename_out = '{}/{}.dat'.format(settings.OBS_DIR, observation_id)
# Initialize SGP4 propagator / pyephem
satellite = ephem.readtle('sat', tle[0], tle[1])
observer = ephem.Observer()
observer.lat = str(station['lat'])
observer.lon = str(station['lng'])
observer.elevation = station['alt']
def remove_doppler_correction(t, freq):
# Remove the doppler correction
observer.date = t
satellite.compute(observer)
v = satellite.range_velocity
df = f_center * v / const.c.value
return f_center + freq - df
tabulation_helper_dialog(lower_left, upper_right,
bandwidth, duration,
f_center, filename_out,
waterfall_matrix,
epoch_start, site_id, correction_method=remove_doppler_correction)
if __name__ == '__main__':
parser = argparse.ArgumentParser(description='Interactive helper to tabulate signals from SatNOGS waterfalls.')
parser.add_argument('observation_ids', metavar='ID', type=int, nargs='+',
help='SatNOGS Observation ID')
args = parser.parse_args()
for observation_id in args.observation_ids:
tabulation_helper(observation_id)

View File

@ -0,0 +1,19 @@
from decouple import config
# Path to TLE
# Filename convention: {TLE_DIR}/{observation_id}.txt
TLE_DIR = config('SATNOGS_TLE_DIR')
# absulute frequency measurement storage
# Filename convention: {DOPPLER_OBS_DIR}/{observation_id}.dat
DOPPLER_OBS_DIR = config('SATNOGS_DOPPLER_OBS_DIR')
# legacy name:
OBS_DIR = DOPPLER_OBS_DIR
# SATTOOLS/STRF/STVID sites.txt file
SITES_TXT = config('SATNOGS_SITES_TXT')
SATNOGS_NETWORK_API_URL = config('SATNOGS_NETWORK_API_URL')
SATNOGS_DB_API_URL = config('SATNOGS_DB_API_URL')
SATNOGS_DB_API_TOKEN = config('SATNOGS_DB_API_TOKEN')

View File

@ -0,0 +1,7 @@
#!/usr/bin/bash
DB_API_TOKEN="4f20a493a3f5fd85074d61db944365bf94987613"
URL="https://db-satnogs.freetls.fastly.net/media/artifacts/b4975058-04eb-4ab7-9c40-9bcce76d94db.h5"
echo "Authorization: Token $DB_API_TOKEN"
curl -H "Authorization: Token $DB_API_TOKEN" "$URL"

17
env-dist 100644
View File

@ -0,0 +1,17 @@
# "New" Python STRF environment variables
# SATNOGS_DIR is used by config.sh for STRF configuration
SATNOGS_DIR=/home/pi/satnogs_data
# SatNOGS Waterfall Tabulation Helper
export SATNOGS_TLE_DIR=/home/pi/satnogs_data/tles/satnogs
export SATNOGS_ARTIFACTS_DIR=/home/pi/satnogs_data/artifacts
export SATNOGS_DOPPLER_OBS_DIR=/home/pi/satnogs_data/doppler_obs
SATNOGS_SITES_TXT=$SATNOGS_DIR/sites.txt
# SatNOGS Artifacts Helpers
SATNOGS_NETWORK_API_URL=https://network.satnogs.org/api/
SATNOGS_DB_API_URL=https://db.satnogs.org/api/
SATNOGS_DB_API_TOKEN=your-db-api-token