Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Updated data, Add rough average # of sats, add support for N threads and N degrees with wrapper script #22

Open
wants to merge 6 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
h3_4_*.txt
__pycache__
tle_cache
archive
11 changes: 11 additions & 0 deletions Developing.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,17 @@ This pile of scripts grew pretty organically as I was testing and messing around
organization of a lot of the code. If you want to generate the data yourself I will try and explain what is necessary here.

## Dependencies
You need a few deps today:
```
apt install -y parallel libgeos-c1v5 libgeos-3.8.0 libgeos-dev proj-data proj-bin libproj15 libproj-dev
```

## Running

```
make install-deps
make generate-coverage
```

### Python
These are the minimal dependencies, any code that uses cartopy can be removed ignored if you're not debugging like I was
Expand Down
9 changes: 9 additions & 0 deletions Makefile
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
CURRENT_DATE = $(shell date -u +%F)

.PHONY: install-deps generate-coverage

install-deps:
pip3 install -r requirements.txt

generate-coverage:
./gen_cov.sh
192 changes: 96 additions & 96 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,96 +1,96 @@
# Goals:
1. % of day covered
2. Average coverage - factoring the number of satellites covering a location which would be more useful for approximating average bandwith available for an area
## Original Implementation
1. TLE to simulate satellites
2. Altitude,x,y to conical section on sphere
1. Adjustable minimum elevation
2. Swap out for more accurate model of earth (WGS-84 ellipsoid)
3. Map conical section to equi-rectangular area
4. Translate to pixel coords with resolution of about 100sq mi
5. Normalize? Overlay
For the core there is this naive implementation:
```
for each time step:
for each pixel:
convert to lat/long
for each sat:
if visible
increase counter for pixel
```
This seems like a pretty gross way to do it. We touch every pixel \* numSats, when we could reduce to
numSats \* visible area. However, the math to calculate visible area for a satellite doesn't seem to
be a part of any library I've found. This is even more relevant with VLEO satellites since they cover
so much less area. I'm not sure how efficient it is to go from sphere to lat/long
on an equi-rectangular projection.
## Resolution
2 options for spatial resolution in my head originally:
- 2048 x 1024 = 2,097,152
- 4096 x 2048 = 8,388,608‬
Instead of trying to use a 2d array to represent the footprints and dealing with the spherical nature
of the Earth, I am instead going to try and use S2 level 9 cells to track coverage. According to the
statistics page of the S2 library, this is about 1537000 cells, so I would need about 12MB of data to
track the whole Earth. Each cell covers 324.29km^2 on average.
For temporal resolution:
These satellites move across a location quite quickly, using a website like https://findstarlink.com
we can find that a whole "train" of Starlink satellites crosses a locations view in about 4 or 5 minutes
so our temporal resolution needs to be a minute if not faster if we want accurate results.
## S2 based approach
The original implementation has lots of gross properties and requires dealing with many projections.
So I tried using a new approach that used ~ equal area tesselated grids. The one I knew about was S2 so I tried that first.
```
for each time step:
for each sat:
calculate footprint
get cells to cover footprint
for each cell:
increment counter
```
This means we only have to iterate over ~2000 (+/-600?) cells per satellite instead of ~2 million
points each. However, this approach seems too slow. Comptuing all the covering cells of each satellite
takes about 90s on a single core in Python. While this could be parallelized, I think maybe there
are even smarter approaches to consider. Could we maybe just store S2Cap objects and test points
against them to count the amount of coverage? What I really like from S2 is that it handles the
issues of latitude and longitude meaning different distances at the equator vs the poles.
In the end for plotting, if we used the cell vertices as plotting locations we would cover the whole
globe and not oversample at the poles or undersample at the equator.
## S2, more like 2Slow... An H3 approach
So S2 was just being way to slow to return cells for a given covering. (90s to simulate one timestep)
Now I have to make some guesses about projections for H3 and but it is ~4.5 times faster to simulate
a timestep than S2 (now takes 20 seconds on my machine). I'm gonna proceed with the H3 approach for
now.
## Used constants
- Earth Mean Equitorial Radius = 6,378.1km
- Minimum angle for user terminals = 35deg
(It has recently come to my attention that there may be newer data from SpaceX that the minimum user terminal angle would be 25 degrees, so my data may be more conservative. The data I used is from the FCC document linked below.)
## Starlink details
From the second reference below I found that Starlink is targeting user terminal minimum angles of 35deg
This means that each satellite covers around 600,000km^2 at an altitude of about 340km. However, most of
the starlink satellites already in space are elevating or are at 550 km
Stralink TLE's
https://celestrak.com/NORAD/elements/starlink.txt
### References
- https://arxiv.org/pdf/1906.12318.pdf
- https://licensing.fcc.gov/myibfs/download.do?attachment_key=1190019
# Goals:

1. % of day covered
2. Average coverage - factoring the number of satellites covering a location which would be more useful for approximating average bandwith available for an area

## Original Implementation

1. TLE to simulate satellites
2. Altitude,x,y to conical section on sphere
1. Adjustable minimum elevation
2. Swap out for more accurate model of earth (WGS-84 ellipsoid)
3. Map conical section to equi-rectangular area
4. Translate to pixel coords with resolution of about 100sq mi
5. Normalize? Overlay

For the core there is this naive implementation:

```
for each time step:
for each pixel:
convert to lat/long
for each sat:
if visible
increase counter for pixel
```

This seems like a pretty gross way to do it. We touch every pixel \* numSats, when we could reduce to
numSats \* visible area. However, the math to calculate visible area for a satellite doesn't seem to
be a part of any library I've found. This is even more relevant with VLEO satellites since they cover
so much less area. I'm not sure how efficient it is to go from sphere to lat/long
on an equi-rectangular projection.

## Resolution

2 options for spatial resolution in my head originally:
- 2048 x 1024 = 2,097,152
- 4096 x 2048 = 8,388,608‬

Instead of trying to use a 2d array to represent the footprints and dealing with the spherical nature
of the Earth, I am instead going to try and use S2 level 9 cells to track coverage. According to the
statistics page of the S2 library, this is about 1537000 cells, so I would need about 12MB of data to
track the whole Earth. Each cell covers 324.29km^2 on average.

For temporal resolution:
These satellites move across a location quite quickly, using a website like https://findstarlink.com
we can find that a whole "train" of Starlink satellites crosses a locations view in about 4 or 5 minutes
so our temporal resolution needs to be a minute if not faster if we want accurate results.

## S2 based approach
The original implementation has lots of gross properties and requires dealing with many projections.
So I tried using a new approach that used ~ equal area tesselated grids. The one I knew about was S2 so I tried that first.
```
for each time step:
for each sat:
calculate footprint
get cells to cover footprint
for each cell:
increment counter
```

This means we only have to iterate over ~2000 (+/-600?) cells per satellite instead of ~2 million
points each. However, this approach seems too slow. Comptuing all the covering cells of each satellite
takes about 90s on a single core in Python. While this could be parallelized, I think maybe there
are even smarter approaches to consider. Could we maybe just store S2Cap objects and test points
against them to count the amount of coverage? What I really like from S2 is that it handles the
issues of latitude and longitude meaning different distances at the equator vs the poles.

In the end for plotting, if we used the cell vertices as plotting locations we would cover the whole
globe and not oversample at the poles or undersample at the equator.

## S2, more like 2Slow... An H3 approach
So S2 was just being way to slow to return cells for a given covering. (90s to simulate one timestep)
Now I have to make some guesses about projections for H3 and but it is ~4.5 times faster to simulate
a timestep than S2 (now takes 20 seconds on my machine). I'm gonna proceed with the H3 approach for
now.

## Used constants

- Earth Mean Equitorial Radius = 6,378.1km
- Minimum angle for user terminals = 35deg

(It has recently come to my attention that there may be newer data from SpaceX that the minimum user terminal angle would be 25 degrees, so my data may be more conservative. The data I used is from the FCC document linked below.)

## Starlink details

From the second reference below I found that Starlink is targeting user terminal minimum angles of 35deg
This means that each satellite covers around 600,000km^2 at an altitude of about 340km. However, most of
the starlink satellites already in space are elevating or are at 550 km

Stralink TLE's
https://celestrak.com/NORAD/elements/starlink.txt

### References

- https://arxiv.org/pdf/1906.12318.pdf
- https://licensing.fcc.gov/myibfs/download.do?attachment_key=1190019
32 changes: 28 additions & 4 deletions gen_cov.sh
100644 → 100755
Original file line number Diff line number Diff line change
@@ -1,5 +1,29 @@
date
seq 0 3 | parallel -j4 "./main.py {}"
#!/bin/bash

NUMTHREADS=16

echo "Starting at `date -Iminutes`"
#cleanup and prep
rm h3_4_*.txt
mkdir -p archive

#generate index
./gen_h3_index.py

for i in 25 30 35; do
echo "Starting $i degrees at `date -Iminutes`"
seq 0 $((NUMTHREADS-1)) | parallel -j$NUMTHREADS "./main.py {} $NUMTHREADS $i"
./merge_cover.py $NUMTHREADS
mv h3_4_cov_full.txt archive/h3_4_cov_op_`date -u +%F`_$i.txt
mv h3_4_cov_op.bin h3_4_cov_op_$i.bin
cp h3_4_cov_op_$i.bin archive/h3_4_cov_op_`date -u +%F`_$i.bin
rm h3_4_cov_*.txt
echo "Finished $i degrees at `date -Iminutes`"
done

#archive tle for today
mv tle_cache/starlink.txt tle_cache/starlink-`date -u +%F`.txt
mv h3_4_cov_full.txt h3_4_cov_op_`date -u +%F`.txt
./merge_cover.py

#cleanup
rm h3_4_*.txt
echo "Done at `date -Iminutes`"
Empty file modified gen_h3_index.py
100644 → 100755
Empty file.
Loading