A proof-of-concept to use gopro-telemetry to automatically adjust roll, pitch, and yaw in processed equirectangular videos/
Read this post for a bit more information about our thought processes in building this: https://www.trekview.org/blog/2022/calculating-heading-of-gopro-video-using-gpmf-part-1/
- First extract the telemetry file (as json) from your GoPro video using gopro-telemetry. Detailed instructions about how to do this can be found in this post.
- Install required packages
pip3 install -r requirements.txt
Add arguments in command line
python3 main.py [.json telemetry file] [--plot] [--video_input] [--mode]
--plot
argument is optional, only put some value when plots are required--video_input
argument is optional, only put some value if you want to modify a video, in which case you also need to pass--video_input
the input video--mode
eitherunworldlock
,level_roll
,level_pitch
(see examples)
This script will not work if the camera is upside down, between 90
- 180
and -180
to -90
.
The sample videos used in this readme are available here.
python3 main.py docs/GS010013-worldlock.json
2. To update the .json file with new data and create plots of the roll pitch yaw and magnetic heading
python3 main.py docs/GS010013-worldlock.json --plot true
python3 main.py docs/GS010013-worldlock.json --video_input docs/GS010013-worldlock.mp4 --mode unworldlock
python3 main.py docs/GS010011-roll.json --plot true --video_input docs/GS010011.mp4 --mode level_roll
python3 main.py docs/GS010010-pitch.json --video_input docs/GS010010.mp4 --mode level_pitch
This script has been tested and confirmed working for:
- GoPro MAX running firmware: H19.03.02.00.00 (shows on camera LCD as 02.00)
Camera orientation CORI
is a relative measurement (the orientation relative to the orientation the sensor had when the acquisition started). It is reported in Quaternions in the order w
,x
,y
,z
in gopro-telemetry.
"CORI":{
"samples":[{
"value":[0.9989318521683401,-0.024964140751365705,0.02621539963988159,0.029206213568529312],
"cts":176.62,
"date":"2022-05-26T08:35:42.485Z",
"sticky":{
"VPTS":1261037}
},
To calculate yaw, pitch and roll values from this data we take the four Quarternation values (Euler Parameters) and convert them into Euler angles on each axis.
The equations to do this are somewhat complex, as you'll see from a cursory scan of this Wikipedia article; Conversion between Quaternions and Euler angles, or by examining the code in main.py
.
Values from the Magnetometer are reported in the axis order z
,x
,y
in MicroTeslas in gopro-telemetry.
"MAGN":{
"samples":[{
"value":[-4,88,27],
"cts":163.461,
"date":"2022-05-26T08:35:42.485Z"
},
To calculate heading values we first sync MAGN
samples with their closest CORI
sample.
Once the times have been synced and each MAGN
sample has a corresponding CORI
sample we can calculate the magnetic heading using the formula:
Mx = mx * cos(p) + my * sin(p)
My = mx * cos(r) * sin(p) + my * cos(r) + mz * sin(r) * cos (p)
M_yaw = atan2(My,Mx)
Where:
mx
= magnetometer x readingmy
= magnetometer y readingmz
= magnetometer z readingr
= roll anglep
= pitch angle
Be careful not to confuse My
and my
/ Mx
and mx
(they are different variables). For clarity; my
is the magnetic component in y direction, My
is the output of the second equation which is approximately corrected y component of the magnetic field. The same explanation applies for Mx
and mx
.
This script then writes out a new telemetry file (INPUT-calculated.json
) with the following values:
RPYR
:- name: roll, pitch, yaw (y,x,z)
- units: radians
- cts: milliseconds since video start
- date: YYYY-MM-DDTHH:MM:SS.SSSZ
RPYD
- name: roll, pitch, yaw (y,x,z)
- units: degrees
- cts: milliseconds since video start
- date: YYYY-MM-DDTHH:MM:SS.SSSZ
HEAR
- name: magnetic heading
- units: radians
- cts: milliseconds since video start
- date: YYYY-MM-DDTHH:MM:SS.SSSZ
HEAD
- name: magnetic heading
- units: degrees
- cts: milliseconds since video start
- date: YYYY-MM-DDTHH:MM:SS.SSSZ
For reference, here's a sample of the first and last HEAD
entry in a telemetry file to demo the structure of the object;
"HEAD": {
"samples": [
{
"value": 2.4325704405863235,
"cts": 189.758,
"date": "2022-06-08T11:46:54.225Z"
},
...
{
"value": -1.9311572331906321,
"cts": 21121.995333333347,
"date": "2022-06-08T11:47:15.591Z"
}
],
"name": "magnetic heading",
"units": "degrees"
}
},
You can see the -calculated.json
files with all fields listed in the /docs
directory of this repository.
HEAR
Values between -pi
to pi
(0
is North) (radians)
HEAD
Values between 0
and 360
(0
is North, 90
is East, etc.) (degrees)
Graphs shown below for example Roll, Pitch, Yaw videos.
Values between -180
and 180
(degrees).
Video input:
Command:
python3 main.py docs/GS010011-roll.json --plot true --video_input GS010011.mp4 --mode level_roll
Output:
Adjusted video:
Values between -90
and 90
(degrees).
Video input:
Command:
python3 main.py docs/GS010010-pitch.json --plot true --video_input docs/GS010010.mp4 --mode level_pitch
Output:
Adjusted video:
Values between -180
and 180
(degrees)
Command:
python3 main.py docs/GS010012-yaw.json --plot true
Output:
This proof of concept was developer with 3 use-cases in mind
World Lock fixes the heading of the video (so the video always faces the same compass heading).
One aim was to reverse the World Lock setting and show video using the true heading of the cameras front lens.
To do this, we assume the first HEAD
value to be the World Lock heading (aka the heading all the frames are fixed to).
Then all that's required is to subtract the World Lock heading from the true compass heading (reported in the telemetry) to get the yaw off-set for the frame and use-open CV to modify the frame appropriately (although executed differently, the logic to do this is described in detail here)
Video input:
python3 main.py docs/GS010013-worldlock.json --plot true --video_input docs/GS010013-worldlock.mp4 --mode unworldlock
Output:
Let's say your camera is mounted to a monopod and is a few degrees in the wrong direction (perhaps your helmet mount isn't perfectly straight). In this case you can use a fixed offset in ffmpeg (no need for this script) to the frames using the v360
filter. Here is an example adjusting yaw by 3 degrees):
ffmpeg -i INPUT.mp4 -vf v360=e:e:yaw=3 OUTPUT.mp4
Roll in video can cause the horizon to sway from side to side. By leveling roll, you can keep the horizon level.
To do this, we assume yaw (x) = 0
to be level. Any frames where x
does not equal 0
means the camera is rolling.
All that's then needed is to take the difference between the roll reported in the telemetry and 0 to get the roll offset for the frame and use ffmpeg to adjust accordingly.
This one was more for fun. We didn't really have a true use-case for it, but wanted to
Similar to roll, we assume pitch (y) = 0
to be level. Any frames where y
does not equal 0
means the camera is pitching.
All that's then needed is to take the difference between the pitch reported in the telemetry and 0 to get the pitch offset for the frame and use ffmpeg to adjust accordingly.
Community support available on Discord: https://discord.gg/ZVk7h9hCfw
The code of this site is licensed under a MIT License.