Local Processing of SnapperGPS #32
Replies: 10 comments 1 reply
-
|
How complex would it be to modify SnapperGPS-backend to read from a local directory instead of a database? This specific step should not be too complex, the original implementation did exactly that. Modifications would likely be limited to a single file, but there, you would have to change a lot. |
Beta Was this translation helpful? Give feedback.
-
|
Additionally, what kind of processing power would be required for a laptop to handle this workload effectively? Originally, I targeted snappergps-backend at Linux servers with slow processors (1.x GHz) and lots of RAM (tens of GB). Processing a full dataset can take hours, but you can happily parallelise. However, a single instance runs fine on a desktop machine with 8 GB RAM total, too. The key is to reduce the batch size command line parameter. For anything with less resources, you'd have to re-visit the internals of the processing code. Also not sure how the use of Windows vs. Linux would affect memory requirements. |
Beta Was this translation helpful? Give feedback.
-
|
1. Download .json data from the device. While it is possible to move this to serial-over-USB communication, I would suggest to stick to the existing WebUSB implementation. You can download JSON data using the Transfer data button here in the web app (which runs offline, too). If needed, you can find the underlying implementation here. |
Beta Was this translation helpful? Give feedback.
-
|
2. Convert .json files into .npy format (if necessary). I would suggest to stick to JSON. |
Beta Was this translation helpful? Give feedback.
-
|
3. Run processing algorithms pointing at a local directory. As written before, the original implementation reads data from a local directory. To replicate that for snappergps-backend, you can simply move the JSON files from step one to a local directory and point the backend there. In addition, you would need to keep track, which files you have already processed successfully. Plus have a routine that detects when new files are added. In general, the change would require quite a few modifications to process_queue. Basically, you would need to remove everything postgres related and replace it with reading from JSON. You can find some example code on how to read SnapperGPS' JSON format here, but it is written in JS. |
Beta Was this translation helpful? Give feedback.
-
|
3. (c't) ... including downloading the appropriate ephemeris data. Downloading navigation data is handled by maintain_navigation_data, which runs independently from process_queue. You can keep it like this, but then make sure that it either runs persistently or is invoked with the -d flagged to fill gaps of previous missing data. Alternatively, you can add this functionality to process_queue. E.g., if this check fails, you could invoke download_brdc from maintian_navigation_data. Either way, you would need to put your NASA credentials here, as described in the readme. |
Beta Was this translation helpful? Give feedback.
-
|
4. Output a .csv file, to a local directory. Basically, you want to modify the code in this section and write to CSV instead of a postgresDB. |
Beta Was this translation helpful? Give feedback.
-
|
Finally, you may want to have a look at the config file to see if there are any hyperparameters you want to change. |
Beta Was this translation helpful? Give feedback.
-
|
P.S: I just pushed a small change to the backend that seems to be able to improve results for very fast moving receivers (e.g., birds flying with the wind): SnapperGPS/snappergps-backend@242d26a |
Beta Was this translation helpful? Give feedback.
-
|
I have been processing tracks using Local Processing initially they were going very well, lately, i have been running into issues with lots of low confidence and tracks that are incomplete and appear to be missing sections of trips, in particular the return leg of the journey, i have tried changing speed filter settings, it doesnt seem to make a difference. The files were processed on the day of return, as i understand from your FAQ page on snappergps.info i understand this might result in reduced accuracy of GNSS information. Do you have any advice for trouobleshooting this, devices show really good tracks on the outbound journey and have shown good tracks with single spot tests after deployment, suggesting this isnt a hardware issue. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Moved here from email.
Our goal is to run a fully local processing workflow on field laptops, allowing each field station to upload and process tracks independently. Ideally, we would continue to use a cloud service like NASA’s Earthdata for ephemeris data but avoid using PostgreSQL entirely. How complex would it be to modify SnapperGPS-backend to read from a local directory instead of a database?
Additionally, what kind of processing power would be required for a laptop to handle this workload effectively?
Our ideal workflow:
Would this be feasible with some modifications?
Beta Was this translation helpful? Give feedback.
All reactions