-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
NPI-3446 Fix SP3 parse and write of data section flags, plus address CLK and POS nodata causing misalignment #45
NPI-3446 Fix SP3 parse and write of data section flags, plus address CLK and POS nodata causing misalignment #45
Conversation
treefern
commented
Aug 13, 2024
- fixed incorrect width of unused columns between data flag values (2 instead of 1). This allows the second prediction flag to be read in.
- Consolidated definitions of columns for maintainability.
- Updated column/index definitions to clearly define a FLAGS index, and specify names of individual flag columns.
- Fixed SP3 output generator to access and output FLAGS data.
- Addressed misalignment bug caused by the combination of CLK nodata value width, and Pandas adding a space between columns. Also applied this fix to POS nodata value, as it appears the same issue would apply there.
- Added handling for missing flag values (conformance to empty strings to avoid outputting nans)
- Added placeholder TODOs for throwing exceptions if record types are encountered that we currently do not implement support for (EP, EV).
- fixed incorrect width of unused columns between data flag values (2 instead of 1). This allows the second prediction flag to be read in. - Consolidated definitions of columns for maintainability. - Updated column/index definitions to clearly define a FLAGS index, and specify names of individual flag columns. - Fixed SP3 output generator to access and output FLAGS data. - Addressed misalignment bug caused by the combination of CLK nodata value width, and Pandas adding a space between columns. Also applied this fix to POS nodata value, as it appears the same issue would apply there. - Added handling for missing flag values (conformance to empty strings to avoid outputting nans) - Added placeholder TODOs for throwing exceptions if record types are encountered that we currently do not implement support for (EP, EV).
Running this branch in python 3.12 via In [2]: from gnssanalysis.gn_io.sp3 import read_sp3, write_sp3
/data/review-repos/gnssanalysis/gnssanalysis/gn_io/blq.py:12: SyntaxWarning: invalid escape sequence '\/'
_RE_LLH = _re.compile(b"lon\/lat\:\s+([\d\.]+)+\s+([\d\.\-]+)\s+([\d\.\-]+)")
/data/review-repos/gnssanalysis/gnssanalysis/gn_io/trop.py:76: SyntaxWarning: invalid escape sequence '\s'
sep='\s+', Not sure if we need to resolve these here, but something to take note of |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for putting this PR together. It hits a lot of the issues we've been seeing recently in the Ginan-Ops space.
I found some errors in doing reading / writing cycles with example SP3 files. I think when these are resolved, this should be good to go in
…avoid DataFrame.to_string() inserting spaces.
…d has same effect as default.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Tested with various SP3 files from different ACs (COD, GFZ, WHU, GA) and different types (ULT, RAP, FIN). Seems to work great now!
Thanks for putting through those changes yesterday.
I think this now good to go in 👍
…ppy. (Note that it seems to still work without this)
Resolved with use of |
…e caused a crash)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Additional commits to address comment I made around the regex expression and leaving notes - minor changes that don't change the overall SP3 changes so happy for this to go in