You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
What happened:
I'm running the python client, with dynamodb as locking provider and s3 as storage. In order to catch when there is a schema change I have a logic that's like this (which then propagates the schema change to BigQuery that consumes the tables):
I THINK the error is happening when the engine fails to write the items. It then creates a log item in s3 with no creationTime like below which BigQuery is unable to parse and then the complete table is failing to load. I don't know if this is expected behaviour from the lib and it's just BigQuery that is not reading the metadata correctly or if this actually is a bug in the lib.
@Petterhg as @ion-elgreco mentioned createdTime is optional but unfortunately some delta implementations do not treat optional as actually optional 😭
#2926 does ensure that the createdTime gets set when a new metadata action is created, such as on schema evolution, so perhaps once I release 0.20.2 you would be able to modify the schema of that table or recreate it in order to make it BigQuery readable?
Environment
Delta-rs version:
delta-rs.0.20.0
Binding:
Environment:
Bug
What happened:
I'm running the python client, with dynamodb as locking provider and s3 as storage. In order to catch when there is a schema change I have a logic that's like this (which then propagates the schema change to BigQuery that consumes the tables):
I THINK the error is happening when the engine fails to write the items. It then creates a log item in s3 with no creationTime like below which BigQuery is unable to parse and then the complete table is failing to load. I don't know if this is expected behaviour from the lib and it's just BigQuery that is not reading the metadata correctly or if this actually is a bug in the lib.
I CAN read the table when using the Python client, just not with BigQuery. Any advice here would be highly appreciated.
What you expected to happen:
How to reproduce it:
More details:
The text was updated successfully, but these errors were encountered: