Replies: 2 comments
-
It's an optional field so that's something google needs to address in bigquery |
Beta Was this translation helpful? Give feedback.
0 replies
-
@Petterhg as @ion-elgreco mentioned #2926 does ensure that the |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Environment
Delta-rs version:
delta-rs.0.20.0
Binding:
Environment:
Bug
What happened:
I'm running the python client, with dynamodb as locking provider and s3 as storage. In order to catch when there is a schema change I have a logic that's like this (which then propagates the schema change to BigQuery that consumes the tables):
I THINK the error is happening when the engine fails to write the items. It then creates a log item in s3 with no creationTime like below which BigQuery is unable to parse and then the complete table is failing to load. I don't know if this is expected behaviour from the lib and it's just BigQuery that is not reading the metadata correctly or if this actually is a bug in the lib.
I CAN read the table when using the Python client, just not with BigQuery. Any advice here would be highly appreciated.
What you expected to happen:
How to reproduce it:
More details:
Beta Was this translation helpful? Give feedback.
All reactions