-
Notifications
You must be signed in to change notification settings - Fork 310
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Default values not initialized when job loading data with the Python API #1495
Comments
I attempted to run this with google-cloud-bigquery=3.4.0 and 3.10.0. |
Encountered the issue too, and this goes against what the documentation says here. Tried to unsert a few lines with a INSERT INTO statement and the default value was correctly used. On the other hand, trying the same thing with a LOAD DATA sql statement does not work either (it leaves the missing field from the JSON file to null and not set to the default value) |
Facing the same issue using parquet files from cloud storage and load_table_from_uri function. Can provide a code sample, if needed EDIT: Code snippet table_schema = [
bigquery.SchemaField("store_id", "INTEGER"),
bigquery.SchemaField("seller_id", "INTEGER"),
bigquery.SchemaField("rec_cre_tms", "TIMESTAMP", default_value_expression="CURRENT_TIMESTAMP()"),
]
table_id = f'{project_id}.{dataset}.{table_name}'
table = bigquery.Table(table_id, schema=table_schema)
bigquery_client.create_table(table)
...
job_schema = [
bigquery.SchemaField("store_id", "INTEGER"),
bigquery.SchemaField("seller_id", "INTEGER"),
]
job_config = bigquery.LoadJobConfig(
source_format=bigquery.SourceFormat.PARQUET,
write_disposition=bigquery.WriteDisposition.WRITE_APPEND,
schema=job_schema
) |
same issue here! |
same issue here, default values only got inserted correctly with INSERT sql statement. |
same issue here |
When issuing a job with
load_table_from_json
, fields in the Schema with a default values are not populated.I use the following table for testing purposes:
Note that I have tried with
mode="REQUIRED"
but that the loading only fails in such case.I have also tried creating the table before launching the job and to let the Python load job handle it: none of that worked.
I don't have this issue when streaming instead of using jobs.
Environment details
google-cloud-bigquery
version: 3.5.0Steps to reproduce
create_date
andnumber
are emptyCode example
Stack trace
None. Insertion happens but without the default columns populated.
The text was updated successfully, but these errors were encountered: