You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
What happened:
I'm trying to delete some data from a delta table in Rust, using with_predicate.
let (table,_) = DeltaOps(table).delete().with_predicate(col("fechabloqueo").eq(cast(lit("2022-11-06"),Date32))).await.unwrap();
but im getting "Execution error: Failed to map column projection for field fechacreacion. Incompatible data types Timestamp(Nanosecond, None) and Timestamp(Microsecond, None)"
What you expected to happen:
Delete successfully with that condition.
How to reproduce it:
More details:
The table was created using default INT96 outputTimestampType option.
If i rewrite the table using spark.conf.set("spark.sql.parquet.outputTimestampType", "TIMESTAMP_MICROS")
still fails: "Execution error: Failed to map column projection for field fechacreacion. Incompatible data types Timestamp(Microsecond, Some(\"UTC\")) and Timestamp(Microsecond, None)"
The text was updated successfully, but these errors were encountered:
This is caused by ambiguity on how to store timestamps in the serialization section of the protocol. The older version of the protocol specified that timestamp columns do not specified the timezone but this change has modified it to be adjusted to UTC.
I did some local testing. Updating the code here to include the timezone of UTC and performing of casts in the user supplied query (e.g (cast(lit("2022-11-06"),arrow_schema::DataType::Timestamp(arrow_schema::TimeUnit::Microsecond, Some("UTC".into())))) will fix this issue. I'm just not certain if this change is a breaking change.
Environment
Delta-rs version:0.12.0
Binding: Rust
Environment:dev
Bug
What happened:
I'm trying to delete some data from a delta table in Rust, using with_predicate.
let (table,_) = DeltaOps(table).delete().with_predicate(col("fechabloqueo").eq(cast(lit("2022-11-06"),Date32))).await.unwrap();
but im getting
"Execution error: Failed to map column projection for field fechacreacion. Incompatible data types Timestamp(Nanosecond, None) and Timestamp(Microsecond, None)"
What you expected to happen:
Delete successfully with that condition.
How to reproduce it:
More details:
The table was created using default INT96 outputTimestampType option.
If i rewrite the table using
spark.conf.set("spark.sql.parquet.outputTimestampType", "TIMESTAMP_MICROS")
still fails:
"Execution error: Failed to map column projection for field fechacreacion. Incompatible data types Timestamp(Microsecond, Some(\"UTC\")) and Timestamp(Microsecond, None)"
The text was updated successfully, but these errors were encountered: