You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In org.apache.spark.sql.comet.CometNativeScanExec#isAdditionallySupported we currently return false for Array types, therefore we fall back to Spark's scan if the Parquet file contains arrays.
I tried modifying this method to return true for Arrays as long as the element type is supported and saw this error:
Cannot cast file schema field c13 of type List(Field { name: "element", data_type: Boolean, nullable: true, dict_id: 0, dict_is_ordered: false, metadata: {} }) to required schema field of type List(Field { name: "item", data_type: Boolean, nullable: true, dict_id: 0, dict_is_ordered: false, metadata: {} })
What is the problem the feature request solves?
In
org.apache.spark.sql.comet.CometNativeScanExec#isAdditionallySupported
we currently return false for Array types, therefore we fall back to Spark's scan if the Parquet file contains arrays.I tried modifying this method to return
true
for Arrays as long as the element type is supported and saw this error:For readability, the
from
andto
types are:The field name is different but the type is the same, so the cast should be supported (and be a no-op).
Describe the potential solution
No response
Additional context
No response
The text was updated successfully, but these errors were encountered: