Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Expand tests for Spark external tables #52

Open
2 tasks
Tracked by #260
jtcohen6 opened this issue Nov 24, 2020 · 0 comments
Open
2 tasks
Tracked by #260

Expand tests for Spark external tables #52

jtcohen6 opened this issue Nov 24, 2020 · 0 comments
Labels
enhancement New feature or request spark tech_debt

Comments

@jtcohen6
Copy link
Collaborator

jtcohen6 commented Nov 24, 2020

(split off from #10)

Not currently tested

  • ndjson (s3://dbt-external-tables-testing/json/)
  • Hive-formatted external tables, since SQL Endpoint returns this error:
('42000', '[42000] [Simba][Hardy] (80) Syntax or semantic analysis error thrown in server while executing query. Error message from server: Error running query: org.apache.spark.sql.AnalysisException: Cannot use this data source. Only csv,json,avro,delta,parquet,orc,text data sources are supported on SQL Gateway.;;\nSubqueryAlias spark_catalog.dbt_jcohen.people_csv_unpartitioned_hive_format\n+- HiveTableRelation `dbt_jcohen`.`people_csv_unpartitioned_hive_format`, org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, [id#2612, f\x00 (80) (SQLExecDirectW)')
@jtcohen6 jtcohen6 added enhancement New feature or request spark labels Nov 24, 2020
@dataders dataders mentioned this issue Mar 8, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request spark tech_debt
Projects
None yet
Development

No branches or pull requests

2 participants