Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] Dbeaver cannot see the catalog and table information in the Flink engine #2901

Open
2 of 3 tasks
waywtdcc opened this issue Jun 17, 2022 · 10 comments
Open
2 of 3 tasks
Labels
kind:bug This is a clearly a bug priority:major

Comments

@waywtdcc
Copy link
Contributor

Code of Conduct

Search before asking

  • I have searched in the issues and found no similar issues.

Describe the bug

Dbleaver cannot see the catalog and table information in the Flink engine
image
image

Affects Version(s)

1.5.1

Kyuubi Server Log Output

13:21:26.199 INFO org.apache.kyuubi.operation.ExecuteStatement: Query[e3fc1c1d-a95d-48a3-af82-aa18a245cf64] in ERROR_STATE
13:21:26.199 INFO org.apache.kyuubi.operation.ExecuteStatement: Processing hdpu's query[e3fc1c1d-a95d-48a3-af82-aa18a245cf64]: RUNNING_STATE -> ERROR_STATE, statement: SELECT version(), time taken: 0.015 seconds
13:21:26.210 INFO org.apache.kyuubi.operation.ExecuteStatement: Processing hdpu's query[e3fc1c1d-a95d-48a3-af82-aa18a245cf64]: ERROR_STATE -> CLOSED_STATE, statement: SELECT version()
13:21:26.212 INFO org.apache.kyuubi.client.KyuubiSyncThriftClient: TCloseOperationReq(operationHandle:TOperationHandle(operationId:THandleIdentifier(guid:E6 39 36 60 9B 3D 40 47 91 EE 9A B6 FC 7D 19 42, secret:B9 7D 7F 01 C8 79 4B C6 B9 7C A2 F4 F4 66 09 14), operationType:EXECUTE_STATEMENT, hasResultSet:true)) succeed on engine side
13:21:26.218 INFO org.apache.kyuubi.operation.GetCatalogs: Processing hdpu's query[9ee1eed4-2776-413c-9b86-dbb2fb461468]: INITIALIZED_STATE -> RUNNING_STATE, statement: GET_CATALOGS
13:21:26.221 INFO org.apache.kyuubi.operation.GetCatalogs: Processing hdpu's query[9ee1eed4-2776-413c-9b86-dbb2fb461468]: RUNNING_STATE -> FINISHED_STATE, statement: GET_CATALOGS, time taken: 0.003 seconds
13:21:26.246 INFO org.apache.kyuubi.operation.GetCatalogs: Processing hdpu's query[9ee1eed4-2776-413c-9b86-dbb2fb461468]: FINISHED_STATE -> CLOSED_STATE, statement: GET_CATALOGS
13:21:26.247 INFO org.apache.kyuubi.client.KyuubiSyncThriftClient: TCloseOperationReq(operationHandle:TOperationHandle(operationId:THandleIdentifier(guid:35 CF ED 61 81 66 47 A7 AF 4A 09 8E F1 5E A1 4D, secret:55 9F 18 2C D4 1F 4F 02 82 3F 90 AC 7E E4 BA B6), operationType:GET_CATALOGS, hasResultSet:true)) succeed on engine side
13:21:26.251 INFO org.apache.kyuubi.operation.GetSchemas: Processing hdpu's query[f9040630-a758-4044-9183-d0ab467a86a6]: INITIALIZED_STATE -> RUNNING_STATE, statement: GET_SCHEMAS
13:21:26.255 INFO org.apache.kyuubi.operation.GetSchemas: Processing hdpu's query[f9040630-a758-4044-9183-d0ab467a86a6]: RUNNING_STATE -> FINISHED_STATE, statement: GET_SCHEMAS, time taken: 0.004 seconds
13:21:26.276 INFO org.apache.kyuubi.operation.GetSchemas: Processing hdpu's query[f9040630-a758-4044-9183-d0ab467a86a6]: FINISHED_STATE -> CLOSED_STATE, statement: GET_SCHEMAS
13:21:26.277 INFO org.apache.kyuubi.client.KyuubiSyncThriftClient: TCloseOperationReq(operationHandle:TOperationHandle(operationId:THandleIdentifier(guid:E0 22 97 4E 55 17 41 76 AE C6 88 27 24 EB 3B E9, secret:2F 4A 80 0A E2 4F 43 E5 9D 49 BD CA C3 03 7E F4), operationType:GET_SCHEMAS, hasResultSet:true)) succeed on engine side
13:21:26.281 INFO org.apache.kyuubi.operation.GetTables: Processing hdpu's query[4009764f-e00f-4b64-8577-6c5f388bd1ce]: INITIALIZED_STATE -> RUNNING_STATE, statement: GET_TABLES
13:21:26.285 INFO org.apache.kyuubi.operation.GetTables: Processing hdpu's query[4009764f-e00f-4b64-8577-6c5f388bd1ce]: RUNNING_STATE -> FINISHED_STATE, statement: GET_TABLES, time taken: 0.004 seconds
13:21:26.297 INFO org.apache.kyuubi.operation.GetTables: Processing hdpu's query[4009764f-e00f-4b64-8577-6c5f388bd1ce]: FINISHED_STATE -> CLOSED_STATE, statement: GET_TABLES
13:21:26.298 INFO org.apache.kyuubi.client.KyuubiSyncThriftClient: TCloseOperationReq(operationHandle:TOperationHandle(operationId:THandleIdentifier(guid:FE 75 50 81 BE CC 40 33 B6 25 16 A6 E3 BA DB CF, secret:12 44 23 FD 83 9F 47 21 A7 DF AC B8 94 DD 48 FE), operationType:GET_TABLES, hasResultSet:true)) succeed on engine side

Kyuubi Engine Log Output

Caused by: org.apache.calcite.sql.validate.SqlValidatorException: No match found for function signature version()
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_221]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_221]
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_221]
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_221]
	at org.apache.calcite.runtime.Resources$ExInstWithCause.ex(Resources.java:467) ~[flink-table_2.12-1.14.4.jar:1.14.4]
	at org.apache.calcite.runtime.Resources$ExInst.ex(Resources.java:560) ~[flink-table_2.12-1.14.4.jar:1.14.4]
	at org.apache.calcite.sql.SqlUtil.newContextException(SqlUtil.java:883) ~[flink-table_2.12-1.14.4.jar:1.14.4]
	at org.apache.calcite.sql.SqlUtil.newContextException(SqlUtil.java:868) ~[flink-table_2.12-1.14.4.jar:1.14.4]
	at org.apache.calcite.sql.validate.SqlValidatorImpl.newValidationError(SqlValidatorImpl.java:4861) ~[flink-table_2.12-1.14.4.jar:1.14.4]
	at org.apache.calcite.sql.validate.SqlValidatorImpl.handleUnresolvedFunction(SqlValidatorImpl.java:1814) ~[flink-table_2.12-1.14.4.jar:1.14.4]
	at org.apache.calcite.sql.SqlFunction.deriveType(SqlFunction.java:321) ~[flink-table_2.12-1.14.4.jar:1.14.4]
	at org.apache.calcite.sql.SqlFunction.deriveType(SqlFunction.java:226) ~[flink-table_2.12-1.14.4.jar:1.14.4]
	at org.apache.calcite.sql.validate.SqlValidatorImpl$DeriveTypeVisitor.visit(SqlValidatorImpl.java:5710) ~[flink-table_2.12-1.14.4.jar:1.14.4]
	at org.apache.calcite.sql.validate.SqlValidatorImpl$DeriveTypeVisitor.visit(SqlValidatorImpl.java:5697) ~[flink-table_2.12-1.14.4.jar:1.14.4]
	at org.apache.calcite.sql.SqlCall.accept(SqlCall.java:139) ~[flink-table_2.12-1.14.4.jar:1.14.4]
	at org.apache.calcite.sql.validate.SqlValidatorImpl.deriveTypeImpl(SqlValidatorImpl.java:1736) ~[flink-table_2.12-1.14.4.jar:1.14.4]
	at org.apache.calcite.sql.validate.SqlValidatorImpl.deriveType(SqlValidatorImpl.java:1727) ~[flink-table_2.12-1.14.4.jar:1.14.4]
	at org.apache.calcite.sql.validate.SqlValidatorImpl.expandSelectItem(SqlValidatorImpl.java:421) ~[flink-table_2.12-1.14.4.jar:1.14.4]
	at org.apache.calcite.sql.validate.SqlValidatorImpl.validateSelectList(SqlValidatorImpl.java:4061) ~[flink-table_2.12-1.14.4.jar:1.14.4]
	at org.apache.calcite.sql.validate.SqlValidatorImpl.validateSelect(SqlValidatorImpl.java:3347) ~[flink-table_2.12-1.14.4.jar:1.14.4]
	at org.apache.calcite.sql.validate.SelectNamespace.validateImpl(SelectNamespace.java:60) ~[flink-table_2.12-1.14.4.jar:1.14.4]
	at org.apache.calcite.sql.validate.AbstractNamespace.validate(AbstractNamespace.java:84) ~[flink-table_2.12-1.14.4.jar:1.14.4]
	at org.apache.calcite.sql.validate.SqlValidatorImpl.validateNamespace(SqlValidatorImpl.java:997) ~[flink-table_2.12-1.14.4.jar:1.14.4]
	at org.apache.calcite.sql.validate.SqlValidatorImpl.validateQuery(SqlValidatorImpl.java:975) ~[flink-table_2.12-1.14.4.jar:1.14.4]
	at org.apache.calcite.sql.SqlSelect.validate(SqlSelect.java:232) ~[flink-table_2.12-1.14.4.jar:1.14.4]
	at org.apache.calcite.sql.validate.SqlValidatorImpl.validateScopedExpression(SqlValidatorImpl.java:952) ~[flink-table_2.12-1.14.4.jar:1.14.4]
	at org.apache.calcite.sql.validate.SqlValidatorImpl.validate(SqlValidatorImpl.java:704) ~[flink-table_2.12-1.14.4.jar:1.14.4]
	at org.apache.flink.table.planner.calcite.FlinkPlannerImpl.org$apache$flink$table$planner$calcite$FlinkPlannerImpl$$validate(FlinkPlannerImpl.scala:159) ~[flink-table_2.12-1.14.4.jar:1.14.4]
	at org.apache.flink.table.planner.calcite.FlinkPlannerImpl.validate(FlinkPlannerImpl.scala:107) ~[flink-table_2.12-1.14.4.jar:1.14.4]
	at org.apache.flink.table.planner.operations.SqlToOperationConverter.convert(SqlToOperationConverter.java:215) ~[flink-table_2.12-1.14.4.jar:1.14.4]
	at org.apache.flink.table.planner.delegation.ParserImpl.parse(ParserImpl.java:101) ~[flink-table_2.12-1.14.4.jar:1.14.4]
	at org.apache.flink.table.client.gateway.local.LocalExecutor.lambda$parseStatement$1(LocalExecutor.java:172) ~[flink-sql-client_2.12-1.14.4.jar:1.14.4]
	at org.apache.flink.table.client.gateway.context.ExecutionContext.wrapClassLoader(ExecutionContext.java:88) ~[flink-sql-client_2.12-1.14.4.jar:1.14.4]
	at org.apache.flink.table.client.gateway.local.LocalExecutor.parseStatement(LocalExecutor.java:172) ~[flink-sql-client_2.12-1.14.4.jar:1.14.4]
	... 7 more

Kyuubi Server Configurations

kyuubi.engine.type FLINK_SQL

Kyuubi Engine Configurations

No response

Additional context

No response

Are you willing to submit PR?

  • Yes I am willing to submit a PR!
@waywtdcc waywtdcc added kind:bug This is a clearly a bug priority:major labels Jun 17, 2022
@github-actions
Copy link

Hello @waywtdcc,
Thanks for finding the time to report the issue!
We really appreciate the community's efforts to improve Apache Kyuubi (Incubating).

@pan3793
Copy link
Member

pan3793 commented Jun 17, 2022

Which JDBC driver are you using? The Hive JDBC driver does not support multi catalogs. The latest DBeaver has build-in Kyuubi support, would you please try it?

@pan3793 pan3793 changed the title [Bug] Dbleaver cannot see the catalog and table information in the Flink engine [Bug] Dbeaver cannot see the catalog and table information in the Flink engine Jun 17, 2022
@waywtdcc
Copy link
Contributor Author

OK, is it the community version?

@pan3793
Copy link
Member

pan3793 commented Jun 17, 2022

Yea, the support of Kyuubi is included in the community version.

@waywtdcc
Copy link
Contributor Author

waywtdcc commented Jun 17, 2022

I downloaded the latest version, but I can't see the catalogs
image

@pan3793
Copy link
Member

pan3793 commented Jun 17, 2022

#2728 may be related, would you please try to build the master branch code and try? It requires you to update both the JDBC driver and Kyuubi.

@waywtdcc
Copy link
Contributor Author

OK, I'll try. Can I create some global catalogs according to the configuration submission? Is there a wechat group? @pan3793

@pan3793
Copy link
Member

pan3793 commented Jun 17, 2022

You may achieve it by using Beeline init file.
Please subscribe apachekyuubi and follow the guidance to join our WeChat group

@waywtdcc
Copy link
Contributor Author

waywtdcc commented Jul 4, 2022

1.5.2 version, dbleaver also can not see the catalog and table information in the Flink engine @pan3793

@cxzl25
Copy link
Contributor

cxzl25 commented Sep 27, 2022

These two PRs can solve your problem.

#3498

#3519

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind:bug This is a clearly a bug priority:major
Projects
None yet
Development

No branches or pull requests

3 participants