Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[google-cloud-datacatalog]: DataCatalogClient.create throws IllegalAccessError - LazyStringArrayList is in unnamed module of loader 'app' #9625

Closed
dbjnbnrj opened this issue Jul 8, 2023 · 5 comments

Comments

@dbjnbnrj
Copy link

dbjnbnrj commented Jul 8, 2023

Thanks for stopping by to let us know something could be better!

PLEASE READ: If you have a support contract with Google, please create an issue in the support console instead of filing on GitHub. This will ensure a timely response.

Please run down the following list and make sure you've tried the usual "quick fixes":

If you are still having issues, please include as much information as possible:

Environment details

  1. API: google-cloud-datacatalog
  2. OS type and version: Mac OS X
  3. Java version:11
  4. Version(s):
<dependency>
      <groupId>com.google.cloud</groupId>
      <artifactId>google-cloud-datacatalog</artifactId>
     <version>1.26.0</version>
<dependency>
      <groupId>com.google.cloud</groupId>
      <artifactId>google-cloud-datacatalog</artifactId>
     <version>1.25.0</version>
      <artifactId>libraries-bom</artifactId>
      <version>26.18.0</version>

Steps to reproduce

  1. Build using the 1.26.0 or 1.25.0 version of google-cloud-datacatalog OR the 26.18.0 bom
  2. Initialize Datacatalog client
  3. Create jar and call in via Dataproc batch job

Code example

  @Provides
  DataCatalogClient dataCatalogClient(DataCatalogEndpointHolder dataCatalogEndpointHolder)
      throws IOException {
    DataCatalogSettings.Builder settingBuilder = DataCatalogSettings.newBuilder();
    if (dataCatalogEndpointHolder.value != null) {
      logger.atInfo().log(
          "Using custom Data Catalog endpoint: '%s'", dataCatalogEndpointHolder.value);
      settingBuilder.setEndpoint(dataCatalogEndpointHolder.value);
    }
    return DataCatalogClient.create(settingBuilder.build());
  }

Stack trace

1) [Guice/ErrorInCustomProvider]: IllegalAccessError: class SearchCatalogResponse tried to access method 
'LazyStringArrayList LazyStringArrayList.emptyList()' (SearchCatalogResponse is in unnamed module of loader MutableURLClassLoader @3db64bd4; LazyStringArrayList is in unnamed module of loader 'app')
  at DataCatalogModule.dataCatalogClient(DataCatalogModule.java:25)
  at DataCatalogModule.dataCatalogFacade(DataCatalogModule.java:40)
      \_ for 1st parameter
  at DataCatalogModule.dataCatalogFacade(DataCatalogModule.java:40)

External references such as API reference guides

  • ?

Any additional information below

Following these steps guarantees the quickest resolution possible.

Thanks!

@suztomo
Copy link
Member

suztomo commented Jul 8, 2023

This seems duplicate of #9558, where the reporter cannot provide a reproducer project yet.

@dbjnbnrj Are you able to create an example GitHub repository that produces the error?

@dbjnbnrj
Copy link
Author

dbjnbnrj commented Jul 9, 2023

Think I discovered the root cause of the issue. I am running my jar on Dataproc Serverless (version 1.0.29). The runtime classpath has a dependency on '/usr/lib/spark/jars/protobuf-java-3.19.6.jar'

However, the 1.26.0 version of Datacatalog api relies on 'protobuf-java-3.21.12.jar'.

This runtime conflict is resulting in this error (similar to quarkusio/quarkus#31240).

Is there a way (in maven) to use the Datacatalog dependency instead of the runtime one? I attempted creating a shaded uber jar as suggested here (https://cloud.google.com/dataproc/docs/guides/manage-spark-dependencies#creating_a_shaded_uber_jar_with_maven) but that did not work.

Wasn't able to create an empty github project yet. Let me know if you have any suggestions meanwhile.

@suztomo
Copy link
Member

suztomo commented Jul 10, 2023

The runtime classpath has a dependency on '/usr/lib/spark/jars/protobuf-java-3.19.6.jar'. However, the 1.26.0 version of Datacatalog api relies on 'protobuf-java-3.21.12.jar'.

Thank you for great finding!

I attempted creating a shaded uber jar as suggested here (https://cloud.google.com/dataproc/docs/guides/manage-spark-dependencies#creating_a_shaded_uber_jar_with_maven) but that did not work.

Shaded Uber JAR is a solution to address such problems. Would you share the stacktrace and pom.xml you used? The stacktrace tells which class is still referencing com.google.protobuf (Then you would need to tweak <relocations>).

@dbjnbnrj
Copy link
Author

dbjnbnrj commented Jul 10, 2023

Thanks for the encouragement @suztomo.

I was able to solve this! I cannot share the repo but this is the shade related change I needed -

 <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-shade-plugin</artifactId>
        <version>3.2.4</version>
        <executions>
          <execution>
            <phase>package</phase>
            <goals>
              <goal>shade</goal>
            </goals>
            <configuration>
              <createDependencyReducedPom>false</createDependencyReducedPom>
              <filters>
                <filter>
                  <artifact>*:*</artifact>
                  <excludes>
                    <exclude>META-INF/*.SF</exclude>
                    <exclude>META-INF/*.DSA</exclude>
                    <exclude>META-INF/*.RSA</exclude>
                  </excludes>
                </filter>
              </filters>
              <transformers>
                <transformer
                    implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
                <transformer
                    implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
                  <mainClass>
                    Path of Main Class
                  </mainClass>
                </transformer>
              </transformers>
              <relocations>
                <relocation>
                  <pattern>org.apache.commons.cli</pattern>
                  <shadedPattern>repackaged.org.apache.commons.cli
                  </shadedPattern>
                </relocation>
                <relocation>
                  <pattern>com</pattern>
                  <shadedPattern>repackaged.com</shadedPattern>
                  <includes>
                    <include>com.google.protobuf.**</include>
                    <include>com.google.common.**</include>
                  </includes>
                </relocation>
              </relocations>
            </configuration>
          </execution>
        </executions>
      </plugin>

@suztomo
Copy link
Member

suztomo commented Jul 11, 2023

Thank you for sharing your solution.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants
@suztomo @dbjnbnrj and others