You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Facing issue while running Batch Ingestion Job. Got this issue after upgrading to latest nightly build. 0.10
The same ingestion is working witj 0.9.2 build,
Command to Run:
/pinot/bin/pinot-admin.sh LaunchDataIngestionJob -jobSpecFile job_config.yaml
Log:
2022/02/03 00:11:06.064 INFO [CrcUtils] [pool-6-thread-1] Computed crc = 1828318080, based on files [/tmp/pinot-b263f2fa-8bad-4a49-9511-508fc14c50e2/output/dim_testtable_OFFLINE_0/v3/columns.psf, /tmp/pinot-b263f2fa-8bad-4a49-9511-508fc14c50e2/output/dim_testtable_OFFLINE_0/v3/index_map, /tmp/pinot-b263f2fa-8bad-4a49-9511-508fc14c50e2/output/dim_testtable_OFFLINE_0/v3/metadata.properties]
2022/02/03 00:11:06.065 INFO [SegmentIndexCreationDriverImpl] [pool-6-thread-1] Driver, record read time : 13
2022/02/03 00:11:06.065 INFO [SegmentIndexCreationDriverImpl] [pool-6-thread-1] Driver, stats collector time : 0
2022/02/03 00:11:06.065 INFO [SegmentIndexCreationDriverImpl] [pool-6-thread-1] Driver, indexing time : 8
2022/02/03 00:11:06.065 INFO [SegmentGenerationJobRunner] [pool-6-thread-1] Tarring segment from: /tmp/pinot-b263f2fa-8bad-4a49-9511-508fc14c50e2/output/dim_testtable_OFFLINE_0 to: /tmp/pinot-b263f2fa-8bad-4a49-9511-508fc14c50e2/output/dim_testtable_OFFLINE_0.tar.gz
2022/02/03 00:11:06.090 INFO [SegmentGenerationJobRunner] [pool-6-thread-1] Size for segment: dim_testtable_OFFLINE_0, uncompressed: 217.24K, compressed: 70.93K
2022/02/03 00:11:06.618 INFO [IngestionJobLauncher] [main] Trying to create instance for class org.apache.pinot.plugin.ingestion.batch.standalone.SegmentTarPushJobRunner
2022/02/03 00:11:06.619 INFO [PinotFSFactory] [main] Initializing PinotFS for scheme file, classname org.apache.pinot.spi.filesystem.LocalPinotFS
2022/02/03 00:11:06.620 INFO [PinotFSFactory] [main] Initializing PinotFS for scheme hdfs, classname org.apache.pinot.plugin.filesystem.HadoopPinotFS
2022/02/03 00:11:06.632 INFO [HadoopPinotFS] [main] successfully initialized HadoopPinotFS
2022/02/03 00:11:06.819 INFO [SegmentPushUtils] [main] Start pushing segments: [hdfs://nameservice1/data/max/poc/pinot-ingestion/dimension_segments/dim_testtable/dim_testtable_OFFLINE_0.tar.gz]... to locations: [org.apache.pinot.spi.ingestion.batch.spec.PinotClusterSpec@51827393] for table dim_testtable
2022/02/03 00:11:06.819 INFO [SegmentPushUtils] [main] Pushing segment: dim_testtable_OFFLINE_0 to location: http://d9-max-insert-2.srv.net:9000/ for table dim_testtable
2022/02/03 00:11:07.164 INFO [FileUploadDownloadClient] [main] Sending request: http://d9-max-insert-2.srv.net:9000/v2/segments?tableName=dim_testtable&tableName=dim_testtable&tableType=OFFLINE to controller: d9-max-insert-2.srv.net, version: Unknown
2022/02/03 00:11:07.168 WARN [SegmentPushUtils] [main] Caught temporary exception while pushing table: dim_testtable segment: dim_testtable_OFFLINE_0 to http://d9-max-insert-2.srv.net:9000/, will retry
org.apache.pinot.common.exception.HttpErrorStatusException: Got error status code: 500 (Internal Server Error) with reason: "Exception while uploading segment: null" while sending request: http://d9-max-insert-2.srv.net:9000/v2/segments?tableName=dim_testtable&tableName=dim_testtable&tableType=OFFLINE to controller: d9-max-insert-2.srv.net, version: Unknown
at org.apache.pinot.common.utils.FileUploadDownloadClient.sendRequest(FileUploadDownloadClient.java:531) ~[pinot-all-0.10.0-SNAPSHOT-jar-with-dependencies.jar:0.10.0-SNAPSHOT-ea2f0aa641e17301293662c8e79dfd94d8568438]
at org.apache.pinot.common.utils.FileUploadDownloadClient.uploadSegment(FileUploadDownloadClient.java:838) ~[pinot-all-0.10.0-SNAPSHOT-jar-with-dependencies.jar:0.10.0-SNAPSHOT-ea2f0aa641e17301293662c8e79dfd94d8568438]
at org.apache.pinot.segment.local.utils.SegmentPushUtils.lambda$pushSegments$0(SegmentPushUtils.java:122) ~[pinot-all-0.10.0-SNAPSHOT-jar-with-dependencies.jar:0.10.0-SNAPSHOT-ea2f0aa641e17301293662c8e79dfd94d8568438]
at org.apache.pinot.spi.utils.retry.BaseRetryPolicy.attempt(BaseRetryPolicy.java:50) [pinot-all-0.10.0-SNAPSHOT-jar-with-dependencies.jar:0.10.0-SNAPSHOT-ea2f0aa641e17301293662c8e79dfd94d8568438]
at org.apache.pinot.segment.local.utils.SegmentPushUtils.pushSegments(SegmentPushUtils.java:119) [pinot-all-0.10.0-SNAPSHOT-jar-with-dependencies.jar:0.10.0-SNAPSHOT-ea2f0aa641e17301293662c8e79dfd94d8568438]
at org.apache.pinot.plugin.ingestion.batch.standalone.SegmentTarPushJobRunner.run(SegmentTarPushJobRunner.java:88) [pinot-batch-ingestion-standalone-0.10.0-SNAPSHOT-shaded.jar:0.10.0-SNAPSHOT-ea2f0aa641e17301293662c8e79dfd94d8568438]
at org.apache.pinot.spi.ingestion.batch.IngestionJobLauncher.kickoffIngestionJob(IngestionJobLauncher.java:146) [pinot-all-0.10.0-SNAPSHOT-jar-with-dependencies.jar:0.10.0-SNAPSHOT-ea2f0aa641e17301293662c8e79dfd94d8568438]
at org.apache.pinot.spi.ingestion.batch.IngestionJobLauncher.runIngestionJob(IngestionJobLauncher.java:118)
The text was updated successfully, but these errors were encountered:
Thanks for reporting the issue. Can you share the last commit hash for the build? @vvivekiyer Can you please take a look and see if #8110 could cause the issue?
Facing issue while running Batch Ingestion Job. Got this issue after upgrading to latest nightly build. 0.10
The same ingestion is working witj 0.9.2 build,
Command to Run:
/pinot/bin/pinot-admin.sh LaunchDataIngestionJob -jobSpecFile job_config.yaml
Log:
2022/02/03 00:11:06.064 INFO [CrcUtils] [pool-6-thread-1] Computed crc = 1828318080, based on files [/tmp/pinot-b263f2fa-8bad-4a49-9511-508fc14c50e2/output/dim_testtable_OFFLINE_0/v3/columns.psf, /tmp/pinot-b263f2fa-8bad-4a49-9511-508fc14c50e2/output/dim_testtable_OFFLINE_0/v3/index_map, /tmp/pinot-b263f2fa-8bad-4a49-9511-508fc14c50e2/output/dim_testtable_OFFLINE_0/v3/metadata.properties]
2022/02/03 00:11:06.065 INFO [SegmentIndexCreationDriverImpl] [pool-6-thread-1] Driver, record read time : 13
2022/02/03 00:11:06.065 INFO [SegmentIndexCreationDriverImpl] [pool-6-thread-1] Driver, stats collector time : 0
2022/02/03 00:11:06.065 INFO [SegmentIndexCreationDriverImpl] [pool-6-thread-1] Driver, indexing time : 8
2022/02/03 00:11:06.065 INFO [SegmentGenerationJobRunner] [pool-6-thread-1] Tarring segment from: /tmp/pinot-b263f2fa-8bad-4a49-9511-508fc14c50e2/output/dim_testtable_OFFLINE_0 to: /tmp/pinot-b263f2fa-8bad-4a49-9511-508fc14c50e2/output/dim_testtable_OFFLINE_0.tar.gz
2022/02/03 00:11:06.090 INFO [SegmentGenerationJobRunner] [pool-6-thread-1] Size for segment: dim_testtable_OFFLINE_0, uncompressed: 217.24K, compressed: 70.93K
2022/02/03 00:11:06.618 INFO [IngestionJobLauncher] [main] Trying to create instance for class org.apache.pinot.plugin.ingestion.batch.standalone.SegmentTarPushJobRunner
2022/02/03 00:11:06.619 INFO [PinotFSFactory] [main] Initializing PinotFS for scheme file, classname org.apache.pinot.spi.filesystem.LocalPinotFS
2022/02/03 00:11:06.620 INFO [PinotFSFactory] [main] Initializing PinotFS for scheme hdfs, classname org.apache.pinot.plugin.filesystem.HadoopPinotFS
2022/02/03 00:11:06.632 INFO [HadoopPinotFS] [main] successfully initialized HadoopPinotFS
2022/02/03 00:11:06.819 INFO [SegmentPushUtils] [main] Start pushing segments: [hdfs://nameservice1/data/max/poc/pinot-ingestion/dimension_segments/dim_testtable/dim_testtable_OFFLINE_0.tar.gz]... to locations: [org.apache.pinot.spi.ingestion.batch.spec.PinotClusterSpec@51827393] for table dim_testtable
2022/02/03 00:11:06.819 INFO [SegmentPushUtils] [main] Pushing segment: dim_testtable_OFFLINE_0 to location: http://d9-max-insert-2.srv.net:9000/ for table dim_testtable
2022/02/03 00:11:07.164 INFO [FileUploadDownloadClient] [main] Sending request: http://d9-max-insert-2.srv.net:9000/v2/segments?tableName=dim_testtable&tableName=dim_testtable&tableType=OFFLINE to controller: d9-max-insert-2.srv.net, version: Unknown
2022/02/03 00:11:07.168 WARN [SegmentPushUtils] [main] Caught temporary exception while pushing table: dim_testtable segment: dim_testtable_OFFLINE_0 to http://d9-max-insert-2.srv.net:9000/, will retry
org.apache.pinot.common.exception.HttpErrorStatusException: Got error status code: 500 (Internal Server Error) with reason: "Exception while uploading segment: null" while sending request: http://d9-max-insert-2.srv.net:9000/v2/segments?tableName=dim_testtable&tableName=dim_testtable&tableType=OFFLINE to controller: d9-max-insert-2.srv.net, version: Unknown
at org.apache.pinot.common.utils.FileUploadDownloadClient.sendRequest(FileUploadDownloadClient.java:531) ~[pinot-all-0.10.0-SNAPSHOT-jar-with-dependencies.jar:0.10.0-SNAPSHOT-ea2f0aa641e17301293662c8e79dfd94d8568438]
at org.apache.pinot.common.utils.FileUploadDownloadClient.uploadSegment(FileUploadDownloadClient.java:838) ~[pinot-all-0.10.0-SNAPSHOT-jar-with-dependencies.jar:0.10.0-SNAPSHOT-ea2f0aa641e17301293662c8e79dfd94d8568438]
at org.apache.pinot.segment.local.utils.SegmentPushUtils.lambda$pushSegments$0(SegmentPushUtils.java:122) ~[pinot-all-0.10.0-SNAPSHOT-jar-with-dependencies.jar:0.10.0-SNAPSHOT-ea2f0aa641e17301293662c8e79dfd94d8568438]
at org.apache.pinot.spi.utils.retry.BaseRetryPolicy.attempt(BaseRetryPolicy.java:50) [pinot-all-0.10.0-SNAPSHOT-jar-with-dependencies.jar:0.10.0-SNAPSHOT-ea2f0aa641e17301293662c8e79dfd94d8568438]
at org.apache.pinot.segment.local.utils.SegmentPushUtils.pushSegments(SegmentPushUtils.java:119) [pinot-all-0.10.0-SNAPSHOT-jar-with-dependencies.jar:0.10.0-SNAPSHOT-ea2f0aa641e17301293662c8e79dfd94d8568438]
at org.apache.pinot.plugin.ingestion.batch.standalone.SegmentTarPushJobRunner.run(SegmentTarPushJobRunner.java:88) [pinot-batch-ingestion-standalone-0.10.0-SNAPSHOT-shaded.jar:0.10.0-SNAPSHOT-ea2f0aa641e17301293662c8e79dfd94d8568438]
at org.apache.pinot.spi.ingestion.batch.IngestionJobLauncher.kickoffIngestionJob(IngestionJobLauncher.java:146) [pinot-all-0.10.0-SNAPSHOT-jar-with-dependencies.jar:0.10.0-SNAPSHOT-ea2f0aa641e17301293662c8e79dfd94d8568438]
at org.apache.pinot.spi.ingestion.batch.IngestionJobLauncher.runIngestionJob(IngestionJobLauncher.java:118)
The text was updated successfully, but these errors were encountered: