-
Notifications
You must be signed in to change notification settings - Fork 5.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
executor: add log for commit work #13965
Conversation
Codecov Report
@@ Coverage Diff @@
## master #13965 +/- ##
===========================================
Coverage 80.1928% 80.1928%
===========================================
Files 480 480
Lines 120416 120416
===========================================
Hits 96565 96565
Misses 16158 16158
Partials 7693 7693 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
Does this pr need to be cherry-pick to release-3.0? @cfzjywxk |
/run-all-tests |
/merge |
/run-all-tests |
What problem does this PR solve?
The original
AddRecordLd
function used inload data
statement, will not report error to caller.Problem:
Loading 300 millon rows in tpch lineitem table into tidb, after loaded row counts over 50800000, the load data commit goroutine will always write zero rows processing one commit task, because of error "[autoid:1467]Failed to read auto-increment value from storage engine" from
AddRecord
. (related to #13648, need more investigation on this), add this error not reported, so the loading process seems got stuckWhat is changed and how it works?
AddRecordLd
function ifAddRecord
failedCheck List
Tests
tpch.lineitem
data with 300million rowsCode changes
Side effects
Related changes
Release note