-
Notifications
You must be signed in to change notification settings - Fork 90
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG]: Error running "migrate-tables" - failed for getting token from AzureAD - UBER PRINCIPAL #1629
Comments
@gregorymiguel this is similar to #1625 - could you also update the spn details for sql warehouse configuration in the meantime? |
Hi @nkvuong , I added the spn config to the sql warehouse before opening this issue. It didn't work for me. |
Hi @gregorymiguel , thank you for being elaborate in your Gihub issues :) it's much appreciated. How are you doing on this bug? I was thinking: could it be something simple like the single curly brackets in the secret instead of double curly brackets:
should be
|
@gregorymiguel : Could you confirm that the proposed solution solves your issue? Otherwise we need to investigate the issue further |
Hi @JCZuurmond , ![]() |
Is there an existing issue for this?
Current Behavior
I found an issue when trying to execute the migrate-tables workflow.
The uber principal has been created and the policy was updated correctly with the uber credentials info.
All the 4 tasks returns with the same error:
Both clusters (main and migrate_tables) are using the policy created by create-uber-principal task.
Policy:

Expected Behavior
The cluster should be able to get AAD token using the uber principal credentials.
Steps To Reproduce
No response
Cloud
Azure
Operating System
macOS
Version
latest via Databricks CLI
Relevant log output
No response
The text was updated successfully, but these errors were encountered: