-
Notifications
You must be signed in to change notification settings - Fork 5.6k
Azure Data Factory: Add swagger definition for specifying Spark environment variables in AzureDatabricksLinkedService #3825
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…onment variables in AzureDatabricksLinkedService.
|
Can one of the admins verify this patch? |
Automation for azure-sdk-for-rubyNothing to generate for azure-sdk-for-ruby |
Automation for azure-sdk-for-pythonThe initial PR has been merged into your service PR: |
|
@hvermis Please take a look. |
...tory/resource-manager/Microsoft.DataFactory/stable/2018-06-01/entityTypes/LinkedService.json
Outdated
Show resolved
Hide resolved
Automation for azure-sdk-for-goThe initial PR has been merged into your service PR: |
Automation for azure-sdk-for-javaThe initial PR has been merged into your service PR: |
…and newClusterSparkEnvVars)
Automation for azure-sdk-for-nodeThe initial PR has been merged into your service PR: |
...tory/resource-manager/Microsoft.DataFactory/stable/2018-06-01/entityTypes/LinkedService.json
Show resolved
Hide resolved
|
@anuchandy taking this over due to service team urgency. |
* Add new ResourceType and new enum option * Add x-ms-long-running-operation-options * Remove testbase basic tier * Use more friendly email event name * Revise operationIds * Revise the name of timedOut property * Fix typo * Revise operation Id based on generated doc Co-authored-by: Bin Yu <[email protected]>
No description provided.