Databricks: Automate Jobs and Clusters
Product information "Databricks: Automate Jobs and Clusters"
This integration allows users to perform end-to-end orchestration and automation of jobs and clusters in Databricks environment either in AWS or Azure.
- Uses Python module requests to make REST API calls to the Databricks environment.
- Uses the Databricks URL and the user bearer token to connect with the Databricks environment.
- With respect to Databricks jobs, this integration can perform the below operations:
- Create and list jobs.
- Get job details.
- Run new jobs.
- Run submit jobs.
- Cancel run jobs.
- With respect to the Databricks cluster, this integration can perform the below operations:
- Create, start, and restart a cluster.
- Terminate a cluster.
- Get a cluster-info.
- List clusters.
- With respect to Databricks DBFS, this integration also provides a feature to upload files larger files.
What's New in V1.3.3
The latest release provides enhanced error handling capabilities while monitoring the status of running jobs, and enables recovery options in the event of errors occurring during the monitoring process.
|Product Component:||Universal Agent, Universal Controller|
|Universal Template Name:||DataBricks|
|Compatibility :||UC/UA 7.1 and above|
Please visit this link to find key features, prerequisites, installation instructions, configuration instructions, and examples of how to use this integration.