Filter
Free
Video
Amazon S3: Cloud Storage Bucket File Transfer
The Amazon S3 Cloud Storage Bucket File Transfer integration allows you to securely automate file transfers from, to, and between Amazon S3 cloud storage buckets and third-party application folders.Storing data in the cloud becomes an integral part of most modern IT landscapes. With Universal Automation Center (UAC), you can securely automate your AWS tasks and integrate them into existing scheduling workflows.
Key Features:
Automate file transfers in real-time.
Drag-and-drop as a task into any existing scheduling workflow within the UAC.
File Transfers can be triggered by a third-party application using the UAC RESTfull web service API: REST API.
The following file transfer commands are supported:
Upload file(s) to an S3 bucket.
Download file(s) from an S3 bucket.
Transfer files between S3 buckets.
List objects in an S3 bucket.
Delete object(s) in an S3 bucket.
List S3 bucket names.
Create an S3 bucket.
Additional Info:Support for AWS S3 prefixes to simulate a folder structure and to improve performance while copying files.Security is ensured by using the HTTPS protocol with support for an optional proxy server.Supports AWS IAM Role-Based Access (RBCA).No Universal Agent needs to be installed on the AWS Cloud – the communication goes via HTTPS.AWS canned ACLs are supported, e.g., to grant full access to the bucket owner.What's New in V1.6.0Enhancements: This new release supports AWS S3 prefixes to simulate a folder structure and to improve performance while copying files.
Free
Amazon SQS: Message
Amazon Simple Queue Service (SQS) is a fully managed message queuing service that enables you to decouple and scale microservices, distributed systems, and serverless applications. This Integration provides the capability to send an AWS SQS message towards an existing queue.
Key Features:
This Universal Extension provides the following main features:
Send a message towards a standard or a FIFO queue.
Capability to control the transport of the messages by configuring the message Delay Seconds (for standard queues) and the message group ID and message deduplication ID (for FIFO queues).Capability to fetch dynamically Queue Names list from SQS for selection during task creation.Capability to be authorized via IAM Role-Based Access Control (RBAC) strategy.Capability for Proxy communication via HTTP/HTTPS protocol. What's New in v1.1.1Introduction of fixes to improve the robustness of this integration.
Free
Amazon SQS: Monitor
Amazon Simple Queue Service (SQS) is a fully managed message queuing service that enables you to decouple and scale microservices, distributed systems, and serverless applications. This Universal Extension provides the capability to monitor AWS SQS messages from an existing queue and run Universal Task and/or workflows accordingly.Key FeaturesThis Universal Extension provides the following main features:ActionMonitor AWS SQS messages from a standard or a FIFO queue.Launch a task in Universal Controller with variables holding the id, body, attributes, message attributes and receipt handle for each fetched message.AuthenticationAWS Credentials.IAM Role-Based Access Control (RBAC) strategy.OtherCommunication through Proxy with use of HTTP or HTTPS.What's New V1.1.1 Introduction of fixes to improve the robustness of this integration.
Free
Apache Airflow
Apache Airflow is an open-source platform created to programmatically author, schedule, and monitor workflows. This Universal Extension provides the capability to integrate with an Apache Airflow Standalone Server or Google Cloud Composer Airflow and use it as part of your end-to-end UC workflow, allowing high-level visibility and orchestration of data-oriented jobs or pipelines.You gain many enterprise-level benefits by using the Universal Automation Center to control Airflow. Notable benefits include: Run your Airflow data pipelines in realtime by using event-based triggers from the UCAdd Airflow tasks to broader business process workflows in the UCGain observability of your Airflow workflows - with proactive alerts from the the UCRead this article to learn more about the benefits of integrating Airflow with the Universal Automation Center.Key FeaturesThis Universal Extension provides the following key features:ActionsTrigger a DAG run and optionally wait until the DAG was reached "success" or "failure".Information retrieval of a specific DAG Run.Information retrieval for a task that is part of a specific DAG Run.AuthenticationBasic authentication for Stand Alone Airflow Server.Service Account Private Key for Google Cloud Composer.OtherCapability to use HTTP or HTTPS proxy instead of direct communication to Stand Alone Airflow Server.
What's new v 2.0.0
The latest release provides the ability to integrate with Google Cloud Composer Airflow Server. Furthermore, provides the ability to configure Action "Trigger DAG Run", by passing JSON configuration parameters.
Free
Apache Kafka: Event Monitor
Apache Kafka is an open-source distributed event streaming platform used by thousands of companies for high-performance data pipelines, streaming analytics, data integration, and mission-critical applications. A Kafka Event Monitor is a Universal Extension responsible for monitoring events (messages) from topics in Kafka.
Key Features:
This Universal Extension provides the following main features:
Monitor messages and consume a Kafka message by consumer group topic subscriptionAuthenticate against Kafka using PLAINTEXT, SASL_SSL or SSL security protocolCapability to filter Kafka messages based on the value of the message. Number, String and JSON filter patterns are supported.Capability to fetch dynamically Kafka topics, during task creation.Capability to control the partition assignment strategy, as well as session-related timeout values
Typically this extension can be used to monitor events from Kafka and upon successful execution to trigger workflows or other tasks, or just to pass information related to the Kafka event within UAC.
What's New in V1.1.0With this new release client authentication over SSL is supported.
Free
Apache Kafka: Publish Event
Apache Kafka is an open-source distributed event streaming platform used by thousands of companies for high-performance data pipelines, streaming analytics, data integration, and mission-critical applications. This Integration is responsible for publishing events (messages) to topics in Kafka.
Key Features:
This Universal Extension supports the following main features:
Perform authentication towards Kafka, using PLAINTEXT or SASL_SSL security protocol
Send a message to Kafka with the capability to select, the topic, the partition, the message key, the message value and message metadata
Capability to control the transport of the messages by configuring the message acknowledgment strategy and the request timeout
Capability to fetch dynamically topics and partitions from kafka for selection during task creation
Capability to automatically select the serialization method depending on the key/value message data types
What's New in V1.2.0This new release improves robustness of this integration and adds a number of useful features:Support for client authentication over SSL against Kafka bootstrap server.Extension Output is now provided upon successful execution.Debug mode execution is enhanced to properly close extension execution.Partition field enhanced to show and use all its dependent fields.
Free
AWS Batch
AWS Batch is a set of batch management capabilities that enables developers, scientists, and engineers to quickly and efficiently run hundreds of thousands of batch computing jobs on AWS. AWS Batch integration provides the ability to submit new AWS Batch Jobs and read the status for an existing AWS Batch Job.
Key Features:This Universal Extension provides the following key features:Support to submit a new Batch Job, with the option to Terminate Job after a timeout period.
Support to submit a new Batch Job and wait until it reaches state "success" or "failed".Support to read Batch Job status for an existing Job ID.
Support for authorization via IAM Role-Based Access Control (RBAC) strategy.
Support for Proxy communication via HTTP/HTTPS protocol.What's New in V1.3.1Revert `certifi` as requirement.
Free
AWS EMR
Amazon EMR (previously called Amazon Elastic MapReduce) is a managed cluster platform that simplifies running big data frameworks, such as Apache Hadoop and Apache Spark, on AWS to process and analyze vast amounts of data. This integration provides the ability the start and optionally monitor (by polling at specified intervals) notebook executions as supported by AWS EMR. Additionally the user can optionally specify various configuration options relating to the notebook execution.Key Features
Actions:
Start Notebook Execution:
This action is used to start an AWS EMR notebook execution. The authentication is done by access and secret keys, optionally with Role Based Access.
This action can be configured to either simply trigger the execution or trigger and wait until it completes successfully or fails.
Free
AWS Glue
AWS Glue is a serverless data-preparation service for extract, transform, and load (ETL) operations. It makes it easy for data engineers, data analysts, data scientists, and ETL developers to extract, clean, enrich, normalize, and load data.
This Universal Extension provides the capability to submit a new AWS Glue Job.
Key Features
This Universal Extension provides the following key features:
Actions
Start a Glue job.
Start a Glue job and wait until it reaches the state "success" or "failed."
Authentication
Authentication through HTTPS.
Authentication through IAM Role-Based Access Control (RBAC) strategy.
Input/Output
Option to pass Input Arguments as UAC script supporting UAC environment variables and UAC Functions.
Option to choose Worker Type.
Other
Support for Proxy communication via HTTP/HTTPS protocol.
What's new v 2.1.0
Added new field - Endpoint URL.
Proxy Type field has been removed (hidden) as it no longer needs to be filled by users on task definition.
Free
AWS Lambda
AWS Lambda is a serverless compute service that runs your code in response to events and automatically manages the underlying compute resources. You can use AWS Lambda to extend other AWS services with custom logic or create your own back-end services that operate at AWS scale, performance, and security. AWS Lambda can automatically run code in response to multiple events, such as HTTP requests via Amazon API Gateway, modifications to objects in Amazon S3 buckets, table updates in Amazon DynamoDB, and state transitions in AWS Step Functions.
Key Features:This Universal Extension provides the following key features:Trigger Lambda function Synchronously or Asynchronously.
Support authorization via IAM Role-Based Access Control (RBAC) strategy.
Support default or on demand AWS Region.
Support Proxy communication via HTTP/HTTPS protocol.What's New v1.2.0
Provide the capability to use a custom Lambda service endpoint.
Free
Azure AZ CLI
`The Azure command-line interface (Azure CLI) is a set of commands used to create and manage Azure resources. The Azure CLI is available across Azure services and is designed to get you working quickly with Azure, with an emphasis on automation.The Universal Task for Azure AZ CLI allows calling a single or a set of Azure CLI commands. Key FeaturesThis Universal Task provides the following key features:The Universal Task for Azure AZ CLI allows you to schedule and Invoke Azure CLI commands.Either a single command or a list of commands can be invoked.No Azure CLI needs to be installed. This Universal Task uses Microsoft-maintained Python modules azure-cli.The following functionalities can be performed:Authenticate with Azure using your Azure user credentials.Authenticate with Azure with a service principal.Invoke a single Azure CLI commands.Invoke a list of Azure CLI commands provides via a Universal Controller Script file.Chooses different log-levels.The Task can run on any Windows or Linux Agent, without the need to install the Azure CLI.
Free
Azure Batch
Azure Batch is a set of batch management capabilities that enables developers, scientists, and engineers to easily and efficiently create Azure Batch jobs and tasks. This Universal Extension provides the capability to submit Azure Jobs, add Azure Batch tasks to jobs as well as monitor tasks for completion.Key Features:
Actions
Add an Azure Batch Job to a specific pool of nodes by providing the Azure Batch Job configuration in JSON format.
Add a task as part of a specific job & optionally monitor task for completion.
Authentication
Ability to connect to Azure using Client Credentials Grant Type.
Ability to connect to Azure using User Credentials Grant Type.
Free
Azure Blob: Manage File Transfers
The integration for Azure Blob Storage allows secure transfer of files from Azure Blob Storage containers and folders.Storing data in the cloud becomes an integral part of most modern IT landscapes. With the Stonebranch Universal Automation Center, you can securely automate your AWS, Azure, Google, and MinIO file transfers and integrate them into your existing scheduling flows.
Key Features:
The following file transfer commands are supported:
Upload file(s) to an Azure Blob Storage container.
Download file(s) from an Azure Blob Storage container.
Transfer files between Azure Blob Storage containers.
List objects in an Azure Blob Storage container.
Delete object(s) in an Azure Blob Storage container.
List Azure Blob Storage container names.
Create an Azure Blob Storage container.
File transfer can be triggered by a third-party application using the Universal Automation Center RESTfull web service API: REST API.
The integration for Azure Blob Storage can be integrated into any existing scheduling workflow in the same way as any standard Linux or Windows task type.
Security is ensured by using the HTTPS protocol with support for an optional proxy server.
Supports Azure token-based Shared Access Signature (SAS).
No Universal Agent needs to be installed on the Azure cloud – the communication goes via HTTPS.
Free
Azure Data Factory: Schedule, Trigger, and Monitor
This integration allows users to schedule, trigger, and monitor the Azure Data Factory pipeline process directly from the Universal Controller.
Key Features:
Uses Python modules azure-mgmt-resource and azure-mgmt-datafactory to make REST API calls to Azure Data Factory.
Use the Azure tenant ID, subscription ID, client ID, client secret, resource group, and location for authenticating the REST API calls to Azure Data Factory.
Perform the following Azure Data Factory operations:
Run a pipeline.
Get a pipeline info.
List all pipelines.
Cancel pipeline run.
List factory by resource group.
Azure Data Factory triggers user can perform the following operations from UAC:
Start trigger.
Stop trigger.
List trigger by factory.
UAC also can restart a failed pipeline either from the failed step or from any activity name in the failed pipeline.
Free
Video
Azure Synapse
Azure Synapse is a cloud-based analytics service that combines big data and data warehousing capabilities to enable organizations to ingest, prepare, manage, and analyze large volumes of data for business insights
This Universal Task provides the capability to run, monitor, and re-start Azure Synapse Pipelines from Universal Automation Center. Key Features:This Universal Task provides the following key features:
Run a Pipeline.Run a Pipeline with parameters.List all Pipelines in a Workspace.Monitor the started Synapse Pipeline. Cancel a Pipeline Run.Cancel a Pipeline Run Recursive.Rerun a Pipeline from a specified activity or the beginning.Service Principal-based Authentication to Azure Synapse. Certificate-based TLS connection.What's New in V1.2.0Enhancements: Json Output is provided for further processing of the output in, e.g., Workflows.
Free
Databricks: Automate Jobs and Clusters
This integration allows users to perform end-to-end orchestration and automation of jobs and clusters in Databricks environment either in AWS or Azure.
Key Features:
Uses Python module requests to make REST API calls to the Databricks environment.
Uses the Databricks URL and the user bearer token to connect with the Databricks environment.
With respect to Databricks jobs, this integration can perform the below operations:
Create and list jobs.
Get job details.
Run new jobs.
Run submit jobs.
Cancel run jobs.
With respect to the Databricks cluster, this integration can perform the below operations:
Create, start, and restart a cluster.
Terminate a cluster.
Get a cluster-info.
List clusters.
With respect to Databricks DBFS, this integration also provides a feature to upload files larger files.
What's New in V1.3.3The latest release provides enhanced error handling capabilities while monitoring the status of running jobs, and enables recovery options in the event of errors occurring during the monitoring process.
Free
Video
DBT: Core CLI
DBT (Data Build Tool) Core is an open-source tool and it is designed to help data engineers and analysts manage and transform data in a reliable and scalable manner. Some key features of DBT Core include version control integration, automated testing, documentation generation, and dependency management. These features enable teams to collaborate effectively, ensure the quality of their data transformations, and maintain documentation for their data pipelines. DBT Core simplifies the process of transforming data in a data warehouse environment, providing a structured and scalable approach that helps teams manage their data transformation workflows effectively.This integration provides an interface to call the DBT Core CLI and enables users to execute DBT Core CLI tasks in Universal Controller. This is particularly useful when UC acts as a data pipeline orchestrator and DBT is a step within a data pipeline Universal Controller workflow.
Key Features:
Show DBT version and DBT environment variables information as Task Instance STDOUT. Enables users to retrieve Information on the dbt core installation.
Execute a given DBT command. DBT output is displayed as task instance STDOUT
Artifacts can be collected, and related information can be published within Universal Controller for subsequent workflow tasks to use.Ability to safely store and pass UC Credentials as DBT environment variables.
What's New in V1.0.1:Include integration icon on Universal Template.
Free
Video
Fivetran
Fivetran is a cloud-based data integration platform that helps businesses automate the process of extracting data from various sources, transforming it, and loading it into a data warehouse for analysis. It offers a wide range of pre-built connectors to popular data sources such as databases, marketing platforms, CRMs, and more. Fivetran's connectors are designed to be easy to set up and use, allowing businesses to focus on data analysis rather than data integration. The platform also offers features such as data transformation, scheduling, monitoring, and access control to help organizations manage their data pipelines more efficiently. Fivetran supports a variety of data warehouses, including Amazon Redshift, Google BigQuery, Snowflake, and Microsoft Azure.
This integration allows to create and execute Fivetran Tasks in Universal Controller.
Key Features:
Trigger a Sync action on a Connector. Optionally, wait until Sync is completed.
Trigger a Re-sync (Historical Sync) action on a Connector. Optionally, wait until Re-sync is completed.
What's New in V1.0.1:A fix is provided to execute the integration with a Python installation, other than the one that is bundled with the Universal Agent.
Free
Video
Google BigQuery: Schedule, Trigger, Monitor, and Orchestrate Operations
Google BigQuery is a serverless, highly scalable data warehouse designed to make SQL-based queries on large datasets fast and easy. It is part of the Google Cloud Platform and enables users to perform real-time data analysis with its powerful processing capabilities and simple SQL interface..
This Universal Task allows Stonebranch users to schedule, trigger, monitor, and orchestrate the Google BigQuery process directly from the Universal Controller.
This task uses Python modules google-cloud-big query and google-auth to make REST-API calls to Google BigQuery
This task will use the GCP Project ID, BigQuery SQL or Schema, Dataset ID, Job ID, Location, Table ID, Cloud Storage URI, and Source File Format as parameters of BigQuery function, and GCP KeyFile (API KEY) of Service account for authenticating the REST-API calls to Google BigQuery.
Key Features
This Universal task provides the following main features:
BigQuery SQL
List dataset
List tables in the dataset
View job information
Create a dataset
Load local file to a table
Load cloud storage data to a table
Export table data
What's New (v1.1.4)
This new release involves a Bug Fix
Fix: Usage for the Double quotes for the _scriptPath
Free
Hitachi Vantara: Pentaho Data Integration
Pentaho Data Integration provides powerful ETL (Extract, Transform, and Load) capabilities and Universal Controller using its universal extension capabilities. Key Features:
Run a Pentaho Job from its Carte configured repository
Run a Pentaho Job from its Repository
Run a Pentaho Job from a remote job definition file
Define & Run a Pentaho Job from Universal Controller Script Library
Execute a Pentaho Transformation from a Repository
Execute a Pentaho Transformation from a remote transformation definition file(XML)Define & Run a Pentaho Transformation from Universal Controller Script Library
Free
IBM InfoSphere: DataStage Jobs
IBM DataStage is a powerful data integration and ETL (Extract, Transform, Load) tool by IBM. It helps organizations extract data from various sources, transform it to meet specific business requirements, and load it into target systems such as data warehouses, data marts, and operational data stores. This Universal Task allows Stonebranch users to schedule, trigger, monitor, and orchestrate InfoSphere DataStage jobs directly from Universal Controller via the DataStage command-line interface (CLI). UAC workflows can be used to set up dependencies between DataStage jobs and other applications or platforms, and to trigger DataStage jobs from UAC — either time-based or event-based.
Key Features:
This Universal Extension provides the following key features:
Ability to schedule, trigger & orchestrate the execution of Datastage jobs
Monitor the DataStage job execution
Resumes a restartable DataStage job from UAC (“-mode RESTART” to the parameters)
Print DataStage job logs post-execution
The benefits of using this integration include the ability to centralize control of DataStage jobs alongside other automated tasks within a data pipeline. Additionally, end-users can quickly root-cause errors, easily re-run jobs, and gain observability by managing DataStage jobs from within UAC. Check out this DataStage Job Scheduler article.
Key Features:This Universal Extension provides the following key features: Ability to schedule, trigger & orchestrate the execution of Datastage jobs Monitor the DataStage job execution Resumes a restartable DataStage job from UAC (“-mode RESTART” to the parameters) Print DataStage job logs post-execution The benefits of using this integration include the ability to centralize control of DataStage jobs alongside other automated tasks within a data pipeline. Additionally, end-users can quickly root-cause errors, easily re-run jobs, and gain observability by managing DataStage jobs from within UAC. Check out this DataStage Job Scheduler article.
Free
Video
Informatica Cloud: Schedule, Control, and Manage
This integration allows users to schedule any data integration task, linear taskflow, or taskflow in the Informatica Cloud.
Key Features:Schedule Data Integration Tasks, Linear Taskflows, Taskflows and Processes in the Informatica Cloud.Support for Parameters in Processes and TaskflowsSupport for Customer Names in TaskflowsResume a suspended taskflow instance from a faulted stepSkip a faulted step and resume a suspended taskflow instance from the next step.All communication is web-service based, using the latest Informatica REST API (version 2 and 3) with support for folders.Log-files including activity-, session-, and error-log are available from the Universal Controller web UI in the same way as the Informatica monitoring console.What's New v1.3.0:Resume a suspended taskflow instance from a faulted stepSkip a faulted step and resume a suspended taskflow instance from the next step.
Free
Video
Informatica PowerCenter: Schedule, Control, and Manage
This integration allows users to schedule Informatica PowerCenter workflows and tasks, including retrieving the workflow and session log. It's also possible to start a workflow from a certain task onwards.
Key Features:
Schedules Informatica PowerCenter via its web services hub; therefore, no installation on any Informatica system is required.
Based on the standard PowerCenter web services hub using the SOAP protocol.
The PowerCenter web services hub interface is called by a Universal Agent running on a Linux server or Windows server.
The following actions are supported:
Start a task in an Informatica PowerCenter workflow.
Start an Informatica PowerCenter workflow.
Start an Informatica PowerCenter workflow from a given task onwards.
Free
Video
Microsoft Power BI
This integration allows users to connect to and combine data from multiple data sources through a secure interface, creating Power BI datasets. Additionally, users may combine and transform collections of tables into a data pipeline, creating Power BI dataflows. This integration provides the capability to perform refresh actions on Microsoft PowerBI Datasets and Dataflows.
A typical Use Case is when UAC is used as a Data Pipeline orchestrator, where the last step of a workflow could be the refresh of a Microsoft Dataset or Dataflow.
Key Features:
Capability to execute refresh commands for Datasets or Dataflows and monitor their status