Filter
Free
Video
Amazon S3: Cloud Storage Bucket File Transfer
The Amazon S3 Cloud Storage Bucket File Transfer integration allows you to securely automate file transfers from, to, and between Amazon S3 cloud storage buckets and third-party application folders.Storing data in the cloud becomes an integral part of most modern IT landscapes. With Universal Automation Center (UAC), you can securely automate your AWS tasks and integrate them into existing scheduling workflows.
Key Features:
Automate file transfers in real-time.
Drag-and-drop as a task into any existing scheduling workflow within the UAC.
File Transfers can be triggered by a third-party application using the UAC RESTfull web service API: REST API.
The following file transfer commands are supported:
Upload file(s) to an S3 bucket.
Download file(s) from an S3 bucket.
Transfer files between S3 buckets.
List objects in an S3 bucket.
Delete object(s) in an S3 bucket.
List S3 bucket names.
Create an S3 bucket.
Additional Info:
Security is ensured by using the HTTPS protocol with support for an optional proxy server.
Supports AWS IAM Role-Based Access (RBCA).
No Universal Agent needs to be installed on the AWS Cloud – the communication goes via HTTPS.AWS canned ACLs are supported, e.g., to grant full access to the bucket owner.
Free
Amazon SQS: Message
Amazon Simple Queue Service (SQS) is a fully managed message queuing service that enables you to decouple and scale microservices, distributed systems, and serverless applications. This Integration provides the capability to send an AWS SQS message towards an existing queue.
Key Features:
This Universal Extension provides the following main features:
Send a message towards a standard or a FIFO queue.
Capability to control the transport of the messages by configuring the message Delay Seconds (for standard queues) and the message group ID and message deduplication ID (for FIFO queues).Capability to fetch dynamically Queue Names list from SQS for selection during task creation.Capability to be authorized via IAM Role-Based Access Control (RBAC) strategy.Capability for Proxy communication via HTTP/HTTPS protocol. What's New in v1.1.0This new release gives the capability to users to rely on AWS credentials set-up on the environment where the extension is running and therefore it is not mandatory to be passed on the task definition as input field. The same applies to AWS Region.
Free
Amazon SQS: Monitor
Amazon Simple Queue Service (SQS) is a fully managed message queuing service that enables you to decouple and scale microservices, distributed systems, and serverless applications. This Universal Extension provides the capability to monitor AWS SQS messages from an existing queue and run Universal Task and/or workflows accordingly.Key FeaturesThis Universal Extension provides the following main features:ActionMonitor AWS SQS messages from a standard or a FIFO queue.Launch a task in Universal Controller with variables holding the id, body, attributes, message attributes and receipt handle for each fetched message.AuthenticationAWS Credentials.IAM Role-Based Access Control (RBAC) strategy.OtherCommunication through Proxy with use of HTTP or HTTPS.What's New V1.1.0 This new release gives the capability to users to rely on AWS credentials set-up on the environment where the extension is running and therefore it is not mandatory to be passed on the task definition as input field. The same applies to AWS Region.
Free
Apache Airflow
Apache Airflow is an open-source platform created to programmatically author, schedule, and monitor workflows. This Universal Extension provides the capability to integrate with an Apache Airflow Standalone Server or Google Cloud Composer Airflow and use it as part of your end-to-end UC workflow, allowing high-level visibility and orchestration of data-oriented jobs or pipelines.You gain many enterprise-level benefits by using the Universal Automation Center to control Airflow. Notable benefits include: Run your Airflow data pipelines in realtime by using event-based triggers from the UCAdd Airflow tasks to broader business process workflows in the UCGain observability of your Airflow workflows - with proactive alerts from the the UCRead this article to learn more about the benefits of integrating Airflow with the Universal Automation Center.Key FeaturesThis Universal Extension provides the following key features:ActionsTrigger a DAG run and optionally wait until the DAG was reached "success" or "failure".Information retrieval of a specific DAG Run.Information retrieval for a task that is part of a specific DAG Run.AuthenticationBasic authentication for Stand Alone Airflow Server.Service Account Private Key for Google Cloud Composer.OtherCapability to use HTTP or HTTPS proxy instead of direct communication to Stand Alone Airflow Server.
What's new v 2.0.0
The latest release provides the ability to integrate with Google Cloud Composer Airflow Server. Furthermore, provides the ability to configure Action "Trigger DAG Run", by passing JSON configuration parameters.
New
Free
Apache Kafka: Event Monitor
Apache Kafka is an open-source distributed event streaming platform used by thousands of companies for high-performance data pipelines, streaming analytics, data integration, and mission-critical applications. A Kafka Event Monitor is a Universal Extension responsible for monitoring events (messages) from topics in Kafka.
Key Features:
This Universal Extension provides the following main features:
Monitor messages and consume a Kafka message by consumer group topic subscriptionAuthenticate against Kafka using PLAINTEXT, SASL_SSL or SSL security protocolCapability to filter Kafka messages based on the value of the message. Number, String and JSON filter patterns are supported.Capability to fetch dynamically Kafka topics, during task creation.Capability to control the partition assignment strategy, as well as session-related timeout values
Typically this extension can be used to monitor events from Kafka and upon successful execution to trigger workflows or other tasks, or just to pass information related to the Kafka event within UAC.
What's New in V1.1.0With this new release client authentication over SSL is supported.
Free
Apache Kafka: Publish Event
Apache Kafka is an open-source distributed event streaming platform used by thousands of companies for high-performance data pipelines, streaming analytics, data integration, and mission-critical applications. This Integration is responsible for publishing events (messages) to topics in Kafka.
Key Features:
This Universal Extension supports the following main features:
Perform authentication towards Kafka, using PLAINTEXT or SASL_SSL security protocol
Send a message to Kafka with the capability to select, the topic, the partition, the message key, the message value and message metadata
Capability to control the transport of the messages by configuring the message acknowledgment strategy and the request timeout
Capability to fetch dynamically topics and partitions from kafka for selection during task creation
Capability to automatically select the serialization method depending on the key/value message data types
What's New in V1.2.0This new release improves robustness of this integration and adds a number of useful features:Support for client authentication over SSL against Kafka bootstrap server.Extension Output is now provided upon successful execution.Debug mode execution is enhanced to properly close extension execution.Partition field enhanced to show and use all its dependent fields.
Free
AWS Batch
AWS Batch is a set of batch management capabilities that enables developers, scientists, and engineers to quickly and efficiently run hundreds of thousands of batch computing jobs on AWS. AWS Batch integration provides the ability to submit new AWS Batch Jobs and read the status for an existing AWS Batch Job.
Key Features:This Universal Extension provides the following key features:Support to submit a new Batch Job, with the option to Terminate Job after a timeout period.
Support to read Batch Job status for an existing Job ID.
Support for authorization via IAM Role-Based Access Control (RBAC) strategy.
Support for Proxy communication via HTTP/HTTPS protocol.What's New in v 1.2.0
This new release gives the capability to users to Submit AWS Jobs and wait until the Jobs reach "Success" or "Failure". Key parameters such as job status, id, name and ARN are updated live during execution.
Free
AWS Glue
AWS Glue is a serverless data-preparation service for extract, transform, and load (ETL) operations. It makes it easy for data engineers, data analysts, data scientists, and ETL developers to extract, clean, enrich, normalize, and load data. This integration provides the capability to submit a new AWS Glue Job.
Key Features:
This Universal Extension provides the following key features:
Actions Start a Glue job.
Start a Glue job and wait until it reaches the state "success" or "failed".
Authentication
Authentication through HTTPS
Authentication through IAM Role-Based Access Control (RBAC) strategy.
Input/Output Option to pass Input Arguments as UAC script supporting UAC environment variables and UAC Functions.
Other
Support for Proxy communication via HTTP/HTTPS protocol.
What's New in v1.2.0
What's new This new release gives the capability to users to Start a Glue job and wait until the Job reaches "Success" or "Failure". Key parameters such as job run status, id are updated live during execution. Furthermore, the option to pass Input Arguments as UAC script supporting UAC environment variables and UAC Functions was added.
Free
AWS Lambda
AWS Lambda is a serverless compute service that runs your code in response to events and automatically manages the underlying compute resources. You can use AWS Lambda to extend other AWS services with custom logic or create your own back-end services that operate at AWS scale, performance, and security. AWS Lambda can automatically run code in response to multiple events, such as HTTP requests via Amazon API Gateway, modifications to objects in Amazon S3 buckets, table updates in Amazon DynamoDB, and state transitions in AWS Step Functions.
Key Features:This Universal Extension provides the following key features:Trigger Lambda function Synchronously or Asynchronously.
Support authorization via IAM Role-Based Access Control (RBAC) strategy.
Support default or on demand AWS Region.
Support Proxy communication via HTTP/HTTPS protocol.What's New v1.1.1
This new release provides a fix for Lambda functions that take long to complete. The fix allows task authors to control the time the Universal Task will wait for a Lambda function to send a response.
Free
Azure AZ CLI
`The Azure command-line interface (Azure CLI) is a set of commands used to create and manage Azure resources. The Azure CLI is available across Azure services and is designed to get you working quickly with Azure, with an emphasis on automation.The Universal Task for Azure AZ CLI allows calling a single or a set of Azure CLI commands. Key FeaturesThis Universal Task provides the following key features:The Universal Task for Azure AZ CLI allows you to schedule and Invoke Azure CLI commands.Either a single command or a list of commands can be invoked.No Azure CLI needs to be installed. This Universal Task uses Microsoft-maintained Python modules azure-cli.The following functionalities can be performed:Authenticate with Azure using your Azure user credentials.Authenticate with Azure with a service principal.Invoke a single Azure CLI commands.Invoke a list of Azure CLI commands provides via a Universal Controller Script file.Chooses different log-levels.The Task can run on any Windows or Linux Agent, without the need to install the Azure CLI.
Free
Azure Batch
Azure Batch is a set of batch management capabilities that enables developers, scientists, and engineers to easily and efficiently create Azure Batch jobs and tasks. This Universal Extension provides the capability to submit Azure Jobs, add Azure Batch tasks to jobs as well as monitor tasks for completion.Key Features:
Actions
Add an Azure Batch Job to a specific pool of nodes by providing the Azure Batch Job configuration in JSON format.
Add a task as part of a specific job & optionally monitor task for completion.
Authentication
Ability to connect to Azure using Client Credentials Grant Type.
Ability to connect to Azure using User Credentials Grant Type.
Free
Azure Blob: Manage File Transfers
The integration for Azure Blob Storage allows secure transfer of files from Azure Blob Storage containers and folders.Storing data in the cloud becomes an integral part of most modern IT landscapes. With the Stonebranch Universal Automation Center, you can securely automate your AWS, Azure, Google, and MinIO file transfers and integrate them into your existing scheduling flows.
Key Features:
The following file transfer commands are supported:
Upload file(s) to an Azure Blob Storage container.
Download file(s) from an Azure Blob Storage container.
Transfer files between Azure Blob Storage containers.
List objects in an Azure Blob Storage container.
Delete object(s) in an Azure Blob Storage container.
List Azure Blob Storage container names.
Create an Azure Blob Storage container.
File transfer can be triggered by a third-party application using the Universal Automation Center RESTfull web service API: REST API.
The integration for Azure Blob Storage can be integrated into any existing scheduling workflow in the same way as any standard Linux or Windows task type.
Security is ensured by using the HTTPS protocol with support for an optional proxy server.
Supports Azure token-based Shared Access Signature (SAS).
No Universal Agent needs to be installed on the Azure cloud – the communication goes via HTTPS.
Free
Azure Data Factory: Schedule, Trigger, and Monitor
This integration allows users to schedule, trigger, and monitor the Azure Data Factory pipeline process directly from the Universal Controller.
Key Features:
Uses Python modules azure-mgmt-resource and azure-mgmt-datafactory to make REST API calls to Azure Data Factory.
Use the Azure tenant ID, subscription ID, client ID, client secret, resource group, and location for authenticating the REST API calls to Azure Data Factory.
Perform the following Azure Data Factory operations:
Run a pipeline.
Get a pipeline info.
List all pipelines.
Cancel pipeline run.
List factory by resource group.
Azure Data Factory triggers user can perform the following operations from UAC:
Start trigger.
Stop trigger.
List trigger by factory.
UAC also can restart a failed pipeline either from the failed step or from any activity name in the failed pipeline.
Free
Databricks: Automate Jobs and Clusters
This integration allows users to perform end-to-end orchestration and automation of jobs and clusters in Databricks environment either in AWS or Azure.
Key Features:
Uses Python module requests to make REST API calls to the Databricks environment.
Uses the Databricks URL and the user bearer token to connect with the Databricks environment.
With respect to Databricks jobs, this integration can perform the below operations:
Create and list jobs.
Get job details.
Run new jobs.
Run submit jobs.
Cancel run jobs.
With respect to the Databricks cluster, this integration can perform the below operations:
Create, start, and restart a cluster.
Terminate a cluster.
Get a cluster-info.
List clusters.
With respect to Databricks DBFS, this integration also provides a feature to upload files larger files.
What's New in V1.3.3The latest release provides enhanced error handling capabilities while monitoring the status of running jobs, and enables recovery options in the event of errors occurring during the monitoring process.
Free
Google BigQuery: Schedule, Trigger, Monitor, and Orchestrate Operations
This integration allows users to schedule, trigger, monitor, and orchestrate the Google BigQuery process directly from the Universal Controller.
Key Features:
Users can perform the below Google BigQuery operations:
BigQuery SQL.
List dataset.
List tables in a dataset.
View job information.
Create a dataset.
Load local file to a table.
Load cloud storage data to a table.
Export table data.
Additional Info:
This task uses Python google-cloud-bigquery and google-auth modules to make REST API calls to Google BigQuery.
This task will use the GCP project ID, BigQuery SQL or schema, dataset ID, job ID, location, table ID, cloud storage URI, source file format as parameters of BigQuery function and GCP KeyFile (API KEY) of service account for authenticating the REST API calls to Google BigQuery.
Free
Hitachi Vantara: Pentaho Data Integration
Pentaho Data Integration provides powerful ETL (Extract, Transform, and Load) capabilities and Universal Controller using its universal extension capabilities. Key Features:
Run a Pentaho Job from its Carte configured repository
Run a Pentaho Job from its Repository
Run a Pentaho Job from a remote job definition file
Define & Run a Pentaho Job from Universal Controller Script Library
Execute a Pentaho Transformation from a Repository
Execute a Pentaho Transformation from a remote transformation definition file(XML)Define & Run a Pentaho Transformation from Universal Controller Script Library
Free
Video
Informatica Cloud: Schedule, Control, and Manage
This integration allows users to schedule any data integration task, linear taskflow, or taskflow in the Informatica Cloud.
Key Features:
Schedule data integration tasks, including linear taskflow, in the Informatica Cloud.All communication is web-service based, using the latest Informatica REST API (version 2 and 3) with support for folders.Log-files including activity-, session-, and error-log are available from the Universal Controller web UI in the same way as the Informatica monitoring console.
Free
Video
Informatica PowerCenter: Schedule, Control, and Manage
This integration allows users to schedule Informatica PowerCenter workflows and tasks, including retrieving the workflow and session log. It's also possible to start a workflow from a certain task onwards.
Key Features:
Schedules Informatica PowerCenter via its web services hub; therefore, no installation on any Informatica system is required.
Based on the standard PowerCenter web services hub using the SOAP protocol.
The PowerCenter web services hub interface is called by a Universal Agent running on a Linux server or Windows server.
The following actions are supported:
Start a task in an Informatica PowerCenter workflow.
Start an Informatica PowerCenter workflow.
Start an Informatica PowerCenter workflow from a given task onwards.
Free
Video
Microsoft Power BI: Refresh Business Intelligence
This integration allows users to refresh datasets and dataflows in the Microsoft Power BI business analytics service.
Key Features:
Refresh a dataset in a group-workspace or in my workspace.
Refresh a dataflow in a group-workspace.
Lookup datasets in a selected group.
Lookup dataflows in a selected group.
Connection to the Power BI REST API is done via the Python MSAL library.
Supports Windows and Linux Universal Agents in order to connect to the Power BI REST API.
Free
Microsoft SQL: Schedule SSRS
This integration can complete various administrative tasks, including publishing reports and moving reports from one server to another server. It's based on the SQL Server Reporting Services 'rs.exe' command-line utility, which can perform many scripted operations related to SQL Server Reporting Services (SSRS). The rs.exe utility requires an input file to tell rs.exe what to do.The list of actual tasks that can be performed includes among others:
Deploying / Publishing reports
Moving reports
Exporting reports to a file
Adjust security
Cancel a running job
Configure SSRS system properties
New
Free
Microsoft SQL: SSIS Package Execution
SQL Server Integration Services (SSIS) is a platform for building data integrations and data transformation solutions.This Universal Extension task interactively allows users to list and select the SSIS Folder, Project, Environment Reference, and SSIS Package while creating the job. Furthermore, it can trigger the SSIS package execution in the Microsoft SQL server, monitor the SSIS Package execution, and fetch SSIS logs to Universal Controller when the SSIS package execution has been completed.Key Features:This Universal Extension provides the following key features:Dynamic Choice Fields to select the SSIS Folder/Project/Package/Environment Reference ID.Launch SSIS Package execution.
Monitor SSIS Package execution.Fetch SSIS Package execution logs.SSIS Execution ID and Execution status captured for every execution in the Task Instance.Connection to MS SQL Server is done via the Python PYMSSQL module.Supports Windows and Linux Universal Agents in order to connect to the MSSQL server.What's in V1.0.4This new release involves a minor bug fix.
Free
Video
SAP IBP: Integrated Business Planning
SAP Integrated Business Planning (IBP) is a cloud-based planning software for supply chain management and this Universal Extension helps to integrate with SAP IBP to Schedule & launch a SAP IBP Job Template or process chain from Universal ControllerKey FeaturesThis Universal Extension provides the following main features:The ability to schedule, automate and execute SAP IBP jobs running in SAP Cloud via Stonebranch Universal ControllerMonitor SAP IBP Jobs from the Stonebranch Universal ControllerStart/Rerun/Cancel batch processes automatically or manually from the Stonebranch Universal ControllerRetrieve the results of the SAP IBP Jobs that are executedCancel a IBP Job execution via dynamic command or by a separate task using the cancel function choiceDisplaying real time IBP execution status in Universal Controller
Free
SAP: Batch Input Map
This integration for SAP batch input allows users to schedule and execute batch input sessions in SAP. Batch input sessions enter data non-interactively into an SAP system. It's typically used to transfer data from non-SAP systems to SAP systems or to transfer data between SAP systems.
Key Features:
Runs a batch input session.
You only need to provide the batch input session name in the task variable.
It's possible to use wildcards (*) to run multiple batch input sessions.
There's no need to create a variant manually for the batch input session in SAP.
The SAP task uses the feature inline variants of USAP to create a temporary variant for the ABAP RSBDCSUB with the batch input session name.
We are an SAP silver partner. Our product has certified integration with SAP S/4HANA.
Free
SAP: BusinessObjects Data Services
Using analytics tools to collect massive amounts of Big Data from your organization is one thing. Extracting meaning from that data and using it to drive real growth is another. Business Objects analytics from SAP can help you unleash the power of collective insight by delivering enterprise business intelligence, agile visualizations, and advanced predictive analytics to all users.Leverage the capabilities of SAP® Business Objects and schedule any SAP Business Object Data Services ETL job in Stonebranch's Universal Automation Center by using the “AL_RWJobLauncher.exe” utility, which comes with the SAP Data Services installation.This Universal Task allows you to execute an SAP Data Services “ETL” Job using the “AL_RWJobLauncher.exe”.Key Features:It is based on the “AL_RWJobLauncher.exe”. which is part of the Data Services Install.The Task runs on Data Services for Windows and Linux.The Universal Task provides the same error and trace information as the SAP Data Services Mgt. Console.You can select different log-levels ; for example, Info and Debug.You can configure all connection Parameters via the Universal Task.For all Parameters an exception handling has been implemented.Configurable Exit Code processing based on any information in the error.log file.We are an SAP silver partner. Our product has certified integration with SAP S/4HANA.