Filter
Free
Video
Amazon S3: Cloud Storage Bucket File Transfer
The Amazon S3 Cloud Storage Bucket File Transfer integration allows you to securely automate file transfers from, to, and between Amazon S3 cloud storage buckets and third-party application folders.Storing data in the cloud becomes an integral part of most modern IT landscapes. With Universal Automation Center (UAC), you can securely automate your AWS tasks and integrate them into existing scheduling workflows.
Key Features:
Automate file transfers in real-time.
Drag-and-drop as a task into any existing scheduling workflow within the UAC.
File Transfers can be triggered by a third-party application using the UAC RESTfull web service API: REST API.
The following file transfer commands are supported:
Upload file(s) to an S3 bucket.
Download file(s) from an S3 bucket.
Transfer files between S3 buckets.
List objects in an S3 bucket.
Delete object(s) in an S3 bucket.
List S3 bucket names.
Create an S3 bucket.
Additional Info:
Security is ensured by using the HTTPS protocol with support for an optional proxy server.
Supports AWS IAM Role-Based Access (RBCA).
No Universal Agent needs to be installed on the AWS Cloud – the communication goes via HTTPS.AWS canned ACLs are supported, e.g., to grant full access to the bucket owner.
Free
Amazon SQS: Message
Amazon Simple Queue Service (SQS) is a fully managed message queuing service that enables you to decouple and scale microservices, distributed systems, and serverless applications. This Integration provides the capability to send an AWS SQS message towards an existing queue.
Key Features:
This Universal Extension provides the following main features:
Send a message towards a standard or a FIFO queue.
Capability to control the transport of the messages by configuring the message Delay Seconds (for standard queues) and the message group ID and message deduplication ID (for FIFO queues).Capability to fetch dynamically Queue Names list from SQS for selection during task creation.Capability to be authorized via IAM Role-Based Access Control (RBAC) strategy.Capability for Proxy communication via HTTP/HTTPS protocol. What's New in v1.1.0This new release gives the capability to users to rely on AWS credentials set-up on the environment where the extension is running and therefore it is not mandatory to be passed on the task definition as input field. The same applies to AWS Region.
Free
Amazon SQS: Monitor
Amazon Simple Queue Service (SQS) is a fully managed message queuing service that enables you to decouple and scale microservices, distributed systems, and serverless applications. This Universal Extension provides the capability to monitor AWS SQS messages from an existing queue and run Universal Task and/or workflows accordingly.Key FeaturesThis Universal Extension provides the following main features:ActionMonitor AWS SQS messages from a standard or a FIFO queue.Launch a task in Universal Controller with variables holding the id, body, attributes, message attributes and receipt handle for each fetched message.AuthenticationAWS Credentials.IAM Role-Based Access Control (RBAC) strategy.OtherCommunication through Proxy with use of HTTP or HTTPS.What's New V1.1.0 This new release gives the capability to users to rely on AWS credentials set-up on the environment where the extension is running and therefore it is not mandatory to be passed on the task definition as input field. The same applies to AWS Region.
Free
Apache Airflow
Apache Airflow is an open-source platform created to programmatically author, schedule, and monitor workflows.
This integration provides the capability to integrate with Apache Airflow and use it as part of your end-to-end Universal Controller workflow, allowing high-level visibility and orchestration of data-oriented jobs or pipelines.
Key Features:
This Universal Extension provides the following main features:
Triggering a new DAG run.
Information retrieval of a specific DAG run.
Information retrieval for a task that is part of a specific DAG run.
Basic authentication (username/password) and SSL protocol.
Using a proxy between Universal Controller and Apache Airflow server.
What's new v 1.1.0The action "Trigger DAG Run" has been enhanced. The latest release provides the ability to configure the task to wait for the execution of the DAG Run until it reaches the state of "success" or "failed". Depending on the final DAG Run state, the task finishes with a corresponding status ("success" or "failed").
Free
Apache Kafka: Event Monitor
Apache Kafka is an open-source distributed event streaming platform used by thousands of companies for high-performance data pipelines, streaming analytics, data integration, and mission-critical applications. A Kafka Event Monitor is a Universal Extension responsible for monitoring events (messages) from topics in Kafka.
Key Features:
This Universal Extension provides the following main features:
Support to consume messages by consumer group subscription, from a specific topic, until a specific condition is met. Filtering is based on the value of the message. When a matching Kafka message is detected, the universal task is finished by publishing information related to the matched message on extension output. Number, String and JSON filter patterns are supported.
Support for authenticating to Kafka through PLAINTEXT or SASL_SSL SCRAM security protocol.
Typically this extension can be used to monitor events from Kafka and upon successful execution to trigger workflows or other tasks, or just to pass information related to the Kafka event within UAC.
Free
Apache Kafka: Publish Event
Apache Kafka is an open-source distributed event streaming platform used by thousands of companies for high-performance data pipelines, streaming analytics, data integration, and mission-critical applications. This Integration is responsible for publishing events (messages) to topics in Kafka.
Key Features:
This Universal Extension supports the following main features:
Perform authentication towards Kafka, using PLAINTEXT or SASL_SSL security protocol
Send a message to Kafka with the capability to select, the topic, the partition, the message key, the message value and message metadata
Capability to control the transport of the messages by configuring the message acknowledgment strategy and the request timeout
Capability to fetch dynamically topics and partitions from kafka for selection during task creation
Capability to automatically select the serialization method depending on the key/value message data types
Free
AWS Batch
AWS Batch is a set of batch management capabilities that enables developers, scientists, and engineers to quickly and efficiently run hundreds of thousands of batch computing jobs on AWS. AWS Batch integration provides the ability to submit new AWS Batch Jobs and read the status for an existing AWS Batch Job.
Key Features:This Universal Extension provides the following key features:Support to submit a new Batch Job, with the option to Terminate Job after a timeout period.
Support to read Batch Job status for an existing Job ID.
Support for authorization via IAM Role-Based Access Control (RBAC) strategy.
Support for Proxy communication via HTTP/HTTPS protocol.What's New in v 1.2.0
This new release gives the capability to users to Submit AWS Jobs and wait until the Jobs reach "Success" or "Failure". Key parameters such as job status, id, name and ARN are updated live during execution.
Free
AWS Glue
AWS Glue is a serverless data-preparation service for extract, transform, and load (ETL) operations. It makes it easy for data engineers, data analysts, data scientists, and ETL developers to extract, clean, enrich, normalize, and load data. This integration provides the capability to submit a new AWS Glue Job.
Key Features:
This Universal Extension provides the following key features:
Actions Start a Glue job.
Start a Glue job and wait until it reaches the state "success" or "failed".
Authentication
Authentication through HTTPS
Authentication through IAM Role-Based Access Control (RBAC) strategy.
Input/Output Option to pass Input Arguments as UAC script supporting UAC environment variables and UAC Functions.
Other
Support for Proxy communication via HTTP/HTTPS protocol.
What's New in v1.2.0
What's new This new release gives the capability to users to Start a Glue job and wait until the Job reaches "Success" or "Failure". Key parameters such as job run status, id are updated live during execution. Furthermore, the option to pass Input Arguments as UAC script supporting UAC environment variables and UAC Functions was added.
Free
AWS Lambda
AWS Lambda is a serverless compute service that runs your code in response to events and automatically manages the underlying compute resources. You can use AWS Lambda to extend other AWS services with custom logic or create your own back-end services that operate at AWS scale, performance, and security. AWS Lambda can automatically run code in response to multiple events, such as HTTP requests via Amazon API Gateway, modifications to objects in Amazon S3 buckets, table updates in Amazon DynamoDB, and state transitions in AWS Step Functions.
Key Features:This Universal Extension provides the following key features:Trigger Lambda function Synchronously or Asynchronously.
Support authorization via IAM Role-Based Access Control (RBAC) strategy.
Support default or on demand AWS Region.
Support Proxy communication via HTTP/HTTPS protocol.What's New v1.1.0This new release gives the capability to users to rely on AWS credentials set-up on the environment where the extension is running and therefore it is not mandatory to be passed on the task definition as input fields. The same applies to AWS Region.
Free
Azure Blob: Manage File Transfers
The integration for Azure Blob Storage allows secure transfer of files from Azure Blob Storage containers and folders.Storing data in the cloud becomes an integral part of most modern IT landscapes. With the Stonebranch Universal Automation Center, you can securely automate your AWS, Azure, Google, and MinIO file transfers and integrate them into your existing scheduling flows.
Key Features:
The following file transfer commands are supported:
Upload file(s) to an Azure Blob Storage container.
Download file(s) from an Azure Blob Storage container.
Transfer files between Azure Blob Storage containers.
List objects in an Azure Blob Storage container.
Delete object(s) in an Azure Blob Storage container.
List Azure Blob Storage container names.
Create an Azure Blob Storage container.
File transfer can be triggered by a third-party application using the Universal Automation Center RESTfull web service API: REST API.
The integration for Azure Blob Storage can be integrated into any existing scheduling workflow in the same way as any standard Linux or Windows task type.
Security is ensured by using the HTTPS protocol with support for an optional proxy server.
Supports Azure token-based Shared Access Signature (SAS).
No Universal Agent needs to be installed on the Azure cloud – the communication goes via HTTPS.
Free
Azure Data Factory: Schedule, Trigger, and Monitor
This integration allows users to schedule, trigger, and monitor the Azure Data Factory pipeline process directly from the Universal Controller.
Key Features:
Uses Python modules azure-mgmt-resource and azure-mgmt-datafactory to make REST API calls to Azure Data Factory.
Use the Azure tenant ID, subscription ID, client ID, client secret, resource group, and location for authenticating the REST API calls to Azure Data Factory.
Perform the following Azure Data Factory operations:
Run a pipeline.
Get a pipeline info.
List all pipelines.
Cancel pipeline run.
List factory by resource group.
Azure Data Factory triggers user can perform the following operations from UAC:
Start trigger.
Stop trigger.
List trigger by factory.
UAC also can restart a failed pipeline either from the failed step or from any activity name in the failed pipeline.
Free
Databricks: Automate Jobs and Clusters
This integration allows users to perform end-to-end orchestration and automation of jobs and clusters in Databricks environment either in AWS or Azure.
Key Features:
Uses Python module requests to make REST API calls to the Databricks environment.
Uses the Databricks URL and the user bearer token to connect with the Databricks environment.
With respect to Databricks jobs, this integration can perform the below operations:
Create and list jobs.
Get job details.
Run new jobs.
Run submit jobs.
Cancel run jobs.
With respect to the Databricks cluster, this integration can perform the below operations:
Create, start, and restart a cluster.
Terminate a cluster.
Get a cluster-info.
List clusters.
With respect to Databricks DBFS, this integration also provides a feature to upload files larger files.
Free
Google BigQuery: Schedule, Trigger, Monitor, and Orchestrate Operations
This integration allows users to schedule, trigger, monitor, and orchestrate the Google BigQuery process directly from the Universal Controller.
Key Features:
Users can perform the below Google BigQuery operations:
BigQuery SQL.
List dataset.
List tables in a dataset.
View job information.
Create a dataset.
Load local file to a table.
Load cloud storage data to a table.
Export table data.
Additional Info:
This task uses Python google-cloud-bigquery and google-auth modules to make REST API calls to Google BigQuery.
This task will use the GCP project ID, BigQuery SQL or schema, dataset ID, job ID, location, table ID, cloud storage URI, source file format as parameters of BigQuery function and GCP KeyFile (API KEY) of service account for authenticating the REST API calls to Google BigQuery.
Free
Hitachi Vantara: Pentaho Data Integration
Pentaho Data Integration provides powerful ETL (Extract, Transform, and Load) capabilities and Universal Controller using its universal extension capabilities. Key Features:
Run a Pentaho Job from its Carte configured repository
Run a Pentaho Job from its Repository
Run a Pentaho Job from a remote job definition file
Define & Run a Pentaho Job from Universal Controller Script Library
Execute a Pentaho Transformation from a Repository
Execute a Pentaho Transformation from a remote transformation definition file(XML)Define & Run a Pentaho Transformation from Universal Controller Script Library
Free
Video
Informatica Cloud: Schedule, Control, and Manage
This integration allows users to schedule any data integration task, linear taskflow, or taskflow in the Informatica Cloud.
Key Features:
Schedule data integration tasks, including linear taskflow, in the Informatica Cloud.All communication is web-service based, using the latest Informatica REST API (version 2 and 3) with support for folders.Log-files including activity-, session-, and error-log are available from the Universal Controller web UI in the same way as the Informatica monitoring console.
Free
Video
Informatica PowerCenter: Schedule, Control, and Manage
This integration allows users to schedule Informatica PowerCenter workflows and tasks, including retrieving the workflow and session log. It's also possible to start a workflow from a certain task onwards.
Key Features:
Schedules Informatica PowerCenter via its web services hub; therefore, no installation on any Informatica system is required.
Based on the standard PowerCenter web services hub using the SOAP protocol.
The PowerCenter web services hub interface is called by a Universal Agent running on a Linux server or Windows server.
The following actions are supported:
Start a task in an Informatica PowerCenter workflow.
Start an Informatica PowerCenter workflow.
Start an Informatica PowerCenter workflow from a given task onwards.
Free
Video
Microsoft Power BI: Refresh Business Intelligence
This integration allows users to refresh datasets and dataflows in the Microsoft Power BI business analytics service.
Key Features:
Refresh a dataset in a group-workspace or in my workspace.
Refresh a dataflow in a group-workspace.
Lookup datasets in a selected group.
Lookup dataflows in a selected group.
Connection to the Power BI REST API is done via the Python MSAL library.
Supports Windows and Linux Universal Agents in order to connect to the Power BI REST API.
Free
Microsoft SQL: Schedule SSRS
This integration can complete various administrative tasks, including publishing reports and moving reports from one server to another server. It's based on the SQL Server Reporting Services 'rs.exe' command-line utility, which can perform many scripted operations related to SQL Server Reporting Services (SSRS). The rs.exe utility requires an input file to tell rs.exe what to do.The list of actual tasks that can be performed includes among others:
Deploying / Publishing reports
Moving reports
Exporting reports to a file
Adjust security
Cancel a running job
Configure SSRS system properties
Free
Microsoft SQL: SSIS Package Execution
SQL Server Integration Services is a platform for building data integrations and data transformation solutions. This integration interactively allows users to list and select the SSIS Folder, Project, Environment Reference, and SSIS Package while creating the job. Further, it can trigger the SSIS package execution in the Microsoft SQL server and monitor the SSIS Package execution, and fetch SSIS logs to the universal controller once the SSIS package execution is completed.
New
Free
Video
SAP IBP: Integrated Business Planning
SAP Integrated Business Planning (IBP) is a cloud-based planning software for supply chain management and this Universal Extension helps to integrate with SAP IBP to Schedule & launch a SAP IBP Job Template or process chain from Universal ControllerKey FeaturesThis Universal Extension provides the following main features:The ability to schedule, automate and execute SAP IBP jobs running in SAP Cloud via Stonebranch Universal ControllerMonitor SAP IBP Jobs from the Stonebranch Universal ControllerStart/Rerun/Cancel batch processes automatically or manually from the Stonebranch Universal ControllerRetrieve the results of the SAP IBP Jobs that are executedCancel a IBP Job execution via dynamic command or by a separate task using the cancel function choiceDisplaying real time IBP execution status in Universal Controller
Free
SAP: Batch Input Map
This integration for SAP batch input allows users to schedule and execute batch input sessions in SAP. Batch input sessions enter data non-interactively into an SAP system. It's typically used to transfer data from non-SAP systems to SAP systems or to transfer data between SAP systems.
Key Features:
Runs a batch input session.
You only need to provide the batch input session name in the task variable.
It's possible to use wildcards (*) to run multiple batch input sessions.
There's no need to create a variant manually for the batch input session in SAP.
The SAP task uses the feature inline variants of USAP to create a temporary variant for the ABAP RSBDCSUB with the batch input session name.
We are an SAP silver partner. Our product has certified integration with SAP S/4HANA.
Free
SAP: BusinessObjects Scheduling Web Intelligence Documents and Crystal Reports
SAP: BusinessObjects Scheduling Web Intelligence Documents and Crystal Reports is a centralized suite for data reporting, visualization, and sharing.
The integration for SAP BusinessObjects allows scheduling Crystal Reports and Web Intelligence documents. It supports multiple prompts and different output formats like PDF, EXCEL, and Webi.
Key Features
Schedule SAP Webi reports
Schedule Crystal Reports
Support multiple prompts as input parameter
Support different output formats like MS EXCEL, PDF, Webi
Based on the latest RESTful web service SDK - no Agent needs to be installed on the SAP BO Server
Exit code processing and error handling
In case a report fails (for example, if you provide a wrong BusinessObjects ID, it will fail), and you can re-start the job with the correct ID.
In case of a connection error, the task will fail (for example, wrong IP address or Port of the SAP BO HOST).
In case a wrong password has been entered, the instance will fail.
We are an SAP silver partner. Our product has certified integration with SAP S/4HANA.
Free
Video
Snowflake: Schedule, Trigger, Monitor, and Orchestrate Operations
This integration allows Stonebranch users to orchestrate, schedule, trigger, and monitor the Snowflake load and unload processes from different data sources (including cloud storage or local virtual machines), directly from the Stonebranch Universal Automation Center.
Key Features:
Users can orchestrate the following Snowflake functionalities:
Snowflake loading processes:
Load data from AWS S3 to Snowflake.
Load data from Azure storage to Snowflake.
Load data from Google storage to Snowflake.
Load internal stage file to Snowflake table.
Copy from local server to internal staging.
Snowflake unloading processes:
Unload Snowflake data to AWS S3.
Unload Snowflake data to Azure storage.
Unload Snowflake data to Google storage.
Unload Snowflake data to internal staging.
Unload from internal stage to local server.
Snowflake execute commands:
Execute a Snowflake command.
Free
Video
SQL: Execute Scripts and Functions
This integration allows users to execute SQL scripts and functions against a MySQL, PostgreSQL, Microsoft SQL Server, Oracle, and SAP HANA database. It uses an agentless connection via ODBC towards SQL Server, MySQL, and PostgreSQL, and the Oracle basic instant client to connect to an Oracle database.
Key Features:
Supports execution of SQL scripts for Oracle, MySQL, PostgreSQL, Microsoft SQL Server, and SAP HANA:
For SQL Server, MySQL, and PostgreSQL: all connections are agentless via ODBC.For SQL Server: Windows Authentication and SQL Server Authentication is supported.
For SAP HANA: database connections are performed agentless using the SAP HANA client for Python.
For Oracle: the execution of SQL scripts and Oracle PLSQL blocks are supported. Oracle connections are performed agentless using the Oracle basic instant client.
Supports Universal Agent for both Linux/Unix and Windows.
Select different log-levels, e.g., info and debug.
Decide if the SQL-output is provided in the standard out or not.
All passwords are encrypted using Stonebranch Universal Controller credentials.