Amazon S3: Cloud Storage Bucket File Transfer
The Amazon S3 Cloud Storage Bucket File Transfer integration allows you to securely automate file transfers from, to, and between Amazon S3 cloud storage buckets and third-party application folders.Storing data in the cloud becomes an integral part of most modern IT landscapes. With Universal Automation Center (UAC), you can securely automate your AWS tasks and integrate them into existing scheduling workflows. Key Features: Automate file transfers in real-time. Drag-and-drop as a task into any existing scheduling workflow within the UAC. File Transfers can be triggered by a third-party application using the UAC RESTfull web service API: REST API. The following file transfer commands are supported: Upload file(s) to an S3 bucket. Download file(s) from an S3 bucket. Transfer files between S3 buckets. List objects in an S3 bucket. Delete object(s) in an S3 bucket. List S3 bucket names. Create an S3 bucket. Additional Info: Security is ensured by using the HTTPS protocol with support for an optional proxy server. Supports AWS IAM Role-Based Access (RBCA). No Universal Agent needs to be installed on the AWS Cloud – the communication goes via HTTPS.AWS canned ACLs are supported, e.g., to grant full access to the bucket owner.
Amazon SQS: Message
Amazon Simple Queue Service (SQS) is a fully managed message queuing service that enables you to decouple and scale microservices, distributed systems, and serverless applications. This Integration provides the capability to send an AWS SQS message towards an existing queue. Key Features: This Universal Extension provides the following main features: Send a message towards a standard or a FIFO queue. Capability to control the transport of the messages by configuring the message Delay Seconds (for standard queues) and the message group ID and message deduplication ID (for FIFO queues).Capability to fetch dynamically Queue Names list from SQS for selection during task creation.Capability to be authorized via IAM Role-Based Access Control (RBAC) strategy.Capability for Proxy communication via HTTP/HTTPS protocol. What's New in v1.1.0This new release gives the capability to users to rely on AWS credentials set-up on the environment where the extension is running and therefore it is not mandatory to be passed on the task definition as input field. The same applies to AWS Region.
Amazon SQS: Monitor
Amazon Simple Queue Service (SQS) is a fully managed message queuing service that enables you to decouple and scale microservices, distributed systems, and serverless applications. This Universal Extension provides the capability to monitor AWS SQS messages from an existing queue and run Universal Task and/or workflows accordingly.Key FeaturesThis Universal Extension provides the following main features:ActionMonitor AWS SQS messages from a standard or a FIFO queue.Launch a task in Universal Controller with variables holding the id, body, attributes, message attributes and receipt handle for each fetched message.AuthenticationAWS Credentials.IAM Role-Based Access Control (RBAC) strategy.OtherCommunication through Proxy with use of HTTP or HTTPS.What's New V1.1.0 This new release gives the capability to users to rely on AWS credentials set-up on the environment where the extension is running and therefore it is not mandatory to be passed on the task definition as input field. The same applies to AWS Region.
AWS Batch is a set of batch management capabilities that enables developers, scientists, and engineers to quickly and efficiently run hundreds of thousands of batch computing jobs on AWS. AWS Batch integration provides the ability to submit new AWS Batch Jobs and read the status for an existing AWS Batch Job. Key Features:This Universal Extension provides the following key features:Support to submit a new Batch Job, with the option to Terminate Job after a timeout period. Support to read Batch Job status for an existing Job ID. Support for authorization via IAM Role-Based Access Control (RBAC) strategy. Support for Proxy communication via HTTP/HTTPS protocol.What's New in v 1.2.0 This new release gives the capability to users to Submit AWS Jobs and wait until the Jobs reach "Success" or "Failure". Key parameters such as job status, id, name and ARN are updated live during execution.
AWS EC2: Create Instances
This integration allows users to create an AWS EC2 instance with parameters, either in task form or by simply creating an EC2 instance from the existing AWS launch template. This task also offers the option to install a Linux/Unix Universal Agent in the newly provisioned EC2 instance. Key Features: The task interacts with the AWS platform via a Python Boto3 module. All AWS credentials remain encrypted. Users can also install/configure a Linux Universal Agent for each EC2 instance, enabling the Universal Controller to communicate with the newly created instance instantly. This task also lets users create multiple EC2 instances with the same configuration. New instances can also be tagged. It allows customers to create a new key pair or use an existing one for the new EC2 instance. This task also enables options for additional EBS volume and encryption, as well as detailed monitoring. Additional Info:Only Linux Universal Agent is supported at the moment.
AWS EC2: Start, Stop, and Terminate Instances
This integration allows users to spin up, terminate, and manage AWS EC2 instances on demand simply by providing one or more instance IDs as input. Key Features: This task uses Python Boto3 to interact with the AWS platform using the credentials supplied within the task. It supports multiple EC2 instances at once. This task goes to the success state in Universal Controller until the EC2 instance is completely spun up or terminated. Scheduling this task using Universal Controller workflow spins up and tears down EC2 instances based on the business needs, complete with the correct setup and dependencies. It dynamically manages EC2 operations, offering the potential to reduce EC2 operations costs in the cloud.
AWS Glue is a serverless data-preparation service for extract, transform, and load (ETL) operations. It makes it easy for data engineers, data analysts, data scientists, and ETL developers to extract, clean, enrich, normalize, and load data. This integration provides the capability to submit a new AWS Glue Job. Key Features: This Universal Extension provides the following key features: Actions Start a Glue job. Start a Glue job and wait until it reaches the state "success" or "failed". Authentication Authentication through HTTPS Authentication through IAM Role-Based Access Control (RBAC) strategy. Input/Output Option to pass Input Arguments as UAC script supporting UAC environment variables and UAC Functions. Other Support for Proxy communication via HTTP/HTTPS protocol. What's New in v1.2.0 What's new This new release gives the capability to users to Start a Glue job and wait until the Job reaches "Success" or "Failure". Key parameters such as job run status, id are updated live during execution. Furthermore, the option to pass Input Arguments as UAC script supporting UAC environment variables and UAC Functions was added.
AWS Lambda is a serverless compute service that runs your code in response to events and automatically manages the underlying compute resources. You can use AWS Lambda to extend other AWS services with custom logic or create your own back-end services that operate at AWS scale, performance, and security. AWS Lambda can automatically run code in response to multiple events, such as HTTP requests via Amazon API Gateway, modifications to objects in Amazon S3 buckets, table updates in Amazon DynamoDB, and state transitions in AWS Step Functions. Key Features:This Universal Extension provides the following key features:Trigger Lambda function Synchronously or Asynchronously. Support authorization via IAM Role-Based Access Control (RBAC) strategy. Support default or on demand AWS Region. Support Proxy communication via HTTP/HTTPS protocol.What's New v1.1.1 This new release provides a fix for Lambda functions that take long to complete. The fix allows task authors to control the time the Universal Task will wait for a Lambda function to send a response.
AWS Mainframe Modernization
AWS Mainframe Modernization service is an AWS cloud-native platform to migrate, modernize, execute, and operate mainframe applications within a fully-managed runtime. Stonebranch Universal Automation Center (UAC) works with AWS Mainframe Modernization to offer a high-availability, template-driven approach to shift existing mainframe batch schedules and event-based automation to re-platformed or refactored mainframe applications that run on AWS. Using the Stonebranch AWS Mainframe Modernization extension, enterprises may run automation in both the mainframe and on AWS simultaneously. Key Features: The ability to schedule, automate and execute mainframe batch jobs running in AWS Mainframe Modernization Service via the Stonebranch Universal ControllerMonitor Application’s batch processes from the Stonebranch Universal ControllerStart/Restart/Rerun/Cancel batch processes automatically or manually from the Stonebranch Universal ControllerSynchronous and asynchronous batch execution from Stonebranch Universal ControllerRetrieve the results of the AWS Mainframe Modernization Batch processesCapture the AWS Cloudwatch logs of the executed jobs in Stonebranch Universal Controller More information: Article: Mainframe Modernization - How Automation Makes it PossibleTechnical Brief: Mainframe Modernization: Convert Mainframe-Centric Schedulers to a Modern Service Orchestration and Automation PlatformWhat's New In 1.4.0: * BluAge support added* Log Fetching Methods updated
AWS Step Functions
Step Functions is a serverless orchestration service that lets you combine AWS Lambda functions and other AWS services to build business-critical applications. Through Step Functions graphical console, you see your application’s workflow as a series of event-driven steps.This integration allows customers to execute AWS Step Functions from Universal Controller.Key FeaturesThis Universal Extension provides the following key features.ActionsExecute a AWS Step Function and wait until is reaches status "Success" or "Failed".Execute a AWS Step Function asynchronously without waiting for the execution to finish.AuthenticationAuthentication using AWS CredentialsAuthorization via IAM Role-Based Access Control (RBAC) strategy.OtherCommunication through Proxy with use of HTTP or HTTPS.Whats New V1.0.1Fix: Correct the handling of the escaped characters within input field payload which led to impossibility of execution of the Step Function.
Azure AZ CLI
`The Azure command-line interface (Azure CLI) is a set of commands used to create and manage Azure resources. The Azure CLI is available across Azure services and is designed to get you working quickly with Azure, with an emphasis on automation.The Universal Task for Azure AZ CLI allows calling a single or a set of Azure CLI commands. Key FeaturesThis Universal Task provides the following key features:The Universal Task for Azure AZ CLI allows you to schedule and Invoke Azure CLI commands.Either a single command or a list of commands can be invoked.No Azure CLI needs to be installed. This Universal Task uses Microsoft-maintained Python modules azure-cli.The following functionalities can be performed:Authenticate with Azure using your Azure user credentials.Authenticate with Azure with a service principal.Invoke a single Azure CLI commands.Invoke a list of Azure CLI commands provides via a Universal Controller Script file.Chooses different log-levels.The Task can run on any Windows or Linux Agent, without the need to install the Azure CLI.
Azure Batch is a set of batch management capabilities that enables developers, scientists, and engineers to easily and efficiently create Azure Batch jobs and tasks. This Universal Extension provides the capability to submit Azure Jobs, add Azure Batch tasks to jobs as well as monitor tasks for completion.Key Features: Actions Add an Azure Batch Job to a specific pool of nodes by providing the Azure Batch Job configuration in JSON format. Add a task as part of a specific job & optionally monitor task for completion. Authentication Ability to connect to Azure using Client Credentials Grant Type. Ability to connect to Azure using User Credentials Grant Type.
Azure Blob: Manage File Transfers
The integration for Azure Blob Storage allows secure transfer of files from Azure Blob Storage containers and folders.Storing data in the cloud becomes an integral part of most modern IT landscapes. With the Stonebranch Universal Automation Center, you can securely automate your AWS, Azure, Google, and MinIO file transfers and integrate them into your existing scheduling flows. Key Features: The following file transfer commands are supported: Upload file(s) to an Azure Blob Storage container. Download file(s) from an Azure Blob Storage container. Transfer files between Azure Blob Storage containers. List objects in an Azure Blob Storage container. Delete object(s) in an Azure Blob Storage container. List Azure Blob Storage container names. Create an Azure Blob Storage container. File transfer can be triggered by a third-party application using the Universal Automation Center RESTfull web service API: REST API. The integration for Azure Blob Storage can be integrated into any existing scheduling workflow in the same way as any standard Linux or Windows task type. Security is ensured by using the HTTPS protocol with support for an optional proxy server. Supports Azure token-based Shared Access Signature (SAS). No Universal Agent needs to be installed on the Azure cloud – the communication goes via HTTPS.
Azure Data Factory: Schedule, Trigger, and Monitor
This integration allows users to schedule, trigger, and monitor the Azure Data Factory pipeline process directly from the Universal Controller. Key Features: Uses Python modules azure-mgmt-resource and azure-mgmt-datafactory to make REST API calls to Azure Data Factory. Use the Azure tenant ID, subscription ID, client ID, client secret, resource group, and location for authenticating the REST API calls to Azure Data Factory. Perform the following Azure Data Factory operations: Run a pipeline. Get a pipeline info. List all pipelines. Cancel pipeline run. List factory by resource group. Azure Data Factory triggers user can perform the following operations from UAC: Start trigger. Stop trigger. List trigger by factory. UAC also can restart a failed pipeline either from the failed step or from any activity name in the failed pipeline.
Azure Logic Apps: Schedule, Trigger, and Monitor Workflows
This integration can trigger and monitor the execution of Azure Logic workflows and retrieve Azure Logic workflow output execution. The Stonebranch Universal Controller (UC) integrates with Logic apps through REST APIs securely through the Azure Oauth 2.0 authentication mechanism. Key Features: Passes dynamic input parameters (JSON format) to each Azure Logic app workflow. Triggers a workflow, monitors it until the process is completed, and then delivers the results to UC. Customers can manage and control Logic app workflow execution from UC, with the capability to employ other dependencies like time triggers or event-based jobs/workflows. This task offers ITSM integration capability, enabling the auto-creation of incidents in Logic apps workflow execution failure.
Azure Virtual Machines: Start, Stop, and Terminate Instances
This integration allows users to utilize Azure Virtual Machine (VM) name, resource group, subscription ID, and access token as inputs to a start, stop, terminate, list, and check the status of Azure VMs. Key Features: Uses a Python request module to interact with the Azure cloud platform. Expands user ability to start/stop/terminate/check/list Azure VMs that belong to a subscription and resource group. In the Stonebranch Universal Controller (UC), this task reaches and stays in the success state until the Azure instance is completely started, stopped, or terminated. Scheduling this task in UC with the right dependencies set up would start and stop EC2 instances based on business needs using a UC workflow. This task helps to dynamically manage VM operations. It could potentially reduce the Azure VM running cost in the cloud. Important: This integration uses Azure Oauth 2.0 access token for Azure API authentication. Users may need to use the UC web services task to refresh the access token periodically.
Databricks: Automate Jobs and Clusters
This integration allows users to perform end-to-end orchestration and automation of jobs and clusters in Databricks environment either in AWS or Azure. Key Features: Uses Python module requests to make REST API calls to the Databricks environment. Uses the Databricks URL and the user bearer token to connect with the Databricks environment. With respect to Databricks jobs, this integration can perform the below operations: Create and list jobs. Get job details. Run new jobs. Run submit jobs. Cancel run jobs. With respect to the Databricks cluster, this integration can perform the below operations: Create, start, and restart a cluster. Terminate a cluster. Get a cluster-info. List clusters. With respect to Databricks DBFS, this integration also provides a feature to upload files larger files. What's New in V1.3.3The latest release provides enhanced error handling capabilities while monitoring the status of running jobs, and enables recovery options in the event of errors occurring during the monitoring process.
Google BigQuery: Schedule, Trigger, Monitor, and Orchestrate Operations
This integration allows users to schedule, trigger, monitor, and orchestrate the Google BigQuery process directly from the Universal Controller. Key Features: Users can perform the below Google BigQuery operations: BigQuery SQL. List dataset. List tables in a dataset. View job information. Create a dataset. Load local file to a table. Load cloud storage data to a table. Export table data. Additional Info: This task uses Python google-cloud-bigquery and google-auth modules to make REST API calls to Google BigQuery. This task will use the GCP project ID, BigQuery SQL or schema, dataset ID, job ID, location, table ID, cloud storage URI, source file format as parameters of BigQuery function and GCP KeyFile (API KEY) of service account for authenticating the REST API calls to Google BigQuery.
Terraform is an infrastructure as code tool that lets you define cloud and on-prem resources in human-readable configuration files that you can version, reuse, and share. You can then use a consistent workflow to provision and manage your infrastructure throughout its lifecycle. This integration allows users to create tasks that execute terraform commands. Typically, it can be used for Use Cases where UAC acts as an orchestrator for resource provisioning, and Terraform needs to be used to provide those resources. Key Features: This Universal Extension provides the following key features: Init Terraform (Supports upgrade option) Plan Terraform (Supports Refresh-only planning mode) Apply Terraform Destroy Terraform What's New v 1.0.1 This new release gives the capability to pass UAC credentials from the template into the terraform variable file and to be used in the terraform execution.
Inter-Cloud Data Monitor
This Universal Extension is an interface to Rclone that provides the capability to monitor files/directories from across different cloud storages, as well as local or distributed file systems. Additionally and upon successful monitor, this extension publishes Local Universal Events. To properly handle these events and take action on them, this Universal Extension optionally can be attached as a Publisher to a Universal Monitor Task (see more details in section Cloud File Monitor Events). Rclone, is the open-source command line program that is utilized to accomplish all the actions supported in this Extension. Download with: Inter-Cloud Data Transfer Key Features: This Universal Extension supports the following key features: Action Monitor objects creation Monitor objects change - change based on modification time Monitor objects deletion Options Trigger On Existence - available for Action Monitor On Create. Advanced filtering capability for objects to be monitored. Support for providing additional Rclone options according to the user needs. Universal Event Event published upon single object creation, holding information about the new object Event published upon single object change, holding information about the updated file, including the latest object size and modification time. Event published upon single object deletion, holding information about the deleted object. Event published upon single object existence, holding information about the object.
Inter-Cloud Data Transfer
This Universal Extension provides the capability to perform data transfers between cloud-based storage services as well as local or distributed file systems. Transfers are fast and secure since data are streamed from one storage to another with no intermediate storage taking place. Multiple storage systems are supported (an overview can be found here). Integrations within this solution package include: AWS S3 Google Cloud Storage Microsoft OneDrive Business, including Sharepoint Microsoft Azure Blob Storage Hadoop Distributed File Storage (HDFS) Local file system (Linux, Windows) HTTP(S) URL Download with: Inter-Cloud Data MonitorKey Features: This Universal Extension supports the following key features: Actions List objects, list directory Copy, move, synchronize data between two storages. Copy a URL's content and to cloud or local destination without saving it in temporary storage. List data on a storage, including listing with details or in JSON format for machine parsing. Create objects on a storage. Delete objects from a storage. Features: Fast transfers for objects stored in the same region. Preserves always timestamps and verifies checksums. Supports encryption, caching, compression, chunking. Dynamic token updates for OneDrive Business cloud storage, observing the OneDrive business refresh token flow. Support for dry runs. Allows users to execute a Universal Task without making any permanent changes on the target storage. Advanced filtering capability for files or objects to be listed or transferred. Option to mark the Universal Task as Failed when no files have been transferred. List of overwrite options for existing data. Additional customized options. Output: Progress of the selected Action is visible, during Universal Task Instance execution. Text or JSON formatted output.
Qlik Sense is a Business Intelligence (BI) Tool. Qlik Sense users can connect and combine data from hundreds of data sources by defining data pipelines as applications. Data can then be visualized via custom dashboards on the Qlik Sense Cloud or Desktop application. Key Features:This Universal Extension provides the following key features:Support to reload a Qlik Sense Cloud Application Support to read the status of an already reloaded Qlik Sense Cloud Application What's New V1.1.0This new release introduces a button that gives the capability to users to trigger and wait until the Qliksense app reaches status "Success" or "Failed".
SAP IBP: Integrated Business Planning
SAP Integrated Business Planning (IBP) is a cloud-based planning software for supply chain management and this Universal Extension helps to integrate with SAP IBP to Schedule & launch a SAP IBP Job Template or process chain from Universal ControllerKey FeaturesThis Universal Extension provides the following main features:The ability to schedule, automate and execute SAP IBP jobs running in SAP Cloud via Stonebranch Universal ControllerMonitor SAP IBP Jobs from the Stonebranch Universal ControllerStart/Rerun/Cancel batch processes automatically or manually from the Stonebranch Universal ControllerRetrieve the results of the SAP IBP Jobs that are executedCancel a IBP Job execution via dynamic command or by a separate task using the cancel function choiceDisplaying real time IBP execution status in Universal Controller
SAP: BusinessObjects Data Services
Using analytics tools to collect massive amounts of Big Data from your organization is one thing. Extracting meaning from that data and using it to drive real growth is another. Business Objects analytics from SAP can help you unleash the power of collective insight by delivering enterprise business intelligence, agile visualizations, and advanced predictive analytics to all users.Leverage the capabilities of SAP® Business Objects and schedule any SAP Business Object Data Services ETL job in Stonebranch's Universal Automation Center by using the “AL_RWJobLauncher.exe” utility, which comes with the SAP Data Services installation.This Universal Task allows you to execute an SAP Data Services “ETL” Job using the “AL_RWJobLauncher.exe”.Key Features:It is based on the “AL_RWJobLauncher.exe”. which is part of the Data Services Install.The Task runs on Data Services for Windows and Linux.The Universal Task provides the same error and trace information as the SAP Data Services Mgt. Console.You can select different log-levels ; for example, Info and Debug.You can configure all connection Parameters via the Universal Task.For all Parameters an exception handling has been implemented.Configurable Exit Code processing based on any information in the error.log file.We are an SAP silver partner. Our product has certified integration with SAP S/4HANA.