Amazon S3: Cloud Storage Bucket File Transfer
The Amazon S3 Cloud Storage Bucket File Transfer integration allows you to securely automate file transfers from, to, and between Amazon S3 cloud storage buckets and third-party application folders.Storing data in the cloud becomes an integral part of most modern IT landscapes. With Universal Automation Center (UAC), you can securely automate your AWS tasks and integrate them into existing scheduling workflows. Key Features: Automate file transfers in real-time. Drag-and-drop as a task into any existing scheduling workflow within the UAC. File Transfers can be triggered by a third-party application using the UAC RESTfull web service API: REST API. The following file transfer commands are supported: Upload file(s) to an S3 bucket. Download file(s) from an S3 bucket. Transfer files between S3 buckets. List objects in an S3 bucket. Delete object(s) in an S3 bucket. List S3 bucket names. Create an S3 bucket. Additional Info: Security is ensured by using the HTTPS protocol with support for an optional proxy server. Supports AWS IAM Role-Based Access (RBCA). No Universal Agent needs to be installed on the AWS Cloud – the communication goes via HTTPS.AWS canned ACLs are supported, e.g., to grant full access to the bucket owner.
Amazon SQS: Create, Monitor, and Send Messages
The Amazon SQS integration allows you to create, send, and monitor Amazon SQS messages and automatically trigger a task or workflow in Universal Controller each time a message has been received.Amazon Simple Queue Service (SQS) is a fully managed message queuing service that enables you to decouple and scale microservices, distributed systems, and serverless applications. Using SQS, enterprises can send, store, and receive messages between software components. Key Features: Allows you to monitor for, create, and send Amazon SQS messages. Trigger a task in Universal Controller upon the arrival of a new SQS message. Amazon SQS tasks can be integrated into any existing or new automation workflow. Create and send a SQS message out of any modern third-party application by calling the Universal Controller remote web service API. Set different log-levels for the Amazon SQS task to provide additional information when root-causing potential issues. Additional Information: This integration uses the Python Boto3 module. This enables new Amazon AWS services and the ability to update the current SQS task when new requirements occur. Credentials for Amazon S3 are stored in an encrypted format in the database. IAM Role-Based Access Control (RBAC) is supported. Communication to Amazon AWS is done via the HTTPS protocol. A proxy server connection to Amazon AWS with basic authentication is supported. Amazon AWS with basic authentication is supported.
Ansible: Execute and Manage Playbooks
This Ansible integration allows users to execute Ansible playbooks and run other Ansible modules or commands directly from the Stonebranch Universal Automation Center (UAC). Key Features: Manage Ansible task execution through the intuitive Universal Controller user interface. Ansible playbooks can either be centrally stored and maintained in the Universal Controller script library, or Universal Controller can call the relevant playbook residing in the Ansible host. This integration also enables the execution of other Ansible commands.
AWS EC2: Create Instances
This integration allows users to create an AWS EC2 instance with parameters, either in task form or by simply creating an EC2 instance from the existing AWS launch template. This task also offers the option to install a Linux/Unix Universal Agent in the newly provisioned EC2 instance. Key Features: The task interacts with the AWS platform via a Python Boto3 module. All AWS credentials remain encrypted. Users can also install/configure a Linux Universal Agent for each EC2 instance, enabling the Universal Controller to communicate with the newly created instance instantly. This task also lets users create multiple EC2 instances with the same configuration. New instances can also be tagged. It allows customers to create a new key pair or use an existing one for the new EC2 instance. This task also enables options for additional EBS volume and encryption, as well as detailed monitoring. Additional Info:Only Linux Universal Agent is supported at the moment.
AWS EC2: Start, Stop, and Terminate Instances
This integration allows users to spin up, terminate, and manage AWS EC2 instances on demand simply by providing one or more instance IDs as input. Key Features: This task uses Python Boto3 to interact with the AWS platform using the credentials supplied within the task. It supports multiple EC2 instances at once. This task goes to the success state in Universal Controller until the EC2 instance is completely spun up or terminated. Scheduling this task using Universal Controller workflow spins up and tears down EC2 instances based on the business needs, complete with the correct setup and dependencies. It dynamically manages EC2 operations, offering the potential to reduce EC2 operations costs in the cloud.
Azure Blob: Manage File Transfers
The integration for Azure Blob Storage allows secure transfer of files from Azure Blob Storage containers and folders.Storing data in the cloud becomes an integral part of most modern IT landscapes. With the Stonebranch Universal Automation Center, you can securely automate your AWS, Azure, Google, and MinIO file transfers and integrate them into your existing scheduling flows. Key Features: The following file transfer commands are supported: Upload file(s) to an Azure Blob Storage container. Download file(s) from an Azure Blob Storage container. Transfer files between Azure Blob Storage containers. List objects in an Azure Blob Storage container. Delete object(s) in an Azure Blob Storage container. List Azure Blob Storage container names. Create an Azure Blob Storage container. File transfer can be triggered by a third-party application using the Universal Automation Center RESTfull web service API: REST API. The integration for Azure Blob Storage can be integrated into any existing scheduling workflow in the same way as any standard Linux or Windows task type. Security is ensured by using the HTTPS protocol with support for an optional proxy server. Supports Azure token-based Shared Access Signature (SAS). No Universal Agent needs to be installed on the Azure cloud – the communication goes via HTTPS.
Azure Blob: Upload Local Directory
This integration allows users to upload a local Windows or Linux directory to an Azure Blob Storage container. As a result, you can integrate uploads of an entire local directory into your existing or new scheduling workflows, providing a true hybrid cloud (on-prem and cloud computing) file transfer solution. This integration makes it possible to automate your uploads in a way that's not available in the standard Azure SDK. Storing data in the cloud becomes an integral part of most modern IT landscapes. With the Stonebranch Universal Automation Center, you can securely automate your AWS, Azure, or any other cloud file transfer and integrate them into your existing scheduling flows. This integration offers multiple levels of security: All credentials for Azure Blob Storage are stored in an encrypted form in the database. Key Features: Calls the Python blobxfr module. The Python blobxfr module is called by a Universal Agent running on a Linux server or Windows server. The server running the Universal Agent needs to have Python 2.7.x or 3.6.x installed. All credentials for Azure are stored in an encrypted form in the database. Select different log-levels, e.g., info and debug. A proxy connection towards Azure is currently not implemented for this integration (however, it's possible with minor adjustments).
Azure Data Factory: Schedule, Trigger, and Monitor
This integration allows users to schedule, trigger, and monitor the Azure Data Factory pipeline process directly from the Universal Controller. Key Features: Uses Python modules azure-mgmt-resource and azure-mgmt-datafactory to make REST API calls to Azure Data Factory. Use the Azure tenant ID, subscription ID, client ID, client secret, resource group, and location for authenticating the REST API calls to Azure Data Factory. Perform the following Azure Data Factory operations: Run a pipeline. Get a pipeline info. List all pipelines. Cancel pipeline run. List factory by resource group. Azure Data Factory triggers user can perform the following operations from UAC: Start trigger. Stop trigger. List trigger by factory. UAC also can restart a failed pipeline either from the failed step or from any activity name in the failed pipeline.
Azure Logic Apps: Schedule, Trigger, and Monitor Workflows
This integration can trigger and monitor the execution of Azure Logic workflows and retrieve Azure Logic workflow output execution. The Stonebranch Universal Controller (UC) integrates with Logic apps through REST APIs securely through the Azure Oauth 2.0 authentication mechanism. Key Features: Passes dynamic input parameters (JSON format) to each Azure Logic app workflow. Triggers a workflow, monitors it until the process is completed, and then delivers the results to UC. Customers can manage and control Logic app workflow execution from UC, with the capability to employ other dependencies like time triggers or event-based jobs/workflows. This task offers ITSM integration capability, enabling the auto-creation of incidents in Logic apps workflow execution failure.
Azure Virtual Machines: Start, Stop, and Terminate Instances
This integration allows users to utilize Azure Virtual Machine (VM) name, resource group, subscription ID, and access token as inputs to a start, stop, terminate, list, and check the status of Azure VMs. Key Features: Uses a Python request module to interact with the Azure cloud platform. Expands user ability to start/stop/terminate/check/list Azure VMs that belong to a subscription and resource group. In the Stonebranch Universal Controller (UC), this task reaches and stays in the success state until the Azure instance is completely started, stopped, or terminated. Scheduling this task in UC with the right dependencies set up would start and stop EC2 instances based on business needs using a UC workflow. This task helps to dynamically manage VM operations. It could potentially reduce the Azure VM running cost in the cloud. Important: This integration uses Azure Oauth 2.0 access token for Azure API authentication. Users may need to use the UC web services task to refresh the access token periodically.
Databricks: Automate Jobs and Clusters
This integration allows users to perform end-to-end orchestration and automation of jobs and clusters in Databricks environment either in AWS or Azure. Key Features: Uses Python module requests to make REST API calls to the Databricks environment. Uses the Databricks URL and the user bearer token to connect with the Databricks environment. With respect to Databricks jobs, this integration can perform the below operations: Create and list jobs. Get job details. Run new jobs. Run submit jobs. Cancel run jobs. With respect to the Databricks cluster, this integration can perform the below operations: Create, start, and restart a cluster. Terminate a cluster. Get a cluster-info. List clusters. With respect to Databricks DBFS, this integration also provides a feature to upload files larger files.
Docker: Support to Push a Command in a New Container
This integration allows users to implement the Docker 'run' container CLI command to run a command in a new container. The Stonebranch Universal Automation Center seamlessly integrates your legacy system into your container-based DevOps process without the need to redesign your business process logic, resulting in a shorter time-to-market, improved customer satisfaction, better product quality, more reliable releases, improved productivity, and efficiency. As a result, you can make use of all the benefits provided by containers, like portability from applications, simplified integration, optimized development, increased scalability and performance, and minimized risk while introducing new technology.
Docker: Support to Run a Command in a Running Container
This integration allows users to implement the Docker 'exec' container CLI command to run a command in a running container. The Stonebranch Universal Automation Center seamlessly integrates your legacy system into your container-based DevOps process without the need to redesign your business process logic, resulting in a shorter time-to-market, improved customer satisfaction, better product quality, more reliable releases, improved productivity, and efficiency. As a result, you can make use of all the benefits provided by containers, like portability from applications, simplified integration, optimized development, increased scalability and performance, and minimized risk while introducing new technology.
E-Mail: SMTP and IMAP Integration
This integration is for any email provider that uses SMTP and IMAP, including email vendors like Outlook.com, Microsoft Exchange, and Gmail. This integration allows sending and retrieve E-Mails and E-Mail attachments. It also provides the functionality to download mail attachments to a mail folder. This Integration is beneficial for Stonebranch SaaS customers, accessing the Universal Controller in the Stonebranch AWS Cloud and having their Universal Agents deployed in their own datacenter. As the Integration is triggered from the Universal Agent, no additional firewall ports would need to be opened. Key Features Some details about Universal Tasks to send and retrieve E-Mails and E-Mail attachments. Main functionalities: Send an E-Mail with or without attachments Use Universal Controller Variables Retrieve an E-Mail based on filter criteria like From, To, sender, subject, body content … Move an E-Mail to a Mail folder after downloading it to a configured folder Delete an E-Mail after downloading it to a configured folder
GitHub: Automated Import/Export
This integration empowers developers to use GitHub as their version control system. Users will automate the transfer of any workload object — such as tasks, calendars, scripts, and triggers — to and from GitHub. Often, this integration is used to support the development process, inclusive of propagating changes to the next environment, such as QA and Development. Support the automation of the DevOps process by integrating the Stonebranch Universal Automation Center (UAC) with GitHub. Key Features: Import any Universal Controller (UC) object from GitHub into UC. For example: import a new template from the GitHub Marketplace into UC. Import any UC object from a script file into UC. For example: support for no internet connection from UC to GitHub. Export any UC object to GitHub from UC. For example: export a developed Universal Template to a GitHub repository. Export any UC object to a script object. The content of the script later can be used to import it on a Controller without needing the UAC import functionality. Support UC customers, whether SaaS or on-premises. Additional Information: Objects integrated with GitHub include Linux/Unix, Windows, and web service tasks. File operations are performed using the PyGithub module. This task supports Universal Agents for both Linux and Windows operating systems. Users can select different log-levels, e.g., info or debug. All passwords are encrypted using UC credentials.
Google BigQuery: Schedule, Trigger, Monitor, and Orchestrate Operations
This integration allows users to schedule, trigger, monitor, and orchestrate the Google BigQuery process directly from the Universal Controller. Key Features: Users can perform the below Google BigQuery operations: BigQuery SQL. List dataset. List tables in a dataset. View job information. Create a dataset. Load local file to a table. Load cloud storage data to a table. Export table data. Additional Info: This task uses Python google-cloud-bigquery and google-auth modules to make REST API calls to Google BigQuery. This task will use the GCP project ID, BigQuery SQL or schema, dataset ID, job ID, location, table ID, cloud storage URI, source file format as parameters of BigQuery function and GCP KeyFile (API KEY) of service account for authenticating the REST API calls to Google BigQuery.
Hadoop: HDFS File Monitoring and Triggering
This integration allows users to monitor a file on the Hadoop Distributed File System (HDFS), then have the option to trigger an action. If the file is found, the file monitor can either go to success or launch a task in the Stonebranch Universal Automation Center. It's also possible to run the task monitor in auto-restart mode, which means it can restart itself automatically after finding a file. Key Features: Uses the Python HDFS module, which calls the Hadoop WebHDFS REST API. Supports Universal agent for both Linux/Unix and Windows. However, it has been currently only tested against a Linux agent. Select different log-levels, e.g., info and debug. All passwords are encrypted using Universal Controller credentials. Currently only the InsecureClient (the default) is implemented. The TokenClient can be implemented upon request. Currently only direct file matches are implemented, as this is the standard functionality supported by the Python HDFS module. A scan for files using wildcards (*) or regular expressions can be implemented upon request.
Hitachi Vantara: Pentaho Data Integration
Pentaho Data Integration provides powerful ETL (Extract, Transform, and Load) capabilities and Universal Controller using its universal extension capabilities integrated with Pentaho Data Integration tool to orchestrate the below functionalities.Key Features: Run a Pentaho Job from its Carte configured repository Run a Pentaho Job from its Repository Run a Pentaho Job from a remote job definition file Define & Run a Pentaho Job from Universal Controller Script Library Execute a Pentaho Transformation from a Repository Execute a Pentaho Transformation from a remote transformation definition file(XML)Define & Run a Pentaho Transformation from Universal Controller Script Library
Informatica Cloud: Schedule, Control, and Manage
This integration allows users to schedule any data integration task, linear taskflow, or taskflow in the Informatica Cloud. Key Features: Schedule data integration tasks, including linear taskflow, in the Informatica Cloud.All communication is web-service based, using the latest Informatica REST API (version 2 and 3) with support for folders.Log-files including activity-, session-, and error-log are available from the Universal Controller web UI in the same way as the Informatica monitoring console.
Informatica PowerCenter: Schedule, Control, and Manage
This integration allows users to schedule Informatica PowerCenter workflows and tasks, including retrieving the workflow and session log. It's also possible to start a workflow from a certain task onwards. Key Features: Schedules Informatica PowerCenter via its web services hub; therefore, no installation on any Informatica system is required. Based on the standard PowerCenter web services hub using the SOAP protocol. The PowerCenter web services hub interface is called by a Universal Agent running on a Linux server or Windows server. The following actions are supported: Start a task in an Informatica PowerCenter workflow. Start an Informatica PowerCenter workflow. Start an Informatica PowerCenter workflow from a given task onwards.
Inter-Cloud Data Transfer
Stream data from one object store to another without intermediate storage.The Inter-Cloud Data Transfer integration allows users to transfer data to, from, and between any of the major private and public cloud providers like AWS, Google Cloud, and Microsoft Azure. It also supports the transfer of data to and from a Hadoop Distributed File System (HDFS) and to major cloud applications like OneDrive and SharePoint. Integrations within this solution package include: AWS S3. Google Cloud. SharePoint. Dropbox. OneDrive. Hadoop HDFS. Key Features: Transfer data to, from, and between any cloud provider. Transfer between any major storage applications like SharePoint or Dropbox. Transfer data to and from a Hadoop HDFS. Stream data from one object store to another – no intermediate storage. Executes quickly if the object stores are in the same region. Preserves always timestamps and verifies checksums. Supports encryption, caching, compression, and chunking. Supports regular expressions based on include/exclude filter rules. Supported actions are: List objects, list directory. Copy/move. Remove object/object store. Perform dry-runs. Monitor object. Copy URL.
Jenkins: Start and Trigger Workflows
This integration improves the functionality of Jenkins when orchestrated from the Stonebranch Universal Automation Center. It encourages collaboration by enabling the well-controlled and automated deployment of applications over to the operations side. Key Features: UAC communicates with Jenkins through the Python Jenkins module. Jenkins can make REST API calls to the Stonebranch Universal Controller (UC) to trigger any task or workflow. This task can trigger or start an existing build job in Jenkins. The UC will monitor the build execution in Jenkins until completion, then send the build results to the UC. With this task, users can create a building job in Jenkins from the UC. Any Jenkins build job definitions in XML will be stored centrally in the UC. It offers the functionality to fetch the Jenkins job build information and list running build info in Jenkins from UC. Enable/disable Jenkins jobs and nodes and delete/copy/rename Jenkins jobs from UC. When users list the installed plugins in Jenkins, a plugin install can be triggered from the UC. Set the next build info for Jenkins to build jobs.
JScape: Managed File Transfer
This integration provides UAC customers the ability to manage and integrate their JSCAPE Managed File Transfer Server processes within their UAC automation processes and workflows.Key Features: This integration implements tasks to perform the following JSCAPE Manage File Transfer functions, which can be integrated into your UAC workflows. PGP Encrypt PGP Decrypt Run a UDM Gateway Trigger SFTP File Upload SFTP File Download Trading Partner File Upload Trading Partner File Download Trading Partner File Upload using a Regex or Generic Filename Pattern Trading Partner File Download using a Regex or Generic Filename Pattern
Kubernetes: Automate Container Operations
This integration allows users to run Kubernetes and perform the list (get), create, delete, and replace functions. Key Features: This task is delivered as a built-in Universal Template from the Stonebranch Universal Controller (UC) release 18.104.22.168 and higher. Documentation is included in the UC 6.9 product documentation. It provides the ability to automate Kubernetes container operations by providing tasks for the following Kubernetes functions: list, create, delete, and replace for Kubernetes Pods, Deployments, and Namespaces.