Blog Post Exploring Data Pipeline Automation Integrations for the UAC
Integration is key to centralizing control of all automated processes along your data pipeline. Learn to orchestrate the secure flow of data throughout your organization.
Data is everywhere… on-prem and in the cloud. In foundational business platforms and function-specific SaaS apps. Captured in IoT devices and streamed into data warehouses or lakes.
While 99% of companies report investments in data initiatives, there is certainly no guarantee of success. Less than a quarter of those companies have managed to create a data-driven organization. Why? One of the reasons is that cobbling together one-off data pipelines with a slew of custom automation scripts and built-in job schedulers creates an environment dependent on tribal knowledge — which is ultimately impossible to scale.
There's a better way. A service orchestration and automation platform (SOAP) revolutionizes data delivery and supports an entire organization's diverse data management needs. Using proven agent technology and modern API integrations, SOAPs securely connect to and control all the systems, platforms, and apps in today's hybrid IT environments.
In this article, we'll explore the role of integrations in data pipelines and look at some of the data-specific integrations available for the Stonebranch Universal Automation Center (UAC).
Centralize Automation for Your Data Pipeline
It's the ability to easily integrate with third-party technologies — whether on-prem or in the cloud — that sets SOAPs apart from their workload automation (WLA) predecessors. These integrations allow data architects, data engineers, data scientists, and citizen automators alike to reach into and control your existing data tool stack from one central platform. SOAPs unify automation workflows across otherwise siloed applications, including:
- Data sources like Salesforce, SAP, SQL Server, and Oracle
- ETL/ELT data integration tools like AWS Glue, Azure Data Factory, Informatica, and Kafka
- Data lakes and data warehouses like DataBricks, Google BigQuery, Hadoop, Redshift, and Snowflake
- Data analysis tools like Dataiku, RapidMiner, SAS, and Terraform
- Data delivery and presentation tools like Microsoft BI, Qlik, and Tableau
Stonebranch Puts the Ops in DataOps
The Stonebranch Universal Automation Center is a modern SOAP that empowers data, development, and IT teams to streamline complex hybrid IT workflows, monitor automated IT processes, and move quickly with proactive alerts to keep pipelines intact and data flowing.
Simply put, the pre-built extensions available in the Stonebranch Integration Hub help your big data initiatives succeed — and offer future-proof flexibility to adapt to a rapidly evolving landscape of data tools. Here's a sampling of what's currently available in the data pipeline category:
Amazon: AWS & SQS
Automate real-time file transfers into and out of Amazon S3 cloud storage buckets.
Centrally send, store, and receive messages between software components.
Perform end-to-end orchestration and automation of jobs and clusters in Databricks environment either in AWS or Azure.
Google Cloud Platform
Automate your Google BigQuery operations: create datasets, list datasets and tables, load local and cloud data to a table, export table data, and view job information.
Informatica: Cloud & PowerCenter
Schedule any data integration task, linear taskflow, or taskflow in the Informatica Cloud.
Schedule Informatica PowerCenter workflows and tasks, including retrieving the workflow and session log.
Monitor Kafka events (messages) until a condition is met, then trigger workflows or pass information related to the event.
Publish events (messages) to topics in Kafka.
Move data to and from Azure Blob Storage containers.
Manage data pipeline processes in Azure Data Factory.
Microsoft Power BI
Refresh datasets and dataflows in the Microsoft Power BI business analytics service.
Microsoft SQL Server
Execute a SQL Server integration services (SSIS) package using the dtexec utility.
Complete various administrative tasks, including publishing reports and moving reports from one server to another, using the SQL Server reporting services (SSRS) rs.exe utility.
Power up your ETL capabilities by centrally orchestrating Pentaho jobs and transformations.
Schedule and execute batch input sessions in SAP to transfer data from non-SAP systems to SAP systems, or between SAP systems.
Schedule any SAP BusinessObjects Data Services ETL job.
Schedule any SAP BusinessObjects schedulable resource, including Crystal or Web Intelligence reports.
Schedule any SAP BusinessObjects Data Services ETL job from the Stonebranch Universal Automation Center, using the AL_RWJobLauncher.exe utility that comes with SAP Data Services.
Orchestrate, schedule, trigger, and monitor the Snowflake load and unload processes from different data sources (including cloud storage or local virtual machines).
Execute SQL scripts and functions against a MySQL, PostgreSQL, Microsoft SQL Server, Oracle, or SAP HANA database.
Stonebranch has already done the heavy lifting for you! The Stonebranch Integration Hub offers pre-built, downloadable extensions that allow you to centrally orchestrate your data pipeline tools, cloud service providers, DevOps platforms, business applications, and more.
If you prefer to exercise your own extraordinary development skills, go for it! Visit our documentation center to learn how to create a new integration. Once it's complete, you can even share your creation with other UAC users by contributing it to the Integration Hub.
Start Your Automation Initiative Now
Schedule a Live Demo with a Stonebranch Solution Expert