6 products found for "google"
Free
Video
Google BigQuery: Schedule, Trigger, Monitor, and Orchestrate Operations
Google BigQuery is a serverless, highly scalable data warehouse designed to make SQL-based queries on large datasets fast and easy. It is part of the Google Cloud Platform and enables users to perform real-time data analysis with its powerful processing capabilities and simple SQL interface..
This Universal Task allows Stonebranch users to schedule, trigger, monitor, and orchestrate the Google BigQuery process directly from the Universal Controller.
This task uses Python modules google-cloud-big query and google-auth to make REST-API calls to Google BigQuery
This task will use the GCP Project ID, BigQuery SQL or Schema, Dataset ID, Job ID, Location, Table ID, Cloud Storage URI, and Source File Format as parameters of BigQuery function, and GCP KeyFile (API KEY) of Service account for authenticating the REST-API calls to Google BigQuery.
Key Features
This Universal task provides the following main features:
BigQuery SQL
List dataset
List tables in the dataset
View job information
Create a dataset
Load local file to a table
Load cloud storage data to a table
Export table data
What's New (v1.1.4)
This new release involves a Bug Fix
Fix: Usage for the Double quotes for the _scriptPath
Free
Google Kubernetes Engine Jobs
Google Kubernetes Engine (GKE) is a service for deploying, managing, and scaling Kubernetes containers on Google Cloud. Among other capabilities, GKE supports Jobs and CronJobs for task execution. Jobs are used for one-time tasks that run to completion, such as batch processing, while CronJobs enable scheduled, recurring task execution. To deploy a Job or CronJob, a Kubernetes resource definition is used. This file is typically defined inside of a YAML or JSON file and specifies the task details, container image, and execution configuration. Once applied to a GKE cluster, the Job runs to completion and terminates automatically, while CronJobs create new Job instances according to their defined schedule.
This integration provides the capability to deploy a GKE job on a specified cluster or delete a job from a namespace.
Key Features
Deploy Job and Cronjob resources, either declaratively or imperatively, from local YAML or JSON files, remote URLs or UAC scripts.Create one-off Job resources based on existing Cronjob definitions in the cluster.Delete Job and Cronjob resources via the delete action or manually through a Dynamic Command. Job resources can also be set to be deleted automatically after completion.Live stream real-time updates about Kubernetes pods as they are created, modified, or terminated.Option to automatically or manually retrieve information, including logs, for all containers pertaining to a specific job.
What's New v2.0.0
Enhancements
Support for deploying CronJobs and Jobs from CronJobs.
Suport for deleting CronJobs.
Breaking ChangesDropped support for Application Default Credentials Authentication Method.
Free
Video
Inter-Cloud Data Transfer
This Universal Extension provides the capability to perform data transfers between cloud based storage services, as well as local or distributed file systems. Transfers are fast and secure since data are streamed from one storage to another with no intermediate storage taking place.Multiple storage systems are supported (an overview can be found here). Integrations within this solution package include:
Amazon S3
Google Cloud Storage
Microsoft OneDrive Business, including Sharepoint
Microsoft Azure Blob Storage
Hadoop Distributed File Storage (HDFS)
Local file system (Linux, Windows)
HTTP(S) URL
Key Features
This Universal Extension supports the following key features:
Actions
Copy, move, synchronize data between two storages.
Copy a URL's content and to cloud or local destination without saving it in temporary storage.
List data on a storage, including listing with details or in JSON format for machine parsing.
Create objects on a storage.
Delete objects from a storage.
Features:
Fast transfers for objects stored in the same region.
Preserves always timestamps and verifies checksums.
Supports encryption, caching, compression, chunking.
Dynamic token updates for OneDrive Business cloud storage, observing the OneDrive business refresh token flow.
Support for dry runs. Allows users to execute a Universal Task without making any permanent changes on the target storage.
Advanced filtering capability for files or objects to be listed or transferred.
Option to mark the Universal Task as Failed when no files have been transferred.
List of overwrite options for existing data.
Additional customized options.
Observability capabilities, providing users with detailed statistical insights for data transfers across all task instances.
Output
Progress of the selected Action is visible, during Universal Task Instance execution.
Text or JSON formatted output.
What's New V 4.5.0
Enhancements
Updated bundled binaries, bringing various improvements and fixesEnable the use of FTP with TLS/SSL certificatesarious improvements.
Fixes
Time parsing issues on systems using different date formats.
Free
Video
Fivetran
Fivetran is a cloud-based data integration platform that helps businesses automate the process of extracting data from various sources, transforming it, and loading it into a data warehouse for analysis. It offers a wide range of pre-built connectors to popular data sources such as databases, marketing platforms, CRMs, and more. Fivetran's connectors are designed to be easy to set up and use, allowing businesses to focus on data analysis rather than data integration. The platform also offers features such as data transformation, scheduling, monitoring, and access control to help organizations manage their data pipelines more efficiently. Fivetran supports a variety of data warehouses, including Amazon Redshift, Google BigQuery, Snowflake, and Microsoft Azure.
This integration allows to create and execute Fivetran Tasks in Universal Controller.
Key Features:
Trigger a Sync action on a Connector. Optionally, wait until Sync is completed.
Trigger a Re-sync (Historical Sync) action on a Connector. Optionally, wait until Re-sync is completed.
What's New in V1.0.1:A fix is provided to execute the integration with a Python installation, other than the one that is bundled with the Universal Agent.
Free
Inter-Cloud Data Monitor
This Universal Extension provides the capability to monitor objects (files/directories) on various major Cloud Providers, including AWS, Google Cloud, Microsoft Azure and trigger Universal Events, which can be used for execution of tasks and workflows. It can be executed as a standalone UAC Task, however in order to benefit from the full functionality and scalability of UAC, it is suggested to be used in conjunction with Universal Monitor Triggers and Universal Monitor Tasks.
Multiple storage systems are supported (an overview can be found here). Integrations within this solution package include:
Amazon S3
Google Cloud Storage
Microsoft OneDrive Business, including Sharepoint
Microsoft Azure Blob Storage
Hadoop Distributed File Storage (HDFS)
Local file system (Linux, Windows)
HTTP(S) URL
Key Features:
This Universal Extension supports the following key features:
Features
Monitor Object Creation, Change (based on modification time) and Deletion using advanced filtering capabilities
Generates Universal Events for each entity the monitor has detected, which can be used to trigger other tasks and workflows.
Large set of possible parameterization and coverage of remote storages.
Plug & Play, easy installation without the need to install external dependencies.
Dynamic token updates for OneDrive Business cloud storage, observing the OneDrive business refresh token flow.
Enhanced observability with event metrics, providing users with detailed statistical insights into events processed across all task instances.
Shared configuration with Cloud Data Transfer.
What's New v 2.4.0
EnhancementsUpdated bundled binaries, bringing various improvements and fixesEnable the use of FTP with TLS/SSL certificates
Free
Video
Snowflake: Schedule, Trigger, Monitor, and Orchestrate Operations
Snowflake is a cloud-based data warehousing platform that allows users to store, manage, and analyze large amounts of data. It offers high performance, scalability, and ease of use by separating storage and computer resources.This Universal Task allows Stonebranch users to orchestrate, schedule, trigger, and monitor the Snowflake load and unload process from different data sources (cloud storage or local VMs) directly from the Universal Controller. It uses Python libraries to perform all functions listed in the following sections. Alternatively, you can also perform all these operations using the Snowflake JDBC driver which you can add to the Universal Controller libraries, and use SQL Task to perform any operations with Snowflake (Downloading / integrating the JDBC Driver | Snowflake Documentation ).Key Features:Users can orchestrate the following Snowflake functionalities: Snowflake loading processes: Load data from AWS S3 to Snowflake. Load data from Azure storage to Snowflake.Load data from Google storage to Snowflake. Load internal stage file to Snowflake table. Copy from local server to internal staging. Snowflake unloading processes: Unload Snowflake data to AWS S3. Unload Snowflake data to Azure storage. Unload Snowflake data to Google storage.Unload Snowflake data to internal staging. Unload from internal stage to local server. Snowflake execute commands: Execute a Snowflake command. What's New (v 1.2.0) Addition: Key pair Authentication