Destinations
A destination stage represents the target for a pipeline. You can use one or more destinations in a pipeline.
You can use different destinations based on the execution mode of the pipeline: standalone, cluster, or edge. To help create or test pipelines, you can use a development destination.
Standalone Pipelines Only
In standalone pipelines, you can use the following destinations:
- Azure Synapse SQL - Loads data into one or more tables in Microsoft Azure Synapse.
- Google BigQuery (Enterprise) - Loads new data or change data capture (CDC) data to Google BigQuery, compensating for data drift to support loading to new or existing datasets, tables, and columns.
- Hive Metastore - Creates and updates Hive tables as needed.
- RabbitMQ Producer - Writes data to RabbitMQ.
- Send Response to Origin - Sends records with the specified response to the microservice origin in the pipeline. Use only in a microservice pipeline.
Standalone or Cluster Pipelines
In standalone or cluster pipelines, you can use the following destinations:
- Aerospike (deprecated) - Writes data to Aerospike.
- Amazon S3 - Writes data to Amazon S3.
- Azure Data Lake Storage (Legacy) - Writes data to Azure Data Lake Storage Gen1.
- Azure Data Lake Storage Gen1 (deprecated) - Writes data to Azure Data Lake Storage Gen1.
- Azure Data Lake Storage Gen2 - Writes data to Azure Data Lake Storage Gen2.
- Azure Event Hub Producer - Writes data to Azure Event Hub.
- Azure IoT Hub Producer - Writes data to Microsoft Azure IoT Hub.
- Cassandra - Writes data to a Cassandra cluster.
- CoAP Client - Writes data to a CoAP endpoint.
- Couchbase - Writes data to a Couchbase database.
- Databricks Delta Lake - Writes data to one or more Delta Lake tables on Databricks.
- Elasticsearch - Writes data to an Elasticsearch cluster.
- Flume (deprecated) - Writes data to a Flume source.
- Google BigQuery - Streams data into existing datasets and tables in Google BigQuery.
- Google Bigtable - Writes data to Google Cloud Bigtable.
- Google Cloud Storage - Writes data to Google Cloud Storage.
- Google Pub/Sub Publisher - Publishes messages to Google Pub/Sub.
- GPSS Producer (deprecated) - Writes data to Greenplum Database through Greenplum Stream Server (GPSS).
- Hadoop FS - Writes data to HDFS or Azure Blob storage.
- HBase - Writes data to an HBase cluster.
- Hive Streaming (deprecated) - Writes data to Hive.
- HTTP Client - Writes data to an HTTP endpoint. Can send responses to a microservice origin in a microservice pipeline.
- InfluxDB - Writes data to an InfluxDB 0.9 - 1.x database.
- InfluxDB 2.x - Writes data to an InfluxDB 2.x database.
- JDBC Producer - Writes data to database table through a JDBC connection.
- JMS Producer - Writes data to JMS.
- Kafka Producer - Writes data to a Kafka cluster. Can send responses to a microservice origin in a microservice pipeline.
- Kinesis Firehose - Writes data to a Kinesis Firehose delivery stream.
- Kinesis Producer - Writes data to Kinesis Streams. Can send responses to a microservice origin in a microservice pipeline.
- KineticaDB (deprecated) - Writes data to a table in a Kinetica cluster.
- Kudu - Writes data to Kudu.
- Local FS - Writes data to a local file system.
- MapR DB - Writes data as text, binary data, or JSON strings to MapR DB binary tables.
- MapR DB JSON - Writes data as JSON documents to MapR DB JSON tables.
- MapR FS - Writes data to MapR FS.
- MapR Streams Producer - Writes data to MapR Streams.
- MemSQL Fast Loader (deprecated) - Writes data to MemSQL or MySQL.
- MongoDB - Writes data to MongoDB.
- MQTT Publisher - Publishes messages to a topic on an MQTT broker.
- Named Pipe - Writes data to a named pipe.
- Pulsar Producer - Writes data to Apache Pulsar topics.
- Redis - Writes data to Redis.
- Salesforce - Writes data to Salesforce.
- SDC RPC (deprecated) - Passes data to an SDC RPC origin in an SDC RPC pipeline.
- SFTP/FTP/FTPS Client - Sends data to a URL using SFTP, FTP, or FTPS.
- Snowflake - Writes data to tables in a Snowflake database.
- Snowflake File Uploader - Writes whole files to an internal Snowflake stage. Use with the Snowflake executor.
- Solr - Writes data to a Solr node or cluster.
- Splunk - Writes data to Splunk.
- SQL Server 2019 BDC Bulk Loader - Writes data to Microsoft SQL Server 2019 Big Data Cluster (BDC) using a bulk insert.
- Syslog - Writes data to a Syslog server.
- Tableau CRM - Writes data to Salesforce Tableau CRM.
- To Error - Passes records to the pipeline for error handling.
- Trash - Removes records from the pipeline.
- WebSocket Client - Writes data to a WebSocket endpoint.
Edge Pipelines
In edge pipelines, you can use the following destinations:
- Amazon S3 - Writes data to Amazon S3.
- CoAP Client - Writes data to a CoAP endpoint.
- HTTP Client - Writes data to an HTTP endpoint.
- InfluxDB - Writes data to InfluxDB.
- Kafka Producer - Writes data to a Kafka cluster.
- Kinesis Firehose - Writes data to a Kinesis Firehose delivery stream.
- Kinesis Producer - Writes data to Kinesis Streams.
- MQTT Publisher - Publishes messages to a topic on an MQTT broker.
- To Error - Passes records to the pipeline for error handling.
- WebSocket Client - Writes data to a WebSocket endpoint.
Development Destination
To help create or test pipelines, you can use the following development destination:
- To Event
For more information, see Development Stages.