Overview
A destination stage represents the target for a pipeline. You can use one or more destinations in a pipeline.
You can use the following destinations in a Transformer
pipeline:
- ADLS Gen2 - Writes data to Azure Data Lake Storage Gen2.
- Amazon Redshift - Writes data to an Amazon Redshift table.
- Amazon S3 - Writes data to Amazon S3 objects.
- Azure Event Hubs - Writes data to Azure Event Hubs.
- Azure SQL - Writes data to an Azure table.
- Delta Lake - Writes data to a Delta Lake table.
- Elasticsearch - Writes data to an Elasticsearch cluster.
- File - Writes data to files in HDFS or local file systems.
- Google Big Query - Writes data to a Google Big Query table.
- Hive - Writes data to a Hive table.
- JDBC - Writes data to a database table using a JDBC driver.
- Kafka - Writes data to a Kafka cluster.
- Kudu - Writes data to a Kudu table.
- MapR Event Store - Writes data to MapR Streams.
- MapR Hive - Writes data to a MapR Hive table.
- Snowflake - Writes data to a Snowflake table.
- Unity Catalog - Writes data to a Databricks Unity Catalog table.