Deprecated Functionality
Deprecated functionality is functionality that is marked for removal in a future release. A stage or feature can be deprecated because it is not commonly used, has been replaced, or connects to a system with an end-of-service date. In the pipeline canvas, a sunset icon () indicates when a non-Enterprise stage is deprecated.
For example, the Kafka Consumer origin is deprecated because you can use the more powerful Kafka Multitopic Consumer origin. In the stage library panel, the sunset icon indicates that the Kafka Consumer origin is deprecated:
Deprecated Feature | Suggested Alternatives |
---|---|
Data Collector user interface | In a future release, you will use the Control Hub user interface to design and run Data Collector pipelines. |
Cluster pipelines and cluster-only origins | StreamSets
recommends using StreamSets
Transformer instead. For more information, see the Transformer documentationTransformer
documentation. There may be some cases where Transformer does not currently achieve full feature parity with cluster pipelines. StreamSets will work with customers to achieve feature parity on a case-by-case basis, as needed. |
SDC RPC pipelines, including SDC RPC stages | This feature is deprecated because it does not support queueing,
buffering, or failure handling as required by current StreamSets standards.
|
Deprecated Origin | Suggested Alternatives |
---|---|
Azure Data Lake Storage Gen1 | Azure Data Lake Storage Gen1 has been retired by Microsoft.
Trying switching to Azure Data Lake Storage Gen2 as recommended by Microsoft. Then, you can use the Azure Data Lake Storage Gen2 origin, available in Data Collector and TransformerTransformer. |
Hadoop FS | Use the Transformer File originFile origin for cluster workloads. |
Kafka Consumer | This stage was deprecated in favor of the Kafka Multitopic Consumer which supports higher versions of the Kafka API. |
MapR FS | Use the Transformer File originFile origin for cluster workloads. |
NiFi HTTP Server | This origin is not widely used and is thus being deprecated. There is no specific alternative. |
Omniture | This origin is not widely used and is thus being deprecated. There is no specific alternative. |
SDC RPC | See SDC RPC pipelines. |
Start Pipelines | This origin only orchestrates pipelines that are not tied to Control Hub. Register your Data Collector with Control Hub and use the Start Jobs origin instead. |
Teradata Consumer | This origin is not widely used and is thus being deprecated. There is no specific alternative. |
Deprecated Processor | Suggested Alternatives |
---|---|
Databricks ML Evaluator | Use TransformerTransformer. |
Spark Evaluator | Use TransformerTransformer. |
Start Pipelines | This processor only orchestrates pipelines that are not tied to Control Hub. Register your Data Collector with Control Hub and use the Start Jobs processor instead. |
Value Replacer | Use the Field Replacer processor. |
Wait for Pipelines | This processor only orchestrates pipelines that are not tied to Control Hub. Register your Data Collector with Control Hub and use the Wait for Jobs processor instead. |
Deprecated Destination | Suggested Alternatives |
---|---|
Aerospike | This destination is not widely used and is thus being deprecated. There is no specific alternative. |
Azure Data Lake Storage Legacy and Gen1 | Azure Data Lake Storage Gen1 has been retired by Microsoft. Try switching to Azure Data Lake Storage Gen2 as recommended by Microsoft. Then, you can use the Azure Data Lake Storage Gen2 destination, available in Data Collector and TransformerTransformer. |
Flume | Cloudera has removed Flume from CDP
7.0. Earlier versions that included Flume are now end-of-life and no
longer supported. Thus, we are deprecating this destination and have
no specific alternative. You might switch to alternative technologies such as Kafka or Pulsar, and then use the Kafka Producer or Pulsar Producer destinations. |
GPSS Producer | This destination is not widely used and is thus being deprecated. There is no specific alternative. |
Hive Streaming | Use the Transformer HiveHive destination. |
KineticaDB | This destination is not widely used and is thus being deprecated. There is no specific alternative. |
MemSQL Fast Loader | This destination is not widely used and is thus being deprecated. There is no specific alternative. |
SDC RPC | See SDC RPC pipelines. |
Deprecated Executor | Suggested Alternatives |
---|---|
ADLS Gen1 File Metadata | Azure Data Lake Storage Gen1 has been retired by Microsoft.
Try switching to Azure Data Lake Storage Gen2 as recommended by Microsoft. Then, you can use the ADLS Gen2 File Metadata executor. |
Deprecated Stage Library | Suggested Alternatives |
---|---|
Apache Kudu stage libraries | Use the Kudu stages available in CDH and CDP stage libraries. |
Cloudera CDH 6.3 stage libraries and earlier | Cloudera has specified an end-of-life timeline for
Cloudera Enterprise products. Upgrade to Cloudera Data Platform Private Cloud, and then use the CDP stage libraries. |
Hortonworks HDP stage libraries | Cloudera has specified an end-of-life timeline for
HDP. Upgrade to Cloudera Data Platform Private Cloud, and then use the CDP stage libraries. |
MapR 6.0.x stage libraries and earlier | Upgrade to MapR 6.1 or later. |
Deprecated Minor Functionality | Suggested Alternatives |
---|---|
Writing out metadata through the File Tail origin | This feature has been replaced by a more robust alternative.
The record generated by the metadata output is the same as the File Tail event record. Best practice is to connect the metadata output to the Trash destination. Then, configure the origin to generate events. For more information about the File Tail events, see Event Generation. |
Processing Microsoft SQL Server CDC data with the JDBC Query Consumer | This feature has been replaced by a more robust alternative.
To process data from Microsoft SQL Server CDC tables, use the SQL Server CDC Client origin. To process data from Microsoft SQL Server change tracking tables, use the SQL Server Change Tracking origin. |
Using the sdcFunctions scripting object with the
Groovy, JavaScript, and Jython processors |
This feature has been replaced by a more robust alternative.
To evaluate and modify data, use the methods in the
|
Using Tableau CRM dataflow with the Tableau CRM destination | This feature has been replaced by a more robust alternative.
Use the append operation in the destination to combine data into a single dataset. |
The jks-cs Java keystore credential store
function |
This feature has been replaced by a more robust alternative.
Use the |
The vault:read and
vault:readWithDelay Hashicorp Vault credential
store functions |
This feature has been replaced by a more robust alternative.
Use credential functions with the Hashicorp Vault options. |