Deprecated Functionality
Deprecated functionality is functionality that is marked for removal in a future release. A stage or feature can be deprecated because it is not commonly used, has been replaced, or connects to a system with an end-of-service date. In the pipeline canvas, a sunset icon () indicates when a stage is deprecated.
For example, the Kafka Consumer origin is deprecated because you can use the more powerful Kafka Multitopic Consumer origin. In the stage library panel, the sunset icon indicates that the Kafka Consumer origin is deprecated:
Deprecated Feature | Suggested Alternatives |
---|---|
Data Collector user interface | In a future release, you will use the Control Hub user interface to design and run Data Collector pipelines. |
Cluster pipelines and cluster-only origins | StreamSets
recommends using StreamSets
Transformer instead. For more information, see the Transformer documentation. There may be some cases where Transformer does not currently achieve full feature parity with cluster pipelines. StreamSets will work with customers to achieve feature parity on a case-by-case basis, as needed. |
SDC RPC pipelines, including SDC RPC stages | This feature is deprecated because it does not support queueing,
buffering, or failure handling as required by current StreamSets standards.
|
Deprecated Origin | Suggested Alternatives |
---|---|
Hadoop FS | Use the Transformer File origin for cluster workloads. |
Kafka Consumer | This stage was deprecated in favor of the Kafka Multitopic Consumer which supports higher versions of the Kafka API. |
MapR FS | Use the Transformer File origin for cluster workloads. |
NiFi HTTP Server | This origin is not widely used and is thus being deprecated. There is no specific alternative. |
Omniture | This origin is not widely used and is thus being deprecated. There is no specific alternative. |
SDC RPC | See SDC RPC pipelines. |
SQL Server 2019 BDC Multitable Consumer | SQL Server 2019 BDC has been retired by Microsoft. There is no specific alternative. |
Start Pipelines | This origin only orchestrates pipelines that are not tied to Control Hub. Register your Data Collector with Control Hub and use the Start Jobs origin instead. |
Teradata Consumer | This origin is not widely used and is thus being deprecated. There is no specific alternative. |
Deprecated Processor | Suggested Alternatives |
---|---|
Spark Evaluator | Use Transformer. |
Start Pipelines | This processor only orchestrates pipelines that are not tied to Control Hub. Register your Data Collector with Control Hub and use the Start Jobs processor instead. |
Value Replacer | Use the Field Replacer processor. |
Wait for Pipelines | This processor only orchestrates pipelines that are not tied to Control Hub. Register your Data Collector with Control Hub and use the Wait for Jobs processor instead. |
Deprecated Destination | Suggested Alternatives |
---|---|
Aerospike | This destination is not widely used and is thus being deprecated. There is no specific alternative. |
Flume | Cloudera has removed Flume from CDP 7.0. Earlier versions that
included Flume are now end-of-life and no longer supported. Thus, we
are deprecating this destination and have no specific alternative.
You might switch to alternative technologies such as Kafka or Pulsar, and then use the Kafka Producer or Pulsar Producer destinations. |
Google BigQuery (Legacy) | This stage was deprecated in favor of the Google BigQuery destination which can process CDC data and handle data drift. |
GPSS Producer | This destination is not widely used and is thus being deprecated. There is no specific alternative. |
Hive Streaming | Use the Transformer Hive destination. |
KineticaDB | This destination is not widely used and is thus being deprecated. There is no specific alternative. |
MemSQL Fast Loader | This destination is not widely used and is thus being deprecated. There is no specific alternative. |
SDC RPC | See SDC RPC pipelines. |
SQL Server 2019 BDC Bullk Loader | SQL Server 2019 BDC has been retired by Microsoft. Try switching to Azure Synapse, as recommended by Microsoft. Then, you can use the Azure Synapse SQL destination, available in Data Collector. |
Deprecated Stage Library | Suggested Alternatives |
---|---|
Apache Kudu stage libraries | Use the Kudu stages available in CDH and CDP stage libraries. |
Cloudera CDH 6.3 stage libraries and earlier | Cloudera has specified an end-of-life timeline for
Cloudera Enterprise products. Upgrade to Cloudera Data Platform Private Cloud, and then use the CDP stage libraries. |
Hortonworks HDP stage libraries | Cloudera has specified an end-of-life timeline for
HDP. Upgrade to Cloudera Data Platform Private Cloud, and then use the CDP stage libraries. |
MapR 6.0.x stage libraries and earlier | Upgrade to MapR 6.1 or later. |
Additional Deprecated Functionality | Suggested Alternatives |
---|---|
Writing out metadata through the File Tail origin | This feature has been replaced by a more robust alternative.
The record generated by the metadata output is the same as the File Tail event record. Best practice is to connect the metadata output to the Trash destination. Then, configure the origin to generate events. For more information about the File Tail events, see Event Generation. |
Processing Microsoft SQL Server CDC data with the JDBC Query Consumer | This feature has been replaced by a more robust alternative.
To process data from Microsoft SQL Server CDC tables, use the SQL Server CDC Client origin. To process data from Microsoft SQL Server change tracking tables, use the SQL Server Change Tracking origin. |
Using the sdcFunctions scripting object with the
Groovy, JavaScript, and Jython processors |
This feature has been replaced by a more robust alternative.
To evaluate and modify data, use the methods in the
|
Using Tableau CRM dataflow with the Tableau CRM destination | This feature has been replaced by a more robust alternative.
Use the append operation in the destination to combine data into a single dataset. |
java.security.networkaddress.cache.ttl property
in the Data Collector configuration file |
If needed, you can configure the
networkaddress.cache.ttl property in the
$SDC_DIST/etc/sdc-java-security.properties file to
cache Domain Name Service (DNS) lookups. |
The jks-cs Java keystore credential store
function |
This feature has been replaced by a more robust alternative.
Use the |
The vault:read and
vault:readWithDelay Hashicorp Vault credential
store functions |
This feature has been replaced by a more robust alternative.
Use credential functions with the Hashicorp Vault options. |