Design in Control Hub

You can design pipelines and pipeline fragments in Control Hub using the Control Hub Pipeline Designer. You can use Pipeline Designer to develop pipelines and fragments for Data Collector or Transformer.

Pipeline Designer enables you to configure pipelines, preview data, and publish pipelines. You can also design and publish pipeline fragments.

You can create new pipelines or edit previously published pipelines. When you create a pipeline in Pipeline Designer, you can start with a blank canvas or with a sample pipeline. Pipeline Designer provides several system sample pipelines. You can also create user-defined sample pipelines. Use the Pipelines view to create a new pipeline or to access existing pipelines in the pipeline repository.

You can also create or edit previous published fragments. When you create a fragment in Pipeline Designer, you start with a blank canvas. Use the Pipeline Fragments view to create a new fragment or access existing fragments.

When you configure a pipeline or pipeline fragment in Pipeline Designer, you specify the authoring engine to use - Data Collector or Transformer. Pipeline Designer displays stages, stage libraries, and functionality based on the selected authoring engine.

For more information about using Pipeline Designer, see Pipeline Designer UI and Pipeline Designer Tips.

Authoring Engine

When you create or edit a pipeline or pipeline fragment, you select the authoring engine to use - Data Collector or Transformer. You can select an accessible authoring engine that is registered with your Control Hub organization and that meets all of the requirements.

Choose an authoring engine that is the same version as the engines that you intend to use to run the pipeline. Using a different version can result in pipelines that are invalid when they are run.

For example, if the authoring Data Collector is a more recent version than the execution Data Collector, pipelines might include a stage, stage library, or stage functionality that does not exist in the execution Data Collector.

An authoring engine must meet all of the following requirements:
  • StreamSets recommends using the latest version of Data Collector or Transformer.

    The minimum supported Data Collector version is 3.0.0.0. To design pipeline fragments, the minimum supported Data Collector version is 3.2.0.0. To create and use connections, the minimum supported Data Collector version is 3.19.0.

  • The Data Collector or Transformer uses the HTTPS protocol because Control Hub also uses the HTTPS protocol.
    Note: StreamSets recommends using a certificate signed by a certifying authority for a Data Collector or Transformer that uses the HTTPS protocol. If you use a self-signed certificate, you must first use a browser to access the Data Collector or Transformer URL and accept the web browser warning message about the self-signed certificate before users can select the component as the authoring engine.
  • The Data Collector or Transformer URL is reachable from the Control Hub web browser.

For more information about how a registered Data Collector works as an authoring Data Collector, see Registered Data Collector.

Note: Control Hub includes a system Data Collector for exploration and light development. Administrators can enable or disable the system Data Collector for use as the default authoring Data Collector in Control Hub. For more information about how the system Data Collector works as an authoring Data Collector, see System Data Collector.

For example, the following Select an Authoring Data Collector window displays the system Data Collector and two registered Data Collectors as choices for the authoring Data Collector. Notice how the second registered Data Collector listed in this image is not accessible and thus cannot be selected because it uses the HTTP protocol:

When you edit a pipeline or fragment in the pipeline canvas, you can change the authoring engine as long as you select the same engine type. For example, when editing a Data Collector pipeline, you can select another authoring Data Collector. You can cannot select an authoring Transformer.

Click the Authoring icon () in the top right corner of the canvas to view which authoring engine is being used and to optionally change the selection.

For example, the following image shows the currently selected authoring Data Collector:

Data Collector Stage Libraries

The selected authoring Data Collector determines the stage libraries that are installed and available for use as you design pipelines and pipeline fragments.

The stage library panel in the pipeline canvas displays all stages. Stages that are not installed on the selected authoring engine appear disabled, or greyed out. For example, the stage library panel shown below indicates that the Elasticsearch and Google BigQuery origins are not installed:

When the selected authoring Data Collector is a tarball installation, you can install additional stage libraries, including enterprise stage libraries, from the pipeline canvas. To install an additional stage library, click on a disabled stage. Confirm that you want to install the library, and then restart the engine for the changes to take effect.

Important: Installing an additional stage library from the pipeline canvas installs the library only on the selected authoring engine. You must install the additional library on any other authoring engine used to design the pipeline and on all execution engines where you run the pipeline.

When the selected authoring Data Collector is a core RPM installation, you must install additional RPM stage libraries using the Data Collector command line program, as described in Install Additional Stage Libraries in the Data Collector documentation.

When the selected authoring Data Collector is an RPM or Cloudera Manager installation, you must install enterprise stage libraries as custom stage libraries as described in Enterprise Stage Libraries in the Data Collector documentation.

External Libraries

The selected authoring Data Collector or Transformer determines the external libraries available to stages as you design pipelines and pipeline fragments. For example, some stages, such as most JDBC stages, require installing a JDBC driver as an external library on Data Collector or Transformer.

As you design pipelines, each stage requiring an external library displays the currently installed libraries in the External Libraries tab in the stage properties panel.

For example, the following image shows that a MySQL JDBC driver is installed for the JDBC stage library on the selected authoring Data Collector. As a result, this external library is available to the JDBC Query Consumer origin during pipeline design:

When needed, you can install an additional external library for a stage from the pipeline canvas by clicking Upload External Library from the External Libraries tab. Control Hub installs the external library to one of the following locations:
  • $SDC_DIST/streamsets-libs-extras directory on the selected authoring Data Collector
  • $TRANSFORMER_DIST/streamsets-libs-extras directory on the selected authoring Transformer
Important: Installing an external library from the pipeline canvas installs the library only on the selected authoring engine. You must install the external library on any other authoring engine that you use to design the pipeline and on all execution engines where you run the pipeline.

For more information about installing external libraries on Data Collector, see Install External Libraries in the Data Collector documentation.

For more information about installing external libraries on Transformer, see External Libraries in the Transformer documentation.

Pipeline Designer UI

The following image shows the Pipeline Designer UI when you configure a pipeline:

Area/Icon Name Description
1 Pipeline Canvas Displays the pipeline. Use to configure the pipeline data flow.
2 Pipeline Creation Help Bar Offers lists of stages to help complete the pipeline.

You can use the help bar to connect a stage to an open node. You can also add a stage between linked stages by clicking the link.

3 Properties panel Displays the properties of the pipeline or selected stage when you configure a pipeline.
4 Selected stage pop-up menu Displays the icons for commands that you can apply to the selected stages.
5 Stage library panel List of available stages. Use to add stages to the pipeline. You can drag a stage to a location on the canvas or click a stage to add it to the end of the pipeline.

You can view all stages, stages by type, or stages by library. You can also search for a stage by name.

Stages that are not installed appear disabled, or greyed out. Click on a disabled stage to install the stage library that includes the stage.

Pipeline name display Displays the name of the pipeline in the canvas.
Pipeline version display and selection The version of the pipeline in the canvas. To select a different version, click the icon and select the version to view.
Check In icon Publishes the pipeline or fragment in the canvas. Displays for pipelines only when the pipeline passes implicit validation.

When the pipeline has already been published, the Edit icon displays in the same location.

Publish a pipeline to enable creating a job for the pipeline. Publish a fragment to enable using the fragment in a pipeline.

Edit icon Enables editing the pipeline or fragment. Displays when the pipeline or fragment has already been published and is being viewed in read only mode.

When the pipeline or fragment is already in edit mode, the Check In icon displays in the same location.

Compare with Previous Version icon Compares the pipeline or fragment in the canvas with a previous version.
History icon Displays the history of the pipeline or fragment in the canvas.
Undo icon

Reverts recent changes. On Mac, you can also use Command+Z. On Windows, you can use Ctrl+Z.

Redo icon

Restores changes that were reverted. On Mac, you can also use Command+Shift+Z. On Windows, you can use Ctrl+Y.

Delete Draft or Delete Pipeline icon Deletes the draft version or published version of a pipeline.
Validation Errors icon Lists the number of validation errors for implicit validation. Click to view the error messages.
More icon Provides additional actions to take.

Use to delete or export a pipeline, or to update the stage libraries used in the pipeline.

Use to import or export a pipeline fragment.

Auto Arrange icon Automatically arranges the stages on the canvas.
Authoring icon Authoring engine associated with the pipeline. You can click the icon and select a different Data Collector or Transformer to use.
Validate icon Validates the pipeline. Performs explicit validation.

Displays when the pipeline is in edit mode and passes implicit validation.

Preview icon Starts data preview. Available for valid pipelines and fragments.

Not available when using the system Data Collector for authoring.

Share icon Shares the pipeline or fragment with users and groups. Use to configure permissions for the pipeline or fragment.
Create Job icon Creates a job based on the pipeline. Creates a job for a published pipeline.
Stage Library icon Toggles the display of the Stage Library panel.
Duplicate Stage icon Duplicates the selected stage.
Create Pipeline Fragment icon Creates a pipeline fragment from the selected stages.
Delete icon Deletes the selected item in the canvas.
Expand Fragments icon Expands either all or the selected fragment stages in the pipeline, displaying all the stages in the fragments.
Collapse Fragments icon Collapses expanded pipeline fragments, displaying a single fragment stage for each fragment.
Stream link icon Indicates the flow of data through the pipeline or fragment. Select to configure data rules and alerts.

Darker icons indicate that a data rule is configured for the link.

Error icon Indicates that one or more required properties are not defined. Can display on a stage for stage properties or in the canvas for pipeline properties.

Related error messages display when you hover over the icon. You can also view the messages in the Validation Errors list.

The icon can also display on tabs in the properties panel to indicate the location of missing properties.

Note: Some icons and options might not display. The items that display are based on the task that you are performing and roles assigned to your user account.
For example, the Create Job icon displays only for published pipelines when you log in with the Job Operator role. Or, if you log in with only the Pipeline User role configuration-related icons are not available.

Pipeline Designer Tips

The Control Hub Pipeline Designer is closely based on the Data Collector and Transformer pipeline configuration canvas. Some functional differences are described below:
Authoring engine
When configuring a Data Collector or Transformer pipeline or pipeline fragment, you must select the authoring engine to use. You can select any accessible Data Collector, either the system Data Collector if available or a registered Data Collector. However, to perform explicit validation or data preview, you must select a registered Data Collector as the authoring Data Collector.
Always select an authoring engine that is the same version as the execution engine that you intend to use to run the pipeline in production. Using a different engine version can result in developing a pipeline that is invalid for the production engine.
For example, if you use a more recent version of Data Collector for development than for production, you might include a stage, stage library, or stage functionality that does not exist in the production Data Collector.
The registered Data Collector must meet certain requirements to be used as the authoring Data Collector. For more information, see Authoring Data Collectors.
Create a pipeline or fragment
When you create a pipeline or pipeline fragment, you specify whether the pipeline will run on Data Collector or Transformer. For pipelines, you can start with a blank canvas or an existing template.
Control Hub provides several templates as sample pipelines. You can use them to familiarize yourself with pipeline designs or you can use them as a basis for pipeline development.
Edit a published pipeline or fragment
When viewing a published pipeline or pipeline fragment, Control Hub displays the pipeline or fragment in read-only mode. The mode appears above the canvas as shown:
To edit a published pipeline or fragment, click the Edit icon: . The pipeline or fragment then enters Edit mode.
Select multiple stages
When editing a pipeline or pipeline fragment, you can select multiple stages in the pipeline canvas and then move, delete, or copy the selected stages. To select multiple stages in the canvas, press the Shift key and then click each stage.
Copy and paste multiple stages and fragments
After selecting multiple stages or fragments in the pipeline canvas, you can copy them to the clipboard by clicking Copy to Clipboard in the properties panel below the canvas. Or, to copy the selected stages and fragments to the clipboard on Mac, you can also use Command+C. On Windows, you can use Ctrl+C.
Then within the same pipeline or in another pipeline, use Command+V on Mac to paste the copied stages and fragments to the canvas. On Windows, use Ctrl+V.
Update stage libraries for the pipeline
When editing a pipeline or pipeline fragment, you can use the Update Stage Libraries dialog box to update the stage libraries for multiple stages in the pipeline or fragment.
This allows you to update all necessary stage libraries at one time when you change the authoring engine for the pipeline or fragment. When preferred, you can also change stage libraries individually by editing each stage.
The stage libraries that display for each stage depend on the authoring engine selected for the pipeline or fragment. For example, if the authoring Data Collector has the MapR 6.0 and 6.1 stage libraries installed, then these are the stage libraries that display for a MapR FS destination or MapR FS File Metadata executor.
To update multiple stage libraries at one time, click the More icon (), then select Update Stage Libraries.
The Update Stage Libraries dialog box displays the stage name and type for each stage in the pipeline or fragment. On the right is the corresponding list of stage libraries for the stage that are available on the authoring engine.
Update the stage libraries as needed, then click Update to save your changes.
Work with versions
When you have multiple versions of a pipeline or pipeline fragment, Control Hub indicates the version of the pipeline or fragment that you are viewing. You can click the pipeline or fragment version to select a different version to view, as follows:
For more information about working with versions, see Version History.
Run a test of a draft pipeline
When editing a pipeline, you can perform a test run of the draft pipeline to quickly test the pipeline logic. You can perform a test run of a draft version of a fully configured pipeline.
Publish a pipeline or fragment
When you have completed work on a pipeline or fragment, you publish or check in the pipeline or fragment. Publish a pipeline to create and run jobs based on the pipeline. Publish a fragment to make it available for testing or use in pipelines.
Use the Check In icon to publish a valid pipeline or fragment: . Enter a commit message stating what changed in this version so that you can track the commit history of the pipeline or fragment.
After you publish a pipeline, it enters read only mode and can be used to create a job. After you publish a fragment, it enters read only mode and can be included in pipelines.
Create a job
After you publish a pipeline, you can create a job.
You can create a job using the Create Job icon () in the toolbar above the pipeline canvas. Or, you can create a job from the Jobs view.
Data preview requirement
You can preview data when the pipeline uses a registered Data Collector or Transformer as the authoring engine.
If the pipeline uses the system Data Collector or a selected registered engine that is not accessible, the Preview Pipeline icon () is disabled.
Validation requirement
You can perform explicit validation when the pipeline uses a registered Data Collector or Transformer as the authoring engine.
Use the Validate icon to perform explicit validation: .

Creating a Pipeline

Create a pipeline to define how data flows from origin to destination systems and how the data is processed along the way.

You can create pipelines using a blank canvas or using a sample pipeline. Control Hub provides system sample pipelines. You can also create user-defined sample pipelines. You can review sample pipelines to learn how you might develop a similar pipeline, or you might use the samples as a starting point for pipeline development.

When you create a pipeline, you specify the type to create - Data Collector, or Transformer, whether to start from a blank canvas or from a sample pipeline, and the authoring engine to use. You can change the authoring engine used during development.

  1. In the Navigation panel, click Pipeline Repository > Pipelines.
  2. Click the Add icon.
  3. Enter the name and optional description.
  4. Select the type of pipeline to create: Data Collector or Transformer.
  5. Select how you want to create the pipeline, and then click Next.
    • Blank Pipeline - Use a blank canvas for pipeline development.
    • Sample Pipeline - Use an existing sample pipeline as the basis for pipeline development.
  6. If you selected Sample Pipeline, in the Select a Sample Pipeline dialog box, filter by the sample type, select the sample to use, then click Next.
  7. Select the authoring Data Collector or Transformer to use, then click Create.
    Control Hub opens a blank canvas or the selected sample pipeline.
If needed, you can change the authoring engine for the pipeline using the Authoring icon: .

Creating a Fragment

When you create a pipeline fragment, you specify the type to create - Data Collector or Transformer, and select the authoring engine to use.
  1. In the Navigation panel, click Pipeline Repository > Pipeline Fragments.
  2. Click the Create New Pipeline Fragment icon: .
  3. In the New Pipeline Fragment dialog box, specify a name, optional description, and the execution engine to use.
  4. Click Next
  5. Select an available authoring Data Collector or Transformer, and then click Create.

    If you use the system Data Collector, you cannot use data preview to help develop the fragment.

    A blank canvas displays.
If needed, you can change the authoring engine for the fragment using the Authoring icon: .

Running a Test of a Draft Pipeline

As you design a pipeline, you can perform a test run of the draft pipeline in the pipeline canvas. Perform a test run of a draft pipeline to quickly test the pipeline logic.

You can perform a test run of a draft version of a fully configured pipeline. The Test Run menu becomes active when a draft pipeline is complete.

You cannot perform a test run of a published pipeline version. To run a published pipeline version, you must first create a job for the published pipeline version and then start the job.

Note: As you monitor a test run of a Data Collector pipeline, you can capture and review a snapshot of the data being processed.
  1. While viewing a draft version of a completed pipeline in the pipeline canvas, click Test Run in the top right corner of the toolbar, and then select one of the following options:
    • Start Pipeline - Start a test run of the pipeline.
    • Reset Origin and Start - Reset the origin and then start a test run of the pipeline.
    • Start with Parameters - Specify the parameter values to use and then start a test run of the pipeline.

    When the test run starts, statistics for the test run display in the Monitor panel.

  2. Monitor the test run of the pipeline, including viewing real-time statistics and error information.

    To access the history of previous test runs for the pipeline, click the Test Run History tab. The test run history includes the start and end time of previous test runs and also the input, output, and error record count for each test run.

    Click View Summary for a specific run to view a summary of the metrics for that test run.

  3. When you've finished monitoring the test run, click the Stop Test Run icon: .
    You can continue designing the pipeline and performing additional test runs until you decide that the pipeline is complete, and then publish the pipeline.

Publishing a Fragment or Pipeline

You can publish fragments and pipelines that are designed in the pipeline canvas.

Publish a fragment to use the fragment in a pipeline. Pipelines can only use published fragments.

Publish a pipeline to create a job that runs the pipeline, to use the pipeline as a sample pipeline, or to retain the published pipeline version for historical reference. You can only use published pipelines in jobs or as sample pipelines.

You can only publish valid pipelines. Control Hub performs explicit validation before publishing a pipeline.

  1. While viewing a fragment or pipeline in edit mode, click the Check In icon: .

    The Check In window appears.

  2. Enter a commit message.

    As a best practice, state what changed in this version so that you can track the commit history of the fragment or pipeline.

  3. Click one of the following buttons:
    • Cancel - Cancels publishing the fragment or pipeline.
    • Publish and Update - Publishes the fragment or pipeline, and then displays the pipelines or jobs that currently use the published fragment or pipeline.

      Select the pipelines or jobs that you want to update to use the latest published version and click Update. Review the pipelines or jobs to be updated, and then click Close. When publishing a pipeline, you can also choose to create a new job for this pipeline version.

      Or, choose to skip the update and close the window. When publishing a pipeline, you can also choose to skip the update and create a new job for this pipeline version.
    • Publish and Close - Publishes the fragment or pipeline.