Skip to main content

Standard Mode Tools

View a list of all Standard mode tools in Designer Cloud. Tools are grouped according to their tool categories.

Workflow In-Out Tools

Item

Description

Input Data Tool

Use Input Data to connect to a table to pull data into your workflow.

Output Data Tool

Use Output Data to write results of a workflow to supported file types or data sources.

Text Input Tool

Use Text Input to manually enter text to create small data files for input. This can be useful in testing and creating Lookup tables while you build your workflow.

Workflow Data Preparation Tools

Item

Description

Auto Column Tool

Use Auto Column to automatically change the column type and size for efficient storage of string data.

Create Samples Tool

Use Create Sample to split the input records into 2 or 3 random samples.

Data Cleansing Tool

Use Data Cleansing to fix common data quality issues. You can replace null values, remove punctuation, modify capitalization, and more.

Filter Tool

Use Filter to select data using a condition.

Formula Tool

Use Formula to create new columns, update columns, and use 1 or more expressions to perform a variety of calculations and operations.

Generate Rows Tool

Use Generate Rows to create new rows of data with an expression.

Imputation Tool

Use Imputation to clean up missing values in your data.

Multi-Column Binning Tool

Use Multi-Column Binning to tile or bin on multiple columns.

Multi-Column Formula Tool

Use Multi-Column Formula to create or update multiple columns using a single expression.

Multi-Row Formula Tool

Use Multi-Row Formula to create and update columns by using row data to create formulas.

Oversample Column Tool

Use Oversample Column to automatically create balanced samples from imbalanced data for use in statistical modeling.

Random % Sample Tool

Use Random % Sample to return an expected number of rows that result in a random sample of the incoming data stream.

Row ID Tool

Use Row ID to create a new column in the data and assign a unique identifier, which increments sequentially for each row in the data.

Sample Tool

Use Sample to limit the data stream to a specified number, percentage, or random set of rows. In addition, the Sample tool applies the selected configuration to the columns you want to group by.

Select Tool

Use Select to include, exclude, and reorder the columns of data that pass through your workflow.

Select Rows Tool

Use Select Rows to return rows and ranges of rows that are specified, including discontinuous ranges of rows. This tool is useful for troubleshooting and sampling.

Sort Tool

Use Sort to arrange the rows in a table in alphanumeric order based on the values of the specified data fields.

Tile Tool

Use Tile to assign a value (tile) based on ranges in the data. The tool does this based on the user specifying 1 of 3 methods.

Unique Tool

Use Unique to distinguish whether a row is unique or a duplicate by grouping on one or more specified columns, then sorting on those columns.

Workflow Join Tools

Item

Description

Append Columns Tool

Use Append Columns to append every row from a source dataset to every row of a target dataset. This operation is also known as a cross join.

Find Replace Tool

Use Find Replace to find a string in 1 column of a dataset and look up and replace it with another string. You can also use Find Replace to append columns to a row.

Join Tool

Use Join to combine 2 inputs based on common columns between the 2 tables. You can also join 2 data streams based on row position.

Join Multiple Tool

Use Join Multiple to combine 2 or more inputs based on a commonality between the input tables. By default, the tool outputs a full outer join.

Make Group Tool

Use Make Group to take data relationships and assemble the data into groups based on those relationships.

Union Tool

Use Union to combine 2 or more datasets on column names or positions.

Workflow Parse Tools

Item

Description

DateTime Tool

Use DateTime to transform date-time data to and from a variety of formats, including both expression-friendly and human-readable formats.

RegEx Tool

Use RegEx to use regular-expression syntax to parse, match, or replace data.

Text to Columns Tool

Use Text To Columns to take the text in 1 column and split the string value into multiple separate columns or rows, based on a 1 or more delimiters.

XML Parse Tool

Use XML Parse to parse XML into columns.

Workflow Transform Tools

Item

Description

Arrange Tool

Use Arrange to manually transpose and rearrange your columns for presentation purposes. Data is transformed so that each row is turned into multiple rows, and you can create new columns using column description data.

Count Rows Tool

Use Count Rows to return a count of the number of rows passing through the tool. Use this tool when you want to report on the resulting row count of a process. It even returns a count of zero which a Summarize tool does not do.

Cross Tab Tool

Use Cross Tab to pivot the orientation of data in a table by moving vertical columns onto a horizontal axis and summarizing data where specified.

Running Total Tool

Use Running Total to calculate a cumulative sum per row in a dataset.

Summarize Tool

Use Summarize to perform various actions (functions and calculations) on your data.

Transpose Tool

Use Transpose to pivot the orientation of a data table.

Weighted Average Tool

Use Weighted Average to calculate the weighted average of an incoming data column.

Workflow Developer Tools

Item

Description

Dynamic Rename Tool

Use Dynamic Rename to rename columns in upstream data. Use this tool to rename a pattern in the column headers, like removing a prefix or replacing underscores with spaces.

Dynamic Replace Tool

Use Dynamic Replace to quickly replace data column values, based on a condition.

Dynamic Select Tool

Use Dynamic Select to select columns either by field type or via a formula.

JSON Build Tool

Use JSON Build to build formatted JavaScript Object Notation (JSON) text from the table schema output of the JSON Parse tool.

JSON Parse Tool

Use JSON Parse to separate JSON text into a table schema for the purpose of downstream processing.