Astera Data Stack
Version 10
Version 10
  • Welcome to Astera Data Stack Documentation
  • RELEASE NOTES
    • Astera 10.5 - Release Notes
    • Astera 10.4 - Release Notes
    • Astera 10.3 - Release Notes
    • Astera 10.2 – Release Notes
    • Astera 10.1 - Additional Notes
    • Astera 10.1 - Release Notes
    • Astera 10.0 - Release Notes
  • SETTING UP
    • System Requirements
    • Product Architecture
    • Migrating from Astera 9 to Astera 10
    • Migrating from Astera 7.x to Astera 10
    • Installing Client and Server Applications
    • Connecting to an Astera Server using the Client
    • How to Connect to a Different Astera Server from the Client
    • How to Build a Cluster Database and Create Repository
    • Repository Upgrade Utility in Astera
    • How to Login from the Client
    • How to Verify Admin Email
    • Licensing in Astera
    • How to Supply a License Key Without Prompting the User
    • Install Manager
    • User Roles and Access Control
      • Windows Authentication
      • Azure Authentication
    • Offline Activation of Astera
    • Setting Up R in Astera
    • Silent Installation
  • DATAFLOWS
    • What are Dataflows?
    • Sources
      • Data Providers and File Formats Supported in Astera Data Stack
      • Setting Up Sources
      • Excel Workbook Source
      • COBOL File Source
      • Database Table Source
      • Delimited File Source
      • File System Items Source
      • Fixed Length File Source
      • Email Source
      • Report Source
      • SQL Query Source
      • XML/JSON File Source
      • PDF Form Source
      • Parquet File Source (Beta)
      • MongoDB Source (Beta)
      • Data Model Query
    • Transformations
      • Introducing Transformations
      • Aggregate Transformation
      • Constant Value Transformation
      • Denormalize Transformation
      • Distinct Transformation
      • Expression Transformation
      • Filter Transformation
      • Join Transformation
      • List Lookup Transformation
      • Merge Transformation
      • Normalize Transformation
      • Passthru Transformation
      • Reconcile Transformation
      • Route Transformation
      • Sequence Generator
      • Sort Transformation
      • Sources as Transformations
      • Subflow Transformation
      • Switch Transformation
      • Tree Join Transformation
      • Tree Transform
      • Union Transformation
      • Data Cleanse Transformation
      • File Lookup Transformation
      • SQL Statement Lookup
      • Database Lookup
      • AI Match Transformation
    • Destinations
      • Setting Up Destinations
      • Database Table Destination
      • Delimited File Destination
      • Excel Workbook Destination
      • Fixed Length File Destination
      • SQL Statement Destination
      • XML File Destination
      • Parquet File Destination (Beta)
      • Excel Workbook Report
      • MongoDB Destination
    • Data Logging and Profiling
      • Creating Data Profile
      • Creating Field Profile
      • Data Quality Mode
      • Using Data Quality Rules in Astera
      • Record Level Log
      • Quick Profile
    • Database Write Strategies
      • Data Driven
      • Source Diff Processor
      • Database Diff Processor
    • Text Processors
      • Delimited Parser
      • Delimited Serializer
      • Language Parser
      • Fixed Length Parser
      • Fixed Length Serializer
      • XML/JSON Parser
      • XML/JSON Serializer
    • Data Warehouse
      • Fact Table Loader
      • Dimension Loader
      • Data Vault Loader
    • Testing and Diagnostics
      • Correlation Analysis
    • Visualization
      • Basic Plots
      • Distribution Plots
    • EDI
      • EDI Source File
      • EDI Message Parser
      • EDI Message Serializer
      • EDI Destination File
  • WORKFLOWS
    • What are Workflows?
    • Creating Workflows in Astera
    • Decision
    • EDI Acknowledgment
    • File System
    • File Transfer
      • FTP
      • SFTP
    • Or
    • Run Dataflow
    • Run Program
    • Run SQL File
    • Run SQL Script
    • Run Workflow
    • Send Mail
    • Workflows with a Dynamic Destination Path
    • Customizing Workflows With Parameters
    • GPG-Integrated File Decryption in Astera
    • AS2
      • Setting up an AS2 Server
      • Adding an AS2 Partner
      • AS2 Workflow Task
  • Subflows
    • Using Subflows in Astera
  • DATA MODEL
    • Creating a Data Warehousing Project
    • Data Models
      • Introducing Data Models
      • Opening a New Data Model
      • Data Modeler - UI Walkthrough
      • Reverse Engineering an Existing Database
      • Creating a Data Model from Scratch
      • General Entity Properties
      • Creating and Editing Relationships
      • Relationship Manager
      • Virtual Primary Key
      • Virtual Relationship
      • Change Field Properties
      • Forward Engineering
      • Verifying a Data Model
    • Dimensional Modelling
      • Introducing Dimensional Models
      • Converting a Data Model to a Dimensional Model
      • Build Dimensional Model
      • Fact Entities
      • Dimension Entities
      • Placeholder Dimension for Early Arriving Facts and Late Arriving Dimensions
      • Date and Time Dimension
      • Aggregates in Dimensional Modeling
      • Verifying a Dimensional Model
    • Data Vaults
      • Introducing Data Vaults
      • Data Vault Automation
      • Raw Vault Entities
      • Bridge Tables
      • Point-In-Time Tables
    • Documentation
      • Generating Technical and Business Documentation for Data Models
      • Lineage and Impact Analysis
    • Deployment and Usage
      • Deploying a Data Model
      • View Based Deployment
      • Validate Metadata and Data Integrity
      • Using Astera Data Models in ETL Pipelines
      • Connecting an Astera Data Model to a Third-Party Visualization Tool
  • REPORT MODEL
    • User Guide
      • Report Model Tutorial
    • Report Model Interface
      • Report Options
      • Report Browser
      • Data Regions in Report Models
      • Region Properties Panel
      • Pattern Properties
      • Field Properties Panel
    • Use Cases
      • Auto-Creating Data Regions and Fields
      • Line Count
      • Auto-Parsing
      • Pattern Count
      • Applying Pattern to Line
      • Regular Expression
      • Floating Patterns and Floating Fields
      • Creating Multi-Column Data Regions
      • Defining the Start Position of Data Fields
      • Data Field Verification
      • Using Comma Separated Values to Define Start Position
      • Defining Region End Type as Specific Text and Regular Expression
      • How To Work With PDF Scaling Factor in a Report Model
      • Connecting to Cloud Storage
    • Auto Generate Layout
      • Setting Up AGL in Astera
      • UI Walkthrough - Auto Generation of Layout, Fields and Table
      • Using Auto Generation Layout, Auto Create Fields and Auto Create Table (Preview)
    • AI Powered Data Extraction
      • AI Powered Data Extraction Using Astera North Star
      • Best Practices for AI-Powered Template Creation in Astera
    • Optical Character Recognition
      • Loading PDFs with OCR
      • Best Practices for OCR Usage
    • Exporting Options
      • Exporting a Report Model
      • Exporting Report Model to a Dataflow
    • Miscellaneous
      • Importing Monarch Models
      • Microsoft Word and Rich Text Format Support
      • Working With Problematic PDF Files
  • API Flow
    • API Publishing
      • Develop
        • Designing an API Flow
        • Request Context Parameters
        • Configuring Sorting and Filtering in API Flows
        • Enable Pagination
        • Asynchronous API Request
        • Multiple Responses using Conditional Route
        • Workflow Tasks in an API Flow
        • Enable File Download-Upload Through APIs
        • Database CRUD APIs Auto-Generation
        • Pre-deployment Testing and Verification of API flows
        • Multipart/Form-Data
        • Certificate Store
      • Publish
        • API Deployment
        • Test Flow Generation
      • Manage
        • Server Browser Functionalities for API Publishing
          • Swagger UI for API Deployments
        • API Monitoring
        • Logging and Tracing
    • API Consumption
      • Consume
        • API Connection
        • Making API Calls with the API Client
        • API Browser
          • Type 1 – JSON/XML File
          • Type 2 – JSON/XML URL
          • Type 3 – Import Postman API Collections
          • Type 4 - Create or customize API collection
          • Pre-built Custom Connectors
        • Request Service Options - eTags
        • HTTP Redirect Calls
        • Method Operations
        • Pagination
        • Raw Preview And Copy Curl Command
        • Support for text/XML and SOAP Protocol
        • API Logging
        • Making Multipart/Form-Data API Calls
      • Authorize
        • Open APIs - Configuration Details
        • Authorizing Facebook APIs
        • Authorizing Astera’s Server APIs
        • Authorizing Avaza APIs
        • Authorizing the Square API
        • Authorizing the ActiveCampaign API
        • Authorizing the QuickBooks’ API
        • Astera’s Server API Documentation
        • NTLM Authentication
        • AWS Signature Authentication
        • Accessing Astera’s Server APIs Through a Third-Party Tool
          • Workflow Use Case
  • Project Management and Scheduling
    • Project Management
      • Deployment
      • Server Monitoring and Job Management
      • Cluster Monitor and Settings
      • Connecting to Source Control
      • Astera Project and Project Explorer
      • CAR Convert Utility Guide
    • Job Scheduling
      • Scheduling Jobs on the Server
      • Job Monitor
    • Configuring Multiple Servers to the Same Repository (Load Balancing)
    • Purging the Database Repository
  • Data Governance
    • Deployment of Assets in Astera Data Stack
    • Logging In
    • Tags
    • Modifying Asset Details
    • Data Discoverability
    • Data Profile
    • Data Quality
    • Scheduler
    • Access Management
  • Functions
    • Introducing Function Transformations
    • Custom Functions
    • Logical
      • Coalesce (Any value1, Any value2)
      • IsNotNull (AnyValue)
      • IsRealNumber (AnyValue)
      • IsValidSqlDate (Date)
      • IsDate (AnyValue)
      • If (Boolean)
      • If (DateTime)
      • If (Double)
      • Exists
      • If (Int64)
      • If (String)
      • IsDate (str, strformat)
      • IsInteger (AnyValue)
      • IsNullOrWhitespace (StringValue)
      • IsNullorEmpty (StringValue)
      • IsNull (AnyValue)
      • IsNumeric (AnyValue)
    • Conversion
      • GetDateComponents (DateWithOffset)
      • ParseDate (Formats, Str)
      • GetDateComponents (Date)
      • HexToInteger (Any Value)
      • ToInteger (Any value)
      • ToDecimal (Any value)
      • ToReal (Any value)
      • ToDate (String dateStr)
      • TryParseDate (String, UnknownDate)
      • ToString (Any value)
      • ToString (DateValue)
      • ToString (Any data, String format)
    • Math
      • Abs (Double)
      • Abs (Decimal)
      • Ceiling (Real)
      • Ceiling(Decimal)
      • Floor (Decimal)
      • Floor (Real)
      • Max (Decimal)
      • Max (Date)
      • Min (Decimal)
      • Min (Date)
      • Max (Real)
      • Max (Integer)
      • Min (Real)
      • Pow (BaseExponent)
      • Min (Integer)
      • RandomReal (Int)
      • Round (Real)
      • Round (Real Integer)
      • Round (Decimal Integer)
      • Round (Decimal)
    • Financial
      • DDB
      • FV
      • IPmt
      • IPmt (FV)
      • Pmt
      • Pmt (FV)
      • PPmt
      • PPmt (FV)
      • PV (FV)
      • Rate
      • Rate (FV)
      • SLN
      • SYD
    • String
      • Center (String)
      • Chr (IntAscii)
      • Asc (String)
      • AddCDATAEnvelope
      • Concatenate (String)
      • ContainsAnyChar (String)
      • Contains (String)
      • Compact (String)
      • Find (Int64)
      • EndsWith (String)
      • FindIntStart (Int32)
      • Extract (String)
      • GetFindCount (Int64)
      • FindLast (Int64)
      • GetDigits (String)
      • GetLineFeed
      • Insert (String)
      • IsAlpha
      • GetToken
      • IndexOf
      • IsBlank
      • IsLower
      • IsUpper
      • IsSubstringOf
      • Length (String)
      • LeftOf (String)
      • Left (String)
      • IsValidName
      • Mid (String)
      • PadLeft
      • Mid (String Chars)
      • LSplit (String)
      • PadRight
      • ReplaceAllSpecialCharsWithSpace
      • RemoveChars (String str, StringCharsToRemove)
      • ReplaceLast
      • RightAlign
      • Reverse
      • Right (String)
      • RSplit (String)
      • SplitStringMultipleRecords
      • SplitStringMultipleRecords (2 Separators)
      • SplitString (3 separators)
      • SplitString
      • SplitStringMultipleRecords (3 Separators)
      • Trim
      • SubString (NoOfChars)
      • StripHtml
      • Trim (Start)
      • TrimExtraMiddleSpace
      • TrimEnd
      • PascalCaseWithSpace (String str)
      • Trim (String str)
      • ToLower(String str)
      • ToProper(String str)
      • ToUpper (String str)
      • Substring (String str, Integer startAt)
      • StartsWith (String str, String value)
      • RemoveAt (String str, Integer startAt, Integer noofChars)
      • Proper (String str)
      • Repeat (String str, Integer count)
      • ReplaceAll (String str, String lookFor, String replaceWith)
      • ReplaceFirst (String str, String lookFor, String replaceWith)
      • RightOf (String str, String lookFor)
      • RemoveChars (String str, String charsToRemove)
      • SplitString (String str, String separator1, String separator2)
    • Date Time
      • AddMinutes (DateTime)
      • AddDays (DateTimeOffset)
      • AddDays (DateTime)
      • AddHours (DateTime)
      • AddSeconds (DateTime)
      • AddMonths (DateTime)
      • AddMonths (DateTimeOffset)
      • AddMinutes (DateTimeOffset)
      • AddSeconds (DateTimeOffset)
      • AddYears (DateTimeOffset)
      • AddYears (DateTime)
      • Age (DateTime)
      • Age (DateTimeOffset)
      • CharToSeconds (Str)
      • DateDifferenceDays (DateTimeOffset)
      • DateDifferenceDays (DateTime)
      • DateDifferenceHours (DateTimeOffset)
      • DateDifferenceHours (DateTime)
      • DateDifferenceMonths (DateTimeOffset)
      • DateDifferenceMonths (DateTime)
      • DatePart (DateTimeOffset)
      • DatePart (DateTime)
      • DateDifferenceYears (DateTimeOffset)
      • DateDifferenceYears (DateTime)
      • Month (DateTime)
      • Month (DateTimeOffset)
      • Now
      • Quarter (DateTime)
      • Quarter (DateTimeOffset)
      • Second (DateTime)
      • Second (DateTimeOffset)
      • SecondsToChar (String)
      • TimeToInteger (DateTime)
      • TimeToInteger (DateTimeOffset)
      • ToDate Date (DateTime)
      • ToDate DateTime (DateTime)
      • ToDateString (DateTime)
      • ToDateTimeOffset-Date (DateTimeOffset)
      • ToDate DateTime (DateTimeOffset)
      • ToDateString (DateTimeOffset)
      • Today
      • ToLocal (DateTime)
      • ToJulianDate (DateTime)
      • ToJulianDayNumber (DateTime)
      • ToTicks (Date dateTime)
      • ToTicks (DateTimeWithOffset dateTime)
      • ToUnixEpoc (Date dateTime)
      • ToUtc (Date dateTime)
      • UnixTimeStampToDateTime (Real unixTimeStamp)
      • UtcNow ()
      • Week (Date dateTime)
      • Week (DateTimeWithOffset dateTime)
      • Year (Date dateTime)
      • Year (DateTimeWithOffset dateTime)
      • DateToJulian (Date dateTime, Integer length)
      • DateTimeOffsetUtcNow ()
      • DateTimeOffsetNow ()
      • Day (DateTimeWithOffset dateTime)
      • Day (Date dateTime)
      • DayOfWeekStr (DateTimeWithOffset dateTime)
      • DayOfWeek (DateTimeWithOffset dateTime)
      • DayOfWeek (Date dateTime)
      • DateToJulian (DateTimeWithOffset dateTime, Integer length)
      • DayOfWeekStr (Date dateTime)
      • FromJulianDate (Real julianDate)
      • DayOfYear (Date dateTime)
      • DaysInMonth(Integer year, Integer month)
      • DayOfYear (DateTimeWithOffset dateTime)
      • FromUnixEpoc
      • FromJulianDayNumber (Integer julianDayNumber)
      • FromTicksUtc(Integer ticks)
      • FromTicksLocal(Integer ticks)
      • Hour (Date dateTime)
      • Hour (DateTimeWithOffset dateTime)
      • Minute (Date dateTime)
      • JulianToDate (String julianDate)
      • Minute (DateTimeWithOffset dateTime)
      • DateToIntegerYYYYMMDD (DateTimeWithOffset dateTime)
      • DateToIntegerYYYYMMDD (Date dateTime)
    • Files
      • AppendTextToFile (String filePath, String text)
      • CopyFile (String sourceFilePath, String destFilePath, Boolean overWrite)
      • CreateDateTime (String filePath)
      • DeleteFile (String filePath)
      • DirectoryExists (String filePath)
      • FileExists (String filePath)
      • FileLength (String filePath)
      • FileLineCount (String filePath)
      • GetDirectory (String filePath)
      • GetEDIFileMetaData (String filePath)
      • GetExcelWorksheets (String excelFilePath)
      • GetFileExtension (String filePath)
      • GetFileInfo (String filePath)
      • GetFileName (String filePath)
      • GetFileNameWithoutExtension (String filePath)
      • LastUpdateDateTime (String filePath)
      • MoveFile (String filePath, String newDirectory)
      • ReadFileBytes (String filePath)
      • ReadFileFirstLine (String filePath)
      • ReadFileText (String filePath)
      • ReadFileText (String filePath, String codePage)
      • WriteBytesToFile (String filePath, ByteArray bytes)
      • WriteTextToFile (String filePath, String text)
    • Date Time With Offset
      • ToDateTimeOffsetFromDateTime (dateTime String)
      • ToUtc (DateTimeWithOffset)
      • ToDateTimeOffsetFromDateTime
      • ToDateTimeOffset (String dateTimeOffsetStr)
      • ToDateTimeFromDateTimeOffset
    • GUID
      • NewGuid
    • Encoding
      • ToBytes
      • FromBytes
      • UrlEncode
      • UrlDecode
      • ComputeSHA256
      • ComputeMD5
      • ComputeHash (Str, Key)
      • ComputeHash (Str, Key, hex)
      • ConvertEncoding
    • Regular Expressions
      • ReplaceRegEx
      • ReplaceRegEx (Integer StartAt)
      • IsMatchRegEx (StartAt)
      • IsMatchRegEx
      • IsUSPhone
      • IsUSZipCode
      • GetMatchRegEx
      • GetMatchRegEx (StartAt)
    • TimeSpan
      • Minutes
      • Hours
      • Days
      • Milliseconds
      • TotalMilliseconds
      • TimeSpanFromTicks
      • Ticks
      • TotalHours
      • Seconds
      • TotalDays
      • ToTimeSpan (Hours, Min, Sec)
      • ToTimeSpan (Milli)
      • ToTimeSpan
      • TotalSeconds
      • TotalMinutes
    • Matching
      • Soundex
      • DoubleMetaphone
      • RefinedSoundex
    • Processes
      • TerminateProcess
      • IsProcessRunning
  • USE CASES
    • End-to-End Use Cases
      • Data Integration
        • Using Astera Data Stack to Create and Orchestrate an ETL Process for Partner Onboarding
      • Data Warehousing
        • Building a Data Warehouse – A Step by Step Approach
      • Data Extraction
        • Reusing The Extraction Template for Similar Layout Files
  • CONNECTORS
    • Setting Up IBM DB2/iSeries Connectivity in Astera
    • Connecting to SAP HANA Database
    • Connecting to MariaDB Database
    • Connecting to Salesforce Database
    • Connecting to Salesforce – Legacy Database
    • Connecting to Vertica Database
    • Connecting to Snowflake Database
    • Connecting to Amazon Redshift Database
    • Connecting to Amazon Aurora Database
    • Connecting to Google Cloud SQL in Astera
    • Connecting to MySQL Database
    • Connecting to PostgreSQL in Astera
    • Connecting to Netezza Database
    • Connecting to Oracle Database
    • Connecting to Microsoft Azure Databases
    • Amazon S3 Bucket Storage in Astera
    • Connecting to Amazon RDS Databases
    • Microsoft Azure Blob Storage in Astera
    • ODBC Connector
    • Microsoft Dynamics CRM
    • Connection Details for Azure Data Lake Gen 2 and Azure Blob Storage
    • Configuring Azure Data Lake Gen 2
    • Connecting to Microsoft Message Queue
    • Connecting to Google BigQuery
    • Azure SQL Server Configuration Prerequisites
    • Connecting to Microsoft Azure SQL Server
    • Connecting to Microsoft SharePoint in Astera
  • Incremental Loading
    • Trigger Based CDC
    • Incremental CDC
  • MISCELLANEOUS
    • Using Dynamic Layout & Template Mapping in Astera
    • Synonym Dictionary File
    • SmartMatch Feature
    • Role-Based Access Control in Astera
    • Updating Your License in Astera
    • Using Output Variables in Astera
    • Parameterization
    • Connection Vault
    • Safe Mode
    • Context Information
    • Using the Data Source Browser in Astera
    • Pushdown Mode
    • Optimization Scenarios
    • Using Microsoft’s Modern Authentication Method in Email Source Object
    • Shared Actions
    • Data Formats
    • AI Automapper
    • Resource Catalog
    • Cloud Deployment
      • Deploying Astera Data Stack on Microsoft Azure Cloud
      • Deploying Astera Data Stack on Oracle Cloud
      • Deploying Astera Data Stack on Amazon Web Services
      • Setting up the Astera Server on AKS
    • GIT In Astera Data Stack
      • GIT Repositories in Astera Data Stack
      • Moving a Repository to a Remote Server
      • Git Conflicts in Astera Data Stack
    • Astera Best Practices
  • FAQs
    • Installation
      • Why do we need to make two installations for Astera?
      • What’s the difference between Custom and Complete installation?
      • What’s the difference between 32-bit and 64-bit Astera?
      • Can we use a single license for multiple users?
      • Does Astera client work when it’s not connected to the server?
      • Why do we need to build a cluster database and set up a repository while working with Astera?
      • How do we set up multiple servers for load balancing?
      • How do we maintain schedules when migrating server or upgrading version?
      • Which database providers does Astera support for setting up a cluster database?
      • How many Astera clients can be connected to a single server?
      • Why is Astera not able to access my source file or create a new one?
    • Sources
      • Can I use data from unstructured documents in dataflows?
      • Can I extract data from fillable PDF forms in Astera?
      • Does Astera support extraction of data residing in online sources?
      • How do I process multiple files in a directory with a single execution of a flow?
      • Can I write information from the File System Items Source to the destination?
      • Can I split a source file into multiple files based on record count?
      • Does Astera support data extraction from unstructured docs or text files?
      • What is the difference between full and incremental loading in database sources?
      • How is the File System Items Source used in a Dataflow?
      • How does the PDF Form Source differ from the Report Source in Astera?
      • Does Astera support extraction of data from EDI files?
      • How does the Raw Text Filter option work in file sources in Astera?
    • Destinations
      • If I want to have a different field delimiter, say a pipe (“|”), is there an option to export with a
      • Tools Menu > Data Format has different date formats, but it doesn’t seem to do anything.
      • Can we export the Object Path column present in the Data Preview window?
      • I want to change the output format of a column.
      • What will be the outcome if we write files multiple times to the same Excel Destination?
    • Transformations
      • How is the Aggregate Transformation different from the Expression Transformation?
      • Can we omit duplicate records using the Aggregate Transformation in Astera?
      • How many datasets can a single Aggregate object take input from?
      • How is Expression Transformation different from the Function Transformation?
    • Workflows
      • What is a Workflow in Astera?
      • How do I trigger a task if at least one of a set of tasks fails?
      • Can I perform an action based on whether a file has data?
    • Scheduler
      • How can I schedule a job to run every x hours?
Powered by GitBook

© Copyright 2025, Astera Software

On this page
  • Overview
  • Creating a New Workflow
  • Using Workflow Designer
  • Adding Objects
  • Unlimited Undo/Redo
  • Copying Objects
  • Managing Workflow Layout
  • Auto Layout
  • Zoom In/Zoom Out/Fit to Screen
  • Linking Objects
  • Ports
  • Links
  • Double-lined Links
  • Setting Object Properties
  • Context Menu - Options for Workflow Tasks
  • Verifying Workflow
  • Running Workflow
  • Setting up Sources

Was this helpful?

Export as PDF
  1. WORKFLOWS

Creating Workflows in Astera

PreviousWhat are Workflows?NextDecision

Last updated 9 months ago

Was this helpful?

Overview

A workflow enables automated and repeated execution of a sequence of tasks, such as running programs, sending emails, uploading files, running a transfer setting or batch, executing SQL code, and many others. The tasks run according to some predefined path and following the custom logic that controls under what circumstances specific paths should be activated.

You can add any number of tasks on a single visual workflow diagram, as well as specify what should happen at each step depending on a successful or erroneous completion of a task.

You can also route workflows one way or another by using Boolean conditions that are suitable for your scenario. You can even call other workflows or dataflows to run within your main workflow.

A workflow can be created quickly and easily in Astera’s intuitive drag-and-drop environment. The Astera workflow editor allows you to copy or move workflow tasks, source objects, or workflow parameters, change their properties, and perform a number of other actions with the capability of unlimited undo-redo of any previous actions.

A workflow makes it easy to visualize and implement complex sequences of tasks because pieces of a workflow can be used like ‘building blocks’. These building blocks can be pasted into different workflows, or even into the same workflow, which allows you to quickly replicate tasks, or even sequences of tasks, with similar properties.

Objects can be added to a workflow in several ways, such as direct drag-and-drop of files from any Explorer window, drag-and-drop of tables or views from the built-in Data Source Browser, or by adding an object from the Toolbox.

A workflow that you create in Astera can run on local or remote servers. The workflow can also be ported to run in any number of target environments. This is achieved by using run-time Parameters which change values depending on the specific context in which the workflow will run.

Finally, to make sure that your workflow is continuously up to date with your changing environment, you can easily update data connections, such as server names, login credentials etc. throughout the entire workflow, using a single easy-to-use interface. This means that your workflow will stay in sync with the current requirements, even if the data connections have changed since the workflow was created.

Creating a New Workflow

To create a new workflow, go to File > New > Workflow on the main menu. Alternatively, you can expand the Create a New Dataflow dropdown icon on the main toolbar and select Workflow as shown below.

Using Workflow Designer

Adding Objects

A workflow always has at least one workflow task, and may have none, one or multiple sources. Sources, workflow tasks, and resources are represented as objects on the workflow.

Depending on the type of object, an object can be added to the workflow in one of the following ways:

For Flat File Sources

  1. Using drag-and-drop: You can drag a file of the type listed below from an Explorer window and drop it onto an open workflow tab in Astera.

  • Excel

  • Delimited

  • Fixed-Length File

The advantage of using the drag-and-drop method, compared to other methods, is that many of the object’s properties are pre-populated for you based on the file’s content. For example, the field layout is automatically filled out so that there is no need to create it manually.

Note: XML files cannot be added to a dataflow using the drag-and-drop method since they require a schema to be defined first for the layout.

  1. Using the Toolbox: You can add a source object by selecting it from the appropriate category in the Toolbox.

For example, to add a source comma-delimited file object, expand the Sources group in the workflow Toolbox, and drag-and-drop the Delimited File Source object onto the workflow.

Note that an object added this way initially does not have any properties defined. To define its properties, double click on the object’s header, or right-click on its header and select Properties from the context menu.

In the Properties screen that opens, select the File Path of the file that will be associated with the object. Field layout and other properties can then be populated based on the file’s content.

  1. Copying and pasting an existing object from the same or different workflow or dataflow: If your source is already defined in the same or different workflow (or dataflow), you can copy the existing object and paste it into your workflow. The object being copied retains the properties of the original object and is assigned a unique new name to distinguish it from the original object.

For XML Sources

  1. Using the Toolbox: To add an XML source to the workflow, use the XML/JSON File Source object in the Sources section of Toolbox.

Note that the XML File object initially will not have any properties defined. To define its properties, double click on the object’s header, or right-click, and select Properties from the context menu.

On the Properties window that opens, select the File Path of the XML file that will be associated with the object. Additionally, you need to provide the path to the XSD schema that defines the layout of your XML file.

  1. As with flat files, you can copy and paste an existing XML object from the same or different workflow or dataflow. The object being copied retains the properties of the original object, and is assigned a unique new name to distinguish it from the original object.

For Database Sources

  1. Using drag-and-drop: You can drag a database table or view from the Data Source Browser and drop it on an open workflow tab.

To open the Data Source Browser, go to View > Data Source Browser. Connect to the server, then expand the Database tree and expand Tables (or Views) tree to select your table (or view). Drag-and-drop the selected table or view to the workflow.

By default, the database table is added as a Database Table Source object.

To add a data model source, press and hold the ‘Ctrl’ key while dragging and dropping a table (or view) from the Data Source Browser.

  1. As with files, you can copy and paste an existing database table object from the same or different workflow or dataflow. The object being copied retains the properties of the original object, and is assigned a unique new name to distinguish it from the original object.

For All Other Types of Objects (for example, workflow tasks or resources objects)

  1. Using the Toolbox: You can add an object by selecting it from the appropriate category in the Toolbox.

Note that an object added this way initially does not have any properties defined. To define its properties, double click on the object’s header, or right-click, and select Properties from the context menu.

  1. Copying and pasting an existing object from the same or different workflow or dataflow: If your object is already defined in the same or different workflow (or dataflow), you can copy the existing object and paste it into your workflow. The object being copied retains the properties of the original object, and is assigned a unique new name to distinguish it from the original object.

Unlimited Undo/Redo

Workflow editor supports unlimited undo and redo capability. You can quickly undo or redo the last action done, or undo/redo several actions at once.

To undo the last action, open the Edit menu and select Undo. You can also click the Undo icon on the Workflow toolbar. Or you can also use the CTRL+Z shortcut. To undo several actions at once, select the first action you wish to undo from the Undo dropdown. The actions following the selected action will also be undone.

To redo the last action, open the Edit menu and select Redo. You can also click the Redo icon on the Workflow toolbar. Or you can also use the CTRL+Y shortcut. To redo several actions at once, select the first action you wish to redo from the Redo dropdown. The actions following the selected action will also be redone.

Copying Objects

Using the copy and paste feature, you can replicate an object on your workflow by copying it into a new object with a different name to distinguish it from the original object.

You can paste this object into the same workflow, or a different workflow.

You can also copy several objects at once. To do so, click the objects you wish to copy while pressing the ‘Ctrl’ key. Or, you can draw a rectangle with your mouse while holding down the LEFT mouse button. The objects inside the rectangle will be selected. Right-click on a selected object and select Copy from the context menu. Then right-click on white space in the same or different workflow, and select Paste from the context menu.

To move an object, or a set of objects, use the Cut and Paste sequence similar to the one described above.

Note: When you move objects, they keep their original names.

Note: You can use CTRL+C shortcut to copy the selected object into the clipboard. CTRL+V will paste it from the clipboard. CTRL+X will cut it onto the clipboard.

Managing Workflow Layout

Auto Layout

The Auto Layout feature allows you to arrange objects on the workflow improving its visual representation.

To invoke the Auto Layout feature, click the Auto Layout Diagram icon on the Workflow toolbar. Or you can open the Workflow menu and select Auto Layout.

Note: You can manually move the object around the workflow by holding the LEFT mouse button down over the object title and moving it to a new location.

Zoom In/Zoom Out/Fit to Screen

The following tools are available in the Workflow menu to help you adjust the display size of the dataflow:

  • Zoom In

  • Zoom Out

  • Fit To Screen

Additionally, you can select a custom zoom percentage using the Zoom % input on the Workflow toolbar.

Linking Objects

Ports

Any Workflow Tasks objects you add to the workflow will show Input and Output ports. Ports make it possible to connect an object to another object via links, creating the required workflow path.

The Input port on an object allows you to connect it to an upstream object on the workflow. Conversely, the Output port allows you to connect your object to a downstream object on the workflow. The downstream object representing a task will be processed after the upstream object it is connected to has finished running (with some exceptions, such as the Run Program task, which may not require waiting for the program to exit).

Source objects and Resources objects have no ports and cannot be linked to other objects on the workflow. These types of objects cannot be linked to any specific object as they provide values that can be used by any task throughout the workflow.

All source objects added to the workflow are used in Singleton mode only. A Singleton mode object only returns the first record of data (excluding the header) from the object. This makes it possible for you to get the values from the first record anywhere on the workflow, using the following notation: ObjectName.FieldName.

Similar to Source objects, Resource Objects, such as Parameters or Context Info, provide parameters used throughout the workflow. Parameters have a similar notation of ObjectName.ParameterName.

Links

To create links between two objects, drag the Output port of the upstream object and drop it on the Input port of the downstream object.

To remove a link between two objects, right-click on the link and select Delete, or left-click on the link and press DEL key on the keyboard.

Note: By default, the link of Normal type is created between two objects. Normal link means that the downstream object will be processed in the event of a successful completion of the upstream object to which it is connected. Normal links are displayed in green.

In contrast, an Error link can be created between two objects, meaning that the downstream object will be processed only in the event of a failed (error) status of the upstream object to which it is connected. Error links are displayed in orange.

To change the link type, right click on the link and select Change Link Type.

An example of linked workflow tasks is shown below.

Double-lined Links

Double-lined links from a source object to a workflow task object will only appear if you are using the source as a loop.

Right-clicking the double-lined link and selecting Loop Options allows you to choose the Degree of Parallelism.

The Degree of Parallelism indicates the number of concurrent supply of files. For example, if you set it to two, two files will go through the Run Dataflow Task simultaneously.

Setting Object Properties

To open an object’s properties, double click on the object’s header. Alternatively, you can right-click on the object’s header and select Properties from the context menu.

Note: On the Properties screen, you can switch to the Properties of another object on the workflow. To do so, select an object in the Editing dropdown.

The following functions are common for many types of objects on the workflow. The following functions are available in the context menu:

  • Rename – Renames the object.

  • Edit File – Only applies to files. It opens the file for editing in a new tab.

  • Edit Schema – Only applies to XML files. It opens the XSD file for editing in a new tab.

  • View Table Data – Only applies to database tables. It opens the table for viewing in a new tab.

  • View Table Schema – Only applies to database tables. It opens the database schema for viewing in a new tab.

  • Delete – Deletes the object from the workflow. This will also remove any links to and from the object.

  • Cut – Removes the object from the workflow and places it into the clipboard. The object can then be ‘pasted’.

  • Copy – Copies the object into the clipboard leaving the original object as it is. A copy of the object can then be ‘pasted’.

  • Paste - Pastes the object from the clipboard.

Context Menu - Options for Workflow Tasks

To open the context menu, right-click on the Workflow task's header. In this menu, there are four options present which are specific to Workflow tasks.

These options are as follows:

Skip Task: Skips the selected task when running a Workflow.

Start Workflow at This Action: Starts the Workflow from the selected task.

Run Workflow up to This Action: Runs a Workflow only until the selected task.

Run This Action: Only runs the selected task in the Workflow.

Verifying Workflow

Verifying a workflow will list any errors or warnings present in the workflow design. Correct any such errors or warnings, and verify your workflow again to ensure there are no errors.

Running Workflow

To run your workflow, click the Start Workflow icon on the main toolbar.

To stop a workflow that is currently running, click the Stop Job icon in the Job Progress window toolbar.

Setting up Sources

Each source on the workflow is represented as a source object. You can have any number of sources in a workflow, but they can only be used in Singleton mode. A Singleton mode source only returns the first record of data (excluding the header) from the source. This makes it possible for you to get the values from the first record anywhere on the workflow, using the following notation:

ObjectName.FieldName.

The following source types are supported by the workflow engine:

Flat File Sources

  • Delimited File

  • Excel File

  • Fixed Length File

Tree File Sources

  • COBOL

  • XML File

Database Sources

  • Data Model

  • Database Table

  • SQL Query

All sources can be added to the workflow by picking a source type from Toolbox and dropping it on the workflow. File sources can also be added by dragging and dropping a file from the Explorer window. Database sources can be drag-and-dropped from the Data Source Browser.

To verify a workflow, click the Start verification icon on the main toolbar. Verification results will be displayed in the window.

Verify