Astera Data Stack
Version 8
Version 8
  • Welcome to Astera Data Stack Documentation
  • Release Notes
    • Astera 8.0 - What's New, What's Fixed, and What's Improved
    • Astera 8.0 - Known Issues
    • Astera 8.1 - Release Notes
    • Astera 8.2 Release Notes
    • Astera 8.3 Release Notes
    • Astera 8.4 Release Notes
    • Astera 8.5 Release Notes
  • Getting Started
    • Astera 8 - Important Considerations
    • Astera 8 - System Requirements
    • Configuring the Server
    • Connecting to a Different Astera Server from the Lean Client
    • Connecting to an Astera Server using Lean Client
    • How to Build a Cluster Database and Create a Repository
    • How to Login from Lean Client
    • Setting up a Server Certificate (.pfx) File in a New Environment
    • Installing Client and Server Applications
    • Licensing Model in Astera 8
    • Migrating from Astera 7.x to Astera 8
    • UI Walkthrough - Astera 8.0
    • User Roles and Access Control
  • Dataflows
    • Sources
      • Data Providers and File Formats Supported in Astera
      • Setting Up Sources
      • COBOL File Source
      • Database Table Source
      • Data Model Query Source
      • Delimited File Source
      • Email Source
      • Excel Workbook Source
      • File Systems Item Source
      • Fixed Length File Source
      • PDF Form Source
      • Report Source
      • SQL Query Source
      • XML/JSON File Source
    • Transformations
      • Introducing Transformations
      • Aggregate Transformation
      • Constant Value Transformation
      • Data Cleanse Transformation
      • Denormalize Transformation
      • Distinct Transformation
      • Database Lookup Transformation
      • Expression Transformation
      • File Lookup Transformation
      • Filter Transformation
      • Join Transformation
      • List Lookup Transformation
      • Merge Transformation
      • Normalize Transformation
      • Passthru Transformation
      • Reconcile Transformation
      • Route Transformation
      • Sequence Generator Transformation
      • Sort Transformation
      • Sources as Transformations
      • Subflow Transformation
      • SQL Statement Lookup Transformation
      • Switch Transformation
      • Tree Join Transformation
      • Tree Transform
      • Union Transformation
    • Destinations
      • Setting Up Destinations
      • Database Table Destination
      • Delimited File Destination
      • Excel Workbook Destination
      • Fixed Length File Destination
      • SQL Statement Destination
      • XML/JSON File Destination
    • Data Logging and Profiling
      • Creating Data Profile
      • Creating Field Profile
      • Data Quality Mode
      • Record Level Log
      • Using Data Quality Rules in Astera
    • Database Write Strategies
      • Database Diff Processor
      • Data Driven Write Strategy
      • Dimension Loader - Database Write
      • Source Diff Processor
    • Text Processors
      • Delimited Parser
      • Delimited Serializer
      • Fixed Length Parser
      • Fixed Length Serializer
      • Language Parser
      • XML JSON Parser
      • XML JSON Serializer
    • Data Warehouse
      • Fact Table Loader
      • Dimension Table Loader
  • WORKFLOWS
    • What Are Workflows?
    • Using the Workflow Designer
    • Creating Workflows in Astera
    • Decision Task
    • EDI Acknowledgement Task
    • File System Task
    • File Transfer Task
    • OR Task
    • Run Dataflow Task
    • Run Program Task
    • Run SQL File Task
    • Run SQL Script Task
    • Run Workflow Task
    • Send Mail Task
    • Workflows with a Dynamic Destination Path
    • Customizing Workflows with Parameters
    • GPG-Integrated File Decryption in Astera
  • Subflows
    • Using Subflows in Astera
  • Functions
    • Introducing Function Transformations
    • Custom Functions
    • Logical
      • Coalesce (Any value1, Any value2)
      • IsNotNull (AnyValue)
      • IsRealNumber (AnyValue)
      • IsValidSqlDate (Date)
      • IsDate (AnyValue)
      • If (Boolean)
      • If (DateTime)
      • If (Double)
      • Exists
      • If (Int64)
      • If (String)
      • IsDate (str, strformat)
      • IsInteger (AnyValue)
      • IsNullOrWhitespace (StringValue)
      • IsNullorEmpty (StringValue)
      • IsNull (AnyValue)
      • IsNumeric (AnyValue)
    • Conversion
      • GetDateComponents (DateWithOffset)
      • ParseDate (Formats, Str)
      • GetDateComponents (Date)
      • HexToInteger (Any Value)
      • ToInteger (Any value)
      • ToDecimal (Any value)
      • ToReal (Any value)
      • ToDate (String dateStr)
      • TryParseDate (String, UnknownDate)
      • ToString (Any value)
      • ToString (DateValue)
      • ToString (Any data, String format)
    • Math
      • Abs (Double)
      • Abs (Decimal)
      • Ceiling (Real)
      • Ceiling(Decimal)
      • Floor (Decimal)
      • Floor (Real)
      • Max (Decimal)
      • Max (Date)
      • Min (Decimal)
      • Min (Date)
      • Max (Real)
      • Max (Integer)
      • Min (Real)
      • Pow (BaseExponent)
      • Min (Integer)
      • RandomReal (Int)
      • Round (Real)
      • Round (Real Integer)
      • Round (Decimal Integer)
      • Round (Decimal)
    • Financial
      • DDB
      • FV
      • IPmt
      • IPmt (FV)
      • Pmt
      • Pmt (FV)
      • PPmt
      • PPmt (FV)
      • PV (FV)
      • Rate
      • Rate (FV)
      • SLN
      • SYD
    • String
      • Center (String)
      • Chr (IntAscii)
      • Asc (String)
      • AddCDATAEnvelope
      • Concatenate (String)
      • ContainsAnyChar (String)
      • Contains (String)
      • Compact (String)
      • Find (Int64)
      • EndsWith (String)
      • FindIntStart (Int32)
      • Extract (String)
      • GetFindCount (Int64)
      • FindLast (Int64)
      • GetDigits (String)
      • GetLineFeed
      • Insert (String)
      • IsAlpha
      • GetToken
      • IndexOf
      • IsBlank
      • IsLower
      • IsUpper
      • IsSubstringOf
      • Length (String)
      • LeftOf (String)
      • Left (String)
      • IsValidName
      • Mid (String)
      • PadLeft
      • Mid (String Chars)
      • LSplit (String)
      • PadRight
      • ReplaceAllSpecialCharsWithSpace
      • RemoveChars (String str, StringCharsToRemove)
      • ReplaceLast
      • RightAlign
      • Reverse
      • Right (String)
      • RSplit (String)
      • SplitStringMultipleRecords
      • SplitStringMultipleRecords (2 Separators)
      • SplitString (3 separators)
      • SplitString
      • SplitStringMultipleRecords (3 Separators)
      • Trim
      • SubString (NoOfChars)
      • StripHtml
      • Trim (Start)
      • TrimExtraMiddleSpace
      • TrimEnd
      • PascalCaseWithSpace (String str)
      • Trim (String str)
      • ToLower(String str)
      • ToProper(String str)
      • ToUpper (String str)
      • Substring (String str, Integer startAt)
      • StartsWith (String str, String value)
      • RemoveAt (String str, Integer startAt, Integer noofChars)
      • Proper (String str)
      • Repeat (String str, Integer count)
      • ReplaceAll (String str, String lookFor, String replaceWith)
      • ReplaceFirst (String str, String lookFor, String replaceWith)
      • RightOf (String str, String lookFor)
      • RemoveChars (String str, String charsToRemove)
      • SplitString (String str, String separator1, String separator2)
    • Date Time
      • AddMinutes (DateTime)
      • AddDays (DateTimeOffset)
      • AddDays (DateTime)
      • AddHours (DateTime)
      • AddSeconds (DateTime)
      • AddMonths (DateTime)
      • AddMonths (DateTimeOffset)
      • AddMinutes (DateTimeOffset)
      • AddSeconds (DateTimeOffset)
      • AddYears (DateTimeOffset)
      • AddYears (DateTime)
      • Age (DateTime)
      • Age (DateTimeOffset)
      • CharToSeconds (Str)
      • DateDifferenceDays (DateTimeOffset)
      • DateDifferenceDays (DateTime)
      • DateDifferenceHours (DateTimeOffset)
      • DateDifferenceHours (DateTime)
      • DateDifferenceMonths (DateTimeOffset)
      • DateDifferenceMonths (DateTime)
      • DatePart (DateTimeOffset)
      • DatePart (DateTime)
      • DateDifferenceYears (DateTimeOffset)
      • DateDifferenceYears (DateTime)
      • Month (DateTime)
      • Month (DateTimeOffset)
      • Now
      • Quarter (DateTime)
      • Quarter (DateTimeOffset)
      • Second (DateTime)
      • Second (DateTimeOffset)
      • SecondsToChar (String)
      • TimeToInteger (DateTime)
      • TimeToInteger (DateTimeOffset)
      • ToDate Date (DateTime)
      • ToDate DateTime (DateTime)
      • ToDateString (DateTime)
      • ToDateTimeOffset-Date (DateTimeOffset)
      • ToDate DateTime (DateTimeOffset)
      • ToDateString (DateTimeOffset)
      • Today
      • ToLocal (DateTime)
      • ToJulianDate (DateTime)
      • ToJulianDayNumber (DateTime)
      • ToTicks (Date dateTime)
      • ToTicks (DateTimeWithOffset dateTime)
      • ToUnixEpoc (Date dateTime)
      • ToUtc (Date dateTime)
      • UnixTimeStampToDateTime (Real unixTimeStamp)
      • UtcNow ()
      • Week (Date dateTime)
      • Week (DateTimeWithOffset dateTime)
      • Year (Date dateTime)
      • Year (DateTimeWithOffset dateTime)
      • DateToJulian (Date dateTime, Integer length)
      • DateTimeOffsetUtcNow ()
      • DateTimeOffsetNow ()
      • Day (DateTimeWithOffset dateTime)
      • Day (Date dateTime)
      • DayOfWeekStr (DateTimeWithOffset dateTime)
      • DayOfWeek (DateTimeWithOffset dateTime)
      • DayOfWeek (Date dateTime)
      • DateToJulian (DateTimeWithOffset dateTime, Integer length)
      • DayOfWeekStr (Date dateTime)
      • FromJulianDate (Real julianDate)
      • DayOfYear (Date dateTime)
      • DaysInMonth(Integer year, Integer month)
      • DayOfYear (DateTimeWithOffset dateTime)
      • FromUnixEpoc
      • FromJulianDayNumber (Integer julianDayNumber)
      • FromTicksUtc(Integer ticks)
      • FromTicksLocal(Integer ticks)
      • Hour (Date dateTime)
      • Hour (DateTimeWithOffset dateTime)
      • Minute (Date dateTime)
      • JulianToDate (String julianDate)
      • Minute (DateTimeWithOffset dateTime)
      • DateToIntegerYYYYMMDD (DateTimeWithOffset dateTime)
      • DateToIntegerYYYYMMDD (Date dateTime)
    • Files
      • AppendTextToFile (String filePath, String text)
      • CopyFile (String sourceFilePath, String destFilePath, Boolean overWrite)
      • CreateDateTime (String filePath)
      • DeleteFile (String filePath)
      • DirectoryExists (String filePath)
      • FileExists (String filePath)
      • FileLength (String filePath)
      • FileLineCount (String filePath)
      • GetDirectory (String filePath)
      • GetEDIFileMetaData (String filePath)
      • GetExcelWorksheets (String excelFilePath)
      • GetFileExtension (String filePath)
      • GetFileInfo (String filePath)
      • GetFileName (String filePath)
      • GetFileNameWithoutExtension (String filePath)
      • LastUpdateDateTime (String filePath)
      • MoveFile (String filePath, String newDirectory)
      • ReadFileBytes (String filePath)
      • ReadFileFirstLine (String filePath)
      • ReadFileText (String filePath)
      • ReadFileText (String filePath, String codePage)
      • WriteBytesToFile (String filePath, ByteArray bytes)
      • WriteTextToFile (String filePath, String text)
    • Date Time With Offset
      • ToDateTimeOffsetFromDateTime (dateTime String)
      • ToUtc (DateTimeWithOffset)
      • ToDateTimeOffsetFromDateTime
      • ToDateTimeOffset (String dateTimeOffsetStr)
      • ToDateTimeFromDateTimeOffset
    • GUID
      • NewGuid
    • Encoding
      • ToBytes
      • FromBytes
      • UrlEncode
      • UrlDecode
    • Regular Expressions
      • ReplaceRegEx
      • ReplaceRegEx (Integer StartAt)
    • TimeSpan
      • Minutes
      • Hours
      • Days
      • Milliseconds
    • Matching
      • Soundex
      • DoubleMetaphone
      • RefinedSoundex
  • Report Model
    • User Guide
      • Report Model Tutorial
    • Report Model Interface
      • Field Properties Panel
      • Region Properties Panel
      • Report Browser
      • Report Options
    • Use Cases
      • Applying Pattern to Line
      • Auto Creating Data Regions and Fields
      • Auto-Parsing
      • Creating Multi-Column Data Regions
      • Floating Patterns and Floating Fields
      • How To Work With PDF Scaling Factor in a Report Model
      • Line Count
      • Pattern Count
      • Pattern is a Regular Expression
    • Exporting Options
      • Exporting a Report Model
      • Exporting Report Model to a Dataflow
    • Miscellaneous
      • Importing Monarch Models
      • Microsoft Word and Rich Text Format Support
      • Working With Problematic PDF Files
  • API Flows
    • API Consumption
      • Consume
        • REST API Browser
        • Making HTTP Requests Through REST API Browser
        • Using REST Client Outside of the Scope of the Project
      • Authorize
        • Authorizing ActiveCampaign API in Astera
        • Authorizing Astera Server APIs
        • Authorizing Avaza APIs in Astera
        • Authorizing Facebook APIs in Astera
        • Authorizing QuickBooks API in Astera
        • Authorizing Square API in Astera
        • Open APIs - Configuration Details
  • Project Management
    • Project Management
      • Astera's Project Explorer
      • Connecting to Source Control
      • Deployment
      • Server Monitoring and Job Management
    • Job Scheduling
      • Scheduling Jobs on the Server
      • Job Monitor
  • Use Cases
    • End-to-End Use Cases
      • Data Integration
        • Using Astera to Create and Orchestrate an ETL Process for Partner Onboarding
      • Data Warehousing
        • Building a Data Warehouse - A Step-By-Step Approach
      • Data Extraction
        • Reusing The Extraction Template for Similar Layout Files
  • Connectors
    • Connecting to Amazon Aurora Database
    • Connecting to Amazon RDS Databases
    • Connecting to Amazon Redshift Database
    • Connecting to Cloud Storage
    • Connecting to Google Cloud SQL in Astera
    • Connecting to MariaDB Database
    • Connecting to Microsoft Azure Databases
    • Connecting to MySQL Database
    • Connecting to Netezza Database
    • Connecting to Oracle Database
    • Connecting to PostgreSQL in Astera
    • Connecting to Salesforce Database
    • Connecting to Salesforce - Legacy Database
    • Connecting to SAP HANA Database
    • Connecting to Snowflake Database
    • Connecting to Vertica Database
    • Setting Up IBM DB2 iSeries Connectivity in Astera
  • Miscellaneous
    • Cloud Deployment
      • Deploying Astera on Amazon Web Services
      • Deploying Astera on Microsoft Azure Cloud
      • Deploying Astera on Oracle Cloud
    • Context Information
    • Pushdown Mode
    • Role Based Access Control in Astera
    • Safe Mode
    • Server Command Line Utility
    • SmartMatch Feature
    • Synonym Dictionary File
    • Updating Your License in Astera
    • Using Dynamic Layout/Template Mapping in Astera
    • Using Output Variables in Astera
    • Using the Data Source Browser in Astera
  • Best Practices
    • Astera Best Practices - Dataflows
    • Overview of Cardinality in Data Modeling
    • Cardinality Errors FAQs
Powered by GitBook

© Copyright 2025, Astera Software

On this page
  • Changing a Source object to a Transformation object
  • Using Delimited File Source as a Transformation
  • Using Excel File Source as a Transformation
  • XML/JSON File Source as a Transformation
  • Report Source as a Transformation
  1. Dataflows
  2. Transformations

Sources as Transformations

PreviousSort TransformationNextSubflow Transformation

Last updated 9 months ago

Astera Data Stack provides an array of source options to read and extract data from. Different source objects can be found in Toolbox > Sources.

Read:

For a detailed overview of different source objects in Astera, see .

Transformations in Astera are used to perform a variety of operations on data as it moves through the dataflow pipeline. The Astera Data Stack provides an extensive library of built-in transformations enabling you to cleanse, convert, and transform data as per your business needs. Transformations can be found in Toolbox > Transformations.

For a detailed review of transformations, see .

In this article, we will discuss:

  1. How various sources in Astera can be used as a transformation.

  2. Some common scenarios where you could use a source as a transformation.

While the basic function of source objects in Astera is to extract data and bring it to the designer for further integration, a source object can also be used as a transformation function.

Changing a Source object to a Transformation object

  1. Select the relevant source object from Toolbox > Sources and drag and drop it onto the designer.

  2. Right-click on the source header and select Transformation from the context menu.

As soon as the Transformation option is selected from the context menu, the header color of the source object will change from green to purple.

This is because, by default, Source objects in Astera are indicated by a green header, and Transformation objects are indicated by a purple header. Hence, the change in color.

Listed below are the source objects that can be used as a transformation:

Note: Some sources in Astera cannot be used as transformations. These sources are ADO.Net Metadata Collections, COBOL Source, SQL Query Source, Multi-table Query Source, and FTP List Directory Contents.

Generally, source objects are used as transformations when the source file path is dynamic.

Using Delimited File Source as a Transformation

A Delimited File Source object can be used as a transformation when it is taking a dynamic file path; therefore, it will have multiple files of the same layout processing in a single dataflow or workflow.

  1. Go to the object’s properties and provide the File Path for the delimited source file.

  1. Once you have provided the File Path and configured the properties of the source object, click OK. Now, right-click on the header and select Transformation from the context menu, to change it to a Transformation object.

The header of the Delimited Source object will change to purple indicating that the source object is now converted into a Transformation object.

The transformed DelimitedSourceTrans object will now have two nodes:

  • Input node: To map the file path of the folder that contains delimited files that are to be processed.

  • Output node: On expanding the Output node, you will see the data fields in the delimited source file. Map these fields to other objects in a dataflow through the output mapping ports.

  1. Use the File System Item Source to pass the file path information in the input node of the Delimited source-transformation object. Drag and drop it from the Toolbox > Sources section.

  1. In the File System Source Properties, point the path to the directory and folder where the delimited files are located.

  1. Map the FullPath field from FileSystem to the DelimitedSource object’s input node (FilePath).

Now our Delimited Source Transformation object is ready. To preview the data, right-click on the DelimitedSourceTrans object and select Preview Output.

Once you select Preview output, you will be able to view the data in the Data Preview pane. The data that you see in the above image is in its condensed preview format. Click on the icon right next to the root node of the DelimitedSourceTran object to expand the node and preview your data.

You now have an expanded version of your data:

  • Root Node: Object Path – DelimitedSourceTran

  • Sub Node:

    • Input: Displays the path of the file that is being used as the input for this data.

    • Output: Displays the fields in the source data.

This is how you use a Delimited File Source as a transformation.

Using Excel File Source as a Transformation

The Excel Workbook Source can be used as a transformation when you have multiple Excel files with the same layout and want to process them together in a dataflow or workflow.

  1. Drag and drop the Excel Workbook Source object onto the designer.

  1. Go to the object’s properties and provide the File Path for the Excel source file.

The header of the ExcelSource object will change to purple indicating that the ExcelSource object is now converted to a transformation object.

The transformed ExcelSource object will now have two nodes:

  • Input node:

    • FilePath: To map the path of the folder that contains Excel files that are to be processed.

  • Output node: On expanding this node, you will be able to see the data fields in the Excel source file. Map these fields to other objects in the dataflow through the output mapping ports.

  1. Use the File System Item Source to pass the file path information in the input node of the Excel source-transformation object. Drag and drop it from the Toolbox > Sources section.

  1. In the File System Source Properties, provide the path of the directory and folder where the Excel files are located.

  1. Map the FullPath field from FileSystem to the ExcelSource object’s Input node (FilePath).

  • Map the Value field from ConstantValue to the ExcelSource object’s Input node (Worksheet).

Now our Excel Source Transformation object is ready. To preview the data, right-click on the ExcelSourceTrans object and select Preview Output.

On selecting Preview output, you will be able to view the data in the Data Preview pane. The data that you see in the above image is in its condensed preview format. Click on the icon right next to the root node ExcelSourceTran to expand the node and preview your data.

You will see the following nodes:

  • Root Node: Object Path – ExcelSourceTran

  • Sub Node:

    • Input: Gives the file path of the file that is being used as the input for this data.

    • Output: Displays the fields in the source data.

This is how you use an Excel Workbook Source as a transformation.

XML/JSON File Source as a Transformation

The XML/JSON File Source object can be used as a transformation when you have multiple XML or JSON files with the same layout and want to process them in a dataflow or a workflow.

  1. Drag and drop the XML/JSON File Source object onto the designer.

  1. Go to the object’s properties and provide the File Path for the XML source file and its schema.

The header of the XmlJsonSource object will change to purple indicating the conversion from a source object to a transformation object.

The transformed XmlJsonSource object will now have two nodes:

  • Input node: To map the file path of the folder that contains XmlJson files that are to be processed.

  • Output node: Once expanded, you will be able to see the data fields that are in the XmlJson source file. You can map these fields to other objects in a dataflow through the output mapping ports.

  1. Use the File System Item Source to pass the file path information in the input node of the XmlJson source-transformation object. Drag and drop it from the Toolbox > Sources section.

  1. In the File System Source Properties, provide the Path of the directory and folder where the XML/JSON files are located.

  1. Map the FullPath field from FileSystem to the XmlJsonSource object’s Input node (FilePath).

Now our XmlJson source transformation object is ready. To preview the data, right-click on the XmlJsonSourceTrans object and select Preview Output.

On selecting Preview output, you will be able to view the data in the Data Preview pane. The data that you see in the above image is in its condensed form. To expand the data and preview your output, you need to click on the +icon right next to the root node – XmlJsonSourceTrans.

You now have an expanded version of your data:

  • Root Node: Object Path – XmlJsonSourceTrans

  • Sub Node:

    • Input: Gives the file path of the file that is used as the input for this data.

    • Output: Displays the fields in the source data.

This is how you use an XmlJson File Source as a transformation.

Report Source as a Transformation

The Report Source object can be used as a transformation when you have multiple report models with the same layout, and process them in a dataflow or a workflow.

  1. Drag and drop the Report Source object onto the designer.

  1. Go to the properties and provide the File Path for the report source and its Report Model.

The header of the ReportSource object will change to purple indicating the conversion from a source object to a transformation object.

The transformed ReportSource object will now have two nodes:

  • Input node: Map the file path of the folder that contains report files that are to be processed.

  • Output node: When expanded, you will be able to see the data fields that are in the report source file. You can map these fields to other objects in the dataflow through the output mapping ports.

  1. Use the File System Item Source to pass the file path information in the input node of the Report source-transformation object. Drag and drop it from the Toolbox > Sources section.

  1. In the File System Source Properties, provide the path of the directory folder where report files are located.

  1. Map the FullPath field from FileSystem to the ReportModel object’s Input node (FilePath).

Now our Report Source Transformation object is ready. To preview the data, right-click on the report source object and select Preview Output.

On selecting Preview output, you will be able to view the data in the Data Preview pane. The data that you see in the above image is in its condensed form.

To expand the data and preview your output, you need to click on the icon right next to the root node – ReportModelTran. Then to further expand the data, click on the icon right next to the subnode – Output.

You now have an expanded version of your data:

  • Root Node: Object Path – ReportModelTrans

  • Sub Node:

    • Input: Gives the file path of the file that is being used as the input for this data.

    • Output: On further expansion, it will show the fields/data that are there in the report model.

This is how you use the Report Source object as a transformation object.

Drag and drop the object onto the designer.

Once you have provided the file path and configured the properties of the object, click OK. Now right-click on the header and select Transformation from the context menu, to change it into a transformation object.

Worksheet: This option can be used when you have more than one worksheet in an Excel source file and want to use any particular worksheet in the dataflow/workflow. This can be done by specifying the worksheet name using a object which you can find in Toolbox > Transformation > Constant Value.

Once you’ve provided both paths and configured the object, click OK. Now, right-click on the header and select Transformation from the context menu to change it into a transformation object.

Once you’ve provided both the paths and configured the properties of the object, click OK. Now right-click on the header and select Transformation from the context menu, to change it to a transformation object.

Delimited File Source
Excel File Source
Report Source
XML/JSON File Source
Delimited File Source
Excel Workbook Source
Constant Value Transformation
XML/JSON Source
Report Source
Supported Data Types and File Formats
Setting Up Sources
Introducing Transformations