Astera Data Stack
Version 10
Version 10
  • Welcome to Astera Data Stack Documentation
  • RELEASE NOTES
    • Astera 10.5 - Release Notes
    • Astera 10.4 - Release Notes
    • Astera 10.3 - Release Notes
    • Astera 10.2 – Release Notes
    • Astera 10.1 - Additional Notes
    • Astera 10.1 - Release Notes
    • Astera 10.0 - Release Notes
  • SETTING UP
    • System Requirements
    • Product Architecture
    • Migrating from Astera 9 to Astera 10
    • Migrating from Astera 7.x to Astera 10
    • Installing Client and Server Applications
    • Connecting to an Astera Server using the Client
    • How to Connect to a Different Astera Server from the Client
    • How to Build a Cluster Database and Create Repository
    • Repository Upgrade Utility in Astera
    • How to Login from the Client
    • How to Verify Admin Email
    • Licensing in Astera
    • How to Supply a License Key Without Prompting the User
    • Install Manager
    • User Roles and Access Control
      • Windows Authentication
      • Azure Authentication
    • Offline Activation of Astera
    • Setting Up R in Astera
    • Silent Installation
  • DATAFLOWS
    • What are Dataflows?
    • Sources
      • Data Providers and File Formats Supported in Astera Data Stack
      • Setting Up Sources
      • Excel Workbook Source
      • COBOL File Source
      • Database Table Source
      • Delimited File Source
      • File System Items Source
      • Fixed Length File Source
      • Email Source
      • Report Source
      • SQL Query Source
      • XML/JSON File Source
      • PDF Form Source
      • Parquet File Source (Beta)
      • MongoDB Source (Beta)
      • Data Model Query
    • Transformations
      • Introducing Transformations
      • Aggregate Transformation
      • Constant Value Transformation
      • Denormalize Transformation
      • Distinct Transformation
      • Expression Transformation
      • Filter Transformation
      • Join Transformation
      • List Lookup Transformation
      • Merge Transformation
      • Normalize Transformation
      • Passthru Transformation
      • Reconcile Transformation
      • Route Transformation
      • Sequence Generator
      • Sort Transformation
      • Sources as Transformations
      • Subflow Transformation
      • Switch Transformation
      • Tree Join Transformation
      • Tree Transform
      • Union Transformation
      • Data Cleanse Transformation
      • File Lookup Transformation
      • SQL Statement Lookup
      • Database Lookup
      • AI Match Transformation
    • Destinations
      • Setting Up Destinations
      • Database Table Destination
      • Delimited File Destination
      • Excel Workbook Destination
      • Fixed Length File Destination
      • SQL Statement Destination
      • XML File Destination
      • Parquet File Destination (Beta)
      • Excel Workbook Report
      • MongoDB Destination
    • Data Logging and Profiling
      • Creating Data Profile
      • Creating Field Profile
      • Data Quality Mode
      • Using Data Quality Rules in Astera
      • Record Level Log
      • Quick Profile
    • Database Write Strategies
      • Data Driven
      • Source Diff Processor
      • Database Diff Processor
    • Text Processors
      • Delimited Parser
      • Delimited Serializer
      • Language Parser
      • Fixed Length Parser
      • Fixed Length Serializer
      • XML/JSON Parser
      • XML/JSON Serializer
    • Data Warehouse
      • Fact Table Loader
      • Dimension Loader
      • Data Vault Loader
    • Testing and Diagnostics
      • Correlation Analysis
    • Visualization
      • Basic Plots
      • Distribution Plots
    • EDI
      • EDI Source File
      • EDI Message Parser
      • EDI Message Serializer
      • EDI Destination File
  • WORKFLOWS
    • What are Workflows?
    • Creating Workflows in Astera
    • Decision
    • EDI Acknowledgment
    • File System
    • File Transfer
      • FTP
      • SFTP
    • Or
    • Run Dataflow
    • Run Program
    • Run SQL File
    • Run SQL Script
    • Run Workflow
    • Send Mail
    • Workflows with a Dynamic Destination Path
    • Customizing Workflows With Parameters
    • GPG-Integrated File Decryption in Astera
    • AS2
      • Setting up an AS2 Server
      • Adding an AS2 Partner
      • AS2 Workflow Task
  • Subflows
    • Using Subflows in Astera
  • DATA MODEL
    • Creating a Data Warehousing Project
    • Data Models
      • Introducing Data Models
      • Opening a New Data Model
      • Data Modeler - UI Walkthrough
      • Reverse Engineering an Existing Database
      • Creating a Data Model from Scratch
      • General Entity Properties
      • Creating and Editing Relationships
      • Relationship Manager
      • Virtual Primary Key
      • Virtual Relationship
      • Change Field Properties
      • Forward Engineering
      • Verifying a Data Model
    • Dimensional Modelling
      • Introducing Dimensional Models
      • Converting a Data Model to a Dimensional Model
      • Build Dimensional Model
      • Fact Entities
      • Dimension Entities
      • Placeholder Dimension for Early Arriving Facts and Late Arriving Dimensions
      • Date and Time Dimension
      • Aggregates in Dimensional Modeling
      • Verifying a Dimensional Model
    • Data Vaults
      • Introducing Data Vaults
      • Data Vault Automation
      • Raw Vault Entities
      • Bridge Tables
      • Point-In-Time Tables
    • Documentation
      • Generating Technical and Business Documentation for Data Models
      • Lineage and Impact Analysis
    • Deployment and Usage
      • Deploying a Data Model
      • View Based Deployment
      • Validate Metadata and Data Integrity
      • Using Astera Data Models in ETL Pipelines
      • Connecting an Astera Data Model to a Third-Party Visualization Tool
  • REPORT MODEL
    • User Guide
      • Report Model Tutorial
    • Report Model Interface
      • Report Options
      • Report Browser
      • Data Regions in Report Models
      • Region Properties Panel
      • Pattern Properties
      • Field Properties Panel
    • Use Cases
      • Auto-Creating Data Regions and Fields
      • Line Count
      • Auto-Parsing
      • Pattern Count
      • Applying Pattern to Line
      • Regular Expression
      • Floating Patterns and Floating Fields
      • Creating Multi-Column Data Regions
      • Defining the Start Position of Data Fields
      • Data Field Verification
      • Using Comma Separated Values to Define Start Position
      • Defining Region End Type as Specific Text and Regular Expression
      • How To Work With PDF Scaling Factor in a Report Model
      • Connecting to Cloud Storage
    • Auto Generate Layout
      • Setting Up AGL in Astera
      • UI Walkthrough - Auto Generation of Layout, Fields and Table
      • Using Auto Generation Layout, Auto Create Fields and Auto Create Table (Preview)
    • AI Powered Data Extraction
      • AI Powered Data Extraction Using Astera North Star
      • Best Practices for AI-Powered Template Creation in Astera
    • Optical Character Recognition
      • Loading PDFs with OCR
      • Best Practices for OCR Usage
    • Exporting Options
      • Exporting a Report Model
      • Exporting Report Model to a Dataflow
    • Miscellaneous
      • Importing Monarch Models
      • Microsoft Word and Rich Text Format Support
      • Working With Problematic PDF Files
  • API Flow
    • API Publishing
      • Develop
        • Designing an API Flow
        • Request Context Parameters
        • Configuring Sorting and Filtering in API Flows
        • Enable Pagination
        • Asynchronous API Request
        • Multiple Responses using Conditional Route
        • Workflow Tasks in an API Flow
        • Enable File Download-Upload Through APIs
        • Database CRUD APIs Auto-Generation
        • Pre-deployment Testing and Verification of API flows
        • Multipart/Form-Data
        • Certificate Store
      • Publish
        • API Deployment
        • Test Flow Generation
      • Manage
        • Server Browser Functionalities for API Publishing
          • Swagger UI for API Deployments
        • API Monitoring
        • Logging and Tracing
    • API Consumption
      • Consume
        • API Connection
        • Making API Calls with the API Client
        • API Browser
          • Type 1 – JSON/XML File
          • Type 2 – JSON/XML URL
          • Type 3 – Import Postman API Collections
          • Type 4 - Create or customize API collection
          • Pre-built Custom Connectors
        • Request Service Options - eTags
        • HTTP Redirect Calls
        • Method Operations
        • Pagination
        • Raw Preview And Copy Curl Command
        • Support for text/XML and SOAP Protocol
        • API Logging
        • Making Multipart/Form-Data API Calls
      • Authorize
        • Open APIs - Configuration Details
        • Authorizing Facebook APIs
        • Authorizing Astera’s Server APIs
        • Authorizing Avaza APIs
        • Authorizing the Square API
        • Authorizing the ActiveCampaign API
        • Authorizing the QuickBooks’ API
        • Astera’s Server API Documentation
        • NTLM Authentication
        • AWS Signature Authentication
        • Accessing Astera’s Server APIs Through a Third-Party Tool
          • Workflow Use Case
  • Project Management and Scheduling
    • Project Management
      • Deployment
      • Server Monitoring and Job Management
      • Cluster Monitor and Settings
      • Connecting to Source Control
      • Astera Project and Project Explorer
      • CAR Convert Utility Guide
    • Job Scheduling
      • Scheduling Jobs on the Server
      • Job Monitor
    • Configuring Multiple Servers to the Same Repository (Load Balancing)
    • Purging the Database Repository
  • Data Governance
    • Deployment of Assets in Astera Data Stack
    • Logging In
    • Tags
    • Modifying Asset Details
    • Data Discoverability
    • Data Profile
    • Data Quality
    • Scheduler
    • Access Management
  • Functions
    • Introducing Function Transformations
    • Custom Functions
    • Logical
      • Coalesce (Any value1, Any value2)
      • IsNotNull (AnyValue)
      • IsRealNumber (AnyValue)
      • IsValidSqlDate (Date)
      • IsDate (AnyValue)
      • If (Boolean)
      • If (DateTime)
      • If (Double)
      • Exists
      • If (Int64)
      • If (String)
      • IsDate (str, strformat)
      • IsInteger (AnyValue)
      • IsNullOrWhitespace (StringValue)
      • IsNullorEmpty (StringValue)
      • IsNull (AnyValue)
      • IsNumeric (AnyValue)
    • Conversion
      • GetDateComponents (DateWithOffset)
      • ParseDate (Formats, Str)
      • GetDateComponents (Date)
      • HexToInteger (Any Value)
      • ToInteger (Any value)
      • ToDecimal (Any value)
      • ToReal (Any value)
      • ToDate (String dateStr)
      • TryParseDate (String, UnknownDate)
      • ToString (Any value)
      • ToString (DateValue)
      • ToString (Any data, String format)
    • Math
      • Abs (Double)
      • Abs (Decimal)
      • Ceiling (Real)
      • Ceiling(Decimal)
      • Floor (Decimal)
      • Floor (Real)
      • Max (Decimal)
      • Max (Date)
      • Min (Decimal)
      • Min (Date)
      • Max (Real)
      • Max (Integer)
      • Min (Real)
      • Pow (BaseExponent)
      • Min (Integer)
      • RandomReal (Int)
      • Round (Real)
      • Round (Real Integer)
      • Round (Decimal Integer)
      • Round (Decimal)
    • Financial
      • DDB
      • FV
      • IPmt
      • IPmt (FV)
      • Pmt
      • Pmt (FV)
      • PPmt
      • PPmt (FV)
      • PV (FV)
      • Rate
      • Rate (FV)
      • SLN
      • SYD
    • String
      • Center (String)
      • Chr (IntAscii)
      • Asc (String)
      • AddCDATAEnvelope
      • Concatenate (String)
      • ContainsAnyChar (String)
      • Contains (String)
      • Compact (String)
      • Find (Int64)
      • EndsWith (String)
      • FindIntStart (Int32)
      • Extract (String)
      • GetFindCount (Int64)
      • FindLast (Int64)
      • GetDigits (String)
      • GetLineFeed
      • Insert (String)
      • IsAlpha
      • GetToken
      • IndexOf
      • IsBlank
      • IsLower
      • IsUpper
      • IsSubstringOf
      • Length (String)
      • LeftOf (String)
      • Left (String)
      • IsValidName
      • Mid (String)
      • PadLeft
      • Mid (String Chars)
      • LSplit (String)
      • PadRight
      • ReplaceAllSpecialCharsWithSpace
      • RemoveChars (String str, StringCharsToRemove)
      • ReplaceLast
      • RightAlign
      • Reverse
      • Right (String)
      • RSplit (String)
      • SplitStringMultipleRecords
      • SplitStringMultipleRecords (2 Separators)
      • SplitString (3 separators)
      • SplitString
      • SplitStringMultipleRecords (3 Separators)
      • Trim
      • SubString (NoOfChars)
      • StripHtml
      • Trim (Start)
      • TrimExtraMiddleSpace
      • TrimEnd
      • PascalCaseWithSpace (String str)
      • Trim (String str)
      • ToLower(String str)
      • ToProper(String str)
      • ToUpper (String str)
      • Substring (String str, Integer startAt)
      • StartsWith (String str, String value)
      • RemoveAt (String str, Integer startAt, Integer noofChars)
      • Proper (String str)
      • Repeat (String str, Integer count)
      • ReplaceAll (String str, String lookFor, String replaceWith)
      • ReplaceFirst (String str, String lookFor, String replaceWith)
      • RightOf (String str, String lookFor)
      • RemoveChars (String str, String charsToRemove)
      • SplitString (String str, String separator1, String separator2)
    • Date Time
      • AddMinutes (DateTime)
      • AddDays (DateTimeOffset)
      • AddDays (DateTime)
      • AddHours (DateTime)
      • AddSeconds (DateTime)
      • AddMonths (DateTime)
      • AddMonths (DateTimeOffset)
      • AddMinutes (DateTimeOffset)
      • AddSeconds (DateTimeOffset)
      • AddYears (DateTimeOffset)
      • AddYears (DateTime)
      • Age (DateTime)
      • Age (DateTimeOffset)
      • CharToSeconds (Str)
      • DateDifferenceDays (DateTimeOffset)
      • DateDifferenceDays (DateTime)
      • DateDifferenceHours (DateTimeOffset)
      • DateDifferenceHours (DateTime)
      • DateDifferenceMonths (DateTimeOffset)
      • DateDifferenceMonths (DateTime)
      • DatePart (DateTimeOffset)
      • DatePart (DateTime)
      • DateDifferenceYears (DateTimeOffset)
      • DateDifferenceYears (DateTime)
      • Month (DateTime)
      • Month (DateTimeOffset)
      • Now
      • Quarter (DateTime)
      • Quarter (DateTimeOffset)
      • Second (DateTime)
      • Second (DateTimeOffset)
      • SecondsToChar (String)
      • TimeToInteger (DateTime)
      • TimeToInteger (DateTimeOffset)
      • ToDate Date (DateTime)
      • ToDate DateTime (DateTime)
      • ToDateString (DateTime)
      • ToDateTimeOffset-Date (DateTimeOffset)
      • ToDate DateTime (DateTimeOffset)
      • ToDateString (DateTimeOffset)
      • Today
      • ToLocal (DateTime)
      • ToJulianDate (DateTime)
      • ToJulianDayNumber (DateTime)
      • ToTicks (Date dateTime)
      • ToTicks (DateTimeWithOffset dateTime)
      • ToUnixEpoc (Date dateTime)
      • ToUtc (Date dateTime)
      • UnixTimeStampToDateTime (Real unixTimeStamp)
      • UtcNow ()
      • Week (Date dateTime)
      • Week (DateTimeWithOffset dateTime)
      • Year (Date dateTime)
      • Year (DateTimeWithOffset dateTime)
      • DateToJulian (Date dateTime, Integer length)
      • DateTimeOffsetUtcNow ()
      • DateTimeOffsetNow ()
      • Day (DateTimeWithOffset dateTime)
      • Day (Date dateTime)
      • DayOfWeekStr (DateTimeWithOffset dateTime)
      • DayOfWeek (DateTimeWithOffset dateTime)
      • DayOfWeek (Date dateTime)
      • DateToJulian (DateTimeWithOffset dateTime, Integer length)
      • DayOfWeekStr (Date dateTime)
      • FromJulianDate (Real julianDate)
      • DayOfYear (Date dateTime)
      • DaysInMonth(Integer year, Integer month)
      • DayOfYear (DateTimeWithOffset dateTime)
      • FromUnixEpoc
      • FromJulianDayNumber (Integer julianDayNumber)
      • FromTicksUtc(Integer ticks)
      • FromTicksLocal(Integer ticks)
      • Hour (Date dateTime)
      • Hour (DateTimeWithOffset dateTime)
      • Minute (Date dateTime)
      • JulianToDate (String julianDate)
      • Minute (DateTimeWithOffset dateTime)
      • DateToIntegerYYYYMMDD (DateTimeWithOffset dateTime)
      • DateToIntegerYYYYMMDD (Date dateTime)
    • Files
      • AppendTextToFile (String filePath, String text)
      • CopyFile (String sourceFilePath, String destFilePath, Boolean overWrite)
      • CreateDateTime (String filePath)
      • DeleteFile (String filePath)
      • DirectoryExists (String filePath)
      • FileExists (String filePath)
      • FileLength (String filePath)
      • FileLineCount (String filePath)
      • GetDirectory (String filePath)
      • GetEDIFileMetaData (String filePath)
      • GetExcelWorksheets (String excelFilePath)
      • GetFileExtension (String filePath)
      • GetFileInfo (String filePath)
      • GetFileName (String filePath)
      • GetFileNameWithoutExtension (String filePath)
      • LastUpdateDateTime (String filePath)
      • MoveFile (String filePath, String newDirectory)
      • ReadFileBytes (String filePath)
      • ReadFileFirstLine (String filePath)
      • ReadFileText (String filePath)
      • ReadFileText (String filePath, String codePage)
      • WriteBytesToFile (String filePath, ByteArray bytes)
      • WriteTextToFile (String filePath, String text)
    • Date Time With Offset
      • ToDateTimeOffsetFromDateTime (dateTime String)
      • ToUtc (DateTimeWithOffset)
      • ToDateTimeOffsetFromDateTime
      • ToDateTimeOffset (String dateTimeOffsetStr)
      • ToDateTimeFromDateTimeOffset
    • GUID
      • NewGuid
    • Encoding
      • ToBytes
      • FromBytes
      • UrlEncode
      • UrlDecode
      • ComputeSHA256
      • ComputeMD5
      • ComputeHash (Str, Key)
      • ComputeHash (Str, Key, hex)
      • ConvertEncoding
    • Regular Expressions
      • ReplaceRegEx
      • ReplaceRegEx (Integer StartAt)
      • IsMatchRegEx (StartAt)
      • IsMatchRegEx
      • IsUSPhone
      • IsUSZipCode
      • GetMatchRegEx
      • GetMatchRegEx (StartAt)
    • TimeSpan
      • Minutes
      • Hours
      • Days
      • Milliseconds
      • TotalMilliseconds
      • TimeSpanFromTicks
      • Ticks
      • TotalHours
      • Seconds
      • TotalDays
      • ToTimeSpan (Hours, Min, Sec)
      • ToTimeSpan (Milli)
      • ToTimeSpan
      • TotalSeconds
      • TotalMinutes
    • Matching
      • Soundex
      • DoubleMetaphone
      • RefinedSoundex
    • Processes
      • TerminateProcess
      • IsProcessRunning
  • USE CASES
    • End-to-End Use Cases
      • Data Integration
        • Using Astera Data Stack to Create and Orchestrate an ETL Process for Partner Onboarding
      • Data Warehousing
        • Building a Data Warehouse – A Step by Step Approach
      • Data Extraction
        • Reusing The Extraction Template for Similar Layout Files
  • CONNECTORS
    • Setting Up IBM DB2/iSeries Connectivity in Astera
    • Connecting to SAP HANA Database
    • Connecting to MariaDB Database
    • Connecting to Salesforce Database
    • Connecting to Salesforce – Legacy Database
    • Connecting to Vertica Database
    • Connecting to Snowflake Database
    • Connecting to Amazon Redshift Database
    • Connecting to Amazon Aurora Database
    • Connecting to Google Cloud SQL in Astera
    • Connecting to MySQL Database
    • Connecting to PostgreSQL in Astera
    • Connecting to Netezza Database
    • Connecting to Oracle Database
    • Connecting to Microsoft Azure Databases
    • Amazon S3 Bucket Storage in Astera
    • Connecting to Amazon RDS Databases
    • Microsoft Azure Blob Storage in Astera
    • ODBC Connector
    • Microsoft Dynamics CRM
    • Connection Details for Azure Data Lake Gen 2 and Azure Blob Storage
    • Configuring Azure Data Lake Gen 2
    • Connecting to Microsoft Message Queue
    • Connecting to Google BigQuery
    • Azure SQL Server Configuration Prerequisites
    • Connecting to Microsoft Azure SQL Server
    • Connecting to Microsoft SharePoint in Astera
  • Incremental Loading
    • Trigger Based CDC
    • Incremental CDC
  • MISCELLANEOUS
    • Using Dynamic Layout & Template Mapping in Astera
    • Synonym Dictionary File
    • SmartMatch Feature
    • Role-Based Access Control in Astera
    • Updating Your License in Astera
    • Using Output Variables in Astera
    • Parameterization
    • Connection Vault
    • Safe Mode
    • Context Information
    • Using the Data Source Browser in Astera
    • Pushdown Mode
    • Optimization Scenarios
    • Using Microsoft’s Modern Authentication Method in Email Source Object
    • Shared Actions
    • Data Formats
    • AI Automapper
    • Resource Catalog
    • Cloud Deployment
      • Deploying Astera Data Stack on Microsoft Azure Cloud
      • Deploying Astera Data Stack on Oracle Cloud
      • Deploying Astera Data Stack on Amazon Web Services
      • Setting up the Astera Server on AKS
    • GIT In Astera Data Stack
      • GIT Repositories in Astera Data Stack
      • Moving a Repository to a Remote Server
      • Git Conflicts in Astera Data Stack
    • Astera Best Practices
  • FAQs
    • Installation
      • Why do we need to make two installations for Astera?
      • What’s the difference between Custom and Complete installation?
      • What’s the difference between 32-bit and 64-bit Astera?
      • Can we use a single license for multiple users?
      • Does Astera client work when it’s not connected to the server?
      • Why do we need to build a cluster database and set up a repository while working with Astera?
      • How do we set up multiple servers for load balancing?
      • How do we maintain schedules when migrating server or upgrading version?
      • Which database providers does Astera support for setting up a cluster database?
      • How many Astera clients can be connected to a single server?
      • Why is Astera not able to access my source file or create a new one?
    • Sources
      • Can I use data from unstructured documents in dataflows?
      • Can I extract data from fillable PDF forms in Astera?
      • Does Astera support extraction of data residing in online sources?
      • How do I process multiple files in a directory with a single execution of a flow?
      • Can I write information from the File System Items Source to the destination?
      • Can I split a source file into multiple files based on record count?
      • Does Astera support data extraction from unstructured docs or text files?
      • What is the difference between full and incremental loading in database sources?
      • How is the File System Items Source used in a Dataflow?
      • How does the PDF Form Source differ from the Report Source in Astera?
      • Does Astera support extraction of data from EDI files?
      • How does the Raw Text Filter option work in file sources in Astera?
    • Destinations
      • If I want to have a different field delimiter, say a pipe (“|”), is there an option to export with a
      • Tools Menu > Data Format has different date formats, but it doesn’t seem to do anything.
      • Can we export the Object Path column present in the Data Preview window?
      • I want to change the output format of a column.
      • What will be the outcome if we write files multiple times to the same Excel Destination?
    • Transformations
      • How is the Aggregate Transformation different from the Expression Transformation?
      • Can we omit duplicate records using the Aggregate Transformation in Astera?
      • How many datasets can a single Aggregate object take input from?
      • How is Expression Transformation different from the Function Transformation?
    • Workflows
      • What is a Workflow in Astera?
      • How do I trigger a task if at least one of a set of tasks fails?
      • Can I perform an action based on whether a file has data?
    • Scheduler
      • How can I schedule a job to run every x hours?
Powered by GitBook

© Copyright 2025, Astera Software

On this page
  • 1. Introduction
  • 2. Environment setup and tools used
  • 3. Use Cases
  • 3.1. Login or Get authorization token API
  • 3.2. Configure Server Properties
  • 3.3. Manage Server License
  • 3.4. Configuring Cluster settings
  • 3.5. Archive File Deployment (User project deployment)
  • 3.6. Scheduling Jobs

Was this helpful?

Export as PDF
  1. API Flow
  2. API Consumption
  3. Authorize

Accessing Astera’s Server APIs Through a Third-Party Tool

1. Introduction

This article is intended to provide a brief user guide to Astera users on how they can use Astera Server APIs to perform some commonly used actions without using the Astera Client.

The document covers the following operations:

  1. Configure Astera Server properties

  2. Configure Astera Server license

  3. Deploy user projects on Astera Server

  4. Schedule/execute the jobs

2. Environment setup and tools used

For the use cases below, Astera Integration Server is installed on a Virtual Machine and no Astera Client is going to be used to perform operations. However, some images of the Astera Client have been used for description purposes. We will use the Postman client as a third-party tool to send API requests to the Astera Server to perform several tasks.

3. Use Cases

3.1. Login or Get authorization token API

3.1.1. Login API example (/api/account/login)

‘POST /api/account/login’

This API returns the bearer token that can be used to make calls to the Astera Server. It also returns some more information about the user.

3.1.2. Login API parameters description

https://localhost:9261/api/account/login

Description

{

"User":"admin",

Username of the user trying to login (default is "admin")

"Password":"Admin123",

Password of the user trying to login (default is "Admin123")

"RememberMe":1

This parameter takes 1 or 0, indicating True or False.

}

3.2. Configure Server Properties

3.2.1. Configure Server properties API example (/api/Server/Config)

In this section, we are going to use the API ‘POST /api/Server/Config’ to change Server Profile of the Astera Server.

Here, we can see that a Server Profile named DEFAULT has been selected in our Server Properties.

To change this profile shown in the image above, we must provide the JSON body containing information related to the Repository Database with the name of the desired Server Profile in the POST request.

Note: Server Profile 2 was created in advance to be used in this request.

Once we have provided the relevant information, we can send the request.

We can see that a 200 OK success response is received in the image above.

We can verify using the Astera Client that we have successfully configured our server with the desired profile ServerProfile2.

Note: To see how the JSON request body is structured or what fields are required for a successful POST request, send a request to the GET /api/Server/Config API. This API will return the configured server’s properties in the response.

3.2.2. Server properties configuration API Parameters description

{

"configParameters": {

This section is recommended to leave defaulted.

"instrumentationOn": false,

Instrumentation slows down a server's processing capacity, but adds more logging for visibility. Only set to "true" when you want to debug an issue with the server or job running on that server.

"purgeChunkSize": 10000,

This is the number of records the server will try to delete at once when purging old job and event history from the cluster database.

"purgeCommandTimeoutSeconds": 600,

This is how long the server should wait for the SQL command to complete before giving up on a purge operation.

"purgeWindowStartHour": 0,

This is the beginning of the 24-hour window in which the server can attempt to purge old job and event history from the cluster database.

"purgeWindowEndHour": 24

The is the end of the 24-hour window in which the server can attempt to purge old job and event history from the cluster database.

},

"serverDbInfo": {

"port": 1433,

SQL Server database port on which instance is running on

"protocol": "http",

Protocol for connecting

"serviceName": "RepositoryDWB",

Name of the database

"authenticationType": "SqlServerAuth",

Type of authentication, SQL Server Authentication or Windows logon

"connectionTimeOut": 15,

Timeout when trying to connect to the instance

"commandTimeOut": 90,

Timeout when executing a command on the SQL server instance

"dataProvider": "SqlServer",

Database provider name. Either SqlServer or PostgreSQL

"server": "localhost",

Database instance host server

"database": "RepositoryDWB",

Name of the database

"isRepository": true,

This option is used to tell the server that this connection is pointing to an Astera repository. Always set this to true.

"schema": "",

Schema of the database, by default it’s 'dbo'

"user": "sa",

Username for logging in to the SQL Server

"password": "Astera123"

Password for logging in to the SQL Server

},

"port": 9261,

The port where Astera Integration Server is running on. By default its 9261, unless specified during installation

"serverProfile": "DEFAULT",

Astera Integration Server profile configuration. User can create different profiles for different servers, in this profile the user can set the max job count and other administrative properties.

}

3.3. Manage Server License

3.3.1. Change license key API example (/api/License/Key)

In this section, we are using the POST /api/License/Key API that allows the user to change the server license. We will also use the POST /api/License/Activate API to activate the license.

Go to Server > Configure > Enter License Key. Here, we can see a user ‘Nisha’ of the organization ‘G’ is registered with an active license.

To change the license key and register the user, we must provide the User, Organization, and License Key in the JSON request body. Refer to image below.

Note: The license key was taken in advance for this demo.

Once we have provided the relevant information, we can send the request.

In response, we can see that a 200 OK success status is received indicating that the license key has been changed.

Now, go to Server > Configure > Enter License Key, and notice the license properties. We can see a user Nisha of the organization Astera with a different/new license key has been registered.

3.3.2. Change license API Parameter description

Description

{

"LicenseKeyRegistrationModel":

{

"user":"TEST",

Username you want to register the product with.

"organization":"TEST",

Organization name you want to register the product with.

"key":"TEST"

License Key provided by Astera Software.

}

}

3.3.3. Activate license API Example (/api/License/Activate)

However, the license is not activated yet. To activate the license, we can simply send a request to the /api/License/Activate endpoint. After sending the request, we can see that a 200 OK response is received.

Note: To receive a 200 OK response, we must send an empty text in the body otherwise the request will result in an error. Also, this API only activates the license online.

Go to Server > Configure > Enter License Key again. Here, we can see the status of the license is activated now.

3.3.4. Activate license Parameter description

No parameters required, send an empty body here.

3.4. Configuring Cluster settings

Configuration of cluster settings is important before proceeding with the project deployment.

3.4.1. Cluster settings API parameters description ‘/api/Cluster’

{

"id": 1,

This is the identifier for the settings object in the repository database. Should always be 1.

"name": "DEFAULT",

This is the name we want to give to our cluster (Server group), by default its set to 'Default'

"sendErrorInfoToAstera": true,

Allow sending anonymous usage and error data to Astera.

"purgeJobInfoAfter": 7,

The number of days before a job info will be purged (removed from the repository database). It will no longer be available in the Job Monitor after it is purged.

"purgeEventInfoAfter": 7,

The number of days before a Server event info will be purged (removed from the repository database). It will no longer be available in the Server Monitor after it is purged.

"stagingDirectory": {

Setting Staging directory in this section

"path": "C:\\Staging"

The staging directory path, this is where Astera Integration Server keeps files which are related to a deployment.

},

"deploymentDirectory": {

Setting deployment directory in this section

"path": "C:\\Deployment"

The deployment directory path, this is where Astera Integration Server keeps a local copy of deployment archive files(.car) and its configuration files (.cfg)

},

"pauseAllServers": false,

This parameter takes in a Boolean value to set the feature for pausing Astera Integration Servers on or off

"pauseServersFrom": null,

Start time when servers pause. Depends on if pauseAllServers is true.

"pauseServersTo": null,

End time when servers pause. Depends on if pauseAllServers is true.

"clientAndServersShareTheSameFileSystem": false

False if the client and the server do not share the same file system. This applies to any scenario when the client and server do not exist on the same machine or network.

}

3.5. Archive File Deployment (User project deployment)

To successfully deploy an archive file (User project .car file) using the APIs, a user must perform the following prerequisites:

  • Upload the archive file (User project .car file) to the deployment directory.

  • Upload the config file to the deployment directory. (optional)

3.5.1. Uploading the Archive File/Config File to the Deployment Directory

There are two methods to do this, let's see each in action.

a. Example of using ‘POST /api/UploadFile’

The user can upload the config and .car files to the deployment directory using APIs. There is a possibility that a user might delete or move the config or .car file from their local machine. To avoid any issues, it is recommended to first upload these files to the deployment directory.

To upload the file, use the ‘POST /api/UploadFile’ API. In this API, we must provide two query parameters:

  • FileTypes: Extension of the target file e.g., cfg for the config file, car for Archive files.

  • TargetFileName: Here, we define the target file’s name e.g., Testing.

Next, we need to configure Request Body for this API. For the request body, select the form-data content type, select the Key type as File and provide the desired archive file (.car file) in the Value. Click Send.

Note: The archive project file (.car file) was created in advance for this demo.

Here, we can see that a 200 OK response, with a file path of the deployment directory’s archive file, has been received. Please copy down this path on your notepad as we need to use this path while creating the deployment.

Similarly, we can upload the config file to the deployment directory using this api/UploadFile endpoint.

Parameter description of /api/UploadFile

Query Parameters

Description

FileTypes: Cfg

The extension of the file e.g., Car or Cfg. There are two files related to the Deployment, one is .car file, another is .cfg file. .car files are archive files which have the snapshot of the project when it was generated, and .cfg files are configuration files, it contains the values for the parameters which were used in the project.

TargetFileName: Testing

The name with which the uploaded files will be saved

Request Body (content-type form-data)

Description

ArchiveFile

In this parameter, attach the archive or configuration file you wish to upload to the deployment directory.

b. Using the ‘POST api/UploadCarFile’

An archive file (.car file) can also be uploaded to the deployment directory via another API i.e., ‘POST api/UploadCarFile’.

In this API, we do not need to specify the query parameters. Simply select the archive file (.car file) in the body and click Send. Here, we receive the archive file path in the response.

The uploaded files can also be seen in the deployment directory.

Parameter description of api/UploadCarFile

Request Body (content-type form-data)

Description

Archive

In this parameter, attach the archive or configuration file you wish to upload to the deployment directory.

3.5.2. Creating the Project Deployment

3.5.2.1. Using POST /api/Deployment example

Now, let’s proceed to the deployment creation. To create a deployment, we must use the ‘POST /api/Deployment’ API.

In this API’s Request’s Body, we must provide info such as:

  1. Relevant archive (.car) and config (optional) file paths (both local and deployment directory file paths)

  2. The deployment’s name, its ID, and its activation state, etc.

After defining the body, we can click Send.

Here, we can see a 200 OK response has been received indicating that a deployment has been created.

Open the client and go to Server > Deployment Settings > Deployment. In the deployment window, we can see the archive file has successfully been deployed.

Please note the following:

  • To create a new deployment, we must provide the field “Id” as 0. We should also provide a unique deployment “Name” i.e., not the same name as an already existing deployment. Otherwise, the request will result in a 400 Bad Request error.

  • If we provide a non-zero “Id” field e.g., Id = 7, the server will consider this request as an update request, and if a deployment with ID 7 already exists on the server it will be modified/updated.

3.5.2.2. Parameter description of /api/Deployment

Description

{

"userArchiveFilePath": "C:\\Project_ArchiveFile.car",

This is the path that was the source .car file used for deployment. It is not used anywhere at runtime.

"clusterArchiveFilePath": "C:\\Testing.Car",

Archive file's (.car file's) deployment directory path, i.e, the path to the archive file that is uploaded in the deployment directory.

"userConfigFilePath": "",

This is the path that was the source .cfg file used for deployment. It is not used anywhere at runtime.

"clusterConfigFilePath": "",

Config file's (.cfg file's) deployment directory path, i.e, the path to the configuration file that was uploaded in the deployment directory. The config file is optional.

"encryptFiles": true,

Turn it to true or false to set the encryption of the configuration (.cfg) file on or off.

"comment": "Modifying by reducing the body",

User comment (i.e., description) attached to a deployment.

"id": 0,

Deployment ID of the deployment in case of modification. Use zero for new deployments.

"name": "DeploymentTesting"

Name of the Deployment, this should be a unique name.

}

3.5.3. Post Deployment Modification

The ‘POST /api/Deployment’ API can also be used to modify an existing deployment. In this API’s Request Body, details of an existing deployment are required.

In this scenario, we want to update the name of the above-created DeploymentTesting. However, we do not have its details available.

So, to gather the details, we firstly use the ‘GET api/Deployments’ API to fetch the info of the existing deployment. Then, we copy the deployment Id, UpdateDtTm, and CreatedDtTm fields from the response.

Note: This GET API returns information for all the deployments on the server. Since we desired to modify only one deployment DeploymentTesting, we copied these highlighted fields only.

Now, in the POST Request’s Body, let’s change the deployment Name to DeploymentTesting_Modified and replace the values of Id, UpdateDtTm, and CreatedDtTm fields with the values copied from the GET response.

Let’s send the request.

A 200 OK response is received. Now, go to Server > Deployment Settings > Deployment, in the deployment window, we can see the modified deployment name.

Note: Each time we update an existing deployment the UpdateDtTm field is modified as well. Therefore, we always must send a GET api/Deployments request first to fetch the details of the deployment and then use the received details as the body for the POST request to successfully modify the deployment. Using an invalid (past time) UpdateDtTm value will give a 400 Bad Request error.

Let’s proceed to learn how we can schedule jobs on the server using APIs.

3.6. Scheduling Jobs

3.6.1. Scheduling Jobs using ‘POST api/Schedule’ API example

In this section, we are scheduling the previously created deployment using the ‘POST api/Schedule’ API.

This API’s Request Body requires the schedule configuration information, i.e., Schedule Name, Schedule Type, Frequency, Activate State, Server Info, etc.

Let’s create the schedule called Schedule_Testing, which runs daily, with schedule type deployment, an active state as True, etc.

Sending the request shows a 200 OK status response.

If we go to Server > Job Schedules, in the scheduler window, we can see a schedule called Schedule_Testing with the following properties: Schedule Job Id is 4, with Schedule Type as Deployment, running at Frequency daily has been created.

Note: Like the deployment POST API, this POST api/Schedule endpoint can also be used for modification of existing schedules.

Each time we update an existing schedule the UpdateDtTm field is modified as well. Therefore, we always have to send a GET api/Schedules or a GET /api/Schedule/:scheduleId request first to fetch the details and then use the received details as the body for the POST request to successfully modify a schedule.

Note: Using an invalid (past time) UpdateDtTm value will give a 400 Bad Request error.

3.6.2. Schedule job API Parameters description

{

"archiveStartItem": "C:\\DataflowSample.Df",

Initial artifact to run from the deployment. Omit if not using an archive.

"schedule": {

"dailyScheduleBase": {

"startTime": "2022-09-06T04:53:46.2573662-07:00",

Date and time when this scheduled job should first start. Since this is a daily schedule, this job will repeat everyday on the same time given here.

"typeName": "Astera.Core.DailyScheduleEveryDay"

Type of the schedule. i.e, daily, weekly, etc.

},

"typeName": "Astera.Core.DailySchedule"

Type of the schedule. i.e, daily, weekly, etc.

},

"traceLevel": "Job",

Set this parameter to 'Job' if you want to track it in job monitor, otherwise this job's progress will not be tracked.

"skipIfRunning": false,

Set this to true if you want to skip if the same schedule's last run is already queued or running.

"isActive": true,

Activates/deactivates the schedule.

"jobOptions": {

"usePushdownOptimization": false,

Set this to true if you want to run scheduled job in 'Pushdown mode'.

},

"filePathResolved": {

This section sets the file path of the archive or flow file.

"path": "C:\\SampleConfigFile\\test.car"

Path to the .car, .df or .wf file depending on 'isFile' parameter.

},

"deploymentId": 1,

Deployment ID in case using an archive deployment, i.e, 'isFile' is set to 'false'. This can be found under Server->Deployment.

"isFile": false,

Set this to True if pointing to a dataflow or workflow file directly, otherwise if using deployment set this to false.

"name": "test"

Name of the scheduled job.

}

PreviousAWS Signature AuthenticationNextWorkflow Use Case

Last updated 11 months ago

Was this helpful?

A sample server config JSON body is provided for reference.

The JSON payload sample for the cluster configuration is attached .

A sample deployment JSON body is attached for reference.

A sample JSON is attached for reference.

A postman collection containing all the server APIs discussed in this article is attached .

here
here
here
here
here