New release TimeXtender Data Integrator

In addition to the TimeXtender Classic update (see also this post) there is also this week a new version of TimeXtender Data Integrator RELEASED! We share with you the most important new features and improvements.

 

Metadata and logs

From now on, it is possible to gain insight into underlying metadata and logs per instance via "Meta collections". This provides valuable insights, such as reload times, failures, usage of fields and tables, and much more!

Meta collections are refreshed once a day and are easy to set up through the portal per instance. The first time you set this up, an additional data source will be available within 24 hours to link to your Ingest instance, allowing you to start working with this data immediately. See below for a brief explanation on how to set this up in the portal.

Step 1: Add metadata collections and data source

 

Step 2: Map the "TX-Meta_collection" source to your ODX

 

Three new TimeXtender connectors

The latest version of TimeXtender Data Integrator adds three powerful new connectors:

  • XML & JSON - In addition to CSV, Parquet and Excel, XML and JSON files can now also be loaded directly
  • Azure Data Factory i.c.w. SAP
  • Infor SunSystems - TimeXtender now supports SunSystems version 5 and higher

Ingest DW_TimeStamp

Good news!!! The well-known DW_TimeStamp from the Business Unit setup is now also available within TDI Ingest layer. This field is often used in dashboards to provide insight into when the data was loaded from the source(s).

How do you implement this? Synchronize the source and select odx_timestamp for the desired tables. Next, synchronize the Ingest instance. Add the field to the table in your Prepare instance (and possibly your Deliver instance).

Variables in Deliver Endpoints

From now on, variables can be applied to Deliver endpoints. Here you have the following options: Fixed, Machine Name, User Name, User Domain Name and Destination Scope (name or type).

This opens up many possibilities, such as dynamically applying filters to different endpoints. As a concrete example, suppose you have a dataset with customer data for multiple stores. You can now create one model and dynamically configure endpoints that provide filtered data by store.

See below the application of this.

First, I set an instance variable on the deliver instance with the name of the endpoint as a variable.

Then I use this variable as a condition on each filter in the Customer table. For example, SalesStoreOne automatically gets only the data where Customer.StoreID = 1 and SalesStoreTwo gets only the data where Customer.StoreID = 2. This makes it easy to work with endpoints in a flexible and scalable way.

Bug fixes

In addition to these updates, several bug fixes have been implemented. Check out the full list of improvements here: TimeXtender Data Integration 6926.1-3177

Questions about this update or curious how this can help your organization? Feel free to contact us!

Bas Hopstaken

TimeXtender Xpert
Tech Lead Data Management

Devin Tiemens

TimeXtender Xpert
Tech Lead Data Management

Recent posts

Automating sales proposals with AI

Automating sales proposals with AI

Discover how AI improves sales proposal automation. Using Microsoft Copilot Studio, we streamline workflows and increase efficiency by integrating AI-driven tools. Learn more about the benefits and challenges of this innovative approach.

read more