Connecting your systems
Takeaway
We offer diferent integration strategies to connect your systems with Product-Live, each of them will address specific needs:
- If you intend to programmatically create or update content, in sync, use the Content Management API.
- If you intend to programmatically create or update content, in async bulk/batch operations use the Data Factory API.
- If the system you intend to connect to Product-Live offers SFTP or Cloud Storage capabilities, use the Data Factory & SFTP or Cloud Storage strategy.
- If your system offers a public API, use the Data Factory & Event Driven strategy.
- Use the Image API to perform complex image manipulations and transformations.
- If your intention is to retrieve content and display it to your users within a high-traffic application or website, refer to the Content Delivery Strategy section.
If your not sure which strategy to use, our teams are available for any question.
Overview
Product-Live offers different ways to connect your systems. Each method has its own advantages. In this document we will explain each method and when to use it.
1. Content Management API
The Content Management API lets you interact with the Product-Live platform to perform various operations on your data. Our API is a classic REST API that uses JSON as its data format. We provide a Swagger documentation which you can find here and an OpenAPI specification which you can find here. We also provide a Typescript SDK (more languages to come) to speed up the development of your integration.
When to use?
While there is no absolute rule, from our experience, below are some use cases where the Content Management API is the best choice. If you need any assistance to choose the best integration method, please reach out to us.
- You already have experience in integrating your systems with third-party services via an HTTP API. For example, you already have integration with a payment provider, a shipping provider, a CRM, etc.
- You will be able to host your own code and manage its execution in a production environment.
- The data flows you wish to set up require the rapid updating of small volumes of data. For example, within an orchestration with our services, it is necessary for a property on a product, its stock for example, to be updated rapidly, in a few milliseconds, so that this information can then be processed by another of your services. The opposite example, for example, would be if you only wanted to update large volumes of data, in batches, once a day, without processing them instantaneously. In this case, the API approach would not be the best choice.
Advantages
An integration with the Content Management API offers the following advantages:
- Interopeability: you can use any programming language to interact with our API. Although we provide a Typescript SDK, you can use any language that supports HTTP requests and even generate your own SDK using the OpenAPI specification.
- Real time integration: you can send and get a response in real time. This is useful for example when you need to send an item to Product-Live and quickly get the acknowledgement of the operation.
- Granular control: Other integration method may require to send all your data at once. With the Content Management API you can send only the data that you need to update.
- Scalability: Our API is designed to handle a large amount of requests to support your growth.
When another approach may be recommended?
- If you don't have any experience in integrating your systems with third-party services via an HTTP API, you may want to consider another approach. For example, the Data Factory plateform allow you to design any integration flow without writing a single line of code and without the need to deploy and manage your code in a production environment.
- For data-intensive applications, such as feeding an ecommerce frontend, we don't recommend using the Content Management API directly. Instead, we recommend setting up a cache, itself fed by our Content Management APIs or the Product-Live Data Factory platform. Product-Live doesn't offer a managed cache solution, but our teams will be happy to advise you on which solution to deploy.
Usage examples
Below, a basic example of how to send a product to Product-Live using the Content Management API. For more examples, please refer to this section of our online documentation.
ts
import { createConfiguration, ServerConfiguration, ItemApi } from '@product-live/api-sdk-beta';
export async function main(): Promise<void> {
const configuration = createConfiguration({
baseServer: new ServerConfiguration(process.env.API_BASE_PATH || '', {}),
authMethods: {
ApiKeyAuthHeader: process.env.API_ACCESS_TOKEN
}
});
const itemApi = new ItemApi(configuration);
await itemApi.createItem({
fields: [{
id: '1',
value: '5449000267412'
},{
id: '2',
value: 'Original Taste - Coca Cola - 1.25l'
},{
id: '2',
value: '<url>'
},{
id: '3',
value: 1.5
},{
id: '3',
value: 'coca_cola'
}]
});
}
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
bash
curl -X 'POST' \
'https://api.product-live.com/v1/items' \
-H 'accept: application/json' \
-H 'X-Api-Key: [REDACTED]' \
-H 'Content-Type: application/json' \
-d '{
"object": "item",
"fields": [
{
"id": "1",
"value": "5449000267412"
},
{
"id": "2",
"value": "Original Taste - Coca Cola - 1.25l"
},
{
"id": "2",
"value": "<url>"
},
{
"id": "3",
"value": 1.5
},
{
"id": "3",
"value": "coca_cola"
}
]
}'
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
json
{
"object": "item",
"id": "1",
"fields": [
{
"id": "1",
"key": "ean",
"type": "IDENTIFIER",
"value": "5449000267412"
},
{
"id": "2",
"key": "title",
"type": "SINGLE_LINE_TEXT",
"value": "Original Taste - Coca Cola - 1.25l"
},
{
"id": "2",
"key": "main_picture",
"type": "IMAGE",
"value": "<url>"
},
{
"id": "3",
"key": "quantity_in_liters",
"type": "NUMBER",
"value": 1.5
},
{
"id": "3",
"key": "brand",
"type": "SINGLE_SELECT",
"value": "coca_cola"
}
],
"metadata": {
"createdAt": "2023-01-01T00:00:00.000Z",
"updatedAt": "2023-01-02T00:00:00.000Z",
"tableId": "1",
"tableKey": "products",
"tableOwnerAccountId": "1",
"levelId": "1",
"levelKey": "main"
}
}
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
2. Data Factory API
The Data Factory API is technically very similar to the Content Management API, but serves different purposes. In fact, the aim here is to offer an API solution that can interact with the Data Factory platform, and thus take advantage of all the power orchestration platform has to offer. Like the Content Management API, the Data Factory API is a classic REST API that uses JSON as its data format. We provide online documentation and examples of use. An OpenAPI definition is available, enabling SDKs to be generated in different languages. Product-live offers a Typescript version of this SDK, enabling you to get started more quickly.
First, what is Product-Live Data Factory?
At its core, Product-Live Data Factory is a orchestration platform that allows you to build complex workflows without the hassle of maintaining technical infrastructure. Product-Live Data Factory may be used to solve various types of problems, such as:
- Process Orchestration and Automation: you can model complex business process and automate them according to your needs.
- Data integration: you can use Product-Live Data Factory to connect your systems together, for example to send data from Product-Live to your CRM or your ERP, or the other way around.
- Data Transformation: you can use Product-Live Data Factory to transform your data, for example to convert a CSV file to a JSON file, or optimize a set of images to fit the requirements of your ecommerce frontend.
- Data Distribution: you can use Product-Live Data Factory to distribute your data to various destinations, for example to send an activity report to your team via email, or to send your product data to a third-party service via an API.
For more information about Product-Live Data Factory, please refer to this section of our online documentation.
When to use?
From our experience, below are some use cases where the Data Factory API is the best choice:
- You already have experience in integrating your systems with third-party services via an HTTP API. For example, you already have integration with a payment provider, a shipping provider, a CRM, etc.
- You won't be able (or rather, you don't want) to host your own code and manage its execution in a production environment, the Product-Live Data Factory platform will handle this for you.
- The data flows you wish to set up require to update large amounts of data, or to run for a long time. The Product-Live Data Factory platform is designed to handle this kind of use case.
- You need to build complex workflows, with multiple steps, and you want to be able to monitor the execution of those long-running workflows.
Advantages
An integration with the Product-Live Data Factory API offers the following advantages:
- Interopeability: As for the Content Management API, you can use any programming language to interact with our API. Although we provide a Typescript SDK, the same provided to interact with the Content Management API, you can use any language that supports HTTP requests and even generate your own SDK using the OpenAPI specification.
- High scalability: The Product-Live Data Factory platform, along with the Data Factory API, is tailored to handle large amounts of data. You can use it to import or export millions of items, inside jobs that can run for hours or even days.
- High resilience and reliability: The Product-Live Data Factory platform is designed to be highly resilient and reliable. It is built on top of the open source service orchestrator maintened by Netflix, all of it hosted on Microsoft Azure cloud infrastructure. It is also designed to be highly available, with a 99.99% SLA.
- Benefit from the richness of the Data Factory platform: To date, several dozen Data Factory tasks are natively available. These can connect to a large number of systems (several hundred to date) and generate any type of document (PDF, Excel, XSV, Json, XML, Zip...). (Note that you can also develop your own tasks if those available don't directly meet your needs. Please contact our teams for more information)
When another approach may be recommended?
- Although the platform has a wide range of functions, enabling it to connect to numerous systems, it is possible that your needs are not yet covered by the platform. In this case, don't hesitate to contact our teams, who will do their utmost to meet your needs.
- If you already have an orchestration platform such as Camunda or Pega, you may want to consider using it to orchestrate your data flows and use the Content Management API to interact with Product-Live.
Use case example
You are seeking to establish a connection between an Enterprise Resource Planning (ERP) system and Product-Live platform while considering the following constraints:
- The ERP system currently possesses a comprehensive product export in CSV format.
- This CSV export encompasses all available products within the ERP.
- Previous experience has indicated that the CSV file may occasionally exhibit inconsistencies, either due to improper formatting or occasional export failures.
- You possess the capability to initiate REST API calls (using one of your existing systems, or directly from an in-house application).
This Data Workflow may be implemented inside the Product-Live Data Factory platform as follows:
Once this workflow has been created inside your Product-Live account, you will be able to execute using the Data Factory API.
- First upload the CSV file to Product-Live Data Factory
ts
import { createConfiguration, ServerConfiguration, DataFactoryFileApi, JobExecutionApi } from '@product-live/api-sdk';
import { promises as fs } from 'fs';
import path from 'path';
export async function main(): Promise<void> {
const filePath = 'file path to upload';
const configuration = createConfiguration({
baseServer: new ServerConfiguration(process.env.API_BASE_PATH || '', {}),
authMethods: {
ApiKeyAuthHeader: process.env.API_ACCESS_TOKEN
}
});
const dataFactoryFileApi = new DataFactoryFileApi(configuration);
const uploadedFile = await dataFactoryFileApi.uploadFile({
data: Buffer.from(await fs.readFile(filePath, 'utf-8')),
name: path.basename(filePath)
});
}
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
bash
curl -X 'POST' \
'https://api.product-live.com/v1/data_factory/files' \
-H 'accept: application/json' \
-H 'Content-Type: multipart/form-data' \
-F 'file=@file.csv;type=type=text/csv'
1
2
3
4
5
2
3
4
5
- Then execute the previously created Data Factory job that will start the Data Workflow by using the previously uploaded csv file
ts
const jobId = 'job id to execute';
const jobExecutionApi = new JobExecutionApi(configuration);
await jobExecutionApi.createJobExecution({
jobId: jobId,
info: {
title: `Import ${path.basename(filePath)}`
},
input: {
file: uploadedFile
}
});
1
2
3
4
5
6
7
8
9
10
11
2
3
4
5
6
7
8
9
10
11
bash
curl -X 'POST' \
'https://api.product-live.com/v1/data_factory/job_executions' \
-H 'accept: application/json' \
-H 'Content-Type: application/json' \
-d '{
"jobId": "<job id>",
"input": {
"file": {
"object": "data_factory_file",
"id": "<file id>"
}
}
}'
1
2
3
4
5
6
7
8
9
10
11
12
13
2
3
4
5
6
7
8
9
10
11
12
13
3. SFTP
An SFTP server (Secure File Transfer Protocol server) is a sofware application that facilitates secure file transfer. It can easily be accessed from a computer desktop using a FTP client (like FileZilla). It's a popular choice for securely transferring files between remote systems and is handled by most solutions.
If you do not possess an SFTP server
Product-Live can provide you with one. Please contact our teams for more information..
When to use?
If you already have an SFTP server or a Managed File Transfer (MFT) software solution such as that offered by Axway or IBM to exchange files with your partners, or, if you are looking for a simple and quick solution to set up, this integration should be the one you choose.
Advantages
An integration using a remote SFTP server and the Data Factory platform offers the following advantages:
- Easy to set up: The SFTP protocol is supported by a wide range of tools and is quick and easy to set up. If you don't have your own SFTP server, Product-Live can provide one for you. Once your SFTP server is set up, you can use the Data Factory platform to connect to it and exchange files.
- High scalability: The Product-Live Data Factory platform, is tailored to handle large amounts of data. You can use it to import or export millions of items, inside jobs that can run for hours or even days.
- High resilience and reliability: The Product-Live Data Factory platform is designed to be highly resilient and reliable. It is built on top of the open source service orchestrator maintened by Netflix, all of it hosted on Microsoft Azure cloud infrastructure. It is also designed to be highly available, with a 99.99% SLA.
- Benefit from the richness of the Data Factory platform: To date, several dozen Data Factory tasks are natively available. These can connect to a large number of systems (several hundred to date) and generate any type of document (PDF, Excel, XSV, Json, XML, Zip...). (Note that you can also develop your own tasks if those available don't directly meet your needs. Please contact our teams for more information)
When another approach may be recommended?
If the Content Management API or the Data Factory API integration approaches are not a good fit for your needs, and you do possess an account among popular Cloud Providers (AWS, Azure, Google Cloud...), we recommend that you consider first the Cloud Storage Provider integration approach, which is more flexible and scalable.
Use case example
A connection through SFTP leverages the same tasks as the previous case, except that the CSV file to be handled is dropped on a SFTP server instead of being uploaded tougth the Product-Live Data Factory API.
This Data Workflow may be implemented inside the Product-Live Data Factory platform as follows:
And you will simply set your job to be executed every day at a given time.
4. Cloud storage providers
Cloud Providers such as Microsoft Azure, Google Cloud Platform or Amazon Web Services are very popular solutions to store documents. Their storage solutions are very scalable and reliable and represent a more modern approach than SFTP.
When to use?
If the Content Management API or the Data Factory API integration approaches are not a good fit for your needs and if you already have an account among popular Cloud Providers (AWS, Azure, Google Cloud...), we recommend that you consider this integration approach.
Advantages
An integration using a Cloud Storage Provider and the Data Factory platform offers the following advantages:
- Easy to set up: Setting up a storage solution within a CLoud Provider account is quick and easy. Once your storage solution is set up, you can use the Data Factory platform to connect to it and exchange files.
- Handles high input data volumes: Storage Solution provide by Cloud Providers are designed to handle large amounts of data. Mictosoft Azure Blob Storage, for example, you can store up to 5 PiB of data in a single storage account.
- High scalability: The Product-Live Data Factory platform, is tailored to handle large amounts of data. You can use it to import or export millions of items, inside jobs that can run for hours or even days.
- High resilience and reliability: The Product-Live Data Factory platform is designed to be highly resilient and reliable. It is built on top of the open source service orchestrator maintened by Netflix, all of it hosted on Microsoft Azure cloud infrastructure. It is also designed to be highly available, with a 99.99% SLA.
- Benefit from the richness of the Data Factory platform: To date, several dozen Data Factory tasks are natively available. These can connect to a large number of systems (several hundred to date) and generate any type of document (PDF, Excel, XSV, Json, XML, Zip...). (Note that you can also develop your own tasks if those available don't directly meet your needs. Please contact our teams for more information)
When another approach may be recommended?
If you do not have an account among popular Cloud Providers (AWS, Azure, Google Cloud...), we recommend that you consider the SFTP integration approach, which is easier to set up.
Use case example
As for the SFTP integration approach, a connection through a Cloud Storage Provider leverages the same tasks as the previous case, except that the CSV file to be handled is dropped on a Cloud Storage Provider instead of being uploaded through the Product-Live Data Factory API.
This Data Workflow may be implemented inside the Product-Live Data Factory platform as follows:
5. Event Driven
You can leverage the Product-Live Data Factory platform to directly push information from Product-Live to your systems using for example an HTTP API provided by one of your systems.
The main applications are as follows
- Acknowledgement of delivery: for example, when all items have been successfully imported into the Product-Live system. Instead of querying the API at regular intervals to check the completion status of a job, it is recommended to incorporate a task at the end of the job that emits an event or invokes an API within your systems. This approach improves the efficiency of connecting systems. However, it requires the ability to expose a public API that can be called from the Data Factory platform.
- Integration of software systems with exposed APIs: This scenario involves establishing connections between software systems that provide their own APIs. To facilitate this integration, a range of pre-configured tasks are available, specially designed for the most common software applications.
Use case example
In the use case below, a job is triggered (by a user or automatically), and during its execution, a call is made to a public HTTP API exposed by a third-party application so that tierce application can take into account changes made in Product-Live (or simply be notified that a certain action has been performed).
6. Content Delivery Strategy
A Content Delivery Strategy is a plan for how you will deliver content to your users. It is a combination of the content delivery network (CDN) and the caching service that you choose to use.
A CDN is a network of servers that are distributed around the world. These servers are called edge servers. When a user requests content, the CDN determines which edge server is closest to the user and delivers the content from that edge server. This reduces the time it takes for the user to receive the content. A caching service is a service that stores content in a cache. When a user requests content, the caching service checks the cache to see if the content is already there. If it is, the caching service delivers the content from the cache. If it is not, the caching service retrieves the content from the origin server, stores it in the cache, and then delivers it to the user. This reduces the time it takes for the user to receive the content.
As mentionned above, the Content Management API are designed to handle large amounts of data. However, in some cases, it is not possible to directly use these APIs in a frontend application. To meet this need, the Data Factory platform is equipped with tools for exporting large volumes of data into systems dedicated to caching your data, so that it can be consumed by data-intensive applications. If you have any questions about the architecture to be implemented, our teams are here to help you.
Use case example
You'll find below a list of use cases based on our experience:
- Caching and optimizing image content for web use
- Make one of the product data sheets available in JSON format for rapid consumption by a mobile application
- Optimize and hide your product data to make it available offline within one of your applications
TIP
For more information, please refer to the Content Delivery Strategy section.
7. Specialized processing
For some very specific use case, when an high level of customization is required, for security reasons or for performance reasons, Product-Live Data Factory platform is able to deport specific task processing to a tierce infrastructure. The use cases we have encountered so far are as follows:
- The need to connect to a specific system that is not publicly accessible
- Processed highly sensitive information that cannot be exposed to the outside world
- Executed very intensive processing that requires very specific hardware
- Handle specific task that are not natively available in the Data Factory platform
For any question regarding this approach, please contact our team, who will be able to guide you in your decision.
A detailed example of this approach is available in the Specialized Processing section.