Data Factory Pipelines
Introduction
Pipelines in Product-Live Data Factory are execution stacks in which jobs run. They allow you to isolate and control the execution of your jobs.
Every job must be associated with one and only one pipeline. Pipelines are configured by our Professional Services team. Contact your Product-Live representative to set up or add pipelines on your account.
How pipelines work
A pipeline follows these core rules:
- Only one job can run on a pipeline at a time
- The scheduling strategy is First In, First Out (FIFO): the first job submitted is the first job executed
- Scheduling is cooperative (no preemption): a running job must complete before the next one can start, no matter how long it takes
- If a running job is cancelled, it releases the pipeline immediately, allowing the next queued job to start
TIP
If you need more than one simultaneous execution on a pipeline, please contact our support team.
When a job is launched
When you launch a job, the pipeline checks whether it is available. If the pipeline is free, the job starts immediately. If the pipeline is busy, the job is placed in a waiting queue and will be executed when the pipeline becomes available.
mermaid
flowchart TD
launch((Job is launched))
available{Is the pipeline<br/>available?}
execute[Execute job immediately]
queue[Place job in the<br/>waiting queue]
done(End)
launch --> available
available -- Yes --> execute
available -- No --> queue
execute --> done
queue --> done
classDef green fill:#9f6,stroke:#333,stroke-width:2px;
class launch greenWhen a job finishes
When a job execution ends (whether it completed, failed, or was cancelled), the pipeline checks its waiting queue. If there are jobs waiting, the oldest one is selected and launched next.
mermaid
flowchart TD
finished((Job execution ends))
empty{Is the waiting<br/>queue empty?}
select[Select the oldest<br/>job in the queue]
start[Launch the selected job]
done(End)
finished --> empty
empty -- Yes --> done
empty -- No --> select
select --> start
start --> done
classDef green fill:#9f6,stroke:#333,stroke-width:2px;
class finished greenQueue position visibility
When a job is waiting in the queue, Product-Live displays the number of jobs ahead in the queue, so you can estimate when your job will start.
When to use multiple pipelines
Consider requesting additional pipelines if:
- You want to protect critical jobs: for example, you have time-sensitive jobs (scheduled nightly exports, automated imports) that must not be delayed. Assign them to a dedicated pipeline so that less critical jobs cannot block them.
- You want to separate non-critical jobs: reporting, testing, or ad-hoc jobs where waiting time is acceptable can share a separate pipeline without impacting your critical workflows.
The diagram below illustrates how two pipelines can isolate workloads:
mermaid
flowchart LR
subgraph pipeline_critical ["Pipeline: Critical jobs"]
direction TB
pj1["Nightly export — Running"]
pj2["Scheduled import — Waiting (1st)"]
pj1 ~~~ pj2
end
subgraph pipeline_other ["Pipeline: Non-critical jobs"]
direction TB
sj1["Ad-hoc report — Running"]
sj2["Test export — Waiting (1st)"]
sj3["Manual import — Waiting (2nd)"]
sj1 ~~~ sj2 ~~~ sj3
endIn this example, critical jobs are never delayed by non-critical jobs. Waiting time on the non-critical pipeline is acceptable for your teams.
For technical details on scheduling rules and job execution lifecycle, see the Pipelines reference.
FAQ
Is it possible to have multiple pipelines on my account?
Yes. For more information on how to purchase additional pipelines, please contact our support team.
Can I run multiple jobs at the same time on one pipeline?
No. A pipeline allows only one job to run at a time. If you need concurrent executions, you can purchase additional pipelines or contact our support team.
What happens if I launch a job while the pipeline is busy?
The job is placed in a FIFO waiting queue. It will be automatically executed when all previously queued jobs have completed. You can see your position in the queue from the application.
Can I cancel a job that is waiting in the queue?
Yes. Cancelling a queued job removes it from the waiting queue without affecting other jobs.
Can I cancel a running job to free the pipeline?
Yes. Cancelling a running job releases the pipeline immediately and the next queued job will start.
How should I organize my pipelines?
A common pattern is to separate highly critical jobs from less critical or experimental jobs, so that test runs never delay critical production workflows. If you have time-sensitive scheduled jobs, consider giving them a dedicated pipeline.