Data Factory reference documentation
This section groups together the core reference documentation for the Product-Live Data Factory platform. It explains how jobs are defined, how they run, which tasks are available, and how to configure cross‑cutting concerns such as variables and error handling.
Use the following pages as building blocks when designing and operating your Data Factory workflows:
- Jobs: Define what runs. A job describes a set of tasks executed in a specific order to implement a business process (import, transformation, export, etc.).
- Job executions: Inspect how it runs. A job execution is a concrete run of a job, with its status, inputs, outputs, and task history.
- Tasks: Use available building blocks. Each task is a single operation (data transformation, import, export, HTTP call, notification, etc.) that can be orchestrated inside a job.
- Variables: Share and reuse values. Variables (global and local) let you store configuration and intermediate results across tasks and jobs.
- Handling failures: Control timeouts and errors. Learn how to configure job‑level timeouts and react to task failures within a job.
- Best practices: Apply recommended patterns. Naming conventions and design guidance to keep your Data Factory setup maintainable and consistent.
- Specialized processing: Extend the platform. Delegate specific task processing to your own infrastructure when you need advanced customization, security, or performance.