Home
Data Pipelines with Apache Airflow
Barnes and Noble
Data Pipelines with Apache Airflow
Current price: $49.99
Barnes and Noble
Data Pipelines with Apache Airflow
Current price: $49.99
Size: Paperback
Loading Inventory...
*Product information may vary - to confirm product availability, pricing, shipping and return information please contact Barnes and Noble
Data Pipelines with Apache Airflow
teaches you how to build and maintain effective data pipelines.
Summary
A successful pipeline moves data efficiently, minimizing pauses and blockages between tasks, keeping every process along the way operational. Apache Airflow provides a single customizable environment for building and managing data pipelines, eliminating the need for a hodgepodge collection of tools, snowflake code, and homegrown processes. Using real-world scenarios and examples,
teaches you how to simplify and automate data pipelines, reduce operational overhead, and smoothly integrate all the technologies in your stack.
Purchase of the print book includes a free eBook in PDF, Kindle, and ePub formats from Manning Publications.
About the technology
Data pipelines manage the flow of data from initial collection through consolidation, cleaning, analysis, visualization, and more. Apache Airflow provides a single platform you can use to design, implement, monitor, and maintain your pipelines. Its easy-to-use UI, plug-and-play options, and flexible Python scripting make Airflow perfect for any data management task.
About the book
teaches you how to build and maintain effective data pipelines. You’ll explore the most common usage patterns, including aggregating multiple data sources, connecting to and from data lakes, and cloud deployment. Part reference and part tutorial, this practical guide covers every aspect of the directed acyclic graphs (DAGs) that power Airflow, and how to customize them for your pipeline’s needs.
What's inside
Build, test, and deploy Airflow pipelines as DAGs
Automate moving and transforming data
Analyze historical datasets using backfilling
Develop custom components
Set up Airflow in production environments
About the reader
For DevOps, data engineers, machine learning engineers, and sysadmins with intermediate Python skills.
About the author
Bas Harenslak
and
Julian de Ruiter
are data engineers with extensive experience using Airflow to develop pipelines for major companies. Bas is also an Airflow committer.
Table of Contents
PART 1 - GETTING STARTED
1 Meet Apache Airflow
2 Anatomy of an Airflow DAG
3 Scheduling in Airflow
4 Templating tasks using the Airflow context
5 Defining dependencies between tasks
PART 2 - BEYOND THE BASICS
6 Triggering workflows
7 Communicating with external systems
8 Building custom components
9 Testing
10 Running tasks in containers
PART 3 - AIRFLOW IN PRACTICE
11 Best practices
12 Operating Airflow in production
13 Securing Airflow
14 Project: Finding the fastest way to get around NYC
PART 4 - IN THE CLOUDS
15 Airflow in the clouds
16 Airflow on AWS
17 Airflow on Azure
18 Airflow in GCP