Getting Started with Airflow: Setup Simplified

By otgiqpmy

Welcome to the third installment of our series on Apache Airflow. If the previous posts have piqued your interest in Airflow, you’re probably wondering about the practical aspects of getting it up and running. This post will guide you through the ease of setting up Airflow, highlighting its simplicity and integration capabilities.

Ease of Setup

One of Airflow’s biggest advantages is its straightforward setup process. Even if you’re not a seasoned developer, you can get Airflow operational with minimal fuss. This ease of setup is crucial for businesses looking to quickly adapt and implement data workflow management solutions.

Requirements and Installation

Before we dive into the installation process, ensure that you have Python installed on your system as Airflow is built on Python.

Here’s a step-by-step guide to installing Apache Airflow:

1. Install Airflow: Open your terminal and run:

pip install apache-airflow

2. Initialize the Database: Airflow uses a database to store its operational data. By default, it uses SQLite.

airflow db init

3. Create a User: To access the web interface, create a user:

airflow users create \
    --username [your_username] \
    --firstname [Your_Firstname] \
    --lastname [Your_Lastname] \
    --role Admin \
    --email [your_email]

4. Start the Web Server: By default, the web server starts on port 8080.

airflow webserver --port 8080

5. Start the Scheduler: Open another terminal and run:

airflow scheduler

That’s it! You now have Airflow running on your system.

First DAG

Now, let’s create your first DAG to illustrate the basics:

1. Create a DAG File: In the Airflow directory, navigate to the dags folder and create a Python file – hello_world_dag.py.

2. Write the DAG: Paste the following code in your hello_world_dag.py file:

from airflow import DAG
from airflow.operators.python_operator import PythonOperator
from datetime import datetime

def hello_world():
    print("Hello, world!")

dag = DAG('hello_world_dag',
          description='Simple tutorial DAG',
          schedule_interval='0 12 * * *',
          start_date=datetime(2023, 1, 1),
          catchup=False)

hello_operator = PythonOperator(task_id='hello_task',
                                python_callable=hello_world,
                                dag=dag)

3.

  1. Activate the DAG: Go to the Airflow web interface, turn on the ‘hello_world_dag’, and it will start running according to the schedule.

Non-Technical Perspective

For non-technical users, the key takeaway is that Airflow’s setup and integration are intuitive. It doesn’t require deep technical expertise to get a basic workflow up and running, and it integrates seamlessly with various data sources and processing tools, making it a versatile addition to any data infrastructure.

Technical Section

This simple example of installing Airflow and creating a basic DAG demonstrates the platform’s accessibility. Airflow’s power lies in its ability to handle much more complex workflows, but the ease with which you can set up and start experimenting is a significant advantage.


Stay tuned for our next blog post, where we’ll explore how Airflow connects and integrates with database infrastructures. As always, if you have any questions or need assistance in setting up Airflow for your business, feel free to reach out to our team. Happy data processing!

Pages: 1 2