paint-brush
Data Pipelines: OpenWeatherMap-Airflow [A How-To Guide]by@ashish-ghimire
936 reads
936 reads

Data Pipelines: OpenWeatherMap-Airflow [A How-To Guide]

by ashish ghimireFebruary 18th, 2020
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

In this article, we will learn how to develop ETL(Extract Transform Load) pipeline using Apache Airflow. In order to use Airflow, you will have to set up Airflow first. We will then create a directory where we will save daily data obtained from API. The code for this article can be found in this Github Repository. You can see Airflow installation documentation on how to setup Airflow. You can find it over the GitHub Repository. This article is developed using Michael Harmon's publication.

Company Mentioned

Mention Thumbnail
featured image - Data Pipelines: OpenWeatherMap-Airflow [A How-To Guide]
ashish ghimire HackerNoon profile picture

In this article, we will learn how to develop ETL(Extract Transform Load) pipeline using Apache Airflow. Here are list of things that we will do in this article:

  • Call an API
  • Setup database
  • Setup airflow

Call an API

We will create a module

getWeather.py
, and inside it we will create a
get_weather()
function which will call the API.

We will then create a directory

data/
where we will save daily data obtained from API. We do this under
createDirectory()
function as shown below.

Setup Database

We will create a module

createTable.py
, and inside it we will create a
make_database()
function which will create database.

Setup Airflow

In order to use Airflow, you will have to set up Airflow first. You can see Airflow installation documentation on how to setup Airflow.

Once Airflow has been set up, we will define our dag.

Now we can run our DAG from Apache Airflow.

Complete code for this article can be found in this Github Repository.

Special thanks to Michael Harmon. This article is developed using his publication. You can find it over

here
.