A worker to log stats from a YouTube’s API

Alchemist Camp has been growing much more quickly in its third week than in its first two, so let's make an app to fetch stats from YouTube and log them every 6 hours!

We'll use YouTube's API, HTTPoison to fetch from it, Poison to decode the JSON, and make an OTP app to supervise the scheduler.

(Source code available for premium members)

The YouTube API

The newest version of the YouTube API as of this writing is version 3. For now, the only stats we're interested in are a channel's subscribers and views. We can get that from the Channels: list part of the API. We'll just get a single channel by ID and then get views and subscribers from inside the statistics portion of the response.

Since we're not setting up OAuth or getting any non-public information, this will be the simplest kind of API key and all we'll need to do is hit a URL which has these parts:

After we put together the URL we need with channel and key as query string parameters, we can paste it into the browser, make sure it's right before storing it and see what the response data looks like.

Creating the StatWatch module (5:47)

First, we'll need to add the following dependencies to our mix.exs:

  • Poison (for encoding and decoding JSON)
  • HTTPoison (for making HTTP requests)

The module will also need the following functions:

  • stat_urls: just returns the URL we built out above. It's hard-coded.
  • column_names: Returns the first line of a CSV file with our column headers
  • fetch_stats: Uses HTTPoison.get! to make a request to the stat_url and decodes the JSON response with Poison.decode!
  • save_csv: Appends a row of stats to a CSV file
  • run: Calls fetch_stats and pipes the response into save_csv

Creating an OTP application to call StatWatch (19:02)

Since we're using Elixir, we won't need a cron job or anything similar external tool to regularly call our StatWatch module. OTP applications are ideal for the task.

First we'll have to add the following to the application function in our mix.exs:

mod: {StatWatch.Application, []},

Then we'll create a StatWatch.Application module. The only change from the default Application mix creates (when --sup is passed) is one worker we'll add inside the children list:

worker(StatWatch.Scheduler, [])

Finally, we need to create the StatWatch.Scheduler file. It will use GenServer and is very basic. Other than init we only need two functions:

  • handle_info(:work, state): Calls StatWatch.run and then schedule_work.
  • schedule_work: Uses Process.send_after to send the message :work to the current process after 6 hours.

Upon starting the application, init calls, handle_info with the message :work and the initial state. Then, handle_info calls run on the StatWatch module, which fetches and saves our stats, and finally calls schedule_work. Six hours later, schedule_work will send the :work message, which will be caught by handle_info which will fetch more stats and call schedule_work again.

On and on it will go every six hours.


Take this project as a starting point and add APIs for it to call. Maybe log the followers on a Twitter profile or a site's Alexa Rank.

(solution here)

Back to index