Skip to content
See my LinkedIn See my GitHub

Making a Blog Using Notion, Gatsby, and Github Pages

https://images.unsplash.com/photo-1432821596592-e2c18b78144f?ixlib=rb-1.2.1&q=85&fm=jpg&crop=entropy&cs=srgb&ixid=eyJhcHBfaWQiOjYzOTIxfQ

Note: This is deprecated. Notion released the official API and I didn't update this article accordingly. This current blog is not using Notion anymore.

Lately, I decided to create this static blog using the great GatsbyJs. Of course, a question quickly arose: what data source am I going to use to populate it?

I really like Notion, I use it for note-taking, internal business documentation and I'll probably be using it to write my blog posts, so why not use it directly as a content manager?

After searching the web for this idea, I came across a blog post by Tony Faieta:

I decided to iterate over what he did.

Specs of the Project

  • I would like to have a root Notion note with a list of sub-notes that corresponds to all my blog posts. The root note is like the index of my blog.
  • I would like my blog to import locally the notes from Notion.
  • I would like my blog to import locally the images of the notes.
  • I would like any video link to display well.
  • I would like my blog to regenerate a static website each time:
    • I make a change on the code
    • every day
    • when I manually ask it to

Importing Notes From Notion

To get the data from Notion, the most flexible tool I found is the notion-py python library. It allows getting notion note modelized as an array of blocks, each block being a text line, a title, an image, a list item, etc.

I found markdown to be the best format to import Notion notes because the Gatsby plugin gatsby-transformer-remark handles it very well and have itself a plugin to display all kinds of videos. Also, using markdown allows me to backup the posts easily.

After a bit of hacking, I've made the following script: get-blog-posts.py

It uses two environment variables, NOTION_TOKEN and NOTION_ROOT_PAGE_ID which are respectively your notion token and the root page id.

The notion token allows the library to access the private notes of your notion account, here are the instructions to find it.

To get the page id of your root page it's easy. If the link is https://www.notion.so/Blog-83f4047341534d6bb846b1f561a13173, the id is 83f4047341534d6bb846b1f561a13173.

The script will download any image URL. It will also convert any text line starting by == to a markdown frontmatter metadata.

Let's say you have this in your Notion note:

== date: 2020-09-17
== description: How to use notion as a content manager to generate a static blog.

In markdown it will become:

---
date: 2020-09-17
description: How to use Notion as a content manager to generate a static blog.
---

The cover image of the page will also be downloaded and the path will be added in the fronttmater:

---
...
featured: '0ca01e654f193b6555f25ba69613ee1ad67d8e71.jpeg'
---

The script stores the imported blog posts in the content/blog/ directory. Then I tell the Gatsby remark plugin to generate the pages using the markdown files. At this point, the blog is running.

Continuous Deployment

I want to deploy on Github pages. To do this manually, we can use the npm package gh-pages like this gh-pages -d public. This is nice, but I prefer to automate this part.

I use a "workflow" made with Github Actions. I create a deploy.yml file in the .github/workflows directory of my repository:

name: Build and Deploy
on:
  schedule:
    - cron: "0 0 * * *" # Every day at midnight.
  workflow_dispatch:
  push:
    branches:
      - master
jobs:
  build-and-deploy:
    runs-on: ubuntu-latest
    steps:
      - name: Set up Python 3.8
        uses: actions/setup-python@v2
        with:
          python-version: 3.8

      - name: Set up pipenv
        uses: dschep/install-pipenv-action@v1
        with:
          version: 2020.8.13

      - name: Checkout 🛎️
        uses: actions/checkout@v2.3.2
        with:
          persist-credentials: false

      - name: Install and Build 🔧
        env:
          NOTION_TOKEN: ${{ secrets.NOTION_TOKEN }}
          NOTION_ROOT_PAGE_ID: ${{ secrets.NOTION_ROOT_PAGE_ID }}
        run: |
          pipenv install
          pipenv run python ./bin/get-blog-posts.py
          yarn install
          yarn build

      - name: Deploy 🚀
        uses: JamesIves/github-pages-deploy-action@3.5.9
        with:
          GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
          BRANCH: gh-pages
          FOLDER: public
          CLEAN: true

In the on: section:

  • schedule tells to run this on a specific interval, here every day at midnight UTC.
  • workflow_dispatch allows you to have a button on your GitHub interface to manually trigger the deploy.
  • push allows you to trigger the deploy on a specific branch push.

Then the script does the following deploy steps:

  • it creates a virtual machine with the latest ubuntu
  • it setups python 3.8
  • it adds pipenv to install the same versions of the python packages as in development
  • it gets the code of the repository
  • it sets the environment variables and runs the commands to import the blog posts and build the static website
  • it deploys on GitHub pages

And we're all set. We can now create our blog posts inside Notion. When we are done editing the post, we can move the note in the root note and either wait the next day to publish it or go in the GitHub interface to ask a rebuild and the blog gets updated. I really enjoy this workflow because there is very little friction to go from a idea to a blog post.

The source code of this blog is available here https://github.com/ArnaudValensi/ArnaudValensi.github.io

Thank you for reading and tell me in the comments what is your preferred way to edit a blog.