/
Local DB Setup Guide

Local DB Setup Guide

For anyone who is working on or branching off FREE-90, here are the steps to get the DB running locally so you can use our new data. Hopefully this won’t be needed soon because our new data will be deployed.

Install Docker Desktop

Set Up Hasura and Postgres

docker compose up -d

Create the Database

  • On the Hasura console page go to the “Data” tab

  • Press the “Connect Database” button and select “Postgres”

  • Name it whatever you want, and connect by the following Database URL:

postgres://postgres:postgrespassword@postgres:5432/postgres
  • Go to the “public” schema (folder icon) on the left and click the “Track All” button next to the “Untracked foreign-key relationships” section

Run the Scraper

npm install && npm run scrape
  • There should now be a bunch of JSON files in the output directory:

Install psycopg2

  • Install Python3 if you haven’t already for some reason

  • Install the Python package psycopg2, for example using

pip3 install psycopg2

Insert Data into Postgres

  • Create a file in the root directory of nss-scraper called insert_data.py with the following contents:

import json import psycopg2 from psycopg2 import Error def insert_buildings(cursor): cursor.execute("DELETE FROM Buildings") with open('output/buildings.json') as f: data = json.load(f) values = [(bldg['id'], bldg['name'], bldg['lat'], bldg['long']) for bldg in data] cmd = 'INSERT INTO Buildings("id", "name", "lat", "long") VALUES (%s, %s, %s, %s)' cursor.executemany(cmd, values) def insert_rooms(cursor): cursor.execute("DELETE FROM Rooms") with open('output/rooms.json') as f: data = json.load(f) values = [( room['id'], room['name'], room['abbr'], room['usage'], room['capacity'], room['school'], '-'.join(room['id'].split('-')[:2]) ) for room in data] cmd = 'INSERT INTO Rooms("id", "name", "abbr", "usage", "capacity", "school", "buildingId") VALUES (%s, %s, %s, %s, %s, %s, %s)' cursor.executemany(cmd, values) def insert_bookings(cursor): cursor.execute("DELETE FROM Bookings") with open('output/bookings.json') as f: data = json.load(f) values = [( booking['bookingType'], booking['name'], booking['roomId'], booking['start'], booking['end'] ) for booking in data] cmd = 'INSERT INTO Bookings("bookingType", "name", "roomId", "start", "end") VALUES (%s, %s, %s, %s, %s)' cursor.executemany(cmd, values) if __name__ == '__main__': connection = None cursor = None try: connection = psycopg2.connect(user="postgres", password="postgrespassword", host="127.0.0.1", port="5432", database="postgres") cursor = connection.cursor() insert_buildings(cursor) connection.commit() insert_rooms(cursor) connection.commit() insert_bookings(cursor) connection.commit() except (Exception, Error) as error: print("Error while connecting to PostgreSQL", error) finally: if (connection): cursor.close() connection.close() print("PostgreSQL connection is closed")
  • In the root directory of nss-scraper run

python3 insert_data.py
  • When it finishes, the tables on the Hasura console should now be populated with data:

Ready to Go

  • Should be good to run the backend and frontend as normal now

  • To run the database again in future, just open Docker Desktop and run the hasuragres container from the dashboard

Related content