A Minimalist framework for exploring syntactic derivations, with built-in support for code switching data.
The project is structured as a multi-package repository via Lerna (https://lernajs.io/) and Yarn (https://yarnpkg.com/) workspaces.
- The
django-backendpackage manages the Django/Channels/Django REST Framework backend. - The
react-frontendpackage manages the React/Redux frontend. - The
docspackage manages the project documentation.
- Frontend routing and server-side Channels operations are handled by a Django app named
frontend. - The actual frontend views and bundles are built via
create-react-appand end up in thebuildfolder of thereact-frontendpackage. The Django backend will either collect them as static files before serving them (in production mode) or proxy browser requests to thecreate-react-appdev server (in dev mode).
- Python dependencies are managed by
poetry(viapyproject.tomlin the project root). - JS dependencies are managed by
yarn(via the variouspackage.jsonfiles). - Tasks are defined as
npmscripts and can also be managed byyarn(via the variouspackage.jsonfiles).
Start by cloning this repository and ensuring that Poetry (https://poetry.eustace.io/docs/), Node.js (https://nodejs.org/) and Yarn (https://yarnpkg.com/) are installed.
From the project root, install the Python dependencies:
poetry installIf you are running the server on a Windows machine, specify the windows extra option to pull in Windows-specific dependencies:
poetry install --extras "windows"Note: As of April 2020, some of the project dependencies do not support Python 3.8, so Python 3.7 is the latest compatible version.
In order for the Channels portion of the backend to work properly, the project expects a Redis server to be accessible (on 127.0.0.1:6379 by default). An easy way to do this is to install Docker (https://www.docker.com/), then run:
docker run -p 6379:6379 --name redis -d redisIf the Docker container is stopped for any reason (e.g., if the system is restarted), it can then be restarted with:
docker container start redisIn addition, the backend uses a Postgres server for data storage (on 127.0.0.1:5432 by default). If you have Docker installed, you can spin up a local Postgres instance with:
docker run -p 5432:5432 --name postgres -e POSTGRES_PASSWORD=<password here> -d postgres(The POSTGRES_PASSWORD variable will have to be set for a completely new container, or it will not start.)
You will also need to create a login role and database for the system. By default, the database name, username, and password are all assumed to be cs_toolkit. To specify different database connection parameters, use the DB_HOST, DB_PORT, DB_NAME, DB_USER, and DB_PASSWORD environmental variables.
(Note: As of April 2020, PgBouncer is no longer used by default, to prevent instability with too many database connections kept open.)
By default, the backend also expects PgBouncer to be running on port 6432 as a connection pooler for the Postgres database.
Using Docker, a PgBouncer instance with the required settings can be started with:
docker run -d --name=pgbouncer -e DB_HOST=postgres -e DB_USER=cs_toolkit -e DB_PASSWORD=cs_toolkit -p 6432:6432 --link postgres:postgres brainsam/pgbouncer:latestOnce the Postgres server is up and running, use the following command to initialise the required tables and load the basic lexicon/grammar data into the database:
yarn workspace django-backend run load-basicThe system uses a distributed task queue (https://dramatiq.io/) to process multiple syntactic computations in parallel, and the queue needs to have at least one worker thread up and running.
Use the start-workers shell script to start up a bunch of worker threads. By default, one worker process is spawned for each CPU core on the machine, with 8 worker threads per process. You can use the NUM_PROCESSES and NUM_THREADS environmental variables to change these if necessary.
For example, to only spawn 8 worker processes with 4 threads each:
NUM_PROCESSES=8 NUM_THREADS=4 ./start-workersFirst, install the frontend production dependencies:
yarn --prodNext, run the build-prod shell script to build the production frontend bundles and collect all production assets in the Django static files directory (packages/django-backend/static);
./build-prodFinally, use the serve-prod shell script to start the Django production server (Daphne) on port 8080. You can set the DJANGO_PORT environmental variable to use a non-default port:
DJANGO_PORT=8080 ./serve-prodThen point your browser to http://localhost:8000 (replacing 8000 with your chosen port, if you set one) to view the main interface.
From the project root, install the frontend development dependencies:
yarnN.B.: If you are using WSL and getting timeout errors, try yarn --network-timeout 100000. (yarnpkg/yarn#5259)
The shell script start-dev will spin up the Django server in debug mode and the React dev server at the same time, automatically monitoring both the frontend and backend sources for changes.
Set the DJANGO_PORT environment variable to use a non-default port for the main interface, or REACT_HOST/REACT_PORT to run the background React dev server on a different address/port:
DJANGO_PORT=8080 REACT_HOST=127.0.0.1 REACT_PORT=3000 ./start-devNote that the interface will then be accessible via the port in DJANGO_PORT, not REACT_PORT.
From the project root, use the following command to rebuild and profile the contents/size of the main frontend bundle:
yarn workspace react-frontend run profileOpen packages/react-frontend/build/bundle-stats.html to view the results.
Documentation for the project is built using Sphinx (http://www.sphinx-doc.org/) and Sphinx-js (https://github.com/erikrose/sphinx-js) from the sources in the docs package.
To build the documentation, the jsdoc executable will need to be available on your path. An easy way to do this is by installing JSDoc globally:
yarn global add jsdocThe documentation can then be built from the project root with the following command:
yarn workspace docs run buildOpen packages/docs/_build/index.html to view the generated documentation.
Alternatively, the docs-dev npm script will watch for changes to the documentation sources and rebuild it automatically:
yarn workspace docs run docs-dev