- There are 2 directories in this project,
frontendandbackend. - The backend requires the
ollamaCLI tool to serve llama3 7B. - This application can be entirely self-hosted. Meaning that you can use it on your clinic's WiFi.
frontend/: React frontend.backend/: FastAPI backend.- It communicates with ollama and the React frontend.
- It uses the
openbmb/MiniCPM-Llama3-V-2_5vision model for handwritten text recognition.
ollama: A CLI tool to serve ML models.- This is used to summarize and format the medical chart text.
-
Handwritten Text Recognition:
githubharald/HTRPipeline: Not good.llava:34b: Doesn't work at all. I suspect more because ollama is not able to access the file at all.harshit543/Handwritten-Text-Recognition: It's okay. It works but can misinterpret maybe ~10% of the text. It's not good enough for production.MiniCPM-Llama3-V-2_5: Really good. Better thanharshit543/Handwritten-Text-Recognition. Works well on Gradio, but unable to read image when used with ollama.- This has been integrated into the FastAPI backend.
-
Model Inference:
transformers: Slow. Takes ~10 seconds to process a single page of text.ollama: Fast. Takes ~1 second to process a single page of text.
-
Install ollama
curl -fsSL https://ollama.com/install.sh | sh -
Run ollama in background on default port 11434
tmux new -s ollama ollama:~$ chmod +x ollama.sh ollama:~$ ./ollama.sh
-
The first time you run ollama, you need to download llama3 7B. This is done in a temporary tmux session. After the download is complete, you can exit the session. You only need to do this once.
tmux new -s llama3 llama3:~$ ollama run llama3 llama3:~$ exit
-
Setup FastAPI environment variables. The
.env.examplefile hasOLLAMA_API_ENDPOINTset to the default port 11434.cd backend cp .env.example .env perl -pi -e 'chomp if eof' .env # Remove the last newline in .env openssl rand -hex 32 >> .env # Set this to the ACCESS_TOKEN
-
Install dependencies
cd backend python3 -m venv .venv pip install -r requirements.txt -
Allow
run.shto be executablechmod +x run.sh
-
Run the FastAPI backend
tmux new -s fastapi fastapi:~$ cd backend fastapi:~$ ./run.sh
-
Run Cloudflare tunnel for FastAPI backend
tmux new -s fastapi_cloudflare fastapi_cloudflare:~$ ./backend_cloudflare.sh
-
Setup React environment variables manually.
cd frontend cp .env.example .env vim .env- Set
VITE_FASTAPI_SERVER_API_BASE_URLtohttps://api.<domain-name>.com. - Set
VITE_FASTAPI_SERVER_ACCESS_TOKENtoACCESS_TOKENfrombackend/.env. - Set
VITE_TIPTAP_JWTto the "Authentication" value in Tiptap dashboard settings.- Find this at Tiptap Collaboration Settings
- Then look under "Authentication"
- Copy the value from "JWT (auto generated for testing)"
- Set
VITE_TIPTAP_COLLAB_APP_IDto the "App ID" in the Tiptap dashboard.- This is the Tiptap Collaboration Dashboard App ID.
- It's NOT Tiptap Convert Document App ID!
- Set
-
Install dependencies.
cd frontend pnpm install -
Run React frontend on port 4173.
tmux new -s react react:~$ cd frontend react:~$ pnpm run build react:~$ pnpm run preview --port 4173
-
Run Cloudflare tunnel for React frontend
tmux new -s react_cloudflare react_cloudflare:~$ ./frontend_cloudflare.sh -
Go to
dashboard.<domain-name>.com. The backend is hosted atapi.<domain-name>.com. -
Allow 30-60 seconds for the application to load the first time you use the application or after restarting. This delay happens because the LLM must be loaded into memory.
-
You can also load the LLM into memory prior to using the application.
-
To do this, go to
api.<domain-name>.com/docs#/default/healthcheck_healthcheck_get. Click "Try it out" then click "Execute". When you see "ok" in the "Response body", the LLM has been loaded into memory.
- Go to
dashboard.<domain-name>.comon your phone and login. - Click "Choose Files". This opens the default camera on your phone.
- Take photos of the patient intake forms.
- On your work computer, go to
dashboard.<domain-name>.comand login. - The application will write a chart note using the patient intake forms. You can copy this form into your own EMR.
- This section requires you to have already run through Getting Started at least once.
- Start all application processes.
./run.sh
- Gracefully stop the application.
tmux kill-server