- [ ] introduce [Flowise](https://flowiseai.com/) in the loop - [x] use local llm (ollama + compatible model) - [ ] add vector storage to store retrieved POIs as embeddings, then use as retriever to augment prompt instead of giving the context-broker results... - [ ] add additional agent that can search on the web, and add some results to the response - [x] format better the output (display markdown return transformed in html)