The docker-geoflow
provides a way to run the geoflow as Docker container / command line. The image exposes the geoflow::executeWorkflow
main R function used to execute geoflows from a configuration file (JSON or YAML).
docker pull ghcr.io/r-geoflow/geoflow:latest
#Get an example of configuration file
wget -O config.json https://raw.githubusercontent.com/r-geoflow/geoflow/refs/heads/master/inst/extdata/workflows/config_metadata_gsheets.json
#Run geoflow as container
docker run --rm -v "$(pwd):/data" ghcr.io/r-geoflow/geoflow /data/config.json
The Docker geoflow service is run as container, which is removed once the execution is finished. In order to access the jobs directory, it is necessary to establish a link between the container directory, and the machine where Docker is executed. For this, we can mount a volume adding -v "$(pwd)/geoflow-jobs:/srv/geoflow/jobs"
to the above Docker run
instruction, as follows:
docker run --rm -v "$(pwd):/data" -v "$(pwd)/geoflow-jobs:/srv/geoflow/jobs" ghcr.io/r-geoflow/geoflow /data/config.json
With this command:
- a local directory
geoflow-jobs
will be created (if not existing) in the current working directory (${pwd}
) - the
geoflow
job sub-directory created within the Docker container will be copied to thegeoflow-jobs
directory.
In this way you can run geoflow
as Docker command line service (as "black-box") while persisting the outputs in your machine.