Skip to content

Save run records in postgresql#180

Draft
corrodis wants to merge 7 commits intodevelopfrom
scorrodi/RunRecordsToDb
Draft

Save run records in postgresql#180
corrodis wants to merge 7 commits intodevelopfrom
scorrodi/RunRecordsToDb

Conversation

@corrodis
Copy link
Copy Markdown
Contributor

Added new module and functionalities to save the run records not only on disk but in posgresql. The intention is compatibility with OTSDAQ environment variables. The feature needs to be enabled in the settings with enable_run_record_database. If we go along with this, we will need to modify the spack packages to include a variance that include py-postgresql.

@corrodis
Copy link
Copy Markdown
Contributor Author

Tagging @rrivera747, @eflumerf, and @normanajn for discussion.

@rrivera747
Copy link
Copy Markdown
Contributor

Seems like a good feature add to me. To complete the feature set, I think we should also return the same run record info to an artdaq Supervisor request. Is it easy for you to add that?

@eflumerf
Copy link
Copy Markdown
Contributor

eflumerf commented Dec 18, 2025

I would probably go more towards a generic solution than adding multiple specific ones. If we had DAQInterface write a summary.json file into the run record directory, other scripts (or the OTSDAQ ARTDAQSupervisor) could easily read it and send it onward.

If we say we like Postgres enough to keep the specific implementation, I'd suggest configuring it by pointing DAQInterface to a .env-style file with the database connection info and credentials, rather than spelling them all out in insecure configs or environment variables.

@corrodis
Copy link
Copy Markdown
Contributor Author

@rrivera747 I'm not sure I fully understand. Doesn't ARTDAQSupervisor already have access to all that information? What we store in sql _components and in the file ranks.txt is already returned by xmlrpc - if I understand right? And the fcl files that we write to _fcl database and run_record files/folders are are based on the flattened fcl that come from the ARTDAQSupervisor?

@corrodis
Copy link
Copy Markdown
Contributor Author

@eflumerf is my understanding right that your comment was not for what is part of this proposal but with respect to a run-end summary? From my point of view that is the next step. I think we should collect statistics from all active loggers and a) write them to a file like we do for the other run_records and add it also to a table artdaq_run_summary (if the postgresql use is activated). How does that sound?

@rrivera747
Copy link
Copy Markdown
Contributor

run_record

I thought you were collecting "run records." So far this is just configuration you mean?

@rrivera747
Copy link
Copy Markdown
Contributor

My main point is that all useful info that DAQ interface collects should be accessible through a python Get accessor function call (not xmlrpc).

@eflumerf eflumerf moved this from 📋 Triage to 👍 PR Created in art-daq Work Tracker Jan 26, 2026

"""

import os
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

Status: 👍 PR Created

Development

Successfully merging this pull request may close these issues.

5 participants