Conversation
Add FragPipe as an additional supported dataset format that can be loaded from PRIDE and processed by the pmultiqc service. Changes: - Add FragPipe file detection (psm.tsv, ion.tsv) in detect_input_type() - Add FragPipe files to filter_search_files() for PRIDE dataset filtering - Add 'fragpipe' to allowed_input_types and --fragpipe-plugin handling - Add PXD066146 as a FragPipe example in the PRIDE examples modal
…bpwS1 Refactor pmultiqc service to support FragPipe
Tus - transfer improvement using Tus framework
📝 WalkthroughWalkthroughAdds TUS resumable uploads (server router, on_upload_complete, client tus-js integration), background archive extraction and FragPipe detection, Redis filename→job_id mapping and polling, MultiQC fragpipe wiring, dynamic Docker tagging/OCI labels in CI, a FastAPI dependency bump, and config/UI tweaks. Changes
Sequence Diagram(s)sequenceDiagram
actor Client
participant Frontend as Frontend (index.html)
participant TUS as TUS Router
participant Backend as FastAPI
participant Redis
participant Worker as Background Worker
participant MultiQC
Client->>Frontend: select file & start upload
Frontend->>TUS: POST /files (initiate TUS upload)
TUS-->>Frontend: Location (upload URL)
loop upload chunks
Frontend->>TUS: PATCH upload (resume-capable chunks)
TUS-->>Frontend: Upload-Offset
end
TUS->>Backend: on_upload_complete callback
Backend->>Redis: set filename → job_id (with TTL)
Backend->>Worker: enqueue background processing
Worker->>Worker: extract archive & detect input type
Worker->>Redis: update job progress/status
Worker->>MultiQC: run pmultiqc with plugins/args
Worker->>Backend: persist results / mark complete
Frontend->>Backend: GET /tus-job/{filename}
Backend->>Redis: lookup job_id
Backend-->>Frontend: job_id
loop status polling
Frontend->>Backend: GET /job-status/{job_id}
Backend->>Redis: return updates
Backend-->>Frontend: progress/status
end
Backend-->>Frontend: final status (complete/failure)
Estimated code review effort🎯 4 (Complex) | ⏱️ ~45 minutes Possibly related PRs
Suggested labels
Poem
🚥 Pre-merge checks | ✅ 3✅ Passed checks (3 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. 🧹 Recent nitpick comments
📜 Recent review detailsConfiguration used: defaults Review profile: CHILL Plan: Pro 📒 Files selected for processing (1)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (3)
🔇 Additional comments (11)
✏️ Tip: You can disable this entire section by setting Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 5
🤖 Fix all issues with AI agents
In `@pmultiqc_service/app.py`:
- Around line 499-530: Remove the redundant Redis write that uses
TUS_JOB_FILENAME_PREFIX and TUS_JOB_FILENAME_TTL_SECONDS: delete the first block
that calls get_redis_client(), constructs key =
f"{TUS_JOB_FILENAME_PREFIX}{filename}", and calls redis_client.setex(...
job_id.encode("utf-8")). Keep the later Redis mapping that uses key =
f"pmultiqc:tus_job:{filename}" (the endpoint relies on this). Also remove the
now-unused constants TUS_JOB_FILENAME_PREFIX and TUS_JOB_FILENAME_TTL_SECONDS
from the top-level constants so there are no dead variables.
In `@pmultiqc_service/requirements.txt`:
- Line 1: Update the FastAPI dependency line in requirements.txt from
"fastapi>=0.110.0,<0.111.0" to a version that allows Starlette >=0.40.0 (for
example "fastapi>=0.128.0,<0.129.0" or a newer patch range); then reinstall or
regenerate your lockfile (pip-compile, pipenv lock, or poetry lock as
applicable) and run the test suite to ensure compatibility (check for any
import/API changes after upgrading FastAPI in your codebase).
In `@pmultiqc_service/templates/index.html`:
- Around line 541-554: The online event listener (onlineHandler) is added but
never removed, causing a memory leak; assign the handler to a property on the
upload object (e.g., upload._onlineHandler = onlineHandler) right after creating
it and adding it with window.addEventListener('online', onlineHandler), and then
remove it (window.removeEventListener('online', upload._onlineHandler)) from the
upload cleanup paths such as the upload.onSuccess, upload.onError
(non-resumable), and any cancel/finish handlers so the listener is always
detached when the upload lifecycle ends.
- Around line 800-821: The showError function is inserting the message into
resultDiv.innerHTML without escaping, allowing XSS when message contains
user-controlled content; update showError to sanitize/escape HTML in the message
before building errorHtml (or use textContent for the message node) so any
filenames or user data are rendered as plain text, then append the Resume and
Return buttons via DOM methods (createElement, setAttribute, addEventListener)
or insert their HTML safely after the escaped text; ensure you still clear
currentUploadXhr and toggle progressContainer/cancelUploadBtn/resumeUploadBtn as
before.
- Around line 9-10: Replace the floating CDN reference to tus-js-client with a
pinned release to avoid unexpected upgrades: update the <script> tag that loads
"tus-js-client" so its src uses a specific version (e.g., change `@latest` to
`@4.3.1` so the URL becomes .../tus-js-client@4.3.1/dist/tus.min.js); ensure the
script tag that references the library is the only place changed and consider
adding SRI/hash later if required for additional security.
🧹 Nitpick comments (6)
.github/workflows/build-and-push.yml (2)
32-44: Dynamic tagging logic looks good, but consider branch name sanitization.The logic correctly tags
main/masteraslatestand uses branch names for feature branches. However, branch names with characters invalid for Docker tags (e.g.,/,#) could cause build failures.Consider sanitizing the branch name:
♻️ Optional: Sanitize branch name for Docker tag compatibility
- name: Determine Docker tag id: tag shell: bash run: | if [[ "${{ env.BRANCH_NAME }}" == "main" ]] || [[ "${{ env.BRANCH_NAME }}" == "master" ]]; then echo "DOCKER_TAG=latest" >> $GITHUB_ENV else - echo "DOCKER_TAG=${{ env.BRANCH_NAME }}" >> $GITHUB_ENV + # Sanitize branch name for Docker tag (replace invalid chars with -) + SANITIZED=$(echo "${{ env.BRANCH_NAME }}" | sed 's/[^a-zA-Z0-9._-]/-/g') + echo "DOCKER_TAG=${SANITIZED}" >> $GITHUB_ENV fi
5-5: Temporarytusbranch trigger should be removed after merge.The
tusbranch trigger is useful for testing during development but should be removed once the feature is merged tomainto avoid publishing unintended images.pmultiqc_service/app.py (4)
444-446: Unused variablefiletype.The variable
filetypeis assigned but never used. As flagged by static analysis, this should be removed or used.♻️ Proposed fix
# Extract metadata filename = metadata.get("filename", "upload.zip") - filetype = metadata.get("filetype", "application/zip")
478-486: Unused variableupload_dirand cleanup logic issue.The variable
upload_diris assigned but never used. Additionally, the cleanup logic attempts to remove.infofile butfile_pathpoints to the uploaded file itself (which was already moved), not the TUS upload directory structure.♻️ Proposed fix: Remove unused variable or fix cleanup logic
# Clean up TUS upload directory to prevent re-upload # The file_path is like /tmp/pmultiqc_uploads/{upload_id} # We need to delete the .info file and directory try: - upload_dir = os.path.dirname(file_path) info_file = file_path + ".info" if os.path.exists(info_file): os.remove(info_file) logger.info(f"Removed TUS metadata file: {info_file}") # Note: The upload file itself is already moved, so no need to delete it except Exception as e: logger.warning(f"Failed to clean up TUS metadata: {e}")
2851-2877: Exception handling should useraise ... fromfor proper chaining.Per static analysis (B904), when re-raising exceptions within an
exceptblock, useraise ... from errorraise ... from Noneto preserve the exception chain.♻️ Proposed fix
except HTTPException: raise except Exception as e: logger.error(f"Error retrieving TUS job mapping: {e}") - raise HTTPException(status_code=500, detail="Error retrieving job information") + raise HTTPException(status_code=500, detail="Error retrieving job information") from e
388-431: Background processing function handles errors appropriately.The
process_upload_in_backgroundfunction:
- Creates extraction directory
- Validates and extracts ZIP
- Detects input type
- Updates job status on failure
- Spawns processing thread
However, per static analysis, using
logging.exceptioninstead oflogging.errorwould include the stack trace automatically.♻️ Optional improvement for better error logging
except Exception as e: - logger.error(f"Background processing failed for job {job_id}: {e}") + logger.exception(f"Background processing failed for job {job_id}") update_job_progress(
📜 Review details
Configuration used: defaults
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (5)
.github/workflows/build-and-push.ymlpmultiqc_service/app.pypmultiqc_service/k8s/configmap.yamlpmultiqc_service/requirements.txtpmultiqc_service/templates/index.html
🧰 Additional context used
🧠 Learnings (1)
📚 Learning: 2025-10-31T10:21:32.605Z
Learnt from: cbielow
Repo: bigbio/pmultiqc PR: 414
File: pmultiqc_service/app.py:1992-1992
Timestamp: 2025-10-31T10:21:32.605Z
Learning: In pmultiqc_service/app.py, when rendering templates (especially index.html), all endpoints must pass the same config structure to maintain consistency. The config dict should include both BASE_URL and PRIDE_BUTTON_VISIBLE: {"request": request, "config": {"BASE_URL": BASE_URL, "PRIDE_BUTTON_VISIBLE": PRIDE_BUTTON_VISIBLE}}. This applies to the index() endpoint and all paths in submit_pride() that render index.html.
Applied to files:
pmultiqc_service/k8s/configmap.yaml
🪛 OSV Scanner (2.3.1)
pmultiqc_service/requirements.txt
[HIGH] 1-1: starlette 0.37.2: Starlette has possible denial-of-service vector when parsing large files in multipart forms
[HIGH] 1-1: starlette 0.37.2: Starlette Denial of service (DoS) via multipart/form-data
🪛 Ruff (0.14.11)
pmultiqc_service/app.py
424-424: Do not catch blind exception: Exception
(BLE001)
425-425: Use logging.exception instead of logging.error
Replace with exception
(TRY400)
445-445: Local variable filetype is assigned to but never used
Remove assignment to unused variable filetype
(F841)
455-455: Abstract raise to an inner function
(TRY301)
455-455: Avoid specifying long messages outside the exception class
(TRY003)
479-479: Local variable upload_dir is assigned to but never used
Remove assignment to unused variable upload_dir
(F841)
485-485: Do not catch blind exception: Exception
(BLE001)
506-506: Do not catch blind exception: Exception
(BLE001)
529-529: Do not catch blind exception: Exception
(BLE001)
533-533: Use logging.exception instead of logging.error
Replace with exception
(TRY400)
534-534: Use logging.exception instead of logging.error
Replace with exception
(TRY400)
2860-2860: Abstract raise to an inner function
(TRY301)
2872-2872: Abstract raise to an inner function
(TRY301)
2875-2875: Do not catch blind exception: Exception
(BLE001)
2876-2876: Use logging.exception instead of logging.error
Replace with exception
(TRY400)
2877-2877: Within an except clause, raise exceptions with raise ... from err or raise ... from None to distinguish them from errors in exception handling
(B904)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
- GitHub Check: Codacy Static Code Analysis
🔇 Additional comments (11)
pmultiqc_service/k8s/configmap.yaml (1)
14-18: Configuration change looks appropriate.Enabling
PRIDE_BUTTON_VISIBLEaligns with the new FragPipe workflow support and PRIDE examples added in this PR. The configuration structure follows the established pattern per the learnings.Minor: Line 18 has trailing whitespace after the
"0"value.pmultiqc_service/templates/index.html (3)
371-390: TUS configuration looks well-designed for resumable uploads.The configuration includes:
- Extensive retry delays (up to 60s) for resilience
- 50MB chunk size appropriate for large files
- Fingerprint-based resume capability
- Proper metadata passing
The implementation correctly handles the TUS protocol requirements.
560-594: Job ID polling implementation is reasonable.The retry logic (10 attempts, 1s intervals) gives adequate time for the backend to process the TUS upload completion callback before failing. The logging at first and last attempts helps with debugging without flooding the console.
1002-1027: FragPipe example addition looks good.The new FragPipe example (PXD066146) is consistent with the other example entries in styling and structure, and correctly describes the expected file types (psm.tsv, ion.tsv).
pmultiqc_service/app.py (6)
538-576: TUS Location header middleware is well-implemented.The middleware correctly handles:
- Relative paths (
/files/...)- Absolute URLs with different hosts
- Proper BASE_URL prepending for ingress scenarios
- Logging for debugging
This is essential for TUS to work correctly behind a reverse proxy/ingress.
1684-1690: FragPipe detection logic looks correct.The detection checks for
psm.tsvandion.tsvfiles which are distinctive FragPipe outputs. The placement after MaxQuant and before DIANN is appropriate since these are more specific file patterns.
698-709: FragPipe file detection in PRIDE file filtering.The addition of
psm.tsvandion.tsvdetection infilter_search_filesallows FragPipe datasets from PRIDE to be properly identified and downloaded. The exact filename matching (filename_lower == "psm.tsv") is appropriate since these are standard FragPipe output names.
579-590: TUS router mounting configuration looks appropriate.The TUS router is mounted at
/fileswith:
max_sizematchingMAX_FILE_SIZE(10GB default)on_upload_completecallback for post-upload processing- 7-day auto-cleanup for incomplete uploads
The configuration aligns with the frontend TUS client setup.
382-385: CORS configuration correctly updated for TUS protocol.TUS requires:
PATCHandHEADmethods (added)- Custom headers like
Upload-Offset,Tus-Resumable(covered by*)- Exposed headers for client to read response headers
The wildcard headers are acceptable here given the TUS protocol requirements, though in production you may want to explicitly list only the required TUS headers for defense-in-depth.
1794-1796: FragPipe plugin integration in MultiQC command is valid.The
--fragpipe-pluginargument is a supported pmultiqc command-line flag defined inpmultiqc/cli.pyand properly integrated with the FragPipeModule implementation. The argument is documented in usage examples and has comprehensive test coverage.pmultiqc_service/requirements.txt (1)
15-15: Confirmed: tuspyserver>=4.2.0 is stable and API-compatible.The
create_tus_routerAPI and all parameters used inapp.py(prefix, files_dir, max_size, on_upload_complete, days_to_keep) are consistent with the documented tuspyserver API. Version 4.2.0 is a stable release with the latest patch 4.2.3 available (Nov 2025). The callback signaturehandle_upload_complete(file_path: str, metadata: dict)matches the expected interface. No compatibility issues.
✏️ Tip: You can disable this entire section by setting review_details to false in your review settings.
pmultiqc_service/requirements.txt
Outdated
| @@ -1,4 +1,4 @@ | |||
| fastapi>=0.104.0,<0.105.0 | |||
| fastapi>=0.110.0,<0.111.0 | |||
There was a problem hiding this comment.
🧩 Analysis chain
🌐 Web query:
FastAPI 0.110.0 starlette dependency version
💡 Result:
FastAPI 0.110.0 pins Starlette to ">=0.37.2,<0.38.0". [1][2]
Sources:
- FastAPI release notes. [1]
- FastAPI GitHub / pyproject dependency listing (discussion). [2]
🌐 Web query:
GHSA-2c2j-9gv5-cj73 starlette vulnerability
💡 Result:
Summary: GHSA-2c2j-9gv5-cj73 is a moderate-severity denial-of-service issue in Starlette (CVE-2025-54121). Parsing multipart forms with large files (over the default spool size) can block the main/event thread when rolling an upload to disk, preventing new connections. Affected versions: < 0.47.2 (notably 0.47.0 and 0.47.1). Fixed in 0.47.2 — upgrade to 0.47.2 or later. (github.com)
Citations:
🌐 Web query:
GHSA-f96h-pmfr-66vw starlette vulnerability multipart
💡 Result:
Summary — what GHSA-f96h-pmfr-66vw is
- A high-severity DoS in Starlette (published Oct 15, 2024): multipart/form-data parts that lack a filename are treated as text fields and buffered with no per-part size limit, allowing attackers to exhaust memory. [1][2]
Affected / fixed
- Affected: starlette versions < 0.40.0.
- Fixed in: starlette 0.40.0 (adds max_part_size to MultiPartParser). [1][3]
Immediate action
- Upgrade any Starlette (or FastAPI) services to >= 0.40.0. [1][3]
- As defense-in-depth, enforce request and part size limits at the reverse proxy (e.g., nginx, cloud WAF) and validate multipart input server-side. [2][3]
References
[1] GitHub Advisory GHSA-f96h-pmfr-66vw (Starlette) — published Oct 15, 2024.
[2] Tenable / CVE-2024-47874 summary.
[3] Starlette commit / changelog fixing multipart (adds max_part_size).
🌐 Web query:
latest stable FastAPI version 2025
💡 Result:
The latest stable FastAPI release in 2025 was 0.128.0, published December 27, 2025. [1][2]
Sources:
- PyPI release page for fastapi. [1]
- Safety DB listing (notes latest v0.128.0). [2]
🌐 Web query:
FastAPI 0.128.0 starlette dependency version
💡 Result:
FastAPI 0.128.0 requires Starlette in the 0.xx series and (per packaging updates) allows Starlette 0.51 — i.e., compatible with Starlette <0.52. [1][2]
Sources:
- PyPI release metadata for fastapi 0.128.0. [1]
- Fedora package changelog (notes: "Allow Starlette 0.51"). [2]
Security: Starlette 0.37.2 has a HIGH-severity memory exhaustion vulnerability (GHSA-f96h-pmfr-66vw).
FastAPI 0.110.x pins Starlette to 0.37.2, which is vulnerable to CVE-2024-47874 (GHSA-f96h-pmfr-66vw): multipart form fields without filenames are buffered with no per-part size limit, allowing attackers to exhaust memory. Since this service handles file uploads, this is a critical concern.
Upgrade FastAPI to a much newer version that allows Starlette >= 0.40.0 (e.g., FastAPI 0.128.0+), which includes the patch.
🧰 Tools
🪛 OSV Scanner (2.3.1)
[HIGH] 1-1: starlette 0.37.2: Starlette has possible denial-of-service vector when parsing large files in multipart forms
[HIGH] 1-1: starlette 0.37.2: Starlette Denial of service (DoS) via multipart/form-data
🤖 Prompt for AI Agents
In `@pmultiqc_service/requirements.txt` at line 1, Update the FastAPI dependency
line in requirements.txt from "fastapi>=0.110.0,<0.111.0" to a version that
allows Starlette >=0.40.0 (for example "fastapi>=0.128.0,<0.129.0" or a newer
patch range); then reinstall or regenerate your lockfile (pip-compile, pipenv
lock, or poetry lock as applicable) and run the test suite to ensure
compatibility (check for any import/API changes after upgrading FastAPI in your
codebase).
Remove tus branch build action in github actions
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## main #540 +/- ##
========================================
+ Coverage 6.83% 8.63% +1.80%
========================================
Files 44 44
Lines 5680 5793 +113
========================================
+ Hits 388 500 +112
- Misses 5292 5293 +1 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Fix all issues with AI agents
In @.github/workflows/build-and-push.yml:
- Around line 36-44: The "Determine Docker tag" step contains an unreachable
else branch and an unused step id; either remove the else block and delete "id:
tag" (since you only trigger on main/master and you write DOCKER_TAG to
GITHUB_ENV), or if you intend to support other branches, expand the workflow
trigger to include branch patterns and replace the branch-to-tag logic with a
sanitized tag (e.g., set DOCKER_TAG by taking env.BRANCH_NAME and replacing '/'
with '-' before writing to GITHUB_ENV) and keep or use an id only if you need
step outputs.
📜 Review details
Configuration used: defaults
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
.github/workflows/build-and-push.yml
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (11)
- GitHub Check: test_maxquant_dia
- GitHub Check: test_dia
- GitHub Check: test_diann
- GitHub Check: test_tmt
- GitHub Check: test_lfq
- GitHub Check: test_maxquant
- GitHub Check: test_proteobench
- GitHub Check: test_mzid_mzML
- GitHub Check: test_fragpipe
- GitHub Check: test_mzid_mgf
- GitHub Check: Codacy Static Code Analysis
🔇 Additional comments (2)
.github/workflows/build-and-push.yml (2)
32-34: LGTM!Standard and correct approach for extracting the branch name from
GITHUB_REF.
46-59: LGTM!Good additions:
- Dynamic tagging via environment variable is correctly wired.
- OCI labels (
org.opencontainers.image.*) follow best practices and provide useful traceability metadata (source repo, commit SHA, version).
✏️ Tip: You can disable this entire section by setting review_details to false in your review settings.
| - name: Determine Docker tag | ||
| id: tag | ||
| shell: bash | ||
| run: | | ||
| if [[ "${{ env.BRANCH_NAME }}" == "main" ]] || [[ "${{ env.BRANCH_NAME }}" == "master" ]]; then | ||
| echo "DOCKER_TAG=latest" >> $GITHUB_ENV | ||
| else | ||
| echo "DOCKER_TAG=${{ env.BRANCH_NAME }}" >> $GITHUB_ENV | ||
| fi |
There was a problem hiding this comment.
Unreachable else branch and unused step ID.
Two observations:
-
The workflow only triggers on
mainandmasterbranches (line 5), so theelsebranch (lines 42-44) will never execute. If this is intentional, consider removing the dead code; otherwise, expand the trigger to include other branches. -
The
id: tagis set but never referenced. Since outputs go toGITHUB_ENV, the id can be removed.
🔧 Suggested simplification (if only main/master is intended)
- name: Determine Docker tag
- id: tag
shell: bash
run: |
- if [[ "${{ env.BRANCH_NAME }}" == "main" ]] || [[ "${{ env.BRANCH_NAME }}" == "master" ]]; then
- echo "DOCKER_TAG=latest" >> $GITHUB_ENV
- else
- echo "DOCKER_TAG=${{ env.BRANCH_NAME }}" >> $GITHUB_ENV
- fi
+ echo "DOCKER_TAG=latest" >> $GITHUB_ENVAlternatively, if you intend to support feature branches in the future, consider sanitizing branch names (replacing / with -) since Docker tags cannot contain /.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| - name: Determine Docker tag | |
| id: tag | |
| shell: bash | |
| run: | | |
| if [[ "${{ env.BRANCH_NAME }}" == "main" ]] || [[ "${{ env.BRANCH_NAME }}" == "master" ]]; then | |
| echo "DOCKER_TAG=latest" >> $GITHUB_ENV | |
| else | |
| echo "DOCKER_TAG=${{ env.BRANCH_NAME }}" >> $GITHUB_ENV | |
| fi | |
| - name: Determine Docker tag | |
| shell: bash | |
| run: | | |
| echo "DOCKER_TAG=latest" >> $GITHUB_ENV |
🤖 Prompt for AI Agents
In @.github/workflows/build-and-push.yml around lines 36 - 44, The "Determine
Docker tag" step contains an unreachable else branch and an unused step id;
either remove the else block and delete "id: tag" (since you only trigger on
main/master and you write DOCKER_TAG to GITHUB_ENV), or if you intend to support
other branches, expand the workflow trigger to include branch patterns and
replace the branch-to-tag logic with a sanitized tag (e.g., set DOCKER_TAG by
taking env.BRANCH_NAME and replacing '/' with '-' before writing to GITHUB_ENV)
and keep or use an id only if you need step outputs.
There was a problem hiding this comment.
Actionable comments posted: 2
🤖 Fix all issues with AI agents
In `@pmultiqc_service/templates/index.html`:
- Around line 547-560: The onlineHandler uses a non-existent upload.isUploading
property so the condition can be true incorrectly and trigger duplicate resume
calls; replace this check by tracking upload status yourself (e.g., introduce a
boolean flag like isPausedByUser or isUploadRunning) and update that flag when
you call upload.start() or upload.abort(), then change the onlineHandler
condition (which references currentUploadXhr and upload) to consult your flag
instead of upload.isUploading so resume is only attempted when the upload is
actually paused by the user.
- Around line 9-10: Update the embedded tus-js-client from v4.2.0 to v4.3.1 by
changing the script tag that references the library (the <script
src="...tus-js-client@4.2.0..."> element) to point to the 4.3.1 CDN URL and
replace the current integrity attribute value with the correct SRI hash for
v4.3.1; ensure the crossorigin attribute remains present if needed and test
uploads to confirm IETF draft and Node Buffer behavior improvements work as
expected.
🧹 Nitpick comments (1)
pmultiqc_service/templates/index.html (1)
1038-1038: Minor: Redundant CSS shorthand value.Static analysis flagged
"0 0 5px 0"as having a redundant fourth value. When the left margin equals the right margin (both 0), the shorthand can be simplified to"0 0 5px".Note: This same pattern appears in other example blocks as well (lines 930, 957, 984, 1011), so you may want to address them all together for consistency.
♻️ Suggested fix
- <h4 style="margin: 0 0 5px 0; color: `#333`;">PXD066146 - FragPipe Analysis</h4> + <h4 style="margin: 0 0 5px; color: `#333`;">PXD066146 - FragPipe Analysis</h4>
📜 Review details
Configuration used: defaults
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
pmultiqc_service/templates/index.html
🧰 Additional context used
🪛 GitHub Check: CodeFactor
pmultiqc_service/templates/index.html
[warning] 1038-1038: pmultiqc_service/templates/index.html#L1038
Expected "0 0 5px 0" to be "0 0 5px". (shorthand-property-no-redundant-values)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (6)
- GitHub Check: test_mzid_mgf
- GitHub Check: test_lfq
- GitHub Check: test_proteobench
- GitHub Check: test_mzid_mzML
- GitHub Check: test_tmt
- GitHub Check: Codacy Static Code Analysis
🔇 Additional comments (6)
pmultiqc_service/templates/index.html (6)
111-114: LGTM!The Resume and Cancel buttons are properly structured with appropriate initial hidden state and onclick handlers.
267-299: LGTM!The
resumeUploadandcancelUploadfunctions correctly handle the TUS upload lifecycle. Good to seecleanupUploadListeners()being called incancelUploadto prevent memory leaks.
393-434: Well-designed error classification.The error handler properly distinguishes between network errors (resumable) and permanent failures. Network errors preserve the upload instance for manual resume, while permanent failures clean up resources appropriately.
566-601: LGTM!The polling implementation is well-designed with proper URL encoding, bounded retry attempts, and reduced logging noise. The error handling gracefully retries on "not ready" responses.
806-824: LGTM!Both utility functions are well-implemented.
escapeHtmlprovides comprehensive XSS protection by escaping all necessary HTML entities, andcleanupUploadListenersproperly removes event listeners to prevent memory leaks. These address concerns from previous reviews.
826-856: XSS vulnerability addressed.The
escapeHtml(message)call properly sanitizes user-controlled content before inserting into innerHTML. The conditional logic for showing/hiding the Resume button based on error type is well-implemented.
✏️ Tip: You can disable this entire section by setting review_details to false in your review settings.
Pull Request
Description
Brief description of the changes made in this PR.
Type of Change
Summary by CodeRabbit
✏️ Tip: You can customize this high-level summary in your review settings.