Skip to content

Change the scans to be pyansys scans #375

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft
wants to merge 37 commits into
base: main
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
37 commits
Select commit Hold shift + click to select a range
60ecd85
Change the scans to be pyansys scans
margalva Aug 6, 2025
8d49092
Merge branch 'main' into maint/scans
margalva Aug 6, 2025
802dcc8
Replace sbom generation action
margalva Aug 6, 2025
660ef38
Merge from main
margalva Aug 6, 2025
81a3176
Fix the name
margalva Aug 6, 2025
e4c1ce4
Stop upload at a previous step
margalva Aug 6, 2025
0d3379f
Add vulnerability scan on dev version
margalva Aug 6, 2025
0aac09c
Fix workflow
margalva Aug 6, 2025
3fd0f37
Add logs for vulnerability
margalva Aug 6, 2025
2bc481e
Address vulnerability scan results
margalva Aug 6, 2025
77f5aac
Address vulnerability scan results
margalva Aug 6, 2025
df67a27
Add logs for vulnerability
margalva Aug 6, 2025
cfe1ed4
Fix syntax
margalva Aug 6, 2025
5ab5996
Fix a couple issues
margalva Aug 6, 2025
2d94873
Synatx fixes
margalva Aug 7, 2025
c767280
automatic upload
margalva Aug 7, 2025
ca97dea
Exclude bandit from security scanning
margalva Aug 7, 2025
6d4bebc
Add bandit config file
margalva Aug 7, 2025
6c7f60b
config for bandit
margalva Aug 7, 2025
756043d
Fix path to yaml
margalva Aug 7, 2025
fa7a2ae
Try to deactivate
margalva Aug 7, 2025
db51442
Try to create our own safety scan
margalva Aug 7, 2025
d67fe83
first the new pipeline
margalva Aug 7, 2025
792ccac
Add safety package in pipeline
margalva Aug 7, 2025
593285c
fix the workflow - check out and build code
margalva Aug 7, 2025
4582f3c
Remove ansys workflow in favor of public workflows
margalva Aug 7, 2025
96d3dbb
Check that it can find security issues
margalva Aug 7, 2025
fc6f642
Make everything in a single workflow...
margalva Aug 7, 2025
5a79505
go back to pyansys workflows
margalva Aug 7, 2025
69e5f8a
Add scan for dev
margalva Aug 7, 2025
689dc53
Show logs
margalva Aug 7, 2025
3668a4c
add exclude from bandit
margalva Aug 7, 2025
84e8749
Add nosec
margalva Aug 7, 2025
e9dbaa5
Add more exception
margalva Aug 7, 2025
e6cdc24
Remove errors on pickle
margalva Aug 7, 2025
d140a50
more bandit fixes
margalva Aug 7, 2025
1d5a7a0
more bandit fixes
margalva Aug 7, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
16 changes: 0 additions & 16 deletions .github/workflows/nightly_scan.yml

This file was deleted.

79 changes: 79 additions & 0 deletions .github/workflows/scan_sbom.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,79 @@
name: Security Scan

env:
MAIN_PYTHON_VERSION: '3.13'
PACKAGE_NAME: 'ansys-dynamicreporting-core'

on:
push:
branches:
- main
- maint/*
- release/*

jobs:

sbom:
name: Generate SBOM
runs-on: ubuntu-latest

steps:

- name: Checkout code
uses: actions/checkout@v4

- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: ${{ env.MAIN_PYTHON_VERSION }}

- name: Build wheelhouse
uses: ansys/actions/build-wheelhouse@v10
with:
library-name: ${{ env.PACKAGE_NAME }}
operating-system: ubuntu-latest
python-version: ${{ env.MAIN_PYTHON_VERSION }}

- name: Install from wheelhouse
run: python -m pip install --no-index --find-links=wheelhouse ${{ env.PACKAGE_NAME }}

- name: Generate SBOM with Syft
uses: anchore/[email protected]
with:
format: cyclonedx-json
output-file: sbom.cyclonedx.json
upload-artifact: false

- name: Upload SBOM as artifact
uses: actions/upload-artifact@v4
with:
name: ${{ env.PACKAGE_NAME }}-sbom
path: sbom.cyclonedx.json


vulnerabilities:
name: Vulnerabilities
runs-on: ubuntu-latest

steps:

- name: PyAnsys Vulnerability check (on main)
if: github.ref == 'refs/heads/main'
uses: ansys/actions/[email protected]
with:
python-version: ${{ env.MAIN_PYTHON_VERSION }}
python-package-name: ${{ env.PACKAGE_NAME }}
token: ${{ secrets.PYANSYS_CI_BOT_TOKEN }}
run-bandit: false
hide-log: false

- name: PyAnsys Vulnerability check (on dev)
if: github.ref != 'refs/heads/main'
uses: ansys/actions/[email protected]
with:
python-version: ${{ env.MAIN_PYTHON_VERSION }}
python-package-name: ${{ env.PACKAGE_NAME }}
token: ${{ secrets.PYANSYS_CI_BOT_TOKEN }}
run-bandit: false
dev-mode: true
hide-log: false
2 changes: 1 addition & 1 deletion codegen/adr_utils.txt
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ def in_ipynb():
return True
if "terminal" in ipy_str:
return False
except Exception: # todo: please specify the possible exceptions here.
except Exception as e: # todo: please specify the possible exceptions here.
return False


Expand Down
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ dependencies = [
"django-guardian~=2.4",
"tzlocal~=5.0",
"numpy>=1.23.5,<3",
"python-pptx==0.6.19",
"python-pptx==0.6.23",
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@margalva Any version beyond 0.6.19 will break powerpoint export. I have to do an upgrade in TFS before it can be upgraded here

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@viseshrp thanks, I'm still working on this PR, but we do need to upgrade the version due to security issues on the current version - see https://data.safetycli.com/vulnerabilities/CVE-2023-4863/62149/
Let me know if this is more work than a simple update on both TFS and here

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, it requires some refactoring/rewrite of small portions of the code.

"pandas>=2.0",
"statsmodels>=0.14",
"scipy<=1.15.3", # breaks ADR if not included. Remove when statsmodels is updated
Expand Down
21 changes: 11 additions & 10 deletions src/ansys/dynamicreporting/core/adr_service.py
Original file line number Diff line number Diff line change
Expand Up @@ -241,7 +241,7 @@ def connect(
username: str = "nexus",
password: str = "cei",
session: str | None = "",
) -> None:
) -> None: # nosec B107
"""
Connect to a running service.

Expand Down Expand Up @@ -282,8 +282,8 @@ def connect(
)
try:
self.serverobj.validate()
except Exception:
self.logger.error("Can not validate dynamic reporting server.\n")
except Exception as e:
self.logger.error(f"Can not validate dynamic reporting server.\nError: {str(e)}")
raise NotValidServer
# set url after connection succeeds
self._url = url
Expand All @@ -301,7 +301,7 @@ def start(
error_if_create_db_exists: bool = False,
exit_on_close: bool = False,
delete_db: bool = False,
) -> str:
) -> str: # nosec B107
"""
Start a new service.

Expand Down Expand Up @@ -392,11 +392,11 @@ def start(
if self._docker_launcher:
try:
create_output = self._docker_launcher.create_nexus_db()
except Exception: # pragma: no cover
except Exception as e: # pragma: no cover
self._docker_launcher.cleanup()
self.logger.error(
f"Error creating the database at the path {self._db_directory} in the "
"Docker container.\n"
"Error creating the database at the path {self._db_directory} in the "
f"Docker container.\nError: {str(e)}"
)
raise CannotCreateDatabaseError
for f in ["db.sqlite3", "view_report.nexdb"]:
Expand Down Expand Up @@ -511,10 +511,11 @@ def stop(self) -> None:
v = False
try:
v = self.serverobj.validate()
except Exception:
except Exception as e:
self.logger.error(f"Error: {str(e)}")
pass
if v is False:
self.logger.error("Error validating the connected service. Can't shut it down.\n")
self.logger.error("Error validating the connected service. Can't shut it down.")
else:
# If coming from a docker image, clean that up
try:
Expand Down Expand Up @@ -814,7 +815,7 @@ def delete(self, items: list) -> None:
try:
_ = self.serverobj.del_objects(items_to_delete)
except Exception as e:
self.logger.warning(f"Error in deleting items: {e}")
self.logger.warning(f"Error in deleting items: {str(e)}")

def get_report(self, report_name: str) -> Report:
"""
Expand Down
12 changes: 6 additions & 6 deletions src/ansys/dynamicreporting/core/docker_support.py
Original file line number Diff line number Diff line change
Expand Up @@ -64,8 +64,8 @@ def __init__(self, image_url: str | None = None, use_dev: bool = False) -> None:
# Load up Docker from the user's environment
try:
self._client: docker.client.DockerClient = docker.from_env()
except Exception: # pragma: no cover
raise RuntimeError("Can't initialize Docker")
except Exception as e: # pragma: no cover
raise RuntimeError(f"Can't initialize Docker: {str(e)}")
self._container: docker.models.containers.Container = None
self._image: docker.models.images.Image = None
# the Ansys / EnSight version we found in the container
Expand All @@ -92,8 +92,8 @@ def pull_image(self) -> docker.models.images.Image:
"""
try:
self._image = self._client.images.pull(self._image_url)
except Exception:
raise RuntimeError(f"Can't pull Docker image: {self._image_url}")
except Exception as e:
raise RuntimeError(f"Can't pull Docker image: {self._image_url}\n\n{str(e)}")
return self._image

def create_container(self) -> docker.models.containers.Container:
Expand All @@ -119,7 +119,7 @@ def copy_to_host(self, src: str, *, dest: str = ".") -> None:
tar_file.write(chunk)
# Extract the tar archive
with tarfile.open(tar_file_path) as tar:
tar.extractall(path=output_path)
tar.extractall(path=output_path) # nosec B202
# Remove the tar archive
tar_file_path.unlink()
except Exception as e:
Expand Down Expand Up @@ -176,7 +176,7 @@ def start(self, host_directory: str, db_directory: str, port: int, ansys_version
existing_names = [x.name for x in self._client.from_env().containers.list()]
container_name = "nexus"
while container_name in existing_names:
container_name += random.choice(string.ascii_letters)
container_name += random.choice(string.ascii_letters) # nosec B311
if len(container_name) > 500:
raise RuntimeError("Can't determine a unique Docker container name.")

Expand Down
7 changes: 4 additions & 3 deletions src/ansys/dynamicreporting/core/examples/downloads.py
Original file line number Diff line number Diff line change
Expand Up @@ -42,9 +42,10 @@ def check_url_exists(url: str) -> bool:
logging.debug(f"Passed url is invalid: {url}\n")
return False
try:
with request.urlopen(url) as response:
with request.urlopen(url) as response: # nosec B310
return response.status == 200
except Exception:
except Exception as e:
logging.debug(f"Check url error: {str(e)}\n")
return False


Expand All @@ -61,7 +62,7 @@ def get_url_content(url: str) -> str:
str
content of the URL
"""
with request.urlopen(url) as response:
with request.urlopen(url) as response: # nosec B310
return response.read()


Expand Down
2 changes: 1 addition & 1 deletion src/ansys/dynamicreporting/core/serverless/item.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
from html.parser import HTMLParser as BaseHTMLParser
import io
from pathlib import Path
import pickle
import pickle # nosec B403
import platform
import uuid

Expand Down
5 changes: 4 additions & 1 deletion src/ansys/dynamicreporting/core/utils/encoders.py
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,10 @@ def default(self, obj):
cls = list if isinstance(obj, (list, tuple)) else dict
try:
return cls(obj)
except Exception:
except Exception as e: # nosec
error_str = f"Object of type {type(obj).__name__} is not JSON serializable: "
error_str += str(e)
raise TypeError(error_str)
pass
elif hasattr(obj, "__iter__"):
return tuple(item for item in obj)
Expand Down
12 changes: 6 additions & 6 deletions src/ansys/dynamicreporting/core/utils/extremely_ugly_hacks.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# All Python3 migration-related ugly hacks go here.
import base64
import pickle
import pickle # nosec B403
from uuid import UUID

from .report_utils import text_type
Expand Down Expand Up @@ -54,18 +54,18 @@ def safe_unpickle(input_data, item_type=None):
try:
# be default, we follow python3's way of loading: default encoding is ascii
# this will work if the data was dumped using python3's pickle. Just do the usual.
data = pickle.loads(bytes_data)
except Exception:
data = pickle.loads(bytes_data) # nosec B301 B502
except Exception: # nosec
try:
data = pickle.loads(bytes_data, encoding="utf-8")
data = pickle.loads(bytes_data, encoding="utf-8") # nosec B301 B502
except Exception:
# if it fails, which it will if the data was dumped using python2's pickle, then:
# As per https://docs.python.org/3/library/pickle.html#pickle.loads,
# "Using encoding='latin1' is required for unpickling NumPy arrays and instances of datetime,
# date and time pickled by Python 2."
# The data does contain a numpy array. So:
try:
data = pickle.loads(bytes_data, encoding="latin-1")
data = pickle.loads(bytes_data, encoding="latin-1") # nosec B301 B502

# if the stream contains international characters which were 'loaded' with latin-1,
# we get garbage text. We have to detect that and then use a workaround.
Expand All @@ -80,7 +80,7 @@ def safe_unpickle(input_data, item_type=None):
# this is a tree item ONLY case that has a pickled datetime obj,
# we use bytes as the encoding to workaround this issue, because
# other encodings will not work.
data = pickle.loads(bytes_data, encoding="bytes")
data = pickle.loads(bytes_data, encoding="bytes") # nosec B301 B502

# check again, just in case
if item_type == "tree":
Expand Down
3 changes: 2 additions & 1 deletion src/ansys/dynamicreporting/core/utils/filelock.py
Original file line number Diff line number Diff line change
Expand Up @@ -251,8 +251,9 @@ def acquire(self, timeout=None, poll_intervall=0.05):
poll_intervall,
)
time.sleep(poll_intervall)
except Exception:
except Exception as e:
# Something did go wrong, so decrement the counter.
logger.error(f"Exception: {str(e)}")
with self._thread_lock:
self._lock_counter = max(0, self._lock_counter - 1)

Expand Down
4 changes: 2 additions & 2 deletions src/ansys/dynamicreporting/core/utils/geofile_processing.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@
import io
import os
import platform
import subprocess
import subprocess # nosec B78 B603 B404
import typing
import zipfile

Expand Down Expand Up @@ -213,7 +213,7 @@ def rebuild_3d_geometry(csf_file: str, unique_id: str = "", exec_basis: str = No
stderr=subprocess.DEVNULL,
close_fds=True,
creationflags=create_flags,
)
) # nosec B78 B603
except Exception as e:
print(f"Warning: unable to convert '{csf_file}' into AVZ format: {str(e)}")
# At this point, if we have an original AVZ file or a converted udrw file, we
Expand Down
Loading
Loading