|
| 1 | +--- |
| 2 | +title: Jupyter Notebooks |
| 3 | +category: Instructor > Autograding |
| 4 | +--- |
| 5 | + |
| 6 | +### Example Configuration |
| 7 | + |
| 8 | +Example Jupyter Notebook autograding configurations can be found on the [Submitty GitHub](https://github.com/Submitty/Submitty/tree/main/more_autograding_examples). |
| 9 | + |
| 10 | +### Required Fields |
| 11 | + |
| 12 | +For autograding Jupyter Notebooks, the following fields must be included in your autograding configuration. |
| 13 | + |
| 14 | +```json |
| 15 | +{ |
| 16 | + "autograding" : { |
| 17 | + "submission_to_runner" : [ ... ], |
| 18 | + "work_to_details" : [ ... ] |
| 19 | + }, |
| 20 | + "autograding_method": "docker", |
| 21 | + "container_options": { |
| 22 | + "container_image": "..." |
| 23 | + }, |
| 24 | + "allow_system_calls": [ |
| 25 | + ... |
| 26 | + ] |
| 27 | +} |
| 28 | +``` |
| 29 | + |
| 30 | +The method for allowing certain system calls can be found in [System Call Filtering](/instructor/autograding/system_call_filtering). |
| 31 | +You may also need to pass in resource limit values or a max submission size. |
| 32 | + |
| 33 | +```json |
| 34 | +"resource_limits" : { |
| 35 | + "RLIMIT_NPROC" : 32, |
| 36 | + "RLIMIT_FSIZE": 20971520 // 20 MB |
| 37 | + }, |
| 38 | +"max_submission_size": 10485760, // 10 MB |
| 39 | +``` |
| 40 | + |
| 41 | +`RLIMIT_NPROC` will allow the necessary resources for the script to execute a submitted Jupyter Notebook. |
| 42 | + |
| 43 | +`RLIMIT_FSIZE` allows the script to save the notebook in Submitty. In this case, it allows at most a 20 MB file to be executed and saved. |
| 44 | + |
| 45 | +### Precommands |
| 46 | + |
| 47 | +Because of the saved outputs, you will need to apply precommands to each testcase that requires files generated from the validation |
| 48 | +method. |
| 49 | + |
| 50 | +```json |
| 51 | +{ |
| 52 | + "pre_commands" : [ |
| 53 | + { |
| 54 | + "command" : "cp", |
| 55 | + // Assuming the first testcase is the case where the validation method was run |
| 56 | + "testcase" : "test01", |
| 57 | + "source" : "{filename}*.*", |
| 58 | + "destination" : "./" |
| 59 | + } |
| 60 | + ], |
| 61 | + ... |
| 62 | + }, |
| 63 | +``` |
| 64 | + |
| 65 | +### Validation Method |
| 66 | + |
| 67 | +We offer a validation method that parses the cells in Jupyter Notebooks for autograding. The script requires a single specified |
| 68 | +input notebook (the student's submission) and outputs an executed version of the notebook. If the student has no restrictions on |
| 69 | +the naming convention for their submitted notebook, you can use a wildcard to expect any input. |
| 70 | + |
| 71 | +``` |
| 72 | +"command": "jupyter_notebook_grader -i *.ipynb -o executed.ipynb" |
| 73 | +``` |
| 74 | + |
| 75 | +The possible saved files based on parsed output are highlighted below. |
| 76 | + |
| 77 | +| Cell Type | File Pattern | Description | |
| 78 | +| --------- | ----------------------- | --------------------------------------------------- | |
| 79 | +| Markdown | {filename}.txt | Contains raw markdown source from cell | |
| 80 | +| Code | {filename}_source.txt | Contains Python source code from cell | |
| 81 | +| | {filename}_err.txt | Contains traceback if error occurs, otherwise empty | |
| 82 | +| | {filename}_stdout.txt | Captures standard output (e.g., print() statements) | |
| 83 | +| | {filename}_stderr.txt | Captures standard error streams | |
| 84 | +| | {filename}_result.txt | Text representation of cell's result | |
| 85 | +| | {filename}.png | Generated image outputs | |
| 86 | + |
| 87 | +By default, the filenames are generated based on the index of the cell in the notebook, starting from 1. |
| 88 | + |
| 89 | +### Submitty IDs |
| 90 | + |
| 91 | +Instructors can add Submitty IDs to mark specific cells in Jupyter Notebooks to grade. These specified IDs will replace the |
| 92 | +default filename of the saved output (e.g. cell1.txt --> {submitty_id}.txt). If students modify their provided Jupyter Notebook |
| 93 | +to be out of order, this method will correctly find the cell to grade. Note that your autograding configuration will also need |
| 94 | +to match the name of the saved file. |
| 95 | + |
| 96 | +```json |
| 97 | +{ |
| 98 | + "pre_commands" : [ |
| 99 | + { |
| 100 | + "command" : "cp", |
| 101 | + "testcase" : "test01", |
| 102 | + "source" : "{submitty_id}*.*", |
| 103 | + "destination" : "./" |
| 104 | + } |
| 105 | + ], |
| 106 | + "title": "STDOUT", |
| 107 | + "points": 1, |
| 108 | + "validation": [ |
| 109 | + { |
| 110 | + "method": "diff", |
| 111 | + "actual_file": "{submitty_id}_stdout.txt", |
| 112 | + "expected_string" : "hello world!" |
| 113 | + } |
| 114 | + ] |
| 115 | +} |
| 116 | +``` |
| 117 | + |
| 118 | +Jupyter Notebook provides an option to easily edit cell metadata, introduced in [7.1.0](https://jupyter-notebook.readthedocs.io/en/stable/changelog.html#id116). In the `Edit` dropdown, you can find |
| 119 | +`Edit Notebook Metadata`. An editor panel will open on the right side of your screen to easily type in the metadata. |
| 120 | + |
| 121 | + |
| 122 | + |
| 123 | +An alternative option would be to manually edit the Jupyter Notebook JSON itself to include the `submitty_id` property in a cell's metadata. |
0 commit comments