Skip to content

Fix TypeError when simpletuner.__file__ is None during training launch#2585

Merged
bghira merged 3 commits intomainfrom
copilot/fix-typeerror-training-start
Feb 8, 2026
Merged

Fix TypeError when simpletuner.__file__ is None during training launch#2585
bghira merged 3 commits intomainfrom
copilot/fix-typeerror-training-start

Conversation

Copy link
Contributor

Copilot AI commented Feb 8, 2026

simpletuner.__file__ can be None under certain install configurations (namespace packages, some editable installs), causing Path(None) to raise TypeError at training start.

train_py = Path(simpletuner.__file__).parent / "train.py"
           ~~~~^^^^^^^^^^^^^^^^^^^^^^
TypeError: argument should be a str or an os.PathLike object where __fspath__ returns a str, not 'NoneType'

Changes

  • simpletuner/__init__.py — Add _get_package_dir() helper that resolves the package directory via __file__, falling back to importlib.util.find_spec (checking both spec.origin and spec.submodule_search_locations)
  • trainer.py, cli/train.py, test_e2e_examples.py — Replace all bare Path(simpletuner.__file__).parent calls with simpletuner._get_package_dir()
  • paths.py — Remove duplicated __file__ is None guard in get_simpletuner_root(), delegate to the shared helper
  • tests/test_get_package_dir.py — Tests covering normal path, __file__ = None with importlib fallback, and unresolvable package cases
Original prompt

This section details on the original issue you should resolve

<issue_title>Training start immediately fails with TypeError</issue_title>
<issue_description>Hi there, thanks for your work on this project. I have been able to run the Web UI in a venv in Linux + ROCm with success, but I get this error always a few seconds after I try to start training:

Multiple distributions found for package optimum. Picked distribution: optimum-quanto
[RANK 0] 2026-02-08 11:58:31,628 [INFO] Patched Attention with flexible fusion (permanent=True)
[RANK 0] 2026-02-08 11:58:31,704 [ERROR] Function error: argument should be a str or an os.PathLike object where __fspath__ returns a str, not 'NoneType'
[RANK 0] 2026-02-08 11:58:31,705 [ERROR] Traceback (most recent call last):
  File "<string>", line 322, in <module>
    result = target_func(wrapped_config)
  File "/workspace/SimpleTuner/simpletuner/helpers/training/trainer.py", line 6780, in run_trainer_job
    accelerate_result = _launch_with_accelerate()
  File "/workspace/SimpleTuner/simpletuner/helpers/training/trainer.py", line 6606, in _launch_with_accelerate
    train_py = Path(simpletuner.__file__).parent / "train.py"
               ~~~~^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib64/python3.13/pathlib/_local.py", line 503, in __init__
    super().__init__(*args)
    ~~~~~~~~~~~~~~~~^^^^^^^
  File "/usr/lib64/python3.13/pathlib/_local.py", line 132, in __init__
    raise TypeError(
    ...<2 lines>...
        f"not {type(path).__name__!r}")
TypeError: argument should be a str or an os.PathLike object where __fspath__ returns a str, not 'NoneType'

& Just in case you need the info:

(venv) [root@517da84a37fa workspace]# rocminfo
ROCk module is loaded
=====================    
HSA System Attributes    
=====================    
Runtime Version:         1.18
Runtime Ext Version:     1.16
System Timestamp Freq.:  1000.000000MHz
Sig. Max Wait Duration:  18446744073709551615 (0xFFFFFFFFFFFFFFFF) (timestamp count)
Machine Model:           LARGE                              
System Endianness:       LITTLE                             
Mwaitx:                  DISABLED
XNACK enabled:           NO
DMAbuf Support:          YES
VMM Support:             YES

==========               
HSA Agents               
==========               
*******                  
Agent 1                  
*******                  
  Name:                    AMD RYZEN AI MAX+ 395 w/ Radeon 8060S
  Uuid:                    CPU-XX                             
  Marketing Name:          AMD RYZEN AI MAX+ 395 w/ Radeon 8060S
  Vendor Name:             CPU                                
  Feature:                 None specified                     
  Profile:                 FULL_PROFILE                       
  Float Round Mode:        NEAR                               
  Max Queue Number:        0(0x0)                             
  Queue Min Size:          0(0x0)                             
  Queue Max Size:          0(0x0)                             
  Queue Type:              MULTI                              
  Node:                    0                                  
  Device Type:             CPU                                
  Cache Info:              
    L1:                      49152(0xc000) KB                   
  Chip ID:                 0(0x0)                             
  ASIC Revision:           0(0x0)                             
  Cacheline Size:          64(0x40)                           
  Max Clock Freq. (MHz):   5187                               
  BDFID:                   0                                  
  Internal Node ID:        0                                  
  Compute Unit:            32                                 
  SIMDs per CU:            0                                  
  Shader Engines:          0                                  
  Shader Arrs. per Eng.:   0                                  
  WatchPts on Addr. Ranges:1                                  
  Memory Properties:       
  Features:                None
  Pool Info:               
    Pool 1                   
      Segment:                 GLOBAL; FLAGS: FINE GRAINED        
      Size:                    131155396(0x7d145c4) KB            
      Allocatable:             TRUE                               
      Alloc Granule:           4KB                                
      Alloc Recommended Granule:4KB                                
      Alloc Alignment:         4KB                                
      Accessible by all:       TRUE                               
    Pool 2                   
      Segment:                 GLOBAL; FLAGS: EXTENDED FINE GRAINED
      Size:                    131155396(0x7d145c4) KB            
      Allocatable:             TRUE                               
      Alloc Granule:           4KB                                
      Alloc Recommended Granule:4KB                                
      Alloc Alignment:         4KB       ...

</details>



<!-- START COPILOT CODING AGENT SUFFIX -->

- Fixes bghira/SimpleTuner#2581

<!-- START COPILOT CODING AGENT TIPS -->
---

💡 You can make Copilot smarter by setting up custom instructions, customizing its development environment and configuring Model Context Protocol (MCP) servers. Learn more [Copilot coding agent tips](https://gh.io/copilot-coding-agent-tips) in the docs.

Copilot AI and others added 2 commits February 8, 2026 23:09
Add _get_package_dir() helper to simpletuner/__init__.py that falls back
to importlib.util.find_spec when __file__ is None (namespace packages,
certain editable installs). Replace all bare simpletuner.__file__ usages
with this helper across trainer.py, cli/train.py, paths.py, and tests.

Co-authored-by: bghira <59658056+bghira@users.noreply.github.com>
…a test case

Co-authored-by: bghira <59658056+bghira@users.noreply.github.com>
Copilot AI changed the title [WIP] Fix TypeError occurring when starting training Fix TypeError when simpletuner.__file__ is None during training launch Feb 8, 2026
Copilot AI requested a review from bghira February 8, 2026 23:11
@bghira bghira marked this pull request as ready for review February 8, 2026 23:19
@bghira bghira linked an issue Feb 8, 2026 that may be closed by this pull request
@bghira bghira merged commit 50f5279 into main Feb 8, 2026
2 checks passed
@bghira bghira deleted the copilot/fix-typeerror-training-start branch February 8, 2026 23:40
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Training start immediately fails with TypeError

2 participants