Conversation
| ), | ||
| 'wd14-convnextv2.v1': WaifuDiffusionInterrogator( | ||
| 'WD14 ConvNeXTV2 v1', | ||
| repo_id='SmilingWolf/wd-v1-4-convnextv2-tagger-v2' |
There was a problem hiding this comment.
this is odd, v1 and v2, was the repository named incorrectly? If so I'm adding a comment
| repo_id='SmilingWolf/wd-v1-4-convnextv2-tagger-v2' | ||
| ), | ||
| 'wd14-swinv2.v2': WaifuDiffusionInterrogator( | ||
| 'WD14 SwinV2 v1', |
There was a problem hiding this comment.
Same question here, but then I'd saythe it should be
'wd14-swinv2.v1': WaifuDiffusionInterrogator(
-----------------^^
'WD14 SwinV2 v1',
repo_id='SmilingWolf/wd-v1-4-swinv2-tagger-v2'
),| target_size = ( | ||
| (target_size[0] // 4) * 4, | ||
| (target_size[1] // 4) * 4, | ||
| ) |
There was a problem hiding this comment.
this is the same as
target_size = (target_size[0] & ~3, target_size[1] & ~3)
right?
| 'onnxruntime-gpu' | ||
| ) | ||
|
|
||
| run_pip(f'install {package}', 'onnxruntime') |
There was a problem hiding this comment.
I can just add onnxruntime to the requirements.txt, right?
onnxruntime-silicon; sys_platform == 'darwin'
onnxruntime-gpu; sys_platform != 'darwin'
or people can adapt this
| def __init__( | ||
| self, | ||
| name: str, | ||
| repo_id: str, |
There was a problem hiding this comment.
I have some without a repo_id, so I've changed this to a repo_id=None, also requires some changes elsewhere,
| unload_model_after_running: bool | ||
| ): | ||
| if interrogator not in utils.interrogators: | ||
| interrogator: Interrogator = next((i for i in utils.interrogators.values() if interrogator_name == i.name), None) |
| ', '.join(processed_tags), | ||
| ratings, | ||
| tags, | ||
| dict(list(tags.items())[:200]), |
There was a problem hiding this comment.
In my branch There is already a solution for this (the slider on in settings -> tagger), so I can omit this.
| interrogators[path.name] = DeepDanbooruInterrogator(path.name, path) | ||
|
|
||
| return sorted(interrogators.keys()) | ||
| interrogators["deepdanbooru"] = DeepDanbooruInterrogator(path.name, path) |
There was a problem hiding this comment.
Path.name seems to work as well, why change it?
| if self.model is not None: | ||
| self.model = None | ||
| unloaded = True | ||
| gc.collect() |
There was a problem hiding this comment.
what exactly does gr.collect() do?
There was a problem hiding this comment.
You may already notice, this line is gc.collect() rather than gr.collect(). I guess PR author's original purpose is to call garbage collector(gc) to run a full collection. But as we already known, he/she did abandon this approach due to the underlying lib bugs.
There was a problem hiding this comment.
ah yes I noticed and forgot. I also found this so numba possible sollution for cuda users.
There was a problem hiding this comment.
The new solution looks cool, please give it a try.
| except ImportError: | ||
| # only one of these packages should be installed at a time in any one environment | ||
| # https://onnxruntime.ai/docs/get-started/with-python.html#install-onnx-runtime | ||
| # TODO: remove old package when the environment changes? |
There was a problem hiding this comment.
In what scenario do you expect the environment to change? installed on an external drive?
I don't think we have to support this.
| repo_id: str, | ||
| model_path='model.onnx', | ||
| tags_path='selected_tags.csv', | ||
| **kwargs |
There was a problem hiding this comment.
the kwargs are now obsolete
| tags_path = Path(hf_hub_download( | ||
| **self.kwargs, filename=self.tags_path)) | ||
| model_path = hf_hub_download(self.repo_id, self.model_path) | ||
| tags_path = hf_hub_download(self.repo_id, self.model_path) |
There was a problem hiding this comment.
tags_path = hf_hub_download(self.repo_id, self.model_path)
^^^^^^^^^--------------------------------------^^^^^^^^^^
This is a bug, I've fixed it.
|
I could not simply merge it anymore so I manually applied the changes and fixed the issues mentioned here. Thanks for the work. I'll mention you in the changelog and commit message. I've posed a lot of questions, sorry about that. Only the ones where I've added eyes are still a question to me. |
see the discussion here