Skip to content

Two issues: 999 chunks limit (?) and itt/s falling off with long gens #42

@esdinev

Description

@esdinev

I tested with a single txt file and I guess the book was a bit long because it was separated into around 3500.
That caused 2 issues:

  1. At 999 chunks the app crashed. But I managed to resume. This one might be a coincidence.
  2. After it finished the whole gen it couldn't concatenate the whole book. It only did the first 999 chunks.

And there's another issue. If the model is working for long it starts losing speed I get around 65 itt/s in the beginning and after an hour or two it falls down to 20. Not sure if that's an issue on your end or with the model itself.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions