Skip to content

Slow inference #265

Open
Open
@Echolink50

Description

@Echolink50

I couldn't find a solid answer on the speed of inference so I just gave it a try. Running RTX 2060 12GB. My examples didn't run with some tensors not found or some kinda error. I ran demo 1 and it took 10 minutes for the 10 second clip. Is this normal? I have to say for a 10GB worth of models and pickle files that seems disappointing. Thanks for the program though

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions