I have a 3080 with 10GB of VRAM. it's faster as in less work (i can just point it at a folder a leave). I also like that its all being handled locally instead of uploading stuff. I used your guide to learn everything but just tweaked the Collab script to function in python. It takes me about 20-30 minutes per 2hour movie with medium models. Closer to a full 2 hours for each movie with large models. In Linux I didn't have enough VRAM to complete a large model, but same docker run in windows it works. Since large model+3080 IS working for me on windows now I kinda restarted everything. (I had 200+ movies srt's completed on medium, its going to take WAY longer to get that much completed on large).