For what this is worth, I will eventually try to simply "remaster" a small clip to test this, but I used to try to upscale things back before I got into AI, and I read something out there that suggested not doing it because (at the time) it would only introduce some additional level of noise in the final image anyway, so just let your hardware do the upscaling. I use a Plex server on my 4K TV, and even now streaming something taken from a DVD at 480p looks decent enough. So, I'm wondering if using AI to at lease enhance (remaster) the image would yield a cleaner looking (aka smoothed out features...a curse mostly) picture so that when viewed on hardware that will do the upscaling for me, it would look just fine?