I recently started building a movie/show collection again on my home NAS.

I know that generally H.265 files can be 25-50% less bitrate than H.264 and be the same or better quality. But what’s the golden zone for both types? 10 Mbps for a 1080p H.264 movie? And would it be like 5 Mbps for H.265 1080p to be on par with H.264? What about 4K?

For file size: would it be 25GB for a 2 hour 1080p movie to be near or at original Blu-Ray/digital quality?

  • AshleyUnciaB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Not asserting this isn’t the case, I’ve not noticed it, but I can’t see why this would be the case for the actual encoding. Decoding I’ve seen it make a difference but that’s mostly the pre-Skylake iGPUs using a poor implementation of QuickSync.

    No, it’s totally a fact. Software encoding yields you better results in terms of ‘quality per megabyte’ over hardware encoding unless you are using some real bad sloppy software encoding results. If size efficiency matters more than anything, you use software encoding or you’re basically leaving money on the table. Of course the downside is that hardware encoding is a whoooooooooooooole heck of a lot faster.