You can change the preset to slow or medium to increase speed, but your file size will suffer compared to slower or even veryslow (if you're very patient!). Increase the -crf value to decrease file size and quality, I don't recommend any higher than 25. You could start with ffmpeg -i input_file.mp4 -preset slower -crf 20 -c:a copy output_file.mp4 You can try using ffmpeg or avconv to convert files. Without knowing how long and what format and content type these files are its hard to tell if these files have room to be shrunk without much visible quality loss.īluRays with 1080p video tends to be upwards of 25GB so its not unlikely you're already at an optimal quality-to-size ratio for H.264. If you want to compress these files you will have to reduce the quality. And after lossy video compression obviously neither analogue smoothness or digital recurrence can be found anymore (if it were, the codecs would use another bzip-stage or something themselves!) If general-purpose compressors used these transformations, the effect would be the opposite: most digital information would actually be misclassified as some kind of noise, because it lacks the "smooth" structure you find in analogue signals. (That's not really how it works, but sort of.) These transformations roughly speaking move the picture-portions and noise-portions into different locations, so they can well be seperated and with lossy compression you retain only the information you think is most "important", which does not include the noise, while the "good information" has lots of redundancy. That's why all compressed image/AV formats use some cleverly chosen transformation as their first encoding step, normally based on DCT or wavelets. That means, there's not really any of the kind of textfile-reduncancy at all: some motives might be recurring, but always with a slightly different configuration of sensor noise. ![]() In particular, media files are conceptually analogue data, in a noisy digital representation. ![]() The algorithms are quite good in generalising this to anything from ASCII-encoded phone number lists over chinese poetry to binary machine code, but they can't possibly work for any kind of data. ![]() General-purpose compression algorithms mostly target the kind of thing obvious in text files: many words turn up not just once but plenty of times in identical form, perhaps phrases of words can be combined, etc. That's practically always the case for uncompressed files – however, it's not necessarily obvious what the redundancy is. It's this: compression in general can only work if the data has some kind of redundancy in it. Really, the fact that the files are already compressed is not the crucial problem.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |