I did try it on my x86 machine and it did not change the result per ffprobe.
This may be misleading since ffmpeg-4.2 does not seem to even look for le.
That is the disturbing feature. The probes and header data do not include be.
This means that all that can happen is that it can create bad data that is not
decodable by most machines. If it works on your machine, it is an outlier, not
the normal condition. You should test the data your machine generates with:
[root@keystone Downloads]# ls -ltr | tail -1
-rw-r--r-- 1 root root 179184 Mar 24 12:09 sixteen_bits_test2.png
[root@keystone Downloads]# cd /path/cinelerra-5.1/thirdparty/ffmpeg-4.2
[root@keystone Downloads]# ./ffprobe /path/sixteen_bits_test2.png
Input #0, png_pipe, from 'sixteen_bits_test2.png':
Duration: N/A, bitrate: N/A
Stream #0:0: Video: png, rgba64be(pc), 720x576, ixt25 tbr, 25 tbn, 25 tbc
[root@keystone Downloads]# xv seen_bits_test2.png
[root@keystone Downloads]# gimp seen_bits_test2.png
I can't tell if the png seen_bits_test2.png is be or le.
On my machine it is dim and see thru (probably incorrect).
Is it supposed to be right or wrong?
This is what I am using:
What is your os? fc31?
what is your arch? x86_64 ?
what version of png is in use? libpng16 ?
Since most decoders have a header that records the format (png_info
in this case) it would be good to know how the "endianess" is encoded. I
do not see any definitions that can support it. That means it always fails
unless user supplied transformation data is included in the decoder open.
gg