NeroAacEnc: Linux and Windows versions

As db1989 already pointed out it's perfectly normal to get slightly different outputs from binaries built using different compilers. It's also normal to get slightly different output from the same binary according to the system architecture in use.

Demonstration:

I have three GNU/Linux PCs to hand. One runs 32-bit Debian on AMD Athlon64 hardware, another runs 32-bit Ubuntu on Core Duo (old 32-bit hardware), and another runs 64-bit Debian on Core 2 Duo. Using the identical neroAacEnc Linux 1.5.4.0 binary on each I encoded the same wav with default settings. Each encoding produced a file with a different md5sum, and the AMD based machine produced a file whose bitrate is 1 kb/s different than the two Intel machines.

NeroAacEnc: Linux and Windows versions

NeroAacEnc uses loads of floating points maths and small roundoff differences can cause slightly different results. The difference between Intel and AMD machines is because it uses the Intel IPP library for some signal processing, and that uses differently optimized versions depending on the hardware it runs on.

It's also possible there is a bug in the encoder where it is at some point relying on undefined behavior.

Nobody (not even with the source) can "prove" there are *no* bugs in the encoder. That's unfeasible for any-nontrivial program, and only some theoretical mathematicians have ever attempted it.

Basically, the behavior you see is normal and excepted, and doesn't necessarily warrant further speculation.