For whatever fucking reason, glibc's functions dealing with decimal numbers apparently can only accept either commas or dots in strings, but not both. Meanwhile, both Windows and macOS have no apparent issues accepting both.
I will never understand why they decided to even consider such behaviour acceptable, especially since those ARE used for parsing decimal numbers in many programs, but I guess it's their own version of Not Invented Here syndrome that they (or anyone else) can't be bothered to deal with. This is not how good C standard libraries are written, at all.