Speaking of encoding and Firefox, for some reason this piece of crap guesses ISO-8859-1 instead of UTF-8 on your site, @abucci@anthony.buc.ci, even though UTF-8 is specified in the HTML:
@lyse@lyse.isobeef.org Interestingly, the W3C validator complains, too:
https://validator.w3.org/nu/?doc=https%3A%2F%2Fbucci.onl%2Fabout.html
The character encoding was not declared. Proceeding using windows-1252.
That comment at the beginning of the file might confuse things?
@abucci@anthony.buc.ci @movq@www.uninformativ.de Oh, interesting. Then I take back my critique this time. I wasn’t aware of that 1024 byte limit either. Working now. I just send it always in the Content-Type
header and sometimes even omit it from the HTML altogether. But when I do, I also use the shorter and more reasonable looking HTML5 style <meta charset="UTF-8">
, just like @eaplmx@twtxt.net showed. The advantage with the HTTP response header is that I just tell nginx to do it for me, so I cannot forget it in the HTML by accident. Well, in case I forgot, it’s not an issue.
But specifying it also in the HTML helps everybody who happens to download the page. Opening it locally then obviously cannot make use of the nonexisting HTTP response header. Not that I think there are a lot of people out there downloading it, but just in case. :-)
Do you happen to have all your browsers set to fall back to UTF-8 if they can’t detect the encoding, @abucci@anthony.buc.ci?
I was gonna say the correct thing to do here normally in most cases is to put the content type encoding in the HTTP response heads