"transcript": "Typed letter: Attached please find my r\u00c3\u00a9s\u00c3\u00bcme[MANY STRANGE MARKINGS OVER AND BENEATH THIS LAST LETTER]\n\nI usually leave out diacritics when I type, so I make up for it by occasionally adding a whole bunch at once.\n\n{{Title text: Using diacritics correctly is not my fort\u00c3\u0083\u00c2\u00a9.}}"

Like... "r\u00c3\u00a9s\u00c3\u00bcme" is just more of the same, but "fort\u00c3\u0083\u00c2\u00a9" has somehow been double fucked. It's not even consistent within the same string.

And the JSONP goes one step further, encoding these Unicode characters again as direct UTF-8 (not \u escapes), and then serves it with no encoding tag in the content-type header, so the browser would need to guess it's UTF-8 just to not add yet another level of misencoding to the data. From my testing, it seems to depend on whether the HTML file that includes the JS file is UTF-8 encoded or not, which is madness, but that's the browser world for you... (you can get around this last hiccup by using <script charset="utf-8">, or by adding scriptCharset:'utf-8' to the $.ajax call if you're using jQuery).