The answer to a lot of questions like "why does Angband still do X?" is because no-one made it not do that yet. Plus, I think most of the Angband maintainers have enjoyed hacking on the game and weren't/aren't particularly interested in learning how to write e.g. a new Windows frontend, or learning how to use SDL. The platform-specific bits of Angband are for sure the least modified and least loved bits of the game. Pending people who like frontend work showing up and doing some stuff, the frontends mostly stay unmodified for years at a time.

EDIT: Also, Angband is very different on different platforms. Windows and SDL use the .fon files, but Mac doesn't, and neither does the X11 port, or curses.

The answer to a lot of questions like "why does Angband still do X?" is because no-one made it not do that yet. Plus, I think most of the Angband maintainers have enjoyed hacking on the game and weren't/aren't particularly interested in learning how to write e.g. a new Windows frontend, or learning how to use SDL. The platform-specific bits of Angband are for sure the least modified and least loved bits of the game. Pending people who like frontend work showing up and doing some stuff, the frontends mostly stay unmodified for years at a time.

EDIT: Also, Angband is very different on different platforms. Windows and SDL use the .fon files, but Mac doesn't, and neither does the X11 port, or curses.

hmm... @takkaria, while I can understand that, I do see the following in the debian angband package

hmm... @takkaria, while I can understand that, I do see the following in the debian angband package

They're used by the SDL port, I think.

Quote:

Could anybody share what are the different fonts that are used or just one font with different sizes ?

They are all different fonts, and the user chooses which ones they want for which window in the SDL and Windows ports. Personally I find the bitmap fonts more legible for text mode than outline/vector fonts but YMMV.

.fon files are really just tilesets. IMO, it would be easier to use bmp or png file with glyphs + text file with coordinates (exactly like tiles in lib/tiles).
In fact, for my project, I've written a tool that makes such tilesets out of vector font files If someone wants it, I can send it to you (the tool is written in D, so you'll also need a D compiler from https://dlang.org/).

t4nk-
It turns out to be a performance penalty to use tiles instead of glyphs.

Do you have any proof of that? Glyphs, as displayed on monitors, are pictures. If they are pictures to begin with (like in .fon files), they're tilesets. If they're something else (say, Bezier curves, like in .ttf files), then transorming them into pictures (rasterizing) requires additional work*.

Quote:

You can see the difference when running and sometimes on redraws.

I can, except in the opposite direction: the difference is that pre-rasterizing glyphs of vector fonts and making an in-memory tileset improves performance noticeably (e.g., main-sdl2.c:make_font_cache()).

* which is, btw, not trivial and can involve such things are running bytecode on a virtual machine (part of so-called "hinting")...

It can be surprisingly difficult to render a large grid of anything without getting performance penalties. I discovered that when writing the Pyrel UI. The naive approach (just tell your GUI widget library to render a text symbol for every tile in view on every update) is horrifically slow. Getting clever about remembering which portions of the screen have changed and only rerendering those helps, but then scrolling the view is also horrifically slow (because every single tile "changes"). So then you start saying "well, the scrolled view re-uses a lot of tiles from the previous view, so I'll keep the previous view around, redraw it at an offset, and then draw the new/updated tiles on top of that" and you have fairly complex drawing logic that's still really not all that fast.

As for tilesets vs. software-rendered fonts, I can absolutely believe that it's faster to blit a texture to a tile than it is to draw an "@". After all, drawing the "@" involves first converting the vectors to a raster image and then...converting that raster to a texture and blitting it. A pre-rendered tileset of the kind t4nk describes is super common in gaming. You can load the entire tileset into your GPU's memory and then just send a payload to the GPU saying "render this tile here, render that tile there", and it can draw the entire screen without having to go back to the CPU for more information.