What do you guys think of the points explained here:
https://www.youtube.com/watch?v=gWv_vUgbmug
Seems like the language shares a lot of features with
D programming language. However there are several
features that caught my interest:
1) The compile times seems very fast in comparison
with other modern programming languages, I'm wondering
how he managed to do it?
2) Compile-time execution is not limited, the build
system is interestingly enough built into the language.

What do you guys think of the points explained here:
https://www.youtube.com/watch?v=gWv_vUgbmug
Seems like the language shares a lot of features with
D programming language. However there are several
features that caught my interest:
1) The compile times seems very fast in comparison
with other modern programming languages, I'm wondering
how he managed to do it?
2) Compile-time execution is not limited, the build
system is interestingly enough built into the language.

I was at that talk, and spoke to him quite a bit there. He also
attended my talk. And yes, there is quite a bit of overlap in
terms of features. He's well in to design by introspection, for
example.
I can answer #1, I know a few things there but that's more
something he should talk about as I don't know how public he's
made that knowledge.
I also put forward to him a case with regards to compile time
execution and code generation. Say you've got a global variable
that you write to, and reading from that changes the kind of code
you will generate. Thus, your outputted code can be entirely
different according to whenever the compiler decides to schedule
that function for execution and compilation. His response was,
"Just don't do that."
That's essentially the philosophical difference there. Jonathan
wants a language with no restrictions, and leave it up to the
programmer to solve problems like the above themselves. Whether
you agree with that or not, well, that's an entirely different
matter.

I can answer #1, I know a few things there but that's more
something he should talk about as I don't know how public he's
made that knowledge.

Well, I know that DMD in particular made a trade off not to
collect garbage during the compilation to improve the speed,
so it is really interesting to look at their compiler sources
to find out what they did to make it compile so quickly.
On Monday, 8 May 2017 at 14:47:36 UTC, Ethan Watson wrote:

I also put forward to him a case with regards to compile time
execution and code generation. Say you've got a global variable
that you write to, and reading from that changes the kind of
code you will generate. Thus, your outputted code can be
entirely different according to whenever the compiler decides
to schedule that function for execution and compilation. His
response was, "Just don't do that."
That's essentially the philosophical difference there. Jonathan
wants a language with no restrictions, and leave it up to the
programmer to solve problems like the above themselves. Whether
you agree with that or not, well, that's an entirely different
matter.

At very least it is interesting to have this feature, I don't know
if I ever will need it in my code. For the game development use
case
it may be useful, for example to package all of the game assets at
compile time. I've seen similar thing being very popular in
different
Haxe-based game frameworks, though Haxe seems to be a bit more
restrictive in this regard.

I don't know if I ever will need it in my code. For the game
development use case it may be useful, for example to package
all of the game assets at compile time.

It's only useful for very select cases when hardcoded assets are
required. You know, unless you want to try making a 45 gigabyte
executable for current Playstation/Xbox games. A talk I watched
the other year made the point that as far as textures go in video
games, literally all but 10 you'll ever use are read only so stop
trying to program that exception as if it's a normal thing.
Hardcoding a select few assets is also a case of a vast-minority
exception. There's ways to do it on each platform, and it's not
really worth thinking about too much until those rare times you
need one.
Embedding inside the executable is also already a thing you can
do in D with the import keyword.

You know, unless you want to try making a 45
gigabyte executable for current Playstation/Xbox games.

Is this why most console games that get ported to PC are
massive? GTA V on PC, for example, was 100GB, while Skyrim was
around 8GB.

Skyrim was that size on release because the console version had
to fit on a DVD for the xbox 360 version, plus they made almost
no changes to the PC version of the game. GTA V however, was
released several months after the console release and had larger
textures and uncompressed audio.

You know, unless you want to try making a 45
gigabyte executable for current Playstation/Xbox games.

Is this why most console games that get ported to PC are
massive? GTA V on PC, for example, was 100GB, while Skyrim was
around 8GB.

Skyrim was that size on release because the console version had
to fit on a DVD for the xbox 360 version, plus they made almost
no changes to the PC version of the game. GTA V however, was
released several months after the console release and had
larger textures and uncompressed audio.

Ok, fair point. Let's look at Final Fantasy XIII (linear,
non-open world console RPG released in 2009 on X360 and PS3,
recently ported to PC) and The Witcher 3 (huge open world PC RPG
released in 2015). FFXIII's size on disk is 60(!) GB, while The
Witcher 3 is 40 GB. This isn't true all the time, but a lot of
console games ported to PC take a surprisingly large amount of
space. It's like they just unpacked the disk image, did an x86
build, then uploaded the whole thing to Steam with uncompressed
assets and called it good enough.

Ok, fair point. Let's look at Final Fantasy XIII (linear, non-open world
console RPG released in 2009 on X360 and PS3, recently ported to PC) and
The Witcher 3 (huge open world PC RPG released in 2015). FFXIII's size
on disk is 60(!) GB, while The Witcher 3 is 40 GB. This isn't true all
the time, but a lot of console games ported to PC take a surprisingly
large amount of space. It's like they just unpacked the disk image, did
an x86 build, then uploaded the whole thing to Steam with uncompressed
assets and called it good enough.

I don't know anything about Witcher, but FF13 *does* have a fair amount
of pre-rendered video, FWIW. And maybe Witcher uses better compression
than FF13?
Also, just a side nitpik, but open-world vs non-open-world alone
shouldn't have any impact on data size - the real factors in a game
world's data size are overall size and detail of the game world. Whether
it's open world is just a matter of how all the data in the game world
is laid out, not how much data there is.

Skyrim was that size on release because the console version had to fit
on a DVD for the xbox 360 version, plus they made almost no changes to
the PC version of the game. GTA V however, was released several months
after the console release and had larger textures and uncompressed audio.

Yea. The crazy thing is though, the huge sizes don't even buy as much as
the numbers suggest. Major case of diminishing returns: Look at PS3 vs
PS4 GTA5: Something like 25GB on PS3 and double that on PS4, and yea you
*can* tell a difference, but its *very* slight, and usually you have to
really look for it. (And then there's other games like Cod Ghosts and
Destiny, where I honestly couldn't tell any difference whatsoever
between the systems no matter how closely I looked, aside from a few
extra particles in the particle systems...although I can't say what the
size difference is on those games, maybe they just used the same assets
for both systems on those games.)

uncompressed audio.

Uncompressed? Seriously? I assume that really means FLAC or something
rather than truly uncompressed, but even still...sounds more like a
bullet-list pandering^H^H^H^H^H^H^H^H^Hselling point to the same
suckers^H^H^H^H^H^H^H"classy folk" who buy Monster-brand cables for
digital signals than a legit quality enhancement. Take a top-of-the-line
$$$$ audio system, set down a room full of audiophiles, and compare
lossless vs 320kbps Vorbis...in a true double-blind, no WAY they'd be
able to consistently spot the difference even if they try. Let alone
while being detracted by all the fun of causing mass mayhem and carnage.
Unless maybe you just happen to stumble upon some kind of audio savant.

Uncompressed? Seriously? I assume that really means FLAC or
something rather than truly uncompressed, but even
still...sounds more like a bullet-list
pandering^H^H^H^H^H^H^H^H^Hselling point to the same
suckers^H^H^H^H^H^H^H"classy folk" who buy Monster-brand cables
for digital signals than a legit quality enhancement. Take a
top-of-the-line $$$$ audio system, set down a room full of
audiophiles, and compare lossless vs 320kbps Vorbis...in a true
double-blind, no WAY they'd be able to consistently spot the
difference even if they try. Let alone while being detracted by
all the fun of causing mass mayhem and carnage. Unless maybe
you just happen to stumble upon some kind of audio savant.

Don't need to go that high. c't did a double blind study some
years ago with the help of her sister magazine for audio
equipment. So they made a very good setup. What they discovered
is that mp3 with 160 kbit/s CBR was already undistinguishable
from CD for 99% of people for almost all kind of music. mp3 is
much better than its reputation, due to really bad encoders at
the beginning (Xing was awful and was the widest used at the
beginning, Fraunhofer was excellent but not free, lame took years
before it was any good) people thought that the crap they heard
was inherent to the mp3 format but very often it was bad
grabbing, over eager lo-pass filtering and crappy psycho-acoustic
models (Xing). So you make a good point that uncompressed audio
for a game is completely nuts.

Uncompressed? Seriously? I assume that really means FLAC or something
rather than truly uncompressed, but even still...sounds more like a
bullet-list pandering^H^H^H^H^H^H^H^H^Hselling point to the same
suckers^H^H^H^H^H^H^H"classy folk" who buy Monster-brand cables for
digital signals than a legit quality enhancement. Take a
top-of-the-line $$$$ audio system, set down a room full of
audiophiles, and compare lossless vs 320kbps Vorbis...in a true
double-blind, no WAY they'd be able to consistently spot the
difference even if they try. Let alone while being detracted by all
the fun of causing mass mayhem and carnage. Unless maybe you just
happen to stumble upon some kind of audio savant.

Don't need to go that high. c't did a double blind study some years ago
with the help of her sister magazine for audio equipment. So they made a
very good setup. What they discovered is that mp3 with 160 kbit/s CBR
was already undistinguishable from CD for 99% of people for almost all
kind of music. mp3 is much better than its reputation, due to really bad

Interesting. Any links? Not familiar with what "c't" is.
Although, even 1% is still a *LOT* of people. I'd be more curious to see
what encoding it would take to get more like 99.99% or so.

encoders at the beginning (Xing was awful and was the widest used at the
beginning, Fraunhofer was excellent but not free, lame took years before
it was any good) people thought that the crap they heard was inherent to
the mp3 format but very often it was bad grabbing, over eager lo-pass
filtering and crappy psycho-acoustic models (Xing). So you make a good
point that uncompressed audio for a game is completely nuts.

Fair point. Also, I've heard that the big quality improvements that
aac/vorbis/etc have over mp3 are mainly just at lower bitrates.

Uncompressed? Seriously? I assume that really means FLAC or
something
rather than truly uncompressed, but even still...sounds more
like a
bullet-list pandering^H^H^H^H^H^H^H^H^Hselling point to the
same
suckers^H^H^H^H^H^H^H"classy folk" who buy Monster-brand
cables for
digital signals than a legit quality enhancement. Take a
top-of-the-line $$$$ audio system, set down a room full of
audiophiles, and compare lossless vs 320kbps Vorbis...in a
true
double-blind, no WAY they'd be able to consistently spot the
difference even if they try. Let alone while being detracted
by all
the fun of causing mass mayhem and carnage. Unless maybe you
just
happen to stumble upon some kind of audio savant.

Don't need to go that high. c't did a double blind study some
years ago
with the help of her sister magazine for audio equipment. So
they made a
very good setup. What they discovered is that mp3 with 160
kbit/s CBR
was already undistinguishable from CD for 99% of people for
almost all
kind of music. mp3 is much better than its reputation, due to
really bad

Interesting. Any links? Not familiar with what "c't" is.

https://www.heise.de/ct/artikel/Kreuzverhoertest-287592.html
So, I got some details wrong in my recollection from memory. They
compared 128 kbit/s, 256 kbit/s and CD. To remove bias, they
burnt the mp3 after decompression on CD so that the testers
couldn't distinguish between the 3 formats and played them in
their high quality audio setup in their studios. The result was
surprizing in that there was no difference between CD and 256K
mp3, and only a slightly lower score for 128K mp3. They were also
surprized that for some kind of music (classical), the mp3 128K
was even favored by some testers over the other formats and they
speculate that the encoding rounds out somehow some roughness of
the music.
They also had one tester who was 100% accurate at recognizing mp3
over CD, but the guy had had a hearing accident in his youth
where he lost part of the hearing spectrum (around 8KHz) which
breaks the psycho-acoustic model and allows him to hear noise
that is suppressed for the not hearing impared.
I don't know where I got the 160 KBit part of my message.

Fair point. Also, I've heard that the big quality improvements
that aac/vorbis/etc have over mp3 are mainly just at lower
bitrates.

On 05/09/2017 02:10 AM, Patrick Schluter wrote:
Interesting. Any links? Not familiar with what "c't" is.

https://www.heise.de/ct/artikel/Kreuzverhoertest-287592.html
So, I got some details wrong in my recollection from memory. They
compared 128 kbit/s, 256 kbit/s and CD. To remove bias, they burnt the
mp3 after decompression on CD so that the testers couldn't distinguish
between the 3 formats and played them in their high quality audio setup
in their studios. The result was surprizing in that there was no
difference between CD and 256K mp3, and only a slightly lower score for
128K mp3.

Not surprised the 128k MP3 was noticeable. Even I've been able to notice
that when I was listening for it (although, in retrospect, it was likely
a bad encoder, now that I think about it...)

They were also surprized that for some kind of music
(classical), the mp3 128K was even favored by some testers over the
other formats and they speculate that the encoding rounds out somehow
some roughness of the music.
They also had one tester who was 100% accurate at recognizing mp3 over
CD, but the guy had had a hearing accident in his youth where he lost
part of the hearing spectrum (around 8KHz) which breaks the
psycho-acoustic model and allows him to hear noise that is suppressed
for the not hearing impared.

Fascinating.
The 128k being sometimes favored for classical kinda reminds me of how
some people prefer vinyl over CD/etc. Both are cases of audio data being
lost, but in a way that is liked.

Uncompressed? Seriously? I assume that really means FLAC or
something rather than truly uncompressed, but even still...

Nope, uncompressed. Seems some games they decided the small
amount of time spent decompressing audio and textures was too
high, which is why some of the games are 50Gb in size, because
it's more important to have larger textures than trying to push
the HD textures and 4k stuff, vs actually having hardware that
can handle it, since the console hardware is seriously behind PC
hardware.

Seems some games they decided the small amount of time spent
decompressing audio and textures was too high <snip> since the
console hardware is seriously behind PC hardware.

Found an appropriate articles Regarding Titanfall (a few years
ago), although that's for PC and the reason for giving a boost to
'underpowered PC's', although i could have sworn they did it for
consoles more. Still ridiculous in my mind.
http://news.softpedia.com/news/Titanfall-Needs-50GB-of-Space-on-PC-Due-to-Uncompressed-Audio-Files-431586.shtml
http://www.pcworld.com/article/3128214/software-games/why-pc-game-downloads-are-so-damned-big.html
http://www.escapistmagazine.com/news/view/132922-Titanfall-Dev-Explains-The-Games-35-GB-of-Uncompressed-Audio

Side nitpick: Console hardware is behind *gaming-PC* hardware. Important
distinction.
Heck, my most powerful PC is a little behind PS3 (in terms of game
performance, anyway) and does everything I need it to do and then some
(including the vast majority of indie games, which I usually prefer anyway).
And ok, yea, that's just my own stuff but still, take a look at the
laptop market: For anything that *doesn't* have Intel graphics, you're
looking at easily around double the price. You can get a quite good
latptop for $400 or less. But want one with an ATI or NVIDIA (and not
their budget "comparable with Intel Graphics" lines)? Then you're
looking at around $1,000. But what good does that ATI or NVIDIA chipset
do (over an Intel) for anything other than 3D modeling/animation and
non-indie gaming? Nada (but maybe suck the battery dry).
You could say "yea, well, that's for laptops, any serious gamer's gonna
want a desktop". But then again you'd simply be talking "gaming PC"
again. And these days, how much point is there really in a desktop (as
opposed to laptop) for non-gaming, non-3dsMax/Maya purposes? Minimal.
Point being: There's a big difference between "PC" and "gaming PC". In
the context of AAA gaming, it tends to get falsely assumed that all PCs
are gaming-PC spec. Not so. Console hardware is only behind "high-end"
PC hardware (what I mean by "high-end" in that sentence isn't so much
"top of the line" but simply "costs more than the highest-end console
available").

Nope, uncompressed. Seems some games they decided the small
amount of time spent decompressing audio and textures was too
high, which is why some of the games are 50Gb in size, because
it's more important to have larger textures than trying to push
the HD textures and 4k stuff, vs actually having hardware that
can handle it, since the console hardware is seriously behind
PC hardware.

On Tuesday, 9 May 2017 at 23:58:13 UTC, Era Scarecrow wrote:

Found an appropriate articles Regarding Titanfall (a few years
ago), although that's for PC and the reason for giving a boost
to 'underpowered PC's', although i could have sworn they did it
for consoles more. Still ridiculous in my mind.

Yeah, you might want to actually read the entire thread before
stating this stuff again.

Is this why most console games that get ported to PC are
massive? GTA V on PC, for example, was 100GB, while Skyrim was
around 8GB.

Consoles have a fixed hardware level that will give you
essentially deterministic performance. The quality of assets it
can handle are generally 1/4 to 1/2 as detailed as what the
current top-line but reasonably-priced PC hardware can handle.
And PC gamers *love* getting the higher detailed assets. So we
ship PC games with the option to scale the quality of the assets
used at runtime, and ship with higher quality assets than is
required for a console game.
See as an alternative example: the Shadows of Mordor ultra HD
texture pack, which requires a 6GB video card and an additional
download. Another example I like using is Rage, which is
essentially 20GB of unique texture data. If they wanted to
re-release it on Xbox One and PS4 without being accused of just
dumping a port across, they'd want to ship with 80GB of texture
data.
There's also grumblings about whether those HD packs are worth
it, but now that 4K displays are coming in those grumblings are
stopping as soon as people see the results.
On Tuesday, 9 May 2017 at 02:21:19 UTC, Nick Sabalausky
(Abscissa) wrote:

I don't know anything about Witcher, but FF13 *does* have a
fair amount of pre-rendered video, FWIW. And maybe Witcher uses
better compression than FF13?

Correct about the video. The Final Fantasy games are notorious
for their pre-renders and their lengthy cutscenes. All of which
require massive amounts of video and audio data.
Better compression though? Unlikely. Texture formats are fairly
standardised these days. Mesh formats are custom, but not as much
of a space hog as textures. Other assets like audio and video is
more where the compression formats come in to play. But gaming
hardware has a few tricks for that. For example:
On Tuesday, 9 May 2017 at 02:13:19 UTC, Nick Sabalausky
(Abscissa) wrote:

Uncompressed? Seriously? I assume that really means FLAC or
something rather than truly uncompressed, but even
still...sounds more like a bullet-list
pandering^H^H^H^H^H^H^H^H^Hselling point to the same
suckers^H^H^H^H^H^H^H"classy folk" who buy Monster-brand cables
for digital signals than a legit quality enhancement.

Well, no. Gaming consoles - and even mobile devices - have
dedicated hardware for decompressing some common audio and video
formats. PC hardware does not. Decompression needs to happen on
the CPU.
Take Titanfall as a use case, which copped quite a bit of flack
for shipping the PC version with uncompressed audio. The Xbox One
version shipped on a machine that guaranteed six hardware threads
(at one per core) with dedicated hardware for audio
decompression. Their PC minspec though? A dual core machine (at
one thread per core) with less RAM and only using general purpose
hardware.
The PC scene had a cry, but it was yet another case of PC gamers
not actually understanding hardware fully. The PC market isn't
all high-end users, the majority of players aren't running
bleeding edge hardware. They made the right business decision to
target hardware that low, but it meant some compromises had to be
made. In this case, the cost of decompressing audio on the CPU
was either unfeasible in real time or increased load times
dramatically during load times. Loading uncompressed audio off
the disk was legitimately an optimisation in both cases.
On Tuesday, 9 May 2017 at 06:50:18 UTC, Ola Fosheim Grøstad wrote:

It isn't all that hard to distinguish if you know what to
listen for. I hear a big difference in music I have mixed
down/mastered on a good headset.

So, as Walter would say, "It's trivially obvious to the casual
observer."
That's the point of the blind test. It isn't trivially obvious to
the casual observer. You might think it is, but you're not a
casual observer. That's essentially why LAME started up - a bunch
of audiophiles decided to encode for perception of quality rather
than strictly objective quality.

In this case, the cost of
decompressing audio on the CPU was either unfeasible in real time or
increased load times dramatically during load times. Loading
uncompressed audio off the disk was legitimately an optimisation in both
cases.

I'm surprised it would've made that much of a difference, I'd grown
accustomed to seeing audio decoding as computationally cheap on even
low-end hardware.
But then again, I suppose the average level of a modern AAA game may
involve a bit more audio data than the average MP3 song (not to mention
a lot more audio streams playing simultaneously), and is already maxing
the hardware as much as it can.

That's the point of the blind test. It isn't trivially obvious
to the casual observer. You might think it is, but you're not a
casual observer.

Well the point of a blind test is more to establish validity for
something having a different effect, but not for establishing
that it isn't different. i.e. false vs unknown, so in the latter
case it would be inconclusive.
These 2 statements are very different:
1. we have not been able to establish that there was any
perceived difference
2. we have established that there was no perceived difference
How would they research this? By asking if one is better than the
other? Well, that is highly subjective. Because better has to do
with expectations. Anyway, cognitive analysis of difference is
rather at a high level and for many something sounds the same if
they interpret the signal the same way. Whereas immersion is much
more subtle and depends on your state of mind also, not only what
you perceive. So not easy to measure! Our perceptual machine is
not a fixed machine, our expectations and mood feeds back into
the system.
Some things like phasing/smearing in high frequency content and
imaging does affect the experience, although the effect is very
subtle and you need good head sets and having heard the original
many times to pinpoint where the differences are at higher
bitrates. (at 300kbit/s it probably isn't all that easy).

Some things like phasing/smearing in high frequency content and
imaging does affect the experience, although the effect is very

I want to add that of course, modern commercial music is already
ruined by too much compression and dynamic abuse so it is
distorted from the beginning... just to get a loud signal. Trash
in -> trash out. Same with speakers. Regular speakers are poor.
Use a good headset (E.g. Sennheiser HD600 or better) and
preferably use the same headset the audio engineer used...
Loudspeaker in room -> not the same signal as on the CD.
Anyway, it is a complicated topic. I went to a two hours lecture
on it a week ago. We were told to use this book: Applied Signal
Processing: A MATLAB™-Based Proof of Concept by Dutoit and
Marqués. It comes with code in matlab so you can modify the mp3
algorithms and explore the effects yourself. :)

Use a good headset (E.g. Sennheiser
HD600 or better) and preferably use the same headset the audio engineer
used... Loudspeaker in room -> not the same signal as on the CD.

You seem to know a thing or two about audio hardware (more than me
anyway) - Any idea offhand where to find a good set of clip-style
headphones? That's something I've been dying to find for years (I don't
like "earbuds" - I find sticking thing inside my ears to be terribly
uncomfortable, even compared to the occasion pinch of clip-style, and
traditional are always just falling off in casual use).
I used to use Koss's clip-style (and loved the one with in-line volume
control) since, despite being affordable, they were the only ones I'd
ever found that didn't sound horrible (all the Sony ones of remotely
comparable price sounded like complete trash no matter what the box
claimed about its specs...and the Sonys are downright ugly to boot.
Other brands didn't fare any better.)
Unfortunately, after a few years, both my Koss pairs crapped out (ie, no
sound period out one or both speakers), and the non-free "warranty"
replacements consistently crapped out the same way after about two
months max (sounded good until then, though).
At this point, I don't care about cost, would just like to find a
reliable good-sounding (ie, at least comparable to Koss's sound quality)
clip-style. Any leads? Is there even such a thing has high-end, or even
mid-range clip headphones?

me anyway) - Any idea offhand where to find a good set of
clip-style headphones?

Unfortunately not. I try to avoid headphones these days, too easy
to crank up the volume without noticing, especially if you have
good ones which can go up to 120db without distortion...

That's something I've been dying to find for years (I don't
like "earbuds" - I find sticking thing

Yeah, don't use consumer-earbuds, can easily damage your hearing.

At this point, I don't care about cost, would just like to find
a reliable good-sounding (ie, at least comparable to Koss's
sound quality) clip-style. Any leads? Is there even such a
thing has high-end, or even mid-range clip headphones?

I've kinda stopped consuming music when travelling, but I
personally prefer large headsets that enclose the ear completely.
Closed ones in non-silent environment so you don't crank the
volume up too much. A bit clunky even if foldable of course... I
used some from Ultrasone, not the best quality, but decent. I
think personal taste and musical style kinda means that you have
to test them in a store to make up your mind.
I guess you could ask at the https://www.head-fi.org/ forums, but
not sure if that site is good anymore? It was a useful resource a
decade ago, but seems to be rammed down with awful ads now?
Unbearable.
Btw people say that one should keep new headset playing some
heavy bass for 10? hours after purchasing before evaluating them,
something about the coils needing to be run in. Well, they are
mechanical so I guess that makes sense (tightening, friction or
something). Maybe different for cans without coils...

I've kinda stopped consuming music when travelling, but I personally
prefer large headsets that enclose the ear completely. Closed ones in
non-silent environment so you don't crank the volume up too much. A bit
clunky even if foldable of course... I used some from Ultrasone, not the
best quality, but decent. I think personal taste and musical style kinda
means that you have to test them in a store to make up your mind.

Hmm, regarding large over-the-ear ones, about a year or two ago I made
the mistake of shelling out for one of Sony's "PlayStation Gold"
headsets. Was initially thrilled with it (except they look hideous while
they're being worn - like any Sony headset really), but then as with
most other people who got them, it didn't take too long before I hit the
infamous problem of its cheap plastic (on a $100 pair??? That's Sony I
guess) cracking and breaking. :/

I guess you could ask at the https://www.head-fi.org/ forums, but not
sure if that site is good anymore? It was a useful resource a decade
ago, but seems to be rammed down with awful ads now? Unbearable.

Thanks, I'll take a look.

Btw people say that one should keep new headset playing some heavy bass
for 10? hours after purchasing before evaluating them, something about
the coils needing to be run in. Well, they are mechanical so I guess
that makes sense (tightening, friction or something). Maybe different
for cans without coils...

So, as Walter would say, "It's trivially obvious to the casual observer."

I was surprised to learn at DConf that "trivially obvious to the most casual
observer" was an unknown expression. Googling it shows only around 10 hits,
none
predating 2005.
I learned it at Caltech in the 1970s, where it was applied to concepts and
proofs that were exceptionally difficult to follow.
I suppose that from now on if you hear the phrase, you can conclude that the
source is from the set of:
1. techers
2. D programmers
:-)

Watched the first 15-20 min of it. Definitely want to watch the rest.
Buuuuuutttt.....so far it's a good example of the *one* little thing
that kinda bugs me about Johnathan Blow:
I keep hearing him say the same things I've been saying for years, but
because he wrote Braid, he can sometimes get people to actually listen
instead of blindly dismissing everything. :/ (Granted that's not a fault
of Blow's. But it still bugs me!)

1) The compile times seems very fast in comparison
with other modern programming languages, I'm wondering
how he managed to do it?

By being a game (and engine) developer and knowing the basics of writing
efficient code, unlike the majority of the software industry.
(Seriously, if other circles of dev would pull their ***** out of their
***** long enough recognize all of what game programming obviously
involves (ex: It's more than the glorified calculators that the upptity
banking software is, and don't get me started on "enterprise" in
general), then they could finally start learning how to write grown-up
code and software today wouldn't suck so f****** badly.)
Also, not using header files.

2) Compile-time execution is not limited, the build
system is interestingly enough built into the language.

Nemerle had that years ago (although I'm guessing/hoping that unlike
Nemerle, Blow's implementation probably doesn't require manually
compiling to a DLL before being able to use given code at compile-time).
My inclination is that it's the right approach, and is one thing that
makes D look clunky and awkward by comparison. I never bought D's
argument that compiling source shouldn't be allowed to do arbitrary code
execution or I/O because, come on, "build scripts" and "build systems".
That "no arbitrary code execution" ship sailed ages ago: Who in hell
compiles software from source without using the provided buildscript or
buildsystem configuration (all of which, by necessity, allow arbitrary
code execution and IO)? Nobody who isn't searcing for their own little
corner of hell, that's who.
The *one* thing that does give me a little pause though is the
possibility that order of compilation could change the results of
generated code. I think it'll be interesting to see how that plays out
in practice. "Don't do that" sounds nice, but the question remains: "Is
it something that will happen without the author knowing he's doing it?
If so, will it be a problem?"