This is my last resort after 2 days of devouring information from everywhere My math skills are relatively poor though, but I have a feeling what I'm trying to do simply can't be done.

I am summing random variables that have uniform distribution, once they are summed they are of normal distributionWith only n = 3, the distribution is already extremely poor.How can I, and is it even possible to make the distribution uniform again?

If you want a random variable with uniform distribution from 0 to N, and you have a variable uniformly distributed from 0 to 1 well then you just have to scale it with N.
I do not see the point in creating a random variable with some distribution and then wanting to transform it back to uniform, if you actually have a uniform distribution to begin with.

I have a feeling you might be confusing the concept of random variable and probability distribution. Could you tell us exactly in detail what you are trying to do?

The slowsort algorithm is a perfect illustration of the multiply and surrender paradigm, which is perhaps the single most important paradigm in the development of reluctant algorithms. The basic multiply and surrender strategy consists in replacing the problem at hand by two or more subproblems, each slightly simpler than the original, and continue multiplying subproblems and subsubproblems recursively in this fashion as long as possible. At some point the subproblems will all become so simple that their solution can no longer be postponed, and we will have to surrender. Experience shows that, in most cases, by the time this point is reached the total work will be substantially higher than what could have been wasted by a more direct approach.

However, this is most likely not the route you want to go through, unless you somehow have a very special setup where your RNG can only supply individual random bits. That sounds rare, and instead most RNG sources are built-in to produce random integers in range [a,b] and random floats in the range [0,1] (which equals roughly 23 bits of randomness).

However, this is most likely not the route you want to go through, unless you somehow have a very special setup where your RNG can only supply individual random bits. That sounds rare, and instead most RNG sources are built-in to produce random integers in range [a,b] and random floats in the range [0,1] (which equals roughly 23 bits of randomness).

These special setups are not rare at all, though I agree you are more likely to find them in cryptographic settings (entropy pools) than in conventional PRNG's. Also, a small terminology nitpick: a random bit generator is called an RBG (and its pseudorandom counterpart is a PRBG).

What I would find rare is PRNG's directly producing floating-point numbers. As far as I can tell this is usually provided by a helper function which performs the floating-point division along the lines of random(max) / max. I recall a few purely floating-point generators, but in my opinion it's not a very smart way to go about it, integer arithmetic is very well-defined unlike floating-point arithmetic.

The slowsort algorithm is a perfect illustration of the multiply and surrender paradigm, which is perhaps the single most important paradigm in the development of reluctant algorithms. The basic multiply and surrender strategy consists in replacing the problem at hand by two or more subproblems, each slightly simpler than the original, and continue multiplying subproblems and subsubproblems recursively in this fashion as long as possible. At some point the subproblems will all become so simple that their solution can no longer be postponed, and we will have to surrender. Experience shows that, in most cases, by the time this point is reached the total work will be substantially higher than what could have been wasted by a more direct approach.

I asked a statistician, and I got a final answer: that i couldn't do this due to central limit theorem
When i sum random samples, I am destined to get a skewed probability distribution
nevertheless, interesting conversations above

Well, not if your uniform random distributions are not independent. The central limit theorem does not apply, and you can build a uniform distribution U(0,7) as the sum of seven U(0,1) distributions that happen to be identical.

But I still don't quite understand what you were trying to do, so this conversation feels kind of irrelevant.

i was using the randoms to create octaved frequencies for biomesi didnt want noise, since gradiented noise has extremely uneven distribution (think of the derivatives on the extremities of a sphere)i basically wanted to make saw-like linear frequencies

You can transform a uniform distribution into something usable for, say, minecraft-style biome generation, but it's much more involved than just summing up distributions. Look into fractional brownian motion, perlin noise, etc...

The graphs on the page you link are probability distribution functions, which map any output number to its probability of being generated (well, actually, since we're looking at a continuous range of values, it's really the probability of generating an output between some A and B). They don't mean anything else and a normal distribution will still have a large variance.

Fractional brownian motion seems to be the closest thing you are looking for. Basically, it will take a sample of uniformily generated random numbers, and output a sample of "smooth" noise, which when mapped to a heightmap, would give a somewhat realistic terrain instead of a completely random mess. It involves summing up octaves of the same uniform sample, with different frequency and amplitude. Is this what you were looking for?

The slowsort algorithm is a perfect illustration of the multiply and surrender paradigm, which is perhaps the single most important paradigm in the development of reluctant algorithms. The basic multiply and surrender strategy consists in replacing the problem at hand by two or more subproblems, each slightly simpler than the original, and continue multiplying subproblems and subsubproblems recursively in this fashion as long as possible. At some point the subproblems will all become so simple that their solution can no longer be postponed, and we will have to surrender. Experience shows that, in most cases, by the time this point is reached the total work will be substantially higher than what could have been wasted by a more direct approach.

the problem isn't really related to determining the biomes, it's determining the terrain... in real life the terrain determines the biome to some degreebut i'm about to cross that bridge now, so not sure yet