ibxtoycat's comparison of Java vs bedrock edition of Minecraft

Here is a video by popular Youtuber ibxtoycat about Minecraft bedrock edition vs Java

the advantages and disadvantages

Personally this is why I recommended bedrock edition over Java to friends of mine,

it runs so much smoother and I've also noticed it doesn't have as much of a toxic following who accuse people of having a bad PC despite there being verifiable proof that even with decent gaming PC's bedrock edition still runs better and is faster at chunk handling because of how it is programmed.

and by default bedrock edition supports better render distances, you'd need Optifine to expand it beyond 32 chunks in Java version, but even then it's not an ideal situation, and people shouldn't have to install mods just to improve something as basic as this, that ought to be the developers responsibility or choice.

I understand the point about the microtransactions, but that alone isn't a good enough reason to say bedrock edition is bad, also for every update that comes out for Java version, bedrock edition will eventually get those features, it already has end cities/elytra, combat, aquatic, and the pillager patrols/raids etc, it's only a matter of time before the nether update comes.

although debug screen with F3 key, amplified biomes, spectator mode and hardcore mode do need to be implemented in bedrock edition IMHO.

Java version is the version people are most familiar with, but that's mainly down to nostalgia.

Java virtual machine certainly wasn't a good choice as it is objectively inferior compared to C++, which is why most games don't use Java.

Most of the issues with modern versions of Java are down to Mojang's horrific coding practices, not because it runs on Java - my own mod is proof enough, or even just a comparison of the system requirements for 1.6 and the latest version:

A Mega Forest biome with trees reaching the clouds from sea level (I am standing at y=131); note that Optifine is not installed and this is without any of the optimizations I made, which I only adding started later (I'd already been writing my own code to be as efficient as possible):

1.8 with Optifine, which is totally unplayable, not just due to the low FPS but lag spikes:

Maybe it is just jungles? Not much better even underground though (with modded cave generation, but occlusion culling hides chunks that can't be seen). This also shows just how terrible the framerate is, with a huge lag spike every 10 frames, regardless of settings, in-game, GPU control panel, Java version, etc (this made it look more like 1/10 the number displayed, so these looked more like 2 FPS. 1.7 also suffered from the same issue but had a similar baseline frame rate as 1.6):

For comparison, this is how well TMCWv5 runs on my current computer, without most of the rendering and tick-related optimizations that I've been making as part of a separate mod (which will be integrated with TMCW when complete); in contrast to the previous screenshots this is also on Fancy (while not as tall as Mega Forest trees TMCW Mega Taiga trees have just as many leaves):

Why does 1.8 run so much worse despite everything in these comparisons favoring it (less demanding situations, Optifine)?

Also, one modder claims to have surpassed Bedrock in performance with render distances up to 256 chunks (64 times the area of 32, the highest vanilla allows, and 16 times the area of 64, the highest that Optifine allows) and none of the "cheats" the Bedrock uses to render such large distances:

In my own tests FC2 could easily outperform everything else, including the W10 edition (MCPE/Bedrock).

The config allows increasing the view distance limit up to 256. Performance will likely tank once the game runs out of VRAM.

All rendering happens with full detail and for a view distance setting of d FC2 will render (d*2+1)2 chunks, some bugs around post-initial area and very large distances aside. The whole view distance is being simulated (ticked). MCPE doesn't do either.

The creator of Optifine has said many times how bad the code in newer versions is:

Minecraft 1.8 has so many performance problems that I just don't know where to start with.

Maybe the biggest and the ugliest problem is the memory allocation. Currently the game allocates (and throws away immediately) 50 MB/sec when standing still and up to 200 MB/sec when moving. That is just crazy.

The previous Minecraft releases were much less memory hungry. The original Notch code (pre 1.3 [I'm pretty sure they meant pre-1.8]) was allocating about 10-20 MB/sec which was much more easy to control and optimize. The rendering itself needed only 1-2 MB/sec and was designed to minimize memory waste (reusing buffers, etc).

The old Notch code was straightforward and relatively easy to follow. The new rendering system is an over-engineered monster full of factories, builders, bakeries, baked items, managers, dispatchers, states, enums and layers. Object allocation is rampant, small objects are allocated like there is no tomorrow. No wonder that the garbage collector has to work so hard.

The multithreaded chunk loading is crude and it will need a lot of optimizations in order to behave properly. Currently it works best with multi-core systems, quad-core is optimal, dual-core suffers a bit and single-core CPUs are practically doomed with vanilla. Lag spikes are present with all types of CPU.

Doubt theyre gonna improve it any time soon, if anything theyre making Minecraft worse in 1.13. sp614x showed us how much worse their object allocations are in 1.13 compared to 1.12 :/ [I did not find a direct source, I believe it was on Discord which is not public, which is why I dislike it so much]

How in the world did they manage to make it over 35 times slower?! For comparison, TMCWv5's biome generator is about 180(!) times faster despite being more complex - even the figure for 18w05a is about 4-5 times slower - in fact, even interpreted code was still faster (the first few runs are slower because the JVM doesn't compile to native code until a method has been called enough times, so it was still interpreting bytecode; the times around 200000000 ns or 200 ms are the compiled times):

Took 686453700 nanoseconds to generate biomes
Took 429975800 nanoseconds to generate biomes
Took 286419100 nanoseconds to generate biomes
Took 334331700 nanoseconds to generate biomes
Took 192021800 nanoseconds to generate biomes
Took 194679600 nanoseconds to generate biomes
Took 199751700 nanoseconds to generate biomes
Took 201494600 nanoseconds to generate biomes
Took 200042600 nanoseconds to generate biomes
Took 198393600 nanoseconds to generate biomes

It also uses significantly less memory than even vanilla 1.6.4, which allocates upwards of 60 arrays, one for each "layer" (they are however stored in a cache), while TMCWv5 only needs 16 as I use a circular cache (in theory as few as 2 would be needed but rivers and biomes are generated in two separate "chains" which partly overlap):

Vanilla also caches arrays but it uses a quite convoluted implementation involving multiple ArrayLists, including separate storage for "small" and "large" arrays (the overhead of having to fetch arrays from the cache is negligible when compared to the calculations when generating biomes; my implementation does require checking for biome corruption if changes are made (specifically, adding new layers) but that is very easy to do thanks to a simple standalone biome mapping utility, much like AMIDST):

(the thread safety part refers to the fact that in vanilla the client and server threads can access the biome generator at the same time, causing biome corruption or crashes (the IntCache class uses synchronized methods but all that does is prevent two threads from accessing its methods at the same time, while the biome generator itself is not synchronized). Mojang fixed this by completely removing "IntCache" in 1.13, significantly increasing memory allocation, while I fixed it by only allowing the client to read biomes from chunks, defaulting to ocean in unloaded chunks)

Likewise, others have significantly improved performance by fixing/patching Mojang's terrible code (better yet would be to completely rewrite the code not to use BlockPos or BlockState objects at all; ironically, in 1.8 Mojang actually removed object pools that older versions used for AABBs and Vec3s):

I noticed in 1.12.x that getBlockState (in World, Chunk, and ChunkCache) accounted for substantial amount of CPU overhead. I developed a block state cache (write-through direct-mapped cache using a specially tuned hash to map from coordinates to cache entries), which made a HUGE difference. That plus a BlockPos neighbor cache literally doubled Minecraft performance for the test cases we tried.

Also, some of my own code even vastly exceeds the performance of built-in Java libraries - you should never use its Random class, which is not only only 48 bits but it is very slow due to using a thread-safe AtomicLong object to store its internal state, even when there is no need for multithreading:

Random.nextInt(n) took 9.9637 nanoseconds
Random64.nextInt(n) took 1.2762 nanoseconds; was 7.8073187 times faster than Random

Random.nextFloat() took 9.9535 nanoseconds
Random64.nextFloat() took 1.2763 nanoseconds; was 7.798715 times faster than Random

Random.nextGaussian() took 71.8699 nanoseconds
Random64.nextGaussian() took 5.033 nanoseconds; was 14.27973 times faster than Random

In particular, simply switching to a functionally identical replica of Random, with no other code changes, halved the time spent on cave generation, one of the slowest parts of world generation - in other words, the majority of the time was spent on calls to Random (I had somebody who thought it was ridiculous that I used my own RNG just to save a "few" nanoseconds - those nanoseconds per call add up though). I also "cheat" with nextGaussian, which is an approximation but it is more than good enough for the code that uses it (of course, I still use a LCG which is a big no-no for statistical purposes but is more than good enough for a game, and using all 64 bits means the higher bits which are used for the output have better statistical properties - the issues that the game has involving poor randomness have to do with the way it calculates a seed for each chunk).

No, I'm not claiming that the theoretical performance of Java can outperform Bedrock - just that the poor performance is almost entirely down to Mojang's coding practices, not the language, with more than enough evidence to support this.

I still like Java edition more, it has more options. I have never had issues running it on vanilla, and Optifine made my fps go from 100-110 to 120-135 fps. I believe that the only unobjectionable positive bedrock has is that it has the best cross-platform play. Besides that I think java is better in every way unless you have a really bad computer. And on the comment that you found it less toxic I would disagree. Bedrock edition is connected to PS4 and XBOX which have some of the most toxic stuff ever. I don't think this counts against bedrock edition I actually wish we had as many snot nosed squeakers to mess with on Java edition. My point is its a matter of do you want to play with people without a PC or do you want almost limitless customization and modifications without a scummy paywall behind it? To end this off I would like to say I don't think nearly as many people would dislike bedrock edition as much as people do if it weren't for the micro-transactions that make you pay for stuff that was originally free. I know I wouldn't.

I still like Java edition more, it has more options. I have never had issues running it on vanilla, and Optifine made my fps go from 100-110 to 120-135 fps. I believe that the only unobjectionable positive bedrock has is that it has the best cross-platform play. Besides that I think java is better in every way unless you have a really bad computer. And on the comment that you found it less toxic I would disagree. Bedrock edition is connected to PS4 and XBOX which have some of the most toxic stuff ever. I don't think this counts against bedrock edition I actually wish we had as many snot nosed squeakers to mess with on Java edition. My point is its a matter of do you want to play with people without a PC or do you want almost limitless customization and modifications without a scummy paywall behind it? To end this off I would like to say I don't think nearly as many people would dislike bedrock edition as much as people do if it weren't for the micro-transactions that make you pay for stuff that was originally free. I know I wouldn't.

When a persons PC exceeds the recommended system requirements of the game in question, you're wrong. It should work as expected, if it doesn't and if most other games of a similar recommendation worked fine, then that is the developers fault.

I still like Java edition more, it has more options. I have never had issues running it on vanilla, and Optifine made my fps go from 100-110 to 120-135 fps. I believe that the only unobjectionable positive bedrock has is that it has the best cross-platform play. Besides that I think java is better in every way unless you have a really bad computer. And on the comment that you found it less toxic I would disagree. Bedrock edition is connected to PS4 and XBOX which have some of the most toxic stuff ever. I don't think this counts against bedrock edition I actually wish we had as many snot nosed squeakers to mess with on Java edition. My point is its a matter of do you want to play with people without a PC or do you want almost limitless customization and modifications without a scummy paywall behind it? To end this off I would like to say I don't think nearly as many people would dislike bedrock edition as much as people do if it weren't for the micro-transactions that make you pay for stuff that was originally free. I know I wouldn't.

Bedrock version has some cool gimmicks -although I've never played it because I only got Windows 10 recently- but I will agree Java seems more updated, but this is more because of nostalgia than real good coding practice, as has been pointed out by TMC. Most MC players play on Java, so Java gets more attention - this also keeps Windows 7 alive longer.

@Toxicity: Considering you joined the website simply to tell people they're wrong here, as you admitted on my deleted thread, you seem to be acting toxic yourself. I understand you want to correct people, that's all well and good, but you still seem rather pushy about things. Different people have had different experiences and yours may not be representative of someone else's or of everyone else's.

Bedrock version has some cool gimmicks -although I've never played it because I only got Windows 10 recently- but I will agree Java seems more updated, but this is more because of nostalgia than real good coding practice, as has been pointed out by TMC. Most MC players play on Java, so Java gets more attention - this also keeps Windows 7 alive longer.

@Toxicity: Considering you joined the website simply to tell people they're wrong here, as you admitted on my deleted thread, you seem to be acting toxic yourself. I understand you want to correct people, that's all well and good, but you still seem rather pushy about things. Different people have had different experiences and yours may not be representative of someone else's or of everyone else's.

And the point of this thread was regardless of how high your specs are or how well built your PC is, if a game isn't properly coded then it isn't going to run very well. Glitches exist, patches are needed for them and come in the form of updates. TheMasterCaver admitted that most of why Java version runs worse is because of Mojang's inefficient coding practices.

There are very good reasons why C++ version of the game came about, if they could've gotten Java version working fine on the portables and games consoles, they would have, or at least there there is a valid reason to believe this.

People must understand that smartphones and games consoles don't generally have very fast CPU's in them and so applications have to be coded accordingly, one of the main criticisms I often see about the Xbox One and PS4 game consoles is how weak their Jaguar based chips are, and arguably this is why some games like Destiny 1 and 2 were capped at 30fps, if the CPU was that much of a bottleneck, then a locked 60fps is impossible with some games.

Minecraft runs at a locked 60fps on games consoles obviously, but this is because it isn't as demanding, certainly not with the render distances they are limited to.

Since then the differences have only grown - out of a total of 180 million copies sold and 480 million registered accounts (most for the free to play Chinese version) just 34.8 million are for Java. Undoubtedly, Java does get a lot more attention because of mods and resource packs, as well as being easier to record on the systems it runs on (nearly all the videos that I've seen are on Java).

I personally really do not like bedrock at all. I'm not a fan of the way the camera moves, and I REALLY don't like the lighting engine. I feel like the bedrock edition is just microsofts way of saying "hey we bought this company so we want to restructure the game completely". Java gets a lot more attention as well and has much more features. Here are some more reasons why I don't like bedrock: 1. Java gets updates first, even if bedrock isn't that far behind it. 2. bedrock is filled to the brim with microtransactions for things you can get for free on Java like skin packs and custom maps. 3. Java is way more open to modding and changing up the game, and the only form of mods that bedrock has is addons, which are just data packs. Sorry for that long rant, I just wanted to prove my point. I respect your opinion though!

When a persons PC exceeds the recommended system requirements of the game in question, you're wrong. It should work as expected, if it doesn't and if most other games of a similar recommendation worked fine, then that is the developers fault.

What are your PC's specs anyway? Also I respect your opinion but still disagree. A recommendation isn't set in stone as a game is updated and more is added if the coding is poor it will suffer and i agree with you on that. But I don't usually have FPS issues. I am not wrong about the FPS issues, that doesn't even make sense. Since I have little to no FPS problem I like Java more. having a bad PC isn't the dev's fault. Yes the coding is poorer and the fps suffers but even when I had a rx 470 gpu I still got 60 fps. Idk how you are getting lower than that unless you don't mess with your graphical settings a run on default expecting it to run well. Again this is my opinion i'm not entitled enough to tell you your opinion is wrong. If your computer isn't good enough to run java edition that's fine. All I was saying is that I like Java edition more and then named the reasons why.

Bedrock version has some cool gimmicks -although I've never played it because I only got Windows 10 recently- but I will agree Java seems more updated, but this is more because of nostalgia than real good coding practice, as has been pointed out by TMC. Most MC players play on Java, so Java gets more attention - this also keeps Windows 7 alive longer.

@Toxicity: Considering you joined the website simply to tell people they're wrong here, as you admitted on my deleted thread, you seem to be acting toxic yourself. I understand you want to correct people, that's all well and good, but you still seem rather pushy about things. Different people have had different experiences and yours may not be representative of someone else's or of everyone else's.

I didn't say I came to tell people they were wrong I said that some of the things people were saying was dumb and I wanted to join. An opinion can't be correct or incorrect, so I can't "correct" them and never said I could. I just wanted to state my opinion. if that is toxic in your opinion you may have some thin skin. Also the point of my talking about toxicity was relating to the game's community not me, you entirely ignored that just to call me toxic.

When a persons PC exceeds the recommended system requirements of the game in question, you're wrong. It should work as expected, if it doesn't and if most other games of a similar recommendation worked fine, then that is the developers fault.

What are your PC's specs anyway? Also I respect your opinion but still disagree. A recommendation isn't set in stone as a game is updated and more is added if the coding is poor it will suffer and i agree with you on that. But I don't usually have FPS issues. I am not wrong about the FPS issues, that doesn't even make sense. Since I have little to no FPS problem I like Java more. having a bad PC isn't the dev's fault. Yes the coding is poorer and the fps suffers but even when I had a rx 470 gpu I still got 60 fps. Idk how you are getting lower than that unless you don't mess with your graphical settings a run on default expecting it to run well. Again this is my opinion i'm not entitled enough to tell you your opinion is wrong. If your computer isn't good enough to run java edition that's fine. All I was saying is that I like Java edition more and then named the reasons why.

I didn't say I came to tell people they were wrong I said that some of the things people were saying was dumb and I wanted to join. An opinion can't be correct or incorrect, so I can't "correct" them and never said I could. I just wanted to state my opinion. if that is toxic in your opinion you may have some thin skin. Also the point of my talking about toxicity was relating to the games community not me, you entirely ignored that just to call me toxic.

What is your average fps? Your pc is better than mine when i had the rx 470 a few years ago. The coding has always been bad this isn't new. I doubt the coding has gotten bad enough to the point that my i3 cpu and rx 470 would outperform your pc. My pc has improved since then but my point still stands. I like java more since I don't have issues with fps. If your pc for any reason can't handle the java edition so you like Bedrock edition more that is fine. It just seems very obstinate of you to state that bedrock edition is better as a fact.

What is your average fps? Your pc is better than mine when i had the rx 470 a few years ago. The coding has always been bad this isn't new. I doubt the coding has gotten bad enough to the point that my i3 cpu and rx 470 would outperform your pc. My pc has improved since then but my point still stands. I like java more since I don't have issues with fps. If your pc for any reason can't handle the java edition so you like Bedrock edition more that is fine. It just seems very obstinate of you to state that bedrock edition is better as a fact.

some areas bedrock edition is better, others it's not. I think I got 120fps when I tested Windows 10 edition the last time, but then my refresh rate was capped at 120hz with adaptive vsync on in the Nvidia control panel. I got this frame rate in Java version as well but it kept fluctuating especially when generating new terrain. Anyway there are strengths Java version has, modding is one of them, and the ability to play older versions of the game. It's just that without Optifine you aren't exactly spoiled with render distances, not that you could benefit from the increased render distances on realms anyway, last I remember checking, realms is limited to 10 chunks. lol

Likewise, somebody who has a better computer than I do gets around 10 times worse performance in 1.15.2 with similar settings in terms of FPS, server tick time, and memory usage (they have one of the most powerful GPUs you can currently get, ranked second on videocardbenchmark):

It is quite amazing that Minecraft can, or used to be able to, run with just 256 MB of memory allocated (displayed as 250) even at maximum settings (it did take a bit longer to load the world):

Of course, this isn't representative of vanilla 1.6.4, which gets a similar framerate on Normal/8 chunk render distance, Far can't be compared at all since it is limited to 10 chunks by the internal server; however, the game didn't change much in terms of requirements until 1.8 (though some change in 1.7 caused severe frametime stuttering on my old computer; actual FPS would be a lot higher if it wasn't for half-second long pauses every 10 frames):

In reality Minecraft needs no more than 256 MB to run, mostly using about 100-150 MB. This is for vanilla Minecraft running in 32 bit Java with no mods installed and using the default texture pack.

Some people may accept even 10-15 FPS but for me anything below 60 is unacceptable, and even then lag spikes would spoil it.

This shows just how low the system requirements used to be - even what I had back then was still better than the recommended requirements:

For comparison, the current recommended requirements are an entire order of magnitude higher; even back in the 1.8 days the recommended requirements for 1.6 were already at or below the minimum:

Likewise, somebody who has a better computer than I do gets around 10 times worse performance in 1.15.2 with similar settings in terms of FPS, server tick time, and memory usage (they have one of the most powerful GPUs you can currently get, ranked second on videocardbenchmark):

It is quite amazing that Minecraft can, or used to be able to, run with just 256 MB of memory allocated (displayed as 250) even at maximum settings (it did take a bit longer to load the world):

Of course, this isn't representative of vanilla 1.6.4, which gets a similar framerate on Normal/8 chunk render distance, Far can't be compared at all since it is limited to 10 chunks by the internal server; however, the game didn't change much in terms of requirements until 1.8 (though some change in 1.7 caused severe frametime stuttering on my old computer; actual FPS would be a lot higher if it wasn't for half-second long pauses every 10 frames):

Fun fact: there is actually a "law" that describes what is happening with Minecraft:

Bedrock servers can actually go up to 32 chunks, if on the alpha server software, but this is demanding, and I suspect the vast majority of them would be running it at around 8 to 10 chunks for this reason. Regardless of which version we use our hardware still is limited.

Hopefully technology evolves to the point where most Minecraft servers will be running the game at beyond 10 chunks render distance, which is equal to 160 blocks, 10 x 16 = 160, because lets be honest, being able to see far away biomes is beneficial for survival gameplay, especially when you're out at sea in a boat, and while coordinates will help you navigate through the worlds without getting lost, they will not tell you what biomes are ahead of you, for this you need to be able to see what has generated in front of you.

Unfortunately, higher rendering distances and tick radius settings are very taxing on both the CPU and the RAM, and users are warned about this in the document that comes with the bedrock server application.

A locked 60fps is tolerable for the game, but what is preferred is a locked 120 or beyond if you have a gaming monitor. I wouldn't complain about 60fps though if there were no lag spikes involved, that's still enough to enjoy the game and not affect the gameplay or responsiveness too much.

What are your PC's specs anyway? Also I respect your opinion but still disagree. A recommendation isn't set in stone as a game is updated and more is added if the coding is poor it will suffer and i agree with you on that. But I don't usually have FPS issues. I am not wrong about the FPS issues, that doesn't even make sense. Since I have little to no FPS problem I like Java more. having a bad PC isn't the dev's fault. Yes the coding is poorer and the fps suffers but even when I had a rx 470 gpu I still got 60 fps. Idk how you are getting lower than that unless you don't mess with your graphical settings a run on default expecting it to run well. Again this is my opinion i'm not entitled enough to tell you your opinion is wrong. If your computer isn't good enough to run java edition that's fine. All I was saying is that I like Java edition more and then named the reasons why.

I didn't say I came to tell people they were wrong I said that some of the things people were saying was dumb and I wanted to join. An opinion can't be correct or incorrect, so I can't "correct" them and never said I could. I just wanted to state my opinion. if that is toxic in your opinion you may have some thin skin. Also the point of my talking about toxicity was relating to the game's community not me, you entirely ignored that just to call me toxic.

You're saying things people have said are dumb which is your opinion. You claim they are objectively dumb because your opinion that modding your game is easy and something everyone would want to do is something you see as fact whereas I see it as an opinion.

I can give other examples, you see the point. If you press so hard on your opinions and rub off other opinions so hard, you sound like you are trying to make your opinions facts and other people's opinions completely wrong. That's not a very genuine thing to do even if your opinions are factually right or if they are popular. It makes you seem like you're looking for trouble or to cause upset, and hence I mistook you for being someone else from here on a burner account rather than someone new - because when people say such controversial things with such conviction knowing the argument this will lead to, they get downdooted to oblivion and so they use fake accounts to absorb the negative impact and shelter their real accounts' identity and reputation.

More on it is not an opinion that if a computer's specs > game requirements, then game should run half-decently. That's just logic and forgive me if I'm too dumb to see how it's more complicated than that.

I still like Java edition more, it has more options. I have never had issues running it on vanilla, and Optifine made my fps go from 100-110 to 120-135 fps. I believe that the only unobjectionable positive bedrock has is that it has the best cross-platform play. Besides that I think java is better in every way unless you have a really bad computer. And on the comment that you found it less toxic I would disagree. Bedrock edition is connected to PS4 and XBOX which have some of the most toxic stuff ever. I don't think this counts against bedrock edition I actually wish we had as many snot nosed squeakers to mess with on Java edition. My point is its a matter of do you want to play with people without a PC or do you want almost limitless customization and modifications without a scummy paywall behind it? To end this off I would like to say I don't think nearly as many people would dislike bedrock edition as much as people do if it weren't for the micro-transactions that make you pay for stuff that was originally free. I know I wouldn't.

you are being toxic in passing judgment on all those people and what you call them and your wish that they be on Java implies you want to get a rise or laugh out of them, else why would you want apparently toxic people here? Just because someone is toxic doesn't mean you should yam on them, that's just low level vigilante justice/karma stuff you're doing there and if anything treating bad people badly only makes you bad and toxic as well.

There is a mantra that you should treat people you hate with the same respect and benefit of the doubt as people you love, that to be unconditionally loving is the highest form of being. The reason for that is that when you engage with your enemy, or the jokes/annoyances of the crowd, you are stooping to their level and are as bad if not worse than them, since you know they are wrong and you mocking them or imitating them instead of helping or ignoring shows character issues and choosing to be mean when you know and are capable of being better than that.

Bedrock servers can actually go up to 32 chunks, if on the alpha server software, but this is demanding, and I suspect the vast majority of them would be running it at around 8 to 10 chunks for this reason. Regardless of which version we use our hardware still is limited.

Somebody claims to have optimized the game to the point where it is possible to go up to 256 chunks - 64 times the loaded area of 32; I have no idea what has become of their mod and wouldn't be able to test it myself either way (using the in-memory chunk storage format used by 1.6.4 you can expect upwards of 32 GB of memory usage just to store chunk data - after all, you are talking about 263169 chunks being loaded - that's more than twice the size of my largest world!):

In my own tests FC2 could easily outperform everything else, including the W10 edition (MCPE/Bedrock).

The config allows increasing the view distance limit up to 256. Performance will likely tank once the game runs out of VRAM.

All rendering happens with full detail and for a view distance setting of d FC2 will render (d*2+1)2 chunks, some bugs around post-initial area and very large distances aside. The whole view distance is being simulated (ticked). MCPE doesn't do either.

My test environments are only a Nvidia GTX 780 and an Intel HD 2000, both with Intel quad core CPUs, your experience on other platforms may vary and is of elevated interest.

Also, I still use only 8 chunks during normal gameplay because I don't see the need to see further; only the largest caves/ravines in TMCW require a larger render distance to see across and practically never in vanilla (the longest ravines are 7 chunks long), and in-game maps only map out to 8 chunks and generating anything beyond this seems wasteful to me (as an example, if I fully explore a level 4 map, 128x128 chunks, a render distance of 8 will generate a total of 144x144 chunks, or 26.5% more chunks than have actually been explored. With a render distance of 16 the area increases to 160x160 or 1.56 times the area; 32 would be 2.25 times. For my first world I've explored about 80% of all generated chunks across a roughly 6600x6600 area). There are also diminishing returns as you increase render distance, especially in hilly/forested terrain, and the perceived view distance also depends on the field of view (Optifine's "zoom" function simply lowers the FOV, conversely, Quake Pro makes everything look much further away to the point that 8 chunks looks as far as 16 on normal FOV).

Also, the game only ticks blocks within an 8 chunk radius (15x15 chunks in 1.6.4) so it doesn't matter how high you set the render distance (as long as it isn't less than 8); likewise, mobs only spawn within an 8 chunk (128 block) radius and only perform specialized AI functions past 32 blocks (random wandering is disabled while they can still target other mobs, eat grass, grow up, etc, likely because pathfinding is much more expensive), plus they don't render past 80 blocks in most cases, much less for smaller mobs (the server only sends most entities to the client when they are within 80 blocks; this explains why the "E" number in this screenshot is only 109 despite there being 440 mobs loaded (the individual mob counts are server-side). Vanilla does require a minimum view distance of 10 for mob (de)spawning to work properly due to entities only being ticked if a 5x5 chunk area around them is loaded but I reduced the (de)spawn radius to 6 to make mobs more concentrated around the player.

Most of the increase in server CPU usage at higher render distances is due to mobs, and this is likely why passive mobs (de)spawn like hostile mobs on Bedrock so they only exist within a limited distance of a player and are much less common overall; by contrast, Java spawns an average of one pack of 4 mobs every 10 chunks during world generation - that's a worst-case (all biomes in the area spawning passive mobs) of more than 100000 mobs loaded at a render distance of 256 so I assume FastCraft doesn't load them at all that far out (this is also visible in these examples (example1, example2); both are set to 16 chunks and have a client-side tick of 0.7 ms but the server-side tick is higher in the one with a higher mob count, partly also because vines were still growing in a nearby jungle (lag in jungles could be significantly improved if vines were fully grown during world generation, either way, as mentioned above blocks are only ticked within an 8 chunk radius so this does not scale up).

You're saying things people have said are dumb which is your opinion. You claim they are objectively dumb because your opinion that modding your game is easy and something everyone would want to do is something you see as fact whereas I see it as an opinion.

I can give other examples, you see the point. If you press so hard on your opinions and rub off other opinions so hard, you sound like you are trying to make your opinions facts and other people's opinions completely wrong. That's not a very genuine thing to do even if your opinions are factually right or if they are popular. It makes you seem like you're looking for trouble or to cause upset, and hence I mistook you for being someone else from here on a burner account rather than someone new - because when people say such controversial things with such conviction knowing the argument this will lead to, they get downdooted to oblivion and so they use fake accounts to absorb the negative impact and shelter their real accounts' identity and reputation.

More on it is not an opinion that if a computer's specs > game requirements, then game should run half-decently. That's just logic and forgive me if I'm too dumb to see how it's more complicated than that.

Also when you say:

you are being toxic in passing judgment on all those people and what you call them and your wish that they be on Java implies you want to get a rise or laugh out of them, else why would you want apparently toxic people here? Just because someone is toxic doesn't mean you should yam on them, that's just low level vigilante justice/karma stuff you're doing there and if anything treating bad people badly only makes you bad and toxic as well.

There is a mantra that you should treat people you hate with the same respect and benefit of the doubt as people you love, that to be unconditionally loving is the highest form of being. The reason for that is that when you engage with your enemy, or the jokes/annoyances of the crowd, you are stooping to their level and are as bad if not worse than them, since you know they are wrong and you mocking them or imitating them instead of helping or ignoring shows character issues and choosing to be mean when you know and are capable of being better than that.

Players shouldn't have to worry about lag, or wasted time. But this guy accused me of being lazy, misinformed or as he assumed "stubborn".

Judging by the date JoltyBoiii joined, it seems he came on just to defend Tails1 when we had a disagreement, to put it mildly.

I understand the game is about mining, but adding on unnecessary time for the sake of making some players feel good about themselves at the expense of others isn't exactly fair. The ore generation and the mining mechanics as they are should remain as is, what's wrong with them? if you keep changing the rules of the game it is inevitable that some people will complain about it.

You, me, and the people I play with as well as other bedrock edition users, shouldn't have to use mods or play the Java version which doesn't have crossplay on Xbox, by the way, just to give somebody else something to smile about, that's selfish.

Players shouldn't have to worry about lag, or wasted time. But this guy accused me of being lazy, misinformed or as he assumed "stubborn".

Judging by the date JoltyBoiii joined, it seems he came on just to defend Tails1 when we had a disagreement, to put it mildly.

I understand the game is about mining, but adding on unnecessary time for the sake of making some players feel good about themselves at the expense of others isn't exactly fair. The ore generation and the mining mechanics as they are should remain as is, what's wrong with them? if you keep changing the rules of the game it is inevitable that some people will complain about it.

You, me, and the people I play with as well as other bedrock edition users, shouldn't have to use mods or play the Java version which doesn't have crossplay on Xbox, by the way, just to give somebody else something to smile about, that's selfish.

They can argue equally validly that NOT making the changes to keep US happy at THEIR expense is selfish too, you know. :\