The build doesn’t use any trick. The images converge to the exact same results.

Regarding AO, Evermotion uses it indeed if i remember correctly, but nearly all interior renders you see in the forum gallery do (go look in the thread if you want like here Archviz - Interior - Emily) and the Blender Foundation films uses it too (the Agent Movie has it, you can see it in the bench files). I’m from those thinking what counts is the mood. Non-biased rendering doesn’t exist anyway, reality is way too complex, even for today’s computer. So what counts is that your image is liked by it’s audience.

But again, this build doesn’t use any trick. The images converge to the same results.

About Eevee: I didn’t start to look at it as a coder yet. For me, it still crashes too much, I don’t find it that good looking in my use cases. It also needs manual tricks and lot of tweaking to compensate it’s nature, like in the Internal Render in it’s time. Cycles on the other hand can not only be extremely fast at rendering when configured wisely, but can also saves huge amount of user time by only needing 1 to 3 parameters to be tweaked per scene. GPUs are extremely cheap today after the mining boom. If you want to earn money with Blender, the GPUs are way cheaper than the human time, even more with the optimizations in this build. So even if Eevee would render 100x faster than Cycles, as long as it needs manual work, it just doesn’t make sense economically.
If you need real time, I find there are much better alternatives for real time rendering in the architechture field for now.
But it’s improving fast, so I keep an eye on it.

To all: You can just post here if you want to participate for fun, but to win a year, you have to send email (to get the coupon) and name (to check people don’t post 20 answers)

Hello there,
looks very promising but I am bit confused about what do you actually sell here. So is it some kind of tweaked Cycles build that you can install into Blender or is it a course with you would learn how to tweak a build on your own? What pricing is it going to be? Or will we get more info on Saturday (29.12.2018)? Any Cycles speed up is more than welcome and for a such a feature I wouldn’t mind pay.

1 year of ready to use builds with the Cycles optimizations integrated, so that you just have to download and run. I do the code maintenance of my patches and the compiling for you. (this thread)

I can’t name an exact number yet, but it will be under 9€ per month for the builds. So that even if you only own a GTX 1060, you actually get your investing back instantly (it’s equivalent to buying another one in many cases). On top of that, you also spare on the electricity bill, by halving the render-related cost.

Eevee is going to be important for Archviz where accuracy is not as important as time. Especially for animation/Archviz walkthroughs. For many non-commercial Blender users like myself buying more Gpus/Machines is not an option. Look at the gallery their is some nice Archviz done in Eevee already.

I agree their are some tricks to learn to work with Eevee and its limitations, but once learn they are easy to reuse. Even Cycles their are things to learn especially for interior scenes with small windows/lights to reduce noise.

Eevee is improving rapidly and easy to switch from Cycles to Eevee. See this Evermotion article migrating from cycles to eevee. I would definitely would be interested in learning code wise about Eevee.

I can understand. Blender is huge, their are physical simulations, image post-processing, Motion analysis, etc. Although the course concentrate on Cycles and the Modifiers, it also learns you to learn and to find information quickly. Even long time devs are regularly confronted to new things, as IT is a very fast moving world. It’s more important to learn how to easily adapt by learning a way of thinking. This course will give you my ways of finding and learning what I don’t know too.

Maybe I’ll make advanced/specific courses for other parts later if their is enough demand. But Eevee I think is the hardest and most boring part of the code, because it’s directly dependant on all the GPUs from intel, amd and nvidia in the last 8 years and it’s dozens of driver versions per hardware that have to be supported (laptop users for example can’t always update properly and want it also working), multiplied by the number of OSs Blender supports. Believe me, it’s not just about having good ideas in this part of the code, it’s about workarounding/pilling exceptions. You get hardlocks, blue screens, because of bad drivers etc. To start, it’s better to touch other parts.

compute precision pro bounce is always finite, it can only be good enough but not exact.

the number of bounces will always be limited, look at SSS or volumes, try to render a scene with volume bounces over 2 and try to make it converge. Even without volumes, look at the Blender Foundation demo files. Even Gooseberry that tried to be as realistic looking as possible is limited to 2 bounces to render it in a realistic amount of time and they already have pretty good render farms.

the data you use for the textures is a very rough approximation. It has a white balance applied, that was dependant on the lightning you had/used to capture, the hardware itself also has it’s own response. To have your rendering non-biased, you should use spectral textures captured with a light source that has all of the visible part of the human eye’s response. It should also be a laser scan that has a resolution good enough to have the real roughness, not a faked one computed from the albedo. And you would of course need a screen capable of displaying this full range of light. Even then, it would be at best reproduce what you can see so it would be non-biased to you(with one frame per year or something like that), but it still would be an approximation based on your way of seeing.

I don’t even speak about geometry. No material in the real world is 100% diffuse and perfectly flat. So you you always capture with self-shading, reflections, etc… There are tricks to compensate those to only get the one computed at render time, but it’s also tricked and biased. You never get the real microsurface structure. Substances/procedural materials can avoid some of this, but also use tricks to compute fast.

I could continue the list, the point in the end is: make pictures that at least you and your audience like and you are right.

Non-biased only means you don’t introduce bias in the random sampling, nothing more. And you can get Cycles very close I believe (turn off MIS, lots of bounces in normal cases is enough, no clamping or light sampling threshold obviously).

Spectral rendering or how accurate material definitions are can make the renderer more accurate, but as far as I understand, it doesn’t have anything to do with sampling bias.

So the answer is 12min13 as I used 4 times the original resolution (2800x2100) of the Evermotion scene. At the original (1400x1050) resolution, it renders indeed in 2min55.
So the winners are @Lumpengnom, @Dragon.Studio and @Komposthaufen and they get one year for free per email.

At the beginning, it’s the same, but the builds will be updated weekly here. In the course, their is one build offered for quick testing and comparison with the builds you make, but you update the builds yourself as their will be as many versions of Blender as their are students. Some will add more modifiers, some will add new features to Cycles, some will build upon 2.7 other upon fracture modifier, etc…
So it adds some convenience to stay up to date with fast rendering.

bliblubli, your claim that unbiased rendering does not exist is actually quite disturbing – given that you are coding and selling a path tracer. I believe you and your software do a good job. Just please do not disregard statistics theory.