>>548376Nvidia, just so you don't get locked out of being able to use CUDA-only renderers if need be. Both brands can do OpenCL.Main thing that matters is number of cores, so in some cases two slower cards may be better than one fast card, for example two 1070's vs one 1080.Rendering doesn't need SLI, so anything goes as far as # of GPUs. Rendering scaling is fairly linear.

>>548405With OpenCL it does support AMD, but they also say that CUDA is faster than OpenCL.

>Currently Nvidia with CUDA is rendering faster. There is no fundamental reason why this should be so, because we do not use any CUDA specific features, but the compiler appears to be more mature, and can better support big kernels. OpenCL support is still in an early stage and has not been optimized as much.

>>548517You can whine all you want about it, but the bitter reality is that CUDA has a high market share, OpenGL has not and is not as fast and stable, and Nvidia Cards even outperform AMD cards on OpenGL. Efficiency is all that counts. Not ideology or Brand loyalty. I don't like it and it might change soon, but right now thats how it is.

I'm building a PC, and right now I'm working off a laptop. For a good ballpark for rendering speed, if I try to render a decently sized scene at 1920x1080 at 500 samples, it can take anywhere from 2 to 4 hours a frame. I mean, it sounds like the laptop is terrible, but it's not that bad. I can run games from 2010 well enough (although that's not much of a benchmark either).

In either case, Nvidia or AMD, I'll still notice a good improvement in rendering speed right? Like I'll be able to wait a few minutes instead of all night to render a scene? Is crossfire or whatever going to improve rendering speeds if I hook up 2 cards in parallel? In case I decide to get a second one down the line.

I'm only really looking to spend like $700 at most, which I'm still trying to save up.Right now I'm looking at a Radeon R9 380 for the card. Honestly I don't know much about graphics cards as far as how they compare to each other, but it seems decent enough to run modern games, especially at the price point.

Sorry for all the questions. I've built PC's before, but I never was the one to buy the parts.

>>548626>I mean, it sounds like the laptop is terrible, but it's not that bad. I can run games from 2010 well enoughYou can run games from 7 years ago "well enough"Wow, that is actually quite terrible.

>Like I'll be able to wait a few minutes instead of all night to render a scene?That depends on what you're rendering. How could I answer your question?In any case if your hardware is that bad of course you're going to notice a big improvement.It took me 17h to render 300 frames of a walking animation.

I have an R9 fury and it seems fast for still renders.Animations take forever though but that's just how it is. If you want fast render just pay a render farm instead.Even if you get 4 titans it's still gonna take a long time depending on what you want to render.I mean it's not like I can give you numbers since it's completely meaningless unless I try rendering the same thing you do.

>>548627>I forgot to mention that I use blender. Don't really know if that'll make a difference.as has been stated in this thread blender is faster with Nvidia. AMD is still trying to fix that.

>>548668I can play oblivion fine, but only with lower settings and on 800 x 600 resolution. If I try to play anything newer it'll lag to hell, I'll get 5-10 fps at most with frequent crashes.

It doesn't help that the laptop itself seems to be built like shit. It's entirely plastic and fucking bends and wobbles. You can bend the whole thing and give it a solid 10-15 degree curvature no problem and it'll come back to its original shape.

One of my classmates threw it on the floor once and it didn't even get damaged, only made a really loud "DONK" sound and kinda bounced off. It worked fine after that.

>>548626Most renderers are CPU based! That means the graphics card will do jack shit. If you want to use the graphics card for rendering you need to look into GPU rendering.

It sounds like Blender supports some form of that. If it's CUDA only you'll want to get an nvidia card. You'll want to get an nvidia card anyhow for 3d, they've always been more solid for this industry compared to ATI/AMD.

>Animations take forever though but that's just how it is. If you want fast render just pay a render farm instead.Even if you get 4 titans it's still gonna take a long time depending on what you want to render.

Things have changed. With something like Redshift and a good GPU you're rendering 5-20 times faster than a CPU. A machine with 4 Titans would blow away the shitty 10-node renderfarm we have at work (which we still manage to render projects with).

This dude rendered his short on a single worstation:https://www.fxguide.com/featured/how-one-vfx-artist-made-these-3-minutes-of-madness/

>This way, I was able to do the whole project on a single workstation with render times ranging from 2min to 15min in 1440p with full brute force GI and Motionblur / DOF. Deadline was also used to stack up jobs so my workstation would be busy around the clock.”

>Lovvold created the piece on a single workstation from his home in Norway. “Nothing too special,” he admits, “except four GTX780TIs for the GPU rendering. You could say the last generation top-end cards - I would go with GTX980TIs or TitanXs if i where to build a new system now a days. My workstation also doubled as an oven in my tiny cellar, with each GPU card running at 83C!”

>>548376Nvidia Geforce is for gamers - you can't see what the card is doing unless you buy exactly the same card but re-labeled as "Quatro". So for involving work you don't want Nvidia unless you have truck loads of money