Games tend to have some form of 'particle editor' which allows you to setup various parameters (which might change in a linear fashion or on a curve of some sort) for emitters and then those emitters would be combined together for an 'effect' of some sort.

So a fire + smoke effect might be made up for 3 emitters (fire, smoke, embers) all with varying parameters.

Usually it will be mostly billboards, with animated rotations, scaling, opacity, tex-coords (to use a different part of an atlas overtrime), etc...
When I worked as an effects programmer, we had a particle editor GUI tool where you could do all his work, rather than hard-coding things. E.g ranges for initial parameters (size, velocity, opacity, etc), curves describing how parameters change over time, what textures to use and atlas layouts, how often different emitters spawn particles (e.g. Smoke and fire would be two different sub-emitters), how particles should be sorted, the shaders to use, the geometry rendering mode (billboards, constrained billboards, ribbons, 3D meshes, etc), external force generators (cylinders/boxes of wind, gravity, etc), and so on. All this would be saved into a 'particle system' file, which the programmers could load/trigger. These files could also be reloaded at runtime to iterate on an effect.

For this kind of effect, yes, I'd just use (soft) billboards. Probably af least 2 different emitters - one with black smoke obscuring the background, and one with orange flames being added over the background. You could also experiment with pre-multiplied alpha blending, which lets you put both additive and alpha blended images into the one texture.
Youd probably have more than 2, so that some particles shoot outwards quickly, while others rise slowly, or mushroom out, etc..

I did a quite realistic 2D simulation of an explosion for a math course using vector fields to simulate the air pressure and it's velocity. You need to make the hot gas rise on the inside while the edges cool down and create the hat. To make it in 3D using voxels, you would need to use the GPU and turn off simulation in sections that are inactive. You would need to make a volume rendering method for displaying the result (Implementations for medical use have about 20 frames per second on a regular GPU). This would be in a very low resolution (maybe 256 x 256 x 16 as the active area) but the flames would flow around obstacles in a small closed environment.

You would need to make a volume rendering method for displaying the result (Implementations for medical use have about 20 frames per second on a regular GPU).

That all depends on how you do the rendering. Volumetric rendering can be pretty fast, if you aren't doing transparent materials. If the OP is interested, there is an implementation in Hieroglyph 3 in the 'VolumeActor' class that is pretty efficient and looks nice... Granted, it isn't making fire, but it still intersects a 3D volumetric texture, and is pretty quick.