Abstract

> A scaling theory of long-wavelength electrostatic turbulence in a
magnetised, weakly collisional plasma (e.g. drift-wave turbulence driven
by ion temperature gradients) is proposed, with account taken both of
the nonlinear advection of the perturbed particle distribution by
fluctuating flows and of its phase mixing, which is caused by the
streaming of the particles along the mean magnetic field and, in a
linear problem, would lead to Landau damping. It is found that it is
possible to construct a consistent theory in which very little free
energy leaks into high velocity moments of the distribution function,
rendering the turbulent cascade in the energetically relevant part of
the wavenumber space essentially fluid-like. The velocity-space spectra
of free energy expressed in terms of Hermite-moment orders are steep
power laws and so the free-energy content of the phase space does not
diverge at infinitesimal collisionality (while it does for a linear
problem); collisional heating due to long-wavelength perturbations
vanishes in this limit (also in contrast with the linear problem, in
which it occurs at the finite rate equal to the Landau damping rate).
The ability of the free energy to stay in the low velocity moments of
the distribution function is facilitated by the `anti-phase-mixing'
effect, whose presence in the nonlinear system is due to the stochastic
version of the plasma echo (the advecting velocity couples the
phase-mixing and anti-phase-mixing perturbations). The partitioning of
the wavenumber space between the (energetically dominant) region where
this is the case and the region where linear phase mixing wins its
competition with nonlinear advection is governed by the `critical
balance' between linear and nonlinear time scales (which for high
Hermite moments splits into two thresholds, one demarcating the
wavenumber region where phase mixing predominates, the other where
plasma echo does).