linear perturbation theory
negare-ye partureš-e xatti
Fr.: théorie de perturbation linéaire
Assumption that the variations in the plasma parameters, due to the presence of waves, are small (to the first order) as compared to the undisturbed parameters. This makes it possible to linearize equations by dropping out second order (and higher) nonlinear terms.