We investigate the evolution of optical beam in the nonlocal nonlinear media with loss and gain using the variational approach and the numerical simulation. When the loss gradually changes to the gain, the optical beams can restore to their initial states, the phenomenon we called the "adiabatic propagation". We have proved that, as long as the changing rate of the loss and gain is small enough, the gain can exactly compensate the loss and the adiabatic propagation can occur for any beams with any profiles. However, the optical beams will shed a part of its energy as dispersive waves if they are lumped amplification like the cases of optical pulses in fibers. The numerical simulations agree well with the variational results.