To a good approximation, the flux of cosmic rays—relativistic particles, including protons, electrons, and nuclei—hitting the earth drops off smoothly with energy, but it’s the kinks and bends in this spectrum that alert astrophysicists to new or exotic mechanisms by which such particles are accelerated to their relativistic speeds. In Physical Review Letters, Pasquale Blasi at the National Institute for Astrophysics in Arcetri, Italy, and colleagues argue, based on their calculations, that unexpected features in the cosmic-ray spectrum can be explained by the ways cosmic rays scatter in the interstellar medium en route to earth.
It is generally believed that shock waves, created in supernova explosions, give the necessary boost to particles with energies from giga-electron-volts () to tera-electron-volts. The particles then travel diffusively, scattering from magnetic fields in the galaxy. This picture explains why the cosmic-ray flux falls off with a simple power law over five decades of energy, but several experiments, including the earth-orbiting satellite PAMELA, observed that it falls off slightly faster with energy below , compared to above this energy.
To explain this subtle kink, Blasi et al. calculated the diffusion constant of cosmic rays that had been accelerated by supernova shocks. The authors showed that self-induced turbulence, consisting of magnetic waves due to the cosmic rays themselves, dominate the energy dependence of the diffusion constant below , while turbulence in preexisting magnetic fields dominates diffusion above this energy. – Jessica Thomas