Dust is everywhere in the Universe. In fact cosmic dust plays a crucial role in star formation, allowing material in stellar nurseries such as the Orion Nebula to cool enough that gravitational collapse is likely, and once that’s done it forms the raw material for planets.
On a larger scale, no attempt to observe or understand a distant galaxy – or our own Milky Way – can fail to take into account the effect of dust on what we see.
A single dust grain may be tiny, no more than a tenth of the size of a sand grain, but the effects of dust as a whole are mighty.
Read more from Chris Lintott, including a look at the Andromeda-Milky Way collision and a guide to the nearest black hole to Earth.
Much attention has been paid to how this cosmic dust forms, with the atmospheres of giant, cool stars likely to be responsible for most of it.
The dramatic fading of Betelgeuse in the constellation of Orion, for example, seems to have been caused by the appearance of new stardust in the star’s tenuous outer atmosphere.
Yet relatively little attention has been paid to the other end of cosmic dust’s life cycle. There is an interesting study that does just that, working out how dust is destroyed.
It’s an important problem because we know, from looking at the dust content of galaxies, that the amount of dust in the Universe has been dropping for at least the last 8 billion years or so.
This is surprising as the amount of raw material for dust, in the form of carbon, silicon and the other heavy elements astronomers call ‘metals’, is increasing.
Production is continuing apace and so something not accounted for in most models must be destroying the dust.
Andrea Ferrara from Pisa and Céline Péroux from ESO in Garching, Germany think they have the answer – or rather answers.
In their theory, about 40% of cosmic dust is destroyed by supernovae. It makes sense that powerful explosions which send shock waves through the cosmos might destroy delicate dust grains in the supernova’s neighbourhood.
But what’s impressive about Ferrera and Péroux’s idea is that they get significant loss of dust overall, while noting that each individual supernova might be less good at this than expected.
The trouble is that massive stars that go on to produce supernovae tend to have powerful stellar winds, and these winds can sculpt a ‘bubble’ of material around them.
Dust caught up in such a bubble forms a dense barrier, and it’s therefore protected somewhat from the shock waves produced by the supernova.
As a result, a supernova on average destroys about half a solar mass worth of dust – about a sixth of what less detailed modelling suggests.
Given the supernova rate over the period studied, that’s enough to account for 40% of the dust loss we see.
What happens to the rest? The answer is rooted in the fact that star formation requires dust.
Dusty clouds can cool, and as they do so they collapse under their own gravity.
The dust is not exempt from this process, becoming heated and incorporated into the star – a process rather wonderfully known as ‘astration’.
This process is efficient enough to account for the majority of lost dust in the last few billion years and explains why our Universe seems to be undergoing a long, slow spring clean.
Chris Lintott was reading Late-time cosmic evolution of dust: solving the puzzle by A Ferrara and C Péroux.Read it online at arxiv.org.
This article originally appeared in the May 2021 issue of BBC Sky at Night Magazine.