At the end of his life, the English naturalist Charles Darwin became intrigued by the musicality of worms. In the last book he ever wrote, in 1881, he describes a series of experiments on his vermicular subjects. Worms, Darwin discovered, are sensitive to vibrations when transmitted through a solid surface, but tone-deaf and unresponsive to the shriek of a whistle or the bellow of a bassoon.
Earlier, in the 1760s, the French natural philosopher Comte de Buffon heated up balls of iron and other minerals until they were white-hot. Then, by sense of touch alone, he recorded how long it took them to cool to room temperature.
A hundred years before that, Isaac Newton wrote about the time he slid a bodkin – a kind of thick tailor’s needle – between his skull and his eye, and rubbed the needle so as to distort the shape of his own eyeball.
These experiments are all pretty wacky, but they still bear the mark of the scientific. Each one involves the careful recording and assessment of data. Darwin was excluding the hypothesis that hearing explained earthworm behaviour; Buffon extrapolated the age of the Earth from a wide range of geological materials (his estimate: 75,000 years); and Newton’s unpleasant self-surgery helped to develop his theory of optics, by clarifying the relationship between the eye’s geometry and the resulting visual effects. Their methods might have been unorthodox, but they were following their intellectual instincts about what the enquiry demanded. They had licence to be scientific mavericks.