Millennium Read online

Page 31


  All this went a step further in 1876 when Alexander Graham Bell obtained patents in Britain and America for ‘talking by telegraph’, as he referred to the telephone. In March that year, Bell uttered the first sentence over the phone to his technical assistant Thomas Watson, after he had just upset a chemical-filled glass jar: ‘Mr Watson, come here, I want you!’ In 1878 a telephone call was successfully made across 115 miles of wire from London to Norwich, and in that same year a telephone directory for New Haven was published. Two years later the Telephone Company produced the first London telephone directory, consisting of about 250 subscribers. Telephone numbers were developed shortly afterwards, as the number of subscribers grew unmanageable for operators. By 1886 telephone kiosks had been constructed, charging 2d for a three-minute call. By 1900 there were 17.6 telephones for every 1,000 people in the United States.8 Communications had undergone a second revolution: first the telegraph had allowed one-way messages to be sent over long distances instantly; now the telephone made two-way exchanges possible. By 1900 Guglielmo Marconi was sending radio messages across the Channel from England to France, and by the end of 1901, he had transmitted a signal across the Atlantic. The first maritime SOS message was received in 1899 by the East Goodwin lightship, which resulted in the crew of the German ship Elbe being rescued by the Ramsgate lifeboat. It was all a very far cry from Lieutenant Lapenotière’s 37-hour dash from Falmouth to London to deliver the news of the victory at Trafalgar.

  Public health and sanitation

  There are three things common to cities in all ages: stench, overcrowding and beggars. If a city lacks one of these, you can be sure it hasn’t grown organically but has been laid out according to some philanthropist’s or dictator’s whim. At the start of the nineteenth century there were no such whimsically clean cities in the West. Every city was smelly and visibly populated by huge numbers of the needy poor. City streets were foul but the smells emanating from the cesspits beneath the houses in the slums were even worse. Neighbourhoods were rife with disease. Poverty bred illness and illness exacerbated poverty, dragging the urban poor down into a whirlpool of misery. The average age at death among labourers in Bethnal Green in 1842 was just 16; among the better-off in London it was 45.9 But when it came to doing something about this situation, society was apparently unconcerned. The best prophylactic in most educated people’s minds was to separate yourself and your household from the smells and the slums by buying a house in a leafy suburb. How and why the poor should have their way of life changed for them was not a matter of public interest. Finding out why they suffered more diseases was similarly not a medical priority. A French surgeon who wondered why the poor in one Parisian street had a death rate that was 50 per cent higher than that of the bourgeoisie in a neighbouring street concluded that immorality was to blame. If the poor could be taught to behave in a less dissolute fashion, they would live longer and so would their children.10 Those who were sceptical about the efficaciousness of such moral improvements tended to fall back on the old idea that a miasma composed of gases from rotting matter created fetid air, which entered the body of a person and caused them to become sick. In their eyes, the poor were slovenly, hence they fell ill. Alternative explanations, such as that of Girolamo Fracastoro, who had suggested in his De Contagione (1546) that illnesses were spread by ‘seeds of disease’, had been forgotten.11 There was a paralysis of prophylaxis.

  The great exception to this paralysis was vaccination. In the 1790s the physician Edward Jenner had noted that dairymaids who had suffered a bout of cowpox developed immunity from smallpox. On 14 May 1796 Jenner deliberately inoculated an eight-year-old boy, James Phipps, with cowpox from a dairymaid who was then suffering from the disease. Six weeks later, he inoculated the lad with the dreaded variola virus. Phipps did not develop smallpox. Jenner was exultant, and tried to persuade the Royal Society to publish his findings; they were reluctant on the basis of so little evidence. Jenner accordingly carried out further experiments in 1798 and published the results in An Inquiry into the Causes and Effects of the Variolœ Vaccinœ that same year. His work immediately attracted a lot of attention. By 1803 there were editions in Latin, French, German, Italian, Dutch, Spanish, French and Portuguese. Charles IV of Spain had his own children vaccinated, and sent the royal physician Francisco de Balmis and 20 children from an orphanage to Colombia, where there had been a smallpox outbreak. The cowpox virus was kept alive during the Atlantic crossing by infecting one child from another. In this way de Balmis was able to vaccinate over 100,000 people in the Caribbean and Latin America. Gradually countries began to make smallpox vaccination compulsory soon after birth, although the United Kingdom itself did not do so until 1853. Other diseases continued to kill as they had always done, however, often finding an overflowing reservoir of victims in the increasingly crowded cities of Europe and the New World.

  On to this murky medical stage entered the Hungarian physician Ignaz Semmelweis. Working in one of two free obstetric clinics in Vienna in 1846, he met pregnant women who would plead with him to be allowed to enter the other clinic. The reason was that his own clinic had death rates that fluctuated around 10 per cent whereas the likelihood of dying in the second clinic was much lower, about 2.5 per cent. No one could work out why this was. Everything about the two clinics was the same apart from the staff, and Semmelweis’s clinic was staffed by highly qualified physicians whereas the other was staffed only by midwives. Then, in March 1847, one of Semmelweis’s colleagues died after being accidentally stabbed with a scalpel by a student during an autopsy. Semmelweis noticed that his late colleague’s corpse showed symptoms similar to those exhibited by the women who were dying in his clinic. He came up with the theory that ‘cadaverous particles’ had been transferred from the dead body to the student’s scalpel in the course of the autopsy, and that these particles had killed his colleague. Semmelweis realised that this might explain why women were dying in greater numbers under the care of qualified medical practitioners, who conducted autopsies, than under the care of midwives. He therefore instigated among the medical staff of his clinic a regime of washing hands with chlorinated lime solution. Almost immediately, the death rate dropped to the same level as that of the other clinic.

  Semmelweis’s students were impressed and promptly set about spreading the news. Most initial reactions to his regime were negative, however. Physicians declared that it was reminiscent of outdated theories of contagion, like Fracastoro’s ‘seeds of disease’. Some feared that the theory of ‘cadaverous particles’ was too close to a magical understanding of the body, which they saw as the enemy of science. After Semmelweis’s term of appointment came to an end, his contract was not renewed and the clinic returned to its old, unclean and deadly ways. In his next position, Semmelweis brought about a similar reduction in women’s deaths through hand-washing and instrument disinfection, but still the medical establishment rejected his advice. Eventually, in 1861, he published his own account of his work, but poor reviews of his book only increased his frustration at not being heard. He became fixated on the rejection of his discovery by physicians more concerned with their own reputations than with helping their frightened patients. In the end he had a nervous breakdown and was committed to a lunatic asylum where he was beaten up by the guards and died of blood poisoning in 1865, at the age of 47. It was two decades before a paper by Louis Pasteur proved that Semmelweis’s theory of ‘cadaverous particles’ was essentially correct.

  While Semmelweiss was trying to save the lives of pregnant Viennese paupers, the social reformer Edwin Chadwick committed himself to improving the lives of the London poor. He was instrumental in bringing about the Public Health Act of 1848, which encouraged towns to set up local Boards of Health to improve their slum dwellings, sewers, urban slaughterhouses and water supplies. Not everyone thought this was a good idea. One article in The Times declared that ‘cholera is the best of all sanitary reformers’.12 But Chadwick’s cause received some unexpected support during the cholera o
utbreak of 1854, when John Snow, a London physician, charted the new cases of the disease and found that all the victims drew their water from a well in Broad Street. Snow had the handle removed to prevent any further infection, the cause of which turned out to be a cholera-infected cesspit a few feet from the well. Most importantly, he gave evidence to a House of Commons Select Committee that cholera was not directly contagious, nor caused by a miasma of fetid air, but transmitted by water. The solution, he declared, was to improve the drainage and the sewerage of the city. When the Great Stink descended on London in 1858 – a stench arising from untreated effluent so pungent that it forced Parliament to adjourn – the government commissioned Joseph Bazalgette to rebuild the entire sewerage system of the capital. The task occupied him until 1875. That same year, the second Public Health Act was passed, enforcing the provision of running water and sewerage to every new house and the compulsory employment of medical officers by every Board of Health. At the same time, Georges-Eugène Haussman was rebuilding Paris, complete with a new sewerage system under its boulevards. Modern cities were no longer allowed to decay in the mass of human detritus.

  Despite his empirical research showing that cholera was waterborne, John Snow could not work out how the Broad Street well actually conveyed the disease. In 1861 Louis Pasteur stumbled on the path that would eventually lead to the answer. Experimenting with broths in Petri dishes, Pasteur showed that those left open to the air became mouldy in a short while, whereas broths that were closed to the air did not. He also noticed that broths remained clear of mould and fermentation when they were left open to the air but were protected by very fine dust filters. It followed that the broths were infected by some particle in the air rather than the mould being caused by a property of the broth itself. Pasteur’s paper on the subject inspired another Frenchman, Casimir Davaine, who had previously found bacilli anthraci in the blood of anthrax-infected sheep. In 1863, Davaine published a paper that showed that anthrax was connected to the microorganism he had observed. In Scotland, the surgeon Joseph Lister also realised the potential in Pasteur’s work, suspecting that particles in the air might be the cause of infection in his patients’ wounds. In 1865 he started to apply carbolic acid to dressings and incisions in order to kill the microorganisms that caused gangrene, with positive results. In Germany, Robert Koch worked on the aetiology of anthrax, experimenting with the bacilli that Davaine had identified. In 1876 he established that the microbes created spores that were inhaled by the animal, or otherwise entered the bloodstream and multiplied in the blood, eventually killing the host. In 1878 Koch went on to identify the germ that caused blood poisoning, and in 1882 he found the microbe that was responsible for tuberculosis, one of the greatest killers in history. Pasteur himself experimented with inoculations against anthrax, chicken cholera and rabies. In a famous incident on 6 July 1885 he vaccinated a nine-year-old boy, Joseph Meister, who had been bitten by a rabid dog two days earlier. Joseph lived, as did a second boy three months later, who had been bitten trying to protect other children from a rabid dog. Germ theory, as Pasteur called it, had arrived.

  All medical discoveries are, in one way or another, matters of public health, and many more advances could be discussed here. One great advance was the introduction of anaesthetics in the 1840s. Another was the successful use of Caesarean section. At the start of the century, this operation was a last resort, as it almost always resulted in traumatic loss of blood and the death of the mother. For the most part, early-nineteenth-century physicians preferred foetal craniotomy – the crushing of the unborn child’s skull and the removal of the foetus in pieces through the vagina – to preserve the life of the mother. One of the very few known early cases in which both mother and baby survived was carried out around 1820 in South Africa by a British military physician, Dr James Barry, who was discovered at his death to have been a woman masquerading as a man all his/her career. From the 1880s, however, the operation was undertaken more regularly, and with increasingly positive outcomes for both mother and child. Over the course of the century, life expectancy at birth rose all across Europe, from approximately 30 years of age to roughly 50, and on that basis, the above changes amount to more than feathers in a few medical caps. The nineteenth century was when the West discovered what caused most illnesses, and worked out, in many cases, how to prevent them, how to cure them, and how to limit contagion.

  Photography

  Not long ago, I was interviewed about a medieval subject for a television programme. Shortly afterwards, a picture researcher phoned me to ask where she might find images of a certain character I had mentioned. When I replied that none exist, she said that in that case, they would drop all references to him. This little episode starkly reveals how our collective awareness of the past – and our knowledge generally – is shaped by visual sources.

  There is a whole hierarchy to historical imagery: our ability to understand the past is closely related to the number, variety and range of images that survive. We find it much easier to imagine sixteenth-century individuals than medieval people because we can see their faces in portraits. The eighteenth century is even more recognisable, for we have street scenes and interiors as well as portraits. But of all the historical images available, it is photographs that have the most impact. One of the key reasons why the First World War is so much more meaningful to today’s public than the Napoleonic Wars is because we can see photographs of the mud and the trenches, and the smiling troops on their way to the Front. We are familiar with the images of the corpse-strewn battlefields to which they were heading. When we see a colour photograph from the First World War, such as Paul Castelnau’s autochrome of a French soldier in his blue uniform cautiously peeping across the top of a trench at the enemy lines, its realism strikes us with far greater force than the painted and engraved depictions of similar scenes in earlier battles.

  Having said this, it is not on account of its future value to historians that nineteenth-century photography is discussed here as a significant change. It is rather because photography did more than any other form of illustration to reform society’s image of itself. You could say it did for society what the mirror did for individuals in the fifteenth century. Think of it this way: if you saw a painting of a casualty from the Napoleonic Wars, you would have registered straight away that the artist had chosen to portray that particular man for a particular reason. The subject would almost certainly be an officer, and you could safely assume that the reason for the painting would have been to mark a conspicuous act of gallantry on his part. The painting itself would have taken considerable time to produce, and thus would only have been created after ‘the moment’ it depicts had passed: after our hero had ceased grimacing in pain and had had his wound dressed. Thus you would have understood that the artist had deliberately composed the picture, deciding how much of the wound to reveal and how much to conceal. In marked contrast, some of the photographs taken a hundred years later, during the First World War, show true horrors: the dismembered corpses of ordinary men and women among the wreckage of their houses, the earth spat up in the air in terrifying fury as a shell exploded, or a semi-naked woman blown in half with her baby when a mortar shell fell on a maternity hospital.13 They show images of the instant of death as well as its immediate aftermath. They reveal the object in front of the lens whatever it happened to be, including many details that the photographer would not have noticed when he took the shot. Although there was of course still a great deal of intention in photographs, and many pictures were posed for propaganda and publicity purposes, people had come to believe that the camera captured the actual scene. Once the camera shutter had flashed open, the object itself told the story. It was no longer filtered through a genteel artist’s imagination or memory.

  Photography began with a series of pioneers working independently in the 1820s and 1830s: Joseph Nicéphore Niépce and Louis Daguerre in France, Henry Fox Talbot in England, and Hercules Florence in Brazil. From 1839, when the French government bou
ght the rights to Daguerre’s work as a gift to the world, daguerreotypes became the pre-eminent form of photography. They were hardly the sort of thing you’d slip in your wallet, being copper plates coated with silver nitrate and preserved beneath glass. Exposure times were long, too, so they were awkward to produce. Nevertheless, they proved immensely popular. For the first time, people who would never have dreamed of having their portrait painted could sit for a photograph. The daguerreotype was followed in the early 1850s by the glass-fronted ambrotype, and that in turn was succeeded from about 1860 by the tintype, on a metal backing, and the carte de visite, which allowed multiple copies of the same image to be produced on a card backing. This last image you could put in your wallet. Many middle-class men and women would hand out copies to family and friends in the same way that wealthy people in previous centuries had sat for miniature portraits to give to their loved ones.

  The invention of photography would have amounted to little more than the puffing-up of middle-class pride if it had not been for the ability to publish photographic images. In this respect, the key invention was not Daguerre’s method of photographing people in studios but Henry Fox Talbot’s technique of creating negative images from which albumen prints could be made. The first photographic book, Fox Talbot’s The Pencil of Nature (published in six parts, 1844–6), reproduced images of his family home, Laycock Abbey, as well as still-life images and landmarks such as the boulevards of Paris and the bridge at Orléans. Although it was not yet possible to mass-produce high-quality photographs, limited-edition technical books on botanical subjects could include albumen prints of photographs of specimens, not just engravings of drawings. In due course, the technology did allow for printed plates to be inserted into books, at which point the social sciences also benefited. John Thomson, who travelled extensively in the Far East in 1862–71, published views of China and Cambodia that showed English readers for the first time what a Chinese street scene looked like, or how the ruins of Angkor Wat, the great Cambodian temple, loomed out of the jungle. No amount of text could bring these things home to the reader so vividly. Although previously some travellers had produced incredibly detailed and beautiful steel engravings of the landscapes they had visited – the name of William Henry Bartlett leaps to mind – their images were still the result of artistic processes, not the action of the light from the object itself falling on a piece of silver-coated metal. They were not ‘true’ in the way that a photograph was true. While the public still had a huge appetite for steel-engraved contemporary images, as shown by the massive print runs of the Illustrated London News from the 1840s, this only indicates how much people wanted to see any image that related to a news story: if they could see a photograph, so much the better. Publications such as the Illustrated London News began to incorporate photographs – first in an engraved form, with the engraving trying to mimic the verisimilitude of the original image, and later, when the technology of the 1890s made it possible, in the form of a half-tone reproduction of the photograph itself. In the travel publications of the 1880s and 1890s, photographic plates depicting jungles, ruined temples and exotic people in faraway places became de rigueur. It was not enough to travel and tell the story of the expedition; you had to show the reader the marvels you had seen. Through such publications, European armchair travellers began to visualise the rest of the world.