Inside Story

Current affairs & culture from Australia and beyond

2681 words

What Ada Lovelace can teach us about digital technology

9 September 2019

Extract | How collaborative work can be liberating and effective

Right:

The mercurial Ada Lovelace: detail from a watercolour portrait, possibly by A.E. Chalon (1780–1860), c. 1840. Science Museum Group/Wikimedia

The mercurial Ada Lovelace: detail from a watercolour portrait, possibly by A.E. Chalon (1780–1860), c. 1840. Science Museum Group/Wikimedia


From the moment she was born, Augusta Ada Gordon was discouraged from writing poetry. It was a struggle against her genetic predisposition. Her father had led by example in the worst possible way, cavorting around the Mediterranean, leaving whispered tales of deviant eroticism and madness wherever he went. He penned epic stanzas full of thundering drama and licentiousness. Lord Byron understood the dangers of poetry. “Above all, I hope she is not poetical,” he declared upon his daughter’s birth; “the price paid for such advantages, if advantages they be, is such as to make me pray that my child may escape them.” Ada, as she was known, failed to make this escape and barely enjoyed the advantages. The poetry she went on to write was beyond even her father’s imaginings.

Likewise hoping that her daughter might avoid the fate of a father who was “mad, bad and dangerous to know,” Ada’s mother, Lady Anne Isabella Milbanke, ensured that she was schooled with precision and discipline in mathematics from her earliest days, and closely watched her for any signs of the troubles that had plagued her father. Lord Byron, abandoning them weeks after Ada’s birth, died when she was eight; his legacy cast a ghostly shadow over her life.

Ada’s schooling marched ever forward, toward an understanding of the world based on numbers. “We desire certainty not uncertainty, science not art,” she was insistently told by one of her tutors, William Frend. Another tutor was the mathematician and logician Augustus De Morgan, who cautioned Ada’s mother on the perils of teaching mathematics to women: “All women who have published mathematics hitherto have shown knowledge, and the power of getting, but… [none] has wrestled with difficulties and shown a man’s strength in getting over them,” he wrote. “The reason is obvious: the very great tension of mind which they require is beyond the strength of a woman’s physical power of application.”

But Ada was never going to be denied the opportunity to learn about mathematics. Lady Anne was a talent herself, dubbed “Princess of Parallelograms” by Lord Byron. Having managed to outlive him, the desire to expunge in her daughter the slightest genetic tendency for mad genius and kinky sex took precedence over any concerns about Ada’s feminine delicacy.

Married at nineteen years of age, Ada, now Countess Lovelace, demonstrated curiosity and agility of mind that would prove to be of great service to the world. Just a year before her marriage, in 1833, she had met Charles Babbage, a notable mathematician with a crankish disposition (he could not stand music, apparently, and started a campaign against street musicians). Together they worked on plans for the Analytical Engine, the world’s first mechanical computer. It was designed to be a mechanical calculator, with punch cards for inputting data and a printer for transcribing solutions to a range of mathematical functions. Babbage was a grand intellect, with a penchant for snobbery and indifference to many of the practicalities of getting things done. Lovelace was his intellectual equal but arguably better adapted to social life.

Like the proverbial genius, Babbage struggled with deadlines and formalities. When one of his speeches was transcribed for publication in Italian and neglected by Babbage, Lovelace picked it up and translated it. She redrafted parts of it to provide explanations to the reader. Her work ended up accounting for about two-thirds of the total text. This became her significant contribution to the advancement of computing: turning the transcription into the first-ever paper on computer science. It became a treatise on the work she and Babbage did together.

There remains some controversy about the extent of Lovelace’s participation in this project, but ample historical evidence exists to dismiss the detractors, not least the direct praise bestowed on her work and intellect by Babbage. Lovelace applied her mathematical imagination to the plans for the Analytical Engine and Babbage’s vision of its potential. She sketched out the possibility of using the machine to perform all sorts of tasks beyond number crunching. In her inspired graphic history of Babbage and Lovelace, Sydney Padua describes Lovelace’s original contribution as one that is foundational to the field of computer science: “By manipulating symbols according to rules, any kind of information, not only numbers, can be operated on by automatic processes.” Lovelace had made the leap from calculation to computation.

Padua describes the relationship between Babbage and Lovelace as complementary in computational terms. “The stubborn, rigid Babbage and mercurial, airy Lovelace embody the division between hardware and software.” Babbage built the mechanics and tinkered endlessly with the physical design; Lovelace was more interested in manipulating the machine’s basic functions using algorithmic formulas. They were, in essence, the first computer geeks.

The kind of thinking needed to build computers is precisely this combination of artistry and engineering, of practical mechanics and abstract mathematics, coupled with an endless curiosity and desire for improvement. The pioneering pair’s work blurred the division between science and art and navigated the spectrum between certainty and uncertainty. Without Babbage, none of it would have happened. But with Lovelace’s predilection for imaginative thinking and education in mathematics, a perfect alignment of intellect allowed for the creation of computer science. Lovelace and Babbage’s achievements were impressive because they challenged what was possible while at the same time remaining grounded in human knowledge.

And beyond all this, Lovelace was a woman. (A woman!) In direct contradiction to her tutors’ warnings decades earlier, Babbage wrote, Lovelace was an “enchantress who has thrown her magical spell around the most abstract of Sciences and has grasped it with a force which few masculine intellects (in our country at least) could have exerted over it.” Lovelace showed it was possible to transcend not only the bounds of orthodox mathematics but also her socially prescribed gender role.

No doubt all this caused Lovelace’s mother considerable worry. The madness seemed to be catching up, much to her consternation. In the years after her visionary publication, Lovelace poignantly beseeched Lady Anne: “You will not concede me philosophical poetry. Invert the order! Will you give me poetical philosophy, poetical science?”

For Babbage, the perfect was the enemy of good, and he never did manage to build a full model of his designs. In 1843, knowing that he struggled with such matters, Lovelace offered, in a lengthy and thoughtful letter, to take over management of the practical and public aspects of his work. He rejected her overtures outright yet seemed incapable of doing himself what was required to bring his ideas to fruition.

Lovelace’s work in dispelling myths and transforming philosophy was cut short when she died of cancer aged just thirty-six. Babbage died, a bitter and disappointed old man, just shy of eighty. The first computers were not built until a century later.


Technological advances are a product of social context as much as of an individual inventor. The extent to which innovations are possible will depend on a number of factors external to the individuals who make them, including the education available to them, the resources they have to explore their ideas, and the cultural tolerance for the kind of experimentation necessary to develop those ideas.

Melvin Kranzberg, the great historian of technology, observed that technology is a “very human activity — and so is the history of technology.” Humans are responsible for technological development but do not labour in conditions of their own choosing. Had Babbage been a bit more of a practical person, in social as well as technological matters, the world may not have needed to wait an extra century for his ideas to catch on. Had Lovelace lived in a time where women’s involvement in science and technology was encouraged, she might have advanced the field of computer science to a considerably greater degree.

So too, then, technological developments more generally can only really be understood by looking at the historical context in which they occur. The industrial revolution saw great advances in production, for example, allowing an economic output that would scarcely be thought possible in the agrarian society that had prevailed a few generations earlier. These breakthroughs in technology, from the loom to the steam engine, seemed to herald a new age of humanity in which dominance over nature was within reach. The reliance on mysticism and the idea that spiritual devotion would be rewarded with human advancement were losing relevance. The development of technology transformed humanity’s relationship with the natural world, a process that escalated dramatically in the nineteenth century. Humans created a world where we could increasingly determine our own destiny.

But such advances were also a method by which workers were robbed of their agency and relegated to meaningless, repetitive labour without craftsmanship. As machines were built to do work traditionally done by humans, humans themselves started to feel more like machines. It is not difficult to empathize with the Luddites in the early nineteenth century, smashing the machines that had reduced their labour to automated work. In resisting technological progress, workers were resisting the separation of their work from themselves. This separation stripped them of what they understood to be their human essence.

Whatever the horrors of feudalism, it had allowed those who laboured to see what they themselves produced, to understand their value in terms of output directly. Such work was defined, at least to a certain extent, by the human creativity and commitment around it. With industrialisation and the atomisation of craftsmanship, all this began to evaporate, absorbed into steam and fused into steel. Human bodies became a vehicle for energy transfer, a mere input into the machinery of production. It gave poetic significance to the term Karl Marx coined for capital: dead labour.

Though the Luddites are often only glibly referenced in modern debates, the truth is that they were directly concerned with conditions of labour rather than mindless machine-breaking or some reactionary desire to turn back time. They sought to redefine their relationship with technology in a way that resisted dehumanisation.

“Luddites opposed the use of machines whose purpose was to reduce production costs,” writes historian Kevin Binfield, “whether the cost reductions were achieved by decreasing wages or the number of hours worked.” They objected to machinery that made poor-quality products, and they wanted workers to be properly trained and paid. Their chosen tactic was industrial sabotage, and when their frame-breaking became the focus of proposed criminal law reform, it was, of all people, Lord Byron who leaped to their defence in his maiden speech to the House of Lords. Byron pleaded that these instances of violence “have arisen from circumstances of the most unparalleled distress.” “Nothing but absolute want,” he fulminated, “could have driven a large and once honest and industrious body of the people into the commission of excesses so hazardous to themselves, their families, and the community.”

The historical effect of this strategy has been to associate Luddites forever with nostalgia and a doomed wish to unwind the advances of humanity. But to see them as backward-looking would be an interpretive mistake. In their writings, the Luddites appear more like a nineteenth-century equivalent of Anonymous: “The Remedy for you is Shor Destruction Without Detection,” the Luddites wrote in a letter to the home secretary in 1812. “Prepaire for thy Departure and Recommend the same to thy friends.”

There is something very modern about the Luddites. They serve as a reminder of how many of our current dilemmas about technology raise themes that have consistently cropped up throughout history. Another one of Kranzberg’s six laws of technology is that technology is neither inherently good nor bad, nor is it neutral. How technology is developed and in whose interests it is deployed is a function of politics.

The call to arms of the Luddites can be heard a full two centuries later, demanding that we think carefully about the relationship between technology and labour. Is it possible to resist technological advancement without becoming regressive? How can the advances of technology be directed to the service of humanity? Is work an expression of our human essence or a measure of our productivity — and can it be both?

Central to understanding these conundrums is the idea of alienation. Humans, through their labour, materially transform the surrounding world. The capacity to labour beyond the bare necessities for survival gives work a distinct and profound meaning for human beings. “Man produces himself not only intellectually, in his consciousness, but actively and actually,” Marx wrote, “and he can therefore contemplate himself in a world he himself has created.” Our impact on the world can be seen in the product of our labour, a deeply personal experience. How this is organized in society has consequences for our understanding of our own humanity.

What happens to this excess of production — or surplus value — is one of the ultimate political and moral questions facing humanity. Marx’s critique of capitalism was in essence that this surplus value unfairly flows to the owners of capital or bourgeoisie, not to the workers who actually produce it. The owning class deserve no such privilege; their rapacious, insatiable quest for profit has turned them into monstrous rulers. Production becomes entirely oriented to their need for power and luxury, rather than the needs of human society.

Unsurprisingly, Marx reserved some of his sharpest polemical passages for the bourgeoisie. In his view, the bourgeoisie “resolved personal worth into exchange value, and in place of the numberless indefeasible chartered freedoms, has set up that single, unconscionable freedom — Free Trade. In one word, for exploitation, veiled by religious and political illusions, it has substituted naked, shameless, direct, brutal exploitation.”

This experience of exploitation gives rise to a separation or distancing of the worker from the product of her labour. Labor power becomes something to be sold in the market for sustenance, confined to dull and repetitive tasks, distant from an authentic sense of self. It renders a human being as little more than an input, a cog, a calculable resource in the machinery of production.

For those observing the development of the industrial revolution, this sense of alienation is often bound up with Marx’s analysis of technology. The development of technology facilitated the separation between human essence in the form of productive labour and the outputs of that labour. Instead workers received a wage, a crass substitute for their blood, sweat and tears, a cheap exchange for craftsmanship and care. Wages represented the commodification of time — they were payment for the ingenuity put into work. The transactional nature of this relationship had consequences. “In tearing away from man the object of his production,” Marx wrote, “estranged labour tears from him his species-life, his real objectivity as a member of the species, and transforms his advantage over animals into the disadvantage that his inorganic body, nature, is taken from him.”

As Amy Wendling notes, it is unsurprising that Marx studied science. He sought to understand the world as it is, rather than pursue enlightenment in the form of spirituality or philosophy alone. He understood capitalism as unleashing misery on the working class in a way that was reprehensible but also as Wendling put it, “a step, if treacherous, towards liberation.” There was no going back to an agrarian society that valued artisan labour. Nor should there be; in some specific ways, the industrial revolution represented a form of productive progress.

But how things were then were not how they could or should be forever. Marx’s thinking was a product of a desire to learn about the world in material terms while maintaining a vision of how this experience could be transcended. Navigating how to go forward in a way that valued fairness and dignity became a pressing concern of many of many working people and political radicals in his time, a tradition that continues today. •

This is an edited extract from Future Histories: What Ada Lovelace, Tom Paine, and the Paris Commune Can Teach Us about Digital Technology, by Lizzie O’Shea, published last month by Verso.

Read next

2658 words

Three ways of looking at private health insurance

7 September 2019

Hooked on subsidies, the system is failing. The government needs to move beyond its prejudices about public and private financing

Right:

In his DNA? Health minister Greg Hunt at Cabrini Private Hospital in Melbourne in June. Stefan Postles/AAP Image

In his DNA? Health minister Greg Hunt at Cabrini Private Hospital in Melbourne in June. Stefan Postles/AAP Image