Review
Books
Culture
6 min read

The beliefs that made Jane Austen and her world

A ‘fashionable goodness’ lay at the heart of the author and her writing.

Beatrice writes on literature, religion, the arts, and the family. Her published work can be found here

A woman in 18th century clothes sits within a windowsill reading a book
Anne Hathaway as Jane Austen in Becoming Jane.

‘There just wasn’t a comprehensive book on Jane Austen’s faith’, Brenda Cox told me when I chatted to her recently, ‘That’s why I decided to write one’. She’s right. There are a handful of books that treat Austen’s Anglican faith seriously, even extensively. Irene Collins’ two books on Austen, Jane Austen and the Clergy (1994) and Jane Austen: The Parson’s Daughter (1998), as well as Laura Mooneyham White’s Jane Austen’s Anglicanism (2011) are excellent examples. But they’re also very academic. On the other hand, Cox’s new book Fashionable Goodness: Christianity in Jane Austen’s England (2022) achieves something truly remarkable: it’s both highly accessible – assuming no prior knowledge of Austen’s life, of theology, or of Regency history – and highly insightful. It’s a true labour of love (Cox told me she spent years on reading and research), and it shows. Before I say anything else about Fashionable Goodness, let me urge you to read it. If you want to understand the way Austen and her characters saw the world around them, this is the book to pick up.   

I’ve spent the last ten odd years reading, thinking, and writing about Austen, and yet Cox has made me see her novels in a new light. What she does best is to help us immerse ourselves into the daily life of Regency people, detailing in the first part of her book how the Church of England functioned in Austen’s times. She explains the difference between a vicar, a rector, and a curate; what tithes were; what exams a young man had to pass to become an ordained priest. As I was reading Cox’s book, Austen’s characters gradually came alive in my imagination like never before. Learning more about how they lived their faith day to day helped me to better grasp their motivations and their behaviour. For example, how many readers (myself included!) have been left confused by the passage in Persuasion where Anne judges her cousin Mr. Elliot for his habits of ‘Sunday traveling’? It only starts to make sense once we know that traveling on a Sunday would have likely meant missing church attendance, of which Austen disapproved. Similarly, in Mansfield Park Mary Crawford’s scoffing remark that Edmund Bertram will become ‘a celebrated preacher in some great society of Methodists’ will mean little to us unless we know that in the early 19th century Methodists were often treated suspiciously and looked down upon as overly emotional and ‘enthusiastic’. To my surprise, even my opinion of Austen’s most notoriously silly clergyman, Pride and Prejudice’s Mr. Collins, improved. Cox points out that Mr. Collins writes at least some of his own sermons, at a time when many clergymen would simply pick ready-written sermons out of a sermon book; he is also resident in his own parish of Hunsford after marrying Charlotte Lucas, when non-residence – the practice of a priest delegating all duties to a curate and living away from the parish – was common. Mr. Collins may be irritating and obsequious to a fault, but if we judge him by the standards of his own time, not of ours, he emerges as quite a respectable man after all.  

Far from being in ignorance of these changes in religious sensibility, Austen observed them, and they gave her hope. 

And that is something else that Cox does brilliantly: she shows us that the past is indeed a foreign country, with different moral standards. Instead of trying to find ways in which we’re similar to the people of Austen’s England, Cox helps us to realise that the values and assumptions of Austen’s England are radically different from ours. Even our language is different. Focusing on what she identifies as key ‘faith words’, Cox shows us that we cannot understand just how deeply English society was steeped in the Christian faith, unless we recognise the religious significance that many words had in Austen’s times. For example, when Elinor Dashwood cries to her sister Marianne, ‘Exert yourself’ in Sense and Sensibility, she doesn’t simply mean that Marianne should be less emotionally affected by Willoughby’s betrayal. Rather, she’s reminding Marianne of her religious duty of ‘exertion’, meaning not giving in to despair. Or, when Anne Elliot engages in ‘An interval of meditation, serious and grateful’ after her engagement to Captain Wentworth in Persuasion, we should not understand Anne’s ‘serious meditation’ as mere reflection; Austen would have expected her readers to know that, in this passage, Anne is examining her conscience and specifically praying. Even the word ‘manners’, often mentioned in Mansfield Park, had a deeper meaning than simply social graces, pointing to a person’s religious principles. Cox encourages us to notice these differences, and to let the past change our way of seeing the world through its alienness. 

Lastly, Cox also presents an England whose religious sensibilities were changing fast. The Church of England faced pressure to address its problems. Pluralism, the practice of one clergyman serving several parishes, meant that some members of the clergy were very well off, while others struggled to make a living. In turn, this encouraged non-residence – especially if the parishes were far from each other – and led to the non-resident parishes to be neglected. But at the same time, the Church of England was also being infused with newly found religious fervour. The Evangelical and Methodist movements, still officially part of the Anglican Church at this point, were spreading at a rapid pace thanks to figures like George Whitfield and the Wesley brothers, championing many worthy causes in the name of the Christian faith. The abolitionist movement heralded by Wilberforce, Clarkson, and Hannah More was gaining momentum just as Austen was beginning to write novels. By the time Sense and Sensibility, her first, was published, the slave trade had been abolished in England. Sunday schools were opening up which would educate thousands and thousands of children in the 19th century; the prison reform movement was gaining popularity, as were efforts to combat animal cruelty and ensure better conditions for factory workers. Goodness, as Cox puts it, was becoming fashionable in England.  

What about Austen herself? Cox tells us that she mentions reading the works of abolitionists with pleasure in her letters, as well as remarking on the newly emerging Evangelical movement with somewhat like cautious admiration. Far from being in ignorance of these changes in religious sensibility, Austen observed them, and they gave her hope. As Cox quotes in the final chapter of Fashionable Goodness, in an 1814 letter to her friend Martha Lloyd, Austen describes England as ‘a Nation in spite of much Evil improving in Religion’. Austen was confident that faithful Christians could rise to the challenges placed before them, and this confidence is reflected in her heroines and heroes, whose storylines trace their growth in virtue. It’s perhaps not a coincidence that 1814 is also the year Austen started working on Mansfield Park, a novel whose heroine, Fanny Price, is famously the most ardent in her moral principles. Fanny’s ‘goodness’, however – which the narrator often explicitly mentions – is no longer fashionable. Contemporary readers of Austen tend to dislike her seriousness and her outspoken religiosity. But perhaps, if we join Brenda Cox in immersing ourselves in the alien country that is Regency England, we can learn to judge the ‘goodness’ of Austen’s characters by different standards from our own. Perhaps we can free ourselves of our prejudices, and appreciate earnest characters like Fanny, as well as witty ones, like Emma Woodhouse or Elizabeth Bennet. Perhaps we too, like Austen herself, will gain hope that ‘goodness’ can be made fashionable once more in our time. 

Explainer
Biology
Culture
Ethics
9 min read

Ethics needs to catch-up with genetic innovation

Are we morally obliged to genetically edit?

John is Professor Emeritus of Cell and Molecular Biology at the University of Exeter.

An artistic visualisation of a DNA strand growing flowers from it.
Artist Nidia Dias visualises how AI could assist genomic studies.
Google Deepmind via Unsplash.

It makes me feel very old when I realise that Louise Brown, the first baby to be born via in vitro fertilisation (IVF), will be 47 years old on July 25th this year. Since her birth in 1978, over 10 million IVF-conceived babies have been born worldwide, of whom about 400,000 have been in the UK. Over that period, success rates have increased such that in some clinics, about 50 per cent of IVF cycles lead to a live birth. At the same time, there have also been significant advances in genetics, genomics and stem cell biology all of which, in relation to human embryos, raise interesting and sometimes challenging ethical issues. 

I start with a question: what is the ‘moral status’ of the early human embryo? Whether the embryo arises by normal fertilisation after sexual intercourse or by IVF, there is a phase of a few days during which the embryo is undergoing the earliest stages of development but has not yet implanted into the wall of the uterus; the prospective mother is not yet pregnant. In UK law, based on the Human Fertilisation and Embryology Act (1990), these early embryos are not regarded as human persons but nevertheless should be treated with some respect. Nevertheless, there are some who oppose this view and believe that from the ‘moment of conception’ (there actually isn’t such a thing – fertilisation takes several hours) embryos should be treated as persons. In ‘conventional’ IVF this debate is especially relevant to the spare embryos that are generated during each IVF cycle and which are stored, deep-frozen, in increasing numbers for possible use in the future.  

A further dimension was added to this area of debate when it became possible to test IVF embryos for the presence of genetic mutations that cause disease. This process is called pre-implantation genetic diagnosis and enables prospective parents who are at known risk of passing on a deleterious mutation to avoid having a child who possesses that mutation. But what about the embryos that are rejected? They are usually discarded or destroyed but some are used in research. However, those who hold a very conservative view of the status of the early embryo will ask what right we have to discard/destroy an embryo because it has the ‘wrong genes’. And even for the many who hold a less conservative view, there are still several questions which remain, including ‘which genetic variants we should be allowed to select against?; should we allow positive selection for genes known to promote health in some way?’; should we allow selection for non-therapeutic reasons, for example, sporting prowess?’ These questions will not go away and there are already indications that non-therapeutic selection is being offered in a small number of countries. 

Genetic modification 

This leads us on to think about altering human genes. Initially, the issue was genetic modification (GM) which in general involves adding genes. GM techniques have been used very successfully in curing several conditions, including congenital severe immune deficiency and as part of treatment programmes for certain very difficult childhood cancers. One key feature of these examples is that the genetic change is not passed on to the next generation – it just involves the body of someone who has already been born. Thus, we call them somatic genetic changes (from the Greek, sōmatikos, meaning ‘of the body’).  

Genetic modification which is passed on to the next generation is called germline GM which means that the genetic change must get into the ‘germ cells’, i.e., the sperm or egg. Currently, the only feasible way of doing this is to carry out the genetic modification on the very early embryo. At present however, with just one very specific exception, GM of human embryos is forbidden in all the countries where it would be possible to do it. There is firstly the question of deciding whether it is right to change the genetic makeup of a future human being in such a way that the change is passed to succeeding generations. Secondly, there are concerns about the long-term safety of the procedure. Although it would involve adding specific genes with known effects, the complexity of genetic regulation and gene interactions during human development means that scientist are concerned about the risks of unforeseen effects. And thirdly, germline GM emphasises dramatically the possibility of using GM for enhancement rather than for medical reasons.  

Genome editing 

This leads us to think about genome editing. In 2011, it was shown that a bacterial system which edits the genomes of invading viruses could also work in other organisms This opened up a large array of applications in research, agriculture and medicine. However, the ethical issues raised by genome editing are, in essence, the same as raised by GM and so there is still a universal prohibition of using the technique with human embryos: germline genome editing is forbidden. Despite this, a Chinese medical scientist, He Jiankui, announced in 2018 that he had edited the genomes of several embryos, making them resistant to HIV; two babies with edited genomes had already been born while several more were on the way. The announcement caused outrage across the world, including in China itself. He Jiankui was removed from his job and then, after a trial, was imprisoned for three years; his two colleagues who collaborated in this work received shorter sentences. 

At present the universal prohibition of human germline genome editing remains in place. However, the discussion has been re-opened in a paper by an Anglo-Australian group.  They suggest that we need to develop heritable (i.e. germline) polygenic genome editing in order to reduce significantly an individual's risk of developing degenerative diseases. These includecoronary artery disease, Alzheimer’s disease, major depressive disorder, diabetes and schizophrenia. I note in passing that one of the authors is Julian Savulescu at Oxford who is already well-known for his view that parents who are able to do so, are ‘morally obliged’ to seek to have genetically enhanced children, whether by PGD, GM or genome editing. The use of polygenic editing, which would, in all likelihood, be available only to the (wealthy) few, fits in well with his overall ethical position. Needless to say, the paper, published in the prestigious journal Nature, attracted a lot of attention in the world of medical genetics. It was not however, universally welcomed – far from it. Another international group of medical scientists and ethicists has stated that ‘Human embryo editing against disease is unsafe and unproven …’ and even go as far as to suggest that the technology is ‘… going to be taken up by people who are pushing a eugenics agenda …’ remain very pertinent. 

Harder still and harder 

I have no doubt that amongst different reader there will be a range of opinions about the topics discussed so far. For anyone who is Christian (or indeed an adherent of almost any religious faith), one of the difficulties is that modern science, technology and medicine have thrown up ethical questions that could not have even been dreamed of by the writers of the Bible (or of other religious texts). We just have to use our wisdom, knowledge and general moral compass (and for some, prayer) to try to reach a decision. And if what I have already written makes that difficult, some recent developments multiply that difficulty still more.  

In the early years of this century, scientists developed methods of transforming a range of human cells into ‘pluripotent’ stem cells, i.e., cells capable of growing into a wide range of cell types. It also became possible to get both induced stem cells and natural stem cells to develop into functional differentiated cells corresponding to specific body tissues. This has huge potential for repairing damaged organs. However, other applications are potentially much more controversial. In 2023, Cambridge scientists reported that they had used stem cells to create synthetic mouse embryos which progressed at least as far as brain and heart formation within the normal pattern of mouse embryo development. 

At about the same time, the Cambridge group used individual human embryonic stem cells (from the blastocyst stage of embryonic development), to ‘grow’ early human embryos in the lab. There is no intention to use these embryos to start a pregnancy – indeed, it would be illegal to do so – but instead to study a period of embryo development which is not permitted with ‘real’ human embryos (research must not continue past 14 days of development). But how should we regard synthetic embryos? What is their moral status? For those who hold a conservative view of the normal human embryo (see earlier), should we regard these synthetic embryos as persons? Neither does the law help us. The legal frameworks covering in vitro fertilisation and early embryos (Human Fertilisation and Embryology Acts, 1990, 2008) do not cover artificial embryos – they were unknown at the times the legislation was drawn up. Indeed, synthetic embryos/embryo models are, in law, not actually embryos, however much they look like/behave like early embryos. Earlier this month, the Human Fertilisation and Embryology Authority (HFEA) discussed these developments with a view to recommending new legislation, but this will not dispel an unease felt by some people, including the science correspondent of The Daily Telegraph, who wrote that this research is irresponsible.  

But there is more. In addition to synthetic embryos, the HFEA also discussed, the possible use of gametes – eggs and sperm – grown from somatic stem cells (e.g., from skin) in the lab. Some authors have suggested that the production of gametes in vitro is the ‘Holy Grail’ of fertility research. I am not so sure about that but it is clear that a lot of effort is going into this research. Success so far is limited to the birth of several baby mice, ‘conceived’ via lab-grown eggs and normal sperm. Nevertheless, it is predicted that lab-grown human eggs and sperm will be available within a decade. Indeed, several clinicians have suggested that these ‘IVGs’ (in vitro gametes) seem destined to become “a routine part of clinical practice”.  

The lab-grown gametes would be used in otherwise normal IVF procedures, the only novelty being the ‘history’ of the eggs and/or sperm. Clinicians have suggested that this could help couples in which one or both were unable to produce the relevant gamete, but who still wanted to have children. In this application, the use of IVGs poses no new ethical questions although we may be concerned about the possibility of the gametes carrying new genetic mutations. However, some of the more wide-ranging scenarios do at the least make us to stop and think. For example, it would be possible for a same-sex couple to have a child with both of them being a genetic parent (obviously for males, this would also involve a surrogate mother). More extremely, a person could have a child of which he or she was actually, in strictly genetic terms, both the ‘father’ and the ‘mother’. What are we to make of this? Where are our limits?  

Dr Christopher Wild, former director of International Agency for Research on Cancer, explores in depth many of the developments and issue I outlined above. His article on why a theology of embryos is needed, is clear, well-written, helpful and thought-provoking. 

 

This article is based on a longer blog post with full footnotes.  

Join with us - Behind the Seen

Seen & Unseen is free for everyone and is made possible through the generosity of our amazing community of supporters.

If you’re enjoying Seen & Unseen, would you consider making a gift towards our work?

Alongside other benefits (book discounts etc.), you’ll receive an extra fortnightly email from me sharing what I’m reading and my reflections on the ideas that are shaping our times.

Graham Tomlin

Editor-in-Chief