A digital cage is still a cage

By Neil Crowther and Lorna McGregor | June 24, 2022

How can new and emerging digital technologies advance, rather than put at risk, the human rights of older people who draw on social care?

On the face of it, technology presents exciting new possibilities for people in care. However, there is the danger it could in fact intrude and infringe on people’s human rights. Neil Crowther and Lorna McGregor stand at the fork in the road and discuss the opposing paths ahead.

As we become older, we may need to make adjustments to where or how we live our lives and to draw on new sources of support. For many of us, this will involve support from family, friends or neighbours. Some of us will reach out to voluntary or community groups. We may draw on formal care and support in our own homes or we might move to live somewhere else, such as in a residential home.

In common with all areas of life, digital technologies are playing an increasing role in the context of care or support. Sometimes people are drawing on mainstream technologies, such as smart home devices, virtual assistants, sensors, security cameras, smart phones and smart watches, which are being used with the aim of being able to live more safely and conveniently at home. Other times, technology has been specifically designed to be used by providers of care, such as to detect movement, falls, changes in people’s daily living patterns, recognise pain, or to collate and analyse data about people’s health and wellbeing. Technology is also being used to manage care delivery.

How new technology could pose a risk to our human rights

New and emerging digital technologies, including technologies enabled by artificial intelligence (AI), may offer many opportunities to protect and advance our health and wellbeing as we get older and need more support to live our lives. They hold the potential to overcome some of the challenges that have historically linked care and support to a loss of control over day-to-day living and where and with whom we live.

But, without careful attention, technology – or rather the reasons for its employment and the way it is used – could also pose risks to our human rights, replicating some of these historic challenges while throwing up new ones.

Sensors, cameras and wearable devices may permit people to live more safely at home, reducing the need for constant human supervision and monitoring. But they could also involve significant intrusion into people’s innermost private lives, observing and collecting data about people’s every movement and activity, their mental and physical state, and that of people they invite into their home.

Smart homes could help liberate people from the walls of a multi-bedded institution, but, if programmed to determine when a person gets up or goes to bed, whether they can leave the property, who is able to visit, or when and what they eat, they could easily replicate the features of institutionalisation, depriving people of liberty and autonomy.

Virtual assistants, videoconferencing and telecare may helpfully allow more people to be supported at home by overcoming distributional challenges of providing support, especially in non-urban areas, but without corresponding action to support people to be with other people in person, this could leave more people isolated and lonely, living in – but disconnected from – the wider community.

Some people may be in a position to make informed decisions about the use of technology in the context of their support, but others may find themselves increasingly subject to the use of new technologies without having any say in it at all. In summary, all of these technologies can simultaneously support us to be in the place we call home, yet violate all the dimensions that make our home, home – and, in turn, our human rights.

Standing at the fork in the road

Two futures seem possible: a future of increasingly remote and automated services, where we are instrumentalised by new and emerging technologies to contain costs and ‘maintain’ us, versus another future in which these technologies become instruments for achieving the best conditions for later life, based on rights, autonomy, dignity and social connection. These may happen concurrently, reflecting and potentially deepening existing inequalities and the power they are able to exert over matters of care or support.

Whether these risks and opportunities manifest themselves depends on a range of factors, including:

  • Who decides whether particular technologies are used and for what purpose.
  • The design and function of the technology.
  • Who decides who can access and use the data for what purposes.
  • Whether we can meaningfully consent and control whether and how data is collected, stored, deleted, shared or sold and the data analytics applied to it.
  • Whether it replaces or complements services or decisions typically made by humans.
  • The safeguards in place to protect our dignity, autonomy and human rights.

While there is a growing literature on the ethics and human rights implications of the use of such technologies generally, and by specific actors, such as the police, there is comparatively little on the ethical and human rights implications of the application of these technologies in the context of social care. Moreover, ethical and human rights considerations appear to play little part in the commissioning, design or application of technologies in this area of life.

Ultimately, the future direction of technology in this field will depend on how social care is conceived. If we choose a future in which care amounts to little more than maintenance, then it is more likely that technology will be designed and employed to help keep people alive, but not living a life. If we choose a future in which people are supported to live a life they have reason to value, and in which they fully participate and are included in the wider community, then technology will be developed to support that goal. It isn’t a done deal and, with human rights as our guide, it is within our power to choose the right path.

Taking the right path

How can people who draw on social care and their advocates, social care providers and commissioners be confident that new and emerging technology advances care and avoids violating the human rights of older people?

Path 1: The potential for new and emerging technologies to support older people’s human rights increases if:

  • The technology is used in a context in which social care is based on equality and human rights, and focuses on supporting people to live independently and be included in the community, with control over decisions about their lives.
  • People who draw on social care, with support where appropriate, decide that a specific technology may support them to clearly meet their own defined goals.
  • People are not placed in a ‘take it or leave it’ position over care or support if they decide they do not want technology in their lives, with viable and accessible non-technological options being made available.
  • People have the opportunity to test and trial individual pieces of technology and smart homes before deciding to introduce them into their own home and can change their minds about using them at a later stage without detriment to their care and support.
  • The technology is accessible, both in terms of cost and design, and through digital literacy initiatives.
  • Tech providers and state and independent care providers are fully transparent about how the technology captures, processes (including through the use of data analytics), stores, shares and sells data, with whom, and in what form.
  • There are clear and accessible ways to make complaints about the technology itself or how different actors, including state agencies, independent care providers and health workers, access the technology or make decisions based on the data it collects.

Path 2: The potential for new and emerging technologies to put the human rights of older people who draw on care and support at risk increases if: 

  • The technology is introduced in a context in which data and new and emerging technologies are poorly regulated, and social care is about ‘maintaining’ people and is transactional in nature rather than about supporting people to live their best lives.
  • The purpose of introducing the technology is not primarily to meet people’s own goals but to meet the objectives of someone else, such as for monitoring or to cut costs in care, through reduced staffing.
  • People are not able to exercise meaningful consent about the introduction of the technology or given the opportunity to opt out or reject the use of the technology in their lives.
  • Non-technological alternatives are not provided, leaving people with a ‘take it or leave it’ position about the role of technology in their lives, and people are denied choice about how to arrange their own care or support.
  • Technology companies and state and independent care providers are not transparent about the invasiveness of the technology and how data is captured, processed, stored, shared or used.
  • The use of technology takes away or restricts choice and control about how people live their lives, for example, about whether people can leave their home or whom they allow in, and when they eat, get up in the morning or go to bed at night.
  • There are no routes – or inadequate, inaccessible or ineffective routes – to challenge the introduction of the technology.

Neil Crowther is Convener of #socialcarefuture. Twitter: @neilmcrowther

Lorna McGregor is Director at Economic and Social Research Council Human Rights, Big Data and Technology Project. Email: lmcgreg@essex.ac.uk Twitter: @HRBDTNews

Download and read the full report, A digital cage is still a cage – How can new and emerging digital technologies advance, rather than put at risk, the human rights of older people who draw on social care?

Share your views on this issue in the comments section below.

About Neil Crowther

Neil Crowther is a co-convener of the #socialcarefuture movement. He is also an independent researcher and social change consultant. Neil’s father, who died last September, had Alzheimer’s disease and through this Neil has experience of trying to navigate the current social care ‘system’. His parents were fortunate to benefit from the creation of a ‘community circle’ to support them, an idea that is part of #socialcarefuture’s vision for the future.

Caring for Care Workers. Donate to The Care Workers’ Charity and make a difference Donate