Do Digital Health Sensors Limit Our Freedom of Choice?
How do you live healthier with data? How do you get used to sensors and wearables? I receive plenty of questions after my keynotes about digital health; how it changes my life and how it could transform society in the future. A while ago, I was on stage in Lisbon, when someone asked me whether I think the use of health sensors might limit our freedom of choice. As it generated a discussion within The Medical Futurist team too, I decided to outline my position and the counter-arguments. Needless to say, I stood for technology not curbing our freedom in any way.
Would sleep tracking limit your freedom of choice?
For me, data means a source of self-understanding and a boost for better performance. In some previous articles, I have described how I track stress, how I changed my life with a simple excel spreadsheet or how you could live healthier with the help of technology.
I have been tracking my sleep for years to be able to optimize it. So, I have quite an experience with apps, sensors, and methods. I even had a six-month-long sleep tracking experiment to get the most out of my sleep. I have tried many sensors ranging from Fitbit One and Surge to Viatom Checkme and Withings Pulse, but for me, the ultimate solution turned out to be the Pebble Time sensor and Android for Sleep app duo. This jackpot combination has been my companion for two years already. The app is the primary source of algorithms. It measures detailed sleep quality, while the sensor is measuring my movements, and wakes me up at the best time in the morning.
As I often talk about how I track my sleep, I received the question after my keynote in Lisbon what I think about the freedom of choice regarding sleeping and waking up with a tracker. The questioner said that if he used the devices that I use, he would give up his freedom since technology would have a control over his life.
I strongly disagree with the notion of technology taking control
I have the choice every day to put down the sleep sensor and sleep app, and I can go to bed without any gadgets. Sometimes I do, when I don’t feel like I want to monitor my bedtime. I am the ultimate decision-making authority here. I decide when I use it in order to get up at the best time, well-rested, being efficient, and when I don’t use anything at all.
One of the key elements here is that I’m clearly aware of the fact that technology serves my purposes and my purposes only. It is a means making my life better and I do not attach any other feeling or afterthought to it. A piece of technology for measuring data. Objective, impersonal and alienable.
The alarm clock and the choice to wake up
However, some members of The Medical Futurist team did not share my view. My editor, for example. She is using the sleep app, Sleep Cycle for several months now, and loves the idea of waking up at the best possible time. In her view, the app does a good job in waking her, however, most of the time, she still cannot get out of bed in time – as she sleeps just 5 minutes more after the smart alarm goes off. They have a complicated relationship. I told her, it’s inefficient.
Anyway, she believes that personalized technology has the power to curb people’s freedom of choice. Here is her line of arguments:
„It already started out with the alarm clock itself. The concept of the personal time measurement device policing the time when people should or should not wake up captures in itself the changes of society and how technology started to be a means of power controlling people’s ways of life. According to an article in Atlas Obscura, alarm clocks originated in Germany in the 15th century, but most people didn’t own such clocks and relied upon the sun, servants or prayer-chiming bells until the 18th-19th century. With the industrial revolution, rigid work hours and tighter rules in Western societies, the alarm clocks were born.
They are waking up people intruding in their natural processes, and smart sleep alarms are doing the same. They are intruders, but at least, they intrude in a less brutal way, trying to pay attention to the personalized body functions of the individuals. So according to my first argument, digital health technology continues the line of development already kicking off in the 18th-19th century, where the policing of people for example through alarm clocks and other technological means started. Yet, at least, it does the intervention in a much softer way.
Control comes with rewards and punishments
The second issue with health sensors, trackers and wearables is the effect of all these objective measurements on people. Bertalan often mentions that his data affects his performance. When he goes out for a run, sometimes he runs more because he can see his data peak or his performance getting better. Data gives him a rewarding feeling of energy. But was that his own choice to stay out and run more? Without the data, he would have definitely not decided for doing one more round on the track. Was that his free choice? The data definitely had some coercive power, but he chose the device in the first place exactly to give him that sense of rewarding feeling through the data. He knew exactly his nature, what he needs and uses the technology according to it; as a means. I believe that’s the most important element of my argument.
However, data and technology could also have a dark side – the psychological effect of someone not being able to live up to the expectations set by data and technology according to various standards in society. What does it mean to lead a healthy way of life? How much amount of sport is enough? What food is advisable to eat? How much sleep is healthy? These questions come down to the tiniest details of our lives. So what if someone believes that healthy sleeping could only be achieved and maintained through sleep tracking? Could this person opt out freely assuming that abandoning sleep tracking comes with a risk of not taking care of his or her health properly? I would say, no. Opting out will come with discomfort, distress – and in the worst case scenario, some early signs of hypochondria.
Do we have too many choices?
There are already several companies who openly attempt to harvest our desires for achieving the standards of the healthy way of life or any other goal through rewards or punishments. Some weeks ago, The Medical Futurist reviewed the Lumo Lift, which indicates with a vibration if you slouch thus motivating you to sit straight. There is a start-up called Beeminder, which punishes you financially if you fail to meet your goals. I would not be surprised to see much more on the skyrocketing wearable market.
I believe that Google Maps’ failed attempt to show calories on its digitized maps also stems from this phenomenon. Why did it fail, yet, you ask? First of all, the tiny image of mini cupcakes showing the calories is loaded with outdated stereotypes about femininity, masculinity and body image, whats missing from most of the fitness trackers or wearables. Secondly, counting calories has a way different message than counting your steps – the latter being fit, able and actively doing something for your health, while counting calories is about making notes of already consumed goods, a rather passive activity, subjected to the disapproval of Western society if done in excess.
From here, the reward and punishment section, the question springs up: if we acknowledge that in a certain way, technology and wearables exert some sort of control over our lives, where does that wish come from? Why do we want them to take over some decisions from us? It might have something to do with responsibility. It’s certainly easier to say that the smart alarm did not wake me loud enough so I was late than to admit I went to bed too late last night.
Yet, it might also have to do with too many options and choices all around us. Amy Lunt from the University of Bath argues here that in a world where you are constantly urged to express who your true self is, to maintain a unique identity can become stressful. And in response, people derive great pleasure in handing over control to others, even if this means freedom is constrained and choices are limited. For consumers burdened by organizations’ ceaseless command to self-express, technologies that promise to limit, constrain, and dominate appear to relieve that burden”.
My editor’s long line of argument definitely reveals some theoretical and ethical points to consider when we think about the effects of personalized technology on society and the person of the future. I am curious what you think about our debate and I would definitely encourage everyone to share their opinions on my Twitter feed, Facebook or LinkedIn page! Let the discussion begin!
[subscribe image=”false” type=”article-horizontal”]Subscribe to
The Medical Futurist℠ Newsletter
- News shaping the future of healthcare
- Advice on taking charge of your health
- Reviews of the latest health technology