In the digital age, the adage stands: if you are not paying for the product, you are the product. In an era where business marketing, sales and influence are increasingly personalised and driven by insights into consumers, there is no product of more value than your personal data.
The quantified self movement and the general focus on health, fitness and beauty in recent years has brought about a proliferation of apps and devices focussed on improving these areas of our lives. However, more and more it is seeming as if our smartphones and wearable devices know us better than we know ourselves.
The pandemic especially has brought about advances in health devices, all while revealing holes in our healthcare systems. While COVID testing has proved complex, recent advances in research conducted by scientists at RMIT University may provide us with an entirely new form of COVID detection. An AI-powered algorithm could soon be turned into a smartphone app that is able to detect COVID by monitoring the coughing of users. Being reliable, contactless and accessible, such an app would provide a viable solution to COVID testing, allowing efficient and early detection that is personally available to the majority of population.
While we have not yet seen this kind of data-driven detection used for COVID, AI-powered apps designed to enhance healthcare have developed significantly in recent years. Technology like Fitbits, Apple Watches and health apps have surged in popularity as more and more individuals are seizing the opportunity to monitor their own health and fitness. New updates to Apple’s Health app, which can already trace heart rates, daily steps and sleeping patterns, will soon see the app monitor users’ walking steadiness, alert them to trends and patterns in their health and enable easy sharing of data with contacts and health professionals.
More and more, the public is being forced to reconcile with the uses of personal data. While there are numerous benefits that come with health apps especially, we would be naïve to ignore the dangers and potential privacy violations that inherently come with their use.
There is one highly popular branch of health monitoring apps that leads the way in its knowledge of users’ intimate lives: period-tracking apps. Known as ‘femtech’, this branch of data-driven health designed to monitor and assist women in their healthcare, is estimated to be worth over $50 billion by 2025. Period-tracking apps such as Flo, Clue and Glow allow users to log their daily symptoms, offering in return detailed and highly personalised insights, predictions and advice regarding the user’s menstrual cycle, fertility and pregnancy. In order to make these predictions, the app asks users to log information regarding their mood, flow, physical symptoms and even libido.
A very recent innovation has furthered the abilities of these apps with a wearable ring that measures heart rate body temperature and other key body patterns in order to assist with birth control. It then tracks menstruation, predicts ovulation and recommends various precautions that the user should take to avoid pregnancy.
More and more, the public is being forced to reconcile with the uses of personal data. While there are numerous benefits that come with health apps especially, we would be naïve to ignore the dangers and potential privacy violations that inherently come with their use. Wherever there are personalised insights, advertising or recommendations, we can be sure that there are vast amounts of information about us that the algorithm has processed behind the scenes.
While companies simply having this information may not seem overly alarming, it is what they may do with the data that poses the real threat.
While companies simply having this information may not seem overly alarming, it is what they may do with the data that poses the real threat. Insurance companies, advertisers, pharmaceutical corporations are all among those potential purchasers of your data which offers precisely the information they need to position you as a high-paying customer.
In a telling example of this, a United States Senate Commerce committee revealed in 2013 that an American company named Medbase200 had sold lists of families with specific illnesses, including AIDS and gonorrhoea, to pharmaceutical companies. Even more despicable, they also advertised lists of rape victims at a price of $79 per 1000 names, and similar databases of domestic violence victims.
For an alarming violation of data privacy that precedes personalised online advertising, look no further than Target. When Target’s data scientists discovered that there was a strong correlation between early stage pregnancy and the purchase of a range of 25 health and cosmetics products, they decided to put these insights to use. When a customer’s transaction history indicated the purchase of a certain number of these 25 products, the company’s marketing machine swung into action, bombarding the customer with pregnancy-related advertising.
Beyond running the risk of possibly being insulting or presumptuous, Target’s strategy proved disastrous for one teenage customer who suddenly began receiving pregnancy-themed advertising mailed to her home. When her father contacted store management, irate at the inappropriateness of sending such mail to his teenage daughter, it was left to his daughter to break the awkward news. In reality, a national retail store’s data department knew she was pregnant long before any of those closest to her had any clue. We can only speculate as to how this would have played out had this teenager been a regular user of a period-tracking app.
Apple, the company that is so often involved in the apps, advertising and data collection mentioned, recently announced that it hopes to make healthcare its greatest contribution to society. The apps that it both offers and enables evidently play a part in this goal, but the vision it recently announced far exceeds the systems it publicly already has in place. Part of this vision will be offering its own health clinics with doctors employed by Apple. These goals stand on the foundation of its existing apps, devices and data and offer an alternative way of monitoring health to the standing healthcare system. The identification of healthcare as a sizeable and largely untapped opportunity for expanding business is becoming increasingly common in big corporations like Apple and Amazon.
The relationship between technology, data and health is very much still unfolding, but is one that holds great potential for the future of our lifestyles and the effectiveness of our healthcare. However, individuals in the public have every right to be concerned and suspicious of the data-driven ways in which this health monitoring is enabled, especially when it comes to issues of privacy, the uses of data and the limited legislation in existence surrounding these. There are evidently many existing and potential benefits in terms of health that emerge from these apps and devices, but the question remains for each individual: at what cost?
 Quin, M 2021, ‘Smart diagnostics: AI tech can hear COVID in a cough’, RMIT University, 17 June.
 Comstock, J 2021, ‘Apple adds walking stability, family and provider health sharing, and more at WWDC 2021’, Mobi Health News, 7 June.
 Tiffany, K 2018, ‘Period-tracking apps are not for women’, Vox, 16 Nov.
 Golub, M 2021, ‘Natural Cycles app gets FDA clearance for first wearable birth control’, Trend Watching, 19 July.
 Ross, A. 2016, Industries of the Future, Simon & Schuster, New York, p. 177.
 Ford, M. 2015, Rise of the Robots, Basic Books, New York, p. 88.
 Winkler, R 2021, ‘Apple Struggles in Push to Make Healthcare Its Greatest Legacy’, The Wall Street Journal, 16 June.
Article supplied with thanks to Michael McQueen. Michael is a trends forecaster, business strategist and award-winning conference speaker.