It’s time for AI to get ethical
Tech companies have been granted access to an unlimited amount of data, an issue that really entangles ethics when it comes to Healthcare.
by Bart de Witte
In 1998, Yahoo! shared insightful data with two students who were working together on a thesis titled “The Anatomy of a Large-Scale Hypertextual Web Search Engine”. At the time, Yahoo! could never have known that these students would later become the founders of Google – which today dominates 78% of the search market.
It stands to reason that if the company did, in fact, know what Larry Page and Sergey Brin would use the data for and were capable of, then they most certainly would have been more cautious about sharing it, and the way we use the internet today might look very different.
Since then, similar “giveaways” have occurred multiple times across different industries; the difference today is that we are more aware of the relevance and the importance of data and how it can be utilized as a full-fledged asset. I’m not only talking about Facebook and Cambridge Analytica, but also something not so widely reported: public organizations which, in order make up for their structural lack of funds, are providing startups with the valuable information that fuels them.
The digitization of businesses today is fueling a massive acquisition of the private sector by the public one: we are constantly witnessing markets consumed by giant, powerful and private tech companies which have been granted access to an unlimited amount of data. This issue really entangles ethics when it comes to Healthcare, one of the most sensitive businesses in our society: when this happens, who retains data ownership? And how will the use and sharing of data be regulated?
On the one hand, Big and Smart Data, made possible by AI, is empowering Healthcare Diagnostic Services to advance in terms of research with an exponential crescendo: the higher the quality of the data we collect is, the faster and more promising the research will be. This will, ultimately, be better reflected in terms of quality for the final user — in our case, the patient — and benefit the Healthcare System as a whole.
However, if, as the latest trends seem to confirm, Healthcare Diagnostic Services will be consolidated into a handful of global-scale private providers – the data is at risk of being unevenly shared and ending up in private hands. This phenomenon then, has more potential to amplify inequalities rather than reduce them: as Smart Data is essential to Smart Healthcare, what could the consequences for the public systems of countries with less access to data be?
Healthcare delivery is reaching an inflection point. Five disruptive trends, several of which have already transformed other major industries, are triggering changes that will profoundly affect healthcare for years to come. When and how these disrupters will strike will vary across the healthcare ecosystem. Forward-looking funds are reviewing their investment strategies now to determine how to best capitalize on the opportunities and mitigate the risks that disruption will bring (see figure).
Another major underlying topic is the thorny problem of data ownership: this is the most sensitive data one could think of. Who owns it? Whilst patients may think that they’re in control of their data, this is nothing but a myth — in fact, this market does not have any of the self-regulating forces we see in other industries. I hear a lot of discussion on how to solve this issue, with some pointing towards innovative technology — especially Artificial Intelligence and Blockchain — as the possible problem solver. I’m not so enthusiastic: there’s still a long road to go before we can actually implement these technologies on the markets.
According to a recent study, people tend to place a lot of trust in university hospitals. This is why I advocate transforming these research powerhouses into the experimental platforms of our future Healthcare system. By involving academies in the issue, we could finally fire up a deep, constructive debate about the use (and misuse) of data. Every other industry is already having this discussion: it’s time we started building our own antibodies and thought more about AI Ethics, both in terms of research and in quality of the service we bring to our communities.
We might end up trusting AI to practice a surgery on us — but can we trust the humans behind the Smart Data this technology collects?