Artificial Intelligence (AI) has started playing important role in Healthcare. For Example, an AI system is used to improve early breast cancer detection. Another AI system improves the performance of the
And, the AI is getting powerful steadily. An AI Algorithm can see and learn to analyze millions of publicly available images on Google Street View to determine the political leanings of a given neighborhood just by looking at the cars on the streets. And, AI systems are performing better than Human Doctors in diagnosing brain tumors.
Even Scientists are trying to give Common Sense Knowledge for Artificial Intelligence Systems. The AI helps Computers to perceive Human Emotions.
All these advancements in artificial intelligence have created new threats to the privacy of health data.
A new study, led by UC Berkeley professor Anil Aswani suggests current laws and regulations are nowhere near sufficient to keep an individual’s health status private in the face of AI development.
In the work, Aswani shows that by using artificial intelligence, it is possible to identify individuals by learning daily patterns in step data (like that collected by activity trackers, smartwatches and smartphones) and correlating it to demographic data. The mining of two years’ worth of data covering more than 15,000 Americans led to the conclusion that the privacy standards associated with 1996’s HIPAA (Health Insurance Portability and Accountability Act) legislation need to be revisited and reworked.
“In principle, you could imagine Facebook gathering step data from the app on your smartphone, then buying health care data from another company and matching the two,” Aswani explains. “Now they would have health care data that’s matched to names, and they could either start selling advertising based on that or they could sell the data to others.”
Aswani makes it clear that the problem isn’t with the devices,
Though the study specifically looked at step data, Aswani says the results suggest a broader threat to the privacy of health data. “HIPAA regulations make your health care private, but they don’t cover as much as you think,” he says. “Many groups, like tech companies, are not covered by HIPAA, and only very specific pieces of information are not allowed to be shared by current HIPAA rules. There are companies buying health data. It’s supposed to be anonymous data, but their whole business model is to find a way to attach names to this data and sell it.”
Aswani says he is worried that as advances in AI make it easier for companies to gain access to health data, the temptation for companies to use it in illegal or unethical ways will increase. Employers, mortgage lenders, credit card companies and others could potentially use AI to discriminate based on pregnancy or disability status, for instance.
News Source: UC Berkeley