Benefits

Maneuvering the complicated intersection of data privacy, health and technology

By Andie Burjek

Apr. 8, 2020

People share their experiences with depression on Twitter to show support for the mental health community. They join private Facebook groups to discuss similar health issues, without realizing that a “private” online group does not actually offer privacy protections. Companies encourage employees to be open about their health in an effort to create a “culture of health.” And employees join “HIPAA-compliant” wellness programs without realizing that the health data they log in various apps may not be protected by any law if the program is voluntary. 

When the Health Insurance Portability and Accountability Act was enacted in 1996, today’s vast digital space didn’t exist. Even if organizations comply with HIPAA, the Genetic Information Nondiscrimination Act and other laws that protect health-related data, that doesn’t necessarily mean the data is protected in many contexts. There are gaps that have yet to be legally addressed. Meanwhile, employees increasingly share health information on digital health apps or online.

Also read: Fingerprint scanners risky amid coronavirus pandemic — it’s a touchy subject

A vast amount of employee data is not legally protected. As collectors of employee data, employers should be aware of the health data privacy landscape and the concerns employees may have.

“As much as it pains me to say, [data privacy] is probably nobody’s top priority,” said data privacy attorney Joseph Jerome. “It only becomes their priority when something goes wrong or they get concerned or they hear something in the news.”

Employers in the U.S. and internationally have increasingly more data privacy regulations to pay attention to — as laws like the General Data Protection Regulation in the European Union and the California Consumer Privacy Act and Illinois Biometric Privacy Act in the U.S. move the data privacy legal environment forward. In this constantly changing world, there’s information that can help organizations navigate this complicated intersection more intelligently. 

Also read: How much do you know about your health data privacy?

Privacy Law Limitations

There is a lack of understanding of what HIPAA protections apply where, when and to what data, Jerome said. At its core, HIPAA was enacted to facilitate the portability and interoperability of health care records, not for any greater data privacy reason. “We act like this is a health data privacy law, but no. It’s designed to govern data in hospital systems,” he said.

what is health data?Employers want to learn increasingly more data about their employees, he said. They have the opportunity to do so through commercial apps that capture wellness and fitness data. “These are things that people perceive as health data, but they’re not covered by HIPAA, and they were never designed to be covered by HIPAA,” he said.  

HIPAA — and therefore what data is considered health information — is limited to covered entities like hospital systems and doctors’ offices. For example, within a health system, a patient’s email address is considered health information under HIPAA, but outside the health system, an email address is not considered health information and does not get HIPAA protection.

HIPAA also doesn’t apply to anonymized data —  the data remaining after being stripped of personally identifiable information from data sets, so that the people whom the data describe remain anonymous.

Further, anonymous data is fair game, legally. “There is no regulation of ‘anonymized’ data. It can be sold to anyone and used for any purpose.The theory is that once the data has been scrubbed, it cannot be used to identify an individual and is therefore safe for sale, analysis and use,” noted “Re-Identification of ‘Anonymized’ Data,” a 2017 Georgetown Law Technology Review article.

A concern here is that anonymous data can be easily re-identified, and it’s tough to hold bad actors accountable for doing so, Jerome said. Further, it’s hard to do anything about it once the data is already identified and public information. Unfortunately, there are realistically not enough reinforcement resources, he added.  

“That’s a real problem right now, not just in health care or employment context, but you’ve got this giant ecosystem where a lot of companies are sharing information and they’re all saying they’re good actors, they’re all saying they’re not re-identifying information, they’re all saying they’re not even using personal information,” he said. “But there’s data leakage all over the place. People are recombining profiles, and it’s very hard to attribute where the information originally came from.”

According to the Georgetown Law Technology Review article, the re-identification of anonymous data can lead to sensitive or embarrassing health information being linked to one’s employer, spouse or community. “Without regulation of re-identified anonymized data, employers, neighbors, and blackmailers have an unprecedented window into an individual’s most private information,” the article said. One of the privacy concerns some people have about their health data is that it could eventually be used against them and that they could suffer real-world implications like the loss of job opportunities, the denial of insurance or higher premiums for insurance. 

Also read: Do Employers Have a Duty to Protect Employees’ Personal Information?

Wellness Program Gaps 

The idea behind employee wellness programs is supposed to be a win-win, said Anya Prince, associate professor of law and member of the University of Iowa Genetics Cluster. Employees get healthier, and employers get lower health care costs and a more productive workforce. 

But wellness programs are often not effective at changing employee health, she said. 

“If the premise is we’re doing this to benefit employees [but] there’s not actually evidence that it’s benefiting employees, the question then becomes why are [wellness programs] continuing to happen?” she said. “The evidence shows that what they’re doing is shifting health care costs back on to employees in various ways. That’s where the concern comes in.” 

Digital health apps on employees’ phones play a part in many workplace wellness programs. But even though third-party health apps are common on people’s phones, the privacy landscape behind these apps is murky at best. 

Prince cited Lori Andrews, professor of law at the University of Chicago and director of Illinois Tech’s Institute for Science, Law and Technology. Andrews has conducted work on the types of data that medical apps collect from users, including employees in workplace wellness programs.

“Some of the medical apps are just completely bogus and don’t give you anything helpful back,” Prince said about the general health data privacy environment. “But they are collecting data on you, not just health information but geolocation and other data that’s worth money.” 

health data privacy; privacy laws; data protectionAnother trend in wellness programs is employers offering employees consumer-directed genetic tests to help them understand what medical issues they may be predisposed to and what preventative measures they can take to combat them. According to the Society for Human Resource Management, 18 percent of employers provided a health-related genetic testing benefit in 2018, up from 12 percent in 2016.

Many studies have shown that people are not aware of  the Genetic Information Nondiscrimination Act or what privacy protections they have through the law, Prince said. “GINA is quite protective in employment in the sense that employers are not allowed to use genetic information to discriminate, so they can’t make hiring, firing, promotion, wage, any decisions based on genetic information,” she said, adding that genetic information includes family medical history, genetic test results and more.

Still, she said, there are some exceptions with GINA, including private employers with fewer than 15 employees and any employee in a voluntary wellness program. 

There is currently a legal debate on whether wellness programs are voluntary or if employees feel coerced to join them, Prince said. Some wellness programs are participatory —  meaning that employees don’t need to hit a certain health outcome target to earn the incentive — but others are health contingent. Employees need to lose some amount of weight or accomplish another target measurement to get the financial benefits of the wellness program.

These programs are more participatory currently, she said. But if programs that collect genetic information become  health contingent, that could bring up ethical issues and become more invasive. 

“If you think of [Breast Cancer gene] testing, which is a predisposition to breast and ovarian cancer, one of the preventive measures right now is to prophylactically remove your breast and ovaries. My dystopian future is the employer saying, ‘Have you finished having kids yet? Get on that, so that you can remove your ovaries,’” she said.  

This discussion begs the question of who is ultimately the best actor to push people toward better behaviors and health outcomes, she said. Society has to ask if employment is the best place to do this. 

“In a way the answer is yes because we’ve created a system where health insurance and employment are so intertwined, but maybe employment isn’t the right space to be encouraging people to make the right health choices,” she said. “Maybe that should be a public health system or your primary care physician or researchers.” 

The Pentagon has advised service members not to engage in 23andMe genetic tests, said Glenn Cohen, professor of law at Harvard Law School, and faculty director of the Petrie-Flom Center for Health Law Policy, Biotechnology and Bioethics. 

There’s a major national security reason for this, he said, but part of the reasoning also has to do with protecting service members’ privacy. The military is exempted from GINA, which is the law which prohibiting genetic discrimination by employers. 

Also read: The Ethics of Artificial Intelligence in the Workplace

Consent, Transparency and Communication 

Employers could communicate with employees better, Jerome said. Privacy is more than just legal compliance, which may include a disclaimer in the company handbook or on the employees’ computers that inform them “All this can be tracked and monitored.” This can help set up the expectation for employees that they should have no expectation of privacy in anything they do at work. 

While most employers have done their legal duty, they’ve yet to have a conversation with employees about what they’re actually doing with this data, Jerome said. 

Also read: Are You Part of the Cybersecurity Solution … Or Part of the Problem?

“I get that those conversations can be difficult and uncomfortable and frankly might get employees riled up, but I think that’s probably a good thing in the end,” he said. 

Employers — who sit on large troves of employee health data — may have the legal right to share data, but that doesn’t mean employees and other parties won’t criticize them, said Cohen. “They have to be worried a little bit about how it’s going to play as a PR matter and, in an industry where they’re competing for talent, how employees feel about [it],” he said. 

When Ascension Health partnered with Google for the “Project Nightingale” initiative late last year — allowing the tech company access to the detailed personal health information of millions of Americans — it received a lot of backlash. It could be dangerous for an organization like Google, which already has so much of people’s personal data, to get access to people’s health records as well, critics argued. Supporters said it was perfectly legal.

“My recommendation in general is even if you legally have the right to share the data, you may want to think about creating some internal governance mechanisms that have employees involved in trying to decide what gets shared or not,” Cohen said. 

Practically, this could mean that the organization charters a committee that includes employers, employees and subject matter experts who can explain both the uses and the risks of adopting a certain solution, he said.

This could be a valuable decision for employers because better decisions get made and it’s better for the employer’s reputation, he said. When people find out a company has sold its employees data, it could look bad if there hasn’t been employee input in the decision. 

For most organizations dealing with health data and other personal data, their reputation is based on how they treat that data, said Ed Oleksiak, senior vice president at insurance brokerage Holmes Murphy. A data breach or misuse of data would be bad press, so the company would be incentivized to protect that data and ensure it’s used properly

When there is a health data mishap, there are a couple ways that organizations can address that breach of trust, he said. Organizations can provide impacted employees some kind of identity theft protection that will help them mitigate any harm. Further, the company is required to address whatever has resulted in the breach and do whatever it can to make sure it can’t happen again in the future. 

“Whether it’s the employer’s health plan, a hospital system, or a technology provider, everybody’s reputation is contingent on successfully mitigating that,” Oleksiak said. “You just have to start over again, and try to fill that cup of trust back up.” 

Oleksiak also suggested that employers follow a key tenet of only getting and storing the minimum necessary data. Even though people involved with employee health plans most likely want to use patient data for the right reasons, people who can hack into these systems can access everything, including more unnecessary data. 

Ultimately, this is an issue of balance. According to the aforementioned Georgetown Law Technology Review “Re-Identification of ‘Anonymized’ Data,” “data utility and individual privacy are on opposite ends of the spectrum. The more scrubbed the data is, the less useful it is.” 

Still, there are positive things companies can do with this data, Oleksiak said. No matter what privacy rules and regulations are put in place, a bad actor is going to find a way to do something that’s for their own benefit.

“Hopefully we write rules that go after people that abused their position or access to data, but still allow everybody else that’s doing it for the right reasons to get the job done,” he said.

Andie Burjek is an associate editor at Workforce.com.

Schedule, engage, and pay your staff in one system with Workforce.com.

Recommended