Advertisement

Ask a Data Ethicist: What Should Be the Limits of Biometric Data Collection in the Workplace?

By on
Read more about author Katrina Ingram.

There’s a movement underway to capture an increasing amount of data about employees – from facial recognition or fingerprint systems used for tracking time and attendance, to systems that monitor your every keystroke. All of these invasive data collection technologies raise this question: What should be the limits of biometric data collection in the workplace?

Biometric data is a particularly sensitive type of data because it’s literally attached to us. It’s a category of data that represents a range of unique body measurements or behaviors, such as retinal scans, facial recognition, fingerprints, voiceprints, and even our gait. This data is often being used as a means of identification in the workplace, but the stakes are high for workers. You can always change your password, but you can’t (easily) change your face or fingerprints.

Fingerprints – the New Punch Clock

Apple popularized TouchID technology, which allows users to confirm their identity through their fingerprints. This technology provides a convenient way to unlock devices and was heralded as a boon for e-commerce when it was first introduced in 2013. Since then, this kind of technology has become cheaper and easier to implement beyond the confines of a smartphone.

For less than $50, anyone can purchase a biometric fingerprint scanner on Amazon. There are also more elaborate systems that can integrate directly with HR software. The old-school punch clock – a staple in workplaces for employees paid by the hour – is increasingly being replaced by fingerprint readers.

Even small-scale employers can use this technology. For example, a recent CBC news story about biometrics in the workplace mentioned charbar – a bespoke restaurant – using this technology. The 21-year-old employee interviewed for the article had “no concerns” about her data being collected this way, but for someone who works in privacy and data ethics, this story raised alarm bells. The stakes for identity theft become much higher when that data includes biometric information. The decisions we might make in a cavalier way about our data while we are young can come back to haunt us in the future. 

Is This Even Legal? It Depends

Forestry company Canfor implemented TouchID systems to upgrade their HR processes. Even though a hundred employees signed a petition speaking out against the system, and their union took the case to arbitration, Canfor was legally allowed to proceed with the system. The arbitrator concluded that the information collection was “reasonable and permitted” and that the policy could be applied unilaterally. This case sets a dangerous precedent for Canadian workers who are concerned about biometrics in the workplace. 

In Illinois, the Biometric Privacy Act (BIPA) provides specific protections related to biometric data. It recognizes the highly sensitive nature of this data. Recently, a $228 million class action suit in Illinois involving 45,000 truck drivers and BNSF Railway, which involved the collection of biometric handprint data, was settled in favor of the workers.

Sweat – Another Biometric Frontier

New means to measure body core temperature and sweat have led to the ability to collect this information in the name of worker safety. Wearable devices are being used to determine if workers are overheated in their work environment, leading to heat exhaustion and “loss of productivity.” This technological solution appears to address the lack of regulation around too-hot working conditions while opening up new concerns around privacy and invasive data collection. The instruments could detect a range of medical data, including possible heart conditions – data that could be taken into other contexts and used in harmful ways.

A Model Employee

In the dystopian short film A Model Employee, an aspiring DJ – whose day job is working at a local restaurant – agrees to wear a wristband that constantly collects biometric data. This lifestyle data will be used to help assess her job performance, monitoring her location and other metrics that might count toward her score. In an effort to game the system, she gives the device to her sister, leading to disastrous consequences. Real-world workplaces now have the same dystopian tracking abilities but with fewer means to take off the device because our unique physical traits are necessary components in the system. 

Who Is Most at Risk?

It’s not management that is subjected to these types of technologies. As all of these cases illustrate, it is typically hourly paid employees, often in sectors such as hospitality, manufacturing, or other service sectors. In other words – people with less power. 

This isn’t surprising given that the aim of most of these data collection systems is to exert control. This holds true not only for systems that track time and attendance but also for systems that monitor how work is performed. Even systems implemented for safety reasons primarily impact workers who are most subjected to high-risk work environments, which means they are already vulnerable to encountering harm on the job. 

What Should We Do?

Ethically speaking, what should responsible management practices look like in a world that enables the collection and use of biometric data? We can start by considering core ethical principles such as upholding human well-being (beneficence), not creating harm (non-maleficence), upholding justice, and ensuring autonomy. Does the use of biometric systems in the workplace adhere to these principles or run counter to them? 

  • Workers should be afforded the opportunity to say no and have their autonomy respected. Some employees may not be comfortable with biometric data collection, and other mechanisms must be in place to provide options for these people. Some of these systems, such as facial recognition, might not work well for all people, creating instances of bias and discrimination.
  • Employers should think through the implications of collecting and storing biometric data. Training for both employers and employees about the privacy risks surrounding biometrics is needed. Vendors will always communicate the benefits of these systems and many employers might only see the upside and not the risks. Without a proper understanding of the risks, employees cannot provide meaningful consent because they can’t fully weigh the consequences. Some experts say that given the imbalance of power in the workplace, employees can’t really ever give meaningful consent.
  • Insurers might also consider businesses that collect, process, and store biometric data as higher risk, perhaps considering higher premiums for those companies. This could be part of the cybersecurity questions that businesses need to answer. There is not only the risk of cybersecurity and data breaches involving this highly sensitive data, but also the risk that these businesses might be subject to legal challenges down the road. 

Given the high stakes of this type of data and the many other means of accomplishing tasks like time and attendance tracking, we should take a precautionary approach. Certainly efficiency – saving time or money – does not seem like a high enough bar on its own to warrant this type of data collection. In the case of safety applications such as heat-monitoring wearables, we might consider the least invasive way to accomplish the task. This could be done by setting limits on the temperature or type of environment in which workers are asked to perform their tasks, as well as respecting autonomy and human dignity by allowing workers to say when they need a break. These types of solutions don’t need to involve biometric data. 

Send Me Your Questions!

I would love to hear about your data dilemmas or AI ethics questions and quandaries. You can send me a note at hello@ethicallyalignedai.com or connect with me on LinkedIn. I will keep all inquiries confidential and remove any potentially sensitive information – so please feel free to keep things high level and anonymous as well. 

This column is not legal advice. The information provided is strictly for educational purposes. AI and data regulation is an evolving area and anyone with specific questions should seek advice from a legal professional.