Click to learn more about author Anthony Habayeb.
Facebook whistleblower Frances Haugen’s testimony to Congress about abusive algorithms and AI referenced Big Tobacco’s enormous influence and fall in the latter half of the 20th century. What’s challenging about the comparison, however, is that smoking is unequivocally bad for you. AI governance offers great promise to enhance our lives, so we need the software industry to avoid the catastrophic mistakes of tobacco. Companies should never knowingly ignore or try to spin the risks and harms of these technological advances. Now is the time to expect greater organizational transparency, governance, and accountability.
The Past Story of Tobacco Regulation
Remember when everything changed for tobacco? Tobacco being harmful wasn’t some quick revelation – for decades, we knew of the harms and risks of being a smoker. We allowed manufacturers to market, promote, and dramatize the coolness and social acceptance of smoking. The Marlboro man was one of the most recognizable icons globally representing American culture and life.
At the same time as billions of dollars in profit were being made, an enormous body of evidence was being amassed to show conclusively that smokers and nonsmokers were increasingly suffering health issues stemming from cigarette smoke.
Throughout the first half of the 20th century, the rate of lung cancer among white men in the United States was on the rise, but there was no indication as to what the root causes were. In 1950, five epidemiological studies were published linking lung cancer to cigarette smoking. Later that year, the first filter for cigarettes was introduced to curb the rate of lung cancer across the country. In 1954, the first lawsuit against the tobacco industry was filed by a lung cancer patient on the basis of negligence. However, the suit was dropped 13 years later due to mounting pressure from the tobacco industry.
In 1972, the Surgeon General found that secondhand smoke – or environmental tobacco smoke – increased nonsmokers’ risk of lung cancer. By the mid-1970s, the federal government began administratively regulating tobacco usage, outlawing tobacco usage in all federal government facilities in 1979. However, it was not until 1988 that there was legislative action on tobacco use at the federal level: Smoking on commercial flights longer than two hours was now against the law, which was later extended to all flights in 1990.
In November of 1998, the tobacco industry agreed to a 46-state Master Settlement Agreement for $206 billion to be paid off by 2025. In the agreement, there were several public health provisions regulating tobacco manufacturing and production. It was not until 2009 that this lawsuit was codified under the Family Smoking Prevention and Tobacco Control Act, which gave the FDA the authority to regulate the marketing, manufacturing, and sale of all tobacco products.
It took 60 years too long for millions of Americans, and we may never fully grasp the public health toll in terms of costs.
This Didn’t Start Yesterday
Are we at the beginning – or smack in the middle – of the watershed moment that fundamentally transforms the U.S. position on regulation of the internet, software, and algorithms without comprehensive consumer protections?
I’m 40 and have had a front-row seat as a consumer and business operator as the internet has become the connective tissue of our entire society. At Yahoo, I was part of the team that launched one of the first-ever behavioral-targeting advertising platforms. At HotJobs, we built an advertising platform that leveraged knowledge about consumer behaviors on the internet to promote career opportunities to individuals we believed would be interested in them.
If you were a nurse, or a truck driver, or maybe a recent college graduate searching for jobs or spending time reading articles about career opportunities and best practices for interviewing, we turned around and proactively brought you opportunities related to your interests. These behavioral ads quickly became our primary revenue driver and growth in the business. Our mission and purpose was noble – make it easier for people to find rewarding work – but we absolutely had huge commercial ambitions and reaped big rewards too.
For more than 20 years, all of us have increasingly built our lives around the internet and social media platforms. We’ve done this because, for the most part, the internet has created greater connectivity between humans spread across the globe. By and large, we have benefited from the tremendous advancements of internet access, computing power, and software. We will certainly continue to benefit from more advancements; however, we’re reaching a critical mass of consumer awareness and concern about the other side of the coin.
Senator Ed Markey is a sponsor of the Algorithmic Justice and Online Platform Transparency Act which at its root pushes for transparency and accountability across the use of algorithms that impact our lives. Europe is materially ahead of the U.S. in developing regulation of AI and algorithms as seen in recent regulatory advancements from the European Commission, U.K., Germany, and the Netherlands, to name but a few.
Frances Haugen’s whistleblower evidence and testimony isn’t sharing any proprietary knowledge about how the algorithms are built. She is offering transparency into a lack of organizational oversight and governance at Facebook. Her testimony shows full awareness of the potential risks and harmful impacts of algorithmic targeting but a systemic lack of action and accountability. It is impossible for any corporation using algorithms to plead ignorance or unawareness of the risks presented by using increasingly advanced models, algorithms, and AI to make decisions.
Some Basic and Reasonable Expectations
- We should ask companies using algorithms to establish intentional governance and oversight, and we should expect to see the evidence of this governance.
- We should expect companies to define governance programs and to make available the evidence and results of continued management of said programs.
- We should expect every “consequential” algorithm – i.e., one that has tangible consequences on people’s lives – to have the ability for its decisions to be verified by independent and objective parties.
- We should discard the idea that AI is somehow accountable for itself – people build these systems and should be accountable for their successes and their failures.
- As Haugen described, we can provide oversight without asking companies to divulge their intellectual property or reduce the economic incentives of competition and innovation.
- We don’t need to reinvent the wheel when approaching principles for algorithmic governance. We should leverage decades of experience and learnings about methods of regulating consequential industries and technologies that impact our lives.
- We should not depend on or expect any single federal or regulating agency to have the technical capacity to comprehensively oversee or understand the technology we have today. We do not want regulation to stifle innovation by attempting to overly regulate how software is built.
While I started this piece contrasting tobacco’s regulatory watershed moment with Facebook and the internet, let me be clear. Software, algorithms, and the internet can be and generally are a hugely positive part of our lives. We should all want to support brilliant minds’ ability to build software and algorithms that improve our lives. I want to leverage big data and algorithms to fight and hopefully bring an end to cancer. The potential to build financial systems and products for consumers that target and remove centuries of human bias and discriminatory practices is inspiring. I love discovering new products, great TV shows, and fantastic professionals in my network thanks to algorithmic recommendations.
Facebook just might be the watershed regulatory moment for algorithms and software, but hopefully, regulators move forward balancing our protection with an equal or greater appreciation for how much good will come from these technologies.