Internet Privacy Roadmap Developed at Shasha Seminar

Mike MavredakisApril 4, 202311min
Assistant Professor of Computer Science Sebastian Zimmeck

Who collects your data? Which data? Why? Where does it go? Who is buying it? What are they doing with it? Can we protect our data or choose who gets to use and sell it? What laws are in place to protect your data? What’s the path forward for data privacy?

These were all questions tackled by data privacy experts from Wesleyan, New York University, Google, Harvard, Boston University, Carnegie Mellon, and others who spoke at the annual Shasha Seminar for Human Concerns on March 31 and April 1.

“The importance of freedom from unauthorized intrusion is on the top of mind for many,” Provost and Senior Vice President for Academic Affairs Nicole Stanton said in her introduction to the seminar. “Privacy is a human right, yet privacy is threatened by new technological developments. And when privacy is under threat, so are civil rights and our democratic processes.”

Event organizer Assistant Professor of Computer Science Sebastian Zimmeck created a roadmap to better universal data privacy, synthesized from the presentations of all speakers. Zimmeck clarified that the roadmap does not represent a consensus among speakers, some even disagree with parts of it, but it was important to get these core points out to the public.

The roadmap calls for a comprehensive federal internet privacy law, building privacy mechanisms into the structure of the internet moving away from individual notice and consent models, increased incentives for businesses to focus on user privacy, better privacy education in schools, while at the same time not completely doing away with advertising to the extent it does not endanger people’s privacy.

“What has become clear is that we must evolve the structure of the Internet towards a privacy-preserving system. Implementing privacy into technical standards would go a long way towards that goal,” Zimmeck said. He added that privacy is a multi-dimensional problem. “That is why it is so important to have experts here to tackle the privacy problem from all its different angles. It is not enough to just look at the technological challenges, but also consider sociological and business aspects, for example.”

Where your data goes and who has access to it may not seem important at first glance, but it can have dramatic impacts when in the wrong hands.

Assistant Professor in the Science in Society Program Mitali Thakor speaks at the Shasha Seminar for Human Concerns.

For example, there are concerns that law enforcement could subpoena search engine information to see if someone searched how to get an abortion in a state where it’s now against the law following the Dobbs vs. Jackson Women’s Health Organization (2022) Supreme Court decision. Mitali Thakor, assistant professor in the Science in Society Program, raised a concern shared by many activists: could data collected by period-tracking applications be used to harm the health of individuals in states where abortion and privacy protections have been diminished post-Dobbs?

“The tension between privacy and surveillance, tech immunity versus responsibility, is a thorny issue for feminists and other activists,” Thakor said. “There’s a profound opportunity here for reconfiguring our values around privacy.”

There have also been cases of abusers using malware programs that monitor their victim’s device information—like location and messages—to aid and abet stalking and violent behavior, according to Damon McCoy, associate professor of computer science and engineering at NYU.

“We need to design for the experiences and threats faced by at-risk populations—not just a mythical ‘average’ user,” McCoy said.

Ada Lerner, assistant professor of computer science at Northeastern, pointed to the way companies in the fanfiction space, which often consist of people with vulnerable identities that require careful discretion, have included measures to protect their communities.

“It’s very valuable to look at communities and to look at privacy as how can we make it safe and trustworthy for people to share content?” Lerner said. “Which might be sort of counterintuitive as a notion of privacy. We often think of privacy as being how can I hide information, but a lot of scholars recently have been looking instead at ‘how can I make it safe?’”

The current model relies heavily on self-regulation by large technology companies like Facebook and Google. To get away from this model, the roadmap calls for a comprehensive federal internet privacy law that establishes a minimum level of protection at or above the existing state laws.

“We need to provide transparency to people of what happens with their data and give them adjustable and easy-to-use privacy controls,” Zimmeck’s roadmap said.

The other side of the privacy argument rests in the internet’s economic system. Much of the internet content-machine is funded through advertising.

Targeted advertising is arguably more profitable for advertisers because it leads to larger conversion rates with consumers, since consumers are seeing products tied to individual needs and interests. Advertisers fund much of the content on the internet and outlets could struggle to survive in a donation- or subscription-only model, according to Wendy Seltzer, who spent 10 years as the Strategy Lead and Counsel to the World Wide Web Consortium.

However, there are issues with how those targeted advertisements come about. The advertising companies must get preference data from somewhere and often it’s without people’s true knowledge or consent.

For much of the last two decades, digital advertisers have relied on user-data acquired through tracking cookies—bits of information placed on your device by websites to track your activity and preferences.

Lorrie Cranor, director and Bosch Distinguished Professor at the CyLab Security and Privacy Institute at Carnegie Mellon University, said that companies are becoming more reliant on cookie-consent interfaces—banners and buttons warning of cookie collection—to stay compliant with existing regulations.

Often these interfaces are not fully descriptive or comprehensible of what data is being mined from the user. They are also often difficult to use and don’t explain what the consequences of ignoring them are. The roadmap calls for fewer cookie-consent interfaces, which can become repetitive and put the onus for privacy protection on individual people. It says that technology should be developed to learn user preferences across websites and apply privacy protections accordingly.

Garrett Johnson, marketing professor at Boston University, suggested several ways to go about having targeted advertising without compromising user privacy during his talk on Saturday.

Zimmeck presented on cryptography and its uses in helping to secure user data while using the internet. Cryptography is a method of protecting information through the use of mathematical functions.

“The good news is that cryptography works,” Zimmeck said. “For example, the https connection that we use to make a website secure, we can rely on that and it is provably secure.”

Amir Houmansadr and Gary King each discussed ways that technology could be used to add privacy protections.

Houmansadr, associate professor of computer science at UMass Amherst, spoke about the tension between machine learning and privacy. As machine learning models require data to train, their use can have substantial privacy implications as people’s sensitive data might be included in the training datasets and later revealed in the use of the models. Houmansadr suggested federated learning as an option to avoid privacy implications. Federated learning keeps much of the data needed for the models on people’s individual devices instead of directly including it in the models.

King, director of the Institute for Quantitative Social Science at Harvard University, promoted the idea of differential privacy—adding noise to datasets to add layers of protection to dataset subjects. Differential privacy should be used much more as there is so much useful data out there that could be used in research for improving people’s lives, Zimmeck said.

Endowed by James Shasha ’50, P’82, the Shasha Seminar for Human Concerns supports lifelong learning and encourages participants to expand their knowledge and perspectives on significant issues.