Platform Accountability in Technology Policy (2024-2025)

Background

Platform accountability, or the evaluation of how a platform (i.e., Facebook) demonstrates secure data management processes, is important to how we assess the impact of a platform on security and society. Legislation and policy have to be carefully constructed to monitor and regulate responsible industry behavior; however, experts are divided on the best approach. 

Key themes such as transparency, liability, responsiveness and enforcement are common across different platform accountability approaches, but while some recommendations have become and remain popular (e.g., Fair Information Practice Principles from the Department of Health, Education and Welfare Advisory Committee of 1973), there is no clear standard on accountability. 

There are a few key areas where research is vital to understanding platform accountability more holistically, including understanding who is monitoring and enforcing compliance (e.g., government or third parties), the strength of whistleblower protections, and avenues through which responsible behavior is rewarded and irresponsible behavior is discouraged. Examining these areas will help provide more comprehensive, transparent and enforceable frameworks for platform accountability across a variety of contexts.

Project Description

This project team will develop a platform accountability framework for how companies can demonstrate that they are behaving responsibly and for governments to measure their behavior more effectively. Team members will examine the intersections between business, law and policy to understand platform accountability and the processes for evaluating companies’ compliance to standards.

First, team members will perform a program evaluation on models for supervision and monitoring through a comparative case study. Certain frameworks with supervision and monitoring provisions, such as the European Union’s General Data Protection Regulation (GDPR), have been in force for over five years. The team will examine whether and how companies have changed internal policies, practices or mechanisms for dealing with the provisions of the regulations. The same can be done for other regulations, such as those that exist in Singapore or Canada. While performing this study, team members will also examine how other industries handle the question of supervision and monitoring — including when they intervene.

Team members will also explore the role that whistleblowers can play in platform accountability, including the law and policy protections that can allow whistleblowers to be effective. One key question will be whether there are third-party oversight opportunities and mechanisms for dispute resolution. While those mechanisms do exist, team members will consider how these types of efforts can be improved.

Anticipated Outputs

Case study; research report; publications; opinion pieces; blog posts; social media awareness training; materials to convene conference at Duke and in Washington, D.C.

Student Opportunities

Ideally, this project team will include 6 graduate students and 6 undergraduates with interests in public policy, political science, computer science, economics, law, business, statistics and engineering. Skills in legal analysis, data analysis, private equity research and/or policy analysis are beneficial but not required.

Team members will break into two subgroups: Accountability and AI; and Addressing Social Media Harms. Subgroups and the larger team will meet weekly.

All team members will have the opportunity to learn how law, policy and technology interact around platform accountability issues. They will gain an understanding of how organizations that develop and/or operate tech platforms can demonstrate that they are behaving responsibly. Team members will learn to recognize frameworks for how governments can measure whether tech platforms are behaving responsibly and undertake research on legal and policy issues, analyze quantitative and qualitative data and produce written reports. Students will also have the opportunity to network with experts in this field and possibly travel to Washington, D.C. in Spring 2025.

Timing

Fall 2024 – Spring 2025

  • Fall 2024: Submit Institutional Review Board (IRB) protocol; conduct literature review and background research; develop research materials and methodologies
  • Spring 2025: Continue developing research materials; produce written reports, blog posts and opinion pieces; develop conference presentation or session at Duke and/or Washington, D.C.

Crediting

Academic credit available for fall and spring semesters

Analytics.

Team Leaders

  • David Hoffman, Sanford School of Public Policy
  • Kenneth Rogerson, Sanford School of Public Policy

/yfaculty/staff Team Members

  • Robyn Caplan, Sanford School of Public Policy
  • Aaron Chatterji, Fuqua School of Business
  • Aria Chernik, Social Science Research Institute|Innovation & Entrepreneurship
  • Jolynn Dellinger, Kenan Institute for Ethics|Duke Law
  • Pardis Emami Naeini, Arts & Sciences-Computer Science
  • Nita Farahany, Duke Law|Arts & Sciences-Philosophy
  • Philip Napoli, Sanford School of Public Policy-DeWitt Wallace Center for Media and Democracy
  • Arti Rai, Duke Law
  • Spencer Reeves, Sanford School of Public Policy-DeWitt Wallace Center for Media and Democracy