Accountability Framework

 
 

Supporting good digital governance, in everyone’s interest.

The Internet Commission engages with technology companies, research institutions, political institutions, and NGOs to advance digital responsibility through confidential disclosure and independent evaluation. Each step in our process is voluntary, as participant organisations retain complete control over which information will be shared with whom.

The information shared with the Internet Commission enables independent analysis and the development of deep insights. These can be used to drive improvements internally in companies, as well as to demonstrate accountability in front of external stakeholders. The process identifies best practice, establishes industry benchmarks and supports leaders in both public and private sector to address complex digital policy challenges.

The climate for internet platforms has become troubled by recent events. Advertisers face increased brand risk and public authorities seek to extend statutory regulation, and companies are scrambling to protect themselves against “regulation by outrage”. The Internet Commission is neither an industry association nor a pressure group. Its unique position is as an independent and fair broker of insights to drive better internal governance and smarter regulation.

We plan to publish our first accountability framework (questions) by summer 2019 and are actively seeking more companies to help shape it. Our first report (evaluation) will follow from there, later in the year.

Get involved

By providing detailed, confidential feedback on our questions, organisations are helping to ensure that our questions are practical, relevant, reasonable and proportionate. Contact: research@inetco.org

 

We are consulting on qualitative and quantitative questions to establish a rounded picture of how companies are tackling illegal content, hate speech, cyberbullying, self-harm and fake news on their platforms. Our draft includes 45 questions in six sections:

Reporting: how is the platform alerted to potential breaches of its rules?
Moderation: how are decisions made to take action about content?
Notice: how are flaggers and content creators notified?
Process of appeal: how can decisions be challenged and what happens when they are?
Resources: what human and other resources are applied to managing content?
Governance: How are content management processes, policies and strategies overseen?

Our questions inform and are informed by growing research collaborations with leading academic institutions. We have integrated the 2018 Santa Clara Principles on Moderation at Scale and have followed their development with great interest.