OK, thank you, members of the Commerce Committee for the opportunity to speak with the American people about Twitter and Section 230.
My remarks will be brief, so we can get the questions.
Section 230 is the most important law protecting Internet speech and removing section 230: will remove speech from the Internet.
Section 230 gave Internet services two important tools.
The first, provides immunity from liability for users content.
The second, provides good Samaritan protections for content moderation and removal, even of constitutionally protected speech, as long as it’s done in good faith.
The concept of good faith is what’s being challenged by many of you today.
Some of you don’t trust were acting in good faith, that’s the problem I wanna focus on solving.
Other services like Twitter earn your trust.
How do we ensure more choice in the market if we don’t?
There are three solutions we’d like to propose to address the concerns raised.
All focused on services that decide to moderate or remove content.
They could be expansions to section 230, new legislative frameworks, or commitment to industry wide self regulation best practices.
The first, is requiring a services moderation process to be published.
Our cases reported and reviewed.
Our decisions made.
What tools are used to enforce?
Publishing answers to questions like these will make our process more robust and accountable to the people we serve.
The second is requiring a straightforward process to appeal decisions made by humans or by algorithms.
This ensures people can let us know when we don’t get it right.
So we can fix any mistakes and make our processes better in the future.
And finally, much of the content people see today is determined by algorithms.
With very little visibility into how they choose what they show.
We took a first step in making this more transparent by building a button to turn off her home timeline algorithms.
It’s a good start, but were inspired by the market approach suggested by Doctor Stephen Wolfram before this committee in June 2019.
Enabling people to choose algorithms created by third parties to rank and filter the content is an incredibly energizing idea that’s in reach.
Requiring one, moderation process and practices to be published.
Two, a straightforward process to appeal decisions and three, best efforts around algorithmic choice are suggestions to address the concerns we all have going forward.
And they’re all achievable in short order.
It’s critical as we consider these solutions we optimize for new start-ups and independent developers.
Doing so ensures a level playing field that increases the probability of competing ideas to help solve problems.
We must entrench the largest companies any further.
Thank you for the time and I look forward to a productive discussion to dig into these and other ideas.