Other services like Twitter, earn your trust.
How do we ensure more choice in the market, if we don’t?
There are three solutions we’d like to propose, to address the concerns raised.
All focused on services that decide to moderate or remove content.
That could be expansions to section 230, new legislative frameworks, or commitment to industry wide self regulation best practices.
The first is requiring a services moderation process to be published.
Our cases reported and reviewed.
Our decisions made.
What tools are used to enforce?
Publishing answers to questions like these, will make our process more robust and accountable, to the people we serve.
The second is requiring a straightforward process, to appeal decisions made by humans or by algorithms.
This ensures people can let us know when we don’t get it right.
So we can fix any mistakes and make our processes better in the future.
And finally, much of the content people see today is determined by algorithms.
With very little visibility, into how they choose what they show.
We took a first step in making this more transparent, by building a button to turn off her home timeline algorithms.
It’s a good start, but we’re inspired by the market approach suggested by Doctor Stephen Wolfram, before this committee in June 2019.
Enabling people to choose algorithms created by third parties to rank and filter the content, is an incredibly energizing idea that’s in reach.
Requiring one, moderation process and practices to be published.
Two, a straightforward process to appeal decisions.
And three best efforts around algorithmic choice.
Or suggestions to address the concerns we all have going forward.
And they’re all achievable in short order.
It’s critical as we consider these solutions, we optimize for new startups and independent developers.
Doing so, ensures a level playing field that increases the probability of competing ideas to help solve problems.
We must entrench the largest companies any further.
Thank you for the time, and I look forward to a productive discussion to dig into these and other ideas.
Let me be clear, the approach I work without political bias, full stop.
To do otherwise would be contrary to both our business interests and our mission, which compels us to make information accessible to every type of person, no matter where they live or what they believe.
At the end of the day, we all share the same goal, free access to information for everyone and responsible predictions for people and their data.
We support legal frameworks that achieve these goals.
I look forward to engaging with you today about these important issues, and answering your questions.
Section 230 helped create the Internet as we know it.
It has helped new ideas get built, and our companies to spread American values around the world, and we should maintain this advantage.
But the Internet has also involved, and I think that Congress should update the law to make sure that it’s working as intended.
One important place to start, would be making content moderation systems more transparent.
Another would be to separate good actors from bad actors, by making sure that companies can’t hide behind section 230, to avoid responsibility for intentionally, facilitating illegal activity on their platforms.
We’re open to working with Congress on these ideas and more.
I hope the changes that you make will ring true to the spirit and intent of 230.
There are consequential choices to make here, and it’s important that we don’t prevent the next generation of ideas from being built.