WEF 2022: Making the internet world safer by tackling harmful content

From the proliferation of health misinformation, child sexual abuse and exploitation, and violent extremist and terrorist content online, parts of the web can be dangerous.

Delivering safe and secure online experiences is essential for global businesses, civil society groups and individuals alike. Stakeholders from multiple industries and geographies agree there is an urgent need for deliberate global coordination to improve digital safety.

The Forum’s Global Coalition for Digital Safety is accelerating public-private cooperation to tackle harmful content online by bringing together a diverse group of leaders who are well placed to exchange best practices for new online safety regulation and help millions of connected citizens improve digital media literacy.

Engaged partners, including Google, Microsoft, AWS, Meta, Oxford Internet Institute, UNICEF, INTERPOL and several government ministers are developing principles and practices that address all aspects of digital safety in a proportionate manner.

Since the coalition launched in June 2021, companies and public institutions have collaborated on clarifying the actions businesses need to take to improve the safety of their platforms, while regulators across jurisdictions have shared key learnings from forthcoming legislation, such as the UK’s Online Safety Bill. Coalition members are also looking ahead and considering a range of mechanisms to embed safety across the metaverse.

More than 4.7 billion people use the internet

  • McKinsey projects that roughly $2.5 trillion will be transacted online by 2025.

  • One in three internet users is a child under 18 years of age.

  • The challenge with digital safety

More than 4.7 billion people use the internet and McKinsey projects that roughly $2.5 trillion will be transacted online by 2025. Global estimates suggest that one in three internet users is a child under 18 years of age and one in three children is exposed to sexual content online.

Many digital platforms are faced with a complex challenge when it comes to regulating the internet. How do they keep users safe from harmful content without stifling free expression?

Private companies have the right to moderate content on their platforms according to their own policies, but there is an ongoing tension between too little and too much content being actioned by platforms that operate globally. Striking the right balance requires significant collaboration and continual monitoring of safeguards.

Our approach to keeping connected citizens safe.

The Forum is supporting collaboration on regulation which addresses the safety of many vulnerable groups, especially children.

The Global Coalition for Digital Safety is considering both preventive and proactive tactics such as safety by design and content moderation as well as regulation to address online safety risks according to a company’s size, role in the digital ecosystem, the users it services and many other factors.

While there is no one-size-fits-all solution, the Forum is forging a way forward by creating action-oriented reports and a follow-through agenda for stakeholders who are working on digital safety from different perspectives. The Coalition has launched an expert advisory group that will be consulted to ensure the best multistakeholder insights on digital safety are brought to the forefront for implementation by members, as they drive public-private cooperation on three questions:

How do international human rights principles translate in a digital context?

What tech, policy, process and design interventions can improve online safety?

How should platforms assess digital safety risks and measure impact of interventions?

Members contributing to the coalition are also evaluating the range of existing efforts to improve online safety, and many have already implemented the following principles and practices to tackle harmful and illegal activities, behaviours, and content on the web:

Five Eyes Voluntary Principles to Counter Online Sexual Exploitation and Abuse

The Christchurch Call to Action To Eliminate Terrorist and Violent Extremist Content Online

The Santa Clara Principles On Transparency and Accountability in Content Moderation

Safety by Design

Republished from the World Economic Forum

Dear Reader,

Business Standard has always strived hard to provide up-to-date information and commentary on developments that are of interest to you and have wider political and economic implications for the country and the world. Your encouragement and constant feedback on how to improve our offering have only made our resolve and commitment to these ideals stronger. Even during these difficult times arising out of Covid-19, we continue to remain committed to keeping you informed and updated with credible news, authoritative views and incisive commentary on topical issues of relevance.

We, however, have a request.

As we battle the economic impact of the pandemic, we need your support even more, so that we can continue to offer you more quality content. Our subscription model has seen an encouraging response from many of you, who have subscribed to our online content. More subscription to our online content can only help us achieve the goals of offering you even better and more relevant content. We believe in free, fair and credible journalism. Your support through more subscriptions can help us practise the journalism to which we are committed.

Support quality journalism and subscribe to Business Standard.

Digital Editor


Source link

Leave a Reply