Online Safety Improvement Bill Passes Parliament - Social Media - Singapore (2023)

December 06, 2022

throughRajesh Sreenivasan,Steve Tan,Benjamin Cheong,light brown lionjasked Tang

figure and tan

To print this article, all you need to do is register or log in to Mondaq.com.

introduction

On November 9, 2022, Parliament passed the Online Safety Act (various amendments) (“Act”). The bill aims to improve the safety of digital spaces for users in Singapore, especially children.

The bill will introduce new regulations and obligations for providers of Internet access services and online communications services and will strengthen the Infocomm Media Development Authority ("INSTANTLY"), to issue orders to block harmful content. In his speech at the second reading of the bill in Parliament (availablehere), the Minister of Communications and Information, Ms. Josephine Teo, pointed out that the new regulations for online communication services are currently intended to apply to social network services, which are considered a priority area due to the increased share of harmful online content on social media platforms.

The Bill will introduce a new Part 10A of the Broadcasting Act 1994, empowering IMDA to better regulate the online communications services accessible to end users in Singapore through the following measures: (i) Issuing online codes of conduct for regulated online -Communication services provider; and (ii) issue blocking instructions for internet access and online communication service providers to deal with "outrageous content". Regarding codes of conduct, IMDA also has a draft Code of Conduct for Online Safety ("draft code").

We previously published a Legal Update on the first reading of the bill in Parliament, available here, which covers the bill in more detail. This update provides a summary of the key provisions of the Bill and the Bill.

Code of Conduct

Part 10A authorizes IMDA to issue online codes of conduct applicable to providers of regulated online communications services. A code of conduct may provide:

  1. Requirements for setting up and using appropriate systems or processes to minimize risk;
  2. Practical guidance on what content should be covered;
  3. The procedures to be followed to comply with the obligation to comply with the Online Code of Conduct;
  4. Provider's participation or cooperation requirements in any investigation of its regulated online communications service by an appropriate expert.

code draft

In line with the focus on social networking services mentioned above, IMDA has drafted a Code of Conduct for Online Safety ("code draft"), which sets out the obligations to designate social network services ("Service"). The Draft Code is intended to provide a first indication of how IMDA intends to implement the provisions of the Code of Practice Bill and will continue to evolve.

The draft Code contains the following key provisions:

  1. user security– The service must take measures to minimize user exposure to harmful content, enable users to manage their security on the service, and mitigate the impact on users that may result from the distribution of harmful content , especially for children. This includes the following measures:
    • Guidelines and standards– The service must implement a set of community guidelines and standards, as well as content moderation measures.
    • empower users– Users should have access to tools that allow them to manage their own security and effectively minimize their exposure and mitigate the impact of malicious content and unwanted interactions on the service. Users must be able to easily access information related to the online safety of the Service. This information should include local information such as B. Security resources or support centers in Singapore where available.
    • Proactive detection and removal–Technology and procedures must be put in place to proactively detect and promptly remove material of sexual abuse, child exploitation and terrorist content where technically feasible.
    • measures for children– Children's exposure to inappropriate content should be minimized through appropriate and proportionate measures, including a set of community standards and guidelines, as well as measures to moderate content suitable for children. These community guidelines must be posted and address at least sexual content, violent content, suicidal and self-harm content, and cyberbullying content.

      Children or their parents/guardians should have access to tools that enable them to manage children's safety and effectively minimize children's exposure to harmful and/or inappropriate content and unwanted interactions on the Service. Unless the Service restricts children's access, children should be provided with differentiated accounts with settings for tools to minimize exposure and mitigate the impact of harmful and/or inappropriate content set to more restrictive levels by default.

  2. user reports- Anyone should be able to report unwanted content or interactions to the service. The reporting and resolution mechanism made available to users must be effective, transparent, easily accessible and user-friendly.
    • User reports must be evaluated, and the Service must take appropriate action, commensurate with the severity of the potential harm, in a timely and diligent manner.
    • If the Service receives a non-frivolous or annoying notification, the User must be immediately informed of the Service's decision and the actions taken in relation thereto. If the Service decides to take action against the reported content, the relevant user of the Account Holder must be promptly informed of the Service's decision and action.
    • Users must be able to submit requests to the Service to obtain verification of the decision and actions taken.
  3. responsibility– The service must submit annual reports to IMDA on the measures taken by the service to combat harmful and inappropriate content for publication on the IMDA website. The report must include: (i) how much and what type of harmful or inappropriate content they find on the Service; (ii) what steps the Service has taken to reduce exposure of Singaporean users to harmful or inappropriate content; and (iii) what actions the Service has taken with respect to User Reports.

The draft code is accompanied by guidelines that provide illustrative but non-exhaustive examples of harmful or inappropriate content for both all users and children in the following content categories: sexual content, violent content, suicidal and self-harm content, cyberbullying content, Content that endangers public health and content that promotes vice and organized crime.

(Video) Joint Committee on the Draft Online Safety Bill

The final online safety code of conduct is expected to be implemented in the second half of 2023 following a final round of consultations with social media companies.

block addresses

Under the new Part 10A, IMDA may also issue the following guidance to address egregious content (including content that advocates suicide or self-harm, physical or sexual violence, and terrorism, that depicts the sexual exploitation of children, or that may cause racial or religious disharmony) :

  1. Online Communication Services– An address to an online communications service provider (which currently includes social networking services such as Facebook, Instagram, YouTube, TikTok and Twitter) to: (i) prevent users from accessing egregious content on their service Singapore Finals; or (ii) stop delivery of Content to all End Users' accounts in Singapore to stop or reduce delivery of the Outrageous Content.

    Failure to comply may result in a fine not exceeding S$1,000,000 and an additional fine not exceeding S$100,000 for each day the offense continues after conviction.

  2. Internet Access Service Providers –A blocking address to an internet access service provider (such as Singtel, Starhub or M1) to prevent Singaporean end users from accessing the online communication service.

    Failure to comply may result in a maximum fine of S$20,000 for each day the individual fails to comply, subject to a total cap of S$500,000.

last words

With the passage of the proposed measures to improve online safety in Parliament, the legislative changes are expected to come into force in 2023. Upon entry into force, online communications services and Internet access service providers should take note of the IMDA's power to issue egregious content blocking orders and their obligation. take all reasonable steps to comply with such orders.

Social networking services in particular seem to be the focus of the new framework at the moment. Although the draft code is not finalized, social media platforms should consider the obligations proposed in the draft code and the measures and policies that may need to be put in place to ensure compliance with those obligations. While many social media service providers may already have existing community standards and policies, as well as content moderation measures, they should ensure that these are reviewed and modified to meet local requirements (e.g. information such as security resources in Singapore or support centers where). available).

The content of this article is intended to provide a general guide on the subject. Depending on your specific circumstances, professional advice should be sought.

(Video) Online safety bill holds tech firms to account

POPULAR ARTICLES ABOUT: Media, Telecom, IT, Singapore Entertainment

Cyber ​​crime according to IPC and IT law: a delicate coexistence

Argus-Partner

The term “cybercrime” is not defined in any law or regulation. The word "cyber" is slang for anything related to computers, information technology, the Internet, and virtual reality.

(Video) Online Safety Bill: Law to be implemented in phases, focus on social media first

Media policy priorities of the Albanian government

Sosteniendo Redlich

In a keynote speech, the Federal Minister of Communications outlined the media policy priorities of the Albanian government.

Advertising Law in India - Part 1

Global Jurix, lawyers and attorneys

(Video) FYI: Weekly News Show. Will the Online Safety Bill really protect children?

There is currently no central statutory authority or unified legislation to regulate the advertising industry in India.

Information Technology Rules (Guidelines for Intermediaries and Code of Ethics for Digital Media), 2021: Proper regulation of intermediaries?

Tatva Legal

The term "intermediary" was defined in Section (w) of Subsection 1 of Section 2 of the Information Technology Act 2000 ("IT Act")...

The Information Technology Rules (Guidelines for Intermediaries and Digital Media Code of Ethics), 2021: Impact on Digital Media

LexCounsel law firms

India is considered the “largest open internet society in the world” and attracts many social media companies to do business in India.

Indian telecoms regulator: the present, the proposal, the future

Luthra y Luthra Law Firms India

Every sector needs a proactive regulator to thrive and be able to maintain balance and achieve its purpose. Especially for a sector as large and dynamic as Indian Telecom...

(Video) LIVE: Petitions Committee evidence session on tackling online abuse - 16 November 2021

FAQs

What is the Online Safety Bill 2022? ›

The Online Safety Bill is a step in the right direction, it will hold social media accountable for protecting children online. The new changes to the bill will also see social media firms forced to publish risk assessments so that parents can see for themselves the dangers and risks that face children on these sites.

Does the Online Safety Bill have a legitimate aim? ›

Pursuit of a legitimate aim: The purpose of the obligations imposed on service providers in relation to the safety duties is to reduce the risk of illegal and harmful content online causing harm to individuals, particularly children. This is a legitimate aim for the interference with the Article 10 rights of users.

Has Online Safety Bill been passed? ›

A somewhat bloated and confused version of its former self, the bill, which was dropped from the legislative agenda when Boris Johnson was ousted in July, has now passed its final report stage, meaning the House of Commons now has one last chance to debate its contents and vote on whether to approve it.

Who does Online Safety Bill apply? ›

What the Bill does to protect women. As with all abuse, this Bill will protect women and girls online in five key ways: Illegal content: All platforms in scope of the Bill will need to proactively remove priority illegal content.

What does the Online Safety Bill cover? ›

The Online Safety Bill is a complex and wide-ranging piece of legislation that aims to create safer online spaces for adults and children, protect freedom of expression and promote innovation by regulating 25,000 tech companies and requiring providers of pornographic sites to operate strict age verification.

What are the three C's of online safety? ›

Areas for online risks can be categorised into the 3 C's - Content, Contact and Conduct, and can be commercial, aggressive or sexual in nature as shown in the table below. Where they are going and what they see? - this will help you talk about content risk.

Why do we need the Online Safety Bill? ›

The Online Safety Bill marks a milestone in the fight for a new digital age which is safer for users and holds tech giants to account. It will protect children from harmful content such as pornography and limit people's exposure to illegal content, while protecting freedom of speech.

Has Bill C 75 passed? ›

On March 29, 2018, the Government introduced Bill C-75, An Act to amend the Criminal Code, the Youth Criminal Justice Act and other Acts and to make consequential amendments to other Acts. Former Bill C-75 (the Act), received Royal Assent on June 21, 2019.

Has Bill C 45 been passed? ›

Please note, the Bill was passed with no changes and came into force on March 31, 2004.

Has the BSL bill been passed today? ›

The BSL Bill was first introduced on 16 June 2021 and passed through the House of Commons on 17 March 2022, receiving unanimous cross-party support.

What is legal but harmful content? ›

There is no official definition for legal but harmful content. The term is used to describe images, words and videos that are legal to create and view but have a high risk of causing physical or psychological damage or injury.

What are the 4 C's of online safety? ›

The 4Cs of online risks of harm are content, contact, conduct and contract risks, as explained in Figure 5.

What are the 4 key aspects of online safety? ›

You should consider the 4 areas of online safety risks when developing your online safety policy: content, contact, conduct and commerce.

What is illegal content on social media? ›

Illegal content includes: Footage of real or simulated violence, accidents, or criminal activity, from movies, video clips, or games. Sexually explicit images, including those of child sexual abuse. Content that promotes extreme political views, potentially used for radicalizing vulnerable people.

What are 5 cybersafe rules? ›

Just as you have taught your child to look both ways before crossing the street, it is important to teach them a few essential cyber life skills.
  • Never leave your device unattended. ...
  • Click with caution. ...
  • Never ever share your password. ...
  • Be wary of using social media. ...
  • Be a good online citizen.

What are 5 tips for online safety? ›

Essential internet safety tips
  • #1: Make sure your internet connection is secure. ...
  • #2: Choose strong passwords. ...
  • #3: Enable multi-factor authentication where you can. ...
  • #4: Keep software and operating systems up-to-date. ...
  • #5: Check that websites look and feel reliable.

What are the 4 areas of risk within online safety? ›

The guidance states that whilst the breadth of issues classified within online safety is considerable and ever-evolving, they can be categorised into four areas of risk; content, contact, conduct and commerce.

What is the most important principle of online safety? ›

A positive approach, avoiding 'scare tactics' or confrontational strategies. Clear goals and outcomes, and effective monitoring and evaluation.

What will the Online Safety Bill do? ›

The Online Safety Bill is a complex and wide-ranging piece of legislation that aims to create safer online spaces for adults and children, protect freedom of expression and promote innovation by regulating 25,000 tech companies and requiring providers of pornographic sites to operate strict age verification.

What does the Online Safety Act do? ›

The Act allows the eSafety Commissioner ('Commissioner'), Julie Inman Grant, to assess complaints relating to cyber abuse, image-based abuse and cyberbullying.

What will the online harms Bill do? ›

The Bill will strengthen people's rights to express themselves freely online and ensure social media companies are not removing legal free speech. For the first time, users will have the right to appeal if they feel their post has been taken down unfairly.

What happens if you dont stay safe online? ›

Most people store a lot of personal information on their computers. If you don't protect your computer properly when you're online, it's possible that personal details could be stolen or deleted without your knowledge. Your computer can be attacked in a number of ways over the internet.

What are the four C of online safety? ›

The 4Cs of online risks of harm are content, contact, conduct and contract risks, as explained in Figure 5. The classification has the merit, we suggest, of order and clarity.

What are the 5 online safety rules? ›

Essential internet safety tips
  • #1: Make sure your internet connection is secure. ...
  • #2: Choose strong passwords. ...
  • #3: Enable multi-factor authentication where you can. ...
  • #4: Keep software and operating systems up-to-date. ...
  • #5: Check that websites look and feel reliable.

What are the 4 main objectives of the safety Act? ›

The Act aims to: secure the health, safety and welfare of employees and other people at work; protect the public from the health and safety risks of business activities; eliminate workplace risks at the source; and.

What is the enhancing online safety act? ›

The Online Safety Act, which came into effect in January 2022, has revamped and improved online safety standards. It's an attempt to promote a better internet experience for all Australians.

What is the social media anti trolling bill? ›

Establishes a framework relating to potentially defamatory content posted on social media by: deeming a person who administers or maintains a social media page not to be a publisher of third-party material; deeming a social media service provider to be the publisher of material that is published on their service if it ...

Does the government have the power to shut down the Internet? ›

There is no law that gives the United States authority over an ISP without a court order. A court order is not necessarily the solution either.

Videos

1. U.K. proposes to fine and block social media giants for harmful content
(CNBC Television)
2. Snapchat, TikTok and YouTube executives testify at Senate hearing - 10/26 (FULL LIVE STREAM)
(Washington Post)
3. Sky News Breakfast: England and Wales to face off at Qatar 2022
(Sky News)
4. Parliament Sitting 5 October 2022 (English interpretation)
(MCI Singapore)
5. Digital and Intelligence Service to be set up after amendments passed to SAF Act, Constitution
(CNA)
6. Facebook whistleblower testifies at Senate hearing on kids’ safety online - 10/5 (FULL LIVE STREAM)
(Washington Post)
Top Articles
Latest Posts
Article information

Author: Jeremiah Abshire

Last Updated: 12/19/2022

Views: 6158

Rating: 4.3 / 5 (74 voted)

Reviews: 89% of readers found this page helpful

Author information

Name: Jeremiah Abshire

Birthday: 1993-09-14

Address: Apt. 425 92748 Jannie Centers, Port Nikitaville, VT 82110

Phone: +8096210939894

Job: Lead Healthcare Manager

Hobby: Watching movies, Watching movies, Knapping, LARPing, Coffee roasting, Lacemaking, Gaming

Introduction: My name is Jeremiah Abshire, I am a outstanding, kind, clever, hilarious, curious, hilarious, outstanding person who loves writing and wants to share my knowledge and understanding with you.