Sunday, June 15, 2025

Social Media’s Final Push Against Illegal Posts

Share

Online platforms must check if they show bad stuff to people by 16 March 2025. This is part of the Online Safety Act. If they don’t check, they might have to pay a fine. Ofcom, the group in charge of UK internet safety, made rules for how companies should handle bad stuff online. They have three months to see if there is harmful content on their sites. If they don’t, they could be fined up to 10% of their money. Ofcom boss Dame Melanie Dawes said companies need to make changes now or face more rules, like banning kids from social media. Some people think the rules don’t do enough to protect children. Tech companies need to find and stop bad content, like stuff about abuse, violence, and self-harm. They also need to warn users about dangers like sharing personal information. Platforms like Facebook and Instagram have already put safety features for teens in place. The new rules still need approval from parliament but companies should start following them now.


Vocabulary List:

  1. Platforms /ˈplætfɔːrmz/ (noun): Digital services or websites that allow users to communicate and share content.
  2. Harmful /ˈhɑːrmfəl/ (adjective): Causing or likely to cause damage or injury.
  3. Compliance /kəmˈplaɪəns/ (noun): The act of adhering to rules regulations or laws.
  4. Dangers /ˈdeɪndʒərz/ (noun): Possibilities of harm or injury.
  5. Features /ˈfiːtʃərz/ (noun): Distinctive attributes or elements of a product or service.
  6. Approval /əˈpruːvəl/ (noun): The action of officially agreeing to something.

How much do you know?

What is the deadline for online platforms to check if they show harmful content to people?
16 March 2025
1 January 2021
30 June 2023
5 September 2024
Who made rules for how companies should handle bad stuff online?
Ofcom
Parliament
Tech companies
Internet users
What percentage of their money could companies be fined if they do not check for harmful content?
10%
5%
20%
15%
Which social media platforms have already implemented safety features for teens?
Facebook and Instagram
Twitter and Snapchat
LinkedIn and Pinterest
TikTok and YouTube
What do tech companies need to find and stop on their platforms?
Bad content like abuse, violence, and self-harm
Advertisements
Positive content only
Educational content
Who mentioned that companies need to make changes now or face more rules, like banning kids from social media?
Dame Melanie Dawes
Mark Zuckerberg
Jeff Bezos
Tim Cook
The new rules under the Online Safety Act have already been approved by parliament.
The Online Safety Act requires online platforms to check for harmful content by 35 March 2025.
Some people believe that the rules do not provide enough protection for children.
Tech companies are only required to warn users about the dangers of sharing personal information.
Ofcom is responsible for overseeing UK internet safety.
Companies might face fines if they fail to implement safety features for teens on their platforms.
Ofcom, the group in charge of UK internet safety, requires companies to check for harmful content within months.
Companies that do not check for harmful content could be fined up to % of their money.
Platforms like Facebook and Instagram have already implemented safety features for .
The group responsible for UK internet safety is .
Some people believe that the rules under the Online Safety Act are not sufficient to protect .
Tech companies need to find and stop bad content like , , and on their platforms.
This question is required

Read more

Local News