Online platforms must check if they show bad stuff to people by 16 March 2025. This is part of the Online Safety Act. If they don’t check, they might have to pay a fine. Ofcom, the group in charge of UK internet safety, made rules for how companies should handle bad stuff online. They have three months to see if there is harmful content on their sites. If they don’t, they could be fined up to 10% of their money. Ofcom boss Dame Melanie Dawes said companies need to make changes now or face more rules, like banning kids from social media. Some people think the rules don’t do enough to protect children. Tech companies need to find and stop bad content, like stuff about abuse, violence, and self-harm. They also need to warn users about dangers like sharing personal information. Platforms like Facebook and Instagram have already put safety features for teens in place. The new rules still need approval from parliament but companies should start following them now.
Vocabulary List:
- Platforms /ˈplætfɔːrmz/ (noun): Digital services or websites that allow users to communicate and share content.
- Harmful /ˈhɑːrmfəl/ (adjective): Causing or likely to cause damage or injury.
- Compliance /kəmˈplaɪəns/ (noun): The act of adhering to rules regulations or laws.
- Dangers /ˈdeɪndʒərz/ (noun): Possibilities of harm or injury.
- Features /ˈfiːtʃərz/ (noun): Distinctive attributes or elements of a product or service.
- Approval /əˈpruːvəl/ (noun): The action of officially agreeing to something.