LONDON (AP) — Starting Friday, Europeans will see their online life change.
People in the 27-nation European Union can alter some of what shows up when they search, scroll and share on the biggest social media platforms like TikTok, Instagram and Facebook and other tech giants like Google and Amazon.
That’s because Big Tech companies, most headquartered in the U.S., are now subject to a pioneering new set of EU digital regulations. The Digital Services Act aims to protect European users when it comes to privacy, transparency and removal of harmful or illegal content.
Here are five things that will change when you sign on:
Automated recommendation systems decide, based on people’s profiles, what they see in their feeds. Those can be switched off.
Meta, owner of Facebook and Instagram, said users can opt out of its artificial intelligence ranking and recommendation systems that determine which Instagram Reels, Facebook Stories and search results to show. Instead, people can choose to view content only from people they follow, starting with the newest posts.
Search results will be based only on the words they type, not personalized based on a user’s previous activity and interests, Meta President of Global Affairs Nick Clegg said in a blog post.
On TikTok, instead of being shown videos based on what users previously viewed, the “For You” feed will serve up popular videos from their area and around the world.
Turning off recommender systems also means the video-sharing platform’s “Following” and “Friends” feeds will show posts from accounts users follow in chronological order.
Those on Snapchat “can opt out of a personalised content experience.”
Algorithmic recommendation systems based on user profiles have been blamed for creating so-called filter bubbles and pushing social media users to increasingly extreme posts. The European Commission wants users to have at least one other option for content recommendations that’s not based on profiling.
IT’S EASIER TO FLAG HARMFUL CONTENT
Users should find it easier to report a post, video or comment that breaks the law or violates a platform’s rules so that it can be reviewed and taken down if required.
TikTok has started giving users an “additional reporting option” for content, including advertising, that they believe is illegal. To pinpoint the problem, people can choose from categories such as hate speech and harassment, suicide and self-harm, misinformation or frauds and scams.
The app by Chinese parent company ByteDance has added a new team of moderators and legal specialists to review videos flagged by users, alongside automated systems and existing moderation teams that already work to identify such material.
Facebook and Instagram’s existing tools for reporting content are “easier for people to access,” said Meta’s Clegg, without providing more details.
YOU’LL KNOW WHY YOUR POST WAS TAKEN DOWN
The EU wants platforms to be more transparent about how they operate.
So, TikTok says European users will get more information “about a broader range of content moderation decisions.”
“For example, if we decide a video is ineligible for recommendation because it contains unverified claims about an election that is still unfolding, we will let users know,” TikTok said. “We will also share more detail about these decisions, including whether the action was taken by automated technology, and we will explain how both content creators and those who file a report can appeal a decision.”
Google said it’s “expanding the scope” of its transparency reports by giving more information about how it handles content moderation for more of its services, including Search, Maps, Shopping and Play Store, without providing more details.
The online retail giant said it invests “significantly in protecting our store from bad actors, illegal content and in creating a trustworthy shopping experience. We have built on this strong foundation for DSA compliance.”
Online fashion marketplace Zalando is setting up flagging systems, though it downplays the threat posed by its highly curated collection of designer clothes, bags and shoes.
“Customers only see content produced or screened by Zalando,” the German company said. “As a result, we have close to zero risk of illegal content and are therefore in a better position than many other companies when it comes to implementing the DSA changes.”
TikTok said in July that it was restricting the types of data used to show ads to teens. Users who are 13 to 17 in the EU, plus Britain, Switzerland, Iceland, Norway and Liechtenstein no longer see ads “based on their activities on or off TikTok.”
Snapchat is restricting personalized and targeted advertising to users under 18.
Meta in February stopped showing Facebook and Instagram users who are 13 to 17 ads based on their activity, such as following certain Instagram posts or Facebook pages. Now, age and location are the only data points advertisers can use to show ads to teens.