Terrorist content will be removed from the internet within 1 hour in the EU

class = “cf”>

In the European Union (EU), the legal regulation which stipulates the removal of terrorist content from the Internet within one hour has entered into force.

Publications with terrorist content published on the Internet, such as social media platforms or websites, should be removed within one hour if requested by the competent authorities in EU countries.

In addition to removing terrorist content, legal regulations aim to prevent the spread of propaganda by terrorist organizations, the direction of terrorist activities, radicalization and the recruitment of personnel by organizations.

The European Commission has started to prepare legislation after the terrorist attacks in some EU countries in recent years. Platforms that offer internet services in the EU and Member States have one year to harmonize with the legal process.

With the new regulation, if all Internet messages such as pictures, videos, texts and audio are considered terrorist content by the competent authority of an EU country and need to be deleted, they must be deleted within the hour or all access to this content must be blocked.

class = “cf”>

Live broadcasts of terrorist crimes or activities that may lead to crimes can also be counted as such content.

The definition of terrorist content has been determined as “committing or contributing to terrorist crimes, participating in the activities of a terrorist group, glorifying terrorist activities, advocating or inciting terrorist crimes”.

In requests for removal of content by “competent authorities” of EU countries, the reason should be clearly stated and the reason why the message requested to be removed will be considered terrorist content should be explained in detail.

In the event of sharing of terrorist content on Internet platforms, very prompt action will be required. These measures will be necessary to protect fundamental rights, including freedom of expression.

EU countries and internet platforms will prepare transparent annual reports on removed content. Any accidentally deleted content will also be reloaded immediately.

There will be no obligation to use automatic identification technologies to detect and remove such content. If such technologies and software are used, they will have to be ‘controlled by humans’.

class = “cf”>

Content providers and Internet platforms will be able to request a review of a request to remove content.

Platforms or Internet service providers who do not comply with requests to remove content may be penalized. The size of the platform will be taken into account in the penalties and it will be guaranteed that the penalty is proportional to the size of the penalty. Financial penalties can go up to 4% of the platform’s turnover.

Bigpara for cryptocurrency markets

Bigpara for cryptocurrency markets

Add a Comment