A controversial decision from Italian Administrative Judges on the appointment of Data Protection Officers

Lawyer Vincenzo Colarocco

Among the factors that most contribute to decreasing the relevance of the DPO’s role are the professional selection methods. On the subject, the Administrative Court of the Puglia Region intervened on 13 September, with sentence n. 1468/2019, from which it is possible to deduce important considerations regarding the procedure for the designation of the Data Protection Officer.

The decision issued, has the effect of canceling the award of a two-year DPO assignment to a limited liability company, which had appointed, for the DPO role, an external consultant. According to the judges, indeed, based on an authentic interpretation of the Guidelines on Data Protection Officers (WP243), the company had proceeded to designate the DPO by appointing a professional external to the company, without specifying and proving that the latter belonged to that same company. Consequently, according to this interpretation, the subject who performs the functions of Data Protection Officer in the case of entrustment to a juridical person of the assignment, must be an employee of the company that offers the DPO service, as long as it is not possible to appoint an external professional.

The ruling of the judges, although it deals with the public sector, expresses a principle that has expansive potential even in the private sector, that, if it could be shared, it would make illegitimate a considerable number of appointments of DPO made in favor of legal persons, since the latter will constantly avail themselves of external professionals, who should guarantee specialist knowledge of data protection legislation and practices, as well as the ability to perform the tasks required by art. 37, par. 5 of the GDPR.

Transparency: EDPB Guidelines on video surveillance

Lawyer Vincenzo Colarocco

On 10 June 2019, the European Data Protection Board (EDPB) adopted guidelines no. 3/2019 on data processing in video surveillance, which clarify the terms of the general data protection regulation that apply to the processing of personal data when using video devices, and aim to ensure the consistent application of the GDPR on the subject.

Briefly, with reference to the information to be given to data subjects, the two-level structure is confirmed: summary information (warning sign) and complete information (available for example on the website). The sign must be placed near the camera and must contain all the essential information required by the GDPR. Below is the example of the new version of the information sign provided by the same guidelines.

What the first sanctions reveal and what are the choices of the Supervisory Authorities?

Lawyer Vincenzo Colarocco

The supervisory authorities – EU privacy guarantors – have so far taken a reasonable and considered approach to sanctions for non-compliance with the Gdpr, as provided for in the same regulation, which states that sanctions must in any case be effective, proportionate and dissuasive. For example:

 

Authority Fine () Quoted Art. Summary
French Data Protection Authority (CNIL) 50,000,000 for Google Inc. Art. 13 GDPR, Art. 14 GDPR, Art. 6 GDPR, Art. 4 nr. 11 GDPR, Art. 5 GDPR Lack of transparency (Art. 5 GDPR), insufficient information (Art. 13 / 14 GDPR) and lack of legal basis (Art. 6 GDPR). The obtained consents had not been given “specific” and not “unambigous” (Art. 4 nr. 11 GDPR).
Italian Data Protection Authority (Garante) 50,000 for the Italian political party Movimento 5 Stelle Art. 32 GDPR Insufficient technical and organisational measures to ensure information security
Information Commissioner (ICO) 204,600,000 for British Airways Art. 32 GDPR Insufficient technical and organisational measures to ensure information security

See the list in detail for a better overview.

The State of health of data protection: Communication from the European Commission to the European Parliament and the European Council of 24 july 2019

Lawyer Vincenzo Colarocco

The European Commission published on 24.7.2019 a “Communication to the European Parliament and the Council” entitled “Data protection rules as a trust-enabler in the EU and beyond – taking stock” outlining the current state of data protection in the EU, with a particular focus on the impact of the GDPR (the framework will need to be complemented by the e-Privacy Regulation, currently under preparation).

Most Member States have put in place the necessary legal framework and the new system strengthening the enforcement of data protection rules is coming into operation. The Commission has noted increased awareness among citizens who increasingly exercise their rights. The EU data protection legislative framework has become the cornerstone of the European civil society innovation project.

The European project covers health and research, artificial intelligence, transport, energy, electoral policies, competition and law enforcement.

The cofemel decision C-683/17 and the CJEU clarification on relationship between Protection Granted by copyright law and design law

Lawyers Alessandro La Rosa

The Court of Justice of the European Union (CJEU), on 12 september 2019, issued its much awaited judgment in Cofemel, C-683/17. It ruled that, as far as designs are concerned, copyright protection may not be granted to designs on the sole ground that they produce a specific aesthetic effect.

In its Cofemel judgment, the CJEU proceeded along the very lines set by some earlier rulings such as Flos, C-168/09, in which the judges ruled that if a design is eligible for protection under the InfoSoc Directive and is, as such, original in the sense clarified by the CJEU – then Member States cannot deny such protection.

Facts

The case in analysis concerned two companies active in the sector of design, production and sale of clothing: G-Star Raw CV (“G-Star”) and Cofemel – Sociedade de Vestuario SA (“Cofemel”). G- Star accused Cofemel of producing and selling jeans, sweatshirts and T-shirts copying some of its own designs, considering that these models of clothing constituted original intellectual creations and that they should be qualified as “works” protected by copyright.

Cofemel defended itself by arguing that the said clothing models could not be qualified as “works” benefiting from such protection.

Due to persistent differences of interpretation, the Supremo Tribunal de Justiça (Portuguese Supreme Court) asked the Court of Justice whether EU law and specifically, Article 2(a) of the InfoSoc Directive prevents Member States from granting copyright protection to designs subject, because, beyond their practical purpose, they produce a specific aesthetic effect.

Judgment

In order to answer the question referred, the CJEU moved from the notion of ‘work’ in Article 2(a) above mentioned, and recalled that this – like all those concepts that do not refer to the laws of individual EU Member States – is an autonomous notion of EU law, which requires uniform interpretation across the EU and presupposes the fulfilment of two cumulative elements. On one hand, according to the CJEU this notion implies that there is an original object, in the sense that it is an intellectual creation specific to its author. On the other hand, the qualification of a “work” is reserved for elements that are the expression of such a creation, and this implies that the subject matter must be expressed in a manner which makes it identifiable with sufficient precision and objectivity, even though that expression is not necessarily in permanent form. In particular, the Court notes that the aesthetic effect that can be produced by a model is the result of the intrinsically subjective sensation of beauty felt by each person who looks at it. Consequently, this subjective effect does not in itself make it possible to characterise the existence of an identifiable object with sufficient precision and objectivity.

Therefore, the circumstance that designs produce, over and above their practical purpose, a specific aesthetic effect, does not, in itself, entail that such designs can be classified as ‘works’.

In conclusion, the ECJ held that Article 2(a) of Directive 2001/29 must be interpreted as precluding national legislation from conferring copyright protection on models such as the clothing models at issue, on the ground that, beyond their utilitarian purpose, they generate a visual effect of their own which is significant from an aesthetic point of view.

The Cofemel judgment is yet another landmark ruling of the CJEU with far-reaching consequences, both for designs and other objects protected by copyright.

Health data: new Council of Europe guidelines

Lawyer Vincenzo Colarocco

The Council of Europe, with a recommendation adopted on last 27 March (hereinafter referred to as the “Recommendation”), has provided a set of guidelines for the Member States with the aim of guiding them in the proper processing of health data.

The clear intention of the above European body is to ensure, in law and in practice, that the processing of such special categories of data under Article 9 of EU Regulation 679/2016 (“GDPR” or “Regulation”) will be implemented in full respect of human rights, at a particular historical time when the use of new technologies is quickly increasing[1]. This assumption implies the need to set up the treatment considering the cornerstones set at the basis of the Regulation:  privacy by design and privacy by default as regulated by art. 32 of the GDPR. Therefore, the technical and organisational measures to be implemented should be incorporated from the design phase of any system that processes health data. In addition, in order to further implement these principles, the Recommendation specifies that compliance with these provisions should be regularly reviewed throughout the entire life cycle of the processing and that the Controller must carry out, before starting the processing and at regular intervals, an assessment of the potential impact in terms of data protection and respect for privacy, including an evaluation  about measures to mitigate the risk.

The Recommendation also sets out some interesting clarifications on the legal basis that could legitimize the processing of health data. Established that the basis of said treatment consists in the informed consent of the data subject (according to Art. 9 of the GDPR), Recommendation also provides, alternatively, two further circumstances which would seem to exclude the prior collection of a consent:

  1. when the processing is necessary for the execution of a contract concluded by the data subject with a health care worker submitted to conditions defined by law, including the obligation of secrecy;
  2. when such data have been made public by the data subject himself.

With reference to the timing of storage (“retention”) to be applied to the category of health data, the Recommendation provides that, if adequate security measures are in place, the retention may be extended[2] when processing is envisaged for purposes of storage in the public interest or for scientific or historical purposes or, again, for research and statistics. In this latter case, the data should, in principle, be rendered anonymous as soon as research, archiving or statistical studies allow; if this is not possible, pseudonymisation could be used to safeguard the fundamental rights and freedoms of the data subjects.

In conclusion, it is clear that the Recommendation follows almost blindly the requirements of the GDPR, but in any case it must be pointed the importance and relevance of these provisions in light of the increasing digitization, that, necessarily, also involves the processing of personal data. This phenomenon evidently leads to an improvement in medical care and patient care, but inevitably causes an exponential increase in the amount of health data submitted to processing operations and, as a result, it determines the necessity to apply legal and technical measures that allow effective protection of each individual.

 

 

[1] The Recommendation also outlines indications regarding the processing of health data collected through mobile devices which, whether implanted in the individual or not, may reveal information about his physical or mental state, or which have, as their object, any information concerning health care and social care benefits.

[2] Therefore, by exceeding the storage periods strictly necessary for the purpose of patient’s caring.

CJU Case C-18/18: the stay-down duties of Facebook according to the Opinion of Advocate General M. Szpunar

Lawyer Alessandro La Rosa

On 4 June 2019 the Advocate General (AG) Szpunar delivered an Opinion in the context of the preliminary ruling request on Case C-18/18, in which the Austrian Supreme Court presented the following questions to the EUCJ:

  1. Does Article 15(1) of Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (‘Directive on electronic commerce’) generally preclude any of the obligations listed below of a host provider which has not expeditiously removed illegal information, specifically not just this illegal information within the meaning of Article 14(1)(a) of the Directive, but also other identically worded items of information:

a.a. worldwide? a.b. in the relevant Member State? a.c. of the relevant user worldwide? a.d. of the relevant user in the relevant Member State?

  1. In so far as Question 1 is answered in the negative: Does this also apply in each case for information with an equivalent meaning?
  2. Does this also apply for information with an equivalent meaning as soon as the operator has become aware of this circumstance?

The facts

The complaint was made by Ms Eva Glawischnig-Piesczek, member of the Nationalrat (National Council, Austria), chair of the parliamentary party die Grünen (‘the Greens’) and federal spokesperson of that party. On 3 April 2016 a user shared on Facebook an article from the Austrian online news magazine oe24.at entitled ‘Greens: Minimum income for refugees should stay’ and published in connection with the article disparaging comments about the applicant, accusing her of being a ‘lousy traitor of the people’, a ‘corrupt oaf’ and a member of a ‘fascist party’. The content placed online by that user could be consulted by any other Facebook user.

AG Opinion

The AG recalled that the “service provider’s conduct must be limited to that of an ‘intermediary service provider’ within the meaning intended by the legislator in the context of Section 4 of that directive.” (par. 29) and that according to Recital 42 of Directive 2000/31/EC (E Commerce Directive, ECD), the liability limitation is subject to the fact that the service provider’s conduct is purely technical, automatic and passive, which implies that it has neither knowledge nor control over the data it stores and that the role it plays must therefore be neutral (par.29).

Considering Article 15, paragraph 1 ECD the AG clarifies that this provision prevents the intermediary service provider from being forced to monitor all of its user’s data in order to prevent future violations  (as this would go against what the CJEU has established in the SABAM C-360/10 Case). This said, this article doesn’t cover monitoring obligations applicable in specific cases as provided in Article 14, paragraph 3 ECD, on the basis of which a provider may be forced to prevent a violation which “logically implies a certain form of monitoring in the future” (par.41) and in Article 18 ECD, that requires Member States to ensure that Court actions available under national law concerning information society services’ activities allow for the rapid adoption of measures to prevent any further impairment of the interests involved.

The AG continues by adding that “a host provider may be ordered, in the context of an injunction, to remove illegal information which has not yet been disseminated at the time when that injunction is adopted, without the dissemination of that information being brought, again and separately from the original removal request, to its knowledge” (par.44).

However, in accordance with the ban of a general monitoring obligation provided in the ECD, such order must be referred to violations of the same kind, from the same recipients and for the same rights (as already established in Case C-324/09 L’Oreal vs eBay). It follows that, obligations of active monitoring may be imposed to a social network operator, provided that they are related to a specific infringement, that the monitoring period is specified and that clear indications about the nature of the violations are provided, such as their author and object.

Therefore, an injunction may force the provider to search and identify, among the information shared by the users of the platform, identical information to the one qualified as illegal by the judge, as the reproduction of the same content by other users of the platform can usually be detected with technological tools without the need to actively filter, in a non-automatic way, all of the information present on the platform. The injunction may also require a service provider to search and identify equivalent information to the illegal one, only among the information shared by the same user that shared the illegal one.

This is due to the fact that, in this specific case, the illegal material relates to offensive remarks and not to IP infringements. As stated by the AG “it is important not to lose sight of the factual context in which the relevant case-law was developed, namely the context of infringements of intellectual property law. Generally, such infringements consist in the dissemination of the same content as that protected or, at least, a content resembling the protected content, any changes in that content, which are sometimes difficult to make, requiring specific intervention. On the other hand, it is unusual for a defamatory act to use the precise terms of an act of the same type. That is, in part, the result of the personalised nature of the way in which ideas are expressed.[…]For that reason, in connection with defamation, a mere reference to acts of the same nature could not play the same role as in connection with infringements of intellectual property law.” (par. 69-70).

Therefore, the injunction of a judge that orders the removal of such equivalent information must ensure clarity, predictability and must balance the different fundamental rights involved, taking into account the proportionality principle.

If follows that, Article 15, paragraph 1, ECD mustn’t prevent a hosting provider from removing equivalent information to the one identified as illegal, where the obligation to remove such information doesn’t imply a general monitoring but derives from a knowledge resulting from the notice provided by the person harmed by the illegal content and not from another source. In such case, the violation of Article 15, paragraph 1 ECD doesn’t arise.

The AG Opinion is in line with the recently published Communications of the European Commission on these matters. In Communication 555/2017 of 28 September 2017 the Commission clarified that “Online platforms may become aware of the existence of illegal content in a number of different ways, through different channels. Such channels for notifications include (i) court orders or administrative decisions; (ii) notices from competent authorities (e.g. law enforcement bodies), specialised “trusted flaggers”, intellectual property rights holders or ordinary users, or (iii) through the platforms’ own investigations or knowledge.”, and that such operators, “In addition to legal obligations derived from EU and national law”, are required a “duty of care” and “should ensure a safe online environment for users, hostile to criminal and other illegal exploitation, and which deters as well as prevents criminal and other infringing activities online”. The Communication also clarifies that “in light of their central role and capabilities and their associated responsibilities” platforms shouldadopt effective proactive measures to detect and remove illegal content online and not only limit themselves to reacting to notices which they receive”.

In relation to the admissibility of dynamic injunctions, the European Commission has clarified in Communication 708/2017 of 29 November 2017 that “some Member States provide for the possibility of issuing forward-looking, catalogue-wide and dynamic injunctions”. These injunctions force the intermediaries to impede further infringements of the rights detained by a rightholder or of rights that are part of a catalogue or of a repository of a license holder,  following the  established infringement of a sample of these rights.

At national level, the Italian Supreme Court in Case n.7708/19 (RTI c Yahoo!) of 19 March 2019, recently rejected the ruling of the Milan Court of Appeal  (n.29/2015 of 7 January 2015) that provided that there could be no obligation to prevent the publication of the same illegal content that had already violated others ‘rights.

In conclusion, specific removal obligations for illicit information that is identical or equivalent is confirmed, in relation to copyright law, in the explicit stay-down obligations provided in Article 17, paragraph 4, letter c of the Directive on Copyright in the Digital Single Market, published in the Official Journal on 17 May 2019, which reads that “If no authorisation is granted, online content-sharing service providers shall be liable for unauthorised acts of communication to the public, including making available to the public, of copyright-protected works and other subject matter, unless the service providers demonstrate that they have […]acted expeditiously, upon receiving a sufficiently substantiated notice from the rightholders, to disable access to, or to remove from their websites, the notified works or other subject matter, and made best efforts to prevent their future uploads”.

The Google Case C 507/17: Right to be forgotten is Limited to Eu

Lawyer Vincenzo Colarocco

By decision of 21 May 2015, the President of the French Commission nationale de l’informatique et des libertés (CNIL’) served formal notice on Google that, when acceding to a request from a natural person for the removal of links to web pages from the list of results displayed following a search performed on the basis of that person’s name, it must apply that removal to all of its search engine’s domain name extensions. By adjudication of 10 March 2016, the CNIL, after finding that Google had failed to comply with that formal notice by the prescribed time limit, imposed on it a penalty, which was publicised, of €100 000. By an application lodged before the Conseil d’État (Council of State, France), Google seeks to have that adjudication annulled. The Conseil d’État decided to refer several questions to the Court of Justice for a preliminary ruling. Advocate General Maciej Szpunar, in his Opinion, stating that the Community provisions on the subject do not expressly regulate the question of the territoriality of deindexing and that a differentiation is necessary according to the place from which the research is carried out, proposes that the Court should declare that the “search engine operator is not required, when acceding to a request for de-referencing, to carry out that de-referencing on all the domain names of its search engine in such a way that the links in question no longer appear, irrespective of the location from which the search on the basis of the requesting party’s name is performed”. However, the Advocate General underlines that “once a right to de-referencing within the EU has been established, the search engine operator must take every measure available to it to ensure full and effective de-referencing within the EU, including by use of the ‘geo-blocking’ technique, in respect of an IP address deemed to be located in one of the Member States, irrespective of the domain name used by the internet user who performs the search.” In conclusion, dealing with the specific case Advocate General Szpunar had the opportunity to point out that the fundamental right to be forgotten must be balanced against other fundamental rights, such as the right to data protection and the right to privacy, as well as the legitimate public interest in accessing the information sought, stating that, if worldwide de-referencing were permitted, the EU authorities would not be able to define and determine a right to receive information, let alone balance it against the other fundamental rights to data protection and to privacy.

Ruling of the Italian Supreme Court in the Mediaset vs Yahoo! Case

Lawyers Alessandro La Rosa

Mediaset has sued Yahoo Inc. and Yahoo! Italy for the illegal broadcasting of more than 200 videos extracted from television programs broadcast by its television networks. The Civil Court of Milan, at first instance, had recognized the liability of Yahoo! Italy as operator of the Yahoo! Video service, for not having prevented the publication of the videos of Mediaset despite having been warned not to do so with an injunction containing the titles of the television programs in question and some URLs by way of example. For the first time in Italy, the Civil Court of Milan had taken on board the lessons of the EU Court of Justice on the concept of “active role” providers. At a later time, the Milan Court of Appeal overturned the decision of the first judge, denying the same configurability as the figure of the hosting provider “with an active role” and excluded the liability of Yahoo! Italy. With its decision of March 19, 2019, the Court of Cassation accepted the motivational approach of the Civil Court of Milan and asked the Court of Appeal of Milan to modify its decision on the basis of the following principles.

With ruling n. 7708 of March 19, 2019 the Italian Supreme Court upheld the action filed by RTI (Mediaset Group) and intentionally clarified the responsibility of the hosting providers (i.e. the service provider which memorises the information provided by the users, as provided by art. 14 Directive 2000/31/CE) towards the affected rightholders.

The Court has provided fundamental guidance that will have to be followed by national jurisprudence. The principles established by the Court can be summarized as follows:

a) the hosting provider must be defined “active” when his conduct has the effect to complete and enrich the consumption of illegal online advertising content by third parties. In this case there is a direct support (notably, active) to the illegal act carried out by these third parties with full civil responsibility towards the rightholders and the inapplicability of the Legislative Decree 70/2003 (the Italian legislative transposition of Directive 2000/31/CE) as the general rules are to be followed: the ruling identifies thoroughly also a number of useful indexes for which the hosting provider must be considered active, such as the activity of filtering, indexing, organizing and so on; or the adoption of a behavioral evaluation technique of the users to increase their loyalty to the service.

b) when such an active conduct is not recognized, the service provider must be qualified as “passive” and Article 16 of the Legislative Decree 70/2003 applies. This however doesn’t mean that such entity is entitled of an absolute liability exemption: In fact, according to these rules the provider is still liable towards the rightholders of content illegally published online by third parties when, he is aware of the illegal nature of the content (illegality that arises when, as in this case, rights of others related to copyright are violated) and nevertheless doesn’t remove it nor blocks access to it.

c) Especially in the actions for damages, responsibility arises when there is manifest illegality of this same content and, although the hosting provider is aware of it, he doesn’t impede its availability, omitting to remove or to disable access to such content. The knowledge arises not only when the rightholder informs about the illicit act and related damage, which can also be done in oral form (without the need of a written warning), but also when the provider has acquired such information by other means or became aware of it after its own verification. In fact, the ruling specifies that, when the communication is done by the affected rightholder, it does not necessarily have to be in written form (writing is advisable for practical need of evidence, but is not required by legislation). Practically speaking, the communication only has to allow hosting providers to understand and identify the illegal content (the Court of Cassation adds that, when videos extracted from TV programmes are concerned, it is to be assessed on a case by case basis if, for this purpose, only the title of the programme is needed or if other descriptive elements or other elements are necessary, with the possibility for the judge to require a technical advice).

d) Although the passive hosting provider is not required to carry out a general, anticipated and constant monitoring and therefore is not responsible for cases where he omitted to monitor in a preventive and continuous manner the content uploaded by the users of the service, it is likewise true that he is responsible for the damages when he acquires knowledge of the content and didn’t act for its immediate removal. There is no obligation to carry out an active search of the illegal material uploaded and distributed online, this obligation arises subsequently to the above knowledge, as legislation hasn’t completely annulled, but only limited, the control over the uploaded content which could be included in an illicit telematic act.

e) upon receiving communication of the illegal act from the rightholder, the hosting service provider can’t, in any case, stay inactive. On the contrary, he is required to evaluate, in accordance with common experience criteria and professional diligence, the notice received (and, if made by the damaged party, the reasonable grounds of the communication) and in case this evaluation is positive, to act expeditiously to remove the identified content. The Court of Cassation adds that, in cases of actions for damages, the illegal act must result “evident” to the operator, meaning it should be identifiable with no major difficulties as experience, knowledge and professional diligence allows. In such cases the hosting provider is liable for cases of intent or serious negligence (without prejudice to the fact that, also in cases of non-manifest illegal acts the provider can’t be inactive, as he should at least inform the competent authorities).

f) the passive hosting provider, when aware of the manifest illegal content uploaded, becomes liable when he doesn’t act. This arises unless he proves, once his knowledge of the illegal act is ascertained and after the elements that make the illegality evident have been included in the actions for damages, that he has never had the chance to effectively take action (but this possibility always arises when the hosting providers has the technical and legal instruments to impede the violations, as the Court of Cassation explains in the ruling).

g) ultimately, pursuant art. 16 of the Legislative Decree 70/2003 (corresponding to art. 14 Directive 2000/31/CE), even when the hosting provider is “passive” and not “active” he is liable for the damages against the affected rightholders when he doesn’t provide for the immediate removal of the illegal content or continues to upload it, if the following three conditions occur: i) the hosting provider becomes legally aware of the illegal act carried out by the user of the service, having known this from the affected rightholder or by other means; ii) the unlawfulness of the conduct of others is reasonably evident and therefore, the operator is in serious negligence for not noticing it, as such diligence is to be reasonably expected from a professional online operator in a given moment in history; iii) the hosting service provider has the possibility to usefully take action as he was made aware, in a sufficiently specific manner, of the illegal content to be removed.

h) in any case, apart from the claims for damages for the established violations, once the passive hosting provider becomes aware of illegal content of the same type, the reasons that exclude an obligation of constant and preventive monitoring no longer apply. Therefore, the judge can decide to impose injunction measures to the provider for the future, not only in order to put an end to the ongoing violations but also to prevent future ones (the ruling also specifies that nothing prevents a Court from requiring the hosting provider to block, not only access to content illegally published online, but also access to any other illegal content of the same type, about which the provider could become aware in the future – also for cases in which this implies a significant cost).

Media: any information that can identify a victim of sexual violence, even indirectly, is prohibited

Lawyer Vincenzo Colarocco

The Italian Data Protection Authority has reiterated, with some recent decisions (see inter alia No. 906580790657829065800 in Italian language) the principle that prohibits the media to publish information that can make identifiable, even indirectly, a victim of sexual violence.

Article. 137 of the Privacy Code provided -and still provide in the new text of Article 12, paragraph 1(c) of Legislative Decree 101/2018– that in the event of disclosure or communication of personal data for journalistic purposes the limitations imposed on freedom of the press, to protect the rights and freedoms of persons, shall be left unprejudiced, and, in particular, the limit of materiality of the information with regard to facts of public interest.

The Authority stated that this limit must be interpreted with particular strictness when are considered data suitable for identifying victims of crimes, even more so with reference to news concerning episodes of sexual violence, given the special protection afforded by the law to the confidentiality of the persons injured by such crimes.

The diffusion within an article of information suitable to make identifiable, even if indirectly, the victim, is in contrast with the requirements of protection of the dignity of the same, also according to the Article 8, paragraph 1, of the code of practice concerning the processing of personal data in the exercise of journalistic activities.

The Authority reminded that in the event of non-compliance with the prohibition, the data controller, in this case the publisher, may also incur the new administrative sanctions introduced by the GDPR, in Article 83, paragraph 5(e), which can reach up to 20 million of euro or 4% of the total annual turnover in the previous year.