The AI Act is coming – Why Israeli businesses should care and start preparing for it

 

Lengthy and fierce discussions about the European Act on Artificial Intelligence (AI Act) look like they have come to an end – the Council of the European Union recently approved the AI Act, it has passed the European Parliament’s Committee on the Internal Market and Consumer Protection and is expected to be approved by the European Parliament in April 2024. The AI Act is the worldwide first comprehensive AI regulation, which sets out harmonised rules for the placing on the market, putting into service and use of artificial intelligence systems (AI Systems) in the EU.

 

  1. Why is the AI Act relevant for Israeli businesses?

There are a few reasons why the AI Act may be relevant for businesses in Israel:

  • Compliance with the AI Act may be required for companies that either operate an AI system in the EU or that provide AI systems to customers in the EU. The law further applies to providers or deployers of AI systems outside the EU where the output produced by the AI system is used within the EU – a provision which broadens the possible scope of the AI Act significantly, but which is expressly intended to prevent circumvention of the law.
  • In addition, the AI Act may influence the development of AI regulations in other jurisdictions. Other lawmakers may look to the EU’s regulatory approach as a model for their own AI regulations, or might align their regulations with the EU’s standards in order to enable cross-border trade and cooperation.

 

  1. Striking a Balance Between Risk and Innovation

The AI Act follows a risk-based approach and classifies AI systems into different categories:

  • Prohibited AI: Guarding against Manipulation and Privacy Invasion

The legislation takes a firm stance against malicious practices by prohibiting AI systems such as purposefully manipulative or deceptive techniques, biometric categorisation systems that individually categorise a person based on sensitive information, social scoring, or real-time remote biometric identification systems in the public for law enforcement purposes.

  • High-Risk AI Systems: Balancing Power and Responsibility

A large part of the AI Act is dedicated to strict and extensive regulations for high-risk AI systems. Companies involved in AI must identify if their AI system is “high-risk” to comply with the law. The AI Act recognizes two types of high-risk AI systems:

1) AI as a product covered by specific EU legislation in industries such as civil aviation, vehicle security, and personal protective equipment, and

2) AI listed in Annex III, which includes remote biometric identification, AI used in education, employment, law enforcement, migration, and more.

  • General-Purpose AI Models: Illuminating the Algorithms

General-purpose AI (GPAI) models, being the building blocks of AI systems, play a pivotal role in shaping our technological future. They are defined as AI models that display “significant generality” and are “capable to competently perform a wide range of distinct tasks regardless of the way the model is placed on the market and that can be integrated into a variety of downstream systems or applications.” Recognizing their significance, the AI Act sets out specific requirements for the development and deployment of such AI systems, ensuring that users understand their underlying algorithms and functionality.

 

  1. Requirements for High-risk AI Systems:

Providers of high-risk AI systems must meet strict requirements to ensure that their AI systems are trustworthy, transparent and accountable. This includes, among other things, conducting risk assessments, using reliable data, documenting technical and ethical choices, maintaining performance records, informing users about the nature and purpose of their systems, enabling human oversight, ensuring accuracy and resilience, addressing cybersecurity concerns, testing for compliance, and registering systems in a publicly accessible EU database.

In addition, the AI Act imposes strict obligations across the value chain of a high-risk AI system. Not only on the ‘provider’ of a high-risk AI system needs to be compliant, but also on the ‘importer’, ‘distributor’ and ‘deployer’ of such systems. Broadly speaking, the importer needs to verify the system’s conformity by reviewing various documentation, whereas the distributor is required to verify the CE (conformité européenne) conformity.

The deployer (in previous drafts also called the user of the AI system), also has various obligations when it utilizes a high-risk AI system, one being the obligation to use the high-risk AI system in accordance with the provider’s instructions of use. This will be important in any potential liability discussion with the provider.

 

  1. Transparency Obligations for AI Systems and GPAIs

The AI Act puts transparency in the foreground. If a person interacts with an AI system, they need to be informed that they are interacting with an AI system instead of with another person. Exceptions apply if the AI interaction is obvious or if the AI system is used for criminal prosecution.

Similarly, outputs of generative AI systems, including General Purpose AI models (e.g. audio, image, video or text content) need to be marked as artificially generated or manipulated.

In case of AI systems used for emotion recognition or biometric categorization, the people exposed to such system have to be informed of the operation.

In case of deep fakes, the content must be labelled as having been artificially created or manipulated.

 

  1. Sanctions

The penalties under the AI Act can be very high. Engaging in a prohibited AI practice can lead to a penalty of up to EUR 35 million or 7% of the total worldwide annual turnover for companies, depending on the severity of the infringement. For high-risk AI systems, the penalty may be as high as EUR 15 million or 3%, whichever is higher.

 

  1. Next steps

We expect that the AI Act will be published in its final form in mid-2024. The AI Act will enter into force 20 days after publication in the Official Journal of the EU. Most of its provisions will apply after 24 months. The rules on prohibited AI systems will apply after 6 months, provisions for GPAI models after 12 months, and the provisions regulating high-risk AI systems after 36 months.

____________________________________________________________________________________________________________________

 

The review was written by Rotem Perelman – Farhi, Partner and Heads of the firm’s Technology, IP & Data Department and Dr. Laura Jelinek, Associate in the the firm’s Technology, IP & Data Department.

_____________________________________________________________________________________________________________________

* This newsletter is provided for informational purposes only, is general in nature, does not constitute a legal opinion or legal advice and should not be relied on as such. If you are seeking legal advice, it is essential to review the specific facts of each case in detail with a qualified lawyer.

Starting February 17, 2024, the Digital Services Act (“DSA”) will apply to providers of digital intermediary services that have a substantial connection to the EU (see in more detail here).

This bulletin focuses on the providers of hosting services, which will have a large array of new obligations. “Hosting services” under the DSA include all services that store content, regardless of whether that content is disseminated to the public or to individual third parties. This is the case when any information provided by users is stored on their behalf.

The following is a brief compliance checklist outlining key tasks that hosting service providers may need to implement:

  • Updating the contact section on the website with the designated single points of contact for authorities and recipients of the service;
  • Reviewing and, where necessary, revising terms and conditions: These should explain how the organisation controls illegal content and content that is incompatible with their terms and conditions. They should also explain what content is prohibited or unwanted in the service, and how prohibited or unwanted content will be dealt with (blocking, deletion, suspension of accounts etc.);
  • Annual publication of transparency reports on “any content moderation that they engaged in during the relevant period” on their website;
  • Putting mechanisms in place to allow any person or entity to notify the hosting service provider of illegal content in an easily accessible and user-friendly manner, exclusively by electronic means;
  • Establishing a process that permits proactive notifications of crimes that involve a threat to the life or safety of a person or persons;
  • Establishing a “Statement of Reasons” process, for example by creating templates which enable the provider to provide clear and specific reasons for any restrictions imposed on service recipients or suspensions or terminations of a user account due to illegal content or violation of the terms and conditions.

It should be noted that the specific obligations for each hosting service provider can vary slightly depending on the circumstances.

For more information and assistance related to compliance with the Digital Services Act, please reach out to us at ERM.

_____________________________________________________________________________________________________________________

The review was written by Rotem Perelman – Farhi, Partner and Heads of the firm’s Technology, IP & Data Department and Dr. Laura Jelinek, Associate in the the firm’s Technology, IP & Data Department.

_____________________________________________________________________________________________________________________

* This newsletter is provided for informational purposes only, is general in nature, does not constitute a legal opinion or legal advice and should not be relied on as such. If you are seeking legal advice, it is essential to review the specific facts of each case in detail with a qualified lawyer.

 

The EU Digital Services Act (“DSA”) brings some of the most important changes to the regulatory framework for offering online content, services and products to consumers in the European Union. It will start applying from February 17, 2024 to a diverse range of intermediary services offered in the EU, including online marketplaces, web-hosting services, cloud services, search engines, and social media platforms.

What is new?

The DSA addresses a wide range of issues related to online platforms, including illegal content, hate speech, counterfeit products, and unfair competition. Its key points are:

  1. Enhanced Accountability: The DSA introduces new obligations for online platforms, making them more accountable for the content shared on their platforms. Platforms must provide predefined notice-and-action mechanisms for reporting alleged illegal content and follow up on such notices, including taking the necessary measures. Whether or not content qualifies as illegal is not determined by the DSA itself, but by the applicable law of the affected EU Member State.
  2. Transparency and Fairness: The DSA aims to promote transparency in online platforms’ policies and algorithms that influence the visibility and ranking of content. There will also be stricter rules for online advertising, for example a ban on targeted advertisements to minors on online platforms.
  3. Safeguarding User Rights: The DSA prioritizes user rights and empowerment. It requires online platforms to provide effective means for users to exercise their rights and introduces measures to tackle harassment and abusive behavior online.
  4. Strengthened Market Oversight: The DSA grants new powers to regulators to monitor and enforce compliance with the regulations. It establishes a single point of contact for cross-border issues, facilitates cooperation between EU member states, and enhances coordination with law enforcement authorities.

Who is affected?

The DSA applies to both B2B and B2C providers of digital intermediary services (intermediaries), who provide recipients with access to goods, services and content via the internet. This includes providers of:

  • mere conduit services (e.g. internet exchange points or wireless access points)
  • caching services (e.g. content delivery networks)
  • hosting services (e.g. cloud computing and web hosting)
  • online platforms (e.g. social networks and online marketplaces)
  • online search engines

The DSA applies to intermediary services that have a substantial connection to the EU, regardless whether the intermediary service in question has an establishment in the EU. Such substantial connection can exist where an intermediary service provides its services to a significant number of recipients or targets its activities towards one or more EU Member States.

It should be noted that providers of intermediary services that do not have an establishment in the EU but fall under the scope of the DSA must appoint a legal representative in one of the affected EU Member States, a principle familiar from the EU GDPR. The legal representative must have sufficient power of representation and resources, and has to act as a contact for authorities and service recipients, among other responsibilities.

Penalties for Non-Compliance

Organisations that fail to comply with the requirements of the DSA may face a fine of up to 6% of the worldwide annual turnover, or periodic penalties of up to 5% of the average daily worldwide turnover for each day of delay in complying with certain remedies, interim measures or commitments. As a last resort, the EU Commission can request the temporary suspension of the service.

For more information and assistance related to compliance with the Digital Services Act, please reach out to us at ERM.

__________________________________________________________________

The review was written by Rotem Perelman – Farhi, Partner and Heads of the firm’s Technology, IP & Data Department and Dr. Laura Jelinek, Associate in the the firm’s Technology, IP & Data Department.

__________________________________________________________________

* This newsletter is provided for informational purposes only, is general in nature, does not constitute a legal opinion or legal advice and should not be relied on as such. If you are seeking legal advice, it is essential to review the specific facts of each case in detail with a qualified lawyer.

We are excited to announce that our client, The Phoenix Insurance Company, will invest up to NIS 350 million in preferred shares and an additional amount of up to NIS 150 million in ordinary shares of Powergen Ltd., which is controlled by Generation Capital Fund.

We are honored to have had the opportunity to represent the Phoenix once again with respect to a complex M&A transaction.

Our team included partners Ron Abelski, Itamar Lev Eldar, and Amnon Epstein, associate Adi Rafaeli, and intern Roni Ghouila.

In its decision from January 15th, 2024 (“Decision”), the European Commission (“Commission”) re-affirmed Israel’s adequacy status, meaning that transferring personal data related to European data subjects to Israel will continue to be seamless.

The free data flow from the European Economic Erea (“EEA”) to Israel will remain in place, and as a result business between Israel and the EEA can be conducted in an easier, more convenient manner compared to other non-EEA countries.

Background

The adequacy status can be given to a country which is not a member of the European Union after an in-depth examination by the Commission to assure that such country offers adequate safeguards to the personal data related to European data subjects. Adequacy status was granted by the Commission only to a limited number of countries.

The renewal of the adequacy status of Israel is a relief for Israel-based companies, after the re-examination procedure of the adequacy status ( originally given in 2011) that began following the Regulation (EU) 2016/679 (General Data Protection Regulation) – GDPR which entered into effect in 2018.

The Israeli government had taken several steps in order to ensure that the adequacy status is renewed. The Commission mentioned in the Decision some of them , including “specific safeguards to reinforce the protection of personal data transferred from the European Economic Area by adopting Privacy Protection Regulations (Instructions for Data that was Transferred to Israel from the European Economic Area), 5783-2023. Israel also strengthened the requirements for data security by adopting Privacy Protection (Data Security) Regulations, 5777-2017 and consolidated the independence of its data protection supervisory authority in a binding government resolution.”

The Acknowledgment

In the Decision, the Commission acknowledges that European data subjects’ personal data transferred to Israel enjoys adequate data protection safeguards. Therefore, the adequacy decision from 2011 remains in power and personal data can continue to flow freely from the EEA to Israel without any special data transfer regime or special arrangement (like the Standard Contractual Clauses – SCC or Binding Corporate Rules – BCR).

In practice, the renewal of the adequacy status means that the transfer of personal data from Europe to Israel will remain materially the same as the transfer of personal data within the EEA.

Why is it important?

The adequacy status is important for the Israeli economy, especially with regards to trade or research relations with Europe. The convenient and simple flow of data to Israel makes it easier, from a legal and regulatory standpoint, for entities in Israel (including Israeli companies, businesses, hospitals, research institutions and public authorities) that receive personal data to do business in the EEA or with EEA based entities.

The adequacy status prevents the need for individual and resource-intensive mechanisms, for example detailed contractual arrangements, thereby reducing costs for businesses and organizations in Israel, reducing legal risks, and creating a competitive advantage for Israeli entities.

The Decision can be read here. The publication of the Privacy Protection Authority in the Ministry of Justice can be read here (in Hebrew).

The authors are Adv. Rotem Perelman-Farhi, head of the Technology, IP and Data Protection Department and Adv. Einat Goldstein, associate in the Department.

* This newsletter is provided for informational purposes only, is general in nature, does not constitute a legal opinion or legal advice and should not be relied on as such. If you are seeking legal advice, it is essential to review the specific facts of each case in detail with a qualified lawyer.

The French Data Protection Authority (CNIL) has published a draft guide on Transfer Impact Assessment (TIA). The draft guide outlines the procedures and considerations for conducting a TIA. This draft serves as a guideline for organisations that transfer personal data outside of the European Economic Area (EEA) and therefore must assess the level of data protection in the countries of destination.

Background

Under the Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data and repealing Directive 95/46/EC (GDPR), personal data transferred outside of the EEA (for example to a cloud service provider or when shared with a parent company or subsidiary) must receive the same protection as within the EEA. This is the case when personal data is transferred to countries with adequacy decisions (for example to Israel). In the absence of adequacy, data exporters (acting as controllers or processors) must adopt measures like Binding Corporate Rules (BCR) or Standard Contractual Clauses (SCCs) to compensate for deficiencies in the data protection laws of the third country (to which the personal data is being transferred).

In its “Schrems II” ruling from July 2020, the Court of Justice of the European Union (CJEU) emphasized that both data exporters and data importers are responsible to guarantee that personal data, when transferred outside of the EEA, is granted the same level of protection as set by the GDPR.

Requirement for Transfer Impact Assessment

Consequently, data exporters must verify whether the third-country legislation is essentially equivalent to EU protection levels, and implement additional measures where needed. If the transfer relies on a transfer tool under Art. 46 GDPR (as BCRs or SCCs), a TIA is required, conducted by the data exporter and importer together. Until now, organizations have mainly relied on the recommendations of the European Data Protection Board (EDPB) on additional measures supplementing the transfer instruments, published in June 2021, to carry out their TIAs.

In this context, the CNIL decided to draft a practical guide to “help data exporters carry out their TIAs.” The CNIL has released a draft of this guide for public consultation until February 12, 2024, with the final version expected to be published later in 2024.

The CNIL Guidelines

The CNIL guide constitutes a methodology which identifies the various elements to be considered when carrying out a TIA. The CNIL points out that the use of this guide is not obligatory; other elements can be considered, and other methodologies can be applied.

The guide provides a TIA template based on the six steps recommended by the EDPB for carrying out a TIA, which are as follows:

  1. Know your transfer;
  2. Document the transfer tool used;
  3. Evaluate the legislation and practices in the country of destination of the personal data and the effectiveness of the transfer tool;
  4. Identify and adopt supplementary measures
  5. Implement the supplementary measures and the necessary procedural steps;
  6. Re-evaluate at appropriate interval the level of data protection and monitor potential developments that may affect it.

It is worth noting that the CNIL specifies that in the case of onward transfers, a separate TIA should be carried out for each type of onward transfer.

Compared to the EDBP’s recommendations, the CNIL also increases the responsibilities of the data importer. The CNIL finds the data importer’s cooperation “essential for the TIA to be carried out” and goes on to state that if the data importer is a data processor, its cooperation obligation is part of the obligations under Art. 28 of the GDPR. Essentially, while the main burden of conducting the TIA is on the data exporter, in the CNIL’s opinion the data importer has significant information obligations.

Differences to the ICO’s Transfer Risk Assessment

The CNIL’s guideline follows the EDPB’s recommendations and as such differs from the United Kingdom’s Information Commissioner’s (ICO) transfer risk assessment tool, which may be used for transfers of personal data outside of the UK. The purpose of such transfer risk assessment (TRA) is to asses if a transfer increases privacy and rights risks compared to keeping data in the UK. If no significant extra risk is found, the transfer is permitted. The ICO’s TRA tool offers a more risk-based (and possibly more business-friendly) approach compared to the EDBP. The TRA tool focuses mainly on general human rights risks in the destination country, which include (i) risks associated with third-party access to data, especially by government and public bodies, and (ii) risks stemming from challenges in enforcing the Article 46 transfer mechanism.

For more information related to the transfer of personal data and on any other matters relating to privacy and data protection laws, please reach out to us at ERM.

Our partner, Adv. Rotem Perelman-Farhi, head of the Technology, IP and Data Protection Department and Adv. Einat Goldstein, LL.M, associate in the Department, share the essential information.

* This newsletter is provided for informational purposes only, is general in nature, does not constitute a legal opinion or legal advice and should not be relied on as such. If you are seeking legal advice, it is essential to review the specific facts of each case in detail with a qualified lawyer.

ERM Advised our client, Azorim, on the signing of a significant agreement with the majority of property rights holders of a property in the city of Ramat Gan.

The deal is another example of the growing recognition of the importance of new and speedier construction starts, needed to meet market demand and safe building development.

Partners Aharon Shimon and Yoav Zahavi from ERM’s Real Estate and Urban Renewal Department advised Azorim.

For the full article in Hebrew click here.

ERM Advised Bank Leumi on the financial close of the NIS 510 facility agreement with Menorah Synergy, a subsidiary of TeraLight and Senergy Renewable Energy financing a portfolio of photo-voltaic power plants on roofs and reservoirs totalling approximately 124 megawatts.

Thank you to our partners Amnon Epstein, Chen Weiss, associates Nir Shaked and Ori Nehmad  from our energy department.

To read the full article in Hebrew click here.

 

ERM Advised Jerusalem Homes Group Ltd. of the Complex 06 project in the Katamonim neighborhood in Jerusalem. The project includes 700 housing units, 3,800 square meters of commercial space, and 2.8 dunams of public space.

Partners Aharon Shimon, Yoav Zahavi and associates Or Tzur, Golan Laihtam and Shoval Yaacov, all from the Real Estate and Urban Renewal Department advised Jerusalem Homes Group Ltd.

For the Hebrew article click here.

ERM Advised Paz Oil Company (TLV: PZOL), the largest Israeli fuels company, and Hagai Miller on reaching financial closing of its first renewable energy project.

Epstein Rosenblum Maoz’s (ERM) Energy Infrastructure and Climate practice group was proud to advise Paz once again following its acquisition of the 272MW PV project in Texas, US in 2021 together with Global Sun Israel.

The financial closing included non-recourse project finance from Nomura and Poalim, Tax Equity finance from a leading US investment bank and equity finance from Menora Mivtachim Group.

Partner Asaf Rimon led the ERM team, together with Amnon Epstein, Galit Heller Farkash, Tal Ishay and Lin Nanikashvili

For the full article in Hebrew click here.