-
- Art. 5a FC
- Art. 6 FC
- Art. 10 FC
- Art. 16 FC
- Art. 17 FC
- Art. 20 FC
- Art. 22 FC
- Art. 29a FC
- Art. 30 FC
- Art. 32 FC
- Art. 42 FC
- Art. 43 FC
- Art. 43a FC
- Art. 55 FC
- Art. 56 FC
- Art. 60 FC
- Art. 68 FC
- Art. 75b FC
- Art. 77 FC
- Art. 96 para. 2 lit. a FC
- Art. 110 FC
- Art. 117a FC
- Art. 118 FC
- Art. 123b FC
- Art. 136 FC
- Art. 166 FC
-
- Art. 11 CO
- Art. 12 CO
- Art. 50 CO
- Art. 51 CO
- Art. 84 CO
- Art. 143 CO
- Art. 144 CO
- Art. 145 CO
- Art. 146 CO
- Art. 147 CO
- Art. 148 CO
- Art. 149 CO
- Art. 150 CO
- Art. 701 CO
- Art. 715 CO
- Art. 715a CO
- Art. 734f CO
- Art. 785 CO
- Art. 786 CO
- Art. 787 CO
- Art. 788 CO
- Transitional provisions to the revision of the Stock Corporation Act of June 19, 2020
- Art. 808c CO
-
- Art. 2 PRA
- Art. 3 PRA
- Art. 4 PRA
- Art. 6 PRA
- Art. 10 PRA
- Art. 10a PRA
- Art. 11 PRA
- Art. 12 PRA
- Art. 13 PRA
- Art. 14 PRA
- Art. 15 PRA
- Art. 16 PRA
- Art. 17 PRA
- Art. 19 PRA
- Art. 20 PRA
- Art. 21 PRA
- Art. 22 PRA
- Art. 23 PRA
- Art. 24 PRA
- Art. 25 PRA
- Art. 26 PRA
- Art. 27 PRA
- Art. 29 PRA
- Art. 30 PRA
- Art. 31 PRA
- Art. 32 PRA
- Art. 32a PRA
- Art. 33 PRA
- Art. 34 PRA
- Art. 35 PRA
- Art. 36 PRA
- Art. 37 PRA
- Art. 38 PRA
- Art. 39 PRA
- Art. 40 PRA
- Art. 41 PRA
- Art. 42 PRA
- Art. 43 PRA
- Art. 44 PRA
- Art. 45 PRA
- Art. 46 PRA
- Art. 47 PRA
- Art. 48 PRA
- Art. 49 PRA
- Art. 50 PRA
- Art. 51 PRA
- Art. 52 PRA
- Art. 53 PRA
- Art. 54 PRA
- Art. 55 PRA
- Art. 56 PRA
- Art. 57 PRA
- Art. 58 PRA
- Art. 59a PRA
- Art. 59b PRA
- Art. 59c PRA
- Art. 62 PRA
- Art. 63 PRA
- Art. 67 PRA
- Art. 67a PRA
- Art. 67b PRA
- Art. 75 PRA
- Art. 75a PRA
- Art. 76 PRA
- Art. 76a PRA
- Art. 90 PRA
-
- Vorb. zu Art. 1 FADP
- Art. 1 FADP
- Art. 2 FADP
- Art. 3 FADP
- Art. 5 lit. f und g FADP
- Art. 6 Abs. 6 and 7 FADP
- Art. 7 FADP
- Art. 10 FADP
- Art. 11 FADP
- Art. 12 FADP
- Art. 14 FADP
- Art. 15 FADP
- Art. 19 FADP
- Art. 20 FADP
- Art. 22 FADP
- Art. 23 FADP
- Art. 25 FADP
- Art. 26 FADP
- Art. 27 FADP
- Art. 31 para. 2 lit. e FADP
- Art. 33 FADP
- Art. 34 FADP
- Art. 35 FADP
- Art. 38 FADP
- Art. 39 FADP
- Art. 40 FADP
- Art. 41 FADP
- Art. 42 FADP
- Art. 43 FADP
- Art. 44 FADP
- Art. 44a FADP
- Art. 45 FADP
- Art. 46 FADP
- Art. 47 FADP
- Art. 47a FADP
- Art. 48 FADP
- Art. 49 FADP
- Art. 50 FADP
- Art. 51 FADP
- Art. 54 FADP
- Art. 57 FADP
- Art. 58 FADP
- Art. 60 FADP
- Art. 61 FADP
- Art. 62 FADP
- Art. 63 FADP
- Art. 64 FADP
- Art. 65 FADP
- Art. 66 FADP
- Art. 67 FADP
- Art. 69 FADP
- Art. 72 FADP
- Art. 72a FADP
-
- Art. 2 CCC (Convention on Cybercrime)
- Art. 3 CCC (Convention on Cybercrime)
- Art. 4 CCC (Convention on Cybercrime)
- Art. 5 CCC (Convention on Cybercrime)
- Art. 6 CCC (Convention on Cybercrime)
- Art. 7 CCC (Convention on Cybercrime)
- Art. 8 CCC (Convention on Cybercrime)
- Art. 9 CCC (Convention on Cybercrime)
- Art. 11 CCC (Convention on Cybercrime)
- Art. 12 CCC (Convention on Cybercrime)
- Art. 25 CCC (Convention on Cybercrime)
- Art. 29 CCC (Convention on Cybercrime)
- Art. 32 CCC (Convention on Cybercrime)
- Art. 33 CCC (Convention on Cybercrime)
- Art. 34 CCC (Convention on Cybercrime)
FEDERAL CONSTITUTION
CODE OF OBLIGATIONS
FEDERAL LAW ON PRIVATE INTERNATIONAL LAW
LUGANO CONVENTION
CODE OF CRIMINAL PROCEDURE
CIVIL PROCEDURE CODE
FEDERAL ACT ON POLITICAL RIGHTS
CIVIL CODE
FEDERAL ACT ON CARTELS AND OTHER RESTRAINTS OF COMPETITION
FEDERAL ACT ON INTERNATIONAL MUTUAL ASSISTANCE IN CRIMINAL MATTERS
DEBT ENFORCEMENT AND BANKRUPTCY ACT
FEDERAL ACT ON DATA PROTECTION
SWISS CRIMINAL CODE
CYBERCRIME CONVENTION
- In a nutshell
- I. General
- II. Content
- III. Violation of the obligation to conduct a DSFA
- IV. Challenges and practical relevance
- V. Comparison with EU law
- Bibliography
- Materials
In a nutshell
Art. 22 FADP regulates the requirements for conducting a data protection impact assessment (DPIA). The article is based on the provisions of Art. 35 f. DSGVO. With the help of the data protection risk analysis, an assessment of these risks should be carried out in advance in the case of planned data processing that is likely to lead to high risks for the personality or fundamental rights of the data subjects. In this way, appropriate measures for risk minimization are to be defined and taken before data processing begins. According to the law, a high risk exists in particular in the case of (i) the extensive processing of data requiring special protection (e.g. health data) or (ii) the systematic extensive monitoring of public areas (e.g. the installation of surveillance cameras in a park). However, the list of high risks mentioned in Art. 22 is not exhaustive. Typically, high risks will arise from the use of new technologies and processes, such as algorithmic systems ("artificial intelligence"), DLT systems or Big Data analytics. However, the planned processing activity may also be classified as a high risk to the personality or fundamental rights of the data subjects due to other factors, e.g. because children are involved, automated individual decisions are made or personal data is passed on to a complex network of order processors. However, the violation of the obligation under Art. 22 is not directly punishable.
I. General
A. Overview
1In contrast to the DSGVO, the revised FADP does not provide for a general accountability obligation. However, it introduced the obligation to conduct a data protection impact assessment (DPIA) for high-risk data processing operations, which effectively constitutes an accountability obligation for certain data processing operations. The DSFA is not only enshrined in law at the federal level, but also in numerous cantons. In Zurich, for example, the corresponding obligation can be found in Section 10 para. 1 IDG, in connection with prior checking (cf. on the prior consultation procedure at the federal level the commentary on Art. 23).
2With the revision, the DSFA is anchored in the Swiss Data Protection Act for all data controllers - for private data controllers as well as for federal bodies. For this reason, the wording of the provision covers both high risks to the personality (for data processing by private controllers) and fundamental rights (for data processing by federal bodies) of the data subjects. The DSFA serves to identify the potential risks of a planned data processing at an early stage, to evaluate them and to define and implement risk mitigating measures. These steps must be taken before the planned data processing is started (see further on the purpose N. 8 ff.). The application of Art. 22 is an outgrowth of the risk-based approach of the FADP, which is particularly evident in the implementation of Art. 22 and 23.
3DIA is considered to be of increasing importance. This is mainly justified by the increasing spread of advanced algorithmic systems or comparable new technologies in all areas of life, such as distributed ledger technologies or the Internet of Things. In many of the aforementioned applications, personal data is involved and it cannot be ruled out that the personality or fundamental rights of the data subjects are exposed to high risks when using these technologies. In these cases, a DPA is therefore required. The data controller is urged to analyze the planned data processing in advance. At the same time, it should be noted that, particularly in the case of algorithmic systems, certain processes are hardly traceable and their potential cannot be conclusively defined (so-called "black box" problem). Accordingly, it may not be possible to clearly define and delimit the personal data processed and the processing purposes in individual cases, especially since many contract processors and other third parties are often involved in processing the data and the Big Data analyses underlying the systems offer a potential that cannot yet be foreseen. Thus, DSFA in these cases may well be an "ambitious undertaking(s)," but its implementation must be taken all the more seriously. The better this is done, the more likely it is to ensure trustworthy and transparent use of such technologies.
B. History of origins
4Prior to the total revision, the Swiss FADP did not explicitly require private data processors to conduct a DSFA. However, federal bodies were already required under the old FADP to notify the internal data protection officer or the FDPIC of projects involving the automated processing of personal data (Art. 20 para. 2 aVDSG). This notification was linked to the fact that the respective federal authorities had to draw up an information security and data protection concept ("ISDS concept") for the planned data processing. According to the dispatch, this procedure was comparable to the DSFA. For private data controllers, the DSFA was in individual cases possibly based on Art. 7 aDSG. In addition, the data processing principles have always required that the consequences of data processing must always be taken into account. In addition, the FDPIC has in the past required private data processors to submit a DIA in certain cases as part of its advice, especially in the case of high-risk data processing. The introduction of the obligation to conduct a DIA was not least due to EU-wide requirements pursuant to Art. 8bis E-SEV 108 and Art. 27 et seq. of Directive (EU) 2016/680. Art. 35 et seq. DSGVO contain analogous provisions for data controllers that fall within the scope of the DSGVO but do not directly oblige Switzerland in the context of lawmaking.
5Internationally, the instrument builds on a long history dating back to the 1990s. During this period, the first privacy impact assessments were developed both in the EU Data Protection Directive (Art. 20, Directive 95/46/EC), which was superseded by the DSGVO, and in the Anglo-Saxon world. Other EU member states such as France and Germany have issued corresponding recommendations based on Directive 95/46/EC. These non-binding guidelines aimed at minimizing or, if possible, eliminating potential risks from the outset, especially since such a preventive orientation always proved to be simpler and more cost-effective than adapting processes, e.g., after the fact, to an investigation by the competent supervisory authority. These goals were to be achieved, among other things, by describing the "information flows," involving and discussing with "stakeholders," and writing reports. The provisions of the former Federal Data Protection Act (aBDSG) in Germany, which provided for increased protection for special types of data in Section 3 para. 9 aBDSG, can also be cited as an example. These special data included data on political opinion or health. Before these data could be processed by organizations, the responsible data protection officer had to conduct a so-called "prior check" to ensure proper handling (Section 4d para. 5 aBDSG).
6The legislative process in Switzerland apparently followed all these developments by providing for a DPA in all preliminary versions of the totally revised FADP. In the preliminary draft, Art. 22 and the related Art. 23 FADP were combined into one Article 16 VE-FADP. However, Art. 16 para. 3 VE-FADP was more stringent than the provision now in force. The article provided that both the result and the measures taken of any DSFA must be reported to the FDPIC. In the FADP, the impact assessment was then put into two articles and adapted to the structure of Art. 35 DSGVO. In the draft, the list of high-risk processing operations, which thus mandatorily require the performance of a DIA, also included profiling (Art. 20 para. 2 lit. b FADP). However, this processing operation was removed from the list of high-risk data processing operations in the final text of the law as a result of the parliamentary deliberations and in accordance with the applicable legal system. This step is to be welcomed, especially since in many cases profiling will not be classified as particularly sensitive and a DIA would therefore not be justified.
7DIA is now standardized in Art. 22 FADP and is applicable to both private data controllers and federal bodies. Thus, the legal norm goes beyond the previously existing obligation for federal bodies from Art. 20 para. 2 FADP and expands it. Within the framework of the new norm introduced with the total revision, there is a lack of further implementing provisions - with the exception of Art. 14 FADP on the obligation to retain data for two years - which could simplify implementation in practice.
C. Purpose of the Norm
8Compliance with the requirements of Art. 22 serves to ensure that a planned data processing does not lead to a violation of the FADP. The DPA is intended to raise the awareness of the data controller for high-risk data processing, such as the use of algorithmic systems ("artificial intelligence") or Big Data analytics, the use of facial recognition software, the introduction of a patient database or in the context of high-risk profiling. It is therefore a "data protection self-assessment" in the case of particularly high-risk data processing. Art. 22 is an outgrowth of the data protection principles as well as the principle of data protection by design and data protection-friendly default settings (Art. 7). The aim is to identify risks in a first step and to evaluate them analytically in a second step. In this process, the controller should make a forecast about the possible consequences of a planned data processing operation for the data subjects. What matters here is how and to what extent the planned data processing will affect the data subjects. Finally, the standard aims to develop and implement appropriate measures, based on the results of the analysis carried out, in order to (i) mitigate the identified risks of the planned data processing, (ii) draw long-term conclusions from them, and (iii) behave in a data protection-compliant manner overall. If this is successful, the DSFA also offers the possibility to prove compliance with regulations from data protection law to the FDPIC and to save costs in the long run. In this sense, Art. 22 requires the data controller to independently think ahead and examine the risks arising in relation to the planned data processing: Is the planned data processing in compliance with the requirements of data protection legislation? Are the general data protection principles complied with? Is the data processing proportionate? Is data security endangered by the planned processing? Do other risks arise for the personality or fundamental rights of the data subjects? These questions must be addressed regardless of the level of risk, whereby the legislator relies to a large extent on a "hands-on mentality". This should serve to reduce the risks associated with data processing to a level that is perceived as appropriate.
9The documentation and retention obligations under Art. 14 DPA supplement the obligation to conduct a DIA for high-risk data processing. Thus, the controller is obliged to document a DSFA that has been carried out and to retain it for at least two years after the end of the data processing. In this way, in the event of an investigation by the FDPIC, the data controller is in a position to justify the high-risk data processing and to show how the high risks are countered. From this provision, the character of the DIA as a complementary accountability both to the FDPIC and to data subjects is clear.
II. Content
A. Addressees
10Art. 22 addresses two actors in particular: federal bodies as well as private data controllers, since it concerns both a risk assessment of the data processing in question in relation to the personality as well as the fundamental rights of the data subjects. Federal bodies already had a similar obligation under the aDSG. Private data controllers as well as federal bodies are affected by this obligation if they fall within the scope of the FADP according to Art. 3 para. 1. This means that private data controllers based abroad are also subject to this obligation, provided that their data processing activities have an impact in Switzerland in accordance with the impact principle and the other requirements of Art. 22 are met (see the commentary on Art. 3 for details on the impact principle).
11 Contract processors are rightly not covered by the obligation to conduct a DIA, especially since they process the data on behalf of a controller and the controller basically decides on the purpose and means of data processing. The risk assessment is therefore fundamentally not within their sphere. While the FADP, in contrast to Art. 28 para. 3 lit. f DSGVO, does not stipulate a duty of support for data processors vis-à-vis data controllers when carrying out a DIA, a duty of support for data processors can, however, also be in their interest. In practice, a corresponding obligation will often be part of the contractual agreements between the controller and the order processor. However, from the data protection principles in Art. 6, an obligation can be derived for commissioned processors to ensure that the data processing transferred to them is carried out in accordance with the data protection principles. If this is not the case, the data controller must be informed that the data processing activity should be adapted. If necessary, it should even be suggested that a DSFA be performed.
B. The high risk
12 Para. 22 provides that the DIA must be carried out in any case where a high risk to the personality or fundamental rights of the data subject can be assumed. Accordingly, a DSFA is not to be performed for every data processing operation. This corresponds to the risk-based approach of the FADP, whereby the risks for the data subjects are specifically in the foreground and not risks that arise for the data controllers.
13The high risk results from the nature, scope and circumstances as well as the purpose of the data processing. These criteria are comparable to the risks that must be taken into account when determining data security measures pursuant to Art. 1 para. 3 DPA. In this sense, the high risk is present in particular, but not exclusively, when new technologies are used. The more extensive and sensitive these actions and data files are, the more likely the existence of a high risk can be assumed. The planned data processing must be examined, among other things, with regard to the impact on the identity, self-determination or dignity of the person concerned. The high risk does not have to manifest itself effectively: It is a matter of identifying and assessing potential risks that could arise in the course of data processing.
14Pursuant to lit. a and lit. b, a high risk in the sense of a fiction exists if (i) personal data requiring special protection is processed extensively (e.g., when criminal record information is published in a public register, when health applications are used that evaluate the data and make diagnoses, or in clinical trials) or (ii) public areas are monitored extensively and systematically (e.g., the monitoring of a public playground or park or the systematic monitoring of employees' Internet use). The law does not list any other data processing operations for which a DSFA must necessarily be performed. In contrast, the DSGVO provides, for example, that supervisory authorities must publish lists of data processing operations for which a DIA is mandatory.
15 According to the wording of Art. 22 para. 2, neither high-risk profiling nor automated individual decision-making are classified as data processing activities that require a mandatory DIA. At first glance, this seems surprising, especially since high-risk profiling by definition entails a high risk for the data subject and automated individual decisions lead to legal consequences or other consequences that significantly affect the data subject. Against this background, a DSFA must always be performed for high-risk profiling. In the case of automated decisions, however, the situation is different and more complex, especially since a DIA evaluates the data processing itself, which is not necessarily sensitive in the case of these automated individual decisions. What is sensitive here is primarily the result of the decision, which is regulated in Art. 21. A DIA must nevertheless be carried out if the data processing leading to the automated individual decision could in itself lead to a high risk for the personality or fundamental rights of the data subject.
16 The dispatch lists further data processing that could lead to a high risk, namely in the case of processing large amounts of data, the transfer of data to third countries (in this context, reference should be made to the so-called Data Transfer Impact Assessment; see the commentary on Art. 16 f.) or if a large number of persons can access the data. In individual cases, however, it must always be examined whether there is actually a high risk.
17 Neither the FADP nor the FADP contain more precise requirements for determining high risk. Thus, it must always be checked on the basis of the specific planned data processing whether a DIA must be carried out. The EU guidelines, which provide nine criteria for the regulation in Art. 35 et seq. DSGVO provide nine criteria according to which a high risk can be determined. According to the guidelines, as soon as a planned data processing operation meets two of these nine criteria, there is an obligation to conduct a DIA. The nine criteria are as follows:
Evaluation or classification of persons or personality aspects with the help of data;
Automated decision making with legal effect on natural persons or similarly significant effect;
Systematic monitoring;
Processing of confidential or highly personal data;
Large-scale data processing;
Matching or merging of different data sets, unless this was to be expected;
Data processing of vulnerable persons (e.g. children, employees or patients);
Innovative use or application of new technological or organizational solutions (e.g., artificial intelligence, Big Data analytics, or blockchain-based technologies), as their data protection risks have often not yet been accurately elicited; and
Cases where data processing prevents data subjects from exercising a right or using a service or performing a contract.
While the guidelines from the EU are not applicable in Switzerland, they can be consulted for conducting a DSFA in Swiss and EU data protection law due to the comparable legal requirements. Furthermore, it cannot be ruled out that the FDPIC will publish similar guidelines or apply the lists of data processing operations of the various EU supervisory authorities pursuant to Art. 35 para. 4 DSGVO, which mandatorily require a DSFA, to Switzerland by analogy. The practice will provide more legal certainty with respect to the assessment of high risk.
C. Implementation of the DSFA
1. Introduction
18Art. 22 provides clear guidance on the cases in which the controller must conduct a DSFA and what must be observed when conducting it. This includes that the risk assessment must be carried out and documented in a structured manner. The documentation must also stand up to consultation with the FDPIC if, despite the measures implemented, the data processing still results in a high risk to the personality or fundamental rights of the data subjects (Art. 23 para. 1; cf. in detail the commentary on Art. 23).
2. Procedure
19The DIA must be carried out by the data controller before starting a new or updated data processing activity. This serves to identify the risks of the planned data processing in advance and to implement appropriate measures before legal violations occur. The rule here is: the sooner, the better. However, the one-time, initial performance of the DSFA is not sufficient. Rather, further runs are also necessary during the respective project in order to check whether any adjustments made to the data processing lead to a different assessment and thus other measures appear more appropriate or even obsolete.
20In terms of content, the DSFA requires three steps, which are defined in para. 3. This is, first, the preparation phase, in which the processing of the data is described as precisely as possible. Secondly, in the assessment phase, the risks are evaluated, and based on this, in a third step, measures are taken to deal with or manage these risks.
21 Before carrying out these three steps, however, private data controllers should check whether they are legally obliged to process data. If there is a legal obligation to process, they are exempt from the obligation to perform a DIA (Art. 22 para. 4). Furthermore, the federal authority or private controller should first conduct a preliminary assessment. This is a general, rather superficial risk assessment to check whether there could potentially be a high risk in the planned data processing (cf. on high risk n. 12 ff.). In this way, it can be clarified in advance whether a DIA is necessary at all. If the controller concludes that the planned data processing is not likely to lead to a high risk, a DIA is not required. However, such a decision should be documented and retained for evidentiary purposes. However, if the controller concludes on the basis of this preliminary assessment that the data processing in question could lead to a high risk, the DIA must generally be performed. An exception exists only in the cases of Art. 22 para. 5.
22 The preparatory phase includes a detailed description of the (planned) data processing activity. The law does not specify what this description must include, nor how detailed it must be. Depending on the complexity of the planned data processing activity, the DPA may therefore be longer or shorter. However, the description should at least include the type of data processed and categories of data subjects, an explanation of the purpose pursued, the retention period, the processing operations, the recipients, and the place of processing and retention. Otherwise, a complete risk assessment is not possible. This description should indicate why which personal data are to be processed and how. Within the scope of this description, the underlying technologies and systems must be addressed, as well as relevant legal foundations, so that a possible need for legislative action can be identified and implemented at an early stage.
23 This is followed by the evaluation phase, during which the necessity and proportionality of data processing are put into relation with the possible risks. The negative consequences of the data processing for the personality or fundamental rights of the data subject are assessed. High risks or negative effects of data processing can manifest themselves in different ways: For example, data processing can lead to damage to a person's reputation, or the processing of incorrect data can result in incorrect prognoses or diagnoses being made in the medical environment. Situations are also conceivable in which a person is unjustifiably dismissed or loss of confidence in the authorities, the private person in charge or in friends and family occurs. In addition, data processing can also have a financial impact. These and comparable risks must be analyzed in terms of their probability and severity during the assessment phase.
24In assessing the risks, the data protection principles - above all those of data minimization, data protection by technology and data protection-friendly default settings - must be observed (see n. 4). The informational self-determination of the data subject is also of particular importance, but not the interests of the controller. However, a balancing of interests with regard to the measures to be taken between the interests of the data subject (data protection interest, interest in the freedom to dispose of one's own data) and the data controller (data processing interest) may take place at a later point in time. Although the data relevant for the DSFA does not necessarily have to be "particularly worthy of protection", the "sensitive nature" of the data must be taken into account in the context of the respective DSFA and the measures to be taken. Accordingly, the number as well as the sensitivity of the data are to be considered in this phase. The assessment phase includes the assessment of the severity of the risks and likelihood of those risks occurring. Although this phase is intended to identify a potentially high risk, the risks may also arise after the DSFA has been performed. In this sense, the three phases mentioned above cannot be strictly demarcated from each other.
25After the evaluation phase has assessed whether the processing of the data involves a high risk to the personality or fundamental rights of the data subject, measures to reduce the risk are defined in the third phase. In this phase, the data controller must therefore not only be aware of what risks exist, but must also make a decision on how to counter these risks. The measures can be organizational, technical or legal or contractual. The data controller must analyze how the planned data processing activity can be adapted so that the high risk is reduced to a reasonable level. Such measures may include, for example, limiting the categories of data, limiting access to the data, encrypting data, changing an order processor involved, changing the place of processing, including certain obligations in the contracts with third parties regarding data processing, etc. However, the measures must be suitable to mitigate the specifically identified risks and cannot be applied by default, i.e. generally with respect to all risks of data processing activities that require a DSFA. They must be defined on a case-by-case basis.
26 The FADP does not specify whether a DIA must be performed only once or on a regular basis. This depends on the data processing activity: If it is one-time, a DSFA performed once is sufficient. However, if the data processing is carried out over a longer period of time and new or higher risks arise due to changed circumstances, the DSFA should be repeated. In principle, however, the DSFA should be repeated approximately every three years. It remains to be seen whether this recommendation will prove effective in the long term in view of rapidly advancing and increasingly comprehensive technological innovations, but it seems rather unlikely from the current perspective.
27 In terms of time, the retention period must also be taken into account. According to Art. 14 DPA, the required documentation of the DIA must be retained for two years. The time limit begins to run with the end of the data processing.
3. Conducting a joint DSFA
28 The controller has the right to conduct a joint DSFA in the case of several comparable data processing operations. However, this only applies if both the risks and the countermeasures are similar. This option serves to reduce the effort and costs for the controller. However, it is doubtful whether this right will actually lead to a reduction in the burden on data controllers; unless a particular processing platform or application is used in different sectors or projects.
4. Organization
29 The performance of a DSFA is not the responsibility of only one particular person, body or department at the controller. Like any other risk analysis, it is a process in which several persons, bodies and departments must be involved across the board. Within an organization, the DSFA typically involves project management, the IT department and/or chief information security officer, the data protection office or data protection consultant, the legal department, the risk and compliance department, and other key individuals. Each person involved is essential to certain steps of the DSFA. For example, the project management is particularly important for the preparation phase because it can best describe the (planned) data processing. The data protection and legal department is responsible for assessing the risks according to data protection and applicable law. Other departments are of particular relevance for determining the risk mitigating measures. Contract processors who will be used in the planned data processing operation should also be involved in the DSFA. As with any project, it is particularly significant that one person, office or department takes the lead in the DSFA and maintains an overview of the entire process. This leadership is usually provided by the data protection consultant or the project manager at the operational level. Commissioned processors who are used in the planned data processing operation should be involved in the DSFA.
30 From the point of view of the data controller, it therefore makes sense to define an office or department that is familiar with the legal requirements for a DSFA and has the necessary expertise to perform a DSFA. This is usually the office or department responsible for data protection. Depending on the focus or business model of the responsible parties, it may be helpful to create a template including instructions for performing a DSFA that can be referred to in individual cases. In this way, the legal requirements are met without certain aspects being forgotten.
D. Exceptions
31In addition to the general obligation to conduct a DSFA in cases where data processing leads to a high risk for the data subjects, para. 4 and 5 define some exceptions.
32Art. 22 para. 4 provides that private data controllers may be exempt from the obligation to perform DIA in individual cases, provided that they are required by law to process the data. In the legislative process, this exemption was justified by the fact that possible risks in these cases have already been weighed and thus there is no further need to conduct a DIA. However, it is rightly pointed out that the exception relates solely to the question of the permissibility of the data processing, not to the manner of the respective processing of data. Thus, the DSGVO is stricter in this respect and requires that the risk analysis carried out in the context of the legislation must relate to the specific processing activity in order for the exception to apply (cf. n. 40). In addition, those cases in which the processing of data does not take place exclusively for the fulfillment of the respective legal obligation must also be taken into account. Legal obligations to process data by private parties can be found, for example, in the Federal Act on Combating Money Laundering and Terrorist Financing (Art. 30 AMLA) or the disclosure of employee data by an employer to fulfill social security obligations (Art. 84 KVG). It is obvious that federal authorities are not covered by this exemption: They may always process data only on the basis of a legal foundation.
33Art. 22 para. 5 then provides for a further exception. Accordingly, private data controllers may waive the requirement to prepare the DIA if certain conditions relating to the certification of a data processing operation or compliance with approved codes of conduct are met. These exemptions do not apply to federal agencies. On the one hand, this is the case if the controller uses a system, product or service that is certified for use under Art. 13 (see the commentary on Art. 13 FADP for more details). On the other hand, the obligation does not apply in cases where the controller complies with a code of conduct as defined in Art. 11, (i) which is based on a DPAA, (ii) which takes specific measures to protect the personality and fundamental rights of the data subjects, and (iii) which has been submitted to the FDPIC. Currently, no such codes of conduct exist in Switzerland. However, such codes are conceivable for industries in which the industry participants work closely together and jointly develop standardized processes, such as in banks and insurance companies, in technology or media companies, or in the advertising industry. In the EU, the development of comparable codes of conduct or regulations is governed by Art. 40 DSGVO, with the EU Cloud Code of Conduct, which serves as a harmonized code of conduct for the cloud industry within the EU, currently being known, among others.
III. Violation of the obligation to conduct a DSFA
34 If a controller does not perform a DSFA, although a corresponding obligation would exist according to Art. 22, Art. 51 para. 3 lit. d must be observed. According to this article, the FDPIC may order the controller to perform a DIA under penalty of a fine. In contrast, however, data subjects have no right to take legal action to enforce the performance of a DIA.
35 If the FDPIC determines in the course of an investigation that a data processing operation leads to a high risk, he may order that the data processing operation be adapted, suspended or stopped altogether until a DIA has been carried out (Art. 51 para. 1 in conjunction with Art. 49 f). This can lead to considerable delays for the data controller, which in turn entails high costs and additional effort for the data controller. Moreover, since the results of an investigation can be published by the FDPIC (Art. 57 para. 2), there are significant reputational risks for the responsible party.
36 The violation of the obligation to conduct a DSFA is not independently subject to a fine. This provision seems reasonable in light of the fact that this obligation is primarily a due diligence obligation (consisting of a documentation and accountability obligation). In the EU, however, this is handled differently (cf. n. 40 ff.). However, it can be assumed that data controllers will nevertheless comply with their obligations under the FADP. On the one hand, because authorities already had to take such an obligation into account under the aDSG. On the other hand, private data controllers, in particular corporate groups, also carried out FADPs before the entry into force of the totally revised FADP, e.g. because the data processing in question fell under the FADP or the FDPIC had required this.
IV. Challenges and practical relevance
37 With the entry into force of the revised Art. 22, it is apparent that the normative requirements for a DPA have a high degree of complexity. This is particularly true in the case of implementation by small and medium-sized enterprises (SMEs), which have an economic importance in Switzerland that should not be underestimated. For them, it is likely to prove challenging to carry out the individual steps in accordance with Art. 22. This applies in particular to the preparation and the measures phase (see n. 18 ff.), which are time-consuming simply because of the necessary analytical procedure. This is likely to generate considerable costs in individual cases. Accordingly, it can be assumed that the need for templates, advice and assistance from the FDPIC will increase significantly in the near future. In this respect, the obligation of the European supervisory authorities to publish lists defining when a DIA is mandatory must be seen as a step in this direction (Art. 35 para. 4 DSGVO; cf. n. 43).
38 With the increasing use of algorithmic systems and other applications (see n. 2), DPA will also become more important. Data controllers must carefully consider in each individual case which risks appear appropriate and whether the use of such systems is worthwhile. This analytical approach prior to the actual start of the project should not be new, particularly for large companies, especially since corresponding privacy impact assessments were already carried out prior to the total revision (see n. 4 f. and 35) and they should be familiar with comparable obligations at the EU level due to the DSGVO.
39The DPA also makes clear that in the area of new disruptive technologies, the different areas of law overlap and an increasing blending of legal issues and those regarding the technology used can be observed. One example of this development is the discussions surrounding the Artificial Intelligence Act ("AI Act"). Comparable to the DIA pursuant to Art. 22 FADP, the most recently discussed text is based on a strongly risk-based approach, according to which a differentiation is made between limited, high and unacceptable risk. Thus, especially in the case of high-risk applications, which are defined in Annex III of the AI Act, a conformity assessment procedure must be carried out in accordance with Art. 43 AI Act, which is based, among other things, on Annex VI. These requirements, like the DSFA under the FADP, are to be understood as a preventive, ongoing process that includes both legal and technical aspects, as well as serving to further transparency and thus user trust.
V. Comparison with EU law
40 In general, the implementation of the DSFA according to the FADP is comparable to the EU-wide requirements of the DSGVO. At the same time, Art. 22 f. FADP are less strict than Art. 35 f. DSGVO. In the EU, for example, a DPA must also be carried out if data processing may lead to a high risk to the rights and freedoms of natural persons. In contrast to Art. 22 FADP, Art. 35 para. 3 lit. a DSGVO also explicitly provides for a DIA for automated individual decisions.
41 The lit. a to d of Art. 35 para. 7 regulate comparable steps to Art. 22 FADP, which must be taken into account when a DIA is performed. Thus, the preparatory phase is comparable to lit. a, the evaluation phase is comparable to lit. b and c, and the implementation phase of the measures is comparable to lit. d. Although in Switzerland it can also be assumed that a DIA may have to be carried out again in the course of data processing due to changed circumstances, Art. 35 para. 11 DSGVO even explicitly provides for this in certain cases.
42 While Art. 35 para. 2 DSGVO contains an obligation to consult the data protection officer, at least if one has been appointed, Art. 35 para. 9 DSGVO even provides that the controller may consult the data subjects when carrying out the DSFA. The FADP does not contain such a provision. The consultation of the data protection officer has no influence on the DSFA in terms of content, especially since the role of the data protection officer is not specified. Obtaining the views of the data subjects, on the other hand, is aimed at transparency on the part of the controller as well as self-determination on the part of the data subjects. These consultations reinforce the purpose of the DPA by demonstrating once again that the risks to be assessed are those of data processing for the data subjects and thus not primarily risks for the controller.
43 Unlike in Switzerland, Art. 35 para. 4 DSGVO requires supervisory authorities to keep a list of processing operations for which a DIA must be carried out in any case (so-called positive lists). Basically, the European legislator thus establishes a de facto obligation to conduct a DIA in the cases defined by the supervisory authorities. In addition, supervisory authorities are also authorized to optionally maintain lists of data processing activities for which a DPA is not required (so-called negative lists; Art. 35 para. 5 DSGVO). These lists are subject to the consistency procedure if they include "processing activities which are related to the offering of goods or services to data subjects or the monitoring of the behavior of such data subjects in several Member States or which could significantly affect the free flow of personal data within the Union" (Art. 35 para. 6 GDPR). The FADP does not provide for an obligation to maintain positive or negative lists, although this would be conducive to legal certainty.
44 While Art. 22 para. 5 FADP explicitly states that a DIA need not be performed if a code of conduct pursuant to Art. 11 FADP is complied with and the additional requirements under para. 5 are met, such codes of conduct under the FADP are only to be taken into account in the DIA as part of the assessment of the impact of the data processing in question. However, they do not release the controller from the obligation to conduct a DIA if it is mandatory by law.
45 Finally, similar to Art. 22 para. 4 FADP, Art. 35 para. 10 FADP provides that a DIA does not have to be performed if a data processing operation is based on a lawful basis. However, the exception in the DSGVO is more narrowly defined: The exception only applies if a DIA was carried out as part of the enactment of the legal basis and the legal basis regulates the specific processing operation or operations. The practical significance is therefore likely to be limited. In addition, there is an exception pursuant to Art. 35 para. 10 DSGVO if the data processing is carried out in the fulfilment of public interests or in the exercise of official authority.
46The DSGVO provides for fines of up to €10 million or up to 2% of the total annual worldwide turnover of the preceding business year for violations of the obligation to conduct a DSFA (Art. 83 para. 4 lit. a DSGVO). This fine contrasts with the FADP, which does not provide for a fine for such a violation (see n. 35).
Bibliography
Baeriswyl Bruno, Der «grosse Bruder» DSGVO und das revDSG: Ein vergleichender Überblick, S. 8–15.
Blonski Dominika, Kommentierung zu Art. 22 DSG, in: Baeriswyl Bruno/Pärli Kurt/Blonski Dominika (Hrsg.), Stämpflis Handkommentar zum DSG, 2. Aufl., Zürich/Basel, 2023
Blonski Dominika, DSFA und Vorabkonsultation, digma 2020, S. 28–32.
Früh Alfred/Haux Dario, Foundations of Artificial Intelligence and Machine Learning, Weizenbaum Series 29, DOI: 10.34669/WI.WS/29.
Klaus Samuel, KI trifft Datenschutz – Risiken und Lösungsansätze, in: Epiney Astrid/Rovelli Sophia, Künstliche Intelligenz und Datenschutz, Zürich 2021, S. 81–95.
Lobsiger Adrian, Hohes Risiko – kein Killerargument gegen Vorhaben der digitalen Transformation, SJZ 2023, S. 311–319.
Pfaffinger Monika, Das Recht auf informationellen Systemschutz. Plädoyer für einen Paradigmenwechsel im Datenschutzrecht, Habil., Baden-Baden 2022.
Rosenthal David, Das neue Datenschutzgesetz, Jusletter 16. November 2020.
Schönbächler Matthias R., Zum neuen Schweizer Datenschutzrecht, ZBJV 2023, S. 171–195.
Schürmann Kathrin, Datenschutz-Folgenabschätzung beim Einsatz Künstlicher Intelligenz: Bewertung und Minimierung der Risiken, ZD 2022, S. 316–321.
Vasella David, Widersprüche im Datenschutzrecht, digma 2020, S. 174–179.
Vasella David/Sievers Jacqueline, Der «Swiss Finish» im Vorentwurf des DSG, digma 2017, S. 44–49.
Widmer Michael, Datenschutz-Folgenabschätzung, digma 2017, S. 224–231.
Kühling Jürgen/Benedikt Buchner, Datenschutz-Grundverordnung BDSG, Kommentar, 3. Aufl., München, 2020.
Rosenthal David/Gubler Seraina, Die Strafbestimmungen des neuen DSG, SZW/RSDA 2021, S. 52–64.
Materials
Botschaft vom 15. September 2017 zum Bundesgesetz über die Totalrevision des Bundesgesetzes über den Datenschutz und die Änderung weiterer Erlasse zum Datenschutz, BBl 2017, S. 6941 ff., abrufbar unter: https://fedlex.data.admin.ch/eli/fga/2017/2057, besucht am 5.6.2023.