Federal Court of Appeal clarifies rules around safeguarding data and meaningful consent
-
Legal Development 11 September 2024 11 September 2024
-
North America
-
Regulatory risk
On September 9, 2024, the Federal Court of Appeal overturned a lower court decision, and found that Facebook Inc. (now Meta Platforms Inc.) (“Facebook”) failed to obtain meaningful consent and safeguard user data as required under the Personal Information Protection and Electronic Documents Act, S.C. 2000, c. 5 (PIPEDA).
This is an important decision, outlining organizations’ responsibilities when collecting and disclosing data, and potential regulatory response for shortcomings.
Background
In March 2018 the Privacy Commissioner of Canada (the “OPC”) completed its investigation of Facebook following receipt of a PIPEDA complaint that a third-party application “thisisyourdigitallife” (the “TYDL App”) obtained data of over 600,000 Canadians through the Facebook Platform and disclosed it to Cambridge Analytica.[1] The TYDL App obtained data beyond what it needed to function (including users’ friends’ data) and by transferring and selling user data to a third party. The global fallout from this scandal resulted in significant fines against Facebook, including from regulators in the United Kingdom and United States of America.
The OPC brought an application before the Federal Court in 2023 alleging Facebook breached PIPEDA through its practice of sharing users’ personal information with third-party applications (“Third Party Apps”) hosted on the Facebook Platform.
The lower court decision (Federal Court)[2]
At the Federal Court, there were two primary issues to be decided:
- Whether Facebook obtained meaningful consent from users and Facebook friends of users when sharing their personal information with third-party apps; and,
- Whether Facebook adequately safeguarded user information
As to the first issue, the OPC argued that Facebook could not obtain meaningful consent through third-party app developers like TYDL App. The Court found that the OPC failed prove Facebook’s data collection lacked meaningful consent. Instead, it held “an organization may rely on third-party consent” but “must take reasonable measures to ensure that the third party obtains meaningful consent”.[3]
With respect to safeguarding user data, the Court held Facebook adequately safeguarded information, despite its disclosure to third-party apps which disclosed that data to Cambridge Analytica. The Court held that Facebook’s safeguarding obligations ended once information was disclosed to third-party applications and that PIPEDA applies to “internal handling” of information in an organization’s “possession”.[4]
In sum, the Federal Court found that the OPC failed to relieve its evidentiary burden to prove Facebook breached PIPEDA.
The appeal decision (Federal Court of Appeal)[5]
The OPC appealed the Federal Court Decision to the Federal Court of Appeal (“FCA”). The FCA granted the appeal, finding the lower court erred in considering subjective evidence and did not consider whether each user had provided Facebook meaningful consent. As a result, Facebook was in breach of PIPEDA.
In coming to its decision, the FCA reiterated that “subjective evidence does not play a role in an analysis of” the reasonable person, and instead it is up to the Court to define an objective, reasonable expectation of meaningful consent. Under PIPEDA, both the organization’s efforts to reasonably inform an individual of the purpose information will be used and the form in which consent is sought must be reasonable.[6] Put simply, “if the reasonable person would not have understood what they consented to, no amount of reasonable efforts on the part of the corporation can change that conclusion”.[7]
(a) Meaningful Consent
With respect to the first issue of meaningful consent, the FCA found that Facebook failed to obtain meaningful consent from friends of users to disclose their data – a breach of PIPEDA. Friends of users were unable to review third party app’s data policies before disclosure.[8] Even where some information was provided, it was too broad to be effective. Meaningful consent requires a reasonable person’s understanding of the nature, use and consequences of the disclosure, which was not obtained.
For users, meaningful consent could not be provided because the terms were not clear to a reasonable person. To the FCA, the terms and conditions, which were “at the length of an Alice Munro short story”, were too complex and unlikely to allow a reasonable person to conclude that by downloading a personality quiz they were “consenting to the risk that the app would scrape their data and the data of their friends”.[9] Facebook’s CEO, Mark Zuckerberg, previously speculated to the US Senate Committee that the users likely did not read the whole policy, nor did app developers understand their obligations under Facebook’s policies.
As a result, the FCA concluded that no user could have provided meaningful consent during the relevant period.
(b) Safeguarding Obligation
On the safeguarding obligation, the FCA found that Facebook failed to safeguard data resulting in a breach of PIPEDA. Facebook failed to review the content of third-party apps’ privacy policies, despite these apps having access to downloading users’ data and the data of their friends.[10] To the FCA, this supported the finding that Facebook “did not take sufficient care to ensure the data in its possession prior to disclosure was safeguarded”.[11] While it would have been practically impossible to review all third-party app privacy policies, Facebook could not limit the scope of its responsibilities under PIPEDA.
Disposition
The FCA held that Facebook’s practices between 2013-2015 breached Principle 3, Principle 7, and section 6.1 of PIPEDA and a declaration should issue to that effect.
Despite the breaches, the decision comes after several lengthy legal and regulatory proceedings in other jurisdictions, including the US Federal Trade Commission imposing a USD $5b fine and a requirement for Facebook to update its privacy policies, and the UK’s Information Commissioner’s Office (ICO) issuing a £500,000 fee against Facebook in 2018. Facebook argued that its privacy policies underwent numerous changes since the events that transpired a decade ago, leaving the FCA to ask that Facebook and the OPC agree to a consent remedial order.
Conclusion
Decisions of this nature offer a rare glimpse into how courts will interpret privacy legislation in the context of large-scale data collection and breaches. Organisations are reminded that meaningful consent requires not only meaningful efforts to obtain consent, but also for the reasonable person to properly understand what they are consenting to. Practically speaking, this means that terms and conditions or privacy policies should be clear, concise (likely not the length of an Alice Munro short story) and adequately represent the risk. In addition, organizations should review the privacy policies of third parties with which they contract and provide data – the obligation under PIPEDA is not relieved even where it is “practically impossible” to comply. It is likely that standard terms or other forms of agreement would be useful to consider.
[1] Facebook says more than 600,000 Canadians may have had data shared with Cambridge Analytica [CBC News]
[2] Canada (Privacy Commissioner) v. Facebook, Inc., 2023 FC 533 (the “Federal Court Decision”)
[3] At para 65 of the Federal Court Decision
[4] At para 86 of the Federal Court Decision
[5] Privacy Commissioner of Canada v. Facebook Inc., 2024 FCA 140 (the “FCA Decision”)
[6] At para 71 of the FCA Decision
[7] At para 72 of the FCA Decision
[8] A violation of clause 4.3.2 of PIPEDA which requires that organizations “make a reasonable effort to ensure that the individual is advised of the purposes for which the information will be used”.
[9] At para 82 of the FCA Decision
[10] At para 110 of the FCA Decision
[11] At para 113 of the FCA Decision
End