Privacy Law Update
Jan 2022 Charity & NFP Law Update
Published on January 27 2022

By Esther Shainblum and Martin U. Wissmath

BCCA certifies class action against US company for ‘scraping’ Instagram user data

A privacy law case in British Columbia provides an example of what courts in Canada may look for to certify a privacy class action proceeding. Severs v. Hyp3R Inc. is a November 22, 2021 judgment of the British Columbia Court of Appeal (the “BCCA”) involving breaches of the Privacy Act statutes by the defendant, Hyp3R Inc., a U.S.-based company (“Hyp3R”), in four provinces: B.C., Saskatchewan, Manitoba, and Newfoundland & Labrador (the “Four Provinces”). The BCCA also found Hyp3R committed the tort of intrusion upon seclusion in Ontario, Alberta, New Brunswick, Nova Scotia, Prince Edward Island and all three territories.

Hyp3R, founded in 2015, describes itself as a “location-based marketing platform” that collects data on social media from users’ posts that include real-world locations and permits third-party advertisers to target users in connection with locations and events. In contravention of Instagram’s privacy policies, from April 2018 to August 2019, Hyp3R collected extensive personal data from users, including photos, exploiting a security flaw, in a process called “scraping.” Instagram removed Hyp3R from its platform and revoked its access in August 2019. The representative plaintiff, Catherine Severs, was an Instagram user at that time who had her privacy settings set to “public”. Severs served Hyp3R a notice of civil claim in June 2020, but Hyp3R did not respond and Severs received a default judgment for damages to be assessed. Notwithstanding the defendant’s default, Justice Veenstra (“Veenstra J”) concluded that it was appropriate to proceed with certification and determination of the issues.

In making his findings, Veenstra J considered whether Hyp3R had breached the respective Privacy Acts of the Four Provinces and whether it had committed the tort of intrusion upon seclusion in the remaining common law provinces. Veenstra J was satisfied that Hyp3R had breached the respective Privacy Acts of the Four Provinces because it had, intentionally and without consent, violated the privacy of class members in each of the Four Provinces. He also concluded that, in the common law provinces without a privacy statute, Hyp3R’s conduct met the test for the common law tort of intrusion upon seclusion, as set out in Jones v Tsige, because its conduct was intentional, involved the invasion of class members’ privacy without lawful justification, a reasonable person would regard that invasion as highly offensive, and a reasonable person would be caused distress, humiliation or anguish. Nominal damages of $24,921,378, which worked out to $10 per Instagram user in Canada, were awarded.

Cyber Centre ‘Ransomware Playbook’ recommended by federal government

The Canadian Centre for Cybersecurity (“Cyber Centre”) has published best practice guidelines and a “Ransomware Playbook” that the federal government is recommending organizations to read and follow, which would include charities and not-for-profits. Best practice guidelines include “Top 10 IT security actions to Protect Internet-Connected Networks and Information” for an organization to follow, and baseline cybersecurity controls. A December 6, 2021 letter signed by four federal cabinet ministers — National Defence; Public Safety; Emergency Preparedness; and International Trade, Export Promotion, Small Business, and Economic Development — discusses the growing threat of ransomware attacks, and urges Canadian organizations “to take stock of your organization’s online operations, protect your important information and technologies with the latest cyber security measures, build a response plan, and ensure that your designated IT security personnel are well-prepared to respond to incidents.” The letter recommends that if an organization is threatened or falls victim to ransomware, to implement a recovery plan, seek professional cyber-security assistance, and “immediately report the incident to the Cyber Centre’s online portal as well as local police.”

Annual OPC Report warns that AI could pose significant privacy risk in the future

The Office of the Privacy Commissioner of Canada (OPC) published its 2020–21 Annual Report with a look back at recommendations for legislative reform and warnings about the potential threats of artificial intelligence technology (“AI”). Among other issues, the Annual Report, published on December 9, 2021, discusses the OPC’s concerns regarding the previous Bill C-11, Digital Charter Implementation Act, 2020 which died on the order paper when a federal election was called in August 2021. The OPC was critical of Bill C-11, which would have enacted the Consumer Privacy Protection Act and the Personal Information and Data Protection Tribunal Act, and the Annual Report refers back to a submission released by the OPC in May 2021, which included 60 recommendations to improve the proposed legislation.

Among the concerns expressed by the OPC in the Annual Report was that Bill C-11 was misaligned and less protective than the laws of other jurisdictions in many ways, that it lacked the privacy protective measures that exist in other countries and that it was a “step backward”. Of particular concern was the OPC’s view that Bill C-11“would have given consumers less control and organizations more flexibility in monetizing personal data, without increasing their accountability” and that the “proposed penalty scheme was unjustifiably narrow and protracted.”

The Annual Report lists six key issues to consider when designing a modern privacy law that would be “fit for purpose” in light of the fact that digital technologies that rely on the collection and analysis of personal information are central to our society and economy but can threaten fundamental rights. One of the key issues identified is the need for a “rights-based framework”, which would create a legal framework around the use of personal information that would entrench privacy as a human right.

The Annual Report also reviews the OPC’s public consultation to examine AI as it relates to the private sector, which led to the OPC’s March 2021 recommendations for the regulation of AI technology. While AI can increase efficiency, productivity and competitiveness, which are key factors in economic recovery, the Annual Report points out that it can also have serious consequences for privacy and, when AI is used to make automated decisions about people, it can seriously impact people’s lives and can heighten inequality, discrimination and societal divisions. The OPC concludes that AI “presents fundamental challenges to all of PIPEDA’s foundational privacy principles.” Key recommendations to address these legal challenges include:

  • Amending PIPEDA to allow personal information to be used for new purposes towards responsible AI innovation and for societal benefits;
  • Creating the right to meaningful explanation for automated decisions and the right to contest those decisions to ensure they are made fairly and accurately; and
  • Requiring organizations to design AI systems from their conception in a way that protects privacy and human rights.

The Annual Report states that the OPC believes that it is possible to improve PIPEDA within its existing structure without having to start over from scratch.


Read the January 2022 Charity & NFP Law Update