Skip to content

The Monitor Progressive news, views and ideas

What we deserve

Warren Urquhart discusses two important digital rights for Canadians

January 6, 2022

6-minute read

We need to regulate big tech: there’s bipartisan agreement that makes this abundantly clear. The question now is how to regulate it. The Trudeau government has proposed Bill C-11 as one way to answer the big tech question. This Bill has promoted discussion: are we trading in private sector intrusion for public sector overreach? Or could C-11 be another chance for a public-private partnership that only enhances both parties’ ability to surveil and own our data? 20 years after 9/11 America’s Patriot Act and Canada’s Anti-Terrorism Act have supercharged online surveillance. Paired with the concern of how companies like Facebook recklessly mishandle user data, the discussion around regulation has very much been: how do we limit government or Big Tech’s online influence? Perhaps, though, we should switch the focus away from limiting what others can do, to focusing on what we deserve. What should we be guaranteed as online citizens? What are our Digital Rights?

This idea of Digital Rights was explored by the last minority Trudeau government in the now dormant Bill C-11. While an imperfect bill, C-11 had the foundation for providing a framework for what online users ought to be entitled to. We can build on what C-11 started to develop this framework for Digital Rights further, identifying key components for its success. To protect our interests, any Digital Rights guidelines should include the Right to Be Forgotten, the Right to Private Legal Action for Data and more.

The Right to Be Forgotten (RTBF) is powerful because it lets us acknowledge our data for what it is: an extension of ourselves. The RTBF, roughly, means that an individual has the right to have their personal information removed from search engines. The RTBF has its legal origins in Europe, through the 2014 Google Spain v AEPD and Mario Costeja Gonzalez decision, and is embedded in Article 17 of the EU’S General Data Protection Regulation (GDPR). It allows, aside for some considerations regarding free speech, an individual to “request any data controller, at any time, to eliminate from their databases any piece of information regarding that data subject, regardless of the source of the information, and regardless of whether that information produces harm.”1

There’s something intuitive above the RTBF: if I consent (and whether “consent” is clicking a checkbox after thousands of words in legalese is another issue) to Facebook having my data, and they can use that data to profit from advertisers, shouldn’t I also be entitled to have that data removed when I leave that platform? The RTBF restores a sense of autonomy for users and restores ownership over personal digital identity because it prioritizes consent. Beyond that intuition, the RTBF is effective because it can protect users from outside forces that use their data without consent.

We should switch the focus away from limiting what others can do, to focusing on what we deserve. What should we be guaranteed as online citizens? What are our Digital Rights?

The Clearview AI fiasco is a perfect example of how the public and private sector can work together to exploit user data, and how the RTBF has proactive utility. In 2020, the RCMP was found to have used facial recognition technology by Clearview, which harvested “billions of personal photos from social media”.2 In June 2021, an investigation concluded that the RCMP’s actions were illegal, and violated the Privacy Act. By using this data, which was harvested without consent of users, RCMP could “match photographs of people against the photographs in the databank” and created “massive repositories of Canadians who are innocent of any suspicion of crime.” Canada’s Privacy Commissioner starkly ruled that “a government institution cannot collect personal information from a third party agent if that third party agent collected the information unlawfully.”3 The findings of the investigation directly contradicted what the RCMP stated a year earlier, that they were “not using Clearview AI on members of the public.” This came after the RCMP originally denied using Clearview AI, only to later correct the record.4

Where a RTBF would come in, is that with a codified right, users could remove their information from social media websites before applications like Clearview’s extract their data. Not only does the RTBF limit what a website or digital service can do with users’ data, but it preemptively protects users from extraction, because sites can’t harvest data that’s not available. With a RTBF, users not only protect themselves from anything the first party can do with their data but from the third party as well. An RTBF legislation would create proactive personal data protection.

Now, what happens in the event that personal data has already been illegally extracted or mishandled? An RTBF allows a proactive solution, but what about a reactive protection that allows users to seek compensation? That’s where a Right to a Private Action, or the Right to Sue for Data, factors in. Of course, Canadians can already sue for privacy. Canada and its provinces, specifically Ontario, have recognized privacy torts. The idea of a Privacy tort, at least in a modern context, is recent, with 2012’s Jones V Tsige case creating the privacy tort of Intrusion upon seclusion.5

The problem with Intrusion upon Seclusion and other privacy torts is that they are focused on individual lawsuits, where most data issues would best be served in class actions, as data leaks usually involve a large number of people. Class actions help bring justice by (1) saving time by letting one legal proceeding work for all plaintiffs, when litigating all their claims would take years upon years and (2) making bad actors pay a large collective sum that is the collective of many smaller losses experienced by everyone in a “class” of (a $20 dollar loss per user may not mean much to the individual users, but a company that has to pay out $20 to 100,000 people will feel the ramifications of its misbehaviour). Canada has made some steps to address privacy on an individual level, but on a class action level, particularly with Premier Ford’s Class actions reform making class actions harder to succeed,6 our country has been lacking. With a plaintiff-unfriendly class actions regime regarding data and privacy, not only are regular people left out of compensation, but Big Tech and those who violate our online privacy easily avoid meaningful mechanisms that ensure accountability.

The problem with privacy class actions is that our legal regime is not sure how to value data. Of course, we know that a data leak that contains sensitive information like banking, financial passwords and social insurance numbers is valuable and can have downstream effects up to and including identity theft. However, our legal system views proving an individual’s harm in widespread privacy invasions as “a difficult and expensive task.”7 The way that class actions work, proving that damage, and that everyone suffered damage through the commonality of the breach, is essential to winning a case. Perhaps some members of the class are able to prove a financial loss because a data leak enabled hackers to steal $1,000 each from them, but other members, at the time of the lawsuit, experienced no loss. We know that those hackers can take that $1,000 at any time with the information they have, but by not experiencing a loss by the time of the lawsuit, the member’s chances of winning in court dip considerably, even though the courts acknowledge the future danger of the leaked data.

What is needed, then, is an expanded Right to a Private Action for Data Issues. The good news is that there are a lot of ways to achieve that. One approach is to add a cy pres settlement function, as Kadri and Cofone have written about, for data and privacy cases specifically.8 Cy pres allows courts to distribute settlement funds to charities when it is too expensive or difficult to assess individual claims of loss (such as those who may be at risk of a future loss, but have not yet experienced it). This way, there is still accountability for the actions of an actor that mishandled data. That serves the disciplinary function of class actions, but in terms of compensation, the Canadian law related to this area, Personal Information Protection and Electronic Documents Act (PIPEDA)—as also suggested by Cofone9—can create new penalties, separate from class actions for privacy violations. These penalties can then be distributed to those who weren’t able to prove loss (in class or singular actions) to compensate victims of breaches. Both methods allow our legal system while it is learning to value and protect data to compensate and to punish privacy violations.

The public should not stop with demanding a more refined Right to a Private Action for Data and an RTBF. This is just the start. We deserve the Right to Not be Tracked, the Right to Free Internet and the Right to Net Neutrality, much of which is in the spirit of the Magna Carta for the Web.10 However, to get there, we need to reorient the conversation around what we deserve. The RTBF and the Right to a Private Action, are important tools to begin that reorientation. With those rights and more, Canadians can fight digital overreach by public and private actors.

Notes

1 Article 17 GDPR: https://gdpr-info.eu/art-17-gdpr/

2 CBC -RCMP denied using facial recognition technology - then said it had been using it for months:  https://www.cbc.ca/news/politics/clearview-ai-rcmp-facial-recognition-1.5482266

3 OPC - RCMP’s use of Clearview AI’s facial recognition technology violated Privacy Act, investigation concludes, https://www.priv.gc.ca/en/opc-news/news-and-announcements/2021/nr-c_210610/

4 CBC -RCMP denied using facial recognition technology - then said it had been using it for months: https://www.cbc.ca/news/politics/clearview-ai-rcmp-facial-recognition-1.5482266

5 Dentons Law Firm-Four of a kind: Ontario Recognizes the Fourth Privacy Tort – False Light,  http://www.dentonsdata.com/four-of-a-kind-ontario-recognizes-the-fourth-privacy-tort-false-light/?utm_source=Mondaq&utm_medium=syndication&utm_campaign=LinkedIn-integration

6 Globe and Mail - Proposed changes to Ontario’s class-action laws would make it harder to sue corporations and government, experts say, https://www.theglobeandmail.com/business/article-proposed-changes-to-ontarios-class-action-laws-would-make-it-harder/

7 Page 2 – Kadri and Cofone - Cy Près Settlements in Privacy Class Actions: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3602294

8 Kadri, Thomas and Cofone, Ignacio, Cy Près Settlements in Privacy Class Actions (May 15, 2020). Ignacio N. Cofone (ed.), Class Actions in Privacy Law, Routledge, 2020 at page 2. Available at SSRN: https://ssrn.com/abstract=3602294 or http://dx.doi.org/10.2139/ssrn.3602294

9 Page 8 – Ignacio N. Cofone - Policy Proposals for PIPEDA Reform to Address Artificial Intelligence Report: https://poseidon01.ssrn.com/delivery.php?ID=269126002097094112100105078103094068122032049015054052117085119094067070121119112101033022034012040098112080065107127126112124111029057079021094030067072071114090127095033045004068020079022085025113064025104088117090009115077020113118107124115100017005&EXT=pdf&INDEX=TRUE

10 Jake Johnson - 'Magna Carta for the Web': Read Tim Berners-Lee's Global Declaration for a Humane and Democratic Internet: https://www.commondreams.org/news/2018/11/06/magna-carta-web-read-tim-berners-lees-global-declaration-humane-and-democratic

Topics addressed in this article

Related Articles

Cities are central to resolving the challenges of our time

Cities have long been seen as engines that drive prosperity but a new fiscal arrangement is needed that allows them to tackle the serious issues of our time.

Taking back our cities

We can shape the future of urban life

Watch Your Step, Buster!

Guess who first normalized pedestrian shaming