Log in
Log in


  • 12 Dec 2022 9:32 AM | CAN-TECH Law (Administrator)

    Revelations of the use of facial recognition and spyware by the RCMP results in a long list of recommendations and a call for accountability

    The House of Commons’ Standing Committee on Access to Information, Privacy and Ethics (also known as “ETHI”) has had a busy few months examining how Canadian police have been using or have considered using particularly intrusive technologies and techniques to advance their examined. In two separate studies and reports, the Committee examined the use of facial recognition and artificial intelligence technology (report) and the use of so-called on-device investigative tools (report), principally by the Royal Canadian Mounted Police. 

    The review of the use of facial recognition by the Committee followed media reports and a Privacy Commissioner Investigation into the practices of Clearview AI. The company was actively crawling social media websites and ingesting billions of photos into its databases, analyzing them biometrically and then providing a service mainly to police agencies which it touted could identify a person or a suspect in any image. Initially the RCMP denied that it had used the company’s facial recognition services, but ultimately admitted they had trialed it. The Commissioner concluded that the images would have been harvested in contravention of Canadian law and that the RCMP should only use services where the underlying data had been lawfully compiled. 

    Among its 19 recommendations, the Committee recommended tighter regulation of the use of the technology both in the public and the private sectors, that there be a moratorium imposed on the use of facial recognition by the police until a framework for review has been approved and that there be a much more transparent approach to the use of facial recognition and artificial intelligence in the public sector. Scrutiny by the Committee and the Privacy Commissioner are credited with prompting the RCMP to establish a “National Technology Onboarding Program” to review police use and adoption of new technology and investigative tools. 

    The same Committee carried out a study of the police use of spyware as an investigative tool after documents tabled in Parliament disclosed that the RCMP had been using “on-device investigation tools” (or “ODITs”), akin to spyware, for some years. This coincided with media reporting on an Israeli cybersurveillance company, NSO Group, and their software called “Pegasus”, which has reportedly been widely used journalists, lawyers and politicians. 

    In testimony before the Committee, the RCMP stated that ODITs provide law enforcement agencies with the capability to secretly collect private communications and other data that can no longer be obtained through conventional wiretap activities or other less intrusive investigation techniques. 

    A range of witnesses commented on the fact that the use of ODITs relies on vulnerabilities existing in devices and operating systems that manufacturers are likely not aware of. If they exist on suspects’ devices, they exist on the devices of many others. As a result, they can be exploited by a range of actors, both foreign and domestic. The Privacy Commissioner confirmed that his office had not been consulted at any time regarding the use this invasive technology. 

    The recommendations of this study closely parallel, thematically, the recommendations of the facial recognition study. They focus on increased accountability, increased scrutiny and increased transparency about the use of these tools. The Committee recommended a review of the provisions of Criminal Code related to the interception of private communications and the creation of an independent advisory body composed of relevant stakeholders from the legal community, government, police and national security, civil society, and relevant regulatory bodies to review new technologies used by law enforcement and to establish national standards for their use.

  • 12 Dec 2022 9:04 AM | CAN-TECH Law (Administrator)

    “Intrusion” privacy tort does not apply to third party hacking claims

    Ontario CA determines that the defendant must be the party who did the intruding

    The Ontario Court of Appeal, in considering a trilogy of cases together, has definitively determined that the privacy tort of “intrusion upon seclusion” does not apply to a defendant whose information systems were intruded by a malicious third party. The three cases were heard together with three sets of reasons issued: Winder v Marriott International, Inc., Obodo v Trans Union of Canada, Inc. and Owsianik v Equifax Canada Co.

    In the landmark case of Jones v Tsige, the Ontario Court of Appeal had determined that the “Prosser privacy torts” exist in Ontario common law, including the tort of intrusion upon seclusion. Since then, numerous privacy class actions have been brought, many of which have pled this privacy tort. The question of whether this tort can be the basis of liability for a company that is itself a victim of a third party’s act has rested on the meaning of the word “reckless” in the articulation of the elements of the cause of action from Jones:

    [71] The key features of this cause of action are, first, that the defendant's conduct must be intentional, within which I would include reckless; second, that the defendant must have invaded, without lawful justification, the plaintiff's private affairs or concerns; and third, that a reasonable person would regard the invasion as highly offensive causing distress, humiliation or anguish. However, proof of harm to a recognized economic interest is not an element of the cause of action. I return below to the question of damages, but state here that I believe it important to emphasize that given the intangible nature of the interest protected, damages for intrusion upon seclusion will ordinarily be measured by a modest conventional sum. [emphasis added]

    Plaintiffs in such data breach class actions have argued that the breaches are the result of the defendant’s recklessness, usually with respect to the handling or safeguarding of personal information. 

    The most extensive reasons in the trilogy of cases were given by Justice Doherty in Owsianik. In all three cases, the question before the courts below was whether to certify the proposed class actions, which requires that there be a legally viable claim. The plaintiffs had experienced varied success in the courts below. 

    In its analysis of the intrusion tort, the Court summarized the elements and explicitly categorized the conduct, state of mind and consequences requirements:

    [54] The elements of the tort of intrusion upon seclusion are laid down in Jones, at para. 71. I would describe them as follows:

    • the defendant must have invaded or intruded upon the plaintiff’s private affairs or concerns, without lawful excuse [the conduct requirement];
    • the conduct which constitutes the intrusion or invasion must have been done intentionally or recklessly [the state of mind requirement]; and
    • a reasonable person would regard the invasion of privacy as highly offensive, causing distress, humiliation or anguish [the consequence requirement].

    The plaintiff argued that the state of mind requirement was applicable to the defendant, Equifax in this case. The Court disagreed: The state of mind requirement applies to the “intruder”. 

    [59] Ms. Owsianik’s submission misunderstands the relationship between the two elements of the tort. The first element, the conduct requirement, requires an act by the defendant which amounts to a deliberate intrusion upon, or invasion into, the plaintiffs’ privacy. The prohibited state of mind, whether intention or recklessness, must exist when the defendant engages in the prohibited conduct. The state of mind must relate to the doing of the prohibited conduct. The defendant must either intend that the conduct which constitutes the intrusion will intrude upon the plaintiffs’ privacy, or the defendant must be reckless that the conduct will have that effect. If the defendant does not engage in conduct that amounts to an invasion of privacy, the defendant’s recklessness with respect to the consequences of some other conduct, for example the storage of the information, cannot fix the defendant with liability for invading the plaintiffs’ privacy.

    The Court noted that Equifax may be liable to the plaintiff on some other basis, but not as an intruder of the plaintiff’s privacy. 

    [61] …. Equifax’s negligent storage of the information cannot in law amount to an invasion of, or an intrusion upon, the plaintiffs’ privacy interests in the information. Equifax’s recklessness as to the consequences of its negligent storage cannot make Equifax liable for the intentional invasion of the plaintiffs’ privacy committed by the independent third-party hacker. Equifax’s liability, if any, lies in its breach of a duty owed to the plaintiffs, or its breach of contractual or statutory obligations.

    The plaintiffs argued that the tort of intrusion upon seclusion should be extended to clearly be applicable to the “Database Defendants”, otherwise the plaintiffs would be without a remedy in these circumstances. This was dismissed by the Court of Appeal:

    [79] The plaintiffs’ “no remedy” argument really comes down to the assertion that because the remedies available in contract and negligence require proof of pecuniary loss, the plaintiffs who cannot prove pecuniary loss are left with no remedy. With respect, this is not what the court meant in Jones when it described the plaintiff as being without remedy. The plaintiffs here are in the same position as anyone else who advances the kind of claim the plaintiffs have advanced here. Because the claim sounds in negligence and contract, the plaintiffs must prove pecuniary loss. The plaintiffs’ position is miles away from the predicament faced by the plaintiff in Jones.

    [80] While it cannot be said the plaintiffs are left without a remedy, it is true that the inability to claim moral damages may have a negative impact on the plaintiffs’ ability to certify the claim as a class proceeding. In my view, that procedural consequence does not constitute the absence of a remedy. Procedural advantages are not remedies.

    The court finally noted, before dismissing the appeal, that if parliament or the provincial legislatures wanted to extend the law so far as to provide moral damages in cases like this, they are able to do so. 

  • 8 Nov 2022 5:03 PM | CAN-TECH Law (Administrator)

    We had a wonderful turnout at the 2022 CAN-TECH Law Fall conference November 2-3, 2022 at the Sheraton Centre Toronto Hotel.

    Click here for the highlights:

  • 16 Sep 2022 6:00 PM | CAN-TECH Law (Administrator)

    Recommendations are limited to improving processes outside of the courts

    The province of Nova Scotia has completed its legislated review of the Intimate Images and Cyber-protection Act, with feedback resulting in a dozen recommendations. The Act replaced the province’s Cyber-safety Act, which was struck down in its entirety in 2015 by the Nova Scotia Supreme Court in Crouch v Snell, which found the former law violated sections 2(b) and 7 of the Charter of Rights and Freedoms. 

    The original law created a CyberScan unit within the Department of Public Safety to assist victims of cyberbullying, which was continued in the new legislation. Individual recourse to the courts was continued, but through a much more arduous formal application process to the Nova Scotia Supreme Court (the ex parte procedures in the previous law were found to violate the principles of fundamental justice). All of the recommendations put forward by the review group focus on augmenting and improving the CyberScan unit, including:

    • improving legal, mental health and crisis supports for victims;

    • creating a centralized, trauma-informed referral process for victims seeking advice and support; and

    • improving training for CyberScan staff, who help victims understand their options and navigate the justice system.

    None of the recommendations address access to the courts for legal relief on behalf of victims of cyber-bullying and the non-consensual distribution of intimate images.

    Nova Scotia remains the only province with a specific law to address cyberbullying, while other provinces have enacted statutes that are restricted to providing legal relief for victims for the non-consensual distribution of intimate images.

  • 16 Sep 2022 5:59 PM | CAN-TECH Law (Administrator)

    Breach notification and reporting obligations come into effect on September 22, 2022

    As part of its significant overhaul of the Act respecting the protection of personal information in the private sector in Bill 64 (now also known as Law 25), the province has introduced mandatory reporting and notification related to data breaches. The provisions in section 3.5 of the Bill will come into effect on September 22, 2022. The provisions are similar to those found in the Personal Information Protection and Electronic Documents Act (Canada) and the Personal Information Protection Act (Alberta), but not surprisingly use different terminology. 

    Regulated businesses will be required to promptly notify the Commission d’accès à l’information (“CAI”), as well as to the affected individuals whenever such businesses experience a “confidentiality incident” that poses a “risk of serious injury” to an individual. This is similar to a “breach of security safeguards” that results in a “real risk of significant harm” under PIPEDA. Again, similar to PIPEDA, businesses will be required to keep a register of all confidentiality incidents in the manner prescribed by regulation, regardless of the risk of injury. 

    On June 29th, 2022, a draft regulation regarding confidentiality incidents was published in the Gazette officielle du Québec. The Draft Bill 64 Regulation provides businesses with details related to the content of the new notification and record-keeping requirements. Interestingly, the new regulation also applies to public sector organizations. 

    Reports to the regulator must include:

    (1) the name of the body affected by the confidentiality incident and any Québec business number assigned to such body under the Act respecting the legal publicity of enterprises (chapter P-44.1);

    (2) the name and contact information of the person to be contacted in that body with regard to the incident;

    (3) a description of the personal information covered by the incident or, if that information is not known, the reasons why it is impossible to provide such a description;

    (4) a brief description of the circumstances of the incident and what caused it, if known;

    (5) the date or time period when the incident occurred or, if that is not known, the approximate time period;

    (6) the date or time period when the body became aware of the incident;

    (7) the number of persons concerned by the incident and the number of those who reside in Québec or, if that is not known, the approximate numbers;

    (8) a description of the elements that led the body to conclude that there is a risk of serious injury to the persons concerned, such as the sensitivity of the personal information concerned, any possible ill-intentioned uses of such information, the anticipated consequences of its use and the likelihood that such information will be used for injurious purposes;

    (9) the measures the body has taken or intends to take to notify the persons whose personal information is concerned by the incident, pursuant to the second paragraph of section 63.8 of the Act respecting Access to documents held by public bodies and the Protection of personal information or the second paragraph of section 3.5 of the Act respecting the protection of personal information in the private sector, and the date on which such persons were notified, or the expected time limit for the notification;

    (10) the measures the body has taken or intends to take after the incident occurred, including those aimed at reducing the risk of injury or mitigating any such injury and those aimed at preventing new incidents of the same nature, and the date on which the measures were taken or the expected time limit for taking the measures; and

    (11) if applicable, an indication that a person or body outside Québec that exercises similar functions to those of the Commission d’accès à l’information with respect to overseeing the protection of personal information has been notified of the incident.

    Notably, and unlike PIPEDA and PIPA, the regulations create a requirement to keep the CAI updated as more information relevant to (1) through (11) becomes known. 

  • 16 Sep 2022 5:58 PM | CAN-TECH Law (Administrator)

    CRA security failure resulted in hackers’ access to thousands of accounts for CERB fraud

    The Federal Court of Canada in Sweet v Canada has certified a negligence, breach of confidence and intrusion upon seclusion class action against the Canada Revenue Agency in connection with widespread “My CRA” account takeovers during the pandemic. Of particular interest is that the class was defined to exclude individuals who had provided their personal information to a BC law firm that first filed its claim and then was itself subject to a cybersecurity incident that may have exposed class member information. 

    During the summer of 2020, a large number of “My CRA” accounts were compromised and access by unknown third parties. The compromised accounts had their banking and direct deposit information changed and many accounts were enrolled in benefits programs, such as the Canada Emergency Response Benefit. The threat actors were also able to access sensitive personal information contained in the accounts, such as addresses, birthdates, employment details and SIN numbers. 

    A BC law firm quickly filed a putative class action in the Federal Court. In April 2021, that firm was itself the victim of a data breach that potentially exposed the personal information of potential class members. The government filed a motion to have the action stayed because it was proposing a third party claim against the law firm for contribution and indemnity for any persons whose information was exposed by both the government and the law firm. That third party claim would not be within the jurisdiction of the Federal Court. As a result, the first law firm withdrew and a second law firm began carriage of the case, and amended the pleadings to narrow the class of plaintiffs to exclude those whose information may have been exposed in the law firm data breach. A new representative plaintiff was substituted. 

    The Court described the incidents which resulted from an apparent failure on the part of CRA:

    [66] In the summer of 2020, GCKey and CRA’s My Account were the subject of what the cybersecurity industry describes as a “credential stuffing attack” by a threat actor, predominantly targeting CRA and ESDC as a means of fraudulently applying for COVID relief benefits (CERB and the Canada Emergency Student Benefit [CESB]) that had been introduced by the Government in the spring of 2020). Credential stuffing is a form of cyber attack that relies on the use of stolen credentials (username and password) from one system to attack another system and gain unauthorized access to an account. This type of attack relies on the reuse of the same username and password combinations by people over several services. Threat actors sell lists of credentials on the Dark Web. Credential stuffing usually refers to the attempt to gain access to many accounts through a web portal using an automated bot system rather than manually entering the credentials. On dates in July 2020, CRA’s My Account experienced large numbers of failed logins, which have since been identified as a precursor to, or otherwise part of, a credential stuffing attack against that service.

    [67] A threat actor attempting to access a particular My Account through credential stuffing would typically have encountered the requirement to successfully answer one of the five security questions selected by the user. However, during the attack that occurred in the summer of 2020, the threat actor(s) were able to bypass the security questions, and access My Account, because of a misconfiguration in CRA’s credential management software. CRA learned of this method to bypass the security questions on August 6, 2020, when it received a tip from a law enforcement partner that such a method was being sold on the Dark Web. Among other steps taken to respond to the data breach, CRA subsequently identified the relevant misconfiguration in its software, which it remedied on or about August 10, 2020.

    [68] In the meantime, at least 48,110 My Accounts were impacted by the unauthorized use of credentials, meaning that the threat actor was able to enter a valid CRA user ID and password. Of those 48,110 My Accounts, 21,860 involved no progress by the threat actor beyond entering the ID and password, such that the threat actor did not access the accounts. This is potentially understood as a stage of the attack in which the threat actor was ensuring that the credentials worked. The threat actor(s) actually logged in to 26,250 My Accounts. In 13,550 of the My Accounts, although the security question bypass was used, the threat actor only viewed the homepage, meaning that some personal information was accessed, but no application was submitted for CERB. In 12,700 of the My Accounts, the threat actor changed the relevant taxpayer’s direct deposit banking information and fraudulently applied for CERB.

    The plaintiffs sought to certify the class action on the basis of systemic negligence, breach of confidence and intrusion upon seclusion. The court noted that there were differing authorities on whether these causes of action could be applicable in circumstances such as these, but overall found that this area of law continues to develop and that the plaintiff’s claims were not bound to fail, based on the pleadings. 

  • 16 Sep 2022 5:57 PM | CAN-TECH Law (Administrator)

    Federal Privacy Commissioner weighs in on draft regulations regarding search of electronic devices by Canadian Border Services Agency

    Currently wending its way through Parliament is Bill S-7, which will inter alia amend the Customs Act to deal specifically with the search of electronic devices. The Canadian Border Services Agency (CBSA) in June 2022 proposed a set of regulations that would govern the examination of data stored on electronic devices. In July 2022 the Office of the Privacy Commissioner of Canada (OPC) published its submissions made as part of the public consultation on the draft regulations. It dealt with four topics:

    Note-Taking Requirements

    The OPC recommended:

      The requirement for noting the basis of the examination to also include noting if the rationale changes as the examination progresses, for instance, if new evidence or facts emerge;

      The proposed requirement for noting the type of document that was examined to also include the reasons why a particular document was examined;

      Adding a requirement to note any communication with the traveller that may be relevant to the circumstances of the examination; and

      Adding a requirement to note whether the search was resultant or not, and the steps taken following that determination.

    Disabling Network Connectivity

    The OPC reiterated earlier submissions that there be more specific procedures imposed regarding the disabling of network connectivity on devices so as to confine searches to data stored on the device; “we would emphasize that certain technical steps and procedures should be specified by the Regulations as necessary to ensure there is no connection to a network, including, but not necessarily limited to: activating “airplane mode”, deactivating connection to a WiFi network and, ensuring a device is not sharing a connection with another device via Bluetooth or otherwise.”

    Password Collection and Retention

    The OPC recommended that the Regulations “include specific provisions directing the methods and circumstances for password and passcode collection, including specifying that an officer must not retain a password or passcode in instances where the examination of a digital device is non-resultant.”

    Solicitor-Client Privileged Information

    In line with an amendment to the Bill that had been made by the Senate, the OPC recommended that CBSA “include its current policy requirements for dealing with solicitor-client privileged information, and other types of sensitive information of this nature, within the proposed Regulations.”

  • 16 Sep 2022 5:56 PM | CAN-TECH Law (Administrator)

    Electronic data from vehicle ruled properly removed, but not properly admitted

    In R. v. Major, the accused had been convicted at trial of dangerous driving and criminal negligence, stemming from a motor vehicle accident in which he was the driver of a pickup truck that collided with another truck. Two RCMP forensic collision reconstructionists attended at the scene of the horrific accident, and removed the factory-installed airbag control module (ACM) and accessed data from the event data recorded (EDR), which is a component of the ACM. They created an image of the data on an RCMP computer and analyzed it, which revealed that the accused’s truck was “moving at 137 km/hr 5 seconds prior to the impact, the brakes had been applied 1.2 seconds before the collision, and the vehicle was still going 118 km/hr immediately before it struck the semi-truck.” Over the accused’s objections, the EDR data and the reconstructionists’ testimony about it was admitted at trial and he was convicted. On appeal he renewed these objections.

    The Court of Appeal first turned to the question of whether the removal of the ACM and accessing of the EDR had breached the accused’s rights under s. 8 of the Charter. After reviewing the essential s. 8 law on search and seizure, Tholl J.A. (for a unanimous court) canvassed two competing lines of authority specifically on the issue of s. 8 breaches regarding ACM and EDR seizure. The first stemmed from the case of R v Hamilton, 2014 ONSC 447, and in these cases the courts tended to find that seizing the data without consent or a warrant amounted to a s. 8 breach, on the basis that the accused had a reasonable expectation of privacy in the vehicle which covered the data (despite its being relatively obscure or unknown to most drivers). The second line of authority stemmed from the case of R v Fedan, 2016 BCCA 26, and in these cases the courts tended to find that it could be presumed that an accused had a subjective expectation of privacy in the data due to its presence in the vehicle, but that such an expectation of privacy was not objectively reasonable “since no information could be gleaned from [the ACM and EDR] that revealed any intimate details about the accused or any information about who was driving the vehicle.”

    Turning to this case, the Court held that the accused had a limited interest in the data, insofar as it was embedded in his vehicle, and that in accordance with the Fedan line of cases it could be presumed without proof (i.e., the accused had not testified on the topic) that he had a subjective expectation of privacy in everything to do with his personal vehicle. However, the court was not convinced that the expectation of privacy was objectively reasonable:

    [69]           The closest an EDR comes to saving any personal information is its storage of the manner in which the vehicle was driven immediately pre-collision. The EDR data provides but a brief snapshot of the status of various mechanical and electrical components of the vehicle before a triggering event. According to the evidence in this matter, the data located in an EDR is not such that it provides longer-term information about the driving habits of the owner or operator of a vehicle. While an ACM and its EDR is, at least partially, an electronic device, it is comparatively rudimentary in nature. Moreover, unlike a mobile device or computer, its owner has no control in relation to any of the information that is stored on it. They cannot input data onto it, retrieve information from it or actively use it for any purpose. It does not record non-vehicle related information nor does it acquire or give location data. In the case at hand, there was no evidence demonstrating that Mr. Major even knew it existed.

    [70]           After considering the two lines of cases regarding EDR data, I find myself in substantial agreement with the reasoning from Fedan for the characterization of the data stored in the EDR. As in Fedan, the data here “contained no intimate details of the driver’s biographical core, lifestyle or personal choices, or information that could be said to directly compromise his ‘dignity, integrity and autonomy’” (at para 82, quoting Plant at 293). It revealed no personal identifiers or details at all. It was not invasive of Mr. Major’s personal life. The anonymous driving data disclosed virtually nothing about the lifestyle or private decisions of the operator of the Dodge Ram pickup. It is hard to conceive that Mr. Major intended to keep his manner of driving private, given that the other occupants of the vehicle – which included an adult employee – and complete strangers, who were contemporaneously using the public roadways or adjacent to it, could readily observe him. His highly regulated driving behaviour was “exposed to the public” (Tessling at para 47), although not to the precise degree with which the limited EDR data, as interpreted by the Bosch CDR software, purports to do. While it is only a small point, I further observe that a police officer on traffic patrol would have been entitled to capture Mr. Major’s precise speed on their speed detection equipment without raising any privacy concerns.

    Accordingly, there had been no section 8 breach.

    Justice Tholl then turned to whether the data had been properly admitted. While the Crown argued that it was admissible as an electronic record under ss. 31.1-31.8 of the Canada Evidence Act, Tholl J.A. pointed out [astutely, in our view: eds.] that these provisions simply provided for authentication and the baseline integrity of the data, and that a proper evidentiary foundation still had to be laid for admissibility. The Crown further argued that technology-generated data, which was commonly relied upon for the safety of the operation of mechanical devices, was prima facie reliable enough to be admissible. However, Tholl J.A. also took issue with this argument:

    [92]           It must be remembered that the data being accessed in the matter at hand was contained in a component of a motor vehicle’s safety system that is unfamiliar to the average person. It is not comparable to a vehicle speedometer, a GPS system, an iPhone, or an Excel spreadsheet. For these four latter pieces of technology, while the average user probably has little knowledge of how they actually work from an internal perspective, they are so ubiquitous that they are easily understood by lay people who feel comfortable with their reliability, based on their commonplace and routine usage. The EDR data and the CDR output do not fall into this same category. By way of illustration, I query what a trier of fact could make of the ten pages of hexadecimal data dump contained at pages 25 to 34 of the output from the Bosch CDR software that was filed as an exhibit.

    [93]           The EDR data is created using a process unknown to the average person, is not accessible by an owner of the vehicle or their mechanic, and can only be extracted with highly specialized third-party software. In this case, there was no evidence as to how the data was actually gathered, what margin of error might exist and what circumstances could influence its accuracy. There was no evidence as to whether the EDR component recorded information accurately. There was also no explanation of any anomalies found in the EDR data.


    In this case, while the reconstructionist who testified for the Crown had been qualified as an expert in accident reconstruction, the trial judge had refused to qualify him as an expert as regarded the EDR data, in particular, and indeed he had been unable to answer some questions on the analysis of the data that he and his colleague had produced. Without such expert opinion (and in spite of some judicial opinion to the contrary), the judge had erred in admitting it:

    [101]      While at some point in the future the use and understanding of these data recorders may become so commonplace that expert testimony is not required to establish their reliability, it is sufficient for me to find that this is not the case now. I am left to conclude that the bare fact that data has been recorded in an EDR and extracted by software does not establish its reliability. Thus, without evidence from a properly qualified expert as to how the system accurately records data and creates accurate output, the CDR output did not meet the threshold required for admission [….]

    [102]      I conclude that a properly qualified expert witness was required in order for the CDR output to be admitted at trial. The technology has not yet progressed to the point where it falls into the same category as a speedometer, video recorder, wrist-watch, spreadsheet program, or similar item of everyday use. Absent consent by an accused, threshold reliability must be established by the Crown before such evidence may be admitted. The trial judge specifically ruled that Cpl. Green could not provide the required expert opinion evidence in relation to this aspect of the evidence. As such, there was insufficient evidence to establish the threshold reliability of the EDR data or the CDR output.

    Given that the data had been the primary evidence of the speed of the vehicle at the time of the accident – a central fact – the error had rendered the verdict unsafe and the case had to be returned for a new trial.

  • 20 Jul 2022 9:29 AM | CAN-TECH Law (Administrator)

    The proposed new privacy law includes order-making powers, penalties and a new tribunal

    On June 26, 2022, the Industry Minister François Philippe Champagne finally tabled in the House of Commons Bill C-27, called the “Digital Charter Implementation Act, 2022”. This is the long-awaited privacy bill that is slated to replace the Personal Information Protection and Electronic Documents Act, which has regulated the collection, use and disclosure of personal information in the course of commercial activity in Canada since 2001

    The bill is very similar to Bill C-11, which was tabled in 2019 as the Digital Charter Implementation Act, 2019, and which languished in Parliament until the federal government called the last election.

    The Bill creates three new laws. The first is the Consumer Privacy Protection Act (“CPPA”), which is the main privacy law. The second is the Personal Information and Data Protection Tribunal Act and the third is the Artificial Intelligence and Data Act.

    The CPPA is in a completely different structure than PIPEDA. PIPEDA included a schedule taken from the Canadian Standards Association Model Code for the Protection of Personal Information and generally required regulated organizations to follow the Code. Similar to the Personal Information Protection Acts of British Columbia and Alberta, the substance of the Code has largely been translated to statutory language in the Bill itself.

    The most significant difference is what many privacy advocates have been calling for: the Privacy Commissioner is no longer an ombudsman. The law includes order-making powers and significant penalties. The Bill also creates a new tribunal called the Personal Information and Data Protection Tribunal, which replaces the current role of the Federal Court under PIPEDA with greater powers.

    PIPEDA applies to the collection, use and disclosure of personal information in the course of commercial activity and to federally-regulated workplaces. That will not change in the CPPA, but a new section 6(2) says that the new Act specifically applies to personal information that is collected, used or disclosed interprovincially or internationally. This provision is not expressly limited to commercial activity, so there’s an argument that could be made that it would apply to non-commercial or employee personal information that crosses borders.

    The CPPA has an interesting approach to anonymous and de-identified data. It officially creates these two categories. It defines anonymize as:

    “to irreversibly and permanently modify personal information, in accordance with generally accepted best practices, to ensure that no individual can be identified from the information, whether directly or indirectly, by any means.”

    Anonymous information is carved out of the CPPA’s purview. But de-identified data remains in-scope. To de-identify data means “means to modify personal information so that an individual cannot be directly identified from it, though a risk of the individual being identified remains.”

    In a number of areas, the CPPA provides more detail about what is required to comply with general principles that are already in PIPEDA. For example, additional detail and needs to be applied with respect to a company’s “privacy management program”. And all the supporting documentation for an organization’s privacy management program must be provided to the Privacy Commissioner on Request.

    With respect to consent, organizations expressly have to record and document the purposes for which any personal information is collected, used or disclosed. This was implied in the CSA Model Code, but is now expressly spelled out in the Act. Section 15 of the CPPA lays out in detail what is required for consent to be valid. It requires not only identifying the purposes but also communicating in plain language how information will be collected, the reasonably foreseeable consequences of is use, what types of information and to whom the information may be disclosed.

    One significant change compared to is the circumstances under which an organization can collect and use personal information without consent. Section 18 of the CPPA allows collection and use without consent for certain business activities, where it would reasonably be expected to provide the service, for security purposes, for safety or other prescribed activities. Notably, this exception cannot be used where the personal information is to be collected or used to influence the individual’s behaviour or decisions. There is also a “legitimate interest” exception, which requires an organization to document any possible adverse effects on the individual, mitigate them and finally weigh whether the legitimate interest outweighs any adverse effects. It’s unclear how “adverse effects” would be measured.

    Like PIPEDA, an individual can withdraw consent subject to similar limitations that were in PIPEDA. But what’s changed is that an individual can require that their information be disposed of. Notably, disposal includes deletion and rendering it anonymous.

    The most notable changes are with respect to the role of the Privacy Commissioner. The Commissioner is no longer an ombudsman with a focus on nudging companies to compliance and solving problems for individuals. The CPPA and Tribunal Act veer strongly towards enforcement.

    As with PIPEDA, enforcement starts with a complaint by an individual or the commissioner can initiate it on his own accord. After the investigation, the matter can be referred to an inquiry.

    Inquiries have more procedural protections for fairness and due process than under the existing ad hoc investigation system. For example, each party is guaranteed a right to be heard and to be represented by counsel. At the end of the inquiry, the Commissioner can issue orders for measures to comply with the Act or to stop doing something that is in contravention of the Act. The Commissioner can continue to name and shame violators, but penalties are left to the new Privacy and Data Protection Tribunal.

    The legislation creates a new specialized tribunal which hears cases under the CCPA. Compared to C-11, the new bill requires that at least three of the tribunal members have expertise in privacy.

    Its role is to determine whether any penalties recommended by the Privacy Commissioner are appropriate. It also hears appeals of the Commissioner’s findings, appeals of interim or final orders of the Commissioner and a decision by the Commissioner not to recommend that any penalties be levied.

    Currently, under PIPEDA, complainants and the Commissioner can seek a hearing in the federal court after the commissioner has issued his finding. That hearing is “de novo”, so that the court gets to make its own findings of fact and determinations of law, based on the submissions of the parties. The tribunal, in contrast, has a standard of review that is “correctness” for questions of law and “palpable and overriding error” for questions of fact or questions of mixed law and fact. These decisions are subject to limited judicial review before the Federal Court.

    Possible penalties are huge. The maximum administrative monetary penalty that the tribunal can impose in one case is the higher of $10,000,000 and 3% of the organization’s gross global revenue in its financial year before the one in which the penalty is imposed. The Act also provides for quasi-criminal prosecutions, which can get even higher.

    The Crown prosecutor can decide whether to proceed as an indictable offence with a fine not exceeding the higher of $25,000,000 and 5% of the organization’s gross global revenue or a summary offence with a fine not exceeding the higher of $20,000,000 and 4% of the organization’s gross global revenue. If it’s a prosecution, then the usual rules of criminal procedure and fairness apply, like the presumption of innocence and proof beyond a reasonable doubt.

    Within a week of being tabled in Parliament, the House rose for the summer break. When Parliament resumes in September, it’s impossible to predict whether the Bill will be fast-tracked or whether it will languish like Bill C-11 in 2019. It is also hard to predict whether the government will be amenable to suggested amendments at the Committee stage.

  • 20 Jul 2022 9:22 AM | CAN-TECH Law (Administrator)

    Consortium of broadcasters sought to block live streams of NHL games in Canada 

    The last three years have seen an opening of the floodgates for Federal Court orders directed at internet service providers to block their customers’ access to third party websites, in response to allegations of copyright infringement by the operators of those third party websites. In a significant extension of this precedent, the Federal Court of Canada has issued the first order requiring internet service providers to block IP addresses in real time, upon request by a consortium of Canadian broadcasters to prevent live access to streams of National Hockey League games. 

    In Rogers Media Inc. v. John Doe 1, the Court considered an application made by Canadian broadcasters who have the exclusive right to broadcast NHL games to prevent Canadians from accessing pirated streams of those live games. 

    The door was opened to website blocking in Canada in the 2019 case of Bell Media Inc. v. GoldTV.Biz. In that case, the Court ordered that ISP block certain IP addresses that were listed in the Court’s order, and the list could only be added to by further court order. The plaintiffs argued that this was not feasible for NHL games and said the court should issue an order for “dynamic” site blocking. From the Court:

    [6] In this case, the Plaintiffs have requested a “dynamic” site blocking Order, which involves trying to follow and block the unlawful streaming as it moves. The Plaintiffs say that the type of order issued in GoldTV FC would not work here because the pirates have adopted new measures to avoid detection and defeat site blocking, including moving their infringing content from site to site on a regular basis. Court approval would be impossible prior to each new blocking step because these efforts need to happen in real time in order to be effective.

    [7] The Plaintiffs say this is of particular relevance here, because most fans watch hockey games live, rather than recording them to watch later. This combination of factors means that blocking of unlawful streaming of live NHL broadcasts must happen while the broadcast is underway. Based on the evidence they have gathered, and experience in other countries where similar site-blocking orders have been issued, the Plaintiffs say that a dynamic site blocking Order is needed to keep up with the evolution in how online copyright piracy operates. For example, in this case the sites to be blocked could shift during the course of a single hockey broadcast. This type of dynamic blocking order has never been granted in Canada or in the United States. However, similar orders have been granted in the United Kingdom and Ireland, as well as in some European countries.

    Some of the ISPs were prepared to consent to the order, while others resisted it on various grounds summarized by the Court at paragraph 8 of the decision: 

    Although the objecting Third Party Respondents do not adopt identical positions, they advance broadly similar arguments. They say the process the Plaintiffs followed has been inappropriate and unfair. They contend that the Plaintiffs have failed to prove their case. They argue that the Order sought would impose undue risks, practical difficulties and costs on them, noting that they are not accused of any wrongdoing in this matter. Finally, they submit that if any Order is to be imposed, the Plaintiffs must be required to indemnify them completely for the costs associated with compliance, including (for those that would be required to do so) any cost of upgrading their network infrastructure.

    The Court followed the reasoning from the GoldTV cases, which included the general test for injunctive relief against uninvolved third parties from RJR-MacDonald Inc. v. Canada (Attorney General) most recently interpreted by the Supreme Court of Canada in Google Inc. v. Equustek Solutions Inc. In this case, as in GoldTV, this analysis was supplemented with additional factors drawn from the England and Wales Court of Appeal case of Cartier International AG v British Sky Broadcasting Ltd.:

    [113] The following summary of the Cartier factors borrows from GoldTV FC at paragraph 52 and GoldTV FCA at para 74. The factors to be considered are:

    A. Necessity – the extent to which the relief is necessary to protect the plaintiff’s rights. The relief need not be indispensable but the court may consider whether alternative and less onerous measures are available;

    B. Effectiveness – whether the relief sought will make infringing activities more difficult to achieve and discourage Internet users from accessing the infringing service;

    C. Dissuasiveness – whether others not currently accessing the infringing service will be dissuaded from doing so;

    D. Complexity and Cost – the complexity and cost of implementing the relief sought;

    E. Barriers to legitimate use or trade – whether the relief will create barriers to legitimate use by unduly affecting the ability of users of ISP services to access information lawfully;

    F. Fairness – whether the relief strikes a fair balance between fundamental rights of the parties, the third parties and the general public;

    G. Substitution – the extent to which blocked websites may be replaced or substituted and whether a blocked website may be substituted for another infringing website; and

    H. Safeguards – whether the relief sought includes measures that safeguard against abuse.

    The Court ultimately was satisfied that the RJR-MacDonald and Cartier factors supported granting the injunction, which was issued on the following terms:

    • The plaintiffs must collectively appoint a third party agent identify impugned IP addresses, to notify the ISPs of IP addresses to be blocked and when to unblock them, compile complaints and address the effectiveness of the measures. The third party agent is to report to the Court and issue a public report. 

    • ISPs must block or attempt to block access to IP addresses streaming infringing content as identified by the plaintiffs’ agent, and informing the plaintiffs of processes they would propose to use. 

    • Disabling access is to be done “as soon as practical,” and within 30 minutes of the beginning of the game or notice and the IP addresses will be unblocked at the conclusion of the game. 

    • The plaintiffs and ISPs must provide a form of notice to affected customers.

    • Compliance with the order generally is immediately, if possible, but an ISP that cannot implement measures to comply within 15 days must notify the plaintiffs. 

    • The plaintiffs are responsible, to a maximum of $50,000 per ISP, for the costs of implementing the order.

    • The order will only be in effect until the end of the 2021–2022 NHL season (including playoffs). 

    While GoldTV did open the floodgates for rightsholders to seek blocking orders, the “dynamic” blocking was largely ordered because of nature of sports broadcasting, where viewers tend to watch games live. The door has clearly been opened for similar orders to be sought for other live sporting events, such as the Soccer World Cup and Olympics. 


Canadian Technology Law Association

1-189 Queen Street East

Toronto, ON M5A 1S2

Copyright © 2024 The Canadian Technology Law Association, All rights reserved.