Log in


News

  • 9 Nov 2021 2:26 PM | CAN-TECH Law (Administrator)


    November 24, 2021 12 to 1 PM ET

    Please join us for our Women Telling Stories event. Our excellent panel of female lawyers from different practice areas and levels of experience will share stories from their professional journey to provide insight on seizing opportunities and navigating challenges in the legal world. Our panel will impart their most impactful professional advice, followed by a fireside chat with questions. No doubt will you find inspiration from these amazing mentors and learn best practices for career and profile management as a legal professional; mentoring best practices; and work/life balance and wellness principles for lawyers and paralegals. 

    Agenda: 

    • Our panel of experts share their professional stories
    • Fireside chat with the questions

    Moderated by:

    • Lisa Danay Wallace, WeirFoulds LLP
    • Maya Madeiros, Norton Rose Fulbright

    Our panel of experts: 

    • Nancy Cleman - Lapointe Rosenstein Marchand Melancon 
    • Catherine Lovrics – Marks & Clerk 
    • Nadine Letson – Microsoft 

    REGISTER HERE 

  • 20 Sep 2021 2:29 PM | CAN-TECH Law (Administrator)

    Unhappy plastic surgery patient held to have defamed surgeon in web reviews

    In Peterson v. Deck, Justice G.P. Weatherill of the British Columbia Supreme Court presided over a summary trial in a defamation case brought by a plastic surgeon against a patient who was dissatisfied with the result of her breast augmentation surgery. The defendant had posted comments about the surgeon’s work on both her own website and in Google Reviews, in which she suggested he had made fundamental errors in his consultation with her and the surgical work, and generally imputing that he was incompetent. She defended on the basis of justification, fair comment and qualified privilege.

    Justice Weatherill first dealt with the defendant’s argument that the claim was barred by the B.C. Protection of Public Participation Act, the province’s “anti-SLAPP” legislation which is designed to ensure valid commentary on issues of public interest is not suppressed by well-resourced defendants bringing legal proceedings. He accepted the defendant’s argument that “a consumer review of a plastic surgeon’s skills is within the ambit of public interest,” citing earlier authority for the proposition that:

    Online reviews of goods or services offered to members of the public… are commonplace on Google or other web sites. While much of the general public may not be interested in [such] reviews…, it is enough that some segment of the community would have a genuine interest in receiving information on the subject.

    However, the claim was not sufficiently in the public interest to meet the requirements under the Act, because the claim had merit and the harm to the doctor’s interests outweighed the value of protecting the expression:

    This action was not brought to stifle or frustrate the defendant’s freedom of expression or prevent her from making reviews or participating in matters of public debate. Consumer reviews, as a general principle, ought to be encouraged and there is a very real danger of a chilling effect if they are curtailed. However, such reviews should not be left unbridled. Online review platforms are not a carte blanche to say whatever one wishes without potential consequences. This case was brought to vindicate the plaintiff’s reputation as a plastic surgeon in light of the Posts.

    On the main claim, the court easily found that the posts were defamatory, given that they would tend to lower the reputation of the plaintiff in the eyes of the public. Justice Weatherill found that the posts had been “published” in spite of little evidence being led on whether anyone had read the reviews; however, they had been read by the surgeon’s administrative assistant, and the defendant herself had received responses to them and stated publicly that they had “gone viral,” which the court held to be sufficient.

    Moving on to defences, Weatherill J. held that justification/truth was not available to the defendant, since the evidence indicated that she had made factual statements that were demonstrably false, regarding what had been discussed with her by the surgeon and what had taken place during the surgery. Nor was fair comment available, since the subjective opinions the defendant expressed in the posts were based on untrue facts. The court held for the plaintiff on liability.

    As to damages, Weatherill J. noted that “the defendant published the Posts using the internet, a medium with tremendous power to harm a person’s reputation by spreading falsehoods far and wide,” and imposed damages of $30,000.00. He also imposed a mandatory injunction requiring the defendant to take down the posts and not place them anywhere else.

  • 20 Sep 2021 2:28 PM | CAN-TECH Law (Administrator)

    Methodology problems result in exclusion of survey results in trademark dispute

    At issue in Tokai of Canada Ltd. v. Kingsford Products Company, LLC, was the admissibility of evidence that was generated by a consumer survey, which was designed to test consumer reaction to the use of the word “KING.” The plaintiff was trying to register this as a trademark for barbeques and butane lighters, but the defendant objected based on its numerous trademarks with the word “KINGSFORD” used with similar products. As fresh evidence on judicial review the plaintiffs had sought to lead expert evidence of an internet survey that comprised 707 interviews with consumers who had purchased, or planned to purchase, a butane lighter. Justice Fuhrer noted that as a survey evidence is a species of expert opinion evidence it must meet the usual requirements for admissibility, and:

    Further, to be considered relevant, the survey must be both reliable (in that if it were repeated it would produce the same results) and valid (in that the right questions were put to the right pool of survey participants in the right way and in the right circumstances to produce the evidence sought)[.]

    Here, there were both validity and reliability problems with the survey evidence proffered, which “highlights the challenges in attempting to simulate a consumer’s imperfect recollection at the time when they encounter the products and trademark in issue in the marketplace.” There were four particular deficiencies. First, the survey referred only to “butane lighters” but it might not have been clear to the average consumer, who was in a hurry, whether this referred to cigarette lighters or utility lighters, which could have skewed results. Second, some completed surveys were pulled from the results evaluated because the market researcher conducting the survey judged they had been “completed too quickly,” yet quick completion was likely to be a feature of a survey which was meant to capture the first impressions of the average, hurried, consumer.

    Third, some survey participants were permitted to take many hours to complete the survey, which again was incorrect methodology for a survey meant to obtain first impressions. Finally there were contextual and other gaps in some survey questions. For example, “[t]he manner in which the survey participant was shown the brand name KING online [was] not reflective of the manner in which the trademark would be encountered in the marketplace in the applicable circumstances (i.e. on packaging or the goods themselves, potentially along side other similar products, such as on a store shelf).” In particular, an online survey, Justice Fuhrer felt, was not a good means by which to emulate how consumers would encounter the goods in a store, as opposed to a commercial website. Due to all of these flaws, the evidence was not sufficiently reliable or valid, and thus was excluded.

  • 20 Sep 2021 2:27 PM | CAN-TECH Law (Administrator)

    Ontario College of Physicians and Surgeons issues guidelines for physician use of social media

    The College of Physicians and Surgeons of Ontario recently released a document entitled Social Media—Appropriate Use by Physicians, the stated goal of which was to provide “guidance to physicians about how to engage in social media while continuing to meet relevant legal and professional obligations.” The document notes that it is not itself a policy or formal means of establishing rules for conduct, but rather is intended to help physicians comply with existing professional expectations “in the social media sphere.”

    The guidelines recommend that physicians:

    1. Assume that all content on the Internet is public and accessible to all.
    2. Exercise caution when posting information online that relates to an actual patient, in order to ensure compliance with legal and professional obligations to maintain privacy and confidentiality. Bear in mind that an unnamed patient may still be identified through a range of other information, such as a description of their clinical condition, or area of residence.
    3. Refrain from providing clinical advice to specific patients through social media. It is acceptable, however, to use social media to disseminate generic medical or health information for educational or information sharing purposes.
    4. Protect their own reputation, the reputation of the profession, and the public trust by not posting content that could be viewed as unprofessional.
    5. Be mindful of their Internet presence, and be proactive in removing content posted by themselves or others which may be viewed as unprofessional.
    6. Refrain from establishing personal connections with patients or persons closely associated with them online, as this may not allow physicians to maintain appropriate professional boundaries and may compromise physicians’ objectivity. It is acceptable to create an online connection with patients for professional purposes only.
    7. Refrain from seeking out patient information that may be available online without prior consent.
    8. Read, understand, and apply the strictest privacy settings necessary to maintain control over access to their personal information, and social media presence undertaken for personal purposes only.
    9. Remember that social media platforms are constantly evolving, and be proactive in considering how professional expectations apply in any given set of circumstances.
  • 20 Sep 2021 2:26 PM | CAN-TECH Law (Administrator)

    New statutory tort for intimate images was determined to not cover all the harms alleged by the plaintiff

    Madam Justice Inglis of the Alberta Court of Queen’s Bench has recognized, for the first time, the tort of “public disclosure of private facts” in the province of Alberta. In ES v Shillington, the plaintiff had previously obtained default judgement against the defendant for a number of recognized claims, including assault, battery, sexual assault and intentional infliction of emotional distress. A separate hearing was held for an assessment of damages and to determine whether judgement would be granted for the novel tort and for breach of confidence.

    The case flowed from a very unpleasant marital break-up in which the plaintiff fled her husband from New Brunswick, where he as posted with the military, to her home in Alberta. In addition to a number of assaults, the plaintiff alleged that he had posted images of her online:

    [11] While he was deployed, near the end of their relationship, the Defendant confessed to the Plaintiff that he had posted her images online. Through accessing the Defendant’s social media accounts the Plaintiff was able to track some of these postings and was disturbed to find many of those private, explicit images available on the internet at pornography sites. At no time did the Defendant have the Plaintiff’s consent to publish these images. The Defendant admitted that he had posted photos as early as 2006, and the Plaintiff has located images posted as late as 2018. As recently as early 2021 the Plaintiff was able to find some of these images online.

    [12] The availability of these photos, including the fact that the Plaintiff is identifiable in some images, resulted in the Plaintiff being recognized in them by a neighbour that spoke to her sexually, having seen her likeness on a website. She has experienced significant mental distress and embarrassment as a result of the postings. She suffers nervous shock, psychological and emotional suffering, depression, anxiety, sleep disturbances, embarrassment, humiliation, and other impacts to her wellbeing.

    The Court carried out a fulsome review of whether the novel tort should be recognized in Alberta, considering the criteria set by the Supreme Court of Canada in Nevsun Resources Ltd v Araya:

    Three clear rules for when the courts will not recognize a new nominate tort have emerged: (1) The courts will not recognize a new tort where there are adequate alternative remedies (see, for example, Scalera); (2) the courts will not recognize a new tort that does not reflect and address a wrong visited by one person upon another (Saskatchewan Wheat Pool, at pp 224-25); and (3) the courts will not recognize a new tort where the change wrought upon the legal system would be indeterminate or substantial (Wallace v United Grain Growers Ltd, [1997] 3 SCR 701 (SCC), at paras 76-77). Put another way, for a proposed nominate tort to be recognized by the courts, at a minimum it must reflect a wrong, be necessary to address that wrong, and be an appropriate subject of judicial consideration.

    Since the publication of the images, Alberta has created a statutory tort for the non-consensual distribution of intimate images (the Protecting Victims of Non-Consensual Distribution of Intimate Images Act), but it could not be applied retrospectively to provide the plaintiff with a remedy. Even if it would, the statutory tort is narrowly circumscribed and does not address all the harms she has experienced. In the circumstances, without recognizing the new tort, there would be no alternative remedies available for her.

    The Court found that this tort was necessary to address an act of deliberate wrongdoing carried out by the defendant, and was one that is appropriate for adjudication:

    [62] The conduct complained of by the Plaintiff clearly meets this third test; it is appropriate for judicial adjudication. The change sought of this court is a determinate and substantial change that recognizes the inherent harm done by dissemination of private content. When conduct attracts legislative and parliamentary attention, its wrongfulness is apparent. From Jane Doe #2 at para 88: “…Failing to develop the legal tools to guard against the intentional, unauthorized distribution of intimate images and recordings on the internet would have a profound negative significance for public order as well as the personal wellbeing and freedom of individuals.”

    Conclusion re cause of action for Public Disclosure of Private Facts

    [63] The existence of a right of action for Public Disclosure of Private Facts is thus confirmed in Alberta. To do so recognizes these particular facts where a wrong exists for which there are no other adequate remedies. The tort reflects wrongdoing that the court should address. Finally, declaring the existence of this tort in Alberta is a determinate incremental change that identifies action that is appropriate for judicial adjudication.

    Following a review of the relevant caselaw from outside of Alberta, the Court stated the elements of the tort in the province:

    [68] Therefore, in Alberta, to establish liability for the tort of Public Disclosure of Private Facts, the Plaintiff must prove that:

    (a) the defendant publicized an aspect of the plaintiff’s private life;

    (b) the plaintiff did not consent to the publication;

    (c) the matter publicized or its publication would be highly offensive to a reasonable person in the position of the plaintiff; and,

    (d) the publication was not of legitimate concern to the public.

    Given the overlap of damages among the intentional torts claimed, the damage award was not divided among them. Under those heads, the plaintiff sought and the court awarded general damages of $80,000, punitive damages of $50,000 and aggravated damages in the amount of $25,000.

  • 20 Sep 2021 2:24 PM | CAN-TECH Law (Administrator)

    Government is seeking input on framework to make internet companies responsible for policing content available in Canada

    Almost immediately before the 2021 general election was called, the federal Department of Heritage launched a consultation on a proposed framework to address “online harms” by the imposition of obligations on online communications service providers, creation of a new regulator in Ottawa and enormous penalties. The announcement was accompanied by a discussion guide and a technical paper, the latter of which fulsomely describes the proposed framework put forward by the government. In the subsequent campaign, the Liberal Party has indicated their intention of passing “online harms” legislation in their first 100 days if re-elected.

    The framework is intended to address five categories of content that is largely illegal, but the proposed definitions go beyond what the courts have determined to be unlawful: (a) terrorist content; (b) content that incites violence; (c) hate speech; (d) non-consensual sharing of intimate images; and (e) child sexual exploitation content.

    Highlights of the proposed framework include requiring online communications service providers to have a flagging mechanism for harmful content, with review and removal within 24 hours. It specifically calls for making the content inaccessible in Canada. The new regulator, the Digital Safety Commissioner, would have very broad powers of supervision and significant transparency obligations would be placed on the providers. An appeal mechanism would rest in a new Digital Recourse Council of Canada, with further appeals going to a new Personal Information and Data Protection Tribunal.

    The proposal anticipates massive penalties for non-compliance in an amount up to 3% of a provider’s gross global revenue or up to ten million dollars ($10,000,000), whichever is higher.

    While the framework is detailed, it anticipates significant provisions would be left to regulation either by the Digital Safety Commissioner or the Governor-in-Council, or both.

    A number of experts, including Professors Michael Geist and Emily Laidlaw have been critical of the approach taken by the Department, and Daphne Keller of Stanford has said the proposal “disregard(s) international experience with past laws and similar proposals around the world, as well as recommendations from legal and human rights experts inside and outside of Canada.”

    The consultation is open for comments until September 25, 2021.

  • 23 Jul 2021 2:41 PM | CAN-TECH Law (Administrator)

    Demand for passwords does not amount to self-incrimination; Alberta Court of Appeal finds child porn admissible despite Charter breach

    In R. v. Al-Askari, the Alberta Court of Appeal rendered a decision on the continuously controversial issue of searches of electronic devices at Canada’s borders. The accused was a refugee claimant of Palestinian nationality who made a refugee claim at the Canada-US border in Coutts, Alberta. As part of routine screening under the Immigration and Refugee Protection Act (IRPA), a CBSA official asked for and received the passwords for the accused’s two cell phones, on one of which she saw child pornography. She then stopped the search and arrested him. Subsequent warranted searches of the phones and several other electronic devices revealed hundreds of child pornography images and videos. At trial, the accused made an unsuccessful Charter challenge to the initial search of the two phones, and was convicted of importing and possessing child pornography. He appealed the finding on the search and also (with leave from the Court of Appeal) contested the device search subsequent to the search of the phones.

    While most “border search” cases have concerned searches conducted under the Customs Act, the Court of Appeal carefully explored the legislative context of this search, arising from the IRPA. It identified two possible provisions under which a search of a refugee claimant could be authorized: under section 139, a search may be conducted if the officer believes on reasonable grounds that the person has not revealed their identity, has undisclosed information about their identity, or has committed or has evidence about particular offences (human smuggling/trafficking, etc.). The officer had testified that she had been looking for evidence of inadmissibility on the basis of security, criminality or identity. Thus s. 130 had not authorized the search.

    The second was section 16 of IRPA. The Court held that 16(1), which requires the person to produce all information relevant to their refugee claim, did not apply. Section 16(3) allows officers to obtain “any evidence that may be used to establish their identity or compliance with this Act.” The Court held that the Crown’s argument that this crated a broad and general search power was not in keeping with the need for a constitutionally-protected core of privacy in electronic devices, even at the border where the expectation of privacy is somewhat attenuated. Their findings are worth setting out in some detail, in part because the Court reviewed its earlier finding in R. v. Canfield which dealt with e-device searches under the Customs Act (but was released while this newsletter was on a COVID-required hiatus):

    [48] The Crown reads s 16(3) more broadly as providing an unqualified search power. As Mr Al Askari emphasizes, this would create one of the broadest search powers in Canadian law. Without any restraint or need to provide an articulated basis, the officer could require a strip search, cavity search, DNA seizure, and encryption passwords, as long as the search was directed toward establishing the applicant’s identity or to ensure compliance with IRPA.

    [49] The more limited approach suggested by Mr Al Askari is supported by R v LE2019 ONCA 961, paras 66-67, 382 CCC (3d) 202, leave to appeal dism’d 2020 CanLII 33846 (SCC). The Court of Appeal for Ontario concluded that s 16(3) creates a qualified statutory search power.

    [50] LE involved a search of a cell phone of the accused who was in Canada illegally and subject to a removal order. The officer had a reasonably grounded belief that the accused was attempting to contravene her removal order and had wrongly made phone contact with her husband. The officer expressly relied upon s 16(3) as the source of authority for the search: para 44.

    [51] The court held that the search was lawful and the scope of the search was restricted to establishing the person’s identity or determining compliance with IRPA. Although there are procedural and substantive limits on this search process, there is no limit on the subject matter of the search since the officer is permitted to obtain “any evidence” as long as that evidence is to establish the person’s identity or determine compliance with IRPA: paras 68-69.

    [52] The court suggested that the search power under s 16(3) requires a “reasonable grounds belief”, para 70:

    In my view, s. 16(3) authorized the CBSA officer’s search of the appellant’s cell phone. The appellant was a foreign national; she had been arrested and detained and was subject to a removal order. The CBSA officers sought evidence that the appellant was attempting to contravene her removal order. They sought evidence from the LG Nexus cell phone in the appellant’s possession on arrest, to determine the appellant’s compliance (or lack thereof) with the IRPA, having information that could support a reasonable ground belief the appellant was obstructing her removal from Canada. [emphasis added]

    [53] This approach is consistent with R c Patel2018 QCCQ 7262, paras 64-66. It held that a cell phone search of a refugee claimant was authorized under ss 16(1), 16(3), 139 and 140 of IRPA upon which the officer explicitly relied. The search was necessary to determine the accused’s true identity because of bona fide concerns about his identification documents and the answers he provided when questioned.

    [54] Both LE and Patel are examples of what is contemplated by the text of s 16(3). Patel was concerned with further evidence of identity. LE addressed evidence of “compliance with the Act” as the accused was subject to a removal order under the IRPA. Neither case involved a broad suspicionless search for criminality.

    [55] A finding that s 16(3) does not authorize suspicionless searches is consistent with this Court’s decision in Canfield.

    [56] At issue in Canfield was the constitutionality of the Customs Act provision that permits the routine inspection of “goods”: s 99(1)(a). Earlier jurisprudence treated the search of electronic devices as coming within the definition of “goods” under s 99(1)(a) and falling within the first category of Simmons: routine searches that could be undertaken without any individualized grounds. This Court held that s 99(1)(a) of the Customs Act was unconstitutional to the extent that it imposed no limits on searches of electronic devices at the border. The definition of “goods” in s 2 of the Customs Act was deemed of no force or effect insofar as the definition included the contents of personal electronic devices for the purpose of s 99(1)(a): para 7. Notably, this Court said that not all searches of phones are the same; some will be more invasive than others: para 34. But routine, suspicionless searches of these devices are not constitutional under s 99(1)(a).

    [57] This Court went on to say, paras 75-76, that a justified search of a personal electronic device needs a threshold requirement of suspicion, but was reluctant to define the boundaries of that threshold, preferring to leave that question to Parliament:

    …To be reasonable, such a search must have a threshold requirement…[I]n our view the threshold for the search of electronic devices may be something less than the reasonable grounds to suspect required for a strip search under the Customs Act … [but] … we decline to set a threshold requirement for the search of electronic devices at this time. Whether the appropriate threshold is reasonable suspicion, or something less than that having regard to the unique nature of the border, will have to be decided by Parliament and fleshed out in further cases. However, to the extent that s 99(1)(a) permits the unlimited search of personal electronic devices without any threshold requirement at all, it violates the protection against unreasonable search in s 8 of the Charter.

    We hasten to add that not all searches of personal electronic devices are equal …

    [58] This Court was alive to the reality that travellers often store relevant documents for customs purposes on their electronic devices. Although an unlimited and suspicionless search of a device would breach the Charter, some documents stored on devices must be made available to border agents as part of the routine screening process. For example, receipts and other information relating to the value of imported goods and travel-related documents, would be essential to routine screening. “The review of such items on a personal electronic device during a routine screening would not constitute an unreasonable search under s 8”: para 79.

    [59] Routine and suspicionless searches of personal electronic devices under IRPA must be limited to the purposes provided in the text: identification and admissibility. Persons have a higher privacy interest in their devices even at the border. Not all searches of devices are overly intrusive, and relevant documents are often stored on these devices. It follows that, under s 16, officers may review documents on personal electronic devices where necessary for identification and admissibility purposes. For example, an officer could ask a refugee claimant to locate the relevant documents on their device instead of independently searching for them. In this situation, a search would only occur if the person could not meet the request.

    [60] In addition, Canfield followed the guidance from R v Fearon2014 SCC 77, paras 74-83, [2014] 3 SCR 621, regarding tailored and precise search protocols. The court warned against open-ended searches, even if done for statutorily prescribed purposes. Thus, a justifiable search of a personal electronic device for the purposes of identification and admissibility must limit the invasion of privacy by conducting the search in a manner that is tailored, and only where the officer is unable to otherwise satisfy themselves of identity and admissibility.

    Here, the CBSA officer had not had any indicators of inadmissibility or criminality when she conducted the search, and explicitly stated that suspicion had arisen only when she viewed the child porn images on the phone. The Court concluded its analysis by finding that a search under s. 16(3) must be grounded in “a reasonable suspicion with respect to the claimant’s identity, admissibility, or other compliance with the IRPA.” Accordingly, both the initial search of the phones and the subsequent search of the other devices had breached s. 8 of the Charter as unreasonable searches and seizures.

    The accused also argued that by demanding his phone passwords the officials had breached his right against self-incrimination under s. 7 of the Charter. Relying on the 2006 decision by the Ontario Court of Appeal in R. v. Jones, as well as its own decision in Canfield, the Court noted that there is no ab initio right to remain silent during inspection at the border, due to its unique nature. While questioning grounded in some “strongly particularized suspicion” that created a detention of the person might engage the protection against self-incrimination, routine border screening did not. Here, the screening had been routine and no detention had arisen.

    Having found the searches unconstitutional, the Court of Appeal nonetheless refused to exclude the evidence under s. 24(2) of the Charter. The state of the law at the time of the search—six years earlier—had been in flux and the officer’s belief that her search was permissible had been reasonable. The examination of photos was intrusive upon the accused’s privacy, particularly as he was religiously concerned about people viewing photos of female members of his family. However, the photos were highly reliable real evidence and society’s interest in trial on the merits had been high.

  • 23 Jul 2021 2:38 PM | CAN-TECH Law (Administrator)

    Yay! Another consultation

    Innovation, Science and Economic Development Canada has launched a consultation on copyright, artificial intelligence and the internet of things, which is open for comment until September 17, 2021. The launch was accompanied by a consultation paper, which sets out its goals:

    With this consultation, the Government invites both evidence of a technical nature and views on potential policy directions described in more detail in the paper. AI and IoT are fast evolving technologies, uses of these technologies are changing, and consumers and businesses are facing new copyright-related challenges when using these complex technologies.

    The types of technical evidence sought in this consultation include technical information about how an AI model integrates data from copyright-protected works as it "learns" from that data, the roles of various human players involved in the creation of works using AI, the extent to which copyrighted-protected works are integrated in AI applications after they are trained and commercialised, and the uses of AI-assisted and AI-generated works by businesses and consumers. With respect to IoT, evidence sought includes technical information about TPMs, how stakeholders interact with TPMs in their respective industries, and the necessary steps, third party assistance, and devices required to circumvent a TPM and perform associated tasks, such as repair or achieving interoperability of two products. Relaying experiences in other markets or jurisdictions that have enacted new measures related to AI and copyright would also be of interest.

    In considering possible copyright measures relating to AI and IoT, the Government will be guided by the extent to which measures would help achieve the following objectives:

    1. Support innovation and investment in AI and other digital and emerging technologies in all sectors in Canada. AI has tremendous potential for society if used ethically and responsibly, and could also drive productivity growth across the economy.
    2. Support Canada's cultural industries and preserve the incentive to create and invest provided by the economic rights set out in the Act. Creators, innovators and rights holders should be adequately remunerated for their works or other copyright subject matter.
    3. Support competition and marketplace needs regarding IoT devices and other software-enabled products. Consumers want to be able to maintain and more easily repair the products they own, while innovators want flexibility and certainty to develop software-enabled products that are interoperable with those of other manufacturers.

    Specific topics covered include the questions of authorship and ownership of works created by artificial intelligence and the right to repair in the context of the “internet of things”.

  • 23 Jul 2021 2:37 PM | CAN-TECH Law (Administrator)

    Ontario Divisional Court strikes intrusion upon seclusion claim based on recklessness

    In a two to one split decision before the Ontario Divisional Court in Owsianik v. Equifax Canada Co., the majority of the three judge panel has struck a class action claim against Equifax Canada that was based on the tort of “intrusion upon seclusion”. The tort, as first established in Canada in Jones v. Tsige, could be applicable where the defendant was intentional or reckless in the intrusion. In the case before the Court, Equifax Canada appealed a certification decision that would have allowed the case to proceed based on the allegation that Equifax had been reckless.

    The plaintiffs argued on appeal that the contours of the privacy tort are evolving and it should be up to the trial judge to determine whether Equifax had been reckless and, if so, whether it triggered the intrusion tort. Equifax, on the other hand, argued that the certification judge’s decision went beyond the “incremental development principle” and that novel claims such as these should be vetted at the certification stage.

    The majority of the three judge panel of the Divisional Court allowed Equifax’s appeal, reasoning:

    [54] The tort of intrusion upon seclusion was defined authoritatively only nine years ago. It has nothing to do with a database defendant. It need not even involve databases. It has to do with humiliation and emotional harm suffered by a personal intrusion into private affairs, for which there is no other remedy because the loss cannot be readily quantified in monetary terms. I agree that Sharpe J.A.’s definition of the tort is not necessarily the last word, but to extend liability to a person who does not intrude, but who fails to prevent the intrusion of another, in the face of Sharpe J.A.’s advertence to the danger of opening the floodgates, would, in my view, be more than an incremental change in the common law.

    [55] I agree with my colleague (paragraph 43) that Equifax’s actions, if proven, amount to conduct that a reasonable person could find to be highly offensive. But no one says that Equifax intruded, and that is the central element of the tort. The intrusion need not be intentional; it can be reckless. But it still has to be an intrusion. It is the intrusion that has to be intentional or reckless and the intrusion that has to be highly offensive. Otherwise the tort assigns liability for a completely different category of conduct, a category that is adequately controlled by the tort of negligence.

    The court also concluded that if a defendant had not taken adequate steps to secure their databases, the tort of negligence “protects them adequately and has the advantage that it does not require them to prove recklessness.”

  • 23 Jul 2021 2:34 PM | CAN-TECH Law (Administrator)

    Federal regulator calls it “a step back overall” for privacy

    The federal Privacy Commissioner, who oversees the Personal Information Protection and Electronic Documents Act and who would be the lead regulator if Bill C-11 ever becomes law has slammed the bill as a “step back overall” for privacy. In a lengthy submission to the House of Commons Standing Committee on Access to Information, Privacy and Ethics, the Commissioner says the bill does not get the balance between privacy and commercial interests right and is out of step with legislation in other jurisdictions. The main concerns are summarized in a press release issued at the same time by the Commissioner:

    Control

    Instead of giving consumers greater control over the collection, use and disclosure of their personal information, Bill C-11 offers less control. It omits the requirement under existing law that individuals understand the consequences of what they are consenting to for it to be considered meaningful, and it allows the purposes for which organizations seek consent to be expressed in vague, if not obscure, language.

    New flexibility without increased accountability

    In the digital economy, organizations need some degree of flexibility to use personal information, sometimes without consent, in order to maximize the potential of the digital revolution for socio-economic development. But with greater flexibility for companies should come greater accountability.

    Unfortunately, Bill C-11 weakens existing accountability provisions in the law by defining accountability in a manner akin to self-regulation.

    Organizations should be required to apply the principles of Privacy by Design and undertake privacy impact assessments for new higher risk activities. The law should also subject organizations to proactive audits by the OPC to ensure they are acting responsibly.

    Responsible innovation

    Bill C-11 seeks to provide greater flexibility to organizations through new exceptions to consent. However, certain exceptions are too broad or ill-defined to promote responsible innovation. The preferred approach would be to adopt an exception to consent based on legitimate business interests, within a rights-based approach.

    A rights-based foundation

    Bill C-11 prioritizes commercial interests over the privacy rights of individuals. While it is possible to protect privacy while giving businesses greater flexibility to innovate responsibly, when there is a conflict, privacy rights should prevail.

    To that end, the Bill should be amended to adopt a rights-based framework that would entrench privacy as a human right and as an essential element for the exercise of other fundamental rights. The OPC submission recommends doing this in a way that would strengthen the constitutional foundation of the law as properly within the jurisdiction of Parliament.

    Access to quick and effective remedies

    Bill C-11 gives the OPC order-making power and the ability to recommend very high monetary penalties. However, both are subject to severe limitations and conditions, including the addition of an administrative appeal between the OPC and the courts that would deny consumers quick and effective remedies.

    Only a narrow list of violations could lead to the imposition of administrative penalties. The list does not include obligations related to the form or validity of consent or the numerous exceptions to consent. It also does not include violations of the accountability provisions.

    In the case of failure to comply with these obligations, only criminal sanctions would apply and only after a process that could take approximately seven years. A process that would take a maximum of two years is recommended.

  

Canadian Technology Law Association

1-189 Queen Street East

Toronto, ON M5A 1S2

contact@cantechlaw.ca

Copyright © 2022 The Canadian Technology Law Association, All rights reserved.