Menu
Log in

                   

Log in


News

  • 20 Sep 2021 2:24 PM | Deleted user

    Government is seeking input on framework to make internet companies responsible for policing content available in Canada

    Almost immediately before the 2021 general election was called, the federal Department of Heritage launched a consultation on a proposed framework to address “online harms” by the imposition of obligations on online communications service providers, creation of a new regulator in Ottawa and enormous penalties. The announcement was accompanied by a discussion guide and a technical paper, the latter of which fulsomely describes the proposed framework put forward by the government. In the subsequent campaign, the Liberal Party has indicated their intention of passing “online harms” legislation in their first 100 days if re-elected.

    The framework is intended to address five categories of content that is largely illegal, but the proposed definitions go beyond what the courts have determined to be unlawful: (a) terrorist content; (b) content that incites violence; (c) hate speech; (d) non-consensual sharing of intimate images; and (e) child sexual exploitation content.

    Highlights of the proposed framework include requiring online communications service providers to have a flagging mechanism for harmful content, with review and removal within 24 hours. It specifically calls for making the content inaccessible in Canada. The new regulator, the Digital Safety Commissioner, would have very broad powers of supervision and significant transparency obligations would be placed on the providers. An appeal mechanism would rest in a new Digital Recourse Council of Canada, with further appeals going to a new Personal Information and Data Protection Tribunal.

    The proposal anticipates massive penalties for non-compliance in an amount up to 3% of a provider’s gross global revenue or up to ten million dollars ($10,000,000), whichever is higher.

    While the framework is detailed, it anticipates significant provisions would be left to regulation either by the Digital Safety Commissioner or the Governor-in-Council, or both.

    A number of experts, including Professors Michael Geist and Emily Laidlaw have been critical of the approach taken by the Department, and Daphne Keller of Stanford has said the proposal “disregard(s) international experience with past laws and similar proposals around the world, as well as recommendations from legal and human rights experts inside and outside of Canada.”

    The consultation is open for comments until September 25, 2021.

  • 23 Jul 2021 2:41 PM | Deleted user

    Demand for passwords does not amount to self-incrimination; Alberta Court of Appeal finds child porn admissible despite Charter breach

    In R. v. Al-Askari, the Alberta Court of Appeal rendered a decision on the continuously controversial issue of searches of electronic devices at Canada’s borders. The accused was a refugee claimant of Palestinian nationality who made a refugee claim at the Canada-US border in Coutts, Alberta. As part of routine screening under the Immigration and Refugee Protection Act (IRPA), a CBSA official asked for and received the passwords for the accused’s two cell phones, on one of which she saw child pornography. She then stopped the search and arrested him. Subsequent warranted searches of the phones and several other electronic devices revealed hundreds of child pornography images and videos. At trial, the accused made an unsuccessful Charter challenge to the initial search of the two phones, and was convicted of importing and possessing child pornography. He appealed the finding on the search and also (with leave from the Court of Appeal) contested the device search subsequent to the search of the phones.

    While most “border search” cases have concerned searches conducted under the Customs Act, the Court of Appeal carefully explored the legislative context of this search, arising from the IRPA. It identified two possible provisions under which a search of a refugee claimant could be authorized: under section 139, a search may be conducted if the officer believes on reasonable grounds that the person has not revealed their identity, has undisclosed information about their identity, or has committed or has evidence about particular offences (human smuggling/trafficking, etc.). The officer had testified that she had been looking for evidence of inadmissibility on the basis of security, criminality or identity. Thus s. 130 had not authorized the search.

    The second was section 16 of IRPA. The Court held that 16(1), which requires the person to produce all information relevant to their refugee claim, did not apply. Section 16(3) allows officers to obtain “any evidence that may be used to establish their identity or compliance with this Act.” The Court held that the Crown’s argument that this crated a broad and general search power was not in keeping with the need for a constitutionally-protected core of privacy in electronic devices, even at the border where the expectation of privacy is somewhat attenuated. Their findings are worth setting out in some detail, in part because the Court reviewed its earlier finding in R. v. Canfield which dealt with e-device searches under the Customs Act (but was released while this newsletter was on a COVID-required hiatus):

    [48] The Crown reads s 16(3) more broadly as providing an unqualified search power. As Mr Al Askari emphasizes, this would create one of the broadest search powers in Canadian law. Without any restraint or need to provide an articulated basis, the officer could require a strip search, cavity search, DNA seizure, and encryption passwords, as long as the search was directed toward establishing the applicant’s identity or to ensure compliance with IRPA.

    [49] The more limited approach suggested by Mr Al Askari is supported by R v LE2019 ONCA 961, paras 66-67, 382 CCC (3d) 202, leave to appeal dism’d 2020 CanLII 33846 (SCC). The Court of Appeal for Ontario concluded that s 16(3) creates a qualified statutory search power.

    [50] LE involved a search of a cell phone of the accused who was in Canada illegally and subject to a removal order. The officer had a reasonably grounded belief that the accused was attempting to contravene her removal order and had wrongly made phone contact with her husband. The officer expressly relied upon s 16(3) as the source of authority for the search: para 44.

    [51] The court held that the search was lawful and the scope of the search was restricted to establishing the person’s identity or determining compliance with IRPA. Although there are procedural and substantive limits on this search process, there is no limit on the subject matter of the search since the officer is permitted to obtain “any evidence” as long as that evidence is to establish the person’s identity or determine compliance with IRPA: paras 68-69.

    [52] The court suggested that the search power under s 16(3) requires a “reasonable grounds belief”, para 70:

    In my view, s. 16(3) authorized the CBSA officer’s search of the appellant’s cell phone. The appellant was a foreign national; she had been arrested and detained and was subject to a removal order. The CBSA officers sought evidence that the appellant was attempting to contravene her removal order. They sought evidence from the LG Nexus cell phone in the appellant’s possession on arrest, to determine the appellant’s compliance (or lack thereof) with the IRPA, having information that could support a reasonable ground belief the appellant was obstructing her removal from Canada. [emphasis added]

    [53] This approach is consistent with R c Patel2018 QCCQ 7262, paras 64-66. It held that a cell phone search of a refugee claimant was authorized under ss 16(1), 16(3), 139 and 140 of IRPA upon which the officer explicitly relied. The search was necessary to determine the accused’s true identity because of bona fide concerns about his identification documents and the answers he provided when questioned.

    [54] Both LE and Patel are examples of what is contemplated by the text of s 16(3). Patel was concerned with further evidence of identity. LE addressed evidence of “compliance with the Act” as the accused was subject to a removal order under the IRPA. Neither case involved a broad suspicionless search for criminality.

    [55] A finding that s 16(3) does not authorize suspicionless searches is consistent with this Court’s decision in Canfield.

    [56] At issue in Canfield was the constitutionality of the Customs Act provision that permits the routine inspection of “goods”: s 99(1)(a). Earlier jurisprudence treated the search of electronic devices as coming within the definition of “goods” under s 99(1)(a) and falling within the first category of Simmons: routine searches that could be undertaken without any individualized grounds. This Court held that s 99(1)(a) of the Customs Act was unconstitutional to the extent that it imposed no limits on searches of electronic devices at the border. The definition of “goods” in s 2 of the Customs Act was deemed of no force or effect insofar as the definition included the contents of personal electronic devices for the purpose of s 99(1)(a): para 7. Notably, this Court said that not all searches of phones are the same; some will be more invasive than others: para 34. But routine, suspicionless searches of these devices are not constitutional under s 99(1)(a).

    [57] This Court went on to say, paras 75-76, that a justified search of a personal electronic device needs a threshold requirement of suspicion, but was reluctant to define the boundaries of that threshold, preferring to leave that question to Parliament:

    …To be reasonable, such a search must have a threshold requirement…[I]n our view the threshold for the search of electronic devices may be something less than the reasonable grounds to suspect required for a strip search under the Customs Act … [but] … we decline to set a threshold requirement for the search of electronic devices at this time. Whether the appropriate threshold is reasonable suspicion, or something less than that having regard to the unique nature of the border, will have to be decided by Parliament and fleshed out in further cases. However, to the extent that s 99(1)(a) permits the unlimited search of personal electronic devices without any threshold requirement at all, it violates the protection against unreasonable search in s 8 of the Charter.

    We hasten to add that not all searches of personal electronic devices are equal …

    [58] This Court was alive to the reality that travellers often store relevant documents for customs purposes on their electronic devices. Although an unlimited and suspicionless search of a device would breach the Charter, some documents stored on devices must be made available to border agents as part of the routine screening process. For example, receipts and other information relating to the value of imported goods and travel-related documents, would be essential to routine screening. “The review of such items on a personal electronic device during a routine screening would not constitute an unreasonable search under s 8”: para 79.

    [59] Routine and suspicionless searches of personal electronic devices under IRPA must be limited to the purposes provided in the text: identification and admissibility. Persons have a higher privacy interest in their devices even at the border. Not all searches of devices are overly intrusive, and relevant documents are often stored on these devices. It follows that, under s 16, officers may review documents on personal electronic devices where necessary for identification and admissibility purposes. For example, an officer could ask a refugee claimant to locate the relevant documents on their device instead of independently searching for them. In this situation, a search would only occur if the person could not meet the request.

    [60] In addition, Canfield followed the guidance from R v Fearon2014 SCC 77, paras 74-83, [2014] 3 SCR 621, regarding tailored and precise search protocols. The court warned against open-ended searches, even if done for statutorily prescribed purposes. Thus, a justifiable search of a personal electronic device for the purposes of identification and admissibility must limit the invasion of privacy by conducting the search in a manner that is tailored, and only where the officer is unable to otherwise satisfy themselves of identity and admissibility.

    Here, the CBSA officer had not had any indicators of inadmissibility or criminality when she conducted the search, and explicitly stated that suspicion had arisen only when she viewed the child porn images on the phone. The Court concluded its analysis by finding that a search under s. 16(3) must be grounded in “a reasonable suspicion with respect to the claimant’s identity, admissibility, or other compliance with the IRPA.” Accordingly, both the initial search of the phones and the subsequent search of the other devices had breached s. 8 of the Charter as unreasonable searches and seizures.

    The accused also argued that by demanding his phone passwords the officials had breached his right against self-incrimination under s. 7 of the Charter. Relying on the 2006 decision by the Ontario Court of Appeal in R. v. Jones, as well as its own decision in Canfield, the Court noted that there is no ab initio right to remain silent during inspection at the border, due to its unique nature. While questioning grounded in some “strongly particularized suspicion” that created a detention of the person might engage the protection against self-incrimination, routine border screening did not. Here, the screening had been routine and no detention had arisen.

    Having found the searches unconstitutional, the Court of Appeal nonetheless refused to exclude the evidence under s. 24(2) of the Charter. The state of the law at the time of the search—six years earlier—had been in flux and the officer’s belief that her search was permissible had been reasonable. The examination of photos was intrusive upon the accused’s privacy, particularly as he was religiously concerned about people viewing photos of female members of his family. However, the photos were highly reliable real evidence and society’s interest in trial on the merits had been high.

  • 23 Jul 2021 2:38 PM | Deleted user

    Yay! Another consultation

    Innovation, Science and Economic Development Canada has launched a consultation on copyright, artificial intelligence and the internet of things, which is open for comment until September 17, 2021. The launch was accompanied by a consultation paper, which sets out its goals:

    With this consultation, the Government invites both evidence of a technical nature and views on potential policy directions described in more detail in the paper. AI and IoT are fast evolving technologies, uses of these technologies are changing, and consumers and businesses are facing new copyright-related challenges when using these complex technologies.

    The types of technical evidence sought in this consultation include technical information about how an AI model integrates data from copyright-protected works as it "learns" from that data, the roles of various human players involved in the creation of works using AI, the extent to which copyrighted-protected works are integrated in AI applications after they are trained and commercialised, and the uses of AI-assisted and AI-generated works by businesses and consumers. With respect to IoT, evidence sought includes technical information about TPMs, how stakeholders interact with TPMs in their respective industries, and the necessary steps, third party assistance, and devices required to circumvent a TPM and perform associated tasks, such as repair or achieving interoperability of two products. Relaying experiences in other markets or jurisdictions that have enacted new measures related to AI and copyright would also be of interest.

    In considering possible copyright measures relating to AI and IoT, the Government will be guided by the extent to which measures would help achieve the following objectives:

    1. Support innovation and investment in AI and other digital and emerging technologies in all sectors in Canada. AI has tremendous potential for society if used ethically and responsibly, and could also drive productivity growth across the economy.
    2. Support Canada's cultural industries and preserve the incentive to create and invest provided by the economic rights set out in the Act. Creators, innovators and rights holders should be adequately remunerated for their works or other copyright subject matter.
    3. Support competition and marketplace needs regarding IoT devices and other software-enabled products. Consumers want to be able to maintain and more easily repair the products they own, while innovators want flexibility and certainty to develop software-enabled products that are interoperable with those of other manufacturers.

    Specific topics covered include the questions of authorship and ownership of works created by artificial intelligence and the right to repair in the context of the “internet of things”.

  • 23 Jul 2021 2:37 PM | Deleted user

    Ontario Divisional Court strikes intrusion upon seclusion claim based on recklessness

    In a two to one split decision before the Ontario Divisional Court in Owsianik v. Equifax Canada Co., the majority of the three judge panel has struck a class action claim against Equifax Canada that was based on the tort of “intrusion upon seclusion”. The tort, as first established in Canada in Jones v. Tsige, could be applicable where the defendant was intentional or reckless in the intrusion. In the case before the Court, Equifax Canada appealed a certification decision that would have allowed the case to proceed based on the allegation that Equifax had been reckless.

    The plaintiffs argued on appeal that the contours of the privacy tort are evolving and it should be up to the trial judge to determine whether Equifax had been reckless and, if so, whether it triggered the intrusion tort. Equifax, on the other hand, argued that the certification judge’s decision went beyond the “incremental development principle” and that novel claims such as these should be vetted at the certification stage.

    The majority of the three judge panel of the Divisional Court allowed Equifax’s appeal, reasoning:

    [54] The tort of intrusion upon seclusion was defined authoritatively only nine years ago. It has nothing to do with a database defendant. It need not even involve databases. It has to do with humiliation and emotional harm suffered by a personal intrusion into private affairs, for which there is no other remedy because the loss cannot be readily quantified in monetary terms. I agree that Sharpe J.A.’s definition of the tort is not necessarily the last word, but to extend liability to a person who does not intrude, but who fails to prevent the intrusion of another, in the face of Sharpe J.A.’s advertence to the danger of opening the floodgates, would, in my view, be more than an incremental change in the common law.

    [55] I agree with my colleague (paragraph 43) that Equifax’s actions, if proven, amount to conduct that a reasonable person could find to be highly offensive. But no one says that Equifax intruded, and that is the central element of the tort. The intrusion need not be intentional; it can be reckless. But it still has to be an intrusion. It is the intrusion that has to be intentional or reckless and the intrusion that has to be highly offensive. Otherwise the tort assigns liability for a completely different category of conduct, a category that is adequately controlled by the tort of negligence.

    The court also concluded that if a defendant had not taken adequate steps to secure their databases, the tort of negligence “protects them adequately and has the advantage that it does not require them to prove recklessness.”

  • 23 Jul 2021 2:34 PM | Deleted user

    Federal regulator calls it “a step back overall” for privacy

    The federal Privacy Commissioner, who oversees the Personal Information Protection and Electronic Documents Act and who would be the lead regulator if Bill C-11 ever becomes law has slammed the bill as a “step back overall” for privacy. In a lengthy submission to the House of Commons Standing Committee on Access to Information, Privacy and Ethics, the Commissioner says the bill does not get the balance between privacy and commercial interests right and is out of step with legislation in other jurisdictions. The main concerns are summarized in a press release issued at the same time by the Commissioner:

    Control

    Instead of giving consumers greater control over the collection, use and disclosure of their personal information, Bill C-11 offers less control. It omits the requirement under existing law that individuals understand the consequences of what they are consenting to for it to be considered meaningful, and it allows the purposes for which organizations seek consent to be expressed in vague, if not obscure, language.

    New flexibility without increased accountability

    In the digital economy, organizations need some degree of flexibility to use personal information, sometimes without consent, in order to maximize the potential of the digital revolution for socio-economic development. But with greater flexibility for companies should come greater accountability.

    Unfortunately, Bill C-11 weakens existing accountability provisions in the law by defining accountability in a manner akin to self-regulation.

    Organizations should be required to apply the principles of Privacy by Design and undertake privacy impact assessments for new higher risk activities. The law should also subject organizations to proactive audits by the OPC to ensure they are acting responsibly.

    Responsible innovation

    Bill C-11 seeks to provide greater flexibility to organizations through new exceptions to consent. However, certain exceptions are too broad or ill-defined to promote responsible innovation. The preferred approach would be to adopt an exception to consent based on legitimate business interests, within a rights-based approach.

    A rights-based foundation

    Bill C-11 prioritizes commercial interests over the privacy rights of individuals. While it is possible to protect privacy while giving businesses greater flexibility to innovate responsibly, when there is a conflict, privacy rights should prevail.

    To that end, the Bill should be amended to adopt a rights-based framework that would entrench privacy as a human right and as an essential element for the exercise of other fundamental rights. The OPC submission recommends doing this in a way that would strengthen the constitutional foundation of the law as properly within the jurisdiction of Parliament.

    Access to quick and effective remedies

    Bill C-11 gives the OPC order-making power and the ability to recommend very high monetary penalties. However, both are subject to severe limitations and conditions, including the addition of an administrative appeal between the OPC and the courts that would deny consumers quick and effective remedies.

    Only a narrow list of violations could lead to the imposition of administrative penalties. The list does not include obligations related to the form or validity of consent or the numerous exceptions to consent. It also does not include violations of the accountability provisions.

    In the case of failure to comply with these obligations, only criminal sanctions would apply and only after a process that could take approximately seven years. A process that would take a maximum of two years is recommended.

  • 20 May 2021 2:47 PM | Deleted user

    Various recent court decisions show judges and parties wrestling—mostly successfully—with faked and misused electronic evidence

    As far back as the Uniform Law Conference of Canada’s 1998 Uniform Electronic Evidence Act—which formed the basis for the provisions in the Canada Evidence Act and various provincial acts dealing with the admissibility of electronic data—courts have been concerned about the provenance of electronic data when it is led as evidence. Specifically, there has always been concern that due to the inherent manipulability of digital data, electronic evidence could be fabricated by dishonest litigating parties and used to undermine the truth-seeking function of the trial process. Anecdotally this is something that happens often but, in our experience, rulings about it seldom show up in reported decisions. However, very recent case law indicates that, where parties and judges are properly attuned to these kinds of problems, they can be prevented and exposed, and the dishonest parties will reap the consequences.

    The trial judge in Lenihan v. Shankarwas adjudicating upon a hotly-contested custody, access and mobility dispute in a family law case, which (the judge held) featured remarkable amounts of subterfuge by the mother. Justice McGee decided in favour of the father, and provided written reasons for potential appellate review but that were also “to draw attention to the evidentiary challenges of spoofed communications and postings created to damage a parent’s credibility and tendered to gain litigation advantage.” Among the various evidentiary issues were arguments by the mother, both that emails and texts adduced in evidence by the father were fake, and that emails and other messages that the mother adduced in evidence were genuine.

    Justice McGee first reviewed the provisions of the Ontario Evidence Act relating to the admissibility of “electronic records,” noting that s. 34.1(4) simply codified the “low threshold test at common law” for authentication at the admissibility stage, namely that the adducing party simply provide “some evidence” that the record is what it purports to be; final determination of authenticity is left for fact-finding. The primary focus at the admissibility stage, she held, is the integrity of the electronic record itself, which under s. 34.1(5.1) can established by “evidence of the integrity of the electronic records system by or in which the data was recorded or stored, or by evidence that reliable encryption techniques were used to support the integrity of the electronic record.” She noted that these provisions, intended to satisfy “best evidence” concerns,

    …work to ensure that an electronic document accurately reflects the original information that was inputted or recorded on the device. With electronic documents, the focus shifts to the information contained in the document, rather than the document itself. The threshold for admissibility is low and at this stage, concerns are generally limited to the completeness and accuracy of the record.

    Here, the father had adduced in evidence a series of text messages between the parties over a 5-month period, which he “had exported…from his phone into a printable format using an application called ‘GIIApps SMSShare 2’.” The mother argued that the texts were all fake and created to “make her look bad,” but the judge rejected this on a number of bases. The mother had refused to produce her own version of the text exchanges; the texts contained pictures that she had taken of herself which she claimed were downloaded from her Facebook page, but there was no evidence that these pictures were ever on her Facebook page; in an earlier Notice to Admit she had admitted to sending a number of the texts she now claimed were fake; and while she claimed that the phone number identified in the texts as hers was also falsified, it appeared as her number in her own Exhibit Brief. The mother also argued that some emails from her to the father which had been adduced had also been faked “to make her look bad,” but the judge noted that the mother only ever used the originating email account, and that despite her claims that someone (probably the father) had been accessing her account to send the emails, she had made no effort to “change the account, change her password or set up a new account, any one of which would be the natural next step were her email to have been “hacked” or used inappropriately.”

    The mother also adduced a particular set of emails purportedly from the father, but Justice McGee held that these were fake on the basis that: the father testified that the sending email address was not his; the emails reflected the mother’s writing style and not the father’s, “inclusive of content, word choice and spelling”; the emails repeated false claims that had been earlier made by the mother and would have made no sense if attributed to the father; the emails indicated knowledge in the sender which the father would not have had at the relevant time; and the email had markers indicating they had been printed from the mother’s known account. Although they were not authentic, the judge admitted them as evidence of the mother’s “extensive efforts to damage [the father]’s character, particularly in the eyes of their daughter’s service providers and the Court.” Other emails were similarly held to be bogus, in one case due to their clearly having been “copied and then pasted into a Word or other word processing document.”

    The mother also adduced in evidence communication logs from a co-parenting and custody app called “Our Family Wizard” which contained messages purportedly between the parties; however, the communications had numerous inconsistencies and ordering flaws, and the judge concluded that the mother had generated two accounts (one in the father’s name) and simply generated “messages that could be used as evidence.”

    Having found for the father on all matters, McGee J. concluded the judgment with some “final thoughts” on the subject of electronic evidence:

    247. As our court transitions to a fully digital platform, this trial was a stark reminder of the potential for the manipulation and misuse of electronic evidence.

    248. The most common internet definition of a spoofed email is when the email address in the “From” field is not that of the sender. It is easy to spoof an email, and not always so easy to detect. For sophisticated senders – such as actors who are “phishing” for information of commercial value – the origins of a spoofed email may never be detected.

    249. Spoofing originates from the idea of a hoax or a parody, and in the early days of the internet, it was a legitimate tool for managing communications so that a user believed that an email came from one source, when it actually came from another.

    250. Spoofing first arose as a term in family law (more commonly referred to in the U.S.A. as divorce law) to describe cell phone users hiding their identity and/or location for nefarious purposes. As a result of advances in mobile apps, websites, forwarding services and other technologies, callers are now able to change how their voice sounds, to evade a blocked number or to pretend to be a person or institution with whom their target was familiar. Targets can be tricked into disclosing sensitive information, harassed, stalked and frightened.

    251. Any electronic medium can be spoofed: texts, emails, postings to social media, and even messaging through a reputable software program specifically designed to provide secure communications between sparring parents.

    252. What stood out in this case was the purpose of the spoofed communications. Instead of tricking or scaring the target, electronic communications were spoofed to deliberately damage the other parent’s credibility and to gain litigation advantage.

    253. In R. v. C. B., the Ontario Court of Appeal foreshadowed the relevance of inauthentic electronic evidence. “[T]endered as bogus” is a critical catch that is not always apparent. A party’s lament that “it wasn’t me” may appear credible at one stage of the proceeding but may no longer be credible at a later stage. An email or text that on first reading appears authentic might later be found to be inauthentic when examined within the evidence as a whole.

    254. Fake electronic evidence has the potential to open up a whole new battleground in high conflict family law litigation, and it poses specific challenges for Courts. Generally, email and social media protocols have no internal mechanism for authentication, and the low threshold in the Evidence Act that requires only some evidence: direct and/or circumstantial that the thing “is what it appears to be;” can make determinations highly contextual.

    255. In a digital landscape, spoofing is the new “catch-me-if-you-can” game of credibility.

    256. I urge lawyers, family service providers and institutions to be on guard, and to be part of a better way forward. Courts cannot do this work alone, and the work must be done well. High conflict litigation not only damages kids and diminishes parents; it weakens society as a whole, for generations to come.

    In R. v. Aslami, the Ontario Court of Appeal reversed the appellant’s conviction on multiple charges related to the firebombing of a home. The crux of the Crown’s case at trial had been various messages sent via Facebook, SMS text and a texting app called “TextNow.” The Court of Appeal accepted the appellant’s argument that the trial judge had failed to take into account issues with the authenticity/integrity of the messages, and that these issues tended to support the defence theory that the complainant’s ex-wife and her new boyfriend had been attempting to frame the appellant for the firebombing. Each group of messages, the Court wrote, “has its own particular frailties.”

    The SMS messages originated from a sender whom the ex-wife had identified in her own phone as the appellant, under the name “Sumal Jan” which she claimed was a name he was called by some in their home country. However, “[a] police detective gave evidence that there were several entries on the appellant’s ex-wife’s cellphone for the name ‘Sumal Jan’ that had different phone numbers associated to them.” The TextNow messages had been retrieved via screenshots of the ex-wife’s phone, but the timestamps had been uncertain, and the Crown had not led “any expert evidence regarding the functioning of the TextNow app, or its reliability, or any ability to manipulate the date, number, name of the sender, or any other details as to the operation of the app.” The trial judge had engaged in a speculative evaluation of this evidence, comparing spelling, phrasing and substantive content between the SMS texts and the TextNow messages, which had the effect of assuming that the appellant had sent them without there being reliable evidence to this effect. The Facebook messages had been exchanged between the ex-wife’s boyfriend and someone using the name “Trustnoone Mob,” and the only evidence linking the latter identity to the accused was the boyfriend’s testimony that he believed he was communicating with the accused.

    In allowing the appeal, Nordheimer J.A. commented:

    [30] As I said at the outset, trial judges need to be very careful in how they deal with electronic evidence of this type. There are entirely too many ways for an individual, who is of a mind to do so, to make electronic evidence appear to be something other than what it is. Trial judges need to be rigorous in their evaluation of such evidence, when it is presented, both in terms of its reliability and its probative value. The trial judge did not engage in that rigorous analysis in this case. In fairness, the trial judge was not assisted by the prosecution in this task. The prosecution ought to have called expert evidence to address the issues that the evidence posed, but they did not.

    Another case of interest is R. v. H.S.S., where Judge Chen of the British Columbia Provincial Court presided over the prosecution of a young person for the alleged sexual assault of another young person (both were 16 years old at the time of the incident). The complainant alleged that during a school day, her friends were using her phone to exchange Instagram messages with the accused, and told her that the accused wished to meet with her in the school’s “handicapped bathroom to talk.” She went to the meeting, not reviewing any of the messages her friends had exchanged with the accused, and he assaulted her by touching her sexually. It transpired, she said, that her friends and her sister had been exchanging Instagram messages with the accused for several days; she selected the “relevant” ones to give to the police upon reporting the alleged assault, and deleted the rest.

    In his defence the accused adduced all of the Instagram messages between himself and the complainant, which amounted to hundreds sent between the two over several days, and which disclosed that they had agreed to have a sexual encounter in the bathroom. He testified that she was a willing participant in the kissing and heavy petting that constituted the alleged assault. In cross-examination the complainant tried to explain away the many inconsistences in her evidence by saying that her sister and friends (none of whom were called as witnesses) must have been using her phone to communicate with the accused, but the judge found her explanations unconvincing in light of both her proven conduct, implausible explanations and the numerous credibility problems her various stories presented.

    In the end Judge Chen held that “even the evidence of the Complainant leads to the inescapable conclusion that the Accused was indeed ‘set up’.” The judge ruled that the complainant and her sister had carried out a campaign of “cruel and callous” sexualized teasing of the accused, who was infatuated with the complainant, and that the Crown could not prove lack of consent to the activity in the bathroom. The accused was acquitted.

  • 20 May 2021 2:45 PM | Deleted user

    Information, misinformation and disinformation about COVID-19 has been ebbing and flowing along with the pandemic in 2020 and 2021. Many have been engaged in sharing their opinions about the virus, possible treatments and vaccinations, but the regulators of healthcare professionals have taken a keen interest in social media posts that may, in their view, harm public health.

    The Ontario College of Physicians and Surgeons of Ontario’s Inquiries, Complaints and Reports Committee (ICRC) has in three cases found that public social media commentary it considered to be inaccurate and misleading presented a potential risk to public health and imposed discipline on the offending physician.

    One physician in particular was the subject of multiple complains from the public related to her activities on Twitter, which included posts such as ““There is absolutely no medical or scientific reason for this prolonged, harmful and illogical lockdown” and “If you have not yet figured out that we don’t need a vaccine, you are not paying attention” and “Contact tracing, testing and isolation.. is ineffective, naïve & counter-productive against COVID-19.. and by definition, against any pandemic.” Other posts that gave rise to other complaints, including a tweeted a message that strongly implied that hydroxychloroquine (HCQ) could “prevent, cure and treat early COVID-19” but that the federal government was withholding this treatment from the Canadian public for vague but sinister reasons.

    In three separate decisions, all of which are currently under judicial review, the College cautioned the physician for her lack of professionalism and failure to exercise caution in her postings, which was considered to be irresponsible and a possible risk to public health. The ICRC concluded:

    The Committee did not accept the Respondent’s position that her tweets come from a personal Twitter account that has no affiliation with her practice. The Respondent’s Twitter biography makes it very clear that she is a physician and also identifies her as the leader of a group of physicians, Concerned Ontario Doctors. The Respondent’s tweets are accessible by the public. Moreover, members of the public who are not healthcare professionals are likely to attribute significant weight and authority to the Respondent’s tweets, given her profession. Non-medically trained members of the public would likely have difficulty determining the scientific and medical validity of the Respondent’s tweets.

    On the basis of the above, the Committee decided that it would be appropriate to caution the Respondent in this matter.

    The ICRC stated that this was particularly problematic, and “irresponsible and careless in the current context and climate”.

    Though some may perceive otherwise, regulated professionals – including healthcare professionals – continue to have professionalism obligations when using social media, particularly when their comments are connected to their holding of credentials and are on a controversial matter related to their profession.

  • 20 May 2021 2:42 PM | Deleted user

    The Ontario Court of Justice asks: “In the age of Zoom, is any forum more non conveniens than another?”

    In a motion brought to stay proceedings in the Ontario court in favour of an arbitration in Chicago, the Ontario Superior Court of Justice had to grapple with time-worn arbitration and conflicts of laws questions, but through a completely novel lens. Justice Mogan commenced the reasons in Kore Meals LLC v. Freshii Development LLC:

    [1] In the age of Zoom, is any forum more non conveniens than another? Has a venerable doctrine now gone the way of the VCR player or the action in assumpsit?

    The plaintiff, Kore Meals LLC, and defendant, Freshii Development LLC, were parties to a Development Agent Agreement that contained an arbitration clause that called for arbitration in the city of the defendant's head office, which was named as Chicago. The plaintiff wanted to litigate in the courts in Ontario, named as a co-defendant the parent company of Freshii Development that was based in Ontario, and pointed to the fact that the defendant’s only presence in Chicago was a mailbox. The defendant had no office or personnel in Chicago.

    In the usual course, an arbitrator would be called upon to determine whether the case was arbitrable. Because that arbitration necessarily “sit” in Chicago, the plaintiff said that would be unfair and impractical, as it is that venue that is being challenged. When asked where the American Arbitration Association is located, both parties counsel indicated they were unsure as all submissions would be made online. When asked if the proceeding would similarly be online, counsel advised the court that would likely be the case in light of the pandemic. With this information, the court wrote:

    [29] All of which undermines the majority of forum non conveniens factors. If hearings are held by videoconference, documents filed in digital form, and witnesses examined from remote locations, what is left of any challenge based on the unfairness or impracticality of any given forum? To ask the question is to answer it. Freshii Developments may have a miniature post office box or an entire office tower in Chicago, and witnesses or documents may be located in Canada’s Northwest Territories or in the deep south of the United States, and no location would be any more or less convenient than another.

    Following consideration of the fact that there would likely be no issue of enforcement of any arbitral award issued by an American arbitrator, the court concluded that location is currently largely irrelevant:

    [31] It is by now an obvious point, but it bears repeating that a digital-based adjudicative system with a videoconference hearing is as distant and as nearby as the World Wide Web. With this in mind, the considerable legal learning that has gone into contests of competing forums over the years is now all but obsolete. Judges cannot say forum non conveniens we hardly knew you, but they can now say farewell to what was until recently a familiar doctrinal presence in the courthouse.

    [32] And what is true for forum non conveniens is equally true for the access to justice approach to the arbitration question. Chicago and Toronto are all on the same cyber street. They are accessed in the identical way with a voice command or the click of a finger. No one venue is more or less unfair or impractical than another.

    The defendant’s motion for a stay was granted in favour of arbitration as contemplated in the agreement.

  • 16 Apr 2021 2:51 PM | Deleted user

    The CAN-TECH LAW Association offers a variety of ways for our members to get involved via our various committees. Volunteers are at the heart of what we do, and it is only because of the time and knowledge our volunteers contribute that we are able to fulfill our mission.

    Our committees in particular help shape key organizational decisions, provide strategic direction, and move the field of legal technology forward.

    Committee volunteer benefits include building a strong professional network, learning about key legal technology issues, and participating in professional development opportunities. Committee members are expected to actively participate in their committee; however members will have the opportunity to work with Committee co-chairs to discuss how much or how little time they can contribute to various initiatives. 

    Below are current CAN-TECH LAW committee opportunities for 2021. However, we are open to creating new committees based on membership interest.

    • Women in Tech Law - This is a standing committee which supports female identifying members of the legal bar by providing tools to success including a variety of leadership development, education, networking and mentoring opportunities for women at all levels of their careers. Committee participation and events are open to all members and all are encouraged to participate.
    • Diversity - This is a standing committee whose principal project for 2021 includes implementing the suggestions outlined in the organization’s diversity and inclusion report from 2020. The committee also supports the planning of D&I events as appropriate.
    • Mentorship - The Mentorship Program will be launched in 2021 with a goal of creating meaningful mentorship relationships between junior, intermediate and senior Can-Tech members to help junior members achieve success in their career. This program will be supported by both WIT and D&I. 
    • Spring Conference Planning Committee - This is an annually constituted committee which is currently actively planning our spring series conference which will occur in late May/June. 
    • Fall Conference Planning Committee - This is an annually constituted committee which is responsible for planning our fall conference. Volunteers are currently welcomed.
    • Privacy, Artificial Intelligence, Digital Identity, Cybersecurity, FinTech - These are ad hoc committees whose goals will be to discuss and comment on developments in the applicable fields and organize round-tables for members on key topics as appropriate. 

    *Co-chair position(s) open; please indicate to us if interested.

    While you can volunteer at any time, we invite you to volunteer by Thursday, April 22, 2021.

    Committee participation is open to all CAN-TECH LAW members.

    Should you have any questions or would like to participate kindly send a note to mohammad.ali@cantechlaw.ca

  • 12 Jan 2021 2:52 PM | Deleted user

    The ADR Institute of Canada (ADRIC) has adopted Med-Arb Rules and announced a new Chartered Mediator-Arbitrator (C.Med-Arb) professional designation in recognition that med-arb is a distinct process that is different from either mediation or arbitration on their own.

    Med-arb is widely used in areas such as employment and family disputes and is now gaining acceptance in the commercial disputes. It promises both flexibility and finality, saving time and money by having a single mediator-arbitrator conduct the entire process. This makes it an attractive option for resolving many technology disputes.

    But there are traps for the unwary. There must be clear procedures to ensure fairness and an enforceable agreement or award at the end of the day. The Med-Arb Rules, which incorporate ADRIC’s existing Mediation Rules and Arbitration Rules, provide a complete procedure for both the mediation and arbitration phases of the med-ab process.

    The Rules can be incorporated in a contract or stand-alone dispute resolution agreement. They can be modified by agreement of the parties, providing a high degree of flexibility.

    The Rules require that the mediator-arbitrator remain independent and impartial at all times. There must be full initial and ongoing disclosure of any potential conflicts. But the Rules also make it clear that merely acting as a mediator, meeting separately with parties or questioning the merits of a party’s position during the mediation phase, will not amount to procedural unfairness.

    Those seeking the Chartered Mediator-Arbitrator designation must have training and practical experience to avoid the potential traps that can lead to unfairness and bias claims. The goal is to ensure that there is a valid and enforceable agreement or award at the end of the med-arb.

    The transition from the mediation phase – when everyone is at least trying to get along and come up with a settlement – to the arbitration phase – when everyone suits up to fight over the remaining issues – is the most difficult part of any med-arb.

    The Rules deals with these crucial transition issues:

    • The mediation phase ends when an agreed time limit expires, the parties have settled all issues in dispute, the parties agree in writing, or the mediator decides to end it.
    • When the mediation phase ends, the parties must confirm which issues have been resolved (to be documented in a settlement or consent award).
    • The parties must also identify the unresolved issues to go to arbitration. If they can’t agree, the mediator-arbitrator will identify those issues.
    • At the beginning of the arbitration phase, the mediator-arbitrator will decide any challenge arising from the mediation before continuing with the arbitration. Any party that does not object is deemed to have waived any such challenge.
    • Any other objection to the mediator-arbitrator, such as impartiality or qualifications, must be resolved under the ADRIC Arbitration Rules.
    • During the arbitration phase the mediator-arbitrator must not use information from the mediation phase unless it becomes evidence in the arbitration or the parties consent to its use.

    These points expressly address many common concerns about the med-arb process.

    With technology or project disputes, when time is critical, there may be a temptation to move quickly to a final settlement or award, but the Rules recognize that there are dangers in moving too fast.

    There must be a clear, bright-line transition from mediation to arbitration.

    The parties and the mediator-arbitrator must document the issues that have been resolved and those that have not. This may be tricky in some cases. For example, agreement on one issue may be dependent on resolution of another.

    If a party has an objection to the mediator continuing as arbitrator for any reason, they must raise it right away. They can’t wait to see how things go and object later, if the award goes against them.

    The rule against using information from the mediation unless it becomes evidence in the arbitration puts a responsibility firmly on the arbitrator, and on each of the parties, to be very clear about what information is in evidence and what is not.

    As noted in McClintock v. Karam2015 ONSC 1024 (CanLII), an often-cited family med-arb case, the mediator-arbitrator “cannot be expected [to] entirely cleanse the mind of everything learned during the mediation phase, and of every tentative conclusion considered, or even reached, during the mediation phase. However, at a bare minimum the parties are entitled to expect that the mediator/arbitrator will be open to persuasion, and will not have reached firm views or conclusions.”

    The ADRIC Med-Arb Rules and evolving best practices should help meet that goal.

    See Michael Erdle’s Slaw.ca column for more commentary on med-arb and the ADRIC Med- Arb Rules.

  

Canadian Technology Law Association

1-189 Queen Street East

Toronto, ON M5A 1S2

contact@cantechlaw.ca

Copyright © 2025 The Canadian Technology Law Association, All rights reserved.