Menu
Log in
Log in


News

  • 7 Feb 2020 3:37 PM | Deleted user

    Manitoba Law Reform Commission releases discussion paper examining electronic augmentation for taking of affidavits and other documents

    The Manitoba Law Reform Commission has released a discussion paper entitled Bridging the Gap for Remote Communities: Electronic Witnessing of Affidavit Evidence. The Commission’s project originated back in 2017, when members of the Manitoba bar brought to its attention the fact that people living in the province’s more remote communities often have trouble accessing a person who is qualified to take the swearing or affirmation of affidavits. While the Manitoba Evidence Act provides for a fairly broad variety of “authorized individuals” who may take affidavits, the preliminary evidence gathered by the Commission indicates that people living in some communities nonetheless are not able to find an “authorized individual” locally. Such people must incur the time and expense to travel to a more populated/urban area, which both creates delays in all manner of legal proceedings and transactions, and presents as a potentially significant access to justice issue.

    The main issue is that the relevant statutory language has been interpreted in Manitoba, as in most provinces and other common law jurisdictions, to require that affidavits be sworn in the presence of the “authorized individual,” and thus the use of electronic means is not permitted. In the only relevant Canadian case to date, First Canadian Title Co v. Law Society of British Columbia, the court decided against a motion to allow the execution of a land title instrument via video link. As the Law Commission report explains:

    …the court acknowledged concerns raised by the Ethics Committee of the Law Society of British Columbia about an overly broad interpretation of the presence requirement allowing for witnessing of documents remotely such as, how to ensure the affiant understands the content of the affidavit, ensuring the signature is genuine, proper identification of the affiant and concerns about changes to the document between the signature of the affiant and of the witness.

    In a survey of the relevant law and practice in various Canadian and foreign jurisdictions, the Commission took note of Bill 161, currently before the Ontario legislature, which does two things: 1) makes the requirement of physical presence before the authorized individual explicit; but 2) creates the potential for exceptions to this requirement, to be added in regulations after the bill is passed. Remarks by the Attorney General of Ontario indicate that the government’s attention is to authorize electronic commissioning and notarizing. Further, the Law Society of Alberta has created a pilot project called the “TreeFort Platform” which would allow for secure online meetings between lawyers and clients at which documents could be executed and certified. Also, the US state of Virginia has brought in legislation allowing for electronic notarization via video- and tele-conferencing and digital signature technology.

    The Commission has identified a number of issues for discussion, on which it is seeking feedback by February 20, 2020, including:

    1. Should The Manitoba Evidence Act be amended to remove the physical presence requirement in certain circumstances as is proposed by Ontario’s Bill 161?
    2. If the Act is amended to enable electronic notarizing or commissioning of affidavit evidence, should standards be set to regulate those providing such services similar to the “standards of notarization” enacted in Virginia?
    3. What safety measures should be required to ensure the privacy and security of documents being witnessed electronically and to respond to the concerns in First Canadian Title Company Ltd. such as the integrity of the document and ability to verify the signatory’s identity?
    4. What criteria should be used to determine what software is allowable? Should such criteria be established by regulation?
    5. What other issues relating to privacy and security may arise?
    6. If the Manitoba Evidence Act should be amended to allow for affidavits to be taken using video-conferencing technology, should it be restricted to certain populations or certain situations?
    7. Should a person witnessing the signing of an affidavit using video-conferencing technology be required to be physically present in Manitoba?
    8. Should the affiant be required to be physically present in Manitoba when appearing before a witness using video-conferencing technology?
  • 23 Jan 2020 3:46 PM | Deleted user

    Pseudonymous posters given adequate notice of the claim via email and via website messages

    The Ontario Superior Court of Justice has granted summary judgement in a defamation case against a number of unnamed, pseudonymous authors of internet postings. In Theralase Technologies Inc. v. Lanter, the plaintiffs were a pharmaceutical company and two of its senior employees. They alleged that a number of postings made on an online discussion website, Stockhouse.com, were defamatory of them and are summarized by the judge:

    [31] Generalizing for introductory purposes, the postings assert that Theralase management are untruthful and unprofessional, the corporation is operating unlawfully and improperly from the investors’ perspectives, and the personal plaintiffs are unprofessional, incompetent managers who have committed criminal acts. Ms. Hachey is also the subject of at least one misogynistic post that is particularly disgusting.

    The postings were made by ten different accounts. Prior to commencing the action, the defendants were able to obtain an order requiring Stockhouse to provide information about the individuals behind the pseudonyms. Stockhouse was able to provide email addresses for all but one of them, but said technical problems prevented them from providing further information. The plaintiffs then sent libel notices and requests for identification to each of the email addresses and obtained an order permitting service of the plaintiffs’ claim by email and private message on the Stockhouse platform. A number of the email addresses generated error messages, suggesting the accounts were no longer in operation, and one of the defendants did respond and was identified. The plaintiffs then brought a motion for default judgement against the still unidentified defendants.

    The court was then required to consider whether it could grant summary judgement against a currently unidentified person. The caselaw on the point is scant as only one Ontario precedent could be found:

    [13] In Manson v John Doe, 2013 ONSC 628, 114 OR (3d) 592, the defendant was an anonymous blogger on a website owned by Google. Google advised the plaintiff that it had sent the plaintiff’s motion seeking the identity of the defendant to the defendant by email and that the defendant had responded indicating that he was seeking legal counsel. Ultimately the plaintiff was provided with the defendant’s email address although it could not determine his name.

    [14] Goldstein J. wrote:

    [20] There are few things more cowardly and insidious than an anonymous blogger who posts spiteful and defamatory comments about a reputable member of the public and then hides behind the electronic curtain provided by the Internet. The Defendant confuses freedom of speech with freedom of defamation. There are, undoubtedly, legitimate anonymous Internet posts: persons critical of autocratic or repressive regimes, for example, or legitimate whistleblowers. The Defendant is not one of those people. The law will afford his posts all the protection that they deserve, which is to say none.

    [15] In the result, Goldstein J. granted judgment against the defendant who was identified only by a user name or pseudonym. There is no discussion in the case report as to whether the lack of the defendant’s actual name was considered to be an impediment to the court’s jurisdiction.

    The court then reviewed caselaw from the United Kingdom, another jurisdiction in which most judgements are made in personam. The principal authority for permitting a default judgement against an unnamed defendant was found in Cameron v. Liverpool Victoria Insurance Co Ltd., [2019] UKSC 6, [2019] RTR 15, a UK supreme court decision that held that such an order can issue in certain cases:

    [22] In Cameron, the Supreme Court found that it is not enough to refer to a defendant by reference to a past act, such as a hit-and-run accident, because the prior act provided no basis to identify the particular person who is the defendant. However, the court concluded that where a form of service is utilized that can reasonably be expected to bring the proceedings to the attention of the defendant, there was no reason in principle to limit the court’s ability to grant judgment against the unidentified defendant.

    [23] I agree with the reasoning in Cameron and adopt the Supreme Court’s framework. Provided that the form of service utilized can reasonably be expected to bring the proceedings to the attention of a specific, identifiable defendant, the court has jurisdiction over that person however he or she may be identified. The test of reasonableness will be influenced by the circumstances of the case. Where, for example, people are hiding behind internet anonymity to make allegedly defamatory comments on a website, service through the website using the coordinates and the identifiers that the users themselves provided to the website operator strikes me as both reasonable and just. If notice does not reach the users, it is because they choose not to access the accounts from which they made their comments or the email addresses that they provided to the website operator. Where there is evidence that a person is actively evading service, such as by shutting down a previously active email address or website account after learning that an action exists, correspondingly less certainty of service may be required as long as it remains conceptually possible. See also: Cameron at para. 25.

    The court reviewed the manner in which the plaintiffs’ claim was communicated to the defendants, both through email and through the messaging function of Stockhouse.com. Such communication was likely to bring the claim to the attention of the defendants and they failed to respond or file a defence.

    The judge acknowledged that there will likely be significant challenges in enforcing the default judgement, but it did not influence the determination of whether default judgement could be obtained. In the result, the court entered summary judgement against the unnamed defendants, assessed damages against each of them and issued an order for costs on a substantial indemnity basis.

  • 23 Jan 2020 3:42 PM | Deleted user

    Hard-to-find terms of use unenforceable and arbitration clause was of no effect

    A US appeals court summarily dismissed an appeal related to the browsewrap terms of use agreement in a mobile gambling app. In Wilson v. Huuuge, Inc., the app developer was appealing a decision of a district court that refused to enforce an arbitration clause in the terms of use for the app.

    The plaintiff brought a suit as an intended class action, alleging that the app and its developer violated Washington state gambling and consumer protection laws. The defendant brought a motion trying to have the action stayed and requiring the plaintiff to arbitrate any dispute. The app's terms of use contained an arbitration clause, but the district court found that these terms (or the terms of use, generally) were not brought to the user's attention either in fact or constructively. The developer did not require users to affirmatively acknowledge or agree to the terms of use before downloading, installing or using the app.

    The United States Court of Appeal for the Ninth Circuit colourfully described the positioning of the terms of use:

    Once a user has downloaded the app, the user can play games immediately. During gameplay, a user can view the Terms by accessing the settings menu. The settings menu can be accessed by clicking on a three dot “kebob” menu button in the upper right-hand corner of the home page (Figure D).

    If a user clicks on the button, a pop-up menu of seven options appears (Figure E). The fifth option is titled “Terms & Policy” and reveals the Terms, including the arbitration agreement.

    To enforce an arbitration agreement under US federal law, the person asserting the agreement must prove there exists a valid agreement by reference to ordinary contract law. The court wrote:

    As we have acknowledged many times, although online commerce has presented courts with new challenges, traditional principles of contract still apply. A contract is formed when mutual assent exists, which generally consists of offer and acceptance. Like many states, Washington does not allow parties to shirk contract obligations if they had actual or constructive notice of the provisions. In the context of online agreements, the existence of mutual assent turns on whether the consumer had reasonable notice of the terms of service agreement. [references omitted]

    The court found that it was a "browsewrap" agreement and amusingly described the adventure that may be required to encounter the terms of use in the app:

    … When downloading the app, the Terms are not just submerged—they are buried twenty thousand leagues under the sea. Nowhere in the opening profile page is there a reference to the Terms. To find a reference, a user would need to click on an ambiguous button to see the app’s full profile page and scroll through multiple screen-lengths of similar-looking paragraphs. Once the user unearths the paragraph referencing the Terms, the page does not even inform the user that he will be bound by those terms. There is no box for the user to click to assent to the Terms. Instead, the user is urged to read the Terms—a plea undercut by Huuuge’s failure to hyperlink the Terms. This is the equivalent to admonishing a child to “please eat your peas” only to then hide the peas. A reasonably prudent user cannot be expected to scrutinize the app’s profile page with a fine-tooth comb for the Terms.

    Accessing the terms during gameplay is similarly a hide-the-ball exercise. A user can view the Terms through the “Terms & Policy” tab of the settings menu. Again, the user is required to take multiple steps. He must first find and click on the three white dots representing the settings menu, tucked away in the corner and obscured amongst the brightly colored casino games. The “Terms & Policy” tab within the settings is buried among many other links, like FAQs, notifications, and sound and volume. The tab is not bolded, highlighted, or otherwise set apart.

    Huuuge argues Wilson’s repeated use of the app places him on constructive notice since it was likely he would stumble upon the Terms during that time period. However, just as “there is no reason to assume that [users] will scroll down to subsequent screens simply because screens are there,” there is no reason to assume the users will click on the settings menu simply because it exists. The user can play the game unencumbered by any of the settings. Nothing points the user to the settings tab and nowhere does the user encounter a click box or other notification before proceeding. Only curiosity or dumb luck might bring a user to discover the Terms.

    At the end of the day, Huuuge took a risk and lost:

    Instead of requiring a user to affirmatively assent, Huuuge chose to gamble on whether its users would have notice of its Terms. The odds are not in its favor. Wilson did not have constructive notice of the Terms, and thus is not bound by Huuuge’s arbitration clause in the Terms. We affirm the district court’s denial of Huuuge’s motion to compel arbitration.

    The appeal was dismissed.

  • 23 Jan 2020 3:37 PM | Deleted user

    A privacy violation can be “highly offensive” and actionable even if it is fleeting and causes no harm

    The dangers inherent in electronic medical records were made apparent in Stewart v. Demme, on the one hand an application for certification of a class action, and on the other an application for summary judgment dismissing the claim. The two defendants in the case were Demme, a nurse, and the hospital at which she had formerly been employed. Over a period of ten years, she stole 23,932 Percocet pills. The method by which she did so was at the heart of the issue of the Plaintiff’s certification as a class action.

    In order to acquire the drugs, Demme accessed the individual health records of over 11,000 patients of the hospital. In some cases, she was able to make use of the patient’s paper file, but in many others she used the Meditech database which digitally accessed patients’ records and displayed them on a screen. In either case, she used the information in order to access the hospital’s Automated Dispensing Unit (“ADU”), in order to have a Percocet pill dispensed. Demme testified that in the early years of her thefts she would look to see if the patient was pre-prescribed Percocet and, if not, would move on to another patient. Eventually, however, she began to click on random patients whose names appeared on the ADU screen list as a method of dispensing the pill.

    Once these thefts were discovered, the Hospital sent a letter to every patient whose file or digital record was accessed by Demme to provide herself with Percocet, leading to the proposed class action lawsuit. The class sought to be certified to bring actions for intrusion upon seclusion and for negligence, while the defendants resisted both claims. In the end the class was certified to pursue the intrusion upon seclusion claim, but the application judge concluded that a negligence action could not succeed and granted summary judgment in that regard.

    In each patient’s file, Demme accessed their information for less than a minute from the same ADU machine (as recorded by the ADU logs). “In effect, Ms. Demme scrolled down the patient list, stopped at any given patient’s name, and clicked on the box designated for the medication that she desired.” Her only motivation for improperly accessing any patient’s records, whether a paper file or a digital one through the ADU, was to obtain drugs (para 16): that is, although she might incidental see private medical information, that was not her goal, nor indeed likely to occur given that she would not want to keep a record open very long. In addition, there was no evidence that any patients’ medication was impacted by Demme’s use of their health records in this way. The purpose of ADU recording was to track the medicine stocks at the Hospital, and these records were not associated with any particular patient, and so when medication is dispensed through the ADU, it is not automatically recorded in the patient’s medical file. Further, the fact that the ADU had dispensed medication did not mean that it would be administered, so there was no evidence of any patient receiving Percocet who ought not to have. By the same token, there was no clear evidence that any patient had ever failed to receive Percocet when they ought to have.

    It was largely for those reasons that the negligence action was dismissed: no damage could be shown, other than the purely symbolic harm of the privacy breach, which was not sufficient. However, the application judge did certify the class for a claim based on intrusion upon seclusion. That tort requires in part that there be intentional or reckless conduct by the defendant and that the defendant invaded, without lawful justification, the plaintiff’s private affairs or concerns. Those requirements were clearly met by Demme’s misconduct.

    However, in Jones v Tsigethe Ontario Court of Appeal determined that “one who intentionally intrudes, physically or otherwise, upon the seclusion of another or his private affairs or concerns, is subject to liability to the other for invasion of his privacy, if the invasion would be highly offensive to a reasonable person”. Both defendants argued that the violation of the class members’ health records was de minimis and not highly offensive,and did not rise to the level required for it to: as counsel for the hospital put it, there was “a very large narcotics theft but a very small privacy invasion” (para 57).

    The application judge acknowledged that Demme’s access to any individual file was fleeting, but held that that point should not be overemphasized: “interference with freedom of moment, just like invasion of privacy, must not be trivialized” (para 67). The nature and quality of the information at issue was also relevant: “other hospital procedures – surgery, chemotherapy, psychopharmalogical treatments, etc. – are bound to be rather less shared by patients with the world at large. The Hospital is a uniquely private and confidential institution” (para 66).

    The judge did not that “While any intrusion – even a very small one – into a realm as protected as private health information may be considered highly offensive and therefore actionable, the facts do not exactly ‘cry out for a remedy’” (para 72). Nonetheless,

    [79]…the Jones reasoning supports the proposition that an infringement of privacy can be “highly offensive” without being otherwise harmful in the sense of leading to substantial damages. The offensiveness is based on the nature of the privacy interest infringed, and not on the magnitude of the infringement.

    Accordingly the class was certified.

  • 23 Jan 2020 3:36 PM | Deleted user

    Internal flight not feasible where privacy can be readily breached

    The state of technology, and its implications for being located against one’s will, were at issue in X (Re) , a decision of the Refugee Appeal Division of the Immigration and Refugee Board of Canada. The case was an appeal from the Refugee Protection Division (“RPD”), which determined that the applicant was neither a “convention refugee” nor a “person in need of protection.” The Appellant was from a district in Punjab, India and had been a supporter and member of the SAD (Amritsar) political party since 2009. His father was also a long-time supporter of the party. Because of that political activity the applicant had been physically attacked by police, and the Appeal Division concluded that Congress party members could be a threat to him in Punjab, and especially in his home district. One issue in the decision, however, was whether he could be safe elsewhere in India: that is, whether he had an Internal Flight Alternative (“IFA”). The test for an IFA is that 1) there must be no serious possibility of the Appellant being persecuted in the part of the country identified as an IFA, and; 2) the conditions in that part of the country must be such that it would not be objectively unreasonable in all the circumstances, including those particular to the Appellant, for him to seek refuge there (para 42).

    In the first prong, the appellant essentially argued that the high degree of information technology advancements in India, combined with the deficient privacy and personal information protection, meant that third parties would be able to locate him. The Appellant referred to the Aadhaar number and card, which is a twelve digit unique identity number assigned to residents of India based on their biometric and demographic data. The Appellant provided evidence that the card was increasingly being required for services, that the data was being misused, and that the cardholder’s personal information was not kept private or protected. As a result,

    [60]…just about anyone could access this personal information through corrupt means and for a small amount of money, and the Appellant would be located because he will need to use his Aadhaar card wherever he relocates to.

    In addition:

    [62] The Appellant also submits that a person can be located in India even more simply through social media such as Facebook or electronic surveillance, without having to go through the police. He further argues that the state has its own system of electronic surveillance, called the CMS and which is described at Tab 10.6 of the latest NDP. Therefore the state could locate the Appellant should they want to by intercepting his electronic communications. Tab 10.6 is also cited to refer to the Crime and Criminal Tracking Network and Systems (CCTNS) which is used as a network of information for the police, and the fact that the tenant verification system has been made even easier by providing online forms and applications for free for landlords who are required to register their tenants.

    [63] In essence, the Appellant is saying that through the high degree of information technology advancements in India, coupled with deficient privacy and personal information protection, he could be located by third parties, not necessarily the police, making an IFA impossible.

    As a result, the Appeal Board determined that an IFA was not available to the Appellant and so, based on that and the serious possibility of persecution if he relocated to Delhi or Mumbai, or elsewhere in India, he was found to be a Convention refugee.

  • 9 Jan 2020 3:29 PM | Deleted user

    Customer disputes interest the public

    With its decision in Raymond J. Pilon Enterprises Ltd. v. Village Media Inc., the Ontario Court of Appeal decided a point concerning the “Prevention of Proceedings that Limit Freedom of Expression on Matters of Public Interest (Gag Proceedings)” portion of the Ontario Courts of Justice Act. Those particular rules – otherwise known as the Anti-SLAPP provisions – state their purpose as:

    (a) to encourage individuals to express themselves on matters of public interest;\

    (b) to promote broad participation in debates on matters of public interest;

    (c) to discourage the use of litigation as a means of unduly limiting expression on matters of public interest; and

    (d) to reduce the risk that participation by the public in debates on matters of public interest will be hampered by fear of legal action.

    The respondents had made a Facebook post about their experience with a Canadian Tire store operated by the appellant. The respondents had successfully brought a motion to have the appellant’s legal action against them over the post dismissed, but the appellant argued on appeal that the Anti-SLAPP provisions did not apply: the issue was, they argued, simply a private dispute between a customer and a store, which was not a “matter of public interest”. The motions judge had disagreed, holding that it related “to the issues of customer service and shopping experience at a major retail store”, raised “the question of the appropriateness of a store manager involving the police in such a matter”, and was “cautioning potential customers of the Canadian Tire in Timmins about the treatment they may receive at that store” (para 4). The Ontario Court of Appeal found no error in these conclusions, and therefore upheld the decision, rejecting the appeal.

  • 9 Jan 2020 3:23 PM | Deleted user

    Greater rather than less security provided by electronic cards

    On December 24, 2019, for the first time, the Ontario Labour Relations Board in Toronto and York Region Labour Council,permitted the use of electronic membership evidence for a representation vote. The displacement application for certification was brought by United Steel, Paper and Forestry, Rubber, Manufacturing, Energy, Allied Industrial and Service Workers International Union (United Steelworkers) under the Labour Relations Act, 1995, S.O. 1995, c.1, as amended (the "Act"). Before the votes could be counted, the Board had required submissions as to whether the proof of membership requirement – normally satisfied by the presentation of physical membership cards – could in this instance be satisfied by electronic evidence of membership instead. Given the method used to obtain that evidence in this case, the Board concluded that the evidence could be used.

    The Board noted that their Rules of Procedures were silent on this particular issue, merely requiring that proof of membership needed to be submitted, without specifying its form. The Rules provided that “‘membership evidence’ includes written and signed evidence that an employee is a member of a trade union or has applied to become a member”, but that did not preclude the possibility of such evidence being electronic.

    The Board relied heavily on the security features of the electronic membership evidence in this case as part of its reason for allowing it. These features, the decision notes at para 13, included that:

    1. The United Steelworkers’ (“USW”) electronic membership cards were created using Adobe Sign software. The electronic cards are identical to the USW’s physical membership cards and contain the same fields to be completed by an applicant for membership.
    2. The USW’s Organizing Coordinator, Darlene Jalbert (the “Organizer”), provided each applicant for membership with a hyperlink to a blank membership card.
    3. The applicant for membership opened the hyperlink, sent to them by the Organizer, which directed them to the blank membership card webpage.
    4. The applicant for membership filled in the mandatory fields (i.e. company name, date, email address, and signature).v
    5. The applicant for membership signed the electronic membership card using the Adobe “draw” function using either a mouse on non-touch screen devices or their finger or stylus on touch-screen devices.
    6. Once the mandatory fields were filled in and the electronic membership card had been signed, the applicant for membership received an automatically generated email with a request to confirm his or her identity. The applicant for membership verified his or her identity by clicking on the hyperlink contained therein.
    7. After the applicant for membership’s identity was verified, the Organizer received an automatically generated email with the signed electronic membership card. The email contained a hyperlink for the Organizer to counter-sign the electronic membership card. The Organizer counter-signed the electronic membership cards using the same process described in paragraph v above.
    8. Once the electronic membership card was signed by the Organizer, both the Organizer and the applicant for membership received an email with the fully completed and signed electronic membership card.
    9. Signed electronic membership cards are encrypted and cannot be modified. The Adobe Sign system generates a unique transaction ID for each electronic membership card that provides for a digital certification of authenticity. This certificate of authenticity can be viewed by opening a copy of the signed PDF in Adobe Reader or Adobe Acrobat.

    The Board noted that this was arguably stronger protections than would be provided by physical cards, since they provide the same information (e.g. name of individual, employer name, date and contact details), but unlike a paper membership card, the electronic membership card is encrypted and cannot be modified, and there is a certificate of authenticity and an “audit trail”.

    The Board also noted that the use of electronic membership evidence was not opposed in this case, and it was possible that they could reach a different decision in future in a case where there was such opposition. However, they also concluded by observing that:

    21 The acceptance of electronic membership evidence should come as no surprise to the labour relations community as this Board continues to take steps that embrace technology in furtherance of the purposes of the Act… While each technological advancement carries its own risks, it has been the Board’s experience that the enhanced accessibility and efficiencies outweigh these risks.

  • 9 Jan 2020 3:13 PM | Deleted user

    Federal Privacy Commissioner upholds traveller complaints about searches of electronic devices

    In the most recent annual report to Parliament filed by federal Privacy Commissioner Daniel Therrien, he detailed an investigation by his office into six different complaints by travellers, regarding searches of their electronic devices and data by Border Service Officers (BSOs) employed by the Canadian Border Services Agency (CBSA). The investigation report indicated that all six complaints were “well-founded” under the federal Privacy Act, having breached sections 4 and 6(1) of that Act which proscribes and limits the collection and retention of the personal data of individuals by federal government entities. The report makes a number of recommendations for improvement of policies and their application by the CBSA, as well as for legislative change, and indicates a somewhat recalcitrant attitude on the part of CBSA towards limitations on their entitlements to search and retain data from devices.

    CBSA takes the position that its power to inspect and search “goods” which are “imported” into Canada under s. 99(1)(a) of the Customs Act extends to electronic documents stored on devices. The Commissioner noted CBSA’s position that this does not include electronic documents which might be accessible from the device through an internet connection, but not stored on the device itself—a position with which the Commissioner agreed. In 2015 the CBSA created its Examination of Digital Devices and Media at the Port of Entry – Guidelines, which require inter alia that the connectivity of devices be disabled before they are searched (usually by switching to airplane mode). The Commissioner also agrees with this limitation, which the report states is “essential” in order for CBSA to be able to comply with both the Customs Act and the Privacy Act.

    Here, the investigation into the complaints revealed a number of instances where this part of the Guidelines, in particular, was not complied with. In one case the BSO not only failed to disable the traveller’s phone but used it to access their social media and banking information. In three other cases searches were done without turning on airplane mode, and in the remaining two there were no notes indicating whether connectivity had been disabled. Other problems included: a BSO photographing documents on a phone for purposes other than enforcing the Customs Act (which would amount to a breach of s. 8 of the Charter); records relating to searches being destroyed, rather than being retained for two years as is required; and in several cases, failure to make notes at all regarding the circumstances/indicators that led the BSO to conduct a search (which is required by additional guidelines put in place by the CBSA in 2017), rendering the CBSA unable to legally justify the search after the fact. There were two instances in which the BSOs did not follow the Guidelines by asking the travellers to input their own passwords into the devices to unlock them, rather than asking for and writing down the password themselves; however, the Commissioner noted that this is currently a legally grey area and declined to make any findings about it.

    The Commissioner’s ultimate findings reflected concern for what appeared to be poor training and inconsistent application of the Guidelines by BSOs. It outlines a number of recommendations with which CBSA has agreed:

    1. mandatory training for BSOs and supervisors, with documentation of participation, in order to ensure consistent compliance with the Guidelines;

    2. the creation of oversight and review mechanisms to ensure compliance with policies and practices relating to devices;

    3. an independent audit with respect to the overall operational framework and its compliance with the Privacy Act (though CBSA would only agree to an internal audit, the Commissioner found that this would accomplish the purpose sought);

    4. updating the CBSA’s Enforcement Manual to include comprehensive treatment of the requirements around device searches;

    5. transparency and accountability by way of information regarding policies and practices around searching devices to be published on the CBSA website;

    6. tracking, compiling and reporting statistics on device searches through all of CBSA’s operations.

    The Commissioner also made a number of recommendations for legislative change, but noted explicitly that he was “surprised and disappointed” that CBSA disagreed with all of them—particularly in that they were entirely consistent with proposals that were made by a Parliamentary committee in 2017. These were:

    7. that the definition of “goods” in the Customs Act be updated to reflect the unique nature of electronic devices, distinguishing them from other goods or receptacles which do not hold massive amounts of individual private data;

    8. that the Customs Act be amended to provide “a clear legal framework for the examination of digital devices, and specific rules that impose a higher threshold for the examination of such devices, in line with the requirements of the CBSA’s Policy.” To this the CBSA responded that due to the pace of technological change, fluidity was necessary and simply using policies was sufficient to keep up and ensure Charter compliance. The Commissioner responded that the very cases under review suggested that policies were insufficient.

    9. that the current threshold for engaging in a search set out in CBSA policy, “multiplicity of indicators” be replaced with a “reasonable grounds to suspect” standard, which was explicitly recommended by the Parliamentary committee and which has been argued to be the only way to ensure Charter compliance of searches. This has long been a battleground between CBSA and privacy advocates of various stripes.

    On this latter point the CBSA had responded:

    It is in the very nature of the border environment that there is a lack of prior knowledge or control over goods before they reach the border. With no prior knowledge or information, it can be impossible to formulate reasonable suspicion relation to goods….

    …more and more documentation is going digital and it is as necessary for CBSA officers to view such documentation as it is for those officers to be able to search the traveller’s baggage. Imposing an inspection threshold only on digital documents makes it more likely that travellers seeking to circumvent Canadian prohibitions on imported goods or evade duties and taxes or, indeed, conceal their true identities will be able to do so.

    The Commissioner countered with skepticism:

    We note that the CBSA’s rationale for disagreeing with our Recommendations 7 and 9 draws comparisons between digital documents and traditional paper documents, suggesting that the changes to legislation would create a higher threshold for the inspection of digital documents. However, CBSA itself has recognized that a higher threshold is appropriate for digital devices given the wealth of information that can be stored on them. Simply put, searching a digital device is not the same as, for instance, consulting a paper receipt and this is already reflected in CBSA policy.

    We also fail to see how our recommendations would prevent the CBSA from viewing travel documents stored electronically as part of its normal operations. The requirement to produce a travel document – in electronic form or otherwise – for inspection does not entail handing over a device and all its contents to be searched and would not be affected by our recommendations. In our view, CBSA is wrongly asserting operational barriers to legislative reform; barriers that are belied by the fact that its Policy already distinguishes and applies a higher threshold for digital devices.

    The Commissioner concluded that he would be recommending the proposed legislative changes to the Minister of Public Safety, and the Minister of Border Security and Organized Crime Reduction, “ in order to adequately protect the privacy and Charter rights of Canadians as they return home from travels abroad.”

  • 19 Dec 2019 3:54 PM | Deleted user

    Private communications producing a record not “records”

    Two recent decisions from the Ontario Superior Court of Justice have been required to consider whether communications sent as Facebook messages constituted “records” within the meaning of section 278.92 of the Criminal Code. The Criminal Code has for some years contained a statutory scheme which requires an accused to bring an application in order to obtain third party records relating to a complainant in a sexual assault prosecution. Those provisions did not apply to records which were already in the hands of the accused, however, and did not deal with the ultimate admissibility of such records at trial. In December of 2019, the provisions were amended so that they now do apply to records already in the hands of the accused. In addition, section 278.92 was added to the Code, requiring an accused to bring an application to the trial judge before being allowed to use the records in any way at trial. The result of these changes is that, when that new provision applies, an accused is effectively required to disclose a part of the defense strategy, and cannot impeach a complainant’s testimony through the use of evidence they had not expected. (This change in the law was a response to the acquittal in the high profile R v Ghomeshi case, where exactly that had happened.)

    The first recent case where that issue arose was R v WM. The accused was charged with several offences, including sexual assault with a weapon, sexual assault causing bodily harm, and assault causing bodily harm that occurred on March 25, 2017 between the accused Mr. M and the complainant Ms. M-A. The accused had a number of Facebook messages which the complainant had sent to him during the relevant time period which he wished to use at trial. The complainant had deleted the messages, and so neither she nor the Crown had copies of them. The accused preferred not to bring a section 278.92 application if he was not required to, since doing so would disclose those messages to the Crown, and so he brought a motion for directions as to whether the section applied or not. That question turned on whether the Facebook messages were “records” within the meaning of the section, and that issue in turn depended on whether the complainant had a reasonable expectation of privacy over the content of the messages.

    The trial judge concluded that the complainant did not have a reasonable expectation of privacy, and therefore that the accused was not required to bring a section 278.92 application in order to use the Facebook messages.

    The Superior Court judge noted a number of things about “reasonable expectation of privacy” in this context. It is not the same issue as in the section 8 of the Charter concerning unreasonable search and seizure, because the issue is not the state obtaining information, but a private citizen using information he already had to defend himself against criminal charges. Further, the judge noted that there was a temporal aspect to the question of reasonable expectation of privacy. A “record” is defined as anything that contained information over which the complainant has a reasonable expectation of privacy, not which they at some point in the past had a reasonable expectation of privacy. Equally, though, that the accused was already in possession of the messages was not determinative in extinguishing the complainant’s expectation of privacy over their content, because privacy is not an all or nothing concept.

    The judge considered several cases, including the Supreme Court decisions in R v Reevesand R v Jarvis. The Court also referred to R v Millswhere the Supreme Court held that the accused did not have a reasonable expectation of privacy over sexually explicit Facebook messages he thought he was sending to a 14 year old girl. The judge notes that the Court in Mills held that “on the normative standard of privacy described by this court, adults cannot reasonably expect privacy online with children they do not know. That the communication occurs online does not add a layer of privacy, but rather a layer of unpredictability” (para 36). It might have been worth noting as well that three of the seven judges in Mills also held the view that “an individual cannot reasonably expect their words to be kept private from the person with whom they are communicating” (para 42 of Mills).

    The judge relied primarily on four factors to conclude that the complainant did not have, at the relevant time, a reasonable expectation of privacy: the content of the messages, the manner in which they were sent, the nature of the relationship, and the policy implications. Of most interest are the judge’s comments on the second factor, the manner in which the messages were sent:

    [44] In Marakah, the Supreme Court held that the sender of an electronic communication has a reasonable expectation that the police, or the state, will not seize that communication from the recipient. The issue here is not whether the state is entitled to seize Ms. M.-A.’s Facebook messages from W.M. (or W.M.’s electronic communications from Ms. M.-A.). The issue is whether Ms. M.-A. has a reasonable expectation that W.M., as the intended recipient of the messages, will keep them private.

    [45] The fact that W.M. was the intended recipient of Facebook messages is a significant factor in deciding whether Ms. M.-A. can reasonably expect that they will be kept private and will not be used by the intended recipient. To the extent that the messages contain personal information about Ms. M.-A., she chose to share that information with W.M. She also chose to do so in writing, knowing that she was creating an electronic record that W.M. could save and share with others.

    [47] I recognize that this factor imports a risk analysis into the decision of whether Ms. M.-A. has a reasonable expectation of privacy over information she shared with W.M. As the courts have repeatedly said, risk of further dissemination is not determinative. It is nonetheless relevant that Ms. M.-A. chose to give W.M. the information he now wishes to use and she did so in a manner that she knew would create a permanent record that he could save. The kind of risk at issue on the facts of this case is quite different from the risk at issue in Duarte or Marakah, namely that the state might intercept or make a permanent record of the communication.

    The same issue about whether an electronic communication was a “record” arose in R v Mai, though in that case relating to messages sent via WhatsApp. The trial judge in Mai equally reached the conclusion that the communications in question were not captured by the statutory scheme, though noting that this would always be determined on a case-by-case basis.

    One additional factor which arose in Mai, with regard to some of the WhatsApp communications, was that they were not exclusively between the complainant and the accused. The judge there noted that “the fact that there is a third party privy to this conversation, in real time, significantly diminishes any expectation of privacy that the complainant could have in the conversation” (para 27). In general, the approach taken in Mai was similar to that in WM, and indeed the judge in Mai also commented on the relevance of “risk analysis”, which is to be avoided in the section 8 context:

    [23] This contextual assessment is essential because I believe that a "risk analysis" forms an important part of assessing whether there is a reasonable expectation of privacy in the totality of circumstances. I recognize that the Supreme Court in R. v. Duarte1990 CanLII 150 (SCC), [1990] 1 SCR 30, emphatically rejected a risk analysis as a legitimate consideration in the context of s.8, noting, among other things, that the risk that the listener will "tattle" on the speaker, is of a different order of magnitude than the risk that the state is listening in and making a permanent recording. While the speaker may contemplate the risk of the former, it cannot reasonably be concluded that he contemplated the risk of the latter. However, outside the s.8 context, that is, where it is not the state that obtained the record, I believe that the risk analysis has an important role to play in assessing whether or not a complainant has a reasonable expectation of privacy in a record…

    [24] More recently, in Jarvis, in the context of interpreting the voyeurism provision in s.162(1) of the Criminal Code, the Supreme Court appears to apply a risk analysis in assessing whether the particular circumstances of a case give rise to a reasonable expectation of privacy. While the majority notes that a risk analysis is not determinative of whether there is a reasonable expectation of privacy in a particular situation (para.68), it appears to be an important consideration…

     [25] I appreciate that the fact that an accused possesses the potential "record" in question is not determinative of the analysis, as s.278.92 is explicitly intended to apply to materials in the possession of the accused. But I believe the fact that a complainant chose to share the information found in the record with the accused is a relevant circumstance. In doing so, the complainant can usually be reasonably expected to contemplate a risk that the accused would seek to use that information to defend himself against a subsequent allegation by the complainant. While the nature of that expectation will depend on the particular circumstances, I believe it does bear on a complainant’s expectation of privacy in the record.

  • 19 Dec 2019 3:14 PM | Deleted user

    Nova Scotia court holds for Plaintiff in action under NS Cyber-Safety Act

    In Candelora v. FeserJustice Joshua Arnold of the Supreme Court of Nova Scotia presided over the first case brought under Nova Scotia’s Intimate Images and Cyber-Protection Act, S.N.S. 2017, c. 7. The Act creates a statutory tort under which individuals can bring civil actions for the distribution of intimate images or cyber-bullying, the latter of which was at issue in the case. The Act defines “cyber-bullying” as follows:

    3(c) "cyber-bullying" means an electronic communication, direct or indirect, that causes or is likely to cause harm to another individual's health or well-being where the person responsible for the communication maliciously intended to cause harm to

    another individual's health or well-being or was reckless with regard to the risk of harm to another individual's health or well-being, and may include:

    (i) creating a web page, blog or profile in which the creator assumes the identity of another person,

    (ii) impersonating another person as the author of content or a message,

    (iii) disclosure of sensitive personal facts or breach of confidence,

    (iv) threats, intimidation or menacing conduct,

    (v) communications that are grossly offensive, indecent, or obscene,

    (vi) communications that are harassment,

    (vii) making a false allegation,

    (viii) communications that incite or encourage another person to commit suicide,

    (ix) communications that denigrate another person because of any prohibited ground of discrimination listed in Section 5 of the Human Rights Act, or

    (x) communications that incite or encourage another person to do any of the foregoing…

    In this case, the Plaintiff, Candelora, was involved in fairly contentious family/custody proceedings with the defendant Feser. During a pickup of the child of the former marriage, Candelora verbally referred to Feser’s new spouse, Dadas, in uncomplimentary terms. Dialogue about this resulted in both Feser and Dadas unleashing a torrent of abusive, insulting Facebook posts about both Candelora and her lawyer in the family proceedings (long excerpts of which, along with related testimony, can be found in the decision). Candelora eventually brought an action against the two under the Act.

    As this was a case of first instance, Arnold J. traced his way through various parts of the statute to underpin his findings. Facebook were clearly “electronic communications,” defined in the Act as “any form of electronic communication, including any text message, writing, photograph, picture recording or other matter that is communicated electronically.” On the issue of whether the postings were “direct or indirect,” the defendants argued that the posts were outside this scope because they were “private,” on the basis that Candelora was blocked from both Facebook accounts. Noting that Facebook posts have been held to constitute “publication” for defamation purposes, Justice Arnold noted that Dadas, in particular, had 4900 Facebook “friends” and many of her posts would receive 200-300 “likes.” He remarked:

    The Facebook postings about Ms. Candelora are not private, whether or not she is blocked as a friend of the respondents. It would obviously defeat the entire purpose of this legislation if a respondent could avoid a claim based on Facebook postings simply by blocking the applicant.

    Also, many of the postings were explicitly directed at, and even addressed, to Candelora.

    The postings in question had not only caused physical, mental and emotional harm to Candelora, but had been maliciously intended to do so, or in some cases reckless as to whether this would occur. The defendants characterized their posts as being some sort of retaliation for letters that Candelora’s counsel had sent as part of the family proceedings, all of which were proper but which they nonetheless found objectionable. Malice and/or recklessness was clear, given that the defendants’ purpose was “to try to intimidate Ms. Candelora into changing the course of the custody and child support proceedings with Mr. Feser,” and “to bully Ms. Candelora so that she would feel psychologically pressured into reversing her legal position.”

    As to other actions that constituted cyber-bullying, Justice Arnold held that the defendants had posted sensitive personal facts and information (including tax returns) about Candelora; were threatening and intimidating; and made many obscene and offensive comments. On the issue of whether there had been harassment, Justice Arnold noted that there was no definition of harassment in the Act but analogized to the offence in s. 264 of the Criminal Code; he held that the defendants were trying to dissuade Candelora from pursuing legitimate litigation goals, and had made her feel “continuously and chronically” worried, which made out harassment.

    Holding that cyber-bullying had clearly been made out, Justice Arnold proceeded to consider a list of considerations (under s. 6 of the Act) for crafting an appropriate order. Among these were the content of the cyber-bullying (“offensive and designed to intimidate and humiliate”), its frequency (“prolific”), and the extent of the distribution (“significant”). He held that the Act should be interpreted consistently with the protection for freedom of expression in s. 2 of the Charter.

    Arnold J. then considered the defences under s. 7 of the Act:

    7 (1) In an application for an order respecting the distribution of an intimate image without consent or cyber-bullying under this Act, it is a defence for the respondent to show that the distribution of an intimate image without consent or communication is in the public interest and that the distribution or communication did not extend beyond what is in the public interest.

    (2) In an application for an order respecting cyber-bullying under this Act, it is a defence for the respondent to show that

    (a) the victim of the cyber-bullying expressly or by implication consented to the making of the communication;

    (b) the publication of a communication was, in accordance with the rules of law relating to defamation,

    (i) fair comment on a matter of public interest,

    (ii) done in a manner consistent with principles of responsible journalism, or

    (iii) privileged…

    The defendants argued that “fair comment on a matter of public interest” applied on the basis that Candelora was a realtor and therefore a “public figure.” Interpreting this in line with the “fair comment” defence in defamation law, Arnold J. held that it was not made out: “Just because [Candelora] has a job whereby she advertises her services publicly does not allow the respondents, or anyone else, to maliciously tee-off on her online for the world to see.”

    In the result, the defendants were ordered to cease cyber-bullying Candelora and to take down any cyber-bullying content, and were prohibited from communicating with Candelora or her counsel except regarding custody matters. The parties were ordered to file submissions on damages and costs.

  

Canadian Technology Law Association

1-189 Queen Street East

Toronto, ON M5A 1S2

contact@cantechlaw.ca

Copyright © 2024 The Canadian Technology Law Association, All rights reserved.