Log in
Log in


  • 9 Jan 2020 3:13 PM | CAN-TECH Law (Administrator)

    Federal Privacy Commissioner upholds traveller complaints about searches of electronic devices

    In the most recent annual report to Parliament filed by federal Privacy Commissioner Daniel Therrien, he detailed an investigation by his office into six different complaints by travellers, regarding searches of their electronic devices and data by Border Service Officers (BSOs) employed by the Canadian Border Services Agency (CBSA). The investigation report indicated that all six complaints were “well-founded” under the federal Privacy Act, having breached sections 4 and 6(1) of that Act which proscribes and limits the collection and retention of the personal data of individuals by federal government entities. The report makes a number of recommendations for improvement of policies and their application by the CBSA, as well as for legislative change, and indicates a somewhat recalcitrant attitude on the part of CBSA towards limitations on their entitlements to search and retain data from devices.

    CBSA takes the position that its power to inspect and search “goods” which are “imported” into Canada under s. 99(1)(a) of the Customs Act extends to electronic documents stored on devices. The Commissioner noted CBSA’s position that this does not include electronic documents which might be accessible from the device through an internet connection, but not stored on the device itself—a position with which the Commissioner agreed. In 2015 the CBSA created its Examination of Digital Devices and Media at the Port of Entry – Guidelines, which require inter alia that the connectivity of devices be disabled before they are searched (usually by switching to airplane mode). The Commissioner also agrees with this limitation, which the report states is “essential” in order for CBSA to be able to comply with both the Customs Act and the Privacy Act.

    Here, the investigation into the complaints revealed a number of instances where this part of the Guidelines, in particular, was not complied with. In one case the BSO not only failed to disable the traveller’s phone but used it to access their social media and banking information. In three other cases searches were done without turning on airplane mode, and in the remaining two there were no notes indicating whether connectivity had been disabled. Other problems included: a BSO photographing documents on a phone for purposes other than enforcing the Customs Act (which would amount to a breach of s. 8 of the Charter); records relating to searches being destroyed, rather than being retained for two years as is required; and in several cases, failure to make notes at all regarding the circumstances/indicators that led the BSO to conduct a search (which is required by additional guidelines put in place by the CBSA in 2017), rendering the CBSA unable to legally justify the search after the fact. There were two instances in which the BSOs did not follow the Guidelines by asking the travellers to input their own passwords into the devices to unlock them, rather than asking for and writing down the password themselves; however, the Commissioner noted that this is currently a legally grey area and declined to make any findings about it.

    The Commissioner’s ultimate findings reflected concern for what appeared to be poor training and inconsistent application of the Guidelines by BSOs. It outlines a number of recommendations with which CBSA has agreed:

    1. mandatory training for BSOs and supervisors, with documentation of participation, in order to ensure consistent compliance with the Guidelines;

    2. the creation of oversight and review mechanisms to ensure compliance with policies and practices relating to devices;

    3. an independent audit with respect to the overall operational framework and its compliance with the Privacy Act (though CBSA would only agree to an internal audit, the Commissioner found that this would accomplish the purpose sought);

    4. updating the CBSA’s Enforcement Manual to include comprehensive treatment of the requirements around device searches;

    5. transparency and accountability by way of information regarding policies and practices around searching devices to be published on the CBSA website;

    6. tracking, compiling and reporting statistics on device searches through all of CBSA’s operations.

    The Commissioner also made a number of recommendations for legislative change, but noted explicitly that he was “surprised and disappointed” that CBSA disagreed with all of them—particularly in that they were entirely consistent with proposals that were made by a Parliamentary committee in 2017. These were:

    7. that the definition of “goods” in the Customs Act be updated to reflect the unique nature of electronic devices, distinguishing them from other goods or receptacles which do not hold massive amounts of individual private data;

    8. that the Customs Act be amended to provide “a clear legal framework for the examination of digital devices, and specific rules that impose a higher threshold for the examination of such devices, in line with the requirements of the CBSA’s Policy.” To this the CBSA responded that due to the pace of technological change, fluidity was necessary and simply using policies was sufficient to keep up and ensure Charter compliance. The Commissioner responded that the very cases under review suggested that policies were insufficient.

    9. that the current threshold for engaging in a search set out in CBSA policy, “multiplicity of indicators” be replaced with a “reasonable grounds to suspect” standard, which was explicitly recommended by the Parliamentary committee and which has been argued to be the only way to ensure Charter compliance of searches. This has long been a battleground between CBSA and privacy advocates of various stripes.

    On this latter point the CBSA had responded:

    It is in the very nature of the border environment that there is a lack of prior knowledge or control over goods before they reach the border. With no prior knowledge or information, it can be impossible to formulate reasonable suspicion relation to goods….

    …more and more documentation is going digital and it is as necessary for CBSA officers to view such documentation as it is for those officers to be able to search the traveller’s baggage. Imposing an inspection threshold only on digital documents makes it more likely that travellers seeking to circumvent Canadian prohibitions on imported goods or evade duties and taxes or, indeed, conceal their true identities will be able to do so.

    The Commissioner countered with skepticism:

    We note that the CBSA’s rationale for disagreeing with our Recommendations 7 and 9 draws comparisons between digital documents and traditional paper documents, suggesting that the changes to legislation would create a higher threshold for the inspection of digital documents. However, CBSA itself has recognized that a higher threshold is appropriate for digital devices given the wealth of information that can be stored on them. Simply put, searching a digital device is not the same as, for instance, consulting a paper receipt and this is already reflected in CBSA policy.

    We also fail to see how our recommendations would prevent the CBSA from viewing travel documents stored electronically as part of its normal operations. The requirement to produce a travel document – in electronic form or otherwise – for inspection does not entail handing over a device and all its contents to be searched and would not be affected by our recommendations. In our view, CBSA is wrongly asserting operational barriers to legislative reform; barriers that are belied by the fact that its Policy already distinguishes and applies a higher threshold for digital devices.

    The Commissioner concluded that he would be recommending the proposed legislative changes to the Minister of Public Safety, and the Minister of Border Security and Organized Crime Reduction, “ in order to adequately protect the privacy and Charter rights of Canadians as they return home from travels abroad.”

  • 19 Dec 2019 3:54 PM | CAN-TECH Law (Administrator)

    Private communications producing a record not “records”

    Two recent decisions from the Ontario Superior Court of Justice have been required to consider whether communications sent as Facebook messages constituted “records” within the meaning of section 278.92 of the Criminal Code. The Criminal Code has for some years contained a statutory scheme which requires an accused to bring an application in order to obtain third party records relating to a complainant in a sexual assault prosecution. Those provisions did not apply to records which were already in the hands of the accused, however, and did not deal with the ultimate admissibility of such records at trial. In December of 2019, the provisions were amended so that they now do apply to records already in the hands of the accused. In addition, section 278.92 was added to the Code, requiring an accused to bring an application to the trial judge before being allowed to use the records in any way at trial. The result of these changes is that, when that new provision applies, an accused is effectively required to disclose a part of the defense strategy, and cannot impeach a complainant’s testimony through the use of evidence they had not expected. (This change in the law was a response to the acquittal in the high profile R v Ghomeshi case, where exactly that had happened.)

    The first recent case where that issue arose was R v WM. The accused was charged with several offences, including sexual assault with a weapon, sexual assault causing bodily harm, and assault causing bodily harm that occurred on March 25, 2017 between the accused Mr. M and the complainant Ms. M-A. The accused had a number of Facebook messages which the complainant had sent to him during the relevant time period which he wished to use at trial. The complainant had deleted the messages, and so neither she nor the Crown had copies of them. The accused preferred not to bring a section 278.92 application if he was not required to, since doing so would disclose those messages to the Crown, and so he brought a motion for directions as to whether the section applied or not. That question turned on whether the Facebook messages were “records” within the meaning of the section, and that issue in turn depended on whether the complainant had a reasonable expectation of privacy over the content of the messages.

    The trial judge concluded that the complainant did not have a reasonable expectation of privacy, and therefore that the accused was not required to bring a section 278.92 application in order to use the Facebook messages.

    The Superior Court judge noted a number of things about “reasonable expectation of privacy” in this context. It is not the same issue as in the section 8 of the Charter concerning unreasonable search and seizure, because the issue is not the state obtaining information, but a private citizen using information he already had to defend himself against criminal charges. Further, the judge noted that there was a temporal aspect to the question of reasonable expectation of privacy. A “record” is defined as anything that contained information over which the complainant has a reasonable expectation of privacy, not which they at some point in the past had a reasonable expectation of privacy. Equally, though, that the accused was already in possession of the messages was not determinative in extinguishing the complainant’s expectation of privacy over their content, because privacy is not an all or nothing concept.

    The judge considered several cases, including the Supreme Court decisions in R v Reevesand R v Jarvis. The Court also referred to R v Millswhere the Supreme Court held that the accused did not have a reasonable expectation of privacy over sexually explicit Facebook messages he thought he was sending to a 14 year old girl. The judge notes that the Court in Mills held that “on the normative standard of privacy described by this court, adults cannot reasonably expect privacy online with children they do not know. That the communication occurs online does not add a layer of privacy, but rather a layer of unpredictability” (para 36). It might have been worth noting as well that three of the seven judges in Mills also held the view that “an individual cannot reasonably expect their words to be kept private from the person with whom they are communicating” (para 42 of Mills).

    The judge relied primarily on four factors to conclude that the complainant did not have, at the relevant time, a reasonable expectation of privacy: the content of the messages, the manner in which they were sent, the nature of the relationship, and the policy implications. Of most interest are the judge’s comments on the second factor, the manner in which the messages were sent:

    [44] In Marakah, the Supreme Court held that the sender of an electronic communication has a reasonable expectation that the police, or the state, will not seize that communication from the recipient. The issue here is not whether the state is entitled to seize Ms. M.-A.’s Facebook messages from W.M. (or W.M.’s electronic communications from Ms. M.-A.). The issue is whether Ms. M.-A. has a reasonable expectation that W.M., as the intended recipient of the messages, will keep them private.

    [45] The fact that W.M. was the intended recipient of Facebook messages is a significant factor in deciding whether Ms. M.-A. can reasonably expect that they will be kept private and will not be used by the intended recipient. To the extent that the messages contain personal information about Ms. M.-A., she chose to share that information with W.M. She also chose to do so in writing, knowing that she was creating an electronic record that W.M. could save and share with others.

    [47] I recognize that this factor imports a risk analysis into the decision of whether Ms. M.-A. has a reasonable expectation of privacy over information she shared with W.M. As the courts have repeatedly said, risk of further dissemination is not determinative. It is nonetheless relevant that Ms. M.-A. chose to give W.M. the information he now wishes to use and she did so in a manner that she knew would create a permanent record that he could save. The kind of risk at issue on the facts of this case is quite different from the risk at issue in Duarte or Marakah, namely that the state might intercept or make a permanent record of the communication.

    The same issue about whether an electronic communication was a “record” arose in R v Mai, though in that case relating to messages sent via WhatsApp. The trial judge in Mai equally reached the conclusion that the communications in question were not captured by the statutory scheme, though noting that this would always be determined on a case-by-case basis.

    One additional factor which arose in Mai, with regard to some of the WhatsApp communications, was that they were not exclusively between the complainant and the accused. The judge there noted that “the fact that there is a third party privy to this conversation, in real time, significantly diminishes any expectation of privacy that the complainant could have in the conversation” (para 27). In general, the approach taken in Mai was similar to that in WM, and indeed the judge in Mai also commented on the relevance of “risk analysis”, which is to be avoided in the section 8 context:

    [23] This contextual assessment is essential because I believe that a "risk analysis" forms an important part of assessing whether there is a reasonable expectation of privacy in the totality of circumstances. I recognize that the Supreme Court in R. v. Duarte1990 CanLII 150 (SCC), [1990] 1 SCR 30, emphatically rejected a risk analysis as a legitimate consideration in the context of s.8, noting, among other things, that the risk that the listener will "tattle" on the speaker, is of a different order of magnitude than the risk that the state is listening in and making a permanent recording. While the speaker may contemplate the risk of the former, it cannot reasonably be concluded that he contemplated the risk of the latter. However, outside the s.8 context, that is, where it is not the state that obtained the record, I believe that the risk analysis has an important role to play in assessing whether or not a complainant has a reasonable expectation of privacy in a record…

    [24] More recently, in Jarvis, in the context of interpreting the voyeurism provision in s.162(1) of the Criminal Code, the Supreme Court appears to apply a risk analysis in assessing whether the particular circumstances of a case give rise to a reasonable expectation of privacy. While the majority notes that a risk analysis is not determinative of whether there is a reasonable expectation of privacy in a particular situation (para.68), it appears to be an important consideration…

     [25] I appreciate that the fact that an accused possesses the potential "record" in question is not determinative of the analysis, as s.278.92 is explicitly intended to apply to materials in the possession of the accused. But I believe the fact that a complainant chose to share the information found in the record with the accused is a relevant circumstance. In doing so, the complainant can usually be reasonably expected to contemplate a risk that the accused would seek to use that information to defend himself against a subsequent allegation by the complainant. While the nature of that expectation will depend on the particular circumstances, I believe it does bear on a complainant’s expectation of privacy in the record.

  • 19 Dec 2019 3:14 PM | CAN-TECH Law (Administrator)

    Nova Scotia court holds for Plaintiff in action under NS Cyber-Safety Act

    In Candelora v. FeserJustice Joshua Arnold of the Supreme Court of Nova Scotia presided over the first case brought under Nova Scotia’s Intimate Images and Cyber-Protection Act, S.N.S. 2017, c. 7. The Act creates a statutory tort under which individuals can bring civil actions for the distribution of intimate images or cyber-bullying, the latter of which was at issue in the case. The Act defines “cyber-bullying” as follows:

    3(c) "cyber-bullying" means an electronic communication, direct or indirect, that causes or is likely to cause harm to another individual's health or well-being where the person responsible for the communication maliciously intended to cause harm to

    another individual's health or well-being or was reckless with regard to the risk of harm to another individual's health or well-being, and may include:

    (i) creating a web page, blog or profile in which the creator assumes the identity of another person,

    (ii) impersonating another person as the author of content or a message,

    (iii) disclosure of sensitive personal facts or breach of confidence,

    (iv) threats, intimidation or menacing conduct,

    (v) communications that are grossly offensive, indecent, or obscene,

    (vi) communications that are harassment,

    (vii) making a false allegation,

    (viii) communications that incite or encourage another person to commit suicide,

    (ix) communications that denigrate another person because of any prohibited ground of discrimination listed in Section 5 of the Human Rights Act, or

    (x) communications that incite or encourage another person to do any of the foregoing…

    In this case, the Plaintiff, Candelora, was involved in fairly contentious family/custody proceedings with the defendant Feser. During a pickup of the child of the former marriage, Candelora verbally referred to Feser’s new spouse, Dadas, in uncomplimentary terms. Dialogue about this resulted in both Feser and Dadas unleashing a torrent of abusive, insulting Facebook posts about both Candelora and her lawyer in the family proceedings (long excerpts of which, along with related testimony, can be found in the decision). Candelora eventually brought an action against the two under the Act.

    As this was a case of first instance, Arnold J. traced his way through various parts of the statute to underpin his findings. Facebook were clearly “electronic communications,” defined in the Act as “any form of electronic communication, including any text message, writing, photograph, picture recording or other matter that is communicated electronically.” On the issue of whether the postings were “direct or indirect,” the defendants argued that the posts were outside this scope because they were “private,” on the basis that Candelora was blocked from both Facebook accounts. Noting that Facebook posts have been held to constitute “publication” for defamation purposes, Justice Arnold noted that Dadas, in particular, had 4900 Facebook “friends” and many of her posts would receive 200-300 “likes.” He remarked:

    The Facebook postings about Ms. Candelora are not private, whether or not she is blocked as a friend of the respondents. It would obviously defeat the entire purpose of this legislation if a respondent could avoid a claim based on Facebook postings simply by blocking the applicant.

    Also, many of the postings were explicitly directed at, and even addressed, to Candelora.

    The postings in question had not only caused physical, mental and emotional harm to Candelora, but had been maliciously intended to do so, or in some cases reckless as to whether this would occur. The defendants characterized their posts as being some sort of retaliation for letters that Candelora’s counsel had sent as part of the family proceedings, all of which were proper but which they nonetheless found objectionable. Malice and/or recklessness was clear, given that the defendants’ purpose was “to try to intimidate Ms. Candelora into changing the course of the custody and child support proceedings with Mr. Feser,” and “to bully Ms. Candelora so that she would feel psychologically pressured into reversing her legal position.”

    As to other actions that constituted cyber-bullying, Justice Arnold held that the defendants had posted sensitive personal facts and information (including tax returns) about Candelora; were threatening and intimidating; and made many obscene and offensive comments. On the issue of whether there had been harassment, Justice Arnold noted that there was no definition of harassment in the Act but analogized to the offence in s. 264 of the Criminal Code; he held that the defendants were trying to dissuade Candelora from pursuing legitimate litigation goals, and had made her feel “continuously and chronically” worried, which made out harassment.

    Holding that cyber-bullying had clearly been made out, Justice Arnold proceeded to consider a list of considerations (under s. 6 of the Act) for crafting an appropriate order. Among these were the content of the cyber-bullying (“offensive and designed to intimidate and humiliate”), its frequency (“prolific”), and the extent of the distribution (“significant”). He held that the Act should be interpreted consistently with the protection for freedom of expression in s. 2 of the Charter.

    Arnold J. then considered the defences under s. 7 of the Act:

    7 (1) In an application for an order respecting the distribution of an intimate image without consent or cyber-bullying under this Act, it is a defence for the respondent to show that the distribution of an intimate image without consent or communication is in the public interest and that the distribution or communication did not extend beyond what is in the public interest.

    (2) In an application for an order respecting cyber-bullying under this Act, it is a defence for the respondent to show that

    (a) the victim of the cyber-bullying expressly or by implication consented to the making of the communication;

    (b) the publication of a communication was, in accordance with the rules of law relating to defamation,

    (i) fair comment on a matter of public interest,

    (ii) done in a manner consistent with principles of responsible journalism, or

    (iii) privileged…

    The defendants argued that “fair comment on a matter of public interest” applied on the basis that Candelora was a realtor and therefore a “public figure.” Interpreting this in line with the “fair comment” defence in defamation law, Arnold J. held that it was not made out: “Just because [Candelora] has a job whereby she advertises her services publicly does not allow the respondents, or anyone else, to maliciously tee-off on her online for the world to see.”

    In the result, the defendants were ordered to cease cyber-bullying Candelora and to take down any cyber-bullying content, and were prohibited from communicating with Candelora or her counsel except regarding custody matters. The parties were ordered to file submissions on damages and costs.

  • 5 Dec 2019 3:57 PM | CAN-TECH Law (Administrator)

    Order permits defendant to continue to service existing customers who are reliant on the software, but resulting revenues to be paid into trust

    Knowmadics Inc., a US-based software developer, sought and obtained an interlocutory injunction against a former employee in the Ontario Superior Court of Justice for marketing a similar and competing product. 

    The employee had been the principal developer of Knowmadics’ software product, known as SilverEye. SilverEye was principally marketed in conjunction with the plaintiff’s CASES mobile application to the military and law enforcement for the collection and analysis of data from mobile devices. At the beginning of her employment with Knowmadics, she signed an employment agreement that contained “obligations pertaining to ownership of intellectual property, confidentiality, conflict of interest and client servicing.” The defendant, Ms. Cinnamon, left her employment with the plaintiff in 2017 and agreed to continue to provide support to the company. The two also signed a non-disclosure agreement

    According to the decision of Justice Hackland, Knowmadics v. Cinnamon, it came to the plaintiff’s attention that Ms. Cinnamon, shortly after leaving her employment, was selling two software products though her company LDX Inc, FireCat and GhostCat, that had similar features to SilverEye and CASES. The plaintiff commenced a lawsuit against Ms. Cinnamon and her company, alleging that the FireCat software infringes Knowmadic’s copyright and for breach of her employment agreement and the post-employment agreements. 

    The plaintiff and defendant jointly retained a consultant to review the code of the software at issue. The consultant made the following conclusions:

    i) certain FireCat source code is identical or substantially similar to the Knowmadics source code;

    ii) “a significant proportion of the LDX database structure and source code has been copied from the Knowmadics BlueBird database”;

    iii) “approximately 10 percent of the Knowmadics BlueBird database schema matches 40 percent of the database tables and 30 percent of the table-column pairs in the LDX database schema. Several of these similarities include clear indicia of copying from the Knowmadics database schema”;

    iv) similarities in the tables contained in the Knowmadics database and the LDX database “include a number of identical matches indicative of copying”;

    v) “some of the LDX stores procedures still contain isolated references that are indicia of copying from the Knowmadics code”;

    vi) certain columns that appear in both the Knowmadics and LDX databases “contain identical misspellings” and that it “is highly unlikely that these identical mistakes in each database are coincidental, as there is no legitimate reason for a developer to intentionally incorporate such errors into a database schema”; and

    vii) “comparison of the Knowmadics and LDX database schemas identified significant structural and functional overlap indicative of copying by LDX. Furthermore, both databases also contain a significant amount of source code that is either identical or substantially similar, providing further evidence of copying”

    The defendant also retained a consultant to determine what portion of the FireCat code was derived from third party code and how much was derived from software she had written before working for Knowmadics. That consultant concluded that much of the code was available elsewhere. The relevance of this would be left to the trial judge:

    [15] It is admitted by Ms. Cinnamon that she used what she claimed was her own prior code in developing SilverEye for the plaintiff during the tenure of her employment with the plaintiff. She did so without identifying this to the plaintiff. The position of the defendant Ms. Cinnamon is that she incorporated the Technocality database (which she wrote for her earlier client Technocality) into SilverEye software she developed for the plaintiff and subsequently incorporated it into the FireCat software she wrote for her company, the defendant LDX.

    [16] In these factual circumstances, I would respectfully agree with and adopt the plaintiff’s statement (from paragraph 76, plaintiff’s factum).

    The relevant issue as far as the database goes is a legal one for the Court to determine at trial: once Ms. Cinnamon delivered SilverEye to Knowmadics incorporating that prior database without identifying that she was doing so, and Knowmadics copyrighted SilverEye with that database code and schema, was Ms. Cinnamon then permitted under copyright law or her agreements with Knowmadics to take a shortcut and use the same code and schema to create a competing software with the same functionalities? This is a serious issue to be determined at trial…

    The defendant also argued that an injunction would be catastrophic for her clients, who have come to rely on the FireCat software and all of whom are based in Canada. Many of them are former clients of the plaintiff, but the defendant has not acquired any of the plaintiff’s US-based clients. 

    A particularly tailored injunction was ordered, described by the Court:

    [27] In the court’s opinion an interim interlocutory injunction which permits the defendants to continue to service only their existing Canadian based clients with their FireCat software until the trial of this action achieves an adequate balancing of the interests of both parties and avoids irreparable harm to either one. The object of such an order would be to maintain the status quo pending trial.

    [28] As the evidence establishes that the defendants have not obtained any customers for LDX’s FireCat software in the United States, I would exercise the court’s discretion to limit the application of the interlocutory injunction to the Canadian market.

    [29] The plaintiff has submitted that the defendant LDX should be subject to an order that pending further order of this Honourable Court, the defendant LDX Inc. shall provide a monthly accounting of all its gross revenues to counsel for the plaintiff and pay all such revenue into Court to the credit of this action on a monthly basis.

    [30] I will order that the monthly accounting of gross revenues be provided to plaintiff’s counsel until the trial of the action. I decline at this time to order any such revenues to be paid into court but will consider a further application in this regard if a proper case can be put forward to justify doing so.

    Costs were to be costs in the cause, as the order was intended to preserve the status quo and the court did not make any findings on the merits.

  • 5 Dec 2019 3:56 PM | CAN-TECH Law (Administrator)

    BC court finds that holding a phone in a position in which it may be used is distracted driving 

    The strictness of British Columbia’s distracted driving laws was confirmed in R. v. Ahmed, where a judicial justice found that the accused had violated the Motor Vehicle Act by holding a cellular phone while operating a vehicle. Unlike in some provinces, the BC prohibition deems holding the device in a manner in which it can be used to be using the device:

    [33] “Use” under Section 214.1, Part 3.1 of the MVA “in relation to an electronic device means one or more of the following actions:

    (a) Holding the device in a position in which it may be used;

    (b) Operating one or more of the device’s functions;

    (c) Communicating orally by means of the device with another person or another device; and

    (d) Taking another action that is set out in the Regulations by means of, with or in relation to an electronic device.”

    The police officer observed the accused holding the phone near the centre of the steering wheel while looking up and down at it. This was uncontradicted and thus a conviction was entered.

  • 5 Dec 2019 3:55 PM | CAN-TECH Law (Administrator)

    Lower court had required disclosure of password under “foregone conclusion” exception to the prohibition against mandatory self-incrimination

    In a four to three ruling, the Supreme Court of Pennsylvania in Commonwealth of Pennsylvania v. Davis found that a defendant cannot be compelled to disclose a password to allow the state access to the defendant’s lawfully-seized encrypted computer because a compulsion of that sort would violate the Fifth Amendment of the United States Constitution. This decision overturned a previous order of the Superior Court, which required that the defendant turn over a 64-character password to access his computer.

    The defendant was accused of child pornography offences and in the course of its investigation, the police seized a desktop computer that was encrypted. During questioning, the defendant was asked for this password. He reportedly replied: “It’s 64 characters and why would I give that to you? We both know what’s on there. It’s only going to hurt me. No f*cking way I’m going to give it to you.” 

    The prosecution brought a motion to require the defendant to reveal his password. The main issue under consideration in the court below was whether providing the password to defeat the encryption was testimonial in nature, and thus, protected by the Fifth Amendment. From page 4:

    The trial court focused on the question of whether the encryption was testimonial in nature, and, thus, protected by the Fifth Amendment. The trial court opined that “[t]he touchstone of whether an act of production is testimonial is whether the government compels the individual to use ‘the contents of his own mind’ to explicitly or implicitly communicate some statement of fact.” 

    The court below applied the “foregone conclusion” exception to the Fifth Amendment rule against incrimination, described at page 5:

    The court noted the rationale underlying this doctrine is that an act of production does not involve testimonial communication if the facts conveyed are already known to the government, such that the individual “‘adds little or nothing to the sum total of the government’s information.’” The trial court offered that for this exception to apply, the government must establish its knowledge of (1) the existence of the evidence demanded; (2) the possession or control of the evidence by the defendant; and (3) the authenticity of the evidence. [citations omitted]

    The trial court had a very high level of confidence about what was on the computer so it determined that the password fit within the “foregone conclusion” exception.

    The appeal court disagreed:

    Based upon the United States Supreme Court’s jurisprudence surveyed above, it becomes evident that the foregone conclusion gloss on a Fifth Amendment analysis constitutes an extremely limited exception to the Fifth Amendment privilege against self-incrimination. The Supreme Court has spoken to this exception on few occasions over the 40 years since its recognition in Fisher, and its application has been considered only in the compulsion of specific existing business or financial records. Its circumscribed application is for good reason. First, the Fifth Amendment privilege is foundational. Any exception thereto must be necessarily limited in scope and nature. Moreover, business and financial records are a unique category of material that has been subject to compelled production and inspection by the government for over a century. The high Court has never applied or considered the foregone conclusion exception beyond these types of documents. Indeed, it would be a significant expansion of the foregone conclusion rationale to apply it to a defendant’s compelled oral or written testimony. As stated by the Supreme Court, “[t]he essence of this basic constitutional principle is ‘the requirement that the [s]tate which proposes to convict and punish an individual produce the evidence against him by the independent labor of its officers, not by the simple cruel expedient of forcing it from his own lips.’” (emphasis original). Broadly circumventing this principle would undercut this foundational right.


    Finally, the prohibition of application of the foregone conclusion rationale to areas of compulsion of one’s mental processes would be entirely consistent with the Supreme Court decisions, surveyed above, which uniformly protect information arrived at as a result of using one’s mind. To broadly read the foregone conclusion rationale otherwise would be to undercut these pronouncements by the high Court. When comparing the modest value of this exception to one’s significant Fifth Amendment privilege against self-incrimination, we believe circumscribed application of the privilege is in order. [citations omitted]

    The Supreme Court of Pennsylvania concluded that the provision of the password was testimonial and was not within the “foregone conclusion” exception, and reversed the order of the Superior Court.

  • 5 Dec 2019 3:54 PM | CAN-TECH Law (Administrator)

    Saskatchewan Court of Appeal considers multi-pronged attack on admission and consideration of Facebook Messages 

    The Saskatchewan Court of Appeal recently reviewed the legal tests to be applied for attributing authorship and authenticating Facebook Messenger messages. The accused/appellant in R v Durocher appealed his finding of guilt for sexual assault and sexual interference with a person under 16, arguing in part that the trial judge had erred in the admission and reliance on Facebook messages put forward as being exchanged between the accused and the complainant. 

    As part of the prosecution’s evidence, the complainant testified and referred to a number of Facebook messages. The defence alerted the judge that there may be some objections and that there may be further evidence touching on the messages, the defence did not object and did not call for a voir dire regarding admissibility, reliability or whether the accused was the actual author of the messages attributed to him by the complainant. On appeal, the accused argued that the messages were hearsay and could not be considered unless the judge held a voir dire to determine the question. It was argued that the judge should have done so on his own motion. 

    On the question of authorship, the Court of Appeal concluded that it was open to the trial judge to make the conclusion the accused had authored the messages based on the complainant’s testimony and inferences that can be drawn. 

    [49] There was no suggestion at trial of tampering or that the sender used an alias. Although the prospect of tampering was discussed in the context of threshold authentication, Watt J.A., in R v C.B., 2019 ONCA 380 (CanLII), 146 OR (3d) 1 [C.B.], was of the view that an inference could be drawn about authorship in the absence of any evidence that gives an air of reality to such a claim:

    [72] As a matter of principle, it seems reasonable to infer that the sender has authored a message sent from his or her phone number. This inference is available and should be drawn in the absence of evidence that gives an air of reality to a claim that this may not be so. Rank speculation is not sufficient: R. v. Ambrose, 2015 ONCJ 813 (CanLII), at para. 52.

    [50] In my view, the trial judge properly applied the Evans test to determine threshold admissibility. Examining the circumstantial evidence as a whole, it was open to him to draw an inference that Mr. Durocher was the author of the Facebook messages. L.A. provided viva voce testimony and a statement to the police that Mr. Durocher was the person who had sent the Facebook messages to her. She explained the basis for her belief and briefly discussed the content of each message. L.A.’s evidence on this point went in unchallenged.

    In considering the threshold of admissibility, the trial judge only needed to be satisfied on the legal standard that the accused made the statements and could rely on circumstantial evidence to do so. Furthermore, a voir dire was not required to consider the question. 

    The appellant also challenged the reliability of the messages, which the Court of Appeal found to be rooted in the law related to hearsay. The appellant asserted that the trial judge should have held a voir dire to determine the question, though none was requested at the trial. An out-of-court statement made by a non-testifying declarant “tendered for the truth of its contents” is presumptively inadmissible, subject to specific common law exceptions. 

    The Court of Appeal noted that at trial, defence counsel never identified hearsay as a basis for concern or suggest a voir dire was necessary. The defence did make a statement that he “may object” at some point to the Facebook messages during examination. Defence counsel also did not challenge the complaint’s testimony that attributed the messages to the defendant. 

    This may have been part of a strategic decision by the defence that ultimately failed: “As commented above, the approach taken by defence counsel at trial was no doubt in furtherance of a strategy that did not succeed, but that alone does not make the trial judge’s approach erroneous.”

    Defence counsel chose not to advance this argument at trial, and the trial judge was not faulted for not intervening with Mr. Durocher’s strategy during trial. In this case, Facebook evidence was presumptively admissible without a voir dire. The trial judge did not err by failing to hold a voir dire of his own motion in order to determine the admissibility of the Facebook messages because they were hearsay. 

    Further, the appeal judge invoked the curative provision of s. 686(1)(b)(iii) of the Criminal Code. While the trial judge believed that Mr. Durocher sent the messages, but they did not add to the allegation of the specific sexual assault 

    [73] … I say this because even though the trial judge was satisfied Mr. Durocher was the author of the messages, he did not give the messages any weight in deciding guilt or innocence. To repeat, he said, “I believe that Mr. Durocher did send messages, but they don’t add to the allegation of the specific sexual assault” (emphasis added). The trial judge’s decision came down to the credibility of L.A. He explained why he found her credible and, in the end, was satisfied beyond a reasonable doubt that Mr. Durocher had committed the offences with which he stood charged.

    The appellant also challenged the authenticity of the messages, which requires resort to the Canada Evidence Act (“CEA”) and the common law. Authenticity, to put it simply, refers to whether the document is what it purports to be. The CEA is particularly engaged because the messages are electronic documents:

    Authentication of electronic documents

    31.1 Any person seeking to admit an electronic document as evidence has the burden of proving its authenticity by evidence capable of supporting a finding that the electronic document is that which it is purported to be.

    And under the CEA, it is merely a threshold question as the ultimate issue has to be determined at the end of the day with the benefit of all the relevant evidence:

    [84] That said, authentication does not necessarily mean the document is genuine: “That is a question of weight for the fact-finder which often turns on determinations of credibility” (citations omitted, Ball at para 70). Evidence can be authenticated even where there is a contest over whether it is what it purports to be. As Professor David Paciocco (as he then was) explained in his article cited above, “Proof and Progress: Coping with the Law of Evidence in a Technological Age” (December 2013) 11 Can J L & Tech 181 [“Proof and Progress”], this is not because the law is interested in false documentation (at 197):

    It is simply that the law prefers to see disputes about authenticity resolved at the end of a case, not at the admissibility stage. Disputes over authenticity tend to turn on credibility, and credibility is best judged at the end of the case in the context of all of the evidence. “Authentication” for the purposes of admissibility is therefore nothing more than a threshold test requiring that there be some basis for leaving the evidence to the fact finder for ultimate evaluation. In R. v. Butler, [2009 ABQB 97] 2009 CarswellAlta 1825, [2009] A.J. No. 1242 (Alta. Q.B.), for example, the Court recognized where there was a live issue about whether the accused generated the Facebook entries in question that would be for the jury to decide.

    The Court of Appeal also noted that authentication and authorship are two different, but interconnected, questions and each must be resolved. The Court of Appeal, at paragraph 85, quoted from Graham Underwood and Jonathan Penner, Electronic Evidence in Canada, loose-leaf (Rel 1, 2016) vol 1 (Toronto: Thomson Reuters, 2010):

    It is also important to note that establishing the authenticity of an electronic document is not necessarily synonymous with demonstrating its authorship. The relationship between authenticity and authorship for electronic documents is not reciprocal; demonstrating authorship is sufficient to establish authenticity, but establishing authenticity of an electronic document does not necessarily provide strong evidence of authorship.

    At the end of the day, the burden for authentication is relatively light. Though neither party before the court nor the trial judge referred to the CEA, there was sufficient evidence to establish that authenticity and authorship. The Court of Appeal noted that the complainant testified to the following, all of which was largely unchallenged by the defence: 

    (a) she was his Facebook friend;

    (b) she described the content of Mr. Durocher’s home page as containing his name in bold and underlined lettering;

    (c) the responses to her messages bore her name;

    (d) she identified the messages she had sent in response and her responses bore her name;

    (e) the messages from the sender contained sexual content with an invitation for sexual activity;

    (f) she was able to access the messages from her smart phone;

    (g) the sender used the word tatanka that, in the Dakota language, means buffalo;

    (h) the messages received by L.A. were contemporaneous with the alleged assaults; and

    (i) some of L.A.’s messages, in the chain of messages, were met by a further message from the sender.

    Though the Court of Appeal would have preferred that the judge had held a voir dire on this question, the low barrier of section 31.1 of the CEA was easily passed. In paragraph 96, the Court of Appeal noted: 

    However, bearing in mind the low bar attached to s. 31.1, the functional approach adopted by the courts with regard to its application, the presumption of integrity under the CEA and the fact the trial judge ultimately found Mr. Durocher was the author of the Facebook messages, I am satisfied that the evidence adduced by the Crown was capable of authenticating the Facebook messages.

  • 21 Nov 2019 4:39 PM | CAN-TECH Law (Administrator)

    Customers complained they could not access their funds; approximately $16M owed

    On November 4, 2019, the British Columbia Securities Commission announced that it had made an application to the Supreme Court of the province for the appointment of an interim receiver to take custody of the assets, undertaking and business of a group of companies operating a cryptocurrency exchange, called the Einstein Exchange. The receiver entered and secured the premises on November 1, 2019.

    According to the affidavit of the Lead Investigator with the Enforcement Division of the BCSC, the Securities Commission commenced its investigation in May 2019 after hearing complaints from several of the company’s customers indicating they were not able to access their assets on the exchange. Two other individuals, including a shareholder, also complained about possible money laundering. According to documents filed with the court, the company trades nineteen different cryptocurrencies and its customers are currently owed just over $16 million.

  • 21 Nov 2019 3:58 PM | CAN-TECH Law (Administrator)

    US government to strictly enforce KYC rules for cryptocurrency exchanges

    As the New York Times recently reported, at a conference hosted by a blockchain analysis company, the Director of the US Financial Crimes Enforcement Network (FinCEN) announced that the US government would be strictly enforcing the “travel rule” anti-money laundering measure over cryptocurrency exchanges. The travel rule “requires cryptocurrency exchanges to verify their customers' identities, identify the original parties and beneficiaries of transfers $3,000 or higher, and transmit that information to counterparties if they exist.” It is a standard feature of anti-money laundering law and the subject of recent recommendations by the Financial Action Task Force (FATF), which held that the rule should apply to cryptocurrency exchanges due to the large amount of theft, scams and fraud associated with them (estimated to be $4.3 billion this year). The report indicates this was something of a surprise to cryptocurrency firms, who felt that cryptocurrency was not money and that the travel rule therefore didn’t apply to them.

  • 21 Nov 2019 3:54 PM | CAN-TECH Law (Administrator)

    Test case by ACLU and EFF upholds privacy interest in e-devices

    The question of when it is legal for border officials to search the electronic devices of travelers has been a contentious one for some time, both in Canada and in the United States. The dominant argument made by governments has been that privacy interests are low at the border, while national security and law enforcement interests are high, and therefore devices should be subject to search at any time. Civil society organizations and defendants have asserted that due to the intense privacy interest in the large amounts of data carried in computers and smart phones, searches are much more invasive than would be the case for other things a traveller might have with them (such as a suitcase), and have advocated for requiring the state to have reasonable grounds or reasonable suspicion before conducting a search.

    In the recent case of Alasaad v. Neilsen, a U.S. District Court for Massachusetts has ruled that searches of the devices carried by 11 travellers (10 US citizens and one permanent resident) by US Customs and Border Protection (CBP) and Immigrations and Customs Enforcement (ICE) violated the US 4th Amendment, because they were done in the absence of reasonable and individualized suspicion that contraband or evidence was present. The action was brought by the American Civil Liberties Union (ACLU) and the Electronic Frontier Foundation (EFF), and one could infer that the plaintiffs were carefully chosen due to the circumstances and effects of the searches that were done. Plaintiff Alasaad was a Muslim woman who objected to male officers searching her phone and viewing pictures of her and her daughters without their headscarves, yet was subject to search without grounds on two different occasions. Other searches saw border officials viewing privileged solicitor-client material, confidential work data belonging to an employee of NASA, and journalistic work product with lists of contacts. One plaintiff, a writer, was asked about her blog posts after the search, and when her phone was returned the Facebook app was open to her “friends” page, which had not been open when the phone was taken. The majority of the plaintiffs had their devices searched more than once. The court observed that the harm to the plaintiffs’ interests from the searches was ongoing, since the evidence indicated that data from the devices would be retained and “used to inform decisions on future searches.”

    The court noted that CBP and ICE had policies under which “basic” searches (searches of devices by hand) did not require any grounds, while “advanced” searches (involving hooking the device up to another computer) required reasonable suspicion. The court rejected the government’s argument that both of these searches were “routine” and thus subject to the “border search exception” to the 4th Amendment. Even a basic search could turn up extremely personal and private information on a device, including metadata, particularly given that even cell phones have search functions that can be used. This being so, it was clear that a standard of “reasonable suspicion” should apply to both, the standard being defined as “a showing of specific and articulable facts, considered with reasonable inferences drawn from those facts, that the electronic device contains contraband.” While granting the plaintiffs’ request for declaratory relief, the court declined their request to issue a nation-wide injunction against searches not based on reasonable suspicion.


Canadian Technology Law Association

1-189 Queen Street East

Toronto, ON M5A 1S2

Copyright © 2024 The Canadian Technology Law Association, All rights reserved.