Menu
Log in

                   

Log in


News

  • 4 Apr 2019 12:32 PM | Deleted user

    Court refuses to compel accused person to unlock phone

    In R. v. Shergill, Justice Philip Downes of the Ontario Court of Justice heard an application by the Crown which raised the thorny issue of whether accused persons can be compelled to “unlock” password-protected electronic devices. The accused was charged with a variety of sexual and child pornography offences and the police seized his cell phone incident to arrest. Realizing they had no technology that would allow them to open the phone without possibly destroying its contents, the police applied for a search warrant along with an “assistance order” under s. 427 of the Criminal Code. This section provides that a judge who issues a warrant “may order a person to provide assistance, if the person’s assistance may reasonably be considered to be required to give effect to the authorization or warrant.” Unusually, the application did not proceed ex parte and both the Crown and the accused made submissions.

    The Crown argued that the accused’s Charter rights were not engaged by the issuance of the assistance order, because it was a matter of “mere practicality.” Centrally, the principle against self-incrimination was not engaged because the order “only compels Mr. Shergill to provide access to, and not create, material the police are judicially authorized to examine, and because any self-incrimination concerns are met by the grant of use immunity over Mr. Shergill’s knowledge of the password.” The accused argued that the principle against self-incrimination was, indeed, engaged because the order would compel him to produce information that only existed in his mind, “for the purpose of assisting [the police] in obtaining potentially incriminating evidence against him”—thus violating his right to silence and the protection against self-incrimination under s. 7 of the Charter.

    Justice Downes sided with the accused. First, the principle against self-incrimination was engaged:

    The Crown suggests that Mr. Shergill’s s. 7 interests are “not engaged” or minimally compromised because what is sought to be compelled from him has no incriminatory value or effect. All the assistance order seeks is a password, the content of which is of no evidentiary value. Indeed, the Crown says that the police need not even be aware of the actual password as long as Mr. Shergill somehow unlocks the phone without actually touching it himself.

    In my view, however, the protection against self-incrimination can retain its force even where the content of the compelled communication is of no intrinsic evidentiary value. This is particularly so where, as here, that communication is essential to the state’s ability to access the evidence which they are “really after.” To paraphrase the Court in Reeves, to focus exclusively on the incriminatory potential of the password neglects the significant incriminatory effect that revealing the password has on Mr. Shergill. As the Supreme Court held in White:

    The protection afforded by the principle against self-incrimination does not vary based upon the relative importance of the self-incriminatory information sought to be used. If s. 7 is engaged by the circumstances surrounding the admission into evidence of a compelled statement, the concern with self-incrimination applies in relation to all of the information transmitted in the compelled statement. Section 7 is violated and that is the end of the analysis, subject to issues relating to s. 24(1) of the Charter. [footnotes omitted]

    Even more important, the judge found, was the right to silence under s. 11(b) of the Charter:

    In my view, the more significant principle of fundamental justice at stake is the right to silence. This right emerged as a component of the protection against self-incrimination in R. v. Hebert in which McLachlin J. (as she then was), held:

    If the Charter guarantees against self-incrimination at trial are to be given their full effect, an effective right of choice as to whether to make a statement must exist at the pre-trial stage… the right to silence of a detained person under s. 7 of the Charter must be broad enough to accord to the detained person a free choice on the matter of whether to speak to the authorities or to remain silent.

    McLachlin J. also reaffirmed the Court’s prior holding that the right to silence was “a well-settled principle that has for generations been part of the basic tenets of our law.” 

    The “common theme” underlying the right to silence is “the idea that a person in the power of the state in the course of the criminal process has the right to choose whether to speak to the police or remain silent.” In tracing the history of the right, McLachlin J. referred to an “array of distinguished Canadian jurists who recognized the importance of the suspect’s freedom to choose whether to give a statement to the police or not” and described the essence of the right to silence as the “notion that the person whose freedom is placed in question by the judicial process must be given the choice of whether to speak to the authorities or not.” Finally, Hebert held that s. 7 provides “a positive right to make a free choice as to whether to remain silent or speak to the authorities.”

    The pre-trial right to silence is a concept which, as Iacobucci held in R.J.S., has been “elevated to the status of a constitutional right.”[footnotes omitted]

    The court also rejected the Crown’s argument that the accused’s rights were sufficiently protected by providing use immunity for his knowledge of the contents of his phone and the password:

    As a practical matter, without the assistance order, the evidence would never come into the hands of the police. In that sense it strikes me as somewhat artificial to say that the data on the Blackberry is evidence which, in the language of D’Amour, “exist[s] prior to, and independent of, any state compulsion.” Rather, it is evidence which, as far as the police are concerned, is only “brought into existence by the exercise of compulsion by the state.”

    [….]

    Fundamentally, realistically and in any practical sense, granting this application would amount to a court order that Mr. Shergill provide information which is potentially crucial to the success of any prosecution against him, and which could not be obtained without the compelled disclosure of what currently exists only in his mind. It strikes at the heart of what the Supreme Court has held to be a foundational tenet of Canadian criminal law, namely, that an accused person cannot be compelled to speak to the police and thereby assist them in gathering evidence against him or herself.

    In my view nothing short of full derivative use immunity could mitigate the s.7 violation in this case.

    The Court then discussed some of the challenges that law enforcement are facing in light of new technology and encryption in particular. Though there is always a compelling public interest in the investigation and prosecution of crimes, the final balancing came down on the side of the accused's liberty interests under s. 7 of the Charter:

    I accept that the current digital landscape as it relates to effective law enforcement and the protection of privacy presents many challenges. It may be that a different approach to this issue is warranted, whether through legislative initiatives or modifications to what I see as jurisprudence which is binding on me. But on my best application of controlling authority, I am simply not persuaded that the order sought can issue without fundamentally breaching Mr. Shergill’s s. 7 liberty interests, a breach which would not be in accordance with the principle of fundamental justice which says that he has the right to remain silent in the investigative context.

    The search warrant was issued but the assistance order was denied.

  • 20 Mar 2019 1:33 PM | Deleted user

    Detailed questionnaire sent to at least 60 individuals

    According to Forbes Online, the Canada Revenue Agency (CRA) has begun to audit individuals with significant involvement in cryptocurrency holdings or transactions. 

    In 2017, the CRA established a dedicated cryptocurrency unit said to be intended to build intelligence and conduct audits focussed on cryptocurrency risks as part of its Underground Economy Strategy

    Forbes reports there are currently over 60 active cryptocurrency audits, which involve a very detailed questionnaire, consisting of 54 questions with many sub-questions. Examples include: 

    • Do you use any cryptocurrency mixing services and tumblers? (Which can be used to intermix accounts and disguise the origin of the funds.) If so, which services do you use? 
    • Can you please provide us with the tracing history, along with all the cryptocurrency addresses you ‘mixed’? Why do you use these services?”. 

    Further questions address whether the taxpayer has purchased or sold crypto-assets from or to private individuals. If so, how they became aware of the sale opportunity, and how the transaction was facilitated. The questionnaire goes so far as to ask the taxpayer to list all personal crypto-asset addresses that are not associated with their custodial wallet accounts. It further asks whether they have been a victim of crypto-theft, or have been involved in ICOs (initial coin offerings) or participate in crypto-mining.

    Not surprisingly, CRA has not disclosed what criteria it has used to target individuals with the questionnaires.

  • 20 Mar 2019 1:29 PM | Deleted user

    Calls for a creation of a “Digital Authority”, increased onus to police user generated content and reducing market concentration

    The United Kingdom Select Committee on Communications has released a very interesting report on the regulation of the internet. Entitled Regulating in a Digital World, the report calls for a whole new era and methodology for regulating both online service providers and platforms, and the content that is made available through them. It calls for regulation based upon ten principles enunciated in the introduction:

    1. Parity: the same level of protection must be provided online as offline
    2. Accountability: processes must be in place to ensure individuals and organisations are held to account for their actions and policies
    3. Transparency: powerful businesses and organisations operating in the digital world must be open to scrutiny
    4. Openness: the internet must remain open to innovation and competition
    5. Privacy: to protect the privacy of individuals
    6. Ethical design: services must act in the interests of users and society
    7. Recognition of childhood: to protect the most vulnerable users of the internet
    8. Respect for human rights and equality: to safeguard the freedoms of expression and information online
    9. Education and awareness-raising: to enable people to navigate the digital world safely
    10. Democratic accountability, proportionality and evidence-based approach.

    At its heart, the Report calls for the creation of what's called a Digital Authority that would advise government and regulators about the online environment: 

    238. We recommend that a new body, which we call the Digital Authority, should be established to co-ordinate regulators in the digital world. We recommend that the Digital Authority should have the following functions:

    • to continually assess regulation in the digital world and make recommendations on where additional powers are necessary to fill gaps;
    • to establish an internal centre of expertise on digital trends which helps to scan the horizon for emerging risks and gaps in regulation;
    • to help regulators to implement the law effectively and in the public interest, in line with the 10 principles set out in this report;
    • to inform Parliament, the Government and public bodies of technological developments;
    • to provide a pool of expert investigators to be consulted by regulators for specific investigations;
    • to survey the public to identify how their attitudes to technology change over time, and to ensure that the concerns of the public are taken into account by regulators and policy-makers;
    • to raise awareness of issues connected to the digital world among the public;
    • to engage with the tech sector;
    • to ensure that human rights and children’s rights are upheld in the digital world;
    • to liaise with European and international bodies responsible for internet regulation.

    239. Policy-makers across different sectors have not responded adequately to changes in the digital world. The Digital Authority should be empowered to instruct regulators to address specific problems or areas. In cases where this is not possible because problems are not within the remit of any regulator, the Digital Authority should advise the Government and Parliament that new or strengthened legal powers are needed.

    The Report further critiques the presence of large companies that it says dominate the digital space, and calls for greater regulation and scrutiny of mergers and challenges the paradigm of cross-subsidies that result in free services:

    15. Mergers and acquisitions should not allow large companies to become data monopolies. We recommend that in its review of competition law in the context of digital markets the Government should consider implementing a public-interest test for data-driven mergers and acquisitions. The public-interest standard would be the management, in the public interest and through competition law, of the accumulation of data. If necessary, the Competition and Markets Authority (CMA) could therefore intervene as it currently does in cases relevant to media plurality or national security. 

    16. The modern internet is characterised by the concentration of market power in a small number of companies which operate online platforms. These services have been very popular and networks effects have helped them to become dominant. Yet the nature of digital markets challenges traditional competition law. The meticulous ex post analyses that competition regulators use struggle to keep pace with the digital economy. The ability of platforms to cross-subsidise their products and services across markets to deliver them free or discounted to users challenges traditional understanding of the consumer welfare standard. 

    With respect to problematic content, the Report proposes the removal of safe harbours that current protect platform providers and replacing it with an obligation to police and be accountable for that user generated content: “a duty of care should be imposed on online services which host and curate content which can openly be uploaded and accessed by the public. This would aim to create a culture of risk management at all stages of the design and delivery of services.”

    The Report also addresses children’s issues, the difficult-to-understand terms of use and privacy by default. 

    How the Report will be received and translated into new regulation remains to be seen.

  • 20 Mar 2019 1:28 PM | Deleted user

    Media parties denied standing in Reference on right to be forgotten

    The Prothonotary of the Federal Court dismissed an application by various media parties to be involved in a Reference case in Reference re subsection 18.3(1) of the Federal Courts Act. A complainant brought a complaint to the Privacy Commissioner alleging that Google contravened the Personal Information Protection and Electronic Documents Act [PIPEDA] by continuing to prominently display links to news articles about him in search results corresponding to his name. The complainant alleges the articles in question are outdated and inaccurate and disclose sensitive and private information, and has requested that Google “de-index” him: that is, that they remove the articles from search results using his name, a process colloquially referred to as “the right to be forgotten”. Before investigating the complaint, the Privacy Commissioner sent two Reference questions to be determined by the Federal Court, and the media parties sought to take part in those Reference proceedings. The prothonotary denied that request, but left open the possibility that the media parties could make the same application at a later point.

    The central issue was, in fact, what the issue was. There was no question that the underlying complaint to the Privacy Commissioner against Google raised “important and ground-breaking issues relating to online reputation, including whether a ‘right to be forgotten’ should be recognized in Canada, and if so, how such a right can be balanced with the Charter protected rights to freedom of expression and freedom of the press” (para 7). That was not, however what the Reference was about. Rather, the Privacy Commissioner had asked only two questions:

    1. Does Google, in the operation of its search engine service, collect, use or disclose personal information in the course of commercial activities within the meaning of paragraph 4(1)(a) of PIPEDA when it indexes webpages and presents search results in response to searches of an individual’s name?
    2. Is the operation of Google’s search engine service excluded from the application of Part I of PIPEDA by virtue of paragraph 4(2)(c) of PIPEDA because it involves the collection, use or disclosure of personal information for journalistic, artistic or literary purposes and for no other purpose?

    Google, as a party to the Reference, had brought an application to expand the questions to include issues relating to whether, if PIPEDA applied to the operation of its search engine and requires deindexing, it would contravene s 2(b) of theCharter. However, that application had not yet been heard at the time the media parties sought to be added (in part at their insistence).

    The media parties argued that the true issue underlying the reference was the Privacy Commissioner’s proposed regulation of internet searches and whether that offends the expression and press freedoms in the Charter, and therefore that they should be added as either parties or intervenors. The Prothonotary, however, held that the question which needed to be asked at this time was whether the media parties should be parties on intervenors on the Reference questions which actually existed at the time, and that there was no basis to grant that application.

    The media parties were not, for example, necessary for a full and effectual determination of all issues in the reference:

    [36] …What is at issue here is only whether Google is subject to or exempt from the application of Part 1 of PIPEDA in respect of how it collects, uses or discloses personal information in the operation of its search engine service when it presents search results in response to an individual’s name. 

    [37] The only direct result or effect of the answer to the questions raised in this reference will be to determine whether the OPC may proceed to investigate the complaint made against Google. The media parties are neither intended nor required to be bound by that result. The questions, as framed in the reference, can be effectually and completely settled without the presence of the media parties.

    Even if the scope of the Reference were expanded to include Google’s Charter question, the Prothonotary noted, she would be hesitant to conclude that the media parties were necessary:

    [39] The Court accepts, for the purpose of this argument, that deindexing may significantly affect the ability of content providers to reach their intended audience and for the public to access media content. Even as argued by the media parties, however, that is only the practical effect of the implementation of a recommendation to deindex. If deindexing is recommended or required, its implementation does not require that any action be compelled from or prohibited against the media parties, any other content provider, or any user of the search engine. The only action required would be by Google. Deindexing could and would produce its effect without the need for the other persons “affected” by it to be “bound” by the result of the proposed expanded reference. 

    [40] The impact of a potential deindexing requirement may be significant, but it does not affect the media parties any more directly than it would affect other content providers or those who use Google’s search engine service to gain access to content. To hold that the media parties are, by reason of the practical effect of a decision, necessary to the full and effectual determination of all issues would require that all others that are equally affected also be recognized as necessary parties and be made parties to the reference. 

    Similarly the media parties were not found to have shown that they would add anything of value if they were allowed to be intervenors in the Reference:

    [47] It seems to the Court that the media parties have not given much thought to what they would have to contribute to the determination of the reference if it were limited to the questions as currently framed in the Notice of Application. Indeed, given that the issues currently framed in the reference focus on whether Google’s operation of its search engine is a commercial activity and the purpose for which Google collects, uses or discloses personal information, it is not clear what evidence the media parties might be able to contribute that might assist the Court’s determination. Asked at the hearing to state the position they might take in respect of each of the questions as framed in the reference, counsel for the media parties candidly admitted that they could not provide an answer, having not even seen the evidentiary record constituted by the Privacy Commissioner for the purpose of the reference.

    The Prothonotary did allow, however, that the media parties could apply for intervenor status again once Google’s application to expand the Reference had been decided, so long as “the proposed intervener’s contribution is well-defined and the Court is satisfied that this contribution is relevant, important and in the interest of justice” (para 50).

    [Editor’s note: one of the authors of this newsletter was counsel to one of the parties in this case, but was not involved in writing up this summary.]

  • 20 Mar 2019 1:27 PM | Deleted user

    Supreme Court concludes you don’t necessarily believe what people tell you on the Internet.

    The Supreme Court of Canada struck down portions of the child-luring provisions with its decision in R v Morrison. The accused was charged with child luring for the purposes of inviting sexual touching of a person under age 16, contrary to ss 172.1(1)(b) and 152 of the Criminal Code. That section, the Court noted, 

    40…creates an essentially inchoate offence — that is, a preparatory crime that captures conduct intended to culminate in the commission of a completed offence: see Legare, at para. 25; R. v. Alicandro, 2009 ONCA 133, 95 O.R. (3d) 173, at para. 20, citing A. Ashworth, Principles of Criminal Law, (5th ed. 2006), at pp. 468-70. There is no requirement that the accused meet or even intend to meet with the other person with a view to committing any of the designated offences: see Legare, at para. 25. The offence reflects Parliament’s desire to “close the cyberspace door before the predator gets in to prey”: para. 25.

    The accused had posted an advertisement on Craigslist saying “Daddy looking for his little girl”, which was responded to by a police officer who posed as ‘Mia,’ a 14-year-old. Over the course of more than two months, Morrison invited ‘Mia’ to touch herself sexually and proposed they engage in sexual activity. As a result he was charged with the child luring offence, and defended himself on the basis that he believed he was communicating with an adult female engaged in role play who was determined to stay in character: as he said to the police when arrested, “on the internet, you don’t really know whether you’re speaking to a child or an adult”. However, that statute affected his ability to make that argument. 

    The offence in section 172.1(1)(b) requires proof that the communication took place with a person who is or who the accused believes is under the age of 16 years. Section 172.1(3) creates a presumption around that belief: 

    (3) Evidence that the person referred to in paragraph (1)(a), (b) or (c) was represented to the accused as being under the age of … sixteen years … is, in the absence of evidence to the contrary, proof that the accused believed that the person was under that age.

    In addition, section 172.1(4) imposes a further burden on the accused in that same regard: 

    (4) It is not a defence to a charge under paragraph (1)(a), (b) or (c) that the accused believed that the person referred to in that paragraph was at least … sixteen years …unless the accused took reasonable steps to ascertain the age of the person.

    Taken in combination, these provisions mean that if the other person is represented as being under 16, then the accused must show evidence to the contrary that he believed that representation, and that evidence to the contrary must include the taking of reasonable steps. As a result

    [49]…the combined effect of subss. (3) and (4) is to create two pathways to conviction where the other person is represented as being underage to the accused: the Crown must prove that the accused either (1) believed the other person was underage or(2) failed to take reasonable steps to ascertain the other person’s age. In the context of child luring cases involving police sting operations, such as in Levigne, where it can be assumed that the undercover police officer posing as a child will represent that he or she is underage, these two pathways to conviction would have been available to the trier of fact.

    The accused challenged the constitutionality of both sections 172.1(3) and 172.1(4). In addition he challenged the constitutionality of the mandatory minimum sentence in section 172.1(2)(a). 

    The Court concluded that the presumption that a person who was told someone was under sixteen therefore believed that that person was under sixteen violated the Charter, specifically the presumption of innocence in section 11(d). It is well-established that the substitution of proof of one thing (the accused was told she was under sixteen) for proof of another thing (the accused believed she was under sixteen) will violate section 11(d) unless the connection from one to the other is “inexorable”: that is, “one that necessarily holds true in all cases” (para 53). That could not be said of a communication on the internet:

    [58] Deception and deliberate misrepresentations are commonplace on the Internet: see R. v. Pengelley, 2010 ONSC 5488, 261 C.C.C. (3d) 93, at para. 17. As the Court of Appeal in this case aptly put it:

    There is simply no expectation that representations made during internet conversations about sexual matters will be accurate or that a participant will be honest about his or her personal attributes, including age. Indeed, the expectation is quite the opposite, as true personal identities are often concealed in the course of online communication about sexual matters. [para. 60]

    Accordingly the Court found that section 172.1(3) violated section 11(d), and they went on to conclude that it could not be saved by section 1. They held that although the goal of protecting children from Internet predators was sufficiently important, the provision was not minimally impairing, because it would be sufficient to “rely on the prosecution’s ability to secure convictions by inviting the trier of fact to find, based on a logical, common sense inference drawn from the evidence, that the accused believed the other person was underage.”

    The majority did not, however, strike down the “reasonable steps” requirement in section 172.1(4), holding that as it required proof of “belief”, it set a high mens rea standard (which excluded recklessness) and did not violate section 7. (Justice Abella, in a concurring judgment, would also have concluded that section 172.1(4) was unconstitutional.) The majority also declined to decide whether the mandatory minimum sentence was unconstitutional or not, preferring to have that issue argued at the retrial they ordered. Justice Karakatsanis, writing a concurring judgment, would have struck down the mandatory minimum.

  • 7 Mar 2019 1:45 PM | Deleted user

    Settlement against Devumi LLC prohibits it from further sales of online accounts, followers, likes and endorsements 

    In a press release issued on January 30, 2019, the Attorney General of New York announced that following an investigation of Devumi LLC, it had entered into a settlement that puts an end to the company’s sales of fake activity from fake social media accounts. 

    The company was found to have engaged in the sale of fake followers, “likes” and views on Twitter, YouTube, LinkedIn, SoundCloud and Pinterest using fake activity from false accounts (both bots – computer operated accounts – and sock-puppet accounts – one person pretending to be multiple other people). From the Attorney General’s press release:

    Devumi LLC and related companies owned by German Calas, Jr. – including DisruptX Inc.; Social Bull Inc.; and Bytion Inc. (collectively, “Devumi”) – sold fake followers, “likes,” views and other forms of online endorsement and activity to users of social media platforms. Devumi supplied the fraudulent activity using bot and sock-puppet accounts. These bot and sock-puppet accounts falsely pretended to express the genuine positive opinions of real people. In some instances, Devumi supplied fake accounts that copied real people’s social media profiles without consent, including their name and picture.

    In addition, Devumi sold endorsements from social media influencers without disclosing that the influencers had been paid for their recommendations. This is especially troubling when considering that the opinions of influencers can have particularly strong influence over the reputation and sales for any product, company, service or person they endorse.

    These business practices deceived and attempted to affect the decision-making of social media audiences, including: other platform users’ decisions about what content merits their own attention; consumers’ decisions about what to buy; advertisers’ decisions about whom to sponsor; and the decisions by policymakers, voters, and journalists about which people and policies have public support.

    Devumi’s practices deceived some of the company’s own customers who mistakenly believed they were paying for authentic endorsements, while many other Devumi customers knew they were buying fake activity and endorsements. Devumi also deceived the social media platforms, which have policies prohibiting fake activity.

    Devumi ceased operations in 2018 after the AG launched her investigation, which caused a major decline in sales. The settlement did not address whether the customers did anything illegal. Further commentary can be found here: https://www.manatt.com/Insights/Newsletters/Advertising-Law/Fake-Likes,-Followers-Yield-Real-Legal-Action.

  • 7 Mar 2019 1:45 PM | Deleted user

    Review will examine choice and affordability in Canada’s mobile wireless market

    On February 28, 2019, the CRTC announced a review of the state of the mobile wireless market in Canada to determine whether action is required to improve choice and affordability for Canadians. They are seeking feedback from the public on a variety of matters, including whether mobile virtual network operators should have mandated access to national wireless provider networks until they can establish themselves in the market, and whether regulatory measures are needed to facilitate the deployment of 5G network infrastructure.

    At the same time, the CRTC issued its Notice of Consultation CRTC 2019-57, a notice of hearing for a public hearing on 13 January 2020 looking at competition in the retail wireless service market, the wholesale mobile service regulatory framework (roaming and mobile virtual network operator), and the future of mobile wireless services in Canada (especially in regard to reducing barriers to infrastructure deployment). They provide 16 questions for Canadians to provide their feedback on, all of which can be found here: https://crtc.gc.ca/eng/archive/2019/2019-57.htm.

  • 7 Mar 2019 1:44 PM | Deleted user

    Finds that Facebook abused its position of dominance in the German market by combining collected user data improperly

    On February 6, 2019, the German Federal Cartel Office (the “Bundeskartellamt”) issued a decision finding that Facebook had abused its position as the dominant company in the market for social networks by collecting user data outside the Facebook platform (via apps like WhatsApp, Snapchat, etc.) and assigning this data to its users. This amounted to unlawful collection and use of user data. The President of the Cartel Office commented:

    With regard to Facebook’s future data processing policy, we are carrying out what can be seen as an internal divestiture of Facebook’s data.In future, Facebook will no longer be allowed to force its users to agree to the practically unrestricted collection and assigning of non-Facebook data to their Facebook user accounts. The combination of data sources substantially contributed to the fact that Facebook was able to build a unique database for each individual user and thus to gain market power. In future, consumers can prevent Facebook from unrestrictedly collecting and using their data.The previous practice of combining all data in a Facebook user account, practically without any restriction, will now be subject to the voluntary consent given by the users. Voluntary consent means that the use of Facebook’s services must not be subject to the users’ consent to their data being collected and combined in this way. If users do not consent, Facebook may not exclude them from its services and must refrain from collecting and merging data from different sources.

    A good write-up on the decision and its context can be found here.

  • 7 Mar 2019 1:43 PM | Deleted user

    Law clerk affidavits attaching technical reports as exhibits insufficient to properly balance the interests of rights holders and internet users under the notice-and-notice copyright regime

    As the Federal Court continues to settle outstanding questions related to the notice-and-notice regime and Norwich Orders, while a number of rights holders seek information related to the customers of internet service providers. In this case, ME2 Productions, Inc. v. Doe, TekSavvy brought an appeal of a decision rendered by a Federal Court Prothonotary sitting as a case management judge (the “CMJ”) in which the CMJ granted the application for a Norwich order, requiring TekSavvy to provide customer names and addresses to the applicant. The main issue on the appeal was whether the applicant had brought sufficient evidence – or the best evidence – to support its request for a Norwich order.

    The main evidence brought by the applicant was in the form of affidavits sworn by law clerks from the plaintiff’s lawyers’ law firm. Attached to one of those affidavits was a solemn declaration signed by an employee of a company called Maverickeye, which performs services to detect and log online sharing of content. That process and the resulting report is described by the Court:

    [10] The Plaintiffs are movie production companies that own copyright in several films. They are concerned about illegal distribution of their films, so they hired a company called Maverickeye UG (Maverickeye) to monitor the Internet for illegal sharing of their films. The Plaintiffs received a report from Maverickeye about suspected illegal downloads which included the date and time of the activity together with the Internet Protocol (IP) addresses associated with the downloads. These IP addresses were correlated to numbers that were held by several different Internet Service Providers (ISPs). The IP addresses at issue here had been allocated to TekSavvy – each ISP is allocated a bank of IP addresses, and these are available for search, so it was possible for Maverickeye to link a particular IP address with a specific ISP. 

    [11] It was not possible, however, for Maverickeye to link the particular IP address with the name of the individual customer. That information is held by the ISP and is not otherwise available for search by other parties. Pursuant to the notice and notice regime established by the Copyright Modernization Act (described more fully below), the Plaintiffs sent notices to TekSavvy alleging that its customers infringed their copyright and providing the relevant information as to the date and time of the alleged illegal activity, as well as the associated IP address. Under the regime, TekSavvy had to forward such notices to its subscribers, and to retain certain information about the subscribers and their activity.

    On the appeal, TekSavvy attacked the evidence put forward by the plaintiffs, arguing that the law clerks affidavits are improper hearsay, the affidavits do not comply with the requirements of the Federal Courts Rules because they do not set out the basis for the deponents’ belief in their truth and finally the declaration from the Maverickeye employee is expert evidence which was also not in compliance with the requirements set out in the Rules.

    [109] In this case, TekSavvy says that the evidence is inadequate: the key evidence that is found in the Arheidt Declaration is hearsay which cannot be subject to cross-examination since it is simply an exhibit to the affidavits. This is exactly the type of evidence which was rejected in BMG, which remains good law. Further, it contends that the affidavits provide no basis to explain why the law clerks adopt the Arheidt Declaration or believe it to be true and that it is opinion evidence which is not properly submitted in accordance with the Rules or jurisprudence. TekSavvy argues that before the Court can grant extraordinary relief such as a Norwich order, it must demand better evidence from copyright owners.

    The Court noted that this type and configuration of evidence has been routinely accepted in similar proceedings and has been accepted as a basis for the issuance of a Norwich order. The Court noted, at paragraph 103, that “[t]his is a case of first impression: there is no binding authority directly on point nor any decisions of this Court, so I must step back and assess it from first principles, guided by the decision in Rogers Communications.”

    Ultimately, the Court found that the CMJ had erred is relying on the evidence put forward in this manner. Decision-making in cases such as these need to balance the interests of the rights holders against the rights of the individual internet users who may be prejudiced by disclosures under Norwich orders. 

    [121] First, the affidavits of the law clerks do not meet the requirements of Rule 81.To put it bluntly, there is absolutely no indication of the basis for the statement that the law clerks swearing the affidavits adopt the Arheidt Declaration or believe it to be true. Nor is there any explanation as to why the best evidence is not available. The fact that they work in the “copyright enforcement group” of the law firm representing the Plaintiffs may be relevant, in that they may have gained knowledge and expertise about how Maverickeye and its software function and why their reports should be viewed as accurate, but this is not explained. Nor is there any explanation as to why Mr. Arheidt did not swear an affidavit in these matters. 

    [122] Second, the key evidence in support of the granting of the Norwich order is set out in the Arheidt Declaration, but it is simply an exhibit to an affidavit. It is therefore beyond the reach of cross-examination. 

    [123] This evidence is simply not good enough. I find that before granting a Norwich order better evidence must be filed. I have noted earlier the obligation on the copyright owner to provide as accurate information as is possible in the circumstances. To this I would add that the key evidence of alleged copyright infringements must normally be set out in an affidavit, sworn either by a person with direct personal knowledge of how the evidence was gathered, or by someone who can explain why such evidence is not available and why they have reason to believe the truth of the material they are submitting.

    The appeal was allowed and the Norwich order was quashed, but the plaintiffs were granted leave to apply again with better evidence in accord with the Court’s decision.

  • 7 Mar 2019 1:42 PM | Deleted user

    Alleged confidentiality agreement found not to have been completed because email attachment not opened

    In Safety First Contracting (1995) Ltd v. Murphy, Justice Bill Goodridge of the Supreme Court of Newfoundland and Labrador presided over the trial of an action for breach of a confidentiality and non-competition agreement, breach of confidence, and wrongful conversion of trade secrets. The plaintiff company, Safety First, alleged that the defendant, its former operations manager, had breached a confidentiality and non-competition agreement that was part of his terms of employment. It was common ground that the defendant had left his job with Safety First and accepted a virtually identical management position with the main competitor company, Hi-Vis, three weeks later. The defendant denied that there was any confidentiality/non-competition agreement and that he had copied or shared any trade secrets.

    The primary dispute was whether had, indeed, been a confidentiality/non-competition agreement entered into by Safety First and Murphy at the point at which Murphy’s employment was finalized. Justice Goodridge found that while Safety First had intended there to be such an agreement, it was not completed. One of Safety First’s witnesses testified that there had been a written agreement executed, but that it had been “lost or stolen.” Justice Goodridge was dubious about this claim, particularly given that the company had a policy of duplicating all documents in a DropBox folder for viewing by the head office in Halifax, despite the company’s testimony that the policy had changed because the St. John’s office had been given more administrative autonomy (of which, in any event, Murphy was not aware). Safety First’s account of what matters were discussed with Murphy during and after his job interview was affected by witness credibility problems and inconsistent evidence about administrative practices.

    Accordingly, the central factual matter was whether Murphy had received and viewed the agreement by email, as it was attached to an email sent to him by a Safety First official on March 5, 2015. That email read, in part, “Please find attached your acceptance and confidentiality letter,” but had two attachments: a letter containing the basic terms of employment and the confidentiality/non-competition agreement. Murphy testified that he saw only the first attachment, which he opened, signed and sent to the employer. He denied seeing the agreement. An account of the evidence followed:

    [27] There was evidence from computer experts, addressing whether the second letter (the confidentiality and non-competition agreement) was received and opened at Mr. Murphy’s inbox. None of the experts who testified could say with certainty whether both attachments on the March 5, 2015 email arrived at Mr. Murphy’s inbox. All experts agreed that anti-virus software does occasionally remove an attachment during transmission, and that the sender would have no way of knowing that this had occurred.

    [28] John Murphy, a technical specialist in IT security, testified that attachments to emails, or even single attachments on an email with multiple attachments, can be intercepted and not arrive at the intended recipient’s email account. Servers and desktop computers are designed to screen malware and spam. The screening technologies are not perfect and items can be quarantined or deleted without the sender or recipient knowing.

    [29] Craig H. Bennett, who manages Safety First’s email server, testified that he reviewed Mr. Murphy’s company email account -- patrick@safetyfirst-sfc.com -- and was of the view that neither attachment had been quarantined or deleted. He could see that a document of the same size, suggesting two separate documents, was received by Mr. Murphy on the Safety First email account. That evidence is not helpful because Mr. Murphy was not able to open attachments on the Safety First email account at that time. On the day the email was sent, Mr. Murphy was snowmobiling in a remote area of Newfoundland. He was using a hand held smart phone. He could reply to the email but he could not open the attachment. He forwarded the email to his two personal email accounts -- paddymurphy1272@gmail.com and paddy@circusorange.com. Mr. Bennett had no way of determining if both attachments were received at Mr. Murphy’s personal email accounts.

    Justice Goodridge found that Murphy was unaware of the second document, and that even if it did arrive in one of his personal email accounts, his failure to open it was “inadvertent but…not unreasonable. The matter referenced in the email from Safety First suggested that there was only a single letter attached, and there was no mention in the body of the email about non-competition. There was nothing that would signal to Mr. Murphy the need to search for a second attachment.” Accordingly, there was no agreement in place.

    Justice Goodridge went on to find that, even in the absence of an agreement, certain common law duties did apply to prevent employees from taking away confidential information. Again, however, the employer’s claim was hamstrung by questionable evidence about administrative practices regarding electronic document access. Corporate documents (including trade secrets) were kept on three different DropBox folders, and the company witnesses were inconsistent about which folders Murphy had access to; Murphy, for his part, testified that he had only ever accessed sub-folders relevant to his actual tasks, and the judge found this account to be more credible.

    The employer’s only direct evidence that Murphy had actually stolen trade secrets came from a witness, Little, who was retained by Murphy to do contract work at Hi-Vis after Murphy began working there. She alleged to have seen him bring up a Safety First safety manual from a thumb drive on his computer, and that she had managed to surreptitiously send herself a copy of that manual using Murphy’s computer via email (the email was produced in evidence). However, Justice Goodridge found Little to be “loose with the truth” and concluded that she had been engaged in some kind of self-appointed sting operation for Safety First, for whom she hoped to do contract work, and rejected her evidence. In the final result, the employer’s claims were dismissed.

  

Canadian Technology Law Association

1-189 Queen Street East

Toronto, ON M5A 1S2

contact@cantechlaw.ca

Copyright © 2025 The Canadian Technology Law Association, All rights reserved.