Menu
Log in
Log in


News

  • 10 Oct 2019 11:40 AM | Deleted user

    Interpretation of UK Data Protection Act needs to be grounded in the European Charter and applicable principles

    The United Kingdom Court of Appeal has given the green light in a class action against Google related to the collection of “browser generated information”, or “BGI”. The Court Richard Lloyd v Google LLCreversed the decision of the judge below, who held that the putative representative plaintiff could not serve Google LLC outside of the UK in the proceeding and that Google could not be liable for damages without proving further pecuniary loss.

    The plaintiff allege that Google created a “workaround” for functionality built into the Safari browser to block third party cookies, such as Google’s advertising cookie. This has already been the subject of various lawsuits and investigations in the United States.

    The Court of Appeal disagreed with Google argument that the representative plaintiff would need to prove causation and consequential damages according to section 13(1) of the Data Protection Act (“DPA”).

    13(1) An individual who suffers damage by reason of any contravention by a data controller of any of the requirements of this Act is entitled to compensation from the data controller for that damage.

    While this is a provision in UK Law, the Court of Appeal conducted its analysis on the basis that it is a matter of EU law. If a purely domestic statutory construction was applied, section 13 would require proof of both a contravention of the law and actual consequent damage, whether pecuniary or non-pecuniary. What changed the analysis was the introduction in 2012 of the Charter of Fundamental Rights of the European Union, “addressed to… Member States only when they [were] implementing [EU] law” (para 41). Article 8 of the Charter, titled “Protection of personal data” confirmed the protection of data rights under EU law:

    1. Everyone has the right to the protection of personal data concerning him or her.
    2. Such data must be processed fairly for specified purposes and on the basis of the consent of the person concerned or some other legitimate basis laid down by law. Everyone has the right of access to data which has been collected concerning him or her, and the right to have it rectified.
    3. Compliance with these rules shall be subject to control by an independent authority.

    The Court of Appeal concluded that because the Data Protection Act was enacted to give effect to EU law, the provisions of the Charter undergird the rights under the DPA: From Paragraph 41:

    The DPA was enacted to implement EU law, so the provisions of the Charter became applicable to data rights under the DPA after it was introduced.

    Interpretation of the DPA must take into account the Charter and similarly consider those terms as giving effect to the provisions set out in the Charter. As a result, what is harm compensable by damages is determined by EU law and not domestic UK law. And under EU law, the court considered (a) whether control over data is an asset that has value, and (b) whether there are compensatory “damages” within the legal definition of the word.

    With respect to the first question, the Court concluded that the data at issue was an asset that has value. Though UK law has hesitated to see data as property, EU law clearly affords it protection. That it could be monetized for advertising purposes reinforced the value it has. As a result, the Court concluded that an aggrieved person can recover damages under section 13 of the DPA and Article 23 of the EU Data Protection Directive.

    For the second question, the Court relied on the case of Gulati v. MGN Limited [2015] EWHC 1482 (Ch) (Mann J), [2015] EWCA Civ 1291 (CA) (“Gulati”) which had held that damages are available without proof of pecuniary loss or distress for the tort of misuse of private information (“MPI”). Though MPI is a different legal basis for a claim, the Court of Appeal determined that Gulati was relevant and applicable by analogy because the misuse of private information tort and the DPA claim both derive from the same core rights to privacy. Further, the loss of control over telephone data in Gulati by the defendant newspaper company hacking phones was held to be damage that was compensable, and therefore Google’s acquisition of browser generated information must also be compensable.

    The Court of Appeal also determined that the judge below erred in determining that the members of the class did not have the same interest under the relevant Civil Procedure Rules related to class and representative proceedings, and were not identifiable.

    The plaintiff was seeking to represent claimants whose BGI was taken by Google without their consent, in the same circumstances, and during the same period. This was not dependent on personal circumstances that would vary among individual claimants. The effect of this results in reducing the damages that could be claimed to the lowest common denominator.

    The Court of Appeal granted the appellant’s appeal and the case can proceed.

  • 10 Oct 2019 11:39 AM | Deleted user

    Accused’s Facebook posts and texts considered “after the fact conduct,” supporting murder conviction

    In R. v. Café (2019 ONCA 775, no hyperlink available), the accused appealed his conviction at trial for first degree murder. Part of the evidence against him had been text messages that he had sent to friends after the victim’s death, in which he boasted about the murder, as well as Facebook posts containing photos of the victim’s body (taken before the police arrived at the scene) and rap lyrics relating to the death. The trial judge ruled that the texts, photos and rap lyrics could be considered by the jury as “after-the-fact conduct evidence” (which used to be called “evidence of consciousness of guilt”), specifically relating to whether the accused had planned the killing of the victim. Some of the materials were suggestive of a motive to kill the victim because he was a religious man, and suggested that the accused had therefore planned to kill the victim as a way to “mock God.” At trial the accused had testified that he did not plan the killing, but spontaneously did it in order to satisfy voices in his head which were instructing him to murder the victim.

    The accused argued on appeal that the trial judge had erred in this ruling, but the Court of Appeal dismissed this argument. This was not just evidence that implicated the accused in the murder generally, but which had specific probative value towards the exact issues to which the jury had been directed—whether the accused had an established motive for violence toward the victim, and whether he had planned the killing. It had been open to the jury to conclude that the Facebook posts and texts were consistent with motive and the plan to “insult God,” rather than supporting the story regarding voices in the accused’s head. The trial judge had given specific instructions regarding the rap lyrics, which were full of disturbing images and profanity, and ensured that the jury knew to use them only to assess motive and planning, and not to reflect on the accused’s general character for violence.

  • 10 Oct 2019 11:38 AM | Deleted user

    Court applies Marakah, holds accused has expectation of privacy in wife’s surreptitiously-obtained copies of his data

    In R. v. King, the accused was charged with accessing and possessing child pornography images. The accused shared a house with his wife, from whom he was estranged. At one point she suspected him of marital infidelity and, having earlier observed him putting his passcode into his phone, memorized the code. When she later unlocked the phone and looked for evidence of his infidelity, she found images she thought were child pornography. Some time later she looked through his desktop computer and one of his tablets, and found similar images. She used her phone to take pictures of some of the suspicious images displayed on all three devices, saved these pictures to a USB key, and gave the USB to the police. An officer viewed the images on the USB key and determined they might be child pornography. The police obtained a search warrants for the accused’s devices (in his house and car), and eventually seized 34 devices, 7 of which contained child pornography images. The accused made a pre-trial motion, to Judge Allan Fradsham of the Alberta Provincial Court, for a declaration that his protections against unreasonable search and seizure under s. 8 of the Charter had been violated.

    While the accused’s wife was not a state agent, Judge Fradsham nonetheless analyzed whether the accused had a reasonable expectation of privacy in the images vis-à-vis the state, and whether there had been a breach of s. 8, in accordance with the Supreme Court of Canada’s decision in R. v. Marakah. He held that, “in the case at bar which involves electronic data, when determining the true subject matter of the search, one looks beyond the physical object in which the data is stored and considers the data itself and whether the accused had a reasonable expectation of privacy in that data.” Here, while the object being searched was the USB key (when the police initially viewed it), the subject matter of the search was “the collection of images on the USB drive as taken from the screens of Mr. King’s electronic devices, and what it told them about Mr. King.” That is, the target of the search was personal information about the accused. Moreover, the accused obviously had a subjective expectation of privacy in the material, given that all of his devices were password-protected.

    As to whether the accused’s expectation of privacy was reasonable, Judge Fradsham first noted that the location of search was of no practical use in the analysis, since the “place” was the USB key, of which the accused had been unaware. The subject matter of the search was clearly private information, since data on an accused’s computer was generally so (R. v. Morelli), and evidence of criminal activity was something that people would have a particular interest in keeping private (R. v. Patrick). The accused had had control over the subject matter of the search, and the fact that he lost that control was not fatal to the reasonableness of his expectation of privacy:

    [130] That said, control is not the exclusive consideration that informs the existence of a reasonable expectation of personal privacy. And there are exceptional cases where control is not necessary. Where a loss of control over the subject matter is involuntary, such as where a person is in police custody or the subject matter is stolen from the person by a third party, then a reasonable expectation of personal privacy may persist: see Stillman[6], at paras. 61-62(privacy may persist in a tissue discarded while in police custody); R. v. Law,2002 SCC 10 (CanLII), [2002] 1 S.C.R. 227, at para. 28 (privacy may persist in a safe stolen by a third party)...[quoting from Marakah, per Moldaver J]

    Turning to the search, the judge noted that since the accused had a reasonable expectation of privacy in the images, he had standing to assert his s. 8 rights. While the wife had consented to the search of the USB key, as a matter of law one person could not waive another’s Charter rights. Therefore, the initial search of the USB key had been warrantless and not authorized by law, and thus unreasonable. The subsequent warrants had depended upon information obtained from the first warrantless search, and thus those searches were also unreasonable. Accordingly, the searches had breached the accused’s rights under s. 8. Arguments regarding the exclusion of the evidence under s. 24(2) of the Charter were to be heard at an unspecified future date.

  • 26 Sep 2019 11:11 AM | Deleted user

    LinkedIn can’t restrict access to users’ publicly available information

    In hiQ Labs, Inc. v. LinkedIn Corporation, the United States Court of Appeals for the Ninth Circuit unanimously agreed to uphold the district court’s preliminary injunction, which stopped LinkedIn from restricting hiQ’s access to its users’ publicly available information.

    HiQ is a data analytics company founded in 2012 that uses automated bots to scrape information publicly posted by LinkedIn users to yield “people analytics”, which it then sells to clients. HiQ offers analytics that identify employees at the greatest risk of being recruited away from their current employer, as well as tools that help employers identify skill gaps in their own workforces.

    LinkedIn representatives had been attending conferences organized by HiQ, and indeed one LinkedIn employee received an “Impact Award” and spoke at the conference. This changed when LinkedIn began exploring developing its own ways to use the data from the profiles to generate new products.

    In May 2017, LinkedIn sent hiQ a cease-and-desist letter claiming that the company’s use of LinkedIn data was in violation of the User Agreement and, if they persisted in using this data, it would violate the Computer Fraud and Abuse Act (CFAA), the Digital Millennium Copyright Act, California Penal Code §502(c), and the California common law of trespass. One week later, hiQ sought injunctive relief, which was granted by the district court. The court also ordered LinkedIn to remove any technical barriers to hiQ’s access to public profiles.

    The court engaged an in depth analysis of the meaning of the relevant CFAA provision cited by Linkedin and concluded that the CFAA’s prohibition of accessing a computer “without authorization” is only violated when a person circumvents a computer’s generally applicable rules regarding access permissions. When a computer network permits public access to its data, then the access of this publicly available data would not constitute access “without authorization” under the CFAA. However, the court concluded that state law trespass to chattels claim might still be applicable to the data-scaping of publicly available data in this case.

    In considering the injunction, the court noted the potential harm to hiQ, which was not just monetary injury, but the threat of going out of business. HiQ had been in the middle of a financing round when it received the cease and desist letter, and the threat of uncertainty created thereby caused the financing to stall and several employees to leave the company. HiQ would also have been unable to fulfill existing contracts without access to the LinkedIn data. That had to be balanced against the privacy interest retained by LinkedIn users who had chosen to make their profiles public, but the balance between those two interests favoured hiQ.

    The court also considered the public interest in granting the preliminary injunction, which focused mainly on non-parties’ interests. On balance, the court determined that on balance, the public interests favour hiQ’s position. They held:

    We agree with the district court that giving companies like LinkedIn free rein to decide, on any basis, who can collect and use data – data that the companies do not own, that they otherwise make publicly available to viewers, and that the companies themselves collect and use – risks the possible creation of information monopolies that would disserve the public interest.

  • 26 Sep 2019 11:09 AM | Deleted user

    No immunity for blocking competitor’s programs

    The United States Court of Appeals for the Ninth Circuit has reversed a dismissal of a claim by one software provider against another, allowing it to proceed further in Enigma Software Group USA, LLC v. Malwarebytes, Inc. Both Enigma and Malwarebytes make software allowing users to detect and block content such as viruses, spyware or other unwanted content from their computers. As explained in the decision:

    Providers of computer security software help users identify and block malicious or threatening software, termed malware, from their computers. Each provider generates its own criteria to determine what software might threaten users. Defendant Malwarebytes programs its software to search for what it calls Potentially Unwanted Programs (“PUPs”). PUPs include, for example, what Malwarebytes describes as software that contains “obtrusive, misleading, or deceptive advertisements, branding or search practices.” Once Malwarebytes’s security software is purchased and installed on a user’s computer, it scans for PUPs, and according to Enigma’s complaint, if the user tries to download a program that Malwarebytes has determined to be a PUP, a pop-up alert warns the user of a security risk and advises the user to stop the download and block the potentially threatening content.

    After a revision to its PUP-detection criteria in 2016, Malwarebyte’s software started identifying Enigma’s software as a PUP. As a result, if a user attempted to download one of Enigma’s programs, the user would be alerted to a security risk and the program would be quarantined. Enigma argued that this was a “bad faith campaign of unfair competition” aimed at “deceiving consumers and interfering with [Enigma’s] customer relationships.”

    The central legal issue concerned the Communications Decency Act, which immunizes computer-software providers from liability for actions taken to help users block certain types of unwanted online material. The primary motivation behind the Act, which was passed in the 1990s, was to allow for programs which would, for example, allow parents to install software that prevented their children from accessing pornography. However, the relevant provision concerns not only that sort of material, but also includes a catchall phrase giving immunity for software which blocks content which is “otherwise objectionable”. In an earlier decision, Zango Inc. v. Kaspersky Lab, Inc., 568 F.3d 1169, 1173 (9th Cir. 2009), the Ninth Circuit Court of Appeals had decided that this phrase permits providers to block material that either the provider or the user considered objectionable, and Malwarebytes had successfully relied on that at the lower level. The Court of Appeal, however, concluded that this case was distinguishable from Zango.

    In Zango there was no dispute over the claim that the blocked material was objectionable, and so the meaning of that term had not been fully considered. The key difference here was that a company was relying on this provision in order to block the content of one of its competitors. In this case, the Court of Appeals determined that “otherwise objectionable” could not be taken to include software that the provider finds objectionable for anticompetitive reasons. They did not go so far as to find that it was limited only to material which was sexual or violent in nature, holding that “spam, malware and adware could fairly be placed close enough to harassing materials to at least be called ‘otherwise objectionable’.” However, the legislation generally was meant to protect competition, and so an anti-competitive purpose would not serve.

    Malwarebytes argued that its reasons were not anticompetitive, and that its program:

    found Enigma’s programs “objectionable” for legitimate reasons based on the programs’ content. Malwarebytes asserts that Enigma’s programs, SpyHunter and RegHunter, use “deceptive tactics” to scare users into believing that they have to download Enigma’s programs to prevent their computers from being infected.

    The Court of Appeals effectively left that claim to be decided at the eventual action, since it was here only overturning the dismissal of Enigma’s suit. They concluded that because Enigma had alleged anticompetitive practices, its claim survived the motion to dismiss.

  • 26 Sep 2019 11:03 AM | Deleted user

    A Facebook ‘Like’ button makes you a joint controller in the EU.

    The European Court of Justice recently addressed the use of a Facebook “Like” button by Fashion ID GmbH & Co. KG (“Fashion ID”), an online clothing retailer, on their website. Facebook Ireland Ltd. (“Facebook”) acted as an intervenor in the case.

    The ECJ ruled that the operator of a website that embeds a social plugin, such as Facebook’s ‘Like’ button, which causes the browser of a visitor to that website to request content from the provider of the plugin and transmit to that provider the personal data of the visitor can be considered a controller within the meaning of Article 2(d) of the Data Protection Directive, and therefore subject to various obligations.

    Article 2(d) of that directive provides that:

    “controller” shall mean the natural or legal person, public authority, agency or any other body which alone or jointly with others determines the purposes and means of the processing of personal data; where the purposes and means of processing are determined by national or [EU] laws or regulations, the controller or the specific criteria for his nomination may be designated by national or [EU] law.

    Any visitor on the Fashion ID website had their personal data transmitted to Facebook due to the inclusion of the ‘Like’ button on their website. This transmission of data occurred, without notice to the visitor, regardless of whether that visitor was a member of Facebook or had clicked on the ‘Like’ button.

    However, the court held that joint liability does not imply equal responsibility. When operators are involved at different stages of the processing of personal data and to different degrees, the liability to be imposed on the various controllers will be assessed according to the specific circumstances of the case. The court concluded that a further investigation was needed to determine the degree of liability for each of Fashion ID and Facebook.

  • 26 Sep 2019 11:01 AM | Deleted user

    De-referencing requirement limited to Member States of European Union, not worldwide

    In Google LLC v Commission nationale de l’informatique et des libertés (CNIL), the Court of Justice of the European Union has sided with Google rather than the Commission nationale de l'informatique et des libertés (the French Data Protection Authority) over the issue of how far, geographically, Google’s obligation to comply with orders to de-reference links extended. The CNIL had ordered Google to remove results from all versions of the search engine, whatever the domain name extension, and imposed a penalty of €100,000 when Google failed to do so. Google pursued the issue, which therefore came to be decided by the Court of Justice of the European Union: that court concluded that Google was only required to de-reference results from versions of the search engine with domain names corresponding to the Member States of the European Union. However, although Google was not required to de-reference the results from all versions, it was also required to take steps to prevent or seriously discourage an internet user who searches from one of the Member States from gaining access to the de-referenced data via a version of the search engine outside the European Union.

    The Court noted that, due to its decision in Google Spain and Google, a person did have the right in certain circumstances to have their data de-referenced from searches, which has come to be referred to as the “right to be forgotten”. That right permits individuals to assert their right to de-referencing against a search engine operator as long as that search engine in the territory of the European Union, whether the actual data processing takes place in there or not. They also noted that directives and regulations had been made aimed at guaranteeing a high level of protection of personal data throughout the European Union, and that “a de-referencing carried out on all the versions of a search engine would meet that objective in full” (para 55). Indeed, they observed that the European Union Legislature could create a rule saying that de-referencing had to take place worldwide:

    56 The internet is a global network without borders and search engines render the information and links contained in a list of results displayed following a search conducted on the basis of an individual’s name ubiquitous (see, to that effect, judgments of 13 May 2014, Google Spain and Google, C131/12, EU:C:2014:317, paragraph 80, and of 17 October 2017, Bolagsupplysningen and Ilsjan, C194/16, EU:C:2017:766, paragraph 48).

    57 In a globalised world, internet users’ access — including those outside the Union — to the referencing of a link referring to information regarding a person whose centre of interests is situated in the Union is thus likely to have immediate and substantial effects on that person within the Union itself.

    58 Such considerations are such as to justify the existence of a competence on the part of the EU legislature to lay down the obligation, for a search engine operator, to carry out, when granting a request for de-referencing made by such a person, a de-referencing on all the versions of its search engine.

    They carried on to find, however, that no such rule had in fact been created. The Union, or indeed individual states, could create such a rule if they chose, but that was not the current state of the law. They acknowledged that “numerous third States do not recognise the right to de-referencing or have a different approach to that right” (para 59), and that

    60…the right to the protection of personal data is not an absolute right, but must be considered in relation to its function in society and be balanced against other fundamental rights, in accordance with the principle of proportionality …Furthermore, the balance between the right to privacy and the protection of personal data, on the one hand, and the freedom of information of internet users, on the other, is likely to vary significantly around the world.

    Accordingly, a de-referencing order did not apply to all versions of the search engine. It did apply to more than the search engine corresponding to the domain from which the search was made, however, and therefore was in force in all Member States. This was justified on the basis of regulations meant to “ensure a consistent and high level of protection throughout the European Union and to remove the obstacles to flows of personal data within the Union” (para 66).

    In part it seems that the decision was influenced by technological change Google had already made to “steer” users into the appropriate search engine. The Court noted that:

    42 During the proceedings before the Court, Google explained that, following the bringing of the request for a preliminary ruling, it has implemented a new layout for the national versions of its search engine, in which the domain name entered by the internet user no longer determines the national version of the search engine accessed by that user. Thus, the internet user is now automatically directed to the national version of Google’s search engine that corresponds to the place from where he or she is presumed to be conducting the search, and the results of that search are displayed according to that place, which is determined by Google using a geo-location process.

  • 13 Sep 2019 11:08 AM | Deleted user

    Pilot project found to be compliant with European Convention on Human Rights and Data Protection Acts

    On September 4, 2019, the UK High Court released its decision in R (Bridges) v CCSWP and SSHD, which was a judicial review and test case of sorts to determine the lawfulness of the use of automated facial recognition (AFR) by the South Wales Police (SWP).

    AFR technology is an automated means by which images captured on CCTV cameras are processed to “isolate pictures of individual faces, extract information about facial features from those pictures, compare that information with the watchlist information, and indicate matches between faces captured through the CCTV recording and those held on the watchlist” (para 25). If there is no match, the captured facial images are discarded and flushed from the system within 24 hours, while the CCTV footage is retained for 30 days.

    The South Wales Police was carrying out a pilot project of AFR, resulting in one arrest using real-time AFR deployment was of a wanted domestic violence offender in May 2017.

    The principal objections against the use of AFR are rooted in the European Convention on Human Rights, which includes at Article 8:

    Article 8

    1. Everyone has the right to respect for his private and family life, his home and his correspondence.

    2. There shall be no interference by a public authority with the exercise of this right except such as is in accordance with the law and is necessary in a democratic society in the interests of national security, public safety or the economic well-being of the country, for the prevention of disorder or crime, for the protection of health or morals, or for the protection of the rights and freedoms of others.

    The Court specifically noted the impact of technology must be accounted for in its analysis:

    46. AFR permits a relatively mundane operation of human observation to be carried out much more quickly, efficiently and extensively. It is technology of the sort that must give pause for thought because of its potential to impact upon privacy rights. As the Grand Chamber of the Strasbourg Court said in S v. United Kingdom (2009) 48 EHRR 50 at [112]:

    “[T]he protection afforded by art.8 of the Convention would be unacceptably weakened if the use of modern scientific techniques in the criminal-justice system were allowed at any cost and without carefully balancing the potential benefits of the extensive use of such techniques against important private-life interests … any state claiming a pioneer role in the development of new technologies bears special responsibility for striking the right balance in this regard”.

    The court concluded that there was no violation of Article 8 as the police had a legal basis to use AFR. This is rooted in the common law powers and duties owed by the police to prevent and detect crimes, which “includes the use, retention and disclosure of imagery of individuals for the purposes of preventing and detective crime”. As a result, police can make reasonable use of such imagery for the purpose of preventing or detecting crime. The extent of the police’s power are construed broadly, and the court determined that that the police use of the images was reasonable. The court also found that new express statutory powers are necessary for the police to use AFR.

    The court also considered AFR in the context of the Data Protection Act 1998 and Data Protection Act 2018. Following the Article 8 analysis, the court concluded that the processing of personal data inherent in AFR is conducted lawfully and fairly. The court also concluded “[t]he processing is necessary for SWP’s legitimate interests taking account of the common law obligation to prevent and detect crime” (para 127).

    The applicant’s application for judicial review of the AFR program was dismissed.

  • 13 Sep 2019 11:06 AM | Deleted user

    Had emails or text messages included canned “signatures”, limitation period likely would have been extended

    In a case before the British Columbia Civil Resolution Tribunal, Lesko v. Solhjell, the applicant, Daniel Lesko, was seeking to recover four alleged loans made to the respondent, Annette Solhjell. The respondent disputed that the amounts were loaned, instead saying they were gifts.

    Principally what was at issue was whether the respondent had acknowledged the debt in such a way that would have extended the limitation period on collecting the amounts as the action was commenced outside of the limitation period. Section 24 of he Limitation Act says that a limitation period may be extended if a person acknowledges liability before the expiry of the limitation period. Section 24(6) states that an acknowledgement of liability must be:

    a) in writing;

    b) signed by hand or by electronic signature as defined in the Electronic Transactions Act;

    c) made by the person making the acknowledgement; and,

    d) made to the person with the claim.

    The text message and email evidence was summarized by the tribunal:

    20. The evidence before me is that the applicant sent a text message to the respondent on January 17, 2017 requesting repayment of the money he was owed. On January 19, 2017, the respondent replied by email stating “I know I still owe you money. I have not forgotten”, and later “I can’t pay”. I find the applicant made a demand for payment on January 17, 2017, and the respondent failed to perform. In his additional submissions, the applicant again stated that he emailed the respondent in January 2017 to pay him back, but that he decided to give her more time. Therefore, I find January 17, 2017 was the date on which the applicant’s claim was discovered. According to the Limitation Act, the applicant was required to start his dispute before January 17, 2019.

    The applicant referred to a subsequent emails and text messages up to August 24, 2017 that acknowledged the debt and argued that they triggered the extension provisions in the Limitation Act. The tribunal had no difficulty in determining that these were in writing, made by the respondent to the applicant. However, the issue turned on whether the “acknowledgement” was signed for the purposes of the Electronic Transactions Act.

    The Electronic Transactions Act defines an electronic signature as information in electronic form that a person has created or adopted in order to sign a record that is in, attached to, or associated with the record. The tribunal followed Johal v. Nordio, in which the court stated that the Electronic Transactions Act focuses on whether the sender of the electronic message intended to create a signature. In that case, the email in question included a relatively standard email “signature”: name, position and contact information, which the court found satisfied the requirements of section 24(6) of the Limitation Act.

    No such information was included in the email or text messages exchanged in this case, so the tribunal concluded that the limitation period expired on January 17, 2019, roughly two weeks prior to commencement of the action on February 1, 2019. The respondent was not required to repay the amounts.

  • 13 Sep 2019 11:04 AM | Deleted user

    Judge invokes proportionality, efficiency, common sense in wide-ranging electronic discovery motion

    In Natural Trade Ltd. v. MYL Ltd., Justice Marchand of the British Columbia Supreme Court heard a set of discovery motions in a hotly-contested action, in which a set of companies alleged that confidential information and customer lists had been misappropriated from the companies by a former employee, who conspired with others to use this information to compete with the plaintiff companies. The plaintiffs sought disclosure of various records, including electronic communications and metadata, while the defendants sought similar materials in a counter-motion.

    The discovery demands were numerous and the resulting order was so long it had to be appended to the judgment as a separate document. Of interest was Marchand J.’s review of the general principles underpinning discovery, in particular proportionality and efficiency. He also provided a tidy capsule summary of electronic discovery principles, and an application of them to cloud storage facilities:

    [34] The word “document” is defined broadly in Rule 1-1 to include “a photograph, film, recording of sound, any record of a permanent or semi-permanent character and any information recorded or stored by means of any device.” While a computer hard drive is typically considered to be a receptacle for the storage of documents akin to a filing cabinet, in certain circumstances the hard drive itself may be a “document” subject to production: Chadwick v. Canada (Attorney General)2008 BCSC 851 (CanLII) at paras. 17-22.

    [35] In Sonepar Canada Inc. v. Thompson2016 BCSC 1195 (CanLII), Pearlman J. dealt with an application for the defendants to disclose electronic documents, including metadata. The case involved allegations that the defendants, including former employees of the plaintiff, conspired to misappropriate the plaintiff’s confidential pricing information and unlawfully interfere with the plaintiff’s contractual relations. At para. 46, Pearlman J. summarized the principles applicable to the production of electronic documents from a computer hard drive or other electronic devices as follows:

    1. A computer hard drive is the digital equivalent of a filing cabinet or documentary repository. While the court may order the production of relevant documents stored on the hard drive, Rule 71 does not authorize the court to permit the requesting party to embark upon an unrestricted search of the hard drive.
    2. A computer hard drive as a document storage facility is generally not producible in specie. A hard drive will often contain large amounts of information that is irrelevant to the matters in issue in the litigation, including information that is private and confidential and that ought not to be produced.
    3. In exceptional circumstances where there is evidence that a party is intentionally deleting relevant and material information, or is otherwise deliberately thwarting the discovery process, the court may order the production of the entire hard drive for inspection by an expert. There must be strong evidence, rather than mere speculation, that one party is not disclosing or is deleting relevant information in order to justify such an order.
    4. On an application for production of electronic records from a computer hard drive, the court must balance the objective of proper disclosure with the targeted party's privacy rights.
    5. Proportionality is a factor for the court to consider in determining the scope of the search parameters.
    6. Metadata consisting of information stored on the software which shows the use of the computer, such as dates when a file was opened, last accessed, or sent to another device, is information recorded or stored by means of a device and is therefore a document within the meaning of the Rules.
    7. As a general rule, the producing party's counsel should have the first opportunity to vet for relevance and privilege any information produced from the hard drive or from any other source of electronic data containing private information unrelated to the lawsuit.
    8. To that, I would add that there may be circumstances where it will be appropriate to depart from the general rule, for example, where there is evidence that the producing party has deliberately destroyed records or is likely to interfere with or thwart the production of relevant information.

    [Citations omitted.]

    [36] The plaintiffs submit that the same principles applicable to the production of a computer hard drive also apply to a cloud-based document repository. While the plaintiffs have cited no authority related to “the cloud”, I can see no principled reason to disagree. The cloud is just another place where parties may store their documents. The use of the cloud should not enable parties to shelter relevant documents from production.

  

Canadian Technology Law Association

1-189 Queen Street East

Toronto, ON M5A 1S2

contact@cantechlaw.ca

Copyright © 2024 The Canadian Technology Law Association, All rights reserved.