Menu
Log in
Log in


News

  • 26 Sep 2019 11:11 AM | Deleted user

    LinkedIn can’t restrict access to users’ publicly available information

    In hiQ Labs, Inc. v. LinkedIn Corporation, the United States Court of Appeals for the Ninth Circuit unanimously agreed to uphold the district court’s preliminary injunction, which stopped LinkedIn from restricting hiQ’s access to its users’ publicly available information.

    HiQ is a data analytics company founded in 2012 that uses automated bots to scrape information publicly posted by LinkedIn users to yield “people analytics”, which it then sells to clients. HiQ offers analytics that identify employees at the greatest risk of being recruited away from their current employer, as well as tools that help employers identify skill gaps in their own workforces.

    LinkedIn representatives had been attending conferences organized by HiQ, and indeed one LinkedIn employee received an “Impact Award” and spoke at the conference. This changed when LinkedIn began exploring developing its own ways to use the data from the profiles to generate new products.

    In May 2017, LinkedIn sent hiQ a cease-and-desist letter claiming that the company’s use of LinkedIn data was in violation of the User Agreement and, if they persisted in using this data, it would violate the Computer Fraud and Abuse Act (CFAA), the Digital Millennium Copyright Act, California Penal Code §502(c), and the California common law of trespass. One week later, hiQ sought injunctive relief, which was granted by the district court. The court also ordered LinkedIn to remove any technical barriers to hiQ’s access to public profiles.

    The court engaged an in depth analysis of the meaning of the relevant CFAA provision cited by Linkedin and concluded that the CFAA’s prohibition of accessing a computer “without authorization” is only violated when a person circumvents a computer’s generally applicable rules regarding access permissions. When a computer network permits public access to its data, then the access of this publicly available data would not constitute access “without authorization” under the CFAA. However, the court concluded that state law trespass to chattels claim might still be applicable to the data-scaping of publicly available data in this case.

    In considering the injunction, the court noted the potential harm to hiQ, which was not just monetary injury, but the threat of going out of business. HiQ had been in the middle of a financing round when it received the cease and desist letter, and the threat of uncertainty created thereby caused the financing to stall and several employees to leave the company. HiQ would also have been unable to fulfill existing contracts without access to the LinkedIn data. That had to be balanced against the privacy interest retained by LinkedIn users who had chosen to make their profiles public, but the balance between those two interests favoured hiQ.

    The court also considered the public interest in granting the preliminary injunction, which focused mainly on non-parties’ interests. On balance, the court determined that on balance, the public interests favour hiQ’s position. They held:

    We agree with the district court that giving companies like LinkedIn free rein to decide, on any basis, who can collect and use data – data that the companies do not own, that they otherwise make publicly available to viewers, and that the companies themselves collect and use – risks the possible creation of information monopolies that would disserve the public interest.

  • 26 Sep 2019 11:09 AM | Deleted user

    No immunity for blocking competitor’s programs

    The United States Court of Appeals for the Ninth Circuit has reversed a dismissal of a claim by one software provider against another, allowing it to proceed further in Enigma Software Group USA, LLC v. Malwarebytes, Inc. Both Enigma and Malwarebytes make software allowing users to detect and block content such as viruses, spyware or other unwanted content from their computers. As explained in the decision:

    Providers of computer security software help users identify and block malicious or threatening software, termed malware, from their computers. Each provider generates its own criteria to determine what software might threaten users. Defendant Malwarebytes programs its software to search for what it calls Potentially Unwanted Programs (“PUPs”). PUPs include, for example, what Malwarebytes describes as software that contains “obtrusive, misleading, or deceptive advertisements, branding or search practices.” Once Malwarebytes’s security software is purchased and installed on a user’s computer, it scans for PUPs, and according to Enigma’s complaint, if the user tries to download a program that Malwarebytes has determined to be a PUP, a pop-up alert warns the user of a security risk and advises the user to stop the download and block the potentially threatening content.

    After a revision to its PUP-detection criteria in 2016, Malwarebyte’s software started identifying Enigma’s software as a PUP. As a result, if a user attempted to download one of Enigma’s programs, the user would be alerted to a security risk and the program would be quarantined. Enigma argued that this was a “bad faith campaign of unfair competition” aimed at “deceiving consumers and interfering with [Enigma’s] customer relationships.”

    The central legal issue concerned the Communications Decency Act, which immunizes computer-software providers from liability for actions taken to help users block certain types of unwanted online material. The primary motivation behind the Act, which was passed in the 1990s, was to allow for programs which would, for example, allow parents to install software that prevented their children from accessing pornography. However, the relevant provision concerns not only that sort of material, but also includes a catchall phrase giving immunity for software which blocks content which is “otherwise objectionable”. In an earlier decision, Zango Inc. v. Kaspersky Lab, Inc., 568 F.3d 1169, 1173 (9th Cir. 2009), the Ninth Circuit Court of Appeals had decided that this phrase permits providers to block material that either the provider or the user considered objectionable, and Malwarebytes had successfully relied on that at the lower level. The Court of Appeal, however, concluded that this case was distinguishable from Zango.

    In Zango there was no dispute over the claim that the blocked material was objectionable, and so the meaning of that term had not been fully considered. The key difference here was that a company was relying on this provision in order to block the content of one of its competitors. In this case, the Court of Appeals determined that “otherwise objectionable” could not be taken to include software that the provider finds objectionable for anticompetitive reasons. They did not go so far as to find that it was limited only to material which was sexual or violent in nature, holding that “spam, malware and adware could fairly be placed close enough to harassing materials to at least be called ‘otherwise objectionable’.” However, the legislation generally was meant to protect competition, and so an anti-competitive purpose would not serve.

    Malwarebytes argued that its reasons were not anticompetitive, and that its program:

    found Enigma’s programs “objectionable” for legitimate reasons based on the programs’ content. Malwarebytes asserts that Enigma’s programs, SpyHunter and RegHunter, use “deceptive tactics” to scare users into believing that they have to download Enigma’s programs to prevent their computers from being infected.

    The Court of Appeals effectively left that claim to be decided at the eventual action, since it was here only overturning the dismissal of Enigma’s suit. They concluded that because Enigma had alleged anticompetitive practices, its claim survived the motion to dismiss.

  • 26 Sep 2019 11:03 AM | Deleted user

    A Facebook ‘Like’ button makes you a joint controller in the EU.

    The European Court of Justice recently addressed the use of a Facebook “Like” button by Fashion ID GmbH & Co. KG (“Fashion ID”), an online clothing retailer, on their website. Facebook Ireland Ltd. (“Facebook”) acted as an intervenor in the case.

    The ECJ ruled that the operator of a website that embeds a social plugin, such as Facebook’s ‘Like’ button, which causes the browser of a visitor to that website to request content from the provider of the plugin and transmit to that provider the personal data of the visitor can be considered a controller within the meaning of Article 2(d) of the Data Protection Directive, and therefore subject to various obligations.

    Article 2(d) of that directive provides that:

    “controller” shall mean the natural or legal person, public authority, agency or any other body which alone or jointly with others determines the purposes and means of the processing of personal data; where the purposes and means of processing are determined by national or [EU] laws or regulations, the controller or the specific criteria for his nomination may be designated by national or [EU] law.

    Any visitor on the Fashion ID website had their personal data transmitted to Facebook due to the inclusion of the ‘Like’ button on their website. This transmission of data occurred, without notice to the visitor, regardless of whether that visitor was a member of Facebook or had clicked on the ‘Like’ button.

    However, the court held that joint liability does not imply equal responsibility. When operators are involved at different stages of the processing of personal data and to different degrees, the liability to be imposed on the various controllers will be assessed according to the specific circumstances of the case. The court concluded that a further investigation was needed to determine the degree of liability for each of Fashion ID and Facebook.

  • 26 Sep 2019 11:01 AM | Deleted user

    De-referencing requirement limited to Member States of European Union, not worldwide

    In Google LLC v Commission nationale de l’informatique et des libertés (CNIL), the Court of Justice of the European Union has sided with Google rather than the Commission nationale de l'informatique et des libertés (the French Data Protection Authority) over the issue of how far, geographically, Google’s obligation to comply with orders to de-reference links extended. The CNIL had ordered Google to remove results from all versions of the search engine, whatever the domain name extension, and imposed a penalty of €100,000 when Google failed to do so. Google pursued the issue, which therefore came to be decided by the Court of Justice of the European Union: that court concluded that Google was only required to de-reference results from versions of the search engine with domain names corresponding to the Member States of the European Union. However, although Google was not required to de-reference the results from all versions, it was also required to take steps to prevent or seriously discourage an internet user who searches from one of the Member States from gaining access to the de-referenced data via a version of the search engine outside the European Union.

    The Court noted that, due to its decision in Google Spain and Google, a person did have the right in certain circumstances to have their data de-referenced from searches, which has come to be referred to as the “right to be forgotten”. That right permits individuals to assert their right to de-referencing against a search engine operator as long as that search engine in the territory of the European Union, whether the actual data processing takes place in there or not. They also noted that directives and regulations had been made aimed at guaranteeing a high level of protection of personal data throughout the European Union, and that “a de-referencing carried out on all the versions of a search engine would meet that objective in full” (para 55). Indeed, they observed that the European Union Legislature could create a rule saying that de-referencing had to take place worldwide:

    56 The internet is a global network without borders and search engines render the information and links contained in a list of results displayed following a search conducted on the basis of an individual’s name ubiquitous (see, to that effect, judgments of 13 May 2014, Google Spain and Google, C131/12, EU:C:2014:317, paragraph 80, and of 17 October 2017, Bolagsupplysningen and Ilsjan, C194/16, EU:C:2017:766, paragraph 48).

    57 In a globalised world, internet users’ access — including those outside the Union — to the referencing of a link referring to information regarding a person whose centre of interests is situated in the Union is thus likely to have immediate and substantial effects on that person within the Union itself.

    58 Such considerations are such as to justify the existence of a competence on the part of the EU legislature to lay down the obligation, for a search engine operator, to carry out, when granting a request for de-referencing made by such a person, a de-referencing on all the versions of its search engine.

    They carried on to find, however, that no such rule had in fact been created. The Union, or indeed individual states, could create such a rule if they chose, but that was not the current state of the law. They acknowledged that “numerous third States do not recognise the right to de-referencing or have a different approach to that right” (para 59), and that

    60…the right to the protection of personal data is not an absolute right, but must be considered in relation to its function in society and be balanced against other fundamental rights, in accordance with the principle of proportionality …Furthermore, the balance between the right to privacy and the protection of personal data, on the one hand, and the freedom of information of internet users, on the other, is likely to vary significantly around the world.

    Accordingly, a de-referencing order did not apply to all versions of the search engine. It did apply to more than the search engine corresponding to the domain from which the search was made, however, and therefore was in force in all Member States. This was justified on the basis of regulations meant to “ensure a consistent and high level of protection throughout the European Union and to remove the obstacles to flows of personal data within the Union” (para 66).

    In part it seems that the decision was influenced by technological change Google had already made to “steer” users into the appropriate search engine. The Court noted that:

    42 During the proceedings before the Court, Google explained that, following the bringing of the request for a preliminary ruling, it has implemented a new layout for the national versions of its search engine, in which the domain name entered by the internet user no longer determines the national version of the search engine accessed by that user. Thus, the internet user is now automatically directed to the national version of Google’s search engine that corresponds to the place from where he or she is presumed to be conducting the search, and the results of that search are displayed according to that place, which is determined by Google using a geo-location process.

  • 13 Sep 2019 11:08 AM | Deleted user

    Pilot project found to be compliant with European Convention on Human Rights and Data Protection Acts

    On September 4, 2019, the UK High Court released its decision in R (Bridges) v CCSWP and SSHD, which was a judicial review and test case of sorts to determine the lawfulness of the use of automated facial recognition (AFR) by the South Wales Police (SWP).

    AFR technology is an automated means by which images captured on CCTV cameras are processed to “isolate pictures of individual faces, extract information about facial features from those pictures, compare that information with the watchlist information, and indicate matches between faces captured through the CCTV recording and those held on the watchlist” (para 25). If there is no match, the captured facial images are discarded and flushed from the system within 24 hours, while the CCTV footage is retained for 30 days.

    The South Wales Police was carrying out a pilot project of AFR, resulting in one arrest using real-time AFR deployment was of a wanted domestic violence offender in May 2017.

    The principal objections against the use of AFR are rooted in the European Convention on Human Rights, which includes at Article 8:

    Article 8

    1. Everyone has the right to respect for his private and family life, his home and his correspondence.

    2. There shall be no interference by a public authority with the exercise of this right except such as is in accordance with the law and is necessary in a democratic society in the interests of national security, public safety or the economic well-being of the country, for the prevention of disorder or crime, for the protection of health or morals, or for the protection of the rights and freedoms of others.

    The Court specifically noted the impact of technology must be accounted for in its analysis:

    46. AFR permits a relatively mundane operation of human observation to be carried out much more quickly, efficiently and extensively. It is technology of the sort that must give pause for thought because of its potential to impact upon privacy rights. As the Grand Chamber of the Strasbourg Court said in S v. United Kingdom (2009) 48 EHRR 50 at [112]:

    “[T]he protection afforded by art.8 of the Convention would be unacceptably weakened if the use of modern scientific techniques in the criminal-justice system were allowed at any cost and without carefully balancing the potential benefits of the extensive use of such techniques against important private-life interests … any state claiming a pioneer role in the development of new technologies bears special responsibility for striking the right balance in this regard”.

    The court concluded that there was no violation of Article 8 as the police had a legal basis to use AFR. This is rooted in the common law powers and duties owed by the police to prevent and detect crimes, which “includes the use, retention and disclosure of imagery of individuals for the purposes of preventing and detective crime”. As a result, police can make reasonable use of such imagery for the purpose of preventing or detecting crime. The extent of the police’s power are construed broadly, and the court determined that that the police use of the images was reasonable. The court also found that new express statutory powers are necessary for the police to use AFR.

    The court also considered AFR in the context of the Data Protection Act 1998 and Data Protection Act 2018. Following the Article 8 analysis, the court concluded that the processing of personal data inherent in AFR is conducted lawfully and fairly. The court also concluded “[t]he processing is necessary for SWP’s legitimate interests taking account of the common law obligation to prevent and detect crime” (para 127).

    The applicant’s application for judicial review of the AFR program was dismissed.

  • 13 Sep 2019 11:06 AM | Deleted user

    Had emails or text messages included canned “signatures”, limitation period likely would have been extended

    In a case before the British Columbia Civil Resolution Tribunal, Lesko v. Solhjell, the applicant, Daniel Lesko, was seeking to recover four alleged loans made to the respondent, Annette Solhjell. The respondent disputed that the amounts were loaned, instead saying they were gifts.

    Principally what was at issue was whether the respondent had acknowledged the debt in such a way that would have extended the limitation period on collecting the amounts as the action was commenced outside of the limitation period. Section 24 of he Limitation Act says that a limitation period may be extended if a person acknowledges liability before the expiry of the limitation period. Section 24(6) states that an acknowledgement of liability must be:

    a) in writing;

    b) signed by hand or by electronic signature as defined in the Electronic Transactions Act;

    c) made by the person making the acknowledgement; and,

    d) made to the person with the claim.

    The text message and email evidence was summarized by the tribunal:

    20. The evidence before me is that the applicant sent a text message to the respondent on January 17, 2017 requesting repayment of the money he was owed. On January 19, 2017, the respondent replied by email stating “I know I still owe you money. I have not forgotten”, and later “I can’t pay”. I find the applicant made a demand for payment on January 17, 2017, and the respondent failed to perform. In his additional submissions, the applicant again stated that he emailed the respondent in January 2017 to pay him back, but that he decided to give her more time. Therefore, I find January 17, 2017 was the date on which the applicant’s claim was discovered. According to the Limitation Act, the applicant was required to start his dispute before January 17, 2019.

    The applicant referred to a subsequent emails and text messages up to August 24, 2017 that acknowledged the debt and argued that they triggered the extension provisions in the Limitation Act. The tribunal had no difficulty in determining that these were in writing, made by the respondent to the applicant. However, the issue turned on whether the “acknowledgement” was signed for the purposes of the Electronic Transactions Act.

    The Electronic Transactions Act defines an electronic signature as information in electronic form that a person has created or adopted in order to sign a record that is in, attached to, or associated with the record. The tribunal followed Johal v. Nordio, in which the court stated that the Electronic Transactions Act focuses on whether the sender of the electronic message intended to create a signature. In that case, the email in question included a relatively standard email “signature”: name, position and contact information, which the court found satisfied the requirements of section 24(6) of the Limitation Act.

    No such information was included in the email or text messages exchanged in this case, so the tribunal concluded that the limitation period expired on January 17, 2019, roughly two weeks prior to commencement of the action on February 1, 2019. The respondent was not required to repay the amounts.

  • 13 Sep 2019 11:04 AM | Deleted user

    Judge invokes proportionality, efficiency, common sense in wide-ranging electronic discovery motion

    In Natural Trade Ltd. v. MYL Ltd., Justice Marchand of the British Columbia Supreme Court heard a set of discovery motions in a hotly-contested action, in which a set of companies alleged that confidential information and customer lists had been misappropriated from the companies by a former employee, who conspired with others to use this information to compete with the plaintiff companies. The plaintiffs sought disclosure of various records, including electronic communications and metadata, while the defendants sought similar materials in a counter-motion.

    The discovery demands were numerous and the resulting order was so long it had to be appended to the judgment as a separate document. Of interest was Marchand J.’s review of the general principles underpinning discovery, in particular proportionality and efficiency. He also provided a tidy capsule summary of electronic discovery principles, and an application of them to cloud storage facilities:

    [34] The word “document” is defined broadly in Rule 1-1 to include “a photograph, film, recording of sound, any record of a permanent or semi-permanent character and any information recorded or stored by means of any device.” While a computer hard drive is typically considered to be a receptacle for the storage of documents akin to a filing cabinet, in certain circumstances the hard drive itself may be a “document” subject to production: Chadwick v. Canada (Attorney General)2008 BCSC 851 (CanLII) at paras. 17-22.

    [35] In Sonepar Canada Inc. v. Thompson2016 BCSC 1195 (CanLII), Pearlman J. dealt with an application for the defendants to disclose electronic documents, including metadata. The case involved allegations that the defendants, including former employees of the plaintiff, conspired to misappropriate the plaintiff’s confidential pricing information and unlawfully interfere with the plaintiff’s contractual relations. At para. 46, Pearlman J. summarized the principles applicable to the production of electronic documents from a computer hard drive or other electronic devices as follows:

    1. A computer hard drive is the digital equivalent of a filing cabinet or documentary repository. While the court may order the production of relevant documents stored on the hard drive, Rule 71 does not authorize the court to permit the requesting party to embark upon an unrestricted search of the hard drive.
    2. A computer hard drive as a document storage facility is generally not producible in specie. A hard drive will often contain large amounts of information that is irrelevant to the matters in issue in the litigation, including information that is private and confidential and that ought not to be produced.
    3. In exceptional circumstances where there is evidence that a party is intentionally deleting relevant and material information, or is otherwise deliberately thwarting the discovery process, the court may order the production of the entire hard drive for inspection by an expert. There must be strong evidence, rather than mere speculation, that one party is not disclosing or is deleting relevant information in order to justify such an order.
    4. On an application for production of electronic records from a computer hard drive, the court must balance the objective of proper disclosure with the targeted party's privacy rights.
    5. Proportionality is a factor for the court to consider in determining the scope of the search parameters.
    6. Metadata consisting of information stored on the software which shows the use of the computer, such as dates when a file was opened, last accessed, or sent to another device, is information recorded or stored by means of a device and is therefore a document within the meaning of the Rules.
    7. As a general rule, the producing party's counsel should have the first opportunity to vet for relevance and privilege any information produced from the hard drive or from any other source of electronic data containing private information unrelated to the lawsuit.
    8. To that, I would add that there may be circumstances where it will be appropriate to depart from the general rule, for example, where there is evidence that the producing party has deliberately destroyed records or is likely to interfere with or thwart the production of relevant information.

    [Citations omitted.]

    [36] The plaintiffs submit that the same principles applicable to the production of a computer hard drive also apply to a cloud-based document repository. While the plaintiffs have cited no authority related to “the cloud”, I can see no principled reason to disagree. The cloud is just another place where parties may store their documents. The use of the cloud should not enable parties to shelter relevant documents from production.

  • 13 Sep 2019 11:03 AM | Deleted user

    Court considers use of data extraction tool and warrant requirements regarding searches of cell phones

    In R. v. Sinnappillai, Boswell J of the Ontario Superior Court of Justice presided over the trial of the accused, who was charged with luring a minor for the purposes of prostitution and sexual touching. The charges in fact resulted from a sting operation in which a police officer communicated with the accused via text, indicating that “she” was a 15-year old girl and setting up a meeting at a hotel room. The “customer” texted the officer several times in the hours and minutes leading up to the meeting time, and the accused arrived at the hotel room at the appointed time. The police received a warrant to search the accused’s Samsung phone to see if it contained the matching set of text messages that the officer’s phone did. In order to do the search, a tech crimes officer hooked the phone up to a “universal forensic extraction device” (UFED), which lacked the ability to extract only a portion of the phone’s contents. The officer followed his standard practice, which was to extract all of the phone’s data (essentially creating a mirror image of the phone) and then searching the extracted data. The search revealed the text conversation and a matching call record. The tech crimes officer stored the mirror image on a secure police server.

    At a later preliminary hearing, the police surmised that the accused might raise an inability to communicate in English as part of his defence. They obtained a second warrant and did a second search which turned up additional text messages. The mirror image was left on the police server, but the police did not report the results of either search as required by the warrants and the Criminal Code. The accused raised a number of arguments that neither the warrants nor s. 8 of the Charter had been complied with sufficiently, and asked that the data be excluded.

    Early in the judgment, Boswell J observed:

    [12] Almost everyone is by now familiar with the amazing array of functions that modern cell phones are capable of performing. Less people – though I suspect the number is growing – are alive to the fact that, commensurate with those functions, cell phones are repositories of immense amounts of core biographical data. They can reveal, amongst other things, where one has been and when; who one has talked to, when, for how long and sometimes what was said; who one’s associates are; and what websites one frequents. Cell phones are meticulous and reliable record-keepers.

    [13] Law enforcement agencies are well aware that cell phones are frequently rich sources of evidence. Indeed, I would say anecdotally, that cell phone data now features prominently in a significant percentage of criminal cases tried before Ontario courts. It is certainly the central feature of this case.

    The judge dismissed the accused’s argument that the manner in which the searches had proceeded amounted to an unauthorized search of the entire phone. The police already had the phone in their possession after the initial seizure, and “[c]opying the hard drive before searching it gave them nothing new and did not impact on Mr. Sinnappillai’s privacy interests.” The protocol followed, which involved imaging all of the data and then searching that data, was reasonable and Charter-compliant, given that it was impliedly authorized by the justice of the peace who issued the warrant and was necessary to preserve the integrity of the data. The police were also not obliged to destroy the mirror image after the first search, as the recent decision of the Ontario Court of Appeal in R. v. Nurse dictated that the police were permitted to retain and search the phone indefinitely so long as the warrant so permitted:

    To conclude that Mr. MacLean [the tech crime officer] should have created a second mirrored image and searched that, as opposed to searching the image he had already created, would be to ignore common sense and practicality. Moreover, it would do nothing to advance Mr. Sinnappillai’s privacy concerns, since presumably the content of the second mirror image would be identical to the content of the first mirror image. Creating a second duplication would be nothing but a redundant ‘make-work’ task for Mr. MacLean.

    However, Justice Boswell agreed with the accused’s argument that the police had failed to report the results of the searches to the issuing justice as required by the warrants and the Code. The Crown argued that the police’s report to the justice upon having seized the phone was sufficient, but “significantly higher privacy interests are engaged once the police begin to look in the phone and seize data.” As an earlier case had held:

    As was subsequently held by the Supreme Court of Canada in Vu, the privacy interest in the data contained on a computer or similar device is subject to a separate level or layer of privacy protection from the seizure of the device itself. Treating supervision of the seized computer as a physical item as comparable to supervision of the data seized from the computers and USB keys is inconsistent with the concerns expressed in cases such as Vu and R. v. Morelli2010 SCC 8 (CanLII), [2010] 1 S.C.R. 253. Consequently, I am of the view that failure to make a report to a justice in relation to the execution of the October 18, 2013 warrant constitutes a violation of s. 8 of the Charter.

    Accordingly, there had been a breach of s. 8. However, Justice Boswell declined to exclude the evidence under s. 24(2) of the Charter. While the breaches were serious, there was no evidence of systemic police misconduct, and the law on the obligation to report back the results of cell phone searches was not entirely settled. The impact on the accused’s privacy was minimal, given that the police had obtained warrants for both searches and both the searches and seizures had been lawful. The evidence was reliable and important to the Crown’s case. Accordingly, the motion to exclude was dismissed.

  • 22 Aug 2019 11:05 AM | Deleted user

    Court rules jurisdiction over tort claims grounded due to defendant company’s e-commerce activities

    In Vahle et al v. Global Work and Travel Co., Inc., two Ontario sisters had gone to Thailand on a work/travel excursion, brokered via the BC-based defendant company. They were injured in an accident while driving a scooter to their employment, and one sister was killed. The plaintiffs (the surviving sister and her parents) brought a number of tort actions in Ontario against Global, including negligence, negligent misrepresentation, and breach of contract and fiduciary obligations. All dealings between the sisters and Global were conducted through Global’s website. Global argued on a motion that the Ontario Superior Court did not have jurisdiction simpliciter and was not the most convenient forum for the actions.

    Justice Paul Schabas of the Ontario Superior Court first noted the Supreme Court of Canada’s jurisprudence on jurisdiction simpliciter included a number of connecting factors which could establish jurisdiction presumptively (though all were rebuttable). On the first, the plaintiffs argued that the contract between the sisters and Global had been made in Ontario, but Justice Schabas applied the usual rule that the contract is made in the jurisdiction where the offeror receives notice of the offeree’s acceptance. The “postal acceptance” exception to the rule did not apply to faxes or emails. On the second presumptive factor, that a tort was committed in Ontario, the plaintiffs had pleaded that substantial negligent misrepresentations were made to them in Ontario, by Global via its website; and that Global’s negligence included failure to notify the parents after the accident and other steps which it should have taken in Ontario. Accordingly, this presumptive factor was made out.

    Justice Schabas then turned to the contentious factor of whether Global was “carrying on a business” in Ontario. It was not sufficient that Global had a website that was accessible in Ontario, or that its online ads and promotions were received in Ontario via Google and Facebook. However:

    [37] Here, the defendant engages in e-commerce in Ontario by contacting and contracting with travellers in Ontario. It does more than simply receive inquiries from clients based in Ontario. It also places foreign vacationers coming to Canada in Ontario through its working holiday program in Canada and works with businesses here who may employ those individuals. Global thus actively works with clients and businesses in Ontario.

    [38] Since Van Breda, the Supreme Court has upheld orders of the British Columbia courts in which they exercised jurisdiction over Google even though it did not have servers or offices, or any employees in the province: Google Inc. v. Equustek Solutions Inc.2017 SCC 34 (CanLII), [2017] 1 SCR 824, affirming 2015 BCCA 265 (CanLII). In that case, Google did, however, gather information and data in British Columbia which led to targeted search results and targeted advertising towards residents of British Columbia.

    [39] Global’s connections to Ontario are at least comparable to Google’s connections with British Columbia. Once contacted by Ontario residents, Global actively solicits their business, as it did here in what the plaintiffs describe as aggressive sales tactics towards them by email and telephone. Global knew that it was contracting with Ontario residents, and assured its clients that the contracts would be governed by “Canadian law” which may be understood by clients to mean the law of the province in which they are located. Accordingly, the plaintiffs have met the burden of demonstrating a good arguable case that Global carries on business in Ontario and there is a presumption of jurisdiction.

    The defendant argued that the internet-based connection was weak and rebutted the presumptive factors. Justice Schabas held:

    In this case, however, Global knew it was dealing with clients in Ontario. It frequently dealt with travellers coming from Ontario, as well as those wishing to have a working holiday in Ontario. Global’s representatives were aware that any representations they made to Nora and Marija were received by them in Ontario. Further, providing that “Canadian law” would apply [via the website] suggests that Global contemplated that it may be subject to Ontario law.

    The connecting factors were made out, and Justice Schabas further found that Ontario was not forum non conveniens. In the result, the motion to dismiss for want of jurisdiction was dismissed.

  • 22 Aug 2019 11:05 AM | Deleted user

    California court invalidates bail condition allowing random searches of devices and social media accounts

    In the case of In Re Ricardo P., the appellant was a juvenile offender and ward of the court who had pleaded guilty to two counts of felony burglary, and was placed on probation. He had admitted to the use of marijuana and told a probation officer that he had stopped using it since being apprehended for the robbery, as it interfered with his ability to think clearly. The juvenile court imposed a condition that he “[s]ubmit . . . electronics including passwords under [his] control to search by Probation Officer or peace office[r] with or without a search warrant at any time of day or night.” The court overruled the appellant’s objection that this condition was not related to the offences which he had committed, stating that monitoring the appellant’s drug usage was an important part of probation, and that it was not unusual for young people to “brag about their marijuana usage or drug usage, particularly their marijuana usage, by posting on the Internet, showing pictures of themselves with paraphernalia, or smoking marijuana.” This made the condition an appropriate part of the overall probation program, in the court’s view.

    The case eventually proceeded to the California Supreme Court, which applied its test for when a probation order condition could be held invalid. It noted that the condition had no relation to the crime which was committed, given that there was no indication of the burglaries having anything to do with the use of electronic devices. Also, the condition related to conduct which was not itself criminal. The case turned, in the court’s view, on the third prong of their test, which asked whether the condition “requires or forbids conduct which is not reasonably related to future criminality.” Noting that the entire point of the condition had been to monitor whether the youth was “communicating about drugs or with people associated with drugs,” the court held that the condition was invalid because “the burden it imposes on Ricardo’s privacy is substantially disproportionate to the countervailing interests of furthering his rehabilitation and protecting society.” There was no evidence in the record that the youth had actually been using drugs when he committed the burglaries, nor was there any evidence that he had used electronic devices to plan, discuss or commit burglaries. While the condition need not be related to particular past offences by the individual, there had to be a degree of proportionality between the burden imposed by the condition and the overall goal of preventing future criminality. Such proportionality was lacking here, as the condition “significantly burdens privacy interests”:

    If we were to find this record sufficient to sustain the probation condition at issue, it is difficult to conceive of any case in which a comparable condition could not be imposed, especially given the constant and pervasive use of electronic devices and social media by juveniles today. In virtually every case, one could hypothesize that monitoring a probationer’s electronic devices and social media might deter or prevent future criminal conduct. For example, an electronics search condition could be imposed on a defendant convicted of carrying an unregistered concealed weapon on the ground that text messages, e-mails, or online photos could reveal evidence that the defendant possesses contraband or is participating in a gang. … Indeed, whatever crime a juvenile might have committed, it could be said that juveniles may use electronic devices and social media to mention or brag about their illicit activities.

    The court commented that the prosecution’s argument that this ruling would prevent the imposition of commonly-used search conditions, such as those for person, property and residence, was flawed:

    the Attorney General’s argument does not sufficiently take into account the potentially greater breadth of searches of electronic devices compared to traditional property or residence searches. (See Riley, supra, 573 U.S. at pp. 396– 397 [“[A] cell phone search would typically expose to the government far more than the most exhaustive search of a house: A phone not only contains in digital form many sensitive records previously found in the home; it also contains a broad array of private information never found in a home in any form — unless the phone is.”].) As noted, the electronics search condition here is expansive in its scope: It allows probation officers to remotely access Ricardo’s e-mail, text and voicemail messages, photos, and online accounts, including social media like Facebook and Twitter, at any time. It would potentially even allow officers to monitor Ricardo’s text, phone, or video communications in real time. Further, the condition lacks any temporal limitations, permitting officers to access digital information that long predated the imposition of Ricardo’s probation.

    Accordingly, the condition was struck.

  

Canadian Technology Law Association

1-189 Queen Street East

Toronto, ON M5A 1S2

contact@cantechlaw.ca

Copyright © 2024 The Canadian Technology Law Association, All rights reserved.