Log in


News

  • 24 Oct 2019 10:56 AM | Anonymous

    Section 714.1 application for witness to testify by video conference denied

    With its decision in R v Musseau the Newfoundland and Labrador Provincial Court provided guidance on the use of section 714.1 of the Criminal Code, which allows a witness to “give evidence by means of technology that permits the witness to testify elsewhere in Canada in the virtual presence of the parties and the court”. Finding that the preconditions for such an order had not been met the trial judge dismissed the application, but provided guidance about the use of the section nonetheless.

    The trial was to occur in Corner Brook Newfoundland, but the complainant was in New Brunswick and did not wish to appear personally, and so the Crown made a section 714.1 application concerning her appearance. The judge pointed out that the Crown’s application simply referred to the complainant in the case (the spouse of the accused, who was charged with assaulting her) appearing by “video conference”, without any description of the actual technology to be used. Section 714.1 requires that the technology make it possible for the witness to be in the “virtual presence” of the parties and the court, and without such evidence he was unable to assess the application. The judge noted that:

    [20]…section 714.1 does not, as does section 714.3 of the Criminal Code (evidence of a witness by audio link), specifically require the Court to consider “any potential prejudice to either of the parties caused by the fact that the witness would not be seen by them”. This is a recognition of the significant difference between virtual testimony and audio testimony. Having said this, prejudice to an accused person’s ability to make full answer and defence must always be considered.

    Further, the requirement for “virtual presence” was an important and meaningful one. An accused is entitled to face their accuser: although that was not to be taken literally, appearance by video technology had to accomplish the goals behind that principle. Video link technology was sufficiently sophisticated that that goal could be accomplished through its use. Indeed, the judge acknowledged at para 40 the possibility that “the use of video appearances are so common that the virtual presence requirement can be assumed or judicially acknowledged.” Nonetheless he decided that evidence concerning the nature of the technology actually to be used was necessary, and had to be provided either in writing as part of the application or in some other fashion.

    In addition it was necessary to have evidence about the location from which the witness would be testifying, such as whether it would be a courtroom or something similar: the judge needed to be able to consider “whether the witness will face the same level of solemnity offered by a courtroom and whether he or she will be as free from outside influences while testifying as she or he would be if they were to testify in person before the trial judge” (para 34).

    Summarizing the usefulness of section 714.1 (and quoting his own previous decision on the same issue in another case), the judge noted at para 42:

    Section 714.1 of the Criminal Code is designed, in part, to lessen the financial cost and inconvenience caused by requiring witnesses to travel to testify in person to an area in which they do not reside. In a country as large as Canada, it would be foolhardy to stymie the use of such appearances. The reality is that modern technology has made the requirement for personal presence, in many instances, unnecessary and superfluous. Our courts must not ignore the reality of modern technological developments in assessing a demand for the personal presence of a witness. 

    In this case, because of inadequacies in the form and content of the affidavits, and the failure therefore to establish the requirements of the section, the trial judge denied the application.

  • 10 Oct 2019 11:42 AM | Anonymous

    European Court determines that EU law does not provide for (but does not prohibit) global search result delisting

    The European Court of Justice (Grand Chamber) has ruled that Google does not have to apply the right to be forgotten globally. The proceeding stemmed from an imposition by the French data protection authority (CNIL) of a EUR 100,000 fine against Google for not deindexing search results across all its search properties. Google had implemented geoblocking that prevented access to certain search results within the European Union, but CNIL found this to be insufficient.

    The court proceeding, on the basis of the former Data Protection Directive, found:

    64 It follows that, currently, there is no obligation under EU law, for a search engine operator who grants a request for de-referencing made by a data subject, as the case may be, following an injunction from a supervisory or judicial authority of a Member State, to carry out such a de-referencing on all the versions of its search engine.

    Importantly, the court also noted that while EU law does not require orders with global effect, it does not prohibit them. Member states are free to adopt national laws that require global censorship.

  • 10 Oct 2019 11:41 AM | Anonymous

    Agreement will provide for mutual recognition of court orders and remove existing barriers

    The United States Department of Justice and the United Kingdom Home Office have announced that the two countries have signed a bilateral agreement “On Access to Electronic Data for the Purpose of Countering Serious Crime”. The Agreement is intended to be a bilateral agreement of the type anticipated under the CLOUD Act. Passed in March 2018, partially to address the litigation against Microsoft related to evidence in Ireland, the CLOUD Act authorizes the United States to enter into executive agreements with other countries that meet specific criteria related to rule of law, civil rights and privacy. Once laid before Congress and approved, the result is to lift each party’s legal barriers that prevent one country’s legal processes from being recognized in the other. Many countries have been seeking an alternative to the traditional channels of mutual legal assistance, which are seen as time consuming and cumbersome. In many cases, US-service providers are prohibited from providing the content of communications except in response to a US court order, requiring the requesting law enforcement agency to go through MLAT and sometimes putting the service provider between a rock and a hard place.

    On the UK side of the equation, changes were made in UK law to permit this under the Crime (Overseas Production Orders) Act 2019, which received Royal Assent in February 2019. The Agreement will enter into force following a six-month Congressional review period mandated by the CLOUD Act, and the related review by UK’s Parliament.

    Australia has already announced that it is seeking its own CLOUD Act executive agreement, and Canada is rumoured to be in similar discussions.

    It should be noted that initial reporting on this was conflated with recent declarations regarding encryption and access to cleartext of communications, so that some outlets reported that this agreement would require access to unencrypted communications, which is specifically excluded from the CLOUD Act.

  • 10 Oct 2019 11:40 AM | Anonymous

    Interpretation of UK Data Protection Act needs to be grounded in the European Charter and applicable principles

    The United Kingdom Court of Appeal has given the green light in a class action against Google related to the collection of “browser generated information”, or “BGI”. The Court Richard Lloyd v Google LLCreversed the decision of the judge below, who held that the putative representative plaintiff could not serve Google LLC outside of the UK in the proceeding and that Google could not be liable for damages without proving further pecuniary loss.

    The plaintiff allege that Google created a “workaround” for functionality built into the Safari browser to block third party cookies, such as Google’s advertising cookie. This has already been the subject of various lawsuits and investigations in the United States.

    The Court of Appeal disagreed with Google argument that the representative plaintiff would need to prove causation and consequential damages according to section 13(1) of the Data Protection Act (“DPA”).

    13(1) An individual who suffers damage by reason of any contravention by a data controller of any of the requirements of this Act is entitled to compensation from the data controller for that damage.

    While this is a provision in UK Law, the Court of Appeal conducted its analysis on the basis that it is a matter of EU law. If a purely domestic statutory construction was applied, section 13 would require proof of both a contravention of the law and actual consequent damage, whether pecuniary or non-pecuniary. What changed the analysis was the introduction in 2012 of the Charter of Fundamental Rights of the European Union, “addressed to… Member States only when they [were] implementing [EU] law” (para 41). Article 8 of the Charter, titled “Protection of personal data” confirmed the protection of data rights under EU law:

    1. Everyone has the right to the protection of personal data concerning him or her.
    2. Such data must be processed fairly for specified purposes and on the basis of the consent of the person concerned or some other legitimate basis laid down by law. Everyone has the right of access to data which has been collected concerning him or her, and the right to have it rectified.
    3. Compliance with these rules shall be subject to control by an independent authority.

    The Court of Appeal concluded that because the Data Protection Act was enacted to give effect to EU law, the provisions of the Charter undergird the rights under the DPA: From Paragraph 41:

    The DPA was enacted to implement EU law, so the provisions of the Charter became applicable to data rights under the DPA after it was introduced.

    Interpretation of the DPA must take into account the Charter and similarly consider those terms as giving effect to the provisions set out in the Charter. As a result, what is harm compensable by damages is determined by EU law and not domestic UK law. And under EU law, the court considered (a) whether control over data is an asset that has value, and (b) whether there are compensatory “damages” within the legal definition of the word.

    With respect to the first question, the Court concluded that the data at issue was an asset that has value. Though UK law has hesitated to see data as property, EU law clearly affords it protection. That it could be monetized for advertising purposes reinforced the value it has. As a result, the Court concluded that an aggrieved person can recover damages under section 13 of the DPA and Article 23 of the EU Data Protection Directive.

    For the second question, the Court relied on the case of Gulati v. MGN Limited [2015] EWHC 1482 (Ch) (Mann J), [2015] EWCA Civ 1291 (CA) (“Gulati”) which had held that damages are available without proof of pecuniary loss or distress for the tort of misuse of private information (“MPI”). Though MPI is a different legal basis for a claim, the Court of Appeal determined that Gulati was relevant and applicable by analogy because the misuse of private information tort and the DPA claim both derive from the same core rights to privacy. Further, the loss of control over telephone data in Gulati by the defendant newspaper company hacking phones was held to be damage that was compensable, and therefore Google’s acquisition of browser generated information must also be compensable.

    The Court of Appeal also determined that the judge below erred in determining that the members of the class did not have the same interest under the relevant Civil Procedure Rules related to class and representative proceedings, and were not identifiable.

    The plaintiff was seeking to represent claimants whose BGI was taken by Google without their consent, in the same circumstances, and during the same period. This was not dependent on personal circumstances that would vary among individual claimants. The effect of this results in reducing the damages that could be claimed to the lowest common denominator.

    The Court of Appeal granted the appellant’s appeal and the case can proceed.

  • 10 Oct 2019 11:39 AM | Anonymous

    Accused’s Facebook posts and texts considered “after the fact conduct,” supporting murder conviction

    In R. v. Café (2019 ONCA 775, no hyperlink available), the accused appealed his conviction at trial for first degree murder. Part of the evidence against him had been text messages that he had sent to friends after the victim’s death, in which he boasted about the murder, as well as Facebook posts containing photos of the victim’s body (taken before the police arrived at the scene) and rap lyrics relating to the death. The trial judge ruled that the texts, photos and rap lyrics could be considered by the jury as “after-the-fact conduct evidence” (which used to be called “evidence of consciousness of guilt”), specifically relating to whether the accused had planned the killing of the victim. Some of the materials were suggestive of a motive to kill the victim because he was a religious man, and suggested that the accused had therefore planned to kill the victim as a way to “mock God.” At trial the accused had testified that he did not plan the killing, but spontaneously did it in order to satisfy voices in his head which were instructing him to murder the victim.

    The accused argued on appeal that the trial judge had erred in this ruling, but the Court of Appeal dismissed this argument. This was not just evidence that implicated the accused in the murder generally, but which had specific probative value towards the exact issues to which the jury had been directed—whether the accused had an established motive for violence toward the victim, and whether he had planned the killing. It had been open to the jury to conclude that the Facebook posts and texts were consistent with motive and the plan to “insult God,” rather than supporting the story regarding voices in the accused’s head. The trial judge had given specific instructions regarding the rap lyrics, which were full of disturbing images and profanity, and ensured that the jury knew to use them only to assess motive and planning, and not to reflect on the accused’s general character for violence.

  • 10 Oct 2019 11:38 AM | Anonymous

    Court applies Marakah, holds accused has expectation of privacy in wife’s surreptitiously-obtained copies of his data

    In R. v. King, the accused was charged with accessing and possessing child pornography images. The accused shared a house with his wife, from whom he was estranged. At one point she suspected him of marital infidelity and, having earlier observed him putting his passcode into his phone, memorized the code. When she later unlocked the phone and looked for evidence of his infidelity, she found images she thought were child pornography. Some time later she looked through his desktop computer and one of his tablets, and found similar images. She used her phone to take pictures of some of the suspicious images displayed on all three devices, saved these pictures to a USB key, and gave the USB to the police. An officer viewed the images on the USB key and determined they might be child pornography. The police obtained a search warrants for the accused’s devices (in his house and car), and eventually seized 34 devices, 7 of which contained child pornography images. The accused made a pre-trial motion, to Judge Allan Fradsham of the Alberta Provincial Court, for a declaration that his protections against unreasonable search and seizure under s. 8 of the Charter had been violated.

    While the accused’s wife was not a state agent, Judge Fradsham nonetheless analyzed whether the accused had a reasonable expectation of privacy in the images vis-à-vis the state, and whether there had been a breach of s. 8, in accordance with the Supreme Court of Canada’s decision in R. v. Marakah. He held that, “in the case at bar which involves electronic data, when determining the true subject matter of the search, one looks beyond the physical object in which the data is stored and considers the data itself and whether the accused had a reasonable expectation of privacy in that data.” Here, while the object being searched was the USB key (when the police initially viewed it), the subject matter of the search was “the collection of images on the USB drive as taken from the screens of Mr. King’s electronic devices, and what it told them about Mr. King.” That is, the target of the search was personal information about the accused. Moreover, the accused obviously had a subjective expectation of privacy in the material, given that all of his devices were password-protected.

    As to whether the accused’s expectation of privacy was reasonable, Judge Fradsham first noted that the location of search was of no practical use in the analysis, since the “place” was the USB key, of which the accused had been unaware. The subject matter of the search was clearly private information, since data on an accused’s computer was generally so (R. v. Morelli), and evidence of criminal activity was something that people would have a particular interest in keeping private (R. v. Patrick). The accused had had control over the subject matter of the search, and the fact that he lost that control was not fatal to the reasonableness of his expectation of privacy:

    [130] That said, control is not the exclusive consideration that informs the existence of a reasonable expectation of personal privacy. And there are exceptional cases where control is not necessary. Where a loss of control over the subject matter is involuntary, such as where a person is in police custody or the subject matter is stolen from the person by a third party, then a reasonable expectation of personal privacy may persist: see Stillman[6], at paras. 61-62(privacy may persist in a tissue discarded while in police custody); R. v. Law,2002 SCC 10 (CanLII), [2002] 1 S.C.R. 227, at para. 28 (privacy may persist in a safe stolen by a third party)...[quoting from Marakah, per Moldaver J]

    Turning to the search, the judge noted that since the accused had a reasonable expectation of privacy in the images, he had standing to assert his s. 8 rights. While the wife had consented to the search of the USB key, as a matter of law one person could not waive another’s Charter rights. Therefore, the initial search of the USB key had been warrantless and not authorized by law, and thus unreasonable. The subsequent warrants had depended upon information obtained from the first warrantless search, and thus those searches were also unreasonable. Accordingly, the searches had breached the accused’s rights under s. 8. Arguments regarding the exclusion of the evidence under s. 24(2) of the Charter were to be heard at an unspecified future date.

  • 26 Sep 2019 11:11 AM | Anonymous

    LinkedIn can’t restrict access to users’ publicly available information

    In hiQ Labs, Inc. v. LinkedIn Corporation, the United States Court of Appeals for the Ninth Circuit unanimously agreed to uphold the district court’s preliminary injunction, which stopped LinkedIn from restricting hiQ’s access to its users’ publicly available information.

    HiQ is a data analytics company founded in 2012 that uses automated bots to scrape information publicly posted by LinkedIn users to yield “people analytics”, which it then sells to clients. HiQ offers analytics that identify employees at the greatest risk of being recruited away from their current employer, as well as tools that help employers identify skill gaps in their own workforces.

    LinkedIn representatives had been attending conferences organized by HiQ, and indeed one LinkedIn employee received an “Impact Award” and spoke at the conference. This changed when LinkedIn began exploring developing its own ways to use the data from the profiles to generate new products.

    In May 2017, LinkedIn sent hiQ a cease-and-desist letter claiming that the company’s use of LinkedIn data was in violation of the User Agreement and, if they persisted in using this data, it would violate the Computer Fraud and Abuse Act (CFAA), the Digital Millennium Copyright Act, California Penal Code §502(c), and the California common law of trespass. One week later, hiQ sought injunctive relief, which was granted by the district court. The court also ordered LinkedIn to remove any technical barriers to hiQ’s access to public profiles.

    The court engaged an in depth analysis of the meaning of the relevant CFAA provision cited by Linkedin and concluded that the CFAA’s prohibition of accessing a computer “without authorization” is only violated when a person circumvents a computer’s generally applicable rules regarding access permissions. When a computer network permits public access to its data, then the access of this publicly available data would not constitute access “without authorization” under the CFAA. However, the court concluded that state law trespass to chattels claim might still be applicable to the data-scaping of publicly available data in this case.

    In considering the injunction, the court noted the potential harm to hiQ, which was not just monetary injury, but the threat of going out of business. HiQ had been in the middle of a financing round when it received the cease and desist letter, and the threat of uncertainty created thereby caused the financing to stall and several employees to leave the company. HiQ would also have been unable to fulfill existing contracts without access to the LinkedIn data. That had to be balanced against the privacy interest retained by LinkedIn users who had chosen to make their profiles public, but the balance between those two interests favoured hiQ.

    The court also considered the public interest in granting the preliminary injunction, which focused mainly on non-parties’ interests. On balance, the court determined that on balance, the public interests favour hiQ’s position. They held:

    We agree with the district court that giving companies like LinkedIn free rein to decide, on any basis, who can collect and use data – data that the companies do not own, that they otherwise make publicly available to viewers, and that the companies themselves collect and use – risks the possible creation of information monopolies that would disserve the public interest.

  • 26 Sep 2019 11:09 AM | Anonymous

    No immunity for blocking competitor’s programs

    The United States Court of Appeals for the Ninth Circuit has reversed a dismissal of a claim by one software provider against another, allowing it to proceed further in Enigma Software Group USA, LLC v. Malwarebytes, Inc. Both Enigma and Malwarebytes make software allowing users to detect and block content such as viruses, spyware or other unwanted content from their computers. As explained in the decision:

    Providers of computer security software help users identify and block malicious or threatening software, termed malware, from their computers. Each provider generates its own criteria to determine what software might threaten users. Defendant Malwarebytes programs its software to search for what it calls Potentially Unwanted Programs (“PUPs”). PUPs include, for example, what Malwarebytes describes as software that contains “obtrusive, misleading, or deceptive advertisements, branding or search practices.” Once Malwarebytes’s security software is purchased and installed on a user’s computer, it scans for PUPs, and according to Enigma’s complaint, if the user tries to download a program that Malwarebytes has determined to be a PUP, a pop-up alert warns the user of a security risk and advises the user to stop the download and block the potentially threatening content.

    After a revision to its PUP-detection criteria in 2016, Malwarebyte’s software started identifying Enigma’s software as a PUP. As a result, if a user attempted to download one of Enigma’s programs, the user would be alerted to a security risk and the program would be quarantined. Enigma argued that this was a “bad faith campaign of unfair competition” aimed at “deceiving consumers and interfering with [Enigma’s] customer relationships.”

    The central legal issue concerned the Communications Decency Act, which immunizes computer-software providers from liability for actions taken to help users block certain types of unwanted online material. The primary motivation behind the Act, which was passed in the 1990s, was to allow for programs which would, for example, allow parents to install software that prevented their children from accessing pornography. However, the relevant provision concerns not only that sort of material, but also includes a catchall phrase giving immunity for software which blocks content which is “otherwise objectionable”. In an earlier decision, Zango Inc. v. Kaspersky Lab, Inc., 568 F.3d 1169, 1173 (9th Cir. 2009), the Ninth Circuit Court of Appeals had decided that this phrase permits providers to block material that either the provider or the user considered objectionable, and Malwarebytes had successfully relied on that at the lower level. The Court of Appeal, however, concluded that this case was distinguishable from Zango.

    In Zango there was no dispute over the claim that the blocked material was objectionable, and so the meaning of that term had not been fully considered. The key difference here was that a company was relying on this provision in order to block the content of one of its competitors. In this case, the Court of Appeals determined that “otherwise objectionable” could not be taken to include software that the provider finds objectionable for anticompetitive reasons. They did not go so far as to find that it was limited only to material which was sexual or violent in nature, holding that “spam, malware and adware could fairly be placed close enough to harassing materials to at least be called ‘otherwise objectionable’.” However, the legislation generally was meant to protect competition, and so an anti-competitive purpose would not serve.

    Malwarebytes argued that its reasons were not anticompetitive, and that its program:

    found Enigma’s programs “objectionable” for legitimate reasons based on the programs’ content. Malwarebytes asserts that Enigma’s programs, SpyHunter and RegHunter, use “deceptive tactics” to scare users into believing that they have to download Enigma’s programs to prevent their computers from being infected.

    The Court of Appeals effectively left that claim to be decided at the eventual action, since it was here only overturning the dismissal of Enigma’s suit. They concluded that because Enigma had alleged anticompetitive practices, its claim survived the motion to dismiss.

  • 26 Sep 2019 11:03 AM | Anonymous

    A Facebook ‘Like’ button makes you a joint controller in the EU.

    The European Court of Justice recently addressed the use of a Facebook “Like” button by Fashion ID GmbH & Co. KG (“Fashion ID”), an online clothing retailer, on their website. Facebook Ireland Ltd. (“Facebook”) acted as an intervenor in the case.

    The ECJ ruled that the operator of a website that embeds a social plugin, such as Facebook’s ‘Like’ button, which causes the browser of a visitor to that website to request content from the provider of the plugin and transmit to that provider the personal data of the visitor can be considered a controller within the meaning of Article 2(d) of the Data Protection Directive, and therefore subject to various obligations.

    Article 2(d) of that directive provides that:

    “controller” shall mean the natural or legal person, public authority, agency or any other body which alone or jointly with others determines the purposes and means of the processing of personal data; where the purposes and means of processing are determined by national or [EU] laws or regulations, the controller or the specific criteria for his nomination may be designated by national or [EU] law.

    Any visitor on the Fashion ID website had their personal data transmitted to Facebook due to the inclusion of the ‘Like’ button on their website. This transmission of data occurred, without notice to the visitor, regardless of whether that visitor was a member of Facebook or had clicked on the ‘Like’ button.

    However, the court held that joint liability does not imply equal responsibility. When operators are involved at different stages of the processing of personal data and to different degrees, the liability to be imposed on the various controllers will be assessed according to the specific circumstances of the case. The court concluded that a further investigation was needed to determine the degree of liability for each of Fashion ID and Facebook.

  • 26 Sep 2019 11:01 AM | Anonymous

    De-referencing requirement limited to Member States of European Union, not worldwide

    In Google LLC v Commission nationale de l’informatique et des libertés (CNIL), the Court of Justice of the European Union has sided with Google rather than the Commission nationale de l'informatique et des libertés (the French Data Protection Authority) over the issue of how far, geographically, Google’s obligation to comply with orders to de-reference links extended. The CNIL had ordered Google to remove results from all versions of the search engine, whatever the domain name extension, and imposed a penalty of €100,000 when Google failed to do so. Google pursued the issue, which therefore came to be decided by the Court of Justice of the European Union: that court concluded that Google was only required to de-reference results from versions of the search engine with domain names corresponding to the Member States of the European Union. However, although Google was not required to de-reference the results from all versions, it was also required to take steps to prevent or seriously discourage an internet user who searches from one of the Member States from gaining access to the de-referenced data via a version of the search engine outside the European Union.

    The Court noted that, due to its decision in Google Spain and Google, a person did have the right in certain circumstances to have their data de-referenced from searches, which has come to be referred to as the “right to be forgotten”. That right permits individuals to assert their right to de-referencing against a search engine operator as long as that search engine in the territory of the European Union, whether the actual data processing takes place in there or not. They also noted that directives and regulations had been made aimed at guaranteeing a high level of protection of personal data throughout the European Union, and that “a de-referencing carried out on all the versions of a search engine would meet that objective in full” (para 55). Indeed, they observed that the European Union Legislature could create a rule saying that de-referencing had to take place worldwide:

    56 The internet is a global network without borders and search engines render the information and links contained in a list of results displayed following a search conducted on the basis of an individual’s name ubiquitous (see, to that effect, judgments of 13 May 2014, Google Spain and Google, C131/12, EU:C:2014:317, paragraph 80, and of 17 October 2017, Bolagsupplysningen and Ilsjan, C194/16, EU:C:2017:766, paragraph 48).

    57 In a globalised world, internet users’ access — including those outside the Union — to the referencing of a link referring to information regarding a person whose centre of interests is situated in the Union is thus likely to have immediate and substantial effects on that person within the Union itself.

    58 Such considerations are such as to justify the existence of a competence on the part of the EU legislature to lay down the obligation, for a search engine operator, to carry out, when granting a request for de-referencing made by such a person, a de-referencing on all the versions of its search engine.

    They carried on to find, however, that no such rule had in fact been created. The Union, or indeed individual states, could create such a rule if they chose, but that was not the current state of the law. They acknowledged that “numerous third States do not recognise the right to de-referencing or have a different approach to that right” (para 59), and that

    60…the right to the protection of personal data is not an absolute right, but must be considered in relation to its function in society and be balanced against other fundamental rights, in accordance with the principle of proportionality …Furthermore, the balance between the right to privacy and the protection of personal data, on the one hand, and the freedom of information of internet users, on the other, is likely to vary significantly around the world.

    Accordingly, a de-referencing order did not apply to all versions of the search engine. It did apply to more than the search engine corresponding to the domain from which the search was made, however, and therefore was in force in all Member States. This was justified on the basis of regulations meant to “ensure a consistent and high level of protection throughout the European Union and to remove the obstacles to flows of personal data within the Union” (para 66).

    In part it seems that the decision was influenced by technological change Google had already made to “steer” users into the appropriate search engine. The Court noted that:

    42 During the proceedings before the Court, Google explained that, following the bringing of the request for a preliminary ruling, it has implemented a new layout for the national versions of its search engine, in which the domain name entered by the internet user no longer determines the national version of the search engine accessed by that user. Thus, the internet user is now automatically directed to the national version of Google’s search engine that corresponds to the place from where he or she is presumed to be conducting the search, and the results of that search are displayed according to that place, which is determined by Google using a geo-location process.

  

Canadian Technology Law Association

1-189 Queen Street East

Toronto, ON M5A 1S2

contact@cantechlaw.ca

Copyright © 2023 The Canadian Technology Law Association, All rights reserved.