Log in


News

<< First  < Prev   1   2   3   4   5   ...   Next >  Last >> 
  • 8 Nov 2022 5:03 PM | CAN-TECH Law (Administrator)

    We had a wonderful turnout at the 2022 CAN-TECH Law Fall conference November 2-3, 2022 at the Sheraton Centre Toronto Hotel.

    Click here for the highlights:

    https://cantechlaw.wildapricot.org/2022cantechlawfallconferencehightlights

  • 16 Sep 2022 6:00 PM | CAN-TECH Law (Administrator)

    Recommendations are limited to improving processes outside of the courts

    The province of Nova Scotia has completed its legislated review of the Intimate Images and Cyber-protection Act, with feedback resulting in a dozen recommendations. The Act replaced the province’s Cyber-safety Act, which was struck down in its entirety in 2015 by the Nova Scotia Supreme Court in Crouch v Snell, which found the former law violated sections 2(b) and 7 of the Charter of Rights and Freedoms. 

    The original law created a CyberScan unit within the Department of Public Safety to assist victims of cyberbullying, which was continued in the new legislation. Individual recourse to the courts was continued, but through a much more arduous formal application process to the Nova Scotia Supreme Court (the ex parte procedures in the previous law were found to violate the principles of fundamental justice). All of the recommendations put forward by the review group focus on augmenting and improving the CyberScan unit, including:

    • improving legal, mental health and crisis supports for victims;

    • creating a centralized, trauma-informed referral process for victims seeking advice and support; and

    • improving training for CyberScan staff, who help victims understand their options and navigate the justice system.

    None of the recommendations address access to the courts for legal relief on behalf of victims of cyber-bullying and the non-consensual distribution of intimate images.

    Nova Scotia remains the only province with a specific law to address cyberbullying, while other provinces have enacted statutes that are restricted to providing legal relief for victims for the non-consensual distribution of intimate images.

  • 16 Sep 2022 5:59 PM | CAN-TECH Law (Administrator)

    Breach notification and reporting obligations come into effect on September 22, 2022

    As part of its significant overhaul of the Act respecting the protection of personal information in the private sector in Bill 64 (now also known as Law 25), the province has introduced mandatory reporting and notification related to data breaches. The provisions in section 3.5 of the Bill will come into effect on September 22, 2022. The provisions are similar to those found in the Personal Information Protection and Electronic Documents Act (Canada) and the Personal Information Protection Act (Alberta), but not surprisingly use different terminology. 

    Regulated businesses will be required to promptly notify the Commission d’accès à l’information (“CAI”), as well as to the affected individuals whenever such businesses experience a “confidentiality incident” that poses a “risk of serious injury” to an individual. This is similar to a “breach of security safeguards” that results in a “real risk of significant harm” under PIPEDA. Again, similar to PIPEDA, businesses will be required to keep a register of all confidentiality incidents in the manner prescribed by regulation, regardless of the risk of injury. 

    On June 29th, 2022, a draft regulation regarding confidentiality incidents was published in the Gazette officielle du Québec. The Draft Bill 64 Regulation provides businesses with details related to the content of the new notification and record-keeping requirements. Interestingly, the new regulation also applies to public sector organizations. 

    Reports to the regulator must include:

    (1) the name of the body affected by the confidentiality incident and any Québec business number assigned to such body under the Act respecting the legal publicity of enterprises (chapter P-44.1);

    (2) the name and contact information of the person to be contacted in that body with regard to the incident;

    (3) a description of the personal information covered by the incident or, if that information is not known, the reasons why it is impossible to provide such a description;

    (4) a brief description of the circumstances of the incident and what caused it, if known;

    (5) the date or time period when the incident occurred or, if that is not known, the approximate time period;

    (6) the date or time period when the body became aware of the incident;

    (7) the number of persons concerned by the incident and the number of those who reside in Québec or, if that is not known, the approximate numbers;

    (8) a description of the elements that led the body to conclude that there is a risk of serious injury to the persons concerned, such as the sensitivity of the personal information concerned, any possible ill-intentioned uses of such information, the anticipated consequences of its use and the likelihood that such information will be used for injurious purposes;

    (9) the measures the body has taken or intends to take to notify the persons whose personal information is concerned by the incident, pursuant to the second paragraph of section 63.8 of the Act respecting Access to documents held by public bodies and the Protection of personal information or the second paragraph of section 3.5 of the Act respecting the protection of personal information in the private sector, and the date on which such persons were notified, or the expected time limit for the notification;

    (10) the measures the body has taken or intends to take after the incident occurred, including those aimed at reducing the risk of injury or mitigating any such injury and those aimed at preventing new incidents of the same nature, and the date on which the measures were taken or the expected time limit for taking the measures; and

    (11) if applicable, an indication that a person or body outside Québec that exercises similar functions to those of the Commission d’accès à l’information with respect to overseeing the protection of personal information has been notified of the incident.

    Notably, and unlike PIPEDA and PIPA, the regulations create a requirement to keep the CAI updated as more information relevant to (1) through (11) becomes known. 

  • 16 Sep 2022 5:58 PM | CAN-TECH Law (Administrator)

    CRA security failure resulted in hackers’ access to thousands of accounts for CERB fraud

    The Federal Court of Canada in Sweet v Canada has certified a negligence, breach of confidence and intrusion upon seclusion class action against the Canada Revenue Agency in connection with widespread “My CRA” account takeovers during the pandemic. Of particular interest is that the class was defined to exclude individuals who had provided their personal information to a BC law firm that first filed its claim and then was itself subject to a cybersecurity incident that may have exposed class member information. 

    During the summer of 2020, a large number of “My CRA” accounts were compromised and access by unknown third parties. The compromised accounts had their banking and direct deposit information changed and many accounts were enrolled in benefits programs, such as the Canada Emergency Response Benefit. The threat actors were also able to access sensitive personal information contained in the accounts, such as addresses, birthdates, employment details and SIN numbers. 

    A BC law firm quickly filed a putative class action in the Federal Court. In April 2021, that firm was itself the victim of a data breach that potentially exposed the personal information of potential class members. The government filed a motion to have the action stayed because it was proposing a third party claim against the law firm for contribution and indemnity for any persons whose information was exposed by both the government and the law firm. That third party claim would not be within the jurisdiction of the Federal Court. As a result, the first law firm withdrew and a second law firm began carriage of the case, and amended the pleadings to narrow the class of plaintiffs to exclude those whose information may have been exposed in the law firm data breach. A new representative plaintiff was substituted. 

    The Court described the incidents which resulted from an apparent failure on the part of CRA:

    [66] In the summer of 2020, GCKey and CRA’s My Account were the subject of what the cybersecurity industry describes as a “credential stuffing attack” by a threat actor, predominantly targeting CRA and ESDC as a means of fraudulently applying for COVID relief benefits (CERB and the Canada Emergency Student Benefit [CESB]) that had been introduced by the Government in the spring of 2020). Credential stuffing is a form of cyber attack that relies on the use of stolen credentials (username and password) from one system to attack another system and gain unauthorized access to an account. This type of attack relies on the reuse of the same username and password combinations by people over several services. Threat actors sell lists of credentials on the Dark Web. Credential stuffing usually refers to the attempt to gain access to many accounts through a web portal using an automated bot system rather than manually entering the credentials. On dates in July 2020, CRA’s My Account experienced large numbers of failed logins, which have since been identified as a precursor to, or otherwise part of, a credential stuffing attack against that service.

    [67] A threat actor attempting to access a particular My Account through credential stuffing would typically have encountered the requirement to successfully answer one of the five security questions selected by the user. However, during the attack that occurred in the summer of 2020, the threat actor(s) were able to bypass the security questions, and access My Account, because of a misconfiguration in CRA’s credential management software. CRA learned of this method to bypass the security questions on August 6, 2020, when it received a tip from a law enforcement partner that such a method was being sold on the Dark Web. Among other steps taken to respond to the data breach, CRA subsequently identified the relevant misconfiguration in its software, which it remedied on or about August 10, 2020.

    [68] In the meantime, at least 48,110 My Accounts were impacted by the unauthorized use of credentials, meaning that the threat actor was able to enter a valid CRA user ID and password. Of those 48,110 My Accounts, 21,860 involved no progress by the threat actor beyond entering the ID and password, such that the threat actor did not access the accounts. This is potentially understood as a stage of the attack in which the threat actor was ensuring that the credentials worked. The threat actor(s) actually logged in to 26,250 My Accounts. In 13,550 of the My Accounts, although the security question bypass was used, the threat actor only viewed the homepage, meaning that some personal information was accessed, but no application was submitted for CERB. In 12,700 of the My Accounts, the threat actor changed the relevant taxpayer’s direct deposit banking information and fraudulently applied for CERB.

    The plaintiffs sought to certify the class action on the basis of systemic negligence, breach of confidence and intrusion upon seclusion. The court noted that there were differing authorities on whether these causes of action could be applicable in circumstances such as these, but overall found that this area of law continues to develop and that the plaintiff’s claims were not bound to fail, based on the pleadings. 

  • 16 Sep 2022 5:57 PM | CAN-TECH Law (Administrator)

    Federal Privacy Commissioner weighs in on draft regulations regarding search of electronic devices by Canadian Border Services Agency

    Currently wending its way through Parliament is Bill S-7, which will inter alia amend the Customs Act to deal specifically with the search of electronic devices. The Canadian Border Services Agency (CBSA) in June 2022 proposed a set of regulations that would govern the examination of data stored on electronic devices. In July 2022 the Office of the Privacy Commissioner of Canada (OPC) published its submissions made as part of the public consultation on the draft regulations. It dealt with four topics:

    Note-Taking Requirements

    The OPC recommended:

      The requirement for noting the basis of the examination to also include noting if the rationale changes as the examination progresses, for instance, if new evidence or facts emerge;

      The proposed requirement for noting the type of document that was examined to also include the reasons why a particular document was examined;

      Adding a requirement to note any communication with the traveller that may be relevant to the circumstances of the examination; and

      Adding a requirement to note whether the search was resultant or not, and the steps taken following that determination.

    Disabling Network Connectivity

    The OPC reiterated earlier submissions that there be more specific procedures imposed regarding the disabling of network connectivity on devices so as to confine searches to data stored on the device; “we would emphasize that certain technical steps and procedures should be specified by the Regulations as necessary to ensure there is no connection to a network, including, but not necessarily limited to: activating “airplane mode”, deactivating connection to a WiFi network and, ensuring a device is not sharing a connection with another device via Bluetooth or otherwise.”

    Password Collection and Retention

    The OPC recommended that the Regulations “include specific provisions directing the methods and circumstances for password and passcode collection, including specifying that an officer must not retain a password or passcode in instances where the examination of a digital device is non-resultant.”

    Solicitor-Client Privileged Information

    In line with an amendment to the Bill that had been made by the Senate, the OPC recommended that CBSA “include its current policy requirements for dealing with solicitor-client privileged information, and other types of sensitive information of this nature, within the proposed Regulations.”

  • 16 Sep 2022 5:56 PM | CAN-TECH Law (Administrator)

    Electronic data from vehicle ruled properly removed, but not properly admitted

    In R. v. Major, the accused had been convicted at trial of dangerous driving and criminal negligence, stemming from a motor vehicle accident in which he was the driver of a pickup truck that collided with another truck. Two RCMP forensic collision reconstructionists attended at the scene of the horrific accident, and removed the factory-installed airbag control module (ACM) and accessed data from the event data recorded (EDR), which is a component of the ACM. They created an image of the data on an RCMP computer and analyzed it, which revealed that the accused’s truck was “moving at 137 km/hr 5 seconds prior to the impact, the brakes had been applied 1.2 seconds before the collision, and the vehicle was still going 118 km/hr immediately before it struck the semi-truck.” Over the accused’s objections, the EDR data and the reconstructionists’ testimony about it was admitted at trial and he was convicted. On appeal he renewed these objections.

    The Court of Appeal first turned to the question of whether the removal of the ACM and accessing of the EDR had breached the accused’s rights under s. 8 of the Charter. After reviewing the essential s. 8 law on search and seizure, Tholl J.A. (for a unanimous court) canvassed two competing lines of authority specifically on the issue of s. 8 breaches regarding ACM and EDR seizure. The first stemmed from the case of R v Hamilton, 2014 ONSC 447, and in these cases the courts tended to find that seizing the data without consent or a warrant amounted to a s. 8 breach, on the basis that the accused had a reasonable expectation of privacy in the vehicle which covered the data (despite its being relatively obscure or unknown to most drivers). The second line of authority stemmed from the case of R v Fedan, 2016 BCCA 26, and in these cases the courts tended to find that it could be presumed that an accused had a subjective expectation of privacy in the data due to its presence in the vehicle, but that such an expectation of privacy was not objectively reasonable “since no information could be gleaned from [the ACM and EDR] that revealed any intimate details about the accused or any information about who was driving the vehicle.”

    Turning to this case, the Court held that the accused had a limited interest in the data, insofar as it was embedded in his vehicle, and that in accordance with the Fedan line of cases it could be presumed without proof (i.e., the accused had not testified on the topic) that he had a subjective expectation of privacy in everything to do with his personal vehicle. However, the court was not convinced that the expectation of privacy was objectively reasonable:

    [69]           The closest an EDR comes to saving any personal information is its storage of the manner in which the vehicle was driven immediately pre-collision. The EDR data provides but a brief snapshot of the status of various mechanical and electrical components of the vehicle before a triggering event. According to the evidence in this matter, the data located in an EDR is not such that it provides longer-term information about the driving habits of the owner or operator of a vehicle. While an ACM and its EDR is, at least partially, an electronic device, it is comparatively rudimentary in nature. Moreover, unlike a mobile device or computer, its owner has no control in relation to any of the information that is stored on it. They cannot input data onto it, retrieve information from it or actively use it for any purpose. It does not record non-vehicle related information nor does it acquire or give location data. In the case at hand, there was no evidence demonstrating that Mr. Major even knew it existed.

    [70]           After considering the two lines of cases regarding EDR data, I find myself in substantial agreement with the reasoning from Fedan for the characterization of the data stored in the EDR. As in Fedan, the data here “contained no intimate details of the driver’s biographical core, lifestyle or personal choices, or information that could be said to directly compromise his ‘dignity, integrity and autonomy’” (at para 82, quoting Plant at 293). It revealed no personal identifiers or details at all. It was not invasive of Mr. Major’s personal life. The anonymous driving data disclosed virtually nothing about the lifestyle or private decisions of the operator of the Dodge Ram pickup. It is hard to conceive that Mr. Major intended to keep his manner of driving private, given that the other occupants of the vehicle – which included an adult employee – and complete strangers, who were contemporaneously using the public roadways or adjacent to it, could readily observe him. His highly regulated driving behaviour was “exposed to the public” (Tessling at para 47), although not to the precise degree with which the limited EDR data, as interpreted by the Bosch CDR software, purports to do. While it is only a small point, I further observe that a police officer on traffic patrol would have been entitled to capture Mr. Major’s precise speed on their speed detection equipment without raising any privacy concerns.

    Accordingly, there had been no section 8 breach.

    Justice Tholl then turned to whether the data had been properly admitted. While the Crown argued that it was admissible as an electronic record under ss. 31.1-31.8 of the Canada Evidence Act, Tholl J.A. pointed out [astutely, in our view: eds.] that these provisions simply provided for authentication and the baseline integrity of the data, and that a proper evidentiary foundation still had to be laid for admissibility. The Crown further argued that technology-generated data, which was commonly relied upon for the safety of the operation of mechanical devices, was prima facie reliable enough to be admissible. However, Tholl J.A. also took issue with this argument:

    [92]           It must be remembered that the data being accessed in the matter at hand was contained in a component of a motor vehicle’s safety system that is unfamiliar to the average person. It is not comparable to a vehicle speedometer, a GPS system, an iPhone, or an Excel spreadsheet. For these four latter pieces of technology, while the average user probably has little knowledge of how they actually work from an internal perspective, they are so ubiquitous that they are easily understood by lay people who feel comfortable with their reliability, based on their commonplace and routine usage. The EDR data and the CDR output do not fall into this same category. By way of illustration, I query what a trier of fact could make of the ten pages of hexadecimal data dump contained at pages 25 to 34 of the output from the Bosch CDR software that was filed as an exhibit.

    [93]           The EDR data is created using a process unknown to the average person, is not accessible by an owner of the vehicle or their mechanic, and can only be extracted with highly specialized third-party software. In this case, there was no evidence as to how the data was actually gathered, what margin of error might exist and what circumstances could influence its accuracy. There was no evidence as to whether the EDR component recorded information accurately. There was also no explanation of any anomalies found in the EDR data.

     

    In this case, while the reconstructionist who testified for the Crown had been qualified as an expert in accident reconstruction, the trial judge had refused to qualify him as an expert as regarded the EDR data, in particular, and indeed he had been unable to answer some questions on the analysis of the data that he and his colleague had produced. Without such expert opinion (and in spite of some judicial opinion to the contrary), the judge had erred in admitting it:

    [101]      While at some point in the future the use and understanding of these data recorders may become so commonplace that expert testimony is not required to establish their reliability, it is sufficient for me to find that this is not the case now. I am left to conclude that the bare fact that data has been recorded in an EDR and extracted by software does not establish its reliability. Thus, without evidence from a properly qualified expert as to how the system accurately records data and creates accurate output, the CDR output did not meet the threshold required for admission [….]

    [102]      I conclude that a properly qualified expert witness was required in order for the CDR output to be admitted at trial. The technology has not yet progressed to the point where it falls into the same category as a speedometer, video recorder, wrist-watch, spreadsheet program, or similar item of everyday use. Absent consent by an accused, threshold reliability must be established by the Crown before such evidence may be admitted. The trial judge specifically ruled that Cpl. Green could not provide the required expert opinion evidence in relation to this aspect of the evidence. As such, there was insufficient evidence to establish the threshold reliability of the EDR data or the CDR output.

    Given that the data had been the primary evidence of the speed of the vehicle at the time of the accident – a central fact – the error had rendered the verdict unsafe and the case had to be returned for a new trial.

  • 20 Jul 2022 9:29 AM | CAN-TECH Law (Administrator)

    The proposed new privacy law includes order-making powers, penalties and a new tribunal

    On June 26, 2022, the Industry Minister François Philippe Champagne finally tabled in the House of Commons Bill C-27, called the “Digital Charter Implementation Act, 2022”. This is the long-awaited privacy bill that is slated to replace the Personal Information Protection and Electronic Documents Act, which has regulated the collection, use and disclosure of personal information in the course of commercial activity in Canada since 2001

    The bill is very similar to Bill C-11, which was tabled in 2019 as the Digital Charter Implementation Act, 2019, and which languished in Parliament until the federal government called the last election.

    The Bill creates three new laws. The first is the Consumer Privacy Protection Act (“CPPA”), which is the main privacy law. The second is the Personal Information and Data Protection Tribunal Act and the third is the Artificial Intelligence and Data Act.

    The CPPA is in a completely different structure than PIPEDA. PIPEDA included a schedule taken from the Canadian Standards Association Model Code for the Protection of Personal Information and generally required regulated organizations to follow the Code. Similar to the Personal Information Protection Acts of British Columbia and Alberta, the substance of the Code has largely been translated to statutory language in the Bill itself.

    The most significant difference is what many privacy advocates have been calling for: the Privacy Commissioner is no longer an ombudsman. The law includes order-making powers and significant penalties. The Bill also creates a new tribunal called the Personal Information and Data Protection Tribunal, which replaces the current role of the Federal Court under PIPEDA with greater powers.

    PIPEDA applies to the collection, use and disclosure of personal information in the course of commercial activity and to federally-regulated workplaces. That will not change in the CPPA, but a new section 6(2) says that the new Act specifically applies to personal information that is collected, used or disclosed interprovincially or internationally. This provision is not expressly limited to commercial activity, so there’s an argument that could be made that it would apply to non-commercial or employee personal information that crosses borders.

    The CPPA has an interesting approach to anonymous and de-identified data. It officially creates these two categories. It defines anonymize as:

    “to irreversibly and permanently modify personal information, in accordance with generally accepted best practices, to ensure that no individual can be identified from the information, whether directly or indirectly, by any means.”

    Anonymous information is carved out of the CPPA’s purview. But de-identified data remains in-scope. To de-identify data means “means to modify personal information so that an individual cannot be directly identified from it, though a risk of the individual being identified remains.”

    In a number of areas, the CPPA provides more detail about what is required to comply with general principles that are already in PIPEDA. For example, additional detail and needs to be applied with respect to a company’s “privacy management program”. And all the supporting documentation for an organization’s privacy management program must be provided to the Privacy Commissioner on Request.

    With respect to consent, organizations expressly have to record and document the purposes for which any personal information is collected, used or disclosed. This was implied in the CSA Model Code, but is now expressly spelled out in the Act. Section 15 of the CPPA lays out in detail what is required for consent to be valid. It requires not only identifying the purposes but also communicating in plain language how information will be collected, the reasonably foreseeable consequences of is use, what types of information and to whom the information may be disclosed.

    One significant change compared to is the circumstances under which an organization can collect and use personal information without consent. Section 18 of the CPPA allows collection and use without consent for certain business activities, where it would reasonably be expected to provide the service, for security purposes, for safety or other prescribed activities. Notably, this exception cannot be used where the personal information is to be collected or used to influence the individual’s behaviour or decisions. There is also a “legitimate interest” exception, which requires an organization to document any possible adverse effects on the individual, mitigate them and finally weigh whether the legitimate interest outweighs any adverse effects. It’s unclear how “adverse effects” would be measured.

    Like PIPEDA, an individual can withdraw consent subject to similar limitations that were in PIPEDA. But what’s changed is that an individual can require that their information be disposed of. Notably, disposal includes deletion and rendering it anonymous.

    The most notable changes are with respect to the role of the Privacy Commissioner. The Commissioner is no longer an ombudsman with a focus on nudging companies to compliance and solving problems for individuals. The CPPA and Tribunal Act veer strongly towards enforcement.

    As with PIPEDA, enforcement starts with a complaint by an individual or the commissioner can initiate it on his own accord. After the investigation, the matter can be referred to an inquiry.

    Inquiries have more procedural protections for fairness and due process than under the existing ad hoc investigation system. For example, each party is guaranteed a right to be heard and to be represented by counsel. At the end of the inquiry, the Commissioner can issue orders for measures to comply with the Act or to stop doing something that is in contravention of the Act. The Commissioner can continue to name and shame violators, but penalties are left to the new Privacy and Data Protection Tribunal.

    The legislation creates a new specialized tribunal which hears cases under the CCPA. Compared to C-11, the new bill requires that at least three of the tribunal members have expertise in privacy.

    Its role is to determine whether any penalties recommended by the Privacy Commissioner are appropriate. It also hears appeals of the Commissioner’s findings, appeals of interim or final orders of the Commissioner and a decision by the Commissioner not to recommend that any penalties be levied.

    Currently, under PIPEDA, complainants and the Commissioner can seek a hearing in the federal court after the commissioner has issued his finding. That hearing is “de novo”, so that the court gets to make its own findings of fact and determinations of law, based on the submissions of the parties. The tribunal, in contrast, has a standard of review that is “correctness” for questions of law and “palpable and overriding error” for questions of fact or questions of mixed law and fact. These decisions are subject to limited judicial review before the Federal Court.

    Possible penalties are huge. The maximum administrative monetary penalty that the tribunal can impose in one case is the higher of $10,000,000 and 3% of the organization’s gross global revenue in its financial year before the one in which the penalty is imposed. The Act also provides for quasi-criminal prosecutions, which can get even higher.

    The Crown prosecutor can decide whether to proceed as an indictable offence with a fine not exceeding the higher of $25,000,000 and 5% of the organization’s gross global revenue or a summary offence with a fine not exceeding the higher of $20,000,000 and 4% of the organization’s gross global revenue. If it’s a prosecution, then the usual rules of criminal procedure and fairness apply, like the presumption of innocence and proof beyond a reasonable doubt.

    Within a week of being tabled in Parliament, the House rose for the summer break. When Parliament resumes in September, it’s impossible to predict whether the Bill will be fast-tracked or whether it will languish like Bill C-11 in 2019. It is also hard to predict whether the government will be amenable to suggested amendments at the Committee stage.

  • 20 Jul 2022 9:22 AM | CAN-TECH Law (Administrator)

    Consortium of broadcasters sought to block live streams of NHL games in Canada 

    The last three years have seen an opening of the floodgates for Federal Court orders directed at internet service providers to block their customers’ access to third party websites, in response to allegations of copyright infringement by the operators of those third party websites. In a significant extension of this precedent, the Federal Court of Canada has issued the first order requiring internet service providers to block IP addresses in real time, upon request by a consortium of Canadian broadcasters to prevent live access to streams of National Hockey League games. 

    In Rogers Media Inc. v. John Doe 1, the Court considered an application made by Canadian broadcasters who have the exclusive right to broadcast NHL games to prevent Canadians from accessing pirated streams of those live games. 

    The door was opened to website blocking in Canada in the 2019 case of Bell Media Inc. v. GoldTV.Biz. In that case, the Court ordered that ISP block certain IP addresses that were listed in the Court’s order, and the list could only be added to by further court order. The plaintiffs argued that this was not feasible for NHL games and said the court should issue an order for “dynamic” site blocking. From the Court:

    [6] In this case, the Plaintiffs have requested a “dynamic” site blocking Order, which involves trying to follow and block the unlawful streaming as it moves. The Plaintiffs say that the type of order issued in GoldTV FC would not work here because the pirates have adopted new measures to avoid detection and defeat site blocking, including moving their infringing content from site to site on a regular basis. Court approval would be impossible prior to each new blocking step because these efforts need to happen in real time in order to be effective.

    [7] The Plaintiffs say this is of particular relevance here, because most fans watch hockey games live, rather than recording them to watch later. This combination of factors means that blocking of unlawful streaming of live NHL broadcasts must happen while the broadcast is underway. Based on the evidence they have gathered, and experience in other countries where similar site-blocking orders have been issued, the Plaintiffs say that a dynamic site blocking Order is needed to keep up with the evolution in how online copyright piracy operates. For example, in this case the sites to be blocked could shift during the course of a single hockey broadcast. This type of dynamic blocking order has never been granted in Canada or in the United States. However, similar orders have been granted in the United Kingdom and Ireland, as well as in some European countries.

    Some of the ISPs were prepared to consent to the order, while others resisted it on various grounds summarized by the Court at paragraph 8 of the decision: 

    Although the objecting Third Party Respondents do not adopt identical positions, they advance broadly similar arguments. They say the process the Plaintiffs followed has been inappropriate and unfair. They contend that the Plaintiffs have failed to prove their case. They argue that the Order sought would impose undue risks, practical difficulties and costs on them, noting that they are not accused of any wrongdoing in this matter. Finally, they submit that if any Order is to be imposed, the Plaintiffs must be required to indemnify them completely for the costs associated with compliance, including (for those that would be required to do so) any cost of upgrading their network infrastructure.

    The Court followed the reasoning from the GoldTV cases, which included the general test for injunctive relief against uninvolved third parties from RJR-MacDonald Inc. v. Canada (Attorney General) most recently interpreted by the Supreme Court of Canada in Google Inc. v. Equustek Solutions Inc. In this case, as in GoldTV, this analysis was supplemented with additional factors drawn from the England and Wales Court of Appeal case of Cartier International AG v British Sky Broadcasting Ltd.:

    [113] The following summary of the Cartier factors borrows from GoldTV FC at paragraph 52 and GoldTV FCA at para 74. The factors to be considered are:

    A. Necessity – the extent to which the relief is necessary to protect the plaintiff’s rights. The relief need not be indispensable but the court may consider whether alternative and less onerous measures are available;

    B. Effectiveness – whether the relief sought will make infringing activities more difficult to achieve and discourage Internet users from accessing the infringing service;

    C. Dissuasiveness – whether others not currently accessing the infringing service will be dissuaded from doing so;

    D. Complexity and Cost – the complexity and cost of implementing the relief sought;

    E. Barriers to legitimate use or trade – whether the relief will create barriers to legitimate use by unduly affecting the ability of users of ISP services to access information lawfully;

    F. Fairness – whether the relief strikes a fair balance between fundamental rights of the parties, the third parties and the general public;

    G. Substitution – the extent to which blocked websites may be replaced or substituted and whether a blocked website may be substituted for another infringing website; and

    H. Safeguards – whether the relief sought includes measures that safeguard against abuse.

    The Court ultimately was satisfied that the RJR-MacDonald and Cartier factors supported granting the injunction, which was issued on the following terms:

    • The plaintiffs must collectively appoint a third party agent identify impugned IP addresses, to notify the ISPs of IP addresses to be blocked and when to unblock them, compile complaints and address the effectiveness of the measures. The third party agent is to report to the Court and issue a public report. 

    • ISPs must block or attempt to block access to IP addresses streaming infringing content as identified by the plaintiffs’ agent, and informing the plaintiffs of processes they would propose to use. 

    • Disabling access is to be done “as soon as practical,” and within 30 minutes of the beginning of the game or notice and the IP addresses will be unblocked at the conclusion of the game. 

    • The plaintiffs and ISPs must provide a form of notice to affected customers.

    • Compliance with the order generally is immediately, if possible, but an ISP that cannot implement measures to comply within 15 days must notify the plaintiffs. 

    • The plaintiffs are responsible, to a maximum of $50,000 per ISP, for the costs of implementing the order.

    • The order will only be in effect until the end of the 2021–2022 NHL season (including playoffs). 

    While GoldTV did open the floodgates for rightsholders to seek blocking orders, the “dynamic” blocking was largely ordered because of nature of sports broadcasting, where viewers tend to watch games live. The door has clearly been opened for similar orders to be sought for other live sporting events, such as the Soccer World Cup and Olympics. 

  • 20 Jul 2022 9:18 AM | CAN-TECH Law (Administrator)

    In a split decision, Alberta Court of Appeal finds that police do not have to seek judicial authorization to obtain IP addresses in order to further an investigation

    In R. v. Bykovets, police were investigating a fraud in which the fraudsters had purchased online liquor gift cards using stolen credit card information. The payments had been processed through a third party service, and when an investigating officer requested information about the purchasers from the third party, that company sent the officer two IP addresses. Using public sources the police were able to determine that the IP addresses were registered to TELUS, and obtained a production order requiring TELUS to provide subscriber information. The information produced by TELUS linked the IP addresses to the appellant and his father, at their home addresses. The police obtained search warrants for the two residences and eventually seized evidence linking the appellant to the fraud. At trial the appellant had argued that s. 8 of the Charter had been breached because the police officer had not obtained judicial authorization before obtaining the IP addresses initially. The trial judge had not acceded to this argument, holding that the appellant did not have a reasonable expectation of privacy in the IP addresses.

    This latter question was the main issue confronted by the Court of Appeal: does the target of a police investigation have a reasonable expectation of privacy in an IP address used by them, such that s. 8 of the Charter is engaged and police are required to obtain judicial authorization before obtaining the IP address? Interestingly, this question produced a split decision from the Court of Appeal. Both the majority and dissenting judgments agreed that the analytical starting point was the Supreme Court of Canada’s 2014 decision in R. v. Spencer, in which the Court held that people have a reasonable expectation of privacy in their subscriber information that is attached to IP addresses which have been assigned to them, because this data can reveal to the police “core biographical information.” The disagreement in this case turned on differing views on what the police “were actually trying to obtain” in seeking the IP addresses. Writing in dissent, Justice Veldhuis accepted the appellant’s broader argument that the privacy invasion here was analogous to the one dealt with in Spencer:

    [62] …Police officers asking third parties for IP addresses associated with particular internet activity is, in essence, no different from police asking Internet Service Providers (ISPs) to provide the subscriber information associated with a particular IP address. Both investigative techniques are aimed at gathering information to ascertain the identity of an internet user and allow the police to gather further information to draw inferences about the intimate details of the lifestyle and personal choices of the internet user.

    [….]

    [72]           The trial judge distinguished this case from R v Jennings2018 ABQB 296, and in my view, it was an error to do so. In Jennings, the police deployed a mobile device identifier (MDI) on numerous occasions which gathered third-party cellular phone information numbers (IMEI and IMSI). The Crown alleged, much like it does here, that IMEI and IMSI are just numbers and do not constitute or contain intimate details, core biographical information or reveal personal choices. However, the trial judge accepted the defence position that had the numbers been linked to Jennings, the police would be able to use that information in repeated applications of the MDI to build a profile that would reveal patterns of communication, contacts, and other biographical data about Jennings.

    [73]           The trial judge in Jennings recognized, at paragraph 37, the quintessential problem with the Crown’s position and applying a narrow scope to characterizing information that acts as a trace associated with a particular individual’s electronic activities: at what point in the iterative process of police gathering electronic information do they need to seek a warrant to ensure that there is a warrant at the point the numbers are associated with a particular person? She held that the point is at the beginning, and I agree.

    However, the majority judges held that this view stretched the Spencer analysis too far:

    [17]           In Spencer, police obtained, without judicial authorization, the IP address and its subscriber data. Thus, without a court order, the police believed the following: Matthew Spencer was using the internet to download child pornography at a specifically named address. By contrast, the police here obtained, without judicial authorization, only IP addresses. Based on this abstract information, police believed a person who committed fraud used the IP addresses. They did not know who. They only knew the IP addresses belonged to TELUS and they ascertained this information through a publicly available internet lookup site. To get the name and address of the subscriber, they lawfully served TELUS with a production order. Thus, without a court order, they believed only this: an unknown person using a known IP address was committing fraud from an unknown address.

    [18]           We also note the difference in mobile phone searches. In situations involving mobile phone data, there is clearly a reasonable expectation of privacy in international mobile subscriber identity (“IMSI”) and international mobile equipment identity (“IMEI”) numbers that the police obtained with a mobile device identifier (“MDI”) or cellular-site simulator (“CSS”). In those cases, the subject or identity of the target is generally known when the MDI or CSS technology is used. More importantly, in those situations, over time the police can glean “significant personal information” from the IMSI and IMEI numbers such as drawing inferences about a target’s cell usage and web browsing.

    [19]           The appellant has analogized an IP address to a house address. In our view, the analogy is not appropriate.

    [20]            An IP address does not tell police where the IP address is being used or, for that matter, who is using it. Nor is there a publicly available resource from which the police can learn this or other subscriber data. To get the core biographical information such as an address, name, and phone number of the user, the police must obtain and serve a production order on the ISP in accordance with Spencer. That is what the police did here.

    [21]           An IP address is an abstract number that reveals none of the core biographical information the issuer of that IP address attaches to it. Standing alone, it reveals nothing. An IP address does not reveal intimate details of a person’s lifestyle nor does it, without more, disclose core biographical information, nor communicate confidential information. An IP address can tell the police something about a person only if they get subscriber information. But the police can only get subscriber information if they comply with Spencer and obtain a production order.

    The trial judge had correctly found that the IP addresses, by themselves, revealed no private information, and therefore no reasonable expectation of privacy had arisen.

    While at the time of writing there was no indication of a defence appeal to the Supreme Court of Canada, one is available because of a dissent on a point of law at the Court of Appeal. It will be interesting to see if the Supreme Court wishes to examine this issue, given that (as revealed in the decision) there seems to be some division of opinion on whether an IP address attracts a reasonable expectation of privacy.


    • 18 Mar 2022 1:51 PM | CAN-TECH Law (Administrator)

      As a new feature, from time to time the English portion of this newsletter will note the publication of books or articles that might be of interest to the CANTECH memberships. Subscribers who themselves publish relevant material are invited to send notice to the editors.

      Michael Shortt, “The Writing Requirement for Canadian Copyright Assignments” (2021) 34:1 Intellectual Property Journal 1

      Abstract: Transfer of copyright in Canada requires a written, signed assignment. While the idea of a writing and signature requirement may seem simple, the Canadian requirement has a complex history, and an unpredictable pattern of application by the courts. This article reviews the history of the Canadian writing requirement, which traces back to an unexpected source in 19th-century British jurisprudence. It then considers the policy objectives that have been advanced for the Canadian writing requirement, concluding that courts regularly apply the writing requirement in ways that are inconsistent with all of its most plausible purposes. Finally, the article reviews how courts have applied the writing requirement to practical litigation scenarios, including retroactive assignments, conditional assignments, and assignments via electronic writings and signatures. It concludes with a series of observations on law reform in this area, both judicial and legislative.

      Special Issue (2021) 19:2 Canadian Journal of Law & Technology

      This special issue of the CJLT contains a collection of articles focused on the problem of technology-facilitated gender-based violence (TFGBV). It emerged from a series of online events hosted by Osgoode Hall Law School and the University of Ottawa.

    << First  < Prev   1   2   3   4   5   ...   Next >  Last >> 

      

    Canadian Technology Law Association

    1-189 Queen Street East

    Toronto, ON M5A 1S2

    contact@cantechlaw.ca

    Copyright © 2022 The Canadian Technology Law Association, All rights reserved.