Cybersecurity

Overview

Read our white paper on what to do before, during and after a cyber incident

Complete our analysis questionnaire on cybersecurity needs.

Now more than ever, companies of all sizes and in every field must pay particular attention to the issue of cybersecurity.

The rise in cyberattacks and costs associated with data leaks is well documented. Indeed, the mobility of information in a telecommuting context, the use of cloud storage, process automation and the increased connectivity of organizational systems increase organizations’ vulnerability to cyberattacks. Data leaks can adversely affect not only an organization’s reputation with the public, but also the management and continuity of its day-to-day business.

In addition, legislative and regulatory requirements for public and private sector companies that hold personal data and information are also being enhanced, as evidenced, in particular, by the National Assembly of Québec’s very recent adoption of Bill 64 in the wake of high-profile security incidents.

Our expertise

Our service offer covers all aspects of cybersecurity, including identifying risks, understanding the issues at stake, implementing best practices in cyber vigilance and providing support should a company be sued following a breach of confidentiality.

Lavery’s team has extensive experience and expertise, particularly in crisis management with respect to:

  • Protection of personal and other sensitive data
  • Information technology
  • Technology governance
  • IT risk management
  • Disputes (including class actions)
  • Labour and employment law

Our team keeps abreast of legislative changes regarding personal information, an area currently undergoing rapid change. It also has an understanding of cutting-edge technology, including the Internet of Things, artificial intelligence and quantum computing, all of which will drastically affect cybersecurity practices in the coming years.

Service offer to private and public institutions

As we know that legal matters represent only a fraction of the issues that need to be addressed with respect to an organization’s cyber vigilance, our service offer includes legal services geared towards IT security management and non-legal services that combine a range of prevention and response measures to provide an effective and operational solution based on four criteria:

  • Strategy and transformation: Developing strategies and programs that focus on business needs and risks and support growth and resilience by making cybersecurity and privacy a company-wide priority.
  • Incident and threat management: Preparing for, identifying, responding to, investigating and handling threats with confidence.
  • Consumer privacy and protection: Designing, implementing and running a privacy program that enables your organization to maximize the use of data in accordance with the law, while building consumer trust.
  • Implementation and operations: Designing, implementing, running and improving the use of cybersecurity technologies and continuously monitoring your environment to detect and contain threats to your business.

Service offer to SMEs

Our firm has developed a cybersecurity service offer to, in particular, analyze companies’ needs in this area and identify possible flaws that require their attention.

As a first step, your organization must complete a cybersecurity needs analysis questionnaire.

Once the questionnaire is completed, we are able to establish a diagnosis, propose solutions and an action plan to remedy problematic aspects and guide you in implementing our recommendations on the following:

  • Cybersecurity governance: A sound decision-making process is important for any business when it comes to cybersecurity.
  • Processes related to employees, suppliers and subcontractors: A business’ decisions and policies respecting cybersecurity must be properly communicated not only within the organization, but also with all stakeholders.
  • Protection of personal information and data, and Canada’s anti-spam legislation: If your organization collects data or personal information as part of its operations, it must do so in accordance with the law.
  • Technical and technological component to increase cybersecurity: Legal and strategic advice associated with implementing the action plan following our cybersecurity needs analysis.

Representative mandates

  • Advised one of the largest professional orders in Quebec regarding a major computer security breach affecting its employees and members.
  • Advised a major Canadian chemical company on the theft of its employees’ and customers’ personal data.
  • Advised a Canadian tax and financial planning association following a cyberattack on its IT service provider.
  • Advised and provided a legal opinion to one of the most prominent public organizations in Quebec on the appropriateness and content of an incident report resulting from a breach of confidentiality following a cyberattack.
  • Advised a multinational tobacco company on the measures to be implemented in the event of a computer security breach and reviewed its policies, guidelines and response plans in this regard.
  • Provided training to executives of a multinational cybersecurity insurance organization.
  • Provided training to a major accounting and tax firm on cybersecurity and privacy.
  • Advised a Crown corporation on applying the General Data Protection Regulation (GDPR) and created a matrix to identify cases where this European legal framework, which includes rules on IT security breaches, should be applied.
  • Participated in data protection IT audits for various companies as part of a partnership with an international consulting firm.
  • Advised a Canadian vehicle parts company that was held to ransom following an unwarranted intrusion into its databases containing all of the technical drawings of its American and European vehicle manufacturer clients.
  • Reviewed the physical and software security rules of two major Canadian financial institutions’ IT and telecommunications systems and negotiated and drafted the physical and software security obligations incumbent on the service provider to which the operation of these systems was outsourced in order to ensure adequate contractual protection for the financial institutions against any breach of confidentiality of personal and other sensitive data entrusted to the service provider.
  • Assisted a European law firm with a major employee and supplier data breach involving a multinational electronics company and its subsidiaries in several jurisdictions around the world.
  • Advised a publicly traded company in the implementation of IT governance and security measures for the sharing of trade secrets between its various sites in Canada, the United States and Europe.
  • Represented a European company that was the victim of a cyber incident to claim damages from those responsible for the incident located in Canada.
  1. The forgotten aspects of AI: reflections on the laws governing information technology

    While lawmakers in Canada1 and elsewhere2 are endeavouring to regulate the development and use of technologies based on artificial intelligence (AI), it is important to bear in mind that these technologies are also classified within the broader family of information technology (IT). In 2001, Quebec adopted a legal framework aimed at regulating IT. All too often forgotten, this legislation applies directly to the use of certain AI-based technologies. The very broad notion of “technology-based documents” The technology-based documents referred to in this legislation include any type of information that is “delimited, structured and intelligible”.3 The Act lists a few examples of technology-based documents contemplated by applicable laws, including online forms, reports, photos and diagrams—even electrocardiograms! It is therefore understandable that this notion easily applies to user interface forms used on various technological platforms.4 Moreover, technology-based documents are not limited to personal information. They may also pertain to company or organization-related information stored on technological platforms. For instance, Quebec’s Superior Court recently cited the Act in recognizing the probative value of medical imaging practice guidelines and technical standards accessible on a website.5 A less recent decision also recognized that the contents of electronic agendas were admissible as evidence.6 Due to their bulky algorithms, various AI technologies are available as software as a service (SaaS) or as platform as a service (PaaS). In most cases, the information entered by user companies is transmitted on supplier-controlled servers, where it is processed by AI algorithms. This is often the case for advanced client relationship management (CRM) systems and electronic file analysis. It is also the case for a whole host of applications involving voice recognition, document translation and decision-making assistance for users’ employees. In the context of AI, technology-based documents in all likelihood encompass all documents that are transmitted, hosted and processed on remote servers. Reciprocal obligations The Act sets out specific obligations when information is placed in the custody of service providers, in particular IT platform providers. Section 26 of the Act reads as follows: 26. Anyone who places a technology-based document in the custody of a service provider is required to inform the service provider beforehand as to the privacy protection required by the document according to the confidentiality of the information it contains, and as to the persons who are authorized to access the document. During the period the document is in the custody of the service provider, the service provider is required to see to it that the agreed technological means are in place to ensure its security and maintain its integrity and, if applicable, protect its confidentiality and prevent accessing by unauthorized persons. Similarly, the service provider must ensure compliance with any other obligation provided for by law as regards the retention of the document. (Our emphasis) This section of the Act, therefore, requires the company wishing to use a technological platform and the supplier of the platform to enter into a dialogue. On the one hand, the company using the technological platform must inform the supplier of the required privacy protection for the information stored on the platform. On the other hand, the supplier is required to put in place “technological means” with a view to ensuring security, integrity and confidentiality, in line with the required privacy protection requested by the user. The Act does not specify what technological means must be put in place. However, they must be reasonable, in line with the sensitivity of the technology-based documents involved, as seen from the perspective of someone with expertise in the field. Would a supplier offering a technological platform with outmoded modules or known security flaws be in compliance with its obligations under the Act? This question must be addressed by considering the information transmitted by the user of the platform concerning the required privacy protection for technology-based documents. The supplier, however, must not conceal the security risks of its IT platform from the user since this would violate the parties’ disclosure and good faith requirements. Are any individuals involved? These obligations must also be viewed in light of Quebec’s Charter of Human Rights and Freedoms, which also applies to private companies. Companies that process information on behalf of third parties must do so in accordance with the principles set out in the Charter whenever individuals are involved. For example, if a CRM platform supplier offers features that can be used to classify clients or to help companies respond to requests, the information processing must be free from bias based on race, colour, sex, gender identity or expression, pregnancy, sexual orientation, civil status, age except as provided by law, religion, political convictions, language, ethnic or national origin, social condition, a handicap or the use of any means to palliate a handicap.7 Under no circumstances should an AI algorithm suggest that a merchant should not enter into a contract with any individual on any such discriminatory basis.8 In addition, anyone who gathers personal information by technological means making it possible to profile certain individuals must notify them beforehand.9 To recap, although the emerging world of AI is a far cry from the Wild West decried by some observers, AI must be used in accordance with existing legal frameworks. No doubt additional laws specifically pertaining to AI will be enacted in the future. If you have any questions on how these laws apply to your AI systems, please feel free to contact our professionals. Bill C-27, Digital Charter Implementation Act, 2022. In particular, the U.S. Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence, October 30, 2023. Act to establish a legal framework for information technology, CQLR c C-1.1, sec. 3. Ibid, sec. 71. Tessier v. Charland, 2023 QCCS 3355. Lefebvre Frères ltée v. Giraldeau, 2009 QCCS 404. Charter of Human Rights and Freedoms, sec. 10. Ibid, sec. 12. Act respecting the protection of personal information in the private sector, CQLR c P-39.1, sec. 8.1.

    Read more
  2. Artificial intelligence in business: managing the risks and reaping the benefits?

    At a time when some are demanding that artificial intelligence (AI) research and advanced systems development be temporarily suspended and others want to close Pandora’s box, it is appropriate to ask what effect chat technology (ChatGPT, Bard and others) will have on businesses and workplaces. Some companies support its use, others prohibit it, but many have yet to take a stand. We believe that all companies should adopt a clear position and guide their employees in the use of such technology. Before deciding what position to take, a company must be aware of the various legal issues involved in using this type of artificial intelligence. Should a company decide to allow its use, it must be able to provide a clear framework for it, and, more importantly, for the ensuing results and applications. Clearly, such technological tools have both significant advantages likely to cause a stir—consider, for example, how quickly chatbots can provide information that is both surprising and interesting—and the undeniable risks associated with the advances that may arise from them. This article outlines some of the risks that companies and their clients, employees and partners face in the very short term should they use these tools. Potential for error and liability The media has extensively reported on the shortcomings and inaccuracies of text-generating chatbots. There is even talk of “hallucinations” in certain cases where the chatbot invents a reality that doesn’t exist. This comes as no surprise. The technology feeds off the Internet, which is full of misinformation and inaccuracies, yet chatbots are expected to “create” new content. They lack, for the time being at least, the necessary parameters to utilize this “creativity” appropriately. It is easy to imagine scenarios in which an employee would use such technology to create content that their employer would then use for commercial purposes. This poses a clear risk for the company if appropriate control measures are not implemented. Such content could be inaccurate in a way that misleads the company’s clients. The risk would be particularly significant if the content generated in this way were disseminated by being posted on the company’s website or used in an advertising campaign, for example. In such a case, the company could be liable for the harm caused by its employee, who relied on technology that is known to be faulty. The reliability of these tools, especially when used without proper guidance, is still one of the most troubling issues. Defamation Suppose that such misinformation concerns a well-known individual or rival company. From a legal standpoint, a company disseminating such content without putting parameters in place to ensure that proper verifications are made could be sued for defamation or misleading advertising. Thus, adopting measures to ensure that any content derived from this technology is thoroughly validated before any commercial use is a must. Many authors have suggested that the results generated by such AI tools should be used as aids to facilitate analysis and decision-making rather than to produce final results or output. Companies will likely adopt these tools and benefit from them—for competitive purposes, in particular—faster than good practices and regulations are implemented to govern them. Intellectual property issues The new chatbots have been developed as extensions to web search engines such as Google and Bing. Content generated by chatbots may be based on existing copyrighted web content, and may even reproduce substantial portions of it. This could lead to copyright infringement. Where users limit their use to internal research, the risk is limited as the law provides for a fair dealing exception in such cases. Infringement of copyright may occur if the intention is to distribute the content for commercial purposes. The risk is especially real where chatbots generate content on a specific topic for which there are few references online. Another point that remains unclear is who will own the rights to the answers and results of such a tool, especially if such answers and results are adapted or modified in various ways before they are ultimately used. Confidentiality and privacy issues The terms and conditions of use for most chatbots do not appear to provide for confidential use. As such, trade secrets and confidential information should never be disclosed to such tools. Furthermore, these technologies were not designed to receive or protect personal information in accordance with applicable laws and regulations in the jurisdictions where they may be used. Typically, the owners of these products assume no liability in this regard. Other issues There are a few other important issues worth considering among those that can now be foreseen. Firstly, the possible discriminatory biases that some attribute to artificial intelligence tools, combined with the lack of regulation of these tools, may have significant consequences for various segments of the population. Secondly, the many ethical issues associated with artificial intelligence applications that will be developed in the medical, legal and political sectors, among others, must not be overlooked. The stakes are even higher when these same applications are used in jurisdictions with different laws, customs and economic, political and social cultures. Lastly, the risk for conflict must also be taken into consideration. Whether the conflict is between groups with different values, between organizations with different goals or even between nations, it is unclear whether (and how) advances in artificial intelligence will help to resolve or mitigate such conflicts, or instead exacerbate them.   Conclusion Chat technologies have great potential, but also raises serious legal issues. In the short term, it seems unlikely that these tools could actually replace human judgment, which is in and of itself imperfect. That being said, just as the industrial revolution did two centuries ago, the advent of these technologies will lead to significant and rapid changes in businesses. Putting policies in place now to govern the use of this type of technology in your company is key. Moreover, if your company intends to integrate such technology into its business, we recommend a careful study of the terms and conditions of use to ensure that they align with your company’s project and the objectives it seeks to achieve with it.

    Read more
  3. Cybersecurity and the dangers of the Internet of Things

    While the Canadian government has said it intends to pass legislation dealing with cybersecurity (see Bill C-26 to enact the Critical Cyber Systems Protection Act), many companies have already taken significant steps to protect their IT infrastructure. However, the Internet of Things is too often overlooked in this process. This is in spite of the fact that many devices are directly connected to the most important IT infrastructure for businesses. Industrial robots, devices that control production equipment in factories, and devices that help drivers make deliveries are just a few examples of vulnerable equipment. Operating systems and a range of applications are installed on these devices, and the basic operations of many businesses and the security of personal information depend on the security of the devices and their software. For example: An attack could target the manufacturing equipment control systems on the factory floor and result in an interruption of the company’s production and significant recovery costs and production delays. By targeting production equipment and industrial robots, an attacker could steal the blueprints and manufacturing parameters for various processes, which could jeopardize a company’s trade secrets. Barcode scanners used for package delivery could be infected and transmit information to hackers, including personal information. The non-profit Open Web Application Security Project (OWASP) has released a list of the top ten security risks for the Internet of Things.1 Leaders of companies that use this kind of equipment must be aware of these issues and take measures to manage these risks. We would like to comment on some of the risks which require appropriate policies and good company governance to mitigate them. Weak or unchangeable passwords: Some devices are sold with common or weak initial passwords. It is important to ensure that passwords are changed as soon as devices are set up and to keep tight control over them. Only designated IT personnel should know the passwords for configuring these devices. You should also avoid acquiring equipment that does not allow for password management (for example, a device with an unchangeable password). Lack of updates: The Internet of Things often relies on computers with operating systems that are not updated during their lifetime. As a result, some devices are vulnerable because they use operating systems and software with known vulnerabilities. Good governance includes ensuring that such devices are updated and acquiring only devices that make it easy to perform regular updates. Poor management of the fleet of connected devices: Some companies do not have a clear picture of the Internet of Things deployed in their company. It is crucial to have an inventory of these devices with their role in the company, the type of information they contain and the parameters that are essential to their security. Lack of physical security: Wherever possible, access to these devices should be protected. Too often, devices are left unattended in places where they are accessible to the public. Clear guidelines should be provided to employees to ensure safe practices, especially for equipment that is used on the road. A company’s board of directors plays a key role in cybersecurity. In fact, the failure of directors to monitor risks and to ensure that an adequate system of controls is in place can expose them to liability. Here are some elements of good governance that companies should consider practising: Review the composition of the board of directors and the skills matrix to ensure that the team has the required skills. Provide training to all board members to develop their cyber vigilance and equip them to fulfill their duties as directors. Assess cybersecurity risks, including those associated with connected devices, and establish ways to mitigate those risks. The Act to modernize legislative provisions respecting the protection of personal information sets out a number of obligations for the board of directors, including appointing a person in charge of the protection of personal information, having a management plan and maintaining a register of confidentiality incidents. For more information, you can read the following bulletin: Amendments to Privacy Laws: What Businesses Need to Know (lavery.ca) Lastly, a company must at all times ensure that the supplier credentials, passwords and authorizations that make it possible for IT staff to respond are not in the hands of a single person or supplier. This would put the company in a vulnerable position if the relationship with that person or supplier were to deteriorate. See OWASP top 10

    Read more
  1. Lavery assists Agendrix in obtaining two ISO certifications for data security and privacy

    On February 6, 2023, Agendrix, a workforce management software company, announced that it had achieved certification in two globally recognized data security and privacy standards, ISO/IEC 27001:2013 and ISO/IEC 27701:2019. This made it one of the first staff scheduling and time clock software providers in Canada to obtain these certifications. The company is proactively engaging in all matters related to the security and confidentiality of the data processed by its web and mobile applications. The ISO/IEC 27001:2013 standard is aimed at improving information security systems. For Agendrix’s customers, that means its products comply with the highest information security standards. ISO/IEC 27701:2019 provides a framework for the management and handling of personal information and sensitive data. This certification confirms that Agendrix follows best practices and complies with applicable laws. A Lavery team composed of Eric Lavallée, Dave Bouchard, Ghiles Helli and Catherine Voyer supported Agendrix in obtaining these two certifications. More specifically, our professionals assisted Agendrix in the review of their standard contract with their customers, as well as in the implementation of policies and various internal documents essential to the management of personal information and information security. Agendrix was founded in 2015, and the Sherbrooke-based company now has over 150,000 users in some 13,000 workplaces. Its personnel management software is a leader in Quebec in the field of work schedule management for small and medium-sized businesses. Agendrix’s mission is to make management more human-centred by developing software that simplifies the lives of front-line employees. Today, the company employs more than 45 people.

    Read more
  2. Lavery represents ImmunoPrecise Antibodies as it acquires BioStrand

    On March 29, 2022, ImmunoPrecise Antibodies Ltd (IPA) announced that it acquired BioStrand BV, BioKey BV, and BioClue BV (together, “BioStrand”), a group of Belgian entities pioneers in the field of bioinformatics and biotechnology. With this €20 million acquisition, IPA will be able to leverage BioStrand’s revolutionary AI-powered methodology to accelerate the development of therapeutic antibody solutions. In addition to creating synergies with its subsidiaries, IPA expects to develop new markets with this revolutionary technology and strengthen its position as a world leader in biotherapeutics. Lavery was privileged to support IPA in this cross-border transaction by providing specialized expertise in cybersecurity, intellectual property, securities and mergers and acquisitions. The Lavery team was led by Selena Lu (transactional) and included Eric Lavallée (technology and intellectual property), Serge Shahinian (intellectual property), Sébastien Vézina (securities), Catherine Méthot (transactional), Jean-Paul Timothée (securities and transactional), Siddhartha Borissov-Beausoleil (transactional), Mylène Vallières (securities) and Marie-Claude Côté (securities). ImmunoPrecise Antibodies Ltd. is a biotherapeutic, innovation-powered company that supports its business partners in their quest to discover and develop novel antibodies against a broad range of target classes and diseases.

    Read more