Advokatens Rådgivaransvar vid Användandet av Artificiell Intelligens i Samband med Due Diligence

Detta är en Uppsats för yrkesexamina på avancerad nivå från Lunds universitet/Juridiska institutionen; Lunds universitet/Juridiska fakulteten

Sammanfattning: Lawyers have long been looking for ways to streamline their work and best serve the client's interest. Artificial intelligence has emerged as the most praiseworthy technique for achieving great efficiencies in the lawyer's duties. A document-reviewing AI tool with greater precision in legal problem solving than most lawyers is now available to some lawyers. These tools are very quick to categorize legal issues and are therefore used where the amount of documents is large. One such area is due diligence, hereinafter referred to as DD, in private company acquisitions. There, these AI tools are used to examine large amounts of documents where the tools categorize potential risks with the company that is about to be acquired. It is conceivable that lawyers using these systems make mistakes in their advice that lead to the client losing money. In these situations, it is possible for the so called adviser's responsibility to enter into force. The lawyer can then reasonably not defend himself by pointing out the shortcomings in the work tool he has used, but is possibly responsible for the result of the work he has done. The assessment in these situations will ultimately be whether or not the lawyer's choice of working method can be said to be negligent. The lawyer's care in using AI must be assessed in these cases, including the entry of initial data for the AI tool, whether the lawyer has maintained the pedagogical duty to explain the system use to a sufficient extent for the client and to what extent the system has actually been used. The lawyer's responsibility is mainly based on which method he has used and whether this can be said to be acceptable or not. The lawyer can not generally be said to be responsible for achieving a certain result. The fact that the use of AI is not an ordinary working method in a lawyer's work must not, after all, mean that the use itself is negligent. An assessment must instead be made between each individual case as to whether the use was negligent or not.

  HÄR KAN DU HÄMTA UPPSATSEN I FULLTEXT. (följ länken till nästa sida)