How do intellectual property rights apply to AI?

TLT Solicitors | Daniel Lloyd | Sept 2019

intellectual property and AI - How do intellectual property rights apply to AI?

Who is liable when AI goes wrong?

Let us take the example of Tesla, whose vehicles have been involved in two similar fatal crashes since 2016. In both cases the vehicle failed to see a lorry cross its path and travelled into the lorry shearing off the top of the car, thereby causing both drivers to suffer fatal injuries. Should Tesla be liable for the crash? At what point should a driver no longer have any liability for what the car is doing?

At the moment the Department of Transport in the USA adheres to the automation standards set out by the SAE which run from “level 0” (no automation) to “level 6” (full automation). It is accepted that Tesla’s Autopilot driverless software system is no more than a level 2 or 3 on this scale, both of which require the driver to remain in control of the vehicle when driving. So from a public law perspective at least, Tesla is not being held liable for the two crashes that occurred if, as appears to be the case, the drivers were not in control of the vehicles at the time they crashed. It is accepted that the drivers should have been in control of the vehicles. However Tesla is continuing to push the envelope and has promised to produce full driverless automation within the next year. As and when that happens new liability questions will be posed.

See:  10 Key Issues For Fintech Startup Companies

With fully driverless cars operating at level 6 on the SAE scale, there will definitely be an argument to say that certain notions of personhood or human agency should be ascribed to the driverless software, once it gets to a level where it is truly driverless, in its capacity as being responsible for the vehicle on the road.  But what happens if there is a crash and the driverless software is at fault? Back in the legal world such disputes will have to be resolved by reference to legal persons and as the driverless software will have no legal personhood beyond its ability to drive a car, the law will have to find another way of resolving justiciable disputes as the software will not by itself be capable of generating a remedy recognisable in law.

So for the time being at least, one has to assume that from a practical perspective whoever is the proprietary owner of the software will also be liable if it, in its AI aspect, creates legal harms that require a remedy.

Breaking AI down into separate parts is difficult in practice. For the purposes of this article we have used the following split:

  • the untrained algorithm is the raw software;
  • data is then used to train the algorithm;
  • machine learning is the ability for the algorithm to learn by itself, or where the machine, rather than a human, writes the 'code'. In its simplest form, again this is just software. It is fiendishly clever software – but it is just software;
  • the trained model is the untrained algorithm with the new 'code' written by the machine. Again this is software.

See:  Machine Learning in Finance – Present and Future Applications

Data element

Under English law, no one can own data. There are no rights in data, there are only rights in relation to data. That is, the data in question must fall into a specific class to be protected – such as confidential information, personal data or database rights. There is not general ownership of data. Therefore to protect data is must be practically and contractually controlled.

Therefore from a legal point of view, all rights and requirements must be explicitly set out in the contract – including usage, return, deletion requirements etc. If any areas are not covered, then you will not be protected.

Software elements

The untrained algorithm + machine learning + trained model. Copyright is the main IP protection for software under English law. Copyright applies to AI software in the same way it applies to any other form of software.

The key issue here is who owns the trained model? Although the trained model will be protected by copyright, it is not clear who owns those rights as this has been created in part by the machine itself – not by a human.

The same difficulty arises for works created by AI.

You make also like:

 


NCFA Jan 2018 resize - How do intellectual property rights apply to AI? The National Crowdfunding & Fintech Association (NCFA Canada) is a financial innovation ecosystem that provides education, market intelligence, industry stewardship, networking and funding opportunities and services to thousands of community members and works closely with industry, government, partners and affiliates to create a vibrant and innovative fintech and funding industry in Canada. Decentralized and distributed, NCFA is engaged with global stakeholders and helps incubate projects and investment in fintech, alternative finance, crowdfunding, peer-to-peer finance, payments, digital assets and tokens, blockchain, cryptocurrency, regtech, and insurtech sectors. Join Canada's Fintech & Funding Community today FREE! Or become a contributing member and get perks. For more information, please visit: www.ncfacanada.org

Latest news - How do intellectual property rights apply to AI?FF Logo 400 v3 - How do intellectual property rights apply to AI?community social impact - How do intellectual property rights apply to AI?
NCFA Newsletter subscribe600 - How do intellectual property rights apply to AI?

NCFA Fintech Confidential Issue 2 FINAL COVER - How do intellectual property rights apply to AI?