Well, perhaps that’s a slightly exaggerated headline. However, as developments in automation advance rapidly, courts will need to address the consequences of the use of artificial intelligence, from copyright and patent law, to privacy, negligence, taxation and administrative law.
Currently, the legal position on automation is a mixed bag. Copyright is unlikely to protect the output of an automated system. Privacy laws are constantly adapting to deal with the immense flow of personal information from large-scale data analysis. Professional services firms, while needing to embrace the possibilities of automated analysis of large volumes of information, also need to be cautious about over-reliance on AI-generated data when providing professional advice.
Two recent Australian cases have provided us with some insight into how the courts may apply existing laws in situations involving automation and AI, in the areas of administrative law and patent law.
Pintarich v Deputy Commissioner of Taxation
Mr Pintarich failed to file tax returns for his income between 2010 and 2013. He received an automated letter in the name of the Deputy Commissioner of the ATO that said that if he paid a lump sum by a fixed date, he would not need to pay his general interest charge (GIC) liability – about $335,000.
Not unreasonably, Mr Pintarich relied on the letter and did not pay the GIC. He borrowed money from the bank and paid the lump sum by the due date.
Subsequently, the ATO notified him that there was an error and he would have to pay the GIC. Mr Pintarich argued that the letter was a binding decision by the Deputy Commissioner.
The ATO argued that the relevant officer had entered information into a program and the letter had been automatically generated. They had no specific explanation for how the wording about the GIC had been included in the letter.
The Court considered that for the letter to be considered a valid decision by the Deputy Commissioner, it required two elements:
1. A mental process of reaching a conclusion; and
2. An objective manifestation of that conclusion.
Given the automated nature of the letter, the majority ruled that there was no mental process involved in reaching the conclusion, and accordingly, the letter was not a valid decision. Unfortunately for Mr Pintarich, the GIC was held to be payable.
The possible consequences of requiring a mental element for an administrative decision are extensive: as automation technologies become more widespread in the public sector, and automated programs begin to replace human mental processes in complex decision making, to what extent will administrative decision makers be able to rely on their ‘subjective’ mental processes compared to their ‘objective’ output to reverse an automated decision? How will constituents know which correspondence to rely on?
Interestingly, having disagreed with the majority judgment, Justice Kerr highlighted the importance of the law needing to reflect the current technological landscape, saying:
“the expectation that a ‘decision’ will usually involve human mental processes of reaching a conclusion prior to an outcome being expressed by an overt act is being challenged by automated ‘intelligent’ decision making systems that rely on algorithms to process applications and make decisions.”
Rokt Pte Ltd v Commissioner of Patents
In 2013, tech-start up Rokt applied for a patent for a computer-implemented system and method for linking web users to online advertising. The method involved using an understanding of customer psychology, linking engagement information, data ranking algorithms and real-time data manipulation, to present ads to customers who were more likely to engage with them. The Commissioner of Patents determined that the invention was not patentable.
Computer-implemented inventions have often failed patentability tests. A patent must involve a method of manufacture, as well as other elements; an improvement in technology can be considered as a method of manufacture, but not simply the use of computing as a tool. So, using a computer to implement an existing method is not patentable. Overturning the Commissioner’s decision, the Federal Court considered that Rokt’s development integrated different pre-existing elements in a novel way to solve a technical problem, and so it qualified as an improvement in technology.
This decision reflects an understanding of how the changing effects of advancing technologies require the evolution of how we apply legal principles.
While the decision has provided some short-term clarity regarding patentability of computer-implemented inventions, it is now under appeal. By the time we have a decision, I may have been replaced by an article-writing robot …
If you have any questions about legal protection for your AI developments, or privacy requirements for large-scale data handling, contact us before it’s too late!
Author: Blake Motbey, Pararobot