A bill of rights for the age of technology


The Australian Human Rights Commission is currently working on a project to 'explore the challenges and opportunities that technology poses to our human rights'. Its focus includes responsible innovation, decision-making and artificial intelligence, and accessible technology.

Abstract humans on white background (Credit: Dmitrri, iStock / Getty Images Plus)

These questions arise against a backdrop of government incursions into human rights through technology, such as the 'robodebt' saga, the NSW suspect target management system, and a raft of commonwealth legislation expanding Australia's surveillance architecture.

The Human Rights Commission Issues Paperposes a number of questions, opening with: 'What types of technology raise particular human rights concerns? Which human rights are particularly implicated?' These questions in particular are difficult to answer. There are three main reasons for this. First, it is not technology itself that raises human rights concerns. Secondly, technology is everywhere and therefore hard to pin down. Thirdly, the questions presuppose that Australia has a human rights regime.

Technology is just a tool

While particular technologies may affect particular human rights, it is the design or deployment of technology, not the technology per se, that has the impact. Artificial intelligence or robots or big data do not infringe human rights simply because we know how to write code, or design machines in particular ways. It is only once these technologies are applied within a context that there is potential for adverse human rights outcomes.

Technology may of course be designed to meet a specific, ethically dubious outcome. Killer robots for example raise very obvious questions. But that does not make robotics in breach of human rights. Other technologies, such as machine learning algorithms that sort through job applications, may unintentionally affect human rights if the authors of the code have not considered inbuilt bias. But machine learning is not intrinsically in breach of human rights, nor is an automated job application process. Designers must simply be aware of the potential for bad outcomes.

The answer to the question of what sorts of technology raises human rights concerns is both all technology, and none. All may affect human rights, but none does necessarily.

Technology is everywhere

Related to the first point, is that we are surrounded by technology. If we were to regulate technology — or its design or application — just what would we be regulating? If we pass a law today, the technology that is the subject of the law will likely change within a year or two at least. The law would forever be playing catch up with the latest developments.


"Despite the timely and important discussions that will flow from the process, Australia sorely needs a bill of rights to contain the exercise of government power concerning technology."


On the other hand, attempts to regulate 'technology' to meet human rights would be meaningless as the term is simply too broad.

A bill of rights

Talk of 'human rights' in Australia is itself fairly abstract, relying on international standards, many of which do not form part of the Australian legal landscape. Indeed, Australians enjoy very little legal assurance of human rights, relative to other countries. We do have anti-discrimination legislation at Commonwealth and state levels, and some human rights exist at common law. We have no nationally entrenched human rights protection. Therefore, discussion about whether technologies (or their application) affect human rights is based only on the relatively slim protections we have under legislation.

The lack of comprehensive human rights protection in Australia — through a bill of rights — is really the foundational problem exposed by the Issues Paper. There may be some types of technological applications properly the subject of legislation in the same way that, say, guns or the sale of motor vehicles are regulated. However, in general, it is the assertion of human rights vested in each person that would provide the measure of whether the application of technology upholds or erodes those human rights.

A robust human rights framework would also hold government to account in its own deployment of technology (such as 'robodebt'). It would also provide protections against the government's increasing attempts to control data through legislation — where data is collected and deployed using diverse technologies.

Recently the Parliamentary Joint Committee on Intelligence and Security heard submissions concerning the benignly-named Access and Assistance Bill. The bill empowers security agencies to require a 'communications provider' to decrypt encrypted information. The scope and the application of the bill is extremely wide, and although the government has cited terrorism as the reason for needing enhanced powers, the powers can be used for all sorts of other crimes, and even in protecting national economic interests.

In over 204 written submissions to the Committee, there was clear concern about the effect of the bill on the security of online information overall. But as with other legislation, including the mandatory data retention regime and the biometric data sharing program, the framing of such regulation has implications for human rights.

Increasing government surveillance capability through legislation may not be the 'technology' contemplated by the Human Rights Commission in its Issues Paper. Yet the systems being put in place both build technological capability and are reliant on it. These systems are a potential threat to human rights, deserving of close attention.

Despite the timely and important discussions that will flow from the Human Rights Commission process, Australia sorely needs a bill of rights to contain the exercise of government power concerning technology.


Submissions responding to the Issues Paper were due last month, and we can expect to see a discussion paper in early 2019 followed by further consultations.


Kate GallowayKate Galloway is a legal academic with an interest in social justice.



Main image credit: Dmitrri, iStock / Getty Images Plus

Topic tags: Kate Galloway, robots, bill of rights, robodebt, data



submit a comment

Existing comments

The new technologies to which you refer are only the latest in a long procession of technologies that can be used for good or ill. The risks associated with these new technologies are not new risks but are the same as those that have always existed, that is the risk of abuse in the hands of the powerful. The greatest risk is is associated with the greatest power, that is with the state. This is why the rights of citizens needs to be enshrined in the Constitution, not just in legislation enacted by Parliament, and it needs to be focused on protecting citizens from abuse by the state. This is why a 'Bill of Rights' needs to focus on freedom of thought, movement, expression and association; and freedom from arbitrary detention, eavesdropping, and harassment.

Ginger Meggs | 01 November 2018  

The topic of maintaining human rights with advancing technology is a very important one. There is the issue of technologies design to kill or injure people as Kate has mentioned, but already many millions of people around the world have had their human right to maintain good health eroded because of pollution generated in the manufacture of the technologies we use and the inappropriate and ineffective management of toxic wastes. Governments have been very slow to introduce laws to protect people by ensuring there are safe work environments and systems of work and that there are safe standards for recycling, disposal and storage of unwanted wastes. People should have a basic human right to live in a healthy and safe environment where they will not become sick because of the criminal negligence of manufacturers. People also have the right to know when the soil or ground water where they live have been contaminated due to industrial activity. Such contamination leads to the presence of toxins, nano and plastic particles in our food. Technology can not only present hazards; it may present other social problems. For example, since the Industrial Revolution, people have been concerned about machinery that will put huge numbers of people out of work. This has been particularly so since robots have been replacing workers in large numbers in more recent times. I think it is a very shameful that Australia does not have a declaration of human rights to provide a framework for the protection of human rights for all citizens. Some years ago during a public forum, I suggested to a former commissioner for human rights that we needed to have such a document and that it needed to include reference to worker OH&S and environmental health as well as the other more traditional human rights issues eg gender, gender preference, ethnicity, life philosophy etc. etc. Not a lot has happened.

Andrew (Andy) Alcock | 01 November 2018  

Similar Articles

The worst may already have happened

  • Fatima Measham
  • 24 October 2018

Under such conditions, it is hard to get people to concede that what they believe might be incomplete. No one wants to give anything up. This is an attempt to get people to give something up. Here is how to do it: ask what is the worst that can happen. Then accept that it may have already happened. But not to you.


Royal visit's model for Aboriginal sovereignty

  • Dani Larkin
  • 24 October 2018

The system that keeps Aboriginal cultural autonomy oppressed calls itself representative democracy, yet our voices remain unheard. Seeing the royal couple prioritise our own land conservation more than the Australian government does is the ultimate example of how disrespected and politically powerless we are.



Subscribe for more stories like this.

Free sign-up