AustLII Home | Databases | WorldLII | Search | Feedback

Computers and Law: Journal for the Australian and New Zealand Societies for Computers and the Law

You are here:  AustLII >> Databases >> Computers and Law: Journal for the Australian and New Zealand Societies for Computers and the Law >> 2021 >> [2021] ANZCompuLawJl 3

Database Search | Name Search | Recent Articles | Noteup | LawCite | Author Info | Download | Help

Barreto, Katherine; Hards, Shaun; Leiman, Tania --- "Automated and algorithmic decision making, data driven inferencing and consumer protection" [2021] ANZCompuLawJl 3; (2021) 93 Computers & Law 5

Automated and algorithmic decision making, data driven inferencing and consumer protection

Katherine Barreto [1], Shaun Hards [2], and Tania Leiman [3]

20 November 2020

Consumer transactions are changing

Technology is changing the nature of consumer transactions. Goods and services are increasingly bought and sold online, impacting the nature of interactions and relationships between consumers and manufacturers of goods and services. Businesses are inundated with and generating large amounts of data each day. Automated or algorithmic decision making (ADM) and data driven inferencing (DDI) are claimed to improve efficiency, personalise product and service provision, and streamline application, payments, and complaints processes. Purported benefits include identifying and predicting trends and demand, enabling agile decisions about price, product development and marketing, and allowing consumers to personalise and streamline choices or purchases but also pose challenges.[4] Consumers may be completely unaware that their data is being mined to shape products and services offered to them.[5] This invisibility, information asymmetry and opacity[6] as to process or outcomes makes consumers vulnerable to exploitation by unfair practices, and presents new challenges for consumer protection.[7] While use of algorithms presents opportunities for consumer organisations to increase awareness of protections,[8] others argue a ‘new consumer protection ecosystem’[9] is required. Australia’s Consumer Data Right[10] is a first step, effective February 2020 and applying initially only to the banking sector, but broader protections are required.

When ADM or DDI generate suggestions or options, businesses, suppliers or consumers may simply accept these without seeking alternative or contradictory information, or be reluctant to decide against ADM/DDI suggested outcomes.[11] Consumers may be unaware they have been subjected to ADM/DDI; unable to identify processes resulting in decisions about them or predicting their future behaviour; and/or unable to access avenues to correct data inputs or seek review or appeal.[12] The identity of the person or entity with legal responsibility for the ADM/DDI may be inaccessible, or intended purposes of ADM/DDI processes opaque. When charges of bias in algorithmic processes have been made, owners or programmers of algorithms may ‘insist no one at their firm intended to discriminate,’[13] and if accountability for ADM/DDI cannot be apportioned to any specific person or entity, intent (if required) will be difficult to prove.

While no national or international consensus for an ethical framework to accommodate emerging technology such as ADM/DDI yet exists,[14] recurring themes emerge as exemplified by the OECD Principles on AI:[15] ‘inclusive growth, sustainable development and well-being; human-centred values and fairness; transparency and explainability; robustness, security and safety; accountability and responsibility.’[16] These are a useful starting point for assessment of any impact by ADM/DDI on consumer welfare and rights, and particularly in consumer-facing fora or when providing goods and services to consumers. Consideration must be given to whether, and if so how, participants in the ADM process (entities that commission and use it, creators, manufacturers, operators, and maintenance personnel) should be held accountable for adverse outcomes or effects.[17] Review of existing blanket disclaimers of liability for software[18] will be essential especially where outputs could be harmful or potentially fatal.[19]

The GDPR Experience

The EU General Data Protection Regulation (GDPR) provides:[20]

• ‘Where personal data are processed for direct marketing purposes, the data subject shall have the right to object at any time to processing of personal data concerning him or her for such marketing, which includes profiling to the extent that it is related to such direct marketing.’[21]

• ‘Where the data subject objects to processing for direct marketing purposes, the personal data shall no longer be processed for such purposes.’[22]

• ‘The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.’[23]

Consumer responses to the GDPR’s introduction indicated most did not strongly dissent to their data being processed to provide direct marketing more tailored to their interests.[24] For more important decisions (e.g. eligibility for mortgage and student loans), algorithms were accepted ‘as a component of the decision making process,’ but consumers ‘felt that a human should be involved in the final decision.’[25] Further, consumers wanted transparency where ADM was used and considered it important that the right to appeal ADM outcomes was available.[26] A GDPR-inspired legal framework has been advocated in the United States as a ‘model for consumer credit reform,’[27] where it may address the erosion of consumer protections arising from the use of big data and machine learning in consumer access to credit. No similar protection to that set out in the GDPR above is yet available in Australia.

Australian Consumer Law

There is ‘limited legislative or judicial analysis’ to date of any ‘disconnections’ between existing Australian Consumer Law (ACL)[28] protections law and ADM/DDI.[29] The ACL’s protections are intended ‘[t]o improve consumer wellbeing through consumer empowerment and protection, to foster effective competition and to enable the confident participation of consumers in which both consumers and manufacturers trade fairly.’[30] The role and importance of technology in consumer-manufacturer dealings has increased rapidly since the ACL commenced in 2010, with significant consequences for both consumers and manufacturers. However, applying ACL protections to ADM and DDI can be problematic because these technologies along with artificial intelligence (AI)[31] generally are an ‘evolving concept’ and ‘most AI innovation is being led by corporate research and development processes’ within which there may be ‘little regard to societal good or the deeper implications of AI innovation.’[32]

Depending on context and the outcome of its use, ADM/DDI might be a ‘person,’ a ‘good’ or a ‘service,’ ‘conduct,’ a ‘product,’ a ‘thing’, or a ‘manufacturer’. Ensuring existing ACL definitions and provisions adequately respond to increasing use of ADM/DDI is thus crucial to protect consumers.


ACL prohibitions against ‘misleading or deceptive conduct’,[33] ‘unconscionable conduct’;[34] unfair practices including making false and misleading representations,[35] unsolicited supplies,[36] ‘participation in pyramid schemes’,[37] and ‘multiple pricing’[38] all refer to conduct of a ‘person’. ‘Person’ includes ‘a body politic or corporate as well as an individual’.[39] ‘‘Individual’ means a natural person.’[40] How should these definitions apply to outcomes produced by ADM or DDI? In Google Inc v ACCC, although sponsored advertisements were misleading and deceptive,[41] the High Court unanimously held that 'Google did not itself engage in misleading or deceptive conduct, or endorse or adopt the representations which it displayed on behalf of advertisers,'[42] acting merely as a ‘conduit’ for those representations.[43] US courts have concluded ‘that algorithmic mediation doesn’t make an intermediary a publisher of other people’s speech,’ enabling “online intermediaries” to protect themselves from direct liability to consumers.[44] Deep fakes are created by Generative Adversarial Networks[45], not humans, and have considerable potential to mislead or deceive.[46] However, unless a ‘person’ has made these false or misleading representations, the ACL may not apply, reflecting the observation that ‘regulation is more likely to target social conduct than technology itself.’[47]

This illustrates the ‘substitution effect’ problem – the effect on society when robots, AI agents, and algorithms act as substitutes for human beings.[48] Proprietary, commercial-in-confidence, ‘black box’ algorithms use undisclosed inputs and data sets,[49] preventing consumers assessing the accuracy, validity and reliability of data sets, outcomes and the purpose for which the ADM/DDI is carried out.[50] Is this lack of transparency deceptive in and of itself?

One proposed solution is not to ‘create a set of legal obligations for algorithms or robots,’ but rather build on ‘our centuries-long experience with regulating persons,’ and monitor and regulate the creating and coding process of the owners and programmers of artificial intelligence.[51] Attributing ultimate responsibility for ADM/DDI to a human person or an entity which meets the definition of a ‘person’ for the purposes of the Acts Interpretation Act aligns with support in the literature for the concept that ‘[a] robot must always indicate the identity of its creator, controller, or owner’ as a means for enabling viable regulation.[52] Perhaps not until ADM/DDI has a legal ‘capacity to act’, can it be regulated, because regulation effectively serves as ‘the intentional attempt to influence the behaviour of people (or other entities with a [legal] capacity to act).’[53]

In its current forms (and short of artificial general intelligence[54]), ADM/DDI cannot be said to be truly autonomous, with some arguing ‘the idea of granting legal personhood to or endowing [ADM/DDI] with any rights and/or obligations should be abandoned.’[55] Instead, (adopting an approach suggested by the National Transport Commission in relation to automated vehicles),[56] to avoid doubt regarding accountability and increase consumer protection, manufacturers, suppliers or others using ADM/DDI in consumer transactions could similarly be required to nominate an ADM Entity responsible for decisions made or actions taken.

‘Goods’ or ‘service’

The ACL set outs various statutory guarantees in relation to consumer transactions. Nine relate to the supply of goods,[57] and three relate to the supply of services.[58] The ACL definition of ‘goods’ is not exhaustive and includes ‘electricity’, ‘computer software’; and ‘any component part of, or accessory to, goods.’[59] The ACL definition of ‘services’ is expansive covering ‘any rights (including rights in relation to, and interests in, real or personal property), benefits, privileges or facilities that are, or are to be, provided, granted or conferred in trade or commerce’ with six further examples included.[60] In an increasingly online consumer environment, traditional definitional boundaries make it harder for consumers to use the ACL to protect their rights or access remedies.

Smart devices (‘any physical entity capable of connectivity that has a direct interface to the physical world’[61]) might come within the ACL’s definition of ‘goods’,[62] but could also be characterised as a ‘service’[63] as they use data from consumer preferences to tailor outputs to those consumers, with internet connectivity available via subscription models. Devices with embedded direct 4G, 4G or 5G internet connection via a subscription-based service might be a ‘service’, but if they rely solely on WiFi connectivity to operate online might be characterised solely as ‘goods’. Technologies like Apple’s Siri, Google Home and Amazon’s Alexa and Echo[64] usually comprise a device or other hardware installed by the user in their home, with the system functioning through a subscription service using artificial intelligence, while ‘the “brain” of the ‘assistant’ is in the ‘cloud’.’[65] These ‘home assistants’ present risks regarding privacy, security and storage of consumers’ personal data, e.g. detailed personal information including the interests of the user and even their sleeping and wakeup patterns.[66]

While algorithmic decision-making and data analysis might be ‘computer software’,[67] and thus ‘goods’, this appears to depend on the manner of supply. If, as in Gammasonics[68], ‘goods’ require a physical transfer, then software would require a physical medium like a DVD or USB, and internet transfer is not sufficient. But Gammasonics was decided before the ubiquity of ‘cloud computing’ and the notion of ‘Software as a service’. More recently, ACCC v Valve classified executable bits of data supplied, execution of which will trigger the hardware to do something (play music/display a graphic)[69] to be ‘goods’, but not other non-executable data.[70] If a ‘service’ requires the supply of rights, benefits, privileges or facilities that are provided, granted or conferred, presumably this includes results of analysis of data or ADM/DDI.


To ensure consumers can be protected, different terminology is required as technology becomes more sophisticated.[71] ‘[L]egal and economic constructs based on the idea of “markets” — whether in goods and services or in speech and ideas — have yet to adapt in response’ to increasing economic activity on platforms.[72] New forms which hold value, may not fit within existing legal frameworks. Increasingly, consumer welfare research ‘demonstrates a need to consider recasting the concept of product to reflect the frequent inextricable mixture of hardware, software, data and service.’[73] Public interest is best served where regulations are outcome-focused and technologically-neutral.[74]

The ACL must address the impacts on consumers of ADM and DDI with inclusion of clear responsibility and accountability mechanisms. This may require expanding existing threshold definitions to ensure they are broad enough to accommodate the diverse range of ADM and DDI functions that exist now and may exist in the future.

[1] B.Com, B.Ec, Juris Doctor student, Flinders University.

[2] B. Com, Juris Doctor student, Flinders University.

[3] LLB, GDLP, GCE(HE), Prof Cert (Innovation), Associate Professor and Dean of Law, Flinders University.

[4] Kayleen Manwaring, ‘Emerging Information Technologies: Challenges for Consumers’ (2017) 17 Oxford

University Commonwealth Law Journal 265.

[5] See generally Shoshana Zuboff, The Age of Surveillance Capitalism. The Fight For a Human Future at the New Frontier of Power (Profile Books, 2019); Nick Srnicek, Platform Capitalism (Polity Press, 2017); Julie E. Cohen, Between Truth and Power: The Legal Constructions of Informational Capitalism (Oxford University Press, 2019).

[6] Rebecca A. Williams, ‘Rethinking Deference for Algorithmic Decision-Making’ (Research Paper No 7, Oxford Legal Studies, August 2018).

[7] Manwaring (n 4).

[8] Consumers International, ‘AI for Consumers: Five Things We Learnt at the Euroconsumers Event on Artificial Intelligence’, Blog (Blog Post, 5 July 2018) <>.

[9] J. Nathan Mathias, ‘Algorithmic Consumer Protection’, Medium (Blog Post, 10 October 2017) <>.

[10] ‘Consumer data right (CDR)’, Australian Competition & Consumer Commission (Web Page) <>.

[11] Mary Cummings, ‘Automation Bias in Intelligent Time Critical Decision Support Systems’ (Conference Paper, Collection of Technical Papers Aiaa 1st Intelligent Systems Technical Conference, 1 December 2004).

[12] Ibid.

[13] Frank Pasquale, ‘Toward a Fourth Law of Robotics: Preserving Attribution, Responsibility, and Explainability in an Algorithmic Society’ (2017) 78(5) Ohio State Law Journal 1243, 1247.

[14] Victorian Information Commissioner, ‘Closer to the Machine: Technical, Social and Legal Aspects of AI’(August 2019) 135.

[15] ‘OECD Principles on AI’, Organisation for Economic Co-operation and Development (Web Page) <,Council%20Recommendation%20on%20Artificial%20Intelligence>.

[16] Victorian Information Commissioner (n 14) 134.

[17] Ibid.

[18] Informatics Europe and Association for Computing Machinery Europe Policy Committee, ‘When Computers Decide: European Recommendations on Machine-Learned Automated Decision Making’ (2018) 12.

[19] Ibid 13.

[20] General Data Protection Regulation, European Parliament and Council of European Union (2016) Regulation (EU) 2016/679 (GDPR).

[21] Ibid Art 21(2).

[22] Ibid Art 21(3).

[23] Ibid Art 22(1).

[24] Wanda Presthus and Hanna Sorum, ‘Consumer Perspectives on Information Privacy Following the Implementation of the GDPR’ (2019) 7(3) International Journal of Information Systems and Project Management 19, 28.

[25] Ibid 29.

[26] Ibid.

[27] Vlad A. Hertza ‘Fighting Unfair Classifications in Credit Reporting: Should the United States Adopt GDPR-Inspired Rights in Regulating Consumer Credit’ (2018) 93 New York University Law Review 1707.

[28] Competition and Consumer Law Act 2010 (Cth) sch 2 (ACL).

[29] Manwaring (n 4) 266.

[30] Council of Australian Governments, Intergovernmental Agreement for the Australian Consumer Law, para C.

[31] Tania Sourdin, ‘Judge v Robot?: Artificial intelligence and judicial decision-making’ [2018] UNSWLawJl 38; (2018) 41(4) The University of New South Wales Law Journal 1114, 1116. Sourdin applies the term ‘AI’ as ‘an umbrella term which encompasses many branches of science and technology and will often involve the creation of complex algorithms to enable outcomes to be determined. AI can include machine learning, natural language processing, expert systems, vision, speech, planning and robotics.’

[32] Ibid.

[33] ACL (n 28) s 18.

[34] Ibid ss 20, 21.

[35] Ibid ss 29-37.

[36] Ibid ss 39-43.

[37] Ibid s 44.

[38] Ibid s 47.

[39] Acts Interpretation Act 1901 (Cth) s 2C(1).

[40] Ibid s 2B.

[41] Google Inc v ACCC (2013) 249 CLR 435.

[42] Google Inc v ACCC [2013] HCA 1, 73.

[43] Ibid 55.

[44] Julie Cohen, ‘Law for the Platform Economy’ (2017) 51 UC Davis Law Review 133, 164.

[45] Jordan Pearson, ‘Watching a Deepfake Being Made Is Boring, And You Must See It’ (Web Page) <>; Anthony Stoks & Tania Leiman, ‘Prohibiting Impersonation of Police in an Era of Deep Fakes?’ (2020) 42(8) Law Society Bulletin 23; Karras et al and Nvidia, ‘Don't Panic. Learn How it Works’ (Web Page) <>.

[46] And potentially breach ACL s 18 and s 29.

[47] Lyria Bennett Moses, ‘How to Think About Law, Regulation and Technology: Problems with 'Technology' as a Regulatory Target’ (2013) 5(1) Law, Innovation and Technology 1, 5.

[48] Jack Balkin, ‘The Three Laws of Robotics in the Age of Big Data’ (2017) 78 Ohio State Law Journal, 14.

[49] Frank Pasquale, The Black Box Society: The Secret Algorithms That Control Money and Information (Harvard University Press, 2015).

[50] See for example ACCC v Trivago NV [2020] FCA 16, 15; State v Loomis, 881 NW2d 749, 762 (Wis. 2016)

[51] Pasquale (n 13) 1252.

[52] Ibid 1253.

[53] Ronald Leenes et al, ‘Regulatory Challenges of Robotics: Some Guidelines for Addressing Legal and Ethical Issues, Law, Innovation and Technology’ (2017) 9(1) Law, Innovation and Technology, 6.

[54] “Artificial general intelligence” refers to systems with non-specialised intelligence capabilities. See Pei Wang and Ben Goertzel, Introduction: Aspects of Artificial General Intelligence (IOS Press, 2007).

[55] Eliza Mik, ‘AI as a Legal Person?’ in Reto Hilty & Kung-Chung Liu (ed) Artificial Intelligence and Intellectual Property (Oxford University Press, 2000).

[56] National Transport Commission, ‘Review of Guidelines for Trials of Automated Vehicles in Australia’ (Discussion Paper, May 2020) 13.

[57] ACL (n 28) s 51 Guarantee as to title; s 52 Guarantee as to undisturbed possession; s 53 Guarantee as to undisclosed securities, etc; s 54 Guarantee as to acceptable quality; s. 55 Guarantee as to fitness for any disclosed purpose ertc; s 56 Guarantee relating to the supply of goods by description;: s 57 Guarantees relating to the supply of goods by sample or demonstration model; ACL s 58 Guarantee as to repairs and spare parts.

[58] Ibid s 50 Guarantee as to due care and skill; s 61 Guarantee as to fitness for a particular purpose etc; s 62 Guarantee as to reasonable time for supply.

[59] Ibid s 2.

[60] Ibid.

[61] Guido Noto La Diega, ‘Clouds of Things: Data Protection and Consumer Law at the Intersection of Cloud Computing and the Internet of Things in the United Kingdom’ (2016) 9 Journal of Law and Economic Regulation 69, 71.

[62] ACL (n 28) s 2.

[63] Lee Bygrave and Dan Svantesson, ‘Jurisdictional Issues and Consumer Protection in Cyberspace: The View from ‘Down Under’ (Cyberspace Law & Policy Series, 24 May 2001).

[64] Amazon ‘Meet Alexa’ (Web Page) <>.

[65] Hyunji Chung et al, ‘Alexa, Can I Trust You?’ (2017) 50(9) Computer 100.

[66] Hyunji Chung and Sangjin Lee ‘Intelligent Virtual Assistant Knows Your Life’ (2018) Computing Research Repository <> .

[67] ACL (n 28) s 2.

[68] Gammasonics Institute for Medical Research Pty Ltd v Comrad Medical Systems Pty Ltd [2010] NSWSC 267.

[69] Australian Competition and Consumer Commission (ACCC) v Valve Corp (No 3) [2016] FCA 196; (2016) 337 ALR 647 (Valve Corp).

[70] Ibid.

[71] Victorian Information Commissioner (n 14) 124.

[72] Cohen (n 44) 137.

[73] Guido Noto La Diega and Ian Walden, ‘Contracting for the ‘Internet of Things’: Looking into the Nest’ (Research Paper No 219, Queen Mary School of Law Legal Studies, February 2016) 28.

[74] NSW Treasury, ‘Regulating for NSW’s Future’ (July 2020).

AustLII: Copyright Policy | Disclaimers | Privacy Policy | Feedback