The consultation represents a significant step in how Australia plans to regulate the rapidly growing use of AI technologies in relation to the supply of goods and services. The outcome of the Australian review could also potentially influence the regulatory response to AI in New Zealand, given the close parallels between Australian and New Zealand consumer laws. New Zealand businesses, especially those involved in AI-enabled provision of goods and services, should take note of the potential changes and consider the implications for their current practices and any AI initiatives in development.
Summary of Australian proposals
The Treasury’s discussion paper (available here) seeks submissions on various questions relating to the Australian Consumer Law (ACL), which is technology-neutral, and its suitability for addressing the novel challenges posed by AI. The paper builds on a separate consultation (“Safe and Responsible AI in Australia”) published in June 2023 and complements other recent proposals to introduce certain mandatory guardrails for AI in high-risk settings, announced in September 2024.
The Treasury’s paper recognises that while AI offers considerable benefits, such as personalised consumer experiences, it also introduces new risks that raise questions about the adequacy of the current regime. In summary, the paper seeks input on the following key issues:
- Managing risks to realise benefits
The paper notes that the opacity of AI systems and the potential difficulty in predicting AI system behaviour may increase the risk of false or misleading representations about AI-enabled goods and services. The paper suggests that many businesses (particularly smaller or less sophisticated businesses integrating off-the-shelf AI systems) may struggle to reliably ensure that representations about the capabilities of those functions are not false or misleading. It also notes that inaccuracies from AI models could result in unwanted bias and misleading or entirely erroneous outputs (or ‘hallucinations’).
This issue will be relevant to many New Zealand businesses also. In some cases, reliance on AI systems could amplify the risk of misleading or deceptive conduct under the Fair Trading Act 1986 (FTA) in particular if AI-enabled products, such as chatbots or recommendation systems, generate inaccurate or biased information. Given that errors in those systems could impair a large number of customer interactions (as opposed to an isolated set of interactions overseen by a single human agent), consumer-facing businesses should consider additional safeguards. Ensuring that they have a clear understanding of the relevant systems and any applicable limitations, and providing accurate and transparent descriptions of AI capabilities to customers, should reduce the risk of inadvertently breaching the FTA. - Addressing uncertainty
The paper considers whether the ACL adequately covers AI-enabled goods and services under existing rules for product safety, consumer guarantees, and misleading conduct. In particular, the paper raises questions about how the ACL’s statutory guarantees – such as requirements that products are of “acceptable quality” and “fit for purpose” – apply to goods that evolve over time, such as AI systems that can learn and adapt after purchase. The paper notes similar uncertainty as to whether certain AI systems should be classified as “goods” or “services” under the ACL.
This will be relevant to many New Zealand suppliers and manufacturers given that the Consumer Guarantees Act 1993 (CGA) provides similar implied guarantees. The application of those guarantees to systems that self-train and evolve over time, and the classification of certain AI-enabled systems as either “goods” or “services”, remains unclear under the CGA. - Accessing remedies
The Treasury’s paper is also seeking feedback on whether the remedies available under the ACL, such as repairs, replacements, and refunds, are fit for purpose when applied to AI-enabled goods. The paper asks whether additional remedies should be developed to address the AI context. The paper gives an example of a defective “AI-enabled vacuum cleaner” controlled by speech recognition, which malfunctions and causes damage. The paper notes the challenge consumers may face of establishing a breach of the statutory guarantees, given the difficulty of accessing or analysing the manufacturer’s training of the relevant AI system. The paper also highlights the challenges of allocating liability between AI developers, manufacturers, and suppliers, especially when AI systems make autonomous decisions that deviate from a product’s intended function.
In New Zealand, consumers have similar rights to repairs, replacements, and refunds under the CGA, the application of which may be complex in the context of AI-enabled products or services. New Zealand law similarly may require clarification or further regulatory guidance to assist with ensuring the appropriate allocation of liability between AI developers, manufacturers, and suppliers.
Implications for New Zealand Businesses
New Zealand’s FTA and the CGA share many principles with the ACL, particularly in relation to issues such as misleading conduct, unfair contract terms, and product safety. As the Australian review progresses, it is possible that any changes to the ACL could influence the approach taken by regulators in New Zealand, who may view these developments as a benchmark for how AI-related risks should be managed under New Zealand law. The reforms may also be of direct relevance to New Zealand businesses that supply AI-enabled products or services in Australia.
As such, we expect that New Zealand businesses involved in AI-enabled products and services will want to stay closely informed about the ongoing consultation and any resulting regulatory changes. In the meantime, if your business is involved in the development or use of AI in a consumer context, it will be important to establish strong internal governance around AI use to ensure compliance with New Zealand’s current consumer laws.
If you require any support with establishing those measures or have any questions about the matters raised in this article, please get in touch with the contacts listed or your usual Bell Gully adviser.