Key Privacy Reforms – Automated Decision Making and Overseas Data Flows

  • Market Insight 09 October 2024 09 October 2024
  • Asia Pacific

  • Regulatory risk

In this second of our deeper dives into key areas of the Privacy and Other Legislation Amendment Bill 2024 (Bill) we examine the (a) notification requirements for automated decision making and (b) improvements relating to overseas disclosures under APP 8.

Our general overview of the key aspects of the Bill can be found here and our first deeper dive on the statutory right of action for serious invasions of privacy can be found here.

What is it?

Automated Decision Making

Automated decision making occurs when an individual’s personal information (often with information collected from other sources and ‘inferred’ information) is processed by a computer program (usually including algorithms), sometimes with no human involvement in the process, for the purpose of either making a decision or assisting a human to make a decision which impacts that individual (ADM).  Once passed, the Bill in its current form will require privacy policies to include information detailing an organisation’s use of ADM, including both where ADM is used to (a) make decisions without human involvement or (b) assist a human to make a decision.

As many will be aware, automated decision making has been specifically addressed and regulated in Article 22 of the GDPR since the GDPR became effective on 25 May 2018. While the GDPR provides that data subjects have a right not to be subject to automated decision making the Bill includes a more modest provision targeting transparency, requiring notification of the use of ADM in the organisation’s privacy policy, as opposed to granting a right to individuals to elect not to be subject to a decision based solely on automated decision making.

Once legislated, privacy policies must state:  (a) the kinds of personal information used in the operation of ADM; (b) the kind of decisions which are made solely by ADM; and (c) the kinds of decisions for which ADM is used as a substantial part of or directly related to the decision making by a human (collectively ADM Information). The new APP 1.7, once legislated, requires the ADM Information to be included in all privacy policies where:

  1. an organisation has arranged for a computer program to make a decision or do anything that is substantially and directly related to decision making (i.e. ADM);
     
  2. that decision could reasonably be expected to significantly affect the rights or interests of an individual; and
     
  3. personal information about an individual is used in the operation of the computer program making or assisting this decision (i.e. with the ADM). 

The Explanatory Memorandum notes that ‘substantially’ in “a thing that is substantially and directly related to decision making” for the purpose of APP 1.7(a) means where it is a key factor in facilitating the human’s decision making. The Explanatory Memorandum further notes that ‘directly’ refers to where ADM has a clear connection with the making of the decision. However, the effects of the resulting decision must be “more than trivial, and must have the potential to significantly influence the circumstances of the individual concerned”.

This APP 1.7 requirement captures more than what appears to be the case on a quick read. Automated decision making (ADM) covers more than only fully automated decision making (i.e. decisions made solely by computer programs) but also includes potentially all decisions made by humans using or relying on the outputs from the operation of a computer program (i.e. ADM). Organisations must therefore consider and fully detail (and disclose) in their privacy policy every instance where a decision is made by or using ADM (including in part) which may impact any individual significantly.

Overseas Data Flows

While the OAIC already has the power to effectively ‘whitelist’ jurisdictions through the operation of cross-border enforcement arrangements with regulatory authorities of foreign jurisdictions, this power never actually resulted in any jurisdictions being whitelisted. That is, no ‘whitelist’ of jurisdictions was ever created. Although, in practice, it was considered by many privacy practitioners that Europe/GDPR and the UK generally satisfied the requirements under APP 8.  

Once passed, the Bill in its current wording will give the Government (presumably on the advice of the Privacy Commissioner) the power to make regulations prescribing that the laws of a jurisdiction or specific binding scheme have the effect of protecting personal information in a way that satisfies APP 8. That is that they, overall, have substantially similar protections to those under the APPs. This means organisations will no longer be required to undertake their own assessment of the suitability of protections in that prescribed jurisdiction in which the international recipient is located. Such will provide a welcome level of certainty as to equivalent privacy jurisdictions, removing the risks of organisations making this assessment themselves.

The Explanatory Memorandum notes that the purpose of introducing this new power is (once such regulations are issued) to reduce the burden on entities assessing whether an overseas jurisdiction where a recipient is located has substantially similar protections. This will operate to provide a new exception under APP 8.2 where a jurisdiction or binding scheme has been prescribed in the regulations made under this new power. However, while this is a welcome development, such will not affect the organisation’s other privacy considerations and obligations under the Privacy Act/APPs.  

How will this impact you?

Automated Decision Making

Organisations will need to thoroughly review their processes to determine what automated and assisted (i.e. ADM) decision-making processes are currently operating and whether the ADM in question is using personal information and if the resulting decision is likely to have a significant impact on any individual. Having undertaken this review, the organisation must include all of the relevant ADM Information in their privacy policy.

When considering what revisions need to be made to an existing privacy policy, organisations must consider all instances of ADM used business-wide. This includes any ADM occurring in internal functions such as hiring, recruitment and assessment of performance. This type of broad review is necessary for all sectors, not just those which have widespread use of automated (or assisted) decision-making. In fact, given the wide definition of ADM, is it those organisations that believe they do not use automated decision-making at all that are the most at risk of missing the ADM  they do use, not including the relevant ADM Information in their privacy policy and thus exposing themselves to the new infringement notice penalties.

With the increased use of artificial intelligence (AI) systems across a variety of industries, consideration must also be given to the operation of any AI which is used to process personal information as part of or to assist decision making. When considering what types of decisions are automated, it must be remembered that automated decision making in this case (i.e. ADM) includes the use of the outputs of automated processes in or to assist with human decision making (i.e. not just fully automated decision making). This type of ADM is likely to cause the most confusion and will be the most difficult to track down in the organisation. 

When conducting a review of any ADM which uses personal information, organisations will need to determine whether the use of ADM (in the widest sense) is substantially or directly related to decision making (APP 1.7(a)) and whether the decisions made (or assisted) by ADM could be reasonably expected to significantly affect the rights or interests of an individual (APP 1.7(b)).

It will also be necessary to keep the organisation’s privacy policy updated when any new ADM functions are introduced to or used by an organisation. Organisations will need to develop a strategy or system to ensure that when new AI, other technologies or processes are introduced the use for ADM is considered and the privacy policy updated accordingly.

Failing to include any of the ADM Information in a privacy policy will be a breach of the new civil penalty (i.e. infringement notice) provisions and may result in a penalty of up to 1000 penalty units for companies, currently equating to a fine of up to $330,000, issued on an infringement notice basis where the organisation will need to take action if it does not wish to pay the penalty levied.

Overseas Data Transfers

For organisations this means disclosures made to prescribed jurisdictions (once prescribed) will be easier in many respects. However, organisations will still have to satisfy all other APP requirements, such as undertaking a security assessment of the third-party provider under APP 11.1.  

The introduction of this new regulation-making power should result in the establishment of a ‘whitelist’ of jurisdictions considered satisfactory under APP 8.1, allowing organisations to disclose information to recipients in these jurisdictions with the reassurance of knowing that doing so will not result in a breach of APP 8. While this will mean that entities will be able to disclose information to jurisdictions prescribed under the regulations (i.e. ‘whitelisted’ jurisdictions) without the need to bind recipients in those whitelisted jurisdictions to comply with APPs 2‑13, care must be taken as regards all of the organisations other applicable privacy (and any contractual) obligations (i.e. to not assume sending to a whitelisted jurisdiction means no privacy obligations apply).  Organisations are still required to satisfy all of their privacy obligations outside of APP 8, for example when engaging third parties to process the organisations personal information to undertake a privacy and security assessment pursuant to APP 11.1.  

What you can do to prepare

Automated Decision Making

These changes will commence 24 months after the Bill is passed and receives Royal Assent.  However, in order to prepare for the changes, organisations should review their processes now to determine what, if any, automated and assisted decisions (i.e. ADM) are currently made and if these ADM use personal information and could reasonably be expected to significantly affect any individual. Once these are identified the organisation’s existing privacy policy will need to be revised to include all ADM Information detailing what personal information is used and what automated decisions are made solely or significantly assist with decision making which are likely to significantly impact any individuals. Clyde & Co’s experienced Cyber Advisory and Digital Law and Privacy teams have established processes to assist organisations to (a) assess the ADM currently used in the organisation and whether such meet the criteria requiring the ADM Information to be included in the privacy policy; (b) uplift the privacy policy as required to include all ADM Information; and (c) establish and implement an internal framework to ensure that the organisation assesses all future AI and other technologies for ADM and uplifts their privacy policy accordingly.

Overseas Data Flows

These changes are set to commence the day after the Act is passed and receives Royal Assent. When organisations start to see the benefits of this, however, will depend on how quickly the Government moves to prescribe jurisdictions. In advance of any jurisdictions being prescribed, however, Clyde & co  are able to assist you to (a) review and advise on your existing contracts with third-party service providers focusing on APP compliance as regards overseas data transfers to ensure your current compliance; and (b) assist with establishing and implementing a process to ensure that all privacy risks are adequately identified and managed currently and for when disclosures are made to recipients in the future to prescribed jurisdictions.

Clyde & Co’s Cyber, Cyber Advisory, Privacy and Technology Team has unparalleled and specialised expertise across the privacy, cyber, financial services information regulatory and broader technology practice areas. It also houses the largest dedicated market leading privacy and cyber incident response practice across Australia and New Zealand. All of this ensures your “readiness, response and recovery” is in good hands. We provide end‑to‑end risk management solutions for clients from advice, strategy, transactions, innovations, cyber and privacy pre‑incident readiness, incident response and post‑incident remediation and recovery to regulatory investigations, dispute resolution, recovery of damages and third-party claims. We offer market leading practical solutions, focused assistance and advice.

End

About the report

Produced

09 October 2024

Location:

Asia Pacific

Written by:

Alec Christie

Alec Christie

Partner

Themes

Regulatory risk

About the report

Produced

09 October 2024

Location:

Asia Pacific

Written by:

Alec Christie

Alec Christie

Partner

Themes

Regulatory risk

Additional authors:

Isobelle Fox, Law Graduate, Sydney

Stay up to date with Clyde & Co

Sign up to receive email updates straight to your inbox!

You might be interested in...