Follow us


The Australian Attorney-General's Department released its Privacy Act Review Report 2022 on 16 February 2023 (the Report). The Report puts forward 116 proposals for reforming Australia’s privacy framework, including the Privacy Act 1988 (the Act) and Australian Privacy Principles (APPs), which, if adopted, will transform how Australian businesses handle data and operate in the digital economy.

Key consultation issues

The Report follows the earlier Issues Paper (2020) and Discussion Paper (2021) prepared by the Attorney-General's Department, with submissions sought and published in relation to each. A number of the proposals now made in the Report have been flagged in one or both of the earlier papers, and many organisations have already expressed their views on these issues in previous submissions.

The proposals we highlight in this section are some of the key ones we feel are more likely to be focus areas in the current round of consultation. This may be for one or more of the following reasons:

  • The proposal has not already been raised as a proposal or question for consultation.
  • The proposal is not a definitive recommendation for reform but is open to further development.
  • The proposal is more likely to be controversial.
  • The government has included a question about the proposal in its online survey for feedback on the Report.

Next steps

The Government is seeking feedback on the Report by 31 March 2023, after which it will formally respond to the Report. Following that, we can expect the Government will introduce an exposure draft of an amendment bill kicking of the legislative process. 

Our team is available to discuss further how the reform may impact you or how you may engage with the consultation process.

Submissions are due 31 March 2023. In this briefing, we highlight some key areas for consultation and provide a detailed overview of the proposals.

Key issues for consultation  

Employee records exemption

The employee records exemption (relating to private sector employees) has been one of the most controversial aspects of the Privacy Act for the last 20 years. This topic generated many submissions in response to the Discussion Paper, and the Report acknowledges strong views on both sides. The Report does not recommend removing the employee records exemption, as had been expected in some quarters (and as has been predicted many times over the years!). However, the Report does propose enhancing certain protections for private sector employees, with the aim of:

  • greater transparency to employees about what their personal and sensitive information is being collected and used for;
  • ensuring that employees’ personal information is protected from misuse, loss and unauthorised access and is destroyed when no longer required;
  • appropriate individual rights (e.g. access, correction, etc); and
  • notifying employees and the OAIC of any data breach involving employee’s personal information which is likely to result in serious harm,

balanced with ensuring employers have adequate flexibility to collect, use and disclose employees’ information that is reasonably necessary to administer the employment relationship.

The employee records exemption was originally introduced on the basis that handling of those records would be better addressed as part of workplace relations legislation. The Report suggests further consultation on whether the recommended protections should be implemented in privacy or workplace laws, and how those laws and regulators should interact.

Some of the topics likely to be most relevant in this round of consultation are:

  • Should employers be permitted to collect employees’ sensitive information and geolocation tracking data without consent? In what circumstances?
  • If APPs 5 (notice of collection) and 11 (security/destruction), and the notifiable data breach scheme, are extended to employers, should they apply differently in the employment context?
  • Which of the current and proposed individual rights (access, correction, objection, erasure) should be extended to employees? Should employer-specific exceptions apply?
  • Should these reforms be implemented in privacy or workplace legislation?

Read our detailed article on ERE here.

Small business exemption

The proposed removal of the small business exemption has also been a matter for debate in recent years, with arguments highlighting the need to bridge the gap in privacy protections for Australians compared to other countries while balancing concerns about the cost compliance for small businesses. The Report proposes to expand the application of the Act incrementally, but only after an impact assessment and consultation on appropriate support for small businesses.

De-identified information

While the Privacy Act is currently largely concerned with the protection of personal information, the Report describes de-identification as a ‘process’, indicating that risks of re-identification remain. In light of this, the Report proposes extending aspects of the following requirements to de-identified information:

  • APP 8 (cross-border disclosure);
  • APP 11.1 (data security); and
  • proposals regarding targeting of individuals (see below), even if the individual cannot be identified.

The Report also suggests that a criminal offence for malicious re-identification should be the subject of further consultation, including in relation to exceptions.

The Report proposes prohibiting the re-identification of information from a source other than the individual, with ‘appropriate exceptions’. The Government may be open to submissions on these exceptions. The following examples were given in the Report:

  • research involving cryptology, information security and data analysis, and
  • testing the effectiveness of security safeguards that have been put in place to protect the information.

Tracking data

While it stops short of recommending including geolocation tracking data in the definition of ‘sensitive information’, the Report does propose requiring consent for the collection, use and disclosure of that data. The Government is now seeking feedback on whether other types of tracking data, such as health data, heart rate and sleeping schedule should have similar protection.

High privacy risk activities and biometric data

The Report reiterates the Discussion Paper proposal to require privacy impact assessments to be conducted for high privacy risk activities, noting that specific high-risk practices could be set out in the Privacy Act.

For facial recognition and other biometric technologies, the Government is specifically seeking feedback on whether additional requirements should apply beyond the broader privacy impact assessment proposal noted.

Research

The Report proposes to legislate requirements for privacy consents to be voluntary, informed, current, specific, and unambiguous. However, the Attorney-General's Department received a number of submissions from stakeholders involved in scientific research that these requirements may be unduly limiting. In response, the Report proposes allowing broad (rather than specific) consents to be obtained for permitted types of research where it is not practicable to specify details of the collection, use and disclosure of personal information when obtaining the consent. In connection with these issues, the Government is now specifically seeking feedback on:

  • whether the circumstances in which research without consent is permitted should be extended
  • whether to align the exceptions for research without consent between Government agencies and private sector organisations, and if not, what differences should remain, and
  • which body should develop new guidelines in relation to research without consent (e.g. the OAIC, the National Health and Medical Research Council).

Individuals experiencing financial abuse

The Report has called for further consultation in response to a submission from the Australian Banking Association calling for a ‘good faith’ exception in the Privacy Act to allow disclosure of personal information to law enforcement or adult safeguarding authorities where an individual’s financial safety may be compromised. The Government is seeking feedback on the proposal, the privacy difficulties currently faced by entities in these circumstances and how any reforms should be implemented.

Direct marketing, targeted advertising, and trading in personal information

Over the last two decades, the advertising industry has been transformed through the convergence of innovations in data analytics and development in the online environment (eg social media), resulting in the emergence of new marketing practices such as profiling and online targeted or personalised advertising.

The Report maintains the earlier proposal for an unqualified right for individuals to opt out of direct marketing, and adds a similar proposal for targeted advertising (including targeting using de-identified information). Proposed fairness requirements would also extend to targeting, and there would be limits to targeting based on sensitive information. Entities would also be required to provide information about targeting, such as use of algorithms and profiling to make recommendations to individuals.

Individuals' prior consent will be required before ‘trading in’ personal information (that is disclosing it for a benefit, service, or advantage). Organisations could also be required to undertake a privacy impact assessment before engaging in profiling and delivering personalised content and advertising to individuals.

Notably, while stricter requirements apply in relation to children, the Report largely resisted calls to move to an opt-in consent model for direct marketing and targeting. Consultation continues however, with the Government calling for feedback on the impact of the proposals on individuals and businesses.

Automated decision making

The Report extends the proposals on automated decision-making based on personal information where there is a legal effect or other significant effect for the individual. The Report proposes that privacy policies should set out the types of personal information used in these decisions and give individuals the right to request meaningful information about how the decisions are made.

The Government is seeking feedback on what types of decisions are likely to have legal effects or other significant effects on individuals, and on whether there should be exceptions to the right to obtain meaningful information about how decisions are made.

Data security and destruction

In the context of recent high-profile data breaches in Australia, the Report introduces several new recommendations in respect of data security and retention.

Consistent with the principle-based approach underpinning the Act, the Report does not recommend imposing specific security controls or measures. Rather, the Report proposes setting baseline security outcomes rather than how those outcomes should be achieved. The Government has called for feedback on what outcomes should be included, with the Report noting that the approach in Article 32 of the GDPR and the ACSC’s Cyber Security Principles could be used as a starting point.

The Report also recommends requiring organisations take reasonable steps to implement practices, procedures and systems to respond to a data breaches and notify the OAIC of eligible data breaches within 72 hours, in line with the timeframe under other incident reporting regimes.1 One key question for consultation is the extent to which APP entities should be required to take reasonable steps to prevent or reduce the harm that is likely to arise for individuals as a result of eligible data breach.

A key challenge for organisations will be to manage the risk of regulatory overlap with other privacy or information security requirements. The Government is inviting submissions on how reporting processes under the notifiable data breach scheme may be streamlined for APP entities with multiple reporting obligations.

With community concern about data retention issues high in the wake of some high-profile data breaches, the Report has added to the previously modest suggestions about reforming APP 11.2 relating to retention and destruction of personal information. This includes requirements to document and review minimum and maximum retention periods for personal information, and to specify retention periods in privacy policies.

The Report also foreshadows a broader review of existing data retention requirements in laws beyond the Privacy Act. The Government is now calling for feedback on the barriers faced by organisation to minimise collection and retention of identity credential information (e.g. drivers’ licence and passport copies and numbers).

The Report also proposes to extend data security, destruction and data breach notification obligations to information covered by the employee records, journalism and political exemptions, and as noted above, to require the security of de-identified information to be protected. 

Individuals’ rights

As foreshadowed, the Report proposes a range of enhanced GDPR-inspired rights for individuals including to, on request, obtain explanation about or object to the handling of their information, have their personal information erased where no longer needed and extend correction rights to generally available publications controlled by an APP entity. All rights would be subject to exceptions in the following categories:

  • competing public interests (e.g. freedom of expression, law enforcement)
  • relationships with a legal character (e.g. contrary to another law or contract with the individual), and
  • technical exceptions (e.g. impossible or unreasonable to comply).

The Government has called for feedback on the impact of these rights on individuals, business and government and on whether any additional exceptions are appropriate.

Controller-processor distinction

The Report now proposes the introduction of a controller-processor distinction in the Act, similar to GDPR and many other jurisdictions. The intention is to clarify obligations and allocate responsibilities between the controllers (entities that determine the purposes and means of collecting and handling information) which will continue to be subject to all the APPs, and the processors (those that process personal information on behalf of a controller) which will only be subject to APP 1 and 11. The Report proposes making small business processors subject to processor obligations under the Act, however the Government is consulting further on what support small businesses will need to help them comply.

Extraterritorial effect

A recent amendment to the Act removed the requirement that, for the Act to apply to overseas companies carrying business in Australia, they must collect or hold personal information in Australia. After Parliament referenced our article commenting that a (presumably unintentional) consequence of the changes appeared to be that foreign companies carrying on business in Australia would be subject to the Act even in respect of their activities that do not relate to their business in Australia, or to Australian individuals, the Senate Standing Committee on Legal and Constitutional Affairs recommended that this issue be referred back to the Attorney-General's Department for consideration as part of the review of the Privacy Act.  The Government is seeking further views on this issue, in particular whether an additional requirement is needed to demonstrate an ‘Australian link’ focused on personal information being connected with Australia.

Overseas data flows

The Report recommends a number of changes to the rules on overseas disclosure, including in relation to approved countries, standard contractual clauses, and changes to the informed consent exception. While a number of these align with previous proposals and are unlikely to generate substantial new views in consultation, the Government is seeking submissions about the introduction of a public interest exception where personal information is published online.

Increased regulatory enforcement tools

Organisations can expect an expanded enforcement toolkit and new avenues for redress for impacted individuals to drive up regulatory actions. These will complement the introduction of increased penalties ($50 million and more), greater regulatory powers and expanded extra-territorial application of the Privacy Act, being the priority reforms which took effect in December 2022 (briefing here).

A number of other recommendations are consistent with what was expected in the earlier discussion papers and would see a clear shift in the role of the OAIC to one which places greater emphasis on proactive enforcement, rather than mainly complaints handling as is the case presently. The clarification of the meaning of ‘serious interferences’ with privacy should give the OAIC greater certainty with which to pursue serious breaches. The recommended tiered civil penalty regime will also give the OAIC latitude to pursue breaches which do not amount to serious interferences.

However, a key practical issue (and historical impediment) remains how the OAIC will be funded to ensure that it can undertake increased enforcement activity. The Government has left this question open, recommending further work to investigate the effectiveness of an industry funding model for the OAIC.

New private avenues of claim

The introduction of new avenues of claim for individuals is a key feature of these proposed reforms with the potential to increase exposure to class action risk for data and privacy breaches.

The adoption of a direct right for action enabling individuals to sue for interferences with their privacy under the Privacy Act will likely make representative complaints more attractive for class actions, due to the range of remedies which would be available to courts, namely uncapped damages, and any decision being readily enforceable, unlike the currently the case for OAIC determinations which require separate proceedings to be commence for enforcement.

A separate statutory tort of serious invasion of privacy is also likely to encourage class action activity, by providing another clear cause of action for breaches of privacy. The fault element of intention or reckless will likely serve as a bar to claims where companies have appropriate information and data handling practices and policies in place to adequately manage foreseeable cyber risks. However, a separate cause of action in negligence may still be relevant where a duty of care can be established.

Report Key Recommendations Table

Topic

Proposed Reforms

Expanded Scope

Definition of personal information

  • Expand the definition of ‘personal information’ the reference from ‘about’ to ‘relates to’ (that is, “information or an opinion that relates to an identified individual), while confining the definition so the connection between the individual and information is not too tenuous or remote.

This aligns more with the GDPR position, and moves away from the narrower Privacy Act definition highlighted in the 2017 ‘Grubb case’ (Privacy Commissioner v Telstra Corporation Ltd).

  • Introduce into the Act a non-exhaustive list of information which may be considered ‘personal information’ (for example online identifiers, location data, technical or behavioural data in relation to an individual’s activities or preferences, predictions of behaviour or preferences and profiles generated from aggregated information), and supplement this list through OAIC guidance.
  • Introduce a non-exhaustive list of mandatory factors to assess whether an individual is ‘reasonably identifiable’.

The Report stopped short of adopting the concept of ‘individuation’ in the definition of personal information, i.e. where information relating to an individual reveals their characteristics and can be used to impact them even though they are not reasonably distinguishable or distinguishable from all others. However, other proposals (see below) deal with the use of de-identified information for targeting.

De-identification and de-identified data

  • Clarify that de-identification must be informed by best available practice and result in the individual not being reasonably identifiable in the current context.
  • Require APP entities to take reasonable steps to protect de-identified information from (a) misuse, interference and loss; and (b) from unauthorised access, modification or disclosure. (This proposal recognises that there is a risk that de-identified information can be re-identified.)
  • Require APP entities take reasonable steps when disclosing de-identified information overseas to ensure that the receiving entity does not re-identify the information or further disclose the information in such a way as to undermine the effectiveness of the de-identification.
  • Prohibit APP entities from re-identifying de-identified information indirectly obtained (with certain exceptions, including where the re-identified information was de-identified by the entity itself, or where the re-identification is conducted by a processor with the authority of an APP entity controller).
  • Consult on the introduction of a criminal offence for malicious re-identification.

Australian link

  • Consulting on an additional requirement in s 5B(3) of the Act which will provide that an organisation will have an ‘Australian link’ if it carries on business in Australia or an external Territory (which is the current test) and the act done or practice engaged in relates to personal information that is ‘connected to Australia’ (which could involve consideration of whether the personal information is collected or held in Australia, or the personal information is of an Australian or a person physically located in Australia).

This proposal comes after the recent amendment to the Act which removed the requirement that, for the Act to apply to overseas companies carrying business in Australia, they must collect or hold personal information in Australia. After Parliament referenced our article commenting that a (presumably unintentional) consequence of the changes appeared to be that foreign companies carrying on business in Australia would be subject to the Act even in respect of their activities that do not relate to their business in Australia, or to Australian individuals, the Senate Standing Committee on Legal and Constitutional Affairs recommended that this issue be referred back to the Attorney-General's Department for consideration as part of the review of the Privacy Act.

Small business exemption

  • In the short term, expand the application of the Act to small businesses that collect biometric information for use in facial recognition or trade in personal information.
  • In the medium term, remove the small business exemption entirely, but only after an impact assessment and development of appropriate support for small businesses.

Employee records exemption

  • Enhance privacy protections for private sector employees with the aim of:
    • greater transparency to employees about what their personal and sensitive information is being collected and used for;
    • ensuring that employees’ personal information is protected from misuse, loss and unauthorised access and is destroyed when no longer required; and
    • notifying employees and the OAIC of any data breach involving employee’s personal information which is likely to result in serious harm,

while ensuring employers have adequate flexibility to collect, use and disclose employees’ information that is reasonably necessary to administer the employment relationship.

  • Undertake further consultation with employer and employee representatives on how these protections should be implemented in legislation, including how privacy and workplace relations laws should interact.

Political exemption

  • Introduce conditions for political entities to obtain the benefit of the political exemption, including in relation to privacy policies, fairness, sensitive information, direct marketing and targeting, data security and notifiable data breaches.

Journalism exemption

  • Introduce conditions for media organisations to obtain the benefit of the journalism exemption, including in relation to data security/destruction, notifiable data breaches and accountability for standards compliance.

Clarifying and strengthening notice and consent requirements

Definition of collection

  • Expressly provide that ‘collection’ covers information obtained from any source and by any means, including inferred or generated information (meaning notification and consent requirements may apply to the inference or generation of personal information about a customer).

Privacy collection notices and policies

  • Introduce an express requirement that collection notices be clear, up-to-date, concise and understandable, with appropriate accessibility measures also in place.
  • Require collection notices to include, in addition to all matters currently required under APP 5.2: if the entity collects, uses or discloses personal information for a ‘high privacy risk activity’, the circumstances of that collection, use or disclosure; and the types of personal information that may be disclosed to overseas recipients.
  • Specify the entity’s retention periods in the privacy policy.
  • Introduce standardised templates and layouts for privacy policies and collection notices, as well as standardised terminology and icons, by reference to relevant sectors (either through OAIC guidance or through future APP codes that may apply to particular sectors or handling practices).

Valid consent and default privacy settings

  • Amend the definition of consent in the Act to provide that it must be voluntary, informed, current, specific and unambiguous, and expressly recognise the ability to easily withdraw consent.

Note there is no proposal to change the circumstances in which an APP entity is required to obtain consent and consent does not need to be express (provided the implied consent is ‘unambiguous’).

  • Require online service providers to ensure that any privacy settings are clear, easily accessible for service users, and reflect the privacy by default framework of the Act.

Indirect collection

  • Require that, if an entity does not collect information directly from an individual, it must take reasonable steps to satisfy itself that the information was originally collected from the individual in accordance with APP 3.

Consent (research)

  • Enable individuals to give broad consent for the purposes of health and medical research, which would be given for ‘research areas’ where it is not practicable to fully identify the purposes of collection, use or disclosure at the point when consent is being obtained (e.g. broad consent could be given to take a bio-sample for the purposes of a study which seeks to identify a particular health risk, and that consent could extend to further studies to identify treatment options for that risk).
  • Consult on broadening the scope of research permitted without consent under the Act.

Fairness

Fairness

  • Introduce a requirement that the collection, use and disclosure of personal information must be fair and reasonable, irrespective of consent, having regard to legislative factors including the kind, sensitivity and amount of personal information, risk of unjustified adverse impact or harm.

Organisational accountability

Organisational accountability

Require APP entities to:

  • record the primary purposes and secondary purposes for collection, use or disclosure; and
  • appoint or designate a senior employee responsible for privacy.

Individual Rights

Access and explanation

As part of existing obligations to provide individual access to their personal information, APP entities to:

  • identify the source of the personal information it has collected indirectly;
  • provide an explanation or summary of what it has done with the information; and
  • consult with the individual about the format for responding to a request; and
  • APP entity may charge a nominal fee for providing access and explanation.

Objection

  • Introduce a right to object to the collection, use or disclosure of personal information (based on the requirements of the Act), and an obligation on an APP entity to review its information-handling, consider the objection and provide a written response to an objection with reasons.

Erasure

  • Introduce a right to erasure of an individual’s personal information, where the information should be destroyed for example by court order or under APP 11.2 as it is no longer needed.
  • Require an APP entity that has collected the information from a third party or disclosed the information to a third party to inform the individual about the third party and notify the third party of the erasure request unless it is impossible or involves disproportionate effort.

Certain limited information should be quarantined rather than erased on request to ensure it remains available for the purposes of law enforcement but still restricts the entity’s own use of the information.

Correction

  • Extend the existing right to correction to generally available online publications over which an APP entity maintains control.

De-indexing

  • Introduce a right for individuals to request that a search engine de-index, in Australia, online search results containing personal information which is sensitive, about a child, excessively detailed (e.g. includes home address and personal number), or inaccurate, out-of-date, incomplete, irrelevant or misleading.

Exceptions

Introduce the following categories of exceptions to all rights of the individual:

  • competing public interests, including freedom of expression and law enforcement;
  • relationships with a legal character, such as where complying with the request would be inconsistent with another law or a contract with the individual; and
  • technical exceptions (e.g. impossible or unreasonable to comply).

High-risk activities and information

Privacy impact assessment

  • Require APP entities to conduct a privacy impact assessment prior to undertaking activities with ‘high privacy risks’, and provide it to the OAIC upon request.

Specific high-risk practices could be set out in the Act or in OAIC guidance, for example the collection, use or disclosure of:

  • sensitive information or children’s personal information on a large scale,
  • biometric templates or biometric information for the purpose of verification or identification collected in a public space (and further considering enhanced risk assessment requirement for facial recognition and other uses of biometric information);
  • information for the purposes of online tracking, profiling and the delivery of personalised content and advertising to individuals, ongoing or real-time tracking of an individual’s geolocation, sale, or automated decision making with legal or significant effects.
  • OAIC to develop practice-specific guidance for new technologies and emerging privacy risks, including compliance with the fair and reasonable handling test.

Sensitive  information

  • Amend the definition of ‘sensitive information’ to include genomic information and clarifying that sensitive information can be inferred from information which is not sensitive information.
  • ‘Geolocation tracking data’ (precise location of an identifiable individual over time) should not be added to the definition of ‘sensitive information’, but should generally require consent if collected or handled.

Children

  • Require that valid consent is to be given with capacity.

This is currently only in OAIC guidance rather than the Act itself. Entities would need to consider whether an individual under 18 has the capacity to consent, although the guidance (endorsed in the Report) suggests that capacity for individuals over 15 can be presumed if individual assessment is not practical.

  • Require that collection notices and privacy policies be clear and understandable in particular for any information addressed specifically to a child.
  • Require entities to have regard to the best interests of the child as part of considering whether collection, use or disclosure is fair and reasonable in the circumstances.
  • Introduce a Children’s Online Privacy Code applying to online services that are likely to be accessed by children aligning, to the extent possible, with scope of the UK Age-Appropriate Design Code.

Vulnerable individuals

  • Introduce in OAIC guidance a non-exhaustive list of factors that indicates when an individual may be experiencing vulnerability and at higher risk of harm from interferences with their personal information; and on capacity and consent which reflects developments in supported decision-making.

Automated decision making

  • Introduce mandatory privacy impact assessment requirement (see above) and transparency requirements (disclosure in privacy policies complemented by a right for individual to request more specific information) before using personal information in substantially automated decisions which have a legal or similarly significant effect on an individual.

Direct marketing, targeting and trading

Direct marketing

  • Define ‘direct marketing’ as the collection, use or disclosure of personal information to communicate directly with an individual to promote advertising or marketing material (including the promotion of goods, services, aims and ideals).
  • Provide individuals with an unqualified right to opt-out of their personal information being used or disclosed for direct marketing.
  • Prohibit direct marketing to a child unless the personal information was collected directly from the child and the direct marketing is in the child’s best interests.

Targeting

  • Define ‘targeting’ as the collection, use or disclosure of information which relates to an individual, including personal information, deidentified information, and unidentified information (internet history/tracking etc.) for tailoring services, content, information, advertisements or offers provided to or withheld from an individual (either on their own, or as a member of some groups or classes).
  • Provide individuals with an unqualified right to opt-out of receiving targeted advertising.
  • Require targeting to be fair and reasonable.
  • Prohibit targeting individuals based on sensitive information (other than political opinion, membership of political association/trade union) unless it contains socially beneficial content.
  • Require entities to provide information about targeting, including clear information about the use of algorithms and profiling to recommend content to individuals.
  • Prohibit targeting a child, with the exception for targeting in the child’s best interests.

Trading

  • Define ‘trading’ as the disclosure of personal information for a benefit, service or advantage.
  • Require consent to trade in personal information.
  • Prohibit trading in personal information of children.

Data security and breaches

Security

  • Amend APP 11.1 include a set of baseline privacy outcomes after consulting with industry and government having regard to the Government’s 2023-2030 Australian Cyber Security Strategy.

This proposal was an alternative to recommending more specific requirements regarding security controls/measures (as opposed to outcomes).

  • Require organisations to take reasonable steps to protect the security of de-identified information and information covered by the employee records, journalism or political exemptions.

Data breaches

  • Require entities to take reasonable steps to implement practices, procedures and systems to respond to a data breach (including by developing a data breach response plan).
  • Require entities to notify the OAIC of eligible data breaches within 72 hours.
  • Require entities to notify individuals of an eligible data breach as soon as practicable, allowing for the notification to be provided in phases if not all information can be provided at once.
  • Require data breach notifications to state the steps an entity has taken or intends to take in response to the breach, and consider requiring entities to take reasonable steps to prevent or reduce the harm that is likely to arise as a result of a data breach;
  • Enable the Attorney-General to permit the sharing of information with appropriate entities to reduce the risk of harm in the event of an eligible data breach.

These proposals come after the passing of the Privacy Enforcement Act, which provided the OAIC with new powers to obtain information or documents in relation to an actual or suspected eligible data breach (see our (briefing here).

The Report also notes that if the concepts of controller and processor are introduced into the Act (see discussion below), only the controller would be responsible for notifying individuals affected by an eligible data breach, but processors would still be required to prepare a statement on the breach and provide a copy of that statement to the OAIC (unless the breach has already been reported by the relevant controller or another processor).

Retention and destruction

  • Require APP entities to establish their own maximum and minimum retention periods in relation to the personal information they hold, taking into account the type, sensitivity and purpose of that information, as well as the entity’s organisational needs and any obligations they may have under other legal frameworks.
  • Require privacy policies to specify the entity’s retention periods.
  • Commonwealth Government to undertake a review of all legal provisions that require retention of personal information to determine if the provisions appropriately balance their intended policy objectives with the privacy and cyber security risks of entities holding significant volumes of personal information.

Controllers and processors

Controllers and processors of personal information

  • Introduce the GDPR-like concepts of APP entity controllers and APP entity processors into the Act. Controllers would have primary responsibility for personal information and be subject to all of the APPs. Processors, who handle personal information on behalf of controllers, would only be subject to the requirements under APP 1 and APP 11 to take reasonable steps to implement practices, procedures and systems to comply with the APPs, protect personal information they hold and destroy it when no longer required.

Both controllers and processors would be subject to the notifiable data breach regime, except that controllers would report to the OAIC and individuals whereas processors would report to the OAIC and controllers.

Small business processors

  • Pending removal of the small business exemption, a non-APP entity that processes information on behalf of an APP entity controller would be brought into the scope of the Act in relation to its handling of personal information for the APP entity controller – although this would be subject to further consultation with small businesses to understand the impact it would have on them.

The Report recognises that while the small business exemption remains in effect, there will be a gap in coverage where a non-APP entity, such as a small business, contracts an APP entity processor. In this circumstance, the non-APP entity would not be subject to the Act while the APP entity processor would only be required to comply with the processor obligations, and neither party would be required to comply with the controller obligations.

Overseas disclosures

Overseas data disclosures

  • Introduce a definition of disclosure in the Act consistent with existing APP guidelines, namely, making information accessible to others outside the entity and releasing the subsequent handling of the information from the entity’s effective control.
  • Introducing a mechanism to recognise countries and certification schemes as providing substantially similar protection to the APPs in order to authorise disclosures in those circumstances.
  • Require entities to, when seeking to rely on the informed consent exception, consider the risks of an overseas disclosure and to inform individuals that privacy protections may not apply to their information if they consent to the disclosure.

Currently the entity must inform the individual that if he or she consents to the overseas disclosure, APP 8.1 will not apply.

  • Require APP entities, when specifying the countries in which recipients are likely to be located if practicable in the APP 5 collection notice, to also specify the types of personal information that may be disclosed to recipients overseas.
  • Make standard contractual clauses available to APP entities for use when transferring personal information overseas.
  • Consider further whether online publications of personal information should be excluded from the requirements of APP 8 where it is in the public interest.

Enforcement and private claim

Enforcement

  • New penalties for breach with a tiered approach to civil penalties and infringement notices.

While the first suite of reforms introduced last year significantly enhanced maximum penalties for serious (or repeated) privacy interferences (see our summary here), additional penalty levels are proposed here to address instances that fall short of that threshold. This includes a mid-tier civil penalty provision to capture privacy interferences without the serious element and a low-tier civil penalty to address administrative breaches with infringement notices powers. Further consideration is to be given to the value of the mid-tier penalty.

  • Greater statutory clarity to the meaning of a ‘serious’ interference with privacy under s 13G (and removing the word repeated from that section) by defining ‘serious’ to include those involving sensitive information, adversely affecting large numbers of individuals, impacting vulnerable people, repeated breaches, wilful misconduct and serious failure to take proper steps to protect personal data.
  • Enhanced coercive investigation powers, with those under the Part 3 of the Regulatory Powers (Standard Provisions) Act 2014 (Cth) to apply to investigations of civil penalty provisions. This would confer powers to the OAIC to, for example, search premises and seize evidential material.
  • A new power to undertake public inquiries and reviews (on approval or direction from the Attorney General) to examine systemic issues and facilitate industry or sector change.
  • Permit the OAIC to require a respondent in a complaint to take reasonable steps to mitigate future loss.
  • Empower the Federal Court to make any order it sees fit after a civil penalty provision relating to an interference with privacy has been established. This would not be limited to serious breaches under the proposed tiered model.
  • Consult further on an industry funding model for the OAIC.

These recommendations are broadly consistent with those in the previous Discussion Paper.

A direct right of action

  • Introduce of a direct right of action to sue for privacy interference by an APP entity where loss or damages has been suffered (within the meaning of the Privacy Act, which includes injury to the person’s feelings or humiliation).  

This direct right would only be available in circumstances where a complaint (including representative complaint) had been made to the OAIC and assessed as unsuitable for conciliation, allowing the OAIC to maintain oversight over privacy issues and identify systemic issues which may be addressed through further regulatory or enforcement action, as well as to resolve complaints where it continues to be appropriate given its expertise.

The threshold requirement to demonstrate harm was considered appropriate given there were other more suitable enforcement mechanisms by which remaining types of breaches could be resolved.

Complainants could then seek leave to have the matter heard in Federal Court / Federal Circuit Court, which would have the power to order any remedies it sees fit, including uncapped damages (in line with comparable jurisdictions such as the EU and Singapore which have robust privacy frameworks). The OAIC may appear as amicus curiae or intervene in proceedings instituted under the Act with the Court’s leave.

This will likely make representative complaints more attractive for class actions, due to the range of remedies which would be available to the Court, namely uncapped damages, and that any decision would be readily enforceable, unlike is currently the case for OAIC determinations which require separate proceedings to be commence for enforcement.

This was previously raised in the Discussion Paper.

A statutory tort for serious invasions of privacy

  • Introduce a statutory tort for serious invasion of privacy to address shortfalls in the current regulatory framework and protect from emerging risks and harms.

The key features of the proposed model are as follows:

  • The invasion of privacy must:
    • involve an intrusion into seclusion or misuse of private information;
    • be intentional or reckless – mere negligence is insufficient; and
    • be ‘serious’ in nature. However, the Review Report does not provide any guidance on this meaning.
  • The expectation of privacy was reasonable in all of the circumstances. This introduces an objective component to the assessment.
  • It involves a public interest test that the public interest in privacy outweighs any countervailing public interests.

This statutory tort is also likely to encourage class action activity, by providing a clear cause of action for breaches of privacy.

The fault element of intention or reckless will likely serve as a bar to claims where companies have appropriate information and data handling practices and policies in place to adequately manage foreseeable cyber risks. However, a separate cause of action in negligence may still be relevant where a duty of care can be established.

This statutory tort would also apply to non-APP entities, unlike many other privacy law protections, and be sufficiently broad in scope to include physical privacy, such as bodily (including e.g. recording a private conversation without consent) or territorial privacy, rather than being limited to information-handling related privacy.

Damages would be available for non-economic loss, as well as exemplary damages in exceptional circumstances.

The Review Report goes further than the Discussion Paper, which floated multiple options for a statutory tort of privacy, by recommending the model outlined at Option 1 of the Discussion Paper. This model was previously put forward by the Australian Law Reform Commission in Report 123 following a comparative analysis of the torts in equivalent jurisdictions, in particular the UK and New Zealand, as has been supported by the OAIC.

Other recommendations

Flexibility of the APPs (APP Code making powers)

Expanding the OAIC’s powers with respect to the development of APP codes, in particular by empowering it to:

  • make an APP code where the Attorney-General has directed or approved that a code should be made where it is in the public interest and there is unlikely to be an appropriate industry representative to develop the code; and
  • issuing a temporary APP code for a maximum 12 month period on the direction or approval of the Attorney-General if it is urgently required and in the public interest to do so.

Interactions with other schemes

The Report sets out a number of proposals regarding the interaction between other Commonwealth, state and territory schemes which contain privacy protections, namely:

  • the Attorney-General’s Department to develop a privacy law design guide to support Commonwealth agencies when developing new schemes with privacy-related obligations;
  • encouraging regulators to continue to foster regulatory cooperation in enforcing matters involving the mishandling of personal information; and
  • establishing a Commonwealth, state and territory working group to harmonise privacy laws (particularly around security, health information etc).

  1. For example, the Security of Critical Infrastructure Act 2018 (Cth), prudential standard CPS 234, and the European General Data Protection (GDPR) provides for a 72 hour notification timeframe.

 

 

Navigating Australian Privacy Reform

Your guide to the changes ahead

Key contacts

Kaman Tsoi photo

Kaman Tsoi

Special Counsel, Melbourne

Kaman Tsoi
Julian Lincoln photo

Julian Lincoln

Partner, Head of TMT & Digital Australia, Melbourne

Julian Lincoln
Peter Jones photo

Peter Jones

Partner, Sydney

Peter Jones
Christine Wong photo

Christine Wong

Partner, Sydney

Christine Wong
Cameron Whittfield photo

Cameron Whittfield

Partner, Melbourne

Cameron Whittfield
Katherine Gregor photo

Katherine Gregor

Partner, Melbourne

Katherine Gregor
Kwok Tang photo

Kwok Tang

Partner, Sydney

Kwok Tang
Merryn Quayle photo

Merryn Quayle

Partner, Melbourne

Merryn Quayle

Stay in the know

We’ll send you the latest insights and briefings tailored to your needs

Cyber Risk Advisory Technology, Media and Telecommunications Consumer Data and Privacy Kaman Tsoi Julian Lincoln Peter Jones Christine Wong Cameron Whittfield Katherine Gregor Kwok Tang Merryn Quayle