University of New South Wales Law Journal Student Series
USING BLOCKCHAIN TO ADDRESS THE LIMITATIONS OF THE AUSTRALIAN MANDATORY DATA BREACH NOTIFICATION LAW
As of 22 February 2018, certain entities have statutory obligations to notify data breaches to the Office of the Australian Information Commissioner and the individuals affected. This thesis considers the extent to which this law is an effective approach to preventing data breaches or protecting individuals from the forms of harm arising. It then considers the extent to which these objectives can be achieved in combination with technology. Blockchain is used as an example of a technology which may be used to facilitate both compliance with the law and the protection of individuals.
The digitisation of most government and other services and the increasing uptake of interconnected personal devices suggest the world is now characterised by ‘“ubiquitous” computing’. Individuals must ‘lead a life within digital networks’ by necessity. The inevitable consequence is the collection and storage of individuals’ data by companies and government entities, ranging from e-commerce retailers to insurers. The real risk that this data will become the subject of a data breach has led to statements like ‘[w]e live in an age of insecurity’ and ‘privacy isn’t real’.
In response to this risk, the Australian government has enacted amendments to the Privacy Act 1988 (Cth) (‘the Act’) which require certain entities to notify the Office of the Australian Information Commissioner (‘OAIC’) of data breaches which meet stipulated criteria. The amendments came into effect on 22 February 2018. The primary objective of this mandatory data breach notification law is to protect individuals from the forms of harm arising from data breaches by requiring that individuals be made aware of data breaches when they occur. This thesis will consider the extent to which this objective of the law has been achieved and analyse the data security and data integrity concerns which it fails to address.
To assess the effectiveness of the law in achieving its objective, it is first necessary to understand the context of data breaches and how they may impact individuals. Part II will define data breaches as encompassing both data security and data integrity breaches and explore the risks to which individuals are exposed therefrom. The effectiveness of the law will be measured, and its limitations identified, by reference to these risks in Part III.
The limitations of the law may be addressed through entities choosing to adopt certain technologies, for example to improve their capacity to detect data breaches efficiently. At a more fundamental level, an entity may opt to change its data storage system and adopt an approach based on ‘privacy by design’ principles. Such an approach involves building data security and data integrity measures into the foundation of data storage systems. In Part IV, blockchain will be considered as a representative example of a technology which may streamline entities’ compliance with the law while also addressing the law’s limitations in certain contexts.
Further, the Australian government may choose to encourage or compel entities to adopt such a technological approach to compliance with the law, as opposed to relying upon their existing data storage systems. Cybersecurity spending by companies globally is expected to exceed US$1 trillion by 2021. The government can influence companies to direct this spending towards particular data security and data integrity approaches such as the adoption of blockchain. The government’s role in this regard will be examined in Part VI.
II RISKS ASSOCIATED WITH DATA BREACHES
As discussed above, the Australian mandatory data breach notification law was enacted to improve the ability of individuals whose information is affected by a data breach to protect themselves from harm in the aftermath. In order to properly understand the law’s objective, the kinds of data breaches which may affect individuals and the resultant harms to which they may be exposed must first be prefaced. The assessment of the law’s effectiveness will be contingent upon how, and the extent to which, it addresses these kinds of breaches and their associated consequences.
A Defining Data Breaches
Importantly, data breaches of information stored by an entity can occur in two central ways which will have different impacts on the individuals to whom the information relates. First, data may be accessed or disclosed without authorisation. This will be referred to as a ‘data security breach’. Individuals will be most concerned about data security in circumstances where the information in question is sensitive in nature, for example where an entity stores their credit card details or health information.
Second, data may be manipulated and changed. This will be referred to as a ‘data integrity breach’. An individual will be most concerned about data integrity if the accuracy of the data in question has significant implications: most obviously, a credit report or financial transaction records held by a bank.
B Risks to Individuals
Any information collected and stored about an individual may become the subject of one of these classes of data breaches, some producing detrimental effects. Identity theft is one possible consequence. Victims of data breaches are almost ten times more likely to suffer identity theft than those whose information has not been breached, and such theft can lead not only to financial loss but also emotional distress. The Australian Payments Network attributes recent increases in credit card fraud incidents directly to ‘large scale data breaches’ and their ability to capture sensitive information about individuals. There is a correlation between data security breaches in particular and an increased risk of identity theft for impacted individuals.
Further, it is expected that different forms of harm will come to fruition as data breaches become more complex. For example, many entities now rely upon behavioural targeting techniques and algorithmic decision-making to advertise and provide their products to consumers. These techniques may require not only the collection of objective personal information, such as full names and addresses, but also ‘inferred demographic characteristics’ like political preferences. A data security breach affecting this kind of data may lead to unexpected consequences. If an algorithm operates to infer an individual is likely to suffer from a sensitive illness like HIV, the exposure of this information might result in mental distress beyond that associated with a breach of credit card information.
Similarly, mental distress may arise from, and financial costs incurred in rectifying, a data integrity breach involving the manipulation of an individual’s health information or credit report, particularly if this inaccurate information is later published publicly.
C Addressing the Risks
The Australian mandatory data breach notification law purports to protect individuals from these, at times significant, consequences following data breaches. How it might achieve this outcome is of fundamental concern in any assessment of its effectiveness. Importantly, an individual is not in most contexts, particularly on the Internet, well-placed to restrict the extent to which their data are made available to entities, or to assess the risks to which they are vulnerable. The effectiveness of the protective measures which do exist then becomes more important.
First, distinction must be made between the date a system is intercepted and the date a breach is ultimately discovered by the affected entity, as in the interim, individuals remain exposed to harm. The mean time before a data breach is discovered in Australia is 185 days, and to contain a breach a further 75 days. Reducing this significant expanse of time is one facet of protecting individuals from the harms arising from both data security and data integrity breaches.
Second, the risk of both data security and data integrity breaches ought to be minimised, including by encouraging entities to, or deterring entities from failing to, strengthen the resilience of their data storage systems to such attacks. The Australian Law Reform Commission specifically recognised the value of general deterrence in the context of data protection in its 2008 report on privacy law.
Third, in the event that a data breach does occur, transparency in communications with individuals to whom the information relates is paramount. The data collection process is relatively opaque and information asymmetry exists between individuals and the entities which collect their data. Whether or not an individual is protected from the harms associated with a data breach will depend upon the extent to which they are made aware of data breaches when they inevitably occur.
III AUSTRALIAN MANDATORY DATA BREACH NOTIFICATION LAW
The effectiveness of the Australian mandatory data breach notification law in addressing data security and data integrity breaches, and in turn protecting individuals from the forms of harm arising, is imperative. There is no constitutional or statutory protection of a right to data security or data integrity in Australia. Further, a tort of breach of privacy does not yet exist in Australia. Victims of data breaches can thus only access compensation through the mechanisms created under the Act. If the law’s effectiveness is limited and gaps are left exposed, no alternative statute or common law cause of action exists to address these limitations and in turn ensure the protection of individuals.
A Obligations under the Law
1 Eligible Data Breaches
The Australian mandatory data breach notification law, in pursuit of its objectives, imposes a number of obligations on entities in the event of a data breach. These obligations are however only engaged in respect of breaches which are deemed ‘eligible’. The scope of the law’s conception of eligibility will be fundamental to the assessment of its ability to protect individuals from data breaches.
First, the law applies only to ‘APP entities’, which essentially encompasses federal government, but not state government, departments, as well as companies, whether for profit or not, with an annual turnover of over $3 million. Entities can also opt in voluntarily. Second, the breach must affect ‘personal information’, which refers to ‘information ... about an identified individual, or an individual who is reasonably identifiable’. The question of whether an individual is reasonably identifiable will be considered in detail in the following section. Third, unauthorised access to or disclosure of that personal information must have taken place. Fourth, the breach will only be notifiable if a reasonable person would conclude that it would be likely to result in ‘serious harm’ to any individuals to whom the information relates. While ‘serious harm’ is not defined, a number of matters are to be considered in making this determination, including, for example, the likelihood that any security measures protecting the information will be overcome. Finally, an otherwise eligible breach is exempted where the entity has taken sufficient remedial steps such that serious harm is no longer reasonably expected to occur.
An entity which experiences a data breach must take all reasonable steps to assess the breach’s eligibility under the law within 30 days. The affected entity’s notification obligations will only be enlivened if the breach is ultimately found to surpass the four hurdles described above and is not otherwise exempted.
2 Form and Content of Notifications
The affected entity must provide its notification to the OAIC as soon as practicable, and the statutory requirements regarding the content of that notification delineate the extent to which entities must be transparent in their disclosure of a data breach. The notification must describe the circumstances of the breach and the types of information affected, recommend steps affected individuals ought to take and provide the entity’s contact information.
Further, a notification containing the same information must as soon as practicable be provided to the individuals to whom the affected information relates. The entity must make this notification by, as practicable, either contacting those affected on an individual basis or publishing the statement on its website.
3 Consequences of Contravention
Equally as important as the law’s scope of application and obligations it creates are the consequences for entities which do not comply. Primarily, these consequences take the form of monetary penalties or compensation. For example, non-compliance with the mandatory data breach notification law amounts to an ‘interference with the privacy of an individual’ under the Act. Serious and repeated interferences are subject to civil penalties, such that if an entity were to repeatedly neglect its notification obligations, it would be subject to a penalty of up to $420 000.
As prefaced earlier, in the absence of a tort of breach of privacy, the enforcement mechanisms of the OAIC following its investigation of a contravention or complaint made to it are the only remedies available to individuals affected by data breaches. Most importantly, the OAIC can make a determination that the entity must undertake ‘any reasonable act ... to redress any loss or damage suffered by the complainant’. In respect of representative complaints, the OAIC can further make a determination that specified payments be calculated and made to the class members.
B Limitations of the Law
The scope of ‘eligibility’ under the law, the substance of the obligations it creates, and the consequences of non-compliance detailed above each expose a number of limitations in respect of the law’s ability to adequately protect individuals.
1 Narrow Scope of the Law
First, the scope of the mandatory data breach notification law is circumscribed by (a) its application only to ‘APP entities’; (b) its application only to data breaches constituting ‘access’ or ‘disclosure’; (c) its application only to ‘personal information’; and (d) the ‘serious harm’ threshold.
The definition of ‘APP entity’ is a relatively low threshold which should extend to most entities which engage in the collection of significant amounts of data about individuals. Similarly, the Act’s use of the terminology of ‘access’ and ‘disclosure’ to describe data breach events appears to encompass most, if not all, data security breaches and data integrity breaches. These features of the law are appropriate to the purpose of protecting individuals from data breaches.
The definition of ‘personal information’, however, only encompasses information about an individual who is ‘reasonably identifiable’. This is concerning in light of the risk of an individual being reidentified from ostensibly de-identified data. ‘De-identified’ information is defined by the Act as information which ‘is no longer about an identifiable individual or an individual who is reasonably identifiable’. It is apparent that the definition of ‘personal information’ is intended to reward those entities which de-identify the data they hold. This reward is unwarranted, given that as data collection becomes exponentially more extensive, anonymisation becomes less reliable as a source of protection.
Importantly, reidentification ‘gains power through accretion’, as unrelated datasets continue to be combined one after another. A hacker only requires one ‘unique data fingerprint’ to begin the task of linking an individual to other data about them, whether publicly available or contained in other databases to which the hacker has gained access. For example, it might appear harmless if an individual’s rating of movies on one online database is linked to another similar database. However, if that single link exposes a common username used by that individual, a rather extensive history of their conduct on the Internet and further personal information might be uncovered.
Differential privacy techniques have been shown to minimise or eliminate the risk of reidentification by, in simple terms, introducing random ‘noise’ into a database to distort the data. However, regardless, there is no set standard by which reidentification risk can currently be measured. The line between de-identified information and personal information under the Act is not fixed, and this erodes the ability of the law to properly protect individuals from breaches of de-identified information.
Further, the law relies upon the ‘risk-based trigger’ of serious harm to limit its scope. The considerations to be taken into account in assessing the risk of serious harm importantly do not restrict the kinds of harm which may form the basis of eligibility. Affected individuals may, for example, be likely to suffer mental distress, and provided other considerations similarly point towards seriousness, this will be sufficient. The determination of whether this threshold has been met, however, concerningly rests with the affected entity. It is at this initial ‘perception’ stage of a data breach that directors and other employees, or even independent information technology (‘IT’) consultants and auditors, will be especially susceptible to bias ‘towards the status quo’. It is unsurprising that those responsible for choosing a particular approach to data protection will be biased against questioning the effectiveness of that approach, particularly if that approach was advised by perceived IT ‘experts’.
Confirmation bias may lead to underestimation of the risk of harm posed by a particular breach, or even incomplete investigations at first instance. The cognitive biases which will inevitably affect those investigating a data breach undermine suggestions that entities should ‘operate from pessimistic assumptions about the performance of privacy technologies’. Additionally, once a decision is made not to notify, there is no prescribed way for any party to discover the reasons behind that determination.
2 Sufficiency of the Content and Transparency of Notifications
Second, the ability of the minimum content requirements for notifications to adequately protect individuals from harm is questionable. The law helpfully requires that a number of details be disclosed about when and how a data breach occurred, which together meet Bisogni’s conception of ‘full transparency’ in data breach notifications. Bisogni’s conception requires that at least two of the following details be present: the type of event, a description of its cause, and the dates of the breach itself and its discovery. Importantly, however, there is no obligation under the Australian law to disclose the date on which the breach occurred (as opposed to the date of discovery) or describe the ‘serious harm’ which is purportedly likely to affect the individual. The individual is left to draw their own conclusions from the other information disclosed.
Further, although the law provides clear content guidelines with which entities must abide, no guidance is provided as to tone or style. Whether or not an individual takes the contents of a data breach notification seriously will ultimately depend on both of these features. If a notification ‘downplays the effects of the data breach’ using a ‘reassuring tone’, or makes recommendations in ‘passive terms’, even if it goes on to recommend that an individual take appropriate action, they may well feel this is unnecessary. Such drafting techniques may further encourage ‘notification fatigue’ and again minimise the extent to which individuals are encouraged to take steps to protect themselves. Although notifications must suggest protective steps for individuals to take, this assumes that those steps will be, first, taken seriously, and second, acted upon.
3 Lack of Deterrence
Third, data breach notification laws are inherently designed to provide only ‘ex post protection’ to individuals. The extent to which they protect individuals before a breach occurs is limited to the deterrent function served by monetary penalties. For instance, the Australian law does not necessarily shorten the time gap between interception and detection except to the extent that entities may be encouraged to improve their detection systems to facilitate their ability to comply.
Notification is generally hoped to mitigate the risk of identity theft by encouraging individuals to take steps to address this risk, such as by contacting a credit reporting agency. The individual is better placed to be proactive in protecting themselves if they are made aware of the breach which has taken place and the implications for their information. In the US, a study by Telang and Acquisti found data breach notification laws could reduce instances of identity theft stemming from data breaches by 6.1 per cent.
Despite this benefit, the Australian law relies heavily upon the deterrent effect associated with civil penalties. This ignores the fact that an entity wary of the consequences of publicising a data breach (for example, a plunge in stock value or damage to brand loyalty) may choose to risk punishment in the short term, because the chance of a third party discovering the breach is slim. On average, stock prices have been shown to drop by five per cent on the day a data breach is notified. While it is unclear whether these effects continue in the long term, the media attention associated with a sudden drop may be enough to encourage entities to take the risk of opting not to notify.
Further, failing to notify a data breach on one occasion is unlikely to be classed as a ‘serious or repeated’ contravention, and therefore subject to the only civil penalty applicable to the law. While data breach notifications can ‘make traceable an otherwise untraceable security breach’ through publicity, this is only on the assumption that entities will in fact comply and not merely risk the consequences of non-compliance.
In the consumer protection context, increased penalties are generally assumed to deter companies from breaching the law. For example, on 1 September 2018, the maximum penalty for a contravention of the Australian Consumer Law was increased from just $1.1 million per contravention to the greater of $10 million, triple the benefit received through the conduct or 10 per cent of the previous year’s annual turnover. These changes were directly in response to findings by Consumer Affairs Australia and New Zealand (‘CAANZ’) and Gordon J of the Federal Court to the effect that the existing penalties were insufficient to serve any deterrent function. Further, CAANZ found some companies treated the penalties as a mere ‘cost of doing business’. The mandatory data breach notification law may suffer the same fate, in light of the relatively low civil penalties imposed and the inability of individuals to bring claims directly against entities for breaches of privacy.
Simply put, data breach notification laws uncover what has previously been readily hidden by entities. For example, Uber failed to disclose a large-scale breach of its users’ accounts in 2016 for over a year. In the absence of notification, a prudent individual would be required to constantly monitor all of their information, most obviously credit cards and bank accounts, in case of an incident. 550 breaches have been notified in the first seven months since the enactment of the mandatory data breach notification law. This number is significantly larger than that notified annually under the previous voluntary regime. The law thus offers at least limited transparency benefits for individuals, but these could be enhanced by adopting appropriate technology in conjunction with the law.
IV USING BLOCKCHAIN TO COMPLY WITH THE NOTIFICATION LAW
Entities seeking to efficiently comply with the Australian mandatory data breach notification law will inevitably turn to technology to streamline their processes. One such option is to store individuals’ data on a blockchain. The adoption of technologies like this may also have the effect of addressing the identified limitations of the law, such that individuals are more effectively protected from data breaches. While other technologies also exhibit potential in this context, and will continue to be developed, blockchain is treated as a representative example.
A Defining Blockchain
In order to assess blockchain’s potential to facilitate the achievement of the law’s objectives, the technology must first be defined. The concept of a blockchain ledger originated from Satoshi Nakamoto’s white paper on the cryptocurrency Bitcoin, and it is this history which has led to ongoing confusion about the distinction between Bitcoin and blockchain. Importantly, Bitcoin is an application of the underlying blockchain technology and they must otherwise be considered distinctly.
The term ‘blockchain’ itself reveals the key features of its structure: ‘blocks’ linked together in a ‘chain’. Each block individually stores certain pieces of information, and as more information is added to the database, more blocks join the chain indefinitely. This chain over time creates a database of information which continues to grow as information is collected or changes are made. For the purpose of this thesis, blockchains which operate as distributed ledgers, and not centralised ledgers, will be the focal point, as the features associated with decentralisation have particular potential in the context of data integrity breaches. A decentralised ledger consists of a number of servers (referred to as ‘nodes’) which are interconnected and store the contents of the ledger at the same time, rather than relying upon only one central server.
The ‘chain’ structure is necessary for verifying the integrity of the data stored in each block. The accuracy of the contents of a particular block is ensured by linking it successively back down the chain. This is achieved by assigning each block a cryptographically generated hash key so the previous block can be conclusively identified. This system of cryptography ensures the order of the blocks is maintained, thus also guaranteeing the integrity of the chain. The blockchain operates as a code-based mechanism of trust – one can trust the contents of the ledger because its very structure authenticates it. It is this feature which has led to blockchain being described as ‘the truth machine’, however claims with such far-reaching implications should be treated with some caution.
As such, the blockchain ledger begins with a single block of information and builds block by block to become an ever-growing succession of records. This ledger is accessible by any computer ‘running the same protocol’, however the data stored within each block are encrypted and only accessible by those with an access key.
Further, in order for new blocks to be added to the chain or changes to be made, the various nodes in a decentralised ledger must reach a consensus about that addition or change. Different blockchains rely on different requirements for reaching consensus, and it is outside the scope of this thesis to discuss those here.
B Blockchain’s Potential in Data Protection
A blockchain-based ledger as described above is by no means limited to its application to cryptocurrencies, and it may be used to both store data held about individuals and record when and by whom those data are accessed. For example, the Estonian government has successfully moved all data it stores about its citizens onto a blockchain. Part I introduced the idea of technology designed using a ‘privacy by design’ approach, and blockchain may prove to be an example of this. The intention behind a privacy by design framework is that ‘security is maintained throughout the life cycle of the data process, and ... [t]he interests of individuals are central in the design of the service’.
Both the data stored about each individual and the transactional information about when and by whom those data have been accessed would be recorded and protected by encryption. Each transaction of data would thus involve two blockchain-based ledgers. The first ledger would contain each individually encrypted piece of information or document. The second ledger would record all transactions of that information, including when each document is accessed. Access rights could be further demarcated by clauses of smart contracts, for example, so information may only be accessed a certain number of times.
C Addressing the Limitations of the Law
These features of blockchain when applied to the context of data protection may operate to facilitate entities’ ability to efficiently comply with the mandatory data breach notification law and, in turn, address the law’s identified limitations.
1 Addressing Limitations of Scope and Transparency through Enhanced Detection
While the adoption of a blockchain-based ledger cannot change the scope of the law, it may operate to streamline an entity’s ability to comply with the law and in turn improve outcomes for individuals, including by enhancing transparency. A number of limitations arise from the circumscribed scope of the law, including the requirement for entities to undertake their own assessment as to whether a data breach is likely to cause ‘serious harm’ and concerns regarding reidentification risk.
The law importantly relies upon an entity’s ability to readily detect data breaches as and when they occur. As discussed in Part II, however, the time gap between interception and detection in Australia is expansive. Using blockchain to store data would improve data breach detection capabilities, whether of data security breaches or data integrity breaches. For example, if one block were altered without authority, this would immediately affect all successive blocks in the chain, and it would be easy to detect not only that such a change has occurred, but where precisely it has occurred. Similarly, the existence of a ledger which specifically records access events would make data security breaches readily traceable. Proof of what specific data were accessed, and when, would be available in real-time, with the time-stamping of each block providing particular precision. Blockchain would facilitate compliance with the law by providing real-time knowledge of access to or manipulation of the ledger.
Further, if all data collected by an entity are stored on a blockchain, the entity’s ability to detect a data breach does not at first instance depend upon whether those data constitute ‘personal information’ or are de-identified. The affected entity will be able to efficiently discover and investigate a breach of any data it holds and subsequently determine whether or not the reidentification risk is sufficient to require notification under the law.
The improved detection capabilities associated with data storage on a blockchain would offer ‘granular transparency’ for individuals in the event of a breach. Although this does not alter the minimum content of notifications required under the law, precise information will be available to entities about breaches which affect them. This information may be provided to individuals to address the ‘formidable information gap’ which currently exists. In this way, the adoption of blockchain would not only facilitate companies’ compliance with the mandatory data breach notification law by streamlining detection, but also enhance protective outcomes for individuals.
2 Addressing Lack of Deterrence by Improving Ex Ante Resilience
Blockchain also has the potential to protect individuals from data breaches at first instance by improving the resilience of entities’ data storage systems, without relying on the law’s purported deterrent function. A ‘privacy by design’ framework is fundamentally concerned with providing proactive protection to individuals as opposed to post-hoc remedies.
In respect of maintaining data integrity, a decentralised blockchain ledger is particularly appealing. A decentralised ledger ‘rais[es] the barriers for manipulation of stored data’, as there is no central authority’s server to attack to force consensus to a proposed change. Rather, a hacker would be required to intercept a number of nodes to achieve consensus, in circumstances where they may not know the consensus mechanism used or the total number of participating nodes.
In respect of maintaining data security, blockchain relies upon the complex cryptography which secures the contents of each block. A blockchain’s data security may also be bolstered in a number of ways, although it is outside the scope of this thesis to determine the viability of these. For example, a blockchain may require multiple keys to authorise transactions, or may operate as a ‘no knowledge proof system’, allowing claims about data to be verified without itself storing the data in question. However, most importantly, the speed of technological development to date has established that cryptographic techniques only remain effective as and until they are surpassed by corresponding decryption techniques. There is no question that, in time, new developments, such as in quantum computing, could break the encryption underlying any blockchain ultimately adopted to store data. It has been reiterated that the code behind blockchain ‘is written by humans, and is always subject to human error’ and blockchain should thus not be treated as a ‘magical cybersecurity silver bullet’. Further, these concerns cannot easily be addressed by, for example, deleting the ledger and recreating it using different encryption techniques. The contents of a blockchain ledger are immutable, and in the context of a decentralised ledger, this would require all instances of the ledger to be erased and replaced accordingly.
Further, the data security of a blockchain may be undermined by the requirement for multiple access keys to be generated, exchanged and stored in order for the stored data to be inspected. These keys must simultaneously be accessible for authorised users while remaining ‘resistant to digital theft’. This is ‘difficult to achieve in practice’, particularly if a large number of keys must be maintained. Hackers may then choose to exploit vulnerabilities in the way access keys are managed in order to intercept stored data, rather than target the blockchain ledger directly.
Blockchain offers attractive benefits in mitigating the risk of data integrity breaches, however there are a number of concerns associated with its ability to maintain data security, particularly in the long term as encryption and decryption technologies advance. This means, in contexts where the integrity of data is most important, and security is not necessarily a priority, a blockchain ledger has considerable potential. An entity concerned about the public relations consequences of a notification may not be deterred from breaching its statutory obligations, but may be encouraged to adopt blockchain to reduce the risk of data integrity breaches taking place at all. This would in turn reduce the risks to which individuals are exposed and improve the extent to which they are protected from such breaches.
3 Scalability Concerns
While the adoption of blockchain for data storage is accompanied by benefits regarding detection capabilities, transparency and the mitigation of data integrity breaches, concerns about its scalability have arisen. Centrally, the fundamental structure of a blockchain relies upon its ability to scale over time as the ledger grows longer and longer. This is particularly so as its integrity relies upon the continued existence of all previous blocks. Latency issues have become apparent over time in some applications of blockchain, and the ledger may not be scalable enough to withstand the storage of exponential amounts of data. Various solutions to scalability limitations have been proposed and are under development, however it is outside the scope of this thesis to consider the viability of these.
V POTENTIAL ROLE OF THE AUSTRALIAN GOVERNMENT
As foreshadowed in Part I and further explored in Part IV, entities subject to the mandatory data breach notification law may choose to adopt blockchain of their own volition to facilitate their ability to comply with the law. This behaviour may then prompt other entities to adopt these technologies in a piecemeal manner to manage data security and data integrity risks. Further, intra- and inter-industry collaboration may ensue as industries begin to develop blockchain protocols appropriately adapted to their specific data storage needs and practices. However, Part III has detailed the limited deterrent function of the current law, and it is thus unlikely that a substantial number of entities will take these proactive steps to streamline their compliance with the law. Reliance upon the slow uptake of new technologies by entities is not an ideal solution to the question of protecting individuals from the imminent risk of data breaches.
Instead, the Australian government may play an important facilitative role in encouraging companies to take steps to strengthen data security and data integrity, including by adopting particular technologies. Eastman has proposed that governments can regulate private sector approaches to cybersecurity in three ways: (i) mandating minimum standards; (ii) ‘mandating that industries set their own standards’; and (iii) providing ‘incentives to adopt cybersecurity standards’. For example, the OAIC has released non-binding guidelines for entities who develop their own codes of practice for managing data, and it could similarly do so with respect to data security standards. Any of Eastman’s approaches may be taken by the Australian government in order to encourage the adoption of specific technologies or features of technologies to address the limitations of the law. A minimum cybersecurity standard for a particular industry may, for instance, provide that entities ought to adopt decentralised data storage systems, whether blockchain-based or not, to address data integrity concerns. In light of the limitations of blockchain, particularly with respect to maintaining data security and scalability over time, the government should refrain from making strict recommendations in favour of particular technologies except in very narrow contexts.
Government intervention may also extend to direct interaction and collaboration with industry stakeholders in order to ensure technology solutions adopted are appropriate for the specific needs of that industry. For example, blockchain may be suitable for an entity which is most concerned with preventing data integrity breaches. While the OAIC continues to liaise with industry about ‘recurring or significant issues arising in complaints’, it may also choose to arrange closed industry meetings regarding data security and data integrity concerns and best practice. Further, the OAIC could engage with cybersecurity and software development firms in order to connect the two imperative skills behind a working ‘privacy by design methodology’, software engineering and law. This professional collaboration could result in improved protective outcomes for individuals, as the identified limitations of the law could be put to software engineering experts and subsequently addressed.
The Australian mandatory data breach notification law does not adequately achieve its objective of protecting individuals from data breaches. First, while the law is designed to deter entities from contravening their obligations, they may be willing to accept the risk of a penalty in circumstances where the breach may never be discovered outside their organisation. This reduces the effectiveness of the post-hoc protection purportedly offered by the law. Second, the definition of ‘personal information’ may not adequately address concerns about the reidentification of individuals from ostensibly de-identified data. Third, the minimum content of notifications is not sufficiently transparent. Companies are neither required to disclose the time gap between interception and detection, nor the specific risks to which the affected individual is apparently exposed. The absence of these details makes it more difficult for individuals to assess whether or not they ought to take the steps recommended in the notification.
The adoption of technologies can operate to simultaneously assist entities to efficiently comply with their notification obligations, as well as address these limitations of the law. Blockchain has been used to exemplify this possibility. In particular, the use of a blockchain to store individuals’ data would enable real-time detection of data breaches due to the existence of a ledger recording access events and the ability for data integrity breaches to be precisely pinpointed. Using blockchain would thus reduce companies’ compliance costs by streamlining the data breach detection and investigation process, except to the extent an initial investment in the technology would be necessary. However, blockchain’s usefulness in mitigating the risk of data breaches is limited to making data integrity breaches more difficult to carry out. With respect to data security breaches, ongoing advancements in encryption and decryption undermine a blockchain-based ledger’s ability to maintain data security over time, particularly because of the immutability of its contents.
Incentivising entities to improve their data security systems and take their obligations towards protection of individuals’ data seriously cannot be achieved purely through legislation like the Australian mandatory data breach notification law. Instead, cultural change will be required, over time, to develop minimum accepted cybersecurity standards either of general application or specific to certain industries. This can be facilitated by government interventions in the form of imposed standards, incentives to adopt standards or direct collaboration with industry to discuss concerns and develop solutions.
As the ‘historical boundaries for information-collection’ continue to shift, the question of data security and integrity will become all the more critical. In the dynamic context of rapid technological developments, no one technology will provide certain data security or integrity in all circumstances or forever. A comprehensive regulatory approach is required, consisting of both minimum cybersecurity requirements and mandatory post-hoc responses to data breaches. These responses must ensure individuals are not only made aware of what has occurred and what steps they ought to take, but also the seriousness of the risks to which they are exposed. Ultimately, protecting individuals from data breaches, an aim to which the law aspires, requires multiple interdependent approaches ranging from statutory regulation to collaboration between industry and the government.
 Ugo Pagallo, Massimo Durante and Shara Monteleone, ‘What Is New with the Internet of Things in Privacy and Data Protection? Four Legal Challenges on Sharing and Control in IoT’ in Ronald Leenes et al (eds), Data Protection and Privacy: (In)visibilities and Infrastructures (Springer, 2017) 59, 60.
 Murat Karaboga et al, ‘Is There a Right to Offline Alternatives in a Digital World?’ in Ronald Leenes et al (eds), Data Protection and Privacy: (In)visibilities and Infrastructures (Springer, 2017) 31, 39.
 Susan Landau, Listening In: Cybersecurity in an Insecure Age (Yale University Press, 2017) 23.
 Peter Burnett, How to Do Privacy in the 21st Century: The True Story of Hacktivism (Eyewear Publishing, 2017) 13.
 Privacy Amendment (Notifiable Data Breaches) Act 2017 (Cth) s 2.
 Commonwealth, Parliamentary Debates, House of Representatives, 19 October 2016, 2430 (Michael Keenan).
 See, eg, Elisa Orrù, ‘Minimum Harm by Design: Privacy by Design to Mitigate the Risks of Surveillance’ in Ronald Leenes et al (eds), Data Protection and Privacy: (In)visibilities and Infrastructures (Springer, 2017) 107.
 2018 Cybersecurity Market Report (31 May 2017) Cybersecurity Ventures <https://cybersecurityventures.com/cybersecurity-market-report/>.
 Alexander Jenkins, Murugan Anandarajan and Rob D’Ovidio, ‘“All That Glitters Is Not Gold”: The Role of Impression Management in Data Breach Notification’ (2014) 78 Western Journal of Communication 337, 338.
 Australian Government Department of Home Affairs, Identity Crime in Australia, <https://www.homeaffairs.gov.au/crime/Documents/infographic-identity-crime-australia.pdf>; Norton by Symantec, ‘2017 Norton Cyber Security Insights Report: Global Results’ (Report, 2018) 4.
 Australian Payments Network, ‘Australian Payment Card Fraud 2018: January–December 2017 Data’ (Report, 2018) 13–14.
 Dustin D Berger, ‘Balancing Consumer Privacy with Behavioral Targeting’ (2011) 27 Santa Clara Computer and High Technology Law Journal 3, 27.
 See, eg, the 2016 breach of World Anti-Doping Agency records of athletes’ use of banned substances, which records were suspected to have been manipulated prior to their publication: Fancy Bears Doping Data ‘May Have Been Changed’ Says Wada (5 October 2016) BBC <https://www.bbc.com/sport/37570246>.
 Andrew W Bagley and Justin S Brown, ‘Limited Consumer Privacy Protections against the Layers of Big Data’ (2014–15) 31 Santa Clara Computer and High Technology Law Journal 483, 495.
 Ponemon Institute, ‘2018 Cost of a Data Breach Study: Global Overview’ (Report, July 2018) 34.
 Australian Law Reform Commission, For Your Information: Australian Privacy Law and Practice, Report No 108 (2008) 250–1.
 Berger, above n 12, 15.
 Victoria Park Racing and Recreation Grounds Co Ltd v Taylor  HCA 45; (1937) 58 CLR 479; Australian Broadcasting Corporation v Lenah Game Meats Pty Ltd (2001) 208 CLR 199.
 Privacy Act 1988 (Cth) s 26WE.
 Ibid ss 6 (definitions of ‘APP entity’ and ‘agency’), 6C(1), 6D(1), (4).
 Ibid s 6EA.
 Ibid s 6 (definition of ‘personal information’).
 Ibid s 26WE(2)(a).
 Ibid s 26WG.
 Ibid s 26WF.
 Ibid s 26WH.
 Ibid s 26WK(1), (2).
 Ibid s 26WK(3).
 Ibid s 26WL(1), (3).
 Ibid s 26WL(2), (4).
 Ibid s 13(4A).
 Ibid s 13G; Crimes Act 1914 (Cth) s 4AA (definition of ‘penalty unit’).
 Privacy Act 1988 (Cth) ss 40(2) (in respect of OAIC investigations) and 36, 38, 40(1) (in respect of complaints made by individuals).
 Ibid s 52.
 Ibid s 52(4), (5).
 Mark Burdon, ‘Contextualizing the Tensions and Weaknesses of Information Privacy and Data Breach Notification Laws’ (2011) 27 Santa Clara Computer and High Technology Law Journal 63, 75.
 Privacy Act 1988 (Cth) s 6 (definition of ‘de-identified’).
 Paul Ohm, ‘Broken Promises of Privacy: Responding to the Surprising Failure of Anonymization’ (2010) 57 UCLA Law Review 1701, 1705.
 Giske Ursin et al, ‘Protecting Privacy in Large Datasets – First We Assess the Risk; Then We Fuzzy the Data’ (2017) 26(8) Cancer Epidemiology, Biomarkers & Prevention 1, 1. See also ibid 1746–7; Arvind Narayanan and Vitaly Shmatikov, ‘Robust De-anonymization of Large Sparse Datasets’ (Paper presented at the IEEE Symposium on Security and Privacy, Oakland, 18–22 May 2008) 9.
 Ohm, above n 39, 1725.
 See generally Narayanan and Shmatikov, above n 40.
 See generally Ursin et al, above n 40; Andrew Chin and Anne Klinefelter, ‘Differential Privacy as a Response to the Reidentification Threat: The Facebook Advertiser Case Study’ (2012) 90 North Carolina Law Review 1417.
 Paul M Schwartz and Daniel J Solove, ‘The PII Problem: Privacy and a New Concept of Personally Identifiable Information’ (2011) 86 New York University Law Review 1814, 1836.
 Burdon, above n 37, 77.
 Privacy Act 1988 (Cth) s 26WG.
 Lilian Mitrou and Tatiana-Eleni Synodinou, ‘Legal Consequences of Cybercrime’ in Ioannis Iglezakis (ed), The Legal Regulation of Cyber Attacks (Wolters Kluwer, 2016) 139, 144–5.
 Mark Burdon, Jason Reid and Rouhshi Low, ‘Encryption Safe Harbours and Data Breach Notification Laws’ (2010) 26 Computer Law & Security Review 520, 523–4.
 Oliver Marnet, ‘Behaviour and Rationality in Corporate Governance’ (2008) 1 International Journal of Behavioural Accounting and Finance 4, 10, 12.
 Travis Laster, ‘Cognitive Bias in Director Decision-Making’ (2012) 20(6) Corporate Governance Advisor 1, 5–6.
 See, eg, Sally Patten, ‘How Confirmation Bias Leads to Bad Corporate Behaviour’, Australian Financial Review (online), 31 May 2018 <https://www.afr.com/brand/boss/how-confirmation-bias-leads-to-bad-corporate-behaviour-20180518-h108ue>.
 Chin and Klinefelter, above n 43, 1429.
 Paul M Schwartz and Edward J Janger, ‘Notification of Data Security Breaches’  MichLawRw 59; (2017) 105 Michigan Law Review 913, 931.
 Fabio Bisogni, ‘Proving Limits of State Data Breach Notification Laws: Is a Federal Law the Most Adequate Solution?’ (2016) 6 Journal of Information Policy 154, 177.
 Ibid 179–81; Schwartz and Janger, above n 53, 952.
 Bisogni, above n 54, 184–6; Sara M Smyth, ‘Does Australia Really Need Mandatory Data Breach Notification Laws – And If So, What Kind?’ (2012–13) 22 Journal of Law, Information and Science 159, 176–7.
 Burdon, above n 37, 79.
 Ibid 75–6.
 Murugan Anandarajan, Rob D’Ovidio and Alexander Jenkins, ‘Safeguarding Consumers against Identity-Related Fraud: Examining Data Breach Notification Legislation through the Lens of Routine Activities Theory’ (2013) 3 International Data Privacy Law 51, 53.
 Bisogni, above n 54, 155.
 See ibid 160; Centrify and Ponemon Institute, ‘The Impact of Data Breaches on Reputation and Share Value’ (Report, May 2017); Kevin Kelleher, Facebook Loses around $13 Billion in Value after Data Breach Affects 50 Million of Its Users (28 September 2018) Fortune <http://fortune.com/2018/09/28/facebook-stock-falls-after-security-breach/> .
 Centrify and Ponemon Institute, above n 62, 2; see further Saim Kashmiri, Cameron Duncan Nicol and Liwu Hsu, ‘Birds of a Feather: Intra-Industry Spillover of the Target Customer Data Breach and the Shielding Role of IT, Marketing and CSR’ (2017) 45 Journal of the Academy of Marketing Science 208.
 See Paul Bischoff, Analysis: How Data Breaches Affect Stock Market Share Prices (11 July 2017) CompariTech <https://www.comparitech.com/blog/information-security/data-breach-share-price/>; cf Elena Kvochko and Rajiv Pant, Why Data Breaches Don’t Hurt Stock Prices (31 March 2015) Harvard Business Review <https://hbr.org/2015/03/why-data-breaches-dont-hurt-stock-prices>.
 Schwartz and Janger, above n 53, 928; Bisogni, above n 54, 165.
 Treasury Laws Amendment (2018 Measures No 3) Bill 2018 (Cth) sch 1.
 Consumer Affairs Australia and New Zealand, ‘Australian Consumer Law Review: Final Report’ (March 2017) 88; Australian Competition and Consumer Commission v Coles Supermarkets Australia Pty Ltd  FCA 1405,  (Gordon J).
 Consumer Affairs Australia and New Zealand, above n 67, 87.
 See, eg, Antonio Kung et al, ‘A Privacy Engineering Framework for the Internet of Things’ in Ronald Leenes et al (eds), Data Protection and Privacy: (In)visibilities and Infrastructures (Springer, 2017) 163, 165 (noting that ‘the business model of some companies is based on extracting value from personal data and tolerat[ing] data protection related risks or even financial penalties’).
 Mike Isaac, Katie Benner and Sheera Frenkel, ‘Uber Hid 2016 Breach, Paying Hackers to Delete Stolen Data’, New York Times (online), 21 November 2017 <https://www.nytimes.com/2017/11/21/technology/uber-hack.html>.
 Office of the Australian Information Commissioner, ‘Notifiable Data Breaches Quarterly Statistics Report: 1 July–30 September 2018’ (Report, 30 October 2018) 4.
 Smyth, above n 57, 176.
 Satoshi Nakamoto, ‘Bitcoin: A Peer-to-Peer Electronic Cash System’ (White Paper, 2009).
 See, eg, Angela Walch, ‘The Path of the Blockchain Lexicon (and the Law)’ (2016–17) 36 Review of Banking & Financial Law 713.
 Dirk A Zetzsche, Ross P Buckley and Douglas W Arner, ‘The Distributed Liability of Distributed Ledgers: Legal Risks of Blockchain’ (2018) 2018 University of Illinois Law Review 1361, 1371.
 Sue McLean and Simon Deane-Johns, ‘Demystifying Blockchain and Distributed Ledger Technology – Hype or Hero?’ (2016) 4 CRi 97, 97.
 See, eg, Michael J Casey and Paul Vigna, The Truth Machine: The Blockchain and the Future of Everything (St Martin’s Press, 2018).
 McLean and Deane-Johns, above n 76, 97.
 Zetzsche, Buckley and Arner, above n 75, 1371.
 Josh Hall, How Blockchain Could Help Us Take Back Control of Our Privacy (22 March 2018) The Guardian <https://www.theguardian.com/commentisfree/2018/mar/21/blockchain-privacy-data-protection-cambridge-analytica>.
 Katharine Kemp and Ross P Buckley, ‘Protecting Financial Consumer Data in Developing Countries: An Alternative to the Flawed Consent Model’ (2017) 18(3) Georgetown Journal of International Affairs 35, 41.
 Nir Kshetri, ‘Blockchain’s Roles in Strengthening Cybersecurity and Protecting Privacy’ (2017) 41 Telecommunications Policy 1027, 1030.
 Ponemon Institute, above n 15, 34.
 Bao-Kun Zheng et al, ‘Scalable and Privacy-Preserving Data Sharing Based on Blockchain’ (2018) 33 Journal of Computer Science and Technology 557, 557; Kshetri, above n 82, 1030.
 UK Government Chief Scientific Adviser, ‘Distributed Ledger Technology: Beyond Block Chain’ (Report, 19 January 2016) 6, 28–9.
 Ibid 22.
 Berger, above n 12, 15, 59–60.
 Orrù, above n 7, 108.
 Zetzsche, Buckley and Arner, above n 75, 1371.
 Ibid 1372; Kshetri, above n 82, 1028.
 Kshetri, above n 82, 1027.
 Sherman Lee, Privacy Revolution: How Blockchain Is Reshaping Our Economy (31 July 2018) Forbes <https://www.forbes.com/sites/shermanlee/2018/07/31/privacy-revolution-how-blockchain-is-reshaping-our-economy/#76236beb1086>.
 McLean and Deane-Johns, above n 76, 102.
 Dalmacio V Posadas Jr, ‘The Internet of Things: The GDPR and the Blockchain May Be Incompatible’ (2018) 21(11) Journal of Internet Law 1, 24; Noah Webster and Aaron Charfoos, ‘How the Distributed Public Ledger Affects Blockchain Litigation’ (2018) 37 Banking & Financial Services Policy Report 6, 13.
 Zheng et al, above n 85, 560.
 Victoria Louise Lemieux, ‘Trusting Records: Is Blockchain Technology the Answer?’ (2016) 26 Records Management Journal 110, 129.
 Ibid. See also Shayan Eskandari et al, ‘A First Look at the Usability of Bitcoin Key Management’ (Working Paper, 2018).
 McLean and Deane-Johns, above n 76, 100.
 See, eg, Adam Efe Gencer, On Scalability of Blockchain Technologies (PhD Thesis, Cornell University, 2017); Cypherium Releases Scalable, Enterprise-Ready Blockchain in Beta (4 September 2018) Globe Newswire <https://globenewswire.com/news-release/2018/09/04/1565143/0/en/Cypherium-Releases-Scalable-Enterprise-Ready-Blockchain-in-Beta.html>.
 See, eg, Saket Sinha, Symbiotic Collaboration: A Key to Succeed with Blockchain in Financial Services (25 September 2018) IBM <https://www.ibm.com/blogs/blockchain/2018/09/symbiotic-collaboration-a-key-to-succeed-with-blockchain-in-financial-services/>; Theresa E Miedema, ‘Engaging Consumers in Cyber Security’ (2018) 21(8) Journal of Internet Law 3.
 See Mitrou and Synodinou, above n 47, 139.
 James Eastman, ‘Avoiding Cyber-Pearl Harbor: Evaluating Government Efforts to Encourage Private Sector Critical Infrastructure Cybersecurity Improvements’ (2017) 18 Columbia Science & Technology Law Review 515, 531.
 Guidelines for Developing Codes (27 September 2013) Office of the Australian Information Commissioner <https://www.oaic.gov.au/agencies-and-organisations/advisory-guidelines/guidelines-for-developing-codes#the-privacy-act-and-codes>.
 Office of the Australian Information Commissioner, ‘Annual Report 2017–18’ (Report, 17 September 2018) 57.
 Dag Wiese Schartum, ‘Making Privacy by Design Operative’ (2016) 24 International Journal of Law and Information Technology 151, 162.
 David C Vladeck, ‘Consumer Protection in an Era of Big Data Analytics’ (2016) 42 Ohio Northern University Law Review 493, 500.