Proctor : June 2018
34 PROCTOR | June 2018 Eyes in the sky, and all around Computer vision threatens current notions of privacy Traditional notions of privacy are being challenged with advancements in technology, particularly computer vision. In recent years this technology has grown in popularity, with improved accuracy and performance to the point where computer vision surpasses human performance in some instances.1 Computer vision improves on facial recognition technology as it combines feature, image and pattern recognition with position, orientation, motion detection and gaze tracking. This new computer vision technology has been widely applied in government security, such as in passport security and border control, immigration and crime control. More interestingly, computer vision technology has also been increasingly applied in the commercial sector. The technology is being developed by some of the world’s largest technology companies before regulators have had an opportunity to consider the ramifications. 2 The issues with these rapid advancements are that the extent of private sector development and the ability to deploy this technology are unknown and require a more thorough understanding to regulate effectively. Privacy in Australia Computer vision allows companies to identify and track users’ movements and social activities. Australia is a signatory to the International Covenant on Civil and Political Rights (ICCPR). Article 17 of the ICCPR provides for a right to privacy. Otherwise, Australia does not have a constitutional right to privacy, and the common law protections of privacy are generally limited to what is encompassed by tort law.3 The Privacy Act 1988 (Cth) (Privacy Act) was introduced as Australia’s response to its international obligations under the ICCPR. It regulates the use of personal information by Commonwealth Government agencies and certain private organisations, and enshrines the Australian Privacy Principles (APPs). The Privacy Act only applies to private organisations with a turnover of more than $3 million, meaning there is a ‘small business’ exemption. This creates a concerning gap within the framework, as 94% of businesses do not meet this threshold. 4 The predominant issue with the current framework is that, while it imposes a ‘transparency’ framework and a consent requirement for the collection of sensitive information, computer vision challenges these concepts as it innovates the way that data can be collected and allows data to be collected seamlessly without requiring consent. Computer vision also challenges the notion of notification. Commercial uses of computer vision Computer vision is currently being used in a number of commercial applications, but the full extent of its current application is not yet known. Companies developing facial recognition software cite four types of functions that can currently benefit from this technology:5 • marketing and customer service • safety and security • photograph identification and organisation • secure access and authentication. Computer vision can now be used to track a customer walking past a store by identifying them on camera. This can then allow a company to send a notification to that person’s phone about potential sales in the shopping centre.6 From a business perspective, this capability is an invaluable marketing tool that can enhance customer engagement. Computer vision also has applications in billboard customisation, whereby the technology can identify a customer’s gender, age and other specific factors and can use this information to create personalised advertisements in real time. A further use of this technology is within the home via smart TVs. With the use of ‘gaze tracking’, businesses can analyse precisely who is viewing advertisements, as well as their response. 7 Going beyond the APPs It is likely that additional privacy principles are required to safeguard against emerging technologies.8 The three most prominent and necessary additional principles are transparency, dynamic consent and privacy by design.9 1. Transparency There are two important ways that companies can facilitate transparency. These are by developing and publishing privacy policies, and providing notice that computer vision is being used.10 The principle of transparency also requires that any information addressed to the public should be easy to understand and easily accessible.11 Further, companies should be candid about their business models and what sort of results data mining is expected to produce.12 Most social media businesses trade personal information for an array of free services. While sophisticated internet users may understand searching online is funded through the monetisation of their personal information, most online services are opaque with their privacy policies and disclosure to customers. This is an example of a lack of transparency. 2. Dynamic consent The large quantity of personal information collected by vision sensors challenges existing concepts of consent and notification in surveillance and privacy. It is difficult to gain consent when individuals are identified or profiled against other datasets.13 To gain dynamic consent, businesses should try to disclose all conceivable future uses of the data and, when new ways to use the data arise, businesses should seek permission to use the data in this way. 14 The challenges raised by this proposition mean artful user interface design is required to come close to achieving dynamic consent. 15 3. Privacy by design Privacy by design means building in reasonable privacy and security controls at all stages of product development. It includes promoting consumer privacy and data security throughout one’s organisation. Conclusion The tension between technological advancements and the right to privacy is a constant battle in law. It is evident that, while Australia maintains a conservative privacy framework, computer vision and other emerging technologies have the ability to undermine the protections that it attempts to afford.