EXPERT REACTION: Coronavirus tracing app

Embargoed until: Publicly released:
Not peer-reviewed: This work has not been scrutinised by independent experts, or the story does not contain research data to review (for example an opinion piece). If you are reporting on research that has yet to go through peer-review (eg. conference abstracts and preprints) be aware that the findings can change during the peer review process.

Opinion piece/editorial: This work is based on the opinions of the author(s)/institution.

Several politicians have raised concerns about the COVID tracker, the federal government app which will be available in the next couple of weeks, to help track down people who may have been in contact with someone with the coronavirus. Below Australian experts comment.

Organisation/s: The University of Western Australia, The University of Adelaide, Swinburne University of Technology, Macquarie University, Cyber Security CRC (CSCRC)

Funder: n/a

Expert Reaction

These comments have been collated by the Science Media Centre to provide a variety of expert perspectives on this issue. Feel free to use these quotes in your stories. Views expressed are the personal opinions of the experts named. They do not represent the views of the SMC or any other organisation unless specifically stated.

Professor Rick Sarre is an Adjunct Professor with the University of South Australia (UniSA) Justice and Society

The Coalition does not have a strong record regarding inspiring confidence in large-scale data collection and retrieval systems. One need only recall the lack of enthusiasm shown by healthcare provider organisations regarding the My Health Record system which was found by the national audit office in 2019 to have failed to manage adequately its cybersecurity risks. 

So what should they do now? In 2018, the government sought to allay any concerns regarding the metadata retention scheme, another large-scale endeavour to amass private data, by giving a role to the Commonwealth Ombudsman to assess agency compliance with their legislative mandate. In the case of the COVID-19 tracing app, the government has, appropriately, enlisted the support of the Privacy Commissioner who will do a privacy impact assessment.

While that is an admirable gesture, one would hope that there will soon be legislation in place to set out all of the conditions of use regarding the app, such as who will have access to the data, how the data are used, and when (after the crisis is over) the app will no longer be functional. It would make sense for the Commissioner not to be asked for his view until that legislation is in place. Once that happens, I would be happy to sign up, but only after the Commissioner has given the ‘all clear’.

Last updated: 11 May 2020 5:52pm
Declared conflicts of interest:
None declared.
Carsten Rudolph is an Associate Professor of Cybersecurity at Monash University

The app would require Bluetooth to be activated, while the general recommendation from a security point of view is to only activate Bluetooth if really required. This would mean that phones would need to be set-up to answer to Bluetooth beacons. This world potentially also enable other types of tracking. In general, it is always recommended to disable all wireless functionality if not needed. 

There are ways to build such an app with strong privacy by design in a way that would give people control over their data. However, the Singapore version of this did not follow any of these approaches and my guess is that the Australian version will also not try to do this. 

The processes around the app need to be clarified to understand the risk of potential misuse of the data.

Last updated: 11 May 2020 10:32am
Declared conflicts of interest:
None declared.
Associate Professor Paul Haskell-Dowland is Associate Dean of Computing and Security at Edith Cowan University

The app as we understand it could best be described as an equivalent to sonar. A signal is sent from the phone (using Bluetooth) with a specific code.  This is claimed to be unique to the owner and randomly created.  At the same time, the app listens for incoming codes from other app users. The app then simply records the codes that it has seen.  At present, we are advised that the app will not record your physical location.  In effect, this turns your phone into a radar that detects other app users in range of the Bluetooth signal.  A record is kept of other uses in range at specific times. The app is currently being 'vetted' by ASD and being considered by governments across the globe.

Issues include:

  1. Trust - the population will never fully trust the app or the government to manage the data responsibly.
  2. Coverage - partial population coverage will not provide sufficient contact tracing.  Figures of 40%+ are being proposed by some. NB there are still people without smartphones!
  3. Battery life - Bluetooth is known to have an impact on battery life and many users will leave Bluetooth turned off for this reason.  The app will not function without it.
  4. App security - the app will run on a mobile device. Will the data on the device be secure?  Can it be accessed by other apps?
  5. Data consolidation - data gathered by the app is not itself personally identifiable, but, could be combined with other data... e.g. an authority with access to phone records (with location data) could correlate timestamps of mobile phone location and map to app data of other users.  Then combine with CCTV etc.
  6. Scope creep - who will have access to the data, under what circumstances will it be accessible, for how long, will it be deleted (and can we trust that it has been?)
  7. Mandating use - to address (1), and to secure (2), will the government make its use mandatory?

There are alternatives (including services by big providers like Apple/Google), but these also have concerns over privacy - especially with Google having already amassed vast data repositories on us all!

Last updated: 07 May 2020 10:40am
Declared conflicts of interest:
None declared.
David Vaile is stream lead for Data Protection and Surveillance at the Allens Hub for Technology, Law and Innovation at UNSW Australia's Law Faculty.

The central question for Australians at present is whether their fellow citizens will continue to trust and cooperate with federal and state governments in pursuit of what remains an ambiguous strategy (eradication is within grasp but unlike NZ we will not commit to it, risking ‘mitigation’ and expanded infections). Trust is only warranted if the other party is trustworthy. The federal government in particular (DHA) has a history of contempt for digital health privacy, with the My Health Record example still vivid: when Australians refused to consent, in a trial, the model changed to impose a record of dubious usefulness and safety on everyone without asking or offering an opt out; the latter only appeared from a brief period after popular and parliamentary revolt.

In this case, while there is now a promise not to make the app compulsory, the basis of this promise is unclear, as is the nature of the app, where data is stored, when identification may be possible, and how it might interfere with the operation of the device. All the questions in your list, and more, are legitimately still open. The Privacy Impact Assessment is in house and not independent, the same approach that led to so many concerns and failures in the 2016 Census.

While the source code has today been promised to be released for review – but perhaps not before the code is set, meaning flaws cannot be fixed; and the PIA will be released – but not done with consultation, meaning it will probably overlook key risks and flaws, these concesssions need to go further if the technical, legal and general community are to have a capacity to understand the app, and to influence its design to avoid the issues of most concern.

Finally, it is critical the app is not oversold, otherwise there is a risk of false reliance on it, either individual or as a country, such that extra risks may be taken which the app will not offset. False negative models indicate it will miss many episodes of infectable interactions, and may report significant numbers of false positives, especially for those exposed to the public. The example of Singapore, rather than demonstrating the app will work as hoped, now should be recognised as a failure, with a second wave of very serious proportions, much harder lockdown but with no sign these ‘after the horse has bolted’ actions can stop the new outbreak. Interpretation is difficult, but at the very least Singapore does not offer evidence that its model is a success, and we need much more clarity of the logic, and limits, of its proposed use in Australia.

Last updated: 29 Apr 2020 3:27pm
Declared conflicts of interest:
None declared.
Professor Richard Buckland is Professor in CyberCrime Cyberwar and Cyberterror at the School of Computer Science and Engineering UNSW

We know generally about the operation of the proposed tracking scheme and understand it will be similar to the Singaporean system currently being used, however as yet we do not know the details. The details are extremely important in systems such as this which operate to balance the useful collection of data with user privacy.  Below is what we currently understand to be the case.

Phones with the app installed continuously emit a faint radio beacon ID signal (using Bluetooth which is inherently low power and faint).  The signals can only be picked up by nearby receivers, which would typically be other phones with the app installed which are also emitting with own ID beacon signals as well.  If your phone can pick up the ID beacon of another phone then presumably you are close to each other, and so your phone logs the ID it received (and the time and the power level of the received signal).

The government keeps a central database of the identities of everyone who downloaded the app and assigns them a private ID. Your app never broadcasts your private ID but uses it to generate a public ID for you which it broadcasts as your faint beacon signal.  Your public ID is periodically changed, perhaps daily, but your private ID never varies.  The app of everyone who is near you logs the public ID you broadcast and keeps that stored on their phone.  If one those people you logged is later diagnosed as having COVID then they give the authorities the list of all the public IDs they have logged and the government uses secret cryptographic keys (which only they know) to convert those into the corresponding private IDs and so learns which people were close contacts of the sick person.  They then alert all those people.

There are weaknesses with this approach which might allow privacy breaches, perhaps serious ones.  Here are three examples:

1. If you are diagnosed with COVID then the identity of everyone your phone has logged, however fleetingly, becomes available to the authorities with metadata about for how long, when, and signal strength.  This reveals data about others as much as it reveals data about yourself.  It might, for example, reveal your neighbour having an affair next-door (Bluetooth can pass through walls to some extent).  It might reveal a passer-by outside your house exercising more than once a day.

2. If the central database and keys are hacked or erroneously leaked or deliberately shared with other government agencies or countries then they could determine the identity of every person from their app’s beacon signal. So, for example, putting a cheap beacon detector outside a brothel you would know the identity of every person who entered.  Or you could program a drone to hover directly above and follow a particular person.  Or you could couple a network of Bluetooth sensors with surveillance cameras throughout a city and know the identity of the people in the footage.

3. If the data were shared with police or other agencies or if police or agencies had the power to demand app users hand over the logs they had collected on their phones then this could potentially be used to – force reporters to reveal the identities of their sources, detect whistle-blowers, identify everyone who participates in mass protests, unmask police informants, force people to self incriminate and reveal breaches of (sometimes unclear and confusing) lockdown or self-isolation rules, identify those in hiding from abusive partners, detect politicians leaking to media or other parties, indicate sensitive commercial negotiations are underway and with whom, ...  The uncertainty and concern ordinary citizens might have about such things may well encourage people to avoid using the app or taking their phone with them and so undermine the potential benefits of comprehensive contact logging.

There is currently much discussion amongst health experts about the extent to which a technological solution such as an App can contribute to a real world solution.  It is important to understand the data about the costs and the benefits before embarking on an intrusive solution.   To what extent do people contract the virus from being near anonymous others versus contact with those they know or from touching contaminated surfaces?  The Singaporean data could shed light on this.

What privacy safeguards are most important if such an App was used?

  1. Explicit prohibition against subsequent scope creep: All the data should be only used for COVID contact tracing and this should be explicitly legally clear and enforced and not able to be undone later by ministerial regulation.  Not given to police for crime investigations, or to planning departments, or to medical researchers, or other ideas later proposed. Australian privacy history has shown that once we have this pool of data many agencies will want to get hold of it to use for things they regard as being important.
  2. Genuine opt in: People should not be able to discriminate against people based on whether or not they have the App and the app data should not be able to be demanded by police or used as evidence in a court of law.  People should be able to opt out of the system at any time and upon doing so all their data including all centrally held records identifying them should be securely and irrevocably destroyed.
  3. Time limit and secure final deletion:  The COVID App and data should have an end date after which all data will be securely destroyed, with no copies retained.  It should only be able to be used for this virus not retained and extended to use against say influenza.  There would be no problem with starting a new App for such a purpose of course.


Societal acceptance – if the App was compulsory I suspect many Australians would try to avoid it as that’s not the sort of government to which they'd want to be entrusting their private life and data.  But if the App was voluntary with the appropriate safeguards, then I believe we are a great nation of volunteers to help others in times of need.

Finally, it is worth thinking about the merits and risks of writing our own software rather than joining in and using a system already developed and tested by a wider group.  Cybersecurity history has repeatedly shown there is usually merit in using an existing system (such as the Singaporean one, or one developed by tech giants) rather than “rolling your own” new untested system - unless you are doing something quite new and different.  Software teething problems are inevitable and the stakes are currently high. It is worth noting that Google and Apple are currently collaborating on developing their own inbuilt phone tracing system which removes the main weakness in the Singaporean system - the need for a secure central database. In practice, such a central database would be extremely attractive to criminals and other nations and almost impossible to confidently secure from coding bugs and mistakes, or insider or cyber attack.

Last updated: 13 May 2020 6:00pm
Declared conflicts of interest:
None declared.
Associate Professor Frank den Hartog is from the School of Engineering and Information Technology at UNSW Canberra, and is affiliated with the UNSW Canberra Cyber centre.

Many questions around this app cannot be answered by independent experts as long as the service specifications and the code are not published. Actually, the service specifications should have been published a long time ago, followed by an open expert consultation for feedback to the service specification and an assessment of the 60+ apps that already exist worldwide. Then choose a promising candidate, take it from there, publish the code, and ask for another open expert consultation.


Part of that consultation would be a privacy and a security assessment. The outcome of which would be more broadly accepted by the public than an assessment by the government or parliament itself. Having said that, I think Australians should be prepared to accept some trade-off between privacy and public health here. However, that does not preclude a thorough minimisation of privacy risks to a bare minimum.

Also, everything is hackable. But publishing the code would allow experts to review it on vulnerabilities and make it more secure. How large the uptake needs to be to make it possible to lift restrictions mostly depends on how effective the institutions react given the data. This is hard to predict. The app itself only generates data.

Last updated: 12 May 2020 9:38am
Declared conflicts of interest:
None declared.
Professor Katina Michael is from the Faculty of Engineering and Information Sciences at the University of Wollongong
  • How does the app work?

"The contact and trace app works using Bluetooth Low Energy (BLE) technology. When two users who have downloaded the App and have Bluetooth enabled come into close physical proximity (a few meters) for a given length of time, their smartphones communicate sending one another a random identifier that is encrypted on each handset and stored for 21 days. The app automates "contact tracing" as done manually by health organisations presently including: contact identification, contact listing, and follow-up. Once someone is confirmed of having COVID, then their handset is interrogated for the stored physical social network, and corresponding records are decrypted, and individuals on the list notified."

  • Will it be a privacy risk?

"The alleged system does not retain personally identifiable information (PII) but a privacy impact assessment is forthcoming mid week. Despite the system being pseudo-announced as 'mandatory', it is now clear that it is voluntary and citizens will have a choice whether to download the app. The usefulness of the app will be determined by the percent of the population that adopts the app, remembering that not everyone has a smartphone.

Definitely, after a confirmed case has been determined, someone's privacy is 'in practice' encroached; as well as their corresponding physical social network, whether family, friends, colleagues, or strangers. But it has to be noted that this is done willingly, with the participants' consent. A user is entrusting the health organisation, or government agency, with their privacy, allowing access to their handset data but that is deemed proportional given the pandemic crisis. What the government is trying to achieve in essence is a COVID-19 response app; but we need to be cognizant that privacy, security, trust and safety are embedded into the functional design. But sadly, a consultation process has been lacking. It seems very top-down to date."

  • What happens to my data?

"Your data is stored on your handset, it allegedly never leaves the device unless you or someone in your physical social network is determined as having COVID-19. Every 21 days, data is deleted from the handset. But once voluntarily accessed, the data will need to be at least temporarily stored on a third party database, whether that is the health organisation or another government agency, it is not known. It should be noted, that the system will most likely utilise the bluetrace.io application which will at least handle the capability side of things, but not much is known about this beyond the company's whitepaper online."

  • What level of uptake is needed to make it possible to lift restrictions?

"I don't think a participation rate in the app, whether 40%, 60% or 80% can aid in lifting restrictions but it is certainly true that the greater the uptake, the more effective the app will be in meeting its primary objective. At best the app is "after the fact". It will help in reducing transmissions possibly, but certainly not in eradicating COVID. It will also allow for early intervention of suspected carriers of COVID, and preparing the medical supply chain and health infrastructure necessary in a given location to better respond to local outbreaks. But that is if it actually does work, of which evidence is lacking. It does make sense in principle but where are the successful use cases?

We have to be very careful not to confuse the issue as well. Apps don't reduce transmission, they help to identify individuals who have come into contact with people who are a COVID confirmed case. If we relax social distancing measures too quickly, with or without the App, we will most likely find ourselves going through a second smaller peak of infection. And we are trying to avoid that. 

The downside of an app like this may well be that people with COVID feel discriminated against. There has been quite a bit of discussion around the fact that some people have tested positive to COVID-19 but do not have symptoms of COVID. In addition, it has been clear that multiple surveillance capabilities may well be used to enforce quarantine for those who do not adhere to self-isolation using anklet bracelets or other trackers. Note, this is not a part of the original contact and trace aims. Digital immunity certificates have also been spoken about, and clearly this makes some people uncomfortable."

  • Is it hackable?

"The technologist in me says that any system is hackable. I do not believe that hackers would be interested in disclosing the names of COVID sufferers, or those in their physical social network. But certainly hackers will have a field day with Bluetooth-enabled handsets. When Bluetooth first was introduced into handsets, a lot of people had 'fun' with innocent hacks. Some of these began to turn nasty when an unsuspecting user had their contact list wiped, after pressing on a message pop-up, or a virus that made the handset dysfunctional for use. And that is no laughing matter. No one is talking about electromagnetic interference at the moment, and they seem to have forgotten what enabling Bluetooth can actually do. We know as average consumer users of this technology, that pairing devices, can sometimes end up in some tricky operational scenarios when devices are shared or otherwise."

  • How does the apps tracking ability differ to other apps such as Facebook and Google Maps?

"The contact and trace application that the Australian federal government is introducing is not a tracking app. it is a good question however to ponder on. We constantly are streaming personal information to both the Facebook and Google platforms. What does this mean for our personal privacy? Google and Apple in a new initiative are trying to offer an operating system level contact and tracing feature that will not require a user to download yet another app, which might well be out of date. By doing this, they will just be seeking consent to usage of the App by the end-user making it a simpler process. It seems a good time to talk about Public Privacy partnerships (PPP) as they seem to be garner force in the background but with little public engagement with citizenry.

Last updated: 20 May 2020 7:02pm
Declared conflicts of interest:
None declared.
Rachael Falk is CEO of the Cyber Security Cooperative Research Centre

The Cyber Security CRC has been playing an active role in reviewing COVID-19 tracing app and have been working with the Australian Government's Digital Transformation Agency and others to do this

We need more time and information before we can give this a final thumbs up, but the app already passes the early tests of security and privacy.

It is right that Australians take a conservative approach to protecting their online security and privacy.


We have already begun scrutinising the source code to ensure Australians who use it are kept safe.

The Cyber Security Co-operative Research Centre exists to link government, business and universities. Our funding comes partly from the Commonwealth, so it’s appropriate that our cybersecurity experts scrutinise this app.

We will advise the Government whether and how the app can be improved to guard the safety of Australians who use it.

Last updated: 05 May 2020 11:53am
Declared conflicts of interest:
The Cyber Security CRC has been playing an active role in reviewing COVID-19 tracing app and have been working with the Australian Government's Digital Transformation Agency and others to do this
Professor Dali Kaafar is the chief scientist of the Optus Macquarie Cyber Security Hub and Group leader of Data61's Information Security and Privacy Group

UPDATED COMMENT:

The following open letter has been signed by 300+ scientists, researchers and experts in Privacy Enhancing Technologies and Cyber Security, from over 25 countries… I, Professor Dali Kaafar, Executive Director of the Optus Macquarie University Cyber Security Hub, am one of the signatories. 

https://t.co/I8X6JcYLAI?amp=1 

The letter lays out 4 principles for open, transparent, privacy-by design approach to contact tracing.
The signatories welcome decentralised approaches as the best way to achieve privacy while enabling the full functionalities of contact tracing mobile apps."

PREVIOUS COMMENT:

"In essence, while the technology on which the Australian government COVID-19 tracing app will be built is providing privacy from other users of the app, it does not provide any privacy from the central authority collecting and processing the data. The authority will certainly be able to collect the social graphs (and optionally visited locations) of not only individuals who have been diagnosed with COVID-19 (and who might have given at that time their consent ), but also the central server can know the private data of a user even if they are not infected.

Once a user has tested positive for COVID-19, they need to provide consent to the server to retrieve and decrypt their data log. This enables the server to obtain the identities of the App users that have been in contact with the infected user. These potentially uninfected users are no longer in control of their privacy. This brings some serious issues with regards to consent and transparency of use of information stored on the local devices and pushed to the server. 

We made the recommendations in the form of simple and easy-to-fix techniques to the government so that the app provides more privacy from the Central Authority, and in particular to enforce the notions of explicit consent-based data collection about individuals, as opposed to implicit consent (that is assumed just after users install the app).

We also believe that contact tracing apps should be developed without a centrally controlled database that holds private information on individuals.

There are a number of proposals for contact tracing methods which respect users' privacy, many of which are being actively investigated for deployment by different countries.

Last updated: 23 Apr 2020 12:42pm
Declared conflicts of interest:
None declared.
Dr Belinda Barnet is a Senior Lecturer in Media at Swinburne University of Technology

If the app is implemented in Australia as it was in Singapore it does not require or request your GPS location. It simply logs which devices have been in close proximity to you. Importantly, TraceTogether stores this data on your own device and only “shares” the data with the authority’s server if you test positive and if you consent. So the data doesn’t get shared unless you test positive.  

My main concerns were around data privacy and also whether or not the app itself would be effective as it cannot replace manual contact tracing. I’m pleased that the government has agreed to make the app open source so that researchers can look at the code and at its security and privacy implications. I’m pleased that they will not make it mandatory. I hope that they simply use TraceTogether rather than making their own. 

I think it will be very difficult to get upwards of 40 per cent simply because people don’t trust the government with their data now. As a result of projects like My Health Record and the Census debacle, there is a general perception that this government uses our data without our consent. So it’ll be a hard sell.

Last updated: 21 May 2020 10:33am
Declared conflicts of interest:
None declared.
Dr Lewis Mitchell is a Professor of Data Science at the University of Adelaide
  • How does the app work?

"By saving on your phone a list of IDs corresponding to other phones you’ve come close to. This is measured by Bluetooth, not GPS. The contacts are stored as lists of random letters and numbers, not as names or phone numbers. If you are diagnosed and consent to share your data, you can share this list with state health authorities, who can then call those potential contacts and conduct a contact tracing interview with them as they do currently."

  • What happens to my data?

"Nothing whatsoever, unless you are diagnosed as having COVID-19, and consent to share your data, or are a close contact of someone who has been diagnosed. In that case, and in that case only, your name, age, phone number, and postcode, will be shared with state health authorities, who will call you for a contact tracing interview. Otherwise, your information stays on your phone, and is never shared with anyone."
 

  • What level of uptake is needed to make it possible to lift restrictions?

"Not clear, because this is a complex question for the government to answer, based on a wide range of health and economic advice. The government is aiming for 40 per cent uptake, mathematical modelling suggests that 60% or more is desirable. But any uptake more than zero has the potential to help."

  • Is it hackable?

"No more than your Facebook or Twitter app is hackable. And even if it were, all that could be gotten is a list of ID codes of phones you’d been in close contact with — some random numbers and letters — not names, phone numbers, or locations."

  • How does the app's tracking ability differ to other apps such as Facebook and Google Maps?

"It differs utterly, in that the app collects zero location information about you. All that is stored on your phone is a list of contacts made with other people who have installed the app.

Last updated: 20 Apr 2020 11:49am
Declared conflicts of interest:
None declared.

News for:

Australia
NSW
VIC
QLD
SA
WA
ACT

Media contact details for this story are only visible to registered journalists.