"DeepNude" shows need for urgency


3 July 2019

comment to New Zealand Herald.

Brainbox research fellow Curtis Barnes, recent co-author to the report of Deepfakes and the Synthetic Media of Tomorrow, provides comment to the Herald about the issues and impacts of the recent DeepNude saga.

Key points:

  1. DeepNude demonstrates the immediacy of some synthetic media harms. We should expect worse in the future.

  2. Women remain the primary targets for synthetic media fakes.

  3. Governments need to consider a range of options for protecting people, including coordination of responsibility, education, investment in detection, and enhancing the capacity to enforce existing law.

  4. Copies of DeepNude remain in circulation.

"Don’t believe everything you see, or hear"

Photo: Getty Images

Photo: Getty Images

Curtis Barnes
22 May 2019, Newsroom.co.nz

When it comes to the effect deepfakes and synthetic media may have on democracy, we should be bracing for impact. In this article for Newsroom, Brainbox scholars provide insight on the present and future of democracy in the digital age.

Key points:

  1. In all likelihood, it is now a matter or when, not if the internet becomes saturated with synthesised audiovisual material. Not all of this will be explicitly or deliberately misleading.

  2. Technological solutions may be inadequate to prevent or mitigate harmful effects. As well as this, there is a global shortage of digital forensic experts.

  3. Examples of political confusion and misinformation are already occurring.

  4. There are limits to which law can intervene or provide solution. Ultimately, changes to the way consumers and citizens consume and share information on the internet will be at critical to a healthy democratic future

Submission to Inquiry on external political interference


The Justice Committee has closed calls for submissions on its Inquiry into the 2017 General Election and 2016 Local Elections. The Inquiry will investigate:

“… on the specific issue of how New Zealand can protect its democracy from inappropriate foreign interference …”

Mr Curtis Barnes and Mr Tom Barraclough, research fellows, have made submission speaking to the Committee’s second specific concern:

“the risk that political campaigns based through social media can be made to appear as though they are domestic but are in fact created or driven by external entities”

Mr Barnes and Mr Barraclough are lead researchers on the Perception Inception Project, which investigates emerging synthetic audiovisual media technologies. Part of this has included developing an understanding of the deceptive potential of these technologies, and their capacity for use by external entities to influence domestic politics.

The researchers are also authors of a chapter on the growing impacts of synthetic media technologies in international security, being considered for publication in an edited collection on emerging technologies and international security.

In the course of the Project, the researchers have spoken with industry leaders responsible for both creating and detecting synthetic media, including discussions with researchers involved in DARPA’s “Media Forensics” program.

The submissions emphasise the need for awareness of the rapidly emerging methods for external interference that will be difficult to detect.

The full submission will be made available here and can also be viewed here.

New technologies a threat to civil liberties, but not how you imagine

Singapore lawmakers are pursuing anti-fake news legislation that will grant authorities powers of reply, fines of up to $750,000 US Dollars, or 10-years imprisonment for anyone who publishes “fake” information with “malicious intent”.

Singapore lawmakers are pursuing anti-fake news legislation that will grant authorities powers of reply, fines of up to $750,000 US Dollars, or 10-years imprisonment for anyone who publishes “fake” information with “malicious intent”.

Curtis Barnes
5 April 2019, Dominion Post, Stuff.co.nz Politics

Curtis Barnes provides comment for the Dominion Post on how regulatory zeal towards new technologies is being used to pursue repressive legislation that otherwise could not succeed on its own.

Key points:

  1. Concepts like fake news and harmful information must be seen in the context of a much older phenomenon of humans sharing belief and opinion, and the old debate over the reasonable limitations for this in a free society.

  2. New laws responding to this phenomenon emphasise its novelty and unprecedented risks as a justification for repressive regulations. These regulations are invariably enacted for the protection of citizens, and intentionally or incidentally, grant dangerous powers to authorities.

  3. The dialogue over harmful expression through use of the internet and digital technologies is necessary. But it must be placed in context, and it must proceed from a position that understands the extreme risk of new technologies being used as vessels for regulatory overreach.

Manipulated Acosta video is old tech but a new wave

acosta stuff.jpg

Curtis Barnes, Tom Barraclough
9 Nov 2018, Stuff.co.nz

The Acosta incident compels us to think: how can we trust audio-visual information? Curtis Barnes and Tom Barraclough provide comment for Stuff.

Key points:

  1. Traditional video editing has clearly been used to manipulate the Acosta video for political disinformation purposes.

  2. New audiovisual technologies will make it easier and cheaper to create better “fakes” than ever before.

  3. Policymakers need to consider how to prepare people in the midst of a general crisis over information integrity.