New Bill responding to concerns with violent extremist content online, proposing measures which include provision for takedowns and Internet filtering http://legislation.govt.nz/bill/government/2020/0268/latest/whole.html
Criminal offence to livestream objectionable content
Censor will be able to make interim decisions
Inspectors will be able to issue takedown notices, and online content hosts will face civil penalties for noncompliance
Current liability limits under HDCA s 24 will not apply to objectionable online content.

(This seems to turn on some specific legal interpretation)
Provisions to empower filtering of objectionable content online
RIA assessment is that it "partially meets" criteria
Here's the policy problem: "the extent to which the FVPCA can apply to digital content is incomplete"

(Though we know that online video including from a livestream can be objectionable under current law)
This is a key point from discussions with stakeholders. Harms people face online are a part of broader policy challenges, including attitudes that drive sharing of extreme content online, and behaviours that this content encourages both online and offline.
"Objectionable" is a key term. It includes but could go much broader than the violent extremist material identified as a policy problem.
The departmental disclosures confirm that advice has been sought on consistency with NZBORA, so we can expect a report at some point http://disclosure.legislation.govt.nz/bill/government/2020/268
The provisions around filtering do require public consultation, and list a range of considerations (though some are only discretionary considerations).
Among those discretionary considerations are:
- side-effects on non-targeted content
- network performance impacts
- likely compliance costs
There are immunities for taking down material which is subject to an interim assessment.

This reflects a trend for regulation to focus on removal of targeted content (as opposed to flagging, transparency reporting, de-prioritising in feeds), in this case with a formal finding.
Have not so far seen measures that would preserve access for research or law enforcement (for example to track or prosecute extremist groups posting material online).

This might be desirable to balance measures that will encourage or require removal of concerning material.
There is provision for reporting on takedown notices, but only annually as part of the DIA annual report.

This may not be frequent enough for effective monitoring of how the framework is being used, and how effective it is.
You can follow @nullary.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled: