New Bill responding to concerns with violent extremist content online, proposing measures which include provision for takedowns and Internet filtering http://legislation.govt.nz/bill/government/2020/0268/latest/whole.html">https://legislation.govt.nz/bill/gove...
Inspectors will be able to issue takedown notices, and online content hosts will face civil penalties for noncompliance
Current liability limits under HDCA s 24 will not apply to objectionable online content.
(This seems to turn on some specific legal interpretation)
(This seems to turn on some specific legal interpretation)
Regulatory impact assessment: https://www.dia.govt.nz/diawebsite.nsf/Files/Proactive-releases/$file/regulatory-impact-assessment-countering-violent-extremism-online.pdf">https://www.dia.govt.nz/diawebsit...
Not yet seeing an NZBORA report on the MoJ website https://www.justice.govt.nz/justice-sector-policy/constitutional-issues-and-human-rights/bill-of-rights-compliance-reports/">https://www.justice.govt.nz/justice-s...
Here& #39;s the policy problem: "the extent to which the FVPCA can apply to digital content is incomplete"
(Though we know that online video including from a livestream can be objectionable under current law)
(Though we know that online video including from a livestream can be objectionable under current law)
This is a key point from discussions with stakeholders. Harms people face online are a part of broader policy challenges, including attitudes that drive sharing of extreme content online, and behaviours that this content encourages both online and offline.
"Objectionable" is a key term. It includes but could go much broader than the violent extremist material identified as a policy problem.
The departmental disclosures confirm that advice has been sought on consistency with NZBORA, so we can expect a report at some point http://disclosure.legislation.govt.nz/bill/government/2020/268">https://disclosure.legislation.govt.nz/bill/gove...
The provisions around filtering do require public consultation, and list a range of considerations (though some are only discretionary considerations).
Among those discretionary considerations are:
- side-effects on non-targeted content
- network performance impacts
- likely compliance costs
- side-effects on non-targeted content
- network performance impacts
- likely compliance costs
There are immunities for taking down material which is subject to an interim assessment.
This reflects a trend for regulation to focus on removal of targeted content (as opposed to flagging, transparency reporting, de-prioritising in feeds), in this case with a formal finding.
This reflects a trend for regulation to focus on removal of targeted content (as opposed to flagging, transparency reporting, de-prioritising in feeds), in this case with a formal finding.
Have not so far seen measures that would preserve access for research or law enforcement (for example to track or prosecute extremist groups posting material online).
This might be desirable to balance measures that will encourage or require removal of concerning material.
This might be desirable to balance measures that will encourage or require removal of concerning material.