@drkierachapman, @andy_inch1 and I are really worried about the government’s White Paper on planning, and particularly its introduction of an algorithmic logic. Together we’ve written a full article for the TCPA (in the latest issue). This thread summarizes our concerns. 1/15
The government’s White Paper on planning is a move away from qualitative document-based information to quantitative machine-readable information. This is explicitly to allow both automation and the entry of ‘PropTech entrepreneurs’ into the planning process. 2/15
PropTech is short for ‘Property Technology’, a burgeoning area of technology-focused business that reaches right across the development and planning process. These startups are often supported by specialist venture capital firms. 3/15
There isn’t much research on PropTech in planning, but studies in the smart city literature suggest that the ‘datafication’ of services creates opportunities for the fragmentation and fundamental reorganization of service delivery. 4/15
What tends to happen is that a particular ‘flow’ of quantitative information becomes the target of a private sector company that has found a way to profit from the management or use of that small piece of the system. 5/15
Over time, the way in which a service is delivered is fundamentally altered, as urban processes are carved up into a series of data-driven pieces, each of which is enclosed by a private company in order to extract value. 6/15
This is often sold via an attractive rhetoric of optimization, value-neutrality, innovation, and efficiency. But behind it sits a neoliberal ambition to use data to allow global flows of capital to penetrate local land and development markets, and the planning system itself. 7/15
This raises a number of ethical and political concerns. Firstly, in tandem with non-computational design codes and zoning, it reduces the complexities of place, as local environments are forced to fit predetermined categories and programmed logics of decision-making. 8/15
Secondly, the use of algorithms could begin to divorce spatial decision-making from political accountability. It can be hard for individuals/communities to challenge the calculative logic at work in algorithms... so this type of spatial decisionmaking is less democratic. 9/15
Thirdly, spatial decisions are inherently controversial. Different people have different views. Algorithms have to assume a univocal logic based on just one set of priorities. These can easily start to encode structural forms of disadvantage, enacting ‘algorithmic violence’ 10/15
Fourthly, this approach will likely be used to target development not where it is most needed, but where it is most profitable; reproducing existing forms of social and spatial inequality rather than addressing the inequalities generate by capitalist urbanization 11/15
Fifthly, this tech is likely to be damaging to local democracy. PropTech companies propose to run planning consultations, but we know that data-driven approaches tend to reinforce a passive model of citizenship, rather than opening up spaces of deliberation… 12/15
…plus the whole tenor of these reforms removes real power over decisions from local communities. Once an area is zoned for growth, local people will have very little say over any development that happens there. 13/15
Ironically, the uncritical embrace of PropTech is grounded in assumptions every bit a technocratic as those of the 1947 planning system, which Boris Johnson is so keen to present as a ‘relic’ in his introduction to the White Paper… 14/15
What gets stripped out is an awareness of the necessary politics and place-sensitivity of planning, and ultimately the danger of an undemocratic shaping of our towns, cities and countryside by those with power and money. 15/15
(If you want to read more about the dangers that algorithms pose from a spatial perspective, check out these fantastic books!)
You can follow @TaitPlanning.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled: