Digital Cultural Public Infrastructure

Digital Cultural Public Infrastructure

The Foundation for Public Code is working at the convergence of policy and software, at a moment when, through digital transformation of their organizations, most public administrations are beginning to implement large, infrastructural-scale aspects of their function as network service supported processes. This shift has huge implications for how society is shaped by technology and – more importantly – how technology is shaped by society.

When humans invent things, for the first 30 years they call those things ‘technology’. Then, when it’s successful, they begin to call them ‘infrastructure’. But the transformation of technology into infrastructure is a cultural moment in which responsibilities shift: different actors become involved with the tools and processes that are necessary to maintain this new infrastructure. The growing public interest in these solidifying pieces of ‘what we assume society provides’ evokes the necessity to tend to issues such as inclusiveness, transparency, accountability, and openness.

Now that during the pandemic we’ve gone to holding Parliamentary meetings in Zoom (a ‘private’ stream of grids of faces), and passing laws in Microsoft Teams (pretty much a provisioning disaster), this idea of ‘software as public infrastructure’ is no longer really in question; people recognize that this is a problem, because the agency over the processes of Public Administration is no longer held by the public administration itself –  it’s actually held in ‘the cloud’ of offshore servers with unclear data policies. 

The idea that governance does involve software is relatively new, but it’s finally taking hold in cities and states around the globe. In Europe, part of what government does is ensuring a space for the unfolding of culture; this opens a new trajectory for the production of digital public infrastructure: what does the cultural digital public look like? If you visit almost any city in Northern Europe and go to their inner city, you will see a building that cost in the region of 65 million Euros that represents a shard of starchitecture (be it an opera house, a library, or something of the sort): this typically embodies the way in which a modern European city signals being a culturally evolved entity. Beyond the beauty of these structures, it is questionable that such a display will be as culturally valuable to a city as, for instance, something on the scale of Flickr [a large scale open media storage-and-curation web service] – which also may have cost around the same amount of money to develop, but had a very different function. Flickr, like many other early services in web 2.0 such as Delicious and the first iteration of YouTube were built as places to safeguard public media history, but they ultimately had to evolve to meet the expectations of shareholders and acquirers. 

Collection of some of the 1715 projects submitted to the Helsinki Guggenheim competition, stage 1. Screengrab from the Internet Archive Wayback Machine, archived 1st April 2016.

As private corporations, they were ultimately constructed to extract value; but what happens if we can scale cultural software development to achieve similarly massive services without the profit motive? What is the equivalent in societal value of the starchitectural library or opera house as a networked digital platform?

There are important flows of resource allocation, support and funding within governmental activities that go into categories like the Arts and Architecture, and you’ll find software projects lodged within those disciplines as ‘cultural production’. The problem is this: at the moment, public production of culture does not think in terms of ‘frameworks’, and ‘platforms’ and ‘sustainable code bases’; instead, almost all cultural production is done in the form of ‘instantiation’. The library and the opera house are the containers for those instances, but buildings belong to a paradigm of hyper-localization, and in some ways their inherent restriction of access can impact transparency and accessibility – they are simply 20th century products, and as such designed around the notion of local accumulation rather than for dispersal across a cultural landscape. 

Networked services are instead the opposite of those pieces of architecture: they’re ‘outward-bound’, fundamentally focused on delivering services to people. They’re more like another set of 20th century institutions, the corporations for public broadcasting, or the corporation that upholds the public presses – institutions dealing with the mediation process. But their modality was a unidirectional broadcast, which indeed allowed cultural institutions to bring mediation to the masses but often in close association with the state, thus reflecting the perspective of the broadcast world. This centralizing network topology was instrumental to the birth of Modern-day institutions, when nation-states were knit together with a single story, but now we’re in a new phase which is about frameworks and platforms: the possibility to hold safe spaces for a multiplicity of expressions, the necessity to understand the provenance of information you’re given, and the ability to adaptively contextualize and culturally digest all the challenges that the rapid changes we will likely witness in the 21st century. 

Especially in Europe we have seen a very advanced discourse framing these concerns within a regulatory expression, a set of constraints that provide some assurances, but are often not technologically literate enough to address the core issues of a problem space. An alternative path might be to constructively support an ecosystem of creative tools that have regulatory constraints built into them – similar to the design of a game in which the players have rule-spaces that they play inside of. And if one can provide those tools proactively – as a set of media applications and platforms that allow people to foster creativity and curatorship while enabling a cultural remix – then you provide a tooling that gives people a safe place to express themselves as well as the discursive environment that doesn’t make them feel threatened, and can provide provenance of trust for the creative and informational outputs. 

There are any number of tools that might be useful in this way: some come from exceptionally mundane and functional toolings such as library or performance venue management systems. Yet, there are also some really interesting new forms that I find more compelling. Here, in the Netherlands, it’s called broedplaats, but almost every European state has a similar mechanism for funding collaborative cultural sites. These sites were inhabited by a set of cultural practices for a number of years, via subsidies or grants – for example, in Amsterdam we have De Ceuvel, and before that we had the buildings of Trouw and Volkskrant. These are collaborative spaces between the municipality’s cultural department and the local cultural productive class: they come together to build 5-to-10 years experiences usually involving a music venue, a restaurant, a cafe to then address the conditions of a site lying fallow and transform it into an attractive center of cultural production. Despite the very different circumstances in each instance, the processes that organize and manage resources to execute a site transformation unfold in similar ways, and that is something that could be captured using a piece of software for public good: a directed social network able to create these collaborative cultural sites. By spinning up an instance of this software, hopefully in a public computer backplane similar to AWS [Amazon Web Services], anybody could actually decide to make a cooperative that is about cultural practice collaboration spaces. The idea that software can capture a process and allow people to replicate it more easily is one of the core convergences of policy and software. In a lot of cases, our Foundation for Public Code helps with public administration processes being digitized and made replicable for other public administrations (like transport network adaptivity, or urban planning participatory processes). However,  in the cultural sector, these processes may not necessarily be state-sponsored activities, but the tooling can be produced as a public resource, which other cultural groups can utilize to add cultural value without receiving subsidies. 

Particularly relevant to the discussions taking place at the Media Architecture Biennale are reactive environments – actual programmable spaces in a digital sense – as platforms that can be publicly produced. This is probably one of the most interesting public cultural ‘spatial computing’ projects made possible: is there a way we can create a containerized service? Maybe a Linux distribution with media production and interaction applications on it that then has a bunch of drivers to commodity hardware that a space can be outfitted with? 

And indeed, as with most software as public infrastructure, these codebases and communities have to become living projects: software used to be considered a thing that you would buy in a shrink-wrap box, install and use. Now that’s no longer the case – software is a process, and it evolves at a very fast pace: for any Google or Facebook product, or any of the major cloud services that people use on a regular basis, chances are that they probably change every time you log into them. This mindset of continuous integration has not yet hit the public sector in the way that it relates to technology, but this will be one of the most transformative aspects of the near future, because the process of democracy transforms the public sector through policy adjustment, and that needs to be reflected in the implementation of the software. And right now, because of this cumbersome process where software is being released years apart, public organizations end up with this crucial lag between the generation of policy and its implementation, a period of time in which the two actually diverge in an undemocratic way. 

A piece of software has a sort of continuity, a gradual metabolic growth: I think this will become evident when you start to see it reflected in something that is as continuously transforming as ‘digital spatial computing’ (like reactive spaces, and performative spaces), with a very high refresh rate of the technical implementation that they represent. Although it’s very hard to stay on that treadmill, and this is often why public administrations have problems thinking about media architecture, because the obsolescence cycle is so fast. 

So the idea is to have something like the webserver, or Linux – huge pieces of open-source software supported by an enormous continuous barn-raising activity by private, public, and individual actors. This is the process that needs to happen to media architecture, transforming it into a dynamic discipline with a high-speed moving toolset, in order to continue to represent the avant-garde of the engagement with technology, mediation, a narrative immersion that currently private media production companies are wrangling with. 

Collection of pictures uploaded with the tag architecture on Flickr.com. Screengrab from the Internet Archive Wayback Machine, archived 31st March 2010.

Let’s not mix expectations; a bus is not a Ferrari. The public implementation of these immersive mediation environments will no doubt lack the massive investments from commercial actors like Disney and Facebook. While media giants turn you into branded superheroes, the public sector versions will be more about bringing everyone, for instance, into an immersive data visualization of the potential different future of their city, so they can understand how to participate successfully in some choices; or a visualization of an archive of all of the photos ever taken of their neighborhood, so that they can start to see what’s relevant to life in their communities. 

In order for citizens to participate and be able to make informed decisions, these are all things that are going to require a literacy that the average person does not yet possess. For this, we have to take advantage of new methods of communication, so that people have the perception of this new need, and it is the duty of the public sector to evolve the tools that educate citizens about the complexity of the systems that they’re living in, including the cities, the environments and their own metabolism.

Understanding the various dynamics of infectious diseases [including transmission and treatment] for instance is paramount to being able to successfully participate in a collaborative society, and if we want this to be the type of culture we continue to live in, then we need to actually develop the tools and platforms to allow that. Media architecture and the open public digital products can provide tooling for exploring highly interrelated complex systems, but imagine if you could do that not only at the library or at the university, but in any place running a public platform that has a unified set of protocols, file types and drivers-to-hardware; and some publicly maintained editable code-base that powers all of the simulation tools, the offering tools, the projection mapping tools and the interaction tools. Those types of applications are eventually going to become some of the most fundamental tooling in society. 

Right now, thanks to software’s rapid evolution in commercial contexts, we have roles like ‘product designer’ and ‘user experience designer’. These roles are all about getting people through an interaction process that involves a transaction – be it money, data, or attention. One of the questions yet to be fully answered and explored is “What do these roles in computational creativity look like if you’re not in a market-driven economy?” This is because these goals are not the goals of computation in the public sector. Now the really interesting idea about cultural production in the digital domain goes beyond such instances (like an immersive visualization): you need to make a set of tools that allow people to be creative with digital experiences in ways that are not about creating products. 

At the moment urban systems can be represented in a lot of different quantitative ways to support engineering and urban planning activities, but there aren’t really interpretive tools for any of this yet. Part of the reason why public digital tools right now rarely challenge the ‘experience attractiveness’ of private tools is they haven’t fully embraced the idea of public cultural expression, one of the most powerful forces in society. All these cultural resources, music and art, haven’t really reached a moment where they’re available for remix culture, in which everything that has been produced in the cultural history can then be drawn upon, and through interpretive tooling produce a new fecundity that talks about where our society is headed. And this mission may eventually be thought of as large as that same giant architecture sign on the inner harbors of cities, because it empowers and expands the horizon beyond the cusp of what digital production allows at the moment in the market. 

Digital tools also need to have representation in the open, transparent, inclusive, accountable stack that is public infrastructure: I think we’re headed into a time that will see an explosion of creativity and I would like this creativity to create a feedback loop, allowing the growth of platforms and tools that empower cultural expression, globally.

Volume 59 – Futures Implied is the result of a collaboration between Volume and the Media Architecture Biennale. This year’s edition MAB20 will take place from June 24-July 2nd as an online event. For more information about the program, see www.mab20.org

3650

0