ADS4: .doc

Tutors: Nicola Koller, Tom Greenall & Matteo Mastrandrea

'At the centre of the world there is a fiction; a fictional piece of land a metre wide by a metre long. It has not been thrown up from the depths; not from the violence of lava bursting up and cooling, though there is a violence in its history. It is called Null Island, and you cannot travel there.'  – Jon K Shaw & Theo Reeves-Evison

Null Island is the busiest place on Earth, but it is impossible to visit.

It is the most photographed place, but impossible to find.

This non-place is only visible to machines. Yet it is the product of an interaction between the planet, our species and technology. A place, measuring one metre long by one metre wide, where a natural order, a constructed order, and – more recently – a digital order coalesce.

Null Island is located where the equator meets the meridian. The equator: the middle of the planet, the line surrounding the earth halfway between its magnetic poles; a line established by probes and sensors, by investigation of a scientific kind. A line more found than made. The meridian: a line inscribed on the globe, tethering that globe to the capital of a faded empire whose persistence is still felt. A line that hides a history of cultural violence, that delineates time, that prejudices the economies of the west over the east. A line not given but created.

The point where the lines intersect, where nature meets culture, 0° North, 0° East, perplexes the machines. Computers need a piece of land on which to ground their calculations. So we feed them a fiction, throw a nonexistent island out into the ocean. In return they run the numbers for our GPS, guiding us home safely at night, tagging our photos and mapping our memories, aligning our satellites and connecting us across the globe.

For machines, Null Island is a necessary fiction. For us, it is an unnecessary fact – one that embodies the often paradoxical, sometimes dangerous, consequences of digital classification.  

Our human desire to describe a given condition – a point on the equator – led to the biased, constructed order of 0° North, 0° East.  This order baffles machines, necessitating the creation of a digital myth/fiction: Null Island. Although this place may not exist physically, the ramifications of its existence are tangible and material – it affects our behaviour and changes the ways we interact. Neither real, fake, but unreal, Null Island makes manifest a simple truth: the way we/machines classify our world produces unintentional, and often problematic new truths, objects, and worlds. These will form the starting points for ADS4’s research this year into the implications of classification on society and the built environment.

Classification and its Consequences

To classify is human. We all spend large parts of our days creating taxonomies and we make up a range of ad hoc classifications to do so. We constantly classify objects, animals, places, illnesses, occupations and ideas. We create separations and ordinances based on certain ways of categorising the world in its material and social dimensions, which are culturally inherited and formalised in manuals, checklists, statistics, or bureaucratic procedures. From the simplest forms of personal organisation, such as how files are organised on a computer, to more complex social and cultural forms, such as how we identify in terms of gender, sexuality, race, ethnicity, and nationality, we are immersed in systems of classification.

Yet, for any individual, group or situation, classifications and standards can give an advantage or they create suffering. Jobs are made and lost, some regions benefit at the expense of others. Behind all classification systems, however trivial or neutral they may seem, there is a certain intentionality, the consequences of which affect social relationships and, ultimately, the identity of individuals.

The difference between the myriad modes of classification that affect our lives is not that some are biased, while others are not. There is always bias. The only difference is that some systems of classification are more visible than others.

Yet this visibility often only occurs when the injustice of any given system becomes socially untenable. For example, the ‘diagnosis’ of homosexuality as an illness caused untold suffering for multiple generations, but was only widely acknowledged, and subsequently ‘de-medicalised’, in the wake of the LGBT social movements of the 1960s.

Sometimes, despite a productive visibility, many classification systems remain vehemently, almost innately, conservative, taking a long time to alter their taxonomies in order to reflect the values of society at large. For example, libraries in more than 138 countries organise their resources according to Dewey Decimal Classification. This proprietary system is the most widely used in the world. But given its nineteenth-century origins, many have noted its longstanding and problematic cultural biases, including its Christian religiocentrism, racism, sexism and homophobia. Throughout the lifetime of the system, homosexuality has appeared under various categories, including: 132: Mental Derangements, 159.9; Abnormal Psychology, specifically 159.9734746: Sexual Inversion/Homosexuality; or even 616.8: Neurological Disorders. It took until 1996 for it to move into 306.7: Sexual Relations.

Ensuring insidious biases within systems of classification are minimised – and the taxonomies that underpin everyday life reflect progressive social mores – is at the core of the ethical project of our work this year.

Andreas Gursky, Amazon, 2016
Andreas Gursky, Amazon, 2016

‘Search, don't Sort’

Humankind has conceived and honed classifications for two main reasons: as a way to find or make some order in the world; and, more simply, as a means to satisfy the basic human need to put things in certain places, so we know where they are when we need them.

Yet, however imbricated these organisational structures or taxonomies become in our lives, they are often invisible. We don’t see or fully understand these taxonomies. And this prevents us from asking a series of important questions. Like, what are these categories? Who makes them? Who may change them? How do they spread? And what is the relationship between locally generated categories, tailored to the particular space of the computer desktop, and the macro, commodified, elaborate and expensive categories generated by medical diagnoses, government regulatory bodies and technology firms?

The invisibility of classification has been exacerbated by the rise of contemporary technology. On April 1, 2004, Google rolled out a free, advertising-supported email service: Gmail. When Gmail started, Google was still a one-product company, known primarily for its efficient search algorithm. It is no surprise Gmail was released touting full-text searchability as its main asset. Its original tagline: ‘Search, don’t sort’.

We used to think sorting saved time. It once did, but it doesn’t anymore. Google’s logic suggests that an automated full-text search for words or numbers on a whole corpus of sources, left raw and unsorted, is a more powerful retrieval tool than the traditional manual process of first sorting items by topic, then looking for them in their respective folders. Today taxonomies, at least in their more practical, utilitarian mode – as an information retrieval tool – are useless. A pragmatic chaos reigns.

Beyond the obvious psychological implications of this new (non)order, as well as problems of retrieval (particularly when human forms of classification interact with machines that obey no human logic – think of a worker trying to retrieve an object from an Amazon warehouse without any technological aid), this 'death' of the taxonomy also has deeper, more nefarious consequences: attitudes emerge which are the unwitting product of machine forms of (non-)classification. The material force of categories appears, always and instantly.

Tech companies work every day on the design, delegation and choice of classification systems, yet few see these systems as embodying moral and aesthetic values that in turn craft people’s identities, aspirations and dignity. Philosophers and statisticians have produced highly formal discussions of classification theory, but few empirical studies of use or impact. Only a handful of people have tackled the question of how these systems inform social and moral orders via the technological and digital infrastructures that facilitate contemporary society. As a culture we have yet to develop conventions of classification for our digital world that bear much connection to our actual daily practice.

As the world becomes ever more automated by artificially intelligent machines, understanding the way in which the world is classified has taken on a critical importance. Like us, machines classify in order to understand and order the world. Yet the so called ‘soft’ AI that powers Google, Netflix, Amazon, etc., is based on a series of classifications that facilitate a crude form of machine learning – with this machine ‘learning’ going largely unchecked. Consequently, the biases that exist in society quickly become embedded in the machine.

There is now ample evidence of the insidious biases that are latent within technology and AI algorithms. Recent research has shown that the 100 top recruitment websites, which are now powered by algorithms, present high-paying job ads six times more frequently to male candidates than to female ones. Algorithms that dictate temperature regulation in contemporary office buildings are designed for a 155-pound male, meaning that many women, who are typically smaller, feel cold. Furthermore, ProPublica recently revealed that algorithmically generated risk assessments used by US judges to guide their decisions at bail hearings were unreliable and prejudiced after discovering that black defendants were twice as likely as white ones to be flagged as future criminals. Even if these measures are built with the best intentions, their growing usage is beginning to aggravate discrimination and impact the very minority groups who have historically been subject to unfair judicial treatment.

The manifestation of this machine bias not only reveals something about the society from which the biases came, but also about the hidden classification systems we are all, unwittingly, subjected to every day. The targeted advertising we face, the products we are recommended and the TV programmes we are encouraged to watch are all the result of assumptions made by the machine due to certain classification protocols. While the existence of a class-based society – and its pernicious consequences – may be nothing new, the long-term implications of opaque, automated and algorithmic classification certainly are.

The Identification Game

Understanding what motivates particular classification systems is a useful starting point when interrogating how to augment them for better ends. In 2014 Facebook proudly announced that its UK users were now able to select from one of 71 different gender options. Disguised as a move towards greater inclusivity and recognition of gender diversity, the extended choice was a cunning way to gain more precise data about its users. ‘If you don’t want to identify as male or female, that’s fine, but please define your ‘custom’ gender, and we’ll make sure we sell your data to the appropriate companies’. This voluntary act of classification allows algorithms to exploit homophily, the phenomenon whereby people like to bond with those similar to them. Extending this logic, Facebook believe you’re like what you like and that you will like the things that people who are like you like.

The details of your digital self – what it consists of, how it’s manufactured, who it was inferred from, to whom it is sold and for what purpose – are kept secret from you. Its effects may or may not be reversible, who knows? This brute force approach to machine learning and reliance on correlation, rather than cognition, is deeply problematic. By making judgments based on majorities, these algorithms effectively reinforces stereotypes. It is of the utmost importance we understand the classificatory systems that govern us if we are to retain agency in our digitally-mediated world.

OMA, Preservation Is Overtaking Us, 2016
OMA, Preservation Is Overtaking Us, 2016

Architectures of Classification

Throughout history, the theory and practice of architecture has been characterised by attempts to classify and reclassify its elements and processes: Vitruvius’ Ten Books on Architecture; Semper’s Four Elements of Architecture; John Ruskin’s Seven Lamps of Architecture; Le Corbusier’s five points; and most recently Rem Koolhaas’ 15 Elements of Architecture. The consideration of ‘typology’ as a principle for the design and classification for buildings is so entrenched in architectural thinking that it seems difficult to challenge. The notion of typology as design methodology can be traced to Jean-Nicolas-Louis Durand’s attempts to find a systematic method for classifying various genres of buildings that can be distilled into their most typical elemental parts. For Durand, any new typology can be created through the adaptation and recombination of these elements in response to specific site constraints. His ambition was to systematise architectural knowledge and to set out a rational method for designing buildings. Forms of classification also provide the framework for contemporary practice: the RIBA’s seven work stages (eight if you include zero); the 19 use classes (A1-5, B1-8, C1-4, D1-2) enshrined within the ‘Town and Country Planning Order 1987’; and the three ‘grades’ for listed buildings determined by Historic England, to offer some notable examples.  

Like other forms of classification, these designations can guide social interaction, create financial gain for some and marginalise others. For instance, a private house increases in value by an average of 14.6 per cent when historically listed, while a commercial building can decrease in price by as much as 35 per cent. In 2017 a one-bed flat in Haringey, North London, dropped in value from £475,000 to £400,000 when the ground floor shop was reclassified from A1 use to A3. In the same year, agricultural land around Belfast rose in value from £24,000 to £1.2m per hectare after being reclassified as residential. When the permitted development rights were amended to allow the conversion of office use to residential as a strategy to address the housing crisis, the London Borough of Camden estimated that 5,000 jobs could be lost. Approximately 257,000 sq ft of office space was subsequently lost in less than 12 months in 2014, equating to the relocation of 2,570 jobs from the borough.

Along with other examples, the unit will explore such phenomena as a means to interrogate the historic role classification has played in architectural design. Students will either augment an existing hegemonic taxonomy, or invent their own, as a way to begin a process of world-building.

0° North, 0° East
0° North, 0° East

Return to Null Island: 'The Busiest Place on Earth'

Each day countless people seeking digital directions on their computers and smartphones are diverted to an isolated spot in the Atlantic Ocean, 1,600 kilometres off the western coast of Africa, where the Prime Meridian and the equator intersect. It is called Null Island.

This lonely way station in the Gulf of Guinea is, according to its (now defunct) website, a thriving republic with a population of 4,000, a roaring economy, a tourism bureau, a unique native language and the world’s highest per capita use of Segway scooters. In the realm of digital cartography, it is one of the most-visited places in the world. The only problem for its millions of visitors is that there isn’t much to see. Null Island does not exist.

This digital 'island', which is described by cartographers as the 'default destination for mistakes', exists as a result of programming errors in geographic information systems (GIS). Whenever you enter a location into your computer or smartphone, a program converts that information into coordinates. If there's an error in the information you've entered, or if the code doesn’t understand that you’ve entered 'null' or 'no information', the program is liable to get confused and default to '0,0'. Recent search-engine requests for a bike-sharing location in the Netherlands, a car-rental agency in Portugal and a polling station in Washington DC have all been side-tracked to Null Island because of the result of typos or coding errors. On one day in June, GIS cartographers counted 1,708,031 misguided location requests that had landed there from a single mapping software – a fraction of the total daily when including all mapping services and applications world-wide.

Several years ago, the crime mapping application for the Los Angeles Police Department made LA City Hall look like the centre of a crime wave when its mapping analysts made it the default location for hundreds of crime reports with undecipherable addresses. To fix that problem, the analysts instead routed mislabelled crime reports to Null Island.

'Like No Place on Earth'

While the exact origins of 'Null Island' are murky, it reached a wide audience no later than in 2011 when it was drawn into Natural Earth, a public domain map dataset developed by volunteer cartographers and GIS analysts. In creating a one-square meter plot of land at 0°N 0°E in the digital dataset, Null Island was intended to help analysts flag errors in a process known as 'geocoding'. If it was first enjoyed as a cartographers’ in-joke, the previous sections of this brief suggest a darker side to this non-place. With algorithms dutifully classifying the characteristics of our contemporary world through this openly available data, is Null Island’s apparently booming economy, thriving tourism industry and disproportionately high crime rate beginning to skew global perceptions about equatorial Africa’s circumstances?  

Null Island is a remarkable product of the collision of systems of classification. As such, it provides a vehicle for exposing the latent biases, privileges and invisible structures that govern our world. During 2018/19 this contested island will become the testbed for ADS4’s investigations into the wide-ranging implications of emerging technologies. Through this site, we will explore the nature, origin and social consequences of analogue (human) and digital (machine) forms of classification.

By using the ‘unreal’ Null Island as our site, we will aim to provide a counterpoint to the idea of ‘future visioning’ as the primary framing device for imagining new realities. Null Island is intended as a place where new worldviews can be developed and formulated into propositions, questions, hypotheses, ideas and what-ifs.

It is not a place for testing ideas that are intended to be implemented, but rather a site where – in response to the complex fusion of politics and technology shaping today’s social realities – speculative forms of material culture can be used to provoke new ideas and collective imagining about the kinds of worlds people wish to live in.

Lies & Videotape: Borrowing Tools and Techniques from Documentary Filmmaking  

ADS4’s interest in contemporary visual culture and the moving image will continue this year through the prism of documentary film. While the primary conceptual motivation behind the brief is to interrogate classification and its social, political and ethical implications, we are also eager to more broadly interrogate the nature of indexing/indexicality.

Over the past few years contemporary art has witnessed a remarkable trend towards the documentary style. Documentary practice in contemporary art is characterised by objectivity and the search for truth, an acute sense of reality and the desire to remain factual.

The creation of documentaries will be used to explore notions of reality, fakery and unreality in relation to Null Island (a site where what is real and what is nonexistent is almost impossible to meaningfully delineate). As film critic Brandon Harris writes, 'in the best documentary films, artifice isn’t an obstacle to truth, it’s a way in'. In other words, we will use fiction as a vehicle to find truth.

Documentary practices and approaches towards reality will also be used more generally in our design projects. In a culture saturated with irony and bullshit, this somewhat anachronistic approach to the moving image – one that recognises its own indexicality – has the potential to foster a new, more sincere attitude towards the world and its objects.

Live Project

The first-year students will produce an exhibition and accompanying publication/atlas on the subject of Null Island. Null Island will be treated as a nation without an identity – one that needs to be designed, in collaboration with others, and through a series of objects.


Tutors

Tom Greenall has completed award-winning buildings with DSDHA, whose Christ College Secondary School was shortlisted for the Stirling Prize in 2010. He has taught Architectural Design Studio 4 (ADS4) at the RCA since 2011. Tom studied architecture at the University of Sheffield and the Royal College of Art, qualifying as an architect in 2011. Tom was made an Associate Director of London-based architectural practice DSDHA in 2013. His work has been published in over 20 languages, he has exhibited internationally, and has written for both Building and Building Design magazines.

Nicola Koller is a designer working for acclaimed fashion designer Sir Paul Smith, designing and commissioning retail spaces worldwide and running design projects of special interest to Paul. Nicola heads an in-house multidisciplinary studio of furniture designers, interior designers, industrial designers and architects. Nicola has overseen and completed over 300 projects from concept to completion with Paul Smith in 24 countries. Including major flagship projects in London, Paris, New York, Antwerp, LA, Tokyo, Seoul, Beijing and Shanghai. She graduated from Oxford Brooks University with a BA in architecture in 2000. She continued her education in architecture at the Royal College of Art, graduating with an MA in 2003.

Follow ADS4 on Instagram:

@rca.ads4