Blog
Fiona Johnstone in conversation with Felicity Allen

Fiona Johnstone

5 November 2019

Fiona Johnstone

5 November 2019

Fiona Johnstone in conversation with Felicity Allen

As part of our investigation into personalisation, People Like You is working with artist Felicity Allen (http://felicityallen.co.uk). Up to fifteen people will participate in Allen’s Dialogic Portraits practice, sitting for Allen in her studio in Ramsgate, where she will paint their portrait and invite them to reflect upon the process – and personalisation – with her.  Fiona Johnstone, postdoctoral research fellow with People Like You, sat for Allen and discussed her practice in relation to personalisation.

 

Fiona: Can you tell me a little more about the process of making Dialogic Portraits? The phrase suggests a conversation or dialogue; it reminds me of Linda Nochlin’s famous line about a portrait being ‘the meeting of two subjectivities’. I’m interested in how this relationship can be captured and made manifest in an artwork.

 

Flick. I’ve been working with Dialogic Portraits as a format for around ten years. Each sitting involves both talking and silence; each portrait is a document of the time that the sitter and I spend together. As well as creating a pictorial portrait, I also produce audio and video recordings, and make written observations. These then go towards making, say, an artist’s book or a film. I’m interested in how people respond to the experience of sitting, and in how they relate to the version of themselves that is given back to them in the finished portrait.

 

Fiona: The notion of series is important for dialogic portraiture, is that correct?

 

Flick: Yes, series, but also concept. Each series of Dialogic Portraits (Begin Again [2009-2014], You [2014-2016], and As if They Existed [2015-2016] and, currently, People Like You, Refugee Tales, and Interpreting Exchange) is informed by a concept that loosely links all the sitters in some way. For example, for Begin Again, which I started at the end of a decade of not-painting, I invited people who I had been working with [as Head of Interpretation & Education at Tate Britain] during that decade to sit for me. This enabled me to explore the limits of what we understand to constitute labour – intellectual, administrative, affective or domestic – and to think through the significance of this labour in relation to the production of both portraits and persons. For each portrait produced (76 in total), I wrote a diaristic note and recorded an interview with the sitter.

 

Fiona: I’m interested in the presentational format of Begin Again, which takes the shape of a two-volume book with images and texts (pictured), and also an exhibition (in 2015) where the portraits were hung as a wall-sized grid of faces (pictured). This configuration conjures several associations for me: a database, a filing system, or a rogues’ gallery. This reminds me of two textual reference points. The first is Siegfried Kracauer’s ‘The Mass Ornament’; Kracauer argues that in the modern period, people can only be understood as part of a mass, not as self-determining individuals. The second is Allan Sekula’s famous essay on photography, ‘The Body and the Archive’, where he explores the relationship in the early nineteenth-century between photographic portraiture, the standardisation of police and penal procedures, and the rise of the pseudo-sciences of physiology and phrenology (both comparative taxonomic systems which in turn contributed to the development of the discipline of statistics). Finally, it also made me think of an Instagram wall!

 

Flick: The associations with Instagram wouldn’t have occurred to me. I started working with the grid before I started using social media, and certainly before I was aware of Instagram [which was launched in 2010]. The grid was partly a practical solution to the problem of how to display multiple images within a limited space. For me, the associations of the grid would be minimalist or modernist – as in Rosalind Krauss’ reading of the grid – rather than to do with Instagram.

 

Fiona: That’s interesting. Krauss claims that the grid is a symptom of modern art’s hostility to narrative and to discourse – this seems antithetical to your own work, which connects image and text. She also describes the order of the grid as that of ‘pure relationship’, whereby objects no longer have any particular kind of value or order in themselves, but only in relation to each other. Perhaps this notion of ‘pure relationship’ might offer us a way into thinking about personalisation in relation to your work?

 

Flick: With the Begin Again wall I was certainly thinking about the individual in relation to the mass; the paradoxical effect of working with a group or series of people is that you starting thinking about them all as individuals. The format is also, crudely speaking, about taking status away from people by putting them alongside other people. It disrupts the way in which we privilege certain people. It’s vaguely political, challenging hierarchy.

Begin Again nos 1–21 (2014), Felicity Allen, 2-volume limited edition artists book

Fiona: It feels as though you are working with an enduringly humanistic notion of the person. In particular, you work primarily with the face, a part of the person that has longstanding associations with phenomenological presence. Your images are often closely cropped; the focus is solely on the face, rather than on any contextual details, such as background or clothes, that might give the viewer a clue as to the identity of the sitter.

 

Flick: I agree that the face has strong humanistic associations. I’m thinking of Levinas’ idea that the face is basically something that stops you killing people – it makes a demand on you, and that relationship is inherently ethical. In terms of contextual details, I’m now starting to crop my images much less closely, because I’m interested in notions of personal branding and role-playing through the way in which people choose to present themselves – through branded clothes, for example.

 

Fiona: I wanted to ask you about the significance of persona. Many of our conversations on this project have looked at personalisation in relation to digital technologies and data science. Digital personalization technologies reflect a longer preoccupation with the ‘person’ and the ‘persona’, and it seems to me that your work, which is almost resolutely analogue, might offer us a different way of approaching personalisation. The origins of the terms personalisation, personal, and personalise all stem from the Latin personalis or personale, which means ‘pertaining to a person’. Can we talk about the concept of persona in relation to your work?

 

Flick: The watercolours are resolutely analogue but there’s usually a kind of comprehensive digital work – a book or a film – which brings the series together. I am interested in how my sitters perform certain roles, but I’m also interested in the way in which I perform – or inhabit – the role of the artist. Begin Again was absolutely about getting people to see me differently, I had a strong consciousness of that very quickly. I was undressing as a manager, and dressing up as an artist.

Begin Again (2015), Felicity Allen, detail of exhibition installation during a residency at Turner Contemporary, Margate

Fiona: Do you find painting (someone’s portrait) to be performative?

 

Flick: It’s totally performative for both artist and sitter. We both find it exhausting, sitters as well as me.

 

Fiona: It’s a little like being on a therapist’s couch.

 

Flick: Yes, or at the hairdressers. But I try to manage the relationship to ensure that I’m not turning into the analyst or the hairdresser.

 

Fiona: How do you do that?

 

Flick: By talking back! And by being very conscious of how the sitting is going. But it does mean that I’m constantly retelling – or reperforming – my own stories.

 

Fiona: There’s a kind of labour, a selling-of-self, involved in that process of storytelling; that’s part of your exchange with the sitter. I’m wondering if you have read any of Isabelle Graw’s work on painting? Graw describes painting as ‘a form of production of signs that is experienced as highly personalised’. What she means by this is that painting has a direct indexical link to its maker; there is a close relationship between person and product. She links this to Alfred Gell’s definition of artworks as ‘indexes of agency’. As a ‘record of time spent together’, your work has a strong claim to the indexical.

 

Flick: Do you believe Graw’s argument?

 

Fiona: It’s seductive, but I don’t really buy it – why is this true of painting, but not of drawing or sculpture?

 

Flick: I don’t believe it, but I feel it. There’s something about the flow, the wet, that is very important about painting. I’ve got this board, and I’ve got paper on it. As I’m painting and I’m using my brush it’s like a proxy for stroking the face. There’s a brushstroke going on, and there’s a body that I could be stroking. It’s about touch, and feeling, and all that stuff – if I was to use a camera, I wouldn’t have that.

 

Fiona: So for you there is a strong sense that the (painted) portrait is a proxy for the person, but also that your tools are proxies for your own libidinal body.

 

Flick: Right.

 

Fiona: I’ve been trying to think about whether I have found my experience of sitting for you to be a personalised one. I think that I’d describe it as a personal or inter-personal experience, but not personalised, as such – for me, that term suggests an industrial process driven by big data and an infinite number of calculable relations based on things like likes and preferences. Understood in this way, personalisation seems to bear little relation to the highly individualised experience of a one-to-one portrait sitting. Perhaps we need a vocabulary that can differentiate between an experience that is individualised, and one that is personalised? Throughout our conversation, we’ve often both found it challenging to think about your work in relation to a dominant concept of [algorithmic] personalisation. In a recent essay published in Critical Inquiry, Kris Cohen notes that personalisation, and indeed networked life more generally, ‘disorientates all of our existing vocabularies of personhood and collectivity’. Do you think that this sematic disorientation might explain our difficulty in thinking through your work in relation to personalisation?

 

Flick: Absolutely!

 

Felicity Allen at work on a portrait of Fiona Johnstone, 5 September 2019, for People Like You

Works referenced

 

Linda Nochlin, “Some women realists”. Arts Magazine (May 1974), p.29.

Allan Sekula, “The Body and the Archive”. October 39 (Winter 1986), pp. 3-64.

Siegfried Kracauer, The Mass Ornament: Weimar Essays, trans. Thomas Y Levin. Harvard University Press, Cambridge, Massachusetts and London, England: 1995.

Ross Krauss, “Grids”. October 9 (Summer 1979), pp. 50-64.

Isabelle Graw, “The Value of Painting: Notes on Unspecificity, Indexicality, and Highly Valuable Quasi-Persons”, in Isabelle Graw, Daniel Birnbaum and Nikolaus Hirsh (eds.), Thinking Through Painting: Reflexivity and Agency Beyond the Canvas. Sternberg Press, Berlin; 2012.

Kris Cohen, “Literally, Ourselves”. Critical Inquiry 46 (Autumn 2019), pp. 167-192.

 

 

How Do You See Me?

Fiona Johnstone

30 September 2019

Fiona Johnstone

30 September 2019

How Do You See Me?

Artist Heather Dewey-Hagborg’s new commission for The Photographers’ Gallery, How Do You See Me?, explores the use of algorithms to ‘recognise’ faces. Displayed on a digital media wall in the foyer of the gallery, the work takes the form of a constantly shifting matrix of squares filled with abstract grey contours; within each unit, a small green frame identifies an apparently significant part of the composition. I say ‘apparently’, because the logic of the arrangement is not perceptible to the eye; although the installation purports to represent a human face, there are no traces of anything remotely visually similar to a human visage. At least to the human eye.

 

To understand How Do You See Me?, and to consider its significance for personalisation, we need to look into the black box of facial recognition systems. As explained by Dewey-Hagborg, speaking at The Photographers’ Gallery’s symposium What does the Data Set Want? in September 2019, a facial recognition system works in two phases: training and deployment. Training requires data: that data is your face, taken from images posted online by yourself or by others (Facebook, for example, as its name suggests, has a vast facial recognition database).

 

The first step for the algorithm working on this dataset is to detect the outlines of a generic face. This sub-image is passed on to the next phase, where the algorithm identifies significant ‘landmarks’, such as eyes, nose and mouth, and turns them into feature vectors, which can be represented numerically.  For the ‘recognition’ or ‘matching’ stage, the algorithm will compare multiple figures across the dataset, and if the numbers are similar enough, then a match is identified – although this might take millions of goes. The similarity of the represented elements remains unintelligible to the human eye, calling a ‘common sense’ understanding of similarity-as-visual-resemblance into question.

 

In the western culture of portraiture the face has traditionally acted as visual shorthand for the individual; through this new technology, the face (read, the person) is transfigured into numeric code, allowing for algorithmic comparison and categorisation across a vast database of other faces/persons that have been similarly processed. How Do You See Me? asks what it means to be ‘recognised’ by AI. What version of ‘you’ emerges from a system like this, and how is it identifiable as such? Dewey-Hagborg described the project as an attempt to form her own subjectivity in relation to AI, noting that her starting point was a curiosity as to how she might be represented within the structure of the system, and what these abstractions of her ‘self’ might look like. In an attempt to answer these questions, she used an algorithm to build a sequence of images stemming from the same source, but varying widely in terms of appearance, working on the hypothesis that eventually one of these pictures would be detected as ‘her’ face. The grid of abstract and figuratively indistinct images on the wall of The Photographers’ Gallery can thus be understood as a loose form of self-representation, the evolution of a set of (non-pictorial) figures that attempt to be detected as a (specific) face. By interrogating the predominantly visual associations of ‘similarity’, which may once have implied a mimetic ‘likeness’ (with connotations of the pictorial portrait, arguably a dominant technology for the production of persons from the sixteenth to the twentieth-century), but which now suggests a statistical correspondence, How Do You See Me? draws attention to changing ideas about how a ‘person’ might be identified and categorised.

 

Following her own presentation, Dewey-Hagborg discussed her practice with Daniel Rubinstein (Reader in Philosophy and the Image at Central St Martins). Rubinstein argued that this new technology of image-making can teach us something about contemporary identity. Considering our apparent desire to be ‘recognised’ by our phones, computers, and other smart appliances, Rubinstein suggested that the action of presenting oneself for inspection to a device resembles the dynamics of an S&M relationship where the sub presents themselves to the dom. Rubinstein argued that we want to be surveyed by these technologies, because there is a quasi-erotic pleasure in the abdication of responsibility that this submission entails. Citing Heidegger, Rubinstein argued that technology is not just a tool, but reveals our relation to the world. Life and data are not two separate things (and never have been): we need to stop thinking about them as if they are. The face is already code, and the subject is already algorithmic.

 

Rubinstein’s provocative remarks certainly provide one answer to the question of why people might choose to ‘submit’ to selected technologies of personalisation. They also help us to address personalisation. The project People Like You promises that we will try to ‘put the person back into personalisation’. Whilst this could be taken to imply that there is a single real person – the individual, we aim instead to consider multiple figurations of the ‘person’ on an equal footing with each other. As Rubinstein’s comments suggest, rather than thinking about this relationship in terms of original and copy (the ‘real’ person or individual and a corresponding ‘algorithmic subject’ produced through personalisation), the ‘person’ is always every bit as constructed a phenomenon as an ‘algorithmic’ subject. Or, to put this another way, rather than taking a liberal notion of personhood for granted as our starting point, our aim is to interrogate the contemporary conditions that make multiple different models of personhood simultaneously possible.

Who gets to feed at the biobank?

William Viney

10 September 2019

William Viney

10 September 2019

Who gets to feed at the biobank?

In the United Kingdom, initiatives such as UK Biobank and the 100,000 Genome Project are now complete, and the NHS Genomic Medicine Service launched last year. With the consent of patients, local NHS trusts collect data and samples for research purposes. Each is a kind of biobank – an organised collection of biological specimens associated with computerised files, including demographic, clinical and biological data. Biobanks are an increasingly important part of research infrastructures in biomedicine and are important to realising the NHS’ desire for a more personalised healthcare system.

More recently, clinicians and researchers have been calling for wider participation in biobanking. This is because participation in biomedical research is seen as fundamental to developing more ‘targeted’ treatments, to foster a transition from a ‘one-size-fits-all’ models of healthcare to more timely, accurate, and preventative interventions. Researchers and clinicians may also need wide and inclusive participation – including patients traditionally excluded from research – to make sure that biological samples and datasets are diverse and representative.

The People Like You project is interested in these and other developments that link healthcare, research, data science, and data infrastructures. My own involvement in biobanking began before I joined the project, when I enrolled as a participant in TwinsUK based at the Department of Twin Research, King’s College London – the UK’s largest registry for twins. When my brother and I visited TwinsUK, the group collected basic biometric data, measuring height, weight, and blood pressure, also the strength of our grip and the capacity of our lungs. We gave samples of our blood, hair and spit, from which DNA, RNA, metabolites and numerous other molecules can be extracted. Our faces were swabbed in different places to test our sensitivity to different chemicals. All was recorded. We were not only enrolled, we are incorporated.

Participating in a biobank is different to enrolling in a discrete study because participants are not told exactly when and how their samples or data are used. The data stored by TwinsUK is available to any bona fide researchers, anywhere in the world. And so a biobank is not only a store of samples and data. It is also a registry or store of names and contact details, linking to individuals who have declared themselves interested in research and will give time, energy, and lots of different kinds of data. When the wind blows in the direction of studies interested in ‘personalised’ tests and interventions, this registry faced new opportunities and challenges, as did its participants.

In 2018, TwinsUK asked if I would take part in a new study called PREDICT. I was interested because it was described as a ‘ground-breaking research study into personalised nutrition’ that would ‘help you choose foods for healthy blood sugar and fat levels.’ Being involved was not straightforward. After a visit to St. Thomas’ Hospital, participants returned home and spent the next 14 days measuring blood glucose, insulin, fat levels, inflammation, sleep patterns and their gut microbiome diversity, both in response to standardised foods and each participant’s chosen diet. In return, participants would be given summary feedback on the their metabolic response. What interested me was how recruitment targeted existing members of the registry in the usual email format and their unique study number. And so it looked like any other Department of Twins Research study. But it is not like any other study.

Although Kings College London is the study sponsor and the Human Research Authority has provided the usual ethical approval, PREDICT is a large collaboration between several European and American universities, backed by venture capital investment from around the world. Tim Spector, the director of TwinsUK, is part of the scientific group that leads the group and has an equity stake in the private company called ZOE, who aims ‘to help people eat with confidence’. It is ZOE, not TwinsUK, that is processing the data that will build predictive – and ‘personalised’ – algorithms for future ZOE customers.

There is nothing nefarious or illegal about PREDICT. Collaborations between university scientists and private companies have been common for centuries. But the presentation of PREDICT’s results led me to think differently about biobanks and biobank participation in an era of personalised medicine and healthcare. PREDICT’s innovation threads together a set of historical tendencies that are important for how personalisation is seen is a desirable, evidence-based, and marketable product.

Changes in how UK universities are funded and the NHS is structured have changed the potential uses of biobanks. This is not always obvious to existing research participants (who, at TwinsUK, have a mean average age of 55 years; some of whom have been volunteers for 25+ years). In the case of PREDICT, TwinsUK assure me that all the proper licences and contracts are in place so that data can be shared with commercial collaborators and participants are given information sheets explaining how their data is used. But what does informed consent become – and ‘participation’ signify – when the purpose of a biobank shifts to include corporate interests outside the health service.

Initial results from PREDICT have been more actively disseminated in the mainstream media than in a peer-reviewed journals (summary results have been presented at a large conference in the US). Significant resources have been ploughed into garnering widespread coverage in The New York Times, Daily Mail, The Times and The Guardian. The data from the first PREDICT study has not been made available to other groups.

Begun in 1993 to investigate aging related diseases, TwinsUK started in the public sector. It still receives money from the Biomedical Research Council at Guy’s and St Thomas’ NHS Foundation Trust and King’s College London, to make translational research benefit everyone, and its other funders, the Medical Research Council, Wellcome Trust, and the European Commission, are committed to the principles of open and equitable science. But with the turn towards ‘personalised’ interventions in nutrition a fresh wave of transatlantic venture capital has become available to biomedical researchers who have access to people, resources, and data, accumulated over years of state funded work.

One facet of what Mark Fisher called ‘capitalist realism’ is the insistence that things are what they are and they cannot be another way. In biomedicine, this has affected the kinds of research that get funded and the corporate interests allowed to inform research, when and how. It is understandable that the microbiome that feeds you may be more worthy of research than the many that are not so financially nourishing. But who is keeping an eye on the opportunity costs?

 

 

 

WHOSE PERSON IS IT ANYWAY?

Scott Wark

16 July 2019

Scott Wark

16 July 2019

WHOSE PERSON IS IT ANYWAY?

One of this project’s lines of inquiry is to ask who the “person” is in “personalisation”. This question raises others: Is personalisation actually more personal? Are personalised services about persons, or do they respond to other pressures? This question also resonates differently in the three different disciplines that we work in. In health, it might invoke the promise of more effective medicine. In data science, the problem of indexing data to persons. In digital culture, though, this tagline immediately invokes more sinister—or at least more ambiguous—scenarios, for me at least. When distributed online services are personalised, how are they using the “person” and to whose benefit? Put another way: Whose person is it anyway?

 

What got me thinking about these differences was a recently-released report on the use of facial recognition technologies by police forces in the United Kingdom. The Metropolitan Police in London have been conducting a series of trials in which this technology is deployed to assist in crime prevention. Other forces around the country, including South Wales and Leicester, have also deployed this technology. These trails have been contentious, leading to criticism by academics, rights groups, and even a lawsuit. As academics have noted elsewhere, these systems particularly struggle with people with darker skin, which they have difficulty processing and recognising. What it also got me thinking about was the different and often conflicting meaning of the “person” part of personalisation.

 

Facial recognition is a form of personalisation. It takes an image, either from a database—in the case of your Facebook photos—or from a video feed—the Met system is known as “Live Facial Recognition”—and processes it to link it to a profile. Online, this process makes it easier to tag photographs, though there are cases in which commercial facial recognition systems have used datasets of images extracted from public webpages to “train” their algorithms. The Live Facial Recognition trials are controversial because they’re seen as a form of “surveillance creep”, or a further intrusion of surveillance into our lives. Asking why is indicative.

 

The police claim that they are justified in using this technology because they operate it in public and because it will make the public safer. The risk that the algorithms underlying these systems might actually reproduce particular biases built in to their datasets or exacerbate problems with accuracy around different skin tones challenge these claims. They’re also yet to be governed by adequate regulation. But these issues only partly explain why this technology has proven to be so controversial. Facial recognition technologies may also be controversial because they create a conflict between different conceptions of the “person” operating in different domains.

 

To get a little abstract for a moment, facial recognition technology creates an interface between different versions of our “person”. When we’re walking down the street, we’re in public. As more people should perhaps realise, we’re in public when we’re online, too. But the person I am on the street and the person I am online isn’t the same. And neither person is the same as the one the government constructs a profile of when I interact with it—when I‘m taxed, say, or order a passport. The controversy surrounding facial recognition technology arises, I think, because it translates a data-driven form of image processing from one domain—online—to another: the street. It translates a form of indexing, or linking one kind of person to another, from the domain of digital culture into the domain of everyday life.

 

Suddenly, data processing techniques that I might be able to put up with in low-stakes, online situations in exchange for free access to a social media platform have their stakes raised. The kind of person I thought I could be on the street is overlaid by another: the kind of person I am when I’m interfacing with the government. If I think about it—and maybe not all of us will—this changes the relative anonymity I might otherwise have expected when just another “person on the street”. This is made clear by the case of a man who was stopped during one facial recognition trial for attempting to hide his face from the cameras, ending up with a fine and his face in the press for his troubles. Whether or not I’m interfacing with the government, facial recognition means that the government is interfacing with me.

 

In the end, we might gloss the controversy created by facial recognition by saying this. We seem to have tacitly decided, as a society, to accept a little online tracking in exchange for access to different—even multiple—modes of personhood. Unlike online services, there’s no opt-out for facial recognition. Admittedly, the digital services we habitually use are so complicated and multiple that opting out of tracking is impracticable. But their complexity and the sheer weight of data that’s processed on the way to producing digital culture means that, in practice, it’s easy to go unnoticed online. We know we have to give up our data in this exchange. Public facial recognition is a form of surveillance creep and it has rightly alarmed rights organisations and privacy advocates. This is not only because we don’t want to be watched. After all, we consent to being watched online, having our data collected, in exchange for particular services. Rather, it’s because it produces a person who is me, but who isn’t mine. The why part of “Why am I being tracked” merges with a “who”—both “Who is tracking me”? and “Who is being tracked? Which me?”

 

In writing this, I don’t mean to suggest that this abstract reflection is more incisive or important than other critiques of facial recognition technology. All I want to suggest is that recognising which “person” is operating in a particular domain can help us to get a better handle on these kinds of controversies. After all, some persons have much more freedom in public than others. Some are more likely to be targeted for the colour of their skin how respectable they seem, how they talk, what they’re wearing, even how they walk. In the context of asking who the “person” is in “personalisation”, what this controversy shows us is that what “person” means is dependent not only on this question’s context, but the ends to which a “person” is put. Amongst other things, what’s at stake in technologies like these is the question of whose person a particular person is—particularly when it’s nominally mine.

 

The question, Whose person is it anyway?, is a defining one for digital culture. If recent public concern over privacy, data security, and anonymity teach us anything, it’s that it’ll be a defining question for new health technologies and data science practices, too.

Tails you win

William Viney

13 May 2019

William Viney

13 May 2019

Tails you win

I came home from a trip to Italy one day having heard that my dear dog Wallace was gravely ill. He had an iron temperament – haughty and devious, a great dog but not much of a pet. He was my constant companion from the age of 10. By the time I was home that summer in 2003 he was already in the ground. The log we used to chain him to – the only way we could stop him running off – was already on the fire. He lived fast and died young. The cause of his death was uncertain, but it was likely connected to Wallace’s phenomenal appetite. Our farm dogs had carnivorous diets: canned meats and leftovers and dry food, all mixed together. But this was never enough for Wallace, who was a very hungry beagle, and who died after eating something truly gruesome on the farm. Pity Wallace, who died for the thing he loved.

While browsing on Twitter a few weeks ago a promoted ad appeared that suggests I should buy their personalised dog food. I felt a familiar pang of sadness. True to the idea that any product can have the word ‘personalised’ attached to it, Tails.com have sought to personalise pet food – the stuff that is proverbially uniform, undifferentiated, derivative – with ingredients selected especially for your dog’s individual needs. Beyond the familiar platitudes I wondered what is being ‘personalised’ when dog food is personalised: what and why is this product being sold to me?

 

I don’t have a dog or anything else in the house that might eat dog food. I have the memory of a dog now dead for 15 years. Such is the informational asymmetry on social media platforms that I can guess, but I don’t really know, how Tails.com decided to spend money marketing their product on my Twitter feed. How had I been selected? Because I associated myself with the weird abundance of ‘doggo’ accounts? Surely something more sophisticated is needed than interacting with some canine-related content? But for a relatively new company like Tails.com, which now has Nestlé Purina Petcare as its majority shareholder, advertising to new customers is also a way of announcing themselves to investors and rivals, since their ads celebrate their innovation within a market – ‘the tailor-made dog food disrupting the industry’ – as well as promising products ‘as unique as your dog’. Whatever made me the ostensive target for this company’s product, the algorithmic trap was sprung from social media in order to ‘disrupt’ how you care for the animals in your home.

 

Tails.com provide personalised rather than customised products. The personalised object or experience is iterative and dynamic, it can be infinitely refined: personalisation seeks and develops a relationship with a person or group of persons; it may even develop the conditions for that group to join together and exist. Personalisation is primarily a process rather than a one-off event. A customised thing, by contrast, is singular and time-bound; it may have peers but it has no equal or sequel. So, many surgical interventions are individualised according to the person, but the patient usually hopes it’s a single treatment. Personalised medicine, on the other hand, is serial and data-driven; a testing infrastructure that recalibrates through each intervention, shaping relationships between different actors within a system. Tails.com sells dog food to dog owners. It does this by capturing and managing a relationship between dogs and owners, mediated by the processing of group and individual-level data. Such a system can be lifelong, informing not one but multiple interactions.

 

When debates continue to turn on the ethical uses of machine learning, its misrepresentations and its inherent biases, I am struck by how even critical voices seek adjustments and inclusions according to consumer rights: an approach that is happily adapted to capitalist prosumerism. ‘Personalise #metoo!’ To simply disregard Tails.com’s ads on Twitter as an intrusive failure of targeted marketing and personalisation may overlook a wider project that is harder to evaluate from an individual, rights-based, or anthropocentric perspective. The promise of disruption through personalised dog food tells us something about personalisation that stretches beyond transactions between company and client.

 

By personalising pet care, Tails.com seeks to enhance interactions between different ‘persons’, extending values of consumer preference and taste, satisfaction and brand loyalty with a blanket of anthropocentric ‘personhood’ to cover both the machines that market and deliver this product and the animal lives that we are told should benefit. No one asks the dog what it wants or needs. The whole system, from company to client and canine, is being personalised, but from a wholly human point of view. And yet, despite messages to the contrary, dogs probably don’t care that their food is ‘personalised’ in the way that Tails.com desire.

 

It’s not hard to imagine the kind of dog food customised to canine desires, the kind of foods that kill dogs like Wallace. I doubt, somehow, that Tails.com would like to facilitate this deathwish, since it would be a customised last supper rather than a personalised relation, sold over and over again.

This is a … toilet

Celia Lury

2 March 2019

Celia Lury

2 March 2019

This is a … toilet

In the project ‘People Like You’ we are interested in the creation and use of categories: from the making of natural kinds to what has been called dynamic nominalism, that is, the process in which the naming of categories gives opportunities for new kinds of people to emerge. And while the making of categories is often the prerogative of specialised experts, the last few years have seen a proliferation of categories associated with social, political and medical moves to go beyond the binaries of male/female and men/women. Emerging categories include: transgender, gender-neutral, intersex, gender-queer and non-binary.

The question of who gets included, who gets excluded and who belongs in categories is complicated, and depends in part on where the category has come from, who created it, who maintains it, who is conscripted into it, who needs to be included and who can avoid being categorised at all. Categories are rarely simply accepted; they need to be communicated, are frequently contested and may be rejected. There is a politics of representation in the acceptance – or not – of categories.

Take this example of a sign for a ‘gender-neutral’ toilet. Before I saw it, I knew what would be behind the door to which it was attached since the building work associated with the conversion of men’s and women’s toilets into gender-neutral toilets had taken weeks. But when the building work was finished and I was confronted with this sign – marking the threshold into a new categorical space – I didn’t know whether to laugh or cry. I am familiar, as no doubt you are too, with signs for what might now be called gender-biased toilets; that is, toilets for either men or women. Typically, the signs make use of pictograms of men or women, with the figure for ‘women’ most frequently distinguished from an apparently unclothed ‘man’ by the depiction of a skirt. Sometimes the signs also employ the words ‘men’ and ‘women, or ‘gentlemen’ or ‘ladies’. But the need to signal to the viewer of the sign that they would be occupying a gender-neutral space on the other side of the door, seemed to have floored the institution in which the toilet was located. The conventional iconography was, apparently, wanting. Perhaps it seemed impolitic – too difficult, imprudent or irresponsible – to represent a category of persons who are neither ‘men’ nor ‘women’. But in avoiding any representation of person, in making use of the word and image of a toilet (which of course is avoided in the traditional iconography, presumably as being impolite if not impolitic), I couldn’t help but think that the sign was inviting me – if I was going to step behind the door – to identify, not with either the category ‘men’ or ‘women’, but with a toilet. The sign intrigued me. Why, I wondered, if it was considered so difficult to depict a gender-neutral person, not just make this difficulty visible once, and simply show either a pictogram of a toilet or the word ‘toilet’? Why ‘say’ toilet twice? I recalled a work of art by the artist Magritte titled The Treachery of Images (1926). In this work, a carefully drawn pipe is accompanied by the words ‘Ceci n’est pas une pipe’, or ‘This is not a pipe’. Magritte himself is supposed to have said, The famous pipe. How people reproached me for it! And yet, could you stuff my pipe? No, it’s just a representation, is it not? So if I had written on my picture ‘This is a pipe’, I’d have been lying! In an essay on this art work (1983), Michel Foucault says the same thing differently: he observes that the word ‘Ceci’ or ‘This’ is (also) not a pipe. Foucault describes the logic at work in the art work as that of a calligram, a diagram that ‘says things twice (when once would doubtless do)’ (Foucault 1983: 24). For Foucault, the calligram ‘shuffles what it says over what it shows to hide them from each other’, inaugurating ‘a play of transferences that run, proliferate, propagate, and correspond within the layout of the painting, affirming and representing nothing” (1983: 49). What, then, does the doubling of the gender-neutral door sign imply about the category of the gender-neutral? Perhaps there is a nostalgia for when there was of play of transferences, when the relations between appearance and reality could be – and were – continually contested. Perhaps, however, it is a new literalism, what Annette Rouvroy and Thomas Berns call ‘a-normative objectivity’ (2013). Then again (and is this my third or fourth attempt to work out why the sign made me want to laugh and cry?), perhaps there is also an invitation to call into existence ‘something’– rather than the ‘nothing’ that Foucault celebrates – even if, for the category of the gender-neutral to come into existence, you have to (not) say something twice.

____________________________________________________________________________

Bibliography

Foucault, F. (1983) This is Not a Pipe, translated and edited by J. Harkness, Berkeley and Los Angeles: University of California Press.

Rouvroy, A. and Berns, T. (2013), ‘Algorithmic governmentality and prospects of emancipation’, Reseaux, 2013/1 no. 177, pp. 163-196, translated by Elizabeth Libbrecht.

Data Portraits

Fiona Johnstone

13 February 2019

Fiona Johnstone

13 February 2019

Data Portraits

One of the aims of People Like You is to understand how people relate to their data and its representations. Scott Wark has recently written about ‘data selves’ for this blog; an alternative (and interconnected) way of thinking about persons and their data is through the phenomenon of the data portrait.

A quick Google of ‘data portraits’ will take you to a website where you can purchase a bespoke data portrait derived from your digital footprint. Web-crawler software tracks and maps the links within a given URL; the information is then plotted onto a force directed graph and turned into an aesthetically pleasing (but essentially unrevealing) image. Drawing on a similar concept, Jason Salavon’s Spigot (Babbling Self-Portrait) (2010) visualises the artist’s Google search history, displaying the data on multiple screens in two different ways; one using words and dates, the other as abstract bands of fluctuating colour. The designation of the work as a self-portrait raises interesting questions about agency and intentionality in relation to one’s digital trace: as well as referring to identities knowingly curated via social media profiles or personal websites, the data portrait can also suggest a shadowy alter-ego that is not necessarily of our own making.

Erica Scourti’s practice interrogates the complex interactions between the subject and their digital double: her video work Life in AdWords (2012-13) is based on a year-long project where Scourti regularly emailed her personal diary to her G-mail account, and then performed to webcam the list of suggested ad-words that each entry generated. A ‘traditional’ portrait in the physiognomic sense (formally, it consists of a series of head-and-shoulders shots of the artist speaking directly to camera), Life in AdWords is also a portrait of the supplementary self that is created by algorithmically generated, ‘personalised’ marketing processes. Pushing her investigation further, Scourti’s paperback book The Outage (2014) is a ghost-written memoir based on the artist’s digital footprint: whilst the online data is the starting point, the shift from the digital to the analogue allows the artist to probe the gaps between the original ‘subject’ of the data and the uncanny doppelgänger that emerges through the process of the interpretation and materialisation of that information in the medium of the printed book.

Other artists explore the implications of representation via physical tracking technologies. Between 2010 and 2015, Susan Morris wore an Actiwatch, a personal health device that registers the body’s movement. At the end of each year she sent the data to a factory in Belgium, where it was translated into coloured threads and woven into a tapestry on a Jacquard loom (a piece of technology that was the inspiration for Babbage’s computer), producing a minute-by-minute data visualisation of her activity over the course of that year. Unlike screen-based visualisations, the tapestries are highly material entities that are both physically imposing (SunDial:NightWatch_Activity and Light 2010-2012 (Tilburg Version) is almost six metres long) and extremely intimate, with disruptions in Morris’s daily routine clearly observable. Morris was attracted to the Actiwatch for its ability to collect data not only during motion, but also when the body is at rest; the information collected during sleep – represented by dark areas on the canvas – suggests an unconscious realm of the self that is both opaque and yet quantifiable.

Susan Morris, SunDial:NightWatch_Activity and Light 2010-2012 (Tilburg Version), 2014. Jacquard tapestry: silk and linen yarns, 155 x 589cm.  © Susan Morris.

Katy Connor is similarly interested in the tensions between the digital and material body. Using a sample of her own blood as a starting point, Connor translates this biomaterial through the scientific data visualisation process of Atomic Force Microscopy (AFM), which imagines, measures and manipulates matter at the nanoscale. Through Connor’s practice, this micro-data is transformed into large 3D sculptures that resemble sublime landscapes of epic proportions.

 

Katy Connor, Zero Landscape (installation detail), 2016.
Nylon 12 sculpture against large-scale risograph (3m x 12m); translation of AFM data from the artist’s blood.  © Katy Connor.

One strand of the People Like You project focuses particularly on how people relate to their medical data. Tom Corby was diagnosed with Multiple Myeloma in 2013, and in response begun the project Blood and Bones, a platform for the data generated by his illness. The information includes the medical (full blood count / proteins / urea, electrolytes and creatinine); the affective (mood, control index, physical discomfort index, stoicism index, and a ‘hat track’ documenting his headwear for the duration of the project); and financial data (detailing the costs to the NHS of his treatment). Applying methods from data science to the genre of illness blogging, Corby’s project is an attempt to take ownership of his data creatively, and thus to regain a measure of control over living with disease.

In the final pages of his influential (although now rather dated) book, Portraiture, the art historian Richard Brilliant envisaged a dystopian future where the existence of portraiture (as mimetic ‘likeness’) is threatened by ‘actuarial files, stored in some omniscient computer, ready to spew forth a different kind of personal profile, beginning with one’s Social Security number’ (Brilliant 1991). Brilliant locates the implicit humanism of the portrait ‘proper’ in opposition to a dark Orwellian vision of the individual reduced to data. Writing in 1991, Brilliant could not have foreseen the ways in which future technologies would affect ideas about identity and personhood; comprehending how these technologies are reshaping concepts of the person today are one of the aims of People Like You.

Sophie Day

14 January 2019

2018-2019 Science Cafes are launched at Maggie’s West London

Our series was formally launched with introductions from Kelly Gleason, Cancer Research UK senior research nurse, and Iain McNeish, Head of Division, Cancer, (both at Imperial College London & Imperial College Healthcare NHS Trust). Later we heard from Adam Taylor (National Physical Laboratory) about work in the Rosetta Team under Josephine Bunch which is supported to map cancer through the first round of CRUK Grand Challenges so as to improve our understanding of tumour metabolism (https://www.cancerresearchuk.org/funding-for-researchers/how-we-deliver-research/grand-challenge-award/funded-teams-bunch).

To begin with, we learned about the breakthrough presented by tamoxifen in the development of personalised cancer medicine before hearing more about the infinite complexity of cancer biology. Twenty years ago, treatments were given to everyone with an anatomically defined cancer. This was frustrating since staff knew from experience that the treatment wouldn’t work for most people and many patients were disappointed. The introduction of tamoxifen led to stratification based on a common oestrogen receptor. Later, in ovarian cancer, it became clear that PARP inhibitors could be used successfully on approximately 20% of patients, who had inherited particular susceptibilities (in BRCA-1 and BRCA-2). Nonetheless, sub-group or stratified medicine is a long way from the goal of delivering unique treatment to everyone’s unique cancer.

This complexity is clear from the preliminary application of a range of integrated techniques by physicists, chemists and biologists in the Rosetta Team, as Adam then explained. Collaborators map and visualise tumours as a whole in their particular environments along with their constituents down to the level of individual molecules in cells. In combination, these measures give both a detailed picture of different tumour regions and a holistic overview. Amongst the many techniques are AI methods that we have encountered through Amazon or Tesco platforms which find patterns through reducing complexity. For example, 4,000 variables are reduced to three coloured axes that label different chemical patterns in one application of varied mass spectrometry techniques. You can find regions of similarity in the data by colour coding, and explore their molecular characteristics.

Amazon has applied non-negative matrix factorisation to predict how likely we are to buy a particular item once we have bought another specific item. A similar approach enabled McNeish’s group to find patterns among samples of ovarian cancer that had all looked different. The team traced 7 patterns driven by 7 mechanisms among these samples.

Embedded in the study of cancer’s biology and chemistry, data scientists ‘know that these are not just numbers. They know where the numbers come from and the biological and technical effects of these numbers.’ Non-linear methods such as t-SNE help in the analysis of very large data sets. Neural networks have also been developed to use in a hybrid approach where a random selection of data is analysed with t-SNE (stochastic neighbourhood embedding) to provide a training set for neural network applications which are then validated using t-SNE methods on another randomly selected chunk of data.

This approach combines fine-grained detail with broad pattern recognition in different aspects of tumour metabolism. It might lead to the development of a ‘spectral signature’ to read the combined signature of thousands of molecules at diagnosis.

At the end of the evening, most of us revealed anxieties about the attribution of a wholly singular status through personalising practices. Those affected by cancer wanted the ‘right’ treatment for them but we were reassured by the recognition that we also share features with other people. We appreciated the sense of combining and shifting between the ‘close up’, which renders us unique, and a more distant view, where we share a great deal with others.

Many thanks to Maggie’s West London for their hospitality.

 

You and Your (Data) Self

Scott Wark

2 January 2019

Scott Wark

2 January 2019

You and Your (Data) Self

You might have seen these adverts on the TV or on a billboard: a man and his doppelgänger, one looking buttoned up and neat and the other, somehow cooler. “Meet your Data Self”, says the poster advert on the tube station wall I often stare at when I’m waiting for the next train. In smaller type, it explains: “Your Data Self is the version of you that companies see when you apply for things like credit cards, loans and mortgages”. And then: “You two should get acquainted”.

This advert has bothered me for quite a while. I’m sure that’s partially intentional—whether I find it funny or whether I find it irritating, its goal is to make the brand it’s advertising, Experian PLC, stick in my mind. I find the actor who plays this everyman and his double, Marcus Brigstocke, annoying—score one to the advert. Beyond Brigstocke’s cocked brow, what bothers me is that this advert raises far more questions than it answers.

Who is this “Data Self” it’s telling me to get acquainted with? Is this person really like me, only less presentable? What impact does this other me have on what the actual me can do? And—this question might come across as a little odd—who does this other me belong to?

Experian is a Credit Reference Agency, so presumably the other ‘me’ is a representation of my financial history: how good I am at paying my bills on time; whether I’ve been knocked back for a credit card or overdraft; even if I’ve been checking my credit history a lot lately, which might come across as suspicious. Banks, credit card companies, phone companies, car dealers—anyone who might extend you credit so you can get a loan or pay something off over time will check in with agencies like Experian to see if you’re a responsible person to lend to.

As a recently-finished PhD student, I’ve no doubt that my other me is not so presentable, to use the visual metaphor presented by this advert’s actor/doppelgänger. A company like Experian might advise another company, like a bank, to not front me money for the long summer holiday I’m dreaming of taking to Northern Italy as I wait for the next packed tube. This “me” might not be trustworthy. Or, to put it another way, this “me” might not indicate trustworthiness.

The point of this advert is to get me to order a credit report from Experian so that I can understand my credit history and so that I can build it up or make it better. This service is central to the contemporary finance industry, which has to weigh the risk of lending money or extending credit to someone like me against the reward they get when I pay it back. If I want to be a better me, it suggests, I ought to get better acquainted with myself—or rather, my data self. If I want that holiday, its visual metaphor suggests, I’d better straighten my data self’s tie.

There’s lots more that might be said about how credit agencies inform the choices we can make and handle our data. One of the more straightforward comments we might make about them is also one that interests us most: This other, data “me” isn’t me. This is perhaps obvious—the advert’s doppelgänger is a metaphor, after all. It’s a person like me, it’s constructed from data about me, and it influences my life, but it’s not me. But this also means that This other, data “me” isn’t mine.

This advert presents just one example of the many data selves produced when we consciously or inadvertently give up our data to other companies. In this case, we agree to our data being passed on to credit rating agencies like Experian every time we get given credit. What’s interesting about this data self is that whilst it isn’t you, it has an effect on a future version of you—in my offhand example, a you who might be holidaying in Italy; or, more problematically, a you who might need an overdraft to make ends meet month-to-month. To riff on our project’s title, these data selves are, quite literally, people like you. They might not be you, but they have a real effect on your life.

We need todo a lot more work researching who these datafied versions of ourselves actually are and what effect they have on being a person in our big data present. As Experian point out in another campaign fronted by food writer and austerity campaigner Jack Monroe, several million U.K. residents are “invisible” to the country’s financial services because they don’t have a credit profile. Conversely, we might ask, what does it mean to be a person in our big data present if who we are is judged on our data doppelgängers? What does it mean when my other “me” isn’t mine—when it’s opaque, confusing, and sold to me as a service?

Countless other digital platforms and services create both fleeting and lasting “data selves” that are used to try to sell us products, for instance, or to better tailor services to our needs. This process is called “personalisation”. One of the things we want to ask as part of our research project is this: who are we when who we are is determined by who we are like? Credit Reference Agencies and the “data selves” they produce make this tangled question tangible, but it applies to many other areas of contemporary life—from finance to medicine, from our participation in digital culture to our status as individuals, actors, citizens, and members of populations. This question raises others about what it means to be a “me” in the present. These are the questions, I think, that bind this project together.

 

For more information about Credit Reference Agencies, see the Information Commissioner’s Office information page.

What is Personalisation?

William Viney

26 November 2018

William Viney

26 November 2018

What is Personalisation?

Personalisation is at once ubiquitous in contemporary life and a master of disguise. Its complexity hides in plain sight. Personalisation may mean producing products and services to ideas of individual demand, but it also means much more than this. Personalisation connects diverse practices and industries such as finance and marketing, medicine and online retail. But it also goes by many aliases – patient-centred, user-oriented, stratified and segmented – in ways that can make it hard to follow. It’s not always clear what personalised products and services share in common.

The ‘People Like You’ project does not shy from this diversity. It works across the fields of medicine, data science, and digital culture to understand the differences in each of these domains, as well as how people and practices work across them. One challenge of understanding emerging practices that are forming within and between particular industries is that histories of personalisation may be contested, sensitive, or rapidly developing. We want to find ways to explore different meanings of the term ‘personalisation’ in the United Kingdom, among people from different working backgrounds: academic and commercial scientists in the biomedical, biotechnology and pharmacology; public policy; advertising and public relations; communications; logistics; financial analysis. So we have designed a study that might be the first of its kind in the UK – an oral history of personalisation. 

The ‘What is Personalisation?’ study uses stakeholder interviews to establish how and why each industry personalises, and with what techniques of categorisation, monitoring, tracking, testing, retesting, aggregation and individuation. These interviews are in-depth and semi-structured. They usually last an hour or more. Interviews allow us an opportunity to understand how a particular individual views their work, industry, profession or experience.

A wide range of policy makers, activists, scientists, technologists, and healthcare professionals have already participated, detailing how they see the emergence of personalisation affecting their lives. Striking themes have revealed just some of the connective aspects of personalised culture: the links between standardisation, promise and failure; how languages of democratic and commercial empowerment contest state, regulative, or market legislative and economic power; how products or services can treat prototyping as a continuous process; the influence of management and design consultancies; and the way mobile technologies interpretr data in real time to produce ‘unique’ experiences for users. These are just some of the ideas that we have talked about during our interviews. We also get to discuss when and how these ideas emerged and became popular in a given industry, field or policy area.

The connections that can be made across different fields, practices, or industries can be contrasted to the highly specific emergence of personalisation in some areas. For instance, the special confluence of disability and consumer rights activism that formed alongside and, at times, in opposition to deregulation in healthcare systems in the late 1980s created individual (later personalised) health budgets, now an important policy instrument used by the National Health Service’s personalised care services. The challenge is to understand the historical and social formation of a particular patch in [personalisation’s history, its various actors and networks, to recognise adjacent and comparable developments. We are doing this whilst recognising broader patterns that are germane to other contemporary figures of personalisation. One of these may be the specific inclusion and exclusion factors that prevent a personalised service becoming a mass standardised service.  Another is to understand whether or not personalisation is being heralded as a success or as a response to failure – not the best of all available options but an alternative to foregone possibilities].

Our work takes patience and a lot of help from those who are passionate experts in their field. If you feel you have an experience of personalisation that would make an important contribution to this study then please get in touch with William Viney (w.viney@gold.ac.uk).