This past week I spent my time at eyeo in Minneapolis, a gathering of creative coders, artists, and data lovers. At eyeo you find amazing people. Like a sculptural data artist who takes wild fire data, codes it and creates large-scale sculpture in a new context – in this case her interpretation of the effects of the fire on the environment and on her personally. Or, you could sit in on a show & tell and find people who make ridiculously cute cat cartoons that appear when you open a new Internet tab – simply to elicit a smile and break the monotony of someone’s work day.
I instantly loved the warmth, people, and venue - the kind of un-conference community I enjoy. I opt for gatherings like eyeo that put me out of my comfort zone, and wait for the connections to hit. Being a social scientist who studies how people engage with products or services, I wondered how I'd fit in. The industries in which I work are emerging technologies, healthcare and social impact. I anticipated a fit between myself and those who want to know how their techie creations are being used, but I spent the first 48 hours struggling to understand my relationship to the coders, animators and artists who made what they made for sometimes very personal reasons – perhaps not intending for many to see or use their creation.
During Adrien's 'post data experience' talk, I got the sonic BOOM:
Artists are compelled to authenticity.
They get humanity. My work is a quest for authenticity, and truth. Honoring humanity by approaching people as fellow humans, not buyers or consumers. Creative coders, artists and designers seek the truth too. Data sits in the middle of us.
Adrien (whose work obviously blew me away) wrangles data that few outside of a climate analyst or forestry buff would pick up, into something new and meaningful. Later in the talk, she began to question data and what constitutes the truth. There were several examples of how the same content could be represented with data, all looking very different and communicating an unrelated message. I started to see how she and others in the community turn data on its head to expose various layers of truth - or to create a new truth. This is analogous to the depth of experience I strive for when I examine an individual's behavior, emotional reaction, environment and context in my research. And then I began to see connections with many other speakers who like to mess with data for the sake of the process.
That same day I listened to a panel called Topologies, Warezpunk, and Human-Scale by a map maker, an interaction designer and a software developer, respectively. Jesse Kriss (code/pixel/sound engineer) asked us what the world would be like if we made 'humane technology' - if we "created technology as if all people mattered." There were examples of how products don't address major shortcomings, and how technology can take away agency at the same time that it gives things you don't want it to - like the stocks app or U2 songs you can't get off your phone. He then discussed the idea of calm or non-intrusive technology, and how to make it. This immediately struck home. Data is a tricky concept. It is inherently based on the individual, but now that we are data inundated in every facet of our lives, a lot gets lost when we aggregate. When we 'collect facts to make the basis of reasoning,' (Wiki definition) what are we leaving out? Fully half of what I consider data is observational inference, unspoken communication, cultural cues and environment. I shudder to think where (if?) those pieces of the puzzle land in the giant aggregations we now use to benchmark society. Calm technology would have to think about how the individual consumes it to indeed be calm.
When I did my show & tell talk on Empathic Immersion my point was to show the community that social scientists approach people as humans, not as customers. We count many things as data and truth that others do not. Think applied social anthropology - but with humans, not birds and monkeys. I wondered what examples would resonate with the crowd. Would they get it? The one that many people later came to talk with me about was a personal safety device designed for women (though certainly not gender exclusive). The small, wearable device alerts friends/family when a woman is in danger, and calls the police. As part of the company's social responsibility mission, it was given free to women in countries where human trafficking is epidemic. Not only did it not keep a single girl safe, no one ever used the device. In these countries, safety is determined by social network, not authority. In fact, calling the police could elicit harm on its own.
Empathic Immersion looks for cultural norms and ideological conventions, and counts them as data. This discovery would have been quite challenging to obtain in a focus group, usability lab or survey. Developing rapport with girls and going (way) off the beaten path of questioning was the only way to get at that truth. Turns out, data at that individual level was the key to creating a safety device that could actually keep girls safe.
So thanks eyeo community.
For a few days of indulgent, hurt-your-brain exploration. As I continue to connect the data dots, it is inspiring to see the convergence and talk it through with creators and fellow data lovers. Think about the role of empathic research in creating humane technology, finding authenticity and making things as if all people matter.
In the end, data should serve humanity. I got the impression that those I've met in the past week work hard to ensure that mission comes to pass.