What is Context?

What Some Call Context is Literally its Opposite.

What is Context?
Peter Max’s cover art for “Global Information Infrastructure: Agenda for Collaboration,” the 1995 report in which US Vice President Al Gore coined the term “Information Superhighway.”
“Every place deserves an atlas, an atlas is at least implicit in every place, and to say that is to ask first of all what a place is. Places are leaky containers. They always refer beyond themselves, whether island or mainland, and can be imagined in various scales, from the drama of the back alley to transcontinental geopolitical forces and global climate. What we call places are stable locations with unstable converging forces.” — Rebecca Solnit, Infinite City

Context is critical to the development of automated systems. They are the principle behind them, in fact: that some point of data correlates to some other point of data. A sensor sees a leaf turn red: it must be autumn in New York. Turn up the thermostat, but just a touch. Link that sensor to a thermostat in Atlanta, though, and everybody sweats.

Automated systems invent context from data, so it’s important to understand what that data represents, where it came from, and how it was gathered before the machine makes any conclusions. This helps us identify bias, misapplication, bad calibration, and the many gremlins of automation.

Data science — which has invented the ways we extract data for automated systems more as a historical accident than through deliberate design — begins with abstraction. Real-world events are transformed into representations written down in a language of numbers. The “behavior” of these numbers are then observed, and conclusions are drawn. An automated system enacts these conclusions out in the world.

When it comes to following systems with concrete sets of rules, this makes sense. Chemicals will change at certain temperatures, for example, and leaves will not turn red if they’ve already fallen from the tree. Some things are predictable.

Many things, however, are not. The numbers can be said to represent one thing, but we have no way of knowing. The way we collect that data — the way we phrase questions, the way we interpret answers — can influence what we get.

This is particularly true in observing human social behavior. As Catherine D’Ignazio and Lauren F. Klein write in Data Feminism: “data are not neutral or objective, they are products of unequal social relations, and this context is essential for conducting accurate, ethical analysis” (149).

Automating systems around human social behavior, especially mass social behavior, can be a bit like putting a parrot into a birdcage and drawing conclusions about the way it squawks. Wild parrots do not seek crackers. When they want crackers, they don’t ask for them in English. Instead, we train parrots with a limited set of symbols (words), and then when it repeats those symbols we can easily assume that we understand the parrot’s intent. Humans are not parrots; humans reduced to data points, however, could be.

Durkheim, the grandfather of sociology, noted early on that abstraction wasn’t all bad, depending on how we use it. Bad abstraction is manipulative. For example, a questionnaire may ask for race but omit several options, forcing users to parrot the questionnaire’s options rather than their reality. Or ignore social factors in disproportionate graduation or employment rates. Bad abstraction accumulates socially constructed assumptions and repeats them until, metaphorically speaking, we start believing that wild parrots want crackers.

Understanding context can fill in the lines lost by abstraction. Where abstraction loses granular details, the quest for context means looking for them. A science of context would be a science of exceptions, feedback, relationships and interactions.

The problem is, in the tech world, designers have abstracted the idea of context into something else entirely.

Cover illustration for “The Group Context” by Sheila Thompson

First, some definitions. Context comes from the Latin, contexere, “a weaving together,” a compound of com- (“with”) and texere (“to weave, to make”). Sometime during the 1560s, it began being used to understand text: "the parts of a writing or discourse which precede or follow, and are directly connected with, some other part referred to or quoted." A dictionary from 1911 defines it, curiously, as “texture,” that is, “the entire texture or connected structure of a discourse.”

Today, Oxford tells us that context is “the circumstances that form the setting for an event, statement, or idea, and in terms of which it can be fully understood and assessed.”

The original sense of the word was a shared, interactive process: context was made, together. This is still true. After 100 years of the Gutenberg press, English speakers began applying the word to the storage of information (books). Today it is understood as an abstraction, a series of events observed rather than co-created.

In designing technology, especially interactive and automated technologies, context has, paradoxically, come to mean something else entirely. Here, I argue, is where all the trouble starts.

What Some Call Context is Literally its Opposite.

Context in tech design does not refer to social context or “the circumstances that form the setting of an event.” That’s because it doesn’t refer to people at all, it refers to how the technology being designed will be used. The context is the place where tech gets dropped. But as Solnit writes: “Places are leaky containers. They always refer beyond themselves.”

Here’s one example. “Context Driven Design” refers to contexts of use: essentially, the context of whether your phone is being held vertically or horizontally, or whether you’re on a train or in the bathroom. This sense of context assumes the technology is the center of the discourse, and the “weaving together” is how it is woven into the life of a specific user.

That’s not the “Social Context” of technology. In Context-Driven Design, “Social Context” refers to things such as “who else is nearby” when an app is running. Robert Scoble’s The Age of Context (2013) is about tech development within the context of other tech development, highlighting the “connected structure” of wearable computing, big data, sensor data, and social networking. (Appropriately, he delivers the talk wearing GoogleGlass frames).

Likewise, “Environmental Context” refers to whether or not the user is indoors, outdoors, or might use the product differently depending on the weather. It does not consider, for example, climate change, or the environmental cost of extracting materials. Again, context is centered on the technology’s use rather than understanding its impact.

We can go back to 1994 to find the original definition of “Context-Aware Systems” — this time for computer science — which goes for a counter-intuitive definition of context right from the abstract:

We believe that a limited amount of information covering a person’s proximate environment is most important for this form of computing since the interesting part of the world around us is what we can see, hear, and touch.

Is it, though? This defines context not as a lens for understanding technology’s place in a broader social and cultural discourse, but as one that specifically limits and restricts access to the broader social and cultural discourse.

Let’s be generous here. It might be unfair to examine design jargon this way. Language has its contexts too, and in this case language as internal industry shorthand varies widely from the way we use that language in our broader social context.

The issue I have with these definitions of context is that they reflect a myopic focus on users rather than the world outside the frame. By focusing so narrowly on context as users, they avoid attempts to understand the impact on non-users.

Non-users create the social context in which the user operates.

The user is not a social context in and of themselves. That assumptions reflects the weirdest strand of Californian Ideology, which emphasizes tech’s role in assuring the liberty of the individual over the needs of those around them. It’s not socially responsible design, it’s design for social withdrawal and alienation. It is, also, bad design, because it leads to alienation and frustration among users — but perhaps tech companies have different priorities.

When I think of context, I think of its root: the spaces that we weave together. Understanding context means widening the scope of the “discourse” we look at to include more pieces of our co-dependent systems, to see what flows into and out from a piece of technology’s introduction to those systems. That’s an approach to context that puts our relationships, exchanges, and values at the center, rather than “users.”

Obviously, designing technology for corporations was not meant to be a revolutionary social project. But since it turned into one, anyway, maybe it’s time to decide what kind of revolution it ought to be.

A pie chart showing four areas for Design Excellence: , emphasizing Mobility, Enhancement of the Natural Environment, Preservation of Community Values, and Safety.
This 1998 pixelated disaster of a visual, ironically titled “Design Excellence,” comes from the “Beyond the Pavement” conference proceedings published by the Maryland State Highway Administration.

Beyond the Pavement

The most aligned set of design principles I could find comes from… The Maryland State Highway Administration.

Highway planning isn’t exactly the track record you’d want tech to follow. It has a long history of destructive ignorance, with highways plowing through neglected parts of cities, driving out ethnic minorities, contributing to redlining and environmental destruction. But let’s be honest: the worst parts of the tech sector are contributing to similar problems today, on broader scales.

As American urban planning started to grapple with its many failures, it eventually arrived at Context Sensitive Design. I can’t speak to it’s success or failure as a protocol, or how effective it has been. What I can say is that, on paper, it is surprisingly aligned to where I assumed “context-driven” practices were looking.

Here’s a list of the eight Context-Sensitive Principles drawn up from the 1998 “Beyond the Pavement” conference for planning roads and highways (see page 3):

  • The project satisfies the purpose and needs as agreed to by a full range of stakeholders.
  • This agreement is forged in the earliest phase of the project and amended as warranted as the project develops.
  • The project is a safe facility for both the user and the community.
  • The project is in harmony with the community, and it preserves environmental, scenic, aesthetic, historic, and natural resource values of the area, i.e., exhibits context sensitive design.
  • The project exceeds the expectations of both designers and stakeholders and achieves a level of excellence in people's minds.
  • The project involves efficient and effective use of the resources (time, budget, community) of all involved parties.
  • The project is designed and built with minimal disruption to the community.
  • The project is seen as having added lasting value to the community.

Thinking about our technology as highways makes sense in the Al-Gore-sense, but designing technology as if it was transportation infrastructure might begin to make sense, too. Not in the smart-cities sense, but in grappling with the same set of historical legacies that highways have contributed to communities: tearing down poor and minority neighborhoods, displacing historical centers for gathering and interaction, and fundamentally retooling ideas of mobility toward the individual driver instead of collective mass transit.

Nobody wants a highway when they could have a park. The equivalent for apps is out there, if only we’d take some responsibility for it.

P.S.

I should give a tip of the hat to Batya Friedman and David Hendry’s Values Sensitive Design, which is less about context-driven principles and more about context-driven methods, which is significantly more important.


Things I am Doing This Week

The new album is out this week. I’ve created a website with lyrics, videos, and credits if you’re inclined to check it out. You can still get physical copies, too, over at Notype.

Things I am Reading This Week

####

Latin American perspectives on datafication and artificial intelligence: Traditions, interventions and possibilities

Special issue of Palabra Clave, h/t to AllModels.ai

This journal issue (linked in English, also available in Spanish and Portuguese) explores AI through the lens of South American media studies, with a lot of good deeper reading for those of you interested in non-British, non-American media studies approaches. Heavy emphasis on decolonization and participatory data systems, categorized by, in the opening editorial’s words, “infrastructure (or the material dimension), imaginaries (identifying how people make sense of and relate to technology), and practices (or the agentic dimension of what people do with technology).”

####

What Public Art Might Look Like After the Pandemic

Furtherfield

Read about a project that is democratizing art in London’s Finsbury Park, where 50% of its participants have never been to an art museum. It’s also a refreshing case for the use of blockchain for governance that doesn’t involve wealth production: “Art history lied when it said art is only made by a lone genius and bought by fat cats to line their pockets and walls (in that order) because creativity and radical imagination are not things individuals have but something collectives do.”

####

The Internet Was Built for Connection. How Did it go so Wrong?

Richard Seymour

A history of the WWW pairing timelines of specific forms of development and its consequent forms of alienation, asking how else it might have been: “the history of internet technology shows that there have long existed alternatives to our present digital derangement.”