Community Data as Collective Care

Abigailfeldman
11 min readMar 23, 2021

Why designers should empower communities to steward their own data.

Society if data was used for public good.

As sensing technologies and ubiquitous computing continue to make their way into public spaces, designers, civic planners, and urbanists have dared to imagine a world without traffic, where energy efficient sensors reduce our ecological footprint, and the internet is accessible to everyone (Rujan, 2018; Templeton, 2020). A world where smart lamp posts and integrated security systems mean that women never have to walk home alone in the dark, and communities are protected (Broom, 2019). What makes such a vision possible? Data.

As designers, we love to imagine this world because it showcases how the interfaces we build can facilitate public good. But how can we ensure that smart cities live up to such utopian visions? What do we need to know about data collection to make sure smart cities actually serve the citizens? While imagination is integral to the design process, we also have to be responsible for the practical implications of a design implementation. We must determine how an imagined interface would function, how it would impact the lived experience of users, and whether that impact would equate public good.

Public Good = Empowering the Public

What is public good? Let’s consider the following scenario: a bridge that connects members of a community separated by a river could be an example of public good. But would a bridge continue to be public good in a community that isn’t centered around a river? What if the community is centered around a river, but already has a complex and efficient canal system? What if community members don’t want a bridge? Without accounting for the idiosyncrasies at hand, a new bridge becomes a hindrance rather than public good.

Case in point: the Borneo Cat Drop, in which the World Health Organization (WHO) sent DDT, a potent insecticide, to Borneo to solve a malaria problem, and then had to parachute cats over the island to solve the overpopulation of plague carrying rats that abounded as the DDT was transferred from the insects sprayed to the geckos that ate the insects, to the islands cats who were killed, likewise illustrates this principle: without understanding the complexity and interrelatedness of a community, some seemingly helpful interventions actually do more harm than good (Brooks, 2014).

Designer Bruce Nussbaum questions the benefits of similar endeavors by designers — situations in which outside designers staged community interventions that ended up harming communities rather than serving the public good. While the discourse around his provocations varies, one thread stands out: the people who best understand the complexity and needs of a community are its members. Designers should empower community members to determine what public good means for themselves by putting tools directly into their hands to increase autonomy.

Data: “The Things We Measure and Care About”

The technologies that power those utopian designer futures are often packaged together as a ‘smart city.’ In a smart city, sensors are used to gather data in situ. Algorithms then use the data to detect patterns that are used for identification, prediction, and recommendation, and actuators respond to the data in real time (Ratti & Claudel, 2016). Data can be construed as the activities and patterns of people, but in context, data is proof of our experiences. Data can be a conversation between friends or the birth of a baby. Mimi Ọnụọha succinctly defines data as “the things we measure and care about” (Ọnụọha, 2018). When communities have the power to measure, or to collect data themselves, they can deepen their understanding of their own experiences, advocate for themselves, and manifest their ideals of public good.

Data as Autonomy

When a user has access to such data, they reflect. I do this using Untappd, an app that I use to ‘check in’ the beers I drink. I look back at my check-ins and use the beer, date, and location to remember the night I split a rich barrel-aged stout after work with a friend who now lives far away — a memory I would have otherwise forgotten. I can also use Untappd to remember which beers I liked, which I didn’t, and use that knowledge to better understand what might be fun to try in the future. Accessing my data through Untappd brings me closer to intimate moments from my past and helps me make informed decisions about what I want to consume.

Data can also help us become more in tune with our bodies. By using my Nike Run Club app, I can assess the varying times, splits, locations, and elevations on my runs and adjust my training based on what I learn. I can monitor my pace in real time and push myself by exact increments. With an Apple Watch I can check my heart rate and acknowledge when I’ve met my threshold. By regularly interacting with my fitness data I can build healthy habits that work for my personal abilities.

Users can develop a closer relationship with the natural world through multi-species data. With the Audubon app, a tool for birders, I can determine where different birds are likely to be when I’m seeking them out in the environment, and while I’m at home I can extrapolate where else the cardinals and tufted titmice at my feeder frequent.

A user’s data can empower autonomy, but when outside entities are those who have the power to measure, that autonomy shrinks. When the collection of a user’s data becomes automatic, ubiquitous, and omnipresent, their autonomy is at an even greater risk.

When users measure their data they often do so by logging into these apps and entering web spaces that are powered by, or at least connected to, conglomerates like Google and Facebook (Amnesty International, 2019). As data is produced over a wide range of online environments and applications, a massive data ecosystem is generated and made accessible to these companies. This ecosystem enables what Shoshana Zuboff terms “economies of scope and scale,” in which these and other businesses are able to profit by using the data to determine the kind of users people are: what they’re interested in, who they are involved with, what they like to do, and what their attitudes, opinions, and beliefs are (Zuboff, 2020).

Knowing this, companies can calculate when a user will be most likely to respond to stimuli, how often the user needs to interact with the stimuli for it to be effective, and what form the stimuli should take. While this stimuli is often an advertisement, the Cambridge Analytica scandal highlights that access to personal data has value as a tool not only to push people to buy things, but to alter what they think, and by extension, what they do. Shoshanna Zuboff describes this as the shift in power from the ownership of the means of production to the ownership of the means of making meaning: the power behind surveillance capitalism (Zuboff, 2020). This loss of the ability to make meaning, to measure, is the antithesis of autonomy.

The solution to this loss, and to the constant nudging and optimizing, might at one point have been to close our laptops, turn off our phones, and to step outside. In the context of the sensing driven smart city, this is no longer an option. Embedded, ubiquitous data collection renders the power relations behind it invisible (Galloway, 2004). Intelligent billboards and signs can currently detect qualities like the gait of a pedestrian in order to tailor advertisements to the specific user (Johnson, 2017). Sidewalk Labs, a subsidiary of Google’s parent company, Alphabet, is piloting a program in Portland that uses smartphone data to track the movements of users to learn about and adjust transportation infrastructure. Sidewalk Labs claims that all data will be anonymized (Templeton, 2020). Privacy law researchers and experts note that precise geolocation data is nearly impossible to anonymize. The New York Times Privacy Project demonstrates this ease of extrapolating identity from location through projects like “Twelve Million Phones, One Dataset, Zero Privacy” that explores the personal information revealed from the location pings of over 12 million Americans. (Thompson & Warzel, 2019).

Beyond human behavioral manufacturing, the stakes of ubiquitous data collection grow more sinister. Those with access to location data can use it to infer identity in the case of events like protests by connecting smartphones at the event to users’ personal location data from before and after (Thompson & Warzel, 2019). Amazon has begun to create partnerships with the police in cities like Jackson, Mississippi, that allow the police to access video footage from the Amazon ring home security system (Guariglia, 2020). In some cities, the local government offers discounts that correlate to how much public space is viewable in the footage (Guariglia, 2020). When public video surveillance is integrated with facial recognition technology, it becomes a government tool for advanced racial profiling — the Chinese government’s use on Uighers, Palantir’s partnership with ICE, and examples in the context of police bias are but three demonstrations. (Crockford, 2020; Macmillan, 2019; Wee & Mozur, 2019). Such tools create a Foucauldian panopticon that suppresses autonomy by breeding behavioral conformity, obedience, and submission that limits creativity, exploration, and dissent (Greenwald, 2014).

Ethical Options for Designers

Under these circumstances it is difficult if not impossible for users to be empowered. To resist, groups advocate for bans on facial recognition tools. These bans do keep citizens safe from facial recognition, but focusing on a single identification method doesn’t account for the technology that can identify people by their heartbeat, gait, iris patterns, and fingerprints (Schneier, 2020). Designers can look to the General Data Protection Regulation (GDPR) passed in Europe to support transparent data collection that is limited in purpose and scope, accurate, and confidential (Albrecht, 2016). However, while the GDPR lays out a strong foundation of informed consent, it doesn’t account for the fact that humans exist in communities of affiliation. Data can exist as social interactions like drinks between friends, or more subtle associations like sharing the same wifi network (Eveleth, 2020). Multiple people are involved in these types of data production — so while only one person may give consent, they may unknowingly implicate those they’re involved with.

To acknowledge, respect, and empower the affiliations between community members, and by extension, between their data, designers must recognize that “we are not here for individual wealth; we are here for community wholeness” (Eveleth, 2020). In practice, designers must create data collection and storage systems that do not serve the individual wealth of corporations and data brokers but the collective wellbeing of communities. When designers enact these systems, the motive for data collection can shift from profit to discovery, understanding, empathy, and autonomy. Urbanist and activist Jane Jacobs said, “Cities have the capability of providing something for everybody, only because, and only when, they are made by everybody” (Jacobs, 2016). To give citizens autonomy to ‘make’ the smart cities they want for themselves, designers can look to tools and systems like Field Kit, Citizen Sense, and the Atlas of Community Based Monitoring in the Arctic. These tools allow users to use sensors to directly gather their own data. Sharing the resulting data through closed-loop platforms with others creates intention, collective value, and strengthens community ties. By putting citizen-sensing in the hands of communities, and allowing members to be stewards of their data, we can resist surveillance capitalism by restoring to users the means of making meaning.

References

Albrecht, J. P. (2016). How the GDPR will change the world. Eur. Data Prot. L. Rev., 2, 287.

Amnesty International. (2019). Surveillance Giants: How the Business Model of Google And Facebook Threatens Human Rights. Retrieved from https://amnestyusa.org/wp-content/uploads/2019/11/Surveillance-Giants-Embargo-21-Nov-0001-GMT-FINAL-report.pdf

Brooks, S. (2014, May 6). Systems thinking: A cautionary tale video (cats in Borneo). Retrieved from https://sustainabilityillustrated.com/en/portfolio/systems-thinking-a-cautionary-tale/

Broom, D. (2019, June 19). The EU wants to create 10 million smart lampposts. Retrieved fromhttps://www.weforum.org/agenda/2019/06/the-eu-wants-to-create-10-million-smart-lampposts/

Crockford, K. (2020). ACLU News & Commentary. American Civil Liberties Union. https://www.aclu.org/news/privacy-technology/how-is-face-recognition-surveillance-technology-racist/.

Eveleth, R. (2020). Dollars For Data. Retrieved October 07, 2020, from https://www.flashforwardpod.com/2020/07/21/dollars-for-data/

Galloway, A. (2004). Intimations of everyday life: Ubiquitous computing and the city. Cultural studies, 18(2–3), 384–408.

Gold, A. (2018, May 16). Cambridge Analytica whistleblower warns of ‘new Cold War’ online. Retrieved October 07, 2020, from https://www.politico.eu/article/cambridge-analytica-whistleblower-christopher-wylie-warns-of-new-cold-war-online/

Guariglia, M. (2020) Police will Pilot a Program to Live-Stream Amazon Ring Cameras. Electronic Frontier Foundation. Retrieved from https://www.eff.org/deeplinks/2020/11/police-will-pilot-program-live-stream-amazon-ring-cameras

Jacobs, J. (2016). The death and life of great American cities. Vintage.

Johnson, T. (2017). Smart billboards are checking you out — and making judgments. The Seattle Times. Retrieved from https://www.seattletimes.com/business/smart-billboards-are-checking-you-out-and-making-judgments/

Greenwald, G. (2014). Why privacy matters. TED. https://www.ted.com/talks/glenn_greenwald_why_privacy_matters.

MacMillan, D. (2019). The war inside Palantir: Data-mining firm’s ties to ICE under attack by employees. The Washington Post. Retrieved from https://www.washingtonpost.com/business/2019/08/22/war-inside-palantir-data-mining-firms-ties-ice-under-attack-by-employees/

Nussbaum, B. (2010). Is Humanitarian Design the New Imperialism?. Fast Company. Retrieved from https://www.fastcompany.com/1661859/is-humanitarian-design-the-new-imperialism

Ọnụọha, M. (2018). What is Missing is Still There. Nichons-Nous Dans L’Internet.

Ratti, C., & Claudel, M. (2016). The city of tomorrow: Sensors, networks, hackers, and the future of urban life. Yale University Press.

Rujan, A. (2018). Thinking about becoming a smart city? 10 benefits of smart cities: Explore Our Thinking: Plante Moran. Retrieved from https://www.plantemoran.com/explore-our-thinking/insight/2018/04/thinking-about-becoming-a-smart-city-10-benefits-of-smart-cities

Schneier, B. (2020). We’re Banning Facial Recognition, We’re Missing the Point. The New York Times. Retrieved from https://www.nytimes.com/2020/01/20/opinion/facial-recognition-ban-privacy.html

Templeton, A. (2020). SimCity In Stumptown: Portland To Get Model Powered By Cellphone Data For Planning. Retrieved October 07, 2020, from https://www.opb.org/news/article/cellphone-location-data-portland-google-privacy/

Thompson, S., & Warzel, C. (2019). Twelve Million Phones, One Dataset, Zero Privacy. New York Times. Retrieved from https://www.nytimes.com/interactive/2019/12/19/opinion/location-tracking-cell-phone.html

Wee, S., & Mozur, P. (2019). China Uses DNA to Map Faces, With Help From the West. New York Times. Retrieved from https://www.nytimes.com/2019/12/03/business/china-dna-uighurs-xinjiang.html

Zuboff, S. (2020). You Are Now Being Remotely Controlled. New York Times. Retrieved from https://www.nytimes.com/2020/01/24/opinion/sunday/surveillance-capitalism.html

--

--