Amy Guy, <[email protected]>
The work of members of the SSI community intersects with a number of existing broad reaching ecosystems (such as the Web), as well as introducing new concepts, platforms, and ways of engaging with the world. There is a lot of complexity and moving parts, and it's next to impossible for a person or team who is focusing on solving a specific problem to be aware of everything else in the space at the same time. It is difficult to predict how the work you are doing in one place may impact the work of others in the community or of end users of the various systems.
Many things are not used in the way their creators intended them, which can be very positive or very harmful. Whilst technology designers can't be held responsible for every possible future implementation - in contexts which may not even exist today - it is important to think critically about technologies which have the potential to directly and indirectly affect large numbers of people.
This is particularly important as the global and networked reach of some of these systems can cause an amplifying effect on social phenomena. Technology which provides a threat or a benefit to a small number may result the same threat or benefit becoming commonplace across a larger population.
It can be incredibly difficult to consider or give credibility or appropriate weight to scenarios outside of your own direct experience, even when you have the best of intentions. It can also be very daunting or discouraging to attempt to do so. Sometimes it feels like a distraction or digression from the 'real work' you need to get done. Perhaps you believe very strongly in your approach, and are quietly afraid that if you think too hard about the impact of what you are building on someone far away in different circumstances to you, you might discover a negative consequence that doesn't justify the benefit you create for those around you.
This should not be an exercise in shutting down good work, but a push for extra effort towards mitigating harms and widening access to the benefits. A frank and honest discussion of potential harms is not an attack, but an opportunity for improvements.
I believe it is vital for the community to engage in such discussions. Where to start?
If you have seen the TV show Black Mirror, ask yourself: if the technology I am working on appeared in an episode of Black Mirror, what would the story be?
The Societal Impacts Questionnaire is a work-in-progress draft by the W3C Technical Architecture Group. The focus is on Web Platform technologies, but the questions are generic and can be applied in the SSI and related communities as well.
This initial set of questions is intended to help technology designers and implementers think about perspectives other than their own, as well as worst case scenarios.
- What kinds of activities do you anticipate your technology becoming a critical part of? What problems occur if something becomes temporarily or suddenly unavailable?
- What kinds of activities could your technology become a part of that you are not designing explicitly for? Are there other groups or communities you could reach out to to discuss this potential further?
- What risks do you see in features of your technology being misused, or used differently from how you intended? Think about worst case scenarios. Think about how the threat model changes if your technology is used differently from how you designed for. What are potential unexpected consequences?
- Can users of a platform choose not to use particular features of your technology? Can people give meaningful consent to use these features? How easy is it for people to opt out?
- What groups of people are excluded from using features of your technology? How accessible and inclusive is your technology?
- In what ways might your technology impact minority or historically disadvantaged groups? Think about parts of society who may have trouble getting their voice heard, or their needs taken seriously; who have faced prejudice and discrimination in the past, or still do so today.
- What are the power dynamics at play in implementations of your technology? How is power granted or removed? Where is power concentrated in the ecosystem of which your technology is a part?
- What points of centralisation may appear in implementations of your technology?
- To what extent do the features in your technology impact the natural environment?
- What is the expected lifetime of the various features of your technology? Do you have a strategy for deprecation, replacement or improvement? What happens if upgrades are applied unevenly across implementations?
- What do you need to consider as part of a full privacy audit of your technology?
- What do you need to consider as part of a full security audit of your technology?
I would like to see a safe and open space at RWOT where attendees can begin to think through honest responses to some of these questions for particular technologies or projects.
I would also value input from the community at RWOT about the applicability of these questions to the SSI and related ecosystems, how they can be expanded upon to be useful, and what kinds of things are missing that we should encourage ourselves and our colleagues to think deeply about.