The intellectual challenge of CSCW: the gap between social requirements and technical feasibility
2011/03/04 § Leave a comment
The intellectual challenge of CSCW: the gap between social requirements and technical feasibility
Introduction. Human activity is highly flexible, nuanced, and contextualized, so computational entities such as information transfer, roles, and policies must also be. Social-technical gap – divide between what we know we must support socially and what we can support technically.
CSCW summary. Emphasize social aspects that are most relevant to gap. Limited rational actor model used along with other assumptions.
- social activity is fluid and nuanced (depth of detailed, fine-grained); information needs to be recontextualized to use experience or knowledge
- organization members can have differing goals; conflict may be as important as cooperation for resolution
- conflicting goals means some hide their goals
- lack of shared meanings/histories means they must be negotiated
- boundary objects – information artifacts that span 2+ groups; allow coordination despite differing interpretations of objects
- exceptions are normal in work processes; roles are often informal and fluid
- people prefer to know others present, using this to guide their work
- awareness vs privacy; awareness vs disturbing others
- visibility of communication exchanges and of information enables learning and greater efficiencies, but can also make work more formal and reduce sharing
- norms of use of CSCW system are often actively negotiated among users; should have backchannels for communication
- critical mass problem – need minimum number of users to take off; melt-down problem – falling below threshold leads to collapse
- people do not just adapt to systems, but adapt systems to own needs
- incentives are critical – need reward structure for collaboration
The social-technical gap in action. P3P as privacy standard for Web. Issue is desire for automation information handling by those writing client-side portion, but users also want direct control in many peculiar ways. User wants to hand infinite information space – social means of taking up “faces” must all be possible under P3P, or else a gap will emerge. Categorization and collections into meta-categories is inherently political. Three aligned issues cause gap:
- systems do not allow sufficient nuance
- system are not socially flexible
- systems do not allow sufficient ambiguity
Technical research in CSCW. Early dichotomy between designers and social analyst.
Achieving greater social nuance and flexibility made being explicit become undesirable. Big issue in CSCW is that people and computers are tolerant of different issues and to different degrees.
Arguments against the significance of the gap.
- only result of ignorance/habit of designers
- but they are aware, it just doesn’t work
- social-technical gap will be solved shortly by new technology
- aka: von Neumann machine architecture is flawed, will get replaced
- but gap is still around, should assume it will last until proven wrong
- gap is historical circumstance that people will adapt to
- aka: we should adapt to technologies or technology will coevolve with us
- but doesn’t mean we shouldn’t try to address, or make visible when this happens
- but (overall): technological trajectories are response to social direction and intellectual direction
What to do? CSCW derives vitality from understanding fundamental nature of gap. Should become intellectual focus and reconceptualize as science of the artificial.
A return to Simon: the science of CSCW. Claim engineering and design are fundamentally different from the sciences. Engineering is about synthesis of artificial: others include economics, organizational science, computer science, artificial intelligence. Two main shortcomings: (1) confused task of identifying fundamental issues with specific technical ideas/implementations; (2) did not confront long-term, systemic incapability as possibility. CSCW as science of the artificial: both engineering of systems for collectives and social science of basis for constructing systems in social world. CSCW centralizes gap between what is preferred and what can be achieved. Practical achievement requires: palliatives to ameliorate current social conditions; first-order approximations to explore the design space; fundamental lines of inquiry to create the science.
Palliatives: ideological, political, and educational. Centralizing gap makes these coherent. Ideological prioritizes people using system – e.g. stakeholder analysis, participatory design, Scandinavian approach. Educational argues for fundamental nature of social requirements.
Beginning systemic exploration: first-order approximations. First-order approximations as tractable solutions that partially solve problems with known trade-offs (Simon’s approximations). Constructed from experimentation or derived from theory. Use here to work around social-technical gap with known trade-offs. Some already used: provide systems that partially meet requirements; know which social arrangements needed for specific task/setting; incorporate new computational mechanisms to substitute for social mechanisms or provide for new social issues; create technical architectures that do not invoke gap – just provide support/augmentation/advice to users. Critic-based architectures – act as set of resources for users to evaluate and choose to use as desired. Achieving a science will require organizing these explorations and demarking fundamental questions and issues.
Toward making CSCW into a science of the artificial. Key is determining systematic methods for designing around the gap. Failures and successes will gradually support understanding only if systematically examined (contrast this with CBR’s emphasis on lazy learning). Guiding questions based on gap and its role in CSCW:
- what can a computational system successfully ignore the need for nuance and context?
- when can a computational system augment human activity to make up for loss in nuance and context?
- can these benefits be systematized to know when creating benefit instead of loss?
- what types of future research will solve some of the gaps between technical capabilities and what people expect in their full range of social and collaborative activities?
Also need to address evolving technical capabilities and infrastructures, along with artifact-study-theory cycle.
Conclusion. Social-technical gap is the fundamental problem of CSCW. Other problems: generalizability from small groups to general population, prediction of affordances, applicability of new technological possibilities.
Handling social fluidity likely demands something of an evolving or developing system. Developing systems can act as a tutor if the goals are relatively fixed. Evolving systems can work with more flexible goals that change with time.
The role of incentives here is likely more nuanced than presented. Specifically, while removing barriers to participation is crucial, if incentives drive participation users will likely lose intrinsic motivation to be involved. It is important to get them involved, but avoid making involvement about external reward if the system is to function in the long term. Blizzard has largely managed to continue by creating a community and guild system that can bridge the gaps between production of incentives. And they still experience significant drops when new content looms near (especially expansions), but the older content has not been altered.
The critic system proposed has trade-offs not made explicit: offered warnings/advice is often taken as imperative by users (seen by Brian O’Neil when testing the ReQUEST system out). There are key trade-offs in irritation, intervention, support, and advice that are difficult to handle and center around key cultural norms and relations to the computer. In US culture the computer is generally taken to be authoritative, combined with treating it as mostly human this leads to users feeling compelled to do what is offered. The other trade-off is that multiple critics can and will likely conflict, leading to confusion and possibly overwhelming users. Psychological work on choice and decision-making explicitly notes that more choices typically do not lead to better choices.
Games has a unique edge compared to CSCW with respect to the social-technical gap: while most games need to be intuitive and usable, it is usually possible to get away with more limitation in some areas by reference to being a game. Games, as trying to overcome obstacles or handle constraints, inherently involve not being capable of all that is needed, and so can justify some simplifications. By referencing game goals and task structure they can bypass the issues of full representation/fidelity.
Chat that allows norm creation and negation mirrors the role of game economies and guilds in MMOs. These are one of the few ways real social flexibility is enabled (along with chat), as most game systems depend on rigidity for their functionality. How might a game that is more flexible be created?
A big issue to consider is negotiation of shared meanings. Star’s boundary objects seem promising – game rules seem to serve a similar function. Designers see rules as balancing out players and events in the game. Players seem them as affordances and constraints in pursuit of their goals. The game itself sees them as processes to follow. Such rules allow a kind of communication – players pursuing goals uncover game bugs, in turn alerting designers to alterations to make to the game; players also enact balances and imbalances among game elements that designers strive to alter. Rules also serve as a common ground for both negotiation (when socially flexible, such as house rules) or for debate and conflict (e.g. zero-sum games).
People’s preference for knowing who is present and using this to guide their work shows up in many ways in games. Communities of players arise who take up certain roles (e.g. LARPing, raids, etc.), but distinguish these roles from their normal lives. Nintendo has pushed to keep players physically near one another, rather than using online spaces, with Miyamoto tirelessly advocating multiplayer scenarios where players are in the same space. These discussions benefit from distinguishing passive and active sociality, as well as synchronous and asynchronous varieties. The passive-active divide separates modes where information (such as status) are conveyed without any require action on the part of those providing information (or getting it) from those that do (e.g. facebook status updates). Synchronous social activity is real-time or live, while asynchronous is message posting and other modes where information is presented that continues to exist.
For game design the main tension is between standardization and creativity. Should the model be what is, what should be, or what could be? How is it to be designed? What will players perceive/receive? What sells in market? Market value vs personal value (to author) vs community value (to game design community) vs player value vs political/ethical/social value.
The three issues that drive the social-technical gap in P3P affect games as well:
- games do not allow game actions/rules to sufficiently capture all intents/expectations from their referents
- lack of social flexibility in games means players are constrained to play by digital rules (to some extent)
- games generally rely on diversity of content, not ambiguity; class systems and input are inherently delimited (Blizzard points out that many choices often simply makes most of them inherently uninteresting or mandatory, in any event removing their actual ambiguity/depth)
Games are an interesting counterpoint to some CSCW issues: whereas these systems
often strive to be implicit, games generally are aiming for dual targets in meeting player expectations and explicating anything ambiguous. Rather than mask uncertainty and social nuance, games are built around explicating the bounds of structures to players, with other factors built around appealing to their fluency with other concepts. The issue here is that evolved player agents lack these assumptions and thus can mislead systems into evolving towards sets of actions/expectations not germane to players or humans.
What is a science of games? [AZ1] Combination of engineering/designing games for play and social science of understanding basis for game creation in social and psychological world.
Game ideological palliatives are basically player-focusing in design. While generally held as an ideal, most major “deviators” are those not explicitly trying to entertain the user, but engagement them emotionally, politically, ethically, etc. Player-centered design is obvious to observers but often much harder in implementation – designers make games they want to make, often ignoring or marginalizing actual players until later in the system’s development. The study of game ethics, media effects, etc. are all focusing on the centrality of games as learning systems and involved in sorts of education. Drawing from the paper, the argument would be that game designers should be taught what the affordances and constraints are of the systems being used. If anything, this may be better framed as teaching ways to explore and understanding these factors, as the rapid pace of technological growth precludes use of a static set of ideas (not that they should not be demonstrated with existing technology).
Game first-order approximations already exist in some designs, although the approaches presented are clarifying. First-order approximations in games typically appear as heuristics and known patterns or compromises in design. Designing games to focus on what counts is often a problem, but is a central focus of these approximations: where does story fit?; how should verisimilitude be balanced against iconicity and engagement/immersion?; what elements of games give designers the most “bang” for their “buck”? Providing chat in games is already a common means to support users, with the current ease of wiki creation often helping to organize interested communities, along with official forums and other public areas. Using new computational mechanisms to handle social ones has appeared in Blizzard’s development of automated complaint and correction systems for common issues people encounter, previously handled by GMs directly. Similarly, automated and semi-restrictive roll systems, gear advantages, dungeon queuing, guild management, etc. all aim to computationally manage social conflicts that crop up frequently. Technical architectures that only support/augment users is essentially the computer-as-coach metaphor – in games this appears in many of the tutoring system options, mods/addons, etc. that aim to help use without other social resources.
Unlike CSCW games can push to define new genres/experiences explicitly manipulating the social as a variable. Ignoring context and nuance points to when games can be inconsistent or differ from expectations in everyday life. In a way this is dealing with the problem of a sort of cognitive physics: when do things make sense even if not in real life (e.g. fireball physics underwater in Mario games)? Augmenting human activity really is more the issue of supporting designers. Games for these purposes typically serve to reframe the activity as a game, generally using reward structures and different descriptions/structures/goals.