- Dec 2018
-
wendynorris.com wendynorris.comhci1523.vp39
-
Our under-standing of the gap is driven by technological exploration through artifact cre-ation and deployment, but HCI and CSCW systems need to have at their corea fundamental understanding of how people really work and live in groups, or-ganizations, communities, and other forms of collective life. Otherwise, wewill produce unusable systems, badly mechanizing and distorting collabora-tion and other social activity.
The risk of CSCW not driving toward a more scientific pursuit of social theory, understanding, and ethnomethodology and instead simply building "cool toys"
-
The gap is also CSCW’s unique contribution. CSCW exists intellectually atthe boundary and interaction of technology and social settings. Its unique intel-lectual importance is at the confluence of technology and the social, and its
CSCW's potential to become a science of the artificial resides in the study of interactions between society and technology
-
Nonetheless, it has been argued here that theunique problem of CSCW is the social–technical gap. There is a fundamentalmismatch between what is required socially and what we can do technically.Human activity is highly nuanced and contextualized. However, we lack thetechnical mechanisms to fully support the social world uncovered by the socialfindings of CSCW. This social–technical gap is unlikely to go away, although itcertainly can be better understood and perhaps approached.
Factors involved in the socio-technical gap:
Social needs vs technical capacity
Human activity
Technical mechanisms continue to lag social insights
-
Nonetheless, several guiding questions are required based on thesocial–technical gap and its role in any CSCW science of the artificial:• When can a computational system successfully ignore the need fornuance and context?• When can a computational system augment human activity withcomputer technologies suitably to make up for the loss in nuance andcontext, as argued in the approximation section earlier?• Can these benefits be systematized so that we know when we are add-ing benefit rather than creating loss?• What types of future research will solve some of the gaps betweentechnical capabilities and what people expect in their full range of so-cial and collaborative activities?
Questions to consider in moving CSCW toward a science of the artificial
-
The final first-order approximation is the creation of technical architecturesthat do not invoke the social–technical gap; these architectures neither requireaction nor delegate it. Instead, these architectures provide supportive oraugmentative facilities, such as advice, to users.
Support infrastructures provide a different type of approximation to augment the user experience.
-
Another approximation incorporates new computational mechanisms tosubstitute adequately for social mechanisms or to provide for new social issues(Hollan & Stornetta, 1992).
Approximate a social need with a technical cue. Example in Google Docs of anonymous user icons on page indicates presence but not identity.
-
First-order approximations, to adopt a metaphor from fluid dynamics, aretractable solutions that partially solve specific problems with knowntrade-offs.
Definition of first-order approximations.
Ackerman argues that CSCW needs a set of approximations that drive the development of initial work-arounds for the socio-technical gaps.
Essentially, how to satisfy some social requirements and then approximate the trade-offs. Doesn't consider the product a solution in full but something to iterate and improve
This may have been new/radical thinking 20 years ago but seems to have been largely adopted by the CSCW community
-
Similarly, an educational perspective would argue that programmers andusers should understand the fundamental nature of the social requirements.
Ackerman argues that CS education should include understanding how to design/build for social needs but also to appreciate the social impacts of technology.
-
CSCW’s science, however, must centralize the necessary gap between whatwe would prefer to construct and what we can construct. To do this as a practi-cal program of action requires several steps—palliatives to ameliorate the cur-rent social conditions, first-order approximations to explore the design space,and fundamental lines of inquiry to create the science. These steps should de-velop into a new science of the artificial. In any case, the steps are necessary tomove forward intellectually within CSCW, given the nature of the social–tech-nical gap.
Ackerman sets up the steps necessary for CSCW to become a science of the artificial and to try to resolve the socio-technical gap:
Palliatives to ameliorate social conditions
Approximations to explore the design space
Lines of scientific inquiry
-
Ideological initiatives include those that prioritize the needs of the peopleusing the systems.
Approaches to address social conditions and "block troublesome impacts":
Stakeholder analysis
Participatory design
Scandinavian approach to info system design requires trade union involvement
-
Simon’s (1969/1981) book does not address the inevitable gaps betweenthe desired outcome and the means of producing that outcome for anylarge-scale design process, but CSCW researchers see these gaps as unavoid-able. The social–technical gap should not have been ignored by Simon.Yet, CSCW is exactly the type of science Simon envisioned, and CSCW couldserve as a reconstruction and renewal of Simon’s viewpoint, suitably revised. Asmuch as was AI, CSCW is inherently a science of the artificial,
How Ackerman sees CSCW as a science of the artificial:
"CSCW is at once an engineering discipline attempting to construct suitable systems for groups, organizations, and other collectivities, and at the same time, CSCW is a social science attempting to understand the basis for that construction in the social world (or everyday experience)."
-
At a simple level,CSCW’s intellectual context is framed by social constructionism andethnomethodology (e.g., Berger & Luckmann, 1966; Garfinkel, 1967), systemstheories (e.g., Hutchins, 1995a), and many large-scale system experiences (e.g.,American urban renewal, nuclear power, and Vietnam). All of these pointed tothe complexities underlying any social activity, even those felt to be straightfor-ward.
Succinct description of CSCW as social constructionism, ethnomethodlogy, system theory and large-scale system implementation.
-
Yet,The Sciences of the Artificialbecame an an-them call for artificial intelligence and computer science. In the book he ar-gued for a path between the idea for a new science (such as economics orartificial intelligence) and the construction of that new science (perhaps withsome backtracking in the creation process). This argument was both charac-teristically logical and psychologically appealing for the time.
Simon defines "Sciences of the Artificial" as new sciences/disciplines that synthesize knowledge that is technically or socially constructed or "created and maintained through human design and agency" as opposed to the natural sciences
-
The HCI and CSCW research communitiesneed to ask what one might do to ameliorate the effects of the gap and to fur-ther understand the gap. I believe an answer—and a future HCI challenge—is toreconceptualize CSCW as a science of the artificial. This echoes Simon (1981)but properly updates his work for CSCW’s time and intellectual task.2
Ackerman describes "CSCW as a science of the artificial" as a potential approach to reduce the socio-technical gap
-
As Heilbroner (1994) and other researchers have argued, technological tra-jectories are responsive to social direction. I make the case that they may alsobe responsive to intellectual direction.1Indeed, a central premise of HCI isthat we should not force users to adapt.
Ackerman concludes the discussion about socio-technical gaps that people should not be forced to adapt to technology.
Technology can and should respond to social and intellectual direction.
Cites Heilbroner (1994) who writes about technological determinism that I should take a look at
-
The coevolutionary form of this argument is that we adapt resources in theenvironment to our needs. If the resources are capable of only partial satisfac-tion, then we slowly create new technical resources to better fit the need.
Another argument that social practices should adapt and evolve alongside technology. Ackerman raises concerns about this viewpoint becoming "invisible" and simply accepted or assumed as a norm without question.
-
A second argument against the significance of the gap is historically based.There are several variants: that we should adapt ourselves to the technology orthat we will coevolve with the technology.
Alternatively, humans should adapt or coevolve with intractable technologies. Ackerman cites neo-Taylorism (an economic model that describes work produced by redundant processes and splintered socio-technical activities)
-
A logically similar argument is that the problem is with the entire vonNeumann machine as classically developed, and new architectures will ame-liorate the gap. As Hutchins (1995a) and others (Clark, 1997) noted, the stan-dard model of the computer over the last 30 years was disembodied, separatedfrom the physical world by ill-defined (if defined) input and output devices.
This related argument that neural network designed systems will overcome the socio-technical gap created by highly architected computer systems that are explicit and inflexible. Ackerman argues here, too, that the advances have not yet arrived and the gap has endured.
Quick summary of von Neumann architecture
-
First, it could be that CSCW researchers merely have not found the properkey to solve this social–technical gap, and that such a solution, using existingtechnologies, will shortly exist.
One argument against the socio-technical gap is that future advances in technology will solve the problem. Ackerman argues this is unlikely since the gap has existed for more than 20 years despite attempts to bridge the gap.
-
Theproblem, then, was centered by social scientists in the process of design. Cer-tainly, many studies in CSCW, HCI, information technology, and informa-tion science at least indirectly have emphasized a dichotomy betweendesigners, programmers, and implementers on one hand and the social ana-lyst on the other.
Two different camps on how to resolve this problem:
1) Change more flexible social activity/protocols to better align with technical limitations 2) Make systems more adaptable to ambiguity
-
In particular, concurrency control problems arise when the software, data,and interface are distributed over several computers. Time delays when ex-changing potentially conflicting actions are especially worrisome. ... Ifconcurrency control is not established, people may invoke conflicting ac-tions. As a result, the group may become confused because displays are incon-sistent, and the groupware document corrupted due to events being handledout of order. (p. 207)
This passage helps to explain the emphasis in CSCW papers on time/duration as a system design concern for workflow coordination (milliseconds between MTurk hits) versus time/representation considerations for system design
-
Moreover,one of the CSCW findings was that such categorization (and especially howcategories are collapsed into meta-categories) is inherently political. The pre-ferred categories and categorization will differ from individual to individual.
Categories have politics.
See: Suchman's 1993 paper
https://pdfs.semanticscholar.org/764c/999488d4ea4f898b5ac5a4d7cc6953658db9.pdf
-
Because some of the idealization must be ignored to pro-vide a working solution, this trade-off provides much of the tension in anygiven implementation between “technically working” and “organizationallyworkable” systems. CSCW as a field is notable for its attention and concern tomanaging this tension.
Nice summation of the human and technical tensions in CSCW
-
Incentives are critical.
Costs, motives, and incentives drive collaboration. Again, refer to peer production literature here from Benkler and Mako, and Kittur, Kraut, et al
-
People not only adapt to their systems, they adapt their systems to theirneeds
Another reference to matching technology to design heuristics -- user control and system/real world needs.
-
There appears to be a critical mass problem for CSCW systems
Perpetual problem but is critical mass more of market issue (large vs niche need and who will pay for it) or a technical issue (meets need vs low adoption due to being ahead of its time)?
-
The norms for using a CSCW system are often actively negotiatedamong users.
Community norms are well-discussed in the crowdsourcing and peer production literature.
See: Benkler, Mako and Kittur, Kraut, et al
-
Visibility of communication exchanges and of information enableslearning and greater efficiencies
Evokes the distributed cognition literature as well peer production, crowdsourcing, and collective intelligence practices.
-
eople prefer to know who else is present in a shared space, and they usethis awareness to guide their work
Awareness, disclosure, and privacy concerns are key cognitive/perception needs to integrate into technologies. Social media and CMCs struggle with this knife edge a lot.
It's also seems to be a big factor in SBTF social coordination that leads to over-compensating and pluritemporal loading of interactions between volunteers.
-
Exceptions are normal in work processes.
More specific reference to workflow as a prime CSCW concern. Exceptions, edge cases, and fluid roles need to be accommodated by technology.
-
Members of organizations sometimes have differing (and multiple)goals, and conflict may be as important as cooperation in obtaining is-sue resolutions (Kling, 1991). Groups and organizations may not haveshared goals, knowledge, meanings, and histories (Heath & Luff,1996; Star & Ruhleder, 1994).
A lot to unpack here as this bullet gets at the fundamental need for boundary objects (Star's work) to traverse sense-making, meanings, motives, and goals within artifacts.
-
One finding of CSCW is that it is sometimes easier and better toaugment technical mechanisms with social mechanisms to control,regulate, or encourage behavior (Sproull & Kiesler, 1991)
HCI / interface design heuristics re: user controls, etc.
See: https://www.nngroup.com/articles/ten-usability-heuristics/
-
because people of-ten lack shared histories and meanings (especially when they are indiffering groups or organizations), information must berecontextualized to reuse experience or knowledge. Systems often as-sume a shared understanding of information.
References Goffman's work on identity and representation.
Touches again on Suchman's work on context in situations.
-
Yet, systems often have considerable difficulty han-dling this detail and flexibility.
This remains a problem in HCI/CSCW nearly two decades after this paper was published.
Why?
Do the theories and models (symbolic vs non-symbolic) not adequately describe the human-side of the technical interaction? Or the technical-side of the human behavior/motive/need?
Is the gap less nuance in (detail about) behavior and more a function of humans are fickle, contradictory, and illogical.
-
Social activity is fluid and nuanced, and this makes systems techni-cally difficult to construct properly and often awkward to use.
CSCW assumption.
See also: Suchman's 1987 situated action book and contests in Vera and Simon's 1993 paper
Gist of SA is that HCI (and its breakdowns) must be studied in real-life situations, knowing is inseparable from doing, and cognition can't be separated from context.
Good summary here:
https://en.wikipedia.org/wiki/Situated_cognition
https://www.sciencedirect.com/science/article/abs/pii/S0364021305800084
https://onlinelibrary.wiley.com/doi/epdf/10.1207/s15516709cog1701_5
-
March and Simon’s (1958; Simon, 1957) limited rational actormodel underlies CSCW
Refers to Simon's argument that "decision makers have limited information processing capabilities" due to cognitive constraints that limit computational thinking, memory and recall.
Instead of searching for the best outcome, people use a "good enough" standard. (see Tapia and Moore 2014 crisis informatics paper).
"Satisficing" describes the process of ending the search for possible decisions once an option achieves a "good enough" alternative. (see Palen, Vieweg and Anderson, 2010 everyday analysts paper)
-
I also arguelater that the challenge of the social–technical gap creates an opportunity to re-focus CSCW as a Simonian science of the artificial (where a science of the arti-ficial is suitably revised from Simon’s strictly empiricist grounds).
Simonian Science of the Artificial refers to "a physical symbol system that has the necessary and sufficient means for intelligent action."
From Simon, Herbert, "The Sciences of the Artificial," Third Edition (1996)
-
In summary, they argue that human activity is highly flexible,nuanced, and contextualized and that computational entities such as informa-tion sharing, roles, and social norms need to be similarly flexible, nuanced, andcontextualized.
CSCW assumptions about social activity
-
Thesocial–technical gapis the divide between what we know we must support sociallyand what we can support technically. Exploring, understanding, and hopefullyameliorating this social–technical gap is the central challenge for CSCW as afield and one of the central problems for human–computer interaction.
primary challenge for CSCW scholars and practitioners
Tags
- workflow
- socio-technical gap
- motives
- spatial time
- heuristics
- cscw
- palliatives
- hci
- peer production
- temporal structures
- science of the artificial
- pluritemporal
- approximations
- distributed cognition
- presence
- categories
- situated action
- incentives
- presentation of self
- satisficing
- classification
- boundary objects
- ai
- technological determinism
- system design
Annotators
URL
-