45 Matching Annotations
  1. Aug 2024
  2. Jun 2024
    1. TensionThe ability to see like a data structure afforded us the technology we have today. But it was built for and within a set of societal systems—and stories—that can’t cope with nebulosity. Worse still is the transitional era we’ve entered, in which overwhelming complexity leads more and more people to believe in nothing. That way lies madness. Seeing is a choice, and we need to reclaim that choice. However, we need to see things and do things differently, and build sociotechnical systems that embody this difference.This is best seen through a small example. In our jobs, many of us deal with interpersonal dynamics that sometimes overwhelm the rules. The rules are still there—those that the company operates by and laws that it follows—meaning there are limits to how those interpersonal dynamics can play out. But those rules are rigid and bureaucratic, and most of the time they are irrelevant to what you’re dealing with. People learn to work with and around the rules rather than follow them to the letter. Some of these might be deliberate hacks, ones that are known, and passed down, by an organization’s workers. A work-to-rule strike, or quiet quitting for that matter, is effective at slowing a company to a halt because work is never as routine as schedules, processes, leadership principles, or any other codified rules might allow management to believe.The tension we face is that on an everyday basis, we want things to be simple and certain. But that means ignoring the messiness of reality. And when we delegate that simplicity and certainty to systems—either to institutions or increasingly to software—they feel impersonal and oppressive. People used to say that they felt like large institutions were treating them like a number. For decades, we have literally been numbers in government and corporate data structures. BreakdownAs historian Jill Lepore wrote, we used to be in a world of mystery. Then we began to understand those mysteries and use science to turn them into facts. And then we quantified and operationalized those facts through numbers. We’re currently in a world of data—overwhelming, human-incomprehensible amounts of data—that we use to make predictions even though that data isn’t enough to fully grapple with the complexity of reality.How do we move past this era of breakdown? It’s not by eschewing technology. We need our complex socio-technical systems. We need mental models to make sense of the complexities of our world. But we also need to understand and accept their inherent imperfections. We need to make sure we’re avoiding static and biased patterns—of the sort that a state functionary or a rigid algorithm might produce—while leaving room for the messiness inherent in human interactions. Chapman calls this balance “fluidity,” where society (and really, the tech we use every day) gives us the disparate things we need to be happy while also enabling the complex global society we have today.
  3. Sep 2021
    1. Vice versa, many researchers and practitioners who are mainly interested in human-centered social constructs choose to ignore the to them often alienating world of technical systems design.
    2. human-centered aspects that predominate in community informatics, like ethics, legitimacy, empowerment, and socio-technical design
    3. socio-technical
  4. Dec 2018
    1. Our under-standing of the gap is driven by technological exploration through artifact cre-ation and deployment, but HCI and CSCW systems need to have at their corea fundamental understanding of how people really work and live in groups, or-ganizations, communities, and other forms of collective life. Otherwise, wewill produce unusable systems, badly mechanizing and distorting collabora-tion and other social activity.

      The risk of CSCW not driving toward a more scientific pursuit of social theory, understanding, and ethnomethodology and instead simply building "cool toys"

    2. The gap is also CSCW’s unique contribution. CSCW exists intellectually atthe boundary and interaction of technology and social settings. Its unique intel-lectual importance is at the confluence of technology and the social, and its

      CSCW's potential to become a science of the artificial resides in the study of interactions between society and technology

    3. Nonetheless, it has been argued here that theunique problem of CSCW is the social–technical gap. There is a fundamentalmismatch between what is required socially and what we can do technically.Human activity is highly nuanced and contextualized. However, we lack thetechnical mechanisms to fully support the social world uncovered by the socialfindings of CSCW. This social–technical gap is unlikely to go away, although itcertainly can be better understood and perhaps approached.

      Factors involved in the socio-technical gap:

      Social needs vs technical capacity

      Human activity

      Technical mechanisms continue to lag social insights

    4. Nonetheless, several guiding questions are required based on thesocial–technical gap and its role in any CSCW science of the artificial:• When can a computational system successfully ignore the need fornuance and context?• When can a computational system augment human activity withcomputer technologies suitably to make up for the loss in nuance andcontext, as argued in the approximation section earlier?• Can these benefits be systematized so that we know when we are add-ing benefit rather than creating loss?• What types of future research will solve some of the gaps betweentechnical capabilities and what people expect in their full range of so-cial and collaborative activities?

      Questions to consider in moving CSCW toward a science of the artificial

    5. The final first-order approximation is the creation of technical architecturesthat do not invoke the social–technical gap; these architectures neither requireaction nor delegate it. Instead, these architectures provide supportive oraugmentative facilities, such as advice, to users.

      Support infrastructures provide a different type of approximation to augment the user experience.

    6. Another approximation incorporates new computational mechanisms tosubstitute adequately for social mechanisms or to provide for new social issues(Hollan & Stornetta, 1992).

      Approximate a social need with a technical cue. Example in Google Docs of anonymous user icons on page indicates presence but not identity.

    7. First-order approximations, to adopt a metaphor from fluid dynamics, aretractable solutions that partially solve specific problems with knowntrade-offs.

      Definition of first-order approximations.

      Ackerman argues that CSCW needs a set of approximations that drive the development of initial work-arounds for the socio-technical gaps.

      Essentially, how to satisfy some social requirements and then approximate the trade-offs. Doesn't consider the product a solution in full but something to iterate and improve

      This may have been new/radical thinking 20 years ago but seems to have been largely adopted by the CSCW community

    8. Similarly, an educational perspective would argue that programmers andusers should understand the fundamental nature of the social requirements.

      Ackerman argues that CS education should include understanding how to design/build for social needs but also to appreciate the social impacts of technology.

    9. CSCW’s science, however, must centralize the necessary gap between whatwe would prefer to construct and what we can construct. To do this as a practi-cal program of action requires several steps—palliatives to ameliorate the cur-rent social conditions, first-order approximations to explore the design space,and fundamental lines of inquiry to create the science. These steps should de-velop into a new science of the artificial. In any case, the steps are necessary tomove forward intellectually within CSCW, given the nature of the social–tech-nical gap.

      Ackerman sets up the steps necessary for CSCW to become a science of the artificial and to try to resolve the socio-technical gap:

      Palliatives to ameliorate social conditions

      Approximations to explore the design space

      Lines of scientific inquiry

    10. Ideological initiatives include those that prioritize the needs of the peopleusing the systems.

      Approaches to address social conditions and "block troublesome impacts":

      Stakeholder analysis

      Participatory design

      Scandinavian approach to info system design requires trade union involvement

    11. Simon’s (1969/1981) book does not address the inevitable gaps betweenthe desired outcome and the means of producing that outcome for anylarge-scale design process, but CSCW researchers see these gaps as unavoid-able. The social–technical gap should not have been ignored by Simon.Yet, CSCW is exactly the type of science Simon envisioned, and CSCW couldserve as a reconstruction and renewal of Simon’s viewpoint, suitably revised. Asmuch as was AI, CSCW is inherently a science of the artificial,

      How Ackerman sees CSCW as a science of the artificial:

      "CSCW is at once an engineering discipline attempting to construct suitable systems for groups, organizations, and other collectivities, and at the same time, CSCW is a social science attempting to understand the basis for that construction in the social world (or everyday experience)."

    12. At a simple level,CSCW’s intellectual context is framed by social constructionism andethnomethodology (e.g., Berger & Luckmann, 1966; Garfinkel, 1967), systemstheories (e.g., Hutchins, 1995a), and many large-scale system experiences (e.g.,American urban renewal, nuclear power, and Vietnam). All of these pointed tothe complexities underlying any social activity, even those felt to be straightfor-ward.

      Succinct description of CSCW as social constructionism, ethnomethodlogy, system theory and large-scale system implementation.

    13. Yet,The Sciences of the Artificialbecame an an-them call for artificial intelligence and computer science. In the book he ar-gued for a path between the idea for a new science (such as economics orartificial intelligence) and the construction of that new science (perhaps withsome backtracking in the creation process). This argument was both charac-teristically logical and psychologically appealing for the time.

      Simon defines "Sciences of the Artificial" as new sciences/disciplines that synthesize knowledge that is technically or socially constructed or "created and maintained through human design and agency" as opposed to the natural sciences

    14. The HCI and CSCW research communitiesneed to ask what one might do to ameliorate the effects of the gap and to fur-ther understand the gap. I believe an answer—and a future HCI challenge—is toreconceptualize CSCW as a science of the artificial. This echoes Simon (1981)but properly updates his work for CSCW’s time and intellectual task.2

      Ackerman describes "CSCW as a science of the artificial" as a potential approach to reduce the socio-technical gap

    15. As Heilbroner (1994) and other researchers have argued, technological tra-jectories are responsive to social direction. I make the case that they may alsobe responsive to intellectual direction.1Indeed, a central premise of HCI isthat we should not force users to adapt.

      Ackerman concludes the discussion about socio-technical gaps that people should not be forced to adapt to technology.

      Technology can and should respond to social and intellectual direction.

      Cites Heilbroner (1994) who writes about technological determinism that I should take a look at

      http://www.f.waseda.jp/sidoli/Heilbroner_1994.pdf

    16. The coevolutionary form of this argument is that we adapt resources in theenvironment to our needs. If the resources are capable of only partial satisfac-tion, then we slowly create new technical resources to better fit the need.

      Another argument that social practices should adapt and evolve alongside technology. Ackerman raises concerns about this viewpoint becoming "invisible" and simply accepted or assumed as a norm without question.

    17. A second argument against the significance of the gap is historically based.There are several variants: that we should adapt ourselves to the technology orthat we will coevolve with the technology.

      Alternatively, humans should adapt or coevolve with intractable technologies. Ackerman cites neo-Taylorism (an economic model that describes work produced by redundant processes and splintered socio-technical activities)

    18. A logically similar argument is that the problem is with the entire vonNeumann machine as classically developed, and new architectures will ame-liorate the gap. As Hutchins (1995a) and others (Clark, 1997) noted, the stan-dard model of the computer over the last 30 years was disembodied, separatedfrom the physical world by ill-defined (if defined) input and output devices.

      This related argument that neural network designed systems will overcome the socio-technical gap created by highly architected computer systems that are explicit and inflexible. Ackerman argues here, too, that the advances have not yet arrived and the gap has endured.

      Quick summary of von Neumann architecture

      https://en.wikipedia.org/wiki/Von_Neumann_architecture

    19. First, it could be that CSCW researchers merely have not found the properkey to solve this social–technical gap, and that such a solution, using existingtechnologies, will shortly exist.

      One argument against the socio-technical gap is that future advances in technology will solve the problem. Ackerman argues this is unlikely since the gap has existed for more than 20 years despite attempts to bridge the gap.

    20. Theproblem, then, was centered by social scientists in the process of design. Cer-tainly, many studies in CSCW, HCI, information technology, and informa-tion science at least indirectly have emphasized a dichotomy betweendesigners, programmers, and implementers on one hand and the social ana-lyst on the other.

      Two different camps on how to resolve this problem:

      1) Change more flexible social activity/protocols to better align with technical limitations 2) Make systems more adaptable to ambiguity

    21. In particular, concurrency control problems arise when the software, data,and interface are distributed over several computers. Time delays when ex-changing potentially conflicting actions are especially worrisome. ... Ifconcurrency control is not established, people may invoke conflicting ac-tions. As a result, the group may become confused because displays are incon-sistent, and the groupware document corrupted due to events being handledout of order. (p. 207)

      This passage helps to explain the emphasis in CSCW papers on time/duration as a system design concern for workflow coordination (milliseconds between MTurk hits) versus time/representation considerations for system design

    22. Moreover,one of the CSCW findings was that such categorization (and especially howcategories are collapsed into meta-categories) is inherently political. The pre-ferred categories and categorization will differ from individual to individual.

      Categories have politics.

      See: Suchman's 1993 paper

      https://pdfs.semanticscholar.org/764c/999488d4ea4f898b5ac5a4d7cc6953658db9.pdf

    23. Because some of the idealization must be ignored to pro-vide a working solution, this trade-off provides much of the tension in anygiven implementation between “technically working” and “organizationallyworkable” systems. CSCW as a field is notable for its attention and concern tomanaging this tension.

      Nice summation of the human and technical tensions in CSCW

    24. Incentives are critical.

      Costs, motives, and incentives drive collaboration. Again, refer to peer production literature here from Benkler and Mako, and Kittur, Kraut, et al

    25. People not only adapt to their systems, they adapt their systems to theirneeds

      Another reference to matching technology to design heuristics -- user control and system/real world needs.

    26. There appears to be a critical mass problem for CSCW systems

      Perpetual problem but is critical mass more of market issue (large vs niche need and who will pay for it) or a technical issue (meets need vs low adoption due to being ahead of its time)?

    27. The norms for using a CSCW system are often actively negotiatedamong users.

      Community norms are well-discussed in the crowdsourcing and peer production literature.

      See: Benkler, Mako and Kittur, Kraut, et al

    28. Visibility of communication exchanges and of information enableslearning and greater efficiencies

      Evokes the distributed cognition literature as well peer production, crowdsourcing, and collective intelligence practices.

    29. eople prefer to know who else is present in a shared space, and they usethis awareness to guide their work

      Awareness, disclosure, and privacy concerns are key cognitive/perception needs to integrate into technologies. Social media and CMCs struggle with this knife edge a lot.

      It's also seems to be a big factor in SBTF social coordination that leads to over-compensating and pluritemporal loading of interactions between volunteers.

    30. Exceptions are normal in work processes.

      More specific reference to workflow as a prime CSCW concern. Exceptions, edge cases, and fluid roles need to be accommodated by technology.

    31. Members of organizations sometimes have differing (and multiple)goals, and conflict may be as important as cooperation in obtaining is-sue resolutions (Kling, 1991). Groups and organizations may not haveshared goals, knowledge, meanings, and histories (Heath & Luff,1996; Star & Ruhleder, 1994).

      A lot to unpack here as this bullet gets at the fundamental need for boundary objects (Star's work) to traverse sense-making, meanings, motives, and goals within artifacts.

    32. One finding of CSCW is that it is sometimes easier and better toaugment technical mechanisms with social mechanisms to control,regulate, or encourage behavior (Sproull & Kiesler, 1991)

      HCI / interface design heuristics re: user controls, etc.

      See: https://www.nngroup.com/articles/ten-usability-heuristics/

    33. because people of-ten lack shared histories and meanings (especially when they are indiffering groups or organizations), information must berecontextualized to reuse experience or knowledge. Systems often as-sume a shared understanding of information.

      References Goffman's work on identity and representation.

      Touches again on Suchman's work on context in situations.

    34. Yet, systems often have considerable difficulty han-dling this detail and flexibility.

      This remains a problem in HCI/CSCW nearly two decades after this paper was published.

      Why?

      Do the theories and models (symbolic vs non-symbolic) not adequately describe the human-side of the technical interaction? Or the technical-side of the human behavior/motive/need?

      Is the gap less nuance in (detail about) behavior and more a function of humans are fickle, contradictory, and illogical.

    35. Social activity is fluid and nuanced, and this makes systems techni-cally difficult to construct properly and often awkward to use.

      CSCW assumption.

      See also: Suchman's 1987 situated action book and contests in Vera and Simon's 1993 paper

      Gist of SA is that HCI (and its breakdowns) must be studied in real-life situations, knowing is inseparable from doing, and cognition can't be separated from context.

      Good summary here:

      https://en.wikipedia.org/wiki/Situated_cognition

      https://books.google.com/books?hl=en&lr=&id=AJ_eBJtHxmsC&oi=fnd&pg=PR7&dq=suchman&ots=KrKpjGFHGV&sig=hmJ_pyJymoEweA_XDFWdMedSL4s#v=onepage&q=suchman&f=false

      https://www.sciencedirect.com/science/article/abs/pii/S0364021305800084

      https://onlinelibrary.wiley.com/doi/epdf/10.1207/s15516709cog1701_5

    36. March and Simon’s (1958; Simon, 1957) limited rational actormodel underlies CSCW

      Refers to Simon's argument that "decision makers have limited information processing capabilities" due to cognitive constraints that limit computational thinking, memory and recall.

      Instead of searching for the best outcome, people use a "good enough" standard. (see Tapia and Moore 2014 crisis informatics paper).

      "Satisficing" describes the process of ending the search for possible decisions once an option achieves a "good enough" alternative. (see Palen, Vieweg and Anderson, 2010 everyday analysts paper)

      See: http://oxfordre.com/politics/view/10.1093/acrefore/9780190228637.001.0001/acrefore-9780190228637-e-405

    37. I also arguelater that the challenge of the social–technical gap creates an opportunity to re-focus CSCW as a Simonian science of the artificial (where a science of the arti-ficial is suitably revised from Simon’s strictly empiricist grounds).

      Simonian Science of the Artificial refers to "a physical symbol system that has the necessary and sufficient means for intelligent action."

      From Simon, Herbert, "The Sciences of the Artificial," Third Edition (1996)

    38. In summary, they argue that human activity is highly flexible,nuanced, and contextualized and that computational entities such as informa-tion sharing, roles, and social norms need to be similarly flexible, nuanced, andcontextualized.

      CSCW assumptions about social activity

    39. Thesocial–technical gapis the divide between what we know we must support sociallyand what we can support technically. Exploring, understanding, and hopefullyameliorating this social–technical gap is the central challenge for CSCW as afield and one of the central problems for human–computer interaction.

      primary challenge for CSCW scholars and practitioners