5 Matching Annotations
  1. Apr 2019
    1. Arledge, C. (2019). Design Ethics and the Limits of the Ethical Designer. https://www.viget.com. Retrieved from https://www.viget.com/articles/design-ethics-and-the-limits-of-the-ethical-designer

      I have been thinking quite a lot about ethics. I was inspired to pursue a career in UX after working with a severely visually impaired student and seeing how poorly our conventional learning and teaching tools translated to her particularly needs and abilities. As I began my journey through SILS, I began to suspect that this was the tip of a much larger iceberg, one that I am passionately interested in, as it seems to speak to the underlying content of our collective character. I wanted to share this article, from a SILS alum at Viget, because I think he is speaking to the battle ethical designers need to fight in order to restore some kind of moral fiber to the world of UX, not as simple ad-on features that help sell more smartphones, but as genuine tools for good that can help to open our minds.

      Arledge starts with an assumption that I think is fair, but maybe a bit defeatist- that designers influence sits atop the influence of their business and that atop the influence of the infrastructure in which it operates. This is true, but it’s a bit like saying that the only way for people to get more rights is from the top down. I’m sure it’s happened somewhere at some point, but almost all cries for advancement of society come from the bottom up from revolution to the civil rights movement. I think there is a place (and indeed, a need) for digital activism, pushes for open source, an expansion of social media outside of Facebook, search outside of Google, and hardware beyond Lenovo, Microsoft and Apple.

      Mike Monteiro argued in his article ‘Design’s Lost Generation (https://medium.com/@monteiro/designs-lost-generation-ac7289549017), designers need to be more comfortable saying no and asking why. A reader on hypothes.is responded- “no is much harder when you’re working for a client.” This is a very fair point, and (I think) the one that Alredge is making with his conceptual model, but I think that we, as the future of tech, will sometimes have to do the hard things. The problem, of course, is that saying no has little impact when there’s literally 100’s of people lined up behind you just a little more morally flexible or financially compromised who will not find the reserves to say no.

      I think there needs to be a new social contract, one between the public and designers that says that what we build will have to meet some standard of intention and impact that protects us all. Even when environmental legislature was in its infancy, companies were aware that smoggy cities were about to be bad for business, and the air quality results were impressive. I believe we are approaching such a tipping point, where the companies are starting to see that tech addiction, election manipulation by a foreign government, driving while texting, and cyber-bullying (among so many other things) are bad for business, and I hope that smoggy websites will soon be bad for business.

    1. Your own website at your own addressThis one is so obvious, but we seem to have forgotten all about it: The web was designed so that everybody was supposed to have their own website, at its own address

      This seems more cultural than structural and all could do some amount of this if they wanted- but I suspect very few people would be all that interested in actually hosting their own site

    2. It’s no surprise, then, that the ability to create web pages was reserved for Netscape Gold,


    3. View SourceFor the first few years of the web, the fundamental way that people learned to build web pages was by using the “View Source” feature in their web browser.

      This is interesting- it is certainly not a prominent feature in any modern browser I've used

    1. One reason is that products are often designed in ways that make us act impulsively and against our better judgment. For example, suppose you have a big meeting at work tomorrow. Ideally, you want to spend some time preparing for it in the evening and then get a good night’s rest. But before you can do either, a notification pops up on your phone indicating that a friend tagged you on Facebook. “This will take a minute,” you tell yourself as you click on it. But after logging in, you discover a long feed of posts by friends. A few clicks later, you find yourself watching a YouTube video that one of them shared. As soon as the video ends, YouTube suggests other related and interesting videos. Before you know it, it’s 1:00 a.m., and it’s clear that you will need an all-nighter to get ready for the following morning’s meeting. This has happened to most of us.

      This makes me think about the question of social and moral responsibility- I understand that YouTube and Facebook didn't develop these algorithms with nefarious intent, but it is a very drug-like experience, and I know I'm not the only one who can relate to this experience