454 Matching Annotations
  1. Jul 2022
    1. Recommendation media is the new standard for content distribution. Here’s why friend graphs can‘t compete in an algorithmic world.
    1. Mark last week as the end of the social networking era, which began with the rise of Friendster in 2003, shaped two decades of internet growth, and now closes with Facebook's rollout of a sweeping TikTok-like redesign.
    1. Data Policy and related materialsometimes, on the contrary, demonstrate an oversupply of very high level, generalised information atthe expense of a more concise and meaningful delivery of the essential information necessary for thedata subject to understand the processing being undertaken and to exercise his/her rights in ameaningful way. Furthermore, while Facebook has chosen to provide its transparency information byway of pieces of text, there are other options available, such as the possible incorporation of tables,which might enable Facebook to provide the information required in a clear and concise manner,particularly in the case of an information requirement comprising a number of linked elements. Theimportance of concision cannot be overstated nonetheless. Facebook is entitled to provide additionalinformation to its user above and beyond that required by Article 13 and can provide whateveradditional information it wishes. However, it must first comply with more specific obligations under theGDPR, and then secondly ensure that the additional information does not have the effect of creatinginformation fatigue or otherwise diluting the effective delivery of the statutorily required information.That is simply what the GDPR requires.

      DPC again schools facebook in reality.

    2. In expressing disagreement with the proposed literal interpretation of Article 13(1)(c) GDPR set outin the Preliminary Draft Decision, Facebook submitted that “Facebook Ireland’s interpretation directlytracks the actual wording of the relevant GDPR provision which stipulates only that two items ofinformation be provided about the processing (i.e. purposes and legal bases). It says nothing aboutprocessing operations.”102 Facebook submitted that because, in its view, Article 13(1) GDPR applies “atthe time data is collected”, and therefore refers only to “prospective processing”. It submits that, onthis basis, Article 13(1)(c) GDPR does not relate to ongoing processing operations, but is concernedsolely with information on “intended processing”.103 Facebook’s position is therefore that Article13(1)(c) GDPR is future-gazing or prospective only in its application and that such an interpretation issupported by a literal reading of the GDPR

      This is both a ballsy, and utterly stupid argument. The kind of argument that well-paid lawyers will make in order to keep getting paid.

    3. For these reasons, I conclude that, as a matter of fact, Facebook did not rely, or purport torely, on the Complainant’s consent as a legal basis for the processing of personal data under theTerms of Service

      First conclusion: No consent. It's 6(1)(b) time.

    4. In light of this confirmation by the data controller that it does not seek to rely on consent in thiscontext, there can be no dispute that, as a matter of fact, Facebook is not relying on consent as thelawful basis for the processing complained of. It has nonetheless been argued on the Complainant’sbehalf that Facebook must rely on consent, and that Facebook led the Complainant to believe thatit was relying on consent

      Here Helen bitchslaps Max by noting that despite what they hope and wish for, FB is relying on contract, and not consent.

    5. On this basis, the issues that I will address in this Draft Decision are as follows: Issue 1 – Whether clicking on the “accept” button constitutes or must be considered consentfor the purposes of the GDPR Issue 2 – Reliance on Article 6(1)(b) as a lawful basis for personal data processing Issue 3 – Whether Facebook provided the requisite information on the legal basis forprocessing on foot of Article 6(1)(b) GDPR and whether it did so in a transparent manner.

      Key issues identified in the draft opinion. Compare later if this differs in final.

    1. 这篇文章负面部分偏多,但也算是公平:当一个产品已经走过十年之后,它需要面对的内部和外部挑战太多了,老的用户正在失去兴趣,而新的用户却被对手抢走了注意力。因此,Facebook 的产品成了一种尴尬的范例:如何在一个老产品中增加新功能。Armstrong 这样写道:

      如果你打开你的 Facebook 应用程序,我想你会对目前存放在那里的产品数量感到震惊……然而,每一个产品都解决了不同的工作,所以下载 Facebook 的价值主张变得越来越模糊。

      而另一个准确的捕捉则是对于 Messenger 的。这个从 Facebook 的私信功能中拆分出来的 app,现在又将被整合回到主端当中,甚至一项雄心勃勃的工作是:要把 Facebook、Instagram 和 Whatsapp 这几个风马牛不相及的产品中即时通信的部分整合成一个。

      信息流和私信是一个社交产品的两个极端:前者是公开的,也就是公域,在算法推荐的加持下,发布者几乎已经丧失了对信息分发的控制力;后者是私密的,也就是私域。在过去十几年的社交平台(无论是媒体还是网络)发展史中,我们都能理解,两者有着清晰的界限,一旦把错误的内容放到了错误的地方,就可能产生不可逆的破坏。

      Armsrong 对这段历史的总结是这样的: 在过去的几年里,我们对所谓的社交网络学到了一些东西。首先,我们实际上并不关心大多数人的想法。社交部分并不像最初想象的那样吸引人,因为人们的很多日常生活都很无聊。我们根本不关心,根本不关心你高中时的某个人的政治观点。来自熟人的内容并不那么有趣。 我们也想明白了,在网上发布我们的大部分社会存在是有负面机会成本的。我们所有人,我是说我们中的每一个人,都曾在公共领域使用过污言秽语,开过种族主义的玩笑,说过一些贬义词,或者只是单纯的愚蠢。自 2007 年以来,越来越明显的是,在网上分享你的生活的好处并不值得让我们冒险发言,虽然这些话现在感觉很好,但在 10 年后可能会摧毁你的生活。 简而言之,社交图谱只是兴趣图谱的一个可怜的替代品。人们从他们所谓的「社交」媒体中寻找的只是他们感兴趣的媒体。

  2. Jun 2022
    1. What's become clear is that our relationships are experiencing a profound reset. Across generations, having faced a stark new reality, a decades-long trend1 reversed as people are now shifting their energy away from maintaining a wide array of casual connections to cultivating a smaller circle of the people who matter most.

      ‘how the demand for deeper human connection has sparked a profound reset in our relationships’.

      The Meta Foresight (formerly Facebook IQ) team conducted a survey of 36,000 adults across 12 markets.

      Among their key findings:

      72% of respondents said that the pandemic caused them to reprioritize their closest friends
      Young people are most open to using more immersive tech to foster connections (including augmented and virtual reality), though all users indicated that tech will play a bigger role in enhancing personal connections moving forward
      37% of people surveyed globally reported reassessing their life priorities as a result of the pandemic
      
    1. And when corporations start to dominate the Internet, they became de-facto governments. Slowly but surely, the tech companies began to act like old power. They use the magic of tech to consolidate their own power, using money to increase their influence, blocking the redistribution of power from the entrenched elites to ordinary people.

      "Money is its own kind of power"

      The corporations built by white, male, American, and vaguely libertarian people became a focal point of power because of the money they had to influence governments and society. They started looking like "old power."

      Later:

      Facebook took advantage of tech's tradition of openness [importing content from MySpace], but as soon as it got what it wanted, it closed its platform off.

  3. May 2022
    1. We believe that Facebook is also actively encouraging people to use tools like Buffer Publish for their business or organization, rather than personal use. They are continuing to support the use of Facebook Pages, rather than personal Profiles, for things like scheduling and analytics.

      Of course they're encouraging people to do this. Pushing them to the business side is where they're making all the money.

    1. Facebook provides some data portability, but makes an odd plea for regulation to make more functionality possible.

      Why do this when they could choose to do the right thing? They don't need to be forced and could certainly try to enforce security. It wouldn't be any worse than unveiling the tons of personal data they've managed not to protect in the past.

    1. Zugsystem ausklügelt, das dieOpfer möglichst schnell und reibungslos nach Auschwitz bringt, darübervergißt, was in Auschwitz mit ihnen geschieht.

      nicht vergleichbar, aber auch ein Bsp für Technik, die dadurch, dass sie als Selbstzweck missverstanden wird, missbraucht wird - Facebook und der Menschenhandel -- siehe Jan Böhmermann Folge zu Facebookleaks

    Tags

    Annotators

    1. Content moderation takes place within this ecosystem.

      The essay makes the point that "Facebook has many faces - it is not a monolith". But algorithmic content moderation is monolithic. Let's see whether this tension is investigated.

  4. Apr 2022
    1. dentical to the content of information which was previously declared to be unlawful, or to block access to that information, irrespective of who requested the storage of that information;

      This establishes that identical or equivalent content once struck down can be made stayed down by automatic tools.

    1. Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (‘Directive on electronic commerce’), in particular Article 15(1), must be interpreted as meaning that it does not preclude a court of a Member State from:–        ordering a host provider to remove information which it stores, the content of which is identical to the content of information which was previously declared to be unlawful, or to block access to that information, irrespective of who requested the storage of that information;–        ordering a host provider to remove information which it stores, the content of which is equivalent to the content of information which was previously declared to be unlawful, or to block access to that information, provided that the monitoring of and search for the information concerned by such an injunction are limited to information conveying a message the content of which remains essentially unchanged compared with the content which gave rise to the finding of illegality and containing the elements specified in the injunction, and provided that the differences in the wording of that equivalent content, compared with the wording characterising the information which was previously declared to be illegal, are not such as to require the host provider to carry out an independent assessment of that content, and–        ordering a host provider to remove information covered by the injunction or to block access to that information worldwide within the framework of the relevant international law.

      C‑18/18 Eva Glawischnig-Piesczek v Facebook

      Key for Art 17 AG opinion: this line of arguments justifies the ex ante blocking of manifestly infringing content.

  5. Mar 2022
    1. Ben Collins. (2022, February 28). Quick thread: I want you all to meet Vladimir Bondarenko. He’s a blogger from Kiev who really hates the Ukrainian government. He also doesn’t exist, according to Facebook. He’s an invention of a Russian troll farm targeting Ukraine. His face was made by AI. https://t.co/uWslj1Xnx3 [Tweet]. @oneunderscore__. https://twitter.com/oneunderscore__/status/1498349668522201099

    1. First is that it actually lowers paid acquisition costs. It lowers them because the Facebook Ads algorithm rewards engaging advertisements with lower CPMs and lots of distribution. Facebook does this because engaging advertisements are just like engaging posts: they keep people on Facebook. 

      Engaging advertisements on Facebook benefit from lower acquisition costs because the Facebook algorithm rewards more interesting advertisements with lower CPMs and wider distribution. This is done, as all things surveillance capitalism driven, to keep eyeballs on Facebook.

      This isn't too dissimilar to large cable networks that provide free high quality advertising to mass manufacturers in late night slots. The network generally can't sell all of their advertising inventory, particularly in low viewing hours, so they'll offer free or incredibly cheap commercial rates to their bigger buyers (like Coca-Cola or McDonalds, for example) to fill space and have more professional looking advertisements between the low quality advertisements from local mom and pop stores and the "as seen on TV" spots. These higher quality commercials help keep the audience engaged and prevents viewers from changing the channel.

  6. Jan 2022
  7. Dec 2021
    1. She thinks the companies themselves are behind this, trying to manipulate their users into having certain opinions and points of view.

      The irony is that this is, itself, somewhat a conspiracy theory.

      Though, I think a nuanced understanding may be closer:

      • The real purpose is not to influence people to believe anything. It's money. It's ad spend and data collection to sell. We need to demonstrate to advertisers that their ads are actually getting seen. The more they get seen, the more money we make. And, the more time is spent on the service, the more data we have to sell... which is as valuable as the add spend.
      • Companies jigger algorithms to maximize time spent on the service.
      • As the Bible is clear, the heart of man is wicked, and the kinds of things that maximize time spent are themselves attitudes of evil, malice, wickedness, and hatred, and the list of things Paul repeatedly tells us to avoid. Go figure.
      • So, people feel the platforms are basically like smoking, and yet, they can't stop.
    2. Only 10 percent say Facebook has a positive impact on society, while 56 percent say it has a negative impact and 33 percent say its impact is neither positive nor negative. Even among those who use Facebook daily, more than three times as many say the social network has a negative rather than a positive impact.

      Here's the rub. Only 1 out of 10 Americans surveyed think Facebook is a good idea.

      Over half of Americans surveyed actually think Facebook is bad for them and society as a whole. And yet, the general sense is now that life is impossible without it.

      How does the church respond to this? Do we tell people to get off or "use in moderation?"

  8. Nov 2021
    1. Source: De Agostini Picture Library / Getty

      This is a searing image for what this article is about:

      Muted dull painting of what appears to be a 17th century gallows being erected in front of a line of soldiers with guns and bayonets and a  crowd with shovels. Instead of a gallows, the structure being erected is a large Facebook thumbs up image on a pole. Various flags with the Facebook logo fly around the scene.

      Could be entitled "A different kind of social justice."

    1. 《TIME》杂志的 10 月份封面是扎克伯格的照片上面写着「Delete "Facebook"?」,这个 Alert 使用的是 iOS 样式的弹框,但搭配了一个鼠标的选择方式,其实完全可以理解为什么这么做,但还是引起了一些讨论,其中最有趣的是一群苹果前员工出来讲自己过去在苹果的设计流程:之前负责重设计 me.com 的设计师 Majd Taby 说,他当时的工作就是把 iPad 的样式完全移植到 me.com 的 web 中,最终的表现就是用鼠标选择 iPad 样式的 Alert,和这个封面展示的效果一模一样。之前负责 iCloud web apps 的 Sebastiaan de With 说,苹果团队内部完全没有 UIKit 或设计源文件的共享,每一个团队要做类似的事情时只能从头开始,要知道那时可是拟物化设计的时代,绘制一套 UI 需要耗费非常多的时间,这一切的原因都是要「保密」。Martin Pedrick 在加入用户界面组被告知的第一件事就是不要用任何资源共享工具来分享设计源文件。

    1. <small><cite class='h-cite via'> <span class='p-author h-card'>David Dylan Thomas</span> in Come and get yer social justice metaphors! (<time class='dt-published'>11/05/2021 11:26:10</time>)</cite></small>

  9. Oct 2021
    1. https://www.theatlantic.com/ideas/archive/2021/10/facebook-papers-democracy-election-zuckerberg/620478/

      Adrienne LaFrance outlines the reasons we need to either abandon Facebook or cause some more extreme regulation of it and how it operates.

      While she outlines the ills, she doesn't make a specific plea about the solution of the problem. There's definitely a raging fire in the theater, but no one seems to know what to do about it. We're just sitting here watching the structure burn down around us. We need clearer plans for what must be done to solve this problem.

    2. When the most powerful company in the world possesses an instrument for manipulating billions of people—an instrument that only it can control, and that its own employees say is badly broken and dangerous—we should take notice.
    3. Facebook could say that its platform is not for everyone. It could sound an alarm for those who wander into the most dangerous corners of Facebook, and those who encounter disproportionately high levels of harmful content. It could hold its employees accountable for preventing users from finding these too-harmful versions of the platform, thereby preventing those versions from existing.

      The "moral majority" has screamed for years about the dark corners of the internet, and now they seem to be actively supporting a company that actively pushes people to those very extremes.

    4. Facebook could shift the burden of proof toward people and communities to demonstrate that they’re good actors—and treat reach as a privilege, not a right.

      Nice to see someone else essentially saying something along the lines that "free speech" is not the same as "free reach".

      Traditional journalism has always had thousands of gatekeepers who filtered and weighed who got the privilege of reach. Now anyone with an angry, vile, or upsetting message can get it for free. This is one of the worst parts of what Facebook allows.

    5. “While we have other systems that demote content that might violate our specific policies, like hate speech or nudity, this intervention reduces all content with equal strength. Because it is so blunt, and reduces positive and completely benign speech alongside potentially inflammatory or violent rhetoric, we use it sparingly.”)

      If it's neither moral nor legal for one to shout "fire" in a crowded theater, why is it somehow both legal and moral for a service like Facebook to allow their service to scream "fire, fire, fire" within a crowded society?

    6. Facebook wants people to believe that the public must choose between Facebook as it is, on the one hand, and free speech, on the other. This is a false choice.
    7. One example is a program that amounts to a whitelist for VIPs on Facebook, allowing some of the users most likely to spread misinformation to break Facebook’s rules without facing consequences.
    8. “I am worried that Mark’s continuing pattern of answering a different question than the question that was asked is a symptom of some larger problem,” wrote one Facebook employee in an internal post in June 2020, referring to Zuckerberg. “I sincerely hope that I am wrong, and I’m still hopeful for progress. But I also fully understand my colleagues who have given up on this company, and I can’t blame them for leaving. Facebook is not neutral, and working here isn’t either.”

      Glad to see that others are seeing Mark Zuckerberg seems to be the one with the flaws that are killing Facebook.

    9. An internal message characterizing Zuckerberg’s reasoning says he wanted to avoid new features that would get in the way of “meaningful social interactions.” But according to Facebook’s definition, its employees say, engagement is considered “meaningful” even when it entails bullying, hate speech, and reshares of harmful content.

      Meaningful social interactions don't need algorithmic help.

    10. At the time, Facebook was already weighting the reactions other than “like” more heavily in its algorithm—meaning posts that got an “angry” reaction were more likely to show up in users’ News Feeds than posts that simply got a “like.” Anger-inducing content didn’t spread just because people were more likely to share things that made them angry; the algorithm gave anger-inducing content an edge. Facebook’s Integrity workers—employees tasked with tackling problems such as misinformation and espionage on the platform—concluded that they had good reason to believe targeting posts that induced anger would help stop the spread of harmful content.
    11. Facebook offers a collection of one-tap emoji reactions. Today, they include “like,” “love,” “care,” “haha,” “wow,” “sad,” and “angry.” Company researchers had found that the posts dominated by “angry” reactions were substantially more likely to go against community standards, including prohibitions on various types of misinformation, according to internal documents.

      "Angry" reactions can be a measure of posts being against community standards and providing misinformation.

      What other signals might misinformation carry that could be used to guard against them at a corporate level?

    12. that many of Facebook’s employees believe their company operates without a moral compass.

      Not just Facebook, but specifically Mark Zuckerberg who appears to be on the spectrum and isn't capable of being moral in a traditional sense.

    13. Facebook has dismissed the concerns of its employees in manifold ways. One of its cleverer tactics is to argue that staffers who have raised the alarm about the damage done by their employer are simply enjoying Facebook’s “very open culture,” in which people are encouraged to share their opinions, a spokesperson told me.
      1. Share opinions
      2. Opinions viewed as "fact"
      3. "Facts" spread as news.
      4. Platform accelerates "news".
      5. Bad things happen
      6. Profit
    1. Not only is Zuckerberg being called out for negligence, but it’s obvious that his ridiculously proposed idea “Instagram for Kids”, a social platform targeting children under the age of 13, is projected to only exacerbate the problem.
    1. FaceBook对未来的设想构建在AR(增强现实)上,而显然现有的AR设备的交互方式还不足以支持平时生活中的使用。他们团队为了实现愿景对全新的交互方式有下面这些设想:

      • 超低摩擦的输入模式:从想法到行动之间没有阻力,比如利用手腕的电信号
      • 可以理解上下文的个性化AI
      • 可以全天候穿戴的设备
  10. Sep 2021
    1. Ben Collins on Twitter: “A quick thread: It’s hard to explain just how radicalized ivermectin and antivax Facebook groups have become in the last few weeks. They’re now telling people who get COVID to avoid the ICU and treat themselves, often by nebulizing hydrogen peroxide. So, how did we get here?” / Twitter. (n.d.). Retrieved September 26, 2021, from https://twitter.com/oneunderscore__/status/1441395300002848769?s=20

    1. We may think of Pinterest as a visual form of commonplacing, as people choose and curate images (and very often inspirational quotations) that they find motivating, educational, or idealistic(Figure 6). Whenever we choose a passage to cite while sharing an article on Facebook or Twitter, we are creating a very public commonplace book on social media. Every time wepost favorite lyrics from a song or movie to social media or ablog, weare nearing the concept of Renaissance commonplace book culture.

      I'm not the only one who's thought this. Pinterest, Facebook, twitter, (and other social media and bookmarking software) can be considered a form of commonplace.

  11. Aug 2021
  12. Jul 2021
    1. 过去 Facebook 是一家由 Mark Zuckerberg 和 Sheryl Sandberg 双核心领导的公司,但自从特朗普上任后,这个双核心发生了很多变化,公司员工普遍认为 Facebook 的权利结构已经由过去的双核心变成了单核外加其他人构成。至于原因,则是因为 Sheryl Sandberg 没有在特朗普的任期内处理好 Facebook 与华盛顿的关系。

      这篇内容节选自还未上市的新书《An Ugly Truth: Inside Facebook's Battle for Domination》。

    1. Facebook AI. (2021, July 16). We’ve built and open-sourced BlenderBot 2.0, the first #chatbot that can store and access long-term memory, search the internet for timely information, and converse intelligently on nearly any topic. It’s a significant advancement in conversational AI. https://t.co/H17Dk6m1Vx https://t.co/0BC5oQMEck [Tweet]. @facebookai. https://twitter.com/facebookai/status/1416029884179271684

  13. Jun 2021
  14. May 2021
    1. “Over the next five to ten years people will start to learn the importance of privacy and keeping their data,” says Moore. “Facebook’s business model is all about tracking – they are not a social media company, they are an advertising company and if they can track you they can make more money. Apple has got nothing to worry about, but Facebook could be gone in ten years.”
    1. A former FB executive and long-standing friend of Zuckerberg emailed him in 2012 (page 31) to say “The number one threat to Facebook is not another scaled social network, it is the fracturing of information / death by a thousand small vertical apps which are loosely integrated together.”

      And this is almost exactly what the IndieWeb is.

    2. The single alternative platform is absolutely not the Facebook-killer.

      This is the truth. One only need to look at cable television providers or telephone service providers to see problems here.

    3. To change incentives so that personal data is treated with appropriate care, we need criminal penalties for the Facebook executives who left vulnerable half a billion people’s personal data, unleashing a lifetime of phishing attacks, and who now point to an FTC deal indemnifying them from liability because our phone numbers and unchangeable dates of birth are “old” data.

      We definitely need penalties and regulation to fix our problems.

    1. A strong and cogent argument for why we should not be listening to the overly loud cries from Tristan Harris and the Center for Human Technology. The boundary of criticism they're setting is not extreme enough to make the situation significantly better.

      It's also a strong argument for who to allow at the table or not when making decisions and evaluating criticism.

    2. it makes a difference whether the argument made before Congress is “Facebook is bad, cannot reform itself, and is guided by people who know what they’re doing but are doing int anyway—and the company needs to be broken up immediately” or if the argument is “Facebook means well, but it sure would be nice if they could send out fewer notifications and maybe stop recommending so much conspiratorial content.”

      Note the dramatic difference between these spaces and the potential ability for things to get better.

    1. Whether Trump can return to Facebook (and Instagram) will be determined on Wednesday morning, when Facebook’s Oversight Board offers its ruling on the company’s indefinite ban. Check TheWrap.com around 6:15 a.m. PT on Wednesday for an update.

      Let's hope that the answer is a resounding "NO!"

  15. Apr 2021