289 Matching Annotations
  1. Last 7 days
    1. Facebook AI is introducing M2M-100, the first multilingual machine translation (MMT) model that can translate between any pair of 100 languages without relying on English data. It’s open sourced here. When translating, say, Chinese to French, most English-centric multilingual models train on Chinese to English and English to French, because English training data is the most widely available. Our model directly trains on Chinese to French data to better preserve meaning. It outperforms English-centric systems by 10 points on the widely used BLEU metric for evaluating machine translations. M2M-100 is trained on a total of 2,200 language directions — or 10x more than previous best, English-centric multilingual models. Deploying M2M-100 will improve the quality of translations for billions of people, especially those that speak low-resource languages. This milestone is a culmination of years of Facebook AI’s foundational work in machine translation. Today, we’re sharing details on how we built a more diverse MMT training data set and model for 100 languages. We’re also releasing the model, training, and evaluation setup to help other researchers reproduce and further advance multilingual models. 

      Summary of the 1st AI model from Facebook that translates directly between languages (not relying on English data)

  2. Oct 2020
    1. you grant us a non-exclusive, transferable, sub-licensable, royalty-free, and worldwide license to host, use, distribute, modify, run, copy, publicly perform or display, translate, and create derivative works of your content (consistent with your privacy and application settings).
    1. This is the story of how Facebook tried and failed at moderating content. The article cites many sources (employees) that were tasked with flagging posts according to platform policies. Things started to be complicated when high-profile people (such as Trump) started posting hate speech on his profile.

      Moderators have no way of getting honest remarks from Facebook. Moreover, they are badly treated and exploited.

      The article cites examples from different countries, not only the US, including extreme right groups in the UK, Bolsonaro in Brazil, the massacre in Myanmar, and more.

      In the end, the only thing that changes Facebook behavior is bad press.

    1. When Wojcicki took over, in 2014, YouTube was a third of the way to the goal, she recalled in investor John Doerr’s 2018 book Measure What Matters.“They thought it would break the internet! But it seemed to me that such a clear and measurable objective would energize people, and I cheered them on,” Wojcicki told Doerr. “The billion hours of daily watch time gave our tech people a North Star.” By October, 2016, YouTube hit its goal.

      Obviously they took the easy route. You may need to measure what matters, but getting to that goal by any means necessary or using indefensible shortcuts is the fallacy here. They could have had that North Star, but it's the means they used by which to reach it that were wrong.

      This is another great example of tech ignoring basic ethics to get to a monetary goal. (Another good one is Marc Zuckerberg's "connecting people" mantra when what he should be is "connecting people for good" or "creating positive connections".

    1. Meta co-founder and CEO Sam Molyneux writes that “Going forward, our intent is not to profit from Meta’s data and capabilities; instead we aim to ensure they get to those who need them most, across sectors and as quickly as possible, for the benefit of the world.”

      Odd statement from a company that was just acquired by Facebook founder's CVI.

    1. A spokeswoman for Summit said in an e-mail, “We only use information for educational purposes. There are no exceptions to this.” She added, “Facebook plays no role in the Summit Learning Program and has no access to any student data.”

      As if Facebook needed it. The fact that this statement is made sort of goes to papering over the idea that Summit itself wouldn't necessarily do something as nefarious or worse with it than Facebook might.

    1. M.B can’t be reduced to stereotypes, of course. But there’s also a bar to entry into this social-media network, and it’s a distinctly technophilic, first-world, Western bar.

      You can only say this because I suspect you're comparing it to platforms that are massively larger by many orders of magnitude. You can't compare it to Twitter or Facebook yet. In fact, if you were to compare it to them, then it would be to their early versions. Twitter was very technophilic for almost all of it's first three years until it crossed over into the broader conscious in early 2009.

      Your argument is somewhat akin to doing a national level political poll and only sampling a dozen people in one small town.

    1. Schemas aren't neutral

      This section highlights why relying on algorithmic feeds in social media platforms like Facebook and Twitter can be toxic. Your feed is full of what they think you'll like and click on instead of giving you the choice.

    1. In fact, these platforms have become inseparable from their data: we use “Facebook” to refer to both the application and the data that drives that application. The result is that nearly every Web app today tries to ask you for more and more data again and again, leading to dangling data on duplicate and inconsistent profiles we can no longer manage. And of course, this comes with significant privacy concerns.
    1. We believe that Facebook is also actively encouraging people to use tools like Buffer Publish for their business or organization, rather than personal use. They are continuing to support the use of Facebook Pages, rather than personal Profiles, for things like scheduling and analytics.

      Of course they're encouraging people to do this. Pushing them to the business side is where they're making all the money.

    1. hanks to a Facebook page, perhaps for the first time in history, an in-ternet user could click yes on an electronic invitation to a revolution
    1. Most previous explanations had focussed on explaining how someone’s beliefs might be altered in the moment.

      Knowing a little of what is coming in advance here, I can't help but thinking: How can this riot theory potentially be used to influence politics and/or political campaigns? It could be particularly effective to get people "riled up" just before a particular election to create a political riot of sorts and thereby influence the outcome.

      Facebook has done several social experiments with elections in showing that their friends and family voted and thereby affecting other potential voters. When done in a way that targets people of particular political beliefs to increase turn out, one is given a means of drastically influencing elections. In some sense, this is an example of this "Riot Theory".

    1. Even publishers with the most social media-savvy newsrooms can feel at a disadvantage when Facebook rolls out a new product.

      The same goes in triplicate when they pull the plug without notice too!

    1. People come to Google looking for information they can trust, and that information often comes from the reporting of journalists and news organizations around the world.

      Heavy hit in light of the Facebook data scandal this week on top of accusations about fake news spreading.

    2. We’re now in the early stages of testing a “Propensity to Subscribe” signal based on machine learning models in DoubleClick to make it easier for publishers to recognize potential subscribers, and to present them the right offer at the right time.

      Interestingly the technology here isn't that different than the Facebook Data that Cambridge Analytica was using, the difference is that they're not using it to directly impact politics, but to drive sales. Does this mean they're more "ethical"?

    1. Facebook’s use of “ethnic affinity” as a proxy for race is a prime example. The platform’s interface does not offer users a way to self-identify according to race, but advertisers can nonetheless target people based on Facebook’s ascription of an “affinity” along racial lines. In other words. race is deployed as an externally assigned category for purposes of commercial exploitation and social control, not part of self-generated identity for reasons of personal expression. The ability to define one’s self and tell one’s own stories is central to being human and how one relates to others; platforms’ ascribing identity through data undermines both.
    2. Facebook’s use of “ethnic affinity” as a proxy for race is a prime example. The platform’s interface does not offer users a way to self-identify according to race, but advertisers can nonetheless target people based on Facebook’s ascription of an “affinity” along racial lines. In other words, race is deployed as an externally assigned category for purposes of commercial exploitation and social control, not part of self-generated identity for reasons of personal expression. The ability to define one’s self and tell one’s own stories is central to being human and how one relates to others; platforms’ ascribing identity through data undermines both.
    1. You could throw the pack away and deactivate your Facebook account altogether. It will get harder the longer you wait — the more photos you post there, or apps you connect to it.

      Links create value over time, and so destroying links typically destroys the value.

    1. My hope is that it will somehow bring comments on Facebook back to the blog and display them as comments here.

      Sadly, Aaron Davis is right that Facebook turned off their API access for this on August 1st, so there currently aren't any services, including Brid.gy, anywhere that allow this. Even WordPress and JetPack got cut off from posting from WordPress to Facebook, much less the larger challenge of pulling responses back.

  3. Sep 2020
    1. What were the “right things” to serve the community, as Zuckerberg put it, when the community had grown to more than 3 billion people?

      This is just one of the contradictions of having a global medium/platform of communication being controlled by a single operator.

      It is extremely difficult to create global policies to moderate the conversations of 3 billion people across different languages and cultures. No team, no document, is qualified for such a task, because so much is dependent on context.

      The approach to moderation taken by federated social media like Mastodon makes a lot more sense. Communities moderate themselves, based on their own codes of conduct. In smaller servers, a strict code of conduct may not even be necessary - moderation decisions can be based on a combination of consensus and common sense (just like in real life social groups and social interactions). And there is no question of censorship, since their moderation actions don't apply to the whole network.

    1. “With no oversight whatsoever, I was left in a situation where I was trusted with immense influence in my spare time,” she wrote. “A manager on Strategic Response mused to myself that most of the world outside the West was effectively the Wild West with myself as the part-time dictator – he meant the statement as a compliment, but it illustrated the immense pressures upon me.”
    2. “There was so much violating behavior worldwide that it was left to my personal assessment of which cases to further investigate, to file tasks, and escalate for prioritization afterwards,” she wrote.

      Wow.

    3. Facebook ignored or was slow to act on evidence that fake accounts on its platform have been undermining elections and political affairs around the world, according to an explosive memo sent by a recently fired Facebook employee and obtained by BuzzFeed News.The 6,600-word memo, written by former Facebook data scientist Sophie Zhang, is filled with concrete examples of heads of government and political parties in Azerbaijan and Honduras using fake accounts or misrepresenting themselves to sway public opinion. In countries including India, Ukraine, Spain, Brazil, Bolivia, and Ecuador, she found evidence of coordinated campaigns of varying sizes to boost or hinder political candidates or outcomes, though she did not always conclude who was behind them.
    4. “In the office, I realized that my viewpoints weren’t respected unless I acted like an arrogant asshole,” Zhang said.
    1. On and on it goes, until the perceived cost of not being on Facebook is higher than the perceived downsides of joining the platform.

      De kosten om niet op Facebook te zijn, zijn hoger dan de nadelen van het lid worden van het platform. Die zin moet ik nog een paar keer op me in laten werken. Ik zie het nog niet voor me.

    1. il est fondamental de comprendre que, sur Facebook, je suis comme devant une fenêtre

      ... mais le caractère diaphane de Facebook n'est pas évident: certes, pour ce qu'on Like ou met en ligne publiquement; mais de manière beaucoup plus significative, le suivi à la trace de chacun de nos comportements – chaque clic sur un lien, chaque site web visité (où Facebook ou une de ses filiales est présent), chaque fraction de seconde pendant laquelle nous cessons de défiler… ce regard profondément asymmétrique qu’a Facebook sur nous, à notre insu, est majeur.

  4. Aug 2020
    1. The mass surveillance and factory farming of human beings on a global scale is the business model of people farmers like Facebook and Google. It is the primary driver of the socioeconomic system we call surveillance capitalism.
    1. Facebook has apologized to its users and advertisers for being forced to respect people’s privacy in an upcoming update to Apple’s mobile operating system – and promised it will do its best to invade their privacy on other platforms.

      Sometimes I forget how funny The Register can be. This is terrific.

    1. Facebook is warning developers that privacy changes in an upcoming iOS update will severely curtail its ability to track users' activity across the entire Internet and app ecosystem and prevent the social media platform from serving targeted ads to users inside other, non-Facebook apps on iPhones.

      I fail to see anything bad about this.

  5. Jul 2020
    1. But the business model that we now call surveillance capitalism put paid to that, which is why you should never post anything on Facebook without being prepared to face the algorithmic consequences.

      I'm reminded a bit of the season 3 episode of Breaking Bad where Jesse Pinkman invites his drug dealing pals to a Narcotics Anonymous-type meeting so that they can target their meth sales. Fortunately the two low lifes had more morality and compassion than Facebook can manage.

      https://www.youtube.com/watch?v=20kpzC3sckQ

  6. Jun 2020
    1. One of the new tools debuted by Facebook allows administrators to remove and block certain trending topics among employees. The presentation discussed the “benefits” of “content control.” And it offered one example of a topic employers might find it useful to blacklist: the word “unionize.”

      Imagine your employer looking over your shoulder constantly.

      Imagine that you're surveilled not only in regard to what you produce, but to what you—if you're an office worker—tap our in chats to colleagues.

      This is what Facebook does and it's not very different to what China has created with their Social Credit System.

      This is Orwellian.

    1. Alarmingly, Google now deploys hidden trackers on 76% of websites across the web to monitor your behavior and Facebook has hidden trackers on about 25% of websites, according to the Princeton Web Transparency & Accountability Project. It is likely that Google and/or Facebook are watching you on most sites you visit, in addition to tracking you when using their products.

    1. And while all major tech platforms deploying end-to-end encryption argue against weakening their security, Facebook has become the champion-in-chief fighting against government moves, supported by Apple and others.
    1. WhatsApp has become the dominant messaging platform, dwarfing all other contenders with the exception of its Facebook stablemate Messenger. In doing so, this hyper-scale “over-the-top” platform has also pushed legacy SMS messaging into the background
    1. The breach was caused by Facebook’s “View As” feature, which allows users to view their own account as if they were a stranger visiting it.
    2. “We have a responsibility to protect your data,” said Zuckerburg, in March. “And if we can’t, then we don’t deserve to serve you.”
    1. Facebook already harvests some data from WhatsApp. Without Koum at the helm, it’s possible that could increase—a move that wouldn’t be out of character for the social network, considering that the company’s entire business model hinges on targeted advertising around personal data.
  7. May 2020
    1. The high number of extremist groups was concerning, the presentation says. Worse was Facebook’s realization that its algorithms were responsible for their growth. The 2016 presentation states that “64% of all extremist group joins are due to our recommendation tools” and that most of the activity came from the platform’s “Groups You Should Join” and “Discover” algorithms: “Our recommendation systems grow the problem.”
  8. Apr 2020
    1. Acton's $850M moral stand and the $122mn fine for deliberately lying to the EU Competition Commission

      Under pressure from Mark Zuckerberg and Sheryl Sandberg to monetize WhatsApp, he pushed back as Facebook questioned the encryption he'd helped build and laid the groundwork to show targeted ads and facilitate commercial messaging. Acton also walked away from Facebook a year before his final tranche of stock grants vested. “It was like, okay, well, you want to do these things I don’t want to do,” Acton says. “It’s better if I get out of your way. And I did.” It was perhaps the most expensive moral stand in history. Acton took a screenshot of the stock price on his way out the door—the decision cost him $850 million.

      Despite a transfer of several billion dollars, Acton says he never developed a rapport with Zuckerberg. “I couldn’t tell you much about the guy,” he says. In one of their dozen or so meetings, Zuck told Acton unromantically that WhatsApp, which had a stipulated degree of autonomy within the Facebook universe and continued to operate for a while out of its original offices, was “a product group to him, like Instagram.”

      For his part, Acton had proposed monetizing WhatsApp through a metered-user model, charging, say, a tenth of a penny after a certain large number of free messages were used up. “You build it once, it runs everywhere in every country,” Acton says. “You don’t need a sophisticated sales force. It’s a very simple business.”

      Acton’s plan was shot down by Sandberg. “Her words were ‘It won’t scale.’ ”

      “I called her out one time,” says Acton, who sensed there might be greed at play. “I was like, ‘No, you don’t mean that it won’t scale. You mean it won’t make as much money as . . . ,’ and she kind of hemmed and hawed a little. And we moved on. I think I made my point. . . . They are businesspeople, they are good businesspeople. They just represent a set of business practices, principles and ethics, and policies that I don’t necessarily agree with.”

      Questioning Zuckerberg’s true intentions wasn’t easy when he was offering what became $22 billion. “He came with a large sum of money and made us an offer we couldn’t refuse,” Acton says. The Facebook founder also promised Koum a board seat, showered the founders with admiration and, according to a source who took part in discussions, told them that they would have “zero pressure” on monetization for the next five years... Internally, Facebook had targeted a $10 billion revenue run rate within five years of monetization, but such numbers sounded too high to Acton—and reliant on advertising.

      T he warning signs emerged before the deal even closed that November. The deal needed to get past Europe’s famously strict antitrust officials, and Facebook prepared Acton to meet with around a dozen representatives of the European Competition Commission in a teleconference. “I was coached to explain that it would be really difficult to merge or blend data between the two systems,” Acton says. He told the regulators as much, adding that he and Koum had no desire to do so.

      Later he learned that elsewhere in Facebook, there were “plans and technologies to blend data.” Specifically, Facebook could use the 128-bit string of numbers assigned to each phone as a kind of bridge between accounts. The other method was phone-number matching, or pinpointing Facebook accounts with phone numbers and matching them to WhatsApp accounts with the same phone number.

      Within 18 months, a new WhatsApp terms of service linked the accounts and made Acton look like a liar. “I think everyone was gambling because they thought that the EU might have forgotten because enough time had passed.” No such luck: Facebook wound up paying a $122 million fine for giving “incorrect or misleading information” to the EU—a cost of doing business, as the deal got done and such linking continues today (though not yet in Europe). “The errors we made in our 2014 filings were not intentional," says a Facebook spokesman.

      Acton had left a management position on Yahoo’s ad division over a decade earlier with frustrations at the Web portal’s so-called “Nascar approach” of putting ad banners all over a Web page. The drive for revenue at the expense of a good product experience “gave me a bad taste in my mouth,” Acton remembers. He was now seeing history repeat. “This is what I hated about Facebook and what I also hated about Yahoo,” Acton says. “If it made us a buck, we’d do it.” In other words, it was time to go.

    1. Also see Social Capital's 2018 Annual Letter which noted that 40c of every VC dollar is now spent on Google, Facebook, and Amazon (ads)

      The leaders of more than half a dozen new online retailers all told me they spent the greatest portion of their ad money on Facebook and Instagram.

      “In the start-up-industrial complex, it’s like a systematic transfer of money” from venture-capital firms to start-ups to Facebook.

      Steph Korey, a founder of Away, a luggage company based in New York that opened in 2015, says that when the company was starting, it made $5 for every $1 it spent on Facebook Lookalike ads.

      They began trading their Lookalike groups with other online retailers, figuring that the kind of people who buy one product from social media will probably buy others. This sort of audience sharing is becoming more common on Facebook: There is even a company, TapFwd, that pools together Lookalike groups for various brands, helping them show ads to other groups.

    1. Facebook Reklamlarının Avantajları Facebook Reklamları diğer sosyal medya platformlarına tercih etmeniz için birçok sebebiniz var. Platformu öğrendikten sonra reklamlarınızı yayınlayıp işletmeniz için kar elde etmeniz çok kolay. Peki Facebook'un neden tercih etmelisiniz? Gelin birlikte bakalım.
    1. Ces dernières ont de toute évidence un rôle clé dans la lutte contre les cyberviolences qui se déroulent la plupart du temps en leur sein. On peut citer les dernières mesures prises par Facebook contre l’intimidation et le harcèlement : possibilité de masquer ou de supprimer plusieurs commentaires à la fois sous un post, ou encore possibilité de signaler un contenu jugé injurieux publié sur le compte d’un ami.

      Exemple de Facebook qui a pris des mesures pour limiter les cyberviolences. La victime peut donc supprimer des commentaires injurieux (après les avoir lu) et il est possible, en tant qu'ami, d'intervenir. Mais à quand de réelles mesures comme la suppression du compte du contrevenant, sa signalisation aux autorités ou des vérifications suffisantes pour s'assurer qu'un enfant de moins de X ans ne soit pas présent sur la plateforme ?

    1. This page documents our efforts to track political advertising and produce an ad transparency report for the 2019 European Parliament Election.We attempted to download a copy of the political ads on a daily basis using the Facebook Ad Library API and the Google Ad Library, starting on March 29 and May 11 respectively, when the two companies released their political ads archive. We provide this data collection log, so that external researchers, journalists, analysts, and readers may examine our methods and assess the data presented in our reports.Facebook Ad Library APIScroll down Facebook provides an Application Programmable Interface ("API") to authorized users who may search for ads in their archive. However, due to the inconsistent state of the Facebook Ad Library API, our methods to scan and discover ads must be adapted on a daily and sometimes hourly basis — to deal with design limitations, data issues, and numerous software bugs in the Facebook Ad Library API.Despite our best efforts to help Facebook debug their system, the majority of the issues were not resolved. The API delivered incomplete data on most days from its release through May 16, when Facebook fixed a critical bug. The API was broken again from May 18 through May 26, the last day of the elections.We regret we do not have reliable or predictable instructions on how to retrieve political ads from Facebook. Visit the methods page for our default crawler settings and a list of suggested workarounds for known bugs, or scroll down to see our log.In general, we encountered three categories of issues with the Facebook Ad Library API. First, software programming errors that cripple a user's ability to complete even a single search including the following bugs:

      Classic example of transparency hobbled by non-bulk access and poor coding. Contrast with Google who just provided straight up bulk access.

      Also reflects FB's lack of technical prowess: they are craigslist x 10. No technical quality: just the perfect monopoly platform able to hoover up cents on massive volume.

  9. Mar 2020
    1. Right now, Facebook is tackling “misinformation that has imminent risk of danger, telling people if they have certain symptoms, don’t bother going getting treated … things like ‘you can cure this by drinking bleach.’ I mean, that’s just in a different class.”
    1. Facebook does not even offer an absolute opt out of targeted advertising on its platform. The ‘choice’ it gives users is to agree to its targeted advertising or to delete their account and leave the service entirely. Which isn’t really a choice when balanced against the power of Facebook’s platform and the network effect it exploits to keep people using its service.
    2. So it’s not surprising that Facebook is so coy about explaining why a certain user on its platform is seeing a specific advert. Because if the huge surveillance operation underpinning the algorithmic decision to serve a particular ad was made clear, the person seeing it might feel manipulated. And then they would probably be less inclined to look favorably upon the brand they were being urged to buy. Or the political opinion they were being pushed to form. And Facebook’s ad tech business stands to suffer.
    1. Using Facebook ads, the researchers recruited 2,743 users who were willing to leave Facebook for one month in exchange for a cash reward. They then randomly divided these users into a Treatment group, that followed through with the deactivation, and a Control group, that was asked to keep using the platform. 

      The effects of not using Facebook for a month:

      • on average another 60 free mins per day
      • small but significant improvement in well-being, and in particular in self-reported happiness, life satisfaction, depression and anxiety
      • participants were less willing to use Facebook from now
      • the group was less likely to follow politics
      • deactivation significantly reduced polarization of views on policy issues and a measure of exposure to polarizing news
      • 80% agreed that the deactivation was good for them
  10. Feb 2020
    1. Last year, Facebook said it would stop listening to voice notes in messenger to improve its speech recognition technology. Now, the company is starting a new program where it will explicitly ask you to submit your recordings, and earn money in return.

      Given Facebook's history with things like breaking laws that end up with them paying billions of USD in damages (even though it's a joke), sold ads to people who explicitly want to target people who hate jews, and have spent millions of USD every year solely on lobbyism, don't sell your personal experiences and behaviours to them.

      Facebook is nefarious and psychopathic.

  11. Jan 2020
    1. received a message telling me that my account had been locked because I was incarcerated and as such, disallowed from using Facebook
  12. Dec 2019
    1. Madison’s design has proved durable. But what would happen to American democracy if, one day in the early 21st century, a technology appeared that—over the course of a decade—changed several fundamental parameters of social and political life? What if this technology greatly increased the amount of “mutual animosity” and the speed at which outrage spread? Might we witness the political equivalent of buildings collapsing, birds falling from the sky, and the Earth moving closer to the sun?

      Jonathan Haidt, you might have noticed, is a scholar that I admire very much. In this piece, his colleague Tobias Rose-Stockwell and he ask the following questions: Is social media a threat to our democracy? Let's read the following article together and think about their question together.

  13. Nov 2019
    1. Facebook reactions are a nuanced way through which each user to express their sentiments on posts. Being used for over 300 billion times in the beginning of its launch, Facebook reactions are still in trend.
    1. Loading this iframe allows Facebook to know that this specific user is currently on your website. Facebook therefore knows about user browsing behaviour without user’s explicit consent. If more and more websites adopt Facebook SDK then Facebook would potentially have user’s full browsing history! And as with “With great power comes great responsibility”, it’s part of our job as developers to protect users privacy even when they don’t ask for.
    1. In June 2012, Facebook announced it would no longer use its own money system, Facebook Credits.

      Gave up in 2012 on their scam. Why hasn't this been brought up? Especially by regulators?

    1. Found a @facebook #security & #privacy issue. When the app is open it actively uses the camera. I found a bug in the app that lets you see the camera open behind your feed.

      So, Facebook uses your camera even while not active.

    1. An explosive trove of nearly 4,000 pages of confidential internal Facebook documentation has been made public, shedding unprecedented light on the inner workings of the Silicon Valley social networking giant.

      I can't even start telling you how much schadenfreude I feel at this. Even though this paints a vulgar picture, Facebook are still doing it, worse and worse.

      Talk about hiding in plain sight.

    1. Somewhere in a cavernous, evaporative cooled datacenter, one of millions of blinking Facebook servers took our credentials, used them to authenticate to our private email account, and tried to pull information about all of our contacts. After clicking Continue, we were dumped into the Facebook home page, email successfully “confirmed,” and our privacy thoroughly violated.
    1. In 2013, Facebook began offering a “secure” VPN app, Onavo Protect, as a way for users to supposedly protect their web activity from prying eyes. But Facebook simultaneously used Onavo to collect data from its users about their usage of competitors like Twitter. Last year, Apple banned Onavo from its App Store for violating its Terms of Service. Facebook then released a very similar program, now dubbed variously “Project Atlas” and “Facebook Research.” It used Apple’s enterprise app system, intended only for distributing internal corporate apps to employees, to continue offering the app to iOS users. When the news broke this week, Apple shut down the app and threw Facebook into some chaos when it (briefly) booted the company from its Enterprise Developer program altogether.
    1. Take Facebook, for example. CEO Mark Zuckerberg will stand onstage at F8 and wax poetic about the beauty of connecting billions of people across the globe, while at the same time patenting technologies to determine users' social classes and enable discrimination in the lending process, and allowing housing advertisers to exclude racial and ethnic groups or families with women and children from their listings.
    1. If the apparatus of total surveillance that we have described here were deliberate, centralized, and explicit, a Big Brother machine toggling between cameras, it would demand revolt, and we could conceive of a life outside the totalitarian microscope.
    1. Senior government officials in multiple U.S.-allied countries were targeted earlier this year with hacking software that used Facebook Inc’s (FB.O) WhatsApp to take over users’ phones, according to people familiar with the messaging company’s investigation.
  14. Oct 2019
    1. Facebook said on Wednesday that it expected to be fined up to $5 billion by the Federal Trade Commission for privacy violations. The penalty would be a record by the agency against a technology company and a sign that the United States was willing to punish big tech companies.

      This is where surveillance capitalism brings you.

      Sure, five billion American Dollars won't make much of a difference to Facebook, but it's notable.

    1. The company has sky-high hopes that Libra could become the foundation for a new financial system not controlled by today’s power brokers on Wall Street or central banks.

      Facebook want another way to circumvent government? Well, let's circumvent Facebook.

    2. The cryptocurrency, called Libra, will also have to overcome concern that Facebook does not effectively protect the private information of its users — a fundamental task for a bank or anyone handling financial transactions.
    1. “We are a nation with a tradition of reining in monopolies, no matter how well-intentioned the leaders of these companies may be.”Mr. Hughes went on to describe the power held by Facebook and its leader Mr. Zuckerberg, his former college roommate, as “unprecedented.” He added, “It is time to break up Facebook.”