1,048 Matching Annotations
  1. Jun 2016
  2. May 2016
    1. The Defense Department is building a massive information-sharing system detailing national security personnel and individuals cleared for accessing U.S. secrets, to flag who among them might be potential turncoats or other "insider threats."
  3. Apr 2016
    1. In its default configuration, the CJRS web service (either deployed as an executable jar, or a war file in a servlet container) is configured to use the SLURM JobExecutionService, and directly invokes ‘srun’, ‘sbatch’, and ‘salloc’ commands that are available on the host it is running on.A natural consequence of this is that SLURM jobs are submitted using the same user ID as owner of the CJRS web service process. For the purposes of training and demonstration, it is recommended to deploy the application so that it runs as a single, unprivileged user created specifically for the purpose of training. In theory, however, anybody who obtains the executable jar file may run it on a machine they have access to, bound to some random high port exclusive to that user, allowing it to launch SLURM jobs on their behalf via the REST API.

      This will likely not be portable to Docker due to security issues; two separate users will be needed: https://docs.docker.com/engine/security/security/#docker-daemon-attack-surface

  4. Feb 2016
  5. Jan 2016
    1. Raj: Now, we are back to the feeling of the need for relief, aren’t we, Paul? Which simply means you feel a need to withdraw. We may stop talking in this fashion, but you don’t have to stop checking to see if I am here, and listening for my response. Just notice the feeling of the need for withdrawal into privateness—without judgment. But be aware of it. I will tell you that you can tolerate the active connection longer, and I want you to remember that the reason we are talking is because of your choice. It is not because I am forcing myself upon you. You do not need to withdraw from me. That is the excuse, but the excuse can only make sense if you can be distracted from the fact that you reached out for the connection. You are not, in fact, withdrawing from my embrace of you. You are withdrawing into privateness that does not allow you to experience the fact that you are embracing always! The suggestion is that you are shutting me out. But know that you are shutting yourself in—self-protective withdrawal into isolation for security that doesn’t constitute security, but which constitutes incarceration.

      "Just notice the feeling of the need for withdrawal into privateness—without judgement. But be aware of it."

      Feeling what we feel without judging it.

      "You are withdrawing into privateness that does not allow you to experience the fact that you are embracing always! The suggestion is that you are shutting me out. But know that you are shutting yourself in—self-protective withdrawal into isolation for security that doesn't constitute security, but which constitutes incarceration."

    1. Linode Cloud Service has been under DDoS attack for a few days. Now they've discovered some stolen passwords. It is not yet known whether the same attacker is responsible for both.

      A security investigation into the unauthorized login of three accounts has led us to the discovery of two Linode.com user credentials on an external machine. This implies user credentials could have been read from our database, either offline or on, at some point.<br> . . .<br> The entire Linode team has been working around the clock to address both this issue and the ongoing DDoS attacks. We've retained a well-known third-party security firm to aid in our investigation. Multiple Federal law enforcement authorities are also investigating and have cases open for both issues.

  6. Dec 2015
    1. A TOP-SECRET document dated February 2011 reveals that British spy agency GCHQ, with the knowledge and apparent cooperation of the NSA, acquired the capability to covertly exploit security vulnerabilities in 13 different models of firewalls made by Juniper Networks, a leading provider of networking and Internet security gear.

      Matt Blaze, a cryptographic researcher and director of the Distributed Systems Lab at the University of Pennsylvania, said the document contains clues that indicate the 2011 capabilities against Juniper are not connected to the recently discovered vulnerabilities.

      So the NSA and GCHQ (and CIA and FBI, etc) are constantly working to find -- or create -- security flaws wherever they can. Civilians get jail time for things like that. Concern for national security should require them to report flaws they discover to the firms that make the hardware and software. But CISA isn't about security.

    1. Manhattan district attorney Cyrus R. Vance Jr. says that law enforcement agencies want Google and Apple to return to systems without full-disk encryption -- those before iOS 8 and Android Lollipop -- which they could unlock in compliance with a warrant.

      He says that's all they're asking. If that's true, they should be speaking out loudly against mass surveillance and FBI demands for backdoors.

    1. "There has always been a tension in the intelligence community between the intel side that wants to exploit the information from social media and the operational or the policy community that wants to do something to shut it down," Mike Flynn, who directed the Defense Intelligence Agency from 2012 to 2014
    1. Apple CEO Tim Cook has repeatedly and strongly criticized those in government who have demanded backdoors, explaining: “You can’t have a back door in the software because you can’t have a back door that’s only for the good guys.” And a representative of many of the large tech companies recently remarked: “Weakening security with the aim of advancing security simply does not make sense.” Eighty-five percent of cybersecurity experts recently surveyed by Politico called backdoors “a bad idea”. (We know, for example, the NSA in particular loves to prey on foreign phone companies’ backdoors.)
    1. The Senate’s recently passed bill, known as the Cybersecurity Information Sharing Act (CISA), is expected to serve as the basis for the finished language. The compromise text will also likely include elements from a bill that originated in the House Intelligence Committee, observers said.This completed product would mostly sideline the privacy advocate-preferred bill from the House Homeland Security Committee. They believe the Homeland Security bill includes the strongest provisions to protect people’s sensitive data from falling into the NSA's hands.Specifically, the Homeland Security bill would give the greatest role to the Department of Homeland Security (DHS) for collecting cyber threat data from the private sector and disseminating it throughout the government.It’s believed the DHS is best suited to scrub data sets of personal information.

      It seems necessary to encourage -- or force -- industrial and financial firms to share information with the government about hacks and attempted hacks. But that should not be used as license to transfer and collect customer metadata,

    1. "It makes zero sense to lock up this information forever," said Jeremiah Grossman, who founded cybersecurity firm WhiteHat Security. "Certainly there are past breaches that the public should know about, is entitled to know about, and that others can learn from."

      I used to think the most fanciful thing about the movie "War Games" was not the A.I., but the defense computer connected to a public network. But if industrial control systems can be reached by the Internet or other public lines -- then maybe the government is that stupid.

    1. It is important to note that the path attribute does not protect against unauthorized reading of the cookie from a different path. It can be easily bypassed using the DOM, for example by creating a hidden iframe element with the path of the cookie, then accessing this iframe's contentDocument.cookie property. The only way to protect the cookie is by using a different domain or subdomain, due to the same origin policy.
  7. Nov 2015
    1. Businesses need to be more careful to avoid revealing customers' personal information. And they should record calls, and watch them collectively over time for signs of suspicious activity.

      The harasser in this article tricked customer service representatives into giving him private details about his victims. Starting with whatever information he could find online (a birthdate, the name of a pet) he would call repeatedly until he succeeded in getting other details -- which would make him still more convincing, so he could get more details.

      In one case, he pretended to be a company technician for ISP Cox Communications. They didn't have a procedure to verify the ID of their own technicians?

      Social engineering)

    1. The call for backdoors is nothing new. During my career in the private sector, I’ve seen requests to backdoor encryption software so as to please potential investors, and have seen people in the field who appeared to stand for secure software balk under the excuse of “if that’s what the customer wants,” even if it results in irreparable security weaknesses. I’ve had well-intentioned intelligence officers ask me informally, out of honest curiosity, why it is that I would refuse to insert backdoors. The issue is that cryptography depends on a set of mathematical relationships that cannot be subverted selectively. They either hold completely or not at all. It’s not something that we’re not smart enough to do; it’s something that’s mathematically impossible to do. I cannot backdoor software specifically to spy on jihadists without this backdoor applying to every single member of society relying on my software.
    2. When you make a credit card payment or log into Facebook, you’re using the same fundamental encryption that, in another continent, an activist could be using to organize a protest against a failed regime.<br> ...<br> If a terrorist is suspected of using a Toyota as a car bomb, it’s not reasonable to expect Toyota to start screening who it sells cars to, or to stop selling cars altogether.<br> ...<br> The brouhaha that has ensued from the press has been extreme. ... A Wired article, like many alongside it, finds an Arabic PDF guide on encryption and immediately attributes it as an “ISIS encryption training manual” even though it was written years ago by Gaza activists with no affiliation to any jihadist group.

    1. All new Dell laptops and desktops shipped since August 2015 contain a serious security vulnerability that exposes users to online eavesdropping and malware attacks.

      "At issue is a root certificate installed on newer Dell computers that also includes the private cryptographic key for that certificate. Clever attackers can use this key from Dell to sign phony browser security certificates for any HTTPS-protected site."

    1. Another provision of the proposed Investigatory Powers Bill is that internet service providers (ISPs) must retain a record of all the websites you visit (more specifically, all the IP addresses you connect to) for one year. This appears to be another measure to weaken privacy while strengthening security – but in fact, it is harmful to both privacy and security. In order to maintain a record of every website you have visited in the last year, the ISP must store that information somewhere accessible. Information that is stored somewhere accessible will sooner or later be stolen by attackers.
    2. I’ll say it again, to be absolutely clear: any mechanism that can allow law enforcement legitimate access to data can inevitably be abused by hostile foreign intelligence services, and even technically sophisticated individuals, to break into systems and gain unauthorised access to the same data.
    3. Any method that provides exceptional access immediately exposes the system to attacks by malicious parties, rendering the protection of encryption essentially worthless. Exceptional access would probably require that government departments have some kind of master keys that allowed them to decrypt any communication if required. Those master keys would obviously have to be kept extremely secret: if they were to become public, the entire security infrastructure of the internet would crumble into dust. How good are government agencies at keeping secrets?
    1. Every three years, the Librarian of Congress issues new rules on Digital Millennium Copyright Act exemptions. Acting Librarian David Mao, in an order (PDF) released Tuesday, authorized the public to tinker with software in vehicles for "good faith security research" and for "lawful modification." The decision comes in the wake of the Volkswagen scandal, in which the German automaker baked bogus code into its software that enabled the automaker's diesel vehicles to reduce pollutants below acceptable levels during emissions tests.
  8. Oct 2015
  9. Jun 2015
  10. May 2015
  11. Feb 2015
  12. Jan 2015
    1. It’s primarily from data and not their algorithms that powerful companies currently derive their advantages, and the only way to curb that power is to take the data completely out of the market realm, so that no company can own them. Data would accrue to citizens, and could be shared at various social levels. Companies wanting to use them would have to pay some kind of licensing fee, and only be able to access attributes of the information, not the entirety of it.

      Yes, well at present the security services are complicit with the present economic and legislative model, and this makes imagining any change to existing structures very difficult because such changes will be resisted by the rather shadowy security services. Cameron does a deal with them, he makes a point somewhat in support of their agenda in return for which he bigs up his position on security with the cost of looking an idiot - not a huge cost for a politician it seems.

    2. But if you turn data into a money-printing machine for citizens, whereby we all become entrepreneurs, that will extend the financialization of everyday life to the most extreme level, driving people to obsess about monetizing their thoughts, emotions, facts, ideas—because they know that, if these can only be articulated, perhaps they will find a buyer on the open market. This would produce a human landscape worse even than the current neoliberal subjectivity. I think there are only three options. We can keep these things as they are, with Google and Facebook centralizing everything and collecting all the data, on the grounds that they have the best algorithms and generate the best predictions, and so on. We can change the status of data to let citizens own and sell them. Or citizens can own their own data but not sell them, to enable a more communal planning of their lives. That’s the option I prefer.

      Very well thought out. Obviously must know about read write web, TSL certificate issues etc. But what does neoliberal subjectivity mean? An interesting phrase.

  13. Nov 2014
    1. This criterion requires an independent security review has been performed within the 12 months prior to evaluation. This review must cover both the design and the implementation of the app and must be performed by a named auditing party that is independent of the tool's main development team. Audits by an independent security team within a large organization are sufficient. Recognizing that unpublished audits can be valuable, we do not require that the results of the audit have been made public, only that a named party is willing to verify that the audit took place.
  14. Mar 2014
    1. We need an authenticity infrastructure when there is no way to have advance knowledge of what SSL certificate a client should expect to see, but your app knows where it will be connecting, and it knows exactly what it should expect.

      Succinct way to highlight this distinction.

    2. Google is already doing this. They have an “app” called Chrome, and when their app makes SSL connections to their own services, it checks to make sure that the certificates it sees are the ones it knows Google is using. They call this “pinning,” and you should do it for your mobile apps.
  15. Sep 2013
    1. Much as it is not the criminal defense lawyer's place to judge their client regardless of how guilty they are, it is not the doctor's place to force experimental treatment upon a patient regardless of how badly the research is needed, and it is not the priest's place to pass worldly judgement on their flock, it is not the programmer's place to try and decide whether the user is using the software in a "good" way or not.

      Taking this to heart / putting it on my wall.