26 Matching Annotations
  1. Aug 2020
    1. In a typical election setting with secret ballots, we need: enforced secrecy: a way for each voter to cast a ballot secretly and no way to prove how they voted (lest they be unduly influenced) individual verifiability: a way for each voter to gain confidence that their own vote was correctly recorded and counted. global verifiability: a way for everyone to gain confidence that all votes were correctly counted and that only eligible voters cast a ballot.

      The requirements of the ideal voting system are:

      1. Enforced secrecy — Each voter can be sure their vote cannot be tied to their identity.
      2. Individual verifiability — Each voter can verify their vote was cast and counted.
      3. Global verifiability — Everyone can verify that all votes were correctly counted and that only eligible voters cast ballots
  2. Jul 2020
    1. prevent its disclosure to any person not authorized to create the subscriber's digital signature

      So the signature can be used by another entity to create the digital signature if authorized beforehand.

      So if there is a statement that "I authorize [organization] to create a cryptographic key-pair on my behalf, and create the digital signature."

  3. Jun 2020
    1. In contrast, theorems in Abstract Cryptography can be provedat a (high) level of abstraction without the instantiation ofthe lower levels. The lower levels inherit these theorems ifthey satisfy the postulated axioms of the higher level. Eachabstraction level can thus focus on specific aspects, such ascomposability or efficiency.

    Tags

    Annotators

    1. When you make a call using Signal, it will generate a two-word secret code on both the profiles. You will speak the first word and the recipient will check it. Then he will speak the second word and you can check it on your end. If both the words match, the call has not been intercepted and connected to the correct profile
  4. Apr 2020
    1. Rails does not issue the same stored token with every form. Neither does it generate and store a different token every time. It generates and stores a cryptographic hash in a session and issues new cryptographic tokens, which can be matched against the stored one, every time a page is rendered.
    1. Google says this technique, called "private set intersection," means you don't get to see Google's list of bad credentials, and Google doesn't get to learn your credentials, but the two can be compared for matches.
  5. Dec 2019
    1. One of the more clever aspects of the agent is how it can verify a user's identity (or more precisely, possession of a private key) without revealing that private key to anybody.
    2. It's very important to understand that private keys never leave the agent: instead, the clients ask the agent to perform a computation based on the key, and it's done in a way which allows the agent to prove that it has the private key without having to divulge the key itself
  6. Nov 2019
  7. Aug 2019
  8. Jul 2019
    1. This tip will help the reader in understanding how using C# .NET and Bouncy Castle built in library, one can encrypt and decrypt data in Elliptic Curve Cryptography.

      Infelizmente esse exemplo é para o Bouncy Castle em C#, mas se torcermos por uma compatibilidade entre bibliotecas, java deverá seguir um padrão similar.

  9. Dec 2016
  10. Feb 2016
  11. Dec 2015
    1. Representatives of the White House seemed to listen attentively, but shared little about their thoughts. They maintained that President Obama’s position has not changed in the last few months. While they seemed well aware of our concerns about the technical infeasibility of inserting backdoors, they didn’t necessarily share them. That worried us a great deal.
    1. this week’s announcement by Google that a machine made by a Canadian company, D-Wave Systems, which is marketed as “the world’s first commercial quantum computer”, had shown spectacular speed gains over conventional computers. “For a specific, carefully crafted proof-of-concept problem,” Google’s Hartmut Neven reported, “we achieved a 100-million-fold speed-up.”
  12. Nov 2015
    1. In this rush to blame a field that is largely unknowable to the public and therefore at once alluring and terrifying, little attention has been paid to facts: The Paris terrorists did not use encryption, but coordinated over SMS, one of the easiest to monitor methods of digital communication. They were still not caught, indicating a failure in human intelligence and not in a capacity for digital surveillance.
    2. The call for backdoors is nothing new. During my career in the private sector, I’ve seen requests to backdoor encryption software so as to please potential investors, and have seen people in the field who appeared to stand for secure software balk under the excuse of “if that’s what the customer wants,” even if it results in irreparable security weaknesses. I’ve had well-intentioned intelligence officers ask me informally, out of honest curiosity, why it is that I would refuse to insert backdoors. The issue is that cryptography depends on a set of mathematical relationships that cannot be subverted selectively. They either hold completely or not at all. It’s not something that we’re not smart enough to do; it’s something that’s mathematically impossible to do. I cannot backdoor software specifically to spy on jihadists without this backdoor applying to every single member of society relying on my software.
    3. When you make a credit card payment or log into Facebook, you’re using the same fundamental encryption that, in another continent, an activist could be using to organize a protest against a failed regime.<br> ...<br> If a terrorist is suspected of using a Toyota as a car bomb, it’s not reasonable to expect Toyota to start screening who it sells cars to, or to stop selling cars altogether.<br> ...<br> The brouhaha that has ensued from the press has been extreme. ... A Wired article, like many alongside it, finds an Arabic PDF guide on encryption and immediately attributes it as an “ISIS encryption training manual” even though it was written years ago by Gaza activists with no affiliation to any jihadist group.

  13. Oct 2015
    1. Nearly all ap­pli­ca­tions of prob­a­bil­ity to cryp­tog­ra­phy de­pend on the fac­tor prin­ci­ple (or Bayes’ The­o­rem).

      This is easily the most interesting sentence in the paper: Turing used Bayesian analysis for code-breaking during WWII.

  14. Apr 2015
    1. This post discusses the relative merits and dangers of different compositions of message authentication and encryption.

  15. Jan 2015
    1. But if you turn data into a money-printing machine for citizens, whereby we all become entrepreneurs, that will extend the financialization of everyday life to the most extreme level, driving people to obsess about monetizing their thoughts, emotions, facts, ideas—because they know that, if these can only be articulated, perhaps they will find a buyer on the open market. This would produce a human landscape worse even than the current neoliberal subjectivity. I think there are only three options. We can keep these things as they are, with Google and Facebook centralizing everything and collecting all the data, on the grounds that they have the best algorithms and generate the best predictions, and so on. We can change the status of data to let citizens own and sell them. Or citizens can own their own data but not sell them, to enable a more communal planning of their lives. That’s the option I prefer.

      Very well thought out. Obviously must know about read write web, TSL certificate issues etc. But what does neoliberal subjectivity mean? An interesting phrase.