82 Matching Annotations
  1. Last 7 days
    1. Identify your user agents When deploying software that makes requests to other sites, you should set a custom User-Agent header to identify the software and provide a means to contact its maintainers. Many of the automated requests we receive have generic user-agent headers such as Java/1.6.0 or Python-urllib/2.1 which provide no information on the actual software responsible for making the requests.
  2. Oct 2020
    1. Perhaps we should detect URLSearchParams objects differently (using duck typing detection instead of instanceof window.URLSearchParams, for example) but the solution isn't adding a specific polyfill to Axios (as it'd increase the bundle size and still won't work with other polyfills).
    1. Sometimes we can’t implement a solution that’s fully spec-compliant, and in those cases using a polyfill might be the wrong answer. A polyfill would translate into telling the rest of the codebase that it’s okay to use the feature, that it’ll work just like in modern browsers, but it might not in edge cases.
  3. Sep 2020
    1. LOD was defined as <x>bi + ksbi, where <x>bi equals the mean of the no-template controls, sbi is s.d. of no-template controls and k = 2.479 (99% confidence interval)

      ddPCR

    1. López, J. A. M., Arregui-Garcĺa, B., Bentkowski, P., Bioglio, L., Pinotti, F., Boëlle, P.-Y., Barrat, A., Colizza, V., & Poletto, C. (2020). Anatomy of digital contact tracing: Role of age, transmission setting, adoption and case detection. MedRxiv, 2020.07.22.20158352. https://doi.org/10.1101/2020.07.22.20158352

  4. Aug 2020
  5. Jul 2020
    1. LoD = LoB + 1.645(SD low concentration sample)

      LoD is the lowest analyte concentration likely to be reliably distinguished from the LoB and at which detection is feasible. LoD is determined by utilising both the measured LoB and test replicates of a sample known to contain a low concentration of analyte.

      LoB is the highest apparent analyte concentration expected to be found when replicates of a blank sample containing no analyte are tested.

      LoB = meanblank + 1.645(SDblank)

    1. Zhong, H., Wang, Y., Shi, Z., Zhang, L., Ren, H., He, W., Zhang, Z., Zhu, A., Zhao, J., Xiao, F., Yang, F., Liang, T., Ye, F., Zhong, B., Ruan, S., Gan, M., Zhu, J., Li, F., Li, F., … Zhao, J. (2020). Characterization of Microbial Co-infections in the Respiratory Tract of hospitalized COVID-19 patients. MedRxiv, 2020.07.02.20143032. https://doi.org/10.1101/2020.07.02.20143032

    1. Davis, J. T., Chinazzi, M., Perra, N., Mu, K., Piontti, A. P. y, Ajelli, M., Dean, N. E., Gioannini, C., Litvinova, M., Merler, S., Rossi, L., Sun, K., Xiong, X., Halloran, M. E., Longini, I. M., Viboud, C., & Vespignani, A. (2020). Estimating the establishment of local transmission and the cryptic phase of the COVID-19 pandemic in the USA. MedRxiv, 2020.07.06.20140285. https://doi.org/10.1101/2020.07.06.20140285

  6. Jun 2020
  7. May 2020
    1. Chu, H. Y., Englund, J. A., Starita, L. M., Famulare, M., Brandstetter, E., Nickerson, D. A., Rieder, M. J., Adler, A., Lacombe, K., Kim, A. E., Graham, C., Logue, J., Wolf, C. R., Heimonen, J., McCulloch, D. J., Han, P. D., Sibley, T. R., Lee, J., Ilcisin, M., … Bedford, T. (2020). Early Detection of Covid-19 through a Citywide Pandemic Surveillance Platform. New England Journal of Medicine, NEJMc2008646. https://doi.org/10.1056/NEJMc2008646

  8. Apr 2020
    1. Adams, E. R., Anand, R., Andersson, M. I., Auckland, K., Baillie, J. K., Barnes, E., Bell, J., Berry, T., Bibi, S., Carroll, M., Chinnakannan, S., Clutterbuck, E., Cornall, R. J., Crook, D. W., Silva, T. D., Dejnirattisai, W., Dingle, K. E., Dold, C., Eyre, D. W., … Sanchez, V. (2020). Evaluation of antibody testing for SARS-Cov-2 using ELISA and lateral flow immunoassays. MedRxiv, 2020.04.15.20066407. https://doi.org/10.1101/2020.04.15.20066407

  9. Feb 2020
  10. Sep 2019
  11. Mar 2018
    1. J.M. Berger Former Brookings Expert

      Paying attention to the qualifications of the author(s)/composer(s) is another crucial role in crap detection at it will help discern whether or not to take the piece seriously or to use it for further research.

    2. Markaz

      In the Rheinghold text , he explains the importance of pay attention the website layout as well as content. However, in doing so, you must tune your crap detection and remember that not everything with a fancy layout is reliable, and vice versa.

    3. I took a detailed look at how ISIS functions online, breaking it down into a five-part template, which can be implemented in different ways depending on the target’s disposition:

      Rather than simply stating information, the author (Berger) explains his source and the way in which he broke his research down into smaller categories. This citation is also apart of crap detection with a reliable source.

    4. detected through social media analysis,

      The implementing of this specific link gives important attribution and increases source reliability. The text makes a statement and is able to back it up with an external, secure source.

    5. there are practical and ethical limits to how much we can interdict discovery.

      Though Rheinghold stresses the importance of crap detection and researching your sources, he accepts the fact that there a limits that we reach in terms of discernment of validity. This is shown as the ISIS busters reach ethical and practical limits of search. It is important in the way that one mustn't get overwhelmed with finding the true source origin because you can only go so far.

    6. stripping away the mystique and focusing on the mechanics.

      Rheinghold stresses the importance of looking at the base of things, rather than simply the makeup and what you see initially, it is important to dig deeper and look at sources from a questionable yet structured angle.

    7. Monday, November 9, 2015

      The article ends in 'edu' which, as Rheinghold states, increases estimation of its credibility.

    8. This post originally appeared on VOX-Pol.

      Considering that the origin of this post comes from a non-secure site, that appears a tad amateur - also brings forth speculation. It is a blog site, and considering this - I somehow take what is posted 'with a grain of salt'.

    9. How does ISIS acquire new recruits online and convince them to take action? J.M. Berger explains, arguing that efforts to counter terrorists’ online activity can be more effective if the mechanics are clearly understood.

      I begin critiquing this article based on Rheinghold's initial conversation with his daughter. In the text Rheinghold suggests using a free internet service - Whois , in order to search for validity in research. After plugging this domain name into the site, I find that the name of the registered owner is 'Educase'. Educase is a nonprofit core data service for research and analysis.

    10. How terrorists recruit online (and how to stop it)

      I will be connecting this text through Howard Rheinghold's "Crap Detection 101" from chapter 2 of his book Net Smart - How to Thrive Online. This allows for further critic of this article in terms of this theme.

  12. Feb 2017
    1. how it uses zones

      Does anyone have an authoritative link for this concept of zones and how they work? It'd be much appreciated.

  13. Jan 2017
    1. Early event detection problems can go here. Two example cases just came to my mind are: 1- in emergency response: detecting a disaster quickly is important. 2- in computational journalism: many locals suddenly start talking about an event means something newsworthy is going on.

  14. Nov 2016
    1. Finally, by assuming the non-detection of a species to indicate absence from a given grid cell, we introduced an extra level of error into our models. This error depends on the probability of false absence given imperfect detection (i.e., the probability that a species was present but remained undetected in a given grid cell [73]): the higher this probability, the higher the risk of incorrectly quantifying species-climate relationships [73].

      This will be an ongoing challenge for species distribution modeling, because most of the data appropriate for these purposes is not collected in such a way as to allow the straightforward application of standard detection probability/occupancy models. This could potentially be addressed by developing models for detection probability based on species and habitat type. These models could be built on smaller/different datasets that include the required data for estimating detectability.

  15. Nov 2015
    1. Presentation summarizing an approach to duplicate web page detection that was developed by a researcher whilst at Google in the early 2000s

  16. Sep 2015
  17. arxiv.org arxiv.org
    1. Given an LSH familyH, the LSH scheme amplifiesthe gap between the high probabilityP1and the lowprobabilityP2by concatenating several functions

      Useful recap of LSH

    2. Recent survey paper for hashing-based approaches to similarity search

    1. This paper has a very useful overview of previous work that is worth reading under section 9.

    2. We used the following publicly available real datasets in the experiment

      Datasets used are DBPL, ENRON, UNIREF-4GRAM. All small (<1M records) in web terms and I would guess, all with small document sizes.

      Given a lengthy paper, could potentially divide into smaller documents (1 doc === 1 page) and do signature calculation on a per-page basis. This could have the benefit of bounding the search time by limiting the number of pages that need to be rendered to text in order to start the lookup process.