34 Matching Annotations
  1. Sep 2024
  2. Jul 2024
    1. NAFTA displays the classic free-trade quandary: Diffuse benefits with concentrated costs.

      for - key insight - free trade - from - Backfire: How the Rise of Neoliberalism Facilitated the Rise of The Far-Right

      quote - free trade - (see below)

      key insight - free trade - NAFTA displays the classic free-trade quandary: - Diffuse benefits with - concentrated costs - While the economy as a whole may have seen a slight boost, - certain sectors and communities experienced profound disruption. - A town in the Southeast loses hundreds of jobs when a textile mill closes, - but hundreds of thousands of people find their clothes marginally cheaper. - Depending on how you quantify it, the overall economic gain is probably greater but barely perceptible at the individual level; - the overall economic loss is small in the grand scheme of things, - but devastating for those it affects directly.

      from - Backfire: How the Rise of Neoliberalism Facilitated the Rise of The Far-Right - https://hyp.is/F6XYujyREe-TaldInE8OGA/scholarworks.arcadia.edu/cgi/viewcontent.cgi?article=1066&context=thecompass

  3. Jun 2023
  4. Dec 2022
  5. Sep 2022
    1. tions will not always fit without inconvenience intotheir proper place ; and the scheme of classification,once adopted, is rigid, and can only be modifiedwith difficulty. Many librarians used to draw uptheir catalogues on this plan, which is now uni-versally condemned.

      Others, well understanding the advantages of systematic classification, have proposed to fit their materials, as fast as collected, into their appropriate places in a prearranged scheme. For this purpose they use notebooks of which every page has first been provided with a heading. Thus all the entries of the same kind are close to one another. This system leaves something to be desired; for addi

      The use of a commonplace method for historical research is marked as a poor choice because:<br /> The topics with similar headings may be close together, but ideas may not ultimately fit into their pre-allotted spaces.<br /> The classification system may be too rigid as ideas change and get modified over time.

      They mention that librarians used to catalog books in this method, but that they realized that their system would be out of date almost immediately. (I've got some notes on this particular idea to which this could be directly linked as evidence.)

  6. Aug 2022
    1. Using git remote set-head has the advantage of updating a cached answer, which you can then use for some set period. A direct query with git ls-remote has the advantage of getting a fresh answer and being fairly robust. The git remote show method seems like a compromise between these two that has most of their disadvantages with few of their advantages, so that's the one I would avoid.)
  7. Jun 2022
  8. Jan 2022
  9. Sep 2021
    1. Melamine is considered the black sheep of the sheet goods’ family by most carpenters. Typically because it creates a lower quality cabinet than other materials. But, also because it is so darn hard to construct with without getting chips. However, melamine does have a place and a purpose, and if you know how to build with melamine, you can produce some budget-friendly spaces.
  10. Aug 2021
  11. Jun 2021
    1. That’s not the only way of writing end-to-end tests in Rails. For example, you can use Cypress JS framework and IDE. The only reason stopping me from trying this approach is the lack of multiple sessions support, which is required for testing real-time applications (i.e., those with AnyCable 😉).
    1. I'm not sure why MSFT decided to change these codes in the first place. While it might have been a noble goal to follow the IETF standard (though I'm not really familiar with this), the old codes were already out there, and most developers don't benefit by the new codes, nor care about what these codes are called (a code is a code). Just the opposite occurs in fact, since now everyone including MSFT itself has to deal with two codes that represent the same language (and the resulting problems). My own program needs to be fixed to handle this (after a customer contacted me with an issue), others have cited problems on the web (and far more probably haven't publicised theirs), and MSFT itself had to deal with this in their own code. This includes adding both codes to .NET even though they're actually the same language (in 4.0 they distinguished between the two by adding the name "legacy" to the full language name of the older codes), adding special documentation to highlight this situation in MSDN, making "zh-Hans" the parent culture of "zh-CHS" (not sure if it was always this way but it's a highly questionable relationship), and even adding special automated code to newly created "add-in" projects in Visual Studio 2008 (only to later remove this code in Visual Studio 2010, without explanation and therefore causing confusion for developers - long story). In any case, this is not your doing of course, but I don't see how anyone benefits from this change in practice. Only those developers who really care about following the IETF standard would be impacted, and that number is likely very low. For all others, the new codes are just an expensive headache. Again, not blaming you of cours
    2. I'm not sure why MSFT decided to change these codes in the first place. While it might have been a noble goal to follow the IETF standard (though I'm not really familiar with this), the old codes were already out there, and most developers don't benefit by the new codes, nor care about what these codes are called (a code is a code).
    1. >> We have that already, it's named 'json_each_text' > Apparently you haven't looked at json parse/deparse costs ;P Well, a PL function is gonna be none too cheap either. Using something like JSON definitely has lots to recommend it --- eg, it probably won't break when you find out your initial spec for the transport format was too simplistic.
  12. Apr 2021
  13. Mar 2021
    1. In a 1772 letter to Joseph Priestley, Franklin lays out the earliest known description of the Pro & Con list,[100] a common decision-making technique, now sometimes called a decisional balance sheet:

      I still use this method today. In my job, we use it to decide on possible courses of action - weighing the pros and cons of each against another.

  14. Feb 2021
  15. Jan 2021
    1. Progress is made of compromises, this implies that we have to consider not only disadvantages, but also the advantages. Advantages do very clearly outweigh disadvantages. This doesn’t mean it perfect, or that work shouldn’t continue to minimize and reduce the disadvantages, but just considering disadvantages is not the correct way.
    2. If folks want to get together and create a snap-free remix, you are welcome to do so. Ubuntu thrives on such contribution and leadership by community members. Do be aware that you will be retreading territory that Ubuntu developers trod in 2010-14, and that you will encounter some of the same issues that led them to embrace snap-based solutions. Perhaps your solutions will be different. .debs are not perfect, snaps are not perfect. Each have advantages and disadvantages. Ubuntu tries to use the strengths of both.
  16. Nov 2020
    1. SVG has the advantage that integrates very well with Svelte, since it’s an XML and the nodes can be managed as if they were HTML. On the other hand, Canvas is more efficient, but it has to be generated entirely with JavaScript.
  17. Oct 2020
  18. Sep 2020
    1. Also, Rollup, which I use in the article as a bundler is pretty slow. Why? Because it needs to re-compile the whole shebang every time you change a file. It produces very small and efficient bundles though.
  19. Jul 2020
  20. Jun 2020
  21. May 2020
    1. See this Hacker News comment thread for more discussion of the issues that might arise and some pro/con comparisons of using Alpine-based images.
  22. Dec 2019
  23. May 2019
    1. enginethatistheproblembut,rather,theusersofsearchengineswhoare.Itsuggeststhatwhatismostpopularissimplywhatrisestothetopofthesearchpile
      • I wanted to highlight the previous sentence as well, but for some reason it wouldn't let me*

      I understand why the author is troubled by the campaign's opinion of "It's not the search engines fault". It makes it seem as if there was nothing that could be done to stop promoting those ideas, and that if something is popular it will just have to be the result at the top.

      This can be problematic, as people who were not initially searching that specific phrase may click through to read racist, sexist, homophobic, or biased information (to just name a few) that perpetuates inaccuracies and negative stereotypes. It provides easier access into dangerous thinking built on the foundations of racism, sexism, etc.

      If the algorithms are changed or monitored to remove those negative searches, the people exposed to those ideas would decrease, which could help tear down the extreme communities that can build up from them.

      While I do understand this view, I also think that system can be helpful too. All the search engine does is reflect the most popular searches, and if negative ideals are what people are searching, then we can become aware and direct their paths to more educational and unbiased sources. It could be interesting to see what would happen if someone clicked on a link that said "Women belong in the kitchen", that led them to results that spoke about equality and feminism.