19 Matching Annotations
  1. Sep 2017
  2. Dec 2016
  3. Apr 2016
  4. Feb 2016
    1. This is a good short overview of how change detection works in Angular 2.

      It fixes the fundamental algorithmic complexity problem that change detection has in Angular 1.x by making it possible to prune parts of the component tree from change detection if the inputs have not changed.

      Unfortunately the zone.js implementation involves some horrifying monkey-patching of various DOM APIs.

  5. Dec 2015
  6. Nov 2015
    1. Join Darin Fisher, VP of Chrome as he talks the past, present and future of the web.

      RAIL - Reaction (<100ms), Animation (<16ms), Idle (<50ms), Load (<1000ms)

  7. Oct 2015
    1. It’s known that Angular becomes slower with around 2,000 bindings due to the process behind dirty-checking. The less we can add to this limit the better, as bindings can add up without us really noticing it!

      From some preliminary testing it looks like we get close to hitting the 2000 watch count on the /stream view - which explains the lag.

    1. Here are a few handy tips on measuring your project’s performance profile:

      In particular, see this section on recommendations for setting up tests that reasonably simulate "real world" conditions.

    2. Web performance article from a couple of the Chrome team developers / (developer advocates). Mentioned on Twitter - https://twitter.com/aerotwist/status/649877465079390209

  8. Sep 2015
    1. Some interesting slides on CSS styling performance on GitHub, particularly focusing on their diff pages.

      Several slides have direct references to WebKit internals explaining the impact on rule resolution performance.

      Mentions a useful tool for understanding CSS selector performance implications, css explain

    1. If your timeline graph is dominated with the color green after recording

      Green is used to denote time spent painting

    2. there does not seem to be a general rule for how many workers to spawn. Some developers claim that 8 is a good number, but use an online calculator and suit yourself

      Web workers are very heavy objects as they include an entire JS VM instance. 8 sounds like a lot.

    1. The $digest loop keeps iterating until the model stabilizes

      cf. React where an event triggers an event handler, which can trigger state changes and calls to React.render(). These are then batched together resulting in a single re-render, a DOM-diff and the application of the result to the DOM. Consequently you can't have an infinite state update loop. The exception is if a state change happens asynchronously, and that state change triggers another async state change (and so on...)

    1. The value function should return the value which is being watched. AngularJS can then check the value returned against the value the watch function returned the last time

      Ah, so since the input is a scope, this means that Angular needs to call every watch value fn that might be affected by a change. Should look into whether it has any optimizations to avoid that for common watch expressions.

    1. Both accessibility and performance are invisible aspects of an experience and should be considered even if they aren’t explicit goals of the project.
    1. So instead we introduced the loading spinner on each widget. Nothing is actually any faster, but the whole experience feels more responsive
    2. Simply reducing some CSS transition times from 500ms to 250ms, and cutting others out entirely, made the whole dashboard feel snappier and more responsive.
    3. This means content is served from the nearest possible geographic location to the user, cutting down request latency from as high as 2500ms (in places such as Singapore) to tens of milliseconds
    4. We eventually came up with a compromise solution based on Addy Osmani’s basket.js, using a combination of server-side script concatenation and localStorage for caching. In a nutshell, the page includes a lightweight loader script, which figures out which JS and CSS it has already cached and which needs to be fetched. The loader then requests all the resources it needs from the server in one request, and saves all the resources into localStorage under individual keys. This gives us a great compromise between cutting down the number of HTTP requests while still being able to maintain cacheability, and not re-downloading code unnecessarily when it hasn’t changed. Addtionally, after running a few benchmarks, we found that localStorage is (sometimes) actually faster than the native HTTP cache, especially on mobile browsers.