384 Matching Annotations
  1. Jun 2020
    1. Common table expressions sound interesting.

    1. # scenario-1: delete in session: SA might set category_id of all chilren Products to None c1 = session.query(Category).get(1) session.delete(c1) session.commit() # scenario-2: delete without loading an object into the session: SA will perform no additional logic session.query(Category).filter(Category.id == 2).delete() session.commit()

      Weird sqlalchemy behavior but totally an accurate statement. When using postgres and not specifying how to handle deletes, if you delete via object instead of filter.delete, sqlalchemy will set all children with foreignkey id's to None.

    1. Integration of Naming Conventions into Operations, Autogenerate

      Importance of naming conventions for sqlalchemy when running alembic db migrations.

  2. Apr 2020
    1. def handle_exception(self, job, *exc_info):

      To unit test an exception handler:

      worker = Worker(..., exception_handler=[handle_exception])
      try:
          raise Exception()
      except Exception:
          exc_info = sys.exc_info()
      
      worker.handle_exception(job, *exc_info)
      
    1. failure_ttl

      How long to keep a failed job.

    2. result_ttl=600

      How long to keep a successful job.

    3. job.meta['handled_by'] = socket.gethostname() job.save_meta()

      You can add metadata on the job like keeping track of the number of times a job has been retried for example.

    1. w = Worker([q], exception_handlers=[foo_handler, bar_handler])

      Exception handlers are attached to the worker.

    2. def my_handler(job, exc_type, exc_value, traceback): # do custom things here

      Write an exception handler that requeues a failed job.

    1. docker-compose rm -f -s -v yourService

      useful commands for launching a single service in a docker-compose file without running it in the background so you can see the logs:

      docker-compose rm -fsv service
      docker-compose up service
      
    1. // _.debounce is a function provided by lodash to limit how // often a particularly expensive operation can be run. // In this case, we want to limit how often we access // yesno.wtf/api, waiting until the user has completely // finished typing before making the ajax request. To learn // more about the _.debounce function (and its cousin // _.throttle), visit: https://lodash.com/docs#debounce

      Seems like it could be useful at some point.

    2. Computed vs Watched Property

      Really useful example of explaining what not to do and how to simplify code.

    1. The components located under src/components are less likely to be used in a route whereas components located under src/views will be used by at least one route.
  3. Mar 2020
    1. the stone of heaven is that this artifact is always found in soil layers dating to at least 12000 BC. The stone was certainly produced by an unknown, highly advanced civilization lost in time
    2. Its composition was found to be composed of 77% oxygen, along with traces of carbon, silicon, calcium, and sodium.
    1. an abnormal noise from the upper airway might be audible during galloping, which usually is a sign that there is something amiss with the upper airway
    1. 3 tbsp lemon zest

      Add this to this recipe and 2 drops of orange extract.

    2. use a bit more if you like it sweeter

      Add 2 extra tablespoons powdered sugar to balance the lemon juice.

    3. 3⁄4 cup mascarpone cheese sour cream

      substitute 8oz cream cheese, 1/4 cup butter, 1/4 cup whipping cream

    4. How To Make Devonshire Cream in Just 5 Minutes

      Devonshire cream recipe.

    1. Instead of mutating the state, actions commit mutations. Actions can contain arbitrary asynchronous operations.

      So for example an action may be a combination of mutation calls and async api calls.

    1. When you create a vue component you must give it a name:

      <script>
      export default {
        name: "TodoItem",
      }
      </script>
      
    2. The following binds a class on a condition in order to apply conditional styling:

      <div class="todo-item" v-bind:class="{'is-complete':todo.completed}">
      
    3. Styling can be scoped to just the component with the scope denotation below:

      <style scope>
      
      </style>
      
    4. When looping in the template the variable must be unique so we use v-bind:key and give it the id of the object as shown below:

      <div v-bind:key="todo.id" v-for="todo in todos">
         <h3>{{todo.title}}<h3>
      
    5. Really nice vue plugin for chrome dev tools-looks like it's worth installing.

  4. Feb 2020
    1. if you’re using user federation (things like single sign-on and OpenID Connect), JWTs become important because you need a way to validate a user’s identity via a third party.
    2. If you’re building API services that need to support server-to-server or client-to-server (like a mobile app or single page app (SPA)) communication, using JWTs as your API tokens is a very smart idea.
    3. If your website is popular and has many users, cache your sessions in a backend like memcached or redis, and you can easily scale your service with very little hassle.
    4. in most web authentication cases, the JWT data is stored in a session cookie anyways, meaning that there are now two levels of signing. One on the cookie itself, and one on the JWT.
    5. Almost every web framework loads the user on every incoming request. This includes frameworks like Django, Rails, Express.js (if you’re using an authentication library), etc. This means that even for sites that are primarily stateless, the web framework you’re using is still loading the user object regardless.
    6. since JWTs are larger (in bytes) and also require CPU to compute cryptographic signatures, they’re actually significantly slower than traditional sessions when used in this manner.
    7. This means that on most websites, the stateless benefits of a JWT are not being taken advantage of.
  5. Jan 2020
    1. const mapStateToProps = state => ({ todos: getVisibleTodos(state.todos, state.visibilityFilter)})const mapDispatchToProps = dispatch => ({ toggleTodo: id => dispatch(toggleTodo(id))})export default connect( mapStateToProps, mapDispatchToProps)(TodoList)

      Example of passing state to the component.

    1. npx create-react-app my-app

      This builds out a nice template project to get started with all the tooling and setup already done for you.

    1. Instead of allowing any and all components to fetch and manipulate data, which can make debugging pretty much suck, we want to implement a pattern that's in line with the Single Responsibility Principle, and that keeps our code DRY.
    1. export LIBRARY_PATH=$LIBRARY_PATH:/usr/local/opt/openssl/lib/

      This worked for me when I ran into this issue in a virtualenv pip install.

  6. Dec 2019
    1. HTTPretty

      Use this to mock the request object so that it will be easier to mock the request context manager.

  7. Nov 2019
    1. Preheat oven to 425 degrees F. Whisk pumpkin, sweetened condensed milk, eggs, spices and salt in medium bowl until smooth. Pour into crust. Bake 15 minutes.

      Whisk eggs and condensed milk in a pot on the stove at low-medium heat. Then add in spices and add pumpkin slowly. Reduce for about 15 min. Then pour into the pie crust and bake. My grandma always added in more sugar (1/4 cup brown & white) to make it sweeter. I also add in about 1/4 tsp cloves.

    2. pumpkin pie recipie

    1. regex library for lua. The string.match doesn't support some characters such as | so we need to use a more complete set.

    1. t = { name = "Bob" } function t:sayHello() -- note the 't:' part print("Hello" .. self.name) end t:sayHello() -- "Hello Bob"

      Implementing a class in lua.

    1. Nice example repo for lua.

    2. lint: @luacheck -q .

      We should probably run a lua linter at some point.

    1. -- patch os.getenv to not have existing environment variables interfere local _getenv = os.getenv os.getenv = function() end -- luacheck: ignore finally(function() os.getenv = _getenv -- luacheck: ignore end)

      patch os.getenv

    1. Tip: If you see require('mymodule') it is just an alternative syntax for require 'mymodule' the two perform identically and are interchangeable.
    2. Here is the code from main() using different variables, both work fine:

      How to import and call a function in a module in lua. You can also use the local MM = require('mymodule') syntax.

    3. This is the module, notice how the interface table mymodule is local and is returned on the last line of the module:

      How to return functions from a module in lua.

    1. It's good practice to keep file loaded by content_by_lua_file at a minimum and place all processing logic into external modules. This allows lua_code_cache to work its magic and simplifies your testing.

      Kinda counter intuitive but in a way it makes sense since require does built in caching and it makes it easier to test. This actually makes me feel a lot more comfortable making a utils area and just testing the functions in there.

    1. output = lustache:render("{{title}} spends {{calc}}", view_model)

      This will return a string of the rendered template.

    1. import operator s = sorted(s, key = operator.itemgetter(1, 2))

      Sort by multiple indexes.

    2. to reverse to only one attribute, you can sort twice: first by the secondary s = sorted(s, key = operator.itemgetter(2)) then by the primary s = sorted(s, key = operator.itemgetter(1), reverse=True)

      This makes sense, particularly from specification of sort order in Pandas, if I recall correctly you usually specify the primary last which makes me think on the backend implementation it does something similar to this where it sorts from the lowest priority to the highest priority.

    1. How to butterfly a lobster tail. Also remember to remove the the digestive vein. You can put squeeze a quarter lemon slice under the tail and use it to prop the meat up while it cooks.

    1. This looks similar to the other way of monkey patching the XMLHttpRequest and looks relatively straightforward.

    1. Whisk in 1 tbs. butter, when butter melts add another piece. Continuing adding butter pieces—1 cup (two sticks total.) Do not let the butter come to a boil or the butter will separate. Try to keep the butter between 160 and 175 degrees F. Use an instant read thermometer to to keep it under 180 F.

      This is also a tasty way to cook lobster. Add garlic to the mix and make sure to add enough salt to the butter.

    1. Make text substitutions in response bodies, using both regular expressions and fixed strings, in this filter module.

      We need to use this. It's under nginx plus though so does that mean we have to pay for it/it doesn't work with regular nginx?

    1. nginx_substitutions_filter is a filter module which can do both regular expression and fixed string substitutions on response bodies. This module is quite different from the Nginx's native Substitution Module.

      We might need to switch to this if we want to do replacement with regex's.

    1. This is a "protocol-relative" link. It uses http or https depending on what was used to load the current page.

      So src="//…" means use the same scheme as the current page.

    1. http { proxy_cache_path /data/nginx/cache levels=1:2 keys_zone=STATIC:10m inactive=24h max_size=1g; server { location / { proxy_pass http://1.2.3.4; proxy_set_header Host $host; proxy_buffering on; proxy_cache STATIC; proxy_cache_valid 200 1d; proxy_cache_use_stale error timeout invalid_header updating http_500 http_502 http_503 http_504; } } }

      example proxy_cache config.

    1. Default: proxy_cache off;

      So sounds like proxy caching is off by default.

    2. Default: proxy_cache_lock off;

      hmm...Ok this is off by default so probably not a bottleneck in that case.

    3. Default: proxy_cache_lock_age 5s;

      Again-seems like this should be a smaller timeout.

    4. Default: proxy_cache_lock_timeout 5s;

      This is a very long time! We should definitely shorten this. I actually wonder if this is perhaps why we are facing a bottleneck when we see a bunch of requests at once. When I performance tested, I ran without caching so it's possible the caching is actually bottlenecking us.

    5. Sets an offset in bytes for byte-range requests. If the range is beyond the offset, the range request will be passed to the proxied server and the response will not be cached.

      I don't see a need for this right now as it's not the range requests that are really the perceived slowness but it might be worth looking at later.

    6. “GET” and “HEAD” methods are always added to the list, though it is recommended to specify them explicitly.

      I wonder why?

    7. If the value is set to off, temporary files will be put directly in the cache directory.

      So use_temp_path should probably be set to off. I don't see a reason why we would need to first write them to a different directory.

    8. Cached data that are not accessed during the time specified by the inactive parameter get removed from the cache regardless of their freshness. By default, inactive is set to 10 minutes.

      This default seems like it should probably be set to minimally an hr-probably more. Cloudlflare is set at 4 hours which seems perfectly reasonable to me.

    9. The special “cache manager” process monitors the maximum cache size set by the max_size parameter. When this size is exceeded, it removes the least recently used data.

      So we might need to increase this?

    10. Cache data are stored in files.

      So it caches them on the file system by way of storing them in a temp file and renaming them. The default here says --. I thought nginx cached things by default so does this mean that's not the case or maybe it doesn't save them? Confused-need to investigate more.

    11. Default: proxy_cache_min_uses 1;

      This seems right for us.

    12. The levels parameter defines hierarchy levels of a cache: from 1 to 3

      Basically the same concept as a hardware cache with 1 I assume being the first cache that will be checked.

    13. each level accepts values 1 or 2

      I don't understand what this means.

  8. Oct 2019
    1. Really useful page for generating regexes of ip ranges. Note they are missing some parenthesis in places though.

    1. location / { sub_filter '<a href="http://127.0.0.1:8080/' '<a href="https://$host/'; sub_filter '<img src="http://127.0.0.1:8080/' '<img src="https://$host/'; sub_filter_once on; }

      How to replace strings in an html response on-the-fly.

      Note the Content-Type must be requested as not compressed for this to work.

    1. # request will be sent to backend without uri changed # to '/' due to if location /proxy-pass-uri { proxy_pass http://127.0.0.1:8080/; set $true 1; if ($true) { # nothing } }

      This is a weird one. I guess as long as you aren't using $uri it has no impact though.

    1. Is anyone aware of a lua http lib that supports keepalive?

      When sending a request you can pass the following keepalive settings which will keep the connection open:

      local http = require "resty.http"
      local httpc = http.new()
      httpc:connect("127.0.0.1", 9081)
      local response, err = httpc:request({
        path = "/proxy" .. ngx.var.request_uri, 
        method = "HEAD",
        headers = ngx.req.get_headers(),
        keepalive_timeout = 60,
        keepalive_pool = 10,
      })
      
    1. Also, it's always a good idea to add ipv6=off to the resolver directive when your dns server may return IPv6 addresses and your network does not support it.

      This might help.

    1. Another difference about $uri and $request_uri in proxy_cache_key is $request_uri will include anchor tags part, but $uri$is_args$args will ignore it Do a curl operation : curl -I static.io/hello.htm

      This could be a problem. We'll probably have to fix this when we move via2 outside lms.

    1. The proxy_buffering option tells NGINX to pass the response directly back to the client. Otherwise, it will try to buffer it in memory or on disk. I recommend this if the upstream response can be large.

      We may want to not buffer on pdf endpoints.

    1. set $stripped_cookie $http_cookie; if ($http_cookie ~ "(.*)(?:^|;)\s*sessionid=[^;]+(.*)$") { set $stripped_cookie $1$2; } if ($stripped_cookie ~ "(.*)(?:^|;)\s*csrftoken=[^;]+(.*)$") { set $stripped_cookie $1$2; }
    1. Indicate number of NA values placed in non-numeric columns.

      This is only true when using the Python parsing engine.

      Filled 3 NA values in column name
      

      If using the C parsing engine you get something like the following output:

      Tokenization took: 0.01 ms
      Type conversion took: 0.70 ms
      Parser memory cleanup took: 0.01 ms
      
    1. location / { proxy_pass http://backend; # You may need to uncomment the following line if your redirects are relative, e.g. /foo/bar #proxy_redirect / /; proxy_intercept_errors on; error_page 301 302 307 = @handle_redirect; } location @handle_redirect { set $saved_redirect_location '$upstream_http_location'; proxy_pass $saved_redirect_location; }

      Usually the redirect is returned as the response and the client follows the redirect. This will follow a redirect inside nginx rather than the client.

    1. The X-Forwarded-Proto request header helps you identify the protocol (HTTP or HTTPS) that a client used to connect to your load balancer. Your server access logs contain only the protocol used between the server and the load balancer; they contain no information about the protocol used between the client and the load balancer.

      The load balancer may talk to the server via http so using $scheme in nginx when there's an AWS load balancer in front may lead to the $scheme being unexpectedly http instead of https.

      http {
          map $http_x_forwarded_proto $original_scheme {
            "" $scheme;
            default $http_x_forwarded_proto;
          }
      }
      
    1. I had a similar issue with nginx+passenger (for Ruby on Rails / Rack / etc.), and I confirm that by default, multiple slashes are collapsed (in both PATH_INFO and REQUEST_URI). Adding merge_slashes off; in the server context of the nginx configuration fixed it
    1. non-blocking internal requests

      Note ngx.location.capture only works on internal requests which means if you want to request an external endpoint dynamically then you need to setup something like below and call that internal endpoint instead of calling the external url directly.

      Say for example you want to send a request to / endpoint with the thirdparty url as part of the path (http:proxy-server.com/http://example.com).

            location /external/ {
              internal;
              set $upstream "";
              rewrite_by_lua_file ./lua/get_external.lua;
              resolver 8.8.8.8;
              proxy_pass $upstream;
      

      Where lua/get_external.lua:

      -- strip beginning '/' from uri path
      ngx.var.upstream = ngx.var.request_uri:sub(2)
      
    1. set $template_root /usr/local/openresty/nginx/html/templates;

      We should probably use this instead of root since root has other implications.