20,173 Matching Annotations
  1. Mar 2021
    1. Or even a simple 1-liner in the Contract that references an AR Model so you don't have to rewrite the validations again in that contract, or have to choose between writing the validations once either in the contract there or in the AR Model?
    2. how to have validations in model and some in contract/form object without duplicating

    1. Don't let the highly rated reviews fool you, this is one of the worst Steam games I've personally bought and played in years (as of writing this I'm closing in on 4000 games in my Steam library).
    1. Dole Mandarin Oranges in Light Syrup, All Natural Fruit, Non-GMO, 15oz CanNot available for curbside
    1. Third configurable block to run.

      I like how they identify in the description which order things run in: 1st, 2nd, 3rd, and last.

      Though, it would be more readable to have a list of them, in chronological order, rather than having them listed in alphabetical order.

    2. Last configurable block to run. Called after frameworks initialize.
    1. Trying to force this one thing to work for everyone is the worst way to do that.
    2. Don’t get me wrong  —  standards are great. Uniformity is bad.
    3. The elimination of what is arguably the biggest monoculture in the history of software development would mean that we, the community, could finally take charge of both languages and run-times, and start to iterate and grow these independently of browser/server platforms, vendors, and organizations, all pulling in different directions, struggling for control of standards, and (perhaps most importantly) freeing the entire community of developers from the group pressure of One Language To Rule Them All.
    4. JavaScript needs to fly from its comfy nest, and learn to survive on its own, on equal terms with other languages and run-times. It’s time to grow up, kid.
    5. If JavaScript were detached from the client and server platforms, the pressure of being a monoculture would be lifted — the next iteration of the JavaScript language or run-time would no longer have to please every developer in the world, but instead could focus on pleasing a much smaller audience of developers who love JavaScript and thrive with it, while enabling others to move to alternative languages or run-times.
    6. Ironically, what we’re doing today, is essentially the opposite: rather than reducing the scope of the problem, we continue to grow it, effectively increasing the number of details — and problems — for everyone.
    7. for whatever reasons, it hasn’t really liberated anyone from JavaScript.
    8. Despite a growing variety of languages that compile to JavaScript, the language itself remains the dominant language in both client-side and server-side eco-systems for web development. The idea of replacing JavaScript with languages that compile to JavaScript, has been explored, and for whatever reasons, it hasn’t really liberated anyone from JavaScript.
    9. We standardize on a finite subset of JS (such as asm.js) — and avoid the endless struggle through future iterations of the JavaScript language, competing super-sets and transpilers

      asm.js and RPython sound similar (restrictive subsets)

    10. agree to accept JavaScript for what it is, but start to think of it as a kind of VM for other languages
    11. Again, this is all opinion-based, and due to the sheer number of developers who rely on this technology as their bread and butter, sub-communities and religiousness forms around patterns, anti-patterns, practices, de-facto standards, micro-packages, polyfills, frameworks, build-tools, etc.
    12. For instance, those who prefer classical inheritance may enjoy the addition of the class keyword, while others may reject it as conflicting with the idea of a prototypical inheritance model.
    13. JavaScript, as a language, has some fundamental shortcomings — I think the majority of us agree on that much. But everyone has a different opinion on what precisely the shortcomings are.
    14. While various shortcomings of the standard run-time library are the obvious, immediate reason for the creation of micro-packages
    15. As to opinions about the shortcomings of the language itself, or the standard run-times, it’s important to realize that every developer has a different background, different experience, different needs, temperament, values, and a slew of other cultural motivations and concerns — individual opinions will always be largely personal and, to some degree, non-technical in nature.

    Tags

    Annotators

    URL

    1. If the only realistic consumer of a package is the monorepo, and you can’t realistically see normal users installing that 1 package out of 138 other packages in that repository, there’s probably no need to have it as a separate package. Ideally it would be better to let a user install 1 package that contains everything, and reduce the overhead.
    2. become more obscure in functionality to the point where some names literally describe what they do
    3. It’s an incredibly amount of overhead and waste. Packages increasingly consume more hard drive space, increase installation times
    4. When you look inside a node_modules directory, there’s likely hundreds if not thousands of packages, even for a relatively basic application.
    5. Very often in these monorepos, packages are so incredibly specific in functionality, the question then becomes why even have a separate package at all if it’s tightly coupled? Can you use these packages independently or are they tied to specific versions of other packages in the monorepo? It’ll probably be easier to remove the mask and just work as a monolith.
    6. After all, that’s why it’s in one repository to begin with right?
    7. There’s typically a complex tree of dependencies, where packages all tend to rely on each other in order to function.
    8. However, if all of these are hosted in the same repository, you lose a lot of those benefits.
    9. parallelise development across multiple teams
    10. There’s several benefits to splitting code into multiple packages, whether it be a library, micro-services or micro-frontends.
    11. Also with one history, these packages will always have commits that are in sync or “atomic”.
    1. The criticism of small modules is a bit ironic because it runs deeper than many realize.
    2. But I believe the core philosophy of tiny modules is actually sound and easier to maintain than giant frameworks.
    3. This isn't to say that on a case by case basis there aren't modules that are grossly overcomplicated.
    4. I wanted to examine this criticism because I don't agree with it.
    5. Isaac then continues on to compare that philosophy to Node.js. They are slightly less succinct but still very enlightening.
    6. he goes on to talk about third party problems and how you're never guaranteed something is written correctly or that even if it is you don't know if it's the most optimal solution
    7. he goes on to say that simple functions should not be packages because they are too small.
    8. "Functions Are Not Packages" - Well why not?
    9. He says that writing the function yourself makes it easy to modify and to fix bugs or improve efficiency.

      .

    10. so you can learn about the ones you don't.
    11. Write modules that solve a problem you know
    12. By treating even small functions like a black box it promotes separation of concerns and allows said black box to evolve independently.
    13. I would much rather have a "cosine" module than a "trigonometry" module because chances are good I only need a small fraction of the utilities provided by the larger trig module.
    14. I found this bit a tad ironic considering he's simultaneously admonishing small modules while complaining about how difficult it is to debug other people's code.
    15. Write modules quickly, to meet your needs, with just a few tests for compliance. Avoid extensive specifications.
    16. Write modules for publication, even if you only use them privately. You will appreciate documentation in the future.
    17. Of course how each developer interprets and applies these very generalized guidelines is subjective and will vary from person to person.
    18. For one, anyone using this module would automatically benefit from any future performance improvements without having to do anything themselves.
    19. Small modules are extremely versatile and easy to compose together in an app with any number of other modules that suit your needs.
    20. Refactor ruthlessly. Rewrite bravely.
    21. Write modules that are small. Iterate quickly.
    22. Write modules that are agnostic about the source of their input or the destination of their output.
    23. Write modules that do one thing well. Write a new module rather than complicate an old one.

      .

    24. Write modules that encourage composition rather than extension.
    25. Sure sometimes my changes get rejected, but it almost always comes with a reason why and I can work together with the maintainer to come up with a sensible solution to my issue.
    26. Second, I don't agree that there are too many small modules. In fact, I wish every common function existed as its own module. Even the maintainers of utility libraries like Underscore and Lodash have realized the benefits of modularity and allowed you to install individual utilities from their library as separate modules. From where I sit that seems like a smart move. Why should I import the entirety of Underscore just to use one function? Instead I'd rather see more "function suites" where a bunch of utilities are all published separately but under a namespace or some kind of common name prefix to make them easier to find. The way Underscore and Lodash have approached this issue is perfect. It gives consumers of their packages options and flexibility while still letting people like Dave import the whole entire library if that's what they really want to do.
    27. You might get the impression after reading David's article above that this trend arose from lazy developers who "forgot how to program", but the reality is that the tiny-module ecosystem on NPM was the intention from the beginning
    28. How are hundreds of dependencies and 28,000 files for a blank project template anything but overly complicated and insane?

    Tags

    Annotators

    URL

    1. Hasty generalization usually follows the pattern:
    2. Hasty generalization is the fallacy of examining just one or very few examples or studying a single case, and generalizing that to be representative of the whole class of objects or phenomena.
    1. the Unix Philosophy is a crucial part of the patterns, opinions, and culture of Node.js
    2. All too often, people get hung up on the wrong aspects of the Unix Philosophy, and miss the forest for the trees
    3. Those sorts of complaints are like saying that someone is not a buddhist unless they speak Pali.
    4. In the real world, we are faced with the completely unfair constraint of being human while writing programs and while debugging them, and none of these costs can ever be reduced to zero.
    5. Simplicity is better than anything.
    6. Compatibility is better than purity.
    7. Focus is better than features.
    8. Working is better than perfect.
    9. It is about balancing the twin needs of writing good software, and writing any software at all.
    10. It’s a practical set of advice for trading a moderate increase in development cost for a much larger reduction in maintenance costs.
    11. Nothing about the Unix Philosophy explicitly relates to a culture of software sharing. However, it should be no mystery that it comes from the software community where we argue at length about the best way to make our programs properly Free. Software that is developed according to these principles is easier to share, reuse, repurpose, and maintain.
    12. The Unix Philosophy is an ideology of pragmatism.
    1. There's a joke in philosophy that goes like this: The First Law of Philosophy: For every philosopher, there exists an equal and opposite philosopher. The Second Law of Philosophy: They're both wrong.
    2. Let's define idealism as a rigid belief system in which you live your life based upon a morality as it is "supposed to be" or "should be,"
    3. let's define pragmatism as doing what is practical, regardless of how you things are supposed to be or should be
    1. The customer overspecified the requirements and now we're contractually required to build it this way. Does he think he's an engineer?
    2. To specify in excessive detail.
    1. Why separate out red tests from green tests? Because my green tests serve a fundamentally different purpose. They are there to act as a living specification, validating that the behaviors work as expected. Regardless of whether they are implemented in a unit testing framework or an acceptance testing framework, they are in essence acceptance tests because they’re based upon validating behaviors or acceptance criteria rather than implementation details.
    2. When I refactor my code, I expect that none of my green tests will break. If red tests break then that’s okay because remember, my red tests can be implementation dependent and when I change an implementation it may cause some red tests to break. But it shouldn’t break any green tests. I find that this is a valuable distinction.
    3. Conversely, red tests are tests I write after the code is written to lock down some implementation.
    4. Have you ever played the game 20 questions? Most of us have played that game at one point in our lives. One person thinks of something that could be an animal, vegetable, or mineral and then they answer yes/no questions that are asked of them. The point of the game is to ask as few questions as possible in order to accurately guess what the person is thinking.  This is how I think of the unit tests that I write the specified behavior as I’m doing test-first development. I ask what are the fewest tests that I need to write in order to assert the behavior I want to create.
    5. So the question becomes how many tests are enough?
    6. I’m proposing that writing those tests from the perspective of specifying the behaviors that we want to create is a highly valuable way of writing tests because it drives us to think at the right level of abstraction for creating behavioral tests and that allow us the freedom to refactor our code without breaking it
    7. I am a big advocate of having a complete test base and even erring on the side of caution when it comes to quality engineering and software validation but that is not what we’re talking about here. What we’re talking about here are the tests that we write when we’re doing test-first development and I’m proposing that writing those tests from the perspective of specifying the behaviors that we want to create is a highly valuable way of writing tests because it drives us to think at the right level of abstraction for creating behavioral tests and that allow us the freedom to refactor our code without breaking it.
    8. The number one problem that I see developers have when practicing test-first development that impedes them from refactoring their code is that they over-specify behavior in their tests. This leads developers to write more tests than are needed, which can become a burden when refactoring code.
    1. Sometimes a change impact analysis is performed to determine an appropriate subset of tests

      Hey, I do that sometimes so I can run a smaller/faster subset of tests. Didn't know it had a fancy name though.

    2. non-regression testing

      That would probably be a better name because you're actually testing/verifying that there hasn't been any regression.

      You're testing for the absence of regression. But I guess testing for one also tests for the other, so it probably doesn't matter. (If something is not true you know it is false, etc.)

    3. Regression testing (rarely non-regression testing[1]) is re-running functional and non-functional tests to ensure that previously developed and tested software still performs after a change.[2] If not, that would be called a regression.
    1. The granularity of data refers to the size in which data fields are sub-divided
    2. Note that, although the modifying terms, fine and coarse are used consistently across all fields, the term granularity is not.
    1. I like to take it a step further and define a technologist as a General Technology Specialist, just to ramp up the oxymoron. However, as most technologists know, that’s exactly what we are – general specialists.

      Wouldn't that make us both a generalist and a specialist? Which is more accurate, a generalist specialist or a generalist specialist? 

    2. However, as most technologists know, that’s exactly what we are – general specialists. We’ve spent decades honing our skill-sets into fine points… in many, many different areas. These finely sharpened points may not be very deep, mind you, but boy are they sharp! The old “jack of all trades, master of none” chestnut comes into play a bit.
    3. the term “technologist” contains a fair amount of tongue-in-cheek.
    1. Essentially we're trying to figure out when it's appropriate for "my" code to become "everyones" code, and if there are steps in between. ("Standard library", for example.)
    2. Look no further than C++, where nearly every major software suite has its own strings, vectors, etc. implemented, frequently duplicating functionality already implemented in (1) STL, and (2) Boost. I seem to recall that the original Android Browser, for example, had no fewer than 5 kinds of strings on the C++ side of the code base, because it interfaced with several different systems and each had its own notion of what a string should be.
    3. One thing that would be useful to this debate an analysis of a language ecosystem where there are only "macropackages" and see if the same function shows up over and over again across packages.
    4. this only applies to end products which are actually deployed. For my modules, I try to keep dependency version ranges at defaults, and recommend others do the same. All this pinning and packing is really the responsibility of the last user in the chain, and from experience, you will make their life significantly more difficult if you pin your own module dependencies.
    5. here is my set of best practices.I review libraries before adding them to my project. This involves skimming the code or reading it in its entirety if short, skimming the list of its dependencies, and making some quality judgements on liveliness, reliability, and maintainability in case I need to fix things myself. Note that length isn't a factor on its own, but may figure into some of these other estimates. I have on occasion pasted short modules directly into my code because I didn't think their recursive dependencies were justified.I then pin the library version and all of its dependencies with npm-shrinkwrap.Periodically, or when I need specific changes, I use npm-check to review updates. Here, I actually do look at all the changes since my pinned version, through a combination of change and commit logs. I make the call on whether the fixes and improvements outweigh the risk of updating; usually the changes are trivial and the answer is yes, so I update, shrinkwrap, skim the diff, done.I prefer not to pull in dependencies at deploy time, since I don't need the headache of github or npm being down when I need to deploy, and production machines may not have external internet access, let alone toolchains for compiling binary modules. Npm-pack followed by npm-install of the tarball is your friend here, and gets you pretty close to 100% reproducible deploys and rollbacks.This list intentionally has lots of judgement calls and few absolute rules. I don't follow all of them for all of my projects, but it is what I would consider a reasonable process for things that matter.
    6. I suspect you aren't seeing much discussion because those who have a reasonable process in place, and do not consider this situation to be as bad as everyone would have you believe, tend not to comment on it as much.
    7. Clearly JS and NPM have done a lot RIGHT, judging by success and programmer satisfaction. How do we keep that right and fix the wrong?
    8. That said, I wish more people would talk both sides. Yes, every dependency has a cost. BUT the alternatives aren't cost free either. For all the ranting against micropackages, I'm not seeing a good pro/con discussion.
    1. Democrat Chicago to allow the economy to open up less than a week after Biden's inauguration...it's all planned to make Biden appear successful! Democrats allowed millions of people to suffer and lose businesses all for their own greed and power!
    1. Whenever majorities trample upon the rights of minorities—when men are denied even the privilege of having their causes of complaint examined into—when measures, which they deem for their relief, are rejected by the despotism of a silent majority at a second reading—when such become the rules of our legislation, the Congress of this Union will no longer justly represent a republican people.
    1. Colin D asks how to preserve the JSON structure of the array, so that the final output is a single JSON array rather than a stream of JSON objects. The simplest way is to wrap the whole expression in an array constructor:
    1. jq uses the Oniguruma regular expression library, as do php, ruby, TextMate, Sublime Text, etc, so the description here will focus on jq specifics.
  2. en.wikipedia.org en.wikipedia.org
    1. RPython is now also used to write non-Python language implementations such as Pixie.
    2. PyPy was funded by the European Union being a Specific Targeted Research Project
    3. Bootstrapping (compilers)
    4. Thus the recursive logo of PyPy is a snake swallowing itself since the RPython is translated by a Python interpreter.
    5. RPython puts some constraints on the Python language such that a variable's type can be inferred at compile time.
    6. There used to be other backends in addition to C: Java, CSharp, and Javascript but those suffered from bitrot and have been removed.
    7. PyPy was conceived to be an implementation of Python written in a programming language that is similar to Python.
    8. PyPy aims to provide a common translation and support framework for producing implementations of dynamic languages, emphasizing a clean separation between language specification and implementation aspects.
    9. PyPy uses a technique known as meta-tracing, which transforms an interpreter into a tracing just-in-time compiler.
    1. Refactoring is a means of addressing the problem of software rot. It is described as the process of rewriting existing code to improve its structure without affecting its external behaviour.
    2. Suppose an administrator creates a forum using open source forum software, and then heavily modifies it by adding new features and options. This process requires extensive modifications to existing code and deviation from the original functionality of that software.
    3. cannot be run on any modern day computer or computer simulator, as it was developed during the days when LISP and PLANNER were still in development stage, and thus uses non-standard macros and software libraries which do not exist anymore
    4. Software that is not currently being used gradually becomes unusable as the remainder of the application changes.
    5. much software requires continuous changes to meet new requirements and correct bugs, and re-engineering software each time a change is made is rarely practical.
    6. This creates what is essentially an evolution process for the program, causing it to depart from the original engineered design. As a consequence of this and a changing environment, assumptions made by the original designers may be invalidated, introducing bugs.
    7. Infrequently used portions of code, such as document filters or interfaces designed to be used by other programs, may contain bugs that go unnoticed. With changes in user requirements and other external factors, this code may be executed later, thereby exposing the bugs and making the software appear less functional.
    8. There are changes in the environment not related to the program's designer, but its users. Initially, a user could bring the system into working order, and have it working flawlessly for a certain amount of time. But, when the system stops working correctly, or the users want to access the configuration controls, they cannot repeat that initial step because of the different context and the unavailable information (password lost, missing instructions, or simply a hard-to-manage user interface that was first configured by trial and error).
    9. "the quality in a technical system that prevents a user from restoring the system, once it has failed

      .

    10. When changes occur in the program's environment, particularly changes which the designer of the program did not anticipate, the software may no longer operate as originally intended.
    11. will eventually lead to software becoming faulty, unusable, or in need of upgrade.
    12. This is not a physical phenomenon: the software does not actually decay, but rather suffers from a lack of being responsive and updated with respect to the changing environment in which it resides.
    1. Or perhaps there was no printed manual, only a link to a web page - that has since disappeared (because the provider went bust, or just changed their web content management system).
    2. A product’s onceability is, to a certain extent, linked to its usefulness. If it is really useful, we will certainly go to considerable lengths to repair it.
    3. But sometimes not even that helps; the onceability factor can, ultimately, trump the usefulness.
    4. Even if the damned thing would be really helpful in the long run, I can't give it the time and attention needed to make it work again ...  Not right now. And ultimately never.
    5. Onceability can be the result of the exaggerated demand for un-memorable passwords.
    6. I have proposed a new word for this quality: onceability.
    7. It could be defined, tentatively, as "the quality in a technical system that prevents a user from restoring the system, once it has failed".
    8. This, I suggest, is an inherent quality in much new technology: the fact that you, as a user, manage to do something once - but not a second time.
    9. Digital technology may contain no moving parts but it still, somehow, gets worn, splintered and corroded. It rots. It decays. The rot, though, is mostly invisible (and un-smellable). Still, one day, the thing is broken.
    10. But every so often, I wind up a "somewhat-later abandoner". 
    11. I searched for a replacement, but the list of plug-ins had 5000 items and the search function couldn't find anything of the same kind...
    1. As a simple example of a basic runtime system, the runtime system of the C language is a particular set of instructions inserted into the executable image by the compiler. Among other things, these instructions manage the process stack, create space for local variables, and copy function-call parameters onto the top of the stack. There are often no clear criteria for deciding which language behavior is considered inside the runtime system versus which behavior is part of the source program. For C, the setup of the stack is part of the runtime system, as opposed to part of the semantics of an individual program, because it maintains a global invariant that holds over all executions. This systematic behavior implements the execution model of the language
    1. The question, 'What is library and information science?' does not elicit responses of the same internal conceptual coherence as similar inquiries as to the nature of other fields, e.g., 'What is chemistry?', 'What is economics?', 'What is medicine?' Each of those fields, though broad in scope, has clear ties to basic concerns of their field. [...] Neither LIS theory nor practice is perceived to be monolithic nor unified by a common literature or set of professional skills. Occasionally, LIS scholars (many of whom do not self-identify as members of an interreading LIS community, or prefer names other than LIS), attempt, but are unable, to find core concepts in common
    2. fragmented adhocracy

      first sighting: adhocracy

    3. The "Pluridisciplinary" or "multidisciplinarity" level The genuine cross-disciplinary level: "interdisciplinarity" The discipline-forming level "transdisciplinarity"
    4. Some believe that computing and internetworking concepts and skills underlie virtually every important aspect of LIS, indeed see LIS as a sub-field of computer science!
    5. In the last part of the 1960s, schools of librarianship, which generally developed from professional training programs (not academic disciplines) to university institutions during the second half of the 20th century, began to add the term "information science" to their names.
    1. Documentation science gradually developed into the broader field of information science.
    1. antimicro is a graphical program used to map keyboard keys and mouse controls to a gamepad.

      why is it named this?

    2. As of May 24, 2016, antimicro has moved from https://github.com/Ryochan7/antimicro to https://github.com/AntiMicro/antimicro. Additionally, project management has passed from Travis (Ryochan7) to the AntiMicro organization due to Travis having other interests and priorities.
    3. This repo is currently unmaintained. The code hasn't been updated for a while. But not all is lost, antimicro has a future!

      Have to read on to understand...

    1. Peer dependency A dependency (listed in the peerDependencies field of the manifest) describes a relationship between two packages. Contrary to regular dependencies, a package A with a peer dependency on B doesn't guarantee that A will be able to access B - it's up to the package that depends on A to manually provide a version of B compatible with request from A. This drawback has a good side too: the package instance of B that A will access is guaranteed to be the exact same one as the one used by the ancestor of A. This matters a lot when B uses instanceof checks or singletons.