48 Matching Annotations
  1. Apr 2024
    1. The consensus is reached in the same way as fortransactions i.e. using hasgraph consensus algorithm. The onlydifference is, that the concerning events in the hashgraph nowcontain other type of data instead of transactions

      Not necessarily, how to store received events is an implementation detail. One could dump them in an array on a side. Can be as efficient as array of pointers to events. Where idx of this array is event's position in total order.

    Tags

    Annotators

  2. Aug 2023
  3. Jul 2023
    1. Conceptual data model: describes the semantics of a domain, being the scope of the model. For example, it may be a model of the interest area of an organization or industry. This consists of entity classes, representing kinds of things of significance in the domain, and relationship assertions about associations between pairs of entity classes. A conceptual schema specifies the kinds of facts or propositions that can be expressed using the model. In that sense, it defines the allowed expressions in an artificial 'language' with a scope that is limited by the scope of the model.
    2. "Data models for different systems are arbitrarily different. The result of this is that complex interfaces are required between systems that share data. These interfaces can account for between 25-70% of the cost of current systems".
    3. The term data model can refer to two distinct but closely related concepts
    4. A data model can sometimes be referred to as a data structure, especially in the context of programming languages.
    5. Sometimes it refers to an abstract formalization of the objects and relationships found in a particular application domain
    6. A data model[1][2][3][4][5] is an abstract model that organizes elements of data and standardizes how they relate to one another and to the properties of real-world entities.
  4. Jan 2023
    1. 个人学习可能取决于他人行为的主张突出了将学习环境视为一个涉及多个互动参与者的系统的重要性
  5. Aug 2022
  6. Apr 2022
    1. ReconfigBehSci. (2022, January 24). @STWorg @FraserNelson @GrahamMedley no worse- he took Medley’s comment that Sage model the scenarios the government asks them to consider to mean that they basically set out to find the justification for what the government already wanted to do. Complete failure to distinguish between inputs and outputs of a model [Tweet]. @SciBeh. https://twitter.com/SciBeh/status/1485625862645075970

    1. Dr Nisreen Alwan 🌻. (2020, March 14). Our letter in the Times. ‘We request that the government urgently and openly share the scientific evidence, data and modelling it is using to inform its decision on the #Covid_19 public health interventions’ @richardhorton1 @miriamorcutt @devisridhar @drannewilson @PWGTennant https://t.co/YZamKCheXH [Tweet]. @Dr2NisreenAlwan. https://twitter.com/Dr2NisreenAlwan/status/1238726765469749248

  7. Feb 2022
  8. Jan 2022
  9. Jul 2021
  10. May 2021
  11. Mar 2021
  12. Nov 2020
    1. We love dbt because of the values it embodies. Individual transformations are SQL SELECT statements, without side effects. Transformations are explicitly connected into a graph. And support for testing is first-class. dbt is hugely enabling for an important class of users, adapting software engineering principles to a slightly different domain with great ergonomics. For users who already speak SQL, dbt’s tooling is unparalleled.

      when using [[dbt]] the [[transformations]] are [[SQL statements]] - already something that our team knows

    1. The attribution data modelIn reality, it’s impossible to know exactly why someone converted to being a customer. The best thing that we can do as analysts, is provide a pretty good guess. In order to do that, we’re going to use an approach called positional attribution. This means, essentially, that we’re going to weight the importance of various touches (customer interactions with a brand) based on their position (the order they occur in within the customer’s lifetime).To do this, we’re going to build a table that represents every “touch” that someone had before becoming a customer, and the channel that led to that touch.

      One of the goals of an [[attribution data model]] is to understand why someone [[converted]] to being a customer. This is impossible to do accurately, but this is where analysis comes in.

      There are some [[approaches to attribution]], one of those is [[positional attribution]]

      [[positional attribution]] is that we are weighting the importance of touch points - or customer interactions, based on their position within the customer lifetime.

  13. Oct 2020
  14. Aug 2020
  15. Jul 2020
  16. May 2020
  17. Apr 2020
  18. Feb 2020
  19. Jan 2020
    1. The Web Annotation Data Model specification describes a structured model and format to enable annotations to be shared and reused across different hardware and software platforms.

      The publication of this web standard changed everything. I look forward to true testing of interoperable open annotation. The publication of the standard nearly three years ago was a game changer, but the game is still in progress. The future potential is unlimited!

  20. Nov 2019
  21. Sep 2019
    1. On the other hand, a resource may be generic in that as a concept it is well specified but not so specifically specified that it can only be represented by a single bit stream. In this case, other URIs may exist which identify a resource more specifically. These other URIs identify resources too, and there is a relationship of genericity between the generic and the relatively specific resource.

      I was not aware of this page when the Web Annotations WG was working through its specifications. The word "Specific Resource" used in the Web Annotations Data Model Specification always seemed adequate, but now I see that it was actually quite a good fit.

  22. Apr 2019
  23. Sep 2016
    1. The importance of models may need to be underscored in this age of “big data” and “data mining”. Data, no matter how big, can only tell you what happened in the past. Unless you’re a historian, you actually care about the future — what will happen, what could happen, what would happen if you did this or that. Exploring these questions will always require models. Let’s get over “big data” — it’s time for “big modeling”.
  24. Feb 2015