1,267 Matching Annotations
  1. Last 7 days
    1. and if there's no server there's really no way to dissemenate that address to the peers thus ----> hard code it

      hard code it yes

    2. You'll probably want to hardcode the global - let's call it "leaderboard" - db address in the app anyway, so maybe that's enough

      leaderboard address hard coded

    1. However, the role of tacit knowledge and interpersonal relationships should not be neglected.

      tacit knowledge

    2. Tacit knowledge has been most often referred to concerning interpersonal strategies. The concept of tacit knowledge emerged in the middle of 1990s and became very popular in the management studies.

      Tacit knowledge 1990s or rather Tacit Knowing 30 years out

    3. Data management workload should decrease, and personalization of web-based services will emerge.

      Personalization of web based services

    4. Chapter authors define them as technologies, techniques, and methods that can be utilized under ICT or interpersonal knowledge management strategy. Techniques are focused more on tacit knowledge and emphasize human interaction. In addition, they are more af-fordable and easier to maintain but require learning strategy to function well. In contrast, technologies require ICT infrastructure and ICT skills and are more expensive to purchase and maintain. They are oriented primarily on explicit knowledge (Al-Ghassani et al., 2005, p. 84).

      interpersonal cheaper

    5. In contrast, tools for interpersonal knowledge management are often not mentioned in holistic considerations. Organizations have difficulties selecting the appropriate tools due to the vast array of available products.

      interpersonal not mentioned

    6. ICT strategies are connected to explicit knowledge, which is data driven, codifiable, and not connected to people. Interpersonal strategies are connected to tacit knowledge, which is person’s own knowledge and it is usually not codifiable. In practice, entire strategic focuses are branching off from two main types of knowledge

      codifiable explicit interpersonal tacit

    7. the implementation of knowledge management strategies is shown to have a direct positive impact on organization’s performance

      KM 4 performance

    8. Organizations are currently facing dynamic transitional changes and are becoming increasingly knowledge-based.

      dynamic transitional changes

  2. Oct 2019
    1. in anticipation of a time when hashtag#HTML documents would become a major vehicle for hashtag#LinkedData deployment to the hashtag#LODCloud (the world's largest hashtag#KnowledgeGraph, by far!).

      HTML for Linked Data

    1. the conceptualizing and prototyping of a PKM System (PKMS) aiming at departing from today’s centralized institutional solutions and at strengthening individuals’ sovereignty and collaborations, not at the expense of Organizational KM Systems, but rather as the means to foster a fruitful co-evolution

      strengthening individual's sovereignty the key to fruitful co-evolution with Organizational KM systems.

    1. Weaving a Decentralized Semantic Web of (Personal) Knowledge

      part of Web 3.0

    2. MindGraphseamlesslyintegratesthearticulationofknowledgewiththeelaborationofthe termsusedtoorganizeandmakesenseofthisknowledge

      instance first situated ontologies as conceptualizations encompassing intents and interpretations

    3. WedescribeaSemanticGraphmodelwhichwecallMindGraph,whichrenders documentsvirtual,andenablesscholarlyworktobe‘bornsemantic’.Thecontentcreated usingMindGraphformsapotentiallyemergentGlobalGiantSemanticWebofdecentralized Knowledge.

      born semantic virtual documents

    1. only the related nodes to be matched and traversed inthe XML document. It avoids traversing through large XML documents and materi-alize included intensional elements with service calls to external resources.

      avoid traversing through large XML documents

    2. annotating XML documents, in our case, is to define theactive parts of the document, as an access point to the data values for the relevant partsof the document

      annotating XML documents

    3. technique oflabelled property graph layer to query the semantics in XML data

      layered property graph layer

    4. intensional parts of the document thatrepresent service calls to the relevant resource of the respective data.

      intentional parts

    5. What types of the XML data willbe graphically connected; and how this graph connection can improve the traversingprocess of the XML document by indexing the XML nodes in graph structure

      graphically connected indexing XML nodes in a graph structure

    6. nline analytical processing(OLAP)

      OLAP

    7. online transactionprocessing (OLTP)

      OLTP

    8. ree pattern query (TPQ) matching process

      tree pattern query

    1. but a broken wheel won't work  and so attempts to fix a broken wheel  produce more variations of a broken  wheel

      That's why we have Framework Fatigue.

    1. Notice that the Child component class has no state field. The component object created by the custom JSX tag has values of the JSX tag properties. If you define a state field in the component, your state field overwrites the JSX properties.

      no state field

    2. he lazy components are useful for single-page applications (SPAs), where we can mount multiple components to a single element.

      lazy components

  3. Sep 2019
    1. use components to divide the application code into smaller and manageable units.

      divide code to manageable units

    2. 38DebuggingWe can debug the AppRun applications right inside Visual Studio Code. To do so, we need to install the Debugger for Chrome extension from the marketplace.

      debugger

    3. event publication and subscription (event pub-sub), also known as the event emitter pattern.

      event pub sup, event emitter pattern

    4. Well-structured applications have decoupled modules or loosely coupled modules.

      well-structured de-coupled or loosely coupled the key to interoperability

    5. I did not find a good answer to the architectural question of how to decouple code modules.

      How to decouple modules? is key

    6. architecture of Elm and the one-way data binding concept from Elm and React because they are simple yet brilliant solutions.

      one way data binding

    7. Application architecture is the discipline that guides application design, as defined in the Gartner IT Glossary. In the book Patterns of Enterprise Application Architecture, Martin Fowler explains application architectureas “The highest-level breakdown of a system into its parts.” Application architecture is not only the structure of the application but also the discipline for breaking down the application logic.

      The best architecture is the one that minimzes the cost associated with change in the way application logic is envisioned and changing the way it was broken down earlier.

    8. All the ancient and old-fashioned stuff has beaten out the high-technology.This story tells us that technology changes, but culture lasts. The real value of culture is buried in time. So, sticking to core concepts and finding out the true value of tools is the way to navigate through JavaScript fatigue.

      Technology changes, Culture lasts core concepts true value of tools navigate through JavaScript fatigue

    9. What we need in real-world business application development is a stable technology that developers can use not only to develop new applications but also to continue developing new features progressively for existing applications even when they are using different technologies.

      no rewrite progressive development based on stable core concepts

    10. The benefit of using events is that they can decouple modules. Module A and Module B do not know each other. They only need to know the global app object. Module B does not have reference to and is not dependent on Module A. Module A and Module B depend only on the global app. Therefore, Module A and Module B are decoupled. Event pub-sub is an effective method to decouple modules. By using event pub-sub, the building blocks or modules in the AppRun architecture are decoupled from each other. They communicate and invoke the functionalities through the events.

      depend only in the global app

    11. 10AppRun solves the component relationship and communication problem by using event pub-sub. AppRun components are decoupled and isolated modules. Elm’s concern is not an issue in AppRun. In AppRun applications, the component is a mini-application and has a component-scoped AppRun architecture, which includes the three architecture parts discussed previously: state, view, and update. Components communicate with each other through the events.

      decoupled components comprised of components

    12. Single-Page Applications

      SPA

    13. the Hacker News reader application uses both the Firebase API and the JSON API. It leverages AppRun pub-sub to connect the Firebase events to the AppRun events.

      AppRun pub-sub

    1. hypertextual method of inquiry

      Hypertext as Method

    1. rise of computational notebooks as a prime example of a new kind of collaborative and highly malleable applications.

      computational notebooks

    1. a mechanism that extracts knowledge models from textbooks and enriches their content with additional links (both internal and external). The textbooks essentially become hypertext documents where individual pages are annotated with important concepts in the domain. We also show that extracted models can be automatically connected to the Linked Open Data cloud, which helps further facilitate access, discovery, enrichment, and adaptation of textbook content. Integrating multiple textbooks from the same domain increases the coverage of the composite model while keeping its accuracy relatively high. The overall results of the evaluation show that the proposed approach can generate models of good quality and is applicable across multiple domains.

      knowledge models from text book linked to LODC

    1. RealWorld Routing specification is as following:

      Thinking of routes event triggered by changing urls

    2. It has about 1000 lines source that can be gziped to 18K. React/Redux has about 2000 lines/211K. React/MobX has 1900 lines/122K, Angular has 2000 lines/570K and Elm has 4000 lines/101K.

      10x on React

    1. Remember: Logic and I/O are separate concerns.Logic is thinking. Effects are actions. Think before you act!

      Think before you act

    2. jQuery custom events to turn the DOM into a pub/sub event bus to decouple view rendering concerns from state logic.

      DOM pub/sub event bus

    3. se pub/sub to decouple I/O from views and program logic. Rather than directly triggering side-effects in UI views or program logic, emit an event or action object describing an event or intent.

      pub/sub emit an event or action object describing the intent

    4. supposed atomic units of composition are not really atomic

      atomic units of composition

    5. Pure functions can be safely memoized, meaning that, if the system had infinite memory, any pure function could be replaced with a lookup table that uses the function’s input as an index to retrieve a corresponding value from the table.

      pure functions memoized

    6. less coupling is desirable for its own sake because it makes code easier to extend and maintain.

      less coupling easier to extend and maintain

    1. A pure JavaScript implementation of git for node and browsers!

      git for node

    1. This symbolic system is called IEML, for Information Economy MetaLanguage. It is : (1) an artificial language that automatically computes its internal semantic relations and translates itself into natural languages, (2) a metadata language for the collaborative semantic tagging of digital data, (3) a new addressing layer of the digital medium (conceptual addressing) solving the semantic interoperability and network efficiency problems, (4) a semantic coordinate system of the mind (the semantic sphere), allowing the computational modeling of human cognition and the self-observation of collective intelligence.

      IEML

    2. a new addressing layer of the digital medium (conceptual addressing) solving the semantic interoperability and network efficiency problems,

      conceptual addressing

    1. Giving back to the users the information that they produce, enabling reflexive collective intelligence.

      reflexive collective intelligence

    1. We plan to compete with React, Vue, Angular and other frameworks and libraries to become the #1 library.

      compete with React

    1. The new generation of AI systems that are powered by knowledge graphs tackle those issues by integrating data in a graph structure and by providing contextual knowledge.

      powered by knowledge graphs

    1. Two-way data binding is possible, but think YAGNI

      end of two way data binding

    2. We drive the app/component update life-cycle using events.

      drive life cycle via events

    1. The sooner you paint and the sooner someone can do something, the better the experience for the person who is using the App.

      AppRun the best

    1. I argue against using these fast food frameworks ever because they create slow user experiences, which of course is contrary to the goal of PWA

      fast food frameworks, clueless bloatware

    1. Universal Language ModelFine-tuning (ULMFiT), an effective trans-fer learning method that can be applied toany task in NLP, and introduce techniquesthat are key for fine-tuning a languagemodel.

      ULMFiT

      transfer learning

    1. applicative style programming is made available in a history sensitive, but non-von Neumann system.

      stateless computation driving state transitions in meaningful chunks

    2. Applicative State Transition Systems (AST Systems)

      applicative style programming is made available in a history sensitive, but non-von Neumann system. Just what is needed on the (re)(decent)ralised Web.

    3. 3 Structure of an AST System

      AST system structure

    4. Remarks About AST Systems

      AST Remarks

    5. The Structure of Algol Compared to That of AST System

      read from here That's the model we need

    1. the technology for coping with large-scale computer systems merges with the technology for building new computer languages, and computer science itself becomes no more (and no less) than the discipline of constructing appropriate descriptive languages.

      language oriented programming

    2. Metalinguistic abstraction--establishing new languages--plays an important role in all branches of engineering design. It is particularly important to computer programming, because in programming not only can we formulate new languages but we can also implement these languages by constructing evaluators. An evaluator (or interpreter) for a programming language is a procedure that, when applied to an expression of the language, performs the actions required to evaluate that expression.

      metalinguistic abstraction definitiion

    1. Language Oriented Programming in MetaLisp Gyuri Lajos's thesis 1992 University of Leeds

      Taking the idea of metalinguistic abstraction to its logical conclusion

    1. only documents have a status on the web.

      Web Annotations like

    2. As a matter of fact text has no status on the web, only documents do.

      See my reply

    1. freeing metadata

      is important.

      One of the key lessons of the past 20 years of the semantic web has been:

      “He who controls metadata controls the Web and, through the Web, many things of our world”.

      https://hyp.is/T-mzyHYbEemwRHMZqK0FnQ/hal.inria.fr/hal-01935898/document

    2. What really drives me is the idea of freedom of expression, freedom of knowledge transfer and ultimately freedom of exchange.

      freedom of expression indeed

    1. "the power of high level languages is notational rather than computational"

      Beyond Programming

  4. Aug 2019
    1. Creative Intelligence or CQ

      A creative term for Increasing Collective Intelligence. Like it very much.

    2. embrace ambiguity and to engage in a sophisticated, nuanced relationship with complexity

      Pretty Zen, being comfortable with the unknown

    1. STEPS Toward The Reinvention of Programming,2009 Progress Report Submitted to the NationalScience Foundation (NSF) October 2009

      STEPS Towards The Reinvention of Programming 2009

    1. Inventing & building a language for STEPS “UI and applications”, and rewriting Frank in it

      inventing & building languages for STEPS "UI and applications"

    2. Producing a workable “chain of meaning” that extends from the screen all the way to the CPU

      workable "chain of meaning" that extends

    3. M'#5&5#'(6E@:#.6#9:6X#:6X#!E(&V',94?/#56%E-'(.9#%6-0&('5#X&.:#4#X&5'#V4,&'.1#6Q#9.6,4@'#4(5#.,4(9-&99&6(# -'%:4(&9-9# %6E?5# 9E09E-'# X6,5# +,6%'99&(@=# 5'9;.6+# +E0?&9:&(@=# +,'9'(.4.&6(9=# '-4&?=#X'0Z+4@'9=#4(5#I1+',%4,5Z?&;'#+4@'9U

      Universal documents

    1. Writing the previous page of this report in Frank, with a “Halloween themed” user interface look

      writing report on Frank in Frank

    2. as size and complexity increases, architectural design dominates materials!

      size architecture design

    1. In a history-sensitive language, a pro- gram can affect the behavior of a subsequent one by changing some store which is saved by the system. Any such language requires some kind of state transition semantics. But it does not need semantics closely coupled to states in which the state changes with every detail of the computation. "Applicative state transition" (AST) systems are proposed as history-sensitive alternatives to von Neumann systems. These have: (a) loosely coupled state-transition semantics in which a transition occurs once per major computation; (b) simple states and tran- sition rules; (c) an underlying applicative system with simple "reduction" semantics; and (d) a programming language and state transition rules both based on the underlying applicative system and its semantics. The next four sections describe the elements of this approach to non-von Neumann language and system design.

      AST summary

    2. John Backus IBM Research Laboratory, San Jose

      Backus

    3. Can Programming Be Liberated from the von Neumann Style? A Functional Style and Its Algebra of Programs

      programming liberated von Neumann Style

    4. The failure of von Neumann languages to treat names as functions may be one of their more important weaknesses. In any case, the ability to use names as functions and stores as objects may turn out to be a useful and important programming concept, one which should be thoroughly explored.

      names as functions stores as objects

    5. functional nature of names (Rey- nolds' OEDANr~N

      look it up

    6. By defining appropriate functions one can, I be- lieve, introduce major new features at any time, using the same framework. Such features must be built into the framework of avon Neumann language. I have in mind such features as: "stores" with a great variety of naming systems, types and type checking, communicat- ing parallel processes, nondeterminacy and Dijkstra's "guarded command" constructs [8], and improved meth- ods for structured programming

      introducing stores

    1. ypical example of a conceptual model that represents instances and typesconnected in this way aremaps, which rely on a conceptual modeling language (whosesemantics is expressed by the map’s legend) whose constructs connect individual streets,crossings, etc., with the universals they instantiate [3]. Maps are very different then fromdirectly depicting representations– to use a Wittgensteinian expression –

      maps vs directly depicting representations

    2. ucid(such that allthe language constructs are interpreted in an unequivocal way in terms of those concepts).These properties are based on the conversational maxims of pragmatical efficiency putforth by the philosopher of language Paul Grice [15],

      lucid

    3. onceptual models as explicit descriptions of mentalmodels.

      explicit descriptions of mental models

    4. fundamental tenet of Cognitive Science and Philosophy of Mind that cognitive pro-cesses generate, use and transform mental representations of the world.

      Seemed a good idea at the time Clearly a dead end

    5. ttached assertions or procedures to every node, cap-turing the semantics of the concept being represented

      assertions procedures attached to nodes

    6. Peter Chen’s Entity-Relationship Mode

      Chen's Entity-Relationship model

    7. associationist stance, including semantic networks, object-oriented modelsand description logics

      associationist stance

    8. models of howweconceive of that domain

      models of how we conceive of a domain

    9. argue for the thesis that conceptual mod-els are models of conceptual mental representations that cognitive agents build, use andmanipulate during cognition.

      conceptual mental representations

      articulation of tacit awareness understanding in terms of symbols and their interrelations

    1. This shows that the ontology can be progressively improved with moredata sheets and the feedback from domain experts.

      ontology progressively improved more data and feedback from experts

    2. We had a domain expert adding the correct entities (group III) to the en-riched ontology and disjointing the incorrect entities (group II) from the enrichedontology.

      human in the loop

    3. Ontology enrichment
      • keyword
    4. This paper presents Con-TrOn – Continuously Trained Ontology – a system that automaticallyaugments ontologies

      automatically augment ontologies

    5. ontologies rely solely on experiencesand perspectives of their creators at the time of creation and cannotaccumulate knowledge over time on their own.

      ontologies accumulate knowledge over time

    1. siloed (often meaning duplicate) data entries,

      silod duplicate

    2. “most companies think they have “Big Data” problems while they actually have big “data problems””

      big "data problems" indeed

    1. high-performance, integrated data warehouse appliance

      data warehouse appliance

    2. content intelligence platform

      content intelligence plaform

    3. managing a discussion board or knowledgebase, project management or time tracking tasks, or doing other workflow operations, in order to help extend SharePoint beyond its out-of-the-box capabilities.

      extend sharepoint discussion board, knwoledge base, project planning, workflow

    4. internet-like enterprise search experience

      internet-like enterprise search experience

    5. natural language processing, machine learning, and knowledge graphing

      knowledge graphing

    6. access and share files directly through Google Drive

      google drive

    7. addresses evolving needs.

      address evolving needs

    8. Self-service solution for customer interaction

      self-service customer interaction

    9. remixed into new documents

      remix

    1. Share a more complete understanding of data among both people & applications

      share complex understanding

    2. Benefit from entity-centric views on your data

      entity centric view

    3. Benefit from a “schema-late” approach

      schema late

    4. Benefit from standards-based data models, even along the whole industry supply chain

      standards-based data models

    5. Integrate heterogeneous data sources (structured & unstructured)

      integrate heterogeneous data

    6. Benefit from unified views across multiple data silos within the enterprise

      unified view

    7. "People think RDF is a pain because it is complicated. The truth is even worse. RDF is painfully simplistic, but it allows you to work with real-world data and problems that are horribly complicated."

      RDF painfully simple could do better handling multidimensionaility

    1. Dynamic semantic content publishing: The entity-centric view of text-based information facilitates the dynamic publication of content assets across multiple platforms. This opens up enormous innovation potential for thewhole publication life cycle. Configurable landing pages can display varying digital assets depending on subject-matter parameters. With graph queries such as “Display a picture, a video, a blog entry, and themost up-to-date news about XY,”organizationscan reuse existing content on the fly and create new digital products. Knowledge workers can search for information and the context surrounding it more precisely, which also makes content creation more efficient.

      semantic content publishing dynamic reuse of existing content on the fly context for content creation

    2. superior data processing operations along the graph can be executed on the fly

      execute data processing along the graph

    3. dynamic data with high personalization requirements on an application level.

      personalization application level

    1. semantic middleware for RDF data processing

      semantic middleware

    1. Note: Custom elements are supported by default in Firefox, Chrome, and Edge (76). Opera and Safari so far support only autonomous custom elements.

      What a shame

    1. Graph-based knowledge representation has been researchedfor decades and the term knowledge graph does not consti-tute a new technology. Rather, it is a buzzword reinventedby Google and adopted by other companies and academiato describe di erent knowledge representation applications.

      Graph-based Knowledge Representation researched for decades KG buzzword

    2. conversely { a knowledge

      a knowledge graph that crawls the entire web could be interpreted as self-contained Semantic Web.

    3. In conclusion,the Semantic Web could be interpreted as the most com-prehensive knowledge graph

      Semantic Web most comprehensive Knowledge graph

    4. question arises what constitutes the dif-ference between the Semantic Web and knowledge graphs.

      semantic web as knowledge Graph

    1. The data relationships are more “connectorial” than combinatorial.

      This idea has it's deep root in the idea that the "Advantages of programming languages are notational rather than computational" The inventor of LISP was not interested when told that it can be run on a computer. The idea of a universal function that was way more important" 50 years on, Jerry Sussman hailed as the greatest discovery of our time that living systems are udinversal in the sense of universal machines or functions

    1. e METHONTOLOGYapproach, which is an ontology engineering methodology

      ontology engineering

    2. everage the transparencyof the theories used to design these systems as well asallowing representing good design principles for effectivelydesigning gamified ITS—i.e., the later benefits could be veryuseful to aid the design of authoring tools for constructingsuch systems

      transparency of theories design authoring tools

    3. providea standard representation for the infrastructure of gamifiedITSs, which may enable the interoperability (e.g., to interoperateeducational resources) between different architectures of thesesystems

      interoperability

    4. here is a growing interest in the useof ontologies to address e-learning problems. Particularly, inthe context of ITS, ontologies have been used to representthe domain model concepts, to represent students’ modelingallowing automated reasoning, to interoperate heterogeneousITSs, and so on (Al-Yahya et al., 2015). Ontology is defined as“explicit specification of a conceptualization” (Gruber, 1993).It is “explicit” because of its classes and properties visibility.Conceptualization is understood to be an abstract and simplifiedversion of the world to be represented. Moreover, ontologiescan be logically reasoned and shared within a specific domain(Guarino, 1998). Thus, ontologies are a standard form forrepresenting the concepts within a domain, as well as therelationships between those concepts in a way that allowsautomated reasoning

      ontologies for e-learning problems allow machine reasoning

    1. For years, my only metric of success was building a billion-dollar company. Now, I realize that was a terrible goal.

      terrible goal indeed

    1. simplify system development by allowing the automation of everything capable of being automated.

      simplify through automation

    2. software development tools for agile software, based on capturing and managing knowledge of both users and customers

      agile software based on knowledge

    Annotators

    1. content calendar is a must, to have it mapped out for each stage of the buying process by persona.

      content calendar mapped by persona buying process

    2. define measures for lead nurturing and actions to take

      measures lead nurturing actions

    3. having a strategic approach that will lead you to reinvent your ways of working.

      reinvent ways of working

    1. The first studycompares the development of mapping rules by em-ploying either the MapVOWL visual notation or theRML language directly

      RML MapVOWL

    2. Following an approachsimilar to Fresnel’s lenses and formats two tem-plates were defined for higher level concepts—oneconsisting of a SPARQL query to extract relevantresources and another to specify their visual pre-sentation. The initial prototypes were evaluated asa part of the development process, and the evalu-ation results were incorporated into the successivedevelopment workflow

      Fresnel-lenses

    3. node-link diagrams, based on the SARO ontology, werecreated to facilitate the visual identification of re-lationships between skills.

      node-link diagrams

    4. comple-mentary visualizations—timelines, small-multiples,parallel coordinates—were implemented to provideoverview, enable identification of trends and regionsof interests, and obtain insight.

      timelines, small-multiples, parallel coordinates

    5. Users typically have prob-lems understanding certain language features andjustifications of OWL entailments. As a result, theaddition and removal of axioms may lead to unin-tended consequences, which poses a significant cog-nitive challenge for developers to identify, as toolsrarely provide adequate feedback. Exploring the in-ferred class hierarchy after running a reasoner is acommon activity for examining the consequences ofontology changes.

      understanding language features

    6. Devel-oping ontologies is a difficult and error-prone un-dertaking due to the complexity of knowledge rep-resentation language

      developing ontologies error-prone

    7. converting contentstored in relational databases into RDF and stud-ied seventeen tools comprising of both tools com-pliant to the W3C R2RML recommendation andtools implementing their own mapping languages.

      relational to RDF R2RML

    8. ontology-based dataaccess (OBDA)

      OBDA

    9. The authors propose Linked Data-drivenWeb Components where the RDF data model is alsoused to describe their content, metadata, scope ofuser interactions and to support customization

      Web Components

    10. Cognitive aspects emerge asan essential ingredient for future work on knowledgeacquisition, representation, reasoning, and interac-tions on the Semantic Web

      essential ingredient

    11. cognitive aspects of useractivities

      cognitive aspects

    12. inher-ently intricate content.

      inherently intricate

    13. visualization and interaction techniquesfor ontology engineering as well as the productionand consumption of Linked Data in traditional andnovel interaction contexts

      visualizations, interaction techniques

    14. Meanwhile, visual interfaces for modeling, edit-ing, exploring, integrating, etc., of semantic contenthave not received much attention yet.

      visual interfaces semantic contents

    15. The Semantic Web enables intelligent agents tocreate knowledge by interpreting, integrating anddrawing inferences from the abundance of dataat their disposal. It encompasses approaches andtechniques for expressing and processing data inmachine-readable formats. All these tasks demanda human-in-the-loop; without them the great visionof the Semantic Web would hardly be achieved

      create knowledge interpreting integrating inferencing requires human-in-the-loop

    1. The KM perspective taken prioritizes a decentralizing agenda benefiting knowledge workers while also aiming to foster a fruitful co-evolution with traditional organizational KM approaches. Findings – The notions of generative fit and capacities in their technical, informational, and social interpretations prove to be able to not only accommodate diverse KM models but also to cumulatively synthesize a wide range of related concepts and per-spectives.

      decentralizing agenda co-evolving personal and organizational knowledge management

    1. Business level primary keys, even be they single-valued, get into some real-life problems

      "Key" Problems

      identity and naming

      what is a thing, what's in a name, identity of things, what's in a link etc

    2. Business level primary keys, even be they single-valued, get into some real-life problems

      "Key" Problems

      identity and naming

      what is a thing, what's in a name, identity of things, what's in a link etc

    3. Object-Role-Modeling (ORM)

      ORM

    4. What you see above is actually a directed graph representing a piece of a data model at the most“atomic level.” Chen’s work really is quite similar to the graph models depicted earlier in this chapter.And he published this in 1976

      Chen's model 1976

    5. The “semantic web” category of graph databases, networks of “triples” representing subject-predicate-object “sentences” in the RDF (XML) language

      semantic web RDF

    6. structure (relationships) is of higher importance thancontents (the list of properties).

      structure (relationships) more important than content properties

    7. relationships, and they express vivid dynamics. This is the space thatthe graph data models explore.

      graphs explore vivid dynamics

    8. Relationships in the sense that there is a relationshipbetween Customer and Order are second class citizens labeled “constraints.”

      relationships second class citizens as labelled constraings

    9. One of the great misconceptions of the data modeling community has been that relational databasesare about relationships.

      misconception relational about relationships

    10. related attributes existing together

      related attributes exist together

    11. One of the great misconceptions of the data modeling community has been that relational databasesare about relationships.

      misconception relational about relationships

    12. data models (which represent business semantics)

      data models = business semantics

    13. graphs emerged remarkably late

      indeed

    14. Frankly, I much prefer Peter Chen’s representation covered a couple of sectionsago

      prefer Chen to UML indeed

    15. With the advent of SQL, the named relationships were not named anymore. Since foreignkey relationships are constraints, and constraints may have names in most SQL implementations, itreally is strange why this happened. From a business semantics point of view, this is a very sad lossof information

      named relationships not named

    16. This book is about recycling data models expressed (rather high level, typically) in meta models suchas XML / UML®, concept maps etc.

      model-2-model transformation

    1. more-rapid comprehension, better comprehension, the possibility of gaining a useful degree of comprehension in a situation that previously was too complex, speedier solutions, better solutions, and the possibility of finding solutions to problems that before seemed insoluble

      comprehension solutions

    1. if we want algorithms to understand textual sources we need to provide these machine readers with links to concepts with unambiguously defined meaning.

      provide machine readers with links to concepts with unambiguously defined meaning

      for human readers provide links to things in a personal knowledge graph described in LinkedText

    2. Leaving digital “margin” notes for the new readers on the block

      digital margin notes new readers on the block

    3. “Marginalia (or apostils) are marks made in the margins of a book or other document. They may be scribbles, comments, glosses (annotations),critiques, doodles, or illuminations.

      marginalia illuminations

    4. Semantic annotation is a tool that gives us the ability to express, refer to and thus make documents and parts of texts machine-processable.

      semantic annotation make texts machine-processable

    5. turning textual sources into data assets is best applied in the areas where knowledge is explicit and multiple, and ambiguousinterpretations are rare

      ambiguous interpretation are rare

    6. interpretation is a matter of computation.

      interpretation matter of computation

    7. not a silver bullet for discovering knowledge and teasing meaning out of data

      teasing meaning out of data

    8. Knowledge discovery (through automatically discovering references to concepts and entities)

      discovery references to concepts and entitites

    9. Knowledge management ( through aggregating all relevant information)

      aggregating relevant information

    10. platform for interactive relationships discovery, called Linked Life Data,

      relationship discovery

    11. CONNECTING THE DOTS BECOMES CONNECTING THE NODES

      connecting the dots as nodes

    12. Examples of companies that took the leap into creating smart content

      smart content

    13. Turning texts into data coupled with linking these data to other sources

      text to data linked

    14. semantic information extraction and semantic annotation

      semantic information extraction annotation

    15. meaning rather than the structure of the data

      meaning rather than structure

    16. express rich, self-describ-ing interrelations of data in a form that machines can process.

      rich, self-describing interrelations

    17. show related facts and items instead of just word matching

      show related facts and items

    18. Why Interweave Semantic DataInto Texts

      interweave semantic data into text

    19. the information our textual sources contain is only as good as our ability and tools to extract and interpret it

      ability to extract interpret