245 Matching Annotations
  1. Last 7 days
    1. The Lisp Machine (which could just as easily have been, say, a Smalltalk machine) was a computing environment with a coherent, logical design, where the “turtles go all the way down.” An environment which enabled stopping, examining the state of, editing, and resuming a running program, including the kernel. An environment which could actually be fully understood by an experienced developer.  One where nearly all source code was not only available but usefully so, at all times, in real time.  An environment to which we owe so many of the innovations we take for granted. It is easy for us now to say that such power could not have existed, or is unnecessary. Yet our favorite digital toys (and who knows what other artifacts of civilization) only exist because it was once possible to buy a computer designed specifically for exploring complex ideas.  Certainly no such beast exists today – but that is not what saddens me most.  Rather, it is the fact that so few are aware that anything has been lost.
    2. The reason for this is that HyperCard is an echo of a different world. One where the distinction between the “use” and “programming” of a computer has been weakened and awaits near-total erasure.  A world where the personal computer is a mind-amplifier, and not merely an expensive video telephone.  A world in which Apple’s walled garden aesthetic has no place. What you may not know is that Steve Jobs killed far greater things than HyperCard.  He was almost certainly behind the death of SK8. And the Lisp Machine version of the Newton. And we may never learn what else. And Mr. Jobs had a perfectly logical reason to prune the Apple tree thus. He returned the company to its original vision: the personal computer as a consumer appliance, a black box enforcing a very traditional relationship between the vendor and the purchaser. Jobs supposedly claimed that he intended his personal computer to be a “bicycle for the mind.” But what he really sold us was a (fairly comfortable) train for the mind. A train which goes only where rails have been laid down, like any train, and can travel elsewhere only after rivers of sweat pour forth from armies of laborers. (Preferably in Cupertino.) The Apple of Steve Jobs needed HyperCard-like products like the Monsanto Company needs a $100 home genetic-engineering set. The Apple of today, lacking Steve Jobs — probably needs a stake through the heart.
    1. It is this combination of features that also makes HyperCard a powerful hypermedia system. Users can build backgrounds to suit the needs of some system, say a rolodex, and use simple HyperTalk commands to provide buttons to move from place to place within the stack, or provide the same navigation system within the data elements of the UI, like text fields. Using these features, it is easy to build linked systems similar to hypertext links on the Web.[5] Unlike the Web, programming, placement, and browsing were all the same tool. Similar systems have been created for HTML but traditional Web services are considerably more heavyweight.
    1. Such are great historical men—whose own particular aims involve those large issues which are the will of the World-Spirit.
  2. Jul 2019
    1. One way to look at this is that when a new powerful medium of expression comes along that was not enough in our genes to be part of traditional cultures, it is something we need to learn how to get fluent with and use. Without the special learning, the new media will be mostly used to automate the old forms of thought. This will also have effects, especially if the new media is more efficient at what the old did: this can result in gluts, that act like legal drugs (as indeed are the industrial revolution’s ability to create sugar and fat, it can also overproduce stories, news, status, and new ways for oral discourse.
    2. To understand what has happened, we only need to look at the history of writing and printing to note two very different consequences (a) the first, a vast change over the last 450 years in how the physical and social worlds are dealt with via the inventions of modern science and governance, and (b) that most people who read at all still mostly read fiction, self-help and religion books, and cookbooks, etc.* (all topics that would be familiar to any cave-person).
    1. A practical example of service design thinking can be found at the Myyrmanni shopping mall in Vantaa, Finland. The management attempted to improve the customer flow to the second floor as there were queues at the landscape lifts and the KONE steel car lifts were ignored. To improve customer flow to the second floor of the mall (2010) Kone Lifts implemented their 'People Flow' Service Design Thinking by turning the Elevators into a Hall of Fame for the 'Incredibles' comic strip characters. Making their Elevators more attractive to the public solved the people flow problem. This case of service design thinking by Kone Elevator Company is used in literature as an example of extending products into services.
    1. In 1996 and 1998, a pair of workshops at the University of Glasgow on information retrieval and human–computer interaction sought to address the overlap between these two fields. Marchionini notes the impact of the World Wide Web and the sudden increase in information literacy – changes that were only embryonic in the late 1990s.

      it took a half a century for these disciplines to discern their complementarity!

    1. I do not actually know of a real findability index, but tools in the field of information retrieval could be applied to develop one. One of the unsolved problems in the field is how to help the searcher to determine if the information simply is not available.
    2. Although some have written about information overload, data smog, and the like, my view has always been the more information online, the better, so long as good search tools are available. Sometimes this information is found by directed search using a web search engine, sometimes by serendipty by following links, and sometimes by asking hundreds of people in our social network or hundreds of thousands of people on a question answering website such as Answers.com, Quora, or Yahoo Answer
    1. Unfortunately, misguided views about usability still cause significant damage in today's world. In the 2000 U.S. elections, poor ballot design led thousands of voters in Palm Beach, Florida to vote for the wrong candidate, thus turning the tide of the entire presidential election. At the time, some observers made the ignorant claim that voters who could not understand the Palm Beach butterfly ballot were not bright enough to vote. I wonder if people who made such claims have never made the frustrating "mistake" of trying to pull open a door that requires pushing. Usability experts see this kind of problem as an error in the design of the door, rather than a problem with the person trying to leave the room.
    2. The web, in yet another example of its leveling effect, allows nearly everyone to see nearly every interface. Thus designers can learn rapidly from what others have done, and users can see if one web site's experience is substandard compared to others.
    1. At the start of the 1970s, The New Communes author Ron E. Roberts classified communes as a subclass of a larger category of Utopias.[5] He listed three main characteristics. Communes of this period tended to develop their own characteristics of theory though, so while many strived for variously expressed forms of egalitarianism, Roberts' list should never be read as typical. Roberts' three listed items were: first, egalitarianism – that communes specifically rejected hierarchy or graduations of social status as being necessary to social order. Second, human scale – that members of some communes saw the scale of society as it was then organized as being too industrialized (or factory sized) and therefore unsympathetic to human dimensions. And third, that communes were consciously anti-bureaucratic.
    1. Another prominent conclusion is that joint asset ownership is suboptimal if investments are in human capital.

      Does that have to be the case?

    1. Other examples of complex adaptive systems are:stock markets: Many traders make decisions on the information known to them and their individual expectations about future movements of the market. They may start selling when they see the prices are going down (because other traders are selling). Such herding behavior can lead to high volatility on stock markets. immune systems: Immune systems consist of various mechanisms, including a large population of lymphocytes that detect and destroy pathogens and other intruders in the body. The immune systems needs to be able to detect new pathogens for the host to survive and therefore needs to be able to adapt.brains: The neural system in the brain consists of many neurons that are exchanging information. The interactions of many neurons make it possible for me to write this sentence and ponder the meaning of life. ecosystems: Ecosystems consist of many species that interact by eating other species, distributing nutrients, and pollinating plants. Ecosystems can be seen as complex food webs that are able to cope with changes in the number of certain species, and adapt – to a certain extent – to changes in climate. human societies: When you buy this new iPhone that is manufactured in China, with materials derived from African soils, and with software developed by programmers from India, you need to realize that those actions are made by autonomous organizations, firms and individuals. These many individual actions are guided by rules and agreements we have developed, but there is no ruler who can control these interactions.
    2. Path FormationPaved paths are not always the most desirable routes going from point A to point B. This may lead pedestrians to take short-cuts. Initially pedestrians walk over green grass. Subsequent people tend to use the stamped grass path instead of the pristine grass, and after many pedestrians an unpaved path is formed without any top-down design.
  3. Jun 2019
    1. However, indexes in the modern sense, giving exact locations of names and subjects in a book, were not compiled in antiquity, and only very few seem to have been made before the age of printing. There are several reasons for this. First, as long as books were written in the form of scrolls, there were neither page nor leaf numbers not line counts (as we have them now for classical texts). Also, even had there been such numerical indicators, it would have been impractical to append an index giving exact references, because in order for a reader to consult the index, the scroll would have to be unrolled to the very end and then to be rolled back to the relevant page. (Whoever has had to read a book available only on microfilm, the modern successor of the papyrus scroll, will have experienced how difficult and inconvenient it is to go from the index to the text.) Second, even though popular works were written in many copies (sometimes up to several hundreds),no two of them would be exactly the same, so that an index could at best have been made to chapters or paragraphs, but not to exact pages. Yet such a division of texts was rarely done (the one we have now for classical texts is mostly the work of medieval and Renaissance scholars). Only the invention of printing around 1450 made it possible to produce identical copies of books in large numbers, so that soon afterwards the first indexes began to be compiled, especially those to books of reference, such as herbals. (pages 164-166) Index entries were not always alphabetized by considering every letter in a word from beginning to end, as people are wont to do today. Most early indexes were arranged only by the first letter of the first word, the rest being left in no particular order at all. Gradually, alphabetization advanced to an arrangement by the first syllable, that is, the first two or three letters, the rest of an entry still being left unordered. Only very few indexes compiled in the 16th and early 17th centuries had fully alphabetized entries, but by the 18th century full alphabetization became the rule... (p. 136) (For more information on the subject of indexes, please see Professor Wellisch's Indexing from A to Z, which contains an account of an indexer being punished by having his ears lopped off, a history of narrative indexing, an essay on the zen of indexing, and much more. Please, if you quote from this page, CREDIT THE AUTHOR. Thanks.) Indexes go way back beyond the 17th century. The Gerardes Herbal from the 1590s had several fascinating indexes according to Hilary Calvert. Barbara Cohen writes that the alphabetical listing in the earliest ones only went as far as the first letter of the entry... no one thought at first to index each entry in either letter-by-letter or word-by-word order. Maja-Lisa writes that Peter Heylyn's 1652 Cosmographie in Four Bookes includes a series of tables at the end. They are alphabetical indexes and he prefaces them with "Short Tables may not seeme proportionalble to so long a Work, expecially in an Age wherein there are so many that pretend to learning, who study more the Index then they do the Book."
    2. Pliny the Elder (died 79 A.D.) wrote a massive work called The Natural History in 37 Books. It was a kind of encyclopedia that comprised information on a wide range of subjects. In order to make it a bit more user friendly, the entire first book of the work is nothing more than a gigantic table of contents in which he lists, book by book, the various subjects discussed. He even appended to each list of items for each book his list of Greek and Roman authors used in compiling the information for that book. He indicates in the very end of his preface to the entire work that this practice was first employed in Latin literature by Valerius Soranus, who lived during the last part of the second century B.C. and the first part of the first century B.C. Pliny's statement that Soranus was the first in Latin literature to do this indicates that it must have already been practiced by Greek writers.
    1. Smil notes that as of 2018, coal, oil, and natural gas still supply 90% of the world's primary energy. Despite decades of growth of renewable energy, the world uses more fossil fuels in 2018 than in 2000, by percentage.
    1. Jevons received public recognition for his work on The Coal Question (1865), in which he called attention to the gradual exhaustion of Britain's coal supplies and also put forth the view that increases in energy production efficiency leads to more, not less, consumption.[5]:7f, 161f This view is known today as the Jevons paradox, named after him. Due to this particular work, Jevons is regarded today as the first economist of some standing to develop an 'ecological' perspective on the economy.
    1. The term first appeared in 1984 as the topic of a public debate at the annual meeting of AAAI (then called the "American Association of Artificial Intelligence"). It is a chain reaction that begins with pessimism in the AI community, followed by pessimism in the press, followed by a severe cutback in funding, followed by the end of serious research.[2] At the meeting, Roger Schank and Marvin Minsky—two leading AI researchers who had survived the "winter" of the 1970s—warned the business community that enthusiasm for AI had spiraled out of control in the 1980s and that disappointment would certainly follow. Three years later, the billion-dollar AI industry began to collapse.
    1. volatility and leverage are co-determined and arepro-cyclical; that is, together, they amplify the impact ofshocks. The mechanism, to be specific, is that decliningvolatility reduces the cost of taking on more leverage andfurthers a buildup of risk. The lesson: Risk managers mustresist the temptation to sell volatility when it is low andfalling. The AMH implicitly embraces modeling suchbehavior with heterogeneous agents that use heuristics.
    1. Throughout the past two decades, he has been conducting research in the fields of psychology of learning and hybrid neural network (in particular, applying these models to research on human skill acquisition). Specifically, he has worked on the integrated effect of "top-down" and "bottom-up" learning in human skill acquisition,[1][2] in a variety of task domains, for example, navigation tasks,[3] reasoning tasks, and implicit learning tasks.[4] This inclusion of bottom-up learning processes has been revolutionary in cognitive psychology, because most previous models of learning had focused exclusively on top-down learning (whereas human learning clearly happens in both directions). This research has culminated with the development of an integrated cognitive architecture that can be used to provide a qualitative and quantitative explanation of empirical psychological learning data. The model, CLARION, is a hybrid neural network that can be used to simulate problem solving and social interactions as well. More importantly, CLARION was the first psychological model that proposed an explanation for the "bottom-up learning" mechanisms present in human skill acquisition: His numerous papers on the subject have brought attention to this neglected area in cognitive psychology.
    1. Bob Barton [said] "The basic principle of recursive design is to make the parts have the same power as the whole." For the first time I thought of the whole as the entire computer, and wondered why anyone would want to divide it up into weaker things called data structures and procedures. Why not divide it up into little computers... Why not thousands of them, each simulating a useful structure?
    1. To keep recession away, the Federal Reserve lowered the Federal funds rate 11 times - from 6.5% in May 2000 to 1.75% in December 2001 - creating a flood of liquidity in the economy. Cheap money, once out of the bottle, always looks to be taken for a ride. It found easy prey in restless bankers—and even more restless borrowers who had no income, no job and no assets. These subprime borrowers wanted to realize their life's dream of acquiring a home. For them, holding the hands of a willing banker was a new ray of hope. More home loans, more home buyers, more appreciation in home prices. It wasn't long before things started to move just as the cheap money wanted them to.
  4. May 2019
    1. Virtually all BPMs have utilities for creating simple, data-gathering forms. And in many types of workflows, these simple forms may be adequate. However, in any workflow that includes complex document assembly (such as loan origination workflows), BPM forms are not likely to get the job done. Automating the assembly of complex documents requires ultra-sophisticated data-gathering forms, which can only be designed and created after the documents themselves have been automated. Put another way, you won't know which questions need to be asked to generate the document(s) until you've merged variables and business logic into the documents themselves. The variables you merge into the document serve as question fields in the data gathering forms. And here's the key point - since you have to use the document assembly platform to create interviews that are sophisticated enough to gather data for your complex documents, you might as well use the document assembly platform to generate all data-gathering forms in all of your workflows.
  5. Mar 2019
  6. Feb 2019
    1. In a 2011 Reddit IAmA, Jennings recalled how in 2004 the Democratic politicians Chuck Schumer and Harry Reid unsuccessfully asked Jennings to run for the United States Senate from Utah. Jennings commented, "That was when I realized the Democratic Party was f@#$ed in '04."[19]
  7. Jan 2019
    1. You don't need complex sentences to express complex ideas. When specialists in some abstruse topic talk to one another about ideas in their field, they don't use sentences any more complex than they do when talking about what to have for lunch. They use different words, certainly. But even those they use no more than necessary. And in my experience, the harder the subject, the more informally experts speak. Partly, I think, because they have less to prove, and partly because the harder the ideas you're talking about, the less you can afford to let language get in the way.
    2. It seems to be hard for most people to write in spoken language. So perhaps the best solution is to write your first draft the way you usually would, then afterward look at each sentence and ask "Is this the way I'd say this if I were talking to a friend?" If it isn't, imagine what you would say, and use that instead. After a while this filter will start to operate as you write. When you write something you wouldn't say, you'll hear the clank as it hits the page.Before I publish a new essay, I read it out loud and fix everything that doesn't sound like conversation. I even fix bits that are phonetically awkward; I don't know if that's necessary, but it doesn't cost much.
    3. If you simply manage to write in spoken language, you'll be ahead of 95% of writers. And it's so easy to do: just don't let a sentence through unless it's the way you'd say it to a friend.
  8. Dec 2018
    1. “They’re actively, actively recruiting,” said Cheddar’s Alex Heath. “They’re also trying to scoop up crypto start-ups that are at the white-paper level, which means they don’t really even have a product yet.”
  9. Sep 2018
    1. The selloff partly reflects a broader malaise in emerging markets. U.S. interest rate increases and a stronger dollar have lured cash back to America, often at the expense of developing economies. Some countries have come under additional pressure because of U.S. tariffs or sanctions, while economic turmoil in Turkey and Argentina have further fueled investors’ concerns.
  10. Aug 2018
    1. Bakkt will provide access to a new Bitcoin trading platform on the ICE Futures U.S. exchange. And it will also offer full warehousing services, a business that ICE doesn’t have. “Bakkt’s revenue will come from two sources,” says Loeffler, “the trading fees on the ICE Futures U.S. exchange, and warehouse fees paid by the customers that buy Bitcoin and store with Bakkt.”
    2. Bakkt plans to offer a full package combining a major CFTC-regulated exchange with CFTC-regulated clearing and custody, pending the approval from the commission and other regulators.

      still pending regulatory approval

    3. At a recent meeting with the couple in the plush Bond Room at the NYSE, Sprecher stressed that Loeffler has been a collaborator in charting ICE’s next big move. “Kelly and I brainstormed for five years to find a strategy for digital currencies,” says Sprecher.

      bakkt is 5 years in the making

    4. Cracking the 401(k) and IRA market for cryptocurrency would be a huge win for Bakkt. But the startup’s plans raise the prospect of an even more ambitious goal: Using Bitcoin to streamline and disrupt the world of retail payments by moving consumers from swiping credit cards to scanning their Bitcoin apps. The market opportunity is gigantic: Consumers worldwide are paying lofty credit card or online-shopping fees on $25 trillion a year in annual purchases.

      Allowing money from 401ks and IRA's would allow for huge influx of passive capital.

      Retail component would actually cause selling pressure as was seen in 2015 when more and more retailers started accepting bitcoin.

    1. The idea of a gold exchange-traded fund was first conceptualized by Benchmark Asset Management Company Private Ltd in India when they filed a proposal with the SEBI in May 2002. However it did not receive regulatory approval at first and was only launched later in March 2007

      Took 5 years to get approval for gold etf in India

    1. However, most ETCs implement a futures trading strategy, which may produce quite different results from owning the commodity.
    2. However, generally commodity ETFs are index funds tracking non-security indices. Because they do not invest in securities, commodity ETFs are not regulated as investment companies under the Investment Company Act of 1940 in the United States, although their public offering is subject to SEC review and they need an SEC no-action letter under the Securities Exchange Act of 1934. They may, however, be subject to regulation by the Commodity Futures Trading Commission.

      Commodity etfs are regulated by CFTC but need a no action letter from the SEC to be approved.

    3. The idea of a Gold ETF was first officially conceptualised by Benchmark Asset Management Company Private Ltd in India when they filed a proposal with the SEBI in May 2002.[32] The first gold exchange-traded fund was Gold Bullion Securities launched on the ASX in 2003, and the first silver exchange-traded fund was iShares Silver Trust launched on the NYSE in 2006. As of November 2010 a commodity ETF, namely SPDR Gold Shares, was the second-largest ETF by market capitalization.[33]

      In 8 years gold etf became the second largest by market cap

  11. Jul 2018
    1. Mayor de Blasio and his administration have made progress in meeting their goal of building 200,000 affordable units over the span of a decade, as 21,963 new units were added in 2016, the most in 27 years. However, there continues to be a shortage in East Harlem. Out of the nearly 20,000 affordable units, the city brought to all five boroughs, just 249 units have been built in East Harlem, according to a new report by the Department of Housing and Preservation Development (HPD). To better accommodate these residents, the city plans on expediting the construction of 2,400 units of affordable housing over the next few years, as DNA Info reported.
    1. However, price time-series have some drawbacks. Prices are usually only positive, which makes it harder to use models and approaches which require or produce negative numbers. In addition, price time-series are usually non-stationary, that is their statistical properties are less stable over time.
    1. Denote NNN as the number of instances of evidence we possess. As we gather an infinite amount of evidence, say as N→∞N→∞N \rightarrow \infty, our Bayesian results (often) align with frequentist results. Hence for large NNN, statistical inference is more or less objective. On the other hand, for small NNN, inference is much more unstable: frequentist estimates have more variance and larger confidence intervals. This is where Bayesian analysis excels. By introducing a prior, and returning probabilities (instead of a scalar estimate), we preserve the uncertainty that reflects the instability of statistical inference of a small NNN dataset.

      Law of large numbers helps to get to the frequentist result but the bayesian perspective reflects instability of inferential statistics when the number of observed inferences is small.

    1. A core tenet of the Y Combinator playbook for startups is to talk to your users. If you’re interested in building a third party app on top of a fat protocol, the lesson might be to also talk to competing apps’ users to figure out what needs aren’t being served. In a similar vein, protocol developers should talk to app developers and learn what they think end users want.

      this isn't happening nearly enough which is why protocols don't provide tech components for viable end user apps

    1. The reason Mr. Wonderful loves royalty based funding is because it is a big win for both businesses and investors. Investors see a return on helping businesses succeed. Experienced investors will even offer guidance to help business owners avoid the pitfalls that many entrepreneurs stumble into. On the business side, entrepreneurs get the financing they need without debt or sacrificing ownership of their companies in any way. Additionally, since repayment of royalty based financing is structured around revenue, there are no rigid payment schedule. Royalty based funding provides financing and flexibility, which gives businesses the freedom to reach their potential, while simultaneously providing healthy returns to investors.
    1. Here are the definitions to make sure we’re on the same page:Subscription model — a periodic (monthly, yearly, or seasonal) payment to gain access to products or services.Transactional model — you pay as you use the products and services.
    1. Second, recall that the impetus formovingfrom proof-of-workto proof-of-stakeis to reduce the amount of computational resource and energy required to maintain the network by a couple orders of magnitude. That’s good forscalability and potential adoption, butalso meansa commensurate reduction in the PQ of the network.

      The impetus is for the reduction of technical debt and the increased efficiency of network resource provision. The computational resources used for mining and not for processing transactions gets repurposed to increase the amount of transactions that can be processed.

      This assumption that Q is constant is bizarre. If just looking at transaction throughput the goal is to be able to process several 100 thousand transactions per second if not millions. P and Q are clearly inversely correlated.

    2. Is that added valueenoughto offsetitsinefficiency compared to the incumbent centralised Twitter? Would Token Twitter offer compellingly higher utility compared to centralised Twitter, including enough surplus utility to offset the cost of operating the consensus mechanism? I’m notso sure.

      Considering the majority of the cost of operating twitter comes from human capital, marketing, legal and accounting and not from IT, which continues to fall on a per unit basis while the aformentioned continue to increase yes. If the assumption is that legal and accounting is no longer needed and developers and other employees are overpaid relative to an entirely crowdsource labor force then then you might see the redundancy costs in IT operations offset by cost reduction in other operational expenses.

    3. The combined effect of low and falling PQ and potentially very high V is that the utility value of utility cryptoassets at equilibrium should in fact be relatively low.

      Utility value of utility crypto assets are not entirely a function of the cost of network resources. These assets also provide influence which can't be financial measured as it captures the participants the expected future value of the network and what having influence over its direction can afford that particular participant.

      Also I would think that PQ is artificially high right now because of how inefficient blockchains are but as P falls due to further scalability Q should increase not only to offset declines in P but overcompensate the declining P as more services can be built on this infrastructure. A premium will be placed on the network effects of protocol that has a successful applicaiton that other applicaitons will want to interoperate with for its data and microservices (e.g. identity account, financial etc).

    1. Additionally, there is work available in most countries for people living outside the US, but only workers in the US and India can withdraw cash. Workers from other countries can only redeem their earnings through Amazon gift cards.
  12. Jun 2018
    1. there have always been far more users/consumers than suppliers, which means that in a world where transactions are costly owning the supplier relationship provides significantly more leverage.
    2. The value chain for any given consumer market is divided into three parts: suppliers, distributors, and consumers/users. The best way to make outsize profits in any of these markets is to either gain a horizontal monopoly in one of the three parts or to integrate two of the parts such that you have a competitive advantage in delivering a vertical solution. In the pre-Internet era the latter depended on controlling distribution.
    1. Since you use a cryptoasset once, and then it’s in someone else’s hands, this discounting methodology is not accumulative over each year the way it is with a DCF.

      why does a token have to be used once and exchange hands. a token can be taken out of circulation

    1. Basically, all token pitches include a line that goes something like this: "There is a fixed supply of tokens. As demand for the token increases, so must the price." This logic fails to take into account the velocity problem
    1. This, of course, leaves us none the wiser as to how to model velocity, as the equation of exchange is nothing more than an identity. MV=PQ just says that the money flow of expenditures is equal to the market value of what those expenditures buy, which is true by definition. The left and right sides are two ways of saying the same thing; it’s a form of double-entry accounting where each transaction is simultaneously recorded on both sides of the equation. Whether an effect should be recorded in M, V, P, or Q is, ultimately, arbitrary. To transform the identity into a tool with predictive potency, we need to make a series of assumptions about each of the variables. For example, monetarists assume M is determined exogenously, V is constant, and Q is independent of M and use the equation to demonstrate how increases in the money supply increase P (i.e. cause inflation).
    2. The first practical problem with velocity is that it’s frequently employed as a catch-all to make the two sides of the equation of exchange balance. It often simply captures the error in our estimation of the other variables in the model.
    3. The core thesis of current valuation frameworks is that utility value can be derived by (a) forecasting demand for the underlying resource that a network provisions (the network’s ‘GDP’) and (b) dividing this figure by the monetary base available for its fulfillment to obtain per-unit utility value. Present values can be derived from future expected utility values using conventional discounting. The theoretical framework that nearly all these valuation models employ is the equation of exchange, MV=PQ.
    1. Mechanism design studies solution concepts for a class of private-information games. Leonid Hurwicz explains that 'in a design problem, the goal function is the main "given", while the mechanism is the unknown. Therefore, the design problem is the "inverse" of traditional economic theory, which is typically devoted to the analysis of the performance of a given mechanism.'[1] So, two distinguishing features of these games are: that a game "designer" chooses the game structure rather than inheriting one that the designer is interested in the game's outcome

      Advantages over traditional game theory for token econimics:

      • a game "designer" chooses the game structure rather than inheriting one
      • that the designer is interested in the game's outcome
    1. 1. Thesis: Open Standards, Market Cycles and Investment ReturnsInformation technology evolves in multi-decade cycles of expansion, consolidation anddecentralization.

      Open standards reduce production costs, which bring down prices for consumers and increase the potential size of the market.

      New entrants realizing that cost are now low, competition is scarce and the potential reward is high, attempt to disrupt incumbents with more efficient and scalable business models.

      Market consolidates around the platforms of the companies that realize and implement these business models first.

      Demand then builds for a low cost, open source alternative to the incumbent platforms.

    2. We favor spreading priceand risk by building up and averaging out of positions over time rather than speculating onspeculation. A committed capital structure with significant capital reserves for staged follow-onsgives us the flexibility to build up our investments independent of market sentiment. We areshielded from having to dump assets on the market to honor redemption requests, avoiding thedreaded “death spiral” which can plague more liquid fund structures.
    3. We fund the development of decentralized information networks coordinated by a scarcecryptoasset – or token – native to the protocol. Our thesis is that decentralization andstandardization at the data layer of the internet is collapsing the production costs of informationnetworks, eliminating data monopolies and creating a new wave of innovation.
    4. Crypto provides a new mechanism for organizing human activity on a global basis usingprogrammable financial incentives. It’s an opportunity to design information networks which canachieve unprecedented levels of scale by decentralizing the infrastructure, open sourcing thedata, and distributing value more broadly. What we’ve discovered is the native business model ofnetworks – which, as it turns out, encompass the entire economy.
    5. Most of the use cases today involve compensating machine work (transaction processing, filestorage, etc.) with tokens: the building blocks of decentralized applications. But the greatestlong-term opportunity is in networks where tokens are earned by end-users themselves.
    6. We’ve also realized how inefficient the joint-stock equity industry model is at accounting for anddistributing the real value created by online networks. The value of a share of stock is necessarilya function of profits; the price of Twitter’s stock only reflects Twitter Inc’s ability to monetizethe data – and not the actual worth of the service. Tokens solve this inefficiency by derivingfinancial value directly from user demand as opposed to “taxing” by extracting profits.
    7. Following the history of information technology and the massive trend towards open source, wecan see that democratizing information is the natural next step in the incessant trend to opensource, and thus the next big opportunity for innovation.
    8. The way to play a consolidating market is to investheavily into the consolidating incumbents (which are likely to continue growing strongly for along period of time) and to invest progressively in the insurgent platforms that will grow tocommoditize the incumbent business models and create a new wave of innovation. We arefocused on the latter
    9. Those who succeed the most and establish successful platforms “on top” of the open standardlater tend to consolidate the industry by leveraging their scale (in assets and distribution) tointegrate vertically and expand horizontally at the expense of smaller companies. Competing inthis new environment suddenly becomes expensive and startups struggle to create value in theshadow of incumbents, compressing venture returns.Demand then builds for a low cost, open source alternative to the incumbent platforms, and thecycle repeats itself: the new open standard emerges and gets adopted, the market decentralizes asnew firms leverage the cost savings to compete with the old on price, value creation shiftsupwards (once more), and so on
    10. Information technology evolves in multi-decade cycles of expansion, consolidation anddecentralization. Periods of expansion follow the introduction of a new open platform thatreduces the production costs of technology as it becomes a shared standard. As production costsfall, new firms come to market leveraging the standard to compete with established incumbents,pushing down prices and margins, and decentralizing existing market powers.The price drop attracts new users, increasing the overall size of the market and creating newopportunities for mass consumer applications. Entrepreneurial talent moves to serve the newmarkets where costs are low, competition is scarce, and the upside is high. Often these earlyentrepreneurs will introduce new kinds of business models, orthogonal to existing ones
    1. In this kind of situation one might well ask: why continue to make the 80 per cent of products that only generate 20 per cent of profits? Companies rarely ask these questions, perhaps because to answer them would mean very radical action: to stop doing four-fifths of what you are doing is not a trivial change.

      Relevant on larger scale of global economies.

    2. There are two routes to achieving this. One is to reallocate the resources from unproductive to productive uses, the secret of all entrepreneurs down the ages. Find a round hole for a round peg, a square hole for a square peg, and a perfect fit for any shape in between. Experience suggests that every resource has its ideal arena, where the resource can be tens or hundreds of times more effective than in most other arenas. The other route to progress—the method of scientists, doctors, preachers, computer systems designers, educationalists and trainers—is to find ways to make the unproductive resources more effective, even in their existing applications; to make the weak resources behave as though they were their more productive cousins; to mimic, if necessary by intricate rote-learning procedures, the highly productive resources. The few things that work fantastically well should be identified, cultivated, nurtured and multiplied. At the same time, the waste—the majority of things that will always prove to be of low value to man and beast—should be abandoned or severely cut back.
    3. George Bernard Shaw put it well: ‘The reasonable man adapts himself to the world. The unreasonable one persists in trying to adapt the world to himself. Therefore all progress depends on the unreasonable man.
    4. Certainly, the principle brings home what may be evident anyway: that there is a tragic amount of waste everywhere, in the way that nature operates, in business, in society and in our own lives. If the typical pattern is for 80 per cent of results to come from 20 per cent of inputs, it is necessarily typical too that 80 per cent, the great majority, of inputs are having only a marginal—20 per cent—impact.
    5. Both phenomena help to show how the universe abhors balance. In the former case, we see a natural flight away from a 50/50 split of competing phenomena. A 51/49 split is inherently unstable and tends to gravitate towards a 95/5, 99/1 or even 100/0 split. Equality ends in dominance: that is one of the messages of chaos theory. The 80/20 Principle’s message is different yet complementary It tells us that, at any one point, a majority of any phenomenon will be explained or caused by a minority of the actors participating in the phenomenon. 80 per cent of the results come from 20 per cent of the causes. A few things are important; most are not.
    6. Related to the idea of feedback loops is the concept of the tipping point. Up to a certain point, a new force—whether it is a new product, a disease, a new rock group or a new social habit such as jogging or roller-blading—finds it difficult to make headway. A great deal of effort generates little by way of results. At this point many pioneers give up. But if the new force persists and can cross a certain invisible line, a small amount of additional effort can reap huge returns. This invisible line is the tipping point.
    7. We can see positive feedback loops operating in many areas, explaining how it is that we typically end up with 80/20 rather than 50/50 relationships between populations. For example, the rich get richer, not just (or mainly) because of superior abilities, but because riches beget riches. A similar phenomenon exists with goldfish in a pond. Even if you start with goldfish almost exactly the same size, those that are slightly bigger become very much bigger, because, even with only slight initial advantages in stronger propulsion and larger mouths, they are able to capture and gobble up disproportionate amounts of food
    8. At the heart of this progress is a process of substitution. Resources that have weak effects in any particular use are not used, or are used sparingly. Resources that have powerful effects are used as much as possible. Every resource is ideally used where it has the greatest value. Wherever possible, weak resources are developed so that they can mimic the behaviour of the stronger resources.
    9. Why should you care about the 80/20 Principle? Whether you realize it or not, the principle applies to your life, to your social world and to the place where you work. Understanding the 80/20 Principle gives you great insight into what is really happening in the world around you.
    10. The reason that the 80/20 Principle is so valuable is that it is counterintuitive. We tend to expect that all causes will have roughly the same significance. That all customers are equally valuable. That every bit of business, every product and every dollar of sales revenue is as good as another. That all employees in a particular category have roughly equivalent value. That each day or week or year we spend has the same significance. That all our friends have roughly equal value to us. That all enquiries or phone calls should be treated in the same way. That one university is as good as another. That all problems have a large number of causes, so that it is not worth isolating a few key causes. That all opportunities are of roughly equal value, so that we treat them all equally. We tend to assume that 50 per cent of causes or inputs will account for 50 per cent of results or outputs. There seems to be a natural, almost democratic, expectation that causes and results are generally equally balanced. And, of course, sometimes they are. But this ‘50/50 fallacy’ is one of the most inaccurate and harmful, as well as the most deeply rooted, of our mental maps.
    11. The key point is not the percentages, but the fact that the distribution of wealth across the population was predictably unbalanced.
    12. In business, many examples of the 80/20 Principle have been validated. 20 per cent of products usually account for about 80 per cent of dollar sales value; so do 20 per cent of customers. 20 per cent of products or customers usually also account for about 80 per cent of an organization’s profits. In society, 20 per cent of criminals account for 80 per cent of the value of all crime. 20 per cent of motorists cause 80 per cent of accidents. 20 per cent of those who marry comprise 80 per cent of the divorce statistics (those who consistently remarry and redivorce distort the statistics and give a lopsidedly pessimistic impression of the extent of marital fidelity). 20 per cent of children attain 80 per cent of educational qualifications available. In the home, 20 per cent of your carpets are likely to get 80 per cent of the wear. 20 per cent of your clothes will be worn 80 per cent of the time. And if you have an intruder alarm, 80 per cent of the false alarms will be set off by 20 per cent of the possible causes. The internal combustion engine is a great tribute to the 80/20 Principle. 80 per cent of the energy is wasted in combustion and only 20 per cent gets to the wheels; this 20 per cent of the input generates 100 per cent of the output!
    13. The 80/20 Principle asserts that a minority of causes, inputs or effort usually lead to a majority of the results, outputs or rewards. Taken literally, this means that, for example, 80 per cent of what you achieve in your job comes from 20 per cent of the time spent. Thus for all practical purposes, four-fifths of the effort—a dominant part of it—is largely irrelevant. This is contrary to what people normally expect.
    1. Overall the potential tariff charges could hit $450 billion worth of Chinese products entering into the US, which is likely to spill over into the working classes and create havoc for the citizens of both economies.  Meanwhile amidst this trade war, the geo-political unrest has directed a surge of capital back into the global crypto market, as individuals on both sides of the pacific withdraw into decentralized virtual assets to protect their interests. The global market capital has enjoyed a $16 billion increase in the last 24hrs and is looking promising to retrace back towards $300 billion, as US and Chinese stock market FUD increases.
    1. Pantera Capital has had a thesis of investing into local exchanges since the inception of its venture capital fund. Local exchanges have an advantage of a local team who understands the culture and marketing of a specific geography in addition to having the relationships for banking and regulations. In June 2014, Pantera investigated and became the lead US investor in the largest cryptocurrency exchange in Korea, Korbit. Korea was a compelling geography for a local exchange investment because of the country’s familiarity with virtual currencies, becoming one of the first countries to adopt them for gaming, having a government that is pro-innovation, having a large mobile ecosystem.
    1. Jonathan Evans suggested dual process theory in 1975. In his theory, there are two distinct types of processes: heuristic processes and analytic processes. He suggested that during heuristic processes, an individual chooses which information is relevant to the current situation. Relevant information is then processed further whereas irrelevant information is not. Following the heuristic processes come analytic processes. During analytic processes, the relevant information that is chosen during the heuristic processes is then used to make judgments about the situation.
    1. Journalists usually describe the organization or structure of a news story as an inverted pyramid. The essential and most interesting elements of a story are put at the beginning, with supporting information following in order of diminishing importance. This structure enables readers to stop reading at any point and still come away with the essence of a story.
    2. Charney states that "an effective lead is a 'brief, sharp statement of the story's essential facts.'"[10][full citation needed][clarification needed] The lead is usually the first sentence, or in some cases the first two sentences, and is ideally 20–25 words in length. A lead must balance the ideal of maximum information conveyed with the constraint of the unreadability of a long sentence. This makes writing a lead an optimization problem, in which the goal is to articulate the most encompassing and interesting statement that a writer can make in one sentence, given the material with which he or she has to work. While a rule of thumb says the lead should answer most or all of the five Ws, few leads can fit all of these.
    3. News stories also contain at least one of the following important characteristics relative to the intended audience: proximity, prominence, timeliness, human interest, oddity, or consequence.
    4. News writing attempts to answer all the basic questions about any particular event—who, what, when, where and why (the Five Ws) and also often how—at the opening of the article. This form of structure is sometimes called the "inverted pyramid", to refer to the decreasing importance of information in subsequent paragraphs.
    1. Awan: Our growth has been a  journey of constant learning, but if I were to pinpoint the principles our growth team lives by today, which I hope would help others building and growing products, they would be: Define one "North Star" metric for success that is aligned with how your users get value as well as with the success of your business, then measure everything and how it contributes to the North Star metric. Good examples of true-north metrics for growth are measures like how many users are truly engaged with your product Growth is a team sport so hire wisely and invest in your team Good product, measured by long-term retention, comes first; growth comes second Invest in multiple growth channels and identify potential channels by looking at existing user behavior, especially how they are currently discovering your product. At LinkedIn, our biggest channels are viral growth, search-engine optimization of profiles and other member-generated content and partnerships. Understand that growth requires continuous prioritization and feedback so always be testing
    2. Awan: Our growth has been a  journey of constant learning, but if I were to pinpoint the principles our growth team lives by today, which I hope would help others building and growing products, they would be: Define one "North Star" metric for success that is aligned with how your users get value as well as with the success of your business, then measure everything and how it contributes to the North Star metric. Good examples of true-north metrics for growth are measures like how many users are truly engaged with your product Growth is a team sport so hire wisely and invest in your team Good product, measured by long-term retention, comes first; growth comes second Invest in multiple growth channels and identify potential channels by looking at existing user behavior, especially how they are currently discovering your product. At LinkedIn, our biggest channels are viral growth, search-engine optimization of profiles and other member-generated content and partnerships. Understand that growth requires continuous prioritization and feedback so always be testing
    3. Awan: Here’s some advice to others as they focus on growing their own company: Make sure you have product-market fit before you invest in growth. What that means is you have validated that your product is in a market with large demand and your product is satisfying the need for users who try the product. You can measure this by retention rate which is essentially the percentage of your users who keep coming back. If your retention is not stable, you’d want to improve the product first rather than wasting resources on growth. Prioritize growth from the very beginning and build it right into the product. It’s much better for your users to bring other users to the product as they create and share photos or other content or, invite others in the course of normally using the app than to try growing through marketing that feels bolted onto the product. This is critically important if your product has network effects because the product value is limited for early users if the network doesn’t grow fast enough. As your company starts to scale, it’s important to create a dedicated multi-disciplinary growth team covering product, design, marketing, engineering, data science, and business operations. In a startup, a single person may be playing multiple of those roles (an engineer who’s also the data scientist) but as you scale, you can create dedicated functional roles.
    4. Instead, we define the goal of the growth team as accelerating the realization of LinkedIn’s vision, which is to create economic opportunity for every member of the global workforce. Keeping this vision top of mind led us to set the right growth objectives and priorities.
    1. With over 65% contribution to the total revenue, talent solutions are the most important services and tools included in the LinkedIn business model. Talent solutions include premium recruiting tools for the companies and recruiters to help them find the most suitable employees/partners for their business.
    2. LinkedIn apart from being the best recruitment platform is also a sought-after social networking website by marketers to execute their marketing campaigns. This service contributes to over 18% of the total revenue of the company and offers features which let companies to not only create a company page but also enhance their marketing efforts by creating sponsored content, sponsored InMails and text advertisements.
    3. LinkedIn not only connects you with other professionals but also with companies and recruiters. The company has uniquely positioned itself as the only platform worthy of professional networking.
    4. The Linkedin business model is a freemium model which works as a community to connect professionals globally.
    5. The platform currently has over 500 Million users in over 200 countries and territories, 80% of which consider professional networking important to their career success.
    6. The mobile version of the website was launched in February 2008 and the company was bought by Microsoft in February 2016 for $26.2 billion.
    1. Thus mass collaboration is more refined and complex in its process and production on the level of collective engagement.
    2. Modularity enables a mass of experiments to proceed in parallel, with different teams working on the same modules, each proposing different solutions. Modularity allows different "blocks" to be easily assembled, facilitating decentralised innovation that all fits together.
    1. "When tasks require high coordination because the work is highly interdependent, having more contributors can increase process losses, reducing the effectiveness of the group below what individual members could optimally accomplish". Having a team too large the overall effectiveness may suffer even when the extra contributors increase the resources. In the end the overall costs from coordination might overwhelm other costs.
    2. Games such as The Sims Series, and Second Life are designed to be non-linear and to depend on collective intelligence for expansion. This way of sharing is gradually evolving and influencing the mindset of the current and future generations.[117] For them, collective intelligence has become a norm.
    3. The UNU open platform for "human swarming" (or "social swarming") establishes real-time closed-loop systems around groups of networked users molded after biological swarms, enabling human participants to behave as a unified collective intelligence.[140][141] When connected to UNU, groups of distributed users collectively answer questions and make predictions in real-time.[142] Early testing shows that human swarms can out-predict individuals.[140] In 2016, an UNU swarm was challenged by a reporter to predict the winners of the Kentucky Derby, and successfully picked the first four horses, in order, beating 540 to 1 odds.
    4. Epistemic democratic theories refer to the capacity of the populace, either through deliberation or aggregation of knowledge, to track the truth and relies on mechanisms to synthesize and apply collective intelligence.
    5. Research performed by Tapscott and Williams has provided a few examples of the benefits of collective intelligence to business:[38] Talent utilization At the rate technology is changing, no firm can fully keep up in the innovations needed to compete. Instead, smart firms are drawing on the power of mass collaboration to involve participation of the people they could not employ. This also helps generate continual interest in the firm in the form of those drawn to new idea creation as well as investment opportunities.[38] Demand creation Firms can create a new market for complementary goods by engaging in open source community. Firms also are able to expand into new fields that they previously would not have been able to without the addition of resources and collaboration from the community. This creates, as mentioned before, a new market for complementary goods for the products in said new fields.[38] Costs reduction Mass collaboration can help to reduce costs dramatically. Firms can release a specific software or product to be evaluated or debugged by online communities. The results will be more personal, robust and error-free products created in a short amount of time and costs. New ideas can also be generated and explored by collaboration of online communities creating opportunities for free R&D outside the confines of the company.[38]
    6. In one high-profile example, a human swarm challenge by CBS Interactive to predict the Kentucky Derby. The swarm correctly predicted the first four horses, in order, defying 542–1 odds and turning a $20 bet into $10,800.
    7. To address the problems of serialized aggregation of input among large-scale groups, recent advancements collective intelligence have worked to replace serialized votes, polls, and markets, with parallel systems such as "human swarms" modeled after synchronous swarms in nature.
    8. While modern systems benefit from larger group size, the serialized process has been found to introduce substantial noise that distorts the collective output of the group. In one significant study of serialized collective intelligence, it was found that the first vote contributed to a serialized voting system can distort the final result by 34%
    9. To accommodate this shift in scale, collective intelligence in large-scale groups been dominated by serialized polling processes such as aggregating up-votes, likes, and ratings over time
    10. The idea of collective intelligence also forms the framework for contemporary democratic theories often referred to as epistemic democracy.
    11. Condorcet, whose "jury theorem" states that if each member of a voting group is more likely than not to make a correct decision, the probability that the highest vote of the group is the correct decision increases with the number of members of the group (see Condorcet's jury theorem).
    12. The basis and goal of collective intelligence is mutual recognition and enrichment of individuals rather than the cult of fetishized or hypostatized communities."
    13. Collective intelligence (CI) is shared or group intelligence that emerges from the collaboration, collective efforts, and competition of many individuals and appears in consensus decision making.
    1. An upper ontology (or foundation ontology) is a model of the common relations and objects that are generally applicable across a wide range of domain ontologies.
    2. A domain ontology (or domain-specific ontology) represents concepts which belong to a part of the world, such as biology or politics.
    3. At present, merging ontologies that are not developed from a common upper ontology is a largely manual process and therefore time-consuming and expensive.
    1. When users can freely choose tags (creating a folksonomy, as opposed to selecting terms from a controlled vocabulary), the resulting metadata can include homonyms (the same tags used with different meanings) and synonyms (multiple tags for the same concept), which may lead to inappropriate connections between items and inefficient searches for information about a subject.
    2. Tagging systems open to the public are also open to tag spam, in which people apply an excessive number of tags or unrelated tags to an item (such as a YouTube video) in order to attract viewers. This abuse can be mitigated using human or statistical identification of spam items.[48] The number of tags allowed may also be limited to reduce spam.
    3. Hierarchical classification systems can be slow to change, and are rooted in the culture and era that created them; in contrast, the flexibility of tagging allows users to classify their collections of items in the ways that they find useful,
    4. The success of Flickr and the influence of Delicious popularized the concept,[21] and other social software websites—such as YouTube, Technorati, and Last.fm—also implemented tagging
    5. Tagging systems have sometimes been classified into two kinds: top-down and bottom-up.[3]:142[4]:24 Top-down taxonomies are created by an authorized group of designers (sometimes in the form of a controlled vocabulary), whereas bottom-up taxonomies (called folksonomies) are created by all users.
    6. People use tags to aid classification, mark ownership, note boundaries, and indicate online identity. Tags may take the form of words, images, or other identifying marks. An analogous example of tags in the physical world is museum object tagging. People were using textual keywords to classify information and objects long before computers. Computer based search algorithms made the use of such keywords a rapid way of exploring records.
    1. About 600,000 people visit News Genius a month, Lehman said, a figure that had grown 10 times since before President Donald Trump was inaugurated. And the number of people who annotate a post on Genius each month is now at 10,000, up 30 percent from the start of the year. “More people are using News Genius now than ever,” Lehman said. Meanwhile, overall traffic to the website and apps has grown to 62 million a month.
    2. Promised partnerships with major news media organizations never materialized, except in the case of The Washington Post’s Fix blog, which still occasionally uses the platform to annotate the news.
    3. Soon after, Genius made a definitive push to realize Andreessen’s vision. By 2015, Genius claimed 40 million visitors to its website a month, 1 million of whom had annotated a post.
    4. But the biggest problem with the annotator from Genius’ perspective is that few individuals are using it. After more than two years of development, the Chrome extension has only 12,320 users. It was last updated in June 2016.
    5. But it faced a storm of criticism last year after some writers complained the tool was being used to harass them. The annotator also raised concerns that it could have been used to inject malicious code onto visitors’ computers, though it’s since been tweaked to address that vulnerability.
    6. In January of that year, the company began testing a tool called the web annotator, which allowed anyone to add genius.it/ before any URL and then highlight and annotate text.
    7. “Rap Genius is going be the fabric of the internet,” co-founder Mahbod Moghadam said in 2014 “Rap Genius is going be the fabric of the internet,” co-founder Mahbod Moghadam said in 2014. “We’re going to have annotations on other sites, so every other site in the world like the Wall Street Journal and the New York Times are going be Genius-powered and they’re going to have our annotations on them. And then the Genius platform will take over the internet; everyone’s most important statistic that they have in life is their Genius IQ.”
    8. “The change we made in January was in recognition of the fact that we needed to shift resources from capturing knowledge — which we've been doing almost exclusively for the past five years — toward packaging and distributing knowledge into easy-to-consume formats like video and Spotify Behind the Lyrics,” Lehman told The Verge.
    1. Everipedia is going to be the answer to fake news
    2. The big difference is that Reddit’s discussions-not unlike Facebook’s-are haphazard and chaotic. [We plan for] Everipedia to have the most sophisticated groupthink software of all time, modeled off sites like Quora, Stackoverflow, and Genius.
    1. The combination of human expertise and automated analysis can exist in multiple overlays. Climate scientists, economists, political analysts, and automated fact checkers might converge on a single sentence in a story on climate change. Nothing depends on any domain-specific vocabulary or schema. Annotation is simply the connective tissue that makes statements in web pages addressable, and binds those addresses to conversations, supporting documents, source data, or truth claims that bear on annotated statements.
    2. The annotated web embodies that pattern. Systems that embrace it will tend to work well with one another. Their outputs will be available to mine, crosslink, and remix, and those activities will drive collective improvement.
    3. The web we know is an information fabric woven of linked resources. By increasing the thread count of that fabric, the annotated web enables a new class of application for which selections in documents are first-class resources.
    1. By requiring a lock up period for the DCR to obtain tickets, Decred hopes that only users invested in the long-term growth of the network will be involved in the consensus process. Short-term speculators and day traders of DCR will not be able to participate in consensus or governance without making their holdings illiquid.
    2. One concern in the Decred community is that the rising ticket price (about 100 DCR, as of mid-2018) excludes small holders from participating in governance and block validation.
    3. A Hybrid Proof of Work, Proof of Stake Crypto-currency’. The initial design of Decred was also inspired by the Proof of Activity whitepaper co-authored by Litecoin founder Charlie Lee.
    4. Further, the lack of clear development funding methods in Bitcoin is often seen as problematic. The core network software exists as open source code on Github, but it is difficult for developers to directly monetize their contributions to the codebase. Funding for Bitcoin Core developers was entirely donation driven until 2014.
    5. Bitcoin has no formal governance structure, and decisions to alter the protocol are made entirely off-chain, typically by insiders/early adopters and heads of large mining operations.
    6. Block rewards are split 60/30/10 between PoW miners, PoS stakers, and a development fund controlled by community vote.
    7. The network was launched in February 2016
    1. The Web is distributed, with different systems working together to provide access to content. Annotations can be used to link those resources together, being referenced as the Body and Target
    2. Comments about shared photos or videos, reviews of products, or even social network mentions of web resources could all be considered as annotations. In addition, there are a plethora of "sticky note" systems and stand-alone multimedia annotation systems. This specification describes a common approach to expressing these annotations, and more
    1. Discretionary traders are decision-based traders who scan the markets and place manual orders in response to information that is available at that time. System traders, on the other hand, use some level of automation to implement an objective set of rules, allowing a computer to both scan for trading opportunities and handle all order entry activity.
    2. Traders continue to monitor their open positions and look for any more opportunities.
    3. If everything is working properly, traders start scanning the markets for potential trading opportunities.
    4. After reading about events and making note of what the analysts are saying, traders head to their workstations, turn on their computers and monitors and open up their analysis and trading platforms.
    5. Traders will also review economic calendars to find out which market-moving financial reports – such as the weekly petroleum status report – are due that day.
    6. This involves reading stories from various newspapers and financial websites, as well as listening to updates from financial news networks, such as CNBC and Bloomberg.
    1. Right now, they estimate the global taxi market is worth $108 billion, which is triple the size of the $36-billion ride-hailing market. At the same time, they calculate an average of 15 million ride-hailing trips a day globally, which they expect to increase to 97 million by 2030.
    1. Recent studies have indicated that Uber’s U.S. driver churn has sharply increased this year, to rates as high as 96%. Needless to say, it’s hard (and costly) to maintain double-digit growth rates, when only 4% of mission critical, de facto employees stay on the job for more than a year.
    2. In fact, Uber has struggled to achieve market share leadership in many large foreign markets, including China, India, SE Asia and Brazil. Moreover, while network effects do exist within each metro market, the benefits are significantly weakened by extremely low switching costs, which enable drivers and riders to utilize whichever ridesharing service offers the best deal on any given trip.
    3. In historical context, Uber’s extraordinary losses are thus not just a case of growing pains of an ambitious Silicon Valley startup, but a reflection of the deep structural deficiencies in ride-hail industry economics. Prior to artificial regulatory supply caps, the unregulated taxi industry was unprofitable and subject to growing concerns over negative externalities. Uber is now facing the same relentless drag on its P&L.
    1. The Achilles’ heel of Uber and Lyft is their centralized management of pricing. This week's uproar by drivers — and their willingness to join an alternative — shows the failure of that approach. You cannot build a long-term relationship with drivers if you are taking away their ability to set their own pricing. Arcade City will decentralize those decisions to the level of the driver and their customers.
    1. Most people think of loyalty programs as an airline that gives miles to frequent fliers, a hotel that gives points toward a stay or a restaurant that offers a punch card incentive. While these may be called loyalty programs, I’ll argue that they are actually marketing programs disguised as loyalty programs. And while I don’t have a problem with this concept, we need to have a clear understanding of the differences between loyalty and marketing.
    1. Non-cooperative companies solve this problem by taking up front capital and using that to subsidize one or both sides of the marketplace – guaranteeing fees to musicians and doing lots of marketing to recruit users. Cooperatives lack this up front capital, making it hard to get started.
    2. First and foremost, it is not clear that Info Coops will produce more open information? After all, the logic of the information coop model is that information is only shared with members.
    3. An Information Coop would pool resources to create or purchase a specific piece of type of information. For example, you could have an Information Coop to fund research on a particular drug or disease; or to design and develop a new software application; or to purchase rights to music or movies for its members.
    1. An underlying theme in much of the work in the field is that existing government regulation of copyright, security, and antitrust is inappropriate in the modern world. For example, information goods, such as news articles and movies, now have zero marginal costs of production and sharing. This has made the redistribution without permission common and has increased competition between providers of information goods.
  13. Dec 2017
    1. Max Keiser and Michael R. Burns, who were awarded a U.S. patent no. 5950176 in 1999 for the invention.

      Wow Keiser owns patent on HSX. Need to reach out about Bitshares.

    2. simulated money to buy and sell "shares" of actors, directors, upcoming films, and film-related options.

      keyword simulated

    1. Additionally Michael Schuman of Time magazine noted that these banks kept injecting new funds into unprofitable "zombie firms" to keep them afloat, arguing that they were too big to fail. However, most of these companies were too debt-ridden to do much more than survive on bail-out funds. Schuman believed that Japan's economy did not begin to recover until this practice had ended.

      Seems like the root cause of this baron period

    2. Trying to deflate speculation and keep inflation in check, the Bank of Japan sharply raised inter-bank lending rates in late 1989.[11] This sharp policy caused the bursting of the bubble and the Japanese stock market crashed.

      Fed would not repeat same mistake as japan. If anything they would attempt to refuel the bubble if it started to falter.

    1. It is also based on an entirely decentralized storage network (DSN) using encrypted blockchain technology that is impervious to eavesdropping, disruption, or other kinds of interference.

      Does it use ipfs?

    1. UBS

      main-subject

    2. Just in time for the implementation of stringent new regulatory requirements, some of the largest banks in the world have revealed a pilot designed to simplify compliance using ethereum.

      summary

    1. for booking musicians and selling tickets to live events using smart contracts.

      larger scope around coordinating events

    2. by creating an Ethereum-based platform

      Constrained by that protocols limitations

    1. A lot of people had decided to go online, and to go online with AOL. Many were even starting to run their businesses through email and things like that. The fact that we were down for 23 hours was frustrating to them and disappointing to them. We understood that. We felt like we had let lots of people down. The upside was that it was an interesting sign that the internet had come of age, that AOL had come of age. Even a few short years before, no one even knew what we were doing or cared what we were doing.

      Same phenomenon with tokens

    2. The cost of the networks, the Time-Net, Telenet, and other things back then was typically about $10 an hour. And when you actually got online, there wasn't much to do, there wasn't much content.

      Cost were prohibitive (e.g bitcoin congestion and fees). Now those fees are far lower (e.g. pos protocols). People are still using proof of work protocols. Ugh.

    3. The third wave, Case believes, is the concept of the "Internet of Everything," where every part of our lives will rely on an internet connection. He sees this new wave defined not by hardware or software but by partnerships—especially between business and government. New partnerships, Case believes, will be able to change the way our institutions, like healthcare, education, and agriculture, integrate the internet into our lives.

      New partnerships? Cooperatives?

    1. Given the speed of bitcoin’s take-off this year, there are concerns. Futures markets and derivatives exchanges like Cboe and CME are battle tested, but the exchanges where actual bitcoins are traded and held are vulnerable. They’re basically startups contending with immense growth; trading disruptions, withdrawal freezes, and hacks at bitcoin exchanges are fairly common.

      Ultimately they will get hacked and price will correct faster than futures can adjust

    1. Ethereum is no longer the only major smart contract platform in existence

      Except it is. Large organizations and financial institutions are not using other platforms, because they are either vapor ware or too new. Ethereum has another six months without any competition and the largest dev community.