1,025 Matching Annotations
  1. Jun 2019
    1. The term first appeared in 1984 as the topic of a public debate at the annual meeting of AAAI (then called the "American Association of Artificial Intelligence"). It is a chain reaction that begins with pessimism in the AI community, followed by pessimism in the press, followed by a severe cutback in funding, followed by the end of serious research.[2] At the meeting, Roger Schank and Marvin Minsky—two leading AI researchers who had survived the "winter" of the 1970s—warned the business community that enthusiasm for AI had spiraled out of control in the 1980s and that disappointment would certainly follow. Three years later, the billion-dollar AI industry began to collapse.
    1. volatility and leverage are co-determined and arepro-cyclical; that is, together, they amplify the impact ofshocks. The mechanism, to be specific, is that decliningvolatility reduces the cost of taking on more leverage andfurthers a buildup of risk. The lesson: Risk managers mustresist the temptation to sell volatility when it is low andfalling. The AMH implicitly embraces modeling suchbehavior with heterogeneous agents that use heuristics.
    1. Throughout the past two decades, he has been conducting research in the fields of psychology of learning and hybrid neural network (in particular, applying these models to research on human skill acquisition). Specifically, he has worked on the integrated effect of "top-down" and "bottom-up" learning in human skill acquisition,[1][2] in a variety of task domains, for example, navigation tasks,[3] reasoning tasks, and implicit learning tasks.[4] This inclusion of bottom-up learning processes has been revolutionary in cognitive psychology, because most previous models of learning had focused exclusively on top-down learning (whereas human learning clearly happens in both directions). This research has culminated with the development of an integrated cognitive architecture that can be used to provide a qualitative and quantitative explanation of empirical psychological learning data. The model, CLARION, is a hybrid neural network that can be used to simulate problem solving and social interactions as well. More importantly, CLARION was the first psychological model that proposed an explanation for the "bottom-up learning" mechanisms present in human skill acquisition: His numerous papers on the subject have brought attention to this neglected area in cognitive psychology.
    1. Bob Barton [said] "The basic principle of recursive design is to make the parts have the same power as the whole." For the first time I thought of the whole as the entire computer, and wondered why anyone would want to divide it up into weaker things called data structures and procedures. Why not divide it up into little computers... Why not thousands of them, each simulating a useful structure?
    1. To keep recession away, the Federal Reserve lowered the Federal funds rate 11 times - from 6.5% in May 2000 to 1.75% in December 2001 - creating a flood of liquidity in the economy. Cheap money, once out of the bottle, always looks to be taken for a ride. It found easy prey in restless bankers—and even more restless borrowers who had no income, no job and no assets. These subprime borrowers wanted to realize their life's dream of acquiring a home. For them, holding the hands of a willing banker was a new ray of hope. More home loans, more home buyers, more appreciation in home prices. It wasn't long before things started to move just as the cheap money wanted them to.
  2. May 2019
    1. Virtually all BPMs have utilities for creating simple, data-gathering forms. And in many types of workflows, these simple forms may be adequate. However, in any workflow that includes complex document assembly (such as loan origination workflows), BPM forms are not likely to get the job done. Automating the assembly of complex documents requires ultra-sophisticated data-gathering forms, which can only be designed and created after the documents themselves have been automated. Put another way, you won't know which questions need to be asked to generate the document(s) until you've merged variables and business logic into the documents themselves. The variables you merge into the document serve as question fields in the data gathering forms. And here's the key point - since you have to use the document assembly platform to create interviews that are sophisticated enough to gather data for your complex documents, you might as well use the document assembly platform to generate all data-gathering forms in all of your workflows.
  3. Mar 2019
  4. Feb 2019
    1. In a 2011 Reddit IAmA, Jennings recalled how in 2004 the Democratic politicians Chuck Schumer and Harry Reid unsuccessfully asked Jennings to run for the United States Senate from Utah. Jennings commented, "That was when I realized the Democratic Party was f@#$ed in '04."[19]
  5. Jan 2019
    1. You don't need complex sentences to express complex ideas. When specialists in some abstruse topic talk to one another about ideas in their field, they don't use sentences any more complex than they do when talking about what to have for lunch. They use different words, certainly. But even those they use no more than necessary. And in my experience, the harder the subject, the more informally experts speak. Partly, I think, because they have less to prove, and partly because the harder the ideas you're talking about, the less you can afford to let language get in the way.
    2. It seems to be hard for most people to write in spoken language. So perhaps the best solution is to write your first draft the way you usually would, then afterward look at each sentence and ask "Is this the way I'd say this if I were talking to a friend?" If it isn't, imagine what you would say, and use that instead. After a while this filter will start to operate as you write. When you write something you wouldn't say, you'll hear the clank as it hits the page.Before I publish a new essay, I read it out loud and fix everything that doesn't sound like conversation. I even fix bits that are phonetically awkward; I don't know if that's necessary, but it doesn't cost much.
    3. If you simply manage to write in spoken language, you'll be ahead of 95% of writers. And it's so easy to do: just don't let a sentence through unless it's the way you'd say it to a friend.
  6. Dec 2018
    1. “They’re actively, actively recruiting,” said Cheddar’s Alex Heath. “They’re also trying to scoop up crypto start-ups that are at the white-paper level, which means they don’t really even have a product yet.”
  7. Sep 2018
    1. The selloff partly reflects a broader malaise in emerging markets. U.S. interest rate increases and a stronger dollar have lured cash back to America, often at the expense of developing economies. Some countries have come under additional pressure because of U.S. tariffs or sanctions, while economic turmoil in Turkey and Argentina have further fueled investors’ concerns.
  8. Aug 2018
    1. Bakkt will provide access to a new Bitcoin trading platform on the ICE Futures U.S. exchange. And it will also offer full warehousing services, a business that ICE doesn’t have. “Bakkt’s revenue will come from two sources,” says Loeffler, “the trading fees on the ICE Futures U.S. exchange, and warehouse fees paid by the customers that buy Bitcoin and store with Bakkt.”
    2. Bakkt plans to offer a full package combining a major CFTC-regulated exchange with CFTC-regulated clearing and custody, pending the approval from the commission and other regulators.

      still pending regulatory approval

    3. At a recent meeting with the couple in the plush Bond Room at the NYSE, Sprecher stressed that Loeffler has been a collaborator in charting ICE’s next big move. “Kelly and I brainstormed for five years to find a strategy for digital currencies,” says Sprecher.

      bakkt is 5 years in the making

    4. Cracking the 401(k) and IRA market for cryptocurrency would be a huge win for Bakkt. But the startup’s plans raise the prospect of an even more ambitious goal: Using Bitcoin to streamline and disrupt the world of retail payments by moving consumers from swiping credit cards to scanning their Bitcoin apps. The market opportunity is gigantic: Consumers worldwide are paying lofty credit card or online-shopping fees on $25 trillion a year in annual purchases.

      Allowing money from 401ks and IRA's would allow for huge influx of passive capital.

      Retail component would actually cause selling pressure as was seen in 2015 when more and more retailers started accepting bitcoin.

    1. The idea of a gold exchange-traded fund was first conceptualized by Benchmark Asset Management Company Private Ltd in India when they filed a proposal with the SEBI in May 2002. However it did not receive regulatory approval at first and was only launched later in March 2007

      Took 5 years to get approval for gold etf in India

    1. However, most ETCs implement a futures trading strategy, which may produce quite different results from owning the commodity.
    2. However, generally commodity ETFs are index funds tracking non-security indices. Because they do not invest in securities, commodity ETFs are not regulated as investment companies under the Investment Company Act of 1940 in the United States, although their public offering is subject to SEC review and they need an SEC no-action letter under the Securities Exchange Act of 1934. They may, however, be subject to regulation by the Commodity Futures Trading Commission.

      Commodity etfs are regulated by CFTC but need a no action letter from the SEC to be approved.

    3. The idea of a Gold ETF was first officially conceptualised by Benchmark Asset Management Company Private Ltd in India when they filed a proposal with the SEBI in May 2002.[32] The first gold exchange-traded fund was Gold Bullion Securities launched on the ASX in 2003, and the first silver exchange-traded fund was iShares Silver Trust launched on the NYSE in 2006. As of November 2010 a commodity ETF, namely SPDR Gold Shares, was the second-largest ETF by market capitalization.[33]

      In 8 years gold etf became the second largest by market cap

  9. Jul 2018
    1. Mayor de Blasio and his administration have made progress in meeting their goal of building 200,000 affordable units over the span of a decade, as 21,963 new units were added in 2016, the most in 27 years. However, there continues to be a shortage in East Harlem. Out of the nearly 20,000 affordable units, the city brought to all five boroughs, just 249 units have been built in East Harlem, according to a new report by the Department of Housing and Preservation Development (HPD). To better accommodate these residents, the city plans on expediting the construction of 2,400 units of affordable housing over the next few years, as DNA Info reported.
    1. However, price time-series have some drawbacks. Prices are usually only positive, which makes it harder to use models and approaches which require or produce negative numbers. In addition, price time-series are usually non-stationary, that is their statistical properties are less stable over time.
    1. Denote NNN as the number of instances of evidence we possess. As we gather an infinite amount of evidence, say as N→∞N→∞N \rightarrow \infty, our Bayesian results (often) align with frequentist results. Hence for large NNN, statistical inference is more or less objective. On the other hand, for small NNN, inference is much more unstable: frequentist estimates have more variance and larger confidence intervals. This is where Bayesian analysis excels. By introducing a prior, and returning probabilities (instead of a scalar estimate), we preserve the uncertainty that reflects the instability of statistical inference of a small NNN dataset.

      Law of large numbers helps to get to the frequentist result but the bayesian perspective reflects instability of inferential statistics when the number of observed inferences is small.

    1. A core tenet of the Y Combinator playbook for startups is to talk to your users. If you’re interested in building a third party app on top of a fat protocol, the lesson might be to also talk to competing apps’ users to figure out what needs aren’t being served. In a similar vein, protocol developers should talk to app developers and learn what they think end users want.

      this isn't happening nearly enough which is why protocols don't provide tech components for viable end user apps

    1. The reason Mr. Wonderful loves royalty based funding is because it is a big win for both businesses and investors. Investors see a return on helping businesses succeed. Experienced investors will even offer guidance to help business owners avoid the pitfalls that many entrepreneurs stumble into. On the business side, entrepreneurs get the financing they need without debt or sacrificing ownership of their companies in any way. Additionally, since repayment of royalty based financing is structured around revenue, there are no rigid payment schedule. Royalty based funding provides financing and flexibility, which gives businesses the freedom to reach their potential, while simultaneously providing healthy returns to investors.
    1. Here are the definitions to make sure we’re on the same page:Subscription model — a periodic (monthly, yearly, or seasonal) payment to gain access to products or services.Transactional model — you pay as you use the products and services.
    1. Second, recall that the impetus formovingfrom proof-of-workto proof-of-stakeis to reduce the amount of computational resource and energy required to maintain the network by a couple orders of magnitude. That’s good forscalability and potential adoption, butalso meansa commensurate reduction in the PQ of the network.

      The impetus is for the reduction of technical debt and the increased efficiency of network resource provision. The computational resources used for mining and not for processing transactions gets repurposed to increase the amount of transactions that can be processed.

      This assumption that Q is constant is bizarre. If just looking at transaction throughput the goal is to be able to process several 100 thousand transactions per second if not millions. P and Q are clearly inversely correlated.

    2. Is that added valueenoughto offsetitsinefficiency compared to the incumbent centralised Twitter? Would Token Twitter offer compellingly higher utility compared to centralised Twitter, including enough surplus utility to offset the cost of operating the consensus mechanism? I’m notso sure.

      Considering the majority of the cost of operating twitter comes from human capital, marketing, legal and accounting and not from IT, which continues to fall on a per unit basis while the aformentioned continue to increase yes. If the assumption is that legal and accounting is no longer needed and developers and other employees are overpaid relative to an entirely crowdsource labor force then then you might see the redundancy costs in IT operations offset by cost reduction in other operational expenses.

    3. The combined effect of low and falling PQ and potentially very high V is that the utility value of utility cryptoassets at equilibrium should in fact be relatively low.

      Utility value of utility crypto assets are not entirely a function of the cost of network resources. These assets also provide influence which can't be financial measured as it captures the participants the expected future value of the network and what having influence over its direction can afford that particular participant.

      Also I would think that PQ is artificially high right now because of how inefficient blockchains are but as P falls due to further scalability Q should increase not only to offset declines in P but overcompensate the declining P as more services can be built on this infrastructure. A premium will be placed on the network effects of protocol that has a successful applicaiton that other applicaitons will want to interoperate with for its data and microservices (e.g. identity account, financial etc).

    1. Additionally, there is work available in most countries for people living outside the US, but only workers in the US and India can withdraw cash. Workers from other countries can only redeem their earnings through Amazon gift cards.
  10. Jun 2018
    1. there have always been far more users/consumers than suppliers, which means that in a world where transactions are costly owning the supplier relationship provides significantly more leverage.
    2. The value chain for any given consumer market is divided into three parts: suppliers, distributors, and consumers/users. The best way to make outsize profits in any of these markets is to either gain a horizontal monopoly in one of the three parts or to integrate two of the parts such that you have a competitive advantage in delivering a vertical solution. In the pre-Internet era the latter depended on controlling distribution.
    1. Since you use a cryptoasset once, and then it’s in someone else’s hands, this discounting methodology is not accumulative over each year the way it is with a DCF.

      why does a token have to be used once and exchange hands. a token can be taken out of circulation

    1. Basically, all token pitches include a line that goes something like this: "There is a fixed supply of tokens. As demand for the token increases, so must the price." This logic fails to take into account the velocity problem
    1. This, of course, leaves us none the wiser as to how to model velocity, as the equation of exchange is nothing more than an identity. MV=PQ just says that the money flow of expenditures is equal to the market value of what those expenditures buy, which is true by definition. The left and right sides are two ways of saying the same thing; it’s a form of double-entry accounting where each transaction is simultaneously recorded on both sides of the equation. Whether an effect should be recorded in M, V, P, or Q is, ultimately, arbitrary. To transform the identity into a tool with predictive potency, we need to make a series of assumptions about each of the variables. For example, monetarists assume M is determined exogenously, V is constant, and Q is independent of M and use the equation to demonstrate how increases in the money supply increase P (i.e. cause inflation).
    2. The first practical problem with velocity is that it’s frequently employed as a catch-all to make the two sides of the equation of exchange balance. It often simply captures the error in our estimation of the other variables in the model.
    3. The core thesis of current valuation frameworks is that utility value can be derived by (a) forecasting demand for the underlying resource that a network provisions (the network’s ‘GDP’) and (b) dividing this figure by the monetary base available for its fulfillment to obtain per-unit utility value. Present values can be derived from future expected utility values using conventional discounting. The theoretical framework that nearly all these valuation models employ is the equation of exchange, MV=PQ.
    1. Mechanism design studies solution concepts for a class of private-information games. Leonid Hurwicz explains that 'in a design problem, the goal function is the main "given", while the mechanism is the unknown. Therefore, the design problem is the "inverse" of traditional economic theory, which is typically devoted to the analysis of the performance of a given mechanism.'[1] So, two distinguishing features of these games are: that a game "designer" chooses the game structure rather than inheriting one that the designer is interested in the game's outcome

      Advantages over traditional game theory for token econimics:

      • a game "designer" chooses the game structure rather than inheriting one
      • that the designer is interested in the game's outcome
    1. 1. Thesis: Open Standards, Market Cycles and Investment ReturnsInformation technology evolves in multi-decade cycles of expansion, consolidation anddecentralization.

      Open standards reduce production costs, which bring down prices for consumers and increase the potential size of the market.

      New entrants realizing that cost are now low, competition is scarce and the potential reward is high, attempt to disrupt incumbents with more efficient and scalable business models.

      Market consolidates around the platforms of the companies that realize and implement these business models first.

      Demand then builds for a low cost, open source alternative to the incumbent platforms.

    2. We favor spreading priceand risk by building up and averaging out of positions over time rather than speculating onspeculation. A committed capital structure with significant capital reserves for staged follow-onsgives us the flexibility to build up our investments independent of market sentiment. We areshielded from having to dump assets on the market to honor redemption requests, avoiding thedreaded “death spiral” which can plague more liquid fund structures.
    3. We fund the development of decentralized information networks coordinated by a scarcecryptoasset – or token – native to the protocol. Our thesis is that decentralization andstandardization at the data layer of the internet is collapsing the production costs of informationnetworks, eliminating data monopolies and creating a new wave of innovation.
    4. Crypto provides a new mechanism for organizing human activity on a global basis usingprogrammable financial incentives. It’s an opportunity to design information networks which canachieve unprecedented levels of scale by decentralizing the infrastructure, open sourcing thedata, and distributing value more broadly. What we’ve discovered is the native business model ofnetworks – which, as it turns out, encompass the entire economy.
    5. Most of the use cases today involve compensating machine work (transaction processing, filestorage, etc.) with tokens: the building blocks of decentralized applications. But the greatestlong-term opportunity is in networks where tokens are earned by end-users themselves.
    6. We’ve also realized how inefficient the joint-stock equity industry model is at accounting for anddistributing the real value created by online networks. The value of a share of stock is necessarilya function of profits; the price of Twitter’s stock only reflects Twitter Inc’s ability to monetizethe data – and not the actual worth of the service. Tokens solve this inefficiency by derivingfinancial value directly from user demand as opposed to “taxing” by extracting profits.
    7. Following the history of information technology and the massive trend towards open source, wecan see that democratizing information is the natural next step in the incessant trend to opensource, and thus the next big opportunity for innovation.
    8. The way to play a consolidating market is to investheavily into the consolidating incumbents (which are likely to continue growing strongly for along period of time) and to invest progressively in the insurgent platforms that will grow tocommoditize the incumbent business models and create a new wave of innovation. We arefocused on the latter
    9. Those who succeed the most and establish successful platforms “on top” of the open standardlater tend to consolidate the industry by leveraging their scale (in assets and distribution) tointegrate vertically and expand horizontally at the expense of smaller companies. Competing inthis new environment suddenly becomes expensive and startups struggle to create value in theshadow of incumbents, compressing venture returns.Demand then builds for a low cost, open source alternative to the incumbent platforms, and thecycle repeats itself: the new open standard emerges and gets adopted, the market decentralizes asnew firms leverage the cost savings to compete with the old on price, value creation shiftsupwards (once more), and so on
    10. Information technology evolves in multi-decade cycles of expansion, consolidation anddecentralization. Periods of expansion follow the introduction of a new open platform thatreduces the production costs of technology as it becomes a shared standard. As production costsfall, new firms come to market leveraging the standard to compete with established incumbents,pushing down prices and margins, and decentralizing existing market powers.The price drop attracts new users, increasing the overall size of the market and creating newopportunities for mass consumer applications. Entrepreneurial talent moves to serve the newmarkets where costs are low, competition is scarce, and the upside is high. Often these earlyentrepreneurs will introduce new kinds of business models, orthogonal to existing ones
    1. In this kind of situation one might well ask: why continue to make the 80 per cent of products that only generate 20 per cent of profits? Companies rarely ask these questions, perhaps because to answer them would mean very radical action: to stop doing four-fifths of what you are doing is not a trivial change.

      Relevant on larger scale of global economies.

    2. There are two routes to achieving this. One is to reallocate the resources from unproductive to productive uses, the secret of all entrepreneurs down the ages. Find a round hole for a round peg, a square hole for a square peg, and a perfect fit for any shape in between. Experience suggests that every resource has its ideal arena, where the resource can be tens or hundreds of times more effective than in most other arenas. The other route to progress—the method of scientists, doctors, preachers, computer systems designers, educationalists and trainers—is to find ways to make the unproductive resources more effective, even in their existing applications; to make the weak resources behave as though they were their more productive cousins; to mimic, if necessary by intricate rote-learning procedures, the highly productive resources. The few things that work fantastically well should be identified, cultivated, nurtured and multiplied. At the same time, the waste—the majority of things that will always prove to be of low value to man and beast—should be abandoned or severely cut back.
    3. George Bernard Shaw put it well: ‘The reasonable man adapts himself to the world. The unreasonable one persists in trying to adapt the world to himself. Therefore all progress depends on the unreasonable man.
    4. Certainly, the principle brings home what may be evident anyway: that there is a tragic amount of waste everywhere, in the way that nature operates, in business, in society and in our own lives. If the typical pattern is for 80 per cent of results to come from 20 per cent of inputs, it is necessarily typical too that 80 per cent, the great majority, of inputs are having only a marginal—20 per cent—impact.
    5. Both phenomena help to show how the universe abhors balance. In the former case, we see a natural flight away from a 50/50 split of competing phenomena. A 51/49 split is inherently unstable and tends to gravitate towards a 95/5, 99/1 or even 100/0 split. Equality ends in dominance: that is one of the messages of chaos theory. The 80/20 Principle’s message is different yet complementary It tells us that, at any one point, a majority of any phenomenon will be explained or caused by a minority of the actors participating in the phenomenon. 80 per cent of the results come from 20 per cent of the causes. A few things are important; most are not.
    6. Related to the idea of feedback loops is the concept of the tipping point. Up to a certain point, a new force—whether it is a new product, a disease, a new rock group or a new social habit such as jogging or roller-blading—finds it difficult to make headway. A great deal of effort generates little by way of results. At this point many pioneers give up. But if the new force persists and can cross a certain invisible line, a small amount of additional effort can reap huge returns. This invisible line is the tipping point.
    7. We can see positive feedback loops operating in many areas, explaining how it is that we typically end up with 80/20 rather than 50/50 relationships between populations. For example, the rich get richer, not just (or mainly) because of superior abilities, but because riches beget riches. A similar phenomenon exists with goldfish in a pond. Even if you start with goldfish almost exactly the same size, those that are slightly bigger become very much bigger, because, even with only slight initial advantages in stronger propulsion and larger mouths, they are able to capture and gobble up disproportionate amounts of food
    8. At the heart of this progress is a process of substitution. Resources that have weak effects in any particular use are not used, or are used sparingly. Resources that have powerful effects are used as much as possible. Every resource is ideally used where it has the greatest value. Wherever possible, weak resources are developed so that they can mimic the behaviour of the stronger resources.
    9. Why should you care about the 80/20 Principle? Whether you realize it or not, the principle applies to your life, to your social world and to the place where you work. Understanding the 80/20 Principle gives you great insight into what is really happening in the world around you.
    10. The reason that the 80/20 Principle is so valuable is that it is counterintuitive. We tend to expect that all causes will have roughly the same significance. That all customers are equally valuable. That every bit of business, every product and every dollar of sales revenue is as good as another. That all employees in a particular category have roughly equivalent value. That each day or week or year we spend has the same significance. That all our friends have roughly equal value to us. That all enquiries or phone calls should be treated in the same way. That one university is as good as another. That all problems have a large number of causes, so that it is not worth isolating a few key causes. That all opportunities are of roughly equal value, so that we treat them all equally. We tend to assume that 50 per cent of causes or inputs will account for 50 per cent of results or outputs. There seems to be a natural, almost democratic, expectation that causes and results are generally equally balanced. And, of course, sometimes they are. But this ‘50/50 fallacy’ is one of the most inaccurate and harmful, as well as the most deeply rooted, of our mental maps.
    11. The key point is not the percentages, but the fact that the distribution of wealth across the population was predictably unbalanced.
    12. In business, many examples of the 80/20 Principle have been validated. 20 per cent of products usually account for about 80 per cent of dollar sales value; so do 20 per cent of customers. 20 per cent of products or customers usually also account for about 80 per cent of an organization’s profits. In society, 20 per cent of criminals account for 80 per cent of the value of all crime. 20 per cent of motorists cause 80 per cent of accidents. 20 per cent of those who marry comprise 80 per cent of the divorce statistics (those who consistently remarry and redivorce distort the statistics and give a lopsidedly pessimistic impression of the extent of marital fidelity). 20 per cent of children attain 80 per cent of educational qualifications available. In the home, 20 per cent of your carpets are likely to get 80 per cent of the wear. 20 per cent of your clothes will be worn 80 per cent of the time. And if you have an intruder alarm, 80 per cent of the false alarms will be set off by 20 per cent of the possible causes. The internal combustion engine is a great tribute to the 80/20 Principle. 80 per cent of the energy is wasted in combustion and only 20 per cent gets to the wheels; this 20 per cent of the input generates 100 per cent of the output!
    13. The 80/20 Principle asserts that a minority of causes, inputs or effort usually lead to a majority of the results, outputs or rewards. Taken literally, this means that, for example, 80 per cent of what you achieve in your job comes from 20 per cent of the time spent. Thus for all practical purposes, four-fifths of the effort—a dominant part of it—is largely irrelevant. This is contrary to what people normally expect.
    1. Overall the potential tariff charges could hit $450 billion worth of Chinese products entering into the US, which is likely to spill over into the working classes and create havoc for the citizens of both economies.  Meanwhile amidst this trade war, the geo-political unrest has directed a surge of capital back into the global crypto market, as individuals on both sides of the pacific withdraw into decentralized virtual assets to protect their interests. The global market capital has enjoyed a $16 billion increase in the last 24hrs and is looking promising to retrace back towards $300 billion, as US and Chinese stock market FUD increases.
    1. Pantera Capital has had a thesis of investing into local exchanges since the inception of its venture capital fund. Local exchanges have an advantage of a local team who understands the culture and marketing of a specific geography in addition to having the relationships for banking and regulations. In June 2014, Pantera investigated and became the lead US investor in the largest cryptocurrency exchange in Korea, Korbit. Korea was a compelling geography for a local exchange investment because of the country’s familiarity with virtual currencies, becoming one of the first countries to adopt them for gaming, having a government that is pro-innovation, having a large mobile ecosystem.
    1. Jonathan Evans suggested dual process theory in 1975. In his theory, there are two distinct types of processes: heuristic processes and analytic processes. He suggested that during heuristic processes, an individual chooses which information is relevant to the current situation. Relevant information is then processed further whereas irrelevant information is not. Following the heuristic processes come analytic processes. During analytic processes, the relevant information that is chosen during the heuristic processes is then used to make judgments about the situation.
    1. Journalists usually describe the organization or structure of a news story as an inverted pyramid. The essential and most interesting elements of a story are put at the beginning, with supporting information following in order of diminishing importance. This structure enables readers to stop reading at any point and still come away with the essence of a story.
    2. Charney states that "an effective lead is a 'brief, sharp statement of the story's essential facts.'"[10][full citation needed][clarification needed] The lead is usually the first sentence, or in some cases the first two sentences, and is ideally 20–25 words in length. A lead must balance the ideal of maximum information conveyed with the constraint of the unreadability of a long sentence. This makes writing a lead an optimization problem, in which the goal is to articulate the most encompassing and interesting statement that a writer can make in one sentence, given the material with which he or she has to work. While a rule of thumb says the lead should answer most or all of the five Ws, few leads can fit all of these.
    3. News stories also contain at least one of the following important characteristics relative to the intended audience: proximity, prominence, timeliness, human interest, oddity, or consequence.
    4. News writing attempts to answer all the basic questions about any particular event—who, what, when, where and why (the Five Ws) and also often how—at the opening of the article. This form of structure is sometimes called the "inverted pyramid", to refer to the decreasing importance of information in subsequent paragraphs.
    1. Awan: Our growth has been a  journey of constant learning, but if I were to pinpoint the principles our growth team lives by today, which I hope would help others building and growing products, they would be: Define one "North Star" metric for success that is aligned with how your users get value as well as with the success of your business, then measure everything and how it contributes to the North Star metric. Good examples of true-north metrics for growth are measures like how many users are truly engaged with your product Growth is a team sport so hire wisely and invest in your team Good product, measured by long-term retention, comes first; growth comes second Invest in multiple growth channels and identify potential channels by looking at existing user behavior, especially how they are currently discovering your product. At LinkedIn, our biggest channels are viral growth, search-engine optimization of profiles and other member-generated content and partnerships. Understand that growth requires continuous prioritization and feedback so always be testing
    2. Awan: Our growth has been a  journey of constant learning, but if I were to pinpoint the principles our growth team lives by today, which I hope would help others building and growing products, they would be: Define one "North Star" metric for success that is aligned with how your users get value as well as with the success of your business, then measure everything and how it contributes to the North Star metric. Good examples of true-north metrics for growth are measures like how many users are truly engaged with your product Growth is a team sport so hire wisely and invest in your team Good product, measured by long-term retention, comes first; growth comes second Invest in multiple growth channels and identify potential channels by looking at existing user behavior, especially how they are currently discovering your product. At LinkedIn, our biggest channels are viral growth, search-engine optimization of profiles and other member-generated content and partnerships. Understand that growth requires continuous prioritization and feedback so always be testing
    3. Awan: Here’s some advice to others as they focus on growing their own company: Make sure you have product-market fit before you invest in growth. What that means is you have validated that your product is in a market with large demand and your product is satisfying the need for users who try the product. You can measure this by retention rate which is essentially the percentage of your users who keep coming back. If your retention is not stable, you’d want to improve the product first rather than wasting resources on growth. Prioritize growth from the very beginning and build it right into the product. It’s much better for your users to bring other users to the product as they create and share photos or other content or, invite others in the course of normally using the app than to try growing through marketing that feels bolted onto the product. This is critically important if your product has network effects because the product value is limited for early users if the network doesn’t grow fast enough. As your company starts to scale, it’s important to create a dedicated multi-disciplinary growth team covering product, design, marketing, engineering, data science, and business operations. In a startup, a single person may be playing multiple of those roles (an engineer who’s also the data scientist) but as you scale, you can create dedicated functional roles.
    4. Instead, we define the goal of the growth team as accelerating the realization of LinkedIn’s vision, which is to create economic opportunity for every member of the global workforce. Keeping this vision top of mind led us to set the right growth objectives and priorities.
    1. With over 65% contribution to the total revenue, talent solutions are the most important services and tools included in the LinkedIn business model. Talent solutions include premium recruiting tools for the companies and recruiters to help them find the most suitable employees/partners for their business.
    2. LinkedIn apart from being the best recruitment platform is also a sought-after social networking website by marketers to execute their marketing campaigns. This service contributes to over 18% of the total revenue of the company and offers features which let companies to not only create a company page but also enhance their marketing efforts by creating sponsored content, sponsored InMails and text advertisements.
    3. LinkedIn not only connects you with other professionals but also with companies and recruiters. The company has uniquely positioned itself as the only platform worthy of professional networking.
    4. The Linkedin business model is a freemium model which works as a community to connect professionals globally.
    5. The platform currently has over 500 Million users in over 200 countries and territories, 80% of which consider professional networking important to their career success.
    6. The mobile version of the website was launched in February 2008 and the company was bought by Microsoft in February 2016 for $26.2 billion.
    1. Thus mass collaboration is more refined and complex in its process and production on the level of collective engagement.
    2. Modularity enables a mass of experiments to proceed in parallel, with different teams working on the same modules, each proposing different solutions. Modularity allows different "blocks" to be easily assembled, facilitating decentralised innovation that all fits together.
    1. "When tasks require high coordination because the work is highly interdependent, having more contributors can increase process losses, reducing the effectiveness of the group below what individual members could optimally accomplish". Having a team too large the overall effectiveness may suffer even when the extra contributors increase the resources. In the end the overall costs from coordination might overwhelm other costs.
    2. Games such as The Sims Series, and Second Life are designed to be non-linear and to depend on collective intelligence for expansion. This way of sharing is gradually evolving and influencing the mindset of the current and future generations.[117] For them, collective intelligence has become a norm.
    3. The UNU open platform for "human swarming" (or "social swarming") establishes real-time closed-loop systems around groups of networked users molded after biological swarms, enabling human participants to behave as a unified collective intelligence.[140][141] When connected to UNU, groups of distributed users collectively answer questions and make predictions in real-time.[142] Early testing shows that human swarms can out-predict individuals.[140] In 2016, an UNU swarm was challenged by a reporter to predict the winners of the Kentucky Derby, and successfully picked the first four horses, in order, beating 540 to 1 odds.
    4. Epistemic democratic theories refer to the capacity of the populace, either through deliberation or aggregation of knowledge, to track the truth and relies on mechanisms to synthesize and apply collective intelligence.
    5. Research performed by Tapscott and Williams has provided a few examples of the benefits of collective intelligence to business:[38] Talent utilization At the rate technology is changing, no firm can fully keep up in the innovations needed to compete. Instead, smart firms are drawing on the power of mass collaboration to involve participation of the people they could not employ. This also helps generate continual interest in the firm in the form of those drawn to new idea creation as well as investment opportunities.[38] Demand creation Firms can create a new market for complementary goods by engaging in open source community. Firms also are able to expand into new fields that they previously would not have been able to without the addition of resources and collaboration from the community. This creates, as mentioned before, a new market for complementary goods for the products in said new fields.[38] Costs reduction Mass collaboration can help to reduce costs dramatically. Firms can release a specific software or product to be evaluated or debugged by online communities. The results will be more personal, robust and error-free products created in a short amount of time and costs. New ideas can also be generated and explored by collaboration of online communities creating opportunities for free R&D outside the confines of the company.[38]
    6. In one high-profile example, a human swarm challenge by CBS Interactive to predict the Kentucky Derby. The swarm correctly predicted the first four horses, in order, defying 542–1 odds and turning a $20 bet into $10,800.
    7. To address the problems of serialized aggregation of input among large-scale groups, recent advancements collective intelligence have worked to replace serialized votes, polls, and markets, with parallel systems such as "human swarms" modeled after synchronous swarms in nature.
    8. While modern systems benefit from larger group size, the serialized process has been found to introduce substantial noise that distorts the collective output of the group. In one significant study of serialized collective intelligence, it was found that the first vote contributed to a serialized voting system can distort the final result by 34%
    9. To accommodate this shift in scale, collective intelligence in large-scale groups been dominated by serialized polling processes such as aggregating up-votes, likes, and ratings over time
    10. The idea of collective intelligence also forms the framework for contemporary democratic theories often referred to as epistemic democracy.
    11. Condorcet, whose "jury theorem" states that if each member of a voting group is more likely than not to make a correct decision, the probability that the highest vote of the group is the correct decision increases with the number of members of the group (see Condorcet's jury theorem).
    12. The basis and goal of collective intelligence is mutual recognition and enrichment of individuals rather than the cult of fetishized or hypostatized communities."
    13. Collective intelligence (CI) is shared or group intelligence that emerges from the collaboration, collective efforts, and competition of many individuals and appears in consensus decision making.
    1. An upper ontology (or foundation ontology) is a model of the common relations and objects that are generally applicable across a wide range of domain ontologies.
    2. A domain ontology (or domain-specific ontology) represents concepts which belong to a part of the world, such as biology or politics.
    3. At present, merging ontologies that are not developed from a common upper ontology is a largely manual process and therefore time-consuming and expensive.
    1. When users can freely choose tags (creating a folksonomy, as opposed to selecting terms from a controlled vocabulary), the resulting metadata can include homonyms (the same tags used with different meanings) and synonyms (multiple tags for the same concept), which may lead to inappropriate connections between items and inefficient searches for information about a subject.
    2. Tagging systems open to the public are also open to tag spam, in which people apply an excessive number of tags or unrelated tags to an item (such as a YouTube video) in order to attract viewers. This abuse can be mitigated using human or statistical identification of spam items.[48] The number of tags allowed may also be limited to reduce spam.
    3. Hierarchical classification systems can be slow to change, and are rooted in the culture and era that created them; in contrast, the flexibility of tagging allows users to classify their collections of items in the ways that they find useful,
    4. The success of Flickr and the influence of Delicious popularized the concept,[21] and other social software websites—such as YouTube, Technorati, and Last.fm—also implemented tagging
    5. Tagging systems have sometimes been classified into two kinds: top-down and bottom-up.[3]:142[4]:24 Top-down taxonomies are created by an authorized group of designers (sometimes in the form of a controlled vocabulary), whereas bottom-up taxonomies (called folksonomies) are created by all users.
    6. People use tags to aid classification, mark ownership, note boundaries, and indicate online identity. Tags may take the form of words, images, or other identifying marks. An analogous example of tags in the physical world is museum object tagging. People were using textual keywords to classify information and objects long before computers. Computer based search algorithms made the use of such keywords a rapid way of exploring records.
    1. About 600,000 people visit News Genius a month, Lehman said, a figure that had grown 10 times since before President Donald Trump was inaugurated. And the number of people who annotate a post on Genius each month is now at 10,000, up 30 percent from the start of the year. “More people are using News Genius now than ever,” Lehman said. Meanwhile, overall traffic to the website and apps has grown to 62 million a month.
    2. Promised partnerships with major news media organizations never materialized, except in the case of The Washington Post’s Fix blog, which still occasionally uses the platform to annotate the news.
    3. Soon after, Genius made a definitive push to realize Andreessen’s vision. By 2015, Genius claimed 40 million visitors to its website a month, 1 million of whom had annotated a post.
    4. But the biggest problem with the annotator from Genius’ perspective is that few individuals are using it. After more than two years of development, the Chrome extension has only 12,320 users. It was last updated in June 2016.
    5. But it faced a storm of criticism last year after some writers complained the tool was being used to harass them. The annotator also raised concerns that it could have been used to inject malicious code onto visitors’ computers, though it’s since been tweaked to address that vulnerability.
    6. In January of that year, the company began testing a tool called the web annotator, which allowed anyone to add genius.it/ before any URL and then highlight and annotate text.
    7. “Rap Genius is going be the fabric of the internet,” co-founder Mahbod Moghadam said in 2014 “Rap Genius is going be the fabric of the internet,” co-founder Mahbod Moghadam said in 2014. “We’re going to have annotations on other sites, so every other site in the world like the Wall Street Journal and the New York Times are going be Genius-powered and they’re going to have our annotations on them. And then the Genius platform will take over the internet; everyone’s most important statistic that they have in life is their Genius IQ.”
    8. “The change we made in January was in recognition of the fact that we needed to shift resources from capturing knowledge — which we've been doing almost exclusively for the past five years — toward packaging and distributing knowledge into easy-to-consume formats like video and Spotify Behind the Lyrics,” Lehman told The Verge.
    1. Everipedia is going to be the answer to fake news
    2. The big difference is that Reddit’s discussions-not unlike Facebook’s-are haphazard and chaotic. [We plan for] Everipedia to have the most sophisticated groupthink software of all time, modeled off sites like Quora, Stackoverflow, and Genius.
    1. The combination of human expertise and automated analysis can exist in multiple overlays. Climate scientists, economists, political analysts, and automated fact checkers might converge on a single sentence in a story on climate change. Nothing depends on any domain-specific vocabulary or schema. Annotation is simply the connective tissue that makes statements in web pages addressable, and binds those addresses to conversations, supporting documents, source data, or truth claims that bear on annotated statements.
    2. The annotated web embodies that pattern. Systems that embrace it will tend to work well with one another. Their outputs will be available to mine, crosslink, and remix, and those activities will drive collective improvement.
    3. The web we know is an information fabric woven of linked resources. By increasing the thread count of that fabric, the annotated web enables a new class of application for which selections in documents are first-class resources.
    1. By requiring a lock up period for the DCR to obtain tickets, Decred hopes that only users invested in the long-term growth of the network will be involved in the consensus process. Short-term speculators and day traders of DCR will not be able to participate in consensus or governance without making their holdings illiquid.
    2. One concern in the Decred community is that the rising ticket price (about 100 DCR, as of mid-2018) excludes small holders from participating in governance and block validation.
    3. A Hybrid Proof of Work, Proof of Stake Crypto-currency’. The initial design of Decred was also inspired by the Proof of Activity whitepaper co-authored by Litecoin founder Charlie Lee.
    4. Further, the lack of clear development funding methods in Bitcoin is often seen as problematic. The core network software exists as open source code on Github, but it is difficult for developers to directly monetize their contributions to the codebase. Funding for Bitcoin Core developers was entirely donation driven until 2014.
    5. Bitcoin has no formal governance structure, and decisions to alter the protocol are made entirely off-chain, typically by insiders/early adopters and heads of large mining operations.
    6. Block rewards are split 60/30/10 between PoW miners, PoS stakers, and a development fund controlled by community vote.
    7. The network was launched in February 2016
    1. The Web is distributed, with different systems working together to provide access to content. Annotations can be used to link those resources together, being referenced as the Body and Target
    2. Comments about shared photos or videos, reviews of products, or even social network mentions of web resources could all be considered as annotations. In addition, there are a plethora of "sticky note" systems and stand-alone multimedia annotation systems. This specification describes a common approach to expressing these annotations, and more
    1. Discretionary traders are decision-based traders who scan the markets and place manual orders in response to information that is available at that time. System traders, on the other hand, use some level of automation to implement an objective set of rules, allowing a computer to both scan for trading opportunities and handle all order entry activity.
    2. Traders continue to monitor their open positions and look for any more opportunities.
    3. If everything is working properly, traders start scanning the markets for potential trading opportunities.
    4. After reading about events and making note of what the analysts are saying, traders head to their workstations, turn on their computers and monitors and open up their analysis and trading platforms.
    5. Traders will also review economic calendars to find out which market-moving financial reports – such as the weekly petroleum status report – are due that day.
    6. This involves reading stories from various newspapers and financial websites, as well as listening to updates from financial news networks, such as CNBC and Bloomberg.
    1. Right now, they estimate the global taxi market is worth $108 billion, which is triple the size of the $36-billion ride-hailing market. At the same time, they calculate an average of 15 million ride-hailing trips a day globally, which they expect to increase to 97 million by 2030.
    1. Recent studies have indicated that Uber’s U.S. driver churn has sharply increased this year, to rates as high as 96%. Needless to say, it’s hard (and costly) to maintain double-digit growth rates, when only 4% of mission critical, de facto employees stay on the job for more than a year.
    2. In fact, Uber has struggled to achieve market share leadership in many large foreign markets, including China, India, SE Asia and Brazil. Moreover, while network effects do exist within each metro market, the benefits are significantly weakened by extremely low switching costs, which enable drivers and riders to utilize whichever ridesharing service offers the best deal on any given trip.
    3. In historical context, Uber’s extraordinary losses are thus not just a case of growing pains of an ambitious Silicon Valley startup, but a reflection of the deep structural deficiencies in ride-hail industry economics. Prior to artificial regulatory supply caps, the unregulated taxi industry was unprofitable and subject to growing concerns over negative externalities. Uber is now facing the same relentless drag on its P&L.
    1. The Achilles’ heel of Uber and Lyft is their centralized management of pricing. This week's uproar by drivers — and their willingness to join an alternative — shows the failure of that approach. You cannot build a long-term relationship with drivers if you are taking away their ability to set their own pricing. Arcade City will decentralize those decisions to the level of the driver and their customers.
    1. Most people think of loyalty programs as an airline that gives miles to frequent fliers, a hotel that gives points toward a stay or a restaurant that offers a punch card incentive. While these may be called loyalty programs, I’ll argue that they are actually marketing programs disguised as loyalty programs. And while I don’t have a problem with this concept, we need to have a clear understanding of the differences between loyalty and marketing.
    1. Non-cooperative companies solve this problem by taking up front capital and using that to subsidize one or both sides of the marketplace – guaranteeing fees to musicians and doing lots of marketing to recruit users. Cooperatives lack this up front capital, making it hard to get started.
    2. First and foremost, it is not clear that Info Coops will produce more open information? After all, the logic of the information coop model is that information is only shared with members.
    3. An Information Coop would pool resources to create or purchase a specific piece of type of information. For example, you could have an Information Coop to fund research on a particular drug or disease; or to design and develop a new software application; or to purchase rights to music or movies for its members.
    1. An underlying theme in much of the work in the field is that existing government regulation of copyright, security, and antitrust is inappropriate in the modern world. For example, information goods, such as news articles and movies, now have zero marginal costs of production and sharing. This has made the redistribution without permission common and has increased competition between providers of information goods.
  11. Dec 2017
    1. Max Keiser and Michael R. Burns, who were awarded a U.S. patent no. 5950176 in 1999 for the invention.

      Wow Keiser owns patent on HSX. Need to reach out about Bitshares.

    2. simulated money to buy and sell "shares" of actors, directors, upcoming films, and film-related options.

      keyword simulated

    1. Additionally Michael Schuman of Time magazine noted that these banks kept injecting new funds into unprofitable "zombie firms" to keep them afloat, arguing that they were too big to fail. However, most of these companies were too debt-ridden to do much more than survive on bail-out funds. Schuman believed that Japan's economy did not begin to recover until this practice had ended.

      Seems like the root cause of this baron period

    2. Trying to deflate speculation and keep inflation in check, the Bank of Japan sharply raised inter-bank lending rates in late 1989.[11] This sharp policy caused the bursting of the bubble and the Japanese stock market crashed.

      Fed would not repeat same mistake as japan. If anything they would attempt to refuel the bubble if it started to falter.

    1. It is also based on an entirely decentralized storage network (DSN) using encrypted blockchain technology that is impervious to eavesdropping, disruption, or other kinds of interference.

      Does it use ipfs?

    1. UBS

      main-subject

    2. Just in time for the implementation of stringent new regulatory requirements, some of the largest banks in the world have revealed a pilot designed to simplify compliance using ethereum.

      summary

    1. for booking musicians and selling tickets to live events using smart contracts.

      larger scope around coordinating events

    2. by creating an Ethereum-based platform

      Constrained by that protocols limitations

    1. A lot of people had decided to go online, and to go online with AOL. Many were even starting to run their businesses through email and things like that. The fact that we were down for 23 hours was frustrating to them and disappointing to them. We understood that. We felt like we had let lots of people down. The upside was that it was an interesting sign that the internet had come of age, that AOL had come of age. Even a few short years before, no one even knew what we were doing or cared what we were doing.

      Same phenomenon with tokens

    2. The cost of the networks, the Time-Net, Telenet, and other things back then was typically about $10 an hour. And when you actually got online, there wasn't much to do, there wasn't much content.

      Cost were prohibitive (e.g bitcoin congestion and fees). Now those fees are far lower (e.g. pos protocols). People are still using proof of work protocols. Ugh.

    3. The third wave, Case believes, is the concept of the "Internet of Everything," where every part of our lives will rely on an internet connection. He sees this new wave defined not by hardware or software but by partnerships—especially between business and government. New partnerships, Case believes, will be able to change the way our institutions, like healthcare, education, and agriculture, integrate the internet into our lives.

      New partnerships? Cooperatives?

    1. Given the speed of bitcoin’s take-off this year, there are concerns. Futures markets and derivatives exchanges like Cboe and CME are battle tested, but the exchanges where actual bitcoins are traded and held are vulnerable. They’re basically startups contending with immense growth; trading disruptions, withdrawal freezes, and hacks at bitcoin exchanges are fairly common.

      Ultimately they will get hacked and price will correct faster than futures can adjust

    1. Ethereum is no longer the only major smart contract platform in existence

      Except it is. Large organizations and financial institutions are not using other platforms, because they are either vapor ware or too new. Ethereum has another six months without any competition and the largest dev community.

    2. Point is, if Dexaran is actually the DAO attacker (quite likely) then he has no incentive to support ETC long-term. If his plan is to develop a decentralized exchange, and also anonymity protocols on ETC – perfect for being able to liquidate the currently “tainted” ETC in the DAO attacker's address, perhaps? If that is the case then the so-called “Ethereum Commonwealth” is only here until the coins can be dumped on the market.

      Interesting point. Dumping the price has an opportunity cost. Could potentially make more money not dumping

    3. This post where he explains that his reason for developing on ETC is because he is concerned the chain will fade into obsolescence.

      I see ethereum fading into obsolescence as well. Does not mean it does not have short term utility.

    4. The problem is that it has become apparent to me that Ethereum Classic is not, in fact, a responsible, long-term-oriented Ethereum – it is a crippled, sad wannabe Ethereum. It's a sour grapes version of Ethereum – and all the ideology is mainly just a show.

      Neglects to realize that ETC can easily fork in proven features of ethereum.

    5. Ethereum Classic's value proposition was that it was Ethereum, but (a) truly decentralized – not controlled by the Ethereum Foundation, and (b) minus the “move fast and break things” modus operandi.

      Thats not really its value proposition. All of these platforms will be centralized because of pareto principal.

    1. EOS DPOS does optimistic pipelining that allows the blockchain to advance in “pending state” while the signatures are gathered.

      "optimistic pipelining" is a curious euphemism

    2. In order to reduce the frequency of producer set changes we have changed block scheduling to only include the top 21 producers. We are considering offering some kind of stand-by pay for the runner ups, but they will not actually be tasked with producing blocks.

      system is designed to have as few producer changes as possible. if there is a change it will force light clients to process more blockheaders

    3. We will be experimenting with different hand-off periods

      seems very experimental

    4. On our eos-noon branch we have implemented a number of changes to the underlying DPOS framework to support 500 ms blocks (2 blocks every second). This change will dramatically increase the responsiveness of decentralized applications.

      game changer

    5. In past updates we indicated our intention to focus on shared-memory architectures so that developers could easily perform synchronous read-access and atomic transactions with other contracts. The consequence of this approach was a loss of horizontal scaling beyond a single high end machine.

      shared memory constrains horizontal scaling

    6. Under this model, the communication will be secured so long as at least ⅓ of producers are honest.

      Hmmm so it has a 66% percent fault tolerance?

    7. Whereas traditionally light clients have to process all block headers, EOS.IO will enable light clients that only have to process block headers when producers change or when a new message is required from a more recent block.

      Something about that sounds off

    8. EOS.IO will be the first proof-of-stake protocol with support for light client validation.

      Is it really the only pos protocol with support for light clients?

    9. EOS Dawn 3.0 will re-introduce horizontal scaling of single chains and infinite scaling via secure inter-blockchain communication. With these two features there will be no limit to what can be built on blockchain technology, nor for how decentralized the network of blockchains can become.

      That's a bold statement! Can't wait to see how it plays out once everything runs on a blockchain.

    10. The contract is billed based upon the total data they store plus a constant overhead factor for each independent database entry. This in-memory database is independent and separate from the EOS.IO Storage protocol for decentralized bulk hosting and storage.

      Can rent memory

    11. All accounts whose authority is required for the transaction will have their 3-day average bandwidth incremented based upon the size of the transaction.

      Uses 3 day average for bandwidth calculations

    12. All transactions consume some of the maximum network bandwidth as configured by the block producers.

      Block producers decide network capacity.

    13. Because of these outstanding attack vectors, performance testing will remain a task for private test networks, but feature testing can now be performed on a public test network which we are artificially limiting to 30 TPS to ensure uptime and access.

      Public test network is limited to 30 tps

    14. That said, there are known attack vectors for which we have unimplemented solutions. For example, compilation of new contracts for the first time can take up to 34ms, which if exploited could cause the network to fragment at transaction rates over 30 TPS.

      Taking very iterative approach. Hopefully this long period of open development will all for white hat hackers to identify all attack vectors.

    15. Our internal testing shows we can sustain several thousand transfers per second and 1 second blocks using our single-threaded implementation on average hardware.

      Estimate of 5-10k tps

    16. due to parallel development paths our implementation of Inter-Blockchain Communication exists on a separate branch that will not be used for the initial test network.

      Interblockchain communication not implemented

    1. But when CryptoKitties was officially released on October 28, it unexpectedly became a multi-million dollar digital kitten mill—perhaps the strongest ever confirmation that a fool and their Ethereum are easily parted.

      Hahaha so true. A clear indication

    1. Its equivalent in the non-profit world is called "micro-volunteering" whereby individuals donate their time and skills to undertake micro-tasks such as tagging pictures or transcribing handwritten messages in support of development projects worldwide.

      from micro volunteering to micropreneurship

    2. The micropreneur may then launch the business and become a traditional business owner if desired.

      gives entrepreneurs more flexibility

    3. micropreneur

      i dont think i like it, but it could grow on me

    1. f MAX_ACCUMULATE_DAY = 7 and VOTING_CONSUME_FACTOR = 0.2, the user can carry out effective ratings for five times with the accumulated coin days per day. However, there is an upper limit of daily rating: every user is only permitted to make ratings for no more than 35 times a day (excluding ineffective ratings) regardless of the accumulated coin days

      Interesting constraint. Not sure why this is necessary

    2. The POT (Proof of Taste) content rating algorithm based on the principle of income distribution dependent on net positive rating weight is applied to YOYOW network

      what steem refers to as proof of brain

    3. YOYOW, named from “You Own Your Own Words”, is a blockchain-based network aiming to quantify contribution and give rewards of participants in the content producing sector with decentralized consensus methods so that content producers, content investors, curators, and consumers of the content ecosystem can be provided with incentives and returns as appropriate.

      alignment of incentives by content producers and beneficiaries

    1. We will provide SDKs and detailed development documents for third-party developers to help them build their own customized content platforms, like blockchain version of Quora, twitter or facebook, different types of content platforms can be built on the top of YOYOW network, for independent bloggers or writers, we will provide plug-ins to enable them to create their own branded platforms

      steem of china

    1. Bullish scenario: Consolidation above the $10,000 mark for a next couple of days would improve the odds of bitcoin moving to fresh record highs.

      interesting, positive

    2. The doors look open for a drop to $9,000. A close below $9,202 (38.2 percent Fibonacci retracement) could yield a sell-off to $7,793 (61.8 percent Fibonacci retracement). However, the 10-day MA (seen today at $9,000) is still sloping upwards, thus losses below the same are likely to be short-lived.

      interesting, negative

    3. The decline to below $10,000 comes despite news that Nasdaq plans to launch bitcoin futures in 2018, perhaps indicating that bitcoin's coming move towards the investment mainstream has been priced in by the markets.

      interesting, negative

    4. Notably, it's only been a day since the cryptocurrency clocked an all-time high of $11,363.99, before falling more than 15 percent to $9,295.79, as per CoinDesk's Bitcoin Price Index (BPI).

      interesting, negative

    5. Bitcoin prices are taking a hit at press time, and could suffer a deeper pullback over the weekend, the price charts indicate.

      summary

    1. some of the data required has also been reduced from the initial order, which included information such as wallet addresses and public keys.

      intersting, positve

    2. The dispute over user records has been ongoing since November 2016, with the IRS proposing a reduced summons in July this year – down from an initial 480,000 customer accounts requested.

      interesting, positive

    3. Coinbase must now provide the tax agency with the name, address, taxpayer identification numbers and date of birth of customers associated with these accounts, a court filing states.

      controversial, negative

    4. Following a lengthy legal battle between the two entities, the San-Francisco district court ruled Tuesday that Coinbase must hand over user accounts at the exchange that bought, sold, sent or received sums of $20,000 and higher between 2013 and 2015.

      summary

  12. Nov 2017
    1. SMT creator may specify thatSteem Power can control a portion of the SMT’s rewards pool for an unlimited or limitedamount of time, with increasing or decreasing influence. Altogether, Shared Influence mayallow SMTs to be wholly or partially bootstrapped by the interest of existing and activeSteem or other SMT community members. Through these tools, community managersand entrepreneurs launching a token may leverage existing user bases to accelerate thedistribution of the SMT to a target market.

      User base targeting

    2. SMT-based ICOsallow a portion of STEEM tokens received to be sent into an SMT’s on-chain, off-order-book market maker in order to provide liquidity to the SMT at a specified reserve ratio.

      Really important to provide liquidity to token holders, otherwise tokens would be useless.

    3. En-trepreneurs may now create tokens to integrate with their blog, application, or an entirenetwork of applications and topics

      The chief advantage of steem is that developers integrate with it instead of develop on it. There are fewer development cost and risks. Your dont have to write your own smart contracts, in a new language and in a possibly insecure environment.