If wherever we encountered new information, sentence by sentence, frame by frame, we could easily know the best thinking on it.

If we had confidence that this represented the combined wisdom of the most informed people—not as anointed by editors, but as weighed over time by our peers, objectively, statistically and transparently.

If this created a powerful incentive for people to ensure that their works met a higher standard, and made it perceptibly harder to spread information that didn’t meet that standard.

These goals are possible with today’s technologies.

They are the objectives of

Reserve your username.



  • Welcoming Aron Carroll

    Earlier this summer Aron Carroll joined the team here at Hypothesis. That makes this post a little overdue, but no less enthusiast...
  • Annotating educational resources

    On June 9-10, the Monterey Institute for Technology and Education (MITE),, and Lumen Learning convened a 2-day summi...
  • Award from the Andrew W. Mellon Foundation

    The Project is pleased to announce an award for $752,000 from the Andrew W. Mellon Foundation to investigate the use ...
  • Peer Review and Annotation

    On May 15-16, 2014, approximately 60 attendees from AAAS and arXiv to the W3C joined at an Alfred P. Sloan funded meet...