Hypothesis: Meeting the Audrey Test for educational technology

Join us May 3-6 in San Francisco at I Annotate 2017, the fifth annual conference for annotation technologies and practices with a keynote from Esther Dyson. This year’s themes are: increasing user engagement in publication, science, and research, empowering fact checking in journalism, and building digital literacy in education.

Logo of pigeon head in circle on black square.
Hack Education Logo by Audrey Watters used under CC BY-NC-SA license.

Anyone working on or with educational technology should take the work of Audrey Watters—widely known as the “Cassandra” of #edtech—very seriously. If your work withers under Audrey’s critical gaze, you’ve got more work to do. In that spirit, I wanted to hold Hypothesis up to the kind of scrutiny that Audrey might provide.

Back in 2012, Audrey posted “The Audrey Test”: Or, What Should Every Techie Know About Education? on her must-read Hack Education blog. The Audrey Test includes a short list of questions that she suggests every #edtech project, product, or company should answer in order to meet the high expectations we should all hold when we are working on educational tools that engage in what we should think of as “high stakes environments with other people’s children.”

How does Hypothesis fare in The Audrey Test? Pretty well, especially on some of her bigger, fundamental questions like how we are funded. We have some strong answers to her more technical questions too, but also some important work yet to do, like on mobile access and accessibility. Here’s Hypothesis’ Audrey Test:

The Audrey Test

(Reproduced here from The Audrey Test”: Or, What Should Every Techie Know About Education? by Audrey Watters under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 license.)

Do you work closely with your potential users (teachers or students, for example) about product development?

Yes. Hypothesis engages users in the early stages of product development, often in the form of “alpha” applications that enable people to test capabilities in real world situations. For example, read about our experiments to integrate annotation into the Instructure Canvas learning management system (LMS).

Do you offer data portability—not just for administrative data, but for students’ own information?

Yes, using a separate utility, and supporting data portability directly in Hypothesis is on our public roadmap.

Is your tool available across platforms?

Yes. Anyone using a modern web browser on any operating system can use Hypothesis to annotate any web-based document using a browser extension, bookmarklet, integration on the publishing server, or if nothing else, by pasting a link to a document into the box on the Hypothesis home page that adds a simple prefix so the link opens with annotation enabled. We have work to do to make the experience on mobile devices better.

Are you open source?

Yes.

Do you offer an API?

Yes.

Is your educational content openly licensed?

Some, but the content is really owned by users. In order to support open use, annotations posted to the public stream carry a Creative Commons Zero (CC0) public domain designation. Users retain copyright to their annotations made in Hypothesis groups. We are considering how to enable users to make other choices (eg, Creative Commons licenses) about how they want to publish annotations in groups.

Is it accessible to those with disabilities?

Not fully, but we are actively working on improvements and an ongoing process to support accessibility following inclusive design principles.

Do you have a revenue strategy that involves something other than raising VC investment?

Yes. While Hypothesis has been supported so far by philanthropy, we are now engaged in specific work to generate sustainable funding for our work.

Does your product reduce the “achievement gap”?

Unsure. We have seen anecdotal evidence that annotation can improve teaching and learning, but we have not yet seen rigorous, quantitative evidence of annotation’s relationship to learning achievement gaps. As we see increased use of annotation in education, we will support investigations about its efficacy.