![]() Readers can tie their feedback to particular parts of the article according to their expertise, making it easier for other readers to hone in on those precise areas. Through the use of in-line annotation, additional materials can be connected through deep linking, by authors, journal editors, or post-publication reviewers. With the way that content is disseminated today on multiple platforms, readers may well be contributing feedback only on one copy of many live on the web.Īnnotation technology can help remove these barriers. Further, post-publication reviewers might submit corrections or updates that also don’t connect effectively back to original documents. Reader comments that live on blogs or on Twitter may well not connect back to the original article, so other readers don’t even know that such feedback exists. All comments may well be lumped into a single bucket irrespective of reviewer expertise. Readers may lack context without access to data or additional resources that traditional peer reviewers might receive. Post-publication or crowdsourced peer review is another iteration on traditional peer review, but even here a number of challenges remain. Hypothesis’ deep linking also enables reviewers and authors to connect specific passages to additional resources across the web to augment the review process and the final manuscript. Authors and reviewers can see the annotations in context atop the documents themselves and via summary documents like decision letters. Additional capabilities like custom tagging or filtering can be added easily. APIs enable annotated reviews to flow into dashboards for editors, reviewers, and authors, respecting granular permissions that indicate who can (or should) see various types of feedback. Open source annotation frameworks like Hypothesis allow submission systems to incorporate annotation directly into their existing web apps, even hosting the annotations locally if they prefer. Inline annotation brings the critique directly over the relevant text and allows a fluid conversation to unfold with editorial guidance. Further, as the revision process proceeds, all those numbers can be rendered meaningless as the text changes. Critique is still delivered via an unwieldy long-form document that references page, paragraph, and line numbers, sending editors and authors on scavenger hunts to track down related text. ![]() While manuscript submission systems have automated many parts of the traditional peer review process, until now the reviewer work process has changed little. Scholars can already add their ORCIDs to their Hypothesis profiles and publishers and platforms that use ORCID for authentication can now provision their users with annotation capabilities automatically. Hypothesis and ORCID have a longstanding collaboration to connect scholarly identifiers in publication workflows, documents, and annotations to establish reliable mechanisms that support trust, attribution and transparency across all scholarship. ![]() Journalists use Hypothesis to connect and discuss documents in investigative research and enrich coverage of other texts. Educators and students in K12 and higher education annotate with Hypothesis to embed teaching and learning directly in digital content. Scientists and researchers use Hypothesis to engage with documents and their peers, organize research, and embed related resources on top of existing texts. Publishers embed Hypothesis in their platforms to support pre-publication workflows like peer review and post-publication engagement with invited experts and general readers. Hypothesis is an organization dedicated to the development and spread of open, standards-based annotation technologies and practices that enable anyone to annotate anywhere. Annotating in groups, both private ones for closed reviews or public-facing groups for open or post-publication reviews, colleagues and peers can provide a variety of different kinds of feedback using different models - even on the same article. ![]() ![]() By connecting observations, questions and suggestions to text selections, annotations enable a precise, fine-grained collaborative conversation on top of documents that can persist across versions and even after publication. Open annotation can dramatically improve the potential for transparency by enabling novel approaches for review that otherwise would be difficult or impossible to achieve.Īnnotation can enhance all types of peer review, including single- and double-blind, open review, and post-publication peer review. Transparency, the focus of this year’s Peer Review Week, ultimately depends on design choices that communities make when agreeing how scholarship can benefit from collective review. Crossposted on ORCID as a part of Peer Review Week 2017Įditors, reviewers and scholars are recognizing the potential for open annotation to streamline and improve traditional forms of peer review and create a framework for new practices. ![]()
0 Comments
Leave a Reply. |