Today we reached a huge milestone at JOSS – we published our 1000th paper! JOSS is a developer friendly, free-to-publish, open-access journal for research software packages. Publishing 1000 papers (and reviewing the corresponding 1000 software packages) over the past ~4 years has been no small feat. This achievement has been possible thanks to the efforts of our journal team and community of reviewers who have all given their time to make JOSS a success. We take this opportunity to review some of what we’ve learnt over the past four years and outline some plans for the future.
A brief recap
Much has been written1 on the topic of research software, and the challenges that individuals face receiving credit for their work. Software is critical for modern research, yet people who invest time in writing high-quality tools often aren’t well rewarded for it. The scholarly metrics of the “impact” of a researcher’s work do a poor job of supporting software (and more).
JOSS was created as a workaround to some of the challenges for supporting software development in academia. Launched in May 2016, JOSS provides a simple, reliable process for receiving academic career credit (through citation of software papers) for writing open source research software. Authors write and submit a short article (usually under 1000 words) about their software, and JOSS reviews the paper and the software2, reviewing it for a variety of qualities including function, (re)usability, documentation3.
In establishing JOSS, we wanted the editorial experience to be very different from a traditional journal, and developer friendly – short papers authored in Markdown, review process on GitHub, open process and documentation – while at the same time following best practices in publishing depositing first-class metadata and open citations with Crossref, archiving papers and reviews with Portico, leaving copyright for the JOSS papers with authors and more.
We describe the journal this way: JOSS is an open access (Diamond OA) journal for reviewing open source research software. With a heavy focus on automation, our open and collaborative peer review process is designed to improve the quality of the software submitted and happens in the open on GitHub.
Some lessons learned publishing our first 1000 papers
JOSS is meeting a need of the research community
Something unclear when starting JOSS was whether demand from the research community would be sufficient. A few years in, we can safely conclude that JOSS is meeting a real need of the academic community.
It has taken us a little over four years to publish 1000 papers and before pausing submissions for two months starting in early March 2020 (to give relief to our volunteers during the pandemic), we were projecting to reach this milestone sometime in June this year. It took us a little under a year to publish our 100th paper, and an additional 8 months to reach our 200th. In that time we’ve grown our editorial board from an initial group of 10 to more than 50 today.
People are reading and citing JOSS papers
Well over half of all JOSS papers have been cited, and many have been cited hundreds of times.
While not designed for it, JOSS is proving a useful resource for people interested in new research software: JOSS currently receives ~10,000 monthly visitors to the journal website and provides language (e.g., https://joss.theoj.org/papers/in/Python) and topic-based (e.g., https://joss.theoj.org/papers/tagged/Exoplanets) search filters and feeds.
People are key
Every journal relies upon the expertise, knowledge, and time of their reviewers and JOSS is no different. 935 individuals have contributed reviews for our first 1000 papers, many having reviewed multiple times.
Like many journals, as the number of submissions grows, JOSS has had to scale its human processes. Over the past few years we’ve added many more editors and associate editors-in-chief to enable papers to be handled efficiently by the editorial team. Simultaneously, we’ve developed our editorial robot Whedon from being an occasional assistant during the review process to being the backbone of the whole JOSS editorial process.
Automation is important
A big part of keeping our costs low is automating common editorial tasks where at all possible. The primary interface for editors managing JOSS submissions is a GitHub issue with the assistance of our Whedon bot who supports a broad collection of common editorial tasks. Other than having reviewers, editors and authors read papers, Pandoc-generated proofs are the final version, and no additional copy editing is done before a paper is published. PDF proofs and Crossref metadata are generated automatically by Whedon, and when the time comes, deposited with Crossref and published automatically too.
When starting JOSS, we thought that automation could be a big part of how things would work if the journal became successful. 1000 papers in, we believe it has been an absolutely critical part of our operations. We call this chatops-driven publishing.
JOSS is committed to providing a high-quality service to the community at no cost for authors or readers (Diamond/Platinum Open Access). We’re transparent about our operating costs and have written about cost models for operating an online open journal.
While JOSS’ operating costs are modest, we’ve benefited from the support of a number of organizations including NumFOCUS (the ‘Open Journals’ organization is a sponsored project of NumFOCUS), the Gordon and Betty Moore Foundation, and the Alfred P. Sloan Foundation. It’s also possible to donate to JOSS if you would like to support us financially.
To the future!
With 1000 papers published and a further ~170 papers under review JOSS is busier than ever and there’s little sign of demand from the community slowing.
Over the next year or so, we’re going to be investing resources in a number of key areas to enable JOSS to scale further, improve the experience for all parties, and help others reuse the infrastructure we’ve developed for JOSS. We’ve captured much of what we want to achieve in this public roadmap. All of this will be possible thanks to a new grant from the Alfred P. Sloan Foundation, some highlights include:
Smarter reviewer assignment and management: Finding reviewers for a JOSS submission is still one of the most time-intensive aspects of the JOSS editorial process (though a large fraction of those we ask to review tend to accept, as they are excited by our mission). We think there’s lots of opportunity for substantially improving the success rate of finding potential reviewers through automation. Making sure we’re not overloading our best reviewers will also be an important aspect of this work.
A major refactor of our editorial bot Whedon: Whedon is a critical part of our infrastructure but has become hard to maintain, and almost impossible for other projects to reuse. We’re planning to rework Whedon into a general framework with a set of reusable modules for common editorial tasks.
Investments in open source: JOSS relies upon a small number of open source projects such as Pandoc and pandoc-citeproc to produce scholarly manuscripts (PDFs) and metadata outputs (e.g., Crossref and JATS). We’re going to work with the Pandoc core team to generalize some of the work we’ve done for JOSS into Pandoc core.
For many of us on the editorial team JOSS is a labor of love, and it has been quite a ride growing JOSS from an experimental new journal to a venue that is publishing more close to 500 papers per year. For those of you who have helped us on this journey by submitting a paper to JOSS or volunteering to review, thank you ⚡🚀💥.
The JOSS editorial team.
Something hardly any other journals do. ↩