Minimum publishable unit

tl;dr – JOSS is introducing new submission criteria whereby submissions under 1000 lines of code will automatically be flagged as potentially being out of scope, and those under 300 lines desk-rejected. This blog post describes some of the motivations behind this decision.

Sometime in 2020, JOSS will publish its 1000th paper – an incredible achievement by a volunteer team of editors and reviewers.

Since its inception a little over four years ago, the primary goal of JOSS has always been to provide credit for authors of research software. Quoting from the original blog post announcing JOSS:

The primary purpose of a JOSS paper is to enable citation credit to be given to authors of research software.

One challenge we’ve always struggled with as an editorial team is defining clear guidelines for submissions allowed (and not allowed) in JOSS. Our current submission criteria are available online and include language about software having to be a significant contribution, feature complete, and having an obvious research application. In these criteria we also explicitly exclude a category of software we generally call “minor utilities”.

The challenge of defining a unit of publication credit

We think of JOSS essentially granting “1 publication credit” for each software package that is reviewed and published in the journal. In empirical research, a publication is often the result of years of work. In engineering research, rarely does a paper represent less than one year of work. Other fields may vary, but let’s say that a scientific paper resulting from work measured in just months is rare or exceptional.

Since the earliest days of the journal, there has been a range of views within the editorial team on what level of effort we should require from authors for a submission to be allowed in JOSS – some JOSS editors feeling that every useful piece of software should be considered, others believing that the “bar” for publishing in JOSS should be higher than it currently is.

Building trust in JOSS

As an editorial team we want JOSS papers to count the same as any other publication in the CV of researchers who write software. With career credit the stated primary reason for starting JOSS, if this isn’t true then the mission of the journal is at risk.

In reality this means that our editorial policy requires us to balance two competing needs:

  1. Providing an excellent service to authors by offering peer-review of their software and the opportunity to receive career credit for their work.
  2. Building the trust of an existing academic culture to accept a JOSS paper as equal to any peer-reviewed journal paper.

These two aspects are in tension with each other because while we would dearly love to publish any and all research software in JOSS regardless of size/scale and level of effort to implement, part of building and maintaining that trust with the community relies on us ensuring that our published authors can continue to expect JOSS papers to “count” in their future merit reviews and promotions.

Updates to our submission requirements and scope-checking procedures

Over the past couple of years, as the number of submissions to JOSS has grown, we’ve found that our existing submission criteria and protocol for rejecting papers as out of scope have been taking a significant fraction of our editorial team time. With a volunteer team of editors, it’s essential that we use their time carefully and an update to our procedures for handling scope assessments is long overdue.

Going forward we’re going to adopt the following new process:

Automatically flagging small submissions

As part of the pre-review process, incoming submissions that are under 1000 lines of code1 (LOC) will be automatically flagged as potentially out of scope by the EiC on rotation.

Submissions under 300 lines2 of code will be desk rejected with no further review.

Mandatory “Statement of need” section in JOSS papers

While this has always been preferred, a clear Statement of need section in the JOSS paper is usually extremely valuable in helping the editorial team understand the rationale for the development of the software.

Gauging the scholarly content of the software as part of the review

Reviewers will be asked if the software under review is a “substantial scholarly effort” and guidelines will be provided on how they can make that assessment.

A streamlined editorial review process

Rather than each paper that is potentially out-of-scope being discussed in a separate thread on our editorial mailing list, JOSS is going to move to a weekly review of such papers by our editorial team. Topic editors will be asked to review papers flagged as potentially out of scope in their area and help the EiC team make a decision.

Arfon M. Smith on behalf of the JOSS editorial team

  1. In a high-level language such as Python or R. More verbose languages such as Java, C++, Fortran etc. will require more LOC. 

  2. We realize that introducing numerical thresholds may encourage some authors to unnecessarily “pad” their submissions with additional lines of code to meet our thresholds. As reviewers are already asked to judge the standard of the implementation as part of their review we expect that situations like these will usually be flagged during the review. 

Reopening JOSS

On March 12th of this year, we suspended new submissions to JOSS in order to reduce the load on our volunteer team of editors and reviewers. We revisited this suspension in early April and decided to continue the pause on new submissions for at least another month. Over the past two weeks, we’ve been reviewing our current status and discussing what it might look like to reopen JOSS.

After collecting inputs from the whole JOSS editorial team about their respective availabilities and discussing what to do next we’ve decided to reopen JOSS in a “reduced service mode” from May 20th, 2020.

Reduced service mode

So what does a “reduced service mode” mean exactly? Essentially this is about adjusting our editorial processes to give authors, editors, and reviewers more time to carry out their activities during the JOSS review process.

We expect that JOSS will be operating in this mode for the foreseeable future with the following new editorial processes:

Revised (extended) review time: Before the suspension, we encouraged reviewers to complete their reviews in two weeks. In this reduced service mode, we’re asking that reviewers try to complete their reviews in six weeks. Reviewers are of course welcome to complete their reviews faster than this and given the iterative nature of JOSS reviews we still encourage everyone to start their review as soon as possible.

Reduced paper throughput: As we reopen JOSS, we plan to limit the number of incoming submissions that will be assigned to a handling editor (and therefore go out for review). In these first few weeks we’re planning on a limit of five papers per week, with papers after this held in a first come, first served waitlist pre-review issue on GitHub.

Re-assign the submissions with unavailable editors: A small number of our editors are currently completely unable to edit due to other commitments. As we reopen JOSS, those papers will be reassigned to a new handling editor.

Growing our editorial team: Back on February, we issued a call for new editors which received an excellent response. Over the coming weeks we will begin on-boarding new editors to help grow the capacity of JOSS.

What this means for JOSS

During the pause on submissions, we’ve managed to publish 37 papers from the existing backlog which is an amazing feat in itself 💖. Since instigating the pause however, activity across all of JOSS had been markedly reduced.

JOSS relies upon editors and reviewers volunteering their time to help authors improve their research software in an iterative, open review process. For many of us, JOSS is a labor of love, and an opportunity to give back to the communities we work in.

While we can never know exactly how the COVID-19 pandemic is affecting every JOSS author, editor, and reviewer, we hope that with these changes to our editorial processes we have found the right balance between accommodating the very real challenges many of us now face in our daily lives, and providing a service to the research software community.

Arfon M. Smith on behalf of the JOSS editorial team

Call for editors

Once again we’re looking to grow our editorial team at JOSS. We’re especially interested in recruiting editors with expertise in bioinformatics, material science, physics, R/statistics, and the social sciences.

Since our launch in May 2016, our existing editorial team has handled over 800 submissions (830 published at the time of writing, 119 under review) and the demand from the community continues to grow. The last three months have been our busiest yet, with JOSS publishing a little over one paper per day, and we see no sign of this demand dropping.

New editors at JOSS are asked to make a minimum 1-year commitment, with additional years possible by mutual consent. As some of our existing editorial team are reaching the end of their term with JOSS, the time is right to bring on another cohort of editors.

Background on JOSS

If you think you might be interested, take a look at our editorial guide, which describes the editorial workflow at JOSS, and also some of the reviews for recently accepted papers. Between these two, you should be able to get a good overview of what editing for JOSS looks like.

Further background about JOSS can be found in our PeerJ CS paper, which summarizes our first year, and our Editor-in-Chief’s original blog post, which announced the journal and describes some of our core motivations for starting the journal.

More recently we’ve also written in detail about the costs related with running JOSS, scaling our editorial processes, and talked about the collaborative peer review that JOSS promotes.

How to apply

Firstly, we especially welcome applications from prospective editors who will contribute to the diversity (ethnic, gender, disciplinary, and geographical) of our board.

✨✨✨ If you’re interested in applying please fill in this short form by 4th March 2020. ✨✨✨

Who can apply

We welcome applications from potential editors with significant experience in one or more of the following areas: open source software, open science, software engineering, peer-review, noting again that editors with expertise in bioinformatics, material science, physics, R/statistics, and the social science are most neded.

The JOSS editorial team has a diverse background and there is no requirement for JOSS editors to be working in academia. Unfortunately individuals enrolled in a PhD program are not eligible to serve on the JOSS editorial team.

Selection process

The JOSS editorial team will review your applications and make their recommendations. Highly-ranked candidates will then have a short (~30 minute) phone call/video conference interview with the editor(s)-in-chief. Successful candidates will then join the JOSS editorial team for a probational period of 3 months before becoming a full member of the editorial team. You will get an onboarding “buddy” from the experienced editors to help you out during that time.

Lorena A. Barba, Daniel S. Katz, Kevin M. Moerman, Kyle E. Niemeyer, Kristen Thyng, Arfon M. Smith

New editors in 2019

2019 has been a big year for JOSS: This year, we’ve already published 300 papers and are on target to reach ~360 papers by the end of the calendar year. We’ve substantially improved our website and made a number of important changes to our editorial team to help us scale further.

In the last 12 months we’ve issued two calls for new editors (one in December 2018 and one in August 2019), and we’ve not done a very good job of announcing when new editors join the team so let’s fix that…

Two new associate editors in chief

In the last few months, we’ve added two associate-editors-in-chief with Kevin M. Moerman and Kristen Thyng stepping into these new roles ⚡.

New editors joining the JOSS team

Twenty one(!) new editors have joined the JOSS editorial team thus far in 2019. Ordered by the date they joined they are:

Viviane Pons: Mathematics, Computer Science
Associate professor at Université Paris-Sud. Mathematician, computer scientist and strong defender of open-source and open science in general. Contributor and user of the SageMath software. Member of the OpenDreamKit European project for open-source development in Mathematics.

Jack Poulson: Numerical optimization, numerical linear algebra, PDEs, high-performance computing, lattice reduction
Independent computational scientist running Tech Inquiry and Hodge Star Scientific Computing. Previously, research scientist at Google and assistant professor of mathematics at Stanford. His research interests: software engineering of high-performance mathematical libraries (e.g., conic optimization, lattice reduction, determinantal point processes, numerical PDEs), their connections to pure mathematics (e.g., differential geometry, conic analysis, representation theory).

George K. Thiruvathukal: HPC, software engineering, programming languages, systems, computational science, digital humanities
Professor of computer science at Loyola University, Chicago, and visiting faculty at the Argonne National Laboratory Leadership Computing Facility. Research interests: high-performance & distributed computing, cyber-physical systems, software engineering, programming languages and systems, history of computing, computational and data science, computing education, and ethical/legal/social issues in computing. Past editor-in-chief of IEEE Computing in Science and Engineering.

Lorena Pantano: Small RNAseq, RNAseq, miRNA, isomiRs, visualization, genomics, transcriptomic, non-codingRNA, data integration
Research scientist at Harvard T.H. Chan School of Public Health. Focused on genomic regulation and data integration, with more than a decade of experience in biological data analysis and contributing to novel algorithms to improve the quantification and visualization of genomic data.

Juanjo Bazán: Astrophysics, Mathematics
Astrophysics researcher at CIEMAT, mathematician and software engineer currently developing chemical evolution models for galaxies. Juanjo has worked as advisor on open source policies and contributed code to many popular libraries like Rails and Astropy. He is member of the founder team of Consul, the most widely used open sourced citizen participation software.

Bruce E. Wilson: Ecology, remote sensing, information sciences, material sciences
Manager for the Oak Ridge National Laboratory Distributed Active Archive Center for Biogeochemical Dynamics (ORNL DAAC) and Adjunct Professor of Information Sciences at the University of Tennessee, Knoxville. Originally trained as a chemist and statistician. Spent a few years as an Enterprise Architect. Research interests in citations, linked data, reproducible science, identity, cybersecurity, data reuse, and long-term data preservation.

Leonardo Uieda: Geoscience, Geophysics, Data Visualization
Geophysicist researching methods for determining the inner structure of the Earth from geophysical observations, mainly disturbances in the Earth’s gravity and magnetic fields. Developer of open-source software for processing, modeling, and visualizing geophysical data. Currently Visiting Research Scholar at the University of Hawai’i at Mānoa working on Generic Mapping Tools.

Alex Hanna: Social sciences, politics
Alex Hanna is a computational social scientist working on machine learning curriculum at Google. She received her PhD in sociology from the University of Wisconsin-Madison. Her research has focused on how new and social media has changed social movement mobilization and political participation. More recently, she has been interested in issues of fairness, accountability, and transparency in sociotechnical systems.

Charlotte Soneson: Bioinformatics, data visualization, transcriptomics, reproducible research
Research Associate at the Friedrich Miescher Institute for Biomedical Research in Basel, Switzerland, with a research background mainly in development and evaluation of analysis methods for transcriptomics data. Developer and maintainer of several open-source R packages for analysis, quality assessment and interactive visualization of high-throughput biological data.

Monica Bobra: Heliophysics, Data Science
Research scientist at Stanford University in the W. W. Hansen Experimental Physics Laboratory who studies the Sun and space weather as a member of the NASA Solar Dynamics Observatory science team and contributes to Heliopython and SunPy.

Yuan Tang: Machine Learning, Distributed Systems, Cloud Computing
Senior software engineer at Ant Financial, building AI infrastructure and AutoML platform. He’s a committer of TensorFlow, XGBoost, and Apache MXNet, maintainer of several Kubeflow projects, and author of numerous open source software projects. He’s also the author of the best-selling book TensorFlow in Practice, which is the first book teaching TensorFlow in Chinese and has been translated to several other languages such as traditional Chinese and Korean.

Olivia Guest: Computational cognitive modeling, cognitive science, cognitive neuroscience
Olivia Guest is a computational modeler in cognitive science and neuroscience. She creates and evaluates computational accounts for categorization and conceptual representation in healthy adults, patient groups, infants, and animals. She is also interested in using computational modeling and data science broadly in theoretical as well as applied contexts.

Marie E. Rognes: Applied mathematics, PDEs, numerical methods, biomechanics
Marie E. Rognes is Chief Research Scientist at Simula Research Laboratory, Oslo, Norway. Her research focuses on numerical methods for partial differential equations, software for scientific computing, with applications in biomechanics and neuroscience. She is a core member of the FEniCS and Dolfin-adjoint Projects.

Vincent Knight: Mathematics, Applied mathematics, Game Theory, Stochastic processes, Pedagogy, Python
Vince is a mathematician at Cardiff University. He is a maintainer and contributor to a number of open source software packages and a contributor to the UK Python community. His research interests are in the field of game theory and stochastic processes and he also has a keen interest in pedagogy. Vince is a fellow of the Software Sustainability Institute and is interested in reproducibility and sustainability of scientific/mathematical research.

Matthew Sottile: Computer science, programming languages, applied mathematics
Research scientist working on program synthesis tools for high performance computing at, and adjunct faculty at the Washington State University Department of Mathematics and Statistics. His PhD work in computer engineering focused on measurement and analysis methods for performance analysis of high performance systems. He believes that open and reproducible research software artifacts are critical to the advancement of computational science.

Mark A. Jensen: Bioinformatics and Computational Biology, Cancer Genomics, Population and Evolutionary Biology, Databases and Data Architecture, Software Development Lifecycle Management
Director of Data Management and Interoperability at the Frederick National Laboratory for Cancer Research, he leads efforts to design, build, and maintain scientist-friendly research data systems that integrate clinical and multiomic data across thousands of cancer patient-donors. He is active in open source software development and a supporter of the FAIR (Findable, Accessible, Interoperable, Reusable) movement in scientific data management. A molecular evolutionary biologist by training, he served as an Associate Editor for the Journal of Molecular Evolution from 2008-2013.

Melissa Weber Mendonça: Applied Math, Numerical Optimization, Numerical Linear Algebra, Fortran, Python, MATLAB, Julia, Teaching
Applied mathematician, working with numerical optimization and numerical linear algebra at the Federal University of Santa Catarina, in Brazil. Interested in open science, open research, and teaching practices in mathematics and computer science.

Katy Barnhart: Geology, Geophysics, Earth Surface Dynamics, Terrain Analysis
Research scientist in Earth surface dynamics at the University of Colorado Department of Geological Sciences and Cooperative Institute for Research in Environmental Sciences. Her research focuses on integrating observations, model development, and model analysis to understand the evolution of the Earth’s surface. She is a core developer of the Landlab toolkit.

William Rowe: Bioinformatics, genomics
Bioinformatician working at the University of Birmingham, UK. Research areas include data sketching for genomics, variation graphs, and metagenome profiling. Currently working on long-read sequencing applications for real-time genomic epidemiology.

Dan Foreman-Mackey: Astrophysics, probabilistic programming, data science, python
Dan Foreman-Mackey is an Associate Research Scientist at the Flatiron Institute in the Center for Computational Astrophysics. His research program focuses on the development and application of probabilistic data analysis techniques to make novel discoveries and solve fundamental problems in astrophysics.

Marcos Vital: R, biostatistics, data visualization, open science, quantitative ecology, evolutionary biology, conservation biology, teaching
Biologist, quantitative ecologist and open science enthusiast at Universidade Federal de Alagoas, Brazil, where he coordinates the Quantitative Ecology Lab. He is interested in a wide range of topics related to evolutionary ecology and conservation biology, but also in other fields such as active learning methods and digital games in education.

Editors retiring from the team

In 2019, a few of our editors have stepped down from editorial duties at JOSS: Jason Clark, Pjotr Prins, and Yo Yehudi have already retired from the JOSS editorial team and Roman Valls Guimera, Melissa Gymrek, and Lindsey Heagy will be retiring from the team soon. JOSS only works because of the volunteer efforts from our editors and reviewers and we’re deeply indebted to them 💖.

We thanks Jason, Pj, Yo, Roman, Melissa, and Lindsey for all of their contributions to JOSS!

Arfon Smith, Editor-in-Chief, Journal of Open Source Software.

Call for editors

Following our call for editors late last year, once again we’re looking to grow our editorial team.

Over the past ~40 months, our existing editorial team has handled close to 800 submissions (665 accepted at the time of writing, 101 under review) and the demand from the community continues to grow. The last three months have been our busiest yet, with JOSS publishing a little over one paper every day, and we see no sign of this demand dropping.

In order for JOSS to be able to continues to thrive, we think the time is right to bring on another cohort of editors.

Background on JOSS

If you think you might be interested, take a look at our editorial guide which describes the editorial workflow at JOSS and also some of the reviews for recently accepted papers. Between these two, you should be able to get a good overview of what editing for JOSS looks like.

Further background about JOSS can be found in our PeerJ paper which summarizes our first year and our Editor-in-Chief’s original blog post announcing the journal describes some of the core motivations for starting the journal.

More recently we’ve also written in detail about the costs related with running JOSS and scaling our editorial processes.

How to apply

Firstly, we especially welcome applications from prospective editors who will contribute to the diversity (ethnic, gender, and geographical) of our board.

We’re especially interested in growing our editorial team in the following subject areas although all are welcome to apply:

  • Applied mathematics
  • Bioinformatics
  • Computational chemistry
  • Ecology
  • Geoscience/earth sciences
  • Material science
  • R/stats

✨✨✨ If you’re interested in applying please fill in this short form by 4th September 2019. ✨✨✨

Selection process

The JOSS editorial team will review your applications and make their recommendations. Candidates ranking highly will then have a short (~30 minute) phone call/video conference interview with the editor-in-chief(s). Successful candidates will then join the JOSS editorial team for a probational period of 3 months before becoming a full member of the editorial team. You will get an onboarding “buddy” from the experienced editors to help you out during that time.

Lorena A. Barba, Daniel S. Katz, Kyle E. Niemeyer, Arfon M. Smith

Scaling the Journal of Open Source Software (JOSS)

The Journal of Open Source Software (JOSS) started with one editor-in-chief (EiC) and 10 topic editors, and in our first year, we published about 100 papers. Three years later, the rate of publication has increased to 300 papers per year. How did we scale up to this point, and how can we continue scaling the journal?

Scaling to-date

Today, we have one editor-in-chief (EiC), three associate editors-in-chief (AEiCs), and 23 topic editors. We recognize past service of editors that step down, for any reason, as emeritus editors on our website (currently, six). The journal is currently receiving about one new submission per day, a 3.5x increase in 2 years.

What scales and what doesn’t – or – what we did to scale to where we are now:

Assigning papers to editors: when a paper is submitted, it enters the “incoming” queue, from which an AEiC moves it to a Pre-Review issue and assigns it to a handling editor. We scaled this task, initially handled by the EiC, to the EiC and 3 AEiCs who rotate on-duty responsibilities weekly. The overall per-person workload for this task is thus manageable.
Assigning papers to reviewers: after a Pre-Review issue starts, the assigned handling editor has the task of finding and assigning reviewers to the paper. This task can scale up by 1) recruiting additional editors so that each editor has a manageable workload, as long as 2) there are sufficient appropriate reviewers available and 3) the editor can find them. JOSS currently assigns a minimum of two reviewers/paper. As we receive more submissions each year, more reviewers are needed. Although we can ask authors of accepted papers to review new submissions, with many more active papers simultaneously, we will need better tracking of reviews contributed, to avoid overburdening some reviewers, and we may need a better way to find appropriate reviewers than the keyword searches of existing volunteer reviewers and the personal knowledge and search skills of the editors that we currently use.
Overseeing active reviews: editors and AEiCs maintain a close eye on active reviews to keep things moving along, moderate conversations, and clarify the appropriate process when needed. This currently scales well for handling editors, who have a dashboard showing the state of their assigned papers with key activity stats. The AEiC on rotation is aided by the “In Progress Papers” dashboard, with a manageable workload to keep reviews from becoming stuck; this scales acceptably at this point with AEiC weekly rotations.
Accepting and publishing papers: the one AEiC on duty each week completes the final tasks needed after reviewers and handling editor recommend acceptance. Our current rotation system avoids excessive long-term workload on any one person, but may not scale to a much larger number of papers accepted each week, as part of the AEiC’s job is final proofreading and detail checking, which can be time consuming.
Editorial robot: Much of the editorial process is assisted by our editorial bot ‘Whedon’ which carries out a number of automated processes on behalf of editors (see this post for more details). These automated steps include preparing the final version of a publication, checking references for missing or malformed DOIs, adding the software release version and archival DOI to the metadata, building an XML file for Crossref metadata, and executing the Crossref DOI registration. All of these steps are possible with simple commands issued to Whedon in the review thread on GitHub. For example, typing the command @whedon accept in a comment on the Review issue will automatically compile the paper PDF from the author’s source in markdown, check the references, and create the Crossref XML, providing links to these files for the AEiC’s final visual check.
Overall process overview: We built a JOSS editor dashboard to 1) help the AEiCs balance editor loads, review unassigned submissions, and understand the status of all in-progress reviews. The dashboard also 2) helps editors keep track of all current submissions (in addition to an automated weekly email reminding editors of current assignments).

Future scaling

If the number of submissions increased by another factor of 3 or even 10, we would need to make additional changes to handle the increase. Some of these could be:

Increase the number of AEiCs

If we do this, we might also need to change the schedule we now have. Our idea of rotating a single AEiC on duty at any one time likely wouldn’t still work for assignment of papers to editors; we might need multiple AEiCs at a time with that assignment of papers to AEiCs done in either a round-robin or random manner.

A decreased rotation time might work for accepting papers and publishing them, but the round-robin and random assignment would also work.

Alternatively, each paper that comes in might be assigned to an AEiC for all work on the paper, from assigning an editor through to acceptance and publication, though this would remove the workload pauses that AEiCs currently have during their off-duty periods.

Further increase number of editors

This should scale reasonably well.

Developing additional tooling for handling the editorial process

As both the number of AEiCs and editors increase, additional tooling likely will be needed, including:

  • Semi-automated tools for authors to suggest reviewers from a pool of volunteers and previous JOSS authors, and for editors to choose reviewers from these same sources as well as from the community outside JOSS
  • Semi-automated tools for assignment of papers to editors
  • Semi-automated checking of status for both editors and AEiCs
  • Reliable statistics on reviewer contributions to avoid reviewer fatigue

Better support

We could off-load some of the AEiC work (e.g., helping authors who have problems with formatting papers or other issues) either to a paid staff member, or more likely, we could set up support mechanisms for authors to be helped by other authors or editors, perhaps by Stack Overflow or something similar. Given that we are working in an open source environment, this would be a way of getting some members of the community to take on a little more, which might later lead to some becoming editors.

An alternative to scaling JOSS itself would be to scale up by starting more JOSS-like journals, even using the same infrastructure, perhaps by discipline, by programming language, etc. (Note to commercial publishers: We would still make these open access and free to publish; we don’t mean that you should start JOSS-like journals and charge APCs.)

Daniel S. Katz, Lorena A. Barba, Kyle E. Niemeyer, Arfon M. Smith

Cost models for running an online open journal

The Journal of Open Source Software (JOSS) is a free, open-access online journal, with no article processing charge (APC). We are committed to operating as a free service to our community, and we do so thanks to the volunteer labor of editors and reviewers, and by taking advantage of existing infrastructure. In this post, we examine the true costs of running a journal such as JOSS, and make the case that even when considering all services we don’t currently pay for, the true cost per paper would not exceed $100. Current APCs at many “gold” open-access journals exceed that by one or more orders of magnitude, (see, for example, PNAS, Nature, IEEE, etc.)

Real costs we *have* to pay

  • Annual Crossref membership: $275/year
  • JOSS paper DOIs: $1/accepted paper
  • JOSS website hosting: $19/month
  • JOSS domain name registration: $10/year

At 300 papers/year, this is $813, or $2.71/paper. This is what we actually pay today, covered by a grant from the Alfred P. Sloan Foundation.

JOSS is a fiscally sponsored project of NumFOCUS, through which we can receive donations. We remind authors about the opportunity to donate as part of the message announcing the acceptance of their papers. Over about 3 years of operation, we have received $285 in donations from a small number of individuals. (If you wish to donate now, use the NumFOCUS secure form!)

Services for which we don’t currently pay

The one-time cost for initial development of Whedon, the service (bot) that runs in an issue tracker, and the JOSS web application that runs would have cost an estimated $50,000 (0.25 FTE). This was covered by a combination of volunteer effort by the JOSS EiC (Arfon Smith), and a small grant from the Sloan Foundation. If amortized over first 1,000 papers, this would add $50/paper during that period, but that’s also the time when the journal needs to keep costs down to encourage submissions. Let’s instead imagine we have to redesign and reimplement our system from scratch every 10 years, and add $5,000 per year for this.

In addition, we require some ongoing development of the existing system every year (adding features, fixing bugs, etc.), at an estimated cost of $5,000/year. This is now mostly done by volunteers, but also partly supported by the Sloan Foundation grant.

We currently host JOSS papers and run reviews on GitHub, which we don’t pay for. If we had to pay for this, or we wanted to run our own system for some reason, we could use GitLab for issue tracking instead, at $50/month.

Another aspect is running the organization that publishes the journal. Currently, we depend on NumFOCUS as our fiscal sponsor to provide us with the ability to accept funds such as the Sloan grant and donations, to pay for needed items and services, and to provide some financial management/accounting services. This is paid for by the overhead on the Sloan Foundation grant, which is handled by NumFOCUS. If we were to pay for these services directly, they might cost $10,000/year.

Summarizing these additional potential costs:

  • Reimplementation of software infrastructure every 10 years: $5,000/year
  • Ongoing software infrastructure development and maintenance: $5,000/year
  • GitLab instance: $50/month
  • Financial services: $10,000/year

Additional volunteer contributions

We depend on a set of volunteers to run JOSS. These include

  • 1 Editor-in-chief working 4 hrs/week (0.1 FTE)
  • 4 Associate Editors-in-Chief (including the EiC), working a total of 10 hours/week, where the 4 rotate, so only one is on duty in any given week (0.25 FTE)
  • 27 Editors (including the AEiCs), working a few hours per week (~1.5 FTE)
  • 651 reviewers (as of 30 May 2019), working when assigned a paper

JOSS does not pay any of these people. Nor are these roles in other scholarly journals paid positions (with rare exceptions), whether open access or not, whether the publishers are for-profit or not. We are aware that some publishers/journals pay stipends to the editor-in-chief (ranging from a few thousand dollars to in some cases ten to twenty thousand), and a couple of publishers/journals have in-house salaried editors.


We rely on authors to provide “camera-ready” articles, though we help them generate these articles via an automated system that accepts markdown text and author-supplied images. We do not work directly with the authors to improve their article design/layout. Editors and reviewers do suggest changes to wording in articles as part of the review process, but we do not provide or apply professional copy-editing, which a few journals do. Given that the average journal doesn’t provide more author services than we do, and many provide less (editors do not provide as much language/grammar help), we will not consider any costs associated with these services.


We do not have any marketing costs, other than giving out some stickers from time to time, at minimal cost. While we recognize that some large publishers regularly attend conferences where they have booths and have other marketing costs, we will not consider any costs associated with marketing since we don’t really do any.


Here are all the costs associated with running JOSS, assuming 300 papers/year:

  • Costs we now pay: $813/year
  • Services for which we don’t currently pay: estimated at $20,600/year
  • Volunteer editor services: 1.85 FTE effort or $370,000 (@$200,000/FTE)
  • Reviewer time (not paid for in almost all journals, nor costed by JOSS)

This would lead to a valuation of the work required per paper of about $1,300 (excluding reviewer efforts), but given current practices regarding editor compensation, including just $10,000 as editor stipend (likely on the high side of today’s practices), we obtain a total annual operating cost of $31,413, requiring an article processing charge (APC) of about $100 per paper.

If we were a for-profit organization, we would also add a profit margin. While 10% is considered a good profit margin in many industries, 30-35% is more common in scholarly publishing. This would still only lead to an APC of about $140/paper.

Existing publishers and professional societies might consider adopting an online and open infrastructure like JOSS’s to reduce their costs. In any event, JOSS will continue to be openly accessible and free to publish.

Daniel S. Katz, Lorena A. Barba, Kyle E. Niemeyer, Arfon M. Smith

A special thanks to our most dedicated reviewers

JOSS is an adventure in next generation publishing, made possible by the volunteer work of many people. Our editors, of course, guide the style and the content of the journal. And our reviewers make a uniquely valuable contribution, both to the software they’ve reviewed and to the broader community of open-source research software. Some reviewers have been extra generous in contributing. Today, we want to say thank you to all our reviewers, but especially our most prolific ones.

Two reviewers take the top spot for the number of reviews they have contributed to JOSS: Bryce Mecum and Luiz Irber, who have each completed nine JOSS reviews!

Bryce (@amoeba on GitHub; @brycem on Twitter) is a scientific software engineer at the National Center for Ecological Analysis and Synthesis (NCEAS). He works on linked open data, semantics, ontologies, science metadata, and reproducible research. He has a B.S. in Biology and an M.S. in Fisheries, he dotes on his dog, and he lives in Juneau, Alaska. Thank you, Bryce!

Luiz (@luizirber on GitHub; @luizirber on Twitter) is a PhD student in computer science at UC Davis. He works at the Lab for Data Intensive Biology with C. Titus Brown, focusing on sketches, streaming, and online approaches for biological data analysis. Luiz is from Brazil, where he worked for more than three years at the National Institute for Space Research, developing tools for a coupled general circulation model. Obrigado, Luiz!

Both Bryce and Luiz are hereby named Top JOSS Reviewers for our first three years of existence. They are being rewarded with a cozy and geeky JOSS hoodie.

We also would like to acknowledge with an honorable mention the following JOSS reviewers, each of whom has contributed 5 or 6 software reviews: Kristian Rother (@krother, Maurizio Tomasi (@ziotom78), Philipp S. Sommer (@Chilipp), Kevin Mattheus Moerman (@Kevin-Mattheus-Moerman), and Nicolás Guarín-Zapata (@nicoguaro).

On a not-so-celebratory note, we have to acknowledge the lack of diversity on this list: not a single woman is among the group of highlighted reviewers. We understand that women and members of minority groups are asked to take on an outsized workload when it comes to contributing to diversity in technology. But we nevertheless want to extend an invitation to everyone to join our reviewer team, and let us know how we can support you. We will be happy to connect you with an experienced reviewer who can be your onboarding “buddy.”

Lorena A Barba, Associate Editor-in-Chief, Journal of Open Source Software

Editor’s note: JOSS recently had its third birthday. In the first three years we’ve published 557 papers (now 570 at the time of writing). I’m hugely grateful to all of our volunteer editors and reviewers for making this experiment in low-cost community-run journals possible! – Arfon Smith

Call for editors

JOSS is expanding its editorial board and we’re opening this opportunity to the open source research software community at large. If you think you might be interested, take a look at our editorial guide which describes the editorial workflow at JOSS and also some of the reviews for recently accepted papers. Between these two, you should be able to get a good overview of what editing for JOSS looks like.

Further background about JOSS can be found in our PeerJ paper which summarizes our first year and my original blog post announcing the journal describes the core motivations for starting the journal.

Over the past ~30 months, our existing editorial team has handled over 500 submissions to JOSS (448 accepted at the time of writing, 84 under review).

How to apply

We especially welcome applications from prospective editors who will contribute to the diversity of our board.

If you’re interested in applying please email me ( with the words "JOSS editor application" in the title and include:

  • A short statement of interest
  • Your specialist subject domains/research topics
  • Links to any past JOSS reviews you’ve carried out (not required)
  • A summary of your experience with open source software including any links to projects on e.g. GitHub

✨✨✨ Please submit your applications before the 18th January, 2019. ✨✨✨

Selection process

The JOSS editorial team will review your applications and make their recommendations. Candidates ranking highly will then have a short (~30 minute) phone call/video conference interview with the editor-in-chief(s). Successful candidates will then join the JOSS editorial team for a probational period of 3 months before becoming a full member of the editorial team.

Changes to the JOSS editorial board

It’s been a busy couple of years and so we’re making a few changes to our editorial team to help us scale our editorial process.

Introducing our three new associate editors in chief

Lorena A Barba
Associate Professor of Mechanical and Aerospace Engineering at the George Washington University, leading a research group in computational fluid dynamics, computational physics and high-performance computing. Member of the Board for NumFOCUS, a non-profit in support of open-source scientific software.

Daniel S. Katz
Works on computer, computational, and data research at NCSA, CS, ECE, and the iSchool at the University of Illinois at Urbana-Champaign, and has a strong interest in studying common elements of how research is done by people using software and data.

Kyle Niemeyer
Mechanical engineer in the School of Mechanical, Industrial, and Manufacturing Engineering at Oregon State University. Computational researcher in combustion, fluid dynamics, and chemical kinetics, with an interest in numerical methods and GPU computing strategies.

Introducing our editors emeritus

A few of our editors are stepping down from the day-to-day editorial duties at JOSS. Abigail Cabunoc Mayes, Tracy Teal, and Jake Vanderplas were amongst the earliest members of our editorial team and have been a huge positive influence on JOSS: their input and guidance on the journal as we scoped it in the early days was invaluable. Thomas Leeper and George Githinji joined us more recently and between them edited nearly 40 JOSS submissions.

We thank Abby, George, Tracy, Jake, and Thomas for all their contributions to JOSS!

Arfon Smith, Editor-in-Chief, Journal of Open Source Software.

A new collaboration with AAS publishing

Today we’re starting something new at JOSS: we’re collaborating with American Astronomical Society (AAS) Journals to offer software reviews for some of the papers submitted to their journals. As part of this process, AAS Publishing will make a small contribution to our parent organization NumFOCUS to support the running costs of JOSS. We’re excited about raising the standard of research software in astronomy and astrophysics, and want to use this blog post as an opportunity to give a little background on the collaboration and how we plan to operate.

Ever since I announced JOSS back in May of 2016, I’ve always been clear that the primary purpose of a JOSS paper is to enable citation credit to be given to authors of research software. Raising the bar on the expected quality of research software has always been a strong motivation for the journal too.

After 30 months, with over 500 JOSS submissions (448 published at the time of writing) by more than 400 amazing volunteers, I think it’s safe to say it’s working1. One of my favorite things about working on JOSS is watching authors, reviewers, and editors all working together to improve a piece of software. Some of my favorite comments from the past couple of years:

Reviewing for @JOSS_TheOJ and #JOSE_theOJ (of the Open Journals: ) is an exercise in restoration of faith in the “scientific process”. Both times it has felt like I’m doing something worthwhile through a collaborative conversation with the author · @drvinceknight

@bovee Done! Thanks for all your feedback and for helping with this submission. I never thought publishing something could be so enjoyable · JOSS Review

Awesome, thanks everyone, the review process really made this package better! · JOSS Review

Some of you might know that my background is in the astronomical sciences, and over the past couple of years I’ve been delighted to see that the AAS has been working to open their journals to ‘software paper’ submissions Take a moment to read this policy – it’s excellent. In many ways, AAS journals are among the most progressive of society-run journals out there, and have demonstrated a commitment to authors of research software with this policy. There’s a problem though: even though software papers are possible in many AAS publications, the AAS Journals don’t instruct their referees to review the software itself; there’s nothing like the formal JOSS review, and there’s no requirement for an open source license.

In discussions with the AAS team over the past year, we realized there was an opportunity to work together to improve the quality of the software associated with their submissions by giving authors of AAS papers a chance to publish with JOSS too.

How this works

Under this new collaboration, authors will submit their paper to one of the AAS journals as usual (, if their submission includes a substantial software component to the paper, they can choose to submit a JOSS paper to accompany their AAS submission.

If they decide to follow this route, the authors will prepare and submit a JOSS paper as usual at and the software will go through the standard JOSS editorial and review process, in the open on GitHub As part of this review, it will be made transparent to all parties involved that the JOSS paper is associated with a submission to a AAS publication.

We’ll ask the reviewers to acknowledge that this is happening and if they’re not comfortable with this arrangement, then we will find an alternative reviewer.

Upon successful peer review of the AAS journal article and the JOSS submission, each paper will cite and link to the other.

The income

JOSS is a volunteer, community-run journal and we try to keep our running costs as low as possible. They’re not zero, however, and we’re always making small infrastructure changes to our toolchain that require real money to support development costs. We recognize, though, that adding money into any volunteer project like JOSS is not without risk and so we’re taking a few proactive steps to make the process as transparent as possible:

  • We’re keeping an open ledger showing the income we’ve received from this partnership, together with a summary of how the money has been spent (e.g. server costs, DOI registration fees with Crossref).
  • Any submission to JOSS that has come via AAS will be clearly marked as such in the review process. We’ll create some additional documentation on explaining what’s going on.
  • Should any of the reviewers not be comfortable with a contribution being made to JOSS/NumFOCUS based on their review, we will find an alternative reviewer.

What’s next?

Honestly, I’m not exactly sure. At this point, I consider this collaboration with AAS an experiment in sustainability and an opportunity to extend the reach of JOSS to a new audience. The JOSS review is designed to improve the quality of the software submitted, and increasingly I believe JOSS represents a badge of quality that has value in the wider community. Collaborating with AAS on this project is our first attempt at exploring this.

Arfon Smith, Editor-in-Chief, Journal of Open Source Software.

  1. If you’d like more information about how JOSS works then our paper describing the first year is available here: