(Editor’s note: HPC is almost the definition of research software. However, too often, HPC research software becomes orphaned or lost over time. The work of maintaining and delivering research-based software has taken on a more formal title called Research Software Engineer (RSE). The position combines two important skills: software expertise and an understanding of research.
Recently, a very well attended conference, RSECon23and the development of a International community indicate interest and growth in the emerging field.
In the United States, the American Software Sustainability Research Institute (URSSI) was created to improve the recognition, development and use of software for a more sustainable research enterprise. They foster collaboration in the development of education, outreach, and software services that emphasize open, transparent, reproducible, and cooperative practices for research software.
The following is republished (with permission) from the URSSI website and provides first-hand insights into careers in scientific software. As background, you can consult the two previous blog articles, Charting the way: policy and planning for sustainable research software And Elevate Research software: co-creation of a digital roadmap.)
THE USSR Chart the way The project organized a targeted participatory workshop at IEEE eScience Conference in October 2023 in Cyprus to gather international perspectives on the main challenges in the field of research software. At this workshop, a nuanced and in-depth discussion took place around the metrics and incentives that shape careers in research software internationally. This helped clarify key questions and potential solutions that were explored in different countries. Participants in this workshop were professionals involved in research software at different levels, including two individuals who lead large teams of research software engineers (RSE) in different countries. The other participants were two early career researchers working with research software and holding doctorates in the social sciences. These differences in participant profiles provided a wide range of perspectives on the topic of advancing careers in research software.
Beyond the “sophisticated digital receipt”
One participant – an early-career researcher in computational biology – argued that because software is a “deliverable” project, it often shares the spotlight with academic papers as markers of successful investment by the funder. However, this researcher noted that academic articles serve as a “sophisticated digital receipt” of software research work – a formal but detached record of the actual work, which is the software itself. This observation highlighted the persistence of the perceived need for research software professionals to tailor their work to traditional academic metrics, despite high-level policies that indicate that research software contributions should be valued in their own right.
Communicate the contribution of research software
Another participant – working on software for a major supercomputer project – highlighted the role of effective communication in overcoming ambiguities and challenges affecting promotion and other key aspects of university life. Although software is “a very critical element,” its impact is amplified when it is disseminated through multiple channels, from academic articles to user group meetings. This multifaceted approach to knowledge sharing highlights the evolving criteria for what constitutes a “valuable” research output.
The assessment gap and the importance of social skills
The conversation then turned to the challenges of measuring the impact of software. One participant lamented the lack of a robust, evidence-based system, calling the current assessment framework “loose.” Participants noted a reliance on self-reported claims rather than concrete measurements, arguing for more reliable methods, such as testimonials or third-party validation. One participant involved in the annual reviews emphasized that technical prowess alone does not define a senior research software engineer (SRE). Equally vital is the ability to “stand your ground” in academic conversations and mentor younger team members. This perspective could expand assessment criteria to include social and communication skills that may not be easily quantifiable but are essential for career advancement.
Operational Challenges of Scaling Teams
The workshop also explored the operational aspects of recognizing the contributions of CSR teams, particularly as they scale. This discussion highlighted the need to maintain transparency to document the accomplishments of research software professionals. “The work system for our entire team resides on GitHub,” said a participating senior executive. This approach provides “continuity” to team interactions, but shows its limitations as teams grow. “I’m already feeling the growing pains,” admitted another participant, emphasizing the need for more formalized systems for tracking research software achievements to ensure fair recognition and promotion for CSRs.
The puzzle of CSR recruitment
When it comes to hiring, the evaluation criteria take an interesting turn. “What I value most is intellectual curiosity,” said a senior team member. We’re not just looking for the “mythical 10X coder”, but rather people who can engage effectively with academics from diverse fields. This senior participant highlighted that one of the most successful RSE recruits (“absolutely phenomenal”) had a background in fine arts and a PhD focused on a traditional profession.
The role of specialists in CSR teams
As the conversation evolved, the role of specialists in growing teams was recognized. While small teams need generalists who can “get involved in everything,” larger teams can afford specialists. “Maybe when we have a hundred people and I can afford these specializations, maybe we’ll hire more,” noted one participant.
Conclusion: a field in constant evolution
The workshop presented a microcosm of the broader changes taking place in computational research. He highlighted the need for a more nuanced and multidimensional approach to evaluating contributions to research software, one that goes beyond articles and citations to include direct contributions to software and communication skills. As the field continues to evolve, these discussions will undoubtedly shape the frameworks that define the success and impact of research software. USSR Chart the way The project works to identify and promote ways to advance these discussions. One way to get involved is to participate in the platform to collaborative engagement about ideas for improving the status, sustainability, and impact of research software we’ve launched on GitHub.
Eric A. Jensen and Daniel S. Katz are on the URSSI Steering Committee