Capabilities Model Introduction and Guide to Use

Updated for V2.0, Oct 2023

Table of Contents: What is a Capabilities Model, and why do we need one? Acknowledgements Understanding Key Aspects of the Model: The Five Facings Availability across institution Service Operating Level Community engagement and collaboration Not relevant or applicable Local Priority Domain support levels


What is a Capabilities Model, and why do we need one?

Research Computing and Data[1]  (computing, data, and related infrastructure and services) is changing at an accelerating rate, while the range of academic fields and disciplines depending on this infrastructure is expanding and becoming increasingly diverse. This Capabilities Model was developed to identify the variety of relevant approaches taken to support the sector, and the key factors for providing this support, with a focus on the front lines of Research Computing and Data infrastructure. The Model is designed for use as an input to strategic decision making and organizational planning, and is intended to be inclusive across types of institutions (large and small, public and private, etc.), as well as different organizational models (e.g., centralized or decentralized).

Who should use the Capabilities Model?

This Model is designed to be useful to a diverse mix of stakeholders, including campus research computing and data practitioners, along with the principal investigators and research team members (faculty, staff) with whom they work, as well as key partners (e.g., central IT), and campus leadership.

What are common uses for the Model?

What is the format for the Model?

An assessment tool is provided, formatted as a questionnaire, that allows institutions to conduct a self-assessment on a series of factors across a range of perspectives. The assessment tool is provided as a web-based application that allows collaboration among individuals or teams involved in Research Computing and Data, who can work together to assess the current capabilities. The next section lays out key aspects of the Model assessment, and how organizations can go about using the tool.

Some institutions consider this assessment work to be a measure of maturity, and are very much interested in ranking/benchmarking against their peers. Others prefer to focus on the capabilities that are broadly understood to constitute Research Computing and Data, as an inspiration for next steps. In any case, it will not generally be realistic for an institution to achieve 100% across the full range of support areas, nor would it necessarily make strategic sense to do so.

Our research computing and data effort is still emerging. Is this assessment for me?

Some institutions choose to do a simpler assessment their first time, and just include a few key participants (e.g., an RCD support lead or equivalent plus someone from the library). The resulting assessment data may not be completely accurate and complete, but this can be a quick way to learn about the RCD Capabilities Model and explore how the resulting data, benchmarking reports, etc. can inform strategic planning, etc. at your institution.

Back to top 

Acknowledgements

The RCD Capabilities Model was developed through a collaboration of the Campus Research Computing Consortium (CaRCC), Internet2, and EDUCAUSE. This work has been supported in part by the National Science Foundation via OAC-1620695 and OAC-2100003.

The Assessment tool, this guide, and the other associated resources were developed with the generous contributions of time and expertise from the 2018 workshop participants, and the working group members: Alex Feltus, Ana Hunsinger, Cathy Chaplin, Claire Mizumoto, Dana Brunson, Deborah Dent, Doug Jennewein, Gail Krovitz, Galen Collier, Jackie Milhans, James Deaton, Jen Leasure, Jill Gemmill, Jim Bottum, Joe Breen, Joel Gershenfeld, John Hicks, John Moore, Karen Wetzel, Mike Erickson, Patrick Schmitz, Preston Smith, Timothy Middelkoop, and Tom Cheatham. In addition, individuals at a number of Universities provided valuable feedback on earlier versions, for which the working group is very grateful.

Understanding Key Aspects of the Model

There are six key concepts that underlie the Model, around which the tool is organized. These are:

  1. The Five Facings
  2. Availability across institution
  3. Service Operating Level
  4. Community engagement and collaboration
  5. Not relevant or applicable
  6. Local Priority

The Five Facings: The Model is organized into sections that reflect different roles that staff fill in supporting Research Computing and Data, and are named to reflect who or what each role is "facing" (i.e., focused on). [2] Within each facing, the model poses questions about aspects of research computing and data for the associated role; the questions are grouped into Topics.

Larger organizations may have a team associated with each facing role, while smaller organizations may have just a few people who cover these different roles. In filling out the assessment tool, you will likely want to involve people who work in the different roles; they can work to fill out their respective section of the assessment.

Facing Area

Description

Example roles

Researcher-
Facing Topics

Includes research computing and data staffing, outreach, and advanced support, as well as support in the management of the research lifecycle.

Research IT User Support, Research Facilitators, CI engineers,[3]; etc.

Data-Facing Topics

Includes data creation; data discovery and collection; data analysis and visualization; research data curation, storage, backup, and transfer; and research data policy compliance.

Research Data Management specialists, Data Librarians, Data Scientists, etc.

Software- Facing Topics

Includes software package management, research software development, research software optimization or troubleshooting, workflow engineering, containers and cloud computing, securing access to software, and software associated with physical specimens.

Research Software Engineers, Applications Specialist, Research Computing support, etc.

Systems- Facing Topics

Includes infrastructure systems, systems operations, and systems security and compliance.

HPC systems engineers, Storage Engineers, Network specialists, etc.

Strategy and Policy-Facing Topics

Includes institutional alignment, culture for research support, funding, and partnerships and engagement with external communities.

Research IT leadership: Director, Assistant/Associate Director, etc.

Table 1 - Description and examples for the Five Facings

Back to top 

Questions for each Capability

The Capabilities Model presents roughly 150 capabilities (in the form of questions) structured around the five Facings. Within each Facing, questions are grouped into topics. Each question is considered through several lenses (described below): Availability across institution, Service Operating Level, and Community engagement and collaboration.

For each question, users choose answers for the three lenses, and these are combined to produce a numerical coverage value for that capability. Coverage values for the questions are averaged to produce a coverage value for the associated topic, and for each Facing overall.

Availability across institution: The Model assessment tool asks organizations to rate the level of availability across their institution, for each capability associated with supporting Research Computing and Data. Note that it should not matter how or where support is implemented (a lab, a central campus facility, a national facility, or the cloud). The point is whether researchers have access, and are supported for effective use. Broadly speaking, availability is a rating of the level and especially the breadth of support across the institution.

The scope of an "institution" is primarily intended to be a given University campus (e.g., Clemson University, UC Berkeley, Oklahoma State University - Stillwater). However, the tool can also be used with a narrower scope, such as a given school or college (e.g., a College of Engineering, a Medical School, etc.).
The rating levels should be interpreted given the scope for which you are using the assessment tool.

Availability Level

Description

Availability/supported institution-wide

Full production-quality technical capability or service is in place, with equitable access institution-wide.
Note that if the local cost model makes access challenging for some groups (e.g., schools or departments), or if support staff lack skills or capacity for the full range of institutional domains, this may not constitute an institution-wide deployment.

Availability/supported for parts of the institution

Full, production-quality technical capability or service is in place with access available to selected users, to specific areas or groups (e.g., departments), but not institution-wide.

Planning, piloting, and initial deployment

This technology or service is not yet available (in production mode) to users, but meaningful planning for availability is under way. A plan for availability is either in development or in place. Staff are investing significant time and resources planning to deploy this technology or service. This includes evaluating options (e.g., specific technologies, vendors, team members, etc.) with an expectation of production availability within a defined timeframe. Evaluation involves at least multiple person-weeks of staff time developing options, a proposal for required funding, and can include piloting, or an initial deployment of the technology or service.

Tracking potential use

Staff are assigned but are restricted to monitoring and understanding this technology or service (i.e., there is active work in understanding, and generally more than just reading articles or occasional discussions).

No availability or support[4]

None of this technology or service is in place and no work is under way or resources committed for this technology or service.

Table 2 - Levels of Deployment at Institution

The notion of Availability across Institution is easiest to understand for things like researchers' access to consulting, or to compute resources. Where the assessment capabilities focus on aspects like systems management or security practices, Availability/supported institution-wide should be understood to mean that all researchers in the institution can benefit from this, and/or all teams supporting the function within the institution follow or support the practice; if only some have access, or only some teams follow a practice, Availability/supported for parts of the institution is the appropriate response. For capabilities concerning understanding or alignment, this can be understood to mean the existence of the understanding, or the level of alignment.

Back to top 

Service Operating Level: As organizations and services mature, support transitions from ad hoc projects, to repeatable and defined services, and eventually to managed activities that work to optimize the service operations and functionality. A common model for describing and structuring this activity is known as IT Service Management (ITSM), and is characterized in this context by adopting a process approach towards management, focusing on researcher needs and IT services for researchers rather than IT systems, and stressing continual improvement. [5] The Capabilities Model includes this dimension to let organizations assess the robustness, resilience, and sustainability of the support for a given capability of Research Computing and Data.

Service Operating Level

Description for capabilities
of the nature of services

Description for capabilities
that are activities and/or practices

Strong support, awareness, & commitment

The capability or service has a high level of operations and support (24x7 support), maintains a high level of customer response, and/or has sufficient resources for regular, significant new service development.

There is a high level of support and awareness of this practice, with a high-level of response to requests, and associated staff have sufficient resources and commitment/buy-in for regular, significant extensions of the practice.

Basic sustained service/support & awareness

The capability or service has good support for operations and some ongoing significant development or enhancement. Staff are able to respond to special customer demands for extensions or enhancements to the capability or service.

There is good support and awareness of this practice and there is ongoing significant development or enhancement. Associated staff are able to respond to some special requests, or extensions of the practice.

Minimal resources & commitment

The capability or service only supports operations and absolutely necessary enhancements to keep the service operating. There is no significant development or enhancement. While the capability or service is not at a significant risk of failure, any reductions in required funding could reduce the level to Very limited support and/or at risk.

There is support or awareness of this practice at a minimum viable level. There is no significant investment in developing or growing the practice. While support for the practice is not at a significant risk of failure, any reductions in required funding or leadership commitment could reduce the level to Very limited support and/or at risk.

Very limited support and/or at risk

The capability or service only supports basic operations, with no meaningful development or improvement. The capability or service is at significant risk of failure, generally due to insufficient resources to anticipate and eliminate operational, security, or other failures.

There is very little support or awareness of this practice, and no meaningful development or improvement. The support/awareness is at significant risk, generally due to insufficient resources, but perhaps as well due to lack of commitment or buy-in.

No existing service/support or awareness[7]

No capability or service support is in place and no work is under way and/or no resources are committed to support the capability or service.

No support is in place, and/or there is no awareness of this practice; no work is under way and/or no resources are committed to provide or develop this support/awareness.

Table 3 - Service Operating Level Descriptions

While not all capabilities are easily understood as "services", the concept may be mapped to "activity" or "practice", to translate the Service Operating Level to each activity or factor.

Note that many organizations providing Research Computing and Data support lack the resources to operate at the Strong support, awareness, & commitment level for many of their capabilities. It is not uncommon to support many capabilities at the Minimal resources & commitment, or Basic sustained service/support & awareness levels.

Back to top 

Community engagement and collaboration: The Model assessment tool also asks organizations to rate the level of Community engagement and collaboration, for each of the capabilities for Research Computing and Data. This recognizes that research is often collaborative across institutions (and even whole communities), and so the RCD teams supporting them should also be engaged outside your organization. Engagement and collaboration can range from participation in structured community forums to grant-funded regional projects, and should involve engagement with other institutions and/or communities to develop and share leading practices, get advice and training, etc. Examples of communities include the CaRCC People Network, Campus Champions, and various open source projects related to RCD, as well as regional network collaborations like the The Great Plains Network and the Ecosystem for Research Networking.[6]

Engagement and collaboration Level

Description

Leading a community collaboration

Your institution is leading/coordinating collaboration and engagement among multiple institutions, and is a recognized leader. (You're going above and beyond!)

Supporting a community collaboration

One or more collaborations and/or engagements are ongoing, have support and momentum, and are likely to continue. Partnerships are well-established, have clearly defined goals and outcomes, and have an established model for sustainability. (RCD teams are well connected to established communities and/or involved in formal collaborations).

Engaging a community collaboration

An initial or early collaboration and engagement is underway to explore the issues, or to build relationships. A plan for collaboration is either in development or in place. Staff are investing significant time and resources planning for this collaboration (or engaging with a new community), partners have been identified, and are actively engaged.

Exploring a community collaboration

There is interest in collaborating with others, and initial discussions may be underway. This involves active engagement and/or planning, exploratory discussions with potential partners, and represents more than simply reading articles, documents, or occasional meetings.

No engagement with community collaboration[7]

No collaboration is in place and no work is under way or resources committed to form or engage such a collaboration. (Your team is inward focused).

Table 4 - Levels of Collaboration across multiple Institutions

Community engagement and collaboration may be more easily understood by some roles/teams than others. For certain activities, the technology or service itself may be part of a collaboration (directly sharing resources among collaborating partners). For others, the staff supporting the service may be part of a community of practice/expertise that develops or shares resources like documentation, training materials, etc. Even for activities that are very locally or inwardly focused (e.g., aspects of data center operation), there can be collaboration on everything from the development and management of the data center, to defining the standards and best practices for staff who perform key functions. Community engagement and collaboration is included as a factor in the model to recognize the importance of working with peers, ensuring awareness of, and even active engagement in the development of community best practices. Few institutions have the capacity to be collaboration leaders across many capabilities.

Impact on the Computed Coverage: This question has less weight in the computed coverage than the Availability and Service operating level questions. If you select No engagement with community collaboration or Exploring a community collaboration, the computed coverage for the associated capability will be slightly decreased. If you select Engaging a community collaboration, this will have a neutral impact on the computed coverage and if you select Supporting or Leading a community collaboration, the computed coverage for the associated capability will be slightly boosted in recognition of the significance of that effort.

Back to top 

Not relevant or applicable: For each capability, you have the option to mark it as Not relevant or applicable. This is intended for use only when support for a given capability makes no sense at your institution. The capability is effectively removed from the model and will not contribute to the computed coverage values for the Topic and Facing. When this value is chosen, the other values for the answer form will disabled and grayed out to indicate this, although you can add Work notes to annotate your decision to mark this Not relevant or applicable.

Note: This value should only be used in exceptional cases, and not simply when there is little or no support because it has not been deemed important or strategic (e.g., due to resource constraints).

As an example of where this may make sense is the Domain support topic for each facing: If your institution does not have a Medical School, you can mark that as Question not applicable.

Back to top 

Local Priority: As you complete the assessment (or in a review session with your advisory groups) your team can optionally mark each capability as a priority for your organization. This can be used, e.g., to mark items you want to address in your strategic planning. The range of priority values is from 1 to 99 (where 1 is your top priority). Leave this empty or set it to "0" if this is not a priority.

Note that this question (and any answers you fill in) are for your own use, and any Local Priority values have no impact on the calculated coverage. However, you can view the top 10 priorities you marked in the main Assessment page; the full list is available in a view linked from that page.

A good practice is to mark as a priority only those capabilities that you can reasonably expect to devote meaningful resources to improving or expanding in the near term (e.g., in the next few years).

If everything is a priority, nothing is a priority!

Back to top 

Domain support levels: Each Facing includes the special topic Domain support. The questions in this topic allow you to assess the level of support across a range of different domains at your institution. Note that these are high-level categories, that Universities and colleges vary in where a given field or discipline might be located, and that there are countless interdisciplinary units. The aim here is a high level of aggregation that roughly corresponds to domains with relatively similar functional needs. While each Facing includes this topic, for some Facings at some institutions there may be less distinction or variation in support across domains.

The summary assessment page averages support levels for each domain across all five Facings. The average Domain support will indicate “(WIP)” until the questions are answered in each Facing.

Note that you can mark a domain as Not relevant or applicable, if that domain is not at all represented at your institution. This value should only be used if there are no researchers in that domain at your institution; it should not be used simply because the support is weak or there is a perception that researchers in that domain have little need for research computing and data resources.[8]

Note also that users at institutions that have worked with earlier versions of the RCD Capabilities Model assessment tools may notice that the v2.0 support for assessing Domain support is quite different than in earlier versions. One aspect of the new support model is that support for a given domain can be assigned a priority.

Requesting an additional domain: If you would like to include an assessment for a domain that is not represented in this list, please send your suggested addition to capsmodel-help@carcc.org for our review.

Back to top 

Footnotes

1 "Research computing and data" (abbreviated as RCD) includes technology, services, and people supporting the needs of researchers and research, and is intended as a broad, inclusive term covering computing, data, networking, security, and software. The National Science Foundation (NSF) uses the term “cyberinfrastructure,” and others use "Research IT."

2 For more about the roles associated with these areas, see the initial draft of a "Research Computing and Data Professionals Job Elements and Career Guide", available at: https://carcc.org/wp-content/uploads/2019/01/CI-Professionalization-Job-Families-and-Career-Guide.pdf

3"CI Engineers" have different roles at different institutions, and some might (also) be in the Systems Facing roles.

4When this value is chosen, the model will default to the corresponding values for the other questions in the form.

5See also https://en.wikipedia.org/wiki/IT_service_management

6(ERN, formerly the Eastern Regional Network)

7The assessment tool will default to this value for a given aspect/factor, when No availability or support is chosen as the level for Availability across institution.

8Researchers in the Arts and Humanities and in the Social Sciences may have more modest requirements, but research computing and data resources are having a significant impact on scholarship. It has been noted that Social Sciences researchers use as much computing resources today as Physical Sciences researchers used just a decade ago.