Key Performance Indicators and Dashboards

From The Business of Energy in U.S. Academic Research Computing
Jump to: navigation, search

Key performance indicators, or KPIs, are things that can be measured, and are deemed important for an organization. The adage in business is that "what is important is what is measured." Turning desires into metrics is important, if they are to be pursued with rigor and gusto.

One mechanism for tracking KPIs, which is particularly appropriate when KPIs are built up of multiple measurements, is a dashboard. With a dashboard, it's visually apparent at a glance whether goals are being met. Dashboards and KPIs are sometimes used externally, and some are only for internal use. In either case, particular outcomes might be touted through marketing efforts or other venues. In this sense, what is measured can be an indicator of whether we are making progress towards organizational goals.

Universities often have mission statements, and many have vision statements, strategic plans, and similar guiding documents. These are less common within research computing units, especially the smaller ones which are subsidiary to a larger organization. When such guiding documents exist, they may be instrumented as a series of metrics leading to KPIs and dashboards. This seems uncommon in university environments, currently.

The center I used to direct had explicit statements for mission, vision and values (via http://www.arsc.edu/arsc/about/our-mission/):

ARSC Mission

The Arctic Region Supercomputing Center (ARSC) provides an ensemble of outstanding expertise, state-of-the-art technology, and innovative research projects that:

  • Enable the creation and discovery of knowledge in science and engineering,
  • Enhance educational and research capabilities of the University of Alaska Fairbanks,
  • Advance knowledge of the polar regions, and
  • Contribute to a richer understanding of the world around us.

ARSC Vision

To use our capacity and capability in high performance computing to:

  • develop users and utilization,
  • facilitate use and access, and
  • diversify our resources to establish distinction as a leading computational center for polar research.

What do these statements mean in terms of day-to-day operations? Are they consistent with broader mission statements from the campus, or the university system? How are they measured?

For comparison, this is the mission at my current center (via http://ksl.kaust.edu.sa/Pages/About.aspx), KAUST:

KAUST Supercomputing Laboratory Mission

To inspire and enable scientific, economic and social advances through the development and application of high performance computing solutions, through collaboration with KAUST researchers and partners, and through the provision of world-class computational systems and services.

In both cases, the mission statements are carefully scoped to reflect items of interest to the parent institution. They are helpful to make decisions about where to apply effort for improvement, and equally importantly to identify potential activities that might be out of scope.

Mission, vision and values statements are great ways of describing to the staff, the users, the community, and the world what an organization stands for. Without metrics, though, they are just words. How will alignment of activities with the mission and vision be assessed? What goals will be pursued to further the mission, and how do they related to each other?

Let's work through some KPIs and associated metrics. Eventually, we could formulate a dashboard. Each research computing center will have its own approach, and exists within its own unique broader environment. As a starting point, I analyzed the ARSC approach.

From Mission and Vision to Goals

The language of Mission and Vision statements does not generally lend itself to measurement of incremental progress. In order to gauge which potential measures are of interest, and to track them over time, we can use goals. Goals in university environments are typically achievable on a relatively long time scale:

  1. to double the rate of alumni giving
  2. to raise the average SAT score of incoming freshmen
  3. to have at least 80% job placement within one year of graduation

In the spreadsheet accompanying this monograph, I have provided a listing of academic outcomes that might serve as dashboard items for centers. In order to facilitate alignment with institutional measures, the items selected are, hopefully, similar to what the institution is already tracking:

  • publications and presentations
  • graduates who utilized the center
  • attendees at training, workshops, etc.
  • provided resources (services, CPU/disk resources)
  • innovations, such as patents or software
  • income from grants, contracts, etc.
    • to the center or center staff
    • to users of the center
    • attributable, at least partially, to the center

Most universities have one or more annual reporting mechanism by which data concerning institutional goals are gathered and reported. These might include data collected from faculty members (i.e., publications), from the registrar (i.e., graduation), and central offices for grants and contracts. Users themselves can be asked to provide data, when it is not otherwise available. For example, if a publication and grants listing for every faculty member is available from campus sources, it might be necessary to ask the faculty member to gauge the role of the research computing center for each item.