Bridging the Data Gap: A Case for Standardized Reporting on OER Faculty Incentive Programs
Amber T. Burtis and Jennifer J. Horton
This case study describes the design and implementation of a research project aimed at conducting a systematic review of the existing literature on open educational resources (OER) faculty incentive programs. The main objective of the research project was to gather evidence on the effectiveness of OER faculty incentive programs, particularly in promoting the uptake of OER by instructors and any resulting impact on student success. However, during the initial data collection phase of the study, the authors discovered a significant lack of available data in the published literature on the outcomes of these programs, especially data addressing the efficacy of the programs and any relationship to student success measures. As a result, the question of whether OER faculty incentive programs are effective and if they empirically increase faculty uptake of OER or improve student success remains unanswered due to the limited collection and reporting of relevant data in the published research literature. This case study proposes the establishment of a shared language and methodological framework for collecting and reporting local data on OER faculty incentive programs, along with the development of an openly available shared dataset that can help address critical questions related to program outcomes.
Academic libraries and librarians have promoted and advocated for OER for over a decade (Bueno-de-la-Fuente et al., 2012; Okamoto, 2013). One strategy to accomplish this goal has been for librarians to set up, manage, and/or assist with OER faculty incentive programs (Walz et al., 2016). These programs encourage faculty to write or use OER materials in their coursework by providing incentives such as monetary awards, course leave, or recognition. The programs can also offer librarian support through copyright assistance, technical expertise, and instructional support. A team of academic librarians at a research university wanted to see what effect, if any, faculty incentive programs had on faculty adoption, adaptation, or creation of OER, and any resulting measures of student success by looking at the scholarly literature. The team embarked on a systematic review of the literature to address these research questions. Since there seemed to be a lot of recent interest in the area and discussion of the topic, the researchers believed there would be many scholarly works written on the subject with assessment data on the programs, whether it was more theoretical or through case studies.
The researchers developed a research question, “Do college or university OER incentivizing programs lead to or increase faculty adoption, adaptation, or creation of OER materials?” using the SPICE framework. This framework is used in social science research to look at a topic’s setting, perspective, intervention, comparison, and evaluation (Booth, 2006). Once the research question was developed, the researchers developed inclusion and exclusion criteria, a list of databases to search within, and a search string for each of those databases. A traditional systematic review procedure was followed, and the team searched the literature, conducted a title/abstract screening, and conducted a full-text screening. Twenty-three studies remained after the title/abstract and full-text screenings. The next step was to begin data extraction. At the beginning of the project, the researchers developed a list of items they wished to glean from each study and started the process. This is where the team noticed that most of the data they sought were not included in the studies despite the articles fitting the topic. Most of the studies did not include data that would assist in answering the original research question, which was surprising. Most of the research focused on setting up an incentive program, but there was not a lot of assessment detail provided that the researchers needed to make any determinations.
The researchers decided to change the research project methodology to a scoping review (Burtis, et al., 2024). This allowed them to determine the state of the literature in the field without answering a specific research question. Most of the original methodology remained the same, except the team had initially planned to use a tool such as Glynn’s Evidence-based Librarianship Critical Appraisal Tool for risk of bias assessment in the systematic review (2006). This step was unnecessary because the project was switched to a scoping review. Instead, the researchers followed the PRISMA Extension for Scoping Reviews (PRISMA-ScR) Checklist and Explanation (Tricco et al., 2018). The scoping review looked to provide an overview of any reported faculty incentive programs involving academic libraries without answering whether these incentives made a difference in faculty adoption, adaptation, or creation of OER, or any resulting student success measures. The project became a way to assist academic library administrators in developing future OER library programs. It highlighted essential features of these programs and provided an overview of what had been accomplished and where these programs were taking place.
The data extraction for the full set of included studies was primarily descriptive in nature and did not allow for inferential statistical analysis. The following data were extracted:
- Year of publication (year)
 - Institution location (state or country)
 - Type of institution (College Navigator classification category)
 - Administration of Incentive Program
- Library leads program (Y/N)
 - Other partners in the program (free-text)
 - Year program began (year)
 - Number of rounds completed (number)
 
 - Structure of the Program
- Overview of the program (free-text)
 - Faculty incentive (free-text)
 
 - Funding of Incentive Program
- Total funding for the program (dollar amount)
 - Amount per incentive (dollar amount)
 - Funding Source (free-text)
 
 - Program Applications
- Eligibility to apply (free-text)
 - Requirements of application (free-text)
 - Number of applicants (number)
 - Number of recipients (number)
 
 - Outcomes
- Number of OER texts adapted, modified, or created (number)
 - Estimated cost savings to students (dollar amount)
 - Subject areas of OER texts adapted, modified, or created (free-text)
 - Additional outcomes reported (free-text)
 - Deliverables required to receive the incentive (free-text)
 
 
In short, the key findings of the scoping review were that, in the majority of the reviewed studies, the library was reported as the administrator of the faculty incentive program, the most common incentive offered was a monetary grant and funding typically came from sources outside of the library budget. Most of the programs included in the review began between 2013 and 2016. While data on the number of applicants was insufficient, the review identified 611 reported recipients across the studies, and these faculty used 304 OER texts in their courses. Sixteen studies reported estimated cost savings to students, with a total reported savings of $33,905,818.33. The OER texts adopted, adapted, or created spanned a wide range of subject areas, with no single field dominating. Programs often required faculty to provide deliverables to receive the incentive, such as replacing course materials with OER, providing feedback, or completing surveys. The majority (76%) of the institutions represented in the included studies were 4-year public institutions. The review found that most of the articles were case studies rather than empirical research studies and that there was insufficient data for a comprehensive analysis of program effectiveness or long-term impacts.
The results of the study revealed that while many institutions offer financial incentives to encourage OER adoption, data on the impact of these programs remains fragmented. Our study extracted descriptive data from published studies, including program administration details, funding structures, and reported outcomes. However, crucial information regarding faculty engagement, student success measures, and longitudinal program impact was often missing or inconsistently reported. Specific challenges include a lack of data to determine the relationship between faculty incentives and student success, inconsistent reporting on OER adoption outcomes, including cost savings and faculty retention in OER initiatives, an absence of standardized data categories, making cross-institutional comparisons difficult, and limited longitudinal data assessing the sustained impact of incentive programs on student success.
To address these challenges, we propose a shared language and methodology for OER faculty incentive program data collection, including standardized reporting on the administration and structure of programs, the outcomes and impact of programs, and student success measures such as improved learning outcomes, satisfaction with learning materials, potential increases in enrollment and retention, and the affordability of course materials through cost savings. We also recommend longitudinal data collection of faculty participation and student impact over multiple academic years and that a centralized data repository be developed with an open-access dataset to support comparative analysis and research.
The research project detailed in this case study highlights the need for long-term assessment data on OER faculty incentive programs so that inferential statistics can be used to determine the relationship of programs to student success variables. The reviewed studies lacked assessment data and longitudinal data, making it difficult to determine the effectiveness of these programs.
Data on whether the programs improved students’ overall knowledge would be valuable in assessing their educational impact, such as data comparing learning outcomes in courses using OER linked to faculty incentive programs versus traditional materials. Data to help analyze whether the programs led to a sustained increase in faculty involvement with OER would also be helpful. Further research could analyze whether these programs helped institutions with recruitment and retention. Collecting data related to student enrollment and retention in courses using OER developed through incentive programs would be necessary to answer questions related to recruitment and retention.
The authors recommend the establishment of a shared language and a shared methodological framework for collecting and reporting local data on OER faculty incentive programs, along with the development of an openly available shared dataset that can help address critical questions related to program outcomes and student success measures. The dataset could be housed as a dashboard by a national OER organization or at an educational institution. Data could be collected by the administrators of the faculty incentive programs in a systematic way so that data from multiple institutions could be analyzed together. To facilitate a shared language, institutions could adopt a set of standardized data extraction categories for their OER faculty incentive programs.
We propose the following points of data collection to operationalize a shared language.
- Administration of the incentive program: Data such as who leads the program, partners, start year, and number of rounds completed.
 - Structure of the incentive program: Overview of the program and the specific faculty incentives offered.
 - Funding of the incentive program: Data such as total funding, amount per incentive, and funding sources.
 - Program applications: Eligibility criteria, application requirements, and the number of applicants and recipients.
 - Outcomes: Number and subject areas of OER texts adapted, modified, or created, the estimated cost savings to students, and any other reported outcomes and deliverables.
 - Deliverables required: Data on what faculty members must produce or report to receive the incentive.
 - Other: Any other outcomes and deliverables associated with the programs should be documented. This might include changes in teaching practices, student performance, or increases in faculty engagement with OER.
 
A shared methodological framework would require clear and consistent definitions for key metrics and how student cost savings are calculated. This would ensure data reported by different institutions is comparable. To promote consistency in data collection and reporting, the development of common reporting templates or guidelines could be beneficial. These could include standardized formats for application data, outcome reports, and student savings calculations.
A central, openly available repository or dataset could be developed to house anonymized data from various OER faculty incentive programs. This would allow researchers and administrators to analyze program outcomes and student success measures across different contexts. The data extraction categories mentioned above could form the basis of this dataset. The shared framework should emphasize the collection of longitudinal data to understand the sustained impact of these programs on faculty engagement with OER, student outcomes, and institutional benefits like recruitment and retention.
While quantitative data on OER numbers and cost savings is important, the framework could also consider the inclusion of qualitative data (e.g., through surveys or brief narratives) to capture faculty and student satisfaction with OER texts and the incentive programs. Institutions should be encouraged to document their methodologies for data collection and analysis, including survey instruments, data validation processes, and any limitations in their data. The establishment of a shared dataset would necessitate the development of community standards for data sharing, privacy, and governance, including guidelines on data anonymization, data usage policies, and mechanisms for data quality control.
While the study did not include data collection on OER initiatives other than faculty incentive programs, the authors recognize that data from other initiatives should also play a key role in any institutional or national OER data framework. Therefore, we suggest fostering collaboration with campus bookstores, teaching centers, bursar offices, and particularly institutional research offices. These offices could assist in incorporating OER metrics into existing institutional reporting systems. Additionally, it’s crucial to recognize that not all OER adoption is driven by formal faculty incentive programs. Efforts should be made to collect data from institutions that encourage OER adoption without offering direct incentives.
References
Booth, A. (2006). Clear and present questions: Formulating questions for evidence-based practice. Library Hi Tech, 24(3), 355-368. https://doi.org/10.1108/07378830610692127
Bueno-de-la-Fuente, G., Robertson, R. J., & Boon, S. (2012). The roles of libraries and information professionals in Open Educational Resources (OER) initiatives: Survey report (CETIS Report No. 2012:R02). Center for Educational Technology & Interoperability Standards. http://digital.library.wisc.edu/1793/63306
Burtis, A.T., Horton, J.J., & Taylor, M.K. (2024). Faculty incentive programs for Open Educational Resources: A scoping review. Journal of Library Administration, 64(5), 562-582. https://doi.org/10.1080/01930826.2024.2351246
Glynn, L. (2006). A critical appraisal tool for library and information research. Library Hi Tech, 24(3), 387-399. https://doi.org/10.1108/07378830610692154
Okamoto, K. (2013). Making higher education more affordable, one course reading at a time: Academic libraries as key advocates for open access textbooks and educational resources. Public Services Quarterly, 9(4), 267-283. https://doi.org/10.1080/15228959.2013.842397
Tricco, A. C., Lillie, E., Zarin, W., O’Brien, K. K., Colquhoun, H., Levac, D., Moher, D., Peters, M. D. J., Horsley, T., Weeks, L., Hempel, S., Akl, E. A., Chang, C., McGowan, J., Stewart, L., Hartling, L., Aldcroft, A., Wilson, M. G., Garritty, C., … Straus, S. E. (2018). PRISMA extension for scoping reviews (PRISMA-ScR): Checklist and explanation. Annals of Internal Medicine, 169(7), 467-473. https://doi.org/10.7326/M18-0850
Walz, A., Jensen, K., & Salem Jr, J. A. (2016). Affordable course content and open educational resources (SPEC Kit 351). Association of Research Libraries. https://publications.arl.org/Affordable-Course-Content-Open-Educational-Resources-SPEC-Kit-351/