New Playbook: Incremental Credentialing in Graduate Education

Design and Development Research (DDR)

DESIGN AND DEVELOPMENT RESEARCH (DDR) STUDY PLAN (UPDATED APRIL 30, 2022)

Table of Contents

1. Purpose

2. Intervention

3. Study Populations

4. Study Design

Priority 1: Informing ongoing development and feasibility of the Incremental Credentialing Framework

Feasibility Research Questions

Feasibility Study Population

Feasibility Data Collection

Priority 2: Assessing promise for generating the intended learning outcomes

Outcome Research Questions

Research Setting and Context

Table 1: Study Partner Institutions

Outcome Measures

Outcome Data Collection Timing

Data Transformation for Analysis

Outcome Analysis

Matching, Baseline Equivalence

Power Analysis

5. Quality Assurance

Appendix A: Potential Comparisons for Student Outcome Analyses

1. Purpose

The Credential As You Go, Priority 2 outcome research project will apply a Design and Development Research (DDR) approach for the proposed study, consistent with guidance from the U.S. Department of Education and National Science Foundation Common Guidelines for Education Research and Development (IES & NSF, 2012). The DDR study will support two priorities: (1) informing ongoing development (rapid prototyping) of the Incremental Credentialing Framework to create targeted, meaningful incremental credentials (ICs) through collection of evidence of its feasibility; and (2) assessing the resulting incremental credentials’ “promise for generating the intended beneficial learning outcomes” (p.20), through an examination of quantitative learner academic and perception data from pilot IC implementations using a comparative interrupted time-series (CITS) analysis of individual-level data. The purpose of the outcome analyses is to quantitatively assess the promise of system- and institution-level policies and processes as they translate into implementation of incremental credentials to further targeted, meaningful postsecondary learner outcomes. 

 

Analyses across data sources will be guided by initial understandings of theoretical relationships among the five elements of the Framework identified during a Priority 1 policy study, framed to anticipate needed conditions for, and barriers to, implementing ICs, establishing a starting theoretical basis for the outcome analysis. The Priority 1 research will expand those understandings by examining (1) revised policies and processes at both system and institution levels, (2) the implementation of new ICs at participating institutions (an institution-level outcome), and (3) realization of beneficial outcomes for learners who participate in new IC offerings within the context five postsecondary systems in Colorado, New York, and North Carolina. Priority 1 findings will be developed at institution, academic area, and student subgroup levels.

 

2. Intervention

The education innovation being developed and piloted is the active establishment of a set of antecedent, institution-level conditions (including as promulgated by state higher education systems) theoretically necessary to implement effective incremental credentialing systems.

 

These conditions are framed by the Incremental Credentialing Framework, which includes six components: (1) Learn As You Go: ICs stand on their own, may or may not be connected to a degree, and prepare for specific workplace skills; (2) Add On As You Go: ICs are obtained for specializations that add onto a degree (or stand alone) while engaging in a degree pathway; (3) Stack As You Go: ICs add together or stack into larger credentials; (4) Transfer As You Go: ICs are built to transfer across institutions; (5) Partner As You Go: ICs prepare for and include field-expected credentials for work, as well as work-related credentials that are accepted into degree or other credentialing pathway that are developed in conjunction with an industry partner(s), and (6) Retro As You Go: credentials are awarded for learning already acquired. 

 

The institution-level treatment also includes implementation policies and processes required to put the Framework into practice (e.g., auto-awarding of credentials to reduce the additional step learners typically go through to “apply” for graduation). Establishment of the right set of conditions is anticipated to empower institutions to develop and execute innovative ICs, tailored to improve outcomes for specific groups of learners.

 

3. Study Populations

The primary population of interest is students enrolling in new ICs that will be developed for the treatment condition, compared with others in similar programs without ICs. This group is examined in terms of outcomes of IC implementation under Priority 2, the Outcome Study. The secondary population of interest, because the IC Framework is first and foremost a policy implementation, are the higher-education system and institution staff responsible for developing, executing, and supporting these similar programs. These stakeholders will provide the data for the Priority 1 Feasibility Study.

 

4. Study Design

 

Priority 1: Informing ongoing development and feasibility of the Incremental Credentialing Framework 

Feasibility Research Questions

The aim of Priority 1 will be furthered by answering the following Research Questions:

  1. How feasible is the IC Framework across systems and institutions?
    1. How feasible is the Incremental Credentialing Framework considering the perspectives of postsecondary systems, employers, and other state-level stakeholder groups for implementation at the institutional level within each system?
    2. What factors influence implementation of the Framework at, and between, the system and institutional levels of state postsecondary education?
    3. To what extent are employers partnered with participating institutions aware of, valuing, and using the emerging array of ICs in their hiring and advancement practices?
    4. How do grant-funded communication strategies (e.g., national campaign, website, communication materials) intended to advance awareness of and value for incremental credentialing further progress toward outcomes among system and institutional stakeholder groups?
    5. Which actions support implementation of the Framework and corresponding policies and processes across different levels of state postsecondary education systems stakeholder groups, and in what ways?
    6. How does execution of the Framework change institutions’ credentialing structures, technologies, activities, and services, considering anticipated action outcomes for personnel (e.g., changes in marketing, policies, articulation agreements, transcripting, advising, and student record systems)? 
    7. What conditions (e.g., existing student information and degree auditing systems) enable or constrain development and execution of ICs at the academic area level (within institutions), particularly for similar content at different institutions? At the institutional level? At the state (and system) level?
    8. What indicators of Framework readiness define key points in the evolution of system and institutional policies and processes for implementing ICs sufficient to assess the promise of efficacy of the model?

 

To answer these research questions, evidence relating to the feasibility of the Framework (IES & NSF, 2012) at the institution level will be collected through qualitative means from participants opting in from the defined study population. All data-collection strategies will examine the same eight, high-level research questions to guide the inquiry listed above.

 

Feasibility Study Population

The population of this group is expected to be approximately 180 individuals across the three states who are guiding and/or engaged in the development and implementation of the anticipated 90 new ICs (over three years), to be offered across postsecondary levels—community colleges, four-year institutions, and graduate schools. 

 

These individuals are current members of one of three groups of higher education stakeholders receiving stipends to contribute to the CAYG project in terms of specified project responsibilities, including (1) the National CAYG Advisory Board; or (2) the State Coordinating Teams or the (3) Institutional Academic Teams associated with each of the three states in the study.

 

These groups consist of individuals representing a number of professional roles—(1) state higher education system administrators, (2) college or university institution administrators, (3) employers (representatives of industry, affiliated professional associations), (4) administrators from community-based, philanthropic, research, policy, and, advocacy organizations in the learn and work ecosystem (Learn and Work Professionals), and (5) institutional academic team members (including faculty, registrars, Deans, and other academic affiliates from institutions participating in the development of new ICs). 

 

Feasibility Data Collection

Data will be collected from all five professional role-based groups for this priority study using a combination of focus groups, interviews, and surveys.

 

Focus groups will be convened using a web voice-video meeting application in role-alike clusters of 6-10 individuals within state, across institutions and/or across state; and across institutions, depending on informants’ roles, disciplines, and other contributing factors to incremental credential development and implementation (e.g., employer groups, system representatives, faculty). Focus groups will be conducted at the end of fall and spring semesters. Questions will be tailored to group role (or roles should they be heterogeneous), considering (1) the Framework schema, (2) institutional personnel action outcomes required to implement the ICs, and (3) any specific issues identified by the development team relating to policies and processes aimed at deploying ICs.

 

Individual interviews will be conducted at the end of each year to provide more in-depth perspectives on the Framework and implementation procedures.

 

Survey questionnaires will also be used to gather ongoing feedback regarding feasibility, supplementing qualitative evidence with broader (but less in-depth) information to address strategies, professional development, expectations, barriers to incremental credentialing strategies, and other factors relating to the CAYG theory of action.

 

Emailed, web-based survey links will also be deployed to accommodate data collection from study participants who would prefer not to participate in a focus group or interview. Abbreviated versions of the questioning protocol can also be used to “prime” focus group or interview participants for web-mediated discussions, while providing preliminary input with which the research facilitators can adjust their questioning strategies.

 

Based on the responses from focus groups and interviews, surveys will be updated to gain additional perspectives on key items identified through ongoing analyses. For example, Research Question 1.8 asks about indicators of readiness of a system or institution to fully and effectively implement incremental credentials. High-level understandings of these conditions will be realized through the results from the focus groups and interviews. Research question 1.4 examines communication strategies; again, results from the focus groups and interviews will provide insight on effective communication strategies. Once readiness indicators are identified, survey items can be developed to assess broader agreement on their importance, applying an Exploratory Sequential Design (Creswell, Plano Clark, et al., 2003)—an approach particularly useful to developing theoretical models and policy frameworks.

 

Feedback from these multiple data collection efforts will be used to revise CAYG activities executing the Framework, as well as IC policies and processes, along the way. Final results will be used to refine the Framework conceptual model, and to disseminate and promote scaling of the Framework, processes, tools, and resources with other institutions and state systems. 

 

Priority 2: Assessing promise for generating the intended learning outcomes

Outcome Research Questions

Priority 2 will be advanced by answering the following Research Questions:

  1. What learner-level outcomes are realized from implementation of the IC Framework?
    1. Where Framework conditions meet expected levels of readiness, what suite of ICs are institutions implementing?
    2. Where ICs are being implemented, how are learners’ understanding and value of incremental credentialing changing? How do options of incremental credentials influence learners’ actions (e.g., choosing programs, accepting awarded credentials)?
    3. To what extent is participation in ICs as implemented as part of this project associated with improvements to academic outcomes for learners including initial enrollmentpersistenceprogress in a program or pathway, and completion of a recognized credential and/or degree?
    4. To what extent do those learners who complete ICs transfer their credentials to another institution, continue their education, and/or obtain employment?
    5. In what ways do learner outcomes differ across groups by age, gender, race and ethnicity, and prior academic and persistence performance? By academic area and institution type (community college, four-year institution)?
    6. In what ways do learner outcomes differ between students who participate in ICs versus non-participating students with regards to awareness and valueaccessenrollmentpersistenceprogress, successful completion, and continuing education and employment?

 

To answer these research questions, the study design will include a comparative interrupted time-series (CITS) analysis of prioritized individual-level data (defined as “core outcomes”). Institutional data will serve as retrospective baseline information for examining new ICs (and IC-enhanced programs) developed by participating institutions – the CAYG treatment condition – and similar credentials and programs implemented at similar institutions (the control condition). Institutions will be considered similar by comparisons in size, degree level, and academic offerings. Other participating institutions provide a convenient starting point. Similar programs or credentials will first be nominated by subject matter experts (e.g., state coordinators, administrative leadership in participating institutions that are developing the treatment ICs). Other programs or credentials will be selected based on content, level, and other factors applicable at that level of analysis. Again, this selection will start with programs within other participating institutions.  

 

Data will be collected on activities related to incremental credentialing from both treatment and control credentials/programs in partner and other institutions. However, community colleges in all three states may also be participating in the Lumina Foundation Racial Equity Adult Credentials in Higher Education (REACH) Collaborative grant, in which they may also be developing ICs. For this reason, institutions will also be coded based on their “IC implementation status,” as (1) CAYG participation, (2) REACH participation, (3) other institutions that may or may not be implementing incremental credentialing activities incidentally, or (4) institutions with no incremental credentialing activities. 

 

Historical institutional data on learner-level outcomes (treatment/control) will be pulled for each targeted program going back 3-5 years, and analyzed to establish baselines prior to and during the COVID pandemic. Learner-level outcomes obtained during the study (for cohorts from funding Y2 and Y3) for both treatment and control institutions will be compared against projected trends from the baselines. Learner outcomes will also be compared within institutions at the department level to examine outcome patterns, also over the past 3-5 years, versus observed learner outcomes resulting from participation in IC offerings. 

 

The Framework and its policies and processes (the treatment condition) will be implemented in 21 grant project partner institutions across five higher-education systems in three states over the period of the grant-funded study. (Some states have separate systems for their community colleges and four-year institutions.) 

 

Research Setting and Context

Learner outcome data will be examined for all students who engage in ICs (completers and non-completers) at participating institutions. The population of this group will be approximately 900 learners (estimated 10 learners per 90 credentials) across the three states, representing diverse learner populations with respect to race and ethnicity, gender, age, and prior academic and persistence performance. Estimates for anticipated numbers of new ICs and impacted learners are provided in the following table.

Table 1: Study Partner Institutions

 

Colorado

New York

North Carolina

Totals

State Systems

2

1

2

5

Participant Institutions

7

7

7 

21

Estimated new ICs per year

10

10

10

30

Estimated Learners per year

100

100

100

300

Estimated Learners, 3 years

300

300

300

900

 

Outcome Measures

The current operational definitions of learner outcome measures include the following: 

  • Access – Evidence of readily available information on, advising for, and the ability to register for targeted credentials. The provision of ICs is one INDICATOR of this immediate outcome of the intervention, as is effective communication regarding availability, benefits, listings for registration, etc.
  • Awareness of ICs – The extent to which students know of ICs, understand what they are, and have the ability to research and enroll in them 
  • Value for ICs – The extent to which students perceive ICs as beneficial to their education and professional goals; likely to realize a return in terms of satisfaction and income 
  • Enrollment (core outcome for analysis) – Registration for incremental credential, acceptance of all associated charges (i.e., one who is registered but not paid is not enrolled), and remains enrolled past the add/drop period for that credential at the institution.
  • Persistence (core outcome for CITS analysis) – Term-to-term continued enrollment (per previous definition) towards an identified educational goal (e.g., incremental credential, certificate, degree) or completion of that goal.
  • Progress (core outcome for CITS analysis) – Completion of enrolled courses and/or successful attainment of credential requirements towards an educational goal (e.g., incremental credential, certificate, degree) or completion of that goal.
  • Completion (core outcome for CITS analysis) – Attainment of a formal award (e.g., incremental credential, certificate, degree) by a learner, by institutional requirements (e.g.,within a stipulated period of time).
  • Transfer (core outcome for CITS analysis) – A transition between or among postsecondary institutions in which the dominant destination institution grants the learner credit for courses taken at the origin institution; normally a one-way transition (i.e., temporary enrollment at a new institution with return to the first is not a transfer).
  • Continuous Education – Enrollment in the next sequential or additional educational credential (with or without a break) at the same institution or a different institution (e.g., incremental credential to degree, associate degree to bachelor’s degree).
  • Employment – Learner has remained employed at the same level, changed job levels or responsibilities, or secured new employment within 6 months after completion of IC. Employment data will be self-reported through the planned learner surveys. 
    • Better pay
    • Promotion
    • Move to different company/organization
    • Move to a different field

 

The core outcome measures of Enrollment, Persistence, Progress, Completion, and Transfer will be the priority for collection from state higher education agencies. The research team is confident that these measures are available by those means in forms appropriate for the outcome study.

Access data (notably when new ICs become available for student registration) will be aggregated from partner institutions along with other attributes of ICs in the treatment condition being tested. Student Awareness and perceptions of the Value of ICs will be assessed through student surveys, amplified with qualitative data collection as described for Priority 1. Distal outcomes of Continuous Education and Employment will be assessed by the same means. These data are available only inconsistently from institutions but any additional information that becomes available will be considered as data are synthesized across sources/methods to answer the research questions.

 

Additional data to be collected (e.g., for consideration of covariates and to facilitate subgroup analyses) will include the following:

  • Demographic Data variables including race/ethnicity, gender, age, socioeconomic status (e.g., FAFSA/Pell eligibility), prior academics (SAT; high school GPA if applicable), types of other credits (e.g., institutional, transfer, prior learning).
  • Higher education variables of learners within programs or institutions such as academic performance (GPA, # of terms enrolled at the institution, matriculated vs. non-matriculated status), and academic program of student (if matriculated)
  • Incremental credentials attribute data including program, topic, credit/noncredit, number of credits (if credit bearing), type of credential to be achieved, purpose of the credential, considerations of prior learning, any designated stackable pathways connected to the credential, assessment strategies, steps to completion, and alignment designations to employment. 

 

In addition, qualitative data will be collected from targeted IC-participant learners through surveys, focus groups, and individual interviews, to monitor the implementation of CAYG-informed ICs (informing the Priority 1 Feasibility Study) and to complement impact analyses with self-reported perceptions regarding awareness and value of ICs, their educational and employment goals, and plans for educational next steps. Surveys and focus groups will take place at the end of fall and spring semesters. Interviews will take place at the end of Years 1 and 2, and mid-year in Year 3.

 

Outcome Data Collection Timing

Outcome data will be collected on the following schedule:

  • Yearly learner outcomes data for participating institutions and non-participating institutions will be collected and analyzed for the past 3-5 years 
  • Yearly learner outcomes data for participating institutions and non-participating institutions will be collected and analyzed for the observed years (years 1-3)
  • Yearly learner outcomes data for participating departments/programs within participating institutions will be collected and analyzed for the past 3-5 years 
  • Yearly learner outcomes data for participating departments/programs within participating institutions will be collected and analyzed for the observed years (years 1-3)
  • Semester outcomes of participating learners – December, May, August of each of the observed years (grant Years 1-3 — last year would not include August) – Participating students = those enrolled in IC developed during grant period. 

 

Implementation data collection (qualities and quantities of activities to develop and deploy ICs) will occur on different timelines across programs, at participating institutions, and even at the level of each partner system. This is necessary because conditions for the execution of a new IC can be realized at any time during a grant year. Introduction of any new IC may happen at the beginning or middle of a semester, subject to institutional policies and practices. To accommodate this, new offerings will be accounted for at the end of each academic period of each year — Fall (typically September-December), Winter (January-February), Spring (January-May), or Summer (June-August) sessions. 

 

Data Transformation for Analysis

Variables in the raw data files secured from partner systems and institutions will have to be transformed to represent the latent variable constructs of the core outcomes being considered, aligned with the definitions provided in this document. These transformations will also ensure uniform formatting (e.g., MM/DD/YYYY for dates) and consistency in tabulation across data sources.

In other cases, synthetic or scratch variables will need to be created and used when the student outcome of interest takes into consideration information spanning more than one point in time; persistence and progress are examples (see below). The synthetic variable will be used to carry information from one point in time to the next or later points in time. The use of a synthetic variable in this manner preserves transformed raw data so that auditing or undoing a particular calculation is possible. Without a synthetic variable, the transformed raw data would otherwise be overwritten, making data audits more difficult.

Enrollment

Term data concerning enrollment (by course) will be transformed from student-level raw data obtained from partner systems or institutions. For purposes of analysis, a binary value is necessary for each student, by term, to inform two other student outcomes – persistence and progress.

Persistence

The calculation for persistence (at the student level) requires a logical comparison of enrollment in, and satisfactory completion of, at least one course for a given term (e.g., quarter, semester, academic year) with that of the previous term. As this variable does not originate with raw student data, it is a synthetic variable.

Progress

Similar to persistenceprogress is a synthetic variable. Calculation of this variable does not however require a chronological comparison of enrollments. Instead, it is the satisfactory completion of credential-relevant coursework in the current term that determines movement toward completion.

Completion

Like enrollmentcompletion is a transformed variable from relevant student-level raw data. A binary value specific to a given term is needed for analysis at the student level (i.e., a student either did or did not complete a given course or other credential component during the term in which they were enrolled). 

 

Outcome Analysis

An ordinary least-squares (OLS) regression approach will be used to conduct the multiple-group interrupted time-series analysis. Specifically, the ITSA command available in Stata version 17 will be used. This analytic approach is both more flexible and applicable for time-series analysis than those based on autoregressive integrated moving-average (ARIMA) models (Linden, 2015). The comparative interrupted time series (CITS) analyses will be conducted at the individual student level.

Processes to assure the quality of study methods and results will be organized to align with guidance from IES regarding enhancing the generalizability of impact studies in education (Tipton & Olsen, 2022).

1. Define the target population

2. Develop a population frame

3. Design a sampling plan

4. Implement the sampling plan

5. Assess similarity

6. Adjust for differences

7. Report generalizability appropriately

 

Matching, Baseline Equivalence

Comparisons will be conducted on learner-level outcomes between baseline and post-implementation (within-subject) and also between individuals in similar credentials/programs at participating and non-participating institutions (between-subject). This approach acknowledges the potential for substantive differences between individuals and between programs (e.g., content, duration) and allows us to address baseline equivalence to help “rule out threats to the validity of causal claims yielded by a simple pre-post comparison” (Hallberg et al., 2020).

 

Further analyses will be conducted by subgroup (e.g., equity, age, institution type, and/or state) to reduce the variance of outcome measures by increasing the similarity of these data along relevant measures. Within subgroups, matching based on outcome measures at baseline will further ensure suitability as a control group member. Comparisons will be elaborated by considering credential and/or program attribute variables, including but not limited to their implementation in the four types of institution defined by their IC implementation status: (1) Credential As You Go participant institutions, (2) REACH grantees, (3) other institutions offering ICs, and (4) those offering no ICs.

 

Importantly, comparisons are unitized at the “program-credential” level (versus the institutional or state higher education system levels). This strategy assumes that, in order to be a viable intervention to improve higher ed, credentialing strategies that make up the Framework will most likely be applied to programs that are already in existence in some form (perhaps ill-advisedly dubbed “traditional programs”). Program credentials that are assigned to the treatment group will therefore come from among the constellation of 90 new, IC-enhanced academic offerings developed under the CAYG initiative, the intervention being the application of the Framework to redesign a program in ways thought to leverage benefits of IC strategies and the credentials that they promulgate. The treatment group may include all 90 of these new programs, as they become offered, or might be some portion of them, based on when they go live for registration, what features each includes, and other factors. It is important to note that state/institutional partners have substantial leeway about what IC enhancements they apply under CAYG, so the study is dependent on their decision making. They will presumably prioritize courses/learning success outcomes that are comparatively important to their economic/social needs. 

 

Given that, the population of comparison group programs will be selected from among current and past programs that are not IC-enhanced under the auspices of the Framework and CAYG project, but are otherwise similar in terms of (first) the program, and (second) the profile of students. Program similarities will be established considering fundamental aspects of the IHE course offering (e.g., content, labor market segment supported). Student group equivalence will be established using propensity score matching, beyond the dichotomous variable indicating program affiliation. Noting that data are collected at the course enrollment-term level, specific comparisons for current new ICs will include — but not be limited to — the following:

  • The program they replace (old credential, same program, same institution, retrospective to prior term implementations)
  • The program they enhance, if it is still offered in non-IC form (old credential, same program, same institution, same time) – in some ways this is arguably the best comparison in terms of limiting unexplained/unmanaged differences
  • An otherwise-similar, non-IC program at the same institution (different credential; different program; same institution; same time, retrospective, or both) 
  • A similar program at a different CAYG institution (different credential; different program; different institution; same time, retrospective, or both)
  • A similar program at a non-CAYG (or comparison) institution (different credential; different program; different institution; same time, retrospective, or both)

A table of possible comparison options is provided in Appendix A of this document. It is important to note that the comparisons described there and above answer Outcome Study (Priority 2) research questions, presuming that the treatment ICs are being implemented (per RQ 2.1. and 2.2.).

 

It should be further noted that comparisons defined truly at the institution level are limited to the Priority One Feasibility Study, assessing execution of policies and processes intended to develop and deploy ICs.

 

Power Analysis

The PowerUp! software package will be used to assess statistical power for the CITS analyses. Assuming  = .05, one-tailed test, three years of baseline data, an average of 100 learners in 60 ICs in first year of implementation, the Minimum Detectable Effect Size (MDES) is estimated to be 0.24 for individual-level analyses (Dong & Maynard, 2013). As these assumptions are subject to change, informal post hoc analyses may be used to estimate attained statistical power and our degree of certainty in statistical findings from the CITS analyses. 

 

5. Quality Assurance

The proposed DDR project will be subject to ongoing “critical-friend” review of its design and execution, leveraging advisory and partnership relationships defined in the work plan. Regular inclusion of questions embedded in planned research data-collection processes will tap into impressions of key individuals from the 100+/- member expected national advisory board, state systems, institutions, and other partners to assess the collective work of the project team and research team(Evaluand and Ad Hoc Analytics) to develop and study the CAYG model and its theoretical framework, data collection strategies, analyses, and reporting. In addition, ongoing meetings will occur across the state data representatives, the research team, and the leadership team to review data collection and analysis processes and results on a regular basis.

 

While not a true “external evaluation,” this approach is broadly consistent with collaborative first principles of a DDR study. This approach will also have the added benefit of encouraging buy-in for the research effort by engaging individuals interested in, and qualified to comment on, the project team’s methods and results. This effort will be guided by a conceptual framework aligned with guidance from the Common Guidelines for Education Research and Development.

 

v.2022.0510.1442

 

Appendix A: Potential Comparisons for Student Outcome Analyses

Option

Program

Institution

CAYG Participant

Credential

Level of Data

Notes

0

psychology

SUNY Empire

yes

CAYG IC (new)

student

A new IC created under CAYG. What the other rows are compared to

1

same

same

yes

IC/BAU

student

Control (either non-CAYG IC or business as usual (BAU) – BA, MS, etc.)

2

same

same

no

IC/BAU

This combination does not exist

3

same

different

yes

IC/BAU

student

Either non-CAYG IC (control group A) or BAU degree (control group B)

4

same

different

no

IC/BAU

?

Useful comparison, but student-level data unlikely

5

similar (e.g., counseling, education)

same

yes

IC/BAU

student

Controls may be similar enough for comparison

6

similar

same

no

IC/BAU

This combination does not exist

7

similar

different

yes

IC/BAU

student

Controls may be similar enough for comparison

8

similar

different

no

IC/BAU

?

May be useful comparison, but student-level data unlikely

9

different

same

yes

IC/BAU

student

H: does CAYG IC outperform BAU in general?

10

different

same

no

IC/BAU

This combination does not exist

11

different

different

yes

IC/BAU

student

H: does CAYG IC outperform BAU in general?

12

different

different

no

IC/BAU

?

May be useful comparison, but student-level data unlikely

References

Creswell, J. W., Plano Clark, V. L., Gutmann, M., & Hanson, W. (2003). Advanced mixed methods research designs. In A. Tashakkori & C. Teddlie (Eds.), Handbook of mixed methods in social and behavioral research (pp. 209–240). Thousand Oaks, CA: Sage.

Dong, N., & Maynard, R. A. (2013). PowerUp!: A tool for calculating minimum detectable effect sizes and minimum required sample sizes for experimental and quasi-experimental design studies. Journal of Research on Educational Effectiveness, 6(1), 24-67.

Hallberg, K., Williams, R., & Swanlund, A. (2020). Improving the use of aggregate longitudinal data on school performance to assess program effectiveness: Evidence from three within study comparisons. Journal of Research on Educational Effectiveness, 13(3), 518-545. https://doi.org/10.1080/19345747.2019.1698088

Institute of Education Sciences, U.S. Department of Education, & the National Science Foundation. (2013, August). Common guidelines for education research and developmenthttp://www.nsf.gov/pubs/2013/nsf13126/nsf13126.pdf

Linden, A. (2015). Conducting interrupted time-series analysis for single- and multiple-group comparisons. The Stata Journal, 15(2), 480-500.

Tipton, E., & Olsen, R. B. (2022). Enhancing the Generalizability of Impact Studies in Education. (NCEE 2022-003). Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance. Retrieved from http://ies.ed.gov/ncee. 

Improving Education and Employment Outcomes