NAIS Surveys: Frequently Asked Questions (FAQ)
CEPR is a client-focused, self-funded research center associated with the Office of the Vice Provost for Research at Indiana University.
- CEPR promotes and supports rigorous program evaluation and nonpartisan policy research primarily, but not exclusively, for education, health/human service and non-profit organizations.
- CEPR conducts international, national, state and local level evaluation and research projects related to diverse content areas including student engagement, STEM education, literacy programs, school improvement initiatives, school choice, etc.
- Please visit CEPR's website for more information.
Although there is considerable variation in how student engagement has been defined and measured, the term generally is used to describe meaningful student involvement throughout the learning environment. Engagement can best be understood as a relationship:
- between the student and school community
- the student and school adults
- the student and peers
- the student and instruction, and
- the student and curriculum
Student engagement is a multi-dimensional (faceted) construct that can be measured with all dimensions dynamically interrelated. These dimensions typically include the following: (a) behavioral engagement, focusing on participation in academic, social and co-curricular activities, (b) emotional engagement, focusing on the extent and nature of positive and negative reactions to teachers, classmates, academics and school, and (c) cognitive engagement, focusing on students’ level of investment in learning. Student engagement is malleable, and is a function of both the individual and the construct; engagement varies in intensity and duration.
Student engagement is increasingly viewed as one of the keys to addressing problems such as low student achievement, student boredom and alienation, and high drop-out rates (Fredricks, Blumenfled and Paris, 2004). Engaged students are more likely to perform well on standardized test, are less likely to drop out of school, and conditions that lead to student engagement (and reduce student apathy) contribute to a safe, positive, and creative school climate and culture.
Research indicates that student engagement declines as students progress from upper elementary grades to middle school, reaching its lowest levels in high school. Some studies estimate that by high school as many as 40-60 percent of youth are disengaged (Marks, 2000). Given the serious consequences of disengagement, more and more educators and school administrators are interested in obtaining data on student engagement and disengagement for needs assessment, diagnosis and prevention. Schools have the power to create the conditions under which students can achieve highly, become motivated for learning, and stay connected and engaged academically, socially and emotionally (ASCD, 2009).
HSSSE and MGSSE define student engagement as a complex multi-dimensional construct. HSSSE’s and MGSSE’s definition of student engagement is aligned with national research (e.g., Fredricks & McColskey, 2012) that conceptualizes student engagement as a complex, multidimensional construct that includes behaviors (e.g., persistence, effort, attention, taking challenging classes), emotions (e.g., interest, pride in success) and cognitive aspects (solving problems, using metacognitive strategies). More specifically, HSSSE and MGSSE measure the following dimensions of student engagement:
Cognitive/Intellectual/Academic Engagement. This dimension can be described as “engagement of the mind.” It includes subscales for cognitive growth through personal skill development, level of effort in academic pursuits, and attitude toward learning. Questions describe:
- students’ effort, investment, and strategies for learning
- the work students do and the ways students go about their work
- engagement connected to instructional time
Social/Behavioral/Participatory Engagement. This dimension can be thought of as “engagement in the life of school.” Questions capture:
- students’ involvement in social, co-curricular, and non-academic school activities, including interactions with other students
- the ways in which students interact within the school community
- engagement with the school outside of instructional time
Emotional Engagement. This dimension can be described as “engagement of the heart.” It includes subscales for motivation for learning, emotional engagement with the school, positive relationships with adults in school, and positive relationships with other students. Questions describe:
- students’ feelings (positive or negative) about their current school situation
- students’ attitudes toward the people with whom they interact, the work, and school structures
- students’ affective reactions
From 2013 to 2015 approximately 90 independent schools participated in HSSSE, with a majority of schools participating annually to allow for comparisons across time. MGSSE, the parallel survey for middle grade students, was recently developed to allow independent schools to also gather student engagement data from 5th through 9th grade students. From 2016 to 2017, 63 independent schools participated in MGSSE.
HSSSE and MGSSE contain parallel survey items, with HSSSE targeting students in high school grade levels (i.e., generally grades 9 through 12) and MGSSE targeting students in middle school grade levels (i.e., generally grades 5 or 6 through 8). However, there is some flexibility in choosing which survey to use depending on your specific context, particularly for schools serving students across both middle and high school grade levels. We would be happy to discuss the options with you via e-mail or phone.
This question can be answered in a variety of ways. Discussing this issue more generally, evidence of the credibility and trustworthiness of HSSSE and MGSSE include the following:
- HSSSE and MGSSE are strongly grounded in the research and literature on student engagement, and in particular research related to the engagement of high school and middle grade students. Research describes student engagement as a multidimensional construct that includes behaviors (e.g., persistence, effort, attention, taking challenging classes), emotions (e.g., interest, pride in success) and cognitive aspects (solving problems, using metacognitive strategies). HSSSE and MGSSE specifically measure student engagement in each of these three dimensions (i.e., behavioral, emotional and cognitive) identified in the research and literature.
- HSSSE and MGSSE were intentionally designed to satisfy the conditions needed for self-reported data to be reliable. These conditions are: (1) information is known to respondents, (2) questions are phrased clearly and unambiguously, (3) questions refer to recent activities, (4) respondents think the questions merit a serious and thoughtful response, and (5) answering questions does not threaten or embarrass students, violate their privacy, or prompt them to respond in socially desirable ways (i.e., concede to peer pressure).
More specifically, researchers and educators often discuss trustworthiness in terms of the validity and reliability of the instruments. These concepts are multifaceted and have diverse definitions; there are multiple methods for examining reliability and validity. However, as a general concept, reliability refers to the degree to which an instrument produces consistent results across administrations. For example, a measure would not be reliable if one day it measured an object’s length at 14 inches and the next day it measured the same object as 13 inches. And as a general concept, validity refers to whether the results obtained from using an instrument actually measure what was intended and not something else. Evidence that supports the validity and reliability of HSSSE include the following (Note: Since MGSSE is newly released, similar reliability and validity evidence are not available yet):
- Content Validity (Face Validity). Content validity addresses the question, do the survey questions cover all possible facets of the scale or construct? This form of validity refers to the extent to which a measure represents all facets of a given construct. There are no statistical tests for this type of validity, but rather we rely on experts to determine whether or not the instrument measures the construct well. To establish content validity, the Center for Evaluation, Policy, & Research (CEPR) at Indiana University convened an external Technical Advisory Panel in 2012-13 consisting of national academic experts in student engagement, K-12 practitioners and psychometricians. The Technical Advisory Panel examined the content validity of the HSSSE categories (i.e., dimensions of engagement), subcategories and items to assess the extent to which the constructs aligned with current research and literature on student engagement. Items were revised, refined or dropped from the instrument based on recommendations from the Technical Advisory Panel. Therefore, the content validity of HSSSE is supported by the integral involvement of the Technical Advisory Panel in the development and refinement of HSSSE.
- Construct Validity. Construct validity is the degree to which an instrument measures the characteristics (or constructs) it is supposed to measure. Construct validity addresses the question, does the theoretical concept match up with a specific measurement/scale? The three dimensions of student engagement measured by HSSSE and MGSSE (i.e., cognitive engagement, emotional engagement and behavioral/social engagement) are commonly regarded in research and literature as the key dimensions of high school and middle school student engagement (Fredricks, Blumenfeld and Paris, 2004; Fredricks, McColskey,, Meli, Mordica, Montrosse, and Mooney, 2011). Confirmatory factor analyses of HSSSE data support the construct validity of the subscales for the three dimensions of cognitive engagement, emotional engagement and behavioral/social engagement.
- Response Process Validity. Response process validity addresses the question, do respondents understand the questions to mean what they are intended to mean? This form of validity refers to the extent to which the respondents understand the construct(s) in the same way it is defined by the researchers. There are no statistical tests for this type of validity, but rather data are gathered via respondent observation, interviews and feedback. To establish response process validity, the Center for Evaluation, Policy, & Research (CEPR) at Indiana University conducted focus groups and cognitive interviews with students at seven high schools using both paper and on-line versions of the instrument. Survey items were refined based on respondents’ feedback in order to establish response process validity.
- Reliability. The Center for Evaluation, Policy, & Research (CEPR) at Indiana University specifically examined internal consistency reliability. Internal consistency reliability addresses the question, do the items within a scale correlate well with each other? Internal consistency is the extent to which a group of items measure the same construct, as evidenced by how well they vary together, or intercorrelate. Internal consistency reliability is measured with Cronbach’s alpha; a Cronbach’s alpha coefficient greater than or equal to 0.70 is traditionally considered reliable in social science research (Thorndike & Thorndike-Christ, 2010). For HSSSE, the Cronbach’s alpha reliability coefficient was calculated for each of the three dimensions of student engagement (cognitive engagement, emotional engagement and behavioral/social engagement) using 2013-2015 data that included 64,911 students. Cronbach alpha was .71-.91 for the subscales of cognitive engagement, .73-.89 for the subscales of emotional engagement and .70 for behavioral/social engagement.
By participating in HSSSE and/or MGSSE, schools receive actionable data about the activities and experiences of students in the school. These data can be used for a variety of purposes including the following:
- Accountability
- Accreditation self-studies
- Assessment and improvement (“How can we do better?”)
- Benchmarking
- Communication with internal and external stakeholders
- Staff development
- General education reform
- Institutional advancement
- As part of a School Improvement Process
- Examining trends or changes over time
- Stimulating discussions about teaching and learning
Simply go to Registration and complete the requested information. If you are participating in both HSSSE and MGSSE, you will need to complete both registrations since schools may have different contact information for the respective grade levels.
Once you have completed the registration form online, you will receive a confirmation email with information on what to expect for the survey administration(s) and suggestions on how to prepare. If you have questions, or want further information before deciding whether or not to register, please feel free to call the Project Team at 812-856-0085 or email HSSSE@indiana.edu. (Note: Same e-mail address for both HSSSE and MGSSE.)
Surveying all students within the relevant grade levels is typically the simplest and most straightforward method for gathering data about student engagement; this is also the most comprehensive method, and provides the most reliable and representative data without the problems inherent in sampling students (see below). However, for a variety of reasons schools sometimes choose one of the following methods:
- Gathering data from a particular subpopulation of students. Some schools choose to focus on a particular subpopulation of students such as only first-year high school students, or only seniors. In these cases, gathering data from all students within the given subpopulation (as opposed to sampling students) will provide the most reliable and representative data without the problems inherent in sampling students (see below).
- Gathering data from a sample of students. Some schools choose to gather data from a sample of students (either a sample of all students, or a sample of a particular subpopulation). Schools choosing this method need to be very careful and cautious. In order for the data gathered to be reliable, unbiased and meaningful, schools need to carefully consider both sample size and representativeness; and students should be randomly selected to participate in the survey. Stratified random samples should also be considered to ensure adequate representation from various key demographics.
Convenience samples (i.e., choosing respondents that can be conveniently reached without regard to their demographic data) are particularly problematic since they are generally not representative of harder-to-select individuals. For example, a school might choose to administer HSSSE only to those students currently taking a computer science course because it is convenient to administer while they are in computer labs. However, students in the computer science class might not be representative of the general student population – they might be higher (or lower) performing students - or students who are drawn to take an elective computer science course might be different than other students in terms of learning styles, etc. Convenience samples often yield biased data that cannot be validly or meaningfully generalized to other students at the school.
Regardless of the method, schools should focus on getting high response rates in order to avoid selection bias. Nonrespondents tend to differ from respondents, so their absence in the final sample makes it difficult to generalize the results to the overall target population. For example, students who are absent frequently (and therefore likely to not respond to surveys administered at school because they are absent when the survey is administered) would likely respond very differently than their peers to a survey of student engagement. Therefore, it is important to maximize the percentage of targeted students who complete the survey through methods such as providing repeated opportunities for survey completion, specifically focusing on increasing survey completion for traditionally hard-to-reach populations, and using multiple communication methods to help students understand the importance and value of completing the survey.
The comprehensive (standard) reports with survey results, and accompanying raw data files, are typically sent out in August to September. Although some schools may complete the survey in late winter or early spring, the analyses and comprehensive reports include comparative data that require CEPR to wait until all participating NAIS schools have completed the surveys. Therefore all comprehensive (standard) reports are completed and delivered around the same time in late summer. Longitudinal results or custom reports may take 2-4 weeks longer.
Schools who participate in the HSSSE and/or MGSSSE will receive a comprehensive (standard) report that includes the following:
- Executive summary with an overview of survey responses, along with selected key findings
- Descriptive data for each of the three dimensions of engagement (cognitive/intellectual/academic engagement, social/behavioral/participatory engagement, and emotional engagement)
- Disaggregated data (e.g., responses broken down by grade level and sex/gender)
- Means comparisons comparing responses to other participating NAIS schools (Note: Schools participating in HSSSE will also receive a comparison of their data compared to public school norms)
- Open-ended responses with personally identifiable information removed
- See Summary for further details and examples or reports schools receive as part of their participation in HSSSE and/or MGSSE.
Schools also receive a file of the raw data with identifiers removed. Reports and raw data files are provided electronically, although hard copies of reports and/or flash drives with raw data can be provided for a small additional fee (see Survey costs).
In addition, schools may request additional (optional, for an additional fee) analyses and reports that might be useful. For example, longitudinal reports are available to schools that have participated in more than one administration of the HSSSE. Also, custom reports are available for schools that have special requests outside of the standard report. Examples of past custom report requests include: gender-specific comparisons (e.g., female respondents compared to female-only respondents from other schools); school-size comparisons (e.g., school compared to other schools of a similar size); and association specific comparisons (e.g., school compared to other schools from the same accrediting association). These are provided for illustrative purposes only. CEPR would be happy to discuss the feasibility of any additional analyses or custom reports with interested schools.