About HSSSE & MGSSE

What is student engagement?

Student engagement is increasingly viewed as one of the keys to addressing issues such as low student achievement, student boredom and alienation, and high drop-out rates (Fredricks, Blumenfled and Paris, 2004). Engaged students are more likely to perform well on standardized tests, are less likely to drop out of school, and conditions that lead to student engagement (and reduce student apathy) contribute to a safe, positive, and creative school climate and culture.

Research indicates that student engagement declines as students progress from upper elementary grades to middle school, reaching its lowest levels in high school. Some studies estimate that by high school, as many as 40-60% of youth are disengaged (Marks, 2000). Given the serious consequences of disengagement, more and more educators and school administrators are interested in obtaining data on student engagement and disengagement for needs assessment, diagnosis, and prevention.

Schools have the power to create the conditions under which students can achieve highly, become motivated for learning, and stay connected and engaged academically, socially and emotionally (ASCD, 2009).

What are HSSSE & MGSSE?

HSSSE (targeting grades 9 through 12) and MGSSE (targeting grades 5 through 9) are student-focused surveys that investigate the attitudes, perceptions, and beliefs of students about their school work, the school learning environment, and their interactions with the school community. The surveys are modeled after the National Survey of Student Engagement (NSSE) with the following primary purposes:

  • To provide teachers and school administrators with valid and meaningful data on student engagement at their specific school
  • To help educators use student engagement data to develop and implement strategies and practices that increase student engagement at their specific school
  • To help schools interpret their own student engagement data relative to the aggregate results from other schools

HSSSE and MGSSE data can be invaluable to schools as student engagement is one of the keys to building a safe, positive, and creative school climate and culture to increase student achievement, and decrease student boredom, alienation, and drop-out rates. Unlike knowledge-based assessment instruments, HSSSE and MGSSE provide invaluable student engagement data that showcase how schools instill 21st century skills in their students while providing a caring and safe environment that nurtures the whole child.

 

Survey Design

HSSSE and MGSSE contain parallel survey items. Both surveys are available as an online or paper survey instrument; most students complete the survey in 15–20 minutes.

Dimensions of engagement measured include:

  • cognitive/intellectual/academic
  • social/behavioral/participatory
  • emotional

The Surveys of Student Engagement define “student engagement” as a complex, multidimensional construct, with three primary components.

Overview of Survey Tools (PDF's)

High School Survey of Student Engagement (HSSSE)

View/Download PDF

Middle Grades Survey of Student Engagement (HSSSE)

View/Download PDF

Valid and reliable data

Evidence of the credibility and trustworthiness of HSSSE and MGSSE include the following:

HSSSE and MGSSE are strongly grounded in the research and literature on student engagement, and in particular research related to the engagement of high school and middle grade students. Research describes student engagement as a multidimensional construct that includes:

  • behaviors (e.g., persistence, effort, attention, taking challenging classes)
  • emotions (e.g., interest, pride in success), and
  • cognitive aspects (solving problems, using metacognitive strategies).

These conditions are:

  1. information is known to respondents;
  2. questions are phrased clearly and unambiguously;
  3. questions refer to recent activities;
  4. respondents think the questions merit a serious and thoughtful response, and;
  5. answering questions does not threaten or embarrass students, violate their privacy, or prompt them to respond in socially desirable ways (i.e., concede to peer pressure).

Researchers and educators often discuss trustworthiness in terms of the validity and reliability of the instruments. These concepts are multifaceted and have diverse definitions; there are multiple methods for examining reliability and validity. However, as a general concept, reliability refers to the degree to which an instrument produces consistent results across administrations.

For example, a measure would not be reliable if one day it measured an object’s length at 14 inches and the next day it measured the same object as 13 inches. And as a general concept, validity refers to whether the results obtained from using an instrument actually measure what was intended and not something else. Evidence that supports the validity and reliability of HSSSE include the following (Note: Since MGSSE is newly released, similar reliability and validity evidence are not available yet):

  • Content Validity (Face Validity). Content validity addresses the question, do the survey questions cover all possible facets of the scale or construct? This form of validity refers to the extent to which a measure represents all facets of a given construct. There are no statistical tests for this type of validity, but rather we rely on experts to determine whether or not the instrument measures the construct well. To establish content validity, the Center for Evaluation, Policy, & Research (CEPR) at Indiana University convened an external Technical Advisory Panel in 2012-13 consisting of national academic experts in student engagement, K-12 practitioners and psychometricians. The Technical Advisory Panel examined the content validity of the HSSSE categories (i.e., dimensions of engagement), subcategories and items to assess the extent to which the constructs aligned with current research and literature on student engagement. Items were revised, refined or dropped from the instrument based on recommendations from the Technical Advisory Panel. Therefore, the content validity of HSSSE is supported by the integral involvement of the Technical Advisory Panel in the development and refinement of HSSSE.
  • Construct Validity. Construct validity is the degree to which an instrument measures the characteristics (or constructs) it is supposed to measure. Construct validity addresses the question, does the theoretical concept match up with a specific measurement/scale? The three dimensions of student engagement measured by HSSSE and MGSSE (i.e., cognitive engagement, emotional engagement and behavioral/social engagement) are commonly regarded in research and literature as the key dimensions of high school and middle school student engagement (Fredricks, Blumenfeld and Paris, 2004; Fredricks, McColskey,, Meli, Mordica, Montrosse, and Mooney, 2011). Confirmatory factor analyses of HSSSE data support the construct validity of the subscales for the three dimensions of cognitive engagement, emotional engagement and behavioral/social engagement.
  • Response Process Validity. Response process validity addresses the question, do respondents understand the questions to mean what they are intended to mean? This form of validity refers to the extent to which the respondents understand the construct(s) in the same way it is defined by the researchers. There are no statistical tests for this type of validity, but rather data are gathered via respondent observation, interviews and feedback. To establish response process validity, the Center for Evaluation, Policy, & Research (CEPR) at Indiana University conducted focus groups and cognitive interviews with students at seven high schools using both paper and on-line versions of the instrument. Survey items were refined based on respondents’ feedback in order to establish response process validity.
  • Reliability. The Center for Evaluation, Policy, & Research (CEPR) at Indiana University specifically examined internal consistency reliability. Internal consistency reliability addresses the question, do the items within a scale correlate well with each other? Internal consistency is the extent to which a group of items measure the same construct, as evidenced by how well they vary together, or intercorrelate. Internal consistency reliability is measured with Cronbach’s alpha; a Cronbach’s alpha coefficient greater than or equal to 0.70 is traditionally considered reliable in social science research (Thorndike & Thorndike-Christ, 2010). For HSSSE, the Cronbach’s alpha reliability coefficient was calculated for each of the three dimensions of student engagement (cognitive engagement, emotional engagement and behavioral/social engagement) using 2013-2015 data that included 64,911 students. Cronbach alpha was .71-.91 for the subscales of cognitive engagement, .73-.89 for the subscales of emotional engagement and .70 for behavioral/social engagement.