Degrees of Change Pilots Student “Pulse Check” Survey

February 20, 2018

By Bill DeBaun, Director of Data and Evaluation

About a year ago, we brought you a very zappy blog post about how NCAN member Degrees of Change was connecting free or low-cost applications to facilitate its data collection. At our national conference in San Diego in September, Degrees of Change staff presented "Mapping the College Experience: Lessons from a Year-Long Pilot of Weekly 'Pulse Checks'," which showed a promising and useful application of this approach. Readers should start with the January blog post to get background and context and then come back here to learn more.

Dr. Tim Herron, Degrees of Change's executive director, and Dr. Kelly Bay-Meyer, director of research and evaluation, led the conference presentation. As the number of students served by Degrees of Change has grown, the program has had to look for approaches that scale efficiently while still delivering the timely, personal communications that makes the program effective. Like other NCAN-member programs, Degrees of Change identified text message-based “nudging” as one such approach. Just texting students is not enough; the program also needed a quick, automated way to collect and analyze student satisfaction data. 

From the college access literature and their own work, Degrees of Change staff knew that the first-year student experience has its ups and downs as students acclimate to the culture shock of being on campus. This is perhaps especially true for first-generation and low-income students, and the staff wanted a way to track this student experience and intervene with students in distress. To do this, they developed a pilot that would test their single question, two-touch "weekly pulse check" surveys over the course of an academic year. They sampled 39 scholars from seven communities and 12 colleges for the pilot. The prototype they used mirrored the second model from the aforementioned blog post. In short: 

  1. Degrees of Change uses Mailchimp to email students a survey;
  2. Students open the mobile-friendly survey, which is pre-filled with their name and other key information;
  3. Student respondents who fill out the survey have their responses recorded in a central student database (using the Zapier service), but non-respondents are flagged (using the Zapier service) and sent the survey again (via the Zapier service using Mailchimp);
  4. Program staff receive automatic email notifications through FormSite (the survey application) if one of their students indicates a low satisfaction rating. This email notification includes the student’s survey response details and their contact information to help ensure quick follow-up;
  5. Students’ response data refreshes automatically into reporting dashboards in Microsoft Power BI, and staff can view these responses and also see low response rates on these dashboards.

From a technical standpoint, programs can use the same survey each week, but the week number or date field must be changed each week so that when the data are added to the central student database there is a longitudinal variable by which to disaggregate. 

The pulse checks ran for 23 weeks. Each Monday, Degrees of Change sent the survey to the pilot sample at 8 a.m. PST. The only question on the survey was, "How would you describe your overall college experience this week?" Responses included a 1-to-5 scale of terrible, poor, mediocre, good, and exceptional. Over the course of the pilot, Degrees of Change found:

  • a 91% overall response rate
  • a 76% same-day response rate
  • a 21% same-hour response rate, including:
  • 82% responded between 8 a.m. and 4 p.m. (local)
    • 52% responded between 8 a.m. and 12 p.m. (local)
    • 30% responded between 12 p.m. and 4 p.m. (local)
  • 69% responded on mobile devices

The average response was a 4.00 (i.e., good), and 79 percent of responses overall were positive (good or exceptional). Seventeen (17) percent were moderate (mediocre), and 5 percent were negative (poor or terrible). Weekly response means ranged from a low of 3.19 (the week after the U.S. presidential election) to a high of 4.38 (the week of winter break, after final exams). The remaining 21 weeks had mean responses between 3.62 and 4.28. The chart below shows the average response over the course of the study period.

In terms of response categories, scholars most frequently used "exceptional," "good," and "mediocre," but 56 percent of scholars used "poor" or "terrible" at least once. The table below has more details. The "number of weeks used" column shows a range of the number of weeks the response category was used by a given student (e.g., the student who responded "good" least frequently did so for three weeks and the student who responded "good" most frequently did so for 22 weeks).

Response Category

% of Scholars Using

Range of # of Weeks Used
















Thirty-nine (39) percent of scholars used three responses in the pilot and 33 percent used four categories. There was a steep drop-off after that with two categories (13%), 5 categories (10%), and one category (5%) rounding out the options. This shows that most students had at least some fluctuation in how their college experience was going during the pilot.

Overall, Degrees of Change found that for students in their pilot study, the first year of college got better over time. Male students’ mean response was a 4.34 while female students' was 3.84, a statistically significant difference of 0.5. The chart below shows mean responses by year at school, and the differences here were statistically significant with third-years having the lowest mean response and second- and fifth-year students tying for the highest.

This pilot has multiple implications for program practice. Having an idea of when students are most likely to be struggling helps programs to plan for interventions (increased contact from advisors, care packages, supportive texts, etc.). The pilot incorporated Degrees of Change’s normal intervention related to the "pulse check"; students who responded with less than a four were flagged for follow-up with an advisor. This helps Degrees of Change to try to get out in front of issues that can derail a student’s postsecondary experience. 

The survey also does not have to be exclusively a success service. For example, access programs could adapt this kind of weekly survey using the question, "How confident are you about going to college next fall?" This could help to flag students who need an advisor to intervene and help allay their concerns about matriculating. 

Degrees of Change staff are interested in forming a working group to continue and broaden their investigation of the first-year experience and the use of this kind of text message surveying. If your organization is interested or you would like more information about how to set up this kind of system, please contact me. Thank you to Drs. Herron and Bay-Meyer and to Degrees of Change for continuing to share their expertise with the NCAN membership and the college access and success field!

Back to Blog

Leave a Reply:

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported License