
Why Customer Satisfaction Matters in School Nutrition
June 9, 2025
Distributing, Analyzing, and Acting on Survey Results
August 5, 2025In our previous blog post, you learned why customer satisfaction is essential in school nutrition. Now that you’re convinced of the benefits and ready to launch a customer satisfaction survey for your program, the next step is to design the survey.
In this second post, we’ll do a top-level review of how to design a customer satisfaction survey to achieve a reasonable response rate and gather answers that’ll help you develop effective action plans.
Why Customer Satisfaction Survey Design Matters
A survey is more than asking a series of questions.
Like you, the stakeholders in your school nutrition program already have a lot of demands on them. Completing a survey is going to be seen as another demand and not a priority. Even if they start the survey but find the questions confusing, boring, or not applicable to the topic, they will quit. Every word, format, and layout decision affect the outcome.
A well-designed customer satisfaction survey can improve response rates and get you the data you need to make good decisions. For example, a survey can uncover hidden barriers to participation, identify preferences within USDA guidelines, and create opportunities for stakeholder engagement.
Let’s break down the elements of a successful survey.
Step 1: Define the Purpose of Your Customer Satisfaction Survey

Before writing any questions, clarify your goal.
Select one goal only, which will ensure your customer satisfaction survey reaches the right audience, the questions are appropriate, and the length is not too long.
For example, are you trying to:
- Understand why students are or are not participating in the meals program?
- Get feedback on specific menu items from high school students?
- Assess staff satisfaction with dining services?
- Learn how parents view school meals?
Your purpose will shape:
- Who you survey
- What you ask (and how you ask it)
- How you analyze the results
Without a clearly defined purpose, surveys often drift, frustrating respondents (who then quit the survey) and leaving you with useless or conflicting data.
Step 2: Choose the Right Audience
Your customers vary. Depending on the goal established for your customer satisfaction survey, your audience may include:
- Students (elementary, middle school, high school)
- Teachers and staff
- Parents and caregivers
- Administrators
To be successful, you’ll want to tailor your language and questions to fit each group. For example, simple language and short questions will be essential for elementary school students.
If the participation of parents and caregivers is low, you may ask questions about perceptions of school meals rather than about the actual taste of the food (as they may not have eaten a school meal).
A one-size-fits-all approach is not going to succeed.

Customer Satisfaction Survey Sampling
Because of time and resources, you usually don’t survey every possible member of your chosen audience (called your population). Instead, you choose to survey a smaller group of that population, which is called a sample.
There are two major methods of sampling:
- Probability sampling: All subjects in the population of interest have an equal chance of being selected in the sample, which means your sample is representative of the population.
- Example: Of all the students who participated in the lunch program, you pull the name of every tenth student, and they complete the survey. Assuming a large enough sample size, your survey results would be generalizable to all students who take part in the lunch program.
- Non-probability sampling: All subjects in the population do not have an equal chance of being selected in the sample because a non-systematic process is used to create the sample. You cannot say that the survey results are generalizable to all subjects.
- Example: You decide to survey students who are eating a school meal and are willing to complete the survey. This group is called a convenience sample because participants were selected based on their availability. Your results would accurately reflect what that group of students thought. They would not necessarily be indicative of all students who eat a school meal.
There are several different types of probability and non-probability sampling you can read about in this article Sampling Methods in Research Methodology: How to Choose a Sampling Technique for Research.
You will most likely be using a convenience sample when you survey your audience, so remember that the results are not necessarily representative of everyone in your chosen audience.
Step 3: Keep It Short and Focused
SurveyMonkey took a random sample of 100,000 surveys and reported that when surveys took more than 7-8 minutes to complete, the rate of completion dropped anywhere from 5% to 20%.
In your customer satisfaction survey, aim for:
- 5–10 questions maximum
- Clear, direct language
- A completion time under 5 minutes for students, under 10 minutes for adults
Skip Logic
Use skip logic in your customer satisfaction survey to avoid asking irrelevant questions. This method involves asking a qualifying question and then, based on the respondent’s answer, directing them to a specific set of questions. Most online survey platforms offer skip logic options.
With a good layout and clear directions, you can use skip logic with paper surveys, too.
For example, if you were surveying parents, you might ask them if they have eaten in the cafeteria in the last six months.
- If they said “yes,” then you would have them answer a series of questions about their meal.
- If they said “no,” then you wouldn’t want to ask them about a meal they didn’t have. Instead, you may ask them about what stops them from eating in the cafeteria or what their thoughts are about the meal program.
Using skip logic keeps the survey focused and relevant to each respondent.
Step 4: Select Effective Question Types
Most school nutrition customer satisfaction surveys should use a mix of these question types:
1. Likert Scale (Rating Scale)
Example: “How satisfied are you with the taste of school lunches?”
- Very dissatisfied / Dissatisfied / Neutral / Satisfied / Very Satisfied
Best Practices:
- Use odd-numbered scales (e.g., 5-point or 7-point) to include a neutral option.
- Align scales from left (low) to right (high) to minimize recency bias. In other words, make sure you earn the higher rating and not just have the respondent choose it because it’s the first choice.
- Use clearly labeled points (not just numbers) unless you are using a specialty scale that requires only numbers or percentages.
2. Multiple Choice
Example: “What factors influence your decision to eat school lunch?”
- Taste
- Time
- Cost
- Friends
- Nothing / I always bring lunch
Best Practices:
- Allow multiple selections if applicable.
- Include an “Other” option with a write-in field.

3. Yes/No or True/False
Example: “Do you know where to find the school lunch menu?”
Best Practices:
- Use sparingly; these provide limited nuance.
4. Open-Ended Questions
Example: “What is one thing you would change about school meals?”
Best Practices:
- Use at the end of the survey.
- Keep to one or two maximum.
Don’t make the mistake of using open-ended questions just for older students and adults.
In our previous taste-testing research, the elementary students often gave detailed, helpful (and humorous!) answers to open-ended questions.
Step 5: Avoid Common Customer Satisfaction Survey Mistakes
Even a few poor design choices can bias your results. Pew Research offers a more comprehensive video on survey question wording.
But here are a few mistakes to watch out for:
1. Leading Questions
Avoid: “How much do you love our new pizza option?”
Better: “How would you rate our new pizza option?”
2. Double-Barreled Questions
Avoid: “How satisfied are you with the speed and friendliness of the service?”
Better: Ask two separate questions: one about speed, one about friendliness.
3. Overuse of Jargon
Avoid: “Are you aware of the CEP program benefits at your site?”
Better: *”Do you know if your school offers free meals to all students?”
4. Too Many Open-Ended Questions
One or two is plenty—respondents may skip them otherwise.
Step 6: Format for Clarity
Whether your survey is digital or paper-based, formatting affects readability. Consider:
- Logical groupings (e.g., food quality questions together)
- Consistent response scale alignment
- Minimal scrolling (digital)
- Large fonts and white space (paper)
- Bolded questions and instructions for tricky formats (e.g., “Select all that apply”)
For younger participants, have someone available to read the questions to them.
Step 7: Test Your Customer Satisfaction Survey Before Launching
Always test your survey with a few members of your intended audience:
- Does it make sense to them?
- Are any questions confusing or irrelevant?
- How long does it take?
You can also test response formats: Do most people select the first option listed? If so, it may suggest bias based on layout.
Step 8: Choose the Right Platform
There are many low-cost or free platforms available to launch your customer satisfaction survey. Some top options include:
- Google Forms (free, easy to use, integrates with Google Sheets)
- SurveyMonkey (has free tier, more formatting options)
- Microsoft Forms (integrates well with school-based Microsoft platforms)
- Paper surveys (still useful for young students or families without internet access)
Choose the platform that best fits your audience’s access and comfort level.
Step 9: Maximize Your Response Rate
A well-designed customer satisfaction survey is of no use if no one responds. To improve participation:
- Explain the purpose upfront and tell them how the data will be used, e.g., “This survey is about the school lunch menu and will be used to improve your school lunch experience.”
- Keep it anonymous unless you need follow-up info. Don’t use identifiers on the survey, such as names or student identification numbers.
- Time the survey wisely when there is enough time for participants to complete it (e.g., during homeroom, staff meeting, or back-to-school night).
- Communicate ahead of time that the survey is coming through various channels to promote the survey.
- Have reasonable expectations about response rates. Getting your audience to complete surveys is challenging. According to a meta-analysis on response rates of online surveys in published research, the average online survey response rate was 44%. Sending the survey to more participants or offering an incentive did not guarantee a higher response rate.
- Use incentives if you have the resources, but remember they don’t always work.
Equity Considerations
When designing customer satisfaction surveys, be mindful of inclusivity and accessibility:
- Translate surveys into common home languages.
- Use plain language. Check the reading level of your survey to ensure it is right for the audience.
- Consider accessibility needs (screen reader compatibility, high contrast, large font).
- Avoid assuming cultural preferences.
Creating a welcoming tone and inclusive format ensures more honest and representative data.
Remember the key takeaways:
- A strong customer satisfaction survey begins with a clear purpose and a targeted audience.
- Use short, focused questions that are easy to understand.
- Favor structured formats (rating scales, multiple choice) over too many open-ended questions.
- Layout, logic, and clarity all impact response quality.
- Test your survey and promote it strategically to get better results.
With thoughtful planning and research-backed design, your school nutrition customer satisfaction survey can be a powerful tool for program improvement.