If possible, partner with experts outside your department, such as the leaders of relevant campus offices (e.g., your office of institutional research, campus climate office, office of equity and inclusion, or human resources office) or an external consulting firm to develop, administer, and analyze surveys, interviews, and focus groups. Recognize that trained outside experts are likely to be more effective than department members for gaining
’ trust, ensuring their safety and anonymity, and collecting useful feedback while avoiding the potential for retaliation or other negative consequences. However, recognize that non-physicists may need guidance on understanding and exploring physics-department-specific issues or challenges.
General advice for writing questions
Tip #1: Determine what you need to know and how it will be used
Drafting good questions starts with your goals. As you write questions, ask yourself:
What exactly do we need to know? Write down a few high-level questions you want to answer.
What will we do with the information?
Do we know how our actions will depend on responses to this question? In other words, if answer A versus B, what will that tell you? If the question won’t give you actionable information, reword or eliminate it.
Choose among possible questions based on what you want to address. Do you want to assess students’ satisfaction with advising, their attendance at advising, or whether advising supported their career decision making? Each area of focus requires different types of questions.
Tip #2: Stick to relevant questions
Ask questions that the can answer based on their personal experience or opinions.
For each question, ask yourself, “Is this something we are just curious about? Or do we need to know this?” If it is not essential, eliminate it.
Tip #3: Write specific questions
Err on the side of being too precise in question wording, especially on a survey, to ensure useful responses. For example, if you want to know if homework is too easy or too hard, ask that; don’t ask if feel the homework level is “appropriate.”
A common mistake is to ask if X (e.g., a particular course) is “important” or “essential.” Ask instead if X is important or essential for a particular purpose, e.g., for respondents’ career goals.
Define your terms and clarify details. For example, if asking about introductory courses, define “introductory course.” If you want to know about experiences “in the past year,” clarify whether you mean in the past 12 months or the last calendar year.
Tip #4: Write clear, neutral questions
Make your questions as neutral as possible. Avoid leading questions. For example, “What did you like about X?” assumes there is something they liked.
Use simple and concrete language.
Draw from examples. Do an internet search for questions on the topic at hand.
Ask only one question at a time; don’t combine multiple questions into one. For example, do not ask to list strengths and weaknesses in a single question.
Always show potential questions to someone else. Good sources of feedback include survey professionals, possible respondents, and colleagues with social science expertise.
Consider doing a pilot interview or survey to gauge how questions are received. Make appropriate revisions to the questions based on what you learn.
Tip #5: Pay attention to question ordering
Put the most important survey questions near the beginning, because many drop out part way through.
To the extent possible, put sensitive questions toward the end, to give or respondents time to become comfortable and familiar with the context of the interview or survey.
Always finish with an such as, “Is there anything else you would like to tell me?”
Tip #6 (For surveys): Use appropriate answer choices and scales
When asking for a value, offer answer choices with ranges (e.g., 0–5 years, 6–10 years, and 11+ years) rather than specific values, if possible. This makes survey questions easier to answer and to analyze. Include a choice of “other” with a space to write in an alternate answer for whose answers don’t fit into the choices offered.
Include all reasonable possible answer choices in , and make sure choices are mutually exclusive.
Use data from an to later create answer choices for a closed-ended version of the question.
Tip #7 (For surveys): Draw the reader’s attention appropriately
Don’t rely on reading long explanations or introductions. Use one-line instructions.
Bold essential terms to draw attention to them. Do not use italics, as these are harder to read.
Tip #8 (For surveys): Keep it short
Clearly mark optional answers as “(optional)” to reduce survey length. If a question is optional, consider whether it is needed at all.
Use sparingly; include preferably one or two and definitely no more than four per survey. are easier to analyze and reduce survey length. Open-ended questions should be saved for information that is very valuable and not easily obtained via a multiple-choice scale.
Questions and rating scales
Guidelines for demographic questions
Include only demographic questions that are relevant and valuable; make them optional or include “prefer not to answer” as an option; and do not ask demographic questions that are likely to compromise the anonymity of (e.g., very specific questions about a demographic group with very few respondents in your sample).
Check to see if you already have the demographic information you need from some other source, such as class rosters or your office of institutional research. Because of the potential for harm, and to shorten the survey, avoid asking questions whose answers you can find elsewhere.
Consult with your office of institutional research and/or your office of equity and inclusion for guidance on wording for questions about, e.g., gender, ethnicity, race, and disability status. Use care when designing such questions to avoid harm. For example, use inclusive options for gender, race, and ethnicity data; use self-reported demographic data; include “other” options with write-ins; and don’t lump options together.
Ask demographic questions at the end of the survey rather than at the beginning, to avoid triggering . For an interview, you can send a quick written survey about demographics after the interview. If you are able to build trust during an interview, you can ask a very general question at the end of the interview such as “We’d like to honor and acknowledge the experience of people from different backgrounds. Could you tell me about your gender and ethnicity, and/or any other facets of your identity that you’d like to share?”
Disaggregate data by demographics, major, or other critical subgroupings to understand the experiences of different groups. However, if disaggregating will compromise anonymity (e.g., when there are very few members of a particular demographic), find ways to aggregate across semesters and/or learn about the experiences of through other methods. See the section on Equity, Diversity, and Inclusion for details.
Use demographic data to learn how your department can better support students and other department members from . Reflect on the conclusions you draw based on the results of demographic analysis, and ensure that such analysis is not used to characterize people or groups in ways that perpetuate stereotypes. If you identify achievement gaps between different groups of students, focus on what you can do to benefit students in ways that are empowering, rather than focusing on their deficits. Focus on students’ growth rather than gaps in relation to other students.
Open-ended question starters for surveys, interviews, or focus groups
These question starters could be used for surveys, interviews, or focus groups. For a survey, they could be turned into
with . In these starters, “X” should be replaced with the topic of interest. “Department” could also be replaced with a domain of interest (e.g., course, advising meetings, or research group)
Tell me about X. (This is a particularly good opener question for most interviews or focus groups; Jacob and Furgerson, 2012).
How do you feel about X?
How do you feel about your experience with X?
How have your feelings about X (or your experiences with X) evolved over time?
What is important to you about X?
How does X fit with your personal goals and values?
How do you do X?
Why do you do X?
What is best about how X is done?
What could be improved about how X is done?
What are some of the major challenges in X?
What activities in your department are helping to do X?
What activities would you like to see in your department to help improve on X?
What changes in your department have you noticed regarding X?
What do you think is important for us to know about X?
Might you like to contribute to X (change effort), and how?
If you could remake X from scratch, what are some key features you’d include?
Below are some
related to diversity and inclusion in a classroom setting from Hogan and Sathy (2022) that could also be adapted to other settings. These could be used in focus groups or course surveys:
In what ways has your instructor demonstrated they care about your learning?
Is there content from the assignments or class discussions that has made you feel included or excluded? Explain.
Are there teaching approaches that have made you feel included or excluded? Explain.
How did the diversity of your class contribute to your learning in this course?
How might the class climate be made more inclusive?
Closed-ended question types for surveys
There are a variety of
question types, such as:
Yes or No
Have you ever enrolled in Physics 101?
True or False
I am the first person in my immediate family to attend college.
True / False
Choose all that apply*
I have participated in the following department activities this year (choose all that apply):
List of possible choices, displayed as checkboxes
Choose the best
Which of the following was the most valuable aspect of the advising sessions?
List of possible choices, displayed as radio buttons
Rating scale / Likert scale
The physics department is a welcoming environment.
List of options, such as: strongly agree / somewhat agree / somewhat disagree / strongly disagree / no opinion
Rank the following career seminar topics in order of your level of interest.
List of options, such as: resume writing / careers in industry / teaching
*Avoid “choose all that apply” questions unless necessary, as they are more difficult to interpret.
Closed-ended question starters for surveys
Here are some possible question starters for formulating
To what extent is X valued in this department?
To what extent does X help you do Y?
During a typical semester how many times do you do X?
Which of the following do you see as benefits of X?
Rate the performance of X on the following dimensions…
How much do you agree that X will affect Y? ( follow-up: If so, how?)
Rating scales for closed-ended questions for surveys
is a set of categories of responses; a survey is asked to indicate where they would rate themselves within those categories. The most common rating scale is a Likert scale, where the survey respondent indicates which option best represents their positive or negative attitude or opinion (e.g., strongly agree / agree / neutral / disagree / strongly disagree). A rating scale is an example of a close-ended question. The number of choices in a rating scale are the “point” of that scale; for example, the above agree/disagree scale is a 5-point scale.
Tip #1: Choose a scale that fits the question.
Make sure the scale answers the question. For example, the following question and scale fit: “To what extent do you feel the advising sessions helped you decide your course schedule? To no extent / to some extent / to a large extent.” On the other hand, the following question and scale do not fit: “To what extent do you feel the advising sessions helped you decide your course schedule? Definitely not / Possibly / Probably / Definitely.” Note that a scale can be unipolar (e.g., “none”…. “extremely”) or bipolar (e.g., “strongly disagree”.... ”strongly agree”).
Tip #2: Use an appropriate number of answer choices.
When deciding how many answer choices (“scale points”) to provide, consider the balance between making the question easy to answer and making meaningful distinctions. All rating scales can be increased or decreased in their point-value; for example, a 5-point agree/disagree scale (strongly disagree / disagree / neutral / agree / strongly agree) can be changed into a 3-point scale by removing “strongly disagree” and “strongly agree” or a 7-point scale by adding “slightly disagree” and “slightly agree”. While more points adds more precision to the scale, it also takes more time and energy for the
to answer. Five answer choices is usually a good balance if you don’t need to calculate a numerical score. If you want to calculate a mean or do other statistical analysis on individual questions, use at least seven answer choices, since ordinal data starts to approximate continuous data at seven points, and you can start to trust a mean. However, if a 7-point scale leads to cognitive overload in pilot testing, reduce the number of options.
Using an odd number of answer choices allows respondents to choose a middle (“neutral”) option, which is often preferred. Including “neutral” responses, as well as responses off the scale such as “not applicable” or “no opinion”, can help prevent respondents from making up a response or getting frustrated that they can’t answer honestly. There are times when it's more appropriate to remove the neutral option and force the respondent to take a side, but do this only if you have a good reason to.
Tip #3: Use matrix questions to show multiple questions with the same scale.
It can be convenient to show several questions with the same rating scale (e.g., a set of five questions all with the agree/disagree set of response options). Such a “matrix question” is also easy for
to go through, as they only need to orient to one scale. However, it’s best to include no more than five to seven questions in a matrix scale, to avoid overwhelming the respondent.
Tip #4: Draw from example scales
Below are several example scales. In some cases, choosing the scale can help to identify the proper question wording to probe the issue at hand. The following list is adapted from Vagias (2006).
Not at all / slightly / somewhat / moderately / very
This scale is best associated with an adjective from the question. For example, “How concerned are you about your grades in this course? Not at all concerned / slightly concerned / very concerned”
Possible adjectives include concerned, familiar, aware, satisfied, responsible, influential, frustrated, important.
Very dissatisfied / dissatisfied / neither satisfied nor dissatisfied / satisfied / very satisfied
None / a few / some / most / all
This scale is best associated with a noun from the question, such as “No staff / a few staff / some staff / most staff / all staff.”
Never / rarely / sometimes / often / always
Almost never true / usually not true / seldom true / occasionally true / often true / usually true / almost always true
Very untrue of me / untrue of me / neutral / true of me / very true of me
Very poor / poor / acceptable / good / very good
Not likely / somewhat likely / very likely
Definitely won’t / probably won’t / probably will / definitely will
To no extent / to little extent / to some extent / to a large extent
Less than I would like / about right / more than I would like
Very low / below average / average / above average / very high
Not a priority / low priority / medium priority / high priority / essential
Even more examples can be found by searching the web for “Likert scale examples.”
Interview or focus group questions
Possible opening questions for interviews:
Tell me about your background
Tell me about your role
Possible probes for interviews
In addition to questions (“prompts”) you can develop “probes,” which are supplementary questions or responses used to get more details about the topic of a more general question. These are often listed as bullets under the question and remind the interviewer to make sure they have been covered by the response to the general question. For example, under a question, “Please tell me about your decision to major in physics,” topics for probes could include “parents,” “peers,” and “high school.” In addition to such specific probes, some general probes are often useful to further elucidate an answer:
Can you say more?
What do you mean by that?
How do you know?
And that’s because…?
Is this working?
Why or why not?
Have you always felt that way? How has your opinion / experience evolved over time?
S. B. Robinson and K. F. Leonard, Designing Quality Survey Questions, Sage Publications, Inc. (2019): A comprehensive text about practical survey design, which includes a very good checklist for survey design.
The STEM Learning and Research Center (STELAR): Provides a variety of tools to measure aspects of students’ experiences in STEM, such as confidence, and interest in STEM, or experiences with particular course materials. These tools can be used to inform survey design.