Supplement: How to Design Surveys, Interviews, and Focus Groups

This supplement summarizes how to design survey, interview, and focus group questions. This supplement includes guidance on both

Open-Ended Question

A survey question in which the respondent can form their own answer. Also known as an open-response question or short-answer question. Open-ended questions always give qualitative data, but may be coded into themes which can be treated quantitatively. Open-ended questions take more time to answer and to analyze than closed-ended questions but may provide richer insight into respondents’ thinking. It is often best to use a mix of open-ended and closed-ended questions to provide a balance between quality and efficiency.

and

Closed-Ended Question

A survey question in which the respondent must select from a set of answer choices. Also known as a closed-response question or multiple-choice question. Closed-ended questions can include yes/no or true/false questions, rating scales, choose-all-that-apply checklists, or other sets of options. Closed-ended question responses can be treated categorically or numerically, so they can be considered quantitative data. Closed-ended questions are relatively quick to answer and to analyze, but offer limited information compared to open-ended questions. It is often best to use a mix of open-ended and closed-ended questions to provide a balance between quality and efficiency.

and includes suggestions for question starters. See the section on How to Select and Use Various Assessment Methods in Your Program for more effective practices on how to use surveys, interviews, and focus groups.

If possible, partner with experts outside your department, such as the leaders of relevant campus offices (e.g., your office of institutional research, campus climate office, office of equity and inclusion, or human resources office) or an external consulting firm to develop, administer, and analyze surveys, interviews, and focus groups. Recognize that trained outside experts are likely to be more effective than department members for gaining

Respondent

A person who responds to a survey. When presenting data it is best to refer to “respondents” rather than to characteristics of people in the sample (e.g., “introductory students”) to make clear that the data do not include responses from all members of the sample.

’ trust, ensuring safety and anonymity, and collecting useful feedback while avoiding the potential for retaliation or other negative consequences. However, recognize that non-physicists may need guidance on understanding and exploring physics-department-specific issues or challenges.

General advice for writing questions

Tip #1: Determine what you need to know and how it will be used

Drafting good questions starts with your goals. As you write questions, ask yourself:

  • What exactly do we need to know? Write down a few high-level questions you want to answer.
  • What will we do with the information?
  • Do we know how our actions will depend on responses to this question? In other words, if

    Respondent

    A person who responds to a survey. When presenting data it is best to refer to “respondents” rather than to characteristics of people in the sample (e.g., “introductory students”) to make clear that the data do not include responses from all members of the sample.

    answer A versus B, what will that tell you? If the question won’t give you actionable information, reword or eliminate it.

Choose among possible questions based on what you want to address. Do you want to assess students’ satisfaction with advising, their attendance at advising, or whether advising supported their career decision making? Each area of focus requires different types of questions.

Tip #2: Stick to relevant questions

  • Ask questions that the

    Respondent

    A person who responds to a survey. When presenting data it is best to refer to “respondents” rather than to characteristics of people in the sample (e.g., “introductory students”) to make clear that the data do not include responses from all members of the sample.

    can answer based on their personal experience or opinions.
  • For each question, ask yourself, “Is this something we are just curious about? Or do we need to know this?” If it is not essential, eliminate it.

Tip #3: Write specific questions

  • Err on the side of being too precise in question wording, especially on a survey, to ensure useful responses. For example, if you want to know if homework is too easy or too hard, ask that; don’t ask if

    Respondent

    A person who responds to a survey. When presenting data it is best to refer to “respondents” rather than to characteristics of people in the sample (e.g., “introductory students”) to make clear that the data do not include responses from all members of the sample.

    feel the homework level is “appropriate.”
  • A common mistake is to ask if X (e.g., a particular course) is “important” or “essential.” Ask instead if X is important or essential for a particular purpose, e.g., for respondents’ career goals.
  • Define your terms and clarify details. For example, if asking about introductory courses, define “introductory course.” If you want to know about experiences “in the past year,” clarify whether you mean in the past 12 months or the last calendar year.

Tip #4: Write clear, neutral questions

  • Make your questions as neutral as possible. Avoid leading questions. For example, “What did you like about X?” assumes there is something they liked.
  • Use simple and concrete language.
  • Draw from examples. Do an internet search for questions on the topic at hand.
  • Ask only one question at a time; don’t combine multiple questions into one. For example, do not ask

    Respondent

    A person who responds to a survey. When presenting data it is best to refer to “respondents” rather than to characteristics of people in the sample (e.g., “introductory students”) to make clear that the data do not include responses from all members of the sample.

    to list strengths and weaknesses in a single question.
  • Always show potential questions to someone else. Good sources of feedback include survey professionals, possible respondents, and colleagues with social science expertise.
  • Consider doing a pilot interview or survey to gauge how questions are received. Make appropriate revisions to the questions based on what you learn.

Tip #5: Pay attention to question ordering

  • Put the most important survey questions near the beginning, because many

    Respondent

    A person who responds to a survey. When presenting data it is best to refer to “respondents” rather than to characteristics of people in the sample (e.g., “introductory students”) to make clear that the data do not include responses from all members of the sample.

    drop out part way through.
  • To the extent possible, put sensitive questions toward the end, to give

    Participant

    A person who participates in an interview or focus group. When presenting data, it is best to refer to “participants” rather than to characteristics of people in the sample (e.g., “introductory students”) to make clear that the data do not include responses from all members of the sample.

    or respondents time to become comfortable and familiar with the context of the interview or survey.
  • Always finish with an

    Open-Ended Question

    A survey question in which the respondent can form their own answer. Also known as an open-response question or short-answer question. Open-ended questions always give qualitative data, but may be coded into themes which can be treated quantitatively. Open-ended questions take more time to answer and to analyze than closed-ended questions but may provide richer insight into respondents’ thinking. It is often best to use a mix of open-ended and closed-ended questions to provide a balance between quality and efficiency.

    such as, “Is there anything else you would like to tell me?”

Tip #6 (For surveys): Use appropriate answer choices and scales

  • When asking for a value, offer answer choices with ranges (e.g., 0–5 years, 6–10 years, and 11+ years) rather than specific values, if possible. This makes survey questions easier to answer and to analyze. Include a choice of “other” with a space to write in an alternate answer for

    Respondent

    A person who responds to a survey. When presenting data it is best to refer to “respondents” rather than to characteristics of people in the sample (e.g., “introductory students”) to make clear that the data do not include responses from all members of the sample.

    whose answers don’t fit into the choices offered.
  • Include all reasonable possible answer choices in

    Closed-Ended Question

    A survey question in which the respondent must select from a set of answer choices. Also known as a closed-response question or multiple-choice question. Closed-ended questions can include yes/no or true/false questions, rating scales, choose-all-that-apply checklists, or other sets of options. Closed-ended question responses can be treated categorically or numerically, so they can be considered quantitative data. Closed-ended questions are relatively quick to answer and to analyze, but offer limited information compared to open-ended questions. It is often best to use a mix of open-ended and closed-ended questions to provide a balance between quality and efficiency.

    , and make sure choices are mutually exclusive.
  • Use data from an

    Open-Ended Question

    A survey question in which the respondent can form their own answer. Also known as an open-response question or short-answer question. Open-ended questions always give qualitative data, but may be coded into themes which can be treated quantitatively. Open-ended questions take more time to answer and to analyze than closed-ended questions but may provide richer insight into respondents’ thinking. It is often best to use a mix of open-ended and closed-ended questions to provide a balance between quality and efficiency.

    to later create answer choices for a closed-ended version of the question.

Tip #7 (For surveys): Draw the reader’s attention appropriately

  • Don’t rely on

    Respondent

    A person who responds to a survey. When presenting data it is best to refer to “respondents” rather than to characteristics of people in the sample (e.g., “introductory students”) to make clear that the data do not include responses from all members of the sample.

    reading long explanations or introductions. Use one-line instructions.
  • Bold essential terms to draw attention to them. Do not use italics, as these are harder to read.

Tip #8 (For surveys): Keep it short

  • Clearly mark optional answers as “(optional)” to reduce survey length. If a question is optional, consider whether it is needed at all.
  • Use

    Open-Ended Question

    A survey question in which the respondent can form their own answer. Also known as an open-response question or short-answer question. Open-ended questions always give qualitative data, but may be coded into themes which can be treated quantitatively. Open-ended questions take more time to answer and to analyze than closed-ended questions but may provide richer insight into respondents’ thinking. It is often best to use a mix of open-ended and closed-ended questions to provide a balance between quality and efficiency.

    sparingly; include preferably one or two and definitely no more than four per survey.

    Closed-Ended Question

    A survey question in which the respondent must select from a set of answer choices. Also known as a closed-response question or multiple-choice question. Closed-ended questions can include yes/no or true/false questions, rating scales, choose-all-that-apply checklists, or other sets of options. Closed-ended question responses can be treated categorically or numerically, so they can be considered quantitative data. Closed-ended questions are relatively quick to answer and to analyze, but offer limited information compared to open-ended questions. It is often best to use a mix of open-ended and closed-ended questions to provide a balance between quality and efficiency.

    are easier to analyze and reduce survey length. Open-ended questions should be saved for information that is very valuable and not easily obtained via a multiple-choice scale.

Questions and rating scales

Guidelines for demographic questions

  • Include only demographic questions that are relevant and valuable; make them optional or include “prefer not to answer” as an option; and do not ask demographic questions that are likely to compromise the anonymity of

    Respondent

    A person who responds to a survey. When presenting data it is best to refer to “respondents” rather than to characteristics of people in the sample (e.g., “introductory students”) to make clear that the data do not include responses from all members of the sample.

    (e.g., very specific questions about a demographic group with very few respondents in your sample).
  • Check to see if you already have the demographic information you need from some other source, such as class rosters or your office of institutional research. Because of the potential for harm, and to shorten the survey, avoid asking questions whose answers you can find elsewhere.
  • Consult with your office of institutional research and/or your office of equity and inclusion for guidance on wording for questions about, e.g., gender, ethnicity, race, and disability status. Use care when designing such questions to avoid harm. For example, use inclusive options for gender, race, and ethnicity data; use self-reported demographic data; include “other” options with write-ins; and don’t lump options together.
  • Ask demographic questions at the end of the survey rather than at the beginning, to avoid triggering

    Stereotype Threat

    A phenomenon in which a member of a marginalized group performs poorly due to the threat of a negative stereotype about their abilities in the area being assessed, e.g., a stereotype that women or Black people are not good at math. Research demonstrates that situations that reinforce such stereotypes lead to members of marginalized groups performing more poorly on assessments related to the stereotypes, but that interventions can be designed to counteract the impact of stereotype threat. This research suggests that without such interventions, standard assessments used for admissions and hiring decisions are likely to be biased against members of marginalized groups. Review article

    . For an interview, you can send a quick written survey about demographics after the interview. If you are able to build trust during an interview, you can ask a very general question at the end of the interview such as “We’d like to honor and acknowledge the experience of people from different backgrounds. Could you tell me about your gender and ethnicity, and/or any other facets of your identity that you’d like to share?”
  • Disaggregate data by demographics, major, or other critical subgroupings to understand the experiences of different groups. However, if disaggregating will compromise anonymity (e.g., when there are very few members of a particular demographic), find ways to aggregate across semesters and/or learn about the experiences of

    Marginalized Groups

    People of color and others with marginalized ethnicities, women and others who experience misogyny, LGBTQ+ people, disabled people, and others who have traditionally been marginalized in society and in physics. According to the TEAM-UP Report, marginalized groups are “groups of people defined by a common social identity who lack adequate social power or resources to design, build, or perpetuate social structures or institutions that reflect the centrality … of their identities, proclivities, and points of view. … They need not be underrepresented or numerical minorities, but often are.” We use the term marginalized groups, rather than minorities, underrepresented groups, or other commonly used terms, because people in these groups are not always minorities or underrepresented, and in order to convey that underrepresentation is the result of marginalization rather than a statistical accident. Another common term is minoritized groups. While we use this general term for brevity and readability, it is important to recognize that the many different groups encompassed by this term face different challenges and have different needs that should be addressed individually whenever possible, to learn the terms that people ask to be called, and to recognize that these terms may change over time.

    through other methods. See the section on Equity, Diversity, and Inclusion for details.
  • Use demographic data to learn how your department can better support students and other department members from marginalized groups. Reflect on the conclusions you draw based on the results of demographic analysis, and ensure that such analysis is not used to characterize people or groups in ways that perpetuate stereotypes. If you identify achievement gaps between different groups of students, focus on what you can do to benefit students in ways that are empowering, rather than focusing on their deficits. Focus on students’ growth rather than gaps in relation to other students.

Open-ended question starters for surveys, interviews, or focus groups

These question starters could be used for surveys, interviews, or focus groups. For a survey, they could be turned into

Closed-Ended Question

A survey question in which the respondent must select from a set of answer choices. Also known as a closed-response question or multiple-choice question. Closed-ended questions can include yes/no or true/false questions, rating scales, choose-all-that-apply checklists, or other sets of options. Closed-ended question responses can be treated categorically or numerically, so they can be considered quantitative data. Closed-ended questions are relatively quick to answer and to analyze, but offer limited information compared to open-ended questions. It is often best to use a mix of open-ended and closed-ended questions to provide a balance between quality and efficiency.

with

Rating Scale

A set of categories of responses in which a survey respondent is asked to indicate where they would rate themselves within those categories. The most common rating scale is a Likert scale, according to which the survey respondent indicates which option best represents their positive or negative attitude or opinion (e.g., strongly agree / agree / neutral / disagree / strongly disagree). A rating scale is an example of a closed-ended question. The number of choices in a rating scale are the “point” of that scale; for example, the above agree/disagree scale is a five-point scale.

. In these starters, “X” should be replaced with the topic of interest. “Department” could also be replaced with a domain of interest (e.g., course, advising meetings, or research group).

  • Tell me about X. (This is a particularly good opener question for most interviews or focus groups; Jacob and Furgerson, 2012).
  • How do you feel about X?
  • How do you feel about your experience with X?
  • How have your feelings about X (or your experiences with X) evolved over time?
  • What is important to you about X?
  • How does X fit with your personal goals and values?
  • How do you do X?
  • Why do you do X?
  • What is best about how X is done?
  • What could be improved about how X is done?
  • What are some of the major challenges in X?
  • What activities in your department are helping to do X?
  • What activities would you like to see in your department to help improve on X?
  • What changes in your department have you noticed regarding X?
  • What do you think is important for us to know about X?
  • Might you like to contribute to X (change effort), and how?
  • If you could remake X from scratch, what are some key features you’d include?

Below are some

Open-Ended Question

A survey question in which the respondent can form their own answer. Also known as an open-response question or short-answer question. Open-ended questions always give qualitative data, but may be coded into themes which can be treated quantitatively. Open-ended questions take more time to answer and to analyze than closed-ended questions but may provide richer insight into respondents’ thinking. It is often best to use a mix of open-ended and closed-ended questions to provide a balance between quality and efficiency.

related to diversity and inclusion in a classroom setting from Hogan and Sathy (2022) that could also be adapted to other settings. These could be used in focus groups or course surveys:

  • In what ways has your instructor demonstrated they care about your learning?
  • Is there content from the assignments or class discussions that has made you feel included or excluded? Explain.
  • Are there teaching approaches that have made you feel included or excluded? Explain.
  • How did the diversity of your class contribute to your learning in this course?
  • How might the class climate be made more inclusive?

Closed-ended question types for surveys

There are a variety of

Closed-Ended Question

A survey question in which the respondent must select from a set of answer choices. Also known as a closed-response question or multiple-choice question. Closed-ended questions can include yes/no or true/false questions, rating scales, choose-all-that-apply checklists, or other sets of options. Closed-ended question responses can be treated categorically or numerically, so they can be considered quantitative data. Closed-ended questions are relatively quick to answer and to analyze, but offer limited information compared to open-ended questions. It is often best to use a mix of open-ended and closed-ended questions to provide a balance between quality and efficiency.

question types, such as:

Type
Example
Choices
Yes or No
Have you ever enrolled in Physics 101?
Yes/No
True or False
I am the first person in my immediate family to attend college.
True / False
Choose all that apply*
I have participated in the following department activities this year (choose all that apply):
List of possible choices, displayed as checkboxes
Choose the best
Which of the following was the most valuable aspect of the advising sessions?
List of possible choices, displayed as radio buttons
Rating scale / Likert scale
The physics department is a welcoming environment.
List of options, such as: strongly agree / somewhat agree / somewhat disagree / strongly disagree / no opinion
Rank order
Rank the following career seminar topics in order of your level of interest.
List of options, such as: resume writing / careers in industry / teaching

*Avoid “choose all that apply” questions unless necessary, as they are more difficult to interpret.

Closed-ended question starters for surveys

Here are some possible question starters for formulating

Closed-Ended Question

A survey question in which the respondent must select from a set of answer choices. Also known as a closed-response question or multiple-choice question. Closed-ended questions can include yes/no or true/false questions, rating scales, choose-all-that-apply checklists, or other sets of options. Closed-ended question responses can be treated categorically or numerically, so they can be considered quantitative data. Closed-ended questions are relatively quick to answer and to analyze, but offer limited information compared to open-ended questions. It is often best to use a mix of open-ended and closed-ended questions to provide a balance between quality and efficiency.

questions.

  • To what extent is X valued in this department?
  • To what extent does X help you do Y?
  • During a typical semester how many times do you do X?
  • Which of the following do you see as benefits of X?
  • Rate the performance of X on the following dimensions…
  • How much do you agree that X will affect Y? (

    Open-Ended Question

    A survey question in which the respondent can form their own answer. Also known as an open-response question or short-answer question. Open-ended questions always give qualitative data, but may be coded into themes which can be treated quantitatively. Open-ended questions take more time to answer and to analyze than closed-ended questions but may provide richer insight into respondents’ thinking. It is often best to use a mix of open-ended and closed-ended questions to provide a balance between quality and efficiency.

    follow-up: If so, how?)

Rating scales for closed-ended questions for surveys

A

Rating Scale

A set of categories of responses in which a survey respondent is asked to indicate where they would rate themselves within those categories. The most common rating scale is a Likert scale, according to which the survey respondent indicates which option best represents their positive or negative attitude or opinion (e.g., strongly agree / agree / neutral / disagree / strongly disagree). A rating scale is an example of a closed-ended question. The number of choices in a rating scale are the “point” of that scale; for example, the above agree/disagree scale is a five-point scale.

is a set of categories of responses; a survey

Respondent

A person who responds to a survey. When presenting data it is best to refer to “respondents” rather than to characteristics of people in the sample (e.g., “introductory students”) to make clear that the data do not include responses from all members of the sample.

is asked to indicate where they would rate themselves within those categories. The most common rating scale is a Likert scale, where the survey respondent indicates which option best represents their positive or negative attitude or opinion (e.g., strongly agree / agree / neutral / disagree / strongly disagree). A rating scale is an example of a close-ended question. The number of choices in a rating scale are the “point” of that scale; for example, the above agree/disagree scale is a 5-point scale.

Tip #1: Choose a scale that fits the question.

Make sure the scale answers the question. For example, the following question and scale fit: “To what extent do you feel the advising sessions helped you decide your course schedule? To no extent / to some extent / to a large extent.” On the other hand, the following question and scale do not fit: “To what extent do you feel the advising sessions helped you decide your course schedule? Definitely not / Possibly / Probably / Definitely.” Note that a scale can be unipolar (e.g., “none”…. “extremely”) or bipolar (e.g., “strongly disagree”.... ”strongly agree”).

Tip #2: Use an appropriate number of answer choices.

When deciding how many answer choices (“scale points”) to provide, consider the balance between making the question easy to answer and making meaningful distinctions. All rating scales can be increased or decreased in their point-value; for example, a 5-point agree/disagree scale (strongly disagree / disagree / neutral / agree / strongly agree) can be changed into a 3-point scale by removing “strongly disagree” and “strongly agree” or a 7-point scale by adding “slightly disagree” and “slightly agree”. While more points adds more precision to the scale, it also takes more time and energy for the

Respondent

A person who responds to a survey. When presenting data it is best to refer to “respondents” rather than to characteristics of people in the sample (e.g., “introductory students”) to make clear that the data do not include responses from all members of the sample.

to answer. Five answer choices is usually a good balance if you don’t need to calculate a numerical score. If you want to calculate a mean or do other statistical analysis on individual questions, use at least seven answer choices, since ordinal data starts to approximate continuous data at seven points, and you can start to trust a mean. However, if a 7-point scale leads to cognitive overload in pilot testing, reduce the number of options.

Using an odd number of answer choices allows respondents to choose a middle (“neutral”) option, which is often preferred. Including “neutral” responses, as well as responses off the scale such as “not applicable” or “no opinion”, can help prevent respondents from making up a response or getting frustrated that they can’t answer honestly. There are times when it's more appropriate to remove the neutral option and force the respondent to take a side, but do this only if you have a good reason to.

Tip #3: Use matrix questions to show multiple questions with the same scale.

It can be convenient to show several questions with the same rating scale (e.g., a set of five questions all with the agree/disagree set of response options). Such a “matrix question” is also easy for

Respondent

A person who responds to a survey. When presenting data it is best to refer to “respondents” rather than to characteristics of people in the sample (e.g., “introductory students”) to make clear that the data do not include responses from all members of the sample.

to go through, as they only need to orient to one scale. However, it’s best to include no more than five to seven questions in a matrix scale, to avoid overwhelming the respondent.

Tip #4: Draw from example scales.

Below are several example scales. In some cases, choosing the scale can help to identify the proper question wording to probe the issue at hand. The following list is adapted from Vagias (2006).

  • Strongly disagree / disagree / neutral / agree / strongly agree

  • Not at all / slightly / somewhat / moderately / very

    • This scale is best associated with an adjective from the question, for example, “How concerned are you about your grades in this course? Not at all concerned / slightly concerned / very concerned.”

    • Possible adjectives include concerned, familiar, aware, satisfied, responsible, influential, frustrated, important.

  • Very dissatisfied / dissatisfied / neither satisfied nor dissatisfied / satisfied / very satisfied

  • None / a few / some / most / all

    • This scale is best associated with a noun from the question, such as “No staff / a few staff / some staff / most staff / all staff.”

  • Never / rarely / sometimes / often / always

  • Almost never true / usually not true / seldom true / occasionally true / often true / usually true / almost always true

  • Very untrue of me / untrue of me / neutral / true of me / very true of me

  • Very poor / poor / acceptable / good / very good

  • Not likely / somewhat likely / very likely

  • Definitely won’t / probably won’t / probably will / definitely will

  • To no extent / to little extent / to some extent / to a large extent

  • Less than I would like / about right / more than I would like

  • Very low / below average / average / above average / very high

  • Not a priority / low priority / medium priority / high priority / essential

Even more examples can be found by searching the web for “Likert scale examples.”

Interview or focus group questions

Possible opening questions for interviews:

  • Tell me about your background
  • Tell me about your role

Possible probes for interviews

In addition to questions (“prompts”) you can develop “probes,” which are supplementary questions or responses used to get more details about the topic of a more general question. These are often listed as bullets under the question and remind the interviewer to make sure they have been covered by the response to the general question. For example, under a question, “Please tell me about your decision to major in physics,” topics for probes could include “parents,” “peers,” and “high school.” In addition to such specific probes, some general probes are often useful to further elucidate an answer:

  • Can you say more?
  • What do you mean by that?
  • How do you know?
  • And that’s because…?
  • Is this working?
  • Why or why not?
  • Have you always felt that way? How has your opinion / experience evolved over time?

Possible closing questions for interviews

  • Is there anything else you would like me to know?
  • How could we make these questions better?

Resources

Interview and focus group resources

Survey resources

  • Pew Research Center, Writing Survey Questions: Provides useful guidance in developing survey questions, choosing among

    Open-Ended Question

    A survey question in which the respondent can form their own answer. Also known as an open-response question or short-answer question. Open-ended questions always give qualitative data, but may be coded into themes which can be treated quantitatively. Open-ended questions take more time to answer and to analyze than closed-ended questions but may provide richer insight into respondents’ thinking. It is often best to use a mix of open-ended and closed-ended questions to provide a balance between quality and efficiency.

    and

    Closed-Ended Question

    A survey question in which the respondent must select from a set of answer choices. Also known as a closed-response question or multiple-choice question. Closed-ended questions can include yes/no or true/false questions, rating scales, choose-all-that-apply checklists, or other sets of options. Closed-ended question responses can be treated categorically or numerically, so they can be considered quantitative data. Closed-ended questions are relatively quick to answer and to analyze, but offer limited information compared to open-ended questions. It is often best to use a mix of open-ended and closed-ended questions to provide a balance between quality and efficiency.

    , question wording, and question order.
  • University of Washington Office of Educational Assessment, Tips for Writing Questionnaire Items: A very useful overview of survey concepts.
  • S. B. Robinson and K. F. Leonard, Designing Quality Survey Questions, Sage Publications, Inc. (2019): A comprehensive text about practical survey design, which includes a very good checklist for survey design.
  • The STEM Learning and Research Center (STELAR): Provides a variety of tools to measure aspects of students’ experiences in STEM, such as confidence, and interest in STEM, or experiences with particular course materials. These tools can be used to inform survey design.
  • See the Resources for the section on Departmental Culture and Climate for examples of climate surveys.

References

EP3 Logo

Brought to you by


Funding provided by

This material is based upon work supported by the National Science Foundation under Grant Nos. 1738311, 1747563, and 1821372. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

This site is governed by the APS Privacy and other policies.

© 2024 The American Physical Society
CC-BY-NC-SA