• Home
  • Summary Reports
  • Services
    • Assessment Process
    • Comprehensive Program Review
    • Division of Student Affairs Annual Reporting Process
  • Learning Center
    • Blog
    • Audio Resources
    • FAQs
    • Assessment Methods
    • Assessment Training Videos
    • Web Links
    • Books and Articles
    • Survey Building with Qualtrics
    • Resources
  • About Us
    • Mission, Vision and Goals
    • Staff
    • Standards of Ethical Practice
    • Contact
  • Skip to primary navigation
  • Skip to main content
  • Skip to footer
Student Affairs Planning, Assessment & Research
Texas A&M University

Student Affairs Planning, Assessment & Research

Division of Student Affairs

Menu
Texas A&M University

Student Affairs Planning, Assessment & Research

Division of Student Affairs

Header Right

  • Home
  • Summary Reports
  • Services
    • Assessment Process
    • Comprehensive Program Review
    • Division of Student Affairs Annual Reporting Process
  • Learning Center
    • Blog
    • Audio Resources
    • FAQs
    • Assessment Methods
    • Assessment Training Videos
    • Web Links
    • Books and Articles
    • Survey Building with Qualtrics
    • Resources
  • About Us
    • Mission, Vision and Goals
    • Staff
    • Standards of Ethical Practice
    • Contact

Assessment

Different Year Different Assessment

September 1, 2020 by Darby

This academic year will be like no other that we have experienced in our lifetimes. So much change, so much still unknown. While it can cause us stress and anxiety, it can also be an opportunity to look at assessment differently. It’s easy to fall into a pattern of program—assessment—decisions—planning—program—assessment….That pattern has been disrupted because we can’t provide our programs and services in the same way, which necessitates us changing how we assess.

As always, it is important to understand who our students are, what they need, what they are learning, and how satisfied they are (to name a few). Right now, it might be even more important to understand our students’ sense of belonging, when many of our programs and services are being offered remotely and the quality of those virtual engagements. How are students connecting with their peers, which is especially important for new students? How are students’ connecting with organization advisors? How do you know students are learning something when you cannot necessarily observe them as frequently?

It’s time to be a little creative in your big questions, your data collection, and your analysis. Take a breath, grab your favorite beverage, and reflect on the following question: What is the most important thing we need to know from the students we serve (and those we don’t serve) about their experience in our program or service during a global pandemic? Don’t rush this process. Jot down some words, draw a picture, think of a theme song….

When you have that question answered, think about how you are going to gather information about what you want to know. In the past, you might have had an in person audience who could fill out a survey or provide verbal feedback on the spot. Without that, we have to figure out different ways to collect information. Maybe it’s using Zoom polls or chats, doing something in the learning management system, having students tweet using a hashtag, create a video. I’m not that creative, so I know people can come up with even more interesting ways to collect data.

When you have the data collected, think about the best way to analyze it. You might have done a poll, so you have quantitative data, but you could also have saved the chats which give you qualitative data. A great way to get a fresh perspective is to ask students to help you analyze the data to make meaning out of it. When you have the data analyzed think of creative ways to share it. Can you do something graphically with charts or word clouds?

Your changes may not be earth shattering, but I hope they are meaningful for you and the students you work with. It’s going to take all of us to get through a pandemic, and I hope we take this opportunity to think about what is really important and how we can be of service to the students we serve on a daily basis.

Filed Under: Assessment

Rapid Data Collection

August 1, 2020 by Darby

Last month I was asked to present at a virtual conference related to COVID-19 with a colleague from NYU. If you know me, you might be saying, “Darby, you are a doctor, but you are not that kind of doctor.” So, true. The point of the presentation, though, was to talk about the value of rapid data collection in times of crisis or the unknown when information, decisions, actions can and do change very quickly.

I thought I would share some of the take-aways from that session, in case you find yourself in a global pandemic again or some other major campus crisis.

Good is better than perfect. (Okay, there is no perfect assessment, as much as we try!) In times of rapid change, we don’t always have time to plan the perfect data collection tool, and we can’t wait until we have all the information that we need to plan. While that doesn’t give you an excuse to launch a really bad survey, it does give you a little latitude to do the best you can with the information you have.

Collaborate. Collaborate. Collaborate. Other people may already have information that you need, so it would be detrimental to re-collect data. It is a waste of your time and annoying to your audience. While everyone may feel pressure to immediately collect data and make decisions in their own silo, there should be some coordinated effort that makes sense and streamlines the assessment and communication process.

Small samples and few questions are okay. Sometimes in assessment and research, we get caught up in having a large, representable sample and a high response rate. Those are great things, but in times of crisis, you may not have that luxury. If you can collect reasonably accurate and representative responses from a small group in a short period of time, that gives you the ability to make a decision, test an action, and determine whether that new action is working. When you re-assess using another small sample, you’ll generally know whether that new intervention worked. If it doesn’t, then you can try something else fairly quickly. Think about the minimum data points you need to maximize the information you can use.

Assessment can be low-tech. It could be as easy as tracking the topic of phone calls/emails that you get from students. Within a day, you probably could make a check sheet of common topics. Fairly quickly (within days or a week), you probably could summarize the main concerns and communicate them to the appropriate person/office. When some action is taken (e.g., putting more information on your website), you can determine if the number of questions about that topic have decreased. If you saw students in person, you could ask them to write main concerns on an index card. If you are calling students, you can also track their concerns/questions.

Equity is still a key component in assessment. In times of crisis, we need to recognize that some groups are impacted more than others. We need to collect data from a variety of populations, while not adding a burden to marginalized populations who are already experiencing greater trauma. In addition, when we are making decisions, we need to include multiple voices at the table. Policy decisions, new procedures, and new requirements can have an inequitable impact, even when we think they might be fair to all (e.g., wearing a face covering, requiring a laptop, etc.).

I hope that gives you some insight about collecting data in these times of rapid change. Student Life Studies is always here to help you in that process. Please let us know how we can help.

 

Filed Under: Assessment

Engaging Students Throughout the Assessment Process

July 1, 2020 by Darby

As we are gearing up for students to return to campus (in person or virtually), it’s time to start thinking about our program and service delivery. Obviously, I think that assessment should be a part of the planning and decision making process. But, I also strongly advocate for including students in the assessment process, especially if the assessment is about learning and student engagement.

A simple assessment cycle would go something like this: Determine your mission, goals, and outcomes. Plan how to deliver and assess your program/service/experience. Deliver that program. Collect data. Analyze and interpret data. Make a decision about improvements to your program. Tell people about your assessment. Implement those changes. Start the cycle again, being sure to reassess to know whether your changes made a positive difference with your participants. Let’s look at those steps individually to see how you could include student voices in each step.

Determine your mission, goals, and outcomes. Do you ask students about what they want to learn and be able to do following some experience? Do you engage them in updating the mission statement? Do you ask student leaders what they want their organization to accomplish? Do you all agree on what success looks like? How can you include multiple voices that you may not normally hear from?

Plan how to deliver and assess the experience. This gets into the logistics of the process. How would students like to deliver their experience (in person, virtually, hybrid)? How many people will be there? What is the goal of the program? What assessment methods will be used to determine success? Have you asked students how they want to be assessed? If you are planning on teaching effective verbal communication skills to 300 people in a one-hour workshop, you may not have time for each participant to practice a skill and get feedback on their performance. That’s why planning and assessment should be happening together. Have you gotten feedback from your participants, employees, or student leaders about how they could best represent their learning and development? A survey may be easy, but it may not be the best way to measure knowledge and learning.

Deliver the program. Is everyone clear about what the outcome and content are? How are you including students in program delivery? Are you able to use students to engage with other students? (Doing that will also increase the skills of the students who are presenting material.) Have you talked ahead of time about where assessment happens in the program delivery?

Collect data. Hopefully, you asked students ahead of time about how best they can demonstrate their learning. At this point in the cycle, can you engage students in the data collection? Maybe you are using a rubric to measure a leadership skill (verbal communication, event planning, meeting facilitation, etc.). Can the executive student leadership be trained to use the rubric to provide feedback to the next level of leaders? Can your students be assessment cheerleaders in talking to their peers about the importance of participating in the assessment?

Analyze and interpret data. Do you include students in analyzing the data and making meaning of it? Have you scheduled a meeting with students to do that? Do you bring multiple voices to the table to give you a variety of perspectives? This can be particularly useful and insightful if you have qualitative data. Students are good about providing the feedback about the student experience because they are living it.

Make a decision about improvements to your program. Based what you are hearing from students, what do you need to do to improve? Have you asked students what specific changes they would make based on the peer feedback? What are their change priorities? What options might they think of that have not crossed your mind?

Tell people about your assessment. When you have assessment results and plans for change, do you tell students about it? Do you present to student leaders, event participants, student employees, student advisory groups? Do you post assessment information on your website to be transparent with students? Have you asked student leaders to present information to stakeholders? All of this helps participants know that you listened to their feedback, which may encourage them to participate in future assessment.

Implement those changes. How have you included students in implementing changes? It could be something that changes in the student organization or with a service you provide. Is it something that they can help implement, so they have ownership in the changes? Students can also be cheerleaders here by telling their peers about the positive changes that have been made.

Start the cycle again and reassess. Have you included students in how you will reassess to be sure the changes are working? Have expressed a desire to have them involved in future changes and assessment?

If you are not incorporating students into your assessment process, you are losing important student voices. Assessment becomes something you do to students, rather than something you do with and for students. People buy into what they help create. Do not underestimate students’ ability to engage in every step of the assessment process. They can be really insightful and develop skills they can use after graduation.  Student Life Studies is always here to help you in that process. Please let us know how we can help.

 

Filed Under: Assessment

What Do You Want Your Audience to Know or Do?

June 1, 2020 by Darby

My last blog was about identifying your audience(s) to help prepare you to share assessment results. This time, I would like to address what you want them to know or do. Obviously, that is very dependent on the particular audience you are addressing. Tailoring your message is incredibly important.

Knowledge

Based on your assessment results, you might identify people that have an interest in what you are doing and assessing, but you don’t necessarily need for them to do something (at this point). At the very least, you can probably identify several people you think should know something about your program right now.

You want your audiences to speak knowledgeably, accurately, and positively about what you do. Again, they may want different pieces of information. You may want the department director to know that you came in under budget and that participants/clients/users were satisfied with their experience. You may want to Vice President to know that students learned something specific in your program and positively contributed to student success. The Vice President, in turn, may share that with a new faculty member who wants to know about programs in student affairs. You may want users (and non-users) to know that you listened to their suggestions and changed something about what you do.

Action

On the other hand, you may want some people to take specific action based on your communication of assessment results. Without knowledge of your program, assessment data, and plans to move forward, it may be difficult for people to act.

An obvious action would be to get more financial support for what you want to do. You may be requesting money from donors; they need to know the positive impact you have had on students. Your director may be requesting funding from the Student Affairs Fee Advisory Board (SAFAB). The board needs to know the value and effect on the student body. Because of SAFAB process, you may want students to make positive comments on the SAFAB website when it is open for feedback on requests. You may need to increase participation fees; you need to let people know what their money will go toward to make it easier to support the change. The more information people have about what you do will encourage them to care about your program.

You may also be interested in having more people participate in your programs and services.  How will you get them to take that action? One way is to tell them how attentive you have been to assessment and what changes you have made based on their feedback. This information could be on your website or in other recruitment material that you send out. Another way is to work through past users/participants to spread the word. You could communicate with them through email or social media, encouraging them to tell their friends.

Maybe you want to forge new partnerships to expand or enhance your current program or service. If someone asked you to do more/different work and use already scare resources, wouldn’t you want to know as much as you could about the program? Being able to approach people with accurate, relevant, usable data will build other people’s confidence in your program.

Those are only a few reasons to share assessment results. Be sure to keep those reasons, and others, in mind as you undertake an assessment project. Student Life Studies is always here to help you in that process. Please let us know how we can help.

Filed Under: Assessment

Sharing Assessment Results: Who is Your Audience?

May 4, 2020 by Darby

As the academic year winds down, it’s a good time to reflect on what programs have accomplished, what students have learned, and what changes you want to make moving forward. Wrapped up in all of that is how and what you want to share with others about the great things you have accomplished and why they should be invested in what you do.

According to dictionary.com, a stakeholder is “a person or group that has an investment, share, or interest in something, as a business or industry.” In other words, who cares (or should care) about what you do? Take a moment and jot down (even in your head) who has a stake in what you do. I’ll wait….

How many entities did you come up with? One, two, five, ten? You may have listed a combination of these stakeholders: students, potential participants, student leaders, your supervisor, your department head, other staff, parents, donors/sponsors, faculty, administrators, the Vice President’s Office, state legislature, Board of Regents, funding agencies, local business owners, the alumni association…the list goes on. You may have even thought of one or more not on that list.

Because your stakeholders can vary drastically in their need for information, your communication to them about results should also vary. Not everyone is interested in a five page report with lots of tables and charts and individual quotes. You have lots of options: power point presentation, infographic, one page executive summary, social media post, website, full report, newsletter, word cloud, poster, etc. The Vice President may not have time to read an extensive report, but she may be really interested in the impact on student learning and success and/or the cost per student in a one-page executive summary. New student leaders planning this year’s program may be interested in what past participants thought went well and what did not, so a word cloud may be applicable. Future participants may look at the program’s website to see what the learning outcomes are and how well they have been accomplished in the past. Donors want to see that their money is spent wisely and educating students—they may want to hear individual student voices/quotes talk about the impact of their experiences.

Here are a few pointers to help you decide how to share your news:

  1. Identify and prioritize your stakeholders.
  2. Determine what you want each stakeholder to know, do, or feel.
  3. Decide what information is most pertinent to each stakeholder based on what you want them to know, do, or feel.
  4. Understand the level of complexity and detail that you need to get the information communicated accurately. This will help you choose the best method.
  5. When you have drafted your message, have a trusted colleague give you feedback on your draft.
  6. Determine the best time to communicate results (immediately following a program, before budget decisions are made, leadership transition time).
  7. Reach out to your stakeholders with the information. Offer to meet with them in person, if applicable.
  8. Build this step into your annual planning process.

If you are fortunate enough to have marketing and communications staff available to you, consult with them for additional resources. As always, Student Life Studies is here to help you maximize your assessment. Let us know what you need, so we can better serve you.

 

 

 

 

Filed Under: Assessment

Words are Hard: Creating Good Survey Questions

March 1, 2020 by Darby

One of the most frequent tasks the staff in Student Life Studies do is helping people design good survey questions. It’s a science and an art, it takes practice and an eye for detail. As with many aspects of assessment, you have to have a clear purpose statement in order to craft questions that help you get to what you need to know. In addition, the types of questions you use are dependent on what you want to know.

I’m going to highlight several different questions types common in survey design. They each have their uses, advantages, and disadvantages. If you would like more information, see Student Affairs Assessment: Theory to Practice by Henning and Roberts.

The simplest question type is yes/no. That means a condition exists or it doesn’t. Sounds simple, but there may be times when the respondent isn’t sure or doesn’t know the answer, and you have to determine if you need that third option. Sample yes/no questions are: Are you at least 18 years of age? Did you attend the alcohol education workshop on February 1, 2020? Are you registered to vote in Brazos County?

Another type of question is choose one. In this case, you want respondents to pick one answer from a list or series. For example, you could ask respondents to specify which state they currently live in. There would be a list of 50 states (and the District of Columbia) and possibly an “I don’t live in the United States.” It should be fairly easy for the respondent to answer the question. You could also be asking about days of the week, months of the year, how people heard about a particular program, their favorite session of a conference, what residence hall they live in, etc. The order of the responses should make some sort of sense: alphabetical, chronological, etc. Just think, you would not want to see a list 50+ states in a random order where you had to search for your state. Usually, you want to keep the list fairly manageable, so respondents don’t get lost in a long list of items.

A choose all that apply question also provides a list of items, but the respondent has the option to pick more than one answer. When you create the response options, you want to keep them to a reasonable number in the list, just as you would for a choose one. You also want to specify the maximum number that respondents should choose, if you have that limit. The instructions could be worded as “choose up to three responses” for example. A common question is something like, “How did you hear about the program?” The answers could be a list such as flier, newspaper, radio, social media, word of mouth, and other (with a write in option). A choose one question and a choose all that apply question could be very similar in wording.

A very common type of question is a rating scale. You have probably seen these at restaurants, online retailers, or workshop evaluations that are seeking a satisfaction score. The questions ask respondents to indicate a strength of response. Common types of scales are strongly agree to strongly disagree, always to never, excellent to poor, etc. The scales are usually three to seven responses in length. People ask whether there should be a “neutral” middle option. I believe people can be neutral in their opinion, but it’s also okay if you don’t want people to be fence-sitters and remove the middle option.

Similar to ratings, but without the granularity, is the ranking question. In this case, the respondent is given several statements or list items to put in a particular order (most important to least important, most frequent to least frequent, etc.). When you get the results back, you will know the order of how people ranked the items, but not the closeness or weight of the answers. Think about a horse race. The distance between the first and second horse doesn’t matter; a horse could win by a nose or by a length. If you wanted to know the strength of opinion, then you could convert your ranking into a rating scale by including a scale for each one of your ranking items. With ranking questions, you should be clear about how many items you want ranked: all of them, the top five, etc.

Open-ended questions allow respondents to use their own words to answer. When you are creating those questions, be sure they do not fit into another type of question category. For example, asking, “Did you learn something from the program?” which could be answered yes or no, is different than asking, “What did you learn from the program?” While the other questions types are quantitative and possibly easier to analyze, the qualitative questions will give you more descriptive information, but may be more time-consuming to analyze.

That is a quick overview of question types. If you would like assistance with designing good survey questions, please reach out to Student Life Studies. We are here to help.

Filed Under: Assessment, Uncategorised

A Picture is Worth a Number of Words

February 3, 2020 by Darby

I have been thinking about fun assessment methods to get the year started…you know, try something new and different, take a risk, all that good stuff. Although it is a new calendar year, it is the middle of the academic year, with a little more time to think about assessment before May. So, I’m going to share a quick idea that could resonate with the students (or even staff) you work with.

Photovoice, also known as reflexive photography or photoelicitation, is a qualitative methodology where participants take picture(s) as a representation of their answer to a question. It usually includes some sort of narrative, written or oral, to explain the meaning behind the picture. While it is typically has been used in participatory research, it can also be used in assessment to better understand a person’s reflections on their environment or even as a team building exercise.

Here’s how it could work with students in an organization or other students/staff you might work with. Participants are given a question or set of questions to respond to using photographs. (Some of you are not old enough to remember that students used to be given disposable cameras and had to get the filmed developed at a store!) They have a period of time to take the pictures and upload them somewhere or print them out. Because the vast majority of students today have a smart phone, they have built in photography capability. If there are students without smart phones, you need to think about alternatives/resources so they can equally participate.

The next step is either for participants to write a brief description of why this particular photo captured their answer or verbally describe the photo’s meaning. It depends on the situation as the best way to share the results. If you are working with the executive board of a student organization, you might take part of an exec meeting for each person to share their photo and meaning. Alternatively, you could also use it in a one-on-one discussion with an employee. You could use it at the beginning of the year to capture expectations or to be part of a teamwork activity. On the other hand, you might use it at the end of an experience to capture reflections on learning.

Revielle IX

Here are a few sample questions to get you thinking about the process:

  1. What represents leadership qualities you desire?
  2. How do you see the university core values (leadership, integrity, selfless service, respect, excellence, loyalty) depicted?
  3. Where are places on campus you feel included and excluded?
  4. What did you learn from being involved in this student organization/employment experience/event?

The questions can be as unique as the situation, and the pictures will be unique to the individual taking them. This method is flexible, especially with the technology available.

As the facilitator of the process, you need to emphasize that there is no right or wrong answer—everyone has a unique perspective. It’s also helpful to point out that the quality of the picture is not the key element of the exercise. If the pictures are being shared publicly/ in a group, build trust and set parameters so participants are comfortable and safe sharing.

If you are interested in using photovoice as a data collection method and want assistance, please reach out to Student Life Studies. We are here to help.

 

Articles on Photovoice

Clark-Ibáñez, M. (2004). Framing the social world with photoelicitation interviews. American Behavioral Scientist, 47(12), 1507–1526.

Goodhart, F. W., Hsu, J., Baek, J. H., Coleman, A. L., Maresca, F. M., & Miller, M. B. (2006). A view through a different lens: Photovoice as a tool for student advocacy. Journal of American College Health, 55(1), 53–56. doi:10.3200/JACH.55.1.53-56

Harrington, C. E., & Schibik, T. J. (2003). Reflexive photography as an alternative method for the study of the freshman year experience. NASPA Journal, 41(1), 23-40.

 

 

Filed Under: Assessment

Student Affairs Planning, Assessment & Research (formerly Student Life Studies) Services

January 2, 2020 by Darby

If you are new to the Division of Student Affairs at Texas A&M, you might not know what Student Affairs Planning, Assessment & Research (SAPAR) is or does on a daily basis.  Here’s a quick overview to introduce you to what we do and the common services we provide.

Philosophically, Student Affairs Planning, Assessment & Research is here to answer the big questions you have so you can improve your programs and services for students and other stakeholders. More concretely, we help staff and student organizations develop assessment instruments, gather and analyze data, and use information for improvement. We take clients from beginning to end—from idea to change.

The most common service that SAPAR provides is survey design, administration, analysis, and reporting. We use software to develop both online surveys and paper-based surveys. The staff in the department help people create appropriate questions for what they want to know.  In our partnership model, you are the expert about your topic, and we are the experts about the process. We help you ensure that your questions are not biased, they are the right type of question for what you want to know, are in an order that makes sense, and much more! Overall, we help ensure that the answers you get will help you accomplish your goals and outcomes.

Surveys are just one part of the assessment services. We also help people design focus groups, from the logistics to the questions. Depending on our availability, we may also facilitate the focus groups for you. We have easy to use digital recorders if you are interested in transcribing an audio recording. Although we do not transcribe for you, we can point you in the direction of cost-effective and timely services and resources. We can also assist you in analyzing and interpreting the information you get.

If you have an educational/developmental program or experience where you want students to demonstrate proficiency in certain areas, we can help you design rubrics. The staff will help you determine your outcomes, criteria, and levels of demonstration. These are great if you have the opportunity to observe some behavior or skill, and they can be used as student self-reflection/rating, supervisor/advisor rating, or both. You can also find premade rubrics at https://sapar.tamu.edu/home/sllorubrics/.

Currently, the university uses Qualtrics as the common survey software. Student Affairs Planning, Assessment & Research administers the accounts for the Division of Student Affairs. We help you get an account, provide group training, and provide individual consultation. Through years of experience, our staff knows shortcuts and best practices that we are happy to share with you. If and when the institution goes to a different platform in the future, we will continue to provide training and consultation services as needed.

The staff can also work with you to analyze or interpret results that you already have. One request, though, is that if you know you are planning to collect data through a medium such as Google Forms, please contact us before you do that. Sometimes when staff have brought us a data file, the data is not in good condition. We may have a quicker and easier way for you to collect and analyze data.

While this blog hits the highlights and most common services Student Affairs Planning, Assessment & Research provides, we always encourage staff to contact the department for any questions. If you don’t know the answer or provide the service, we can find out the best way to assist you. We are always here to help.

Filed Under: Assessment

Closing the Loop: Reassessing for Confirmation

December 1, 2019 by Darby

Assessment professionals use the term “close the loop” a lot. (The irony is that if it is a loop, it is already closed.) The gist is that you reassess the changes you made based on some assessment you did, which led to the decision to make that change in the first place. Makes sense, right?

I think the challenge is that we move quickly and move on to the next thing quickly. In order to really “close the loop,” we need to plan and document. Let’s take a look at this step by step as a long(er) term process.

Let’s say you have an idea to assess something…a conference, student learning, climate, etc. You spend quality time thinking about what you really want to know, from whom, in what manner, and when. You create a great assessment (survey, focus group, observation, etc.) that answers your question(s). Usually, you confirm that you are doing many things right. Your audience is having a positive experience, learning what you intended them to learn, etc. Most of the time, there are a couple of things that you could do better based on your assessment results. (Remember, your assessment should be very focused because you can’t work on 100 things in the next year.)  As a good student affairs professional, you make plans to change something in your program/service/activity and move on to the next project or program.

But wait! You’re not done. As you decide to make a change (or two or three), You need to make sure you actually implement them. You also need to make a plan to reassess the thing(s) you change. Don’t just walk away and assume the change is beneficial. Do you have a rationale as to why you think your change will be an improvement? When will you reassess the change to be sure it was positive? Where will you document all of that good stuff?

Let’s look at a simple example.  If you host an annual conference and collect data through a survey after each iteration, you may have a built in mechanism. Last year’s participants really disliked the food, which rated a 2.3 out of a 5-point scale and expressed displeasure in their comments. You discussed the results with your staff and decided to seek out a new caterer. After taste testing three options (what a great assessment gig!), you choose the new caterer that you think will meet your needs within the budget you have. This year, the assessment results changed dramatically, with a score of 4.4 out of 5.0. You decide to renew your contract with that caterer for next year and note that in your transition documents. While that is a very basic example, it illustrates that you assessed, reviewed the results with others, made a decision about change, implemented the change, assessed the change, and documented the process and assessment.

While this process is the right thing to do, it also helps with accountability. Departmental and institutional assessment plans seek documentation about assessment, change, and reassessment. It can be really easy to overlook the reassessment step if there is not accountability to remind us the importance of doing it.

I hope this information was helpful in framing “closing the loop.” Feel free to reach out to Student Life Studies for assistance. We are always here to help.

Filed Under: Assessment, Planning

The Role of the Institutional Review Board*

September 2, 2019 by Darby

Every now and then, Student Life Studies’ staff gets asked, “Does this project need to go to the Institutional Review Board (IRB)? Every now and then, Student Life Studies’ staff says, “This project definitely needs to go to the IRB.” So, what exactly is the IRB and why would projects need their approval? Keep in mind that every campus has different processes and procedures, so Texas A&M is different than other institutions.

I’ll hit some of the basics, but to really understand the role of the IRB at Texas A&M, see https://vpr.tamu.edu/human-research-protection-program/. That website will describe the definitions, steps for approval, resources, and training requirements. The IRB is here to protect humans (living subjects) in the research process (a systematic investigation resulting in generalizable knowledge). In research, the investigator gathers identifiable data about people through some sort of intervention or interaction. But, not every data collection activity requires IRB approval. In many cases in the Division of Student Affairs, assessment is completed to improve a specific program or improve student learning in a particular activity. Those data collection functions, although inherently involving interacting with humans, do not have the purpose to create new, generalized knowledge. Alternatively, there are topics/people that Student Life Studies would recommend IRB review. Those typically include sensitive topics or groups (alcohol/drugs, illegal activity, sexual activity, sexual misconduct, minors, cognitively impaired adults, pregnant women, prisoners, etc.) or cases when you know you want to publish the results. I hope you can see where there might be overlap or questions between program improvement data collection and human subjects research.

The IRB offers/requires training for any investigator. If you are just collecting some feedback about your program to make changes for next year, you might think you don’t need to take the IRB training. But, the training provides good information about the ethics of data collection, regardless of whether you are conducting actual research. Texas A&M uses the Collaborative Institutional Training Initiative (CITI) as the online training system. The nice thing is that the training is good for five years before you have to do a refresher course. The course can be accessed at https://rcb.tamu.edu/humans/training/human-subject-protection-training-in-citi .   Although not the most exciting of professional development options, it is still important to student affairs staff who collect any data.

If you need to submit a proposal to the IRB, there are several steps to complete and questions needing answered. This page https://vpr.tamu.edu/human-research-protection-program/approval-process/before-you-submit/,  provides an overview of the documents that you will need. To understand the questions on the application, see the Socio-Behavioral Protocol Template at https://rcb.tamu.edu/humans/toolkit/templates/templates  When you are ready to submit, you log into the iRIS portal at https://iris.tamu.edu/shibboleth-ds/index.html. The instructions at https://vpr.tamu.edu/human-research-protection-program/approval-process/how-to-submit/ walk you through the process. The questions can be confusing because of some of the jargon, so feel free to reach out to the IRB hotline at 845-4969 or the general office number, 458-4067, to find the Division of Student Affairs liaison.

I hope this information gave you an introduction to the IRB with resources you can access for more detail. Feel free to reach out to Student Life Studies for assistance as well. We are always here to help.

 

*The overview provided here is very simplified. Please refer to the IRB or Student Life Studies for more specific information.

Filed Under: Assessment

  • Previous Page
  • 1
  • 2
  • You're on page 3
  • 4
  • 5
  • Next Page

Site Footer

222 Koldus Student Services Building
Texas A&M University
College Station, TX 77843-1254

P: 979.862.5624
F: 979.862.5640

[email protected]

Who we are

Student Affairs Planning, Assessment & Research provides quality assessment services, resources and assessment training for departments of the Texas A&M University Division of Student Affairs and Student Organizations.

  • Site Policies
  • Risk, Fraud & Misconduct Hotline
  • State of Texas
  • Statewide Search
  • State Link Policy
  • Open Records
  • Texas Veterans Portal
  • Texas CREWS (PDF)

Copyright 2025 • Student Affairs Planning, Assessment & Research | Division of Student Affairs • All Rights Reserved. • Hosted by Division of Student Affairs Department of IT