• Home
  • Summary Reports
  • Services
    • Assessment Process
    • Comprehensive Program Review
    • Division of Student Affairs Annual Reporting Process
  • Learning Center
    • Blog
    • Audio Resources
    • FAQs
    • Assessment Methods
    • Assessment Training Videos
    • Web Links
    • Books and Articles
    • Survey Building with Qualtrics
    • Resources
  • About Us
    • Mission, Vision and Goals
    • Staff
    • Standards of Ethical Practice
    • Contact
  • Skip to primary navigation
  • Skip to main content
  • Skip to footer
Student Affairs Planning, Assessment & Research
Texas A&M University

Student Affairs Planning, Assessment & Research

Division of Student Affairs

Menu
Texas A&M University

Student Affairs Planning, Assessment & Research

Division of Student Affairs

Header Right

  • Home
  • Summary Reports
  • Services
    • Assessment Process
    • Comprehensive Program Review
    • Division of Student Affairs Annual Reporting Process
  • Learning Center
    • Blog
    • Audio Resources
    • FAQs
    • Assessment Methods
    • Assessment Training Videos
    • Web Links
    • Books and Articles
    • Survey Building with Qualtrics
    • Resources
  • About Us
    • Mission, Vision and Goals
    • Staff
    • Standards of Ethical Practice
    • Contact

Blog

A Response to “Why Do I Assess?”

March 1, 2019 by Darby

In late January, Linda Suskie, a well-known higher education assessment author and consultant, published a blog post January 31,2019, “Why Do I Assess?”  Not only did the post garner comments on her blogsite, it created a whirlwind of conversation on a higher education assessment listserv. (By the way, I’m a huge fan of Linda’s for her practical, down to earth perspective and her clear writing style.)

I think everyone should take a few reflective moments to think about why they assess (yes, I think everyone should assess something at some point). Because it’s required? Out of curiosity? For program improvement? To be a better educator? To know what learning occurred? To create an educated society? Each person may have a unique set of reasons, which could be a great conversation to have among peers, and we need to understand that about each other.

Here are a few of my reasons to assess.

First, I love learning. That statement goes in two directions. One, I like to know more about the world around me. When I describe my job, I tell people that I know a little bit about a lot of things. Not surprisingly, two of my Top 5 Strengths are Input and Learner. Over the years, I think I have become more curious and been able to ask deeper questions. Two, I am excited about the learning process among college students particularly outside of the classroom. Learning happens everywhere. I am motivated when I see students have an “aha” moment or describe how they were changed by a co-curricular experience. When students transfer learning to other situations or articulate how a developed skill can be used in their first job after college, I feel good about the work we do in higher education and student affairs.

Second, improvement is important (one of my other Top 5 Strengths is Achiever). How do we know how well we do if we don’t assess? I truly believe that we come to work every day to do the best we can for the stakeholders we serve (students, staff, etc.). We might know anecdotally that we are positively impacting students, but it helps to have more than that to support our assertion. And, in case you haven’t noticed, students and our environment change over time. A program created 50 years ago (or even 10 years ago) may not serve the students of today. We need to know that, so we can adapt. Why would you want to continue to do something that takes time, energy, and money, but doesn’t meet the needs of your audience?

Third, it’s just fun for me. Not only do I like the process of assessment, I like the product of assessment. I have fun sitting down with people to brainstorm what they want to know, formulating and implementing a plan, and figuring out how to share and use the results. I think it’s fun to give back to the profession through teaching and writing. I’m very fortunate to be in a role that fits my personality, interests, and skills. I enjoy coming to work and helping people answer their burning questions.

So, those are my thoughts about what I assess. What are your reasons to assess? I would love to hear them.

Filed Under: Assessment

Assessing Everything All the Time

February 2, 2019 by Darby

A student affairs assessment colleague at another institution recently reached out to me with a challenge at her institution. Her problem is not lack of staff motivation to assess, it’s actually the opposite. They are trying to assess every activity all the time, particularly using surveys as the data collection method. Of course, that has led to survey fatigue, a common ailment when staff catch the assessment bug.

 

It’s not uncommon for the pendulum to swing from no assessment, or outright antipathy about it, to assessing every…single…thing…all…of…the…time. But, not only is that unnecessary, it leads to some unintended consequences, including participant fatigue, staff fatigue, information overload, inability to make a decision or changing direction too frequently, and lack of focus on what’s truly important. How do you overcome that? Here are a few suggestions.

 

–Focus on what’s important to the program, the unit, the department, the division, and the institution. The alignment is important and helps you understand how what you do fits into the larger context. What are the goals and strategic plans that guide practice? If your program or services do not align with guiding documents, you might need to assess if you should be doing it at all.

 

–Develop and revisit outcomes frequently. Similar to aligning within the organizational structure above, looking at the outcomes helps you stay internally focused on what is important. If you have developed student learning outcomes, how will you know students have actually learned what you wanted them to learn? If you are assessing program or process outcomes, how are you keeping track of those to know you are effective in your practice?

 

–Vary your assessment methods. Surveys are overused in the student affairs assessment. Perhaps you have the ability to set up focus groups or interviews as a follow up to survey responses or you focus on individual experiences and perceptions. If you are promoting student learning with a small group, you could create rubrics for self and other evaluation. For longer term, deeper experiences, participants could journal or reflect on photos they have taken.

 

–Create a calendar for both short term and long terms assessment practices. In the short term, look at the number of programs/activities you do, the academic calendar, and planning and reporting timelines. If you present a canned program 10 times a semester, do you really need to assess it 10 times, or can you take a sample of them? Maybe you pick every second or third program in the first semester to determine what changes need to be made for the following semester. If you are looking at usage of services, maybe you pick one or two “typical” weeks in a semester to review, rather than 15 weeks. In the long run, you can create a calendar of important topics you want to know about and how you plan to assess them. Maybe one year focuses on satisfaction (using a survey), while the next year focuses on student learning (using rubrics or exit interviews), and the year after that focuses on tracking usage (using observation and preexisting data). You assessment will be ongoing, but will also give you focus areas of improvement each year.

 

–Be brief. See my previous blog, “You Only Get Five Questions.”  People are much more likely to respond to a few quick questions than a long, involved survey. Focus on what your NEED to know, not what you just want to know.  Besides, you can’t address 100 things in a year, so you don’t need to ask 100 questions on a survey. Moreover, if you have asked the same questions for several iterations, the answers have been (acceptably) consistent, and you have no plans to change that area, STOP asking it about it for a while. You already know the answer.

 

–If you haven’t used past data for change, don’t reassess yet. Why would you think the answer would be any different? Change and improvement take time to implement, especially for large scale changes. Keston Fulcher and his colleagues at James Madison University, wrote a great article, “A Model for Learning Improvement: Weigh Pig, Feed Pig, Weigh Pig.”  If you assess something (weigh pig) without taking action (feed pig) before assessing again (weigh pig), you will likely get the same results. You can assess too frequently, which becomes a waste of time and resources, for both you and the participants.

 

I hope this gives you some ideas about just saying no to over assessing. You need time to do the great things you do for students and staff; you don’t need to assess everything all the time. Take that advice from someone who loves assessment and works with staff to do it well every day!

Filed Under: Assessment, Planning

Extracurricular vs. Co-curricular

January 2, 2019 by Darby

Language is important. We can all agree on that, right?  It helps us communicate meaning so we understand each other. In higher education, we create our own jargon, slang , and acronyms that are supposed to aid in communication and exemplify our culture. Unfortunately, we don’t always stop to confirm that we are talking about the same thing.

Language also changes over time as our environment changes and we develop more nuanced vocabulary or more appropriate verbiage. One example is “extracurricular” and “co-curricular.” On the face of it, they look very similar, but to me can have a very different meaning for college students and those of us who work with college students.

The curriculum is defined as the aggregate of a course of study in school. If you look at the prefix of extracurricular, extra-, it means outside or beyond. In the same vein, the co- in co-curricular means together, jointly, or partnership.

We used to talk about students engaging in extracurricular activities. The tagline of Student Activities used to be “The Other Education.” Looking back, we see that those phrases made it sound like student involvement and engagement were outside of and separate from the learning and education that takes place on a college campus. It was something that wasn’t necessary and valued by others outside of Student Affairs.

Alternatively, we now hear the word co-curricular to mean those experiences that happen alongside or with the curricular, rather than separate from the curricular. Learning takes place all over, regardless if it is in a classroom, residence hall, or a camping tent. We, as student affairs educators, need to continue to embrace the co-curricular as our classroom. In our positions, we have the tremendous opportunity to influence student learning. Many times, as advisors or supervisors, we have one-on-one or small group discussions with students that help them make meaning of their experiences, both in the formal classroom and outside of it. Faculty may not have that luxury when teaching large classes.

There are many examples of how the co-curricular engages students in learning. On the more academic side, experiences such as study abroad and internships provide students with valuable learning experiences. Closer to student affairs work, we see opportunities such as employment, student leadership, and international excursions as tremendous learning experiences. A Resident Advisor (RA), who is also an engineering major, gains skills in teamwork, problem solving, critical thinking, and verbal communication. Those RA skills can be applied to their curricular activities in group projects, presentations, and lab work. Faculty members don’t have the time to teach those “soft skills” but probably expects students to develop them along the way. As advisors and supervisors, we teach those skills and provide real-time feedback that students can use in multiple environments.

Students (should) have one college experience, rather than several siloed experiences. Students should be able to integrate and transfer their learning. As student affairs educators, we should strive to enhance the co-curricular so students have a challenging and rewarding college experience and productive post-graduation life.

Filed Under: Learning

Increasing Response Rates

December 5, 2018 by Darby

One of the most common questions we get in Student Life Studies is how to increase response rate, especially for electronic surveys. Students are over-surveyed, so just because you can technically send out a survey to 65,000 students doesn’t mean you should (Hint: you should rarely, if ever, send out a survey to your whole population—Just Don’t Do It!). There is a science, and a bit of luck, that goes along with increasing response rates. As with so many aspects of assessment, you must keep your audience in mind every step of the way. Here are a few suggestions based on our staff experiences, as well as Dillman, Smyth and Christina (2009).

1. Make the survey interesting, relevant, and brief. No one wants to take a long survey that doesn’t matter to them. We found that the response rate of surveys sent to a particular group (organizations, participants in a specific program, etc.) responded more than those sent a random sample survey.

2. Make the cover email interesting, relevant, and brief. You need to hook potential respondents in, even before they might click on a link. Have the email come from someone they know, make the subject line engaging, and explain the purpose of the survey and the importance of their participation. If you can, provide examples of how you have used their feedback in the past.

3. Set a deadline. If you are like me, you might put off doing something because it doesn’t have a clear deadline to establish any urgency. Most surveys stay open about 7-14 days. You rarely need to keep a survey open longer than that.

4. For students, send surveys early in the semester, on Mondays or Thursdays, and in the afternoon. A few years ago, we looked at response rates based on months, days of initial invitation, and time of data of invitation. It was eye opening: timing matters. Also, know what other large surveys are going out around the same time, so you don’t impede each other’s efforts.

5. If you have the ability, you might want to pre-notify you potential respondents that a survey is coming to them. Then, people anticipate getting the survey and mentally plan time to take it. Notification can be email, word of mouth, social media, mail, flyers, etc.

6. Plan for 2-3 reminders with different messages. Each time a reminder is sent, you will get a spike in responses. The initial email will always garner to most responses (usually in the first 24 hours), then each day will decrease until you send a reminder. Then the response rate will jump a little, and the pattern will be the same each reminder. There is a fine balance between sending reminders and making potential respondents annoyed.

7. Incentives have mixed reviews on impacting response rate. You have to balance the cost of incentives and administering them with the potential benefit of increased response. You also have to decide whether every respondent gets something small or a few people will win something large. We have not found a significant impact on response rate using incentives. See #6 for an alternative.

8. Say please and thank you, just like your mom taught you. Ask for advice or help and show positive regard for people’s time and effort. Personalize the communication and make the task seem important. In the email and on the survey, be sure to thank people for their input.

9. Ensure confidentiality (to the extent you can). People will be more likely to be honest when they trust you to protect their information.

10. Ask Student Life Studies to help you! We have expertise and practice in this area and can guide you through the process.

Following these tips will help you increase your response rate. Even when the response rate is lower than you want, there are things we can look at to help you understand whether your data is going to be useful/representative of your population. Plan, Plan, Plan!

Filed Under: Assessment

Assessment vs. Research: What’s the Difference?

November 1, 2018 by Darby

You may have heard the terms “assessment” and “research” used interchangeably. Are they really the same thing? Does it matter? (And that doesn’t even include throwing “evaluation” into the mix!) There have even been recent debates among professionals about it. (<a href=”http://www.presence.io/blog/assessment-and-research-are-different-things-and-thats-okay/”>http://www.presence.io/blog/assessment-and-research-are-different-things-and-thats-okay/</a>, <a href=”https://www.insidehighered.com/views/2016/11/21/how-assessment-falls-significantly-short-valid-research-essay”>https://www.insidehighered.com/views/2016/11/21/how-assessment-falls-significantly-short-valid-research-essay</a>, <a href=”https://onlinelibrary.wiley.com/doi/abs/10.1002/abc.21273″>https://onlinelibrary.wiley.com/doi/abs/10.1002/abc.21273</a> )

In my opinion, assessment and research have a lot in common. They are about collecting data to learn something, they use similar data collection methodologies (qualitative and quantitative), they require knowledge and practice to be effective, and they are important to student affairs and higher education. There are expectations of good practice in both areas.

On the other hand, there are some key differences. The purpose of research is to create generalizable knowledge, that is, to be able to make credible statements about groups of people beyond one campus. It might be about first year college students, new professionals in student affairs, or college graduates in STEM fields. Research may also be used to develop new theories or test hypotheses. Assessment is typically confined to one program, one campus, or one group. In that case, the purpose is to collect information for improvement to that particular area of interest. Assessment rarely would set up an experimental design to test a hypothesis. The results are not meant to apply to a broader area, but they are key to decision making. Assessment can provide reasonably accurate information to the people who need it, in a complex, changing environment.

The timing of research and assessment may differ. Research may have more flexibility in the time it takes for data collection because it may not be tied to one particular program, service, or experience that will change. Alternatively, assessment may be time bound, because the information is being collected about a particular program or service, so changes can be implemented. It may be an event that occurs on an annual basis, information is needed for a budget request, or data needs to be provided for an annual report.

The expectations of response rate may also be different. Of course, everyone wants a high response rate that reflects the population of interest. Realistically, though, that may not happen. In research, there may be more effort and resources to recruit respondents over a longer time or use already collected large data sets. There may be effort to determine if late responders were similar to early responders or if more recruitment needs to happen. In assessment, partially because of the time-bound nature, and the over-assessment of college students, staff may have to settle for the response rate they get and decide if the results are credible.

The audience may also differ. Ideally, all professionals should be keeping up with the literature in their field based on sound research. Research results are published in journals for other researchers to see and use. More narrow, though, assessment provides (hopefully) useful information to decision makers and practitioners about their particular area.  In the big picture, assessment results can inform research questions and vice versa.

Research typically requires Institutional Review Board (IRB) approval, before collecting data from “human subjects.” That board wants to ensure that people are not harmed and appropriate processes are followed. Because of the narrow focus, and usually low risk nature, assessment is typically excused from the IRB process.

All in all, both assessment and research belong in student affairs and higher education. They are important to individual campuses and departments. They just may look a little different in the structure and use. Practitioners need to access both to be the best they can be.

Filed Under: Assessment

Is Assessment a Four Letter Word?

October 1, 2018 by Darby

A few weeks ago, there was a thread on a couple of listservs about the use of the word “assessment,” which for some people has a negative connotation. Obviously, I like the word assessment, but I understand how some people may be scared or turned off by it.

Listserv members offered up a variety of alternatives that might be more palatable for the folks on their campuses:
• Evidence
• Improvement
• Effectiveness
• Learning
• Storytelling
• Evaluation
• Research
• Problem Solving
• Inquiry
• Quality Assurance

As you can see, there were a number of suggestions to replace the word. All of those are good words, and they have different connotations, depending on how precise you want to be about the definitions. You can see that “accountability” is not on that list—I think that is another word that creates negative associations with a potentially frustrating experience.

At the same time, there a number of people who said that we should continue to use the word assessment and change the attitude toward it (reclaim the word). Several people gave examples of how their colleagues had one bad experience, so they generalize that to all future endeavors. It becomes hard to change someone’s mind when they have negative associations with a word or were somehow punished, felt like they wasted time, just did busy work, etc. This line of thinking focused on the activities of assessment, rather than the language.

Here’s my thought: I don’t really care what you call it. My vision is that you are doing something to know that you have improved the lives and experiences of college students. You cannot continue to implement your program/service/experience/course the same way you did a decade ago and think that the students are getting the same benefit from it. That something should have some structure and system to it, but it doesn’t need to be your dissertation. It can be quantitative or qualitative. It can be a sample (of people, timeframes, services), rather than all things all the time. It can be fun. (Really, it can be fun.)

It’s about how you know what you know, so that you can continue to do better and do the best for the students you serve. It’s about documenting (even in a creative way) to tell others about what you are doing and what you know. It’s about thoughtfully taking action based on something other than your gut feeling.

So, is assessment a bad word? I’d love to hear your perspective.

Filed Under: Assessment

We Are ALL Educators

September 1, 2018 by Darby

Over a decade ago, Elizabeth Whitt wrote an article for About Campus (January-February, 2006), asking “Are ALL of Your Educators Educating?” I still think about that article today (hence, writing a blog about it in 2018). I agree with the opening statement: “Institutions that excel are filled with educators in the curriculum and the cocurriculum who believe student learning is everyone’s business.” I wholeheartedly agree, and I think the article is still relevant today as we talk about student success and retention. It’s great to get students to graduation, but we cannot underestimate what goes on between matriculation and graduation in terms of student learning.

Whitt made 10 recommendations about creating an engaging campus.
1. Focus on student learning. Period.
2. Create and sustain partnerships for learning.
3. Hold all students to high expectations for engagement and learning, in and out of class, on and off campus.
4. Implement a comprehensive set of safety nets and early warning systems.
5. Teach new students what it takes to succeed.
6. Recognize, affirm, and celebrate the educational value of diversity.
7. Invest in programs and people that demonstrate contributions to student learning and success.
8. Use data to inform decisions.
9. Create spaces for learning.
10. Make every residence hall a learning community

I think all of these ideas are invaluable, especially when used in concert with each other. In the article, Whitt provided reflective questions to consider after the explanation of the recommendation. The questions make you think about where we put our resources, how we integrate the curricular and cocurricular, how we communicate across the institution and with students, and more. I believe that every staff member in student affairs, regardless of position, education level, or time at an institution has a role in educating students and contributing to their success. We shouldn’t leave learning, teaching, and education just to the faculty—we have to be partners in student success.

Are you educating to your fullest potential?

Filed Under: Uncategorised

Five Questions…That’s All You Get

August 1, 2018 by Darby

I’m fairly sure that we can agree that students (and staff) are over surveyed, particularly using electronic surveys. I think we can also agree that taking really long surveys is annoying, especially if the questions are not that interesting. On the other hand, part of building a culture of assessment involves collecting information from important stakeholders to help you make decisions. How do you balance that?

I facetiously have said to staff, “You only get five questions…make them good.” Okay, I am being serious when I say that, but it’s hard for people to constrain themselves when there are many interesting data points to collect. That’s the conflict: what is interesting vs. what is necessary information for you to be able to improve your practice.

Let’s face it: there are only some many hours in the day to make changes to your program or service in a given time frame. You can’t focus on more than a few things at any given point, nor is anyone asking you to. So, why are you asking a whole lot of questions that are not going to help you do better? Just because you can, doesn’t mean you should.

When you ask a lot of questions on a survey (“a lot” is a relative term), there are a couple of negative consequences. First, you annoy your respondents. People today do not have time, patience, and attention span to complete a survey more than a few minutes long. This is especially true if they do not have a stake in the topic or outcome. Second, related to the first, your audience will stop answering the survey. If respondents see how long a survey is in the first place, they may not take it. If they start answering a survey and get bored/tired/busy, they will stop answering your survey. The result of that is that you do not get all of the feedback you were looking for and your response rate is low.

So, what’s the answer? When you start thinking about assessment you want to do, particularly an online survey, focus yourself. What are, at most, the five most important pieces of information you NEED to know? It may be that, rather than several scale questions, you ask an open-ended question. Or, you may choose to assess one component this year and another one next year. Or, you may set up your sample so some of them see a set of questions and another portion of your sample sees other questions. You have options.

When in doubt, consult with Student Life Studies. We would be happy to help you streamline your questions!

Filed Under: Uncategorised

Reflections on the NASPA Assessment and Persistence Conference

July 1, 2018 by Darby

In June, most of the Student Life Studies staff attended the NASPA Assessment and Persistence Conference where student affairs assessment professionals, institutional researchers, faculty, and administrators came together to talk about the pressing student success issues and what we know from assessment to address some of those issues.

As I was reflecting back on what I learned and how I would use that, several key points stuck with me. First, Tia Brown McNair, from AAC&U, spoke about the difference of being “campus ready” and “student ready.” Most campuses think in “campus ready” terms: are students ready to attend our college? They may not have put things in place to be student ready: What has the campus done to prepare for entering students? Do campuses even know about the students who are entering and what their needs are? What policies and practices are in place that could inhibit success? What biases and assumptions do we have as educators about different demographic groups on our campuses? How do we understand and build upon student assets (rather than focusing on a deficit model)? I think we have to continually challenge ourselves and others to disaggregate data and ask ourselves how a policy/plan may impact specific groups (first-generation, low socio-economic status students, transgender students, students with children, etc.).

One of the other key points is that language is important. Are we all speaking the same language? For example, take a moment in your head to define “first-generation college student.” What is your definition? Neither parent received a four year degree? Neither parent earned a four year degree in the United States? Neither parent has any form of higher education? Does it include nuances for biological parents, guardians, other family members? The more you think about the intricacies of today’s family structure, the more it can be confusing. Do incoming students even know that they are first generation (however you choose to define it)? There are also nuances to First Time in College, transfer students, etc., especially if you consider how students may be getting college credit for experiences, such as being in the military. And then there is “non-traditional” student…. The changing demographics on most campuses these days seem to be making the “non-traditional” the traditional. Students may be older, have families, have been in the military, have significant work experience, attending part-time and working full-time, etc. They are not a monolithic group within “non-traditional.” Before we assess, we need to know how our variables and population are defined.

The last key point for today is that data will continue to be a hot topic. Who has it? Who wants it? How do we share it? What are the ethics around it? Does predictive modeling make assumptions about particular individuals that is damaging? Many campuses are trying to figure out how to combine data to provide a comprehensive picture of students, so there can be successful interventions created and decisions made with evidence. We have to be responsible using it so that we are not drawing incorrect conclusions to the detriment of students.

The conference provided a lot of opportunity to consider where we are going in higher education and what data we need to determine appropriate interventions. Next year, the 2019 conference will be combined with several others to provide a broad focus (and specific tracks) focused on student success, first-generation college students, financial well-being, and more. Lots of information to reflect on!

Filed Under: Uncategorised

Preparing for the Fall

June 1, 2018 by Darby

Summer is supposed to be slow, right? The more people I talk to, the more people say that summer seems to be almost as busy as the regular school year. That makes it even more important to plan and prepare for assessment when you have the opportunity before getting back into the busy-ness of the semester.

Have you looked at any assessment results you have from the previous academic year? Before you get too far removed from the last semester, now is the time to review results, interpret their meaning, and make plans for the new academic year. If you have results that you need to review, put time on your calendar. Schedule a meeting with yourself and/or other people who might be interested in what you did. That could be colleagues, students, a committee or your supervisor. It helps to share ideas and perspectives with other people.

Summer is also a good time to catch up on your reading. Pull out those journals and newsletters sitting on a shelf that you have been meaning to read. Get on your professional association’s website to access their publications. Look at Amazon.com to see what new books are available. You might even look in other areas than your own functional area. There might be good resources from business, sociology, psychology, or neuroscience that pique your interest. Once again, schedule time with yourself to read and digest the literature that will help you think in a different way.

Make a plan. It’s easy to think of things you could do, but you have to make a concrete plan to implement them before you forget and fall back into your comfort zone of “we’ve always done it that way, and I works.” Put deadlines on your calendar or task list. Again, this may be something you bring in your colleagues or students to help you with. Don’t forget to build in assessment as part of the plan, so you know if your changes are effective.

I do hope that you have some down time in the summer to rejuvenate and refresh. Take some time away from work to clear your mind, so when you return you are ready to begin anew. If you need help, Student Life Studies is always here for you. Just give us a call, email us, or stop by 222 Koldus. We are here to serve.

Filed Under: Planning

  • Previous Page
  • 1
  • 3
  • 4
  • You're on page 5
  • 6
  • Next Page

Site Footer

222 Koldus Student Services Building
Texas A&M University
College Station, TX 77843-1254

P: 979.862.5624
F: 979.862.5640

[email protected]

Who we are

Student Affairs Planning, Assessment & Research provides quality assessment services, resources and assessment training for departments of the Texas A&M University Division of Student Affairs and Student Organizations.

  • Site Policies
  • Risk, Fraud & Misconduct Hotline
  • State of Texas
  • Statewide Search
  • State Link Policy
  • Open Records
  • Texas Veterans Portal
  • Texas CREWS (PDF)

Copyright 2025 • Student Affairs Planning, Assessment & Research | Division of Student Affairs • All Rights Reserved. • Hosted by Division of Student Affairs Department of IT