• Home
  • Summary Reports
  • Services
    • Assessment Process
    • Comprehensive Program Review
    • Division of Student Affairs Annual Reporting Process
  • Learning Center
    • Blog
    • Audio Resources
    • FAQs
    • Assessment Methods
    • Assessment Training Videos
    • Web Links
    • Books and Articles
    • Survey Building with Qualtrics
    • Resources
  • About Us
    • Mission, Vision and Goals
    • Staff
    • Standards of Ethical Practice
    • Contact
  • Skip to primary navigation
  • Skip to main content
  • Skip to footer
Student Affairs Planning, Assessment & Research
Texas A&M University

Student Affairs Planning, Assessment & Research

Division of Student Affairs

Menu
Texas A&M University

Student Affairs Planning, Assessment & Research

Division of Student Affairs

Header Right

  • Home
  • Summary Reports
  • Services
    • Assessment Process
    • Comprehensive Program Review
    • Division of Student Affairs Annual Reporting Process
  • Learning Center
    • Blog
    • Audio Resources
    • FAQs
    • Assessment Methods
    • Assessment Training Videos
    • Web Links
    • Books and Articles
    • Survey Building with Qualtrics
    • Resources
  • About Us
    • Mission, Vision and Goals
    • Staff
    • Standards of Ethical Practice
    • Contact

Blog

Words are Hard: Creating Good Survey Questions

March 1, 2020 by Darby

One of the most frequent tasks the staff in Student Life Studies do is helping people design good survey questions. It’s a science and an art, it takes practice and an eye for detail. As with many aspects of assessment, you have to have a clear purpose statement in order to craft questions that help you get to what you need to know. In addition, the types of questions you use are dependent on what you want to know.

I’m going to highlight several different questions types common in survey design. They each have their uses, advantages, and disadvantages. If you would like more information, see Student Affairs Assessment: Theory to Practice by Henning and Roberts.

The simplest question type is yes/no. That means a condition exists or it doesn’t. Sounds simple, but there may be times when the respondent isn’t sure or doesn’t know the answer, and you have to determine if you need that third option. Sample yes/no questions are: Are you at least 18 years of age? Did you attend the alcohol education workshop on February 1, 2020? Are you registered to vote in Brazos County?

Another type of question is choose one. In this case, you want respondents to pick one answer from a list or series. For example, you could ask respondents to specify which state they currently live in. There would be a list of 50 states (and the District of Columbia) and possibly an “I don’t live in the United States.” It should be fairly easy for the respondent to answer the question. You could also be asking about days of the week, months of the year, how people heard about a particular program, their favorite session of a conference, what residence hall they live in, etc. The order of the responses should make some sort of sense: alphabetical, chronological, etc. Just think, you would not want to see a list 50+ states in a random order where you had to search for your state. Usually, you want to keep the list fairly manageable, so respondents don’t get lost in a long list of items.

A choose all that apply question also provides a list of items, but the respondent has the option to pick more than one answer. When you create the response options, you want to keep them to a reasonable number in the list, just as you would for a choose one. You also want to specify the maximum number that respondents should choose, if you have that limit. The instructions could be worded as “choose up to three responses” for example. A common question is something like, “How did you hear about the program?” The answers could be a list such as flier, newspaper, radio, social media, word of mouth, and other (with a write in option). A choose one question and a choose all that apply question could be very similar in wording.

A very common type of question is a rating scale. You have probably seen these at restaurants, online retailers, or workshop evaluations that are seeking a satisfaction score. The questions ask respondents to indicate a strength of response. Common types of scales are strongly agree to strongly disagree, always to never, excellent to poor, etc. The scales are usually three to seven responses in length. People ask whether there should be a “neutral” middle option. I believe people can be neutral in their opinion, but it’s also okay if you don’t want people to be fence-sitters and remove the middle option.

Similar to ratings, but without the granularity, is the ranking question. In this case, the respondent is given several statements or list items to put in a particular order (most important to least important, most frequent to least frequent, etc.). When you get the results back, you will know the order of how people ranked the items, but not the closeness or weight of the answers. Think about a horse race. The distance between the first and second horse doesn’t matter; a horse could win by a nose or by a length. If you wanted to know the strength of opinion, then you could convert your ranking into a rating scale by including a scale for each one of your ranking items. With ranking questions, you should be clear about how many items you want ranked: all of them, the top five, etc.

Open-ended questions allow respondents to use their own words to answer. When you are creating those questions, be sure they do not fit into another type of question category. For example, asking, “Did you learn something from the program?” which could be answered yes or no, is different than asking, “What did you learn from the program?” While the other questions types are quantitative and possibly easier to analyze, the qualitative questions will give you more descriptive information, but may be more time-consuming to analyze.

That is a quick overview of question types. If you would like assistance with designing good survey questions, please reach out to Student Life Studies. We are here to help.

Filed Under: Assessment, Uncategorised

A Picture is Worth a Number of Words

February 3, 2020 by Darby

I have been thinking about fun assessment methods to get the year started…you know, try something new and different, take a risk, all that good stuff. Although it is a new calendar year, it is the middle of the academic year, with a little more time to think about assessment before May. So, I’m going to share a quick idea that could resonate with the students (or even staff) you work with.

Photovoice, also known as reflexive photography or photoelicitation, is a qualitative methodology where participants take picture(s) as a representation of their answer to a question. It usually includes some sort of narrative, written or oral, to explain the meaning behind the picture. While it is typically has been used in participatory research, it can also be used in assessment to better understand a person’s reflections on their environment or even as a team building exercise.

Here’s how it could work with students in an organization or other students/staff you might work with. Participants are given a question or set of questions to respond to using photographs. (Some of you are not old enough to remember that students used to be given disposable cameras and had to get the filmed developed at a store!) They have a period of time to take the pictures and upload them somewhere or print them out. Because the vast majority of students today have a smart phone, they have built in photography capability. If there are students without smart phones, you need to think about alternatives/resources so they can equally participate.

The next step is either for participants to write a brief description of why this particular photo captured their answer or verbally describe the photo’s meaning. It depends on the situation as the best way to share the results. If you are working with the executive board of a student organization, you might take part of an exec meeting for each person to share their photo and meaning. Alternatively, you could also use it in a one-on-one discussion with an employee. You could use it at the beginning of the year to capture expectations or to be part of a teamwork activity. On the other hand, you might use it at the end of an experience to capture reflections on learning.

Revielle IX

Here are a few sample questions to get you thinking about the process:

  1. What represents leadership qualities you desire?
  2. How do you see the university core values (leadership, integrity, selfless service, respect, excellence, loyalty) depicted?
  3. Where are places on campus you feel included and excluded?
  4. What did you learn from being involved in this student organization/employment experience/event?

The questions can be as unique as the situation, and the pictures will be unique to the individual taking them. This method is flexible, especially with the technology available.

As the facilitator of the process, you need to emphasize that there is no right or wrong answer—everyone has a unique perspective. It’s also helpful to point out that the quality of the picture is not the key element of the exercise. If the pictures are being shared publicly/ in a group, build trust and set parameters so participants are comfortable and safe sharing.

If you are interested in using photovoice as a data collection method and want assistance, please reach out to Student Life Studies. We are here to help.

 

Articles on Photovoice

Clark-Ibáñez, M. (2004). Framing the social world with photoelicitation interviews. American Behavioral Scientist, 47(12), 1507–1526.

Goodhart, F. W., Hsu, J., Baek, J. H., Coleman, A. L., Maresca, F. M., & Miller, M. B. (2006). A view through a different lens: Photovoice as a tool for student advocacy. Journal of American College Health, 55(1), 53–56. doi:10.3200/JACH.55.1.53-56

Harrington, C. E., & Schibik, T. J. (2003). Reflexive photography as an alternative method for the study of the freshman year experience. NASPA Journal, 41(1), 23-40.

 

 

Filed Under: Assessment

Student Affairs Planning, Assessment & Research (formerly Student Life Studies) Services

January 2, 2020 by Darby

If you are new to the Division of Student Affairs at Texas A&M, you might not know what Student Affairs Planning, Assessment & Research (SAPAR) is or does on a daily basis.  Here’s a quick overview to introduce you to what we do and the common services we provide.

Philosophically, Student Affairs Planning, Assessment & Research is here to answer the big questions you have so you can improve your programs and services for students and other stakeholders. More concretely, we help staff and student organizations develop assessment instruments, gather and analyze data, and use information for improvement. We take clients from beginning to end—from idea to change.

The most common service that SAPAR provides is survey design, administration, analysis, and reporting. We use software to develop both online surveys and paper-based surveys. The staff in the department help people create appropriate questions for what they want to know.  In our partnership model, you are the expert about your topic, and we are the experts about the process. We help you ensure that your questions are not biased, they are the right type of question for what you want to know, are in an order that makes sense, and much more! Overall, we help ensure that the answers you get will help you accomplish your goals and outcomes.

Surveys are just one part of the assessment services. We also help people design focus groups, from the logistics to the questions. Depending on our availability, we may also facilitate the focus groups for you. We have easy to use digital recorders if you are interested in transcribing an audio recording. Although we do not transcribe for you, we can point you in the direction of cost-effective and timely services and resources. We can also assist you in analyzing and interpreting the information you get.

If you have an educational/developmental program or experience where you want students to demonstrate proficiency in certain areas, we can help you design rubrics. The staff will help you determine your outcomes, criteria, and levels of demonstration. These are great if you have the opportunity to observe some behavior or skill, and they can be used as student self-reflection/rating, supervisor/advisor rating, or both. You can also find premade rubrics at https://sapar.tamu.edu/home/sllorubrics/.

Currently, the university uses Qualtrics as the common survey software. Student Affairs Planning, Assessment & Research administers the accounts for the Division of Student Affairs. We help you get an account, provide group training, and provide individual consultation. Through years of experience, our staff knows shortcuts and best practices that we are happy to share with you. If and when the institution goes to a different platform in the future, we will continue to provide training and consultation services as needed.

The staff can also work with you to analyze or interpret results that you already have. One request, though, is that if you know you are planning to collect data through a medium such as Google Forms, please contact us before you do that. Sometimes when staff have brought us a data file, the data is not in good condition. We may have a quicker and easier way for you to collect and analyze data.

While this blog hits the highlights and most common services Student Affairs Planning, Assessment & Research provides, we always encourage staff to contact the department for any questions. If you don’t know the answer or provide the service, we can find out the best way to assist you. We are always here to help.

Filed Under: Assessment

Closing the Loop: Reassessing for Confirmation

December 1, 2019 by Darby

Assessment professionals use the term “close the loop” a lot. (The irony is that if it is a loop, it is already closed.) The gist is that you reassess the changes you made based on some assessment you did, which led to the decision to make that change in the first place. Makes sense, right?

I think the challenge is that we move quickly and move on to the next thing quickly. In order to really “close the loop,” we need to plan and document. Let’s take a look at this step by step as a long(er) term process.

Let’s say you have an idea to assess something…a conference, student learning, climate, etc. You spend quality time thinking about what you really want to know, from whom, in what manner, and when. You create a great assessment (survey, focus group, observation, etc.) that answers your question(s). Usually, you confirm that you are doing many things right. Your audience is having a positive experience, learning what you intended them to learn, etc. Most of the time, there are a couple of things that you could do better based on your assessment results. (Remember, your assessment should be very focused because you can’t work on 100 things in the next year.)  As a good student affairs professional, you make plans to change something in your program/service/activity and move on to the next project or program.

But wait! You’re not done. As you decide to make a change (or two or three), You need to make sure you actually implement them. You also need to make a plan to reassess the thing(s) you change. Don’t just walk away and assume the change is beneficial. Do you have a rationale as to why you think your change will be an improvement? When will you reassess the change to be sure it was positive? Where will you document all of that good stuff?

Let’s look at a simple example.  If you host an annual conference and collect data through a survey after each iteration, you may have a built in mechanism. Last year’s participants really disliked the food, which rated a 2.3 out of a 5-point scale and expressed displeasure in their comments. You discussed the results with your staff and decided to seek out a new caterer. After taste testing three options (what a great assessment gig!), you choose the new caterer that you think will meet your needs within the budget you have. This year, the assessment results changed dramatically, with a score of 4.4 out of 5.0. You decide to renew your contract with that caterer for next year and note that in your transition documents. While that is a very basic example, it illustrates that you assessed, reviewed the results with others, made a decision about change, implemented the change, assessed the change, and documented the process and assessment.

While this process is the right thing to do, it also helps with accountability. Departmental and institutional assessment plans seek documentation about assessment, change, and reassessment. It can be really easy to overlook the reassessment step if there is not accountability to remind us the importance of doing it.

I hope this information was helpful in framing “closing the loop.” Feel free to reach out to Student Life Studies for assistance. We are always here to help.

Filed Under: Assessment, Planning

The Role of the Institutional Review Board*

September 2, 2019 by Darby

Every now and then, Student Life Studies’ staff gets asked, “Does this project need to go to the Institutional Review Board (IRB)? Every now and then, Student Life Studies’ staff says, “This project definitely needs to go to the IRB.” So, what exactly is the IRB and why would projects need their approval? Keep in mind that every campus has different processes and procedures, so Texas A&M is different than other institutions.

I’ll hit some of the basics, but to really understand the role of the IRB at Texas A&M, see https://vpr.tamu.edu/human-research-protection-program/. That website will describe the definitions, steps for approval, resources, and training requirements. The IRB is here to protect humans (living subjects) in the research process (a systematic investigation resulting in generalizable knowledge). In research, the investigator gathers identifiable data about people through some sort of intervention or interaction. But, not every data collection activity requires IRB approval. In many cases in the Division of Student Affairs, assessment is completed to improve a specific program or improve student learning in a particular activity. Those data collection functions, although inherently involving interacting with humans, do not have the purpose to create new, generalized knowledge. Alternatively, there are topics/people that Student Life Studies would recommend IRB review. Those typically include sensitive topics or groups (alcohol/drugs, illegal activity, sexual activity, sexual misconduct, minors, cognitively impaired adults, pregnant women, prisoners, etc.) or cases when you know you want to publish the results. I hope you can see where there might be overlap or questions between program improvement data collection and human subjects research.

The IRB offers/requires training for any investigator. If you are just collecting some feedback about your program to make changes for next year, you might think you don’t need to take the IRB training. But, the training provides good information about the ethics of data collection, regardless of whether you are conducting actual research. Texas A&M uses the Collaborative Institutional Training Initiative (CITI) as the online training system. The nice thing is that the training is good for five years before you have to do a refresher course. The course can be accessed at https://rcb.tamu.edu/humans/training/human-subject-protection-training-in-citi .   Although not the most exciting of professional development options, it is still important to student affairs staff who collect any data.

If you need to submit a proposal to the IRB, there are several steps to complete and questions needing answered. This page https://vpr.tamu.edu/human-research-protection-program/approval-process/before-you-submit/,  provides an overview of the documents that you will need. To understand the questions on the application, see the Socio-Behavioral Protocol Template at https://rcb.tamu.edu/humans/toolkit/templates/templates  When you are ready to submit, you log into the iRIS portal at https://iris.tamu.edu/shibboleth-ds/index.html. The instructions at https://vpr.tamu.edu/human-research-protection-program/approval-process/how-to-submit/ walk you through the process. The questions can be confusing because of some of the jargon, so feel free to reach out to the IRB hotline at 845-4969 or the general office number, 458-4067, to find the Division of Student Affairs liaison.

I hope this information gave you an introduction to the IRB with resources you can access for more detail. Feel free to reach out to Student Life Studies for assistance as well. We are always here to help.

 

*The overview provided here is very simplified. Please refer to the IRB or Student Life Studies for more specific information.

Filed Under: Assessment

Change is Hard

August 2, 2019 by Darby

Change. It’s hard. People don’t like it. It causes disruption. It’s hard. It causes conflict. It takes too much time. It’s hard….Okay, now that we have that out of the way….

 

Assessment is one way to inspire change. Results from a survey, focus group, observation, or cost/benefit analysis, may tell you that you need to do something different. It could be something big or small, short term or long term. How do you go about creating sustainable, meaningful change, especially when you are leading a group of people who need to buy into the change? For big changes, you need a process or framework to help you sustain that change.

 

I like John Kotter’s (2014) eight-step model of change (https://www.kotterinc.com/8-steps-process-for-leading-change/) as a way to frame and organize a successful process. The first step is to create a sense of urgency. What will motivate people to see the need for immediate action? Human nature causes us to deprioritize things that are not of immediate need. Basically, you need to provide the vision with a deadline.

 

The second step is to build a guiding coalition. In Student Affairs, I think we do that a fair amount. Who are the right people to guide the change and communicate to others? Who is in the unit affected by the change that is passionate about the change? Bring that core group of people together to coordinate your efforts.

 

Step 3 involves building a strategic vision and initiatives. What is the “Big Opportunity” statement that can spur people to volunteer to be part of the change? This builds ownership as a part of the process. People want to be part of something successful, and they need to see what the better future looks like.

 

The fourth step is to enlist a volunteer army. How do you get a large number of people engaged in the change? Granted, your assessment results and/or staff size may not indicate an “army” needs to  be involved, but be sure that you are asking yourself if there are people who should be involved that are not.

 

Step 5 enables action by removing barriers. What/who are the perceived roadblocks? How can you remove silos and build collaboration?  This serves as an opportunity to rethink processes and systems currently in place. You assessment results may clearly support some sort of system change to make thing more efficient and effective.

 

Step 6 is to generate short-term wins. Be sure to keep track and share your milestones, so everyone can see the progress. Don’t wait for the final bell to release the results. Keep people informed and engaged.

 

The seventh step is to sustain acceleration. Build momentum for each success from Step 6 as you get closer to you goal. Because change can take a while, you need to keep up the energy along the way, so the change doesn’t get abandoned when the next urgent thing comes along.

 

The eighth, and final, step is to institute the change. It’s really important to tie the new change to success, so old habits don’t sneak back into the work. It’s hard to change a habit, so you have to be sure the behavior/process/system sticks and becomes the fabric of what you do until you determine you need another change.

 

I hope that helps you think about the change process in a new light. If you need help, Student Life Studies is here for you.

Filed Under: Planning

Reflections on the Student Success in Higher Education Conference

July 2, 2019 by Darby

Last month, I attended the new NASPA Student Success in Higher Education Conference. It combined four conferences in one: Closing the Achievement Gap, Student Financial Wellness, First-Generation Student Success, and Assessment, Persistence and Data Analytics. Because all of those topics intertwine, the conference had something for everyone.

From the assessment standpoint, I continue to be surprised, although I should not be, at the number of people who just got assessment added to their job without any skills or knowledge about assessment. Over 50 people attended the “Assessment 101” pre-conference session to get themselves a foundation. In an assessment culture session, some professionals are at the beginning stages of implementing assessment in their department or division. At the same time, the conversation about student affairs assessment has matured at the conference, so there are sessions that are more advanced, focusing on use of big data, different data collection methodology, and using data to predict at-risk students.

Several sessions addressed student employment from a financial as well as a developmental experience, which I think is going to continue to be a hot topic as more students work during college. Some students, even those that work, struggle with food and housing insecurity. They are a challenging population to assess because they may not be clearly apparent.

Students are complex. Higher education is complex. Student affairs is complex. Data is complex. This conference tried to address some of these issues and how they intersect, so that we can help students be successful in their college life and beyond.

NASPA has not announced where the 2020 conference will be, but I encourage you to consider it as part of your professional development options.

Filed Under: Assessment

Documenting Your Assessment

June 1, 2019 by Darby

It’s the end of the year. Whoop! You might be taking a well-deserved breather or already in the planning mode for next year. Either way, don’t forget to document what you did assessment-wise this past year. You probably will not be in your current job forever, and you might even win the lottery tomorrow and decide to quit your job. Are you setting up the next person for success? If you are here next year at this time, will you remember what you did and why you did it?

That’s where good documentation comes in. Don’t worry, it doesn’t have to be really formal. You might jot down notes in a Word document about your program’s purpose, the planning timeline, any outcomes you developed, and how you assessed what you did. If you have some sort of assessment results (statistical output, qualitative comments, summary report, etc.) be sure to save those as well. Maybe most importantly, from an assessment standpoint, document what decisions were made and any subsequent actions taken. Be sure to put the documentation on a shared drive, so other people can access it if you are not around or in charge of that project any more. It can be frustrating to not be able to get the information you need.

You never know when someone will bring up an idea or a change. With the documentation, you know whether that was something that was tried before and/or whether the assessment results would lead to that change being successful. We are all busy people juggling multiple responsibilities, so having that documentation also helps you remember what you did and why you did it. That information can also be part of your assessment plan reporting for accountability purposes.

Don’t make documentation more work than it has to be. It should help you work smarter, not harder. You might think it’s not important or you don’t have time, but it is easier to jot a few notes rather than having to remember all of the details or burden someone else with having to figure it out.

Filed Under: Planning

What I Learn from Program Reviews

May 1, 2019 by Darby

Recently, I was asked to review a relatively new student affairs assessment office at a different institution using the CAS Standards (https://www.cas.edu/). I have done program reviews before, and they always remind me to think about our own processes and the future. This one was no different.

Student Life Studies has been around for more than 20 years. As one of the first “departments” (more than one person) of student affairs assessment, we learned as we went. There wasn’t a manual or an elder giving sage advice. There was not an association of student affairs assessment professionals. There were no conferences devoted to the topic. We made mistakes along the way and continue to look at our own organization for improvement. I think part of my role as an “expert” is to help other student affairs assessment offices get up to speed quicker than we did a couple of decades ago. There are so many improvements in technology, project management tools, and resources that we did not have back then! At this point, I think learning and improvement can be more rapid than it was when we started.

The department I reviewed already had staff training in place to build capacity among their staff. They have in person group training, individual consultation, and resources available online. Their division staff have gained confidence and competence from the efforts. All of their Division staff have access to Qualtrics, so it is easy for staff to send out surveys (which has its own challenges!). They are well on their way to developing a culture of assessment, especially as more staff are used to assessment.

In the last few years, data has become a hot topic. Who has it? Who wants it? How is it used? How should it be used? How is it protected? This unit is already building what has taken us a while to get in place. As systems become more technologically complex, but also potentially more integrated, the data conversation becomes even more important. It’s a way to decrease silos that have developed over the years if the right people can get together to take action. It gives us a more accurate, complete picture of the student experience.

This department has clear support from the relatively new Vice President of Student Affairs, who saw the importance of creating a multi-person office, rather than having it as a small percentage of one person’s job. Student Life Studies has leadership support as well, but I wonder if DSA staff take it for granted because we have been around for so long. The department I reviewed still has the “new” factor in its favor—staff remember what it was like when the department did not exist. Not many Texas A&M DSA staff were here before 1998.

While I am asked to review other departments based on my experience in Student Life Studies, I always learn something from other places about improvements or innovations we can make here. I hope my advice helps them move forward quickly, but I always appreciate ideas that I get from other offices even if they don’t have the same history.

Filed Under: Assessment

Assessment as Counseling?

April 1, 2019 by Darby

A few weeks ago, in the graduate class I teach, one of the students asked (paraphrasing) if I used counseling skills when working with people on assessment (thanks, Dylan!). I hadn’t ever thought about it that way. I am certainly not a trained and licensed psychologist or counselor like the fabulous staff at Counseling and Psychological Services (http://caps.tamu.edu/), and it’s been many years since a I had a counseling/helping skills class, but I can see a few similarities. The assessment relationship helps people through a reflective and analytical process ultimately to make some sort of improvement. Potentially oversimplifying, the counseling process usually takes people through a process of reflection and thought/behavior change to make some decision/action to improve.

From the very beginning, the assessment professional has to put the client at ease. The client may be nervous about the process, fear the feedback, or unsure of the amount of work it might take. Because staff take ownership in their work, they may have some anxiety about how the program or service reflects on their own performance. The assessment professional can address those concerns by talking about the process, the outcomes, and the good things that assessment can help them with. The unknown can be scary, but it can also been seen as an opportunity.  The more motivated the client is, the easier the assessment process can be. I see that building rapport is something a counselor needs to be good at to get their client comfortable with the process. I certainly would not want a counselor who made me feel more anxious or stupid.

Counselors and assessment professionals ask good questions. That’s part of the job. They know how to push without being pushy, how to challenge without causing shut down, and how to get clients talking. When clients come in for assessment help, they may be lost as where to start, what to ask, and what to do with results. The assessment professional can ask probing questions to get to the key issues, the background and priorities, the desired outcomes, and the risks of change. That also sounds like a counselor strategy.

Counselors maintain impartiality. The assessment professional is also a neutral third party, in that they don’t have a stake in the outcomes of most assessments. That allows them to see things with some clarity, without pre-conceived notions, and with an open mind for the future (and yes, we all come with some biases related to our experiences; a good professional can recognize and address them as needed). Because the client is immersed in the program/service/activity, they may have a hard time stepping back and looking at things objectively.

Related to that, the assessment professional may be able to see more options based on knowing about other assessments, how others have used assessment, and what is happening in other environments. While the client is immersed in their program, the assessment professional can see across other assessment and data available that could inform the assessment process or using results.  Just like a counselor, the assessment professional can look at a topic from several angles and use several strategies to help the client.

Most comforting, the counselor and the assessment professional provide support to the client. Sometimes it’s being a cheerleader, sometimes a coach, and sometimes an empathetic listening ear. I was going to say a shoulder to cry on, but, to paraphrase a famous movie quote, “there is no crying in assessment.” (Okay, that’s not really true, but I hope there is less crying in our office than in Counseling and Psychological Services.)

I hope this month’s blog has given you some perspective about the relational and support side of being an assessment professional. I would love to hear your perspective.

Filed Under: Assessment

  • Previous Page
  • 1
  • 2
  • 3
  • You're on page 4
  • 5
  • 6
  • Next Page

Site Footer

222 Koldus Student Services Building
Texas A&M University
College Station, TX 77843-1254

P: 979.862.5624
F: 979.862.5640

[email protected]

Who we are

Student Affairs Planning, Assessment & Research provides quality assessment services, resources and assessment training for departments of the Texas A&M University Division of Student Affairs and Student Organizations.

  • Site Policies
  • Risk, Fraud & Misconduct Hotline
  • State of Texas
  • Statewide Search
  • State Link Policy
  • Open Records
  • Texas Veterans Portal
  • Texas CREWS (PDF)

Copyright 2025 • Student Affairs Planning, Assessment & Research | Division of Student Affairs • All Rights Reserved. • Hosted by Division of Student Affairs Department of IT