What To Do If You’re Worried Whether People Will Respond To 360

The better the response rate to a 360-degree feedback survey, the more weight you can give the results. But why do some organizations achieve a 95% response rate, while others languish in the 60s and 70s? And how do you attract more people to respond? One excellent method is to discover what makes people want not to respond.

When we introduced our “Opting Out” option to PanoramicFeedback.com, we recognized that it had the potential of adding to our knowledge of the factors that influence people’s choices about responding. We studied the results for a year.

Here’s how opting out works. Potential responders received emails inviting them to provide 360-degree feedback to subjects. They were provided with a choice of two URLs to connect to. The first simply took them straight to the 360 questionnaire to reply.

But if they chose the alternate portal, it allowed them to opt out. It ensured that they would receive no further emails for the particular project, removing all pressure to take part in the 360.

On this special portal, they also found a comment box, so they could, if they wished, tell the survey organizers why they had chosen not to take part. Approximately ten percent provided comments about their motivation.

Our study tabulated hundreds of opt-out messages, a fascinating glimpse into the minds and motivations of potential responders. The study provided valuable information about how to avoid the most common mistakes made by designers of 360-degree feedback.

It is important to note that the overwhelming majority, many thousands of people, elected to respond to the questionnaires, despite the opt-out option. Even when this option is offered, perhaps especially when it is offered, most people are eager to respond.

Opt-out explanations

Those who opted out identified 7 major reasons why responders may choose not to take part in 360-degree feedback:

  1. Personal unwillingness
  2. Mistrust of the organization
  3. “I can’t really assess myself, can I?”
  4. Subjects included incorrectly
  5. Responders badly chosen
  6. Inadequate Communication
  7. “Buzz words and sound bites”

The responders’ criticisms turned out to be helpful because each one contained the seeds of a solution within it. We learned from them that it is not very difficult to make 360s more responder-friendly.

(All quotations below have been purged of identifying data in order to maintain confidentiality.)

Response block 1.

Personal unwillingness

Sometimes people are just not willing, and there is little you can do that will change that. They may simply attribute their refusal to “personal reasons”. Clever questionnaire design will not help here. Cajoling will not help. A vague sense of unwillingness is all the potential responder is willing to communicate.

For others, that sense of resistance is perfectly focused. Who can argue with the person whose eyes are already fixed on the start of a new life? “I am retiring in eight more working days and will not be available to participate in this review,” said one.

And who can argue with the sheer pathos of this message: “I would still like to complete this survey, if possible. My mother passed away, and I have not been in the office. Please advise as soon as possible.”

On the other hand, the following message contains a tantalizing blend of admiration with lack of motivation: “Jack is a wonderful person and a fantastic manager. He is professional and dedicated to providing only the best services from himself and his staff. I just simply do not have time to fill out the questionnaire right now.”

These messages suggest that you may be able to improve the way you “sell” the 360-degree feedback process to some, but not all, of your potential responders. We’ll return to that concept as we look at other responses.

Response block 2.

Mistrust of the organization

Some unwilling responders have lost their belief in the organization. “Sorry,” said one responder, “I have become way too cynical for this project.”

Others say that they have lost faith that confidentiality will be honored a clear indication of trust issues within the organization. Sometimes their lack of trust is exaggerated by individual personality quirks. But they may still have good reason to fear that those who speak their truth within the organization risk punishment.

Occasionally the refusal to respond is a symptom of anger at the subject. “Donna needs to be fired,” exclaimed one responder who decided to opt out. “Her lack in programming [for her department] proves the inadequate job she and her staff are doing.”

Occasionally, cynicism works in the opposite direction. One genuine admirer had read that the responses would be summarized using Olympic averaging (where the highest and lowest scores are discarded, to reduce the impact of extreme responders). The individual believed that would invalidate his enthusiastic response. “I am opting out because Marshall gets all 10s [on a scale of 1 to 10] and your computer would throw my responses out. He has no weaknesses, in my opinion.”

If you receive this kind of response, it’s clear that there is more work for you and your leaders to do in communicating the good faith of the organization and its commitment to continuous learning, open communication, confidentiality, and safety for all.

Response block 3.

“I can’t really assess myself, can I?”

Responders who ask whether they should assess themselves may be driven by a variety of motivations: lack of understanding, laziness, the fear of reflection, or the discomfort of comparing their self-assessments with those of others. One subject saw her own name at the top of the invitation to respond and blurted in the comment box, “This is Myself!” Others react more reflectively: “This is my own personal ID…not sure why I would be evaluating myself.”

Some recognize that they cannot see themselves objectively, and abdicate the opportunity. “The subject of this questionnaire is MYSELF. I have already responded to several questionnaires on others, but did not think it would be any use to comment on myself.”

But others imply fear of the impact of self-analysis. “I feel,” said one, “as if I try to perform the duties required of my position to the best of my abilities, and evaluate myself on a daily basis.” Responders like this one have already worked hard to assess themselves privately, and wonder why they should subject themselves to the added effort of completing a questionnaire.

In many such cases, hesitancy masks a fear that someone else (their boss or coach) will read their honest self-evaluation, and judge them for any discrepancy between it and the assessment of others.

How do you deal with this perceived danger? One opt-out responder suggested s/he had received inadequate preparation as a subject. “I do not remember any guidance that we were to rate OURSELVES. Please get back with me if I should be accomplishing a self-examination.”

The solution is crisp, clear communication about the value of self-assessment. Subjects need to hear in advance of the survey that to document how they see them-selves is a crucial part of the process of self-understanding. When they look at how their self-assessment accords with that of the outside world, they are learning to see themselves with greater objectivity.

Response block 4.

Subjects included incorrectly

When people have been incorrectly identified as subjects, the cost may not be enormous, but the confusion can be demoralizing. Said one potential responder, “I do not know who this person is.” She then added the obvious corollary, “As such, I am unable to provide any effective feedback.” The ironic tone of the comment suggests that the responder did not have a high opinion of the survey organizers.

Another responder replied, “The above individual has retired after 50 years of civil service. I see no need to complete this test for this person.” As the mildly irritated second sentence suggests, errors in the setup of the 360 process do diminish its credibility.

Messages about unnecessary errors abounded in our study: “Marty no longer works at Organization Inc.” “Jeannette has moved overseas.” “This person has accepted a position with another organization and will be leaving next month.” “Mark is no longer my supervisor and is no longer with our organization,” reported one invitee, who then added, again with subtle annoyance, “He has retired.”

Careful management of the selection of subjects can largely eliminate such credibility destroyers. A helpful procedure is for subjects’ names to be submitted to their immediate managers in advance, to ensure that those who have been chosen are not retiring or inappropriate for other reasons.

Response block 5.

Responders badly chosen

Sometimes the selected responders are simply not the best choice. And often they are the best judge of that.

One subject chose a responder whom she hoped would remember her from a single meeting far in the past. “Sorry,” came the reply. “It has just been too long since we had the meeting for me to really be able to provide meaningful feedback about her.”

“Don’t know how Daniel works in the areas mentioned,” said another. “He was working in another area during this period of time and I had no direct contact with him or direct knowledge of his performance.”

Scrupulous invitees sometimes responded like this one: “I do not spend enough time with him during the work week to fairly analyze his performance.” They often said they didn’t wish to provide information that might not be “fair or accurate”. They didn’t wish to skew the results.

“I started through the questionnaire,” said one, “and was unable to relate a lot of the questions to my involvement & contact with Belinda’s roles. I feel that my answers may not truly be reflected as I intended them to be.”

Responders to 360 surveys may be selected by the subject, by supervisors, or by project organizers. It is important that whoever takes on the task be fully briefed about how to choose appropriate responders.

Sophisticated 360-degree feedback providers offer an interface that enables subjects to choose their own responders. In many cases (not all, as one of the notes above attests), this can ensure that responders are chosen who actually know the subject’s work.

Our study found that the majority of opt-outs came in bunches, representing certain organizations that were highly prone to opt-outs. Our impression is that they were generally larger in size, and that because of the sheer numbers, responders may have been chosen automatically or without consultation.

The lesson is that even when there are large numbers to deal with, it is important to ensure that the process includes a personal invitation from the subject to the potential responders. Nothing can be more appealing and convincing than to hear, “I’d appreciate getting feedback from you in particular.”

Response block 6.

Inadequate communication

The confusion noted so far points toward the need for more effective communication about 360. Just when you think the survey organizers have informed everyone in the organization about every issue imaginable, you may be knocked off your feet by a simple message like this one: “I have no idea what this is.”

Somehow your best efforts at explaining the 360 process have entirely escaped this person’s attention, leaving him utterly bewildered about the invitation to respond.

Sometimes you think you have communicated clearly when you haven’t. For instance, in one survey, the administrators failed to make the case to managers about the benefits of using the 360-degree feedback questionnaire as an integral part of the performance management process. As a result, one manager opted out of the process: “As her supervisor, I will be writing Marg’s review directly.”

In this organization, managers were not informed that employees would benefit by seeing their managers’ responses in the context of the responses of her peers, customers, and direct reports. The manager quoted did not understand that 360-degree feedback actually provides a concise forum for writing a performance review.

Sometimes the information provided about the questionnaire is not carefully written, leading people to feel that they don’t know enough to comment on the subject’s performance even when their information may be highly valuable. ‘Blake is a valued colleague to me,” read an opt-out comment, “enthusiastic, open to new ideas, patient, and very keen to do an excellent job as a manager. I believe Blake’s strength, from our limited interactions, is her openness to explore new ideas and to embrace change.”

Had the remark above been captured in a 360-degree feedback response, it would have been much valued and helpful to the subject. We can assume that organizers of this survey encouraged the selection of responders like the person quoted above. So it would have been beneficial if they had communicated that comments would be appreciated even from those who had to select the “Not Certain” response to some questions.

The best way to find out how well you are communicating is to talk with individuals who have no presuppositions. Choose people who have not been involved in the preparation for 360-degree feedback in any way. Ask what they have heard. What confuses them? What do they think about the 360 process? Do they see the value for themselves?

The answers to these questions will provide clear evidence of those areas where you have communicated well and those where more or better is required.

Response block 7.

“Buzz words and sound bites”

Here is a response that goes to the heart of questionnaire design. In this case, the questionnaire had been provided to potential responders ahead of time. “I feel I should opt out of responding to this questionnaire because the majority of the skills we are suppose to assess are either not understandable or at this level, not applicable. These skills read more like buzz words and sound bites, and not a skill that can be quantified.”

In another organization, a spelling-challenged individual described the problem more succinctly: “Unable to asses most of the questions.”

Survey organizers may or may not agree with the comment that follows, but what is clear from it is that the writer speaks in the voice of a significant sub-culture in the workplace. “Many of the questions relate to an outdated style of management (e.g. an excessive and outmoded focus on teamwork) and I doubt the value of any results that may be achieved.”

The responder continued: “I read this survey over and over and tried to relate the questions/skills to what is realistic, in our specific business and at my supervisor’s level and could not the majority of the time. I believe there should be a break-out of different skills for different levels of management, and if they are expected to achieve the skills, then the skills need to be explained, trained, coached, and mentored.”

This response reflects the problem of communicating effectively with those who have not bought in to changes in the corporate culture. It also raises important questions about whether the organization is walking its talk about the skills required of managers.

When designing questionnaires, organizers may stumble if they uncritically accept the organizational “line” about competencies. Open discussion with others is often the best way of deciding how to phrase questions, and discovering the extent to which they reflect workplace reality.

360 questionnaire design is not complicated. But it does require some insight into how ordinary people read. The best way to make sure your questionnaire asks what you intend to ask is to ask for frank comments about it ahead of time.

As more organizations encourage their employees to use the Internet, they are finding that this actually increases the number of literate people. The Internet attracts even those who would not readily pick up a book or newspaper. Like more sophisticated Internet users, they prefer brief, direct writing on-line. So an opting-out comment like this, “survey is too long and it’s confusing,” is a useful reminder of the reading preferences of the average reader.

An unexpected outcome of the decision to give unwilling responders a voice has been that they became instructors not only for the organizers of particular surveys, but for all of us.

© Panometrics Inc.

Request Service Bureau Pricing

Please complete the following form to request service bureau pricing for your specific requirements.
After the following form has been submitted, a representative will reach out to you via phone, email, or both, to discuss service bureau pricing and options.
How Many People Do You Plan To Assess?*
By providing this information, you agree that we may process your personal data in accordance with our Privacy Policy

Request Pricing

Please complete the following form to request pricing for your specific requirements.
After the following form has been submitted, a representative will reach out to you via phone, email, or both, to discuss pricing and options.
How Many People Do You Plan To Assess?*
By providing this information, you agree that we may process your personal data in accordance with our Privacy Policy

Request Demonstration

It can be challenging to choose the right 360-degree feedback platform that precisely suits the needs of your organization. You need a tool that’s simple, robust, easy to understand and implement, and easy to manage. Our demonstration will provide you with the experience you need to determine how Panoramic Feedback can support your goals.
After the following form has been submitted, a representative will reach out to you via phone, email, or both, to schedule your session.
By providing this information, you agree that we may process your personal data in accordance with our Privacy Policy