The Value of Customer Relationship Surveys

Add bookmark

Are customer relationship surveys valuable for an organization? Does establishing a baseline put a shared services entity in a better position to improve services? The NASA Shared Services Center (NSSC) considers the survey method of determining customer satisfaction a key element to its success and continuous improvement.

As a shared services organization, the NSSC is responsible for consolidating, standardizing, and automating select business processes in the functional areas of financial management, human resources, information technology, and procurement. We provide select services from these functional areas to the nine NASA field centers and our Headquarters organization. Customer surveys, supplemented with focus groups and executive interviews in a shared services environment allow our organization to track accuracy, timeliness, quality, and overall effectiveness of the support provided by the NSSC to NASA employees and customers.

Four survey types

The NSSC employs four types of surveys to gauge success rate. The baseline is the initial survey, conducted and deployed prior to an activity transitioning to the NSSC. It establishes current employee satisfaction levels for a service being provided by a NASA center. For the NSSC, that means surveying employees at centers where services are currently performed by their local staff or by a support contractor.

After the activity has transitioned to the NSSC and we have performed the service for at least three months, a broad-based survey is issued. Participants are questioned on how well the NSSC is performing the activity versus prior center performance.

The third type of survey used at the NSSC is the transactional or event-driven/high-volume survey, geared toward measuring only that portion of the activity for which the NSSC is responsible. Survey questions measure the four target areas of timeliness, accuracy, quality, and overall effectiveness.

Finally, we employ non-transactional surveys, which are conducted less frequently and address the functional community responsible for overseeing local activity. Survey questions poll the targeted community on all, or multiple, support activities provided by the NSSC. These questions are generally greater in length and complexity.

Customer Relationship Surveys*
Type Purpose Frequency
Baseline To establish pre-transitional conditions against which the effects of a finished project can be compaared Conducted within a year of projected transition of services to the shared services organization.
Broad-based To survey the initial baseline process group to see how well shared services is now performing those functions. Conducted annually for all activities that have transitioned within the past 12 to 18 months (but not less than 3).
Transactional or Event Driven/High Volume To measure only the portion of the activity for which the shared services organization is responsible. Consists of questions that measure each of the following areas: timeliness, accuracy, quality, and overall effectiveness (limited number of questions). Conducted weekly, monthly, quarterly, or semi-annually. Remains open for response within a five workday period.
Nontransactional To specifically address the opinions of the functional community responsible for oversight of the activity locally. Includes questions that poll that community on multiple activities Conducted less frequently, usually annually. Deployed and remains open five days.


All surveys are conducted electronically via email or the Web. Other methods that may be used include focus groups, telephone, paper/hard copy, in person or via US mail.

I’d like to highlight, here, that we include a section on every survey for our customers to provide written comments if they so wish. We review these personal comments with the utmost scrutiny and consider them key to a successful survey process.

In designing our survey questions, we incorporate the perspectives of the NSSC functional area (financial management, human resources, information technology and procurement) leaders from both NASA and our service provider, Computer Sciences Corporation. After extensive review by our senior leadership, the questions and a timeline for execution of the survey are established.

Once the survey is deployed, customers are allowed a reasonable response time and response rate statistics are gathered during the process. A draft of the survey results is then provided to the NSSC senior leadership team, and later to the functional area managers.

Survey results are compared to industry benchmarks and leading practices and may be used as a basis for reengineering processes. The results also serve as a tool for identifying efficiencies critical to garnering future NASA community support. Margins of error and confidence levels are also taken into account. Our survey plan requires that all results and anticipated changes in a survey process or procedure be communicated appropriately with survey participants and their management.

Example: Permanent Change of Station (PCS) relocation services

Take into consideration the following example of survey results and how they influenced the way we do business at NSSC:

The NSSC conducted surveys on Permanent Change of Station (PCS) relocation services, which are provided to employees moving from one NASA center to another. Services include the sale of the employee’s home, moving and temporary housing expenses, and guidance and counseling services. At the onset of the transition, the paperwork and counseling services were coordinated through the NSSC’s Human Resources Office and the vouchers (reimbursements) were handled by the NSSC’s Finance Office. Early surveys consistently yielded poor marks from a customer satisfaction standpoint and a failure to meet established metrics.

To address this challenge, we employed continuous improvement methods to restructure our organization, located all components of PCS under the NSSC Finance Office, and delegated all PCS responsibility to one manager. Accordingly, our processes were evaluated and improvements were implemented. More importantly, however, we purchased industry software that tracked and managed all the financial aspects of the PCS transactions. The new software replaced the older spreadsheet method and, though it has only been in place for a few months, we have already seen improvements in the process. We plan to re-survey and are confident the results will be more favorable.

Follow up

Following any survey, respondents receive a thank-you e-mail with a link to the Executive Summary results, posted on our Customer Service website. Key findings are highlighted by bullets for easy reading and a link is provided for detailed results.

Throughout the survey process, results are gathered, recorded, and compared. Data is collected from all surveys on a monthly basis, and quarterly reports are issued containing the following information:

  • results collection date
  • frequency and type
  • survey pool source
  • total surveyable candidates
  • required sample size
  • level of satisfaction, etc.

Following evaluation, favorable and unfavorable results are targeted for action or resolution during future Operational Readiness Reviews, which serve as a methodological approach to ensure we are adequately prepared to transition and perform an activity.

Lessons learned

One of the main lessons learned through the NSSC survey process is that excessive surveying of our customers should be avoided. We try not to survey customers more than once within a 90-day period and have found that any increase in frequency decreases the response rate.

Ostensibly, our highest response rates have been gained from the baseline surveys, distributed prior to an activity transition. The baseline allows for a unique customer perspective and greatly benefits the development of our activity communication plans, which are implemented prior to roll-out. At the NSSC, we make great use of the talents and skills of our center liaisons, co-located at the centers, who serve as our eyes and ears for customer feedback. They communicate with center stakeholders and customers and contribute to our activity transition success rate.

Each survey includes surprises and it is important not to overreact to the findings. Frequently, there are explanations for less than satisfactory marks and evaluating the answers, investigating the reasons for dissatisfaction by the customer and working to change or improve the activity is tantamount to success. Despite management commitment to customer service and satisfaction, transitioning and administering an unpopular service is sometimes unavoidable. For example, employee drug testing is a necessary and mandatory federal policy, especially in the dynamic and often dangerous NASA environment. Some employees surveyed have been disconcerted with the randomness of the testing, the inconvenience, and the invasiveness; however, by increasing our communication efforts, we have made it more palatable to the employee.

Surveying is a continuous process by the NSSC team and we value our customer’s input and take action to improve daily. For an organization to thrive, reflection is a necessary step, especially where surveys are concerned. Initially, the NSSC did not develop an automatic database for tracking what we characterize as "nuisance" surveys. This occurs with excessive or repeated solicitation of customers who have already completed a prior survey. We learned that a technology-based, effective database can provide important information to help avoid such mistakes.

A survey strategy, well thought out implementation plan, and continuous improvement are important for a shared services organization. Surveys should be considered proactive tools offering valuable insight into customers’ acceptance of new processes. They also serve as an excellent barometer for determining customers’ satisfaction, which, ultimately, is essential to an organization’s success.

About the Author

Richard Arbuthnot oversees all aspects of activating and managing the NASA Shared Services Center, including defining the organizational structure and staffing the nearly 500-person organization. His duties include ensuring all facility, IT and business systems, including chargeback mechanisms, are in place to operate the organization. Mr. Arbuthnot has been recognized for his executive leadership and expertise in organizational structure and design, and is credited with introducing a number of significant initiatives across NASA. Prior to his current role, Mr. Arbuthnot served in various capacities at NASA, including Director of Human Resources at Stennis Space Center and Director of Administration at Kennedy Space Center. Mr. Arbuthnot is a graduate of Kansas State University with a Master’s degree in Public Administration and holds a Bachelor’s degree in Political Science from Wayne State College.


Tags: SSON

RECOMMENDED