NASA SSC – Leading Practices Assessment

Add bookmark

SSON: Why did NASA decide to go through the process of conducting an LPA?

RA: In 2008, we had been through the majority of our services transitions, and it was time to take the next step—for the NASA Shared Services Center (NSSC) to establish external benchmarks in order to better understand NSSC processes, costs and rates in comparison to private industry standards. We thought it prudent to focus this review on some of our high-volume services, including accounts payable, payroll and travel.  It was important to secure this data in order to understand where the gaps were and how we might use the data to close the gaps and improve our processes, costs and quality of services. While our performance up to the point that we decided to go through the LPA had been very good, it was important that we not rest on our laurels. We understood the criticality of continuous improvement in the shared services environment, essential to improving processes, quality of services and lowering costs.  In addition to using the information gained internally, we needed to provide our stakeholders insight into the cost drivers for NSSC rates; develop an understanding of policy and process changes, which, if implemented, could potentially increase efficiency, and reduce rates; as well as identify new technology that could be leveraged to increase efficiency and reduce rates.

SSON: At what point was your SSC in development when you decided to do LPA?

RA: Approximately 95 percent of the scheduled services had been transitioned, we were beginning to stabilize many of the services and, for that reason, we believed it was time to conduct the LPA.
 
SSON: So you had started the centre a couple of years earlier

RA: The NSSC became operational in March 2006.

SSON: What did you learn in going through the LPA process?

RA: We found that NSSC performance varied across the three functions.  Overall, the NSSC compared most favorably to private industry in the area of quality as a result of the attention given to compliance and the diligence of the audit and review processes.  Conversely, the focus on quality sometimes had a negative impact on cost and productivity—with the largest gaps between private industry and the NSSC performance found in the productivity category (i.e. the number of transactions processed per employee/per month). This was primarily driven by the number of regulatory requirements and NASA policies that restricted NSSC’s ability to adopt changes to align performance more closely with private industry. In some cases, this gave us the documentation needed to address specific policy impacts with NASA agency policy makers and effect necessary changes in policy that were beneficial to NASA and the NSSC customers. 

In many cases, we could not make an apples-to-apples comparison, but we were able to take away meaningful information to help drive costs down and quality up.  An example:  The LPA found that NSSC’s invoice processing costs were higher than private industry benchmarks due to: manual effort required from invoice validation through payment authorization; lack of a fully integrated and automated technology solution; time and effort involved in the certification process, which is generally not found in private industry; and, employee headcount, which is higher than typically found in industry based on the volume of transactions processed at the NSSC.  Based on this specific feedback, the NSSC was authorized by the Office of the Chief Financial Officer to develop a white paper and present a potential solution providing for electronic invoicing within NASA.  We are currently on track with the white paper being vetted and potential solutions being developed that would solve many of the issues noted above.

SSON: You mentioned it highlighted what you were and weren’t doing well. Can you give me a couple of examples?

RA: The primary one that comes to mind is the need for electronic invoicing—mentioned above.

SSON: You mentioned different cost drivers…….

RA: Reporting, audit and compliance requirements associated with areas such as financial management in the government is very different than in private industry. In industry, you have the Sarbanes-Oxley Act requirements. In the government, you have the stringent internal control requirements associated with and detailed in OMB Circular A123 "Management’s Responsibility for Internal Control".

SSON: Were there any other govt orgs to benchmark against Rick?

RA: Not that we were aware of.

SSON: Would that be helpful for you?

RA: Yes, it would be very helpful. I know ScottMadden is currently working on a federal sector benchmarking exercise. What I have found is that in the federal sector, and I suspect even in the private sector, there is not a consistent understanding of what shared services is—so the first thing we need to ensure is a common definition of shared services to ensure validity of the benchmarking efforts.

Each organization does business a little bit differently. For example, in the government, there are some shared services organizations indicating that they provide shared services, but they don’t have a Customer Contact Center (CCC), which can be a significant cost driver and must be factored into your rates and overall costs. If you don’t have a CCC, or other functions such as an electronic data management capacity, there might be a misleading perspective on what it costs to perform a certain function, and you have apples to oranges comparison. 

SSON: How have the LPAs had a direct impact on operations and moving forwards?

RA: It forced us to take a hard look at the impact of policies and procedures on efficiency and cost.  It gave us the opportunity to make educated recommendations to revise existing policies, and it also drove us to make a commitment to use net operating results for capital investments that should eventually drive costs down and improve processes (e.g. electronic invoicing).

To ensure that we were focused on these lessons learned, we incorporated several suggestions and findings as strategic and tactical goals in our Balanced Scorecard.

SSON: How long did that process take?

RA: We formally incorporated goals in the following Fiscal Year 2009, but we took the findings and recommendations seriously and, to the maximum extent possible, we began considering ideas and plans for making changes early on.

SSON: Can you give me an example of a couple of the biggest impacts that the LPA has had? You mentioned electronic invoicing.

RA: As a result of the gap analysis contained in the LPA, we knew that we needed to adopt a methodology for continuous improvement. Before this, we had organizations attempting to make improvements, many of which lacked the rigor needed to support the proclaimed outcome—that is, the result was difficult to validate.  A team of senior managers at the NSSC, supported by a follow-on Leading Practices Review by ScottMadden, developed a standard for continuous improvement, which was concurred by the NASA and the Service Provider Senior Leadership Teams. This team is in the process of formulating the NSSC Continuous Improvement Implementation Strategy, using Lean Six Sigma techniques and methodologies.

SSON: How many functions are you running Rick?

RA: We have about fifty-one activities across four different functional areas, including human resources, procurement, financial management and information technology (IT).

SSON: Did you start them all at the same time?

RA: No, we brought different activities in each of the different functional areas phased over a three year period, using a disciplined project transition scheduling approach. For very high-volume activities, such as accounts payable and personnel action processing, we made a conscious decision to transition these services later in the transition schedule, in order to mature and stabilize our operating practices and processes. This provided the time to work out any significant operational issues prior to transition, thus preventing any significant transition impacts upon our customers and stakeholders.

SSON: Did the LPA cover all four functions?

RA: No, it covered financial management functions--including two of our highest volume activities. Other services, specifically IT, did not have the volume or content suitable for the LPA.

SSON: Would you then want to roll IT through the LPA after it’s been up and running for a while?

RA: Absolutely, yes. Specifically we plan to transition an agency IT Tier 0/Tier 1 Service Desk and Service Request System to the NSSC beginning early in Fiscal Year 2011.  We have already done a considerable amount of industry benchmarking for these activities but, once stabilized, we believe we could learn even more through a follow-on LPA. 
 
SSON: Describe an example of one of the best improvements you’ve made since your organisation began operations, nothing necessarily to do with the LPA.

RA: We have a greater understanding of the importance of using technology to improve the customer experience and to drive costs down.  We developed a Grants Status Web page, which is accessible by all grantees, center and headquarters managers with immediate information by grant—something we used to have to respond to telephonically or via email.  We are in the process of implementing a customer-relationship management tool, which will provide for a more robust search engine as well as dynamic FAQ capability, and we already talked about the importance of electronic invoicing. A vendor status page enables a vendor to know immediately the status of their invoice in the payment cycle. We also contracted for a thorough assessment of our electronic data management capability/capacity and used the results to make improvements in that area, as well.

The more that we can adopt more efficient processes and make the NSSC interfaces easier for our customers, the more we can drive the NSSC to a self-service mode, the more we can drive costs down and the more we can improve customer relationships. 

A couple of other things come to mind. First, with the Grants process, we’ve cut approximately 30 percent of the steps out of the process, which is a significant improvement. Also we’ve produced an Extended Temporary Duty (TDY) video for our Web site for all NASA employees who will be going on Extended TDY at the various locations. This doesn’t sound like a big deal, but it’s very significant because, before NSSC there was no consistent application of Extended TDY practices across the agency and inconsistencies in this process often caused negative financial repercussions on the employee on ETDY.

SSON: Tell me about the grants process, why was that such a significant improvement?

RA: There were multiple hand offs not only between the NSSC and NASA HQ but even within the NSSC. When we first started, it was one of the original activities that we opened with. We realized there were things we could do to improve it. We realized there were some process efficiencies both internal to the NSSC as well as additional efficiencies that could be gained at the agency level. We were able to cut about 30 percent of the hand offs, so we really improved cycle time. Based on early process improvements in the Grants area, we have been able to continuously refine and fine-tune Grants processing to optimize both processing (cycle) time and quality of the Grants documents.  We are currently evaluating how to handle the diversity of Grants we process.  For example, a $25,000 Grant should be handled much differently than a $25,000,000 Grant.  Our goal is to do a better job of managing high dollar value, complex Grants.  As we take advantage of lessons learned, benchmarking and process improvements, we are able to adapt our practices and processes to better serve our customers and stakeholders.

SSON: What sort of advice would you have for other organisations that are considering going through this assessment process?

RA: Knowledge is power – the more you know about your business as compared to others in similar businesses, the more information you have to be world class.  The benefits far outweigh the angst about the potential outcome. You can look at the data that comes in, and you can say "wow, we need to do better in this area and this area, or we are doing really well in that area." The one stipulation is that you always need to understand the context, the numbers and the way the data was derived. If you understand that context, then you’re ready to make informed decisions about areas you might improve.

Find the things that are most comparable to your set of circumstances and don’t try to bite off more than you can chew.

You can try to go after 25 things, but it’s going to be difficult. You really want to pick the top 4 to 5 big ticket items that will give you the biggest bang for the buck as far as reducing your rates or increasing customer satisfaction.

SSON: So Rick, you purposefully left out some processes knowing they’d be less comparable is that right?

RA: I wouldn’t say we left them out. I’d say we looked at them and said, OK, what’s driving these gaps and differences. In some cases, it’s different between federal government and industry and, in other cases, there’s not that much difference in how they do this in industry, but we don’t know that until we start delving into the detail.

SSON: Given the opportunity to go through the LPA again, is there anything you’d do differently?

RA: I would say maybe do some homework upfront and then really restrict your efforts to the top 4 or 5 things that compare most favorably and offer the greatest opportunity for improvement.
 
SSON: Thank you Rick!

 

RECOMMENDED