Project Name:

Development of a Prototype for the Standard Application Process Portal

Contractor: RTI International

Lessons Learned

  • Internal review cycles prior to submission to OMB can be time consuming and can create timeline risks for short period of performance projects.
  • Internal review prior to OMB submission will catch potential issues and shorten the ultimate OMB review time.
  • Generic clearance OMB will require full guides and scripts to be developed for each population.

The process and length of time to obtain OMB approval has been more involved and taken longer than anticipated. Given the short length of the contract and the need to complete deliverables on schedule, it likely would have been more appropriate to obtain feedback from key stakeholders using methods that do not require OMB approval.

  • OMB review processes also affected the timelines for user testing against the prototypes.
  • We had started the OMB package for user testing early based on experience with OMB for user research OMB, however, this resulted in a misalignment with the interview scripts for OMB and the features of the prototype, which had not yest been developed.

Summary:

o OMB: The combination of long cycles of serial OMB review and multiple OMB packages did not work well with this period of performance. OMB reviews took about three months for user research and almost six months for user testing. Ideally, the OMB scripts for user testing needed to be developed after the prototype was developed, in order to have a question flow that better aligned with the product, and for features and labels to be accurately identified in the interview guides. However, if that had happened, the timeline for prototype development, OMB approvals, and user testing would have necessitated a longer period of performance. Another approach would have been to design the user testing with methods that did not require OMB approval. The best of both worlds would be for the OMB guides to be flexible with wording and question order so that it does become a serial process, yet testing outcomes can reflect a broader set of users.
o Recruitment: We used an NCSES based email list to reach out for volunteers for testing both for user research and for user testing against the prototype. The benefit of this approach was that multiple concurrent projects had equitable access to potential participants. A downside of this approach was that the email list was comprised of people who had opted in voluntarily and were not necessarily users of the existing portal site. Ideally, we might have explored recruiting from registered users to ensure more people who had used the process, or the system provided feedback.
o General: The participants and stakeholders are very passionate about this process and access to restricted federal data and there is a lot of energy available for improvements and growth.

Disclaimer: America’s DataHub Consortium (ADC), a public-private partnership, implements research opportunities that support the strategic objectives of the National Center for Science and Engineering Statistics (NCSES) within the U.S. National Science Foundation (NSF). These results document research funded through ADC and is being shared to inform interested parties of ongoing activities and to encourage further discussion. Any opinions, findings, conclusions, or recommendations expressed above do not necessarily reflect the views of NCSES or NSF. Please send questions to ncsesweb@nsf.gov.