GSA Government-wide Section 508 Accessibility Program

Updating the Accessibility Vision: Leveraging Program Maturity Data

Thu, 11/19/2015 - 11:41 -- melinda.davey@j...

On September 28, 2015, Federal Section 508 Coordinators gathered to discuss how to strengthen the Federal Approach to Accessibility Program management. Sessions included both discussions of GSA’s Governmentwide Section 508 Program and future vision for services and tools and discussions of how to measure Section 508 Program maturity. The breakout session called, “Leveraging the Data” by Katie Pittman, CIOC Accessibility Community of Practice, is described below.

Background

On January 24, 2013, OMB published the “Strategic Plan for Improving Management of Section 508 of the Rehabilitation Act - A Framework for Enhancing and Sustaining Management Improvements to Increase the Accessibility of Electronic and Information Technology.” The Strategic Plan required the CIO Council Accessibility Community of Practice (formerly called the CIOC Accessibility Committee) to develop a standard government-wide template for agencies to use in reporting assessment of the Section 508 program.

This Section 508 OMB Dashboard/Reporting Template identifies five Section 508 program metrics, to be assigned one of four possible maturity measures (ad hoc, planned, resourced and measured). These measures provide a framework for defining the maturity level of key Section 508 program activities, and imply a general Section 508 Program Maturity Model.
This session focused on sharing experiences in gathering and reporting this data, understanding similarities and differences in that process across agencies, and discussing next steps.

Agency ICT Management Structure and Organization

It is recognized that agencies differ in the way they are organized with respect to IT budget and policy authority. This difference can roughly be viewed as a continuum ranging from highly centralized, with consolidated budget control and concentrated policy authority, to highly federated, with distributed IT budgets and dispersed policy authority.

Figure 1. Continuum of federal agency IT budget and policy authority
Line with Consolidated, centralized IT budget and policy wording on Left and Federated, distributed IT budgets and policy authorities on the Right.

Hypothesis

Review of the available data so far suggests differences in how agencies collect and report their data that can in turn affect data quality and reliability. The hypothesis for this session was that agency data collection and reporting practices differ based on the degree of centralization (or decentralization) of the agency’s IT budget and policy authority.

Exercise

Representatives from eighteen federal agencies participated in the session and identified where they thought their agency fit on the scale of degree of centralization of the agency’s IT budget and policy authority. The process was subjective and represented the individual participant’s perception of their agency along the provided spectrum. Eight agencies self-identified as being towards the more centralized IT authority side of the scale, and ten agencies self-identified at the midpoint or on the side of more federated IT authority.

The participants were separated into two groups based on their position on the IT budget and policy authority continuum.
Group 1 consisted of the eight agencies below the midpoint of the scale representing more consolidated, centralized IT budget and policy authority.
Group 2 consisted of the ten agencies at or above the midpoint of the scale representing less centralized, more federated and distributed IT budget and policy authority.
Each group was asked to respond separately to two questions related to the agency process for collecting and reporting data for the ACoP/OMB Section 508 Program Maturity Assessments.

The questions and responses recorded from each group are outlined below:

1. Of the processes your agency is currently using for data collection and reporting, what are 3 things that are working out well?

Group 1 (Centralized):

  • Lean collection methods are employed, with a single point of contact for reporting at highest level of components
  • Testing and data collection are embedded into the System Development Life Cycle (SDLC) and done continuously or with high frequency
  • Vendors and contractors are informed about testing and data collection requirements and required to fit into the SDLC

Group 2 (Federated):

  • Automated reporting systems and scorecards were used to assist data collection
  • Partnering with the right stakeholders and gaining the support of upper management has proven essential.
  • Using outreach and training on how to use the systems and processes in place for reporting as well as allowing components to provide input into the Enterprise Architecture and IT Acquisition processes improves reporting and results.

2. What are the 3 biggest challenges to submitting data that reflects actual agency performance?

Group 1 (Centralized):

  • Vague testing requirements in the reporting template
  • Frequent reporting intervals that might overlap the collection process
  • Inconsistent data collection from components at different levels, and lack of feedback when submitting data up the chain

Group 2 (Federated):

  • Components don’t see consolidated agency-level that is submitted
  • Components have short lead times to collect and report data
  • Testing samples may not be representative of the universe of agency EIT

Discussion and Conclusions

It was noted that there is an anticipated trend for agencies to generally begin to move more towards a more centralized IT model, with increasing budget and policy authority at the agency-level due to increasing responsibility as seen in the Federal Information Technology Reform Act (FITARA).

The responses to the two questions were generally discussed by all participants. From the recorded responses and ensuing discussion a general observation was recognized: Agencies with more centralized IT budget and policy authority were generally working on issues to improve the process, the accuracy of data collection and opportunities to increase the impact of the reporting results within their agencies, while those with a more decentralized model were generally working on the basics of collecting data and coordinated reporting to meet the minimum requirements of the mandated reporting.

Moving forward, the results from this session suggest that sharing lessons learned, training, and outreach will be most effective when agencies collaborate with other agencies that are similar to them with respect to the continuum of IT budget and policy authority.