2016-09-27

November 2015

Contents

Executive Summary

Background

NZ Government Web Standards

Goals of the 2014 Self-Assessments

Agency response

Analysis and results

Self-assessment process and timeline

Conclusions and future plans

Appendix A: Self-Assessment Methodology

Appendix B: Timeline

Executive Summary

The 2013 New Zealand Disability Survey, and projects like the Govt.nz website redesign, make it clear that the public come to Government websites with a range of abilities. For instance, 40% of New Zealand’s working population have low literacy levels.[1] Such facts inform the New Zealand Government’s intention to make its websites easy to use for as many people as possible.

A website that everyone can use, including people with disabilities, has greater reach, allowing more people to interact with government when and where they want. At the same time, this helps to minimise the business costs associated with providing the same information and services through more traditional channels, e.g. call centres.

In line with the above intention, the New Zealand Government Web Accessibility and Web Usability Standards, issued July 2013, set requirements for the design, development, and content of Government websites.

The Web Accessibility Standard is based on internationally recognised guidelines and is the Standard for ensuring access for people with disabilities. The Web Usability Standard addresses a number of policy obligations (e.g. copyright, privacy) and sets some minimum content requirements to help all people use Government websites.

Input from a number of sources make it clear that NZ Government agencies and commercial web vendors’ knowledge, commitment and expertise with regard to web accessibility are variable. Both agencies and vendors need more accessibility guidance and training.

2014 Self-Assessments

From November 2014 to February 2015, agencies self-assessed their websites against the Standards. The goals of the self-assessments were to report on agencies’ progress, set a baseline for planning continual improvement, and raise Web Standards’ awareness, knowledge and skill. This was not a compliance exercise, but was to help identify where improvements and additional training were needed.

To conduct a self-assessment, each agency had to select and assess a collection of web pages against the Web Standards, and report the results along with a risk management plan. The assessment methodology was developed by the Web Standards Working Group (WSWG) with input from Statistics New Zealand. The WSWG also audited agencies’ work for accuracy against the Web Standards, and interpretation of risk. The WSWG is made up of representatives from various Government agencies and disability communities selected for their specialist knowledge. They are supported by the digital specialists in Web Standards.

Results

The results from agencies’ self-assessments identified the requirements with the lowest average compliance across NZ Government websites. With a few differences, these were confirmed by the WSWG’s audit of those results (Figure 1). Notably, the requirements with the lowest average compliance tend to be those that are most often incorrectly assessed or understood, indicating low levels of awareness or knowledge of these requirements.

Variable knowledge and skill

The WSWG audits of agencies’ self-assessments indicate that there is significant variability in agencies’ Web Standards knowledge and skill. In some cases, a web page was deemed by an agency to have passed a requirement, whereas the WSWG deemed the web page to have failed, and vice versa.

The difference between these audit and self-assessed scores indicates the degree to which conformance was accurately assessed or a requirement correctly understood by an agency. There is a clear correlation between this variability and the degree to which an agency’s websites meets the Standards: The less an agency understands a specific requirement, the less likely it is to implement it correctly.

Website design and development

The self-assessment results indicate that NZ Government websites are designed and developed in a manner that tends to overlook the range of ways that people access and interact with web content. In particular, they present barriers to access and ease of use for people with impaired vision, and those who don’t use a mouse or who might use special software or other devices to interact with web content.

Website policy and contact information

Agency websites tend not to include all the content required for copyright and privacy statements. They also do not always provide a clear mechanism for contacting an agency through its website. This can makes it difficult for a person to contact the agency for whatever reason after interacting with its website.

Assessing and managing risk

On the whole, agencies successfully completed the self-assessments and risk management reports. However, they did have some difficulty translating their self-assessment findings into relevant risk statements and actionable mitigation plans.

Conclusions

The 2014 Self-Assessments were largely successful in meeting the intended goals. Agencies established where their websites need improvement with respect to the Standards. At a cross-government level, the least understood and correctly implemented requirements have been identified, setting the priority areas for future training and guidance. Through performing the self-assessments and attending self-assessment workshops, participants increased their awareness and understanding of the Web Standards.

As a result of conducting the 2014 Self-Assessments, a collection of actions are planned.

Workshops

A programme of ongoing, practical workshops for agencies and vendors will be established to:

shift agency focus from strict compliance to practical risk management, and integration of accessibility awareness to be considered as part of the GCIO ICT maturity model

focus on top common failures and requirements that agencies and assessors understand least, using specific instances from the self-assessments as real examples

address the inclusion of conformance and testing as part of the procurement and product acceptance process

help agencies understand their own motivations for meeting the Web Accessibility Standard, and how to embed the necessary practices in their day-to-day activities and processes.

As most NZ Government web work is outsourced, workshops and guidance will also be provided for the vendor community, and will focus on technical development issues and techniques.

Guidance

Much of the guidance currently provided to government agencies lacks the degree of practical implementation detail that practitioners are looking for. Drawing on the priority areas for work highlighted by the self-assessments, more practical guidance will be published. In order to avoid reinventing the wheel, effort will be spent directing people to and helping them navigate the vast range of guidance and other resources that already exist online.

Improvements to the Web Toolkit website’s structure and navigation will also be considered to make it easier for users to find the guidance they are after.

Virtual community of expertise

Accessibility expertise in New Zealand is limited. Whether through the WSWG or the assistance of informed experts and practitioners, more of a coaching role will be taken, lifting the capability of web practitioners through hands-on training and workshops.

Instead of relying on one working group or a few experts within a single agency, a broader community of accessibility advocates and experts will be established. This will allow for a more efficient and effective sharing of knowledge and skill.

Future assessments

The cost of assessing websites against complex, technical standards is always going to be high. While the self-assessment results are reliably useful for identifying the Web Standards requirements that are least understood or correctly implemented by agencies, the accuracy of those results is relatively low. Given the degree of effort and resources involved, it is questionable if there is sufficient value in agencies self-assessing their websites.

To obtain an overall view of how NZ Government websites are performing against the Standard, more cost effective, accurate, and repeatable approaches will be considered.

Background

In July 2013, the Government Chief Information Officer (GCIO) issued the NZ Government Web Accessibility and Web Usability Standards. These two new Standards, mandatory for Public Service departments and Non-Public Service departments in the State Services in accordance with a 2003 Cabinet mandate [CAB Min (03) 41/2B], set requirements for the design, development, and content of Government websites to help make them easier for the public to use. They also require that agencies, when asked, assess and report on their conformance with the Standards, and submit a risk assessment and management plan regarding any areas of non-conformance.

In 2011, NZ Government organisations performed a simple self-assessment of their websites against the then current NZ Government Web Standards 2.0. Since then, those self-assessments have been supplemented by ad hoc reviews of government websites and input from the government web community. Together, these indicate significant variability in Government organisations and commercial web vendors’ knowledge and skill implementing and assessing against the Web Standards. This is especially the case where web accessibility for people with disabilities is concerned.

In December 2012, the Web Standards Working Group (WSWG), made up of representatives from various Government agencies and disability communities, was re-established to help agencies meet the Web Standards and deliver more accessible government online information and services. Its first task was to deliver revised web standards for the NZ Government, ensuring that they remained fit for purpose.

To aid the requirement for training, and to inform a current state assessment, the WSWG developed a programme for agencies to report on their levels of Web Standards conformance. From November 2014 to February 2015, agencies self-assessed their websites against the Web Standards. This report describes the goals, process, and outcomes of that activity, as well as future plans to improve the quality of NZ Government websites by lifting the capability of agencies and vendors with regard to the Standards.

NZ Government Web Standards

The 2013 New Zealand Disability Survey, and projects like the Govt.nz website redesign, make it clear that the public come to Government websites with a range of abilities. For instance, 40% of New Zealand’s working population have low literacy levels.[2] In 2013, 24% of New Zealanders were identified as disabled, a total of 1.1 million people.[3] Such facts inform the New Zealand Government’s intention to make its websites easy to use for as many people as possible.

As more and more information and services move online, it makes sense to maximise the benefits that the web offers to both providers and consumers. A website that everyone can use, including people with disabilities, has greater reach, allowing more people to interact with government when and where they want. At the same time, this helps to minimise the business costs associated with providing the same information and services through more traditional channels, e.g. call centres. These potential gains become greater every year as use of the online channel increases, and the elderly population in New Zealand, which has a greater incidence of disability, grows.

The NZ Government Web Accessibility and Web Usability Standards are available on the Web Toolkit website.

Web Accessibility Standard

The Web Accessibility Standard exists to help ensure that as many people as possible, including people with disabilities, are able to access and interact with online government information and services. Improving accessibility supports the aims of the GCIO as Functional Leader for ICT, and is directly aligned with strategic goals identified in the Government ICT Strategy, and Better Public Services (Results 9 and 10). It also helps the Government to meet its obligations under the Human Rights Act 1993, and the United Nations Convention on the Rights of Persons with Disabilities, which New Zealand has signed and ratified.

The Web Accessibility Standard is founded on the internationally recognised Web Content Accessibility Guidelines (WCAG) 2.0 from the World Wide Web Consortium (W3C). WCAG 2.0 comprises a collection of testable requirements designed to help web content providers reduce or remove barriers to access by people with disabilities.

Web Usability Standard

The Web Usability Standard addresses a number of policy obligations and sets some minimum content requirements to help people use a site and its content. For example, this Standard requires that:

each site includes a link to the Govt.nz website

each site has a “Contact us” page, as well as copyright and privacy policies

each link to a downloadable file indicates the file’s size and format

the main content on each web page can be printed legibly on standard sheets of paper.

Goals of the 2014 Self-Assessments

The overall goals of the 2014 Web Standards Self-Assessments were to report on agencies’ progress, set a baseline for planning continual improvement over time, and generally raise their awareness, knowledge and skill with regard to the Web Standards. Achieving a particular compliance score or developing league tables comparing agencies’ performance was not a goal, and tends to distract from the more important work of identifying, understanding, and tackling priority areas for ongoing improvement.

These goals were achieved by:

having agencies identify those aspects of their websites that do not meet the Web Standards

having agencies assess the risks associated with those aspects and identify plans to address them

improving agency web assessors’ understanding of, and familiarity with, the Web Standards through self-training and workshops during the Self-Assessment process

establishing a baseline of agencies’ understanding with respect to the Web Standards, and what their most common issues are

directing agencies’ efforts towards continual improvement over time

informing the development and delivery of training and education for agencies and vendors.

Fit for purpose

To reduce the potentially significant workload for agencies with many websites, each agency was asked to assess a representative sample of web pages from across its entire web presence. This is in contrast to the set number of pages per website each agency was asked to assess for the 2011 Self-Assessments. This new approach also ensured that the common issues raised by the self-assessment were generally representative of the agency’s websites.

The following table summarises the main differences between the 2011 and 2014 Self-Assessments.

2014 Self-Assessments

2011 Self-Assessments

Each agency assessed up to 5 home pages, 5 “Contact Us” pages, and a maximum of 68 randomly selected pages from across all its websites.

Each agency assessed between 6 and 20 (average 10) pages from a number of its websites. [For the larger agencies, this added up to several hundred web pages.]

Assessment results were recorded in a single spreadsheet, with one pass/fail mark per requirement for each web page, specified by URL.

Assessment results were recorded in online form with one single, aggregate pass/fail mark per requirement for each website. The specific web pages assessed were not recorded.

Agencies were required to review their own assessment results, develop and submit a risk management plan.

Agencies were responsible for any follow up activities based on their self-assessment results.

Agency response

A total of 34 agencies responded. Each of the mandated 29 Public Service departments and 4 Non-Public Service departments in the State Services submitted reports, as did the Earthquake Commission, a Crown Agent not mandated by Cabinet to meet the Web Standards.

Only 28 of the 33 mandated agencies submitted full self-assessments and risk register reports as required. The remaining five agencies submitted reports explaining why they did not perform a self-assessment. With the exception of the Ministry of Pacific Island Affairs, which cited lack of expertise and resource given its small size, the other agencies did not self-assess because the resources that would normally have been assigned to the self-assessment were otherwise dedicated to website consolidation or redevelopment efforts in those agencies.

Analysis and results

Summary

Agencies submitted their self-assessment results, which were then audited by the WSWG. Members of the WSWG possess a degree of Web Standards expertise due to specialist knowledge, workshops delivered during the development of the Web Standards, and participation at regular WSWG meetings. The WSWG also had access to the expertise of the Government Web Standards office.

The results from agencies’ self-assessments, and the WSWG’s audits of those results identify the Web Standards requirements with the lowest average compliance across NZ Government websites. Notably, these same requirements tend to be those that are most often incorrectly assessed or understood by agencies and assessors, indicating low levels of awareness or knowledge of these requirements. It is thus reasonable to focus future guidance and training on these particular requirements.

Website design and development

The self-assessment results indicate that the design and development of NZ Government websites are overly reliant on visual presentation and interaction at the expense of a more inclusive approach that acknowledges the range of ways that people access and interact with web content. In particular, they show a lack of consideration for:

people who don’t use a mouse and instead rely on a keyboard or other input device and software to interact with web content

people with impaired vision

people who use special software to help them interpret and understand the structure and relationships between different bits of content on a web page (e.g. what’s a heading, what’s a list item, etc.).

Website policy and contact information

There is also a general lack of attention paid to the formal policy-related requirements around copyright and privacy statements, and the need to provide a clear mechanism for contacting an agency through its website.

Assessing and managing risk

Agencies were asked to consider and report on the various risks associated with their websites’ failures of the Web Standards. On the whole, agencies successfully completed these risk management reports. However, they did have some difficulty translating their self-assessment findings into relevant risk statements and actionable mitigation plans. Accordingly, how different Web Standards failures represent risk and how to integrate their treatment as part of the agency’s web management activities are areas in which agencies could use some guidance.

Compliance scores

This report does not include individual agencies’ compliance scores, or the scores as audited by the Web Standards Working Group. As previously mentioned, compliance scores have never been the point of the Self-Assessments. Agencies’ self-assessed result scores are not accurate (see “Audit variances” below). The accuracy of the WSWG audit scores are equally subject to a degree of variability in the skill and interpretations of the WSWG members who performed the audits. Additionally, the audits are not sufficiently representative to reliably adjust agencies’ self-assessed scores. Finally, because the Web Accessibility Standard doesn’t address every possible aspect of a website’s design or development, a high compliance score does not necessarily equate with an accessible website.

This is not to suggest, however, that these scores are not otherwise useful. For instance, at an individual requirement level, the scores suggest which requirements agencies, web teams, and vendors need to address further.

Risk register analysis

The WSWG reviewed the risk registers that agencies submitted to verify their completeness. This review had two parts: to check how each agency rated the impact severity of the risks that it registered, and identify any risks that it might have overlooked.

To address the first part of the review, the WSWG reassessed the impact severity of each of the risks that the agencies identified. This reassessment was informed using only what was contained in each of the submitted risk registers, and the experience and expertise of the WSWG. As such, the reassessed impact severities do not reflect any agency-specific information or conditions that the WSWG was not privy to. For example, an agency may have rated the severity of a particular risk higher than expected because of the projected cost of responding to the risk’s occurrence. Without access to such information, the WSWG would have assigned a lower impact severity to that risk. Accordingly, the reassessed impact severities may not reflect the total impact severity for this or that risk given an agency’s actual circumstances.

Based on the reassessed impact severity ratings, a vast majority of agencies originally overrated the severity of the risks introduced by their Web Standards compliance issues. This could be based on facts or information the WSWG was not privy to and that would justify a higher risk impact severity. Or it could indicate a degree of confusion in assessing risk impact severity.

For instance, when rating the fiscal impact of a risk, some agencies considered the cost of proactively and fully repairing any Web Standards failures that contributed to that risk. However, the cause of a risk is not the same thing as its occurrence. A risk may be realised in any number of ways that don’t require a full-scale remediation of the underlying Web Standards issue. It is the fiscal impact of the risk as it occurs that needs to be considered when assessing the risk’s impact severity. In other words, the cost of removing the cause of the risk is not a contributing factor to the fiscal impact of that risk being realised. Agencies would have benefitted from more detailed guidance and instruction on this aspect of the risk assessment.

While the review of agency-assessed risk impacts does not reveal any firm and significant findings, it does suggest that agencies are giving these risks sincere consideration, at least in so far as their inclusion in the risk registers is concerned. Acknowledging the various reputational, business, legal, and fiscal risks associated with a failure to meet the Web Standards is key to addressing that failure. In this regard, agencies’ assessments of their Web Standards risks are positive.

The second part of the risk register review was to identify any risks that an agency might have overlooked and not included in its risk register. This was accomplished by comparing the agency’s self-assessment results and the WSWG’s audit results with the aggregate risks that the agency included in its risk register, and noting any absences.

Of the 29 risk registers submitted by agencies, 23 were missing at least one risk that the WSWG considered significant enough to include. Further, over half of the risk registers were missing at least four or more risks that the WSWG thought should have been included. This strongly suggests that agencies had some difficulty translating their self-assessment results into representative aggregate reputational, business, legal, or financial risks. Identifying the particular risks and potential impacts associated with different Web Standards failures, then, is something that agencies could use some help with.

Most common severe risks

The severest risks that were commonly identified by agencies were all related to the Web Accessibility Standard. In analysing the risks identified by agencies, the WSWG grouped them according to WCAG’s 12 guidelines. These 12 guidelines establish the overall goals around which the specific WCAG requirements are organised and that web authors should try to meet when enabling access for people with different disabilities.

The following table lists the top 6 WCAG guidelines with the most risks associated. The most common specific causes or issues behind the risks are also noted.

Number of risks identified
(% of total)

WCAG 2.0 Guideline

Specific risk causes

60 (15%)

Provide ways to help users navigate, find content, and determine where they are.

Lack of visible focus; incorrectly assessed bypass blocks; ambiguous link purpose, often incorrectly assessed.

57 (14%)

Make it easier for users to see and hear content including separating foreground from background.

Insufficient colour contrast; information communicated solely through colour (esp. links); text delivered in images instead of as text.

56 (14%)

Create content that can be presented in different ways (for example simpler layout) without losing information or structure.

Form labels not programmatically associated with form fields; headings not marked up as headings; general semantic relationships not provided in HTML mark-up.

34 (8%)

Provide text alternatives for any non-text content so that it can be changed into other forms people need, such as large print, braille, speech, symbols or simpler language.

Images with missing, insufficient or meaningless alternative text, commonly with logos, but also content images.

34 (8%)

Maximize compatibility with current and future user agents, including assistive technologies.

HTML code errors; insufficient exposure of programmatic name/role/state for interactive controls.

28 (7%)

Make all functionality available from a keyboard.

Lack of keyboard-only access, esp. with carousels and navigation menus.

It is consistent with the self-assessment methodology that the top five of the six severest risks that agencies raised reflect the top ten Web Standards failures identified both by agencies’ self-assessments and the WSWG’s audits (see “Self-assessment scores” and “WSWG audit scores” below). Agencies had some difficulty identifying all of the relevant risks associated with their various compliance issues. They nonetheless consistently identified significant risks issues with common causes that can be addressed across government through additional guidance and training.

Mitigation plans

Agencies were asked what they would do, and when, to reduce or remove the risks they identified, and to plan their responses should any of the risks occur. The Self-Assessment Methodology included questions that agencies could ask to help them determine their mitigation plans for each risk:

Where can the biggest improvements be made for the least cost and effort?

Can the fix be implemented immediately, or is it more reasonable/efficient/effective to wait for the next design or development window?

What will you do if the risk occurs, e.g. a member of the public complains or asks for an alternative version of inaccessible content?

When addressing Web Standards risks, it is important that mitigation activities be integrated with website project plans and development workflows. Few of the submitted risk registers included mitigation plans that went much beyond identifying what needed to be fixed. For example, if a risk was raised because images lacked alternative text for people who can’t see them, the proposed mitigation was just to add the alternative text. This action did not have start and end dates, nor was it included as part of a broader design or development activity. This makes it difficult to act on and measure success. Additionally, very few of the risk mitigation plans established or considered responses in the case that a risk manifested.

Admittedly, the sample mitigation plans in the risk register template did not clearly establish the expectation that mitigation plans should integrate with existing website management plans or upcoming events, e.g. redesigns, code releases, and where possible, identify dates for implementation. Given the mitigation plans submitted by the majority of agencies, more guidance and effort in this area is required.

Self-assessment scores

The following table lists the top ten Web Standards requirements with the lowest average rate of compliance, as scored by agencies themselves. The average compliance rate is indicated as a percentage, where 100% represents full compliance.

Requirement

Description

Average compliance rate (%)

WCAG 2.4.7 Focus Visible

Sighted users who don’t use a mouse can tell which page element currently has focus.

56%

WCAG 4.1.1 Parsing

Any errors in the HTML code do not cause browsers or other assistive technologies to misinterpret the page.

57%

WCAG 1.4.3 Contrast (Minimum)

Text stands out from the background such that sighted users can read it easily.

62%

WCAG 2.4.1 Bypass Blocks

Users are provided a way to jump past blocks of content that are repeated across multiple pages, e.g. site menu.

63%

WCAG 1.3.1 Info and Relationships

The structure and relationships between bits of content can be interpreted by assistive technologies.

67%

Usability 2.2.4 Website's 'Contact us' page has required content

Sufficient methods for contacting the website owners are provided.

68%

Usability 2.3.3 Unambiguous link on home page to website's general copyright statement

The home page provides an easy way to get to the site’s copyright information.

69%

WCAG 1.1.1 Non-text Content

Content presented in images is also provided in text.

70%

WCAG 4.1.2 Name, Role, Value

What each interactive component is called, what it does, and its current state can be interpreted by assistive technologies.

72%

Usability 2.4.2 Website's privacy statement has required content

Sufficient information regarding users’ information gathered by the site is provided.

75%

WSWG audit scores

Agencies self-assessment results were audited by the WSWG. The following table lists the top ten Web Standards requirements with the lowest average rate of compliance, as scored by the WSWG. The average compliance rate is indicated as a percentage, where 100% represents full compliance.

Requirement

Description

Average compliance rate (%)

WCAG 1.3.1 Info and Relationships

The structure and relationships between bits of content can be interpreted by assistive technologies.

25%

WCAG 1.1.1 Non-text Content

Content presented in images is also provided in text.

34%

WCAG 1.4.3 Contrast (Minimum)

Text stands out from the background such that sighted users can read it easily.

36%

WCAG 4.1.2 Name, Role, Value

What each interactive component is called, what it does, and its current state can be interpreted by assistive technologies.

36%

WCAG 2.4.7 Focus Visible

Sighted users who don’t use a mouse can tell which page element currently has focus.

38%

WCAG 4.1.1 Parsing

Any errors in the HTML code do not cause browsers or other assistive technologies to misinterpret the page.

54%

Usability 2.2.4 Website's 'Contact us' page has required content

Sufficient methods for contacting the website owners are provided.

59%

Usability 2.4.2 Website's privacy statement has required content

Sufficient information regarding users’ information gathered by the site is provided.

59%

Usability 2.3.3 Unambiguous link on home page to website's general copyright statement

The home page provides an easy way to get to the site’s copyright information.

62%

WCAG 1.4.1 Use of Color

Information conveyed through the use of colour is provided in other ways for people with colour deficiencies.

64%

With the exception of WCAG 2.4.1 Bypass Blocks and WCAG 1.4.1 Use of Color, the top ten requirements with the lowest average compliance are the same across both agencies’ self-assessment results and the WSWG audits, albeit with some difference in position (refer to Figure 1). This fairly reliably indicates that, regardless of the accuracy of the self-assessment and audit scores, these are the top ten Web Standards failures present on government websites generally.

Additionally, three of the top four requirements with the lowest levels of compliance (based on the WSWG’s audit), namely WCAG 1.3.1 Info and Relationships, WCAG 1.1.1 Non-text Content, and WCAG 4.1.2 Name, Role, Value, are arguably the three most technically difficult requirements, indicating a need for more training around their implementation and assessment.

Audit variances

For each requirement in the Web Standards, agencies’ own assessments of their web pages were compared with the results from the WSWG’s audit. The difference or variance between the two is an indication of the degree to which compliance with the requirement was accurately assessed or correctly understood. In some cases, a web page was deemed by an agency to have passed a requirement, whereas the WSWG deemed the web page to have failed, and vice versa. The greater the variance, the less the requirement was understood or accurately assessed.

The following table lists the top ten requirements with the greatest average variance between the agencies’ own scores and the WSWG’s audit scores. The variance is expressed as a percentage, where 100% would indicate that the WSWG’s audit scores were totally opposite those of the agencies.

Requirement

Description

Average variance (%)

WCAG 1.3.1 Info and Relationships

The structure and relationships between bits of content can be interpreted by assistive technologies.

49%

WCAG 2.4.7 Focus Visible

Sighted users who don’t use a mouse can tell which page element currently has focus.

48%

WCAG 1.1.1 Non-text Content

Content presented in images is also provided in text.

48%

WCAG 4.1.2 Name, Role, Value

What each interactive component is called, what it does, and its current state can be interpreted by assistive technologies.

40%

WCAG 1.4.3 Contrast (Minimum)

Text stands out from the background such that sighted users can read it easily.

38%

Usability 2.4.2 Website's privacy statement has required content

Sufficient information regarding users’ information gathered by the site is provided.

38%

WCAG 2.4.1 Bypass Blocks

Users are provided a way to jump past blocks of content that are repeated across multiple pages, e.g. site menu.

34%

Usability 2.6.1 Main content is printable in full on standard paper

A page’s core content can be printed legibly on standard sheets of paper for people who prefer to read from a hard copy.

29%

WCAG 1.4.1 Use of Color

Information conveyed through the use of colour is provided in other ways for people with colour deficiencies.

28%

WCAG 2.4.4 Link Purpose (In Context)

Each link’s purpose or target is clear from its text and context.

28%

To the degree that the variance between the agency and WSWG audit scores reflects the agencies’ understanding of a requirement, one would expect an inverse relationship between variance and compliance score. This is what the data shows: as compliance scores improve from requirement to requirement, the variance between WSWG audit and agency self-assessment score tends to go down (see Figure 2 below).

Of interest is that the top six requirements with the highest variance appear in the top ten requirements with the lowest compliance scores from both agency self-assessments and WSWG audits. These six requirements are both common failures across agency web pages, and are also those most often incorrectly assessed or understood.

Of equal interest is that the top five requirements with the highest variance are arguably five of the most important WCAG requirements for ensuring accessibility of content for people with disabilities. These include:

ensuring that semantic information (e.g. what’s a heading, what’s a list, etc.) conveyed through visual presentation is also available to the technologies used by people who cannot see the visual presentation (WCAG 1.3.1)

providing a visual indication of which page component currently has focus for sighted users who rely on a keyboard only or other non-mouse input device (WCAG 2.4.7)

providing textual equivalents for images and other non-text content (WCAG 1.1.1)

ensuring that the role (e.g. button), name (e.g. “Search”), and state or other values (e.g. pressed, visited) of interactive components are made available to the technologies used by people with different disabilities (WCAG 4.1.2)

providing sufficient contrast between text and its background to ensure easy legibility (WCAG 1.4.3).

It’s also worth noting that WCAG 2.4.1 Bypass Blocks and WCAG 1.4.1 Use of Color, the two requirements that were not common to both the agency self-assessments and the WSWG audit results, do appear in the above list of top ten requirements with the highest variance. This is consistent with agencies not fully understanding these two requirements and therefore incorrectly failing pages on WCAG 2.4.1 Bypass Blocks, and missing failures of WCAG 1.4.1 Use of Color.

Average cost and time

Some agencies conducted the self-assessment internally, while some contracted external consultants. Still others conducted the easier portion of the self-assessment themselves, while contracting out the more technically complex portion of the work.

The total spend as reported by all agencies was $86,022. The total time taken as reported by all agencies was 2,289.5 hours.

For those agencies that performed the self-assessment internally only, the average number of hours spent per agency was 116.5.

For agencies that entirely outsourced the work, the average cost per agency was $13,193. For agencies that did some of the work internally, and outsourced part of it, the average money spent per agency was $9,036.

Note that the three averages above do not include the time or cost reported by agencies where those values were below a reasonable threshold for having performed a meaningful self-assessment, as deemed by the WSWG. The WSWG estimated that a comprehensive assessment against just the Web Accessibility Standard would take approximately 1 hour per page. This translates to approximately 78 hours of assessment time for most agencies, and does not include the time required to identify the web pages to be tested, the risk analysis of those test results, or the preparation of the final report.

It is true that not all agencies had to test the same number of web pages, as the number of pages to assess depended on the number and size of an agency’s websites. Yet, even accounting for the number of pages assessed, the time or dollars spent by some agencies still does not represent the type of sustained effort that a reasonably complete self-assessment would require. For instance, one agency indicated that it assigned only internal staff for a total of only 9 hours to assess 36 pages, while another agency had an external resource complete a self-assessment of 75 pages in just 6 hours for a total cost of $900.

Other trends

A range of other factors were examined to understand if there were any relationships between them:

Overall compliance score degree of audit variance, i.e. do high compliance scores tend to be associated with high audit variances?

Internal/external assessor compliance score and audit variance, i.e. do self-assessments conducted by external contractors tend to have higher or lower compliance scores, and are they more or less accurate than internally performed self-assessments?

Time/cost compliance score and audit variance, i.e. does more time or money spent on a self-assessment result in higher or lower and more or less accurate compliance scores?

In a few of the above scenarios, there were some weak correlations indicated, but given the number of samples and the significant variability in the distribution of data points, there is not enough confidence to draw a reliable conclusion regarding any relationships between the factors considered.

Self-assessment process and timeline

As part of the self-assessment process, agencies produced the following artefacts:

self-assessment results for all web pages tested, with pass/fail marks for each requirement in the Web Standards

a summary of time invested and cost incurred by completing the self-assessment (including both internal and external resources)

an aggregated assessment of the risks resulting from non-conformance with the Web Standards, to assist agencies identify priorities for remediation efforts.

The artefacts were audited by the WSWG for accuracy against the Web Standards, and interpretation of risk.

From their audits, the WSWG produced the following artefacts:

assessment results for a 5-page subset of each agency’s sample of web pages, with pass/fail marks for each requirement in the Web Standards

a list of the most common Web Standards failures across agency websites (and therefore possible training opportunities)

a list of the most common significant shortcomings in the skill and knowledge of agency practitioners (and therefore possible training opportunities) with regard to the Web Standards

feedback for each participating agency regarding their own performance

feedback for each agency regarding their assessment of risk with respect to their Web Standards compliance issues.

Details of the Self-Assessment Methodology can be found in Appendix A.

Timeline

The original plan, as communicated to agencies at the end of May 2014, was to initiate the official self-assessment period in July 2014 on the one-year anniversary of the new Web Standards. The self-assessment period was to last for two months. However, delivery of the support materials, in particular the Web Accessibility Standard Assessment Guide, was significantly delayed due to resource constraints. The Self-Assessments formally began mid-November 2014, and they lasted until mid-February 2015.

See Appendix B for a detailed chronology of the 2014 Self-Assessments.

Conclusions and future plans

A worthwhile effort

The Web Standards exist to help ensure that the public can access and use government online information and services in an easy and effective manner. Having agencies self-assess their websites is one way to both lift their capability in this regard and determine what yet needs to be done to improve the public’s experience. The 2014 Self-Assessments were a lot of work for agencies, but they were successful in meeting most of the intended goals.

Activities such as the self-assessment and review, as well as the hands-on workshops that were held, help those who performed the work to advance their own understanding of and familiarity with the Web Standards. Feedback from workshop participants indicates that the opportunity to discuss the requirements, learn what issues other agencies are facing, and ask questions of those with more expertise and experience was very beneficial.

While analysis of the reports submitted by agencies confirmed some things we already suspected, it has equally provided evidence regarding which aspects of the Web Standards that agencies and agency websites are having the most difficulty with.

A majority of agencies seem to have had difficulty fully translating their self-assessment results into related risk issues with actionable mitigation plans. However, there is significant value, to both the WSWG and agencies, in their having completed a formal process whereby they reviewed, considered, and proposed solutions for the Web Standards issues they found and the risks those issues presented. It is only through this kind of review that an agency will be able to understand their current state with regard to the Web Standards, and know what to address in order to improve over time.

Together, the results from agencies’ self-assessments, the WSWG audits of those results, and the risk register reports, are very important for informing the training and education for agencies and vendors that is now to be developed and delivered.

Future assessments

As noted in this report, the 2014 Self-Assessments resulted in some important outcomes. However, the cost of assessing the compliance of NZ Government websites against complex, technical standards is always going to be relatively high. Agencies maintain a large number of websites, and a meaningful, representative assessment necessarily takes time and money. Given the degree of effort and resources involved, it is questionable if there is sufficient value in asking agencies to formally self-assess in the same comprehensive manner, at least over the next several years. The self-assessment results, while generally consistent across agencies and websites, are not highly accurate. They are very useful for identifying common issues to be addressed, but they are not particularly useful for establishing a baseline measure of compliance per se against which future assessments can be compared to determine degree of improvement.

The real benefit of the self-assessment process is to engage agencies and agency web practitioners with the Web Standards, increase their awareness, knowledge and skill in that regard, and help them integrate practices that will support continual improvement in the delivery of conformant, accessible websites.

To obtain an overall view of how NZ Government websites are performing against the Web Standards, it could be that a more centralised assessment of a sample of pages from across all-of-government would be a more cost effective, accurate, and repeatable approach to take. Another approach might be to prioritise the assessment of the key agencies with the greatest online delivery of Government information and services.

In international jurisdictions, both self-assessment and testing by a centralised body are approaches that have been used for measuring web standards progress. In New Zealand, the Government has a number of other standards and requirements frameworks for which there are assessment processes, e.g. Protective Security Requirements, Privacy Maturity Assessment Framework, Public Records Act 2005 Audit Programme. If another Web Standards assessment is deemed reasonable, it will be important to learn from and align with these other activities when considering possible approaches. It will be equally important for the team that coordinates any future assessments to consider ways to address those aspects of the 2014 Self-Assessments that could have been improved.

Self-Assessment areas for improvement

Risk assessment and mitigation

It was important to have agencies address their Web Standards failures from a relatively formal risk management perspective to acknowledge and more fully integrate them as part of their broader ICT and assurance-related activities. That said, the methodology for assessing and addressing risk severity and impact was complicated and lacked clarity.

Expectations for the risk mitigation plans to be submitted by agencies to the GCIO were not clearly set. Nor were agencies provided sufficient guidance and instructions for developing their mitigation plans. Likely as a result, very few agencies’ risk mitigation plans integrating their proposed actions with their other web management activities (e.g. website redevelopments, redesigns, consolidations, etc.). Too few of the mitigation plans considered what the agency’s response would be in the case that any of the identified risks happened. Too many of the risk mitigation activities amounted to simple statements that the cause of the risk would be fixed which does not amount to an actionable plan.

Sampling methodology

Expert advice from Statistics New Zealand was sought to inform the methodology for selecting the pages to test. This was to ensure that the self-assessments would deliver statistically meaningful results at the agency level. The methodology was relatively complex as it required an agency to generate lists of specific page URLs from across all of its websites’ web analytics packages, then manipulate and randomise those lists within a special spreadsheet. An alternative was provided, involving a simpler but much more manual selection process. The more websites an agency had to assess, the more complicated was the process, as familiarity with those websites tended to be distributed across multiple individuals and business units.

To the extent that representative results are desirable, it is difficult to see how the page selection approach could be simplified. In the event of future assessments, this question will be pursued with Statistics NZ for possible options.

Qualified assessors

Based on direct feedback during and following the self-assessments, many agency employees noted an increase in their familiarity of and expertise with regard to the Web Standards, and particularly web accessibility. This was an important and beneficial outcome of the Self-Assessments. However, many of the individuals who performed the work are not regularly involved in the actual day-to-day web design, development, or content authoring for their agency’s web presence.

As such, there was a lack of experience and expertise among many of the agency practitioners, making their task of assessing web pages against the Web Standards relatively more frustrating and difficult, and their results relatively less accurate results. This was borne out in the questions and discussions that were raised by the participants at the self-assessment workshops.

Workshops

The Self-Assessment workshops held in January and February 2015 were very well received. Participants and others have since enquired regarding future workshops. A programme of ongoing, hands-on workshops regarding the Web Standards will be planned. The results from the 2014 Self-Assessments will allow these workshops to:

shift agency focus from strict compliance to practical risk management, and integration of Web Standards awareness to be considered as part of the GCIO ICT maturity model

focus on top common failures and requirements that agencies and assessors understand least, using specific instances from the Self-Assessments as real examples address the inclusion of Web Standards compliance and testing as part of the procurement and product acceptance process

help agencies understand their own motivations for meeting the Web Standards, and how to embed the necessary practices in their day-to-day activities and processes.

As most web development work across government is outsourced, it is expected that workshops for agencies will tend to be oriented more to the authoring of accessible web content, and how the Web Standards need to be addressed by internal policies and processes (e.g. risk management, procurement). Workshops for the vendor community, on the other hand, will tend to focus on more technical development issues and techniques.

Guidance

The NZ Government Web Toolkit (webtoolkit.govt.nz) already contains an array of guidance regarding the Web Standards and their application. Yet, much of this guidance lacks the degree of practical implementation detail that practitioners are looking for when wanting to meet this or that requirement. There’s also been feedback that the site’s structure and navigation make it somewhat difficult for users to find the guidance they are after, so the team responsible for the Web Toolkit will consider improvements on that front.

Drawing on the priority areas for work highlighted by the 2014 Self-Assessments, more practical guidance will be published on the Web Toolkit. The Web Accessibility Standard and Web Usability Standard Assessment Guides used by agencies for their self-assessments will also be published. In order to avoid reinventing the wheel, effort will be spent directing people to and helping them navigate the vast range of guidance and other resources that already exist online.

Virtual community of expertise

The availability of Web Standards expertise among agencies and vendors in New Zealand is limited. By taking more of a coaching role and delivering hands-on training and workshops - whether through the WSWG or the assistance of informed experts and practitioners - the capability of web practitioners will be lifted. As this happens, we will want to take advantage of their increased expertise and interest to establish a broader community of Web Standards advocates and experts that can answer questions and share their knowledge and skill, instead of relying on one working group or a team within a single agency.

The GovtWeb Yammer group is an example of an online collaborative space where this type of sharing of Web Standards knowledge and experience among agencies is already taking place, and that might serve as a prompt to establish a distributed community of Web Standards experts.

Appendix A: Self-Assessment Methodology

The Self-Assessment Methodology had roughly three main steps:

Identify the pages to assess.

Assess those pages against the Web Standards requirements.

Complete and submit the self-assessment report and risk register.

Selecting pages to assess

Each agency was asked to assess up to five home pages, up to five ‘contact us’ pages, and depending on the total number of web pages across all of its sites, up to 68 randomly selected pages. The randomly selected pages were weighted towards the most visited pages from the agency's most visited websites. In other words, the sample included more popular pages from the agency's most popular sites and fewer pages from its least popular sites. At these sizes, the sample is statistically representative of the agency's overall web presence within a 10% margin of error, 18 times out of 20 (90% confidence level).

The Self-Assessment Methodology suggested procedures by which these pages could be selected, for example, by using Google Analytics. The self-assessment spreadsheet in which agencies recorded their actual assessment results also included some tools to help agencies identify the pages to test.

Assessing the web pages

Once an agency had selected its sample of pages, it was expected to assess those pages against the requirements in the Web Accessibility and Web Usability Standards, whether this was performed in-house or by an external vendor, or some combination of both.

To help with the actual testing, two support documents were provided: the Web Usability Standard Assessment Guide and the Web Accessibility Standard Assessment Guide. Acknowledging that, in many cases, the individuals performing the self-assessments would not be experts in web development or the Web Standards, each of these guides was an attempt to present the requirements in an easy-to-understand manner with practical directions on how to test a web page against them.

Admittedly, especially where the Web Accessibility Standard is concerned, the requirements can be quite technically complex. In order to ensure that any agency employee tasked with completing the self-assessment could succeed, some of the technical detail required for a full understanding and assessment of certain requirements had to be left out. As such, where those more technically complex requirements are concerned, it was expected that some self-assessments might incorrectly identify a pass where in fact there was a failure to meet the requirement, or vice versa.

Reporting and risk management

Upon completing the self-assessment, each agency was to submit a report including both the completed self-assessment spreadsheet and a risk management report comprising an aggregated list of risks and mitigation plans. Additionally, agencies were to include their best estimate of total employee time and/or vendor cost required to complete the self-assessment, including the reporting.

The risk management report lists the risk issues raised by the self-assessment in a simple risk register. A template with sample content was provided.

Instead of drafting a risk statement for every individual failure of the Web Standards, agencies were to group and classify the failures. Common failures were to be grouped as much as possible into single, broader statements that captured the thrust and scope of the overall issue. Then each risk statement was to be assessed using an impact severity matrix adapted from the All-of-Government Risk Assessment Process Report Template.

For each risk statement, agencies were asked to consider what they would do, and when, to reduce or remove the risk. They were also to plan their response should the risk manifest. Then the risk statements were prioritised by impact severity, from most to least severe, and the top 10 most severe risks and their corresponding mitigations entered into the risk register template.

Appendix B: Timeline

29 May 2014: Government Chief Technology Officer (GCTO) asked agencies to self-assess their websites against the Web Standards over a two month period beginning in July 2014

30 June 2014: Delay providing self-assessment spreadsheet and assessment guides

1 August 2014: Update to Web Usability Standard 1.2 — clarifies certain requirements and provides agencies greater flexibility in how they meet them.

11 September 2014:

Self-assessment spreadsheet ready

Assessment guides still in progress

Agencies urged to begin selecting pages to assess

15 October 2014: Web Usability Standard Assessment guide delivered to agencies

22 October 2014: Web Accessibility Standard Assessment guide delivered to agencies

14 November 2014: Self-Assessments officially underway; reports due 13 February 2015

5 December 2014: Updated self-assessment spreadsheet and assessment guides delivered

January – February 2015: Five hands-on workshops attended by various agency assessors, on how to assess specific web content against particular requirements.

4 February 2015: Deadline extended one week to 20 February to acknowledge earlier delays and to allow agencies more time

11–20 February 2015: 24 of 34 self-assessments reports are submitted by 20 February deadline, including one from a non-mandated agency (Earthquake Commission)

23 February–12 June 2015:

Show more