Groups seek clear guidance on digital exams, assistive technology features

Disability groups are seeking to ensure digital assessment applications conform to an acceptable set of accessibility standards in proposed regulations from the Education Department.

Specifically, Section 200.6, as part of ED’s proposed regulations to implement the Every Student Succeeds Act, Pub. L. No.114-95, would require that a state’s academic assessment system provides “appropriate accommodations for each student with a disability, including the ability to use [their own] assistive technology devices.”

The proposed regulations say the use of assistive technology devices must be consistent with nationally recognized accessibility standards, but Oklahoma ABLE Tech and the Association of Assistive Technology Act Programs expressed concern in letters submitted to ED that the proposal lacks “clear requirements” for assessments to be accessible.

“Our members have been dismayed to watch the shift to digital assessments increase, rather than decrease, access barriers [that] penalize students with disabilities by forcing them to learn new built-in access features (e.g., text-to-speech system, magnification technology, etc.) rather than using their own AT,” the groups wrote. “This creates a situation in which the assessment is measuring a student’s ability to learn the new access technology as much if not more than it is measuring their expertise on academic content.”

“These are massive learning and efficiency barriers that will negatively impact students’ performance,” said Linda Jaco, director for Oklahoma ABLE Tech. She added that it is “sort of like asking someone to take their driver’s test with a manual transmission car when they don’t know how to drive a stick shift.”

digital-cloud

Learning curve

Indeed, stakeholders believe the proposed guidelines also appear insensitive to the time it would take individuals with disabilities to sufficiently learn an unfamiliar piece of assistive technology in order to yield valid assessment results.

Many students with disabilities require access features that cannot be built into assessments, such as voice recognition and more complex alternative input devices like eye gaze. “It takes a long time to train a voice recognition system and it can be very complex to set up direct keyboard commands with screen reader technology,” Jaco said in an interview.

For these students, when the assessment is not accessible and their own AT does not work, they have no independent access to the assessment and are forced to use human supports or, in some cases, take hard copy “alternative format” assessments when other students are taking online digital assessments, the groups noted.

“In some cases, there simply is no alternative other than the AT the student uses, as that is the only computer interface that works,” Jaco said. “All of the student’s time would be spent in learning the [new or alternative] technology as opposed to working through the questions on the assessment.”

Conforming to national standards

Such scenarios may be avoided if assessments are developed consistent with a nationally accepted accessibility standard like the National Instructional Materials Accessibility Standard or Web Content Accessibility Guidelines 2.0 access standards, Jaco added.

The groups said the current way the proposed regulation is written in reference to “consistent with nationally recognized accessibility standards” applies to assistive technology devices, which they said is “inappropriate and inaccurate” because there are no accessibility standards for AT devices.

Instead, nationally recognized accessibility standards are applicable to assessments, the group noted.

In letters to ED, they said proposed rules should be revised so that the phrase “consistent with national recognized accessibility standards” applies to the assessment, not the assistive technology.

Jaco said in an interview that it is relatively easy to develop a web-based assessment that is WCAG 2.0 conformant and that will support interoperability with assistive technologies “if that is done from the beginning.”

“The problem is that none of the developers did that, nor did the [Education Department] provide strong guidance to make sure this happened, and now all of the vendors are struggling to fix their [assessment] code,” she said.

Test developers, by and large, believe running third-party assistive technology software over their assessment application poses a major security risk, she said.

Consequently, “even if the application is WCAG conformant, there are still major barriers to students being able to use their own AT because security settings lock out the code that is needed for the AT to run,” she added.

She said her colleagues in the field have seen instances where students are forced to use two separate computers to take the assessments so that the computer with the assistive technology is “secured” from the one with the assessment code.

“It would be great to get some focus on this issue so it is fixed,” Jaco said.

At LRP’s National Future of Education Technology Conference (FETC), experts will provide insights for K-12 districts in this area. This includes workshops that address ways to strategically select technology to help students with disabilities access state standards and assessments, as well as avoid costly disputes in choosing and implementing assistive technology. FETC will be held Jan. 24-27, 2017, in Orlando, Fla.

Future of Education Technology Conference

Future of Education Technology Conference

Leave a Reply