Data Interpretation

After reviewing responses, data fields were defined based on the questions to provide both qualitative and quantifiable results. A SQL database was designed with a SCHOOLSDATA table for complex responses and a SPREADSHEET table for discrete responses. The table in Appendix 1, section 7.2 describes each data field by table, as well as the data type and the question whose answer its value is based on. Any questions not marked were not used for data fields. Although initially the SPREADSHEET table was meant to have boolean values, this was changed to accommodate the gray areas of partial accessibility, and the types of responses received. The table in Appendix 1, section 7.3 shows the code used to represent discrete response types.

Contact Information

The given school’s URL was used to locate the college’s main website, and verify college contact information. If the given contact information differed from the college’s main website information, the field was replaced with the information from the college’s main website.

The given AT services department’s URL was used to assess the clarity of information, and determine any inconsistencies between the survey responses and the information on the given website. No information was replaced at this step. Clarity was added to abbreviated office locations (example: LIB 2153 was changed to Library building, room 2153).

Website Accessibility

The response to the open-ended question about the college’s website accessibility was evaluated for accuracy, by testing each college’s main website inaccessibility with evaluation tools WAVE (“WAVE Web Accessibility Tool”) and AChecker (“IDI Web Accessibility Checker: Web Accessibility Checker”). Neither website can truly determine whether a website is accessible; “only a human can determine true accessibility” (“WAVE Help”). However, both do check for indicators of non-compliance with web accessibility standards based on Section 508 and WCAG 2.0. Errors on WAVE “indicate accessibility errors” (”WAVE Help”). Known problems on AChecker are “problems that have been identified with certainty as accessibility barriers” (“AChecker Handbook”). Both indicate elements that must be fixed before the website can be deemed accessible. The following is the initial algorithm used to test for inaccessibility:

  1. 1.

    Run AChecker

    • If 1 or more Known Problems, failed AChecker

  2. 2.

    Run WAVE

    • If 1 or more Error, failed WAVE

  3. 3.

    If [(failed WAVE) or (failed AChecker)], change accessibility to 1 (Partial compliance)

Initial implementation of this algorithm had an equalizing effect on the data. A second algorithm was devised to show possible variations in data:

  1. 1.

    Each school is given an initial score of 0

  2. 2.

    Run AChecker

    • Add number of Known Problems to score

  3. 3.

    Run WAVE

    • Add number of Errors to score

While this algorithm does not indicate which websites are more or less accessible, it does indicate the minimum elements on each website that absolutely must be changed before the website could be tested for accessibility by humans.

Finally, when asked about assistive technology available to students, two colleges reported they did not provide educational support software at their institution, yet indicated elsewhere in the survey that their offices provided a Kurzweil product, which is in fact educational support software (“Text to Speech, Literacy Software — Kurzweil Educational Systems”). For these two colleges, the value for ATEDUCSUPPORTSW was changed from 0 (No) to 2 (Yes).

The complete data tables are are available upon request.