Evaluation: Software & Tutorials

Sample Digital Library Evaluation

 

Below is an example of a digital library evaluation processing including the development and ranking of evaluation criteria and applying them to an existing collection. It was conducted within the framework of the Evaluate an Existing Digital Library Exercise.

Part I & II: Developing and Ranking a Set of Evaluation Criteria

The eight criteria that I have chosen to use for evaluating a digital library stem from the Tefko Saracevic’s Six Classes of Criteria.  The criteria I am using, listed in order of importance, are:

Accessibility

Accessibility comes from the “technology” class. I have chosen accessibility as my first and foremost digital library criteria because one must be able to access the digital library in order to use it. If the means to reach the collection is down, and nobody can visit the collection, then what is the point? It might as well not be there as far as the digital users are concerned.

Quality

While the quality of a digital library is certainly subject to individual interpretation, this “content” criterion should be met in order to increase the perceived value and encourage sustained use of the collection. The content should be worth the effort of locating it.

Ease of Use

To ensure that a wide variety of people can readily navigate the digital library, Tefko’s “interface” class should be addressed. Should the navigation cues prove irksome, many users may give up and go elsewhere for their information.

Interpretation Difficulty

This “process/service” criterion comes into play when users cannot understand what is offered. Clarity and the anticipation of possible communication failures can reduce the likelihood that patrons will leave confused and frustrated.

Accuracy

Found in the “content” class, accuracy speaks to the correctness of the material provided. Depending on the purpose and subject of the information search, inaccuracies could lead to the dissemination of false statements, which in turn can cast the information user and the information provider in a bad light. Once you provide inaccurate results you lose your patron’s trust.

Success

Success is the goal of the “user” class. We want our digital library visitors to have a fruitful experience; to find what they are looking for. Visitors came to the digital library with a purpose they expected us to fill and when we fulfill their information need and meet their expectations, we build trust and hopefully a following.

Institutional Fit

This criteria of the “context” class is probably lesser known, but worth mentioning. It speaks to the degree to which the digital library is connected to its parent institution. Does it make sense that the provider is the purveyor of this material? This can be important in relation to the real or perceived authoritativeness. If the digital library is in an area that the institution is not known for, can you trust that the institution is a good source? Also, if times turn hard, the areas less likely to be invested in are those that are not in keeping with the main purpose and goals of the parent institution. Maintaining such a disparate digital library maybe a lost cause.

Attractiveness

It sounds shallow, but looks matter, even in digital libraries. It may be that this “interface” criterion is what gets your digital library noticed long enough for the viewer to fall in love with the content. A sense of the aesthetic, though again open to different tastes, seems to be hard wired into people. Some time should be devoted to this area, especially if you are looking for funding from people who are only peripherally interested in your subject matter.

Part III: Application of the Evaluation Criteria to an Existing Collection

Now that I have outlined eight evaluative criteria, I will apply them to the Digital Library of Appalachia.  For continuity’s sake, I will discuss the criteria in the same order as presented above.

Accessibility

Technologically speaking, I have had no accessibility issues with this website during my contact with it. I was able to locate it easily with the help of a search engine and gain repeated access to it once I added its URL to my list of “Favorites.” I conducted both quick and advanced searches, browsed by library, topic and essays, and was able to reach the various tabs provided without incident. I did have a problem with the preferences, though, in that once I chose my preferred color scheme and applied it, the appearance did not change. That could be a technological problem which needs to be addressed in case the user has a problem with the given color contrasts. I find the brownish orange on medium gray a little tough to read.

Quality

In my perusal of the topic of “education” in the Digital Library of Appalachia, it looks as though the contributing members have made efforts to research the topic and provide several resources for additional reading.  The Digital Library of Appalachia’s participants consist of several college and university libraries, institutions unlikely to invest the effort in useless information or have an agenda other than education.

Ease of Use

The DLA’s digital library is pretty easy for an adult to navigate. There are several different discovery options: keyword “quick search”, advanced searching, and browsing by topic or library. A few times the results took longer than I had expected them to – 10 seconds or more. As people come to expect faster retrieval speeds, this could be a deterrent to use.

Interpretation Difficulty

I did encounter an interesting issue with the terms “violin” and “fiddle.” As I understand it, these two terms are interchangeable. I did a search for “violin” and six results were returned. A search of “fiddle” produced 4141 hits. Clearly, in the minds of those providing the metadata, there is a distinct difference between the two instruments. This is probably a response to the local/geographical preference for calling the instrument a fiddle, but this could be an area for frustration for those interested and choosing the wrong term.
Additionally, I would say the reading level could become an obstacle for visitors with limited educational achievement.  This particular DL is not very kid oriented in topic or set-up. The content and expected user appears to be in high school or older.

Accuracy

I did not find reason to doubt the veracity of any of the resources I examined while checking this digital library.

Success

This is a little harder to gauge as I did not visit the Digital Library of Appalachia with a definitive question to solve, but I did find interesting material when I looked through the resources. I tried a few common terms like “children” and “school” and received lists of multiple items to look through so I guess I would consider my searches to be successful.

Institutional Fit

The institutional fit is good. All of the contributing libraries are part of the Appalachian College Association and the materials are supplied from their special collections. Joining this group allows another way for their library/school to be discovered and get noticed.

Attractiveness

The color scheme of gray, rust, green and orange does not particularly appeal to me. The quilt examples are interesting, and photos on the main page catch the eye. I think if they were on a brighter background they would stand out more, though. The layout is O.K. and pretty much what one comes to expect of a CONTENTdm website.

All in all, based on my chosen criteria, I would say that the Digital Library of Appalachia is a helpful resource for those in high school or older who are interested in learning more about this area of our country.

Website Accessibility

Assistive Technology Devices

People with disabilities can use assistive technology devices to help them use computers, access Web information, and serach for information. A list of ATD can be found through the WheelchairNet Organization.

Website Accessibility Guidelines and Evaluation

The World Wide Web Consortium (W3C) offers several guidelines and techniques for maintaining web accessibility. Additionally, they provide a listing of accessibility evaluation resources.

Accessibility Metadata Project. Beyene (2017) discussed accessibility metadata that can improve accessibility of information and communication technology solutions. Accessibility metadata can be used to expose accessible resources for the benefit of users with disabilities. Accessibility metadata would help users to quickly discover materials that fit their needs. Accessibility metadata terms can match WCAG 2.0 guidelines. Two principles of WCAG, “perceivable” and “understandable”, mostly matched “accessibilityFeature” (e.g. alternativeText, Captions, audioDescription, highContrastAudio, audioDescription, highContrastDisplay, tactileGraphic, tactileObject, readingOrder, signLanguage, Braille, displayTransformability, readingOrder, etc.), while another principle of WCAG, “robust”, matched “accessibilityAPI” (e.g. AndroidAccessibility, ARIA, ATK, AT-SPI, BlackberryAccessibility, iAccessible2, iOSAccessibility, JavaAccessibility, MacOSXAccessibility, etc.). The other principle, “operable” was related to “accessibilityControl” (e.g. fullKeyboardControl) and “accessibilityHazard” (e.g. Flashing, noFlashingHazard, motionSimulation, noMotionSimulationHazard, sound, and noSoundHazard) as well as “accessibilityFeature” (e.g. readingOrder, structuralNavigation, and timingControl).

Search Engine Optimization (SEO) Factors. Moreno & Martinez (2013) offered recommendations for the Web content accessibility by analyzing overlapping between on-page SEO factors and WCAG 2.0: keyword use (in image alt text, video subtitles and transcription, internal/external link anchor text on the page, the first word(s) of the title tag, and H1 headline tags), HTML validation to W3C standards, existence of a meta description tag, and location in information architecture of the site.