Virtual reference for videos
and wading through government statistics
(Only 2 out of the 3 scheduled speakers were present…) Listening to contributed paper presentations is an excellent way to get a glimpse at innovations coming down the pike..Ron Brown, doctoral candidate from UNC-Chapel Hill described the GovStat Statistical Information Glossary (SIG) research project. The SIG project is a component of the National Science Foundation (NSF) funded GovStat research effort out of the HCI labs at UNC-Chapel Hill and the University of Maryland. The project identifies users’ lack of statistical knowledge as a significant barrier finding and evaluating statistical information on the web. The goal of the SIG project is to help users become more familiar with statistical terms WITHOUT “pulling [them] away from their primary information task” ( a la the parent project’s motto: finding what they need and understanding what they find) . The study looked at the effectiveness of offering definitions/explanations of frequently used statistical terms in a variety of formats (text only, animation, and animation and text) to accomodate different learning styles. It also examined the impact (perceived and real) of three distinct levels of interactivity had on user learning rate and satisfaction.
VR for Videos
The video-based virtual reference IS a novel concept. Most of us (librarians and info specialists) are adept at guiding users to those elusive text-based objects (digital and print). With digitally born objects being made available on the web by the minute, it is only a matter of time before we will be asked to help users locate multimedia bits and clips stored in corporate or institutional repositories, major search engines or open access resources like the Open Video Project. Xiagming Mu and Lili Luo’s paper details the system design, testing and pilot user study of the VideoHelp tool. The VideoHelp tool is java based and employs chat, escorted navigation (basically co-browsing), the shared access and control over video files stored in a database.
Chat w/escorted navigation should be familiar from most 24/7 VR products currently available. Here is a jpeg¹ of the VideoHelp screen shot. (A = videoplayer, B= shared browser and C= live chat window). One worrisome note: the tool only searches video files via timestamp. There was a cursory mention of metadata, but it was not clear [to me] that there was other descriptive information attached or mapped to the timestamps. Would this only work for known video file searching? Expanding searching functionality and options should be explored. Mu suggested that a more extensive usability study would be conducted in the future. Interesting and potentially practical tool. Would it play nice with the developing Open Video Toolkit? I wonder if a product like Jybe will eventually have this capability or if a slick Ajax version is in the works.
¹image from 2005 ASIST Proceedings cd. Full papers available in the ASIST 2005 conference proceedings.
Beatrice | 1 Comment |
1 Comment »
Leave a comment
Line and paragraph breaks automatic, e-mail address never displayed, HTML allowed:
<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>