Example of Effective Searching

The University of North Texas College of Music has an online audio and video archive for distribution of student recitals and larger ensemble concerts. The main purpose of this archive is to get media to people as quickly as possible.

Our users ranging from college students on their mobile phones or small notebooks to much older and varying technical competency. If users can’t find what they are looking for within a few pages I get an email asking where to find their event. Many were more than inconvenienced at the idea of dropping DVDs and CDs in favor of some online system, a system which they checked their mailbox and had a disc waiting for them. Tough order to match online.

While many sites on the Internet focus on retention, my goal is to get people on and off the site in as few steps as possible, with my reward being less confusion, and less emails.

Using Hotjar video recordings (early beta adopter too!) I quickly found several parts of the site were not used, working properly, or had good results. While Google Analytics and other stats applications focused on things like funneling, timelines, and visitor flows, it did not tell me why people were failing in finding their recordings while very much page surfing. In addition Hotjar allows tagging during the recording, so I could tag specific events while the user was active such as selecting input fields, what they typed, hitting enter, or clicking specific links and watch them later.

Watching in Horror

We had a javascript app that searched through a giant JSON list for matches. It was amazingly fast and I had thought the results were good enough.

The recordings showed quite the opposite. Early estimates put a clickthrough from this search between 10–15%. You could feel the frustration in their mouse movements watching these gut-wrenching recordings. Only worse was that the site’s other categories and sorting views were minimal, thinking search was enough, and I watched as members struggled to use those views.

On an iPhone the performance was horrible. It was slow and on lower-end iPhones would crash the browser.

Search 1.0

I couldn’t find an easy javascript solution for better sorting so I moved to server-side using a plugin that returned great results. Add a touch of async retrieval, and the change was hardly noticeable. After about 3 months we are seeing a 60% click through of immediately returned results, with the other 40% simply hitting enter and being taken to a more in-depth results page. That put the search at about a 70% use across the site.

Me, Me, Me.

What members were typing was not a surprise, but was not as obvious as it should have been. Student musicians were searching for themselves. They were searching for their concerts. Duh. At the time only some types of events had performers tagged, we quickly moved to tagging every performer in every concert for more complete results and saw reduced surfing.

Search 2.0, The Search Strikes Back.

Since the students have to log in, we know their name, why not search for it automatically? So a section of “You Might Be In” was added to search for their exact name string on the landing page after login. This knocked down the input field search from 80% on the landing page, to 40% with that 40% going to that new view. The quickest conversions (login to file downloads) are from this input field.

This statistic will only get higher as more events are tagged with complete program information.

Search 3.0, The Return of the Search.

While name searching was popular with students, it was not with faculty. First, their login rate is much lower than students even equating the quantity of faculty vs student difference. Age and technical skills are also a consideration, and their platform is mostly desktop, but off-campus just as much as students.

Ensemble directors always access their ensemble recordings, so showing them their ensemble before anything else was easy. I also put in current metadata for them to review.

Instrumental faculty tend to look for their students, or students who play either the same instrument or the same area. Some faculty need to listen to recordings for grading, some for teaching review. Again relevant results are displayed after first logging in further decreasing searching and surfing.

More interesting is that most faculty I talk to about this are not aware of the lengths gone into customizing their experience. Those that are comment how different it is from other university systems. Even if they are not aware of it, the data shows those personalizations are saving them time and energy completing this task.

I manage a recording department for a large music college. Audio, video gear, web streaming and web development.