Adventures in simplification
Libraries have certainly embraced the convenience factor. What is discovery search but Google on steroids? Yet I hear a lot of librarians having second thoughts or even serious doubts about having gone this route. Simplicity on the front end does not mean you will have usable data at the other end.
Despite our efforts, information seekers continue to prefer Google to library databases. So maybe we should move our information literacy efforts into our users’ arena by teaching them how to optimize Google and Google Scholar, find dissertation archives, and create good search strategies. I already do that in my graduate credit courses but with a crafty twist: First, I have students use the library’s databases to do a search assignment. Then they do the same assignment with Google Scholar. Their reaction? Almost universally, it is that Google Scholar sucks. It lacks the mechanisms to do precision searches and the results are confusing. Then you need to take the further step of checking holdings through the library links option in order to get full text for most items.
So, if we can find a way past the convenience factor to actually convince users that they are better off not putting all their eggs in the Google basket, maybe there’s some hope in getting users to the best tools. But it’s a tall order.
Measuring Value
A colleague and I have been working on this problem for some time now. We’ve seized upon an interesting partial solution in 75-minute sessions with our University 101 students. Here is the scenario: Students are supposed to write a paper on climate change in the Arctic, finding at least five peer-reviewed articles that are fully relevant and no more than 5 years old.
My colleague starts with Google, searching on "climate change" arctic. Happily, Google recognizes this as a potentially academic topic and provides three “scholarly articles” at the top of the results page. The first is a hazardous materials risk assessment from the Federal Highways Administration, dated 1999 (so much for Google’s precise search ability). The second is a scholarly article on climate change in the Arctic, but it’s too old (2005). The third is from Nature, so it passes the academic test, but it dates from 2001. Three strikes, so we have a look at the rest of the Google results.
Moving down the results list, we find a site sponsored by Pew Charitable Trusts, and it’s definitely on Arctic climate change (oceansnorth.org/climate-change). There is a stated author but no indication of any sort of peer review. After the inevitable Wikipedia entry, there are a number of sites from various organizations concerned with the issue. Many of them look fairly scholarly, but none are clearly peer reviewed. A variety of blogs and reports follow, but nothing peer reviewed. All these efforts have taken time, and the convenience factor is screaming that it wants a better way to solve the problem.
A Google Scholar search gets articles that are more within the assignment parameters but that cover multiple aspects. There’s no coherent focus. And there is still the problem of finding full text through library links. Scholar results are messy. Half the time, students don’t even know what they are looking at—book, article, or something else.
At that point in our teaching session, I step in, rather smugly, and use EBSCO’s Academic Search Premier to do an initial search, enlisting the subject headings and peer-reviewed articles features on the results page to get five peer-reviewed articles dated in the time frame required. The results are not only relevant but also enable me to narrow the topic to the plight of polar bears in the light of Arctic climate change. Total elapsed time: less than 10 minutes.
A student came up to me after one of our sessions and told me he was giving up Google for academic research. Maybe his response was a bit extreme, but he’d gotten the message that the convenient Google may not be as convenient as it seems to be.
The big issues
We are left with two questions. First question: Are we academics setting criteria that are too strict for today’s information environment? In my Google search on climate change in the Arctic, I found any number of pretty good resources—government reports and other studies by known environmentalists, such as David Suzuki. Is pretty good not good enough? My answer would be that such reports and studies could well offer good “things,” even academic “things.” There is value there. But there is also an academic tradition of putting academic work to the test. Peer review may be under attack on several fronts, but it survives as our best way to test academic findings.
The convenience factor that lets us simply post our academic work on the internet or release a fine-looking report from our organization online says peer review (an incon venient process) may not be necessary. But inadequate critique from external parties creates the potential that the sources students use could devolve into an environment of opinion and untested assertions.
Second question: If Google-like tools actually make the task of finding what we need more complicated than the same quest in a sophisticated academic database, is there some way to alert users to the fact that what appears to be convenience may actually be an illusion? A search in Google, Google Scholar, or a discovery tool seems so right, so easy, users may miss the fact that wading through their mess of results is actually taking them longer than the same quest would in an academic search premier.