FEATURE
Bing vs. Google: What Patrons Should Know About Search
by Jennifer Bruneau
The truth about internet search is this: We don’t need 5 million answers—we just need a single correct one. |
It’s information overload. The amount of material available online is almost incalculable. This includes scientific data, advertisements, social media posts, government documents, and personal websites. The internet has the power to give users access to quality information from scholarly sources. It also has the power to lead the user down a rabbit hole of misinformation. The art of internet searching is not to simply Google a term and then click on the first website that appears, but rather to hone a search so that the user can discover high-quality information from good sources. Librarians are in a perfect position to guide patrons through the mountains of digital bits to the relevant and authoritative sources. Let’s examine Google and Bing, the two largest search engines in the Western world. The searches conducted in this article occurred in March 2019. In addition, those searches will produce more results with each passing day, as new websites are created and the indexing power of Google and Bing become more powerful.
Test Cases: Top Results
A Google search for “when did the American Revolution end” nets 248 million results. That’s impressive, but nobody is going to read through 248 million results. Many will only look at the immediate results on the first page or two. As Mary Ellen Bates (2019) says, “users believe that Google search results automatically put the most reliable results first.” And search engines such as Google and Bing are playing into this perception, making the top results even more obvious. A “featured snippet” is a highlighted information box appearing at the top of search results, such as the one produced by this search on the American Revolution (Figure 1). This snippet does accurately indicate the end year of the American Revolution (1783), but librarians can note for the patron that this snippet has no citation. The following cases simulate reference requests in public and academic library settings. OK, time to get into reference mode.
CASE 1: A patron wants to read John Grisham’s novels in the order in which they were published. The question, therefore, is rather specific and straightforward. A Google search for “what was John Grisham’s first novel” produces a featured snippet that displays A Time to Kill in bold letters, the information pulled from jgrisham.com (Figure 2). An identical Bing search produces a panel of Grisham novels as the top result (Figure 3). The panel can be arranged to list the books from oldest to most recent, which is a very handy interactive tool for readers advisory. Also, beneath the panel, we have a snippet pulled from thoughtco.com that identifies A Time to Kill as Grisham’s first novel.
CASE 2: This is an example of the kind of question in which a simple internet search might be the best reference tool. A patron wants a list of comedies starring John Candy. A Google search for “comedies starring John Candy” results in an interactive snippet (Figure 4). The snippet includes several movie titles, each with a corresponding picture and the date of production, arranged horizontally above the search results. We can slide the panel left and right to see more titles. The first results are Planes, Trains and Automobiles (1987), Uncle Buck (1989), and The Great Outdoors (1988). An interesting feature of this Google snippet is that the user can adjust this John Candy filmography by year and genre. For instance, the user can select Drama from a drop-down menu and see that Candy also starred in JFK (1991). The patron will likely appreciate how easy it is to see the different titles without navigating from the search page. If this had been a more serious reference question, the librarian would notify the patron that the snippet does not have a citation.
An identical Bing search produces a similar panel illustrating Candy’s filmography (Figure 5). The listing includes fewer titles than the Google snippet, and there is no interactive feature to switch genres or years. However, Bing includes ratings for each film, which appear to be pulled from imdb.com, but no citation is given. These ratings let the user know that The Blues Brothers (1980) holds up better than Splash (1984).
CASE 3: A student needs information about the Egyptian pyramids. She uses a library computer to conduct a search. The Bing search for “when were the pyramids built” presents an important information-literacy teaching moment. The featured snippet offers ambiguous information, as it skims information from the Wikipedia page and highlights November 2008—definitely not the year the pyramids were built (Figure 6). This example shows how a search engine can pull information from a webpage, post it at the top of the search results, and potentially mislead a user. While the top Bing result is problematic, the student will find National Geographic further down on the first page of results. The librarian can explain that National Geographic is a better source for a school paper than Wikipedia, since the user can cite a specific author in National Geographic. A college-level paper may need a scholarly source, in which case the librarian can lead the student to a scholarly database.
A Google search about the pyramids produces a better first result, boldly setting 2630 B.C. in the featured snippet (Figure 7). This result was drawn from history.com, the online presence of the History Channel. The History Channel might suffice for a K–12 paper, but this popular source may not cut it at the college level. This example is a good case in which a librarian can explain the difference among Wikipedia, a quality popular source such as National Geographic, and scholarly resources.
CASE 4: A patron is searching for a power of attorney form. Here we see the limits of internet search. The librarian should steer the patron away from Googling, say, “power of attorney form,” or “is my migraine a tumor,” or “the best companies to invest in.” It really is the responsibility of the librarian to let the patron know that certain cases, such as these, are beyond the scope of an internet search. The patron should talk to an attorney about legal documents, schedule an appointment with a doctor about that awful headache, or discuss investing options with a trusted financial advisor. Many maintenance and repair issues may also require professional consultation. For instance, the patron who has Googled “fixing a sparking electrical outlet” should be furnished with phone numbers for local electricians.
Tricks of the Trade
The truth about internet search is this: We don’t need 5 million answers—we just need a single correct one. Fortunately, librarians can demonstrate simple search strategies for patrons that apply to both Google and Bing. These simple tools can narrow the limitless heaps of information retrieved from a search.
A Google search for “sharks” retrieves 617 million responses. At the top of the Google search results are the statistical tables for the San Jose Sharks NHL franchise, as well as a link to the franchise webpage. Bing provides us with 28.5 million results. Similar to Google, Bing assumes we are sports fans and offers an interactive panel listing the latest hockey statistics. However, Bing also provides an adjacent panel at the same level that discusses sharks as wildlife, with information drawn from a Wikipedia page. While Wikipedia is not a scholarly scientific resource, Bing provides a wider spectrum of subject interpretation in this case.
How can we limit the search to sharks the wildlife, not the hockey team? There is an easy way to limit Google and Bing searches to more authoritative sources. Results can be narrowed to college and university webpages by simply adding site:.edu after the search term. While a Google search results in more than 600 million hits, a search using sharks site:.edu reduces the number to 860,000 hits. All of the websites from this search are domains that end in .edu—meaning that many of these webpages are authored by academic departments or university facility. While the Google search “sharks” retrieves a sports team as the top result, the search for using sharks site:.edu retrieves the Smithsonian National Museum of Natural History as the top result. This trick also works in Bing, which likewise retrieves the Smithsonian as the top result.
Librarians can show patrons another easy trick, one that limits search results to government websites. Simply add site:.gov to the search string. For instance, a Bing search using sharks site:.gov results in more than million government webpages. The top result is the State of Hawaii Division of Aquatic Resources. The identical search in Google retrieves the National Oceanic and Atmospheric Administration. Bing and Google become much more authoritative indexes with a few keystrokes.
Scholarly Sources
Let’s get serious. When a patron needs technical or peer-reviewed materials, librarians can turn to Google and Bing’s scholarly sisters: Google Scholar and Microsoft Academic. For instance, a college student needs to write a 10-page paper discussing Noam Chomsky’s theory of universal grammar. Instead of finding popular sites about Chomsky’s universal grammar, Google Scholar and Microsoft Academic allow the user to locate the scholar’s actual works.
The user can search Google Scholar (scholar.google.com) by typing a simple string into the search box. However, Google Scholar has an advanced search. Click on the menu button, and the Advanced Search option is made visible (Figure 8). It allows the user to hone the search to a specific author—in this case, Chomsky (Figure 9). A panel to the left of the search results gives users the option to narrow the search by year or to list the results by relevance or date. Results with available full-text links are shown to the right of the citation. Google Scholar provides helpful information on several levels, including citation assistance, the number of times the article has been cited by other scholarly works, and links to WorldCat to locate monographs in libraries.
Let’s take a look at Microsoft Academic (academic.microsoft.com) and run the string “Chomsky universal grammar” in the search box. The results can be narrowed by limiters to the left of the results (Figure 10). These limiters can narrow the search by date range, author, author affiliation, field, journal, and publication type. This search resulted in 60 citations, fewer retrievals than the 301 results found in Google Scholar. However, if we click on Chomsky’s name in one of the results, we receive a listing of his papers indexed in Microsoft Academic. Currently, this database lists 575 papers by this author, and these articles have netted 225,320 citations. Also, when a user searches Microsoft Academic by author, he or she can list the authors’ publications in descending order by Relevance, Newest First, Oldest First, and Most Citations. When we sort Chomsky’s articles by Most Citations, we quickly identify the author’s most influential papers, an amazingly helpful tool for scholarly research (Figure 11). Microsoft Academic details that Chomsky’s 1965 book Aspects of the Theory of Syntax has been cited an incredible 35,683 times. Similar to Google Scholar, we can retrieve available full-text papers in Microsoft Academic, as well as access citation tools to build bibliographies.
Search On!
Bing and Google provide the user with powerful search tools, with both engines performing well in many reference areas. An informed user can retrieve great information fast and efficiently on either platform.
|