Search Engine Design Perspectives

CIRI Blog

Published: October 22, 2021 by Dr. Virginia Tucker

Two years ago, I had the privilege of participating in the IEEE Award Ceremony in Palo Alto, California, where the DIALOG search system was recognized with an IEEE Milestone Award (2019a), joining an elite group of inventions ranging from ARPANET in 1969 to Marconi’s telegraph machine in 1895 (2019b). The award citation describes how DIALOG, in 1966, was the first interactive, online search system to allow “iterative refinement of results” across multiple databases, and that it “preceded major Internet search tools by more than two decades” (IEEE, 2019a).

I worked at the company as a product architect, client training manager, and instructional designer from the early 1980s to 2000s, before returning to academia at the iSchool. At the time of the ceremony, I was working on an article about the evolution of search design, and the event brought into stark relief how the extraordinary achievements in search capabilities in recent years have left behind some of the strengths of the early search engines such as DIALOG. Most notable is the ability to search iteratively, refining initial results based on what displays, to have a true conversation—a dialogue (the origin of the system’s name)—with the retrieval system. This kind of exploration refines not only the results but also the question to be answered, in support of ‘berrypicking’ in information seeking (Bates, 1989). For example, fine-tuning the initial result through inclusion of additional search terms is not supported, nor is faceted iteration or restriction of terms through different proximity operators or by appending field limiters, all standard fare since the 1970s on aggregator search systems. Web search engines have features to narrow results by such options as filetype, date range, and page segments (e.g., title or URL), if the user knows the correct syntax, and this is not to minimize in any way their significant smarts in relevance ranking and in drilling into page and file content. The article was published this year (Tucker & Edwards, 2021), reporting results from a study that expanded an existing theoretical model that represents how university students experience web searching, framed in the larger context of how search engine design has evolved. This evolution is discussed from multiple perspectives: the market-, not user-driven, design of search engines; anticipatory search and algorithmic biases; and the trend away from learning-to-search modes. As Bates (2019) expressed it: “The goal is no longer to make search results more valuable, the search industry is about making as much money as possible from people’s data” (‘How search really works’).

Another experience during the IEEE event was in reflecting on my responsibilities at the iSchool and on how I work with the mental models of students in the information retrieval and search classes I teach. Students whose search experiences have largely been inside the single searchbox on Google (or similar) find it difficult to build on this knowledge into understanding about data structures and user-selected search options, literally wrestling to detach from their dependence on anticipatory search. They also typically begin not knowing that the generation of their search results and anticipatory displays involve filter bubbles, sponsorship, algorithmic biases, etc.—nor how to grasp what they might have missed. And, sadly, a valuable and well-designed learning-to-search interface on Google that supported the user in developing more advanced skills was removed (Tucker & Edwards, 2021, p. 5-6).

A recent event hosted at San José State brought increased attention to the racial and gender biases built into the algorithms of web search engines (Tufekci, 2015; Wachter-Boettcher, 2018). In April this year, Shalini Kantayya, director of the documentary film, Coded Bias, spoke to the campus community via Zoom on the fight for algorithmic justice, featuring the research of Dr. Joy Buolamwini (2016) and others; the ensuing discussion pointed out bias within educational technologies as well. A few weeks later, Simmons University’s School of Library and Information Science hosted a virtual conversation with Dr. Safiya Umoja Noble, author of Algorithms of Oppression (2018), facilitated by Dr. Marie desJardins. Such events are keeping the conversation going about search engine design, where it’s been, and what its trajectory needs to be.  

References

Bates, M. J. (1989). The design of browsing and berrypicking techniques for the online search interface. Online Review, 13(5), 407-424.

Bates, M. J. (2019). How search really works [podcast]. UX Radio. www.soundcloud.com/ux-radio/how-search-really-works-with-dr-marcia-bates 

Buolamwini, J. (2016). How I’m fighting bias in algorithms. TED Talk. https://www.ted.com/talks/joy_buolamwini_how_i_m_fighting_bias_in_algorithms

Institute of Electrical and Electronics Engineers (IEEE). (2019a).  DIALOG Online Search System Milestone Award. https://ethw.org/Milestones:DIALOG_Online_Search_System,_1966

Institute of Electrical and Electronics Engineers (IEEE). (2019b). Engineering and Technology: List of IEEE Milestones. https://ethw.org/Milestones:List_of_IEEE_Milestones

Kantayya, S. (Director). (2020). Coded Bias [Film]. Seventh Empire Media.

Noble, S. U. (2018). Algorithms of oppression: How search engines reinforce racism. NYU Press.

Simmons University School of Library and Information Science. (2021, May). Conversation with Dr. Safiya Noble. YouTube https://youtu.be/0iZJmTtuPlg

Summit, R. K. (1967). DIALOG: An operational on-line reference retrieval system. Proceedings of the 22nd ACM National Conference, pp. 51-56. https://doi.org/10.1145/800196.805974  

Tucker, V.M. & Edwards, S.L. (2021). Search evolution for ease and speed: A call to action for what’s been lost. Journal of Librarianship and Information Science, 53(4), 668-685. http://doi.org/10.1177/0961000620980827  

Tufekci, Z. (2015). Algorithmic harms beyond Facebook and Google: Emergent challenges of computational agency. Colorado Technology Law Journal, 13(2), 203-217.

Wachter-Boettcher, S. (2018). Technically wrong: Sexist apps, biased algorithms, and other threats of toxic tech. W.W. Norton.

Comments

Post new comment