Collaborative Browsing Experiment

Team: Observer-Kathi Blocher, Searcher-Hiroaki Takatsuki, Assistant-Marko Turpenien


Observer's Observations

Scenarios: Paris | Ralph Nader | Caffeine | LEGO | Summary


Temperature and Humidity in Paris

Hiroaki, the searcher, immediately requested input from Marko, the assistant, about the best type of search engine to use to find weather information in Paris. Marko met the challenge and continued to be helpful in a proactive fashion through the search. Marko recommended Excite because of the details provided by that particular search engine and suggested Hiroaki type in 'Paris Weather'. Hiroaki asked whether the string needed quotes, implying the desire for Marko to be more instructional. Marko continued to lead the search by explaining why recommended choices were suggested. When Hiroaki appeared frustrated by the slow response or un-informative page, Marko would recommend the Back button or Go menu feature in the browser to display other options listed in the initial search. After twenty minutes of searching for humidity, Marko suggested trying a different search question.
select another option

Number of States Ralph Nader appeared on the US Presidential ballot

Before Hiroaki began the next search, he asked if he should use the same informational retrieval tool, Excite. Marko recommended Alta Vista and suggested he type 'AltaVista.com' in the Netscape URL Location box because Net Search no longer includes Alta Vista. Hiroaki appeared to be considering whether quotes were necessary and Marko recommended he enter "Ralph Nader" with quotes for the search string. After receiving over 4,000 possibilities, Marko suggested a more detailed search using boolean connections, "Ralph Nader" AND "presidential" OR "states" and an unexpected list appeared with sources unrelated to Ralph Nader. Marko reaches towards the keyboard reflexively as if to make a quick modification and Hiroaki asked what just happened and was informed that the OR option returns all possible URLs with "states" OR "Ralph Nader AND presidential" and recommended typing "Ralph Nader" in the Results Ranking Criteria of the advanced search option. Scrolling through the selections Marko notices that Ralph Nader is part of the Green Party by the number of name occurrences in the summary listings and mentions how a super intelligent agent could recommend a search using Green Party too. Instead, Marko suggested the Resource Page and in a short time Marko points to '23 US states' on the screen.
select another option

Chemical name for Caffeine

One browser, one selection, one scroll and Marko points to the chemical name for Caffeine.
select another option

Year when LEGO toys were introduced to the United States

Marko recommends going directly to the LEGO Web page without using Net Search. He tells Hiroaki the possible URL, LEGO.com, and then recommends different button options on that page which may lead to the answer. After ten minutes browsing and not finding the specific date hunted, Marko suggests trying a different approach. Three different dates appeared for LEGO toys in the United States, although the date when LEGO toys were introduced to the US was more difficult to locate. Using Alta Vista and a search of someone's personal page containing Frequently Asked Questions about LEGO toys led to the year 1961.
select another option

Experiment Summary

Marko provided expert assistance to Hiroaki, the searcher. An instructional relationship was established after only a few questions and the remainder of the assistance was proactive and adaptive with the searcher being led to the solution of the problem. Hiroaki's body gestures and facial expressions communicated signals to Marko for further assistance. Marko's expert assistance included intelligence, intuitiveness and expertise where an equivalently talented software agent would need sophisticated algorithms, combined expert system, and affective visual devices to simulate human-like intelligence.

The searcher and assistant interacted smoothly with few misunderstandings. Any time the searcher seemed confused, the assistant adapted quickly offering a clarification or an alternative solution. Marko xhibited patience and empathy towards Hiroaki throughout the exercise even when leading Hiroaki through a search.

While being close to other teams, different dates to the LEGO questions were overheard, as well as found in our search, which prompts the question of validity; accuracy of the source and data are questionable. Trusting the assistant to recommend the correct source and find the right answer may be difficult to program in a software agent. This exercise illustrates the complex requirements on a software system if it is to provide human-like assistance. What features are reasonable to expect and achievable in software agents, today and in the near future?
select another option

Kathi Blocher

Last modified: Sat Feb 22 00:38:30 MET


...Back to the top

Searcher's Observations


Q. How helpful was it to have the Assistant?


Q. Were there times when the Assistant was more distracting that helpful?

Q. Was there any kind of assistance you might have liked from the Assistant that you didn't get?

Q. What could the Assistant have done if he/she knew more about your personal likes, dislikes, experience, etc.?

Q. If the Assistant were a computer rather than a person, what could it have done or what would be impossible?

Q. If the Assistant were another person [but not in the same room], how could a computer have facilitated interaction between you?

Other Comments

In the experiment, Marko, the Assistant, was not only a navigator of WWW, but also an instructor of search engines. For me, to be a good instructor is more valuable than to be a powerful navigator. Because, results given by search engines depend on the way of choosing keywords. After the experiment, I personally tried the same search. And in the search, for example, when I used keywords, "lego" AND ("in" NEAR "19") on the Alta Vista for the question about the LEGO, I could get the right web page in the first selection. In the pattern, "in" NEAR "19" means "in 19??." I think there are many patterns like this. If an agent propose those patterns, it could be more powerful.

Moreover, search engines often propose us more than 100 web sites. If I check half of them from a computer that is connected to the net with a modem, it could easily take more than half hour. Even though a powerful navigator assists me, this downloading time is not so different, and the heavy traffic would be a problem. To propose appropriate patterns are also effective to this problem.

Hiroaki Takatsuki

Last modified: Tue. Feb. 25

...Back to the top

Assistant's Observations

Q: How hard was it to provide assistance?

It was hard. There are many levels of interaction, communication and knowledge involved in even a simple search task. I had to translate my knowledge about search engines into useful assistance on the fly, while adapting to the level of sophistication of the searcher.

Language added up to the complexity of the task, since not the searcher, nor me, nor the computer has English as a mother tongue.

Q: Could the Searcher have provided any kind of input or advice that would have helped to give assistance?

That actually happened a lot during the excercise. When I wasn't clear or specific enough, the searcher asked me to repeat or confirm something I had said.

Q: Could you have done a better job giving assistance if you had had more time to work on the question yourself?

Definitely. Now we wasted a lot of time on things that I could have tried on my own computer while he was doing the searching.

Q: Qould you have an easier time giving assistance to the same person next time as a result of your experience?

Maybe, since now I know his basic level of expertise in using WWW search tools. However, I don't know anything about his personal preferences.

Q: Did you learn anything from assisting this person that you could apply if you were assisting someone else?

During the process I discovered some new things about the search tools myself. Also a big problem for me was to formulate my knowledge into understandable English, which would be easier next time.

Q: How much of your assistance could have been provided by a computer agent [assuming current technology]?

To certain level of sophistication it could have been done by a computer. If the agent (or a search engine) can do natural language processing on the seach task, it could try to come up with good suggestions.

Q: If the Searcher were another person [but not in the same room], how could a computer have facilitated interaction between you?

Offering a shared workspace to show results of browsing and to show examples (see what I found!) or a chat space.

Other comments:

I think the most interesting part of the experiment was when we searched for the LEGO question. We came up with one answer from the LEGo history page at www.lego.com which was 1987. Then Kathi, our Observer, noted that she had memories of playing with legos which date much earlier than that.

I suggested that maybe they were similar blocks, but not actual legos, since I know that they have been very careful with their patents and trademarks. So I tried to find a reasonable explanation for my wrong answer! Then we noticed that 1987 was the year when LEGO and Duplo BUCKETS were introduced to the US. I think this was interesting example of collaboration between humans in a computer-assisted search task.


Marko Turpeinen


...Back to the top