AI Challenges

This blog specializes in Generative AI challenges for professional use.

Get better search results with LLM generated search queries in my Web Search Assistant
Published 4.10.2024 by Christian Gintenreiter

Get better search results with LLM generated search queries in my Web Search Assistant

The other day I was optimising my Web Search Assistant and wanted to improve the search results.

My Web Search Assistant is a small Web Tool that takes a question, searches via Brave Search API and answers the question based on the contents of the first pages of the search results. You could think of it being a mini Perplexity.ai probably with a very narrow focus. I did not want to create a remake of Perplexity.ai - I just needed some playground to try out some ideas.

Using the question 1:1 as search query

The challenge of using the question 1:1 as search query

What I want to have is that the user can enter any question in natural language and get an answer.

The Brave Search API is based on simple search queries like "What is the capital of Germany?" or "Who is the author of the book '1984'?" or even simpler queries like "birth date of Marc Márquez".

So I found that one and the same sentence might not be a good fit for both the search and the LLM.

The idea to find a more suitable input for the search query

I added an option called 'Use LLM for search queries' which allows to use an LLM to generate a search query as the first step. Using a small LLM and instructing it well is key here to not prolong the response time too much due to this extra step.

This way the LLM extracts the key words and the search query is much more likely to return good results also if the user entered a more complex or an unclear question.

How to deal with questions that can not be answered by a single search query

The challenge of answering questions that can not be answered by a single result set

What about answering questions like 'Who is older, Marc Márquez or Valentino Rossi?' ...

You would have to be pretty lucky to find the information of either the answer or both birth dates in the results of a single search query.

The idea to use the LLM to generate multiple search queries and combine the results

So I extended the option 'Use LLM for search queries' to not only generate a single search query but to generate multiple search queries if applicable. The LLM has full freedom to decide how many search queries to generate.

Here is the prompt I use:

You are a helpful assistant that knows how to transform a user question into one or several web search queries.

General information:
 - Today is the 04.10.2024
 - The current time is 11:20 UTC

Here are some examples of how to transform user questions into web search queries:

User question: When was Lewis Hamilton born ?
Web search queries: 'Birth date Lewis Hamilton'

User question: Is Michael J. Fox older than Bjork ?
Web search queries: 'Birth date Michael J. Fox' +++ 'Birth date Bjork'

User question: Are there more inhabitants in Paris or London ?
Web search queries: 'Number of inhabitants Paris' +++ 'Number of inhabitants London'

Extract maximum 3 web search queries from the user question.
Always exactly follow the format: 
- 'query1' for one query
- 'query1' +++ 'query2' for two queries
- 'query1' +++ 'query2' +++ 'query3' for three queries

User question: {query}
Web search queries:

I added the day and time in case the user asks a question that might relate to the current day or time.

Take aways

Using the approach of letting the LLM generate the search queries I was able to improve the search results for my Web Search Assistant.

For questions that get a bit more complex I now get answers where before this adaptation the answer often was 'The context does not provide enough information to answer this question.'.