What is the challenge? |
There are times when an AI query might span across many workspaces and data objects. When large data sets are involved, the data limit of the context window may be pushed further than what is allowed by current LLMs (GPT4.1/GPT5.0, Claude, Gemini). |
---|---|
What is the impact? |
If the user is not aware that they are asking the AI assistant to process more data than the LLM context window allows, they let the AI assistant run for a long time before receiving a notification that the query could not be processed. |
Describe your idea |
Include in-app guidance on how best to navigate using the AI assistant when large data sets are going to be in play. Some examples:
It would also be useful if the AI assistant processes data in multiple increments, to keep within the data limits of the context window. |