Anthropic is enhancing its Claude chatbot with a new artificial intelligence feature. On Monday, the San Francisco-based company unveiled an update that allows Claude to recall and reference previous user conversations, enabling users to seamlessly continue discussions from where they halted. This development aims to prevent the inconvenience of scrolling through past messages for unfinished projects. The feature is initially being offered to select paid subscription tiers.
Claude Can Now Refer to Previous Chats
In a post on X (formerly Twitter), Claude’s official account shared news about the chat reference capability, which is currently available for Max, Team, and Enterprise subscribers. The company indicated that this feature will soon extend to other subscription plans, including Pro. However, it remains uncertain whether the feature will be available for users on the free plan.
The ability to reference past conversations is not a new concept for AI chatbots, as both OpenAI’s ChatGPT and Google’s Gemini already provide this functionality to their entire user base, including those on free tiers. This feature typically utilizes a retrieval-augmented generation (RAG) method to access older dialogues upon request or when deemed necessary by the chatbot.
While Anthropic has previously integrated two-way voice conversation and web search capabilities in May 2025, the company has been relatively slow to adopt new features for its chatbot. The introduction of this chat reference system is expected to assist users who regularly engage with Claude, saving them from having to navigate through extensive chat histories. Users will be able to inquire about a previous conversation, enabling Claude to fetch relevant information and build upon ongoing tasks.
Last month, the company implemented new weekly rate limits for paid subscribers due to a small group of users exploiting the prior policy that reset rate limits every five hours. At that time, Anthropic reported instances where some individuals ran Claude Code continuously, resulting in usage costs reaching into the tens of thousands of dollars.
The new chat reference feature has raised concerns among some users that accessing information from densely packed conversations may contribute to reaching the rate limit quicker. The company has not clarified whether utilizing this feature will consume any tokens.