Google Search AI Unlocks Visual Data with Interactive Charts and Graphs
Google deepens AI integration, empowering Search with visual data and Gemini with proactive task automation.
June 7, 2025

Google is significantly advancing its artificial intelligence capabilities with the introduction of new functionalities within its Search platform's AI Mode and its dedicated Gemini application. These updates aim to provide users with more sophisticated tools for information retrieval and task management, signaling a deeper integration of AI into Google's core services. Notably, AI Mode in Google Search will now feature the ability to display charts and tables for more complex queries, while the Gemini app is being enhanced with "scheduled actions," allowing for proactive AI assistance.
The evolution of Google Search continues with the expanded rollout and enhancement of AI Mode, which is now becoming more widely available to users in the U.S. without requiring a sign-up through Search Labs.[1][2] This mode is positioned as Google's most powerful AI search experience, designed for more advanced reasoning and multimodal inputs.[3][4] A key development within AI Mode is its new capability to generate custom charts and interactive graphs, particularly for financial and sports-related queries.[5][6][7][8] Users can ask complex questions, such as comparing the stock performance of multiple companies over a specific period, and AI Mode will analyze the data and present it visually.[5][6][7] These charts are interactive, allowing users to hover over data points for more detail and ask follow-up questions to refine the information, such as inquiring about dividend payouts for the companies displayed.[5][6] This feature, currently being tested in Labs, leverages real-time data and Google's Knowledge Graph to provide these visualizations.[1][6] Beyond charts, AI Mode is also being bolstered by other advanced features. "Deep Search" is designed to provide more thorough, expert-level reports on complex topics by issuing hundreds of searches and synthesizing the information.[3][1][8] "Search Live," powered by Project Astra, enables real-time conversational search experiences, initially focusing on voice and planned to incorporate video streaming capabilities.[9][10] Personalization is another focus, with AI Mode soon to offer tailored results based on past search history and, with user consent, data from other Google apps like Gmail to assist with tasks like travel planning.[3][2] Underpinning these advancements is Google's powerful Gemini 2.5 model, which enhances the reasoning and multimodal capabilities of both AI Mode and AI Overviews.[3][1][2][4] AI Overviews, which provide AI-generated summaries at the top of search results, have already seen increased user engagement and are expanding globally.[3][11][2][12]
Parallel to the enhancements in Search, Google's standalone AI assistant, the Gemini app, is gaining a significant new feature called "scheduled actions."[13][14] This functionality allows users to automate tasks and receive proactive updates from Gemini.[14][15] Users can instruct Gemini to perform specific actions at set times or on a recurring basis.[14][16][15] Examples include receiving a daily summary of calendar events and unread emails, getting weekly blog post ideas, or obtaining updates on a favorite sports team after a game.[13][14][15][17] One-off tasks, like requesting a summary of an awards show the day after it airs, can also be scheduled.[14][17] This feature aims to transform Gemini from a purely reactive chatbot into a more proactive productivity tool.[15] The setup process is designed to be intuitive, with users able to define tasks, set timings, and manage these scheduled actions within the app's settings.[14][16][15] Scheduled actions are currently rolling out to users with Google AI Pro or Ultra subscriptions, as well as qualifying Google Workspace business and education plans.[13][14][15] Gemini can also use location information, if permitted, to provide more relevant responses for location-based scheduled actions, such as weather updates.[16] Users retain control over these actions, with options to edit, delete, pause, or resume them at any time.[16]
These new features represent a significant step in Google's broader AI strategy, which emphasizes making AI more personal, proactive, and deeply integrated into its product ecosystem.[14][18][19] The introduction of scheduled actions in the Gemini app, for instance, positions it more directly against other AI assistants like OpenAI's ChatGPT, particularly in terms of productivity and task automation.[15] The ability for Gemini to connect with other Google apps like Calendar, Notes, and Photos further enhances its personalization capabilities.[19] The advancements in AI Mode, particularly the inclusion of data visualizations and deep research tools, aim to change how users interact with search, moving beyond simple keyword queries to more complex, conversational information discovery.[4][5][6] This ongoing development of AI in Search, including AI Overviews, has already led to an increase in search activity for queries that trigger these features in markets like the U.S. and India.[3][2] However, the increasing prevalence of AI-generated summaries and direct answers in search results also carries significant implications for the wider digital ecosystem.[20][21] Content creators and businesses that rely on organic search traffic may see shifts, as users might find the information they need directly within the AI-generated responses without clicking through to individual websites.[20][21][7] This could necessitate a re-evaluation of SEO strategies, with a greater emphasis on becoming a cited authority within AI summaries and structuring content to be easily discoverable and understandable by AI systems.[20][21] Google has acknowledged past "edge cases" where AI Overviews provided incorrect information and states it is continuously working on improving accuracy and reliability through AI training and feedback.[2] The company also emphasizes user control over personal data and the use of information from other Google apps for personalization.[2][22]
In conclusion, Google's latest updates to AI Mode in Search and the Gemini app underscore its commitment to embedding advanced AI into the daily digital experiences of its users. The ability to generate charts and tables directly in search results offers a new dimension to information access, particularly for data-intensive queries. Simultaneously, the introduction of scheduled actions in the Gemini app marks a move towards a more proactive and automated AI assistant. These developments highlight a future where AI is not just a tool for finding information but an active partner in managing tasks and navigating complex data. As these AI capabilities become more sophisticated and widespread, they are set to reshape user expectations and interactions with technology, while also prompting adjustments across the digital content and marketing landscape. The focus on integrating these AI features with user data and other Google services further points towards a more interconnected and personalized AI-driven future.
Research Queries Used
Google AI Search charts and tables feature details
Google Gemini app scheduled actions functionality
Google AI Overviews new capabilities
latest updates Google AI Mode Search
Google Gemini app new features explained
implications of new Google AI features for users and industry
Sources
[3]
[6]
[9]
[10]
[11]
[12]
[13]
[14]
[16]
[17]
[18]
[19]
[20]
[21]
[22]