Cloud Blog: Grounding Analytical AI Agents with Looker’s Trusted Metrics

Source URL: https://cloud.google.com/blog/products/data-analytics/grounding-analytical-ai-agents-with-lookers-trusted-metrics/
Source: Cloud Blog
Title: Grounding Analytical AI Agents with Looker’s Trusted Metrics

Feedly Summary: The growth and adoption of generative AI enables new ways for users to engage with products and services. Your organization has likely spent significant time and resources establishing trusted self-service analytics capabilities – and AI is revolutionizing how users can interact with their business data. For example, instead of building a query, users can use their voice or type their question using natural language . AI is the ‘magic’ that sits between the user and the data being queried – and that’s what type of interactions and experiences product and engineering teams are tasked with bringing to their users. 
Over the last decade, Bytecode has helped over 1,000 organizations build successful data stacks and Looker for trusted self-service analytics. Today, a common question we are getting from clients is how can I bring AI to my analytics and how long will this take? Clients who have invested in Google Cloud’s vertically integrated data and AI cloud (BigQuery, Looker and Vertex AI) are excited to learn that the answer is both easier and faster than they expected.

In the age of generative AI, Looker customers have a tremendous time-to-value advantage by extending Looker’s trusted metrics into analytical agents for both fast and reliable insights for AI-powered Business Intelligence.

At the heart of many generative AI use cases are Large Language Models (LLMs). There is a growing number of  available LLMs pre-trained on massive amounts of data which make them generally impressive out of the box to interpret, translate or summarize text or another modality, but largely lacking when it comes to focused knowledge about your data and business. The speed at which an organization can point the ‘magic’ of AI at their data is essentially the same as how quickly one can become frustrated. Off-the-shelf LLMs aren’t (yet) trained to speak your specific language or know your unique business. Frustrations can arise when LLM generated definitions and the actual business terms differ, and this becomes particularly clear in analytical use cases. For example, how does your organization define a high value customer? It could be a combination of dollars spent over their lifetime, the recency of their last purchase or their repeat purchasing cadence. Permutations are endless and specific for every business. 
This data dilemma presents an opportunity to leverage your existing investments with Google’s Data Cloud, including Looker’s semantic modeling layer, where your business definitions are globally available to all your users and systems, including generative AI. By combining large language models in Gemini and Looker’s semantic layer, we are able to quickly fine tune and ground your analytical agent in your ‘business truths’ to deliver a new interface for end users to access trusted insights to make decisions.

For end users, getting analytical answers can now be as easy as asking a question. We’re seeing demand skyrocket within organizations that have successfully deployed analytical agents.

One of Bytecode’s clients is OfficeSpace, a SaaS workplace management platform. OfficeSpace is integrating AI agents into their core product to enable faster and easier management of desk reservations and space management experiences. Central to their product are analytics on office space usage and utilization. By leveraging the Gemini models and Looker they are able to go beyond dashboards and deliver accurate, governed insights directly within their chat experience. The AI models are trained on OfficeSpace then tuned to their specific use cases and use LookML to generate the queries, ensuring the answer is the same whether the user is chatting a question, reviewing a dashboard or downloading a dataset. 
The challenge
LLMs are providing a critical piece of the technology solution to make AI-Powered BI a reality. For OfficeSpace delivering their analytics chat experience, it is a two part problem:

How to translate the many different ways a person could ask about occupancy rate into a standard question. This is the domain where LLMs are strongest.

How to create a query that accurately calculates occupancy rate. This is the domain where semantic layers are strongest. 

The solution
What each technology is doing:

The LLM is responsible for:

Translating the end user question into a standard 

Picking the Looker measure and dimension that most closely matches the question

Looker is responsible for:

Semantic layer to define business logic and generate query

Delivering resulting data summary and/or visualization

Getting Gemini and Looker to communicate end-to-end is trivial. The accuracy of the experience is unlocked through a robust semantic layer, the grounding of the language model and prompt training to tune Gemini to accurately translate the question and select the right objects in LookML. Creating data agents with Gemini is very similar to creating curated self-service analytics – we focus on the end user, the questions they need to be able to answer and then curate the experience. 
Demystifying the training process 
Training sounds complicated, but it’s really similar to a traditional analytics problem. Just like when we build a dashboard, we identify the business questions our end users need to be able to answer and then build a set of visualizations to help make decisions. 
In the training process, we’re taking the same approach. We come up with a list of business questions and provide Gemini the expected Looker Explore result. By providing Gemini common business questions and the expected results, you will see dramatic improvements in the accuracy of the auto-generated responses.
Lowering barriers grows data-empowered users
The delivery of trusted AI in Business Intelligence (BI) is a significant development for internal and productized analytics. We are breaking down even more barriers to adoption as we simplify the interface for users to answer data questions. Other customers are leveraging AI in BI to deliver analytics to their front line workers who just need a quick answer. 
We’re excited to partner with Google Data Cloud to demystify the process and deliver accurate, governed AI-powered BI. Reach out to Bytecode if you are searching for an experienced partner to help realize your analytical AI agent use cases, or visit the Google Cloud console to access Looker, Vertex AI and Gemini models to get started on your own.

AI Summary and Description: Yes

**Summary:** The text discusses the integration of generative AI, specifically Large Language Models (LLMs), into business intelligence (BI) and analytics tools, centering around the role of Google Cloud’s Looker and Gemini models. It highlights how organizations can utilize AI to facilitate data queries through natural language, transforming user interactions with analytics while addressing challenges related to data specificity and user-defined metrics.

**Detailed Description:**
The text outlines how generative AI, particularly LLMs, enhances business engagement with analytics tools by allowing users to query data using natural language rather than traditional querying methods. Here are the key components of the discussion:

– **Generative AI in Analytics:**
– Adoption of generative AI changes the dynamics of user engagement with data, allowing for more intuitive interactions.
– The focus is on enabling self-service analytics through AI, helping users transition from complex queries to straightforward conversational requests.

– **Case Study of Bytecode’s Clients:**
– Bytecode has focused on helping organizations integrate AI into their analytics environments, with over 1,000 clients, including OfficeSpace, benefiting from this transformation.
– Organizations employing Google Cloud’s vertically integrated data and AI cloud can rapidly implement AI-enhanced analytics, significantly reducing time-to-value.

– **Role of LLMs and Semantic Models:**
– While LLMs are powerful in understanding varying user queries, they often lack specific business knowledge which is crucial for accurate analytics.
– Looker’s semantic modeling layer plays a vital role in grounding the AI outputs in a user’s business context, providing tailored analytics.

– **Challenges and Solutions:**
– The primary challenges faced involve translating user questions into standard queries and accurately calculating parameters like occupancy rates.
– The AI model (LLM) handles user question translation and standardization, while Looker provides the defined business logic to generate the necessary queries.

– **Training the AI:**
– The training process aims to streamline LLM responses by providing expected results for common business questions, improving accuracy dramatically over time.

– **Broader Implications:**
– The integration of trusted AI in BI systems promotes greater adoption and accessibility of analytics for end users, empowering them to make data-driven decisions more quickly.
– The partnership with Google Data Cloud exemplifies a direct approach to simplify and enhance the analytics experience through accurate AI-powered solutions.

**Key Takeaways:**
– Generative AI facilitates a much more user-friendly interaction with data analytics tools.
– The combination of LLMs and structured semantic layers is crucial for accurate and meaningful data analysis.
– The development of AI-powered BI tools opens doors for more users, particularly frontline workers, to engage and utilize data effectively without needing extensive analytics training.

In summary, this text offers insights into how generative AI, particularly in the context of LLMs and tools like Looker, is redefining the analytics landscape and enhancing user engagement with data, addressing both existing challenges and potential solutions in business intelligence.