Search
Close this search box.

The Rise of the Conversational Analyst: How LLMs are Transforming How We Interact with Data

Where could AI offer a breakthrough in this next generation of BI?

If you’re a knowledge worker who frequently makes informed strategic decisions leveraging data and analytics, you know the plight of finding the right data or dashboards, running SQL queries, and finding patterns and trends. You turn to your friendly analysts, who are already slogged and overworked. They have no more room for additional analysis in their pipeline. You’re left frustrated, waiting for weeks, or still proceed without adequate insights. Now imagine the AI-powered analyst comes around, who not only sifts through loads of data and queries, but also shares powerful trends, insights, and actionable recommendations, all while you brew your coffee. A new AI-led era in business intelligence (BI) is transforming the landscape, offering a glimpse into a future where data accessibility and analysis are not just streamlined and democratized but are also enriched with deeper insights that provide a broader understanding of available information sources.

Early beginnings to Cloud-based innovation

We’ve come a long way in business intelligence. The birth of on-premises giants like Microstrategy and Business Objects in the 1980s moved us from raw data to data mining and centrally generated reporting–a revolution at the time. Yet, these tools empowered only a select few, leaving the promise of data-driven decision-making unfulfilled for the broader enterprise. Then came Tableau in the 2000’s to query relational databases, online analytical processing cubes, and spreadsheets to generate graph-type data visualizations. And while these generated visually beautiful reports and dashboards, IT Admins still needed to manage and administer servers and scale infrastructure with typical updates and maintenance overheads. Then came PowerBI (2011) and Data Studio (2016) leveraging a serverless, browser-based IDE, with drag and drop canvas for data visualization, so anyone could self-serve dashboards building on data from spreadsheets to enterprise systems, CRMs, or data warehouses. No admin overhead meant analysts could spend time actually extracting insights, and less time interacting with the IT Admin.

The direct access to the database, and light weight data modeling, meant every team could have a front-end, a data-app for each use case.Teams with prolific dashboards made data-informed decisions, rapidly spreading beautiful data stories, but a data model in every front-end application led to the challenge of a lack of commonly shared glossary of business metrics. Every department has its own model, and new calculated fields and formula on top of the data warehouse or data lakes. Departments now have more than five definitions of ARR (annual recurring revenue), and when producing the QBR (quarterly business revenue) reports for C-suite, no one quite knows the single source of truth. 

Meanwhile, a revolutionary product called Looker was born in 2012 – a perfect time and world for centralized governed models. The enterprise administrators and central data teams love the semantic layer and benefit from this improvement because they can have one instance for practically the entire company. Everyone now has the shared metrics, with a common definition and a central source of truth. However, business and domain experts still want to run ad hoc analysis, so the guardrails felt like handcuffs. Thus, exporting to sheets and CSV files for formulae proliferated even further. 

Solving one problem seems to always breed another: Which dashboard is the right one to answer my questions? Which version of the report should I trust? How do we evolve the data model? We swing between the goodness and consistency of a central model while desiring the flexibility of the self-serve dashboard. Then, ChatGPT was announced in Nov 2022, and the world was taken by storm. Virtually every technology leader was talking about incorporating AI in their products. Where could AI offer a breakthrough in this next generation of BI?

The promise of AI 

What was once limited to augmented analytics products like Thoughtspot or Sisu with ML-first insights and forecasts, now seems to be ubiquitous in products with LLMs. Major BI products from Looker and Power BI to Quicksight and Tableau are going beyond dashboards and reports to bring newer data experiences and conversational analysis to the forefront. LLMs are not just powering insightful data stories and summaries, AI-assisted calculated formulae, and auto-complete SQL queries within dashboards, but generative AI is enabling us to build entirely new experiences. AI and LLMs might finally be the answer to the dashboard overload problem. Once we needed to scour tons of dashboards, contact dashboard owners, and try various ways to query and search the dashboard titles and dimensions, but now all that can be taken care of by the conversational analyst. From Tableau Pulse with Einstein GPT, and Duet AI in Looker, to Copilot in Power BI with Microsoft Fabric, and Thoughtspot Sage, the BI industry is transforming from a dashboard-first world to an intelligent-assist first world where insights are ubiquitous and pervasive. 

The rise of this AI first conversational analyst demands a careful approach – an advancement that should be made with human in the loop. Imagine situations where AI misinterpreting trends or making recommendations without fully understanding context, drawing flawed conclusions that impact major business decisions. As an example, a historical trend of declining subscriptions for a nutrition and protein shake manufacturer might lead to recommending the company to lower production, but it could fail to recognize the effects of inflation or aging population factors on declining subscriptions due to lack of data that need to be solved differently. Correlating the number of car accidents caused to the driver age or traffic peak time may not be causal or even statistically significant, yet AI might hallucinate and provide a recommendation to increase auto insurance costs or alter the driver age requirements. Historical patterns, statistical limitations, genAI’s non deterministic nature and external factors need human-in-the-loop oversight and strong ethical AI principles. The future of BI is not just about technological advancements but also about ensuring these technologies are deployed in a manner that is ethical, reliable, and ultimately beneficial to individuals, organizations, and society at large. Where do you think the BI industry will go next?

Disclaimer: All views expressed by Aqsa are their own and personal. They should not be considered as attributable to their employer. 

Picture of Aqsa Fulara
Aqsa Fulara
Aqsa is an accomplished product manager and AI thought leader with expertise in enterprise. An author, international speaker and a judge; she has been recognised on several forums for her product management outcomes and influence. Her experience spans machine learning, recommendations, business intelligence and data analytics, from her time across Google Ads, Cloud and Assistant. She has a proven track record in achieving business outcomes by working through strategy and execution for end to end product lifecycle.
Subscribe to our Latest Insights
By clicking the “Continue” button, you are agreeing to the AIM Media Terms of Use and Privacy Policy.
Recognitions & Lists
Discover, Apply, and Contribute on Noteworthy Awards and Surveys from AIM
AIM Leaders Council
An invitation-only forum of senior executives in the Data Science and AI industry.
Stay Current with our In-Depth Insights
The Biggest Exclusive Gathering Of CDOs & Analytics Leaders In United States

MachineCon 2024
26 July 2024, New York

MachineCon 2024
Meet 100 Most Influential AI Leaders in USA
Our Latest Reports on AI Industry
Supercharge your top goals and objectives to reach new heights of success!

Cutting Edge Analysis and Trends for USA's AI Industry

Subscribe to our Newsletter