Company Updates

Joining the Journey to the Next Frontier in AI-Powered Analytics

Joining the Journey to the Next Frontier in AI-Powered Analytics

Over the last few years, I've been on a fascinating personal journey — part marketer, part data geek — exploring the promise (and pitfalls) of generative AI (GenAI). I've seen its magic in content creation, idea generation, and brainstorming. But I've also seen its Achilles' heel: it will confidently make things up.

That's fine if you're drafting a blog outline. But in analytics — where precision drives decisions — hallucinations aren't just an annoyance, they're a dealbreaker.

That's why I'm excited to announce my next chapter: joining Codd AI as GTM leader and co-founder alongside CEO and founder, Ravi Punuru — the visionary and engineering mastermind behind our platform.

Why Context Has Always Been King in Analytics

My obsession with data started in the early '90s in Cape Town, when I was drowning in 4,000+ COBOL reports that could only answer yesterday's questions. Discovering Business Objects was a revelation. It introduced me to the semantic layer — a way to translate complex database structures into plain business terms, while preserving the relationships that give data meaning.

Business Objects Universe - Semantic Layer Data Model Figure 1: Business Objects Universe

Over the years, this idea evolved:

The EDW and ETL

The next notable step in the semantic layer progression was the advent of the enterprise data warehouse (EDW). The purpose was to better reflect the relationships in the data as it pertains to the kind of questions we need to ask. The other advance was the move towards MPP (massively parallel processing) architectures that delivered far greater performance and capacity compared to the traditional models.

To get data from transactional systems into the EDW required a new technology — ETL (extract, transform, load). Tools like Informatica and Business Objects delivered ETL tools as the primary way to take data from transactional systems and transform it into the new structure (context) in the EDW.

The semantic layer remained a critical construct though in that every BI tool deployed on the EDW required the manual creation of their specific semantic layer and modeling before these tools could be used against the EDW.

Hadoop and ELT

Then came the internet and data volumes went through the roof (or so we thought then). To better handle these, a new technology called Hadoop came to the fore. And then Hadoop became the jumping off point (and still continues) to many new data processing engines and storage formats.

Hadoop facilitated that data got landed pretty much in raw form in the storage facility (HDFS for Hadoop) and the structure and context was only applied upon using or querying that data. So, instead of ETL we now had to do ELT (extract, load, transform). Transactions from all your systems simply get extracted and dumped into storage in the data lake without any context.

While we won big on the cost and performance side, we lost out massively because BI tools do not work without structure or context. This resulted in heavy manual effort by data engineers and data scientists to build custom data environments or dashboards. An additional major challenge with these disparate semantic layers and modeling approaches is inconsistencies and a lack of reuse that leads to unsustainable overheads and costs.

Traditional AI

The traditional AI world does not escape this paradigm. Data scientists or data engineers are required to model and create the semantic layer upon extracting and usage of the data, do feature engineering, train models and rigidly apply scores and results. In fact, Spark (and its commercial successor Databricks) was founded on this paradigm that data scientists could use the engine to create their data pipelines and data processing.

Context is King

The net insight for me was that the meaning of the data gets interpreted via relationships that then provide the context for answering the question. Whether you predefine the context in the schema (like the EDW or data mart) or whether the data engineer does this upon creating the analytical insight when they extract the data — context is a must-have.

As an aside, this effort of creating, maintaining and enriching semantic layers was a major challenge for organizations using tools like Business Objects and all semantic layer-based tools. The person asking the question needs to translate a business requirement into a set of (business) objects that could reflect the question.

For example, picking REVENUE is interesting but means nothing. Well, if you want to know the revenue by product for last quarter, then you need to pick PRODUCT, REVENUE, QTR and build a filter on QTR for the date range. And if that object did not exist then we needed someone to go back in to create a new object. In a world of rapid changing dynamics, ad hoc analysis and fast turn-around this was a constant burden on the data engineering team.

Where GenAI Breaks (and How to Fix It)

GenAI can chat with your data in theory, but without business context it will:

  • Connect unrelated dots
  • Produce inconsistent answers
  • Require constant manual cleanup by data teams

The fix? Give GenAI the same thing BI tools have relied on for decades: a rich, accurate, up-to-date semantic layer. One that encodes business rules, definitions, and relationships before it starts generating answers.

Empowered with this comprehensive semantic layer, your traditional BI tools or GenAI conversational canvas can then start using the engine to answer pertinent business questions. And in an iterative fashion continue to create insights and frame them in the correct context for your business.

Codd AI Conversational Analytics - Claims Analysis AI-powered analytics with business context built-in

Codd AI Knowledge Query - IBNR Calculation Methodology Explainable AI showing the methodology and calculations behind insights

Codd AI Conversation - Insights and Follow-up Questions Natural language queries with accurate, contextual responses

Enter Codd AI

At Codd AI, we're making that possible — and effortless.

We've built a platform that uses GenAI to automatically create, maintain, and enrich an interactive semantic layer. This layer is embedded in an AI agent that can be accessed from your BI tools, your own conversational interfaces, or our canvas.

  • Need three new customer support metrics? Just ask.
  • Want the same answers in Excel, Tableau, or Slack? No problem.

By automating what used to take a team of data engineers, we're cutting labor costs by up to 70% and slashing time-to-value.

And this isn't just about speed — it's about trust. With business context baked in, GenAI stops hallucinating and starts delivering accurate, consistent answers.

Bringing Data to People — Not People to Data

Too often, "data literacy" means training people to think like analysts or data engineers. I think we've had it backwards.

With intelligent, context-aware agents, we can bring the data — and the insight — directly to people in the tools they already use. GenAI, done right, becomes your personal data engineer or data scientist. No SQL, no modeling, no waiting in line for IT.

That's the future we're building at Codd AI.


📩 If you want to see how this works in your world, email me at [email protected] or schedule a session here.