By Dan Jaye, CEO

Lately, we’ve seen some big, bold moves in the data infrastructure space. Snowflake’s acquisition of Crunchy Data, Databricks’ $1 billion bet on Neon, and Salesforce’s Informatica deal all tell the same story: the AI race is shifting. It’s no longer just about who has the biggest model: as a Forbes article out recently nicely puts it, it’s about who can deliver AI-ready data, resiliently, and at scale.

I think these are smart moves. They show that the market is maturing, and that the industry now understands something we’ve believed at Aqfer from the start – AI is only as good as the data it runs on.

But while this new focus on infrastructure is promising, it’s not the full picture.

Fast and Scalable Isn’t the Same as Context-Aware

Modern databases are evolving quickly. They’re more scalable, more real-time, and better equipped to support AI-driven applications than ever before. But even the most performant infrastructure has a blind spot: context – especially when that context depends on knowing who the data is about.

For example, many industry architectures confuse streaming with insight. Streaming captures what just happened, but it doesn’t explain why it matters or to whom it happened.

In marketing and customer analytics, that “who” is rarely simple. The data is fragmented. IDs don’t align neatly. And the story of any customer is often split across dozens of interactions, devices, and platforms. The events themselves – a purchase, a page view, a sign-up – are immutable. But the way we interpret those events changes as we gather new evidence about identity. 

We may realize two users are actually the same person. Or that one user was mistakenly grouped into a cohort they don’t belong in. That fluidity of identity is essential to how marketers operate, and it’s something traditional infrastructure doesn’t handle well.

The Identity Graph Is the Lens That Brings It All Into Focus

At Aqfer, we think of the identity graph as a lens – a dynamic layer that brings meaning to raw event data. That data is immutable: a purchase was made, a page viewed, a click recorded. These facts don’t change.

But the identity behind them? That’s mutable. It evolves as new evidence emerges. What looked like three users may later resolve to one. Or one might split into several. The data stays the same – your understanding gets sharper. And it matters a lot.

That lens lets you revisit the past with sharper clarity – answering questions like, “How often has this person bought this product?” or “Is this their first complaint, or their fifth?” It also helps you look ahead – powering better audience targeting, more accurate attribution, smarter personalization, and AI outputs that actually reflect the customer as they are now, not as they were six weeks ago in a fragmented data table.

Most systems don’t support this. They store and move data well, but without identity context, models are left to guess. That’s where we come in. Aqfer makes sure the lens is in place – so the data your AI sees is complete, current, and connected.

The quality of that lens determines the quality of your insight.

Most systems don’t account for this. They store the data beautifully, move it quickly, and expose it to models – but they do it without a cohesive identity layer in place. Which means the model is doing inference on fragments, not people. Without that identity lens, even the most advanced AI is working off partial, disconnected context.

Where I See the Next Wave of Innovation

The acquisitions made by Snowflake, Databricks, and Salesforce are smart bets on where the AI infrastructure stack is going. But they also create a need to build out capabilities that turn data into understanding.

This means helping enterprise teams move beyond real-time ingestion and toward real-time interpretation. For marketing use cases, that means AI systems that understand customer history, resolve identity with confidence, and avoid the blind spots that lead to wasted spend or flawed personalization.

As more enterprises invest in the core, they’ll also need to invest in the connective tissue that brings meaning to the data, and keeps it trustworthy across every touchpoint.

From Data Plumbing to Strategic Enablement

If the last decade was about building data lakes and pipelines, the next one will be about making the data meaningful. Not just fast or fresh, but structured in a way that reflects how customers behave and how identities evolve.

That’s how we move from experimentation to true intelligence. And that’s the work I believe is still ahead of us. The foundation is finally getting the attention it deserves –  now it’s time to build the rest of the system.

 

Categories

Recent Posts

Subscribe Now

This field is for validation purposes and should be left unchanged.