Perspectives

Executive Digest: AI Orchestra, Human Maestro: The New Era of Data Modernization

2026/01/09 3 min read

Share
Executive Digest: AI Orchestra, Human Maestro: The New Era of Data Modernization

Luis Marques | Link’s Head of Data & AI

Original Article PT: Executive Digest

 

Although Artificial Intelligence (AI) dominates the conversation across virtually every industry — from insurance to logistics, from banking to retail — it is data that continues to underpin what truly matters: competitive, efficient, and profitable organizations.

The problem is that many organizations remain trapped in legacy data platforms, burdened with high operational costs, inefficiencies, and an inability to respond to business demands. In my view, this is one of the greatest challenges organizations face today.

In the past, modernization initiatives took years — long, expensive projects that, by the time they reached production, were already built on technology that was no longer truly modern.

Generative AI has changed this completely, introducing tools that simply did not exist before. It is now finally possible to evolve without alarming any CIO or CEO.

One of the most powerful examples is the assisted modernization and migration of data and code (using specialized models such as Claude) to the latest data platforms, directly addressing the technical debt of legacy systems.

At its core, this means deploying AI agents to analyze code, requirements, data, and existing documentation, while other agents interpret these outputs and generate recommendations and ideas — and yet others write data and code quality tests. It becomes an orchestrated system of agents, designed to dramatically accelerate the entire process.

This orchestration can be implemented through an “agent manager” that breaks down high-level objectives into sub-tasks and distributes them to specialized agents, collecting artifacts and metrics to determine next steps. This manager integrates with the organization’s engineering ecosystem and brings together agents for code analysis, mapping and translation, data engineering, as well as testing and observability.

Coordination also includes built-in validation mechanisms: one agent proposes a transformation, a second reviews and simplifies it, a third generates tests, and only then is the code promoted for human review. This creates “chain review cycles,” where the output of one agent becomes the input of another, reducing systematic errors and increasing confidence.

In the context of technical debt, this agent-based platform can continuously generate “health” metrics, such as the number of obsolete dependencies, architecture violations, and risks related to sensitive data exposure. With each migration iteration, these indicators are updated, enabling smarter prioritization.

When discussing AI, security and compliance can never be overlooked. Agents must be governed by strict data access policies, data masking rules, and all decisions must be fully auditable. This not only supports future audits but also enables a “human-in-the-loop” approach, where data owners approve critical steps while the agent orchestration handles the heavy lifting.

But we should not be misled. Despite these new capabilities brought by AI, the true foundation remains people. They design the orchestration, lead it, make the difficult decisions, and define the strategy.

We are living in a unique era — one where AI can be used to accelerate the modernization of what ultimately enables better decision-making: data.