Why ESG Software Was Built Wrong

JM Juanjo Mestre, CEO & Cofounder · · 12 min read
Why ESG Software Was Built Wrong

Photo by Milad Fakurian on Unsplash

Why ESG Software Was Built Wrong

The sustainability software industry has a $30 billion problem, and it’s not what anyone thinks it is. It’s not a data quality problem. It’s not a regulation problem. It’s not even a talent problem. It’s an architecture problem. And almost no one is talking about it.

The original sin

When the first wave of ESG software emerged, it was designed to answer one question: how do we fill out this report? That sounds reasonable. Companies faced new disclosure requirements (GRI, CDP, early iterations of what would become CSRD) and they needed tools to compile information and produce documents. The market responded with what it always responds with when faced with a compliance deadline: purpose-built reporting tools. Here’s the problem. Building software around a reporting framework means you inherit the structure of that framework as your data architecture. Your data model isn’t shaped by how sustainability data actually behaves inside an organization. It’s shaped by how a specific framework decided to ask questions in a specific year. Framework-first architecture is the original sin of ESG software. Everything that frustrates companies today flows from it. The industry didn’t just build the wrong product. It built the wrong foundation. And then spent a decade furnishing a building with structural cracks.

The Framework Trap

Consider what happens when a company adopts a typical ESG platform. The tool asks: what’s your Scope 1 emissions figure? The company enters a number. The tool maps it to the right cell in the right framework. Report generated. Compliance achieved. Now a second framework asks for the same underlying data, but structured differently. A third asks for overlapping but not identical data. An investor requests a custom cut. An internal team needs the raw inputs for operational decisions. The platform can’t serve any of these without rework, because the data was captured for a framework, not captured as structured evidence that can be reused across any output. This is the Framework Trap: the more frameworks you support, the more redundant data capture you create, and the more fragile your entire system becomes. Every new reporting request doesn’t simplify, it multiplies the work. Most companies don’t realize they’re in this trap until they’re managing three or four frameworks simultaneously and spending more time reconciling numbers across reports than actually improving performance. By then, they’ve built years of institutional knowledge on top of a broken foundation.

The category error

This isn’t a feature gap. It’s a category error. A reporting tool takes inputs and produces a formatted output. Its value is in the last mile: the document, the submission, the PDF. The data exists to serve the report. Data infrastructure does the opposite. It captures evidence once, structures it at the source, and makes it available to any downstream process (reporting included, but not exclusively). The report is one of many possible outputs. The value is in the architecture, not the document. The distinction: Framework-first (what the industry built): Data → Framework Template → PDF. Repeated per framework. Reconciled manually. Breaks when anything changes. Architecture-first (what should have been built): Evidence → Structured Data Layer → Infinite Outputs. Captured once. Mapped dynamically. Scales with complexity. In the first model, data is shaped by the question. In the second, the question is shaped by the data. The industry confused document generation with data design. It built translation software when it needed to build a data operating system.

Why this happened

It wasn’t stupidity. It was incentive structure. The buyers of first-generation ESG software were sustainability teams with compliance deadlines. Their success was measured by reports submitted on time. They didn’t need infrastructure, they needed output. Fast. So vendors optimized for time-to-report. Ingest data in whatever format the client had, map it to the framework, generate the report. Ship it. Renew the contract. This created a market where every vendor competed on framework coverage. Who supports CSRD? Who supports CBAM? Who added DMA first? The race to cover more frameworks only deepened the architectural debt underneath. Nobody stopped to ask: what if the data layer was independent of any framework? Nobody asked because nobody was incentivized to ask. The sustainability team wanted reports. The vendor wanted renewals. The regulator wanted disclosures. Compliance urgency created architectural debt at industry scale.

The compounding cost

Each year, the regulatory (and clients requests) surface area expands. Each new framework adds another capture cycle. Each cycle adds another reconciliation burden. The cost doesn’t grow linearly, it grows geometrically. Redundant capture. The same data point (energy consumption at a facility) gets collected, formatted, validated, and entered multiple times for multiple frameworks. Each instance introduces variance. The data isn’t wrong, exactly. It’s just never authoritative. Structural fragility. When a framework changes the entire pipeline breaks. Not because the underlying reality changed, but because the mapping layer is hardcoded to a specific version of a specific standard. A framework update should be a configuration change. Instead, it’s a migration. No operational value. Data locked inside a reporting structure can’t inform decisions. You can’t run scenario analysis on numbers trapped in a CSRD template. You can’t optimize procurement with data captured for CDP. Sustainability data stays siloed, disconnected from the operational systems that could actually use it. Audit exposure. When the same number appears in three reports with three slightly different values, you don’t have a data problem. You have a governance problem. And mandatory assurance is about to surface it. An entire software category is compounding its own fragility. And the reporting tool, by design, has no mechanism to flatten that curve.

What the architecture should have been

The principle: capture once, structure at the source, reuse infinitely. A company generates sustainability-relevant evidence constantly: invoices, utility bills, sensor readings, supplier declarations, HR records, procurement data… That evidence exists whether or not a framework asks for it. The job of the software is not to ask the company to re-enter that evidence into a framework-shaped form. The job is to ingest the evidence in its raw form, structure it against a universal data model, and then render it into whatever output is required. The framework becomes a rendering layer, not a data model. The report becomes a view, not a destination. This is Single Capture Architecture. One ingestion point per data source. One structured representation per evidence unit. Unlimited outputs. Adding a new framework becomes a mapping configuration, not a data collection project. Updating a framework version becomes a schema change, not a migration. Serving an ad-hoc client request becomes a query, not a three-week sprint. The marginal cost of the next framework approaches zero. The data becomes auditable by default, one authoritative source instead of fifteen copies. And crucially, the data becomes available for operational use, because it’s not locked inside a report. This isn’t one of several possible approaches. It’s the only architecture that survives the next five years of regulatory expansion without breaking.

Why agents are impossible without this

Here is where the argument becomes binary. The current model assumes human-driven workflows: a person collects data, validates it, maps it, generates the report. The software automates some steps, but the cognitive architecture is still manual. This model has a ceiling. The complexity surface (more frameworks, more granularity, more jurisdictions, more assurance requirements) is growing faster than teams can hire. The manual model cannot keep up. That point isn’t theoretical. It’s arriving. The only system that can absorb this complexity without proportional headcount growth is agentic, software agents that understand the data model, monitor evidence streams, flag gaps, execute mappings, and generate outputs autonomously. But here is the dependency the market is ignoring: agents are not an enhancement you add to existing software. They are structurally impossible without architectural correction. An agent needs a coherent data model to reason over. It needs to traverse relationships between facilities, suppliers, time periods, and evidence sources. It needs to trace any number to its origin, assess confidence, identify what’s missing, and determine how to resolve it. None of this works on framework-shaped data. An agent looking at a Scope 3 cell in a CSRD template sees a number. It doesn’t see the supply chain evidence that produced it, the methodology applied, the confidence level, or the coverage gaps. The agent is blind, not because the AI isn’t capable, but because the data structure gives it nothing to navigate. The equation is binary: Framework-shaped architecture → no meaningful agents. Ever. You can add chatbots, draft generators, cosmetic AI. But you cannot deploy agents that autonomously manage the evidence lifecycle, because the architecture doesn’t support reasoning. Evidence-based architecture → agents become inevitable. Once the data layer is structured, coherent, and framework-independent, deploying agents is a natural consequence. The architecture invites agency because it provides everything an agent needs: structured evidence, complete lineage, semantic relationships, gap awareness. This is what an Agentic Workspace looks like. Not AI features bolted onto a reporting tool, but a data architecture where evidence lives once, is structured once, and intelligent agents manage everything from ingestion to disclosure. The workspace isn’t the interface. It’s the data layer. Companies that built reporting tools cannot bolt agents onto their current architecture. They would need to rebuild from the foundation. Which means admitting the architecture was wrong from the start.

The bifurcation

The sustainability software market splits along this exact line. On one side: reporting tools competing on framework coverage, fighting over shrinking margins as frameworks converge and simplify. Their moat erodes every time a regulator consolidates requirements because simplification removes the complexity they were built to manage. No path to agentic capability. Increasing cost per output. A ceiling on value. On the other side: data infrastructure that treats sustainability evidence as a first-class data asset, independent of any regulatory output. Value increases with every new framework, every new stakeholder request, every new use case, because the marginal cost of a new output from structured data is near zero. And the architecture naturally supports the agentic systems that the next decade demands. One architecture gets more expensive as complexity grows. The other compounds.

The choice

The ESG software industry optimized for output when it should have optimized for architecture. Compliance urgency created architectural debt at industry scale. The result is an entire category of software that makes the problem it was designed to solve progressively worse. The fix isn’t better reporting tools. It isn’t more connectors, more templates, more framework translators. It isn’t an AI copilot layered on a broken data model. The fix is rebuilding from the data layer up. Single capture. Structured evidence. Framework-agnostic architecture. Agentic execution. The companies that rebuild their data layer will turn compliance into a byproduct and regulatory complexity into leverage. The ones that don’t will keep hiring analysts to reconcile spreadsheets, while the complexity they’re trying to manage grows faster than they can staff against it. That’s not a choice between two products. It’s a choice between two futures.


Ready to see architecture-first sustainability software? Request a demo to explore how Dcycle’s Single Capture Architecture works.

ESGAISustainabilityDcycle

Collect once. Use everywhere.

See how Dcycle can cut your reporting time by 70% and give your auditors what they need , the first time.

See Dcycle in action