Logo
logo
EnglishLanguage
logo
Listen live
HomeGlossaryContact us
Find us on social media
Advertisement for 5fXBptIOLaA?si=-QAVpQnM0DVFw-al

CLONES turns human workflows into verifiable, traceable, and trusted AI

'Verification shouldn’t come after intelligence is built. It should build the intelligence,' say the platform's founders

Urian BuenconsejoProfile
By Urian BuenconsejoOct. 15th - 3pm
3 min read
Victor and Waylon of CLONES
'We’re not just recording how machines learn, we’re recording how people teach them,' says Victor of CLONES AI

Data in AI is both its foundation and its flaw. Most systems are trained on information that cannot be traced to a verified source. Datasets are scraped, reconstructed, or anonymized until accountability dissolves. For executives integrating AI into regulated environments (finance, defense, healthcare) that opacity is not a technical problem but a structural one.

CLONES, founded by Victor and Waylon, proposes a structured way to record, verify, and trade the digital traces of human-computer interaction, turning demonstrations, rather than documents, into the foundation of machine learning.

“Verification shouldn’t come after intelligence is built,” Victor said. “It should build the intelligence.”

Introducing Proof of Use (PoU)

At the center of CLONES’ architecture is Proof of Use (PoU), a verification mechanism that logs and validates how users interact with software in real time. Each workflow is encoded with metadata and timestamps that preserve both context and sequence.

For machine learning teams, this offers a different kind of training corpus: one that reflects how real users navigate and problem-solve within interfaces, rather than relying on machine-generated assumptions about human behavior. 

“Every action that trains a model is an act of value creation,” Waylon said. “The difference is whether that value is visible, provable, and rewarded.”

PoU reframes data acquisition as a measurable process rather than a hidden pipeline. Each contribution is verifiable, traceable, and paired with consent, allowing organizations to demonstrate compliance and researchers to confirm integrity.

The idea is not to compete with large-scale annotation or synthetic datasets, but to complement them, adding a human-authored layer of interaction that other systems can’t simulate.

The data supply chain

Beneath the philosophical framing is a market design problem. The training-data industry, valued at $2.6 billion in 2024, according to Grand View Research, depends on distributed labor and fragmented ownership. Companies can spend months sourcing or validating data, often without knowing who produced it or how.

CLONES approaches this as a supply-chain challenge: how to create liquidity and traceability for a commodity that has never been priced transparently.

Its marketplace structures each dataset as a tokenized asset. When a buyer accesses the data, a burn event executes an on-chain license, leaving a permanent record of its use.

Prices adjust dynamically through bonding-curve mechanics, reflecting real demand rather than speculative hype.

“Every dataset should have a supply chain,” Waylon said. “That’s how you guarantee quality and origin.”

For enterprises, the result is not a replacement for existing procurement but a parallel channel, where provenance, cost, and authorship are recorded automatically. This allows AI teams to source smaller, validated data segments incrementally rather than waiting for monolithic annotation cycles.

The design borrows less from crypto markets than from logistics management. Each dataset has a creation point, a validator, and a recipient. The process can be audited at any time, establishing what Victor calls proof of contribution, a record that ties human expertise to the data it produced.

Visibility as infrastructure

What CLONES introduces to AI infrastructure is not moral oversight, but transparency. It doesn’t define what “ethical” training looks like, but it ensures that the path from data creation to model training can be inspected.

That visibility has structural consequences. It enables measurable accountability in environments where audit trails are increasingly mandatory, and it recognizes human participation as a quantifiable input in machine learning, something rarely reflected in current pipelines.

AI will continue to learn. The remaining question is how much of that learning will be traceable. CLONES’ founders argue that visibility, not velocity, is the more sustainable foundation for intelligent systems.

As Victor summarized: “We’re not just recording how machines learn, we’re recording how people teach them.” That’s a verifiable premise: not an ideology, but an infrastructure that can be measured by what it reveals.

Share :
Advertisement for 5fXBptIOLaA?si=-QAVpQnM0DVFw-al

We use cookies on our site.