|

Fluence Is Turning the Cloud Inside Out — and Making It Cheaper

The proper second to increase into GPUs was most likely two years in the past,” Evgeny Ponomarev says. “AI demand is exploding — and our prospects saved asking for GPU capability. It was an natural subsequent step.

For years, Fluence has been constructing a decentralized cloud layer that challenges hyperscalers like AWS or Google Cloud. The firm first made its mark with CPU-based compute — powering blockchain nodes and knowledge processing workloads throughout its distributed community. But now, with AI dominating international infrastructure conversations, Fluence is stepping boldly into GPU territory.

From Nodes to Neural Nets

“Our first product was easy — digital servers,” Ponomarev explains. “People might run databases, backends, even analytics. But as a result of we’re native to Web3, most of our prospects use Fluence to run blockchain nodes.”

Now, a lot of those self same prospects want GPU energy — not only for AI coaching but in addition to assist new blockchain protocols that depend on AI-driven processes. “We realized some nodes actually can’t perform with out GPUs anymore,” he says. “So increasing our infrastructure was each a buyer want and a community alternative.”

Beyond Cost: Choice and Control

Fluence is commonly described as as much as 85% cheaper than centralized clouds, however Ponomarev insists that’s solely a part of the story.
“Cost effectivity issues,” he says, “however the actual worth is customization.

He offers an instance:
“Imagine you want ten H100 GPUs sitting proper subsequent to petabytes of coaching knowledge in the identical knowledge heart. Hyperscalers not often allow you to customise setups like that. In Fluence, as a result of we mixture {hardware} from a number of suppliers and knowledge facilities, we will truly offer you that flexibility.”

It’s a imaginative and prescient of compute range — the place anybody, from AI startups to analysis labs, can entry the actual {hardware} they want with out gatekeepers.

Bridging Web3 and AI

As decentralized protocols start to mix with AI techniques, Fluence finds itself at the intersection of two large revolutions.
“Many new Web3 protocols have AI elements that require GPU compute,” Ponomarev says. “We’re enabling these networks to remain decentralized — by working their nodes throughout our CPU and GPU capability.”

That convergence is greater than technical; it’s philosophical. AI, typically criticized for centralization, now has a decentralized basis to construct upon.

Vision 2026: The Decentralized Cloud Stack

Fluence’s roadmap reads like a manifesto for open infrastructure.
“By 2026, success means having the minimal viable cloud to serve any buyer,” says Ponomarev. “That consists of CPU and GPU cases, block storage, load balancing, managed Kubernetes — the full stack.”

But there’s additionally a tokenized economic system creating behind the scenes.
“Today, individuals stake to safe compute and earn rewards. Next, we’re introducing a stablecoin collateralized by our token, FLT, to pay for capability instantly,” he explains. “We’re additionally tokenizing {hardware} itself — so you may put money into sure sorts of machines, like GPUs, and share of their income.”

It’s a posh system, however Ponomarev distills it merely:
We’re turning compute into an on-chain market — the place anybody can take part, present assets, or use them. That’s the way forward for the cloud.

The Bottom Line

Fluence’s enlargement into GPU compute isn’t only a characteristic replace — it’s a press release of intent. As AI’s starvation for compute deepens, decentralized options like Fluence are proving that scale, effectivity, and openness don’t should be mutually unique.

People used to suppose decentralized infrastructure couldn’t compete with hyperscalers,” says Ponomarev. “Now we’re proving it will possibly outperform them — in price, flexibility, and resilience.

The publish Fluence Is Turning the Cloud Inside Out — and Making It Cheaper appeared first on Metaverse Post.

Similar Posts