Inclusive Computing: Championing Decentralization
Disclosure: The views expressed here are solely those of the author and do not necessarily represent the perspectives of crypto.news’ editorial team.
If artificial intelligence resembles electricity, then a handful of private entities possess the ability to turn off the power for everyone else at their discretion.
Summary
- Computational power has emerged as the vital bottleneck — the top models and innovations hinge on a limited number of centralized servers, creating a lopsided competition instead of a fair race.
- View computing as essential infrastructure — akin to electricity or broadband, it should function as a utility with transparent pricing, open availability, and fair allocations.
- Decentralization is more beneficial than concentration — positioning computing resources near renewable energy sources and local hubs eases grid strain, lowers costs, and complicates monopolistic practices.
- Broader access fosters innovation — allowing more individuals to experiment openly accelerates the pace of iteration, unearthing breakthroughs and redistributing power within the ecosystem.
Access to the most extensive models, the boldest experiments, and the pace of discovery has now become dependent on a few tightly controlled servers and accelerators. This scenario diverges sharply from a truly free market; it’s more like gatekeeping that determines who shapes the future and who must wait.
Centralized computing not only heightens costs; it also skews competition. When training opportunities are allocated through exclusive arrangements and biased systems, the results are predetermined long before the competition starts — just look at Meta’s $10 billion cloud deal with Google.
Driven by quotas, ambitious labs and students are prompted to stifle their curiosity, with entire research directions truncated. This establishes a narrative of ‘inevitable winners’ that becomes self-sustaining. Such an environment stifles innovation, not by headlines, but through the quiet suppression of ideas that never see fruition.
Build the network, not the bottleneck
Recognize computing as the critical infrastructure it is, infuse accountability into every rack, and initiate rapid transformations. Align incentives with access metrics rather than exclusivity, and ensure this information is public; in doing so, nothing remains obscured, the network expands, and all can take part in AI’s evolving narrative.
The real discussion revolves not around increasing capacity, but rather who will control it, under which conditions, and how broadly the benefits will be distributed. Concentration converts a universal technology into a privately controlled toll road. To ensure intelligence positively impacts the masses, computing needs to be treated as a public utility that guarantees equal access — no special privileges here.
Projected global electricity consumption by data centers is expected to more than double to approximately 945 terawatt-hours by 2030, largely driven by AI. Concentrating this demand within a few locations amplifies grid pressure and costs.
Now, imagine that demand redistributed across sites positioned near new renewable energy sources and flexible energy grids. This strategy would result in a cleaner, cheaper, and more intricate system to monopolize, benefiting a much wider network.
Public funding should be directed toward ensuring public access today, encompassing provisions for open scheduling, fixed allocations for newcomers (including students, civic initiatives, and first-time founders), and transparent, cost-driven pricing.
Europe’s AI Continent Action Plan proposes a network of AI Factories and regional antennas aimed at improving access and interoperability across borders. Regardless of one’s views on Brussels, the focus on diffusion rather than capture is the correct movement.
In other regions, the stakes are even higher, exemplified by U.S. President Donald Trump’s allocation of up to $500 billion for AI infrastructure. While it may appear advantageous for everyone, the outcome could either nurture a diverse ecosystem or solidify a monopoly, contingent on the terms implemented.
End scarcity-as-a-service
Let’s be honest. Scarcity has become the business model for centralized computing; it’s not merely a flaw. Major cloud agreements are often branded as ‘efficiency,’ yet they fundamentally create dependence as negotiating power consolidates around server locations.
When access hinges on contracts instead of merit, valuable concepts might falter before reaching validation. What is critical is a reserved, transparent portion of capacity for newcomers at equitable, cost-driven rates to ensure accessibility for all.
APIs must be open, schedules need interoperability, queue times and acceptance rates should be public, and all exclusive lockups must be transparent to eliminate hidden barriers.
View this as more than just gaining access to machines or processing power; it’s a fundamental right to computing. Much like societies value literacy, healthcare, and broadband, computing should be recognized as an essential pillar for creativity, science, and advancement. Embracing this perspective means embedding guarantees into the system: ensuring portability so work and data move fluidly across environments, adopting carbon-aware scheduling to prevent innovation from harming the planet, and developing community or campus-level nodes that connect directly to a shared, resilient network. The terminology matters. This isn’t about charity, handouts, or subsidies. It’s about unlocking progress and enabling anyone with an idea to test, iterate, and build without obstructive hurdles.
When more individuals can experiment — when they can try, fail, and try again without begging for access or waiting weeks for approval — the speed of iteration dramatically increases. What used to take months can happen within days. This freedom not only accelerates prototype development but also enhances learning curves, facilitates quicker pivots, and ultimately leads to faster breakthroughs. Moreover, beyond the technical advantages, a subtler change occurs: politics diminish. Build the network, not the bottleneck.
The post Computing for All: Embrace Decentralization appeared first on Daily Star.