Early investors in Anthropic build AI server "computing power grid"

robot
Abstract generation in progress

Anjney Midha

Anjney Midha was a general partner at Andreessen Horowitz (a16z) and an early personal investor in Anthropic. According to my colleague Katie, he is finally ready to publicly disclose his highly anticipated AI infrastructure startup, for which he previously tried to raise over $10 billion in funding.

Midha stated that the new company, named AMP, is building an “AI computing power grid,” which operates similarly to a centralized grid that coordinates power supply, fundamentally changing how AI developers access scarce server resources. (The name AMP is aptly derived from the unit of electric current, “ampere.”)

After a deep conversation with Midha at the NVIDIA GTC conference, I realized he is advancing an idea that has been in the works for years: AI computing resources should be sold like public electricity, achieving lower costs and broader coverage.

OpenAI CEO Sam Altman also discussed this concept last week at the BlackRock Infrastructure Summit, but he indicated that OpenAI itself would become a supplier of such resources.

Regardless, this will mark a significant shift from the current sales model for AI infrastructure!

During his tenure at Andreessen Horowitz, Midha built the prototype project for AMP — the Oxygen computing cluster, which pooled NVIDIA chips for shared use among its portfolio companies. Concerned that AI computing resources were quickly concentrating in the hands of a few companies holding large amounts of GPUs, he decided to spin off the project and establish AMP.

Currently, NVIDIA graphics processing units (GPUs) are primarily supplied through long-term leases (reserved instances) or hourly rentals (spot instances), and Midha believes this allocation method fundamentally lacks efficiency.

Just as the power grid became a key facility for businesses to share scarce electricity a century ago, AMP aims to provide a similar shared model for AI developers in need of servers. Midha envisions that AI developers will not have to individually procure and maintain infrastructure — whether renting or purchasing from cloud vendors or chip companies — but will instead use a more efficient shared system.

He was unwilling to disclose other partners of AMP, whether server suppliers or computing power users, but mentioned that top research labs and cloud vendors have already participated in the project.

He did not reveal AMP’s business model. Unlike building and operating data centers, AMP will launch an application that connects server suppliers with AI developers in need. Midha likened it to an independent system operator in the electricity system — such entities may not own the underlying infrastructure but are responsible for coordinating supply and demand.

To this end, AMP is developing software to allocate a shared pool of computing power among AI developers and schedule the operation time and nodes for different computing tasks. However, AMP will not rent out GPUs by the hour, nor will it charge fees directly to AI developers.

Beyond GPUs

Notably, AMP plans to support developers in renting various types of AI hardware. Midha did not disclose whether Google’s Tensor Processing Units (TPUs) would be included, but considering that the AMP founding team includes engineers who previously managed Google’s large infrastructure internal management systems, it is reasonable to assume they have the capability to build such a system. (According to previous reports, Google has taken tangible steps to open TPU usage to AI developers outside of Google Cloud.)

Currently, there are companies in the market integrating multiple AI servers, such as Together AI and NVIDIA itself (the latter previously attempted to build a marketplace for idle GPU transactions), but AMP’s model is not easily comparable.

Midha stated, “You must be a neutral, independent entity, establish uniform standards, and allow all participants to connect.”

AMP plans to release a mission statement later today, aimed at attracting more companies to join this computing power grid. I am very curious to see which organizations will ultimately participate.

Currently, AI companies generally view server resources as a strategic advantage, so they need to provide corresponding economic incentives to encourage them to use the AMP system or contribute their own servers.

Considering Midha’s close ties as an early investor with Anthropic, I tend to believe that Claude’s developer, Anthropic, will participate in the project.

Midha declined to comment on the company’s capital structure but mentioned that several hundred million dollars have already been invested in the project over the past few months.

The Challenge of Scaling Computing Power

Midha’s inspiration for founding AMP came from his collaboration experiences with Anthropic and other startups, where he witnessed firsthand the importance of servers in developing new models. Training new models at scale with computing power is well-known as the “scaling law” in AI.

However, he stated that scaling servers is not easy because developers’ demand for computing devices is hard to predict.

“Observing computing loads reveals that demand fluctuates dramatically,” he said. “A team’s load pattern typically involves: large-scale training tasks causing peaks, followed by periodic research and inference work, which is extremely difficult to predict.”

This leads AI developers to either have insufficient reserved servers from cloud vendors or to over-reserve, resulting in significant idleness. Developers also continually face a dilemma: whether to use scarce computing power to train better models or to use it to run existing models to serve clients and generate revenue.

He said, “Many of the most productive teams in cutting-edge research globally are the least efficient when utilizing computing power, the most valuable resource.”

In many cases, this drives companies to hoard AI server chips — even when many devices are idle. “This deeply troubles me,” he said.

AMP-6.05%
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin