Anthropic in early talks to buy DRAM-less AI inference chips from UK startup — Fractile's SRAM architecture reduces need for pricey memory during extreme pricing and shortage crunch

Anthropic
(Image credit: Anthropic, AMD)

Anthropic has reportedly held early discussions with London-based chip startup Fractile about purchasing the company's inference accelerators, The Information reported on Friday, citing people familiar with the matter. The talks would add Fractile as a fourth source of AI server silicon for the Claude developer, which already uses chips from Nvidia, Google, and Amazon.

Fractile's chips aren’t expected to reach commercial readiness until around 2027, placing any deployment well outside Anthropic's near-term procurement plans and roughly inside the same window as its Google-Broadcom TPU partnership.

Latest Videos From

Google Preferred Source

Follow Tom's Hardware on Google News, or add us as a preferred source, to get our latest news, analysis, & reviews in your feeds.

Luke James
Contributor

Luke James is a freelance writer and journalist.  Although his background is in legal, he has a personal interest in all things tech, especially hardware and microelectronics, and anything regulatory. 

  • alan.campbell99
    2027, assuming Anthropic doesn't fall over and die if the bubble pops. How will they pay for these on top of all their other commitments alongside their insane cash burn. This ARR that gets thrown around is a have, I believe to date they've made something like 5 billion? They have this apparent surging demand yet still signing up as many customers as they can get their hands on? At least there as a token acknowledgement of their inference costs, neither they nor openAI are profitable here. I do wish some tech reporters would stop blindly accepting details from known liars and con men, maybe start being a bit more critical instead of carrying these guy's water.
    Reply
  • American2021
    alan.campbell99 said:
    2027, assuming Anthropic doesn't fall over and die if the bubble pops. How will they pay for these on top of all their other commitments alongside their insane cash burn. This ARR that gets thrown around is a have, I believe to date they've made something like 5 billion? They have this apparent surging demand yet still signing up as many customers as they can get their hands on? At least there as a token acknowledgement of their inference costs, neither they nor openAI are profitable here. I do wish some tech reporters would stop blindly accepting details from known liars and con men, maybe start being a bit more critical instead of carrying these guy's water.
    It is a private LLC so no financials available at the sec.gov website. Their detailed internal financials remain confidential and unavailable. This fuels a lot of speculation and interesting reporting. That said they did file a Form D which can be accessed in the EDGAR database but there is very limited data there. Presently, there is a lot of speculative reporting around Anthropic PBC with the usual suspects (e.g., Crunchbase, PitchBook, Bloomberg, etc.) that report on private company valuations likely off track to whatever degree they are.
    Reply
  • bit_user
    The article said:
    Fractile is one of several inference-focused startups pursuing SRAM-based or near-memory architectures, including Groq and Cerebras.
    Yes, most purpose-built AI chips I've read about seem to do this, including Tenstorrent and Graphcore (defunct, I think). The downside is that it takes potentially lots and lots of chips, in order to use it for inferencing large models.
    Reply