SuperNODE™ platform, capable of supporting dozens of Corsair™ AI inference accelerators in a single node, delivers unprecedented scale and efficiency for next-generation AI inference workloads.
ATLANTA--(BUSINESS WIRE)--d-Matrix today officially launched Corsair™, an entirely new computing paradigm designed from the ground-up for the next era of AI inference in modern datacenters. Corsair ...
Startup launches “Corsair” AI platform with Digital In-Memory Computing, using on-chip SRAM memory that can produce 30,000 tokens/second at 2 ms/token latency for Llama3 70B in a single rack. Using ...
Company sees a window where they can launch their cost-effective solution and get traction ahead of other’s next-gen silicon. d-Matrix has closed $110 million in a Series-B funding round led by ...
CARLSBAD, Calif.– Edge-to-core AI platform company GigaIO today announced the next phase of its partnership with d-Matrix to deliver an inference solution for enterprises deploying AI at scale.
At a time when millions of people are using AI services – along with the rise of agentic AI, reasoning, and multi-modal interactive content – there has been a shift from model training to deploying AI ...
GPU Alternative d-Matrix Raises $110 Million for AI Inference Your email has been sent Microsoft and other investors have poured $110 million into d-Matrix, an artificial intelligence chip company, ...