Piris Labs
Winter 2026 NewCerebras-speed inference but scalable
Piris Labs offers a full-stack inference service that eliminates the AI data movement bottleneck. By pairing proprietary photonic hardware with a vertically optimized software stack, we minimize the "memory wall" associated with expensive GPUs. This allows us to deliver the same performance as traditional clusters at a fraction of the cost. Our technology improves effective FLOP utilization and reduces latency, finally making the unit economics of trillion-parameter models sustainable. Founded by a team of MIT physicists and Meta AI experts.
AI Investor Summary
Piris Labs is building a full-stack inference service leveraging proprietary photonic hardware to eliminate the AI data movement bottleneck, offering Cerebras-speed performance at a fraction of the cost of traditional GPU clusters. Their strong technical team from Google and Meta is well-positioned to capture the massive and growing AI inference market by making trillion-parameter models economically viable.
Key Highlights
- ● Founders with strong AI/ML backgrounds from Google and Meta.
- ● Addressing a critical bottleneck in AI inference with a novel hardware/software solution.
- ● Potentially disruptive technology for the rapidly growing AI infrastructure market.
Risk Factors
- ● Hardware development risk and manufacturing scalability.
- ● Customer adoption risk in a market dominated by established GPU providers.
- ● Demonstrating clear unit economics and ROI for potential customers.
- ● Competition from other emerging AI hardware and infrastructure solutions.
Founders
Ali Khalatpour is the co-founder of Piris Labs, a Y Combinator startup focused on AI for scientific research. His background includes significant experience in AI and machine learning, with a focus on developing innovative solutions for complex scientific challenges. Prior to Piris Labs, he held roles at prominent tech companies, contributing to advancements in AI applications.
I am an AI technologist with a decade of experience scaling AI/ML products. At Meta and X, I specialized in leading tiger teams to launch high-stakes 0-to-1 initiatives. Now, I am the Co-Founder of Piris Labs. We are building high-speed photonic interconnects to solve the energy and latency constraints for AI data inference. My current focus is to iterate on our inference service and onboard customers into our platform.
Score Breakdown
Strong technical team with deep AI/ML expertise from top-tier companies (Google, Meta) and prestigious academic backgrounds (Stanford PhD, Stanford MS, Berkeley BS). Ali's research focus and Keyvan's experience scaling AI/ML products and leading 0-to-1 initiatives are highly relevant. The combination of research and product scaling experience is a significant plus. [Boost +2: Founder from Google; PhD from Stanford; Founder from Google]
Large addressable market in AI inference infrastructure, driven by the exponential growth of AI models and the increasing demand for efficient, cost-effective deployment. The timing is excellent as the industry grapples with the limitations of current GPU-based inference. Regulatory tailwinds for AI development are also a positive factor. [Boost +1: Large TAM mentioned]
Product shows promise with a novel approach combining photonic hardware and a vertically optimized software stack to address the data movement bottleneck. The claim of Cerebras-speed inference at a fraction of the cost is compelling. Technical differentiation is present, and the potential for a defensible moat through proprietary hardware is high. UX quality and platform potential are yet to be fully demonstrated at this early stage.
Early stage with limited public traction metrics. While there is positive press coverage and investor interest (Tekedia Capital investment), concrete revenue, user numbers, and growth rates are not yet available. This is typical for a Winter 2026 batch, but a key area for future evaluation. [Boost +2: Tier-1 VC: accel]
News
A video introducing Piris Labs, highlighting their optical interconnect solution for faster and cheaper AI inference by addressing data movement bottlenecks.
Piris Labs has launched, offering fast and cost-effective AI inference powered by optical interconnects and a custom software stack, addressing the 'memory wall' bottleneck.
Piris Labs has raised $500K in a Seed round from Y Combinator in 2026, with Y Combinator being its sole investor.
Piris Labs is building a low-latency inference architecture by using proprietary photonic interconnects to increase bandwidth and reduce power consumption, aiming to solve the data movement bottleneck in AI infrastructure.
Tekedia Capital has invested in Piris Labs, a startup focused on creating a full-stack AI inference platform by leveraging proprietary photonic hardware to overcome data movement limitations in AI computing.
Piris Labs delivers low-latency, low-cost AI inference by eliminating the memory wall using full-stack optical interconnects and a purpose-built software stack, founded by MIT physicists and former Meta AI experts.
Piris Labs has joined the Y Combinator W26 batch, which will accelerate its mission to build new compute architectures for AI inference by leveraging proprietary photonic interconnects to address data movement bottlenecks.
Piris Labs offers a full-stack inference service that eliminates the AI data movement bottleneck by pairing proprietary photonic hardware with a vertically optimized software stack, aiming to deliver Cerebras-level performance at a lower cost.
Piris Labs is developing a full-stack AI inference service using photonic hardware to overcome the memory bandwidth bottleneck, aiming to provide Cerebras-level inference speeds at a lower cost and with greater scalability than traditional GPU clusters.
Tekedia Capital has invested in Piris Labs, a company building a full-stack AI inference platform that uses proprietary photonic hardware to address the data movement bottleneck, aiming for sustainable unit economics for large AI models.
Quick Info
- Batch
- Winter 2026
- Team Size
- 2
- Location
- Unspecified
- Founders
- 2
- Scraped
- 4/10/2026