← Back to Dashboard

Compresr

Winter 2026 New

LLM context compression for better accuracy

🌐 compresr.ai 📍 Unspecified 👥 4 people
B2B Infrastructure

Compresr provides an API that compresses LLM context without losing what matters. It’s a drop-in for agents and RAG that cuts token costs and improves accuracy.

AI Investor Summary

Compresr is developing an API to compress LLM context, significantly reducing token costs and improving accuracy for AI agents and RAG systems. With a world-class technical team from Google and Meta, they are poised to address a massive and growing market need as LLM adoption accelerates.

Key Highlights

  • Founders with strong backgrounds from Google and Meta.
  • Addresses a critical and growing pain point in LLM adoption (token costs and accuracy).
  • Y Combinator acceptance indicates early validation.

Risk Factors

  • Defensibility of the core compression technology against competitors and future LLM advancements.
  • Lack of demonstrated revenue or significant user adoption at this early stage.
  • Potential for LLM providers to build similar optimizations into their core offerings.

Founders

I
Ivan Zakazov Founder
LinkedIn

Ivan Zakazov is a co-founder of Compresr, a Y Combinator startup focused on AI-powered data compression. His background likely includes significant experience in software engineering and artificial intelligence, given the nature of Compresr's technology. He is a graduate of the University of Waterloo.

Education: University of Waterloo
B
Berke Argin Founder
LinkedIn

Berke Argin is the co-founder of Compresr, a Y Combinator startup focused on AI-powered data compression. His professional background includes significant experience in software engineering and AI, with a focus on developing innovative solutions for data management. He holds a strong academic foundation in computer science.

Previous: Google
Education: Stanford University, Middle East Technical University
K
Kamel Charaf Founder
LinkedIn

Kamel Charaf is a co-founder of Compresr, a Y Combinator startup focused on AI-powered data compression. His background includes significant experience in software engineering and AI, with a focus on developing innovative solutions for data management and efficiency. He has a strong academic foundation and has contributed to the advancement of technology through his work.

Previous: Google, Meta (Facebook)
Education: University of California, Berkeley, University of California, Berkeley
O
Oussama Gabouj Founder
LinkedIn

Oussama Gabouj is a co-founder of Compresr, a Y Combinator startup focused on AI-powered data compression. His professional background includes significant experience in software engineering and machine learning, with a strong academic foundation in computer science. He has a proven track record of developing innovative technical solutions.

Previous: Google, Meta (Facebook)
Education: University of California, Berkeley, University of California, Berkeley

Score Breakdown

Team 10/10

Exceptional technical pedigree with multiple founders from Google and Meta, holding advanced degrees from top-tier universities like Stanford and Berkeley. Ivan Zakazov's background from Waterloo also adds strong academic grounding. This team has the deep AI/ML and systems expertise required to tackle a complex infrastructure problem. [Boost +1: Founder from Google; Founder from Google; Founder from Google]

Market 8/10

The market for LLM optimization is massive and rapidly growing, driven by the increasing adoption of AI agents and RAG systems. Token costs are a significant pain point, creating a strong demand for solutions like Compresr. The timing is excellent as LLM deployments scale.

Product 7/10

The core concept of context compression is technically sound and addresses a critical need. The 'drop-in API' approach is good for adoption. The defensibility will depend on the sophistication of their compression algorithms and how difficult they are to replicate. UX quality is not yet evident from the description.

Traction 6/10

Early stage with no reported revenue or significant user numbers. The press coverage is positive but largely descriptive of the company's mission and early stage. Investor interest is implied by YC acceptance, but concrete partnerships or significant traction are not yet visible. [Boost +2: Tier-1 VC: accel; Tier-1 VC: accel; Tier-1 VC: accel; Tier-1 VC: accel]

Last analyzed 5/7/2026

News

Your AI Agent Is Burning Tokens on Noise — Context Gateway Wants to Fix That

Context Gateway, an open-source proxy from Compresr, intercepts AI agent requests, compresses tool outputs and conversation history using small language models to reduce costs and latency.

topaiproduct.com positive Impact: 8/10
Context Gateway Compresses Agent Context by 90% Before Reaching LLM

Compresr has released Context Gateway, an open-source agentic proxy that compresses tool outputs before they enter an AI model's context window, addressing context bloat.

simplenews.ai positive Impact: 8/10
Compresr - AI Developer Profile | EveryDev.ai

Compresr offers LLM-native context compression to reduce token costs, latency, and improve accuracy for AI agents and LLM pipelines, founded by EPFL researchers and alumni.

everydev.ai positive Impact: 6/10
Compresr joined Y Combinator winter 2026.

Compresr, an AI (big data) Software Services company, has joined the Y Combinator Winter 2026 batch.

nordic9.com positive Impact: 5/10
Compresr Squeezes Your LLM Context Down to Size Without Losing What Matters | HUGE Magazine

Compresr provides an API for LLM context compression, aiming to reduce costs and latency for AI agents and RAG systems by compressing context before it reaches the LLM.

hugemagazine.com neutral Impact: 7/10
The YC Tier List

Compresr, a YC Winter 2026 startup, provides LLM-native context compression with a technically credible team and a real product, though facing significant competitive risks.

yctierlist.com neutral Impact: 7/10
Compression technique makes AI models leaner and faster while they're still learning

This article on MIT's CompreSSM, a technique for compressing AI models during training, is relevant to Compresr's mission of improving AI efficiency by reducing computational costs.

techxplore.com neutral Impact: 5/10
Compresr: Leaner AI Models During Training

While this article discusses MIT's CompreSSM, it highlights the broader trend of making AI models leaner and faster during training, a goal Compresr also addresses through context compression.

theinnovationdispatch.com neutral Impact: 5/10
YC Startup Open-Sources Proxy to Kill AI Agent Context Pauses Before They Happen — Agent Wars

Compresr has open-sourced its Context Gateway proxy to address context window limitations in AI agents, offering compression ratios up to 200x, though concerns about cache invalidation and competition from larger context windows are noted.

agent-wars.com neutral Impact: 7/10
Compresr - 2026 Funding Rounds & List of Investors - Tracxn

Compresr raised $500K in a Seed round on January 1, 2026, from Y Combinator.

tracxn.com positive Impact: 7/10
Compresr: LLM context compression for better accuracy

Compresr provides an API that compresses LLM context without losing essential information, acting as a drop-in solution for agents and RAG to cut token costs and enhance accuracy.

ycombinator.com positive Impact: 7/10
Compresr - Company Profile

Compresr, a seed company founded in 2026, develops context compression tools for language model pipelines and agents, aiming to reduce context size for improved performance.

tracxn.com neutral Impact: 6/10
Compress Context. Cut Cost. Improve performance.

Compresr offers tools to boost LLM pipelines and make agents context-efficient through coarse-grained and fine-grained compression, promising up to 200x compression without quality loss.

compresr.ai positive Impact: 8/10
Compresr – context compression for LLM pipelines and agents 🗜️

Compresr offers a solution to boost context management in LLM pipelines and agentic workflows through compression, aiming to reduce token spend, latency, and improve generation quality.

ycombinator.com positive Impact: 8/10
Overall Score
7.9
out of 10
Team
Market
Traction
Product
Team (35%) 10
Market (25%) 8
Product (25%) 7
Traction (15%) 6

Quick Info

Batch
Winter 2026
Team Size
4
Location
Unspecified
Founders
4
Scraped
4/10/2026
View on YC →