Back to Conferences
R

Redis® Released 2025

Leela Ambience, New Delhi
January 2025
Attended

My Experience at Redis Released 2025
Architecture, Context, and New Perspective

Attended Redis Released 2025 last week, and the sessions + technical conversations shifted my understanding of how Redis is evolving - especially in the context of AI systems, vector search, and semantic caching.

Right from the start, discussions with developers and architects revolved around real production use cases: cache layers, vector sets, multi-region strategies, active-active clusters, and LLM pipelines.

I also spent time speaking with several industry leaders and Redis solution experts. Their insights added clarity to how Redis is expanding beyond traditional caching into context-aware AI infrastructure.

Key Technical Notes by Different Speakers

O

Omkar Konnur

  • Context Engineering is the real power in today's era.
  • Redis is evolving from "Redis for webapps" to similarly "Redis for LLM systems."
  • How redis vector sets, released recently this years handles redundancy and duplicacy.
K

Kanika Gupta

How Agentic workflows can be faster in migration and real-time knowledgebase transformation and migration with Redis as the dynamic layer?
S

Suman Guha

Architecture is about tradeoffs. Always focus on what's fit for purpose.
R

Rhythm Goyal

Caching is the unsung hero layer. It silently carries the app performance to skyrocket.
M

Murali Mohan Chakravarthy

  • Ensure the right data goes into Redis.
  • How to design your architecture to avoid cache stampede, thundering herd?
  • Five stages of latency grief.
A

Ashwin Hariharan

  • How agentic AI works behind the hood?
  • What goes into long-term and short-term memory and how Redis supports this design.
R

Rishabh Gupta

A real product implementation using Bloom Filters - great to see practical usage of the probablistic data structure I've always enjoyed studying.
P

Prasan Kumar

Great Walkthrough and demonstration of Vector Search, Vector DB, Semantic Search and Semantic Cache and Redis LangCache.
"Faster AI is not bigger models, it's better context engineering."
T

Tanuj Tyagi

  • How redis active-active feature deals with latency when data is across multiple regions and availability zones.
  • How redis is powerful in games development.

My Perspective Shift

Earlier, for me, Redis mostly meant "Caching, Rate Limiting and Distributed Locks". After this event, Redis looks more like a context engine powering LLM and agentic workloads.

I, personally liked how everyone focussed on their problem statement first and then came out that how Redis was a good fit for purpose.

Met several amazing people out there - Suryash Kumar Jha, Amandeep Prajapati, Bhavya Chopra, Sagar Dalal, Jatin Gupta, Kutub Jhabuawala and had good conversations, good perspectives shared.

If you attended Redis Released and we didn't get a chance to connect, feel free to reach out here on LinkedIn. Always up for discussing backend architecture, caching strategies, or Redis in modern AI systems.

Conference Photos

Redis Released 2025 Conference
Redis Released 2025 Conference
Redis Released 2025 Conference
Redis Released 2025 Conference
Redis Released 2025 Conference
Redis Released 2025 Conference

Key Takeaways

  • Context Engineering is the real power in today's era - Redis is evolving from "Redis for webapps" to "Redis for LLM systems"
  • Caching is the unsung hero layer that silently carries app performance to skyrocket
  • Faster AI is not bigger models, it's better context engineering
  • Architecture is about tradeoffs - always focus on what's fit for purpose
  • Redis active-active features help deal with latency across multiple regions and availability zones
  • Redis is transforming from traditional caching to a context engine powering LLM and agentic workloads
  • Design architecture to avoid cache stampede and thundering herd problems
#redisreleased #redisreleased2025 #techconference #modernAI #architecture #caching