Daniel Kornev’s Post

View profile for Daniel Kornev, graphic

CEO at Sentius | Techstars'24 | Microsoft Alumni'10 | Xoogler'11 | 2nd-time Founder

Super proud of Mike and his team. Many people asked me over the last year why long context and reasoning are so important. My answer is simple: long context is, in other words, a much better memory that allows LLM to utilize its ability to comprehend incoming text fully which as we all know is a million miles better than RAG. A very clear use case is with processing huge documentation like corporate wiki. Now, the deal with reasoning over long context is also simple: if models can get an ability to process long context but do that poorly then what's the value is in doing? And that's why a proper benchmark like BABILong is so desperately needed. It helps you to open your eyes on how inefficent the majority of the modern methods for processing long context by LLMs are. It helps you to understand what the real goal is. Let's make LLMs understand long context better together!

Akshay Jain

Founder and Managing Director at DNA Growth

2w

Great point on the importance of long context for LLMs, Mike - it really magnifies the difference a comprehensive benchmark like BABILong can make in evaluating their efficacy.

great! I hope all is going well, Daniel!

See more comments

To view or add a comment, sign in

Explore topics