What a week! We're excited to see our Head of Data, Alexandra Manthey, speak at #AdoptAI today - don’t miss her insights at 'The New Frontiers of Generative AI' roundtable at 4:30PM today #TechTalks #GenAI #RAG2.0
About us
Enterprise LLMs
- Website
-
https://contextual.ai
External link for Contextual AI
- Industry
- Software Development
- Company size
- 2-10 employees
- Type
- Privately Held
Employees at Contextual AI
Updates
-
Our CEO Douwe Kiela is speaking at the #Snowflake Data Cloud Summit next week! Join us on Jun 3rd at 1PM PDT to hear his discussion on the future of AI, alongside Bryan Catanzaro (Nvidia) and Adrien Treuille (Snowflake). Register here: https://lnkd.in/ecH6cJaE
-
We have the new way to build #enterpriseai with RAG 2.0. Our CTO Amanpreet Singh will be sharing how we accelerate #AI training workloads with @GoogleCloudNext tech. Join the discussion on April 10 → g.co/cloudnext
-
Join us on April 10 as we take part in #GoogleCloudNext. Our CEO Douwe Kiela will dive into retrieval-augmented generation (RAG), which he pioneered at Facebook, and share how Contextual AI's RAG 2.0 approach is key to #generativeAI deployment in the #enterprise. Register to join → g.co/cloudnext
-
Our CEO Douwe Kiela spoke at SaaStr's AI Day yesterday - want to know what it takes to build AI products for the enterprise? Watch the recording here! https://lnkd.in/eV7pCK2J
SaaStr AI Day: Building AI Products for the Enterprise with Contextual AI's CEO
https://www.youtube.com/
-
With RAG 2.0, the generator and retriever are always working together. Whether you're building a house or enterprise-grade AI, teamwork makes the dream work. Find out how to make the dream work at http://rag2.ai
-
We’re excited to announce RAG 2.0, our end-to-end system for developing production-grade AI. Using RAG 2.0, we’ve created Contextual Language Models (CLMs), which achieve state-of-the-art performance on a variety of industry benchmarks. CLMs outperform strong RAG baselines built using GPT-4 and top open-source models like Mixtral, according to our research and customers. On axes critical for enterprise work, such as open-domain question answering, faithfulness, and freshness, CLMs significantly improve performance over current systems. CLMs achieve even bigger gains over current approaches when applied to real world data, as we have seen with our early customers in domains such as finance and engineering. We’re thrilled about the results we’re seeing with RAG 2.0 and can’t wait to bring it to more leading enterprises. Read more in our blog post here: http://rag2.ai
-
#GTC24 is here! Don't miss Amanpreet Singh on March 20th as he shares how we leverage Kahneman Tversky Optimization (KTO) for production-grade #generativeai
We’re a week away from #GTC24! Join us on March 20th to hear Amanpreet Singh discuss how we developed Kahneman Tversky Optimization (KTO) to speed up the #LLM and human feedback loop for #generativeai Add to your schedule: https://bit.ly/3Iye9jD
-
We’re a week away from #GTC24! Join us on March 20th to hear Amanpreet Singh discuss how we developed Kahneman Tversky Optimization (KTO) to speed up the #LLM and human feedback loop for #generativeai Add to your schedule: https://bit.ly/3Iye9jD
-
We're excited to share our latest work on GRIT, which leads to a single language model that is open state of the art on both embedding (MTEB) and generative tasks (BBH etc.). With GritLM, we also achieve significant boosts in the RAG pipeline (> 60%) by using it jointly as an embedder as well as a language model. GritLM's 7B version outperforms Mistral 7B Instruct on most generative tasks such as MMLU, GSM8K, and TyDiQA; and on the MTEB embedding benchmark. Furthermore, you can use GritLM as a reranker as well, boosting performance on most retrieval datasets, giving you an embedding, a reranker and an LM in a single model 🤯 Read our blog post for more details: https://lnkd.in/gN7EYb4X Work led by Niklas Muennighoff. At Contextual AI, we're building the next generation of foundation models for enterprises on top of our cutting-edge research like GRIT. If this excites you, join us.