Lorenzo Thione’s Post

View profile for Lorenzo Thione, graphic

Public Speaker & Investor in Artificial Intelligence / Broadway Producer / 🏳️🌈 Advocate

Lots of AI news, and it’s only the beginning of Jan! 2024 is going to be the year of AI at the edge. We’re not going to spend the year just chasing the next bigger, better model. Although next-gen model training is going to continue, focusing on optimization is going to be just as significant as we look to a future in which LLMs are deployed in a wide range of use cases. Apple’s December surprise, for example: a breakthrough method of using windowing and row-column bundling to run AI models in flash storage rather than RAM significantly increasing the size of models the iPhone could run (expect Siri to get smarter soon). Microsoft’s Phi-2’s highly information dense, highly consistent and curated training data resulted in a model that is small but with amazing inferential performance. And the remarkable innovation around AI-specific chips produced by companies like Rain.ai, Femtosense and Applied Brain Research further pushes the boundaries of AI computing on the edge. Expect AI to come to a (every) device near you, this year. Optimization—alongside major breakthroughs in LLM functionality—will bring us closer to a reality in which AI lives alongside us: on our phones, on our watches, in devices perfect for pairing with AI that have yet to be dreamed or devised. The possible applications will be endless and revolutionary. #Gaingels #VC #VentureCapital #Founders #Entrepreneurship #AI #OpenAI

  • No alternative text description for this image

Exciting to see such rapid advancements in AI, especially with innovative approaches to model efficiency and deployment that promise to revolutionize our interactions with technology.

The flash memory optimizations look like they will be huge! I need to read more about that.

Like
Reply
Carlo Cisco

Founder & CEO of Select

6mo

💯

Like
Reply
See more comments

To view or add a comment, sign in

Explore topics