Steve Rosenbush’s Post

View profile for Steve Rosenbush, graphic

Bureau Chief, WSJ Pro Enterprise Technology at The Wall Street Journal

My takeaways from Apple's WWDC24: The new emphasis on AI opens the door for changes in the way people design and use software, primarily by allowing applications to work together in greater concert. Today, most applications live on desktops and mobile devices as separate entities. On Monday, Apple laid out a sweeping set of AI features and initiatives that includes, among other things, a lessening of those barriers. Kelsey Peterson, Apple director for machine learning and AI, explained it this way. She said she wanted Siri to tell her when her mom’s flight was going to land. “What is awesome is that Siri actually cross references flight details that my mom shared with me by email with real-time flight tracking to give me her up-to-date arrival time,” Peterson said. Next, she imagined that she wanted Siri to tell her details of lunch plans with her mom. If the answer isn’t in her calendar, the new version of Siri can check casual conversations mentioned in a text. And it can determine how long it will actually take to get to lunch from the airport. Users will be able to interact with Siri via text as well as voice. “I haven’t had to jump from mail to messages to maps to figure out this plan. And … tasks that would have taken minutes on my own … could be addressed in a matter of seconds,” she said. From the developers’ point of view, Apple’s approach to AI will unlock new possibilities, according to Figma CEO Dylan Field. “I think it's really neat when you can say okay, there's different parts of functionality from different apps that can be woven together into one intent .. I'm very curious where that goes over time,” said Field, who was in the audience during Apple’s presentations. Field said Apple talked deeply about integrating AI functionality into various app experiences and into the Apple platform, in contrast to the “sprinkle AI on top” approach taken by some companies. “I'm hopeful this … kicks off a lot more serious investment by developers into how do you truly improve your app experience and really design with AI in mind,” he said. "You'll pull up your phone, you'll ask it a question. It'll either tell you the answer or it'll complete the action for you,” said Box CEO Aaron Levie. “What we saw today was the biggest stepping stone in that direction that Apple's shown, probably since the creation of Siri." Craig Federighi, Apple senior vice president of software engineering, said the "personal intelligence system" is comprised of large language and diffusion models that are specialized for everyday tasks, and can adapt on the fly to current activity. It also includes an on-device semantic index that can organize and surface information from across apps. When you make a request, Apple Intelligence uses its semantic index to identify the relevant personal data, and feeds it into generative models, many of which models run entirely on-device but can scale out to the private cloud.

  • No alternative text description for this image

To view or add a comment, sign in

Explore topics