Posts tagged with "WWDC 2024"

The Issues of iPadOS 18’s New Tab Bars

Earlier today on Mastodon, I shared some concerns regarding the Books app in iPadOS 18 and how Apple implemented the new tab bar design in the app. Effectively, by eschewing a sidebar, the app has returned to feeling like a blown-up iPhone version – something I hoped we had left behind when Apple announced they wanted to make iPad apps more desktop-class two years ago.

Unfortunately, it gets worse than Books. As documented by Nico Reese, the developer of Gamery, the new tab bars seem to fall short of matching the previous design’s visual affordances as well as flexibility for developers. For starters, the new tabs are just text labels, which may work well in English, but not necessarily other languages:

Since the inception of the iPhone, tabs in a tab bar have always included a glyph and a label. With the new tab style, the glyphs are gone. Glyphs play a crucial role in UX design, allowing users to quickly recognize parts of the app for fast interaction. Now, users need to read multiple text labels to find the content they want, which is slower to perceive and can cause issues in languages that generally use longer words, such as German. Additionally, because tab bars are now customizable, they can even scroll if too many tabs are added!

You’ll want to check out Nico’s examples here, but this point is spot-on: since tab bars now sit alongside toolbar items, the entire UI can get very condensed, with buttons often ending up hidden away in an overflow menu:

Although Apple’s goal was to save space on the iPad screen, in reality, it makes things even more condensed. Apps need to compress actions because they take up too much horizontal space in the navigation bar. This constant adjustment of button placement in the navigation bar as windows are resized prevents users from building muscle memory. The smaller the window gets, the more items collapse.

If the goal was to simplify the iPad’s UI, well, now iPad users will end up with three ways to navigate apps instead of two, with the default method (the top bar) now generally displaying fewer items than before, without glyphs to make them stand out:

For users, it can be confusing why the entire navigation scheme changes with window resizing, and now they must adjust to three different variations. Navigation controls can be located at the top, the bottom, or the left side (with the option to hide the sidebar!), which may not be very intuitive for users accustomed to consistent navigation patterns.

The best way I can describe this UI change is that it feels like something conceived by the same people who thought the compact tab bar in Safari for iPad was a good idea, down to how tabs hide other UI elements and make them less discoverable.

Nico’s post has more examples you should check out. I think Marcos Tanaka (who knows a thing or two about iPad apps) put it well:

It makes me quite sad that one of the three iPad-specific features we got this year seems to be missing the mark so far. I hope we’ll see some improvements and updates on this front over the next three months before this feature ships to iPad users.

Permalink

WWDC 2024: The AppStories Interviews with ADA and Swift Student Challenge Distinguished Winners

Devin Davies, the developer of Crouton.

Devin Davies, the developer of Crouton.

To wrap up our week of WWDC coverage, we just published a special episode of AppStories that was recorded in the Apple Podcasts Studio at Apple Park. Federico and I interviewed three of this year’s Apple Design Award winners:

Devin Davies.

Devin Davies.

  • Devin Davies, the creator of Crouton, which won an ADA in the Interaction category
Katarina Lotrič and Jasna Krmelj of Gentler Streak.

Katarina Lotrič and Jasna Krmelj of Gentler Streak.


- Katarina Lotrič, CEO and co-founder, and Jasna Krmelj, CTO and co-founder, of Gentler Streak, which won an ADA in the Social Impact category

James Cuda, CEO, and Michael Shaw, CTO, of Procreate.

James Cuda, CEO, and Michael Shaw, CTO, of Procreate.


- James Cuda, CEO, and Michael Shaw, CTO of Procreate, which won an ADA for (Procreate Dreams) in the Innovation category

We also interviewed two of the Swift Student Challenge Distinguished Winners:

  • Dezmond Blair, a student at the Apple Developer Academy in Detroit. His app marries his passion for biking and the outdoors with technology, which creates an immersive experience.
  • Adelaide Humez, a high school student from Lille, France. Her winning app, Egretta, allows users to create a journal of their dreams based on emotions.

In addition to being available as always in your favorite podcast app as an audio-only podcast, This special episode of AppStories is available on our new MacStories YouTube channel, which is also the home of Comfort Zone, one of the two podcasts we launched last week and other video projects.


We deliver AppStories+ to subscribers with bonus content, ad-free, and at a high bitrate early every week.

To learn more about the benefits included with an AppStories+ subscription, visit our Plans page or read the AppStories+ FAQ.

Permalink

Opting Out of AI Model Training

Dan Moren has an excellent guide on Six Colors that explains how to exclude your website from the web crawlers used by Apple, OpenAI, and others to train large language models for their AI products. For many sites, the process simply requires a few edits to the robots.txt file on your server:

If you’re not familiar with robots.txt, it’s a text file placed at the root of a web server that can give instructions about how automated web crawlers are allowed to interact with your site. This system enables publishers to not only entirely block their sites from crawlers, but also specify just parts of the sites to allow or disallow.

The process is a little more complicated with something like a WordPress, which MacStories uses, and Dan covers that too.

Unfortunately, as Dan explains, editing robots.txt isn’t a solution for companies that ignore the file. It’s simply a convention that doesn’t carry any legal or regulatory weight. Nor does it help with Google or Microsoft’s use of your website’s copyrighted content unless you’re also willing to remove your site from the biggest search engines.

Although I’m glad there is a way to block at least some AI web crawlers prospectively, it’s cold comfort. We and many sites have years of articles that have already been crawled to train these models, and you can’t unring that bell. That said, MacStories’ robot.txt file has been updated to ban Apple and OpenAI’s crawlers, and we’re investigating additional server-level protections.

If you listen to Ruminate or follow my writing on MacStories, you know that I think what these companies are doing is wrong both in the moral and legal sense of the word. However, nothing captures it quite as well as this Mastodon post by Federico today:

If you’ve ever read the principles that guide us at MacStories, I’m sure Federico’s post came as no surprise. We care deeply about the Open Web, but ‘open’ doesn’t give tech companies free rein to appropriate our work to build their products.

Yesterday, Federico linked to Apple’s Machine Learning Research website where it was disclosed that the company has indexed the web to train its model without the consent of publishers. I was as disappointed in Apple as Federico. I also immediately thought of this 2010 clip of Steve Jobs near the end of his life, reflecting on what ‘the intersection of Technology and the Liberal Arts’ meant to Apple:

I’ve always loved that clip. It speaks to me as someone who loves technology and creates things for the web. In hindsight, I also think that Jobs was explaining what he hoped his legacy would be. It’s ironic that he spoke about ‘technology married with Liberal Arts,’ which superficially sounds like what Apple and others have done to create their AI models but couldn’t be further from what he meant. It’s hard to watch that clip now and not wonder if Apple has lost sight of what guided it in 2010.


You can follow all of our WWDC coverage through our WWDC 2024 hub or subscribe to the dedicated WWDC 2024 RSS feed.

Permalink

Designing Dark Mode App Icons

Apple’s announcement of “dark mode” icons has me thinking about how I would approach adapting “light mode” icons for dark mode. I grabbed 12 icons we made at Parakeet for our clients to illustrate some ways of going about it.

Before that though, let’s take some inventory. Of the 28 icons in Apple’s preview image of this feature, only nine have white backgrounds in light mode. However, all icons in dark mode have black backgrounds.

Actually, it’s worth noting that five “light mode” icons have black backgrounds, which Apple slightly adjusted to have a consistent subtle black gradient found on all of their new dark mode icons. Four of these—Stocks, Wallet, TV, and Watch—all seem to be the same in both modes. However, no other (visible) icons are.

Fantastic showcase by Louie Mantia of how designers should approach the creation of dark mode Home Screen icons in iOS 18. In all the examples, I prefer Mantia’s take to the standard black background version.

See also: Gavin Nelson’s suggestion, Apple’s Human Interface Guidelines on dark mode icons, and the updated Apple Design Resources for iOS 18.

Permalink

Apple Details Its AI Foundation Models and Applebot Web Scraping

From Apple’s Machine Learning Research1 blog:

Our foundation models are trained on Apple’s AXLearn framework, an open-source project we released in 2023. It builds on top of JAX and XLA, and allows us to train the models with high efficiency and scalability on various training hardware and cloud platforms, including TPUs and both cloud and on-premise GPUs. We used a combination of data parallelism, tensor parallelism, sequence parallelism, and Fully Sharded Data Parallel (FSDP) to scale training along multiple dimensions such as data, model, and sequence length.

We train our foundation models on licensed data, including data selected to enhance specific features, as well as publicly available data collected by our web-crawler, AppleBot. Web publishers have the option to opt out of the use of their web content for Apple Intelligence training with a data usage control.

We never use our users’ private personal data or user interactions when training our foundation models, and we apply filters to remove personally identifiable information like social security and credit card numbers that are publicly available on the Internet. We also filter profanity and other low-quality content to prevent its inclusion in the training corpus. In addition to filtering, we perform data extraction, deduplication, and the application of a model-based classifier to identify high quality documents.

It’s a very technical read, but it shows how Apple approached building AI features in their products and how their on-device and server models compare to others in the industry (on servers, Apple claims their model is essentially neck and neck with GPT-4-Turbo, OpenAI’s older model).

This blog post, however, pretty much parallels my reaction to the WWDC keynote. Everything was fun and cool until they showed generative image creation that spits out slop “resembling” (strong word) other people; and in this post, everything was cool until they mentioned how – surprise! – Applebot had already indexed web content to train their model without publishers’ consent, who can only opt out now. (This was also confirmed by Apple executives elsewhere.)

As a creator and website owner, I guess that these things will never sit right with me. Why should we accept that certain data sets require a licensing fee but anything that is found “on the open web” can be mindlessly scraped, parsed, and regurgitated by an AI? Web publishers (and especially indie web publishers these days, who cannot afford lawsuits or hiring law firms to strike expensive deals) deserve better.

It’s disappointing to see Apple muddy an otherwise compelling set of features (some of which I really want to try) with practices that are no better than the rest of the industry.


  1. How long until this become the ‘Apple Intelligence Research’ website? ↩︎
Permalink

Interview Roundup: Apple’s Executives Talk Up Apple Intelligence and WWDC

In what has become a yearly WWDC tradition, Apple executives have been out talking about the big announcements from this year’s conference. Craig Federighi, Greg Joswiak, John Giannandrea, and Tim Cook have given interviews to YouTubers, news sites, and John Gruber on a special edition of The Talk Show streamed live in spatial video.

They gave fascinating answers to some questions, particularly about Apple Intelligence, so without further ado, here’s a roundup of interesting Apple executive interviews over the past few days.

Read more


tvOS 18: The MacStories Overview

Yesterday, during its WWDC 2024 opening keynote, Apple officially revealed its latest software story for Apple TV. Coming this fall, tvOS 18 introduces new intelligence-based features such as InSight and on-device Siri, native 21:9 aspect ratio support, new screen savers, and a host of noteworthy additions to enhance the at-home TV viewing experience. Let’s jump into everything new coming to Apple TV.

InSight

Apple’s video player is somewhat of a hidden gem when it comes to playback and controls for audio and captions. A few years ago, the company expanded its functionality with a quick swipe down gesture revealing an Info panel with details of the currently-playing content and quick access to the user’s Up Next queue. Premiering this fall is a new feature nestled between those two elements called InSight.

A new addition to Apple TV+, InSight gives users real-time access to information about the actors and their characters onscreen, as well as the soundtrack in a given scene, allowing viewers to quickly add that song or musical performance to an Apple Music playlist to enjoy later. Much like Amazon Prime Video’s X-Ray feature that came before it, there’s lots of fine granular detail that could be added to InSight before its fall launch, but this is a great start.

In addition to accessing InSight on the big screen, users will also be able to view real-time actor, character, and music information through the Remote app found in Control Center on iOS and iPadOS, allowing access to the same information for a distraction-free experience when watching with friends and family.

Read more


Apple Announces New Features Coming to Its Services This Fall

Alongside updates to Apple’s platforms and Apple Intelligence, the company announced an assortment of new features coming to its line of services this fall. From the press release in Apple Newsroom:

“So many of our users rely on Apple services throughout their day, from navigating their commute with Apple Maps, to making easy and secure payments with Apple Pay, to curating playlists with Apple Music,” said Eddy Cue, Apple’s senior vice president of Services. “We’re excited to give them even more to love about our services, like the ability to explore national parks with hikes in Apple Maps, redeem rewards or access installments with Apple Pay, and enjoy music with loved ones through SharePlay in Apple Music.”

I like that this services roundup is becoming an annual WWDC tradition. Some of these features were mentioned or shown on-screen during the keynote, but it’s easy for them to get overlooked in light of major operating system changes. While they might seem small in comparison, improvements to Apple’s services can have lasting day-to-day impacts on those who use them, myself included.

A few of my favorite services updates this year:

  • A new Places Library in Maps that allows you to save locations and write notes about them.
  • Tap to Provision, an easier way to add credit and debits cards to Wallet by tapping them instead of entering card numbers.
  • Redesigned event tickets in Wallet that can feature new types of data, including parking and weather information.
  • The Library tab in Apple Fitness+ for quicker access to saved workouts, Custom Plans, and Stacks.
  • Redesigned iCloud settings to better surface recommendations and features you’re using.

Check out the press release for all the updates coming to Apple’s services this fall. There’s a lot to look forward to there, and I’m happy to see the company continuing to push its services forward.


You can follow all of our WWDC coverage through our WWDC 2024 hub or subscribe to the dedicated WWDC 2024 RSS feed.

Permalink

Apple Intelligence: The MacStories Overview

After months of anticipation and speculation about what Apple could be doing in the world of artificial intelligence, we now have our first glimpse at the company’s approach: Apple Intelligence. Based on generative models, Apple Intelligence uses a combination of on-device and cloud processing to offer intelligence features that are personalized, useful, and secure. In today’s WWDC keynote, Tim Cook went so far as to call it “the next big step for Apple.”

From the company’s press release on Apple Intelligence:

“We’re thrilled to introduce a new chapter in Apple innovation. Apple Intelligence will transform what users can do with our products — and what our products can do for our users,” said Tim Cook, Apple’s CEO. “Our unique approach combines generative AI with a user’s personal context to deliver truly helpful intelligence. And it can access that information in a completely private and secure way to help users do the things that matter most to them. This is AI as only Apple can deliver it, and we can’t wait for users to experience what it can do.”

It’s clear from today’s presentation that Apple is positioning itself as taking a different approach to AI than the rest of the industry. The company is putting generative models at the core of its devices while seeking to stay true to its principles. And that starts with privacy.

Read more