Omni Roadmap 2024 — Post-WWDC Update

A road through the trees, alluding to Omni's roadmap and macOS 15 Sequoia.)

It seems like only yesterday that we shared our Omni Roadmap 2024, featuring a Brady Bunch-inspired image of team members wearing our brand new Apple Vision Pro devices. It was an exciting time as we entered the first week of the new visionOS platform and released OmniPlan 4 for visionOS, making our powerful universal project management app available on day one for Apple’s latest offering.

As our next step, we focused on bringing our second universal app to visionOS. We had just shipped OmniFocus 4.0 weeks earlier and subsequently provided several updates. We were eager to leverage our investment in making OmniFocus 4 universal by bringing it to Apple’s latest platform. Now that we had hardware to test the app on, one thing we quickly noticed was that while we had over one hundred Omni Automation plug-ins available, their installation process on visionOS was no longer straightforward. So as we put the final touches on OmniFocus for visionOS, we also invented and implemented new “Look, Tap, and Approve!” install links for our plug-ins—ensuring a seamless experience for installing plug-ins when we shipped OmniFocus for visionOS. All within six weeks of the platform’s launch!

Our momentum continued through the Spring, as we settled into a routine of shipping new features and making performance improvements:

  • With OmniFocus 4.2, we made perspectives much more powerful by adding new custom perspective rules. For example, it’s now very easy to find all items whose defer dates have already passed by using the new “Has date in range” rule.

  • After seeing how those “Look, Tap, and Approve!” install links reduced the friction of installing plug-ins on visionOS, we brought install links to all four platforms—first with OmniFocus 4.2, then with OmniPlan 4.8, OmniGraffle 7.23 for Mac, and OmniGraffle 3.20 for iOS. (And we will soon be bringing install links to OmniOutliner as well, with OmniOutliner 5.13 for Mac and OmniOutliner 3.11 for iOS.)

  • At the start of this month, OmniFocus 4.3 added support for device Focus Filters—again, across all platforms—and further improved performance while adding new rich text formatting options.

Making OmniFocus and OmniPlan share code across all platforms has brought us significant benefits, making it easier to update them simultaneously. This achievement results from our team’s dedicated efforts to rebuild our apps using SwiftUI for shared code—and from Apple’s ongoing enhancements to support larger applications like ours. We anticipate similar advantages in OmniGraffle and OmniOutliner, with recent TestFlight builds of OmniGraffle 8 already heavily leveraging SwiftUI and sharing more code across all platforms.

This brings us to June, which is always one of the most important milestones in our annual roadmap. Apple improves all of their platforms every year, and their annual developer conference (WWDC) is where we learn what’s coming next. Last year, this was where we first learned about their plans for spatial computing and visionOS. This year, we learned about Apple Intelligence.

It’s clear that Apple has thought deeply about artificial intelligence and machine learning for a very long time. When we first brought our apps from the NeXT platform to Mac OS X (back in the ’90s!), we discovered that Apple had a number of great early AI features that we immediately wanted to support, from the ability to summarize text to voice control and speech synthesis. Over the years, we saw Apple start to bring more of that power directly to consumers, with features like identifying people and objects in photos, and general voice interactions using Siri. As Apple’s machine learning got more advanced, we saw Apple dedicate more and more of their hardware to implement machine learning more efficiently, with their introduction of the Neural Engine on the iPhone in 2017 (with the Apple A11 Bionic) and on the Mac in 2020 (with the M1).

Throughout that time, we saw Apple consistently treat privacy as a fundamental human right. Here’s John Giannandrea, three (3) years ago:

From the beginning Apple has chosen a different path from others when it comes to our approach to AI, and how we use and handle data to protect our customers’ privacy. Our engineers put an enormous amount of thought into how we innovate while keeping privacy at the center of everything we create.

With this month’s debut of Apple Intelligence, we see Apple taking the next step to make their AI even more powerful and capable—describing it as “Powerful, Intuitive, Integrated, Personal, and Private.” That last descriptor—Private—is a key differentiator. It makes it possible for us to safely leverage their AI in our apps.

One of the best things about Apple Intelligence is that it does most processing on device, rather than on someone else’s computer. We’ve been watching what other companies have done with AI for years, but the common approach they’ve taken is a non-starter for our apps, since we won’t send any of our customers’ data to third parties. In contrast, Apple Intelligence enhances the local device—making it smarter, so it can work in new ways with data and capabilities it already has. The device already has all of the data it needs! What’s new is that it’s learning how to use that data even more effectively to serve the user’s personal requests. Apple never has access to any of that data; you can decide exactly how and when your data is used to serve your own needs. This aligns perfectly with our privacy policy: At Omni, Your Data Is Yours!

So what sorts of things will Apple Intelligence be able to help us with? Well, that’s a great question! Apple has said Apple Intelligence will be available in beta this fall, so it’s not available for me to test yet and at this point I’m just speculating. But several ideas came to mind as I watched Apple’s sessions:

  • Could we ask Siri to make a list of the actionable items in today’s email and send that list over to OmniFocus?
  • Could we ask Siri to make an org chart in OmniGraffle based on relationships between the senders of those email messages?
  • Can we ask Siri to make a packing list and add that to OmniFocus?

Based on my explorations with locally-hosted large language models, these sorts of tasks seem like plausible possibilities. I look forward to exploring these and other questions over the coming months as Apple makes the technology available for us to play with!

So, in case anyone was wondering: yes, of course we plan to have our apps support Apple Intelligence. (In fact, we think they already do to some extent.) And we’re thrilled by Apple’s direction of making our devices even more capable, in ways that ensure that none of the personal data on our devices is being exploited—even when bringing extra power to bear on processing that data using Apple’s Private Cloud Compute (which can run much larger AI models than portable devices can).

What exactly does supporting Apple Intelligence look like? Well, Apple Intelligence is designed to interface with our apps using App Intents. These were actually introduced in last year’s operating system updates: in fact, we just shipped some App Intents when we implemented device Focus Filters in OmniFocus 4.3. App Intents can also be used to implement Shortcuts actions, and over the coming months we will be exploring how we might migrate our existing Shortcuts actions over to the newer App Intents system. We have lots of other ideas we’ll be exploring for App Intents throughout the summer, as well as ideas for other new features introduced at WWDC such as Control Center widgets for third-party apps. If you’re eager to explore this with us, we encourage you to keep an eye out for our Summer Exploration TestFlights.

We’re very excited about the work we’ve been doing this year, and with WWDC’s news we now know about the other technologies we need to explore this summer. Apple Intelligence is an exciting direction for the platforms and our apps!

But on a more mundane note, we will also be working (as always) to make sure all our apps will be compatible with all of the new operating systems that are coming—macOS 15 (Sequoia), iOS 18, iPadOS 18, watchOS 11, and visionOS 2—on the day they ship. And we continue to work towards the next major versions of OmniGraffle and OmniOutliner.

Oh, and I see we have new betas to install already. We have a fun summer ahead!


(At the Omni Group, we make powerful productivity apps which help you accomplish more every day. Feedback? I’d love to hear from you! You can find me on Mastodon at @kcase@mastodon.social, or send me email at kc@omnigroup.com.)