SwiftyClip v1.0.3 — agentic, Apple-native, and finished
We shipped SwiftyClip v1.0 exactly one week ago. The feedback has been relentless. While the initial release established our core thesis—that professional video clipping should happen entirely on-device—it was only the beginning. Today, with v1.0.3, we are closing the feature gap with the world's most sophisticated Apple-native creative tools. In a single twenty-four-hour sprint, we have delivered eight major features that transform SwiftyClip into an agentic powerhouse.
The philosophy behind this update is simple: SwiftyClip should feel like it was built by Apple for power users. We are not interested in building cross-platform abstractions that dilute the user experience. We are building for the macOS user who values speed, privacy, and deep system integration. This release is about maturing the product and proving that a focused team can out-pace cloud competitors by leveraging the platform primitives that others ignore. This is SwiftyClip v1.0.3, representing the most significant leap in our short history.
Shortcuts and Siri via AppIntents
Automation is the heartbeat of a modern creative workflow. With v1.0.3, we are introducing deep integration with the Shortcuts app via AppIntents. We have exposed the core mechanics of our video engine to the system. You can now use actions like IngestVideo, ScoreLatestClip, and RenderLatestClip directly within your custom Shortcuts workflows.
This means you can build a workflow that automatically triggers when a new recording is added to a folder, sends it to SwiftyClip for analysis, and returns the highest-scoring segments autonomously. Because these are native AppIntents, they are fully voice-activated via Siri. You can say, "Siri, score my latest recording," and watch the Neural Engine go to work. This level of system-wide agency separates a professional tool from a toy. For those building complex, multi-tool automations, we've updated our guide to agentic workflows to include these new native capabilities.
Writing Tools on Captions
With macOS 26, Apple Intelligence is now foundational to the desktop. In v1.0.3, we have integrated Writing Tools directly into our caption editor. You can now highlight any text and use Apple's native LLM to rewrite it inline. We've implemented five specific styles optimized for social media: punchier, quieter, question-first, hook, and concise.
This matters because context-switching kills productivity. Copying transcripts into external chatbots is now obsolete. By using on-device Writing Tools, you get world-class language modeling with zero latency and absolute privacy. Your content never leaves the secure enclave of your Mac. Whether turning a rambling sentence into a sharp hook or shortening a caption for better on-screen readability, the tools are now right where you need them. It's how we leverage Apple's investment in AI to provide a superior experience.
ScreenCaptureKit Live Capture
The most requested feature since launch was recording video directly into SwiftyClip. We've delivered a full implementation of ScreenCaptureKit. You can now record your entire display, a specific window, or a selected application directly into the SwiftyClip pipeline. This is a high-performance capture engine that feeds directly into our analysis system.
For streamers and solo creators, this eliminates the need for a separate recording step. Capture your gameplay, coding session, or presentation, and as soon as you hit stop, the video is already being transcribed and scored for viral moments. By using ScreenCaptureKit, we ensure the recording process is extremely efficient, utilizing hardware-accelerated encoding to minimize CPU overhead. This leaves more power for your work while SwiftyClip handles the heavy lifting in the background. It collapses the distance between creation and distribution.
Raycast ⌘K Command Palette
Speed is a feature. In v1.0.3, we have introduced a global command palette, accessible via ⌘K. This allows you to fuzzy-search every action within the application. Whether switching aspect ratios, triggering a filler-word sweep, or exporting a specific clip, you can now do it entirely from the keyboard.
This pattern is the fastest way to navigate a complex application. We have indexed every menu item and common workflow. For developers, this command palette is a GUI wrapper around our internal toolset, which we've also made available programmatically via ourModel Context Protocol (MCP) server. Whether interacting via fingers or an AI agent, the underlying capability remains consistent and accessible.
MenuBarExtra Render Queue
Professional software should stay out of your way. A common frustration is needing to keep the main window open during renders. In v1.0.3, we've introduced a MenuBarExtra render queue. You can now close the main SwiftyClip window and see the progress of every in-flight render directly in your macOS menu bar.
This is standard macOS citizenship. Your Mac is a multitasking machine, and SwiftyClip respects that. The menu bar icon provides a subtle visual indicator of progress, and a quick click reveals detailed breakdowns, including estimated time remaining and controls to pause or cancel. When a render finishes, you'll receive a native notification to reveal the file in Finder or share it immediately. It's a small change that significantly improves the day-to-day feel of the application.
One-Keystroke Filler-Word Sweep (⇧⌘F)
Editing a transcript shouldn't be a chore. One of the most tedious parts of podcast editing is removing "ums," "uhs," and "likes." In v1.0.3, we've introduced a one-keystroke filler-word sweep via ⇧⌘F. In a single action, the SwiftyClip engine scans your entire transcript and strips these distractions while maintaining continuity.
This isn't simple find-and-replace. Our engine uses precise word-level timestamps to perform surgical cuts that sound natural. For a thirty-minute interview, this can save twenty minutes of manual trimming. We want to remove the mechanical friction of editing so you can focus on the narrative. By making this a single global command, we've turned a project-long headache into a momentary task.
OSLog Signposts and Instrumentation
We believe in technical transparency. When we claim SwiftyClip can analyze a sixty-minute podcast in under five minutes, we want to prove it. In v1.0.3, we have instrumented our pipeline with OSLog signposts. Every stage—from ingestion to saliency analysis and rendering—is now traceable in the macOS Instruments app.
For users, this means better performance and reliability, as we use these traces to eliminate bottlenecks. For power users, it means you can see how the software interacts with your hardware. Watch exactly how we utilize the Neural Engine and GPU during a render. This level of engineering rigor is rare for "AI wrappers," but standard for a truly Apple-native application. We are building for efficiency and giving you the tools to verify it.
Aspect-Ratio Top-Chrome Switcher
Modern distribution requires multiple formats. While 9:16 is king for TikTok, you still need 1:1 for LinkedIn and 16:9 for YouTube. In v1.0.3, we've added an aspect-ratio switcher directly into the top chrome. Toggle between 9:16, 1:1, 16:9, and 4:5 with a single click.
We've taken inspiration from the best-in-class patterns but kept the implementation minimal. Switching ratios doesn't just crop video; it intelligently adjusts framing based on face-tracking data gathered during analysis. The UI remains clean, putting important controls right where you expect them. It's about making the transition between platforms feel effortless, ensuring your content looks its best everywhere.
Multi-Aspect Export in Parallel
The final piece of v1.0.3 is multi-aspect export. Once finished editing, you can trigger a parallel render in all three primary platform ratios (9:16, 1:1, 16:9) simultaneously. Each export utilizes its own face-tracked framing to ensure the subject is always centered.
This is where Apple Silicon shines. By optimizing our render engine with Metal and AVFoundation, we can saturate hardware video encoders, allowing for multiple simultaneous renders with no performance degradation. Instead of waiting for three separate exports, you wait for one. This is a massive efficiency boost for multi-platform creators, turning the final stage of the workflow into a fast, automated process.
Closing the Loop
SwiftyClip v1.0.3 is more than a collection of features; it's a statement of intent. We are building the most agentic, Apple-native video clipper, pacing faster than cloud competitors. By focusing exclusively on macOS, we deliver performance, privacy, and integration that browser-based tools fundamentally cannot match.
We aren't stopping here. In the coming weeks, we will ship our signed DMG, launch our public TestFlight, and release a comprehensive new demo video showcasing these agentic workflows in action. Our roadmap is full. Join us as we redefine what's possible with video on the Mac.
Check out our pricing page for current offers, including our lifetime license. For those getting started, ourfirst clip guide will walk you through the process in minutes. We've updated ourchangelog with technical notes, and you can explore our vision for agentic AI andenterprise solutions. The future of video editing is on-device, agentic, and Apple-native.