Press kit · Reddit launch posts

Reddit launch posts

Five sub-specific posts. Read each sub's rules before posting; some require mod approval for self-promo. Stagger across 48 hours for better reach.

r/podcasting

Built an AI clipper that runs 100% on your Mac — no cloud uploads, $149 lifetime

Longtime lurker, building something for myself. SwiftyClip takes your long-form podcast and clips it into short-form vertical videos, but everything happens on your Mac's Neural Engine. No uploads, no queue, no per-clip credits.

Why I built it: I hated that Opus Clip would extract the same 30 seconds 8 times from a 2-hour episode. SwiftyClip uses embedding-based dedupe so you never get duplicate clips.

Pricing: free tier (5 clips/mo), Starter $9/mo unlimited, or $149 lifetime. Mac only (Apple Silicon required).

Happy to answer: how it handles your interview show, whether SpeechAnalyzer accuracy is good enough on your audio setup, anything else.

Link in the OP if mods allow, otherwise just swiftyclip.com.

Note: r/podcasting mods are strict on self-promo. Check rules + consider posting a Weekly thread first.

r/videoediting

On-device AI video clipper for Mac (Apple Silicon) — $0 per-clip, 93 tests passing, SwiftUI

Sharing what I built after getting priced out of Opus Clip at $27/mo. SwiftyClip is a single-purpose clipping app — drag a video in, get scored clips out. Everything runs locally via WhisperKit + Vision + MLX.

Currently v1.0.3:
- 11 MCP tools so Claude Code / Cursor can drive it end-to-end
- Multi-aspect export (one click → 9:16 + 1:1 + 16:9 in parallel)
- ScreenCaptureKit live capture (record a display/window/app straight into the pipeline)
- Filler-word sweep ⇧⌘F (strips um/uh across a whole transcript)
- Drag-and-drop batch import queue

Not affiliated with Opus / Submagic / CapCut. Honest comparison at swiftyclip.com/alternatives.

Note: Include a short screenshot of the workspace or a before/after clip as a gallery if the sub allows.

r/MacApps

SwiftyClip — native Mac AI video clipper. On-device, Apple-first, MCP-native.

Apple-native stack: Swift 6 strict concurrency, SwiftUI, AVFoundation, Vision, WhisperKit, MLX, SwiftData + CloudKit. macOS 15 minimum, Apple Silicon required.

What it does: drop a long-form video, get short-form clips with captions and face-tracked reframing. Zero cloud uploads. Zero per-clip credits.

What's different: full agent surface via MCP. Claude Code can chain ingest → analyze → score → render with one prompt. There's even a ⌘K Raycast-style palette + ⌘/ shortcut cheat sheet.

Free tier exists. Starter $9/mo. Lifetime $149 one-time.

Feedback welcome — especially on the Settings → Agents pane.

Note: r/MacApps appreciates specific Apple-framework mentions and a real screenshot/demo.

r/artificial

MCP-native video clipper running fully on Apple Silicon — Claude Code drives the whole pipeline

If you've been playing with MCP servers, this one's 100% local: SwiftyClip exposes 11 clip.* tools and runs its ML on the Neural Engine + GPU. No OpenAI, no Anthropic, no Gemini calls required. WhisperKit for transcription, Vision for analysis, MLX for hook scoring.

Full tool catalog: swiftyclip.com/api/mcp/schema
OpenAPI spec: swiftyclip.com/api/mcp/openapi.json
Claude Code setup: swiftyclip.com/integrations/claude-code

Interesting bits:
- 11 tools (including clip.registerWebhook so agents can subscribe to render events)
- stdio + loopback WebSocket transports
- No outbound network calls for video/audio content, ever.

Would love feedback on the tool shape — especially if you've integrated other MCP servers and have thoughts on ergonomics.

Note: This is the most forgiving sub for AI/dev tooling. Post in Show & Tell or similar if they have one.

r/selfhosted

Local-first clipper for podcasters — runs on your Mac, no cloud, MCP-controllable

Self-hosted cousin of the Opus Clip / Submagic / Vugola cohort. SwiftyClip is a Mac app (not a server you host) but the spirit is the same: your data never leaves your machine.

What runs locally:
- Transcription (WhisperKit → Apple Neural Engine)
- Analysis (Vision: faces, saliency, scene cuts)
- Hook scoring (MLX embeddings)
- Rendering (AVFoundation + Metal)

What hits the network (only if you opt in):
- Auth (Supabase) — if you use scheduled posts
- Billing (Stripe) — for paid tiers
- Transactional email (Resend) — waitlist confirmations

Video / audio / transcript content: never leaves the Mac. Full doc at swiftyclip.com/security-whitepaper.

Note: r/selfhosted cares a lot about data sovereignty — lead with that, downplay the paid tiers.