Privacy details for the current MingHelper build.
Last updated March 7, 2026. This page describes the current MingHelper macOS build and the app behavior reflected in the codebase prepared for App Store submission.
Overview
MingHelper is a macOS desktop assistant. It can answer typed prompts, analyze screenshots that you capture, transcribe system audio for coaching workflows, and display Tutorial Mode guidance on screen.
Information you provide
MingHelper may process prompts you type, screenshots you intentionally capture or attach to a workflow, audio context you enable for coaching features, and configuration details such as custom instructions.
Permissions
The app may request Screen Recording, Speech Recognition, and Accessibility access. These permissions are used only for the features that depend on them. Accessibility access is optional and mainly improves Tutorial Mode target snapping.
API requests
When you use assistant features, MingHelper sends requests directly to the OpenAI API. An OpenAI API key is required for live requests. Certain reply modes may also enable OpenAI web-backed search tools as part of the response request flow.
Local storage
The current build stores some settings locally in macOS UserDefaults, including selected mode, custom instructions, some preferences, and the API key field contents. The current codebase does not store the API key in Keychain.
Session retention
In the current repository build, conversation history, screenshots, and transcripts are kept in memory for the active session and are not written to a local chat database by this codebase.
Data sharing
MingHelper does not sell personal data. Information is shared only as needed to provide the assistant features that you explicitly use, such as sending prompts and optional captured context to OpenAI.
Contact
- Support page: mingllm.com/minghelper-support
- Email: support@mingllm.com