Skip to main content

Settings

A reference for every option in the LucidPal Settings screen.

Open Settings by tapping the gear icon in the top-right corner of the main screen.


Simple vs Advanced Mode

At the top of Settings, a segmented picker lets you switch between two views of the settings screen.

ModeWhat you see
SimpleData Sources (Notes, Habits, Contacts, Calendar, Location, Web Search), Text Model (model picker + Download More), Voice, and General (Notifications, About, Debug Logs)
AdvancedEverything in Simple, plus: Vision toggle and vision model picker, full Inference controls (context window, temperature, max tokens, timeout, KV Cache info), and Shortcuts/Siri section

The selected mode is remembered across app launches. New users start in Simple mode.

tip

Switch to Advanced mode when you want to fine-tune how the AI generates responses or configure a vision model. For everyday use, Simple mode is all you need.


Data Sources

These toggles control which personal data LucidPal can read and act on. All processing is on-device — nothing leaves your iPhone.

SectionToggleWhat it does
NotesNotesLucidPal can save ideas when you say "save this" or "make a note"
HabitsHabitsLucidPal can log and query habits — "log my workout", "did I meditate today?"
ContactsContacts AccessLucidPal can look up phone numbers and email addresses from your contacts
CalendarUse calendar in chatUpcoming events are included in the AI prompt for scheduling and reminders
LocationInclude city in AI contextYour detected city is added to the AI prompt for location-relevant answers
Web Search(tap to open sub-screen)Configure web search provider and API key — see Web Search

Calendar

The Calendar row adapts to the current permission state:

StateWhat you see
Not authorizedAllow Access button — tap to trigger the iOS permission prompt
AuthorizedToggle to include or exclude event context from the AI

Once access is granted, a Default Calendar picker also appears — choose which calendar new events are created in ("System Default" uses the iOS default).

Location

The Location row shows different states depending on permission:

StateWhat you see
Not yet requestedEnable button — tap to request iOS location permission
GrantedRow label shows the detected city inline, e.g. "Location — Montreal"
DeniedDenied badge — re-enable via iOS Settings → Privacy & Security → Location Services → LucidPal

The city string is included in the system prompt — never stored on any server.


Vision

SettingWhat it does
Vision toggleWhen on, photo attachments are processed by the vision model. Turn off to force text-only inference and save RAM.

See Vision & Photos for how to attach and analyze images.


Text Model

Lists all downloaded and available text models for your device. Tap a model to select and load it.

  • A checkmark indicates the active model.
  • On device / Not downloaded shows download status.
  • File size is shown next to each model name.
  • Swipe left on a downloaded model to Delete it and recover storage.
  • Tap Download More Models to browse and download additional models.

Device RAM and available storage are shown in the section footer to help you choose.

For a full model comparison table, see AI Models.


Vision Model

Lists available vision models (separate from the text model, unless you chose an integrated model).

BadgeMeaning
IntegratedSingle model handles both text and vision — no separate download needed
VisionDedicated vision model — pairs with your text model
  • Tap a downloaded vision model to activate it.
  • Tap Download Vision Models to fetch one if none are downloaded.
  • If no vision models are compatible with your device, the section shows a notice.

Inference

Controls how the AI generates responses.

SettingDefaultWhat it does
Start voice on openOffAutomatically activates the microphone when you open a new chat
AirPods auto-voiceOffStarts listening when AirPods connect; stops on silence
Auto-send after speechOnSubmits the transcribed message without requiring a tap (hidden when "Start voice on open" is on)
Context WindowDevice maxTokens the model keeps in memory — larger = longer conversations, more RAM

Context window options are capped to your device's RAM. The app auto-selects the largest safe value on first launch and after upgrades.

Advanced Inference Controls

These settings are visible in Advanced mode only.

SettingRangeDefaultWhat it does
Temperature0.0 – 2.00.35Lower = focused/deterministic; higher = creative/varied
Max Response Length128 – 2048 tokens768Cap on how long a single reply can be
Timeout30 – 300 s90 sGeneration is cancelled if it takes longer than this
KV CacheFixedShows the quantization type used for the key-value cache (read-only)

Temperature and context window changes take effect the next time the model loads (i.e., next new chat).

Thinking Mode

Thinking mode is toggled per-chat via the brain icon in the chat toolbar. The last state is remembered across chats. See AI Models → Thinking Mode for details on which models support it.


Notifications

SettingWhat it does
Pre-event remindersSends a notification 10 minutes before each calendar event with a tap-to-prepare shortcut

Toggling this on triggers an iOS notification permission prompt if not yet granted.


Shortcuts

LucidPal exposes four actions to the Shortcuts app for automation:

ActionDescription
Ask LucidPalQuery the AI assistant and receive a text response
Create EventAdd a calendar event with title, time, and duration
Check Next MeetingGet details of your next upcoming calendar event
Find Free TimeSearch for available time slots in your calendar

Tap Open Shortcuts App to jump directly to the Shortcuts app and build automations.

Action parameters

ActionParametersDefaults
Ask LucidPalquery (text)
Create EventeventTitle, startTime, durationMinutesduration: 60 min
Check Next Meeting(none)
Find Free TimesearchDate, durationMinutesdate: now, duration: 60 min

Empty or whitespace-only inputs are rejected — the action returns an empty result rather than creating a malformed entry.

See Siri & Shortcuts for step-by-step automation examples.


About

Shows the app version and build number.

Debug Logs

Tap Debug Logs to open an in-app log viewer that captures real-time events from the AI, voice transcription, calendar, and other subsystems.

ControlWhat it does
Filter (funnel icon)Filter entries by category (LLM, Whisper, Calendar, …) or log level (info / warning / error)
Search barFull-text search across all log messages
Copy iconCopies the filtered log as plain text — paste into a bug report or email
Trash iconClears all log entries

Logs are stored in memory only and are cleared when you quit the app. If you are reporting a bug, reproduce the issue and then tap the copy icon before closing the app.