Support

Help center · Look & Talk · Last updated: May 7, 2026

Welcome to the Look & Talk support center. Here you'll find how-to guides, frequently asked questions, and a direct way to reach the team. Look & Talk is fully VoiceOver-compatible — every control on this page and inside the App is labeled for screen reader navigation.

Direct contact

Email: christian@irack.mx

We reply within 48 business hours, Monday through Friday (Mexico City time).

Important — assistive tool, not a medical or safety device Look & Talk helps with everyday reading and recognition tasks. It is not a medical device, a navigation device, a safety device, or a substitute for professional orientation and mobility training. For critical decisions — medication doses, financial documents, traffic, hazards — please verify the result with a sighted assistant, a certified specialist, or a clinical tool.

How to use Look & Talk

Read printed text (OCR)
  1. Open Look & Talk and go to the Read tab
  2. Hold the iPhone about 20–30 cm from the printed material
  3. The camera scans continuously; recognized text is read aloud through VoiceOver in real time
  4. Tap the Capture button to freeze a frame and read the full text again at your own pace
  5. Captured text is saved to History automatically

Works best with clean printed text in good lighting. Handwriting and very stylized fonts may be misread — verify critical content with a sighted assistant.

Describe a scene
  1. Open Look & Talk and go to the Describe tab
  2. Aim the camera at the room, object, or scene
  3. Tap the Describe button (or use the volume-up shortcut)
  4. Look & Talk runs Apple Foundation Models on-device and reads the description aloud
  5. The description is saved to History; you can re-listen at any time

Description quality depends on lighting, framing, and visual context. The model does not always identify people, brands, or text correctly. Treat descriptions as helpful guidance, not as a verified report.

Use the magnifier
  1. Open Look & Talk and go to the Magnify tab
  2. Pinch to zoom or use the slider to set the magnification level
  3. Tap Freeze to capture a still image you can pan and read at your own pace
  4. Toggle High-contrast filters to invert colors or boost contrast for low-vision reading
  5. Toggle the Torch if the lighting is poor

Designed for prescription bottles, ingredient lists, business cards, and any fine print. The freeze-frame is held in memory only and is discarded when you exit the tab.

Identify a color
  1. Open Look & Talk and go to the Color tab
  2. Aim the camera at the surface (clothing, paint, button, label)
  3. Hold the camera steady about 10 cm from the surface
  4. Look & Talk announces the color name through VoiceOver — for example, "navy blue", "warm gray", "bright red"
  5. The color is saved to History if you tap Save

Color names are localized in your iPhone's primary language. Detection accuracy depends on lighting — natural daylight gives the best results.

Review and manage History
  1. Open Look & Talk and go to the History tab
  2. Browse OCR text, scene descriptions, and color names you've saved
  3. Tap any entry to listen to it again or copy the text
  4. Swipe left on an entry to delete it
  5. Tap Clear All in Settings to wipe History entirely

History is stored locally on your iPhone using SwiftData. It is never uploaded anywhere. Uninstalling Look & Talk removes History permanently.

Use Look & Talk with VoiceOver

Look & Talk is built VoiceOver-first. Every button, label, and result is read out loud automatically. Three-finger swipe to navigate between tabs, single-finger swipe to move through controls, and double-tap to activate.

If a control is not announced correctly, please email christian@irack.mx with the screen name and the control affected — accessibility bug reports are treated as the highest priority.

Change the language and voice

Look & Talk is available in English, Español, and Português (Brasil). Change it from the Settings tab. The change applies immediately and persists across sessions.

Voice rate, pitch, and the speech voice itself are controlled by your iPhone's system VoiceOver settings (Settings → Accessibility → VoiceOver). Look & Talk respects whatever you have configured there.

Frequently asked questions

Does Look & Talk upload my photos or camera frames?

Never. Every camera frame, every OCR result, every scene description, every color identification stays on your iPhone. Look & Talk has no backend, no cloud sync, and no third-party AI provider. Recognition runs on-device using Apple's Vision and Foundation Models frameworks. See our Privacy Policy.

Why does Look & Talk need camera and Photo Library access?

The camera is used to read text, describe scenes, magnify content, and identify colors — without it, the App cannot do its job. The Photo Library is optional and only used when you pick a saved photo to run OCR or scene description on. Look & Talk never browses your library in the background.

Can Look & Talk read handwriting?

Apple Vision can recognize many printed and some handwritten styles, but accuracy with handwriting is limited. Cursive, stylized fonts, and notes written in a hurry may be misread. For critical handwritten content (signatures, prescriptions, legal notes), please verify with a sighted assistant.

Does Look & Talk work offline?

Yes. All recognition runs on-device. After installing the App, you can use OCR, scene description, magnification, and color detection without any internet connection. The App makes no network requests.

Can Look & Talk replace a sighted assistant or guide dog?

No. Look & Talk is an assistive tool, not a navigation, mobility, or safety device. It cannot detect curbs, traffic, obstacles, hazards, or sudden changes in the environment. For orientation and mobility, please rely on professional training, a guide dog, a cane, or a sighted assistant. For medical decisions, financial documents, and legal matters, always verify the App's output with a qualified human.

Does Look & Talk drain my battery?

Live OCR, scene description, and continuous magnification keep the camera and the on-device ML model active, which uses more battery than typical apps. Sustained use for 30 minutes or more will noticeably reduce battery. If you plan to use Look & Talk for an extended session, plug in your iPhone or use a battery pack.

Which iPhones are supported?

Any iPhone running iOS 18 or later. Scene description (Apple Foundation Models) requires a device that supports Apple Intelligence; on older devices, that feature may be unavailable while OCR, magnification, and color detection still work.

Does Look & Talk work on iPad, Mac, or Apple Watch?

Currently iPhone only. iPad and Mac versions are on the roadmap if there is demand from the community.

Do I have to pay anything?

No. Look & Talk is free, with no In-App Purchases, no subscription, no ads, and no paywall. It is offered as a permanent free assistive tool. If you'd like to share feedback or suggest a feature, email christian@irack.mx — that's enough.

Is there a way to export my History?

Not yet in v1.0. Because Look & Talk is local-only, exporting is on the roadmap as a share-sheet option for individual entries. If you need this sooner, please email the team and describe the use case.

Can I take photos of other people with Look & Talk?

The camera captures whatever is in front of it, including people. Laws around photographing other people vary by country and state, and some jurisdictions require consent. You are responsible for complying with the law where you live. Please ask before taking a picture of another person.

How do I report a bug?

Email christian@irack.mx and include:

Can I suggest a new feature?

Yes — feedback from the blind and low-vision community is the most important input Look & Talk receives. Email christian@irack.mx with your proposal. We review every suggestion.

Contact

Developer Chris Flores
Ingeniería.dev
Postal address Av. Javier Barros Sierra 495, Santa Fe Lomas de Santa Fe Zedec Santa Fé, Álvaro Obregón 01219, Mexico City, CDMX, Mexico

Estimated response time: 48 business hours (Monday through Friday, Mexico City time). Accessibility bug reports are treated as the highest priority.