I am often told: "Basically, Lumyeye is just an app that talks, right?" or "Yeah, it's basically Gemini!"

This phrase makes me smile. It's the best compliment one can give me, because it means the complexity is invisible. But today, I want to tell you about the behind-the-scenes creation of this app for the blind.

If Lumyeye has become a true alternative to electronic magnifiers and heavy, expensive assistive devices, it is at the cost of a fierce battle against every millisecond.

The mirage of fluid conversation

When you ask Lumyeye a question, it feels like a straight line. In reality, it’s an infernal relay race in 7 steps:

  1. Your voice is recorded.
  2. It is converted to text (STT).
  3. This text goes to our servers.
  4. The AI deciphers, analyzes, and drafts a response.
  5. The response comes back to your phone.
  6. The text is transformed into human voice (TTS).
  7. The sound is broadcast.

For the experience to remain human, this cycle must last less than 8 seconds. Every half-second gained requires months of work from us. We optimize the code, change protocols, reduce data packet sizes... All so that you never feel like you're waiting.

âš¡ Latency = network communication delay, key indicator for a fluid experience!

The secret war against VoiceOver

This is the technical point that made us pull our hair out (even the little I have left!). VoiceOver is vital for a non-sighted person, but it’s a very chatty neighbor.

Imagine: you talk to the AI, and suddenly VoiceOver starts reading the time or a notification. The app microphone picks up the phone's voice, the AI gets confused, and it's a failure.

We had to code very complex "politeness rules" so that VoiceOver steps back at exactly the right moment, without ever compromising the phone's overall accessibility. By the way, our latest update is exclusively on this point! (we finally fixed the issue of VoiceOver talking while Lumyeye is listening when starting the application!)

Creating an interface without eyes: vibrations and sound cues

How do you know the application is "thinking" when you can't see the screen? We replaced loading bars with a sensory language:

  • Listening mode is signaled by a precise haptic vibration.
  • AI reflection is accompanied by small "sound pellets" (discreet and reassuring sounds).

Today, the experience is so integrated that you just have to say "Hey Siri, open Lumyeye". The application opens, vibrates to confirm it is listening to you, and announces "Opening discussion". To achieve this second of fluidity, it took hundreds of hours of testing.

The war of buttons: creating simplicity is nameless complexity

At the very beginning of the Lumyeye adventure, the screen looked like an airplane cockpit. We thought we were doing well by putting a button for every mode, every option, every setting. We filled the space.

Then, we understood: for a non-sighted user, every button is a potential obstacle, a source of confusion under VoiceOver. So, we started a radical slimming cure. We deleted, refined, hid. Today, when you open Lumyeye, there is almost nothing left. Only one button remains: the settings button.

But make no mistake: creating simplicity is nameless complexity. So that there are no more buttons, the artificial intelligence must understand everything, all by itself. The application must anticipate if you want to read a letter or bake a cake. Removing a button means having to write hundreds of additional lines of code so the app "guesses" the user's intention. Making an empty screen is the ultimate engineering challenge.

The nightmare of updates (iOS & Android) for a mobile app for the Blind

The most frustrating thing in my job? The ground is always moving. As soon as Apple or Google releases an update, communication protocols change. What worked yesterday can suddenly "break" due to a new management of Bluetooth or the microphone. We then have to reopen everything, check everything, on dozens of different phone models, from the latest iPhone to the entry-level Android.

Why Lumyeye isn't "just a Gemini" (and why it's 30% cheaper)

People often tell me: "Yeah, basically, it's Gemini right!". Not quite.

First, let's talk concrete: the price. At €14.90 per month compared to €21 for Gemini Advanced, Lumyeye is 29% cheaper. But beyond the savings, the real difference is philosophical: Lumyeye is an application made by and for the blind.

Where the giants offer generalist tools, we have developed surgical precision modes. Our Recipe Mode, for example, doesn't just read an indigestible block of text: it lists the ingredients, then the steps one by one, and waits for your voice confirmation to move to the next. It’s an AI that cooks with you, at your pace. We also added vital details that GAFAM ignores, like "blurry photo" alerts or the "end of text" signal to guarantee you haven't missed the bottom of a page.

The most important part? Lumyeye is 100% vocal. It was designed to be usable immediately, even by those who do not master complex VoiceOver touch gestures.

A word of truth about "Vision Live" mode

I also want to be totally transparent with you regarding the use of live video mode. It is a technological feat, but it has its physical limits. For the service to remain stable for everyone, Gemini had to put in place access restrictions in case of excessive use in a single day.

It is also a real "marathon" for your phone: real-time video analysis stresses the processor so much that the device can heat up, to the point where the system sometimes cuts the camera for safety. It is a very battery-hungry feature and it does not forgive network drops: if your connection weakens, the stream will cut automatically. This is the price to pay for having such powerful technology in your pocket!

We chose another alternative, no cuts, no heating, and a service that works all the time without needing to click a single button and at a competitive price.

Lumyeye on glasses?

This is a request we get often, yet for now it is not yet possible, why? Meta does not authorize us to port our application to their star Rayban glasses. Another point, for now glasses represent a significant additional cost, which remains too expensive for many. Finally, glasses are another device to charge, to carry, to update, and what about people already wearing glasses... so many things that are, for us at this stage, not adequate.

We are following developments closely!

Why do we put ourselves through this?

Because autonomy shouldn't cost €3,000 nor weigh 2 kilos in a bag. By transforming the smartphone into an alternative to expensive technical aids, we make freedom accessible to all.

The next time you use Lumyeye and everything seems "normal", know that behind this silence and speed, there is a developer who tracked down the slightest millisecond for you!