“We built it for the person who picks it up alone, in the dark, for the first time.”
Seven students. One constraint: no screen, ever.
أبصرني , See Me.
Built from a
single question
Abserny began as a graduation project in late 2024 at the Faculty of Engineering. We chose accessibility because over 2.2 billion people worldwide live with visual impairment, and AI had reached a point where it could make a real, practical difference in their daily lives.
The question that shaped every decision: what if a blind user picked up this app for the very first time, alone? Most assistive tools fail at that exact moment, they require sighted help to choose a language, read instructions, or navigate setup. We set out to eliminate that dependency completely.
The result is Abserny v3.1, a fully voice-first Android application in which no visual interaction is required at any point, from first launch through every day of use.
What We Offer
Real-time Scene Description
Powered by Gemini 2.0 Flash Lite, Abserny describes your environment in natural language, hazards first, spatially aware, in Arabic or English, at 1,430 ms median latency.
Works Offline, Always
When there's no internet, ML Kit steps in automatically. The app never fails silently, it routes to on-device detection and keeps speaking. No data connection required.
Gesture-Only Control
Five gestures replace the entire UI. Double tap to scan. Long press to repeat. Triple tap for settings. Swipe to change mode. No screen interaction, no voice commands, no menus.
Our Values
Gesture Vocabulary
Five gestures cover every function. The vocabulary is intentionally small, a larger set would be harder to remember and more error-prone.
Meet the Team
Seven students from the Faculty of Engineering, Computer Science Department, Academic Year 2025–2026.

Saeed Hany
Lead Developer & AI EngineerArchitecture, gesture system, AI integration, and hooks design

Omaretooo
Speech & TherapyWithout him the team could be lost in the darkness

Mohamed Eid
Speech & Language SystemsTTS queue, bilingual support, Arabic prompt engineering

Bassant Wael
Research & TestingUser testing, gesture accuracy evaluation, accessibility validation

Rashida Hassan
Backend & Offline SystemsML Kit integration, AbserneyVision training pipeline, offline routing

Hana Hamed
Onboarding & UXVoice-first onboarding design, gesture tutorial, settings overlay

Mariam Suroor
Documentation & QAGraduation book, latency benchmarks, quality evaluation
Join Our Journey
Abserny is an open graduation project. We welcome feedback from accessibility researchers, developers, and anyone with lived experience of visual impairment.
Contribute on GitHub