Audio Atlas Project: guide yourself more easily through our stations

The Audio Atlas project was created as part of our accessibility mission. The concept? To offer any of our passengers who have trouble moving through our spaces, either due to a disability or because they are not very familiar with our network, an app that guides them to a specific platform, exit or connection within one of our metro or RER stations. We are already testing this new app. 

The Audio Atlas project

The Audio Atlas is designed for passengers who need assistance as they move through our metro and RER stations. Whether you have a disability (sight impaired, wheelchair users, etc.) or are simply not very familiar with our network, the Audio Atlas is made for you.  

We are conducting this joint project with Urbilog, a digital specialist on integrating persons with disabilities, and the Department of Technology, Handicaps, Interfaces and Multimodality (THIM) of the University of Paris 8.

1 2

Cette ligne offre les opportunités suivantes :

  • Stations historique modernisées (ajout ascenseur,…)
  • Stations nouvelles accessibles tous handicaps
  • Quatre types de station (Complexe, intermédiaire, simple)
  • Une ligne labélisée S3A
  • Une ligne prolongée
Visuel compagnon sonore

En attendant le déploiement de cette solution, le site référence les cheminements de 121 stations.

How does the Audio Atlas work?

The Audio Atlas is an audio guide rather similar to those found in certain museums. It allows you to move through our spaces, segment by segment, according to the route best adapted to your profile (sight impaired, passengers who must use elevators instead of stairs, occasional users, foreign tourists, etc.). 

A specific feature also triggers real-time alerts, thanks to mobile phone sensors or external equipment such as Li-Fi (light fidelity), a LED-based wireless data-transmission technology. The purpose of these alerts is to strengthen localisation, to warn of difficult segments, and to provide additional information to describe complex spaces.

Our partner THIM is also working on the automatic calculation of routes based on maps, using a principle rather similar to optical character recognition (OCR) tools. 

The next tests will be conducted in June 2017. We will keep you updated!