Feedback about: Virtual maps for the blind and visually impaired


Like many makers and [OpenStreetMap]( volunteers, I’ve been involved in attempts to make physical tactile maps — everything from etched PCBs, lasercut acrylic to 3d printed. TBH, most of the projects weren’t particularly outcome-based: they made a prettier thing for seeing folks to feel good about than making a useful device for vision-impaired folks to actually use. Additionally for Braille users, there isn’t a useful embossing technology that can be made in the makerspace: 3d printers have far too low a resolution for standard characters.

While there are indoor navigation schemes that make use of beacons and Bluetooth positioning, they only work where installed.

A comment I got from a visually-impaired tech user I volunteered with last year was that, while narrated navigation apps do exist, much of the underlying map data isn’t precise enough or detailed enough to be useful. The example he gave was that sidewalk trees aren’t universally mapped, and phone GPS accuracy is typically ± 5 metres. This led to some unfortunate interactions as he traversed his neighbourhood.

Commercial map databases have no incentive to make their map data more detailed. More accurate outdoor data can be submitted to OpenStreetMap (the basis of most map apps’ data) but it’s slow to collect and uses expensive, complicated equipment. Even the cheapest RTK homebrew GPS setup for roughly centimetre-range accuracy is around $1000 and requires levels of geekery way above my level. Then that leaves the accuracy of the user’s device …