• This topic is empty.
Viewing 13 posts - 1 through 13 (of 13 total)
  • Author
  • #10344

    The most successful ways to teach the blind and visually impaired about maps seem to be through tactile and 3d printed maps, combined with physical reconstruction of the spaces through similar tactile means. I have come across a variety of tools that combine text to speech, augmented reality, and physical maps, but have not come across a successful, purely virtual map.

    I’m trying to learn the encumbrances of technologies that have attempted to move tactile maps to smartphones, but I am not finding much. Does anyone here know someone I could speak with to advance my understanding?


    Hi Stephen-I am going to try to connect you with someone who may know more through our chapter leaders in our community:
    @SRv @MMC_Kristina @alice @Butzuk @jlee245 @loretodalt @Bham_makerspace @mlevac @jbengall @ConnorM @zhangcu2 Do any of you know much about virtual maps?


    Thanks Zee!


    Like many makers and [OpenStreetMap](https://www.openstreetmap.org/) volunteers, I’ve been involved in attempts to make physical tactile maps — everything from etched PCBs, lasercut acrylic to 3d printed. TBH, most of the projects weren’t particularly outcome-based: they made a prettier thing for seeing folks to feel good about than making a useful device for vision-impaired folks to actually use. Additionally for Braille users, there isn’t a useful embossing technology that can be made in the makerspace: 3d printers have far too low a resolution for standard characters.

    While there are indoor navigation schemes that make use of beacons and Bluetooth positioning, they only work where installed.

    A comment I got from a visually-impaired tech user I volunteered with last year was that, while narrated navigation apps do exist, much of the underlying map data isn’t precise enough or detailed enough to be useful. The example he gave was that sidewalk trees aren’t universally mapped, and phone GPS accuracy is typically ± 5 metres. This led to some unfortunate interactions as he traversed his neighbourhood.

    Commercial map databases have no incentive to make their map data more detailed. More accurate outdoor data can be submitted to OpenStreetMap (the basis of most map apps’ data) but it’s slow to collect and uses expensive, complicated equipment. Even the cheapest RTK homebrew GPS setup for roughly centimetre-range accuracy is around $1000 and requires levels of geekery way above my level. Then that leaves the accuracy of the user’s device …


    Thank you @SRv for such a thoughtful response.

    My thoughts in regard to mapping are provoked by the issues that you’ve just described. Tactile maps are not cost-effective and typically work better for the late blind as the congenitally blind tend to use egocentric reference frames. BLE is coming, but it poses the same issues as you described. Also, there are new studies stating that the habitual use of GPS negatively impacts spatial memory during self-guided navigation.

    The transfer of orientation and mobility knowledge for the blind between virtual and real environments is well established, and I can’t figure out why there aren’t more UI’s that allow for environment exploration from the home. As you alluded to, there is a lack of trust in real-time navigation and many of the projects I have come across seem to be made in vain.

    I’m attempting to find research that proves the comparative inefficacy of cross-modal interactions between two-dimensional objects like tablets/smartphones and three-dimensional objects like tactile maps. I’m not finding a lot, and it leaves me wondering if there is a gap in map UI’s that could be valuable.


    @stephengduke, I reached out to our resource center for persons with disabilities, here are the responses:
    “Our talking map is an X-Y tablet that speaks the location of your finger when pressing on a 2D surface. A tactile map is usually placed on the surface and the user is able to feel buildings or roads and hear them described when pressed.

    Another area of virtualization uses Haptics. On a touch screen, for example, controlled vibrations are used to simulate the presence of objects.

    A robotic pen can restrict movement to simulate the feel of a 3D object. https://www.3dsystems.com/haptics-devices/touch

    Many different haptic devices are being used to guide users as they travel.”

    Does this help? Thanks.


    Very helpful, thanks so much!


    @stephengduke Hi Stephen, more information from the director of the resource center, himself is vision repaired, and he is sharing his own experience: “The best blindness-related maps I experience are those which have raised attributes and a paired touch pad as Stephen recites. That is what we would have shown in the ATC here. The Iveo tablet has great promise which we are still working to fully deliver upon.

    We have a talking globe here also that Dr. Hwang made accessible with clear glue as a boundary marker. It was purchased from Amazon as a smart globe and his work made it viable for blind students.

    Touch screen devices like iPhone and iPad have great promise also. I have had several such apps but didn’t find any of them to be game changers. I like the possibility there with a larger iPad screen but there is still no substitute to embossed maps with a touch pad to add voice to the maps.”


    I’ve been watching this message board for a while, but this is my first time posting. I’m Michael Cantino. I’m an accessibility specialist and braille transcriber in Portland, Oregon.
    I think I can help with this one! Prepare for info dump!

    I worked on a research project focused on [Interactive Printed Models](https://www.interactiveprintedmodels.com/) for visually impaired learners. We used a Blender add-on to add annotations to 3D models, and we loaded those annotations into an augmented reality app. We then 3D printed the models and affixed a QR code tracker cube. Using the app, users could explore models and trigger the annotations with their finger and voice.

    [Augmented Reality and Virtual Reality Research Articles](https://drive.google.com/drive/folders/1G8dU4HftsvFNkbk18j0koxgHxUXono5y?usp=sharing)
    [Educators in VR article with tons of resource links](https://educatorsinvr.com/2019/05/31/accessibility-disabilities-and-virtual-reality-solutions/)
    The first link is a folder of research articles I reviewed toward the end of the project. These are some of the best examples I’ve seen of virtual maps for the blind. There are several different approaches. Er et al use audio and haptic feedback to explore virtual maps. Kreimeier and Götzelmann use a virtual reality headset and walk-in-place locomotion to explore virtual maps. Zhao et al explore the use of a haptic cane virtual reality controller. Kunz et al use a VR headset, a typical cane, and haptic feedback to explore virtual maps. Lots of really interesting stuff to explore!

    [3D Printed Tactile Maps Research Articles](https://drive.google.com/drive/folders/1tKxqKQDJcFGYnf1z_YJuMPDX4MLMlnjY?usp=sharing)
    Tactile maps are widely used for orientation and mobility training, and they are cost effective to produce. Tactile maps are most often used to study routes before and after visiting a location with an Orientation and Mobility specialist. Interpreting tactile graphics is a skill, and complex graphics can be difficult to decipher. People’s effective use of tactile graphics, in my experience, is directly related to their experience and familiarity with braille and tactile graphics.

    With a braille embosser, you can produce maps for pennies per page. Embossed graphics are possibly the most common medium for graphics, but embossed graphics can be difficult to interpret. The [IVEO Tactile Tablet](https://exceed.lv/index.php/en/produkti/braila-produkti/braila-printeri/view/266:braille-printer-viewplus-iveo), that Tracy mentioned, and the [Talking Tactile Tablet](https://touchgraphics.com/portfolio/ttt/) are excellent tools for making interactive embossed tactile graphics or mixed media, “collage graphics” that provide the user with important information as they examine a graphic. The tablets aren’t very portable. I think you might be interested in the Götzelmann articles in the 3D printed articles folder. He tries a few different approaches using smart phones and small tactile maps to make portable, interactive maps.
    Another common medium for tactile maps is microcapsule paper or “swell touch” paper. Ink on the paper swells to create raised lines and braille. These machines are very easy to work with, but expensive. The paper itself is $1-2 per sheet depending on size.

    In terms of equipment and materials, 3D printing might actually be the most cost effective medium. I regularly print high quality tactile graphics and braille on budget printers with no modifications. 3D printed graphics take longer to print and a little longer to design, but material costs are in line with microcapsule paper, about $1-2 per map depending on size.
    3D printed tactile maps also have a ton of benefits. The ability to exaggerate the height of important symbols and areas is amazing. On an embossed page (with a nice embosser), you can do little to vary the line and pattern heights, but it’s an important method for conveying complex graphic information. That limitation doesn’t exist with 3D printing. I’m fortunate enough to always be in collaboration with blind friends, staff, and students, and the ease of use and clarity offered by raising the height of symbols is invariably the most popular feature of 3D printed maps.
    3D maps are also durable and more portable. Paper-based graphics can be creased or damaged if not handled carefully. 3D printed maps can be thrown in a bag and then studied while traveling.
    3D printers can print lovely, smooth braille! There are some tricks though. To get smooth, round domes, you should avoid printing the braille parallel to the print surface. If you want to print braille parallel to the print bed, using flat cylinders seems to produce smoother braille, but it’s less durable. Benetech put together this excellent [document about 3D printing braille](https://drive.google.com/file/d/192qh75UKhT9zqWizkT3ml76DNjaRmaLp/view?usp=sharing). I usually use [this braille font module](https://www.thingiverse.com/thing:3490757). The module produces braille at the standard size, so don’t rescale the braille when adding labels to models.

    To improve upon 3D models even further, many approaches are being used to offer supplemental information through interaction. You can read more in the 3D research folder. Whether through embedded electronics, touch screen overlays, or AR overlays, interactive 3D printed models offer an experience similar to the IVEO graphics, but in 3D.

    GPS guided navigation is a very useful tool, but it has its limitations, as previously noted. I ran into an engineer named Kelvin Crosby who made [an incredible Smart Guider Cane](https://www.thesmartguider.com/). He’s using lidar to detect objects, and then swivels the cane tip to direct users away from obstacles.

    I hope this helps! I think I actually still have more resources if you want any more information.

    Michael Cantino


    Thank you so much Michael for the response! I love the Interactive Printed Models that you linked up and the work you did with them. This is all some great information and I’m still in the process of taking it all in! You’re a peach in a garden of lemons.


    Thx for your info @mcantino and welcome to our community! So glad to have someone with your expertise involved!


    Happy to be here! Sorry for the delayed response. Fall term has been really busy!

    I hope the information is useful. Stephen, I had some thoughts about a couple of your previous questions.

    > I’m trying to learn the encumbrances of technologies that have attempted to move tactile maps to smartphones, but I am not finding much.


    > I’m attempting to find research that proves the comparative inefficacy of cross-modal interactions between two-dimensional objects like tablets/smartphones and three-dimensional objects like tactile maps. I’m not finding a lot, and it leaves me wondering if there is a gap in map UI’s that could be valuable.

    I do think there is a gap here. The research articles I posted, while acknowledging any known flaws in their respective approaches, overall seem to find that interactive tactile maps are effective learning tools. It seems that the barrier to wider adoption is not the efficacy of these maps. I think the primary barriers are:

    * Authoring tools or production methods are too complex to be easily adopted.

    * Access to the required equipment/software to author or interact with maps

    Many of the methods in the [3D Printed Maps folder](https://drive.google.com/drive/folders/1tKxqKQDJcFGYnf1z_YJuMPDX4MLMlnjY?usp=sharing) involve electronics embedded into 3D printed maps or some other complex production process. Some studies feature automated tools that simplify the preparation process, but it seems those tools have not been maintained after completion of the studies. The high learning curve that users face when trying to implement these approaches is likely enough to prevent wider adoption.

    Some people may not have access to the necessary equipment to reproduce some of these approaches, and many authoring tools aren’t accessible, preventing blind users from creating their own maps. To simplify his production process, Götzelmann recommends using a 3D printer with a dual extruder to print conductive filament wires into the map. Printers that are capable of this are few and far between. In the [AR/VR Folder](https://drive.google.com/drive/folders/1G8dU4HftsvFNkbk18j0koxgHxUXono5y?usp=sharing), Thevin and Brock use a very simple authoring and production process to augment typical raised-line tactile graphics, but their approach requires at least a camera and a projector (and their authoring software). They also developed a method that allows blind users to create their own maps to use with their system. While this is a very promising approach, projectors aren’t necessarily a common household object.

    Smartphones and tablets have been so widely adopted that they seem like an excellent tool to leverage for interactive tactile maps. The [project I worked on](https://www.interactiveprintedmodels.com/) required a smartphone or tablet with a stand, QR code stickers, and 3D printed models. The authoring process was a little complex, and it wasn’t accessible. I found this approach to be very useful, but unfortunately I don’t think it will be publicly available any time soon.

    Some sort of simple UI for creating or utilizing interactive tactile maps or other objects would be fantastic!


    Hey @mcantino! My name is Tyler and I am one of the engineering interns at MMC. I first want to thank you for the excellent resource of information you shared in this post. I think there is lots of potential to bring this information to several future projects in MMC but in particular a [braille calculator](https://makeymakey.com/blogs/how-to-instructions/makey-makey-braille-calculator-by-tracy-zhang) device that was developed by @zhangcu2. I am working on some documentation for the braille calculator device so it can be posted on our website but I had some problems sourcing braille stickers or a opportunity for DIY braille. I think 3D printed tiles that could be glued to the calculator would be a fantastic option. @mcantino would you have some time in the future for a chat about this and some questions I have around Braille? if so please feel free to reach out to me here or at [email protected]

Viewing 13 posts - 1 through 13 (of 13 total)
  • You must be logged in to reply to this topic.