{"688648":{"#nid":"688648","#data":{"type":"news","title":"New \u2018Touchable Sound\u2019 Museum Display Makes Data More Accessible","body":[{"value":"\u003Cp\u003EBlind and low vision (BLV) people may soon have access to and more easily understand scientific data in museum exhibits through new \u201ctouchable sound\u201d displays.\u003C\/p\u003E\u003Cp\u003EAssociate Professor Jessica Roberts and Ph.D. student Emily Amspoker of Georgia Tech\u2019s School of Interactive Computing are working with the \u003Ca href=\u0022https:\/\/gacoast.uga.edu\/\u0022\u003E\u003Cstrong\u003EUniversity of Georgia\u2019s Marine Extension and Georgia Sea Grant in Savannah\u003C\/strong\u003E\u003C\/a\u003E. Together, they\u2019ve developed a prototype display that uses sonification and texture to convey sea floor habitat information from \u003Ca href=\u0022https:\/\/graysreef.noaa.gov\/\u0022\u003E\u003Cstrong\u003EGray\u2019s Reef National Marine Sanctuary\u003C\/strong\u003E\u003C\/a\u003E off the coast of Georgia.\u003C\/p\u003E\u003Cp\u003ESonification is the process of translating data points into sound.\u003C\/p\u003E\u003Cp\u003EThe display functions as a map that BLV users can follow to learn about each habitat. It is made from a wooden board with laser-cut patterns engraved into the surface. Each pattern represents information about the four types of habitats found in Gray\u2019s Reef. Each pattern has a distinct sound that corresponds to a legend on the board, which provides an audio description of each habitat.\u003C\/p\u003E\u003Cp\u003EThe four habitats are:\u003C\/p\u003E\u003Cul\u003E\u003Cli\u003EFlat sand \u2014 smooth sandy seafloor with little topographic variation that provides habitat for burrowing organisms such as worms, clams, and sand dollars.\u003C\/li\u003E\u003Cli\u003ERippled sand \u2014 sandy bottom shaped into small wave-like ridges by currents and wave action; supports microhabitats of small invertebrates and attracts fish feeding on buried prey.\u003C\/li\u003E\u003Cli\u003ESparse live bottom \u2014 areas of exposed hard surfaces with scattered attached organisms like sponges, corals, and algae, offering structure and shelter for reef-associated fish and invertebrates.\u003C\/li\u003E\u003Cli\u003EDense live bottom \u2014 hard-bottom reef areas with abundant attached marine life, providing high biodiversity and offering food, and breeding sites for numerous species.\u003C\/li\u003E\u003C\/ul\u003E\u003Cp\u003EBy allowing learners to explore these habitats, the team hopes to emphasize the importance of protecting diverse ocean habitats.\u0026nbsp;\u003C\/p\u003E\u003Cp\u003E\u201cOur job was to figure out how we can use sounds and touch to represent each of the four habitat types so our visitors can explore the ocean without being able to see it,\u201d she said.\u003C\/p\u003E\u003Cp\u003ERoberts said the project is critical to advance understanding of how science and informal learning can be more inclusive to those who have difficulty processing visual data displays.\u003C\/p\u003E\u003Cdiv\u003E\u003Cdiv\u003E\u003Cp\u003E\u201cThis was particularly exciting to figure out how we could broaden accessibility to data sets because just like so much other scientific data, it\u2019s out there and available, but when it\u2019s presented to the public, it\u2019s usually in visual form,\u201d she said. \u201cThere are many open questions about how to do this well within a museum with complex scientific data. We\u2019re moving the needle on that, but there\u2019s a long way to go.\u201d\u003C\/p\u003E\u003Ch4\u003E\u003Cstrong\u003ERight Combination\u003C\/strong\u003E\u003C\/h4\u003E\u003Cp\u003EAmspoker and Roberts created three different versions of the prototype. One was sound-only, one was texture-only, and the other was a combination of sound and texture.\u003C\/p\u003E\u003Cp\u003E\u201cWe expected the multimodal version would work best,\u201d Amspoker said. \u201cWe found people used sound and texture in different ways when interacting with it. In cases where people relied on texture, it was still difficult to tell when they crossed the barrier from one texture to another. Sound was very useful in that case.\u201d\u003C\/p\u003E\u003Cp\u003EAmspoker said computer vision and an app she designed allow the technology to be deployed on any surface, whether a mobile device, a wooden board, or even a classroom floor. A camera set up above the display tracks the user\u2019s hand movements.\u003C\/p\u003E\u003Cp\u003E\u201cIt figures out where you are on the board, and then our code uses the location of your finger to decide what sound should play from the computer,\u201d she said. \u201cWhat\u2019s nice about our system is it only needs a computer and a webcam, and you can use whatever materials you have on hand for the map.\u201d\u003C\/p\u003E\u003Ch4\u003E\u003Cstrong\u003EBuilding on a Legacy\u003C\/strong\u003E\u003C\/h4\u003E\u003Cp\u003ERoberts said she is building on the work of a previous NSF-funded collaboration with Dr. Amy Bower, a senior scientist at the Woods Hole Oceanographic Institute in Massachusetts who is blind.\u003C\/p\u003E\u003Cp\u003EBower lost her vision in graduate school, but because of her lifelong interest in oceanography, she set out to create ways to learn about ocean data through sound.\u0026nbsp;\u003C\/p\u003E\u003Cp\u003EIn 2021, she launched the \u003Ca href=\u0022https:\/\/accessibleoceans.whoi.edu\/\u0022\u003E\u003Cstrong\u003EAccessible Oceans\u003C\/strong\u003E\u003C\/a\u003E project through the National Science Foundation\u2019s Advancing Informal STEM Learning program. The interdisciplinary team, including Roberts and collaborators Leslie Smith of Your Ocean Consulting and Jon Bellona of the University of Oregon, created auditory displays of sonified data for museums.\u003C\/p\u003E\u003Cp\u003EIn 2023, the team published \u003Ca href=\u0022https:\/\/tos.org\/oceanography\/article\/expanding-access-to-ocean-science-through-inclusively-designed-data-sonifications\u0022\u003E\u003Cstrong\u003Ean article in \u003C\/strong\u003E\u003Cem\u003E\u003Cstrong\u003EOceanography,\u003C\/strong\u003E\u003C\/em\u003E\u003Cstrong\u003E the official magazine of the Oeanography Society\u003C\/strong\u003E\u003C\/a\u003E.\u003C\/p\u003E\u003Cp\u003E\u201cInformal learning environments are increasingly recognizing the importance of employing multiple modalities to engage all learners and are leveraging sound to enhance visitor experience,\u201d the authors wrote.\u003C\/p\u003E\u003Cp\u003E\u201cWhile sonic additions of music, soundscapes, and field recordings add qualitative value, there is a need to explore the potential of sound to facilitate engagement with quantitative information. Data sonification is a promising avenue for increasing accessibility to data within the museum context.\u201d\u003C\/p\u003E\u003C\/div\u003E\u003C\/div\u003E","summary":"","format":"limited_html"}],"field_subtitle":"","field_summary":[{"value":"\u003Cp\u003EGeorgia Tech researchers have created a prototype \u201ctouchable sound\u201d museum exhibit that helps blind and low-vision visitors explore scientific data by combining tactile maps with sonification of seafloor habitats. The display translates information about different ocean environments into distinctive textures and sounds so users can follow a physical map of Gray\u2019s Reef National Marine Sanctuary and hear data-driven audio cues. The team hopes this multimodal approach will make complex visual data more inclusive and broaden access to informal science learning.\u003C\/p\u003E","format":"limited_html"}],"field_summary_sentence":[{"value":"Georgia Tech researchers have developed a prototype \u201ctouchable sound\u201d museum display that uses sonification and tactile maps to make complex scientific data about ocean habitats more accessible to blind and low-vision visitors."}],"uid":"36530","created_gmt":"2026-03-03 15:13:03","changed_gmt":"2026-03-20 12:52:09","author":"Nathan Deen","boilerplate_text":"","field_publication":"","field_article_url":"","location":"Atlanta, GA","dateline":{"date":"2026-03-03T00:00:00-05:00","iso_date":"2026-03-03T00:00:00-05:00","tz":"America\/New_York"},"extras":[],"hg_media":{"679503":{"id":"679503","type":"image","title":"2026-Jessica-Roberts-Reef-Data-Sonification-2.jpg","body":null,"created":"1772550793","gmt_created":"2026-03-03 15:13:13","changed":"1772550793","gmt_changed":"2026-03-03 15:13:13","alt":"Jessica Roberts","file":{"fid":"263675","name":"2026-Jessica-Roberts-Reef-Data-Sonification-2.jpg","image_path":"\/sites\/default\/files\/2026\/03\/03\/2026-Jessica-Roberts-Reef-Data-Sonification-2.jpg","image_full_path":"http:\/\/hg.gatech.edu\/\/sites\/default\/files\/2026\/03\/03\/2026-Jessica-Roberts-Reef-Data-Sonification-2.jpg","mime":"image\/jpeg","size":118705,"path_740":"http:\/\/hg.gatech.edu\/sites\/default\/files\/styles\/740xx_scale\/public\/2026\/03\/03\/2026-Jessica-Roberts-Reef-Data-Sonification-2.jpg?itok=UaqIj7yh"}}},"media_ids":["679503"],"groups":[{"id":"47223","name":"College of Computing"},{"id":"1188","name":"Research Horizons"},{"id":"50876","name":"School of Interactive Computing"}],"categories":[{"id":"153","name":"Computer Science\/Information Technology and Security"}],"keywords":[{"id":"360","name":"accessibility"},{"id":"194701","name":"go-resarchnews"},{"id":"9153","name":"Research Horizons"},{"id":"9092","name":"museums"},{"id":"181370","name":"oceanography"},{"id":"176552","name":"data sonification"},{"id":"1102","name":"blind"},{"id":"2751","name":"visually impaired"}],"core_research_areas":[{"id":"39501","name":"People and Technology"}],"news_room_topics":[{"id":"71881","name":"Science and Technology"}],"event_categories":[],"invited_audience":[],"affiliations":[],"classification":[],"areas_of_expertise":[],"news_and_recent_appearances":[],"phone":[],"contact":[],"email":[],"slides":[],"orientation":[],"userdata":""}}}