{"671507":{"#nid":"671507","#data":{"type":"news","title":"CS Majors Win \u0027Best Hack for Health\u0027 with App for the Visually Impaired","body":[{"value":"\u003Cp\u003EA team of first-year Georgia Tech computer science (CS) majors has created an AI-powered app that empowers visually impaired people to lead more independent lives.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EJARVIS \u2013yes, the team members are big Iron Man fans\u2013 uses open-source AI tools, machine learning frameworks, and computer vision techniques to provide users with a richer, more thorough understanding of their immediate surroundings.\u003C\/p\u003E\r\n\r\n\u003Cp\u003ECS majors Arnav Chintawar, Dhruv Roongta, and Sahibpreet Singh developed JARVIS as their entry for\u0026nbsp;Cal Hacks\u0026nbsp;10.0, a hackathon organized by a University of California Berkeley\u2019s student group.\u0026nbsp;The team won multiple awards, including Best Hack for Health, Best Use of Zilliz, and Best Use of GitHub during the in-person event in late October that attracted more than 2,000 participants.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EThe multifunctional app acts much like a personal assistant for users. It can be integrated with a smartwatch and can:\u003C\/p\u003E\r\n\r\n\u003Cul\u003E\r\n\t\u003Cli\u003ERecognize and interpret a person\u2019s environment, offering detailed scene descriptions.\u003C\/li\u003E\r\n\t\u003Cli\u003ERead text and make recommendations.\u003C\/li\u003E\r\n\t\u003Cli\u003ERecognize friends and family.\u003C\/li\u003E\r\n\u003C\/ul\u003E\r\n\r\n\u003Cp\u003EAlong with these capabilities, JARVIS can perceive, interpret, and describe non-verbal cues of an individual near the user.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EA visually impaired family member inspired the team to develop JARVIS. Along with volunteering with local organizations, the team consulted advocates from the Center of the Visually Impaired to understand better the difficulties faced by the community and how the app could help.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u201cWe set out to bridge the accessibility gap for blind and visually impaired individuals by giving them unprecedented situational awareness of their surroundings. We hope that JARVIS can improve the quality of life for this community,\u201d said Roongta.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EThe team started by making JARVIS easy to use. Responding to spoken queries, JARVIS could help a user meet a friend for dinner. The app would describe the general setting and layout of the restaurant, and provide an approximate number of people present and the activities observed.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u201cWe used a speech-to-text and text-to-speech model similar to Siri or Alexa to ensure JARVIS would be easy to access and seem familiar to users,\u201d said Chintawar.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EThe user would then ask JARVIS to scan the room for friends, family, or others based on a database of images uploaded by the user. Once it recognizes someone, the app says their name and where they are in the room. The team estimates its identity classification model supporting this functionality to be about 95% accurate.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EJARVIS would then analyze the friend\u2019s facial expressions as the pair chats before ordering. It conveys the detected emotions via audio descriptions or haptic pulses through the user\u2019s smartwatch. The pulses vary in intensity based on the level of emotion it observes.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EWhen they are ready to review the menu, the user could ask JARVIS to list the appetizers or the vegetarian options. This capability integrates optical character recognition technology with the team\u2019s text-to-speech model, which allows the app to make relevant recommendations.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u201cThis project has broadened our technical knowledge and instilled in us a profound sense of empathy and a commitment to enhancing the lives of visually impaired individuals,\u201d said Singh.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EBuilding on its success, the team is pushing JARVIS forward with an eye toward future entrepreneurial competitions. Planned upgrades include extending compatibility to a broader range of wearable computing devices and developing more robust description capabilities.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u201cWe look forward to participating in the 2024 Georgia Tech InVenture Prize competition with an improved version of JARVIS. This will likely include customizing the vision model and fine-tuning it on custom data,\u201d said Roongta.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EAdditional details about the technologies behind JARVIS and the team\u2019s development approach are available on its Cal Hack 10.0 hackathon\u0026nbsp;\u003Ca href=\u0022https:\/\/devpost.com\/software\/jarvis-wbfx7n\u0022\u003Edevelopment site\u003C\/a\u003E.\u003C\/p\u003E\r\n","summary":"","format":"limited_html"}],"field_subtitle":"","field_summary":[{"value":"\u003Cp\u003EThree Georgia Tech CS majors have won multiple awards for an app they created as part of one of the largest student hackathons in the world. The teams used machine learning frameworks, and computer vision techniques to provide visually impaired users with a richer, more thorough understanding of their immediate surroundings.\u003C\/p\u003E\r\n","format":"limited_html"}],"field_summary_sentence":[{"value":"Three undergraduate students have won multiple awards for an app they created as part of one of the largest student hackathons in the world."}],"uid":"32045","created_gmt":"2023-12-11 13:07:35","changed_gmt":"2024-05-13 14:32:50","author":"Ben Snedeker","boilerplate_text":"","field_publication":"","field_article_url":"","dateline":{"date":"2023-12-11T00:00:00-05:00","iso_date":"2023-12-11T00:00:00-05:00","tz":"America\/New_York"},"extras":[],"hg_media":{"672551":{"id":"672551","type":"image","title":"CS majors Arnav Chintawar, Sahibpreet Singh, and Dhruv Roongta","body":null,"created":"1702383368","gmt_created":"2023-12-12 12:16:08","changed":"1702383368","gmt_changed":"2023-12-12 12:16:08","alt":"CS majors Arnav Chintawar, Sahibpreet Singh, and Dhruv Roongta.","file":{"fid":"255805","name":"Screenshot 2023-12-08 at 3.38.06\u202fPM.png","image_path":"\/sites\/default\/files\/2023\/12\/12\/Screenshot%202023-12-08%20at%203.38.06%E2%80%AFPM.png","image_full_path":"http:\/\/hg.gatech.edu\/\/sites\/default\/files\/2023\/12\/12\/Screenshot%202023-12-08%20at%203.38.06%E2%80%AFPM.png","mime":"image\/png","size":886724,"path_740":"http:\/\/hg.gatech.edu\/sites\/default\/files\/styles\/740xx_scale\/public\/2023\/12\/12\/Screenshot%202023-12-08%20at%203.38.06%E2%80%AFPM.png?itok=PQPlkHJ3"}}},"media_ids":["672551"],"related_links":[{"url":"https:\/\/youtu.be\/IwB2bDj1UpY?si=FcedDNUBARdJnx8e","title":"CS majors explain JARVIS, an app to help visually impaired people"}],"groups":[{"id":"47223","name":"College of Computing"}],"categories":[{"id":"193158","name":"Student Competition Winners (academic, innovation, and research)"}],"keywords":[{"id":"10199","name":"Daily Digest"},{"id":"11075","name":"The Whistle"}],"core_research_areas":[{"id":"39501","name":"People and Technology"}],"news_room_topics":[{"id":"71871","name":"Campus and Community"}],"event_categories":[],"invited_audience":[],"affiliations":[],"classification":[],"areas_of_expertise":[],"news_and_recent_appearances":[],"phone":[],"contact":[{"value":"\u003Cp\u003EBen Snedeker, Communications Mgr.\u003C\/p\u003E\r\n\r\n\u003Cp\u003ECollege of Computing\u003C\/p\u003E\r\n","format":"limited_html"}],"email":[],"slides":[],"orientation":[],"userdata":""}}}