Code, create and collaborate wit community to make your Sense-Hub more than the sum of its parts.
A Forum designed to encourage discussion to facilitate and put haptic and multi-sense technology in reach of more developers.
We want to put that capability directly in the hands of developers and with their help popularize multi-sense technology for VR and AR environments. Thereby making it easier for developers to add sensorial input to their products and their processes by integrating digital sensorial databases directly with their products.
See Our Sister Sites | Digital Suggestion Box
Sense-Hub is a platform created to host Forums focused on open projects. We have a specific library forum where you can share and find support files with haptic sensory information and models for software/hardware developers, being free to take, modify, save, share and use them in all your projects, also collections of subprograms used in the development of sensory applications. To facilitate sharing and development of haptic tools and use of augmented sensory reality Called OpenSenseLibrary, similar to OpenGL (Open Graphics Library used in computer graphics, for graphics application development, 3D environments, games).
We aim to foster discussion and develop solutions in a forum, not discusse about therapies or medical devices, neither informe final users, but rather proof of concept and hyper-experimentation for developers seeking to push beyond boundaries by pushing the boundaries of imagination and creativity in the use of augmented reality. -hub.com) The forum created to encourage discussion to facilitate and put haptic and multi-sensory technology within the reach of more developers. We want to put this feature directly in the hands of developers and, with their help, popularize multi-sensory technology for RV and AR environments. This makes it easier for developers to add sensory information to their products and processes by integrating digital sensory databases directly with their products. Human-computer interaction is undergoing a revolution, entering a multisensory era that goes far, far beyond the current paradigm. We seek to foster a forum that could accelerate this revolution. Voice, gestures, touch, haptics, force feedback, and many other sensors or actuators are now available, promising to simplify and simultaneously improve human interaction with computers. The revolution has begun, with touch and gesture-based systems reinventing mobile phones and video games. But the pace of development and the arrival of these advances on the market has been painfully slow for users with sensory impairment (visual, hearing …). That’s what they want to change by promoting open source development from the many interaction devices currently available – touch screens, motion sensors, speech recognition and many others – we are working to create a forum to discuss open source development capable of quickly and easily support the design and development of new user interfaces by merging the various types of data entry devices available. Sense-hub allows developers to explore different possibilities for interaction. Faster development means more iterations of a new interface to get a usable multimodal interface. Developers will be able to choose any interaction device they want to discuss and learn.
The idea is to attract people and open discussion threads in the forum (sense-hub.com). The goal is to promote and encourage the independent development of haptic technology and augmented reality (experimentally and opensource) mainly technologies designed for users who have some sensory impairment (hearing, visual impairment …) as a tool for social inclusion and assistance. mobility and independence in routine activities.
Greetings! You are Welcome to Join our Hackathon!
Meet IRIS Project
IRIS (Intuitively Regulated Intel System) will be one of the projects available on Sense-Hub open to collaboration with dedicated Forum for a community of developers. The goal of the project is to create data models to improve the management of sensory information through the use of haptic and augmented reality technologies. With information gathered by the developer network, the models created will be used to identify discrepancies, data filtering, complete data in the event of a lack of information, and behavioral profiling that will provide the tools to manage sensory information more efficiently.
Sense-Hub also seeks to be a space for work with exoskeleton designs and wearable soft robots to provide external mechanical forces driven by voluntary muscle signals to assist the patient’s desired joint movement, deflecting objects or assisting in better movement composition and modulation, as an example haptic adjustments and guidelines to get a cup or a movement in a sport activity.