INTEGRATED SMART HELMET AND METHODS AND SYSTEMS OF CONTROLLING SMART HELMET
20220047028 · 2022-02-17
Inventors
Cpc classification
H04W4/90
ELECTRICITY
B62J43/30
PERFORMING OPERATIONS; TRANSPORTING
International classification
Abstract
Exemplary embodiments of the present disclosure are directed towards an integrated smart helmet system, comprising: a control unit-PCB 201 is wirelessly connected to a computing device 303 over network 305, computing device 303 is configured to enable a user to use different functionalities without having to remove helmet 102 and access the computing device 303 and the control unit-PCB 201 is configured to detect crashes while wearing the helmet 102 by the user and notify crash detected information to computing device 303 over network 305, buttons 213a-213d are positioned at the rear or side of helmet 102 and control unit-PCB 201 is electrically coupled to buttons 213a-213d, buttons 213a-213d are configured to initiate prompts to direct the user to put away the computing device 303 while driving and disable certain dangerous functions.
Claims
1. An integrated smart helmet system, comprising: a control unit-PCB 201 is wirelessly connected to a computing device 303 over a network 305, whereby the computing device 303 is configured to enable a user to use different functionalities without having to remove a helmet 102 and access the computing device 303 and the control unit-PCB 201 is configured to detect crashes while wearing the helmet 102 by the user and notify the crash detected information to the computing device 303 over the network 305; and buttons 213a-213d are positioned at the rear or on the side of the helmet 102 and the control unit-PCB 201 is electrically coupled to the buttons 213a-213d, the buttons 213a-213d are configured to initiate prompts to direct the user to put away the computing device 303 while driving and also disable certain dangerous functions.
2. The integrated smart helmet system as claimed in claim 1, wherein the control unit-PCB 201 comprises a wireless communication device 303 configured to transmit and receive information via the network 305.
3. The integrated smart helmet system as claimed in claim 1, wherein the control unit-PCB 201 is electrically coupled to a microphone 205 configured to record SOS signal
4. The integrated smart helmet system as claimed in claim 1, wherein the control unit-PCB 201 is electrically coupled to ultrasonic sensors 211a-211b placed at blind spot detection angles and are configured to scan and report objects in the user's blind spot.
5. The integrated smart helmet system as claimed in claim 1, wherein the control unit-PCB 201 is electrically coupled to at least two front LED indicators 209a-209b, and at least one rear LED indicator 209c is configured to direct the user to put away the computing device 303 while driving and also disable certain dangerous functions.
6. The integrated smart helmet system as claimed in claim 1, wherein the smart helmet 102 comprising a paired remote button 402 via an RF chip circuit 404, and a battery 405 which can be placed on the handle bar.
7. The integrated smart helmet system as claimed in claim 1, wherein the computing device 303 comprises a data processing module 310 configured to compute and store data from the smart helmet 102 by leveraging the processing power of the user's computing device 303.
8. A method, comprising: activating a smart helmet 102; pairing the smart helmet 102 with a computing device 303 of a user; commencing the ride by the user; activating a gyroscope sensor 215 upon commencement of the ride; obtaining the relative mean head position of the user; calculating as to where the user is looking at a given point of time; and calculating the exact location of a smart helmet 102 and the angle relative to the riding position.
9. The method as claimed in claim 8, further comprising a step of allowing the user to select the timer to cancel the SOS.
10. The method as claimed in claim 8, further comprising a step of updating the live location from GPS.
11. The method as claimed in claim 8, further comprising a step of sending the location-based notifications to emergency contacts via the data processing module 310.
12. A method, comprising: activating a smart helmet 102; pairing the smart helmet 102 with a computing device 303 of a user; commencing the ride by the user; detecting head-on crash greater than the threshold limit by an accelerometer 219; authenticating data readings from the accelerometer 219 and motion detection from GPS location after the impact and then enquire whether there is a detected motion is a passive crash or an active crash; cross referencing accelerometer data readings with the data processing module 306 at the computing device 303 and-checks whether the user is in motion by referring to a gyroscope sensor 215 and then updating the GPS live location when the user doesn't cancel the crash protocol before timeout; if it is the active crash, sending the location-based notifications to emergency contacts via the data processing module 306; and if it is the passive crash, sending the notification to the user that the helmet was dropped when the user doesn't cancel the crash protocol before timeout.
13. The method of claim 12, further comprising concussion detection includes steps detecting angular velocity greater than the threshold limit by a gyroscope sensor 215; measuring head position, and angular velocity; accessing GPS data before and after the impact; identifying linear acceleration and position; determining concussion from the combined GPS, accelerometer 219, and gyroscope sensors 215; sending concussion alert to the coach and/or medical support; and detecting head-on crash greater than threshold limit by an accelerometer 219 sensor and identifying impact and linear acceleration.
14. The method of claim 12, further comprising rotational head movement detection includes steps during crash includes detecting angular velocity greater than threshold limit by the gyroscope sensors 215; allowing the user to select the timer to cancel the crash protocol and sending notifications to the emergency contacts by the data processing module 310.
15. The method of claim 12, further comprising ride detection includes steps enabling the smart helmet to activate state and listens for SOS; accessing GPS and Geofence data from the computing device 303; checking for the smart helmet in connection with the wireless communication device 306; and sending the notifications to the computing device 303 that the smart helmet is not connected.
Description
BRIEF DESCRIPTION OF DRAWINGS
[0048] Other objects and advantages of the present invention will become apparent to those skilled in the art upon reading the following detailed description of the preferred embodiments, in conjunction with the accompanying drawings, wherein like reference numerals have been used to designate like elements, and wherein:
[0049] Other objects and advantages of the present invention will become apparent to those skilled in the art upon reading the following detailed description of the preferred embodiments, in conjunction with the accompanying drawings, wherein like reference numerals have been used to designate like elements, and wherein:
[0050]
[0051]
[0052]
[0053]
[0054]
[0055]
[0056]
[0057]
[0058]
[0059]
[0060]
[0061]
[0062]
[0063]
[0064]
[0065]
[0066]
DETAILED DESCRIPTION
[0067] It is to be understood that the present disclosure is not limited in its application to the details of construction and the arrangement of components set forth in the following description or illustrated in the drawings. The present disclosure is capable of other embodiments and of being practiced or of being carried out in various ways. Also, it is to be understood that the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting.
[0068] The use of “including”, “comprising” or “having” and variations thereof herein is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. The terms “a” and “an” herein do not denote a limitation of quantity, but rather denote the presence of at least one of the referenced item. Further, the use of terms “first”, “second”, and “third”, and the like, herein do not denote any order, quantity, or importance, but rather are used to distinguish one element from another.
[0069] The drawing figures are intended to illustrate the general manner of construction and are not necessarily to scale. In the detailed description and in the drawing figures, specific illustrative examples are shown and herein described in detail. It should be understood, however, that the drawing figures and detailed description are not intended to limit the invention to the particular form disclosed, but are merely illustrative and intended to teach one of ordinary skill how to make and/or use the invention claimed herein and for setting forth the best mode for carrying out the invention.
[0070] In accordance with various non limiting exemplary embodiments of the present subject matter, helmet systems and methods are disclosed wherein the helmet systems are integrated with wireless communication devices connecting the helmet system with devices such as mobile phones and thus allowing the user's to use the different functionalities of their devices without having to remove their helmets every time.
[0071] With reference to the drawing figures and in particular
[0072] Referring to
[0073] In accordance with one or more exemplary embodiments, the smart helmet provides a crash detection feature to the user. The smart helmet detects any blow to the head while the user wears the smart helmet. Whenever the user has a crash while wearing the smart helmet, it is automatically detected by the controller unit-PCB 201 and notifies this information to the emergency contacts. This shares the exact location of the user after the crash is detected. The controller unit-PCB 201 initiates the protocol and the data processing module (not shown) on the computing device (not shown) sends location-based notifications to the preassigned emergency contacts. The location-based notifications may include, but not limited to, SMS, Email, alerts, and so forth. The smart helmet communicates to the data processing module (not shown) via the network (not shown). The smart helmet may know when you have accidentally dropped your helmet or when you are involved in a fatal crash. This is done by a proprietary algorithm and code which cross references the impact information and location of the user with the GPS movements and speed of the user on the computing device (not shown) and classifies whether the user is driving or not. If the data generated by the accelerometer 219 on the computing device (not shown) shows a change in location and motion, the data processing module (not shown) authenticates this and relays this information to the data processing module (not shown). This helps in distinguishing if the user had a real crash or he just dropped his smart helmet. The smart helmet is also capable of working with different types of head injuries. The head injuries may include, but not limited to, head-on injury, rotational injury, and so forth. When the user has rotational head movement during a crash, the smart helmet calculates the angular velocity and the rotational angles of the head and when they cross the threshold limit then the crash detection protocol is activated. When the user has a crash the smart helmet calculates the impact detected from the accelerometer 219, the angular velocity, and head position by the gyroscope and the linear acceleration from the GPS. By combining this information the head position and the impact from the head-on and rotational collision, the smart helmet may classify whether the user has possibly suffered a concussion or a high impact on the head. After the concussion is detected, we can inform this to the emergency services as well as people monitoring this data, for eg: the coach or medic of a sports team.
[0074] In accordance with one or more exemplary embodiments, the smart helmet provides SOS feature. SOS is a safety feature, which when triggered, may notify the real-time, live location to the selected contacts that may be pre-assigned. This auto enables a live location tracking after the SOS function is triggered and updates the contacts with a live map of where the user is moving. It also sends location-based notifications to the preassigned emergency contracts via different communication modes (such as SMS and an Email) with the exact location to the preassigned Emergency Contacts. This feature can also be used as a simple live location tracker. The SOS feature may be activated from the smart helmet or directly from the data processing module (not shown). Triggering from the smart helmet may be done with a triple tap of the button on the helmet or triple tap on the helmet itself to activate the SOS (this feature may be activated by the accelerometer detecting a tap on the surface of the smart helmet).
[0075] In accordance with one or more exemplary embodiments, the smart helmet with rise to wake technology may detect the motion of the helmet in a sleep state and may activate the smart helmet to wake up to an active state. This is done by the accelerometer interrupt function that would send an interrupt when it detects significant motion, which in turn can also classify if a user has picked up the helmet or not. After it goes to the activate state, it starts advertising for the user's computing device (not shown) to connect. This ensures that the smart helmet is ready to start pairing to the user's computing device as soon as it is picked up. Once the user disconnects the computing device (not shown) from the smart helmet, the smart helmet may go back to the sleep state after a timeout period.
[0076] In accordance with one or more exemplary embodiments, the smart helmet with inertial measurement unit configured to measure how the head is rotating when the user puts on the smart helmet. The smart helmet may track the angles at which the head is positioned and give the absolute orientation of the head. The smart helmet system also measures the angular velocity of the head and may be mapped to real time head movement. The inertial measurement unit is multi-axis combination of gyroscope 215 and accelerometer 219. The smart helmet may further be configured to measure the absolute orientation by getting the quaternions of the helmet in real time. The data is sent over to the server from the smart helmet for processing and storing data. The smart helmet is coupled with the computing device (not shown) and the data processing module (not shown) for visualizing and analyzing the data. Visualizations include the real time head movement with 3D object files with the data from the remote sensor. The smart helmet needs to communicate the data from the sensor for visualizing the data. The data may be sent over the data processing module (not shown) via the network (not shown). For example, the smart helmet implemented with wireless low energy communication, then it can be mainly used for a single user at a time approach. Wireless low energy is known to consume very less energy and an ideal choice for a single user approach. The data from the smart helmet may be streamed via L2CAP profile to the paired computing device (not shown). The data processing module (not shown) may process this information and then have the dashboard for visualizing the real time head movements, the angular velocity at which head is moving, Quaternion data with angles of head rotation. The data processing module (not shown) can also classify on the data for any predefined gestures or movements. In another example, the smart helmet implemented with WiFi based communication protocols may work for any single user at a given time or multiple users in a group. The smart helmet may act as UDP client whilst the UDP server streams the data. The data is used for real time head tracking, visualizing real time head movements, angles, and positions. The server (not shown) may be connected to the data processing module (not shown) that has the user interface built for respective sport or activities. The interface may provide all the vital information with respect to the sport that involves helmet and need for tracking the head.
[0077] In accordance with one or more exemplary embodiments, real time player analysis is done using the wireless sensors placed inside the smart helmet of the player that provide all the vital information. The wireless sensors consist of IMU (Inertial Measurement Unit), GPS (Global positioning System), and impact detection system (Impact accelerometer) and on board WiFi chip. The wireless sensors are responsible for capturing the real time data from the IMU, GPS, Impact detection system and send them wirelessly to the computing device (not shown) or a cloud computer or a local server (not shown) for processing the data and streaming it to the smart computing device (not shown) or data processing module (not shown) with UI respective to the sport. IMU is responsible for the head tracking done on the player for real time. The IMU calculates the absolute orientation, angular velocity and any tap gestures on the smart helmet. GPS is responsible for calculating the absolute position of the player on the pitch, calculating speed and linear acceleration of the player on the pitch. Impact Detection System is responsible for measuring any high impacts on the helmet, detecting rapid changes in acceleration, linear acceleration and deceleration. On board WiFi is responsible for streaming this real time data to the cloud computer (not shown) of local server that can process the data and infer information about the player. The data may extract from this system, which is Player Position, Player current speed, Player heat map, Impact and hit detection, Head tracking and orientation, Gesture recognition, Player top speed, Player average distance covered, Concussion detection, and image about how sensor systems collect the information.
[0078] In accordance with one or more exemplary embodiments, the real time analysis is a fleet of connected wireless sensors with the users that are part of the team. The wireless systems are all connected to the same network. The wireless sensors may talk to the computing device (not shown) that can process data and also it can communicate among other wireless sensors. The connected wireless systems consist of IMU, GPS, impact accelerometer, and on board WiFi. The fleet of wireless systems when connected to the same network will share any information from any other node in the network (not shown). This helps in calculating all the users and synchronizing the data to form a collective information for the team. The GPS could effectively show the position of each user in an area, that is constantly updating with each movement. Practice session monitoring is also done using the real time team analysis. This works by first setting up the system with feed of users, positions and unique id of their wireless sensor system. For example, the practice session monitoring mode allows coaches to predefine any formation of strategy and observe in real time how the players are performing with the help of the data processing module (not shown) which is connected to the computing device (not shown).
[0079] In accordance with one or more exemplary embodiments, the smart helmet with the entertainment system integrated seamlessly into it can be used for listening music, making and taking calls, and activating the personal assistants (Ski or Google assistant, for e.g.). The Entertainment System is also equipped with onboard battery, Usb Micro-B port for charging and software upgrade. The entertainment system does this by having onboard wireless communication device connecting to the computing device with the HFP (Hands Free Profile) and AVRCP (Audio/Video Remote Control Profile). The smart helmet also provides a device firmware upgrade feature through USB Micro-B Port which is accompanied by the data processing module that detects the smart helmet when connected in a firmware upgrade state. The System comes equipped with buttons for changing different modes. The Entertainment system has mainly 3 modes, which are pairing mode, active mode, DFU mode, and so forth. Pairing Mode: The Entertainment system can go into pairing mode for it to be available for nearby devices to connect.
[0080] Active Mode: The Entertainment system in active mode is connected to a nearby wireless communication device with the AVRCP and HFP profiles and is used to listen music, take and receive calls.
[0081] DFU Mode: DFU Mode is primarily used for updating firmware of the Entertainment system inside the smart helmet. Firmware of the system can be updated by connecting the smart helmet to the computing device and going to the DFU Mode.
[0082] Referring to
[0083] Referring to
[0084] Referring to
[0085] Referring to
[0086] Referring to
[0087] Referring to
[0088] Referring to
[0089] As a continuation to step 814, at step 816 the data processing module is subjected to a predefined process. At step 818 it is enquired whether the SOS signal has been received via wireless network. The step 818 is connected to step 806. If the enquiry to step 818 is yes, then the GPS location is collected at step 820. At step 822 the emergency alert and the GPS location is sent to favorite contacts who may not be limiting to, friends, family, guardians, and the like. At step 824 audio live stream is received from helmet which is saved through the microphone provided in the memory of the computing device at step 826. Steps 808 and 824 are interconnected. Further at step 828 the app may be re launched as per the requirement. The wireless band normally tends to be idle as depicted in step 830. It is enquired at step 832 whether the SOS button has been long pressed. If the enquiry to step 832 is yes, then the SOS signal is sent to the helmet at step 834. The step 834 and 812 are interconnected. If the enquiry to step 832 is no, then the process reverts to step 830.
[0090] Referring to
[0091] Referring to
[0092] Referring to
[0093] Referring to
[0094] Referring to
[0095] Referring to
[0096] Referring to
[0097] Referring to
[0098] Digital Processing System 1600 may contain one or more processors such as a central processing unit (CPU) 1610, Random Access Memory (RAM) 1620, Secondary Memory 1630, Graphics Controller 1660, Display Unit 1670, Network Interface 1680, and Input Interface 1690. All the components except Display Unit 1670 may communicate with each other over Communication Path 1650, which may contain several buses as is well known in the relevant arts. The components of
[0099] CPU 1610 may execute instructions stored in RAM 1620 to provide several features of the present disclosure. CPU 1610 may contain multiple processing units, with each processing unit potentially being designed for a specific task. Alternatively, CPU 1610 may contain only a single general-purpose processing unit.
[0100] RAM 1620 may receive instructions from Secondary Memory 1630 using Communication Path 1650. RAM 1620 is shown currently containing software instructions, such as those used in threads and stacks, constituting Shared Environment 1625 and/or User Programs 1626. Shared Environment 1625 includes operating systems, device drivers, virtual machines, machine language, etc., which provide a (common) run time environment for execution of User Programs 1626.
[0101] Graphics Controller 1660 generates display signals (e.g., in RGB format) to Display Unit 1670 based on data/instructions received from CPU 1610. Display Unit 1670 contains a display screen to display the images defined by the display signals. Input Interface 1690 may correspond to a keyboard and a pointing device (e.g., touch-pad, mouse) and may be used to provide inputs. Network Interface 1680 provides connectivity to a network (e.g., using Internet Protocol), and may be used to communicate with other systems (such as those shown in
[0102] Secondary Memory 1630 may contain Hard Drive 1635, Flash Memory 1636, and Removable Storage Drive 1637. Secondary Memory 1630 may store the data software instructions (e.g., for performing the actions noted above with respect to the Figures), which enable Digital Processing System 1600 to provide several features in accordance with the present disclosure.
[0103] Some or all of the data and instructions may be provided on Removable Storage Unit 1640, and the data and instructions may be read and provided by removable storage drive 1637 to CPU 1610. Floppy drive, magnetic tape drive, CD-ROM drive, DVD Drive, Flash memory, removable memory chip (PCMCIA Card, EEPROM) are examples of such removable storage drive 1637.
[0104] Removable storage unit 1640 may be implemented using medium and storage format compatible with removable storage drive 1637 such that removable storage drive 1637 can read the data and instructions. Thus, removable storage unit 1640 includes a computer readable (storage) medium having stored therein computer software and/or data. However, the computer (or machine, in general) readable medium can be in other forms (e.g., non-removable, random access, etc.).
[0105] In this document, the term “computer program product” is used to generally refer to removable storage unit 1640 or hard disk installed in hard drive 1635. These computer program products are means for providing software to digital processing system 1600. CPU 1610 may retrieve the software instructions, and execute the instructions to provide various features of the present disclosure described above.
[0106] The term “storage media/medium” as used herein refers to any non-transitory media that store data and/or instructions that cause a machine language to operate in a specific fashion. Such storage media may comprise non-volatile media and/or volatile media. Non-volatile media includes, for example, optical disks, magnetic disks, or solid-state drives, such as storage memory 1630. Volatile media includes dynamic memory, such as RAM 1620. Common forms of storage media include, for example, a floppy disk, a flexible disk, hard disk, solid-state drive, magnetic tape, or any other magnetic data storage medium, a CD-ROM, any other optical data storage medium, any physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, NVRAM, any other memory chip or cartridge.
[0107] Storage media is distinct from but may be used in conjunction with transmission media. Transmission media participates in transferring information between storage media. For example, transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise bus (communication path) 1650. Transmission media can also take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications.
[0108] Reference throughout this specification to “one embodiment”, “an embodiment”, or similar or any Machine language means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. Thus, appearances of the phrases “in one embodiment”, “in an embodiment” and similar or any Machine language throughout this specification may, but do not necessarily, all refer to the same embodiment.
[0109] Furthermore, the described features, structures, or characteristics of the disclosure may be combined in any suitable manner in one or more embodiments. In the above description, numerous specific details are provided such as examples of programming, software modules, user selections, network transactions, database queries, database structures, hardware modules, hardware circuits, hardware chips, etc., to provide a thorough understanding of embodiments of the disclosure.
[0110] In different embodiments, the helmet system is integrated with artificial intelligence (AI), wherein the AI integrated helmet systems can detect various situations and react/adapt accordingly in an intelligent manner.
[0111] In different embodiments, the helmet system is configured to collect data and correlate and compute it with the mobile applications such as maps in the user's computing device to provide real time traffic and weather data for various purposes.
[0112] Although the present disclosure has been described in terms of certain preferred embodiments and illustrations thereof, other embodiments and modifications to preferred embodiments may be possible that are within the principles and spirit of the invention. The above descriptions and figures are therefore to be regarded as illustrative and not restrictive.
[0113] Thus the scope of the present disclosure is defined by the appended claims and includes both combinations and sub combinations of the various features described herein above as well as variations and modifications thereof, which would occur to persons skilled in the art upon reading the foregoing description.