INTELLIGENT COMPANION APPLICATIONS AND CONTROL SYSTEMS FOR ELECTRIC SCOOTERS
20230294670 · 2023-09-21
Assignee
Inventors
- Christopher L Oesterling (Troy, MI, US)
- Anthony J. Sumcad (Rochester Hills, MI, US)
- Russell A Patenaude (Macomb Township, MI, US)
Cpc classification
B60W2510/06
PERFORMING OPERATIONS; TRANSPORTING
International classification
Abstract
Presented are adaptive operator assistance systems for motor-assisted manually powered (MMP) vehicles, methods for making/using such systems, and electric scooters equipped with such systems. A method of operating an MMP vehicle using a handheld mobile computing device (MCD) includes the handheld MCD receiving path plan data for the MMP vehicle and then receiving, based on this path plan data, MMP-specific ambient data that is aligned with the vehicle's present location and contains surrounding environment data particular to the MMP vehicle. A wireless location device of the handheld MCD tracks the MMP vehicle's real-time location, and a sensing device of the handheld MCD detects MMP-specific threat data that is aligned with the vehicle's real-time location and contains user danger data particular to the MMP vehicle. The handheld MCD then commands a resident subsystem of the MMP vehicle to execute a control operation based on the MMP-specific ambient data and/or threat data.
Claims
1. A method of operating a motor-assisted manually powered (MMP) vehicle using a handheld mobile computing device (MCD), the method comprising: receiving, via the handheld MCD, path plan data including a vehicle origin for the MMP vehicle; determining, via the handheld MCD based on the received path plan data, MMP-specific ambient data aligned with the vehicle origin and containing a predefined set of surrounding environment data particular to a vehicle species of the MMP vehicle; tracking, via a wireless location device of the handheld MCD, a real-time vehicle location of the MMP vehicle while traversing from the vehicle origin to a vehicle destination; detecting, via a sensing device of the handheld MCD, MMP-specific threat data aligned with the real-time vehicle location and containing a predefined set of user danger data particular to the vehicle species of the MMP vehicle; and transmitting, via the handheld MCD to a resident vehicle subsystem attached to the MMP vehicle, a command signal to execute a control operation based on the MMP-specific ambient data and/or the MMP specific threat data.
2. The method of claim 1, further comprising: determining, via the handheld MCD, a vehicle subspecies of the MMP vehicle; and modifying the control operation based on the vehicle subspecies of the MMP vehicle.
3. The method of claim 1, further comprising: determining, via the handheld MCD, a user skill level specific to an operator of the MMP vehicle, and modifying the control operation based on the user skill level of the operator.
4. The method of claim 1, further comprising: receiving, via a human-machine interface (HMI) of the handheld MCD, a user-selected preference input by an operator of the MMP vehicle; and modifying the control operation based on the user-selected preference.
5. The method of claim 1, wherein the sensing device of the handheld MCD includes a video camera and/or a proximity sensor, and wherein the predefined set of user danger data includes target object data indicative of a motor vehicle approaching the MMP vehicle.
6. The method of claim 5, further comprising transmitting, via the handheld MCD to a motor vehicle subsystem of the motor vehicle, a notification alerting a driver to a presence of the MMP vehicle relative to the motor vehicle.
7. The method of claim 1, wherein the path plan data further includes a predicted path from the vehicle origin to the vehicle destination, and wherein the predefined set of surrounding environment data includes a memory-stored hazard located on the predicted path, the method further comprising determining, via the handheld MCD, an alternate route for traversing from the vehicle origin to the vehicle destination.
8. The method of claim 1, wherein the resident vehicle subsystem includes an audio device, a video device, and/or a tactile device mounted to a vehicle body of the MMP vehicle, and wherein the control operation includes an audible, visual, or tactile notification.
9. The method of claim 1, further comprising outputting, to a user of the MMP vehicle via an audio device and/or a tactile device of the handheld MCD, an audible or tactile notification based on the MMP-specific ambient data and/or the MMP specific threat data.
10. The method of claim 1, wherein the predefined set of surrounding environment data of the MMP-specific ambient data includes hazards data indicative of path hazards proximal the vehicle origin, weather data indicative of ambient weather conditions proximal the vehicle origin, and/or timed systems data indicative of home-automated irrigation, lighting and/or door systems proximal the vehicle origin.
11. The method of claim 1, wherein the predefined set of user danger data of the MMP-specific threat data includes hazards data indicative of detected path hazards proximal the real-time vehicle location, approaching vehicles data indicative of a detected motor vehicle approaching the real-time vehicle location, and/or distractions data indicative of a detected one of a plurality of preset user distractions proximal the real-time vehicle location.
12. The method of claim 1, wherein the handheld MCD includes a smartphone, and the sensing device includes an accelerometer, a gyroscope, a proximity sensor, a magnetometer, a temperature sensor, a global positioning system transceiver, and/or a light sensor.
13. The method of claim 1, wherein the vehicle species of the MMP vehicle includes an electric pedal cycle, an electric standing kick scooter, or an electric skateboard each equipped with a motor operable to generate intermittent assist torque to propel the MMP vehicle.
14. A motor-assisted manually powered (MMP) vehicle, comprising: a vehicle body with a platform configured to support thereon a user; a plurality of road wheels attached to the vehicle body; a motor attached to the vehicle body and operable to drive one or more of the road wheels and thereby assist with propelling the MMP vehicle concurrent with manual propulsion via the standing user; a vehicle controller attached to the vehicle body and configured to communicate with a handheld mobile computing device (MCD) carried by the standing user; and a dedicated mobile software application executable on the handheld MCD and programmed to: receive path plan data including a vehicle origin for the MMP vehicle; receive, based on the received path plan data, MMP-specific ambient data aligned with the vehicle origin and containing a predefined set of surrounding environment data particular to a vehicle species of the MMP vehicle; track, using a wireless location device of the handheld MCD, a real-time vehicle location of the MMP vehicle while traversing from the vehicle origin to a vehicle destination; detect, using a sensing device of the handheld MCD, MMP-specific threat data aligned with the real-time vehicle location and containing a predefined set of user danger data particular to the vehicle species of the MMP vehicle; and transmit, to the vehicle controller for execution via a resident vehicle subsystem mounted to the MMP vehicle, a command signal to execute a control operation based on the MMP-specific ambient data and/or the MMP specific threat data.
15. A non-transitory, computer-readable medium (CRM) storing instructions executable by one or more processors of a handheld mobile computing device (MCD) to operate a motor-assisted manually powered (MMP) vehicle, the instructions, when executed by the one or more processors, causing the handheld MCD to perform operations comprising: receiving path plan data including a vehicle origin for the MMP vehicle; determining, based on the received path plan data, MMP-specific ambient data aligned with the vehicle origin and containing a predefined set of surrounding environment data particular to a vehicle species of the MMP vehicle; tracking, using a wireless location device of the handheld MCD, a real-time vehicle location of the MMP vehicle while traversing from the vehicle origin to a vehicle destination; detecting, using a sensing device of the handheld MCD, MMP-specific threat data aligned with the real-time vehicle location and containing a predefined set of user danger data particular to the vehicle species of the MMP vehicle; and transmitting a command signal to a resident vehicle subsystem of the MMP vehicle to execute a control operation based on the MMP-specific ambient data and/or the MMP specific threat data.
16. The CRM of claim 15, wherein the operations further comprise: determining a vehicle subspecies of the MMP vehicle; and modifying the control operation based on the vehicle subspecies of the MMP vehicle.
17. The CRM of claim 15, wherein the operations further comprise: determining a user skill level specific to an operator of the MMP vehicle; and modifying the control operation based on the user skill level of the operator.
18. The CRM of claim 15, wherein the operations further comprise: receiving, via a human-machine interface (HMI) of the handheld MCD, a user-selected preference input by an operator of the MMP vehicle; and modifying the control operation based on the user-selected preference input by the operator.
19. The CRM of claim 15, wherein the sensing device of the handheld MCD includes a video camera and/or a proximity sensor, and wherein the predefined set of user danger data includes target object data indicative of a motor vehicle approaching the MMP vehicle.
20. The CRM of claim 15, wherein the path plan data further includes a predicted path from the vehicle origin to the vehicle destination, wherein the predefined set of surrounding environment data includes a memory-stored hazard located on the predicted path, and wherein the operations further comprise determining an alternate route for traversing from the vehicle origin to the vehicle destination.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0014]
[0015]
[0016]
[0017] The present disclosure is amenable to various modifications and alternative forms, and some representative embodiments are shown by way of example in the drawings and will be described in detail herein. It should be understood, however, that the novel aspects of this disclosure are not limited to the particular forms illustrated in the above-enumerated drawings. Rather, the disclosure is to cover all modifications, equivalents, combinations, subcombinations, permutations, groupings, and alternatives falling within the scope of this disclosure as encompassed, for example, by the appended claims.
DETAILED DESCRIPTION
[0018] This disclosure is susceptible of embodiment in many different forms. Representative embodiments of the disclosure are shown in the drawings and will herein be described in detail with the understanding that these embodiments are provided as an exemplification of the disclosed principles, not limitations of the broad aspects of the disclosure. To that extent, elements and limitations that are described, for example, in the Abstract, Introduction, Summary, and Detailed Description sections, but not explicitly set forth in the claims, should not be incorporated into the claims, singly or collectively, by implication, inference or otherwise.
[0019] For purposes of the present detailed description, unless specifically disclaimed: the singular includes the plural and vice versa; the words “and” and “or” shall be both conjunctive and disjunctive; the words “any” and “all” shall both mean “any and all”; and the words “including,” “containing,” “comprising,” “having,” and the like, shall each mean “including without limitation.” Moreover, words of approximation, such as “about,” “almost,” “substantially,” “generally,” “approximately,” and the like, may each be used herein in the sense of “at, near, or nearly at,” or “within 0-5% of,” or “within acceptable manufacturing tolerances,” or any logical combination thereof, for example. Lastly, directional adjectives and adverbs, such as fore, aft, inboard, outboard, starboard, port, vertical, horizontal, upward, downward, front, back, left, right, etc., may be with respect to a motor vehicle, such as a forward driving direction of a motor vehicle when the vehicle is operatively oriented on a horizontal driving surface.
[0020] Referring now to the drawings, wherein like reference numbers refer to like features throughout the several views, there is shown in
[0021] MMP vehicle 10 of
[0022] To impart motive power to the vehicle 10, the traction motor 16 is drivingly coupled to the two drive wheel units 22A, 22B through a suitable power transmission, such as a belt-drive or a chain-drive transmission 30. The vehicle's final drive system employs a split-power differential gear train 32 that apportions motor-generated torque and power between the wheel units 22A, 22B. Each of two axle shafts 34A (
[0023] With continuing reference to
[0024] Electric scooter 10 of
[0025] Handlebar set 40 projects upwardly from the box-type support frame 36 and allows the rider to manually control the heading and directional changes of the vehicle 10. Right-hand and left-hand brake lever assemblies 44A and 44B, respectively, are mounted on the handlebar set 40 adjacent respective handle grips 46A and 46B. These brake lever assemblies 44A, 44B allow the user to selectively slow and stop the vehicle 10 by actuating right-side and left-side drum brake assemblies 48A (
[0026] Located at the front of the MMP vehicle 10, forward cargo bed 42 provides a rigid surface for seating thereon and supporting a cargo payload. Although not shown, the cargo bed 42 may incorporate guard rails, a basket, or a container to provide additional retention and protection while transporting cargo placed on the vehicle 10. A slide bracket 52 mechanically couples the rearward end of the cargo bed 42 to the frame 36 and allows for adjustable repositioning of the bed 42. Optional support plates 54 may be mounted to the frame 36 fore and aft of the left-hand and right-hand ground wheel units 22A and 22B.
[0027] E-assist capabilities may be selectively provided by the traction motor 16 in response to motor control signals from the vehicle controller 18. Real-time interface of an operator 11 (
[0028] As indicated above, resident vehicle controller 18 is constructed and programmed to govern, among other things, operation of the traction motor 16, display device 56, etc. Controller, control unit, control module, module, microprocessor, processor, and permutations thereof may be used interchangeably and synonymously to reference any one or various combinations of one or more of logic circuits, Application Specific Integrated Circuit(s) (ASIC), integrated circuit device(s), central processing unit(s) (e.g., microprocessor(s)), and may include appropriate signal conditioning, input/output, and buffer circuitry, and related components to provide herein described functionality. Associated memory and storage (e.g., read only, programmable read only, random access, hard drive, etc.)), whether resident, remote, or a combination of both, stores software, firmware programs, routines, instructions, and/or data retrievable by a controller.
[0029] Software, firmware, programs, instructions, routines, code, algorithms and similar terms may mean any controller executable instruction sets including calibrations and look-up tables. The controller may be programmed with a set of control routines executed to provide desired functions. Control routines are executed, such as by a central processing unit or a networked controller or control modules, and are operable to monitor inputs from sensing devices and other networked control modules, to execute control and diagnostic routines for controlling operation of devices and actuators. Routines may be executed in real-time, near real-time, continuously, systematically, sporadically and/or at regular intervals, for example, each 100 microseconds or 10 or 50 milliseconds, etc., during ongoing vehicle use or operation. Alternatively, routines may be executed in response to occurrence of any one of a set of calibrated events during operation of the vehicle 10.
[0030] Turning next to
[0031] For enhanced operation of the e-scooter 10, the rider 11 may activate and interface with a dedicated mobile software application (“companion app”) 15 that is executable on the handheld MCD 60, as indicated at control operation (S1) of the process workflow in
[0032] Activated at control operation (S2) is an integrated IAN component 17 app that operates within the companion app 15 and provisions vehicle warning and control features that are distinctively tailored to MMP vehicle-specific use cases. As will be explained in extensive detail below, for example, the IAN component 17 may offer scooter-specific terrain hazard notifications, alerts for motor vehicles in proximity to the scooter (e.g., approaching from behind), and warnings of scooter-specific distractions. The IAN component 17 leverages available MCP hardware and software to effectively transmute the smartphone 60 into an active sensor farm and advanced rider assistance system for an e-scooter 10 that may otherwise lack such functionality. IAN component 17 may also enable a rider 11 of an MMP vehicle 10 to wirelessly communicate with drivers and in-vehicle subsystems of nearby automobiles, e.g., to militate against a potential collision event.
[0033] Upon activation of the IAN component 17, the smartphone 10 executes control operation (S3) to automate a first-time aggregation of ambient condition data for the surrounding area proximal the MMP vehicle's present location or a user-selected start location. Ambient condition data may include ride-specific data that is retrieved from memory, third party resources, backend host services, MCD hardware/software, etc. IAN component 17 may pool open street map data, user-saved route data, and crowd-sourced geographic data (collectively “Terrain & Hazard Data’ 19) to identify—in addition to standard streets and roadways with associated dangers—sidewalks, alleys, and other pathways navigable via MMP vehicles and any hazards attendant thereto (collectively “Hazards Data” 21). The IAN component 17 may also prompt a third-party weather service 23 to provide weather forecasts, warnings of hazardous weather conditions, and other weather-related data (collectively “Weather/Climate Data” 25). In addition, the smartphone 60 may access a timed systems database 27 to pull historical or crowd-sourced lighting, irrigation, and gate systems information (collectively “Timed Systems Data” 29). A database may be maintained, e.g., via host cloud computing service 13, for any of the data sets based on previous geocoded riders or crowd-sourced riders and, if applicable, related inertial events.
[0034] After aggregating the first-time data retrieved at control operation (S3), the combined data is preprocessed, analyzed, and compared against available path plan data, including origin, destination, and predicted routing data, for the present trip of the MMP vehicle 10 in order to generate ride-specific recommendations at control operation (S4). Non-limiting examples of ride-specific recommendations may include presenting the user with an alternate route, a warning of an identified hazard or distraction, information specific to an identified hazard or distraction, etc. A ride-specific recommendation may be presented to the user via the e-scooter 10 (e.g., using display device 56), via the smartphone 60 (e.g., using touchscreen display device 66 or one of the output devices 68), or both. Notification thresholds and notification timing may be configured by a rider 11 through the companion app 15. For instance, the rider 11 may request to receive only audible and tactile notifications, and may request such notifications be generated and output within a predefined window of time (e.g., approximately three (3) seconds before an inertial event, such as a large bump in the sidewalk, or a friction risk event, such as a wet or water-pooled sidewalk). The companion app 15 may take into consideration scooter location, direction, speed, and (optionally) type to determine if/when to notify the rider 11.
[0035] In addition to presenting the rider 11 with ride-specific notifications related to predetermined hazards, weather conditions, and distractions, the smartphone 60 may also monitor the surrounding environment of the e-scooter 10 while en route to a desired destination to identify potential hazards and distractions in real-time or near real-time. At control operation (S5), for example, the IAN component 17 may first associate the rider 11 with one of a variety of predefined rider types (e.g., novice vs expert, aggressive vs conservative, etc.). Rider type information may be entered by the rider 11 or learned via the IAN component 17, e.g., using deep Neural Network learning techniques. Additional rider-specific information that may be collected at control operation (S5) includes a vehicle type (also referred to herein as “species”) and trim type (also referred to herein as “subspecies”). For MMP vehicle implementations, vehicle species may include e-scooters, e-bikes, e-skateboards, e-roller skates/blades, and a variety of other manually-powered vehicles with a resident motorized propulsion assist system. In this regard, vehicle subspecies may include “trim options” for the vehicle; for e-scooter applications, this may include standard, foldable, stunt, big wheel, and the like. By way of example, rider notifications, alerts, and warnings may be tailored differently for an expert rider on a competition class e-scooter with aggressive riding tendencies (e.g., using expert rider data 31) as opposed to an intermediate rider on a foldable e-bike with conservative riding tendencies (e.g., using standard rider data 33).
[0036] After identifying rider-specific data particular to the current operator 11 and the subject MMP vehicle 10, IAN component 17 begins to accumulate polling data to detect impending hazards and distractions along the upcoming path segments of the moving e-scooter 10, as indicated at control operation (S6). Polling data, such as expert-rider polled data 31 and standard-rider polled data 33 of
[0037] At control operation (S7), the IAN component 17 preprocesses and analyzes the polling data collected at control operation (S6), independently or in combination with the first-time data aggregated at control operation (S3), to generate and output ride-and-rider specific notifications. Non-limiting examples of rider-and-ride specific notifications may include presenting the user with a warning to avoid the current route, a warning of an upcoming hazard or distraction, information specific to an upcoming hazard or distraction, a warning to allow an approaching automobile to overtake and pass the e-scooter 10, etc. With the handheld MCD 60 in the rider's back pocket and the camera facing rearward, for example, the IAN component 17 may detect an automobile approaching from behind; a haptic transducer resident to the device 60 may issue a single “alert” vibration notifying of the automobile's presence or, when appropriate, a series of vibrations with progressively increasing intensity/duty cycle to indicate a more complex notification of distance and target confidence. As another option, the IAN component 17 may leverage the smartphone's Bluetooth connectivity to activate one or more LEDs or haptic feedback devices on the handlebars of the e-scooter 10. Other options may include using multiple sensors and output devices to indicate a side of approach, a speed of approach, a size of the approaching vehicle, a proximity of the approaching vehicle, etc.
[0038] With reference next to the flow chart of
[0039] Method 100 begins at terminal block 101 with memory-stored, processor-executable instructions for a programmable controller or control module or similarly suitable processor to call up an initialization procedure for an adaptive rider assistance protocol, such as companion app 15 of
[0040] After initializing the companion application at terminal block 101, method 100 advances to internal storage (RAM) process block 103 to activate a scooter-centric component, such as IAN component 17 of
[0041] At data input/output process block 107, method 100 uses the received path plan data to determine MMP-specific ambient data that is proximal to the vehicle origin and/or aligned along a predefined segment or segments of the predicted path. MMP-specific ambient data contains one or more predefined sets of surrounding environment data, each of which is tailored to the vehicle type/species of the MMP vehicle. By way of example, and not limitation, one predefined set of surrounding environment data may contain memory-stored hazards that are proximal to the e-scooter's current location or located on the predicted path and predetermined to be potentially detrimental to MMP vehicles. One predefined set of surrounding environment data may contain weather data that is indicative of ambient weather conditions proximal the vehicle origin/path and predetermined to be unfavorable or potentially injurious to riders of MMP vehicles. Another predefined set of surrounding environment data may contain timed systems data that is indicative of home-automated irrigation, lighting, door and gate systems, etc., proximal the vehicle origin/path and predetermined to be distracting or potentially dangerous to riders of MMP vehicles. In addition to MMP-specific ambient data, the companion application may also retrieve user-saved historical trip data, real-time geolocation data, open street map data, crowd-sourced data, etc. Using this data, ride-specific recommendations, such as those described above with reference to control operation (S4) of
[0042] Advancing to decision block 111, the MMP-vehicle tailored component within the companion app determines if the skill level of the current operator of the subject MMP vehicle is an expert. As explained above, additional and alternative metrics may be considered at this juncture when determining the types of data that will be collected and evaluated when presenting alerts and notifications to riders. If the current rider is not an expert (block 111=NO), method 100 may poll “live” real-time data for riders having an intermediate or novice skill level at data input/output block 113. Conversely, if the current operator is an expert rider (block 111=YES), method 100 may poll “live” real-time data for riders having an expert skill level at data input/output block 115.
[0043] With continuing reference to
[0044] Advancing from predefined process block 117, method 100 executes data output (display) block 119 in order to present the rider-and-ride specific notifications to the current operator of the MMP vehicle. For instance, the handheld MCD may transmit one or more command signals to a resident vehicle subsystem, such as touchscreen interactive display device 56 and/or an array of LEDs or haptic transducers mounted to the MMP vehicle, to execute one or more control operations based on the MMP-specific ambient data and/or the MMP specific threat data. As noted above, the resident vehicle subsystem may take on a variety of different forms, including audio components, video components, touch-sensitive components, etc., that singly or collectively produce audible/visual/tactile feedback to a user of the MMP. In addition, or alternatively, the handheld MCD may transmit one or more wireless signals to an approaching vehicle; the motor vehicle may responsively activate a resident vehicle subsystem in order to alert the driver or other vehicle occupant as to the presence, location, speed, trajectory, etc., of the MMP vehicle. Each control operation may be selectively modified based, for example, on the vehicle subspecies of the MMP vehicle, the user skill level of the operator, and/or any received user-selected preferences.
[0045] Aspects of this disclosure may be implemented, in some embodiments, through a computer-executable program of instructions, such as program modules, generally referred to as software applications or application programs executed by any of a controller or the controller variations described herein. Software may include, in non-limiting examples, routines, programs, objects, components, and data structures that perform particular tasks or implement particular data types. The software may form an interface to allow a computer to react according to a source of input. The software may also cooperate with other code segments to initiate a variety of tasks in response to data received in conjunction with the source of the received data. The software may be stored on any of a variety of memory media, such as CD-ROM, magnetic disk, and semiconductor memory (e.g., various types of RAM or ROM).
[0046] Moreover, aspects of the present disclosure may be practiced with a variety of computer-system and computer-network configurations, including multiprocessor systems, microprocessor-based or programmable-consumer electronics, minicomputers, mainframe computers, and the like. In addition, aspects of the present disclosure may be practiced in distributed-computing environments where tasks are performed by resident and remote-processing devices that are linked through a communications network. In a distributed-computing environment, program modules may be located in both local and remote computer-storage media including memory storage devices. Aspects of the present disclosure may therefore be implemented in connection with various hardware, software, or a combination thereof, in a computer system or other processing system.
[0047] Any of the methods described herein may include machine readable instructions for execution by: (a) a processor, (b) a controller, and/or (c) any other suitable processing device. Any algorithm, software, control logic, protocol or method disclosed herein may be embodied as software stored on a tangible medium such as, for example, a flash memory, a solid-state drive (SSD) memory, a hard-disk drive (HDD) memory, a CD-ROM, a digital versatile disk (DVD), or other memory devices. The entire algorithm, control logic, protocol, or method, and/or parts thereof, may alternatively be executed by a device other than a controller and/or embodied in firmware or dedicated hardware in an available manner (e.g., implemented by an application specific integrated circuit (ASIC), a programmable logic device (PLD), a field programmable logic device (FPLD), discrete logic, etc.). Further, although specific algorithms may be described with reference to flowcharts and/or workflow diagrams depicted herein, many other methods for implementing the example machine-readable instructions may alternatively be used.
[0048] Aspects of the present disclosure have been described in detail with reference to the illustrated embodiments; those skilled in the art will recognize, however, that many modifications may be made thereto without departing from the scope of the present disclosure. The present disclosure is not limited to the precise construction and compositions disclosed herein; any and all modifications, changes, and variations apparent from the foregoing descriptions are within the scope of the disclosure as defined by the appended claims. Moreover, the present concepts expressly include any and all combinations and subcombinations of the preceding elements and features.