SELECTIVE MANUAL OPERATION OF A VEHICLE

20250348080 ยท 2025-11-13

Assignee

Inventors

Cpc classification

International classification

Abstract

A vehicle includes a frame, a drive system coupled to the frame to propel and steer the vehicle, an energy storage device coupled to the frame and configured to provide power to the drive system, a lift implement coupled to the frame, the lift implement comprising a cradle to support a load and a lift assembly configured to adjust a position of the cradle relative to the frame, one or more sensors configured to provide sensing data indicative of an environment surrounding the vehicle, and a controller comprising one or more memory devices having instructions stored thereon, that, when executed by one or more processors, cause the one or more processors to: operate the vehicle along a first path, determine the first path extends into a first predefined zone, generate a second path that avoids the first predefined zone; and operate the vehicle along the second path.

Claims

1. A vehicle, comprising: a frame; a drive system coupled to the frame to propel and steer the vehicle; an energy storage device coupled to the frame and configured to provide power to the drive system; a lift implement coupled to the frame, the lift implement comprising a cradle to support a load and a lift assembly configured to adjust a position of the cradle relative to the frame; one or more sensors configured to provide sensing data indicative of an environment surrounding the vehicle; and a controller comprising one or more memory devices having instructions stored thereon, that, when executed by one or more processors, cause the one or more processors to: operate the vehicle along a first path; determine the first path extends into a first predefined zone in the environment surrounding the vehicle; generate a second path based on the sensing data that avoids the first predefined zone; and operate the vehicle along the second path.

2. The vehicle of claim 1, wherein the controller is further configured to sense, via the one or more sensors, at least one indicator in the environment.

3. The vehicle of claim 2, wherein the at least one indicator is a physical indicator positioned in the environment.

4. The vehicle of claim 2, wherein the controller is further configured to determine, based on the at least one indicator, a boundary of the first predefined zone.

5. The vehicle of claim 2, wherein the at least one indicator comprises a plurality of indicators, and wherein the plurality of indicators at least partially define a boundary of the first predefined zone.

6. The vehicle of claim 1, wherein the controller is further configured to receive a position of the first predefined zone from a user device communicably coupled to the vehicle.

7. The vehicle of claim 1, wherein the first predefined zone is a virtual zone.

8. The vehicle of claim 7, wherein the controller is further configured to receive zone data from a user device indicating a position of the virtually first predefined zone in the environment surrounding the vehicle.

9. The vehicle of claim 1, wherein the instructions, when executed by the one or more processors, further cause the one or more processors to: determine the second path extends into a second predefined zone in the environment surrounding the vehicle; adjust the path to a third path based on the sensing data to avoid the second predefined zone; and receive a command to override the adjustment and operate the vehicle according to the second path through the second predefined zone.

10. A system, comprising: a user device; and a vehicle communicably coupled to the user device, wherein the vehicle comprises: a frame; a drive system coupled to the frame to propel and steer the vehicle; an energy storage device coupled to the frame and configured to provide power to the drive system; a lift implement coupled to the frame, the lift implement comprising a cradle to support a load and a lift assembly configured to adjust a position of the cradle relative to the frame; and a controller comprising one or more memory devices having instructions stored thereon, that, when executed by one or more processors, cause the one or more processors to: operate the vehicle along a first path; receive, from an external device, a position of a first zone in an environment surrounding the vehicle, wherein the first path extends at least partially into the first zone; adjust the path to a second path based on the position of the first zone to avoid the first zone; and operate the vehicle along the second path.

11. The system of claim 10, wherein the vehicle further comprises one or more sensors configured to provide sensing data indicative of an environment surrounding the vehicle.

12. The system of claim 11, wherein the instructions, when executed by the one or more processors, further cause the one or more processors to: determine, based on the sensing data, the second path extends at least partially into a second zone, adjust the second path to a third path based at least partially on the sensing data to avoid the second zone; and operate the vehicle along the third path.

13. The system of claim 12, wherein the instructions, when executed by the one or more processors, further cause the one or more processors to: identify, based at least partially on the sensing data, at least one indicator in the environment; and determine, based at least partially on the at least one indicator, a position of the second zone in the environment.

14. The system of claim 10, wherein the external device comprises: a motion sensor configured to sense movement of the user device; and a user device controller comprising one or more memory devices having instructions stored thereon, that, when executed by one or more processors, cause the one or more processors to: establish the position of the first zone in the environment based at least partially on data from the motion sensor indicating movement of the user device.

15. The system of claim 14, wherein the user device further comprises a camera, and wherein the instructions, when executed by the one or more processors, cause the one or more processors to establish the position of the first zone in the environment based at least partially on data from the motion sensor indicating movement of the user device and image data form the camera.

16. The system of claim 10, wherein the external device is a mobile phone.

17. The system of claim 10, wherein the external device is a second vehicle communicably coupled to the first vehicle by a local area mesh network established at least in part by the first vehicle and the second vehicle.

18. A method of operation for a vehicle, comprising: operating, by one or more processing circuits, a vehicle along a first path, wherein the vehicle comprises: a frame; a drive system coupled to the frame to propel and steer the vehicle; an energy storage device coupled to the frame and configured to provide power to the drive system; a lift implement coupled to the frame, the lift implement comprising a cradle to support a load and a lift assembly configured to adjust a position of the cradle relative to the frame; and one or more sensors configured to provide sensing data indicative of an environment surrounding the vehicle; determining, by the one or more processing circuits, the first path intersects a first predefined area in the environment; generating, by the one or more processing circuits, a second path, wherein the second path avoids the first predefined area; and operating, by the one or more processing circuits, the vehicle along the second path to avoid the first predefined area.

19. The method of claim 18, further comprising: receiving, by the one or more processing circuits, the first predefined area from an external device.

20. The method of claim 18, further comprising: identifying, by the one or more processing circuits, the first predefined area based on sensing data from the one or more sensors indicating at least one indicator in the environment.

Description

BRIEF DESCRIPTION OF THE FIGURES

[0007] The disclosure will become more fully understood from the following detailed description, taken in conjunction with the accompanying figures, wherein like reference numerals refer to like elements, in which:

[0008] FIG. 1 is a perspective view of a vehicle according to an exemplary embodiment.

[0009] FIG. 2 is a top view of the vehicle of FIG. 1.

[0010] FIG. 3 is a perspective view of the vehicle of FIG. 1 equipped with a lifting implement, according to an exemplary embodiment.

[0011] FIG. 4 is a perspective view of the vehicle of FIG. 3 and another vehicle cooperating to support a telehandler, according to an exemplary embodiment.

[0012] FIG. 5 is a perspective view of the vehicle of FIG. 1 equipped with a cart implement, according to an exemplary embodiment.

[0013] FIG. 6 is a perspective view of the vehicle of FIG. 3 interfacing with a cart supporting a boom assembly, according to an exemplary embodiment.

[0014] FIG. 7 is a block diagram of a control system for the vehicle of FIG. 1.

[0015] FIG. 8 is a top view of a production system including the vehicle of FIG. 1, according to an exemplary embodiment.

[0016] FIG. 9 is a flow chart of a method for adjusting the operation of a vehicle, according to an exemplary embodiment.

[0017] FIG. 10 is a top view of a production system with a plurality of zones including the vehicle of FIG. 1, according to an exemplary embodiment.

[0018] FIG. 11 is a flow chart of a method for operating a vehicle, according to an exemplary embodiment.

[0019] FIG. 12 is a flow chart of a method for operating a vehicle, according to an exemplary embodiment.

DETAILED DESCRIPTION

[0020] Before turning to the FIGURES, which illustrate certain exemplary embodiments in detail, it should be understood that the present disclosure is not limited to the details or methodology set forth in the description or illustrated in the figures. It should also be understood that the terminology used herein is for the purpose of description only and should not be regarded as limiting.

[0021] Referring generally to the FIGURES, one or more vehicles may be configured to operate in one of a plurality of different modes of operation, including different degrees of autonomous operation (e.g., manually controlled, partially autonomous, or fully autonomous). In some embodiments, a vehicle can be configured to switch between different modes of operation (e.g., between a mode of operation that is substantially user controlled to a mode of operation that is fully, or at least substantially, autonomous). Stated differently, in some embodiments, the vehicles may be configured to switch between different levels of autonomous operation (e.g., minimally autonomous to fully autonomous and vice versa) based a user input or information related to the operation of the one or more vehicles, including based on the feasibility of operating a vehicle autonomously in the presence of one or more different operational conditions (e.g., inclement weather, power outage, steep grades, spills, unexpected obstacles, etc.). For example, the vehicles may switch between a first mode of operation and a second mode of operation based on a match value determined from sensor data, which is indicative of one or more operational conditions and one or more operational criteria.

[0022] The different modes of operation may work together. A manually controlled mode of operation may be used to guide a vehicle along a path. The manually-guided vehicle may leave virtual waypoints to mark the path for future semi-autonomous or autonomous navigation along the same path. While being guided, an operator may also manually identify one or more milestones along the path. The vehicle may mark the location of the milestones relative to the path for future semi-autonomous or autonomous navigation. In a semi-autonomous or autonomous mode, the vehicle may move along the path and to one or more milestones indicated on the path in response to a command or automatically. An operator may also manually identify one or more zones the vehicle is to avoid moving through. The operator may use a user device (e.g., mobile phone, camera, laser pointer, IR emitter, motion/motion-sensing controller, etc.) to indicate the outer boundaries of a zone for the vehicle through operation of the user device. An operator may also manually identify the one or more zones by placing a plurality of indicators around an outer perimeter of the zone. The indicators may include visual indicators (e.g., flags, cones, markers, QR codes, NFC tags, radio-signal emitters, etc.) which may be detected by the vehicle. A vehicle may sense the zone a single time and remember the position of the zone relative to the surrounding work environment even after the indicators are removed.

[0023] During operation in a semi-autonomous or autonomous mode, a manual override may also be provided to force the vehicle to switch to a manual operation mode. The manual override can be generated from a controller onboard on the vehicle or from an external device (e.g., remote controller, mobile phone, laptop). The device can be onsite or offsite at a remote location. When taking manual control, an operator can control of a plurality of a vehicles or a single vehicle depending on the override provided.

Overall Vehicle

[0024] Referring to FIGS. 1 and 2, a machine, vehicle, trolley, transport, hauler, mule, or tug, is shown as vehicle 10 according to an exemplary embodiment. The vehicle 10 may be configured to support, push, pull, turn, or otherwise facilitate movement of a product or components of a product throughout a manufacturing environment. By way of example, the vehicle 10 may move a product (e.g., another vehicle or machine) along a manufacturing line as the product is assembled. The vehicle 10 may move the product between stations where different assembly operations are performed. Additionally or alternatively, the vehicle 10 may be used to move parts or subassemblies (e.g., booms, engines, tires, etc.) throughout the manufacturing environment (e.g., to the product, to a storage area, etc.).

[0025] The vehicle 10 may be manually controlled, partially autonomous, or fully autonomous. In some embodiments, the vehicle 10 is configured as a semi-automated guided vehicle (SGV). When configured as an SGV, the vehicle 10 may be manually operated by an operator (e.g., through a wireless or tethered user interface). By way of example, the operator may manually control the steering of the vehicle 10. In some embodiments, the operator may also act as a supervisor over the autonomous operation of the vehicle 10. For example, the vehicle 10 may have plurality of goals or objectives to complete (e.g., tasks, waypoints, etc.) arranged in an order of completion. The vehicle may then proceed to accomplish those goals automatically in the semi-autonomous mode. The operator may then prioritize one goal over the others such that the prioritized goal is completed first before the remaining goals. In some embodiments, the user control is provided to an entire group of vehicles 10 to manually take control and prioritize certain goals (e.g., tasks) over others. In some embodiments, the vehicle 10 is configured as an automated guided vehicle (AGV). When configured as an AGV, the vehicle 10 may navigate along a predefined route (e.g., using a magnetic strip or other fixed navigation element). If the vehicle 10 configured as an AGV encounters an obstacle, the vehicle 10 may rely on manual intervention from an operator (e.g., through a user interface) to correct course and navigate around the obstacle. In some embodiments, the vehicle 10 is configured as an autonomous mobile robot (AMR). When configured as an AMR, the vehicle 10 may autonomously navigate through an area without requiring a predefined path. The vehicle 10 configured as an AMR may avoid obstacles without manual intervention by an operator.

[0026] The vehicle 10 includes a chassis, shown as frame 12, that supports the other components of the vehicle 10. In some embodiments, the frame 12 defines an enclosure that contains one or more components of the vehicle 10. The frame 12 includes a pair of side portions, shown as drive modules 14, a central portion, shown as controls enclosure 16, and a lateral member, shown as back plate 18. The drive modules 14 each extend longitudinally along the vehicle 10 and are laterally offset from one another. The controls enclosure 16 and the back plate 18 each extend laterally between the drive modules 14, fixedly coupling the drive modules 14 to one another. The controls enclosure 16 and the back plate 18 are longitudinally offset from one another, such that a recess or passage, shown as implement recess 20, is defined between the controls enclosure 16, the back plate 18, and the drive modules 14.

[0027] The drive modules 14 may contain components that facilitate propulsion of the vehicle (e.g., the drivetrain 40). The drive modules 14 may include one or more removable or repositionable panels, shown as drive module doors 24, that facilitate access to components within the drive modules 14 from outside of the vehicle 10. The controls enclosure 16 may contain components that facilitate powering or control over the vehicle (e.g., the controller 102, the batteries 110). The controls enclosure 16 includes a removable or repositionable panel, shown as controls enclosure door 22, that facilitates access to components within the controls enclosure 16 from outside of the vehicle 10. In other embodiments, the vehicle 10 includes a separate housing, body, or enclosure that is coupled to the frame 12 and contains one or more components of the vehicle.

[0028] The frame 12 defines a top surface 30, a front surface 32, a rear surface 34, and a pair of side surfaces 36 of the vehicle 10. The top surface 30 extends substantially horizontally across the drive modules 14 and the controls enclosure 16. A distance from the top surface 30 to the ground beneath the vehicle 10 may define a height of the vehicle 10. The front surface 32 is positioned at a front end portion of the frame 12 and extends substantially vertically and laterally across the drive modules 14 and the controls enclosure 16. The rear surface 34 is positioned at a rear end portion of the frame 12 and extends substantially vertically and laterally across the drive modules 14 and the back plate 18. The side surfaces 36 each extend longitudinally along one of the drive modules 14, between the front surface 32 and the rear surface 34.

[0029] The vehicle 10 includes a drive system or driveline, shown as drivetrain 40, that is configured to propel and steer the vehicle 10. The driveline includes a pair of actuators or motors (e.g., hydraulic motors, pneumatic motors, electric motors, etc.), shown as drive motors 42. In some embodiments, the drive motors 42 are electric motors powered by an electrical energy source (e.g., the batteries 110, energy from a power grid external to the vehicle 10, etc.). The drive motors 42 are each configured to provide rotational mechanical energy to drive rotation of one or more tractive elements 44 (e.g., wheel and tire assemblies, shown in FIG. 2). In some embodiments, the drive motors 42 drive the left and right sides of the drivetrain 40 independently, facilitating skid steer operation of the vehicle 10. By way of example, the tractive elements 44 may be driven at the same speed and in the same direction to travel straight. By way of another example, the tractive elements 44 may be driven at different directions and/or at different speeds to turn the vehicle 10. By driving the tractive elements 44 at the same speed and in opposite directions, the drivetrain 40 may rotate the vehicle 10 about a substantially vertical axis, shown as central axis 46, that is substantially centered relative to the frame 12. Rotation of the vehicle 10 about the central axis 46 may facilitate reorienting the vehicle 10 without changing position (i.e., turning in place).

[0030] The frame 12, the drivetrain 40, and various other components coupled to the frame 12 form a base portion of the vehicle 10, shown as base assembly 48. To facilitate moving a product, the vehicle 10 may include an implement that that selectively couples the base assembly 48 to a product. FIGS. 3 and 4 illustrate a first implement, shown as lifting implement 50, and FIGS. 5 and 6 illustrate a second implement, shown as cart implement 60. Each implement may be received within the implement recess 20 and fixedly coupled to the frame 12. In some embodiments, the implement is removable from the implement recess 20 to facilitate interchanging with another type of implement. By way of example, the lifting implement 50 may be removed and replaced with the cart implement 60. In other embodiments, the implement is permanently installed on the vehicle.

[0031] Referring to FIGS. 3 and 4, the lifting implement 50 includes a product interface, shown as cradle 52, and a lift device or lifting assembly, shown as lift assembly 54. The cradle 52 is configured to receive and directly support a product, shown as telehandler 56. By way of example, the cradle 52 may receive an axle assembly of the telehandler 56. The lift assembly 54 couples the cradle 52 to the frame 12. The lift assembly 54 may be extended to raise the cradle 52 or retracted to lower the cradle 52. Accordingly, the lift assembly 54 may be used to raise or lower the telehandler 56.

[0032] Certain large products, such as the telehandler 56, may be difficult to support with only a single vehicle 10. To facilitate steering the product and spreading out the weight of the product, multiple vehicles 10 may be utilized. In the example shown in FIG. 4, a front axle of the telehandler 56 is supported by one vehicle 10, and a rear axle of the telehandler 56 is supported by another vehicle 10. In some embodiments, the vehicles 10 are independently operable. In other embodiments, operation of one vehicle 10 is dependent upon the other vehicle 10. By way of example, a first vehicle 10 may supply electrical energy to, propel, and/or control operation of the other vehicle 10.

[0033] Referring to FIGS. 5 and 6, the cart implement 60 includes a pair of protruding interface elements (e.g., pins), extending above the top surface 30. Specifically, the cart implement 60 includes a central pin, shown as driving pin 62, and an offset pin, shown as turning pin 64, that can each be selectively raised and lowered by an actuator of the cart implement 60. The driving pin 62 is centered about the central axis 46, and the turning pin 64 is offset from the central axis 46. The driving pin 62 and the turning pin 64 are positioned to a mobile platform, shown as cart 66, that supports a product subassembly, shown as boom assembly 68.

[0034] When extended, the driving pin 62 and the turning pin 64 each engage the cart 66 to limit movement of the cart 66 relative to the base assembly 48. When both the driving pin 62 and the turning pin 64 engage the cart 66, the cart 66 may be fixed to the base assembly 48. When only the driving pin 62 engages the cart 66, the base assembly 48 may rotate freely about the central axis 46 relative to the cart 66, but movement of the vehicle 10 in a particular direction may cause movement of the cart 66 in that same direction. When the driving pin 62 and the turning pin 64 are both retracted away from the cart 66, the vehicle 10 may move freely relative to the cart 66.

[0035] The cart 66 may be equipped with casters or slides to facilitate free movement of the cart 66 along the ground. In some embodiments, the cart 66 supports some or all of the weight of the boom assembly 68. The driving pin 62 and the turning pin 64 may generally push horizontally on the cart 66, such that there may be little or no transmission of vertical forces between the cart implement 60 and the cart 66. Accordingly, the vertical load on the vehicle 10 may be minimized while still permitting the vehicle 10 move the cart 66 and the boom assembly 68 throughout the environment as desired. This reduction in load may reduce the overall cost of the vehicle 10.

[0036] Referring to FIG. 7, the vehicle 10 and a control system 100 for the vehicle 10 are shown according to an exemplary embodiment. The control system 100 may facilitate operation of the vehicle 10 and/or other devices of a production environment. Although certain components are shown as being included in the base assembly 48 and/or the implements 50 and 60, it should be understood that any component may be positioned in the base assembly 48, the lifting implement 50, or the cart implement 60 or duplicated across multiple thereof.

[0037] The vehicle 10 includes a controller 102 that controls operation of the vehicle 10. The controller 102 includes a processing circuit, shown as processor 104, and a memory device, shown as memory 106. The memory 106 may contain one or more instruction that, when executed by the processor 104, cause the processor to perform the various functions described herein.

[0038] The controller 102 further includes a communication interface 108 (e.g., a communication circuit, a network interface, etc.) that facilitates communication with (e.g., to and from) other components of the vehicle 10 and/or the control system 100. The communication interface 108 may facilitate wired communication (e.g., through CAN, Ethernet, communication of power, etc.). Additionally or alternatively, the communication interface 108 may facilitate wireless communication (e.g., through Bluetooth, Wi-Fi, radio transmission, inductive transmission of energy, etc.).

[0039] The base assembly 48 includes one or more energy storage devices, shown as batteries 110. The batteries 110 store energy (e.g., as chemical energy). The batteries 110 may deliver electrical energy to other components of the vehicle 10 to power the vehicle 10. The batteries 110 may be charged by an outside source of energy (e.g., an electrical grid, a wireless charging interface, etc.). In other embodiments, the base assembly 48 includes a different type of energy storage device (e.g., a fuel tank for an internal combustion engine of a generator, a fuel cell, etc.).

[0040] The base assembly 48, the lifting implement 50, and the cart implement 60 may each include one or more sensors 112 operatively coupled to the controller 102. The sensors 112 may provide sensor data describing the current status of the vehicle 10 and/or the surrounding environment. By way of example, the sensors 112 may include mapping or imaging sensors (e.g., LIDAR sensors, light curtains, cameras, ultrasonic sensors, etc.). By way of example, the sensors 112 may include position sensors (e.g., GPS, potentiometers, encoders, etc.). By way of example, the sensors 112 may include orientation or acceleration sensors (e.g., accelerometers, gyroscopic sensors, inertial measurement units, compasses, etc.). By way of example, the sensors 112 may include pressure sensors, flowmeters, buttons, or other types of sensors.

[0041] The base assembly 48 may include one or more operator interface elements (e.g., input devices, output devices, etc.), shown as user interface 114. The user interface 114 may include output devices that provide information to one or more users. By way of example, the user interface 114 may include displays, speakers, lights, haptic feedback (e.g., vibrators, etc.), or other output devices. The user interface 114 may include input devices that receive information (e.g., commands) from one or more users. By way of example, the user interface 114 may include buttons, switches, knobs, touchscreens, microphones, or other input devices.

[0042] The lifting implement 50 and/or the cart implement 60 may include one or more actuators 116 that facilitate controlled movement (e.g., movement of the lifting implement 50 or the cart implement 60). The actuators 116 may include linear actuators (e.g., electric linear actuators, hydraulic cylinders, etc.), motors (e.g., electric motors, hydraulic motors, etc.), or other types of actuators. The actuators 116 may be electrically-powered, hydraulically-powered, or otherwise powered.

[0043] The lifting implement 50 and/or the cart implement 60 may include a hydraulic system 120. They hydraulic system 120 may supply pressurized hydraulic fluid (e.g., hydraulic oil) to facilitate operation of other components of the vehicle 10. By way of example, the hydraulic system 120 may supply pressurized hydraulic fluid to an actuator 116. In some embodiments, the hydraulic system 120 forms a self-contained hydraulic loop with one or more actuators 116.

[0044] The hydraulic system 120 includes a low-pressure reservoir, shown as tank 122, that stores a volume of hydraulic fluid at a low pressure. A pump 124 receives electrical energy from the batteries 110, draws hydraulic fluid from the tank 122, and supplies a flow of pressurized hydraulic fluid. One or more valves 126 (e.g., solenoid valves, directional control valves, etc.) control the flow of the hydraulic fluid from the pump 124. By way of example, the valves 126 may control the flow rate, direction, and destination of hydraulic fluid flowing throughout the hydraulic system 120. The controller 102 may control operation of the actuators 116 by controlling the valves 126.

[0045] The control system 100 further includes additional devices in communication with the vehicle 10. The devices may communicate with the vehicle 10 directly or through a network 130 (e.g., a local area network, a wide area network, the Internet, etc.). The network 130 may utilize wireless and/or wired communication. In some embodiments, the network 130 is a mesh network formed between multiple devices of the control system 100 (e.g., permitting indirect communication between two devices through a third device).

[0046] The control system 100 may include multiple vehicles 10. A vehicle 10 may communicate with other vehicles 10 to share information and facilitate operation. By way of example, a vehicle 10 may provide commands to another vehicle 10 to coordinate transportation of a large item that is carried by both of the vehicles 10. By way of another example, a vehicle 10 may provide its location to another vehicle 10 to facilitate path generation and avoid collisions.

[0047] The control system 100 may include one or more user devices 132 (e.g., smartphones, tablets, laptops, desktop computers, etc.). The user devices 132 may facilitate a user monitoring and/or controlling operation of the vehicles 10. By way of example, the user devices 132 may indicate statuses of the vehicles 10 (e.g., positions, whether maintenance is needed, if any errors are occurring, what task a vehicle 10 is assigned, etc.). By way of example, the user devices 132 may permit a user to command a vehicle 10 to travel to a different place or to assign a vehicle 10 to a particular production line.

[0048] The control system may include one or more remote devices 134 (e.g., servers). In some embodiments, a remote device 134 functions as a production manager that controls various operations throughout a manufacturing environment. The production manager may receive requests for production of certain equipment (e.g., fifteen telehandlers are requested for production by Apr. 12, 2025, etc.). The production manager may monitor the statuses of vehicles 10, personnel, equipment, and raw materials. By way of example, the vehicles 10 may provide sensor data from the sensors 112 to a remote device 134 for storage and/or analysis. Based on the available data, the production manager may generate assignments for vehicles 10, personnel, equipment, and raw materials to meet the production requests. The production manager may adapt to changes in availability (e.g., by reassigning a vehicle 10 to a different task or area in response to a failure of one of the vehicles 10). The assignments for a vehicle 10 may include a path along which the vehicle 10 should travel, a desired configuration of the vehicle 10 (e.g., the type of implement available to the vehicle 10), an amount of time that the vehicle 10 should wait at a given station, etc.

[0049] Referring to FIG. 8, a manufacturing environment or production system 150 is shown according to an exemplary embodiment. The production system 150 may include a series of vehicles 10 that move a product 152 and a subassembly 154 through various stages of assembly (e.g., as controlled by a remote device 134). The vehicles 10 move the product 152 along a first path, shown as manufacturing line 156, and the vehicles 10 move the subassembly 154 along a second path, shown as manufacturing line 158. A series of manufacturing or assembly stations, shown as stations 160, are spaced at regular intervals along the manufacturing lines 156 and 158. Each station 160 may be associated with a different manufacturing or assembly process that is performed there. By way of example, there may be stations 160 for attaching components to a product 152, coupling components with hoses or wires, confirming that certain functions are operating properly, etc.

[0050] Initially the product 152 and the subassembly 154 move along separate manufacturing lines 156 and 158. After the last station 160 needed to prepare the subassembly 154, the manufacturing line 158 intersects the manufacturing line 156, and the subassembly 154 is attached to the product 152. The product 152 and the subassembly 154 then move together along the manufacturing line 156. This proceeds until the product 152 is fully assembled and removed from the vehicles 10. The vehicles 10 may then return to collect another product that requires assembly, and the manufacturing process is repeated.

[0051] In some embodiments, the product 152 assembled by the production system is a vehicle or work machine. By way of example, the product 152 may be a lift device, such as a telehandler, a scissor lift, a boom lift, a vertical lift, an aerial work platform, or another type of lift device. By way of another example, the product 152 may be a fire truck, an aircraft rescue and firefighting apparatus (ARFF) truck, a refuse vehicle, a concrete mixing truck, a tow truck, a broadcast van, a military vehicle, a robot, a truck, a van, a passenger vehicle, or another type of vehicle. In other embodiments, the product 152 is not a vehicle (e.g., is a stationary piece of equipment).

Reconfiguration of Vehicle Operation and Corresponding Autonomous Modes

[0052] As described above, one or more of the vehicles 10 (e.g., one or more embodiments of the vehicle 10 shown in, and described with reference to, FIG. 1) may be configured to operate in a plurality of different modes of operation, including different degrees of autonomous operation (e.g., manually controlled, partially autonomous, or fully autonomous). In some embodiments, a vehicle 10 can be configured to switch between different modes of operation (e.g., between a mode of operation that is substantially user controlled to a mode of operation that is fully, or at least substantially, autonomous). Stated, differently, in some embodiments, the vehicles may be configured to switch between different levels of autonomous operation (e.g., minimally autonomous to fully autonomous and vice versa) based information related to the operation of the one or more vehicles, including based on the feasibility of operating a vehicle autonomously in the presence of one or more different operational conditions (e.g., inclement weather, power outage, steep grades, spills, unexpected obstacles, etc.). For example, the vehicles may switch between a first mode of operation and a second mode of operation based on a match value determined from sensor data, which is indicative of one or more operational conditions, and one or more operational criteria.

[0053] The vehicles 10 may switch between a first mode of operation and a second mode of operation based on sensor data indicative of one or more operational conditions and one or more operational criteria. More specifically, the vehicle may determine a mode of operation based on sensor data relevant to the operation of the vehicle, including, for example, data collected by one or more light sensors, traction data (e.g., sensor data indicating a quality of the terrain on which the vehicle operates), load data (sensor data indicative of a load position, load orientation, the distribution of a load including between two or more vehicles, etc.), location data, and the like. The sensor data may be provided by sensors 112 of the vehicle 10, or based on sensor data provided by one or more remote device 134 or other vehicles 10 communicably coupled to the vehicle 10.

[0054] For example, a vehicle 10 may initially operate in a first mode or as an autonomous mobile robot and receive sensor data from one or more sensors 112 (e.g., cameras, LIDAR, etc.) indicative of a large unexpected obstacle in the vehicle's path. The vehicle 10 may initially remain in the first mode, and continue to operate as an autonomous mobile robot and upon unsuccessfully attempting to navigate around the unexpected obstacle, or due to sensor data indicating one or more additional operational conditions (e.g., low light/visibility, unexpected location, and/or an unstable load carried by the vehicle), the vehicle 10 may change operation to operate in a second mode (e.g., as an autonomous guided vehicle or a semi-autonomous guided vehicle, as described above). In some embodiments, the vehicle 10 may operate in one of a plurality of modes such as an autonomous mode, a semi-autonomous mode, or a manual mode and transition to a different mode of the plurality of modes based a command from at least one of a user device (e.g., user device 132), user interface 114, or remote device 134.

[0055] FIG. 9 depicts an example method 2600 for adjusting the mode of operation of a vehicle, according to one embodiment of the present disclosure. The method 2600 can be performed by at least the controller 102 and/or processor 104 of the system 100 depicted in FIG. 7, but is not limited thereto. In some implementations, one or more of the steps may be performed by a different processor, server, or any other computing device (e.g., user devices 132 and/or remote devices 134 of FIG. 7). For instance, one or more of the steps may be performed via a cloud-based service including any number of servers, which may be in communication with a processor of the vehicle 10 and/or an associated control system.

[0056] Although the steps are shown in FIG. 9 having a particular order, the steps may be performed in any order. In some instances, some of these steps may be optional. The method 2600 may be executed to improve the operation of one or more vehicles, including vehicles operating autonomously, and/or semi autonomously, within a manufacturing environment.

[0057] The method 2600 can, in some embodiments, include a method of operating a vehicle. In some embodiments, the method 2600 can include operating 2610 a vehicle in a first mode of operation. The vehicle 10 may include a plurality of modes of operation such as an autonomous mode, a semi-autonomous mode, and a user-guided or manual mode. In the autonomous mode, the vehicle 10 may be referred to as a self-guided vehicle or an autonomous mobile robot (AMR). In the semi-autonomous mode, the vehicle may be referred to as an automated guided vehicle (AGV). In the user-guided mode, the vehicle may be a remote-controlled vehicle. For example, in some embodiments, the method 2600 may include operating a vehicle in a first mode of operation that is fully autonomous or as an automated mobile robot, as described above, with reference to FIGS. 1 and 2. The autonomous mobile robot may be configured to operate within a manufacturing environment, including to move one or more components and/or products, without, or substantially without, user intervention or direct input from a user to control the operation of the vehicle 10.

[0058] Alternatively, or in addition, the vehicle 10 may operate in a first mode of operation that is substantially user controlled and/or that requires substantial input from a user to operate the vehicle 10, such as to steer the vehicle 10 while it is in a semi-automated guided vehicle configuration. For example, the vehicle 10 may operate in a semi-autonomous guided vehicle configuration based on the presence of one or more operating conditions.

[0059] The method 2600 can further include receiving 2620 sensor data indicative of one or more operational conditions. In some embodiments, operational conditions may include one or more conditions of the vehicles' environment that relate to the vehicle's operation. Operational conditions may include, for example, local weather, vehicle location, grade, traction, connectivity, proximity to one or more additional vehicles, tire pressure, battery levels, detected obstacles and/or obstructions, load data, light and/or visibility, sound, temperature, etc. The sensor data may be provided by the sensors 112 of the vehicle 10 or received from one or more other vehicles 10.

[0060] For example, in some embodiments, the method 2600 can receive sensor data indicative of an exceptionally steep grade based on or more orientation sensors of the vehicle 10 and may further receive sensor data indicative of a low traction environment (e.g., as traction data based on, or received from, the operation of the vehicle's drivetrain).

[0061] The method 2600 can include accessing 2630 one or more operational criteria, wherein the one or more operational criteria correspond to one or more operational modes of the vehicle. The operational criteria may include one or more criteria specified for one or more different modes of operation of the vehicle, including, for example, one or more criteria that prohibit, or substantially conflict with, one or more modes of operation of the vehicle 10. For example, the operational criteria may include criteria that specify the vehicle 10 operate in a user controlled mode when a grade is above a given angle, when vehicle traction is inconsistent, or when the vehicle's load is precarious and/or difficult to maneuver safely.

[0062] Alternatively, or in addition, the one or more operational criteria may specify that a semi-autonomous and/or partially autonomous operation of the vehicle 10 may be used in the presence of certain operational conditions. For example, where the vehicle 10 has reliable connectivity to a vehicle network and/or guiding features, the vehicle may continue to operate autonomously even when a load is large, heavy, and/or costly, or the vehicle must maneuver between relatively small spaces and/or with little physical clearance.

[0063] In some embodiments, the method 2600 can include determining 2640 a match value between the sensor data (e.g., received at 2620) and the one or more operational criteria (e.g., accessed at 2630). For example, the method 2600 can include determining a match value via the controller 102 and/or the processor 104 of the vehicle. In some embodiments, determining the match value may include determining a degree of overlap and/or similarity between the operational conditions indicated by the sensor data and the one or more operational criteria accessed at 2630. For example, the method 2600 can include determining a match value for each operational condition for which sensor data has been received and for which an operational criteria may apply.

[0064] In some embodiments, the method 2600 can include determining if a user override is received at 2650. The user override can be a user input designating a mode of operation for the vehicle to operate in. For example, the user override can indicate the vehicle 10 should change from to a manual mode from an autonomous mode. The override may be received directly at the vehicle 10 from the user interface 114. In some embodiments, the override is received from a remote device external to the vehicle 10 such as the user device 132 (e.g., a mobile application) or the remote device 134 (e.g., offsite control). In some embodiments, an override received at a first vehicle 10 is sent to other vehicles 10 within a predetermined distance of the first vehicle 10.

[0065] In some embodiments, if a user override is not received, the method 2600 can include switching 2660 operation of the vehicle 10 from the first mode of operation to a second mode of operation based on the match value indicating a match between one or more operational criteria and the one or more operational conditions indicated by the sensor data. For example, the vehicle 10 controller 102 may change the operation of the vehicle from fully autonomous, or autonomous mobile robot operation, to a second mode of operation as a semi-autonomous guided vehicle for which steering must be controlled via user input. The vehicle may change to a SGV mode of operation based on the match value indicating the vehicle 10 lacks network connectivity and is located in an unexpected location (e.g., outside of an expected manufacturing environment and/or manufacturing station).

[0066] In some embodiments, if a user override is received at 2650, the method 2600 can include switching 2670 operation of the vehicle 10 from the first mode of operation to a second mode of operation based on the user override. For example, a user can override the vehicle and inhibit it from transitioning to an autonomous mode, and instead direct it to a manual mode. The user override otherwise direct the vehicle 10 to transition to any of the modes.

Selective Manual Operation

[0067] Referring now to FIG. 10 the vehicle 10 is shown in the manufacturing environment or production system 150, according to an exemplary embodiment. In the production system 150 is a plurality of zones, shown as a first zone 2710, a second zone 2715, and a third zone 2725. The plurality of zones such as the first zone 2710 and the second zone 2715 are areas of the production system 150 an operator such as operator 2705 may determine the vehicle 10 should not enter or should operate within in only certain circumstances or only under certain conditions (e.g., only in manual modes, only below a certain speed, only during certain hours, etc.). In some embodiments, a zone may surround a hazard. For example, first zone 2710 surrounds hazard 2714 in the production system 150. Still, one or more of the zones may simply be an area the vehicle 10 is not intended to enter. In some embodiments, the plurality of zones may be established manually by an operator, such as operator 2705, using the user device 132 to mark the boundaries of the zone. The plurality of zones may additionally and/or alternatively be established manually by an operator positioning one or more indicators (e.g., indicators 2720) in the production system 150. The vehicle 10 determines the location of the zones 2710, 2715 in the production system 150 by sensing one or more indicators (e.g., indicators 2720) identifying the zone, or receiving the location of the zone from another device (e.g., user device 132, another vehicle 10, etc.).

[0068] As shown in FIG. 10, the first zone 2710 surrounds a hazard 2714 and is established by a plurality of indicators 2720. The indicators 2720 are physical devices positioned in the production system 150 to delineate the boundaries of the first zone 2710 for the vehicle 10. In some embodiments, a user may place the indicators 2720 in the production system 150 to ensure a vehicle 10 does not enter the zone 2710. The indicators 2720 may include visual markers (e.g., flags, text, colors, shapes, symbols, QR codes, barcodes, etc.). For example, the indicators 2720 may be cones or flags simply placed by an operator in the production system 150. In some embodiments, when the indicators 2720 include visual markers, the vehicle 10 may sense the indicators 2720 using optical sensors in the sensors 112 (e.g., cameras, photodetectors, infrared sensors, etc.). The indicators 2720 may additionally and/or alternatively emit a signal (e.g., sound, electromagnetic wave, etc.). The signal may be a wireless signal such as Bluetooth, Wi-Fi, NFC, UWB, or other types of radio signals, a light with predetermined characteristics (e.g., color, brightness, etc.), or another type of signal. In some embodiments, when the indicators 2720 emit a signal, the vehicle 10 may include a corresponding receiver in the sensors 112 to receive the signal and determine a location of the respective indicator 2720. In some embodiments, there are a plurality of types of indicators 2720, wherein each type includes a unique visual marker and/or emits a unique signal which differentiates one type of indicator 2720 from another.

[0069] Upon sensing the indicators 2720, the controller 102 of the vehicle 10 performs one or more actions to determine the outer perimeter of the zone 2710. The actions may depend on the type of indicators 2720 that are sensed. The controller 102 of the vehicle 10 may reference a lookup table to determine a corresponding action or sequence of actions to take for each type of indicator 2720. For example, a first type of indicators 2720 may represent a corner of a zone, as shown in the first zone 2710. The vehicle 10 may then determine that indicators 2720 of the corner type the vehicle 10 determines an outer perimeter line, shown as perimeter 2712 which is defined by the indicators 2720. The perimeter 2712 may be a continuous line formed by connecting adjacent indicators 2720 (i.e., the vertices or corners of the first zone 2710) to each other until each indicator 2720 is connected to the rest of the indicators 2720. The types of indicators 2720 may indicate the indicator 2720 is a corner, a center, a side, a height of the zone to be established, a duration of the zone to be established, etc. For the height of the zone, the indicator 2720 may indicate that the zone, such as zone 2710 should not be entered by vehicle 10 with a height greater than the height indicated by the type of indicator. For example, the type of indicator may represent a zone which begins 2 feet above the ground of the production system 150, such that vehicles 10 shorter than 2 feet may freely enter the zone, but vehicles 10 at or greater than 2 feet may not. The duration of a zone may indicate how long vehicles 10 are to avoid the zone. In some embodiments, once a zone defined by indicators 2720 is sensed by a vehicle 10, the indicators 2720 may be removed by the vehicle may maintain the zone (e.g., in its memory 106). The duration the zone may be maintained and/or avoided may be a predetermined amount of time, such as 1 hour, 1 day, or 1 week, or a predetermined number of passes by a vehicle 10, such as after passing and identify the zone once, twice, three times, etc. After the duration has elapsed, the vehicle 10 may freely enter the zone. In such a manner, the features of the zones may be customized and/or determined by the type of indicator 2720 used.

[0070] Upon determining the boundary of the first zone 2710, the vehicle 10 may automatically adjust its path, shown as path A, to avoid the first zone 2710. In some embodiments, the vehicle 10 selects a path from a plurality of possible paths. In some embodiments, the selected path may be the path with the highest average distance from the plurality of zones along the path. In some embodiments, the selected path may be the path with the lowest average distance from the plurality of zones along the path. In some embodiments, selected path may be the path that is extends adjacent or next to the least number of zones. In some embodiments, the selected path may be the path with the largest minimum distance between the vehicle 10 and one of the plurality of zones. In some embodiments, the selected path may be the shortest path that maintains a predetermined minimum distance from each of the plurality of zones. In such embodiments, the vehicle 10 may be in a semi-autonomous or autonomous mode and automatically determine the appropriate path taking into account the senses first zone 2710.

[0071] In some embodiments with only two indicators 2720, the controller 102 determines a line connecting the indicators 2720 together and the vehicle 10 will treat the line as a boundary not to be crossed. In some instances a single indicator 2720 is used, such as the third zone 2725. In such cases, the vehicle 10 establishes a zone shown as the third zone 2725 with the indicator 2720 at its center. The diameter of the third zone 2725 may be a predetermined distance from the indicator 2720. In such cases, the predetermined distance may be based on the type of indicator 2720 used to indicate the third zone 2725. In other embodiments, the predetermined distance is configured by an operator or owner of the vehicle 10. Still, in other embodiments, the vehicle 10 may be configured to determine a maximum possible distance from the indicator 2720 based on the position of the indicator 2720 relative to surrounding objects, walls, or hazards in the production system 150.

[0072] The production system 150 also includes the second zone 2715. The second zone 2715 is shown in a partially completed state as it is being defined by an operator 2705. The second zone 2715 may be defined or established by an operator, shown as operator 2705, operating the user device 132. The user device 132 may be a motion/motion-sensing controller, smartphone, tablet, laptop, camera, laser pointer, IR emitter, etc. The user device 132 may facilitate a user mapping out the zone 2715 in virtual space. By way of example, the user devices 132 may be a motion/motion-sensing controller with an accelerometer, gyroscope, and/or a magnetometer that is capable of determining its own orientation in 3D space. The user device 132 can then act as a digital marker or a wand that can be wielded by an operator such as operator 2705 to delineate the boundaries of the second zone 2715. The operator 2705 may point the user device 132 at a point of the production system 150. The path 2717 traced by the virtual end 133 of the user device 132, may represent the outer boundary of a zone. Using the user device 132, an operator such as operator 2705 may trace a line or a boundary of zone. The user device 132 is communicably coupled to the vehicle 10 and may provide the path 2717 traced by the user device 132 to the vehicle 10 as zone data for the vehicle 10 to process and place the zone in the production system 150.

[0073] By way of another example, the user device 132 may include a touch interface, a display, and a camera. The camera may image the production system 150 and the operator 2705 may provide one or more user inputs to identify the path on the image of the production system 150 displayed. The user inputs may include tracing the boundary of the second zone 2715, and/or selecting multiple points in the image of the production system 150 to be connected to form the boundary of the second zone 2715. The data from the user device (e.g., the image, the user inputs, and/or the second zone 2715) may be provided to the vehicle 10.

[0074] In some embodiments, the operator 2705 may place the vehicle 10 in a manual operating mode and control the vehicle 10 to drive around the zone to define the boundary of the zone. The path traversed by the vehicle 10 may then be stored in memory as a boundary not to be crossed.

[0075] FIG. 11 depicts an example method 2800 for establishing and traversing a plurality of zones with a semi-autonomous or autonomous vehicle, according to an exemplary embodiment. The method 2800 can be performed by at least the controller 102 and/or processor 104 of the system 100 depicted in FIG. 7, but is not limited thereto. In some implementations, one or more of the steps may be performed by a different processor, server, or any other computing device (e.g., user devices 132 and/or remote devices 134 of FIG. 7). For instance, one or more of the steps may be performed via a cloud-based service including any number of servers, which may be in communication with a processor of the vehicle 10 and/or an associated control system.

[0076] Although the steps are shown in FIG. 11 having a particular order, the steps may be performed in any order. In some instances, some of these steps may be optional. The method 2600 may be executed to improve the operation of one or more vehicles, including vehicles operating autonomously, and/or semi autonomously, within a manufacturing environment.

[0077] The method 2800, can, in some embodiments, include a method of operating a vehicle such as vehicle 10. The vehicle 10 may include a plurality of modes of operation such as an autonomous mode, a semi-autonomous mode, and a user-guided or manual mode. In the autonomous mode, the vehicle 10 may be referred to as a self-guided vehicle or an autonomous mobile robot (AMR). In the semi-autonomous mode, the vehicle may be referred to as an automated guided vehicle (AGV). In the user-guided mode, the vehicle may be a remote-controlled vehicle. For example, in some embodiments, the method 2800 may include at step 2810 operating a vehicle in a first mode of operation that is fully autonomous or as an automated mobile robot, as described above, with reference to FIGS. 1 and 2. The autonomous mobile robot may be configured to operate within a manufacturing environment, including to move one or more components and/or products, without, or substantially without, user intervention or direct input from a user to control the operation of the vehicle 10.

[0078] Alternatively, or in addition, at step 2810 the vehicle 10 may operate in a first mode of operation that is substantially user controlled and/or that requires substantial input from a user to operate the vehicle 10, such as to steer the vehicle 10 while it is in a semi-automated guided vehicle configuration. For example, the vehicle 10 may operate in a semi-autonomous guided vehicle configuration based on the presence of one or more operating conditions.

[0079] The method 2800 further includes at step 2820 establishing one or more zones in a workspace. The one or more zones (e.g., first zone 2710, second zone 2715, third zone 2725, etc.) may be established by an operator using a user device (e.g., user device 132) to virtually define the boundaries of the one or more zones in a workspace (e.g., production system 150) and/or by an operator placing one or more physical indicators in the zone (e.g., indicators 2720). The physical indicators may include visual markings and/or emit one or more sounds or signals to be sensed by the vehicle 10. In some embodiments, zones are established according to one or more rules based on the type of physical indicator used. There may be a plurality of types of indicators, with each type corresponding to a different function or feature of the zone. In some embodiments, rather than an operator placing the one or more physical indicators, a second vehicle 10 may place the one or more physical indicators in the workspace.

[0080] The method 2800 further includes at step 2830 receiving, at the vehicle the one or more zones. In some embodiments, where the zones are virtually defined by an operator using a user device, step 2830 includes receiving zone data from the user device 132. The zone data may include a position of the zone in the production system 150. In other embodiments, the zone data may include the user inputs, and the controller of the vehicle itself may determine the zone position in the production system 150 based on the user inputs. In some embodiments, where the zones are defined by physical indicators (e.g., indicators 2720) step 2830 includes using one or more sensors of the vehicle (e.g., sensors 112) to detect the physical indicators in the workspace. Upon detecting the physical indicators, the vehicle may then determine the boundary of the zone defined by the physical indicators, and thereby receive at the vehicle the one or more zones.

[0081] The method 2800 further includes at step 2840 operating the vehicle in the workspace to avoid the one or more zones. In step 2840 the vehicle may be operated in a manual mode, a semi-autonomous mode, or an autonomous mode. In a semi-autonomous or autonomous mode, the vehicle may determine a path to an endpoint that avoids each of the one or more zones in the work space. The path may be a selected path from a plurality of possible paths. The vehicle may determine the selected path based on one or more characteristics of the plurality of possible paths such as length, duration to traverse, minimum distance to a zone, maximum average distance to a zone, number of adjacent zones along the path, etc. The vehicle may then automatically traverse the production system 150 along a path that avoids the one or more zones. In some embodiments, in zones indicated by physical indicators (e.g., the indicators 2720) once the vehicle 10 has determined the zone defined by the physical indicators, even if the physical indicators are removed, the vehicle 10 can remember the zone and continue to avoid the zone. In some embodiments, any zones received or sensed by a first vehicle may be provided to a second vehicle, for the second vehicle to receive and avoid in a manner similar to the first vehicle. Thus, a zone sensed by a first vehicle may be provided to and avoided by a second vehicle. In some embodiments, each of a plurality of vehicles sense only a part of a zone, and in communicating each of the sensed parts to each other, the plurality of vehicles thereby sense the entire zone, and each vehicle can thereafter automatically avoid the sensed zone.

[0082] FIG. 12 depicts an example method 2900 for establishing and traversing a plurality of zones with a semi-autonomous or autonomous vehicle, according to an exemplary embodiment. The method 2900 can be performed by at least the controller 102 and/or processor 104 of the system 100 depicted in FIG. 7, but is not limited thereto. In some implementations, one or more of the steps may be performed by a different processor, server, or any other computing device (e.g., user devices 132 and/or remote devices 134 of FIG. 7). For instance, one or more of the steps may be performed via a cloud-based service including any number of servers, which may be in communication with a processor of the vehicle 10 and/or an associated control system.

[0083] Although the steps are shown in FIG. 12 having a particular order, the steps may be performed in any order. In some instances, some of these steps may be optional. The method 2600 may be executed to improve the operation of one or more vehicles, including vehicles operating autonomously, and/or semi autonomously, within a manufacturing environment.

[0084] The method 2900, can, in some embodiments, include at step 2910 a method of operating a vehicle such as vehicle 10 in a manual mode of operation. In the manual or user-guided mode, the vehicle may be a remote-controlled vehicle.

[0085] The method 2900 further includes at step 2920 receiving one or more waypoints. The vehicle 10 may receive a user input (e.g., via the user interface 114, the user device 132, the remote device 134, etc.) indicating the location of a waypoint. The location of the waypoint may be the location of the vehicle 10 at the time the user input is received. In some embodiments, the waypoint also includes a time series of the speed of the vehicle recorded since the last waypoint. In other embodiments, the vehicle 10 automatically determines waypoints based on a predetermined duration of time elapsing. For example, a vehicle may set waypoints of its current location every second, every 5 seconds, every minute, etc. In some embodiments, the vehicle determines a waypoint at each moment the vehicle changes its direction of travel. As the vehicle 10 is manually controlled in the workspace such as the production system 150, the vehicle may receive a plurality of waypoints.

[0086] The method 2900 further includes at step 2930 determining a path based on the plurality of waypoints. The plurality of waypoints may define a path of the vehicle 10 by connecting each adjacent (i.e., in time or in space) waypoint to the next consecutively. In some embodiments, the plurality of waypoints define only a portion of a path of the vehicle 10 as it has been under manual control. The portion of the total traversed path as indicated by the waypoints may be saved as a discrete path separate and apart from the total traversed path.

[0087] The method 2900 further includes at step 2940 operating the vehicle in a second mode of operation to follow the path. The vehicle 10 may be operable in plurality of modes of operation. While the vehicle 10 may learn the path and obtain the waypoints in a manual mode of operation (i.e., while controlled by an operator) when the vehicle 10 is in a semi-autonomous or autonomous mode, the vehicle 10 may traverse the path as indicated by the plurality of waypoints automatically. In some embodiments, the vehicle 10 may proceed along the path at the same speed as when in the manual mode. Still in other embodiments the vehicle 10 may follow the path at a predetermined speed which may be less than or greater than the speed the vehicle 10 traversed the path in the manual mode.

[0088] As utilized herein with respect to numerical ranges, the terms approximately, about, substantially, and similar terms generally mean +/10% of the disclosed values. When the terms approximately, about, substantially, and similar terms are applied to a structural feature (e.g., to describe its shape, size, orientation, direction, etc.), these terms are meant to cover minor variations in structure that may result from, for example, the manufacturing or assembly process and are intended to have a broad meaning in harmony with the common and accepted usage by those of ordinary skill in the art to which the subject matter of this disclosure pertains. Accordingly, these terms should be interpreted as indicating that insubstantial or inconsequential modifications or alterations of the subject matter described and claimed are considered to be within the scope of the disclosure as recited in the appended claims.

[0089] It should be noted that the term exemplary and variations thereof, as used herein to describe various embodiments, are intended to indicate that such embodiments are possible examples, representations, or illustrations of possible embodiments (and such terms are not intended to connote that such embodiments are necessarily extraordinary or superlative examples).

[0090] The term coupled and variations thereof, as used herein, means the joining of two members directly or indirectly to one another. Such joining may be stationary (e.g., permanent or fixed) or moveable (e.g., removable or releasable). Such joining may be achieved with the two members coupled directly to each other, with the two members coupled to each other using a separate intervening member and any additional intermediate members coupled with one another, or with the two members coupled to each other using an intervening member that is integrally formed as a single unitary body with one of the two members. If coupled or variations thereof are modified by an additional term (e.g., directly coupled), the generic definition of coupled provided above is modified by the plain language meaning of the additional term (e.g., directly coupled means the joining of two members without any separate intervening member), resulting in a narrower definition than the generic definition of coupled provided above. Such coupling may be mechanical, electrical, or fluidic.

[0091] References herein to the positions of elements (e.g., top, bottom, above, below) are merely used to describe the orientation of various elements in the FIGURES. It should be noted that the orientation of various elements may differ according to other exemplary embodiments, and that such variations are intended to be encompassed by the present disclosure.

[0092] The hardware and data processing components used to implement the various processes, operations, illustrative logics, logical blocks, modules and circuits described in connection with the embodiments disclosed herein may be implemented or performed with a general purpose single- or multi-chip processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general purpose processor may be a microprocessor, or, any conventional processor, controller, microcontroller, or state machine. A processor also may be implemented as a combination of computing devices, such as a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. In some embodiments, particular processes and methods may be performed by circuitry that is specific to a given function. The memory (e.g., memory, memory unit, storage device) may include one or more devices (e.g., RAM, ROM, Flash memory, hard disk storage) for storing data and/or computer code for completing or facilitating the various processes, layers and modules described in the present disclosure. The memory may be or include volatile memory or non-volatile memory, and may include database components, object code components, script components, or any other type of information structure for supporting the various activities and information structures described in the present disclosure. According to an exemplary embodiment, the memory is communicably connected to the processor via a processing circuit and includes computer code for executing (e.g., by the processing circuit or the processor) the one or more processes described herein.

[0093] The present disclosure contemplates methods, systems and program products on any machine-readable media for accomplishing various operations. The embodiments of the present disclosure may be implemented using existing computer processors, or by a special purpose computer processor for an appropriate system, incorporated for this or another purpose, or by a hardwired system. Embodiments within the scope of the present disclosure include program products comprising machine-readable media for carrying or having machine-executable instructions or data structures stored thereon. Such machine-readable media can be any available media that can be accessed by a general purpose or special purpose computer or other machine with a processor. By way of example, such machine-readable media can comprise RAM, ROM, EPROM, EEPROM, or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to carry or store desired program code in the form of machine-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer or other machine with a processor. Combinations of the above are also included within the scope of machine-readable media. Machine-executable instructions include, for example, instructions and data which cause a general purpose computer, special purpose computer, or special purpose processing machines to perform a certain function or group of functions.

[0094] Although the figures and description may illustrate a specific order of method steps, the order of such steps may differ from what is depicted and described, unless specified differently above. Also, two or more steps may be performed concurrently or with partial concurrence, unless specified differently above. Such variation may depend, for example, on the software and hardware systems chosen and on designer choice. All such variations are within the scope of the disclosure. Likewise, software implementations of the described methods could be accomplished with standard programming techniques with rule-based logic and other logic to accomplish the various connection steps, processing steps, comparison steps, and decision steps.

[0095] It is important to note that the construction and arrangement of the vehicle 10 and the production system as shown in the various exemplary embodiments is illustrative only. Additionally, any element disclosed in one embodiment may be incorporated or utilized with any other embodiment disclosed herein. For example, the vehicle of the exemplary embodiment shown in at least FIGS. 1 and 2 may be incorporated in the exemplary embodiment shown in at least FIGS. 3 and 4. As another example, the vehicle 10 of the exemplary embodiment shown in at least FIG. 1 may be incorporated into, or used to perform, the method 2600 and/or any portion thereof. Although only one example of an element from one embodiment that can be incorporated or utilized in another embodiment has been described above, it should be appreciated that other elements of the various embodiments may be incorporated or utilized with any of the other embodiments disclosed herein.