AUTOMATIC DETECTION OF TREAT RELEASE AND JAMMING WITH CONDITIONAL ACTIVATION OF ANTI-JAMMING IN AN AUTONOMOUS PET INTERACTION DEVICE
20220022417 · 2022-01-27
Inventors
- Henrik PETERSEN (Kongens Lyngby, DK)
- Ole Lim CHRISTIANSEN (Kongens Lyngbyv, DK)
- Troels Kjeldsen KRISTENSEN (Kongens Lyngbyv, DK)
- Hanne JARMER (Kongens Lyngby, DK)
Cpc classification
International classification
A01K15/02
HUMAN NECESSITIES
Abstract
Certain aspects of the present disclosure provide techniques for detecting an operational state of an autonomous pet interaction device, including: commanding a treat dispenser to dispense a treat from the autonomous pet interaction device; capturing image data using an image sensor of the autonomous pet interaction device; and determining, based on the image data, whether or not the treat was dispensed from the treat dispenser.
Claims
1. A method for controlling an autonomous pet interaction device, comprising: commanding a treat dispenser to dispense a treat from the autonomous pet interaction device; capturing image data using an image sensor of the autonomous pet interaction device; and determining, based on the image data, whether or not the treat was dispensed by the treat dispenser.
2. The method of claim 1, wherein determining, based on the image data, whether or not the treat was dispensed from the treat dispenser comprises: providing the image data to a model; and receiving from the model a determination of whether or not the treat was dispensed.
3. The method of claim 2, wherein: the model is a deep neural network model, and the image sensor is a video image sensor.
4. The method of claim 1, further comprising entering a mode configured to dislodge a stuck treat upon determining, based on the image data, that the treat was not dispensed from the treat dispenser.
5. The method of claim 4, further comprising rotating one of a carousel or a control arm of the treat dispenser in a first direction and a then second direction, different from the first direction, a plurality of times in order to dislodge the treat.
6. The method of claim 5, further comprising: entering a normal state if rotating the carousel or the control arm dislodges the treat; and entering a disabled state if rotating the carousel or the control arm does not dislodge the treat.
7. The method of claim 1, further comprising determining based on rotation sensor data whether or not a carousel or a control arm of the treat dispenser rotated after commanding the treat dispenser to dispense the treat.
8. The method of claim 7, further comprising reversing a rotation direction of the treat dispenser after determining based on the rotation sensor data that the carousel or the control arm of the treat dispenser did not rotate after commanding the treat dispenser to dispense a treat.
9. The method of claim 1, wherein the treat dispenser comprises a treat carousel configured to hold treats in a plurality of treat enclosures.
10. The method of claim 1, wherein the treat dispenser comprises a treat dispensing control arm configured to selectively block or unblock a treat exit.
11. An autonomous pet interaction device, comprising a treat dispenser; a motor configured to rotate the treat dispenser; an image sensor configured to capture image data of a treat being dispensed from the treat dispenser; a memory comprising computer-executable code; and a processor configured to cause the autonomous pet interaction device to: command the treat dispenser to dispense a treat from the autonomous pet interaction device; capture image data using the image sensor; and determine, based on the image data, whether or not the treat was dispensed by the treat dispenser.
12. The autonomous pet interaction device of claim 11, wherein in order to determine, based on the image data, whether or not the treat was dispensed from the treat dispenser, the processor is further configured to: provide the image data to a model; and receive from the model a determination of whether or not the treat was dispensed.
13. The autonomous pet interaction device of claim 12, wherein: the model is a deep neural network model, and the image sensor is a video image sensor.
14. The autonomous pet interaction device of claim 11, wherein the processor is further configured to cause the autonomous pet interaction device to enter a mode configured to dislodge a stuck treat upon a determination, based on the image data, that the treat was not dispensed from the treat dispenser.
15. The autonomous pet interaction device of claim 14, wherein the processor is further configured to cause the autonomous pet interaction device to rotate one of a carousel or a control arm of the treat dispenser in a first direction and a then second direction, different from the first direction, a plurality of times in order to dislodge the treat.
16. The autonomous pet interaction device of claim 15, wherein the processor is further configured to cause the autonomous pet interaction device to: enter a normal state if rotating the carousel or the control arm dislodges the treat; and enter a disabled state if rotating the carousel or the control arm does not dislodge the treat.
17. The autonomous pet interaction device of claim 11, wherein the processor is further configured to cause the autonomous pet interaction device to determine based on rotation sensor data whether or not a carousel or a control arm of the treat dispenser rotated after commanding the treat dispenser to dispense the treat.
18. The autonomous pet interaction device of claim 17, wherein the processor is further configured to cause the autonomous pet interaction device to reverse a rotation direction of the treat dispenser after a determination based on the rotation sensor data that the carousel or the control arm of the treat dispenser did not rotate after commanding the treat dispenser to dispense a treat.
19. The autonomous pet interaction device of claim 11, wherein the treat dispenser comprises a treat carousel configured to hold treats in a plurality of treat enclosures.
20. The autonomous pet interaction device of claim 11, wherein the treat dispenser comprises a treat dispensing control arm configured to selectively block or unblock a treat exit.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0010] The appended figures depict certain aspects of the one or more embodiments and are therefore not to be considered limiting of the scope of this disclosure.
[0011]
[0012]
[0013]
[0014]
[0015]
[0016]
[0017]
[0018]
[0019]
[0020]
[0021]
[0022] To facilitate understanding, identical reference numerals have been used, where possible, to designate identical elements that are common to the drawings. It is contemplated that elements and features of one embodiment may be beneficially incorporated in other embodiments without further recitation.
DETAILED DESCRIPTION
[0023] Aspects of the present disclosure provide apparatuses, methods, processing systems, and computer-readable mediums for automatically detecting operational states in an autonomous pet interaction device and automatically attempting to correct any operational errors if an error is detected.
[0024] Autonomous pet interaction devices, as described herein, may be configured to provide mental stimulation to animals, such as household pets (e.g., dogs and cats), in an automatic, secure, adaptive, and interactive way. In some embodiments, an autonomous pet interaction device may be connected to a display device commonly found in a home, such as a television or computer monitor, to provide auditory and visual instructions and feedback to a pet. Generally, the autonomous pet interaction device may be configured to start with easy cues or instructions that result in a treat being dispensed to a pet upon successful completion.
[0025] Because treats may come in many forms, an autonomous pet interaction device may be designed with generalized treat dispensing hardware, which may not always be optimal for certain treats. As such, it is possible that a treat may become “stuck” in a dispensing mechanism. In order to improve on conventional devices, the autonomous pet interaction devices described herein may include various sensors and models configured to detect a “stuck” or otherwise non-conforming operational condition of the autonomous pet interaction device, such as a stuck treat dispenser that is not dispensing treats when commanded.
[0026] Notably, embodiments described herein may detect an operational condition even when the autonomous pet interaction device is not experiencing a physical jam. For example, a treat that is stuck inside a dispensing element may fail to be ejected from the dispensing element when commanded, but may not otherwise impact the operation of the device, such as the advancing of a treat carousel comprising many dispensing elements. Similarly, an autonomous pet interaction device may have run out of treats and thus does not dispense a treat when commanded despite being in an otherwise operational state. Conventional devices cannot detect a failure to dispense a treat when the device is otherwise mechanically unaffected.
[0027] Generally, autonomous pet interaction devices, as described herein, may have multiple sensing devices (sensors) to detect an operational state of the device.
[0028] For example, embodiments described herein may include an image sensor (e.g., a still or live video camera sensor) placed in view of a dispensing path of the device. The image sensor may capture image data, which may include still or moving image data, and provide it to a trained model, such as a machine learning model, to determine whether a treat was dispensed when a treat was expected to be dispensed. For example, the machine learning model may be trained to perform image recognition on the image data to determine whether a treat was detected (e.g., while being dispensed) during a configurable time interval, such as an interval associated with a command to dispense a treat.
[0029] Further embodiments may include an audio sensor, such as a microphone, to provide further sensing capabilities and additional data to provide to a sensing model in an autonomous pet interaction device. For example, a machine learning model may be trained to detect the sound of a treat being dispensed based on audio data from an audio sensor. This model may be in addition to, instead of, or integrated with the aforementioned image-based model. For example, a combined model may be configured to output a probability of a treat being dispensed based on the video data and the audio data, or a probability of a treat being dispensed based on the video data separate from a probability of a treat being dispensed based on the audio data. In the latter case, the video-based output and the audio-based outputs (e.g., treat dispensed probabilities) may be used as a voting mechanism to improve fidelity of problem detection.
[0030] Other sorts of sensor inputs may be used to detect the successful dispensing of a treat as well. For example, proximity sensors, weight sensors, vibration sensors, and the like may all be used. For example, if a treat is only partially dispensed, a sensor in a treat dispensing carousel of the autonomous pet interaction devices may detect the carousel's failure to move forward partially or fully. As another example, if a treat is dispensed, a vibration associated with the treat falling out of the dispenser may be detected (or not detected if the treat fails to be dispensed). As yet another example, a weight of a treat dispenser (e.g., a carousel) may change when a treat is dispensed. In various embodiments, the output data from one or more of these sorts of sensors may be provided to a trained machine learning model to infer an operational state of the autonomous pet interaction device, such as a state in which treats are being dispensed correctly, and an error state in which treats are not being dispensed correctly. As above, in some cases each input type may be used for an individual model configured for that type of input, whereas in other cases, multiple input types may be combined in a single model architecture. In some cases, models may be ensemble to determine an ultimate determination as to whether a treat was dispensed or not.
[0031] Upon detecting a failure to dispense a treat when a treat was meant to be dispensed, the autonomous pet interaction devices described herein may initiate a procedure to correct the operational issue. For example, a second release may be attempted, and the same sensing to determine whether a treat was dispensed on the second attempt may be performed.
[0032] If a second attempt fails and/or the autonomous pet interaction device determines (e.g., by a model's output) that a treat has become stuck, the autonomous pet interaction device may engage a treat loosening or dislodging operational mode. For example, in embodiments where the autonomous pet interaction device includes a carousel, the carousel may be instructed to move back and forth a set number of times (e.g., 3 times) in order to try and dislodge or loosen the stuck treat through vibration and perturbation. In some embodiments, a transducer may also be used to vibrate the carousel in order to try and dislodge or loosen the stuck treat (e.g., in addition to commanding the carousel to move).
[0033] If automatic attempts to remedy an operational condition preventing treats from being dispensed fail, the autonomous pet interaction device may enter an idle (or error) mode in which interaction with a pet is not offered in order to not encourage the pet without an appropriate treat reward. In some embodiments, the pet may be presented with a cue, such as a visual or audio cue, indicating that the interaction is over. This beneficially prevents the pet from confusing the intended stimulus and response provided by the autonomous pet interaction device.
Example Computer Vision Algorithms for Determining Operational State of Autonomous Pet Interaction Device
[0034] Various computer vision techniques may be used to detect the presence of a treat being dispensed (or not). In one embodiment video frames captured at a sufficiently high frame rate (e.g., >30 FPS) from a forward facing camera (e.g., facing in the direction that a treat will be dispensed) may be used to detect the successful dispensing of treats. In such embodiments, the mean light intensity of video frames are computed over time and the short and long term deviation of the current frame is compared to a certain threshold indicating that a treat passed in front of the camera. Notably, the camera may be adjusted so that a field of view, focal length, aperture, and the like are optimized for detecting change in mean light intensity when a treat is dispensed. In some embodiments, a monochromatic (e.g., greyscale) camera sensor may be used, whereas in others, a color sensor may be used.
[0035] In other embodiments the treat detection might be based on the computer vision techniques, such as background subtraction, blob detection, image segmentation, and others.
[0036] By continually computing a difference image of the current image and a background image (e.g., either a certain predefined frame, or a more general frame computed by, for example, Gaussian Mixture Models from a given number of previous frames), a binary image can be computed by comparing the pixel differences based on a certain difference threshold value. After performing a certain set of morphological operations on the binary image, it is possible to detect blobs (binary large objects) and from certain properties, such as area and central moments, determine whether a blob corresponds to a treat passing in front of the camera.
[0037] In some examples, the aforementioned models configured to detect if a treat is successfully dispensed may be implemented as machine learning models, such as deep neural network models. In such cases, a training data set may be generated based on sensor data captured by autonomous pet interaction device with labelling to indicate in each training instance whether a treat was or was not dispensed.
[0038] In some embodiments, the model(s) used by an autonomous pet interaction device may be updateable, such as by software update on the autonomous pet interaction device. For example, an autonomous pet interaction device may include a data port (e.g., USB port) for uploading updated models, or may include a wireless network connection and be configured to receive updated model(s) from a remote service.
[0039] In some embodiments, a plurality of autonomous pet interaction devices may be used to perform federated learning of one or more machine learning models for detecting whether a treat was dispensed.
Example Autonomous Pet Interaction Device Control Logic
[0040]
[0041] In the depicted example, interaction code 102 represents the logic for how to mentally stimulate an animal (e.g., a pet dog) by providing interactive tasks. The interaction code 102 provides the dispense treat signal 103 when it is determined that the animal successfully performed a given task.
[0042] Further in the depicted example, motor code 104 represents the logic for driving the motor 106 that turns a treat carousel (or activates another type of treat dispenser) and thereby releases a treat. The motor 106 may be activated (e.g., turned in one direction or another) by transmitting a motor enable signal 109, which may, for example, activate a general-purpose input/output (GPIO) pin of a main printed circuit board (PCB), as depicted in
[0043] Further in the depicted example, motor stall detection code 110 represents the logic for detecting whether the motor 106 is unable to turn (e.g., is stalled), which may indicate whether something has become “stuck|” and is preventing a treat to be dispensed. The stall detection signal 111 may be based on rotation values 107 received from the rotation sensor 112. In some cases, the stall detection signal 111 is a Boolean signal that is input to the treat dispensing control code 108.
[0044] Further in the depicted example, treat detection code 114 represents the logic for detecting whether a treat is successfully released. In one embodiment, treat detection code 114 implements a computer vision algorithm to analyze whether a treat passes in front of the camera 116 based on output video frames 115 from camera 116. An output of treat detection code 114 may be treat detected signal 117 that is provided to the treat dispensing control code 108.
[0045] Further in the depicted example, treat dispensing control code 108 represents the logic for determining whether the dispense treat signal 103 has resulted in a successfully dispensed treat. In one example, based on the output of the motor stall detection code 110 and the treat detection code 114, treat dispensing control code 108 either outputs a success/fail signal 113 (e.g., a Boolean signal) to the interaction code 102 or a reverse motor signal 119 to the motor code 104 in case a stall has been detected.
[0046] Note that in
[0047]
[0048] In the depicted example, the logical flow of dispensing a treat starts by providing a signal to turn motor at 202, which may corresponds to turning the motor until a rotation value corresponding to one carousel treat slot (as depicted in
[0049] If a stall is detected at 210, a stall attempts counter (e.g., a variable in a program code) is increased by 1 at 214 and then it is determined whether the stall attempts counter has exceeded a threshold at 218. If not, then the controller attempts to turn the motor first in the reverse direction at 220, and then in the normal direction again back at 202. If the stall is detected again at 210, then the flow 200 continues until no stall is detected at 210 or a maximum number of attempts is reached at 218, and the flow exits with a fail at 224.
[0050] If a stall is not detected at 210, then it is determined whether a treat has been detected at 208. If no treat is detected at 208, then a detection attempts counter (e.g., a variable in a program code) is increased by 1 at 212 and then it is determined whether the detection attempts counter has exceeded a threshold at 216. If so, the flow exits with a fail at 224. If not, then the flow 200 returns to turning the motor at 202, which prompts another treat detection at 204.
[0051] If at 208 a treat is detected, then the flow exits with a success at 222.
Example Autonomous Pet Interaction Device with Treat Dispenser
[0052]
[0053]
[0054]
[0055]
[0056]
[0057]
[0058] In the depicted example, the autonomous pet interaction device is shown in a configuration to be mounted against a vertical surface, such as a wall, so that a treat may be dispensed by rotating the treat carousel 502 in alignment with the treat exit 302 and allowing the treat to fall out of the autonomous pet interaction device by action of gravity. Further, the treat exit 302 is aligned with the image sensor 304 so that when the treat falls out of the autonomous pet interaction device, it will pass directly in front of the image sensor 304.
[0059] The processing system 706 is generally connected to other sub-components, such as the motor 704, the camera 304, and the rotation sensor 604, by suitable electrical connections and configured to control those sub-components, for example, according to the flows in
[0060] As above, the image sensor 304 is used, among other things, for the determination of a successful or an unsuccessful treat release. For example, image data from image sensor 304 may be processed by a model stored in a memory of processing system 706.
[0061] The motor 704 initiates the rotation of the treat carousel 502, which enables the treat release functionality.
[0062] The gear 606 is connected to the motor 704 and interfaces with an internal gear 710 that is located in the treat carousel 502, allows the treat carousel 502 to rotate.
[0063] The rotation sensor 604 provides angular rotation feedback to processing system 706, which allows precise control of the angle of treat carousel 502 in order to release only one treat from one carousel treat slot each time. Further, rotation sensor 604 enables processing system 706 to determine when the carousel 502 has made a full rotation and has thus dispensed all the treats.
[0064] In the depicted embodiment, rotation sensor 604 interacts with a magnet 702 whose poles are positioned diametrically. In some cases, magnet 702 is a cylindrical, diametrically magnetized magnet. Rotation sensor 604 can read rotation angles when the polarity changes as the treat carousel 502 is rotated about its axis of rotation. The position of the magnet 702 as well as its proximity to the rotation sensor 604 below allows for a precise reading of the rotation angle of treat carousel 502.
[0065]
[0066]
[0067] As depicted in
[0068] As further depicted in
[0069] As depicted in
[0070] Notably, the motion of treat dispensing control arm 906 may be controlled based on determining whether a treat has been dispensed (e.g., according to flow 100 described above with respect to
Example Method for Controlling an Autonomous Pet Interaction Device
[0071]
[0072] Method 1000 begins at step 1002 with commanding a treat dispenser to dispense a treat from the autonomous pet interaction device. For example, a dispense treat signal (such as 103 in
[0073] Method 1000 then proceeds to step 1004 with capturing image data using an image sensor of the autonomous pet interaction device.
[0074] Method 1000 then proceeds to step 1006 with determining, based on the image data, whether or not the treat was dispensed from the treat dispenser.
[0075] In some embodiments of method 1000, determining, based on the image data, whether or not the treat was dispensed from the treat dispenser comprises: providing the image data to a model; and receiving from the model a determination of whether or not the treat was dispensed. In some embodiments, the model is a deep neural network model and the image sensor is a video image sensor.
[0076] Though not depicted in
[0077] In some embodiments, method 1000 may further include entering a normal state if rotating the carousel or the treat dispensing control arm dislodged the treat; and entering a disabled state if rotating the carousel or the treat dispensing control arm did not dislodge the treat. A normal state may generally describe a state in which the device is operating normally, such as interacting with an animal and dispensing treats based on that interaction. A disable state, on the other hand, may generally describe a state in which the device has stopped interacting with the animal and is not attempting to dispense treats.
[0078] In some embodiments, method 1000 may further include determining based on rotation sensor data whether or not the carousel or the treat dispensing control arm of the treat dispenser rotated after commanding the treat dispenser to dispense the treat.
[0079] In some embodiments, method 1000 may further include reversing a rotation direction of the treat dispenser after determining based on the rotation sensor data that the carousel of the treat dispenser did not rotate after commanding the treat dispenser to dispense a treat.
[0080] Note that method 1000 is just one example, and many other examples, which may include additional, alternative, or fewer steps, are possible consistent with this disclosure.
Example Processing System for an Autonomous Pet Interaction Device
[0081]
[0082] In the depicted embodiment, processing system 1100 includes processor 1102, which may be a general purpose processor, a microcontroller, an application-specific integrated circuit (ASIC), a field programmable gate array (FPGA), and the like.
[0083] Processor 1102 is configured to receive data from sensors 1104, which may include an image sensor (e.g., 304 as described with respect to
[0084] Processor 1102 is further configured to control motor 1106, which may in-turn rotate a treat dispenser, as described herein (e.g., as described with respect to motor 704 of
[0085] Processor 1102 is further configured to receive data from and send data to various input and output interfaces 1108. For example, processor 1102 may output video and audio signals through an audio/visual interface, such as HDMITM interface 404 as depicted in
[0086] Processor 1102 is further configured to execute computer executable code in memory 1110, which implement functional components. For example, processor 1102 is further configured to execute an animal interaction component 1110A, such as described with respect to code 102 in
[0087] Processor 1102 is further configured to execute models 1110D, which may include machine learning models for detecting treats, as described above, and which may be used by treat dispensing and detection component 1110C.
[0088] In some embodiments, processor 1102 is configured to implement the flows of
[0089] Note that processing system 1100 is just one example depicting selected aspects for simplicity and clarity, and other processing systems are possible consistent with the description herein.
Example Clauses
[0090] Implementation examples are described in the following numbered clauses:
[0091] Clause 1: A method for controlling an autonomous pet interaction device, comprising: commanding a treat dispenser to dispense a treat from the autonomous pet interaction device; capturing image data using an image sensor of the autonomous pet interaction device; and determining, based on the image data, whether or not the treat was dispensed by the treat dispenser.
[0092] Clause 2: The method of Clause 1, wherein determining, based on the image data, whether or not the treat was dispensed from the treat dispenser comprises: providing the image data to a model; and receiving from the model a determination of whether or not the treat was dispensed.
[0093] Clause 3: The method of Clause 2, wherein: the model is a deep neural network model, and the image sensor is a video image sensor.
[0094] Clause 4: The method of any one of Clauses 1-3, further comprising entering a mode configured to dislodge a stuck treat upon determining, based on the image data, that the treat was not dispensed from the treat dispenser.
[0095] Clause 5: The method of Clause 4, further comprising rotating one of a carousel or a control arm of the treat dispenser in a first direction and a then second direction, different from the first direction, a plurality of times in order to dislodge the treat.
[0096] Clause 6: The method of Clause 5, further comprising: entering a normal state if rotating the carousel or the control arm dislodges the treat; and entering a disabled state if rotating the carousel or the control arm does not dislodge the treat.
[0097] Clause 7: The method of any one of Clauses 1-6, further comprising determining based on rotation sensor data whether or not a carousel or a control arm of the treat dispenser rotated after commanding the treat dispenser to dispense the treat.
[0098] Clause 8: The method of Clause 7, further comprising reversing a rotation direction of the treat dispenser after determining based on the rotation sensor data that the carousel or the control arm of the treat dispenser did not rotate after commanding the treat dispenser to dispense a treat.
[0099] Clause 9: The method of any one of Clauses 1-8, wherein the treat dispenser comprises a treat carousel configured to hold treats in a plurality of treat enclosures.
[0100] Clause 10: The method of any one of Clauses 1-8, wherein the treat dispenser comprises a treat dispensing control arm configured to selectively block or unblock a treat exit.
[0101] Clause 11: An autonomous pet interaction device, comprising: a treat dispenser; a motor configured to rotate the treat dispenser; an image sensor configured to capture image data of a treat being dispensed from the treat dispenser; a memory comprising computer-executable code; and a processor configured to cause the autonomous pet interaction device to: command the treat dispenser to dispense a treat from the autonomous pet interaction device; capture image data using the image sensor; and determine, based on the image data, whether or not the treat was dispensed by the treat dispenser.
[0102] Clause 12: The autonomous pet interaction device of Clause 11, wherein in order to determine, based on the image data, whether or not the treat was dispensed from the treat dispenser, the processor is further configured to: provide the image data to a model; and receive from the model a determination of whether or not the treat was dispensed.
[0103] Clause 13: The autonomous pet interaction device of Clause 12, wherein: the model is a deep neural network model, and the image sensor is a video image sensor.
[0104] Clause 14: The autonomous pet interaction device of any one of Clauses 11-13, wherein the processor is further configured to cause the autonomous pet interaction device to enter a mode configured to dislodge a stuck treat upon a determination, based on the image data, that the treat was not dispensed from the treat dispenser.
[0105] Clause 15: The autonomous pet interaction device of Clause 14, wherein the processor is further configured to cause the autonomous pet interaction device to rotate one of a carousel or a control arm of the treat dispenser in a first direction and a then second direction, different from the first direction, a plurality of times in order to dislodge the treat.
[0106] Clause 16: The autonomous pet interaction device of Clause 15, wherein the processor is further configured to cause the autonomous pet interaction device to: enter a normal state if rotating the carousel or the control arm dislodges the treat; and enter a disabled state if rotating the carousel or the control arm does not dislodge the treat.
[0107] Clause 17: The autonomous pet interaction device of any one of Clauses 11-16, wherein the processor is further configured to cause the autonomous pet interaction device to determine based on rotation sensor data whether or not a carousel or a control arm of the treat dispenser rotated after commanding the treat dispenser to dispense the treat.
[0108] Clause 18: The autonomous pet interaction device of Clause 17, wherein the processor is further configured to cause the autonomous pet interaction device to reverse a rotation direction of the treat dispenser after a determination based on the rotation sensor data that the carousel or the control arm of the treat dispenser did not rotate after commanding the treat dispenser to dispense a treat.
[0109] Clause 19: The autonomous pet interaction device of any one of Clauses 11-18, wherein the treat dispenser comprises a treat carousel configured to hold treats in a plurality of treat enclosures.
[0110] Clause 20: The autonomous pet interaction device of any one of Clauses 11-18, wherein the treat dispenser comprises a treat dispensing control arm configured to selectively block or unblock a treat exit.
[0111] Clause 21: A processing system, comprising: a memory comprising computer-executable instructions; one or more processors configured to execute the computer-executable instructions and cause the processing system to perform a method in accordance with any one of Clauses 1-10.
[0112] Clause 22: A processing system, comprising means for performing a method in accordance with any one of Clauses 1-10.
[0113] Clause 23: A non-transitory computer-readable medium comprising computer-executable instructions that, when executed by one or more processors of a processing system, cause the processing system to perform a method in accordance with any one of Clauses 1-10.
[0114] Clause 24: A computer program product embodied on a computer-readable storage medium comprising code for performing a method in accordance with any one of Clauses 1-10.
Additional Considerations
[0115] The preceding description is provided to enable any person skilled in the art to practice the various embodiments described herein. The examples discussed herein are not limiting of the scope, applicability, or embodiments set forth in the claims. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments. For example, changes may be made in the function and arrangement of elements discussed without departing from the scope of the disclosure. Various examples may omit, substitute, or add various procedures or components as appropriate. For instance, the methods described may be performed in an order different from that described, and various steps may be added, omitted, or combined. Also, features described with respect to some examples may be combined in some other examples. For example, an apparatus may be implemented or a method may be practiced using any number of the aspects set forth herein. In addition, the scope of the disclosure is intended to cover such an apparatus or method that is practiced using other structure, functionality, or structure and functionality in addition to, or other than, the various aspects of the disclosure set forth herein. It should be understood that any aspect of the disclosure disclosed herein may be embodied by one or more elements of a claim.
[0116] As used herein, the word “exemplary” means “serving as an example, instance, or illustration.” Any aspect described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other aspects.
[0117] As used herein, a phrase referring to “at least one of” a list of items refers to any combination of those items, including single members. As an example, “at least one of: a, b, or c” is intended to cover a, b, c, a-b, a-c, b-c, and a-b-c, as well as any combination with multiples of the same element (e.g., a-a, a-a-a, a-a-b, a-a-c, a-b-b, a-c-c, b-b, b-b-b, b-b-c, c-c, and c-c-c or any other ordering of a, b, and c).
[0118] As used herein, the term “determining” encompasses a wide variety of actions. For example, “determining” may include calculating, computing, processing, deriving, investigating, looking up (e.g., looking up in a table, a database or another data structure), ascertaining and the like. Also, “determining” may include receiving (e.g., receiving information), accessing (e.g., accessing data in a memory) and the like. Also, “determining” may include resolving, selecting, choosing, establishing and the like.
[0119] The methods disclosed herein comprise one or more steps or actions for achieving the methods. The method steps and/or actions may be interchanged with one another without departing from the scope of the claims. In other words, unless a specific order of steps or actions is specified, the order and/or use of specific steps and/or actions may be modified without departing from the scope of the claims. Further, the various operations of methods described above may be performed by any suitable means capable of performing the corresponding functions. The means may include various hardware and/or software component(s) and/or module(s), including, but not limited to a circuit, an application specific integrated circuit (ASIC), or processor. Generally, where there are operations illustrated in figures, those operations may have corresponding counterpart means-plus-function components with similar numbering.
[0120] The following claims are not intended to be limited to the embodiments shown herein, but are to be accorded the full scope consistent with the language of the claims. Within a claim, reference to an element in the singular is not intended to mean “one and only one” unless specifically so stated, but rather “one or more.” Unless specifically stated otherwise, the term “some” refers to one or more. No claim element is to be construed under the provisions of 35 U.S.C. § 112(f) unless the element is expressly recited using the phrase “means for” or, in the case of a method claim, the element is recited using the phrase “step for.” All structural and functional equivalents to the elements of the various aspects described throughout this disclosure that are known or later come to be known to those of ordinary skill in the art are expressly incorporated herein by reference and are intended to be encompassed by the claims. Moreover, nothing disclosed herein is intended to be dedicated to the public regardless of whether such disclosure is explicitly recited in the claims.