USER INPUT DEVICES
20220177267 ยท 2022-06-09
Inventors
- Javier Munoz Sotoca (Rivas-Vaciamadrid, ES)
- Borja de Diego Restrepo (Madrid, ES)
- Marcos Garcia Gonzalez (Madrid, ES)
Cpc classification
B66B2201/4615
PERFORMING OPERATIONS; TRANSPORTING
B66B2201/4638
PERFORMING OPERATIONS; TRANSPORTING
B66B2201/4623
PERFORMING OPERATIONS; TRANSPORTING
G06F3/0421
PHYSICS
B66B1/461
PERFORMING OPERATIONS; TRANSPORTING
International classification
B66B1/46
PERFORMING OPERATIONS; TRANSPORTING
Abstract
A user input device (202), for providing an input to an elevator controller is disclosed. The user input device (202) includes a plurality of input regions (204A-204F) each of which correspond to a different input command; and a single touchless sensor (216) arranged to detect the presence of an object in any of the plurality of input regions (204A-204F).
Claims
1. A user input device (202, 302, 402), for providing an input to an elevator controller (115), comprising: a plurality of input regions (204A-204F, 304A-304F, 404A-404L) each of which correspond to a different input command; and a single touchless sensor (216, 316A, 416) arranged to detect the presence of an object in any of the plurality of input regions (204A-204F, 304A-304F, 404A-404L).
2. The user input device (202, 302, 402) as claimed in claim 1, wherein the single touchless sensor comprises a distance sensor (216, 316A, 416) configured to determine the distance at which an object is detected, and wherein each of the plurality of input regions (204A-204F, 304A-304F, 404A-404L) is spaced from the distance sensor (216, 316A, 416) by a known distance.
3. An elevator control system comprising the user input device (202, 302, 402) as claimed in claim 2 and a controller (219), wherein the controller (219) is configured to determine the input region (204A-204F, 304A-304F, 404A-404L) in which an object is present based on the known distance of each of the input regions (204A-204F, 304A-304F, 404A-404LF) from the distance sensor (216, 316A, 416) and the distance determined by the distance sensor (216, 316A, 416).
4. The user input device as claimed in claim 2, wherein the known distance for each input region (204A-204F, 304A-304F, 404A-404L) comprises a range of distances (228).
5. The user input device (202, 302, 402) as claimed in claim 2, wherein the distance sensor (216, 316A, 416) comprises an emitter (217) configured to emit a beam (210, 310A, 410) through the plurality of input regions (204A-204F, 304A-304F, 404A-404L) and a receiver (221) configured to receive the beam (210, 310A, 410) reflected by an object (226) placed in one of the plurality of input regions (204A-204F, 304A-304F, 404A-404L).
6. The user input device (202, 302, 402) as claimed in claim 1, wherein the plurality of input regions (204A-204F, 304A-304F, 404A-404L) are arranged in a straight line.
7. The user input device (402) as claimed in claim 5, wherein the plurality of input regions (404A-404L) comprises a first subset of input regions (404A-404F) and a second subset of input regions (404G-404L), wherein the user input device (402) further comprises a beam director (430) arranged to direct the beam (410) through at least the second subset of input regions (404G-404L).
8. The user input device (202, 302, 402) as claimed in claim 1, wherein the single touchless sensor (216, 316A, 416) is arranged behind a faceplate (208, 308), and wherein the plurality of input regions (204A-204F, 304A-304F, 404A-404L) are behind the faceplate (208, 308) and at least partially defined by a plurality of apertures (220) within the faceplate (208, 308).
9. The user input device (202, 302, 402) as claimed in claim 8, wherein each of the apertures (220) comprises at least one wall (222A; 222B) which extends behind the faceplate (208) and which partially defines each of the plurality of input regions (204A-204F, 304A-304F, 404A-404L).
10. The user input device (202, 302, 402) as claimed in claim 9, wherein the at least one wall (222A; 222B) comprises at least one at least partially transparent portion (224A; 224B) arranged to allow a beam (210, 310A, 410) to pass through each of the plurality of regions (204A-204F, 304A-304F, 404A-404L).
11. The user input device (302) as claimed in claim 1, wherein the user input device (302) comprises a further plurality of input regions (304G-304L) each of which correspond to further input commands and a further single touchless sensor (316B) arranged to detect the presence of an object in one of the further plurality of input regions (304G-304L).
12. The user input device (202, 302, 402) as claimed in claim 1, wherein the user input device (202, 302, 402) is a car operating panel for installation in an elevator car (103) or a landing operation panel for installation in an elevator landing (125).
13. The user input device (202, 302, 402) of claim 1, wherein the plurality of input regions (204A-204F, 304A-304F, 404A-404L) comprise regions for inputting at least one of: a destination floor command, a door open command, a door close command, an emergency operation command, an alarm command, a personal identification code, a travel up command and a travel down command.
14. A user input device (502, 602), for providing an input to an elevator controller, comprising; a faceplate (508, 608) which comprises at least one aperture (520) therein which allows access to a corresponding input region (504A-504F, 604A, 604F), located behind the aperture (502), which corresponds to an input command; and a touchless sensor (516A-516F, 616A-616F) arranged behind the faceplate (608), spaced from the aperture (520) and configured to detect the presence of an object in the input region (504A-504F, 604A, 604F).
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0038] Certain examples of the present disclosure will now be described with reference to the accompanying drawings, in which:
[0039]
[0040]
[0041]
[0042]
[0043]
[0044]
[0045]
[0046]
[0047]
[0048]
[0049]
[0050]
[0051]
[0052]
[0053]
DETAILED DESCRIPTION
[0054]
[0055] The tension member 107 engages the elevator machine 111, which is part of an overhead structure of the elevator system 101. The elevator machine 111 is configured to control movement between the elevator car 103 and the counterweight 105, and thus control the position of the elevator car 103 within the elevator shaft 117. The encoder 113 may be mounted on a fixed part at the top of the elevator shaft 117, such as on a support or guide rail, and may be configured to provide position signals related to a position of the elevator car 103 within the elevator shaft 117. In other embodiments, the encoder 113 may be directly mounted to a moving component of the elevator machine 111, or may be located in other positions and/or configurations as known in the art. The encoder 113 can be any device or mechanism for monitoring a position of an elevator car and/or counterweight, as known in the art.
[0056] The controller 115 is located, as shown, in a controller room 121 of the elevator shaft 117 and is configured to control the operation of the elevator system 101, and particularly the elevator car 103. For example, the controller 115 may provide drive signals to the elevator machine 111 to control the acceleration, deceleration, levelling, stopping, etc. of the elevator car 103. The controller 115 may also be configured to receive position signals from the encoder 113 or any other desired position reference device. When moving up or down within the elevator shaft 117 along guide rail 109, the elevator car 103 may stop at one or more landings 125 as controlled by the controller 115. The controller 115 may also be configured to receive, or determine, input commands based on an output of a user input device as will be described below. Although shown in a controller room 121, those of skill in the art will appreciate that the controller 115 can be located and/or configured in other locations or positions within the elevator system 101. In one embodiment, the controller may be located remotely or in the cloud.
[0057] The elevator machine 111 may include a motor or similar driving mechanism. The elevator machine 111 may be configured to include an electrically driven motor. The power supply for the motor may be any power source, including a power grid, which, in combination with other components, is supplied to the motor. The elevator machine 111 may include a traction sheave that imparts force to tension member 107 to move the elevator car 103 within elevator shaft 117.
[0058] Although shown and described with a roping system including a tension member 107, elevator systems that employ other methods and mechanisms of moving an elevator car within an elevator shaft may employ embodiments of the present disclosure. For example, embodiments may be employed in ropeless elevator systems using a linear motor to impart motion to an elevator car. Embodiments may also be employed in ropeless elevator systems using a hydraulic lift to impart motion to an elevator car.
[0059]
[0060] An object, e.g. a user's finger or a key, may be placed in the first input region 204A if a user wishes to open the elevator doors or an object may be placed in one of the second to sixth input regions 204B-204F to select a destination floor command A beam 210, emitted from a single touchless sensor (not visible in this Figure), passes through each of the input regions 206A-206F. A set of indicators 212 are arranged at the top of the user input device 202 and may be used to indicate the state of the elevator system to a user. For example, the indicators 212 may indicate that an alarm has been pressed, or indicate the direction of movement of the elevator car. Additionally, a speaker 214 is arrange to make audible announcements regarding operation of the user input device 202. For example, the speaker 214 may be configured to make a noise when a user places an object in one of the input regions 204A-204F, i.e. indicating the registration of their input command.
[0061]
[0062] Each of the input regions 204A-204F is defined by a respective aperture 220 in the faceplate 208, and two walls 222A, 222B. Only one aperture 220 and corresponding walls 222A, 222B are labelled for clarity purposes, but the apertures and arcuate walls for the other input regions 204A-204F may be identical. Whilst in the examples shown the walls 222A, 222B have an arcuate shape, it will be appreciated that they may have any suitable form. The two walls 222A, 222B are separated from one another and define two at least partially transparent portions in the form of two openings 224A, 224B. The two openings 224A, 224B allow the beam 210 emitted from the touchless sensor 216 to pass through all of the input regions 204A, 204F unimpeded. The two walls 222A, 222B act to limit the space in which a user can place an object behind the faceplate 208 and partially define each of the input regions. Whilst two walls 222A, 222B are included in the example shown, these may be replaced by a single wall. In this case, a single wall may comprise at least partially transparent portions, e.g. in the form of openings or at least partially transparent material, 15 which allow the beam to pass through.
[0063] Operation of the user input device 202 will now be described with reference to
[0064] As discussed above, this may be performed by the controller 219 shown in
[0065] The controller 219 may comprise a memory on which the known distances for each input region 204A-204F are stored. The known distances and their associated input commands may be stored in the form of a look-up table. Depending on the size of the apertures 20 and the size of the object which is inserted therein, the object may be placed at a range of positions in the input regions 204A-204F. Accordingly, the measured distance for an object in any one of the input regions 204A-204F may vary depending on the exact position of the object in the input region 204A-204F. Thus, the known distances may comprise a range of distances for each input region 204A-204F. This is depicted in
[0066] Following determination of the distance by the touchless sensor 216, the input command may be determined. This may be performed by the controller 219 as discussed above. This may be achieved, for example, by the controller 219 comparing the measured distance to a look-up table of known distances and associated input commands which may be stored in a memory of the controller 219. The input command corresponding to the input region in which the user has placed the object 226 may then be directed towards an appropriate part of the elevator system, e.g. the controller 115 depicted in
[0067] The user input device 202 described above is just one example of a user input device in accordance with the present disclosure.
[0068]
[0069]
[0070] Whilst the touchless sensors described above are in the form of a distance sensor, it will be appreciated that any other form of touchless sensor that is capable of determining the presence of an object in any one of the input regions may be used.
[0071]
[0072] The wall 522 may be integrally provided with the faceplate 508. In the example shown in
[0073]
[0074] The touchless sensors 516A-516F may comprise any sensor that is capable of detecting an object which moves into the vicinity of the touchless sensor, without requiring physical contact with the touchless sensor. The touchless sensors 516A-516F may, for example, comprise a capacitive touch sensor, a Hall Effect sensor, an ultrasound sensor, a beam-type sensor, e.g. an infrared, laser or visible light sensor. Certain types of sensor may only be capable of detecting certain types of objects. For example, Hall Effect sensors may only be capable of detecting metal objects.
[0075]
[0076]
[0077] As depicted in
[0078] Any of the user input devices described above may be employed in the elevator system 101 shown in
[0079] Accordingly, it will be appreciated by those skilled in the art that examples of the present disclosure provide an improved user input device for providing an input to an elevator controller. While specific examples of the disclosure have been described in detail, it will be appreciated by those skilled in the art that the examples described in detail are not limiting on the scope of the disclosure.