DISPLAY METHOD AND APPARATUS
20220020339 · 2022-01-20
Inventors
Cpc classification
G09G2370/06
PHYSICS
H04N21/442
ELECTRICITY
H04L12/2821
ELECTRICITY
G06F3/1423
PHYSICS
G06F3/1454
PHYSICS
H04Q2213/13175
ELECTRICITY
G09G2370/08
PHYSICS
G09G2356/00
PHYSICS
G06F3/14
PHYSICS
G09G5/12
PHYSICS
International classification
G09G5/12
PHYSICS
G06F3/14
PHYSICS
Abstract
One example display method includes in response to receiving a display request from a target display device, determining a first display device and a second display device that support display of a target service, when a first distance between the first display device and a user is less than a. second distance between the second display device and the user, obtaining current first display data of the target service from the target display device and sending the current first display data to the first display device, and when that the first distance reported by the first display device is greater than the second distance reported by the second display device is subsequently obtained, obtaining current second display data of the target service from the target display device and sending the current second display data to the first display device and the second display device.
Claims
1. A display method, comprising: receiving, by a control device, a display request sent by a target display device when the target display device needs to display a target service; determining, by the control device in response to the display request, a first display device and a second display device that support display of the target service; obtaining, by the control device, a first distance between the first display device and a user, and a second distance between the second display device and the user; when the first distance is less than the second distance, obtaining, by the control device, current first display data of the target service from the target display device, and sending the first display data to the first display device, wherein the first display device displays the target service based on the first display data; and when subsequently determining that the first distance is greater than the second distance, obtaining, by the control device, current second display data of the target service from the target display device, and sending the second display data to the first display device and the second display device, wherein both the first display device and the second display device display the target service based on the second display data.
2. The method according to claim 1, wherein at least one of: the first display data comprises at least one layer that is of all layers of the target service and that supports display by the first display device; or the second display data comprises at least one layer that is of all layers of the target service and that supports display by the second display device.
3. The method according to claim 1, wherein after the sending, by the control device, the second display data to the first display device and the second display device, the method further comprises: when a preset duration in which the second display device displays the target service expires, stopping, by the control device, sending the second display data to the first display device.
4. The method according to claim 1, wherein after the sending, by the control device, the second display data to the first display device and the second display device, the method further comprises: when the second distance is less than a preset distance threshold, stopping, by the control device, sending the second display data to the first display device.
5. The method according to claim 1, wherein after the sending, by the control device, the second display data to the first display device and the second display device, the method further comprises: when the second distance is less than a preset distance threshold, determining, by the control device, a duration in which the second distance between the user and the second display device is less than the preset distance threshold; and if the duration is greater than a preset duration threshold, stopping, by the control device, sending the second display data to the first display device.
6. The method according to claim 1, wherein after the sending, by the control device, the first display data to the first display device, the method further comprises: when the control device subsequently obtains that the first distance is equal to the second distance, instructing, by the control device, the first display device and the second display device to perform face detection; and if a face detection result reported by the first display device is obtained, obtaining, by the control device, the current second display data of the target service from the target display device, and sending the second display data to the first display device; or if a face detection result reported by the second display device is obtained, obtaining, by the control device, the current second display data of the target service from the target display device, and sending the second display data to the first display device and the second display device.
7. The method according to claim 1, wherein the display request comprises attribute information of a to-be-displayed layer that needs to be displayed for the target service; and wherein the determining, by the control device, by the control device in response to the display request, a first display device and a second display device that support display of the target service comprises: determining, by the control device according to the attribute information, the first display device and the second display device that support displaying the target service.
8. A display method, comprising: receiving, by a mobile phone, a video call; obtaining, by the mobile phone, a first distance between a user of the mobile phone and a smart television and a second distance between the user and a display device; selecting, by the mobile phone, the smart television when a preset condition is satisfied, wherein the preset condition comprises: the first distance is less than the second distance; and sending, by the mobile phone, display data of the video call to the smart television for display.
9. The display method according to claim 8, wherein the preset condition further comprises: the smart television supports attribute information of a first to-be-displayed layer for the video call, and wherein the display data of the video call comprises display data of the first to-be-displayed layer.
10. The display method according to claim 9, wherein the attribute information of the first to-be-displayed layer indicates one or more of the following: a size of the first to-be-displayed layer, and a privacy attribute of the first to-be-displayed layer.
11. The display method according to claim 9, wherein the method further comprises: obtaining, by the mobile phone, device information of the smart television, wherein the device information of the smart television comprises one or more of the following: screen resolution, a rendering capability of a graphics processing unit (GPU), a frequency of a central processing unit (CPU), a size of a layer that the smart television supports, and privacy attribute of a layer that the smart television supports; and wherein the device information of the smart television is used to determine that the smart television supports the attribute information of the first to-be-displayed layer for the video call.
12. The display method according to claim 8, wherein the preset condition further comprises: the user focuses on the smart television.
13. A control device comprising: at least one processor; and one or more memories coupled to the at least one processor and storing programming instructions for execution by the at least one processor to enable the control device to perform operations comprising: receiving a display request sent by a target display device when the target display device needs to display a target service; determining, in response to the display request, a first display device and a second display device that support display of the target service; obtaining a first distance between the first display device and a user, and a second distance between the second display device and the user; when the first distance is less than the second distance, obtaining current first display data of the target service from the target display device, and sending the first display data to the first display device, wherein the first display device displays the target service based on the first display data; and when subsequently determining that the first distance is greater than the second distance, obtaining current second display data of the target service from the target display device, and sending the second display data to the first display device and the second display device, wherein both the first display device and the second display device display the target service based on the second display data.
14. The control device according to claim 13, wherein at least one of: the first display data comprises at least one layer that is of all layers of the target service and that supports display by the first display device; or the second display data comprises at least one layer that is of all layers of the target service and that supports display by the second display device.
15. The control device according to claim 13, wherein the operations comprise: when a preset duration in which the second display device displays the target service expires, stopping sending the second display data to the first display device.
16. The control device according to claim 13, wherein the operations comprise: when the second distance is less than a preset distance threshold, stopping sending the second display data to the first display device.
17. The control device according to claim 13, wherein the operations comprise: when the second distance is less than a preset distance threshold, determining a duration in which the second distance between the user and the second display device is less than the preset distance threshold; and if the duration is greater than a preset duration threshold, stopping sending the second display data to the first display device.
18. The control device according to claim 13, wherein the operations comprise: when that the first distance is equal to the second distance is subsequently obtained, instructing the first display device and the second display device to perform face detection; and if a face detection result reported by the first display device is obtained, obtaining the current second display data of the target service from the target display device, and sending the second display data to the first display device; or if a face detection result reported by the second display device is obtained, obtaining the current second display data of the target service from the target display device, and sending the second display data to the first display device and the second display device.
19. The control device according to claim 13, wherein the control device further comprises a display connected to the at least one processor, and wherein the display is configured to display the target service based on at least one of the first display data or second display data sent by the target display device.
20. A mobile phone, comprising: at least one processor; and one or more memories coupled to the at least one processor and storing programming instructions for execution by the at least one processor to enable the mobile phone to perform operations comprising: receiving a video call; obtaining a first distance between a user of the mobile phone and a smart television and a second distance between the user and a display device; selecting the smart television when a preset condition is satisfied, wherein the preset condition comprises: the first distance is less than the second distance; and sending display data of the video call to the smart television for display.
Description
BRIEF DESCRIPTION OF DRAWINGS
[0031]
[0032]
[0033]
[0034]
[0035]
[0036]
[0037]
[0038]
[0039]
[0040]
[0041]
[0042]
[0043]
[0044]
[0045]
[0046]
[0047]
[0048]
[0049]
[0050]
[0051]
[0052]
DESCRIPTION OF EMBODIMENTS
[0053] The terms “first” and “second” mentioned below are merely intended for a purpose of description, and shall not be understood as an indication or implication of relative importance or implicit indication of a quantity of indicated technical features. Therefore, a feature limited by “first” or “second” may explicitly or implicitly include one or more features. In the description of the embodiments of this application, unless otherwise stated, “a plurality of” means two or more than two.
[0054] A display method provided in an embodiment of this application may be applied to a display system 100 shown in
[0055] As shown in
[0056] The control device 200 may be connected to each display device by using a wireless network (for example, Wi-Fi, Bluetooth, or a cellular mobile network) or a wired network (for example, an optical fiber). This is not limited in this embodiment of this application.
[0057] In some embodiments of this application, the control device 200 stores device information that can reflect a display capability and the like of each display device. The mobile phone 201 is used as an example. After establishing a connection to the control device 200, as shown in
[0058] Similarly, for each display device connected to the control device 200, device information of the display device may be recorded in the control device 200. Subsequently, when a display device initiates a target service (for example, playing a video or running a game) that needs to be displayed, the display device may send a corresponding display request and display data corresponding to the target service to the control device 200. In this case, the control device 200 may determine, based on the recorded device information of each display device, a proper display device as a target device for the current target service, and send the display data corresponding to the target service to the target device for display.
[0059] For example, as shown in
[0060] A layer (view or layer) is a basic composition unit of a display interface on a display device. After a plurality of layers are stacked in sequence, a final display effect of the display interface is formed. Each layer may include one or more controls, and a definition rule icon of each layer and a sequence of the plurality of layers may be defined by a developer during application development. An Android system is used as an example. Some basic layers, such as ImageView, AdapterView, and RelativeLayout, are defined in the Android system. A developer may use or modify these basic layers to draw a customized layer.
[0061] As shown in
[0062] In this case, when the mobile phone 201 needs to display the chat interface (namely, the target service) in
[0063] Still as shown in
[0064] In this case, the control device 200 may send response information of the display request to the mobile phone 201, to trigger the mobile phone 201 to generate display data of the video call service (namely, data of the to-be-displayed layer) and send the display data to the control device 200. As shown in
[0065] Certainly, if a connection relationship is established between the mobile phone 201 and the smart television 202, the control device 200 may also add an identifier of the smart television 202 to the response information. In this way, the mobile phone 201 may send, based on the identifier of the smart television 202, the generated display data of the video call service to the smart television 202 for display.
[0066] Alternatively, the control device 200 may have an image processing capability, for example, an image rendering capability. In this case, after receiving the display data of the video call service generated by the mobile phone 201, the control device 200 may perform secondary rendering on the display data based on the device information such as resolution of the smart television 202, to obtain display data that conforms to a display capability of the smart television 202, and send the display data to the smart television 202 for display.
[0067] It may be learned that, in the display method provided in this embodiment of this application, the plurality of display devices of the user may be interconnected with the control device 200, and the device information of each display device is recorded in the control device 200, so that the control device may intelligently select, based on the device information of each display device, a proper target device for the current target service, and project a layer corresponding to the target service to the target device for display.
[0068] In other words, any display device in the display system 100 may be used as a source device that provides screen source data when triggering the target service, and the control device 200 in the display system 100 may intelligently determine a screen-projection occasion and a controlled device displaying the target service, so that the source device and the controlled device in a multi-screen display scenario may be flexibly set based on a service requirement, thereby improving efficiency of collaboration between the plurality of devices.
[0069] In some other embodiments of this application, as shown in
[0070] In this case, as shown in
[0071] In this way, provided that the mobile phone 201 initiating the target service reports the target service to the control device 200, a smart screen projection function of projecting the target service to another display device for display can be implemented, thereby reducing implementation complexity and power consumption of each display device in the display system 100.
[0072] In addition, when determining the proper target device for the target service, the control device 200 may further obtain a distance between the user and each display device in the display system 100 in this case, and determine a display device closest to the user as the target device for displaying the target service.
[0073] For example, as shown in
[0074] It should be noted that a specific implementation form of the control device 200 in the display system 100 is not limited in this embodiment of this application. For example, as shown in (a) in
[0075] In addition, in the foregoing embodiment, an example in which the target service of the display device is projected to another display device for display is used for description. It may be understood that a terminal connected to the control device 200 in the display system 100 may alternatively be a terminal having another output function, for example, a Bluetooth speaker having an audio output function. In this case, when receiving a to-be-played audio service initiated by any terminal, the control device 200 may intelligently select a proper audio playing device for the terminal to perform the to-be-played audio service. This is not limited in this embodiment of this application.
[0076] In some embodiments of this application, the display device (or the control device 200) in the display system 100 may specifically be any terminal such as a mobile phone, a wearable device, an augmented reality (augmented reality, AR)/virtual reality (virtual reality, VR) device, a tablet computer, a notebook computer, an ultra-mobile personal computer (ultra-mobile personal computer, UMPC), a netbook, or a personal digital assistant (personal digital assistant, PDA). Certainly, a specific form of the terminal is not limited in the following embodiments.
[0077] As shown in
[0078] As shown in
[0079] The parts of the mobile phone 201 are described in detail below with reference to
[0080] The processor 101 is a control center of the mobile phone 201. The processor 101 is connected to the parts of the mobile phone 201 by using various interfaces and lines, and performs various functions of the mobile phone 201 and processes data by running or executing an application program stored in the memory 103 and invoking data stored in the memory 103. In some embodiments, the processor 101 may include one or more processing units. Optionally, an application processor and a modem processor may be integrated into the processor 101. The application processor mainly processes an operating system, a user interface, an application program, and the like, and the modem processor mainly processes wireless communication. Optionally, the modem processor and the application processor may alternatively be independent of each other.
[0081] In this embodiment of this application, the processor 101 may include a GPU 115 and a CPU 116, or may be a combination of a GPU 115, a CPU 116, a digital signal processing (digital signal processing, DSP), and a control chip (for example, a baseband chip) in a communications unit. In an implementation of this application, the GPU 115 and the CPU 116 each may be a single computing core, or may include a plurality of computing cores.
[0082] The GPU 115 is a microprocessor specially used for performing image computing on a personal computer, a workstation, a game console, and some mobile devices (such as a tablet computer and a smartphone). The GPU 115 may convert and drive display information required by the mobile phone 201, provides a row scanning signal to a display 104-2, and control correct display of the display 104-2.
[0083] Specifically, in a display process, the GPU 115 may send a corresponding drawing command to the GPU 115. For example, the drawing command may be “draw a rectangle with a length and width of a and b respectively at a coordinate position (x, y)”. In this case, the GPU 115 may quickly calculate all pixels of the graphic according to the drawing instruction, and draw the corresponding graphic at the specified position on the display 104-2.
[0084] It should be noted that the GPU 115 may be integrated in the processor 101 in a form of a function module, or may be disposed in the mobile phone 201 in an independent entity form (for example, a video card). This is not limited in this embodiment of this application.
[0085] The radio frequency circuit 102 may be configured to receive and send a radio signal in an information receiving and sending process or in a call process. Particularly, after receiving downlink data from a base station, the radio frequency circuit 102 may send the downlink data to the processor 101 for processing. In addition, the radio frequency circuit 102 sends related uplink data to the base station. Usually, the radio frequency circuit includes but is not limited to an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency circuit 102 may further communicate with another device through wireless communication. The wireless communication may use any communication standard or protocol that includes but is not limited to a global system for mobile communications, a general packet radio service, code division multiple access, wideband code division multiple access, long term evolution, an email, a short message service, and the like.
[0086] The memory 103 is configured to store an application program and data. The processor 101 performs various functions of the mobile phone 201 and processes data by running the application program and the data stored in the memory 103. The memory 103 mainly includes a program storage area and a data storage area. The program storage area may store an operating system, and an application program required by at least one function (for example, a sound play function or an image play function). The data storage area may store data (for example, audio data or a phone book) created based on use of the mobile phone 201. In addition, the memory 103 may include a high-speed random access memory (random access memory, RAM), or may include a non-volatile memory such as a magnetic disk storage device, a flash storage device, or another volatile solid-state storage device. The memory 103 may store various operating systems such as an iOS® operating system developed by Apple and an Android® operating system developed by Google. The memory 103 may be independent, and is connected to the processor 101 by using the communications bus. Alternatively, the memory 103 may be integrated into the processor 101.
[0087] The touchscreen 104 may specifically include a touchpad 104-1 and the display 104-2.
[0088] The touchpad 104-1 may collect a touch event (for example, an operation performed by a user on or near the touchpad 104-1 by using any proper object such as a finger or a stylus) performed by the user of the mobile phone 201 on or near the touchpad 104-1, and send collected touch information to another component (for example, the processor 101). The touch event performed by the user near the touchpad 104-1 may be referred to as a floating touch. The floating touch may mean that the user does not need to directly touch the touchpad for selecting, moving, or dragging a target (for example, an icon), and the user only needs to be near the terminal to perform a desired function. In addition, the touchpad 104-1 may be implemented in a plurality of types such as a resistive type, a capacitive type, an infrared type, or a surface acoustic wave type.
[0089] The display (also referred to as a display screen) 104-2 may be configured to display information entered by the user or information provided for the user, and various menus of the mobile phone 201. The display 104-2 can be configured in a form of a liquid crystal display, an organic light emitting diode, or the like. The touchpad 104-1 may cover the display 104-2. When detecting the touch event on or near the touchpad 104-1, the touchpad 104-1 transfers the touch event to the processor 101 to determine a type of the touch event. Then, the processor 101 can provide a corresponding visual output on the display 104-2 based on the type of the touch event. Although the touchpad 104-1 and the display screen 104-2 in
[0090] The mobile phone 201 may further include the Bluetooth apparatus 105, configured to exchange data between the mobile phone 201 and another terminal (for example, a mobile phone or a smartwatch) at a short distance away from the mobile phone 201. In this embodiment of this application, the Bluetooth apparatus may be an integrated circuit, a Bluetooth chip, or the like.
[0091] The mobile phone 201 may further include at least one sensor 106 such as a fingerprint collection component 112, a light sensor, a motion sensor, and another sensor. Specifically, the fingerprint collection component 112 may be disposed on a back facet (for example, under a rear-facing camera) of the mobile phone 201, or on the front facet (for example, under the touchscreen 104) of the mobile phone 201. For another example, the fingerprint collection component 112 may alternatively be configured in the touchscreen 104 to implement a fingerprint recognition function. In other words, the fingerprint collection component 112 may be integrated with the touchscreen 104 to implement the fingerprint recognition function of the mobile phone 201. The light sensor may include an ambient light sensor and a proximity sensor. The ambient light sensor may adjust luminance of the display of the touchscreen 104 based on ambient light luminance. The proximity sensor may power off the display when the mobile phone 201 approaches to an ear. As a motion sensor, an accelerometer sensor can detect a value of acceleration in each direction (usually, three axes can detect a value and a direction of gravity in a static state, and can be used in an application for identifying a mobile phone posture (such as screen switching between a landscape mode and a portrait mode, a related game, and magnetometer posture calibration), a function related to vibration identification (such as a pedometer and a knock), and the like. Other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor may also be disposed in the mobile phone 201. Details are not described herein.
[0092] In this embodiment of this application, the sensor 106 of the mobile phone 201 further includes a distance sensor 113, configured to sense a distance between the mobile phone 201 and an object (or the user) to complete a preset function. According to different working principles, the distance sensor may be classified into an optical distance sensor, an infrared distance sensor, an ultrasonic distance sensor, or the like. This is not limited in this embodiment of this application.
[0093] The Wi-Fi apparatus 107 is configured to provide, for the mobile phone 201, network access that complies with a Wi-Fi-related standard protocol. The mobile phone 201 may access a Wi-Fi access point by using the Wi-Fi apparatus 107, to help the user receive and send an email, browse a web page, access streaming media, and the like. The Wi-Fi apparatus 107 provides a wireless broadband interact access for the user. In some other embodiments, the Wi-Fi apparatus 107 may alternatively be used as a Wi-Fi wireless access point, and may provide another terminal with Wi-Fi network access.
[0094] The positioning apparatus 108 is configured to provide a geographic location tor the mobile phone 201. It may be understood that the positioning apparatus 108 may specifically be a receiver of a positioning system such as a global positioning system (GPS), the Beidou navigation satellite system, or the GLONASS of Russia. After receiving the geographic location sent by the positioning system, the positioning apparatus 108 sends the information to the processor 101 for processing, or sends the information to the memory 103 for storage. In some other embodiments, the positioning apparatus 108 may alternatively be a receiver of an assisted global positioning system (AGPS). The AGPS system serves as an assisted server to assist the positioning apparatus 108 in completing ranging and positioning services. In this case, the assisted positioning server communicates with a terminal, for example, the positioning apparatus 108 (namely, the GPS receiver) of the mobile phone 201, through a wireless communications network, to provide positioning assistance. In some other embodiments, the positioning apparatus 108 may alternatively be a positioning technology based on a Wi-Fi access point. Each Wi-Fi access point has a globally unique media access control (media access control, MAC) address. The terminal can scan and collect a broadcast signal of a surrounding Wi-Fi access point when Wi-Fi is enabled. Therefore, the terminal can obtain a MAC address broadcast by the Wi-Fi access point. The terminal sends such data (for example, the MAC address) that can identify the Wi-Fi access point to a location server by using the wireless communications network. The location server retrieves a geographic location of each Wi-Fi access point, calculates a geographic location of the terminal with reference to strength of the Wi-Fi broadcast signal, and sends the geographic location of the terminal to the positioning apparatus 108 of the terminal.
[0095] The audio circuit 109, a speaker 113, and a microphone 114 may provide audio interfaces between the user and the mobile phone 201. The audio circuit 109 may transmit, to the speaker 113, an electrical signal converted from received audio data, and the speaker 113 converts the electrical signal into a sound signal for output. In addition, the microphone 114 converts a collected sound signal into an electrical signal, and the audio frequency circuit 109 converts the electrical signal into audio data after receiving the electrical signal, and then outputs the audio data to the RF circuit 102 to send the audio data to, for example, another mobile phone, or outputs the audio data to the memory 103 for further processing.
[0096] The peripheral interface 110 is configured to provide various interfaces for external input/output devices (such as a keyboard, a mouse, an external display, an external memory, and a subscriber identity module card). For example, the mouse is connected by using a universal serial bus (universal serial bus, USB) interface, and the subscriber identity module (subscriber identification module, SIM) card provided by a telecommunications operator is connected by using a metal contact on a card slot of the subscriber identity module card. The peripheral interface 110 may be configured to couple the external input/output peripheral devices to the processor 101 and the memory 103.
[0097] The mobile phone 201 may further include the power supply apparatus 111 (for example, a battery and a power management IC) that supplies power to the parts. The battery may be logically connected to the processor 101 by using the power management IC, so that functions such as charging management, discharging management, and power consumption management are implemented by using the power supply apparatus 111.
[0098] Although not shown in
[0099] With reference to the display system 100 shown in
[0100] 801a. A first display device sends a first connection request to a control device.
[0101] 802a. After receiving the first connection request, the control device establishes a connection relationship with the first display device.
[0102] In steps 801a and 802a, an example in which the first display device (for example, the foregoing mobile phone 201) actively establishes a connection to the control device is used for description.
[0103] In some embodiments of this application, after the control device 200 accesses a network, for example, a local area network whose Wi-Fi name is “1234”, the control device 200 may add an identifier (for example, a MAC address of the control device 200) of the control device 200 to first indication information for periodic broadcast. The first indication information is used to indicate that the device is the control device 200, in this case, after the mobile phone 201 also accesses the local area network whose Wi-Fi name is “1234”, the mobile phone 201 may receive the first indication information, to determine the current control device 200.
[0104] Then, as described in step 801a, a processor of the mobile phone 201 may invoke, based on the identifier of the control device 200, a Wi-Fi apparatus of the mobile phone 201 to send the first connection request to the control device 200 by using the Wi-Fi network named “1234”. The first connection request is used to request to establish the connection relationship between the mobile phone 201 and the control device 200, and the first connection request may carry an identifier of the mobile phone 201 (for example, a MAC address of the mobile phone 201).
[0105] In this case, after the control device 200 receives the first connection request sent by the mobile phone 201, as described in step 802a, the control device 200 may store the identifier of the mobile phone 201 in a memory, to establish the connection relationship with the mobile phone 201. Subsequently, both the control device 200 and the mobile phone 201 can find each other by using the identifier of the control device 200 and the identifier of the mobile phone 201. to perform communication.
[0106] In some other embodiments of this application, after the mobile phone 201 and a. plurality of other devices access a same network (for example, the local area network whose Wi-Fi name is “1234”), as shown in
[0107] In this case, after detecting this entry operation of the user, the mobile phone 201 can set the mobile phone 201 as the control device 200, and add the identifier of the mobile phone 201 to the first indication information for periodic broadcast. In this case, after receiving the first indication information, another display device in the local area network may add an identifier of the display device to the first connection request, and send the first connection request to the mobile phone 201 (namely, the control device), so that the mobile phone 201 stores the received identifier, to establish a connection relationship with each display device in the local area network.
[0108] 801b. The control device sends a second connection request to the first display device.
[0109] 802b, After receiving the second connection request, the first display device establishes the connection relationship with the control device.
[0110] In steps 801b and 802b, an example in which the control device actively establishes a connection to the first display device (namely, the mobile phone 201) is used for description,
[0111] Similar to the foregoing steps 801a and 802a, the control device 200 can add the identifier of the control device 200 to the second connection request, and send the second connection request to the mobile phone 201. In this case, after receiving the second connection request, the mobile phone may store the identifier of the control device 200, and send the identifier of the mobile phone 201 to the control device 200, so that the control device 200 also stores the identifier of the mobile phone 201 in the memory of the control device, to establish the connection relationship with the mobile phone 201. Subsequently, both the control device 200 and the mobile phone 201 can find each other by using the identifier of the control device 200 and the identifier of the mobile phone 201, to perform communication.
[0112] It should be noted that, in the foregoing embodiments, an example in which the first display device establishes the connection relationship with the control device is used for description. Another display device may also establish a connection relationship with the control device based on the foregoing method, to build the display systems 100 shown in
[0113] 803. The first display device sends device information of the first display device to the control device.
[0114] 804. After receiving the device information, the control device stores the device information in the memory of the control device for recording.
[0115] That the mobile phone 201 is used as the first display device is still used as an example. In step 803, because the mobile phone 201 has established the connection relationship with the control device, the mobile phone 201 may send the device information of the mobile phone 201 to the control device based on the stored identifier of the control device 200, for example, a parameter reflecting a display capability of the mobile phone 201 such as screen resolution of the mobile phone 201, a rendering capability of a GPU, and a frequency of a CPU, a parameter reflecting an audio playing capability such as an audio format supported by the mobile phone 201, and a parameter reflecting whether display of user privacy is supported. This is not limited in this embodiment of this application.
[0116] The user privacy may specifically include information such as secure transaction information (for example, a stock transaction page), information having a chat nature (for example, art SMS message or a message notification), location information of the user, and information, such as a contact number, that cannot be disclosed by the user forever.
[0117] For example, whether the display device supports display of the user privacy may be determined based on a parameter such as a type of the display device and/or a geographical location of the display device. For example, when the display device is a device with strong mobility, for example, a mobile phone or a wearable device, because the user usually carries such a device, in other words, privacy of the device is relatively high, it may be determined that the device supports display of the user privacy. When the display device is a device with weak mobility, for example, a Bluetooth speaker or a smart television, because a location of such device is relatively fixed and the device usually cannot be moved with movement of the user, in other words, privacy of the device is relatively low, therefore, it may be determined that the device does not support display of the user privacy.
[0118] Then, in step 804, after receiving the device information sent by the mobile phone 201, the control device 200 may store a correspondence between the mobile phone 201 and the device information of the mobile phone 201 in the memory of the control device 200 for recording,
[0119] The control device 200 may record received device information of each display device. As shown in Table 1, the device information of each display device is maintained in the control device 200. Subsequently, when a target service needs to be displayed, the control device 200 may determine, for the target service and based on the recorded device information of each display device, a proper display device as a target device for displaying the target service.
TABLE-US-00001 TABLE 1 Device information Playing Support Display capability capability privacy Display CPU GPU Supported display device Resolution capability capability format or not Mobile 1920 × 1080 Strong Strong M3 and Yes phone 201 WAV Smart 3840 × 2160 Weak Weak M3 No television 202 Tablet 2560 × 1440 Strong Middle MP3 and Yes computer WAV 203
[0120] For example, after receiving an incoming call service, the mobile phone 201 may send, to the control device 200, attribute information of attributes of one or more to-be-displayed layers related to the incoming call service, for example, resolution supported by the to-be-displayed layer, a CPU capability and a GPU capability that are supported by the to-be-displayed layer, and whether user privacy is included in the to-be-displayed layer. The control device 200 matches the received attribute information of the to-be-displayed layer with the device information of each display device recorded in Table 1, to obtain one or more display devices that support display of the to-be-displayed layer.
[0121] For example, the control device 200 determines that the mobile phone 201, the smart television 202, and the tablet computer 203 in Table 1 all support display of the to-be-displayed layer of the incoming call service. In this case, to facilitate the user to learn of the incoming call service in time, when the mobile phone 201, the smart television 202, and the tablet computer 203 all remain connected to the control device 200, the control device may send second instruction information to the mobile phone 201, the smart television 202, and the tablet computer 203. The second instruction information is used to instruct the display device to report a distance away from the user.
[0122] Then, after receiving the second instruction information, the mobile phone 201, the smart television 202, and the tablet computer 203 each may obtain the distance away from the user through periodic detection by a distance sensor (for example, a camera or an infrared sensor) of the mobile phone 201, the smart television 202, or the tablet computer 203 or in another existing manner, and report the distance obtained through detection to the control device 200. In this way, the control device 200 may determine a display device closest to the user, for example, the smart television 202 in the bedroom, as the target device for displaying the to-be-displayed layer of the incoming call service. In addition, the target device may be selected based on a real-time distance in a process of performing the incoming call service. In the process of performing the incoming call service, when the distances between the user and the plurality of display devices change, the incoming call service may be freely switched on the plurality of display devices, thereby improving efficiency of collaboration between the plurality of devices while greatly improving user experience.
[0123] In this embodiment of this application, a house structure diagram showing a location of each display device may also be pre-stored in the control device 200.
[0124] In this case, when the control device 200 determines, based on the distance between the user and each display device, the target device displaying the target service, with reference to the specific location of each display device shown in
[0125] In addition, each display device in the rooms may further periodically report the distance between the user and the display device to the control device 200, for example, report the distance between the user and the display device every 30 seconds. Therefore, When the user moves in the rooms, for example, as shown in
[0126] However, when the control device 200 switches the target service from television 2 to television 1, the user may not enter the bedroom, or may not enter an optimal viewing area in the bedroom, and consequently, the user misses a related image of the target service.
[0127] Therefore, an embodiment of this application provides a display method. That the foregoing display system 100 includes a control device 200, and a first display device and a second display device that are connected to the control device 200 is used as an example. As shown in
[0128] The control device obtains a first distance between the first display device and a user, and a second distance between the second display device and the user.
[0129] Specifically, a distance sensor may be disposed in each of the first display device and the second display device. The first display device may measure the current first distance away from the user by using the distance sensor of the first display device, and the second display device may measure the current second distance away from the user by using the distance sensor of the second display device. Then, the first display device and the second display device may separately send the measured first distance and second distance to the control device.
[0130] Certainly, if the user is not within a ranging range of the first display device (or the second display device) at the moment, for example, the second display device does not detect the user in the bedroom, it may be considered that the distance between the second display device and. the user is infinite.
[0131] Alternatively, one or more cameras connected to the control device may further be disposed in the display system 100, and the control device may capture a user image by using the camera. In this case, the first distance between the first display device and the user and the second distance between the second display device and the user may be determined in combination with a pre-stored location of each display device.
[0132] Certainly, because a wearable device or a mobile phone) is usually carried by the user, the control device may further obtain a positioning result of the user by using a positioning apparatus of the wearable device (or the mobile phone), and then determine, in combination with the pre-stored locations of the display devices, the first distance between the first display device and the user and the second distance between the second display device and the user.
[0133] Optionally, the control device may alternatively obtain the first distance and the second distance in another existing manner such as by using indoor positioning. This is not limited in this embodiment of this application.
[0134] In some embodiments of this application, when the control device receives a display request initiated by a target service, the control device may be triggered to obtain the first distance and the second distance. For example, as shown in
[0135] Certainly, the mobile phone in
[0136] 902. When the first distance is less than the second distance, the control device instructs the first display device to run the target service.
[0137] 903. The first display device displays the target service in real time.
[0138] When the first distance D1 is less than the second distance D2, as shown in
[0139] The to-be-displayed layer may include some layers during running of the video playing service. For example, the control device may remove a layer that includes user privacy during running of the video playing service, and send, to the first display device, a layer that does not include privacy as the to-be-displayed layer. Alternatively, the control device may send, to a third display device that supports display of user privacy, a layer that includes user privacy during running of the video playing service, and send, to the first display device, a layer that does not include privacy as the to-be-displayed layer. Certainly, the to-be-displayed layer may alternatively include all layers during running of the video playing service. This is not limited in this embodiment of this application.
[0140] In addition, when the first distance D1 is less than the second distance D2, if the first distance is less than a preset value, for example, the distance between the user and the first display device is less than 3 meters (or another preset distance), or duration in which the distance between the user and the first display device is less than 3 meters (or another preset distance) is greater than preset duration, the control device may be triggered to instruct the first display device to run the target service, to avoid a problem that when the user quickly passes through the first display device, the first display device is triggered to display the target service, increasing power consumption of the first device.
[0141] Certainly, before the control device sends, to the first display device, the to-be-displayed layer generated when the mobile phone runs the video playing service, the control device may further perform secondary rendering on the to-be-displayed layer sent by the mobile phone, for example, adjust a size of the to-be-displayed layer to adapt to resolution of the first display device. This is not limited in this embodiment of this application.
[0142] 904. The control device continues obtaining the first distance between the first display device and the user, and the second distance between the second display device and the user.
[0143] While the first display device displays a real-time image of the target service, the first display device and the second display device may continue detecting and reporting the distance between the first display device and the user and the distance between the second display device and the user, so that the control device continues obtaining the first distance D1 between the first display device and the user, and the second distance D2 between the second display device and the user.
[0144] 905. When the first distance is greater than the second distance, the control device sends a first instruction to the first display device, and sends a second instruction to the second device.
[0145] The first instruction is used to instruct the first display device to continue displaying the target service in real time, and the second instruction is used to instruct the second display device to display the target service in real time starting from a target image currently displayed by the first display device.
[0146] In addition, before sending the second instruction to the second display device, the control device may first determine whether the second display device is currently in a connected mode to the control device, in other words, whether the second display device is online. When the second display device is online, the control device may be triggered to send the second instruction to the second display device.
[0147] However, when the second display device is offline, the control device may re-establish a connection relationship with the second display device, and then send the second instruction to the second display device. Alternatively, when the second display device is offline, the control device may reselect another display device that is currently closer to the user and that is in a connected mode to the control device, and send the second instruction to the display device. This is not limited in this embodiment of this application.
[0148] 906. In response to the first instruction, the first display device continues displaying the target service in real time within the preset duration.
[0149] 907. In response to the second instruction, the second display device displays the target service in real time starting from a target layer currently displayed by the first display device.
[0150] When the first distance D1 is greater than the second distance D2, as shown in
[0151] It should be noted that the to-be-displayed layer generated when the second display device displays the target service may be the same as or different from the to-be-displayed layer generated when the first display device displays the target service. For example, when the second display device supports display of user privacy but the first display device does not support display of privacy, the to-be-displayed layer sent by the control device to the second display device may include a layer related to user privacy, for example, a layer including a contact number and SMS message content, but the to-be-displayed layer sent by the control device to the first display device does not include a layer related to user privacy.
[0152] For the first display device, the control device continues sending a to-be-displayed layer of video A after three minutes and 45 seconds to the first display device instead of immediately stopping sending the to-be-displayed layer of the video playing service to the first display device. In other words, when the control device switches the video playing service from the first display device to the second display device, the first display device and the second display device simultaneously display same images in a period of time.
[0153] A reason is that a process in which the user moves from the first display device to the second display device is a continuous process. When it is detected that the first distance D2 is greater than the second distance D2, still as shown in
[0154] Therefore, when it is detected that the first distance D1 is greater than the second distance D2, in addition to switching the video playing service from the first display device to the second display device, the control device continues sending the to-be-displayed layer of the video playing service to the first display device, so that the first display device continues displaying the video playing service for a period of time (for example, 30 seconds). In this way, the user can still view the video playing service played in real time before leaving the room in which the first display device is located, thereby ensuring that the video playing service can be stably transited when being switched between different display devices. In addition, seamless connection of the video playing service on the different display devices is implemented, thereby improving efficiency of collaboration between the plurality of devices.
[0155] Further, to enable the first display device and the second display device to play the video playing service at the same time as much as possible, the control device may send the to-be-displayed layer of the target service to the first display device and the second display device at the same time. In this way, the first display device and the second display device may immediately display the to-be-displayed layer of the target service after receiving the to-be-displayed layer, thereby improving synchronization of playing the video playing service by the first display device and the second display device.
[0156] Alternatively, a synchronization mechanism may be preset between the display devices of the display system, so that system time of the first display device is synchronized with that of the second display device. In this case, the control device may add a display time point of the target service to the first instruction sent to the first display device and to the second instruction sent to the second display device. In this way, when the display time point arrives, the first display device and the second display device may be triggered to simultaneously play the target service, to improve synchronization of playing the video playing service by the first display device and the second display device.
[0157] 908. When the control device obtains that the second distance between the second display device and the user is less than a distance threshold, the control device sends a close instruction to the first display device, to enable the first display device to stop displaying the target service.
[0158] Optionally, in step 908, after switching the video playing service from the first display device to the second display device, the control device may further continue obtaining the second. distance D2 between the user and the second display device. As shown in
[0159] Further, the control device may determine duration in which the second distance between the second display device and the user is less than the threshold. If the duration is greater than a duration threshold, it indicates that the user has stayed in front of the second display device for a period of time. In this case, the control device may be triggered to send the close instruction to the first display device, to enable the first display device to stop displaying the target service.
[0160] In addition, if the first display device does not receive, within the preset duration, the to-be-displayed layer that is of the video playing service and that is sent by the control device, the first display device may stop sending the first distance between the first display device and the user to the control device, to reduce power consumption of the first display device.
[0161] Certainly, after the control device stops sending the to-be-displayed layer of the video playing service to the first display device, the first display device may still continue periodically sending the first distance between the first display device and the user to the control device, so that in a subsequent movement process of the user, the control device may determine in time, based on the first distance, whether to switch the video playing service to the first display device until the control device sends an instruction for stopping reporting the first distance to the first display device. This is not limited in this embodiment of this application.
[0162] In addition, after step 904, if the control device obtains that the first distance is equal to the second distance, in other words, the distance between the user and the first display device is equal to the distance between the user and the second display device, still as shown in
[0163] 909. The first display device and the second display device separately perform face detection (or human eye detection).
[0164] When the first distance is equal to the second distance, the control device may determine the current focus of the user based on an orientation of a face (or an eye) of the user, and further determine whether to switch the video playing service from the first display device to the second display device.
[0165] Specifically, a camera may be disposed on each of the first display device and the second display device. In this way, the first display device and the second display device each may capture an image of the user by using the camera, and further recognize the image of the user based on a face detection (or human eye detection) algorithm. When the face (or the human eye) is recognized, it indicates that a face detection (or human eye detection) result is obtained in this case.
[0166] 910. If the first display device obtains the face detection (or human eye detection) result, the control device instructs the first display device to continue displaying the target service.
[0167] If the first display device obtains the face detection (or human eye detection) result, it indicates that the current focus of the user still falls on the first display device in this case, and the control device may continue sending, to the first display device, the to-be-displayed layer that is generated in real time by the video playing service, and does not need to switch the video playing service from the first display device to the second display device.
[0168] In this embodiment of this application, the first display device and the second display device may alternatively periodically capture images of the user by using the cameras. In this case, in step 910, when the first display device obtains the face detection (or human eye detection) result, the control device may further identify, by using a face (or human eyes) identification algorithm, whether a face (or a human eye) detected in current detection is the same as the face (or the human eye) detected in previous face detection (or human eye detection). If the face (or the human eye) detected in the current detection is the same as the face (or the human eye) detected in the previous face detection (or human eye detection), it indicates that the user focusing on the first display device does not change, and the control device may instruct the first display device to continue displaying the foregoing target service. Otherwise, the control device may ignore a face detection (or human eye detection) result reported by the first display device in the current detection.
[0169] 911. If the second display device obtains the face detection (or human eye detection) result, the control device performs the foregoing steps 905 to 908.
[0170] If the second display device obtains a face detection (or human eye detection) result, it indicates that the current focus of the user has been transferred to the second display device. In this case, the control device may switch the video playing service from the first display device to the second display device. For a specific switching method of the video playing service, refer to related descriptions in steps 905 to 908, and details are not described herein again.
[0171] Similar to step 910, when the second display device obtains the face detection (or human eye detection) result, the control device may further identify, by using the face (or human eye) identification algorithm, whether a currently detected face (or human eye) is the same as a face (or human eye) that is previously reported by the first display device. If the currently detected face (or human eye) is the same as the face (or human eye) that is previously reported by the first display device, it indicates that the user originally focusing on the first display device transfers the focus to the second display device. Then, the control device may switch the video playing service from the first display device to the second display device by performing the foregoing steps 905 to 908. Otherwise, the control device may ignore the face detection (or human eye detection) result reported by the second display device this time.
[0172] Certainly, the user may also manually switch the target service from the first display device to the second display device.
[0173] For example, as shown in
[0174] For another example, the user may further trigger a switching process of the target service by performing a corresponding gesture on the first display device. As shown in
[0175] Optionally, the mobile phone may further display the determined relative position relationship between the mobile phone and the smart television on a display screen of the mobile phone in a form of a text, a picture, an animation, or the like, to prompt the user to switch the target service between the mobile phone and the smart television by performing a directional drag operation.
[0176] Further, in this embodiment of this application, the user may alternatively select to switch display content in an area on the display interface of the first display device to the second display device for display. For example, a current display interface of the first display device includes a plurality of display windows, for example, an input method window 1102 and a window of an SMS message application that are displayed by the mobile phone in
[0177] It may be understood that, to implement the foregoing functions, the control device or the display device include corresponding hardware structures and/or software modules for performing the functions. A person skilled in the art may be easily aware that, in combination with the units, algorithm, and steps in the examples described in the embodiments disclosed in this specification, the embodiments of this application can be implemented by hardware or a combination of hardware and computer software. Whether a function is performed by hardware or hardware driven by computer software depends on particular applications and design constraints of the technical solutions. A person skilled in the art may use different methods to implement the described functions for each particular application, but it should not be considered that the implementation goes beyond the scope of the embodiments of this application.
[0178] In the embodiments of this application, the foregoing terminal may be divided into function modules based on the foregoing method examples. For example, each function module may be obtained through division based on each corresponding function, or two or more functions may be integrated into one processing module. The integrated module may be implemented in a form of hardware, or may be implemented in a form of a software function module. It should be noted that, in the embodiments of this application, module division is an example, and is merely a logical function division. During actual implementation, another division manner may be used.
[0179] When the function modules are divided based on corresponding functions,
[0180] The receiving unit 2101 is configured to support the control device in performing the processes 901 and 904 in
[0181] The determining unit 2102 is configured to perform control management on an action of the control device. The receiving unit 2101 and the sending unit 2103 are configured to support a communication process between the control device and another device. In addition, the control device may further include a storage unit, configured to store program code and data of the control device. For example, the storage unit may be configured to store device information sent by each display device.
[0182] When the control device is used as a display device in the display system, the control device may further include a display unit, configured to display information entered by a user or information provided for a user, and various menus of the control device.
[0183] For example, the determining unit 2102 may be a processor, the receiving unit 2101 and the sending unit 2103 may be transceiver devices such as RF circuits or Wi-Fi apparatuses, the storage unit may be a memory, and the display unit may be a display. In this case, the control device provided in this embodiment of this application may be the mobile phone 201 shown in
[0184] All or some of the foregoing embodiments may be implemented by means of software, hardware, firmware, or any combination thereof. When a software program is used to implement the embodiments, the embodiments may be implemented completely or partially in a form of a computer program product. The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the procedure or functions according to the embodiments of this application are all or partially generated. The computer may be a general-purpose computer, a dedicated computer, a computer network, or another programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or may be transmitted from a computer-readable storage medium to another computer-readable storage medium. For example, the computer instructions may be transmitted from a website, computer, server, or data center to another website, computer, server, or data center in a wired (for example, a coaxial cable, an optical fiber, or a digital subscriber line (DSL)) or wireless (for example, infrared, radio, and microwave, or the like) manner. The computer-readable storage medium may be any usable medium accessible by a computer, or a data storage device, such as a server or a data center, integrating one or more usable media. The available medium may be a magnetic medium (for example, a floppy disk, a hard disk, or a magnetic tape), an optical medium (for example, a DVD), a semiconductor medium (for example, a solid state drive Solid State Disk (SSD)), or the like.
[0185] The foregoing descriptions are merely specific implementations of this application, but are not intended to limit the protection scope of this application. Any variation or replacement within the technical scope disclosed in this application shall fall within the protection scope of this application. Therefore, the protection scope of this application shall be subject to the protection scope of the claims.