ELECTRONIC DEVICE CONTROL METHOD AND ELECTRONIC DEVICE
20240244314 ยท 2024-07-18
Inventors
Cpc classification
G06F3/017
PHYSICS
H04N23/632
ELECTRICITY
H04M1/0216
ELECTRICITY
G06F1/1652
PHYSICS
G06F1/1616
PHYSICS
G06F1/1677
PHYSICS
G06F3/0346
PHYSICS
G06F2200/1614
PHYSICS
H04M1/72448
ELECTRICITY
G06F1/1626
PHYSICS
International classification
Abstract
This application provides an electronic device control method and an electronic device. The method includes: displaying a first screen, wherein the first screen comprises a first control displayed at a first position, wherein a first side of the electronic device is at a first angle with respect to a direction of gravity; detecting a first operation on the first control, and shooting, by the electronic device, a first image; receiving a second operation on the electronic device, wherein, after the second operation, the first side of the electronic device is at a second angle with respect to the direction of gravity; displaying, by the electronic device in response to the second operation, a second screen, wherein the second screen comprises: a second control displayed at a second position; and detecting a third operation on the second control and shooting, by the electronic device using the camera, a second image.
Claims
1.-16. (canceled)
17. An electronic device, comprising a memory configured to store computer program instructions and a processor configured to execute the program instructions, wherein the memory is coupled to the processor, the processor is configured to: display a first screen of a first application, wherein the first screen comprises an image acquired in real time by a camera of an electronic device and a first control displayed at a first position, wherein a first side of the electronic device is at a first angle with respect to a direction of gravity; detect a first operation performed by a user on the first control, and shooting, by the electronic device using the camera, a first image; receive a second operation performed by the user on the electronic device, wherein, after the second operation, the first side of the electronic device is at a second angle with respect to the direction of gravity, the second angle being different from the first angle; display, by the electronic device in response to the second operation, a second screen, wherein the second screen comprises: an image acquired in real time by the camera of the electronic device and a second control displayed at a second position, the second position being different from the first position; and detect a third operation performed by the user on the second control and shooting, by the electronic device using the camera, a second image.
18. The electronic device according to claim 17, wherein the second screen further comprises the first control.
19. The electronic device according to claim 17, wherein the processor is further configured to: when the first application is running in photo mode; detect a third operation performed by the user on the third control; switch the first application to video mode in response to the third operation; detect a fourth operation performed by the user on the first control, shooting, by the electronic device using the camera, a first video, and displaying a third screen, wherein the third screen comprises: a fourth control and a fifth control, the fourth control being at a same position as the first control, and the fifth control being at a same position as the second control; and detect a fifth operation performed by the user on a sixth control, stopping shooting, by the electronic device, the first video, and displaying a fourth screen, wherein the fourth screen comprises the first control displayed at the first position and the second control displayed at the second position.
20. The electronic device according to claim 17, wherein a difference between the second angle and the first angle is 90?.
21. The electronic device according to claim 17, wherein the processor is further configured to: after the display, by the electronic device in response to the second operation, a second screen, detect a fourth operation performed by the user on the second control; and display the second control at a third position in response to the fourth operation, wherein the third position is different from the first position and the second position.
22. The electronic device according to claim 21, wherein the processor is further configured to: after the display the second control at a third position in response to the fourth operation, receive a fifth operation performed by the user on the electronic device, wherein, after the fifth operation, the first side of the electronic device is at the first angle with respect to the direction of gravity; display, by the electronic device in response to the fifth operation, a fifth screen, wherein the fifth screen comprises an image acquired in real time by the camera of the electronic device and the first control displayed at the first position; receive a sixth operation performed by the user on the electronic device, wherein, after the sixth operation, the first side of the electronic device is at the second angle with respect to the direction of gravity; and display, by the electronic device in response to the sixth operation, a sixth screen, wherein the sixth screen comprises an image acquired in real time by the camera of the electronic device and the second control displayed at the third position.
23. The electronic device according to claim 21, wherein the processor is further configured to: after the display the second control at a third position in response to the fourth operation, display, by the electronic device in response to a seventh operation by the user, a seventh screen, wherein the seventh screen does not belong to the first application; display, by the electronic device in response to an eighth operation by the user, an eighth screen of the first application, wherein the eighth screen comprises an image acquired in real time by the camera of the electronic device and the first control displayed at the first position, wherein the first side of the electronic device is at the first angle with respect to the direction of gravity; and display, by the electronic device in response to a ninth operation by the user, a ninth screen, wherein the ninth screen comprises an image acquired in real time by the camera of the electronic device and the second control displayed at the third position.
24. The electronic device according to claim 17, wherein the processor is further configured to: display a tenth screen in response to a tenth operation on the first screen, wherein the tenth screen comprises a first option and a second option, and the first option has been selected; select the second option in response to an eleventh operation on the tenth screen; display an eleventh screen of the first application, wherein the eleventh screen comprises an image acquired in real time by the camera of the electronic device, the first control displayed at the first position, and the second control displayed at the fourth position, wherein the first side of the electronic device is at the first angle with respect to the direction of gravity; and receive a twelfth operation performed by the user on the electronic device, wherein, after the twelfth operation, the first side of the electronic device is at the second angle with respect to the direction of gravity; and displaying, by the electronic device, a twelfth screen, wherein the twelfth screen comprises an image acquired in real time by the camera of the electronic device, the first control displayed at the first position, and the second control displayed at the fourth position.
25. The electronic device according to claim 17, wherein the electronic device is an electronic device having a foldable display, the electronic device having a foldable display is in an unfolded state, and the processor is further configured to: receive a thirteenth operation performed by the user on the electronic device, wherein after the thirteenth operation, the electronic device changes from the unfolded state to a folded state; display a thirteenth screen of the first application, wherein the thirteenth screen comprises an image acquired in real time by the camera of the electronic device and the first control displayed at the first position, wherein the first side of the electronic device is at the first angle with respect to the direction of gravity; and receive a fourteenth operation performed by the user on the electronic device, wherein, after the fourteenth operation, the first side of the electronic device is at the second angle with respect to the direction of gravity; and displaying, by the electronic device, a fourteenth screen, wherein the fourteenth screen comprises an image acquired in real time by the camera of the electronic device and the first control displayed at the first position.
26. The electronic device according to claim 17, wherein the second control has a same pattern as the first control.
27. The electronic device according to claim 17, wherein the second control is smaller than the first control.
28. The electronic device according to claim 17, wherein the second control has a transparency greater than 0.
29. The electronic device according to claim 17, wherein the processor is further configured to: detect the first operation performed by the user on the first control, wherein after the first operation performed by the user on the first control is detected, the second control and the first control display a same motion effect; and detect the third operation performed by the user on the first control, wherein after the first operation performed by the user on the first control is detected, the second control and the first control display a same motion effect.
30. A control method, comprising: displaying a first screen of a first application, wherein the first screen comprises an image acquired in real time by a camera of an electronic device and a first control displayed at a first position, wherein a first side of the electronic device is at a first angle with respect to a direction of gravity; detecting a first operation performed by a user on the first control, and shooting, by the electronic device using the camera, a first image; receiving a second operation performed by the user on the electronic device, wherein, after the second operation, the first side of the electronic device is at a second angle with respect to the direction of gravity, the second angle being different from the first angle; displaying, by the electronic device in response to the second operation, a second screen, wherein the second screen comprises: an image acquired in real time by the camera of the electronic device and a second control displayed at a second position, the second position being different from the first position; and detecting a third operation performed by the user on the second control and shooting, by the electronic device using the camera, a second image.
31. The control method according to claim 30, wherein the second screen further comprises the first control.
32. The control method according to claim 30, wherein when the first application is running in photo mode, the method further comprises: detecting a third operation performed by the user on the third control; switching the first application to video mode in response to the third operation; detecting a fourth operation performed by the user on the first control, shooting, by the electronic device using the camera, a first video, and displaying a third screen, wherein the third screen comprises: a fourth control and a fifth control, the fourth control being at a same position as the first control, and the fifth control being at a same position as the second control; and detecting a fifth operation performed by the user on a sixth control, stopping shooting, by the electronic device, the first video, and displaying a fourth screen, wherein the fourth screen comprises the first control displayed at the first position and the second control displayed at the second position.
33. The control method according to claim 30, wherein a difference between the second angle and the first angle is 90?.
34. The control method according to claim 30, further comprising: after the display, by the electronic device in response to the second operation, a second screen, detecting a fourth operation performed by the user on the second control; and displaying the second control at a third position in response to the fourth operation, wherein the third position is different from the first position and the second position.
35. The control method according to claim 34, further comprising: after the display the second control at a third position in response to the fourth operation, receiving a fifth operation performed by the user on the electronic device, wherein, after the fifth operation, the first side of the electronic device is at the first angle with respect to the direction of gravity; displaying, by the electronic device in response to the fifth operation, a fifth screen, wherein the fifth screen comprises an image acquired in real time by the camera of the electronic device and the first control displayed at the first position; receiving a sixth operation performed by the user on the electronic device, wherein, after the sixth operation, the first side of the electronic device is at the second angle with respect to the direction of gravity; and displaying, by the electronic device in response to the sixth operation, a sixth screen, wherein the sixth screen comprises an image acquired in real time by the camera of the electronic device and the second control displayed at the third position.
36. A computer-readable storage medium, wherein the computer-readable storage medium comprises a program, which when executed by a processor, causes the processor to perform operations, the operations comprising: displaying a first screen of a first application, wherein the first screen comprises an image acquired in real time by a camera of an electronic device and a first control displayed at a first position, wherein a first side of the electronic device is at a first angle with respect to a direction of gravity; detecting a first operation performed by a user on the first control, and shooting, by the electronic device using the camera, a first image; receiving a second operation performed by the user on the electronic device, wherein, after the second operation, the first side of the electronic device is at a second angle with respect to the direction of gravity, the second angle being different from the first angle; displaying, by the electronic device in response to the second operation, a second screen, wherein the second screen comprises: an image acquired in real time by the camera of the electronic device and a second control displayed at a second position, the second position being different from the first position; and detecting a third operation performed by the user on the second control and shooting, by the electronic device using the camera, a second image.
Description
BRIEF DESCRIPTION OF DRAWINGS
[0025]
[0026]
[0027]
[0028]
[0029]
[0030]
[0031]
[0032]
[0033]
[0034]
[0035]
[0036]
[0037]
[0038]
[0039]
[0040]
[0041]
[0042]
[0043]
[0044]
[0045]
[0046]
[0047]
DESCRIPTION OF EMBODIMENTS
[0048] The following describes technical solutions in the embodiments of this application with reference to the accompanying drawings in the embodiments of this application. In a description of the embodiments of this application, unless otherwise specified, / indicates an or relationship. For example, A/B may represent A or B. In this specification, and/or is merely an association relationship for describing associated objects, and represents that three relationships may exist. For example, A and/or B may represent the following three cases: only A, both A and B, and only B. In addition, in the description of the embodiments of this application, a plurality of means two or more than two.
[0049] Terms first, second, and third in the following are merely intended for a purpose of description, and shall not be understood as an indication or implication of relative importance or an implicit indication of the number of the indicated technical features. Therefore, a feature limited by first, second, or third may explicitly or implicitly include one or more features.
[0050] The electronic device control method provided in the embodiments of this application may be applied to a terminal device such as a mobile phone, a tablet computer, a wearable device, an in-vehicle device, an augmented reality (augmented reality, AR)/virtual reality (virtual reality, VR) device, a notebook computer, an ultra-mobile personal computer (ultra-mobile personal computer, UMPC), a netbook, or a personal digital assistant (personal digital assistant, PDA). The embodiments of this application impose no limitation on a specific type of the terminal device.
[0051] For example,
[0052] It should be understood that a structure illustrated in this embodiment of this application does not constitute a specific limitation on the terminal device 100. In some other embodiments of this application, the terminal device 100 may include more or fewer parts than shown in the figure, or combine some parts, split some parts, or have different part arrangements. The parts shown in the figure may be implemented by using hardware, software, or a combination of software and hardware.
[0053] A software system of the terminal device 100 may use a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. In this embodiment of this application, a software structure of the terminal device 100 is described by using an Android system with a layered architecture as an example.
[0054]
[0055] As shown in
[0056] The application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications at the application layer. The application framework layer includes some predefined functions. As shown in
[0057] The Android runtime includes a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
[0058] The core libraries include two parts: One is a performance function that needs to be invoked by a Java language, and the other is an Android core library.
[0059] The application layer and the application framework layer run on the virtual machine. The virtual machine executes java files of the application layer and the application framework layer as binary files. The virtual machine is configured to perform functions such as object lifecycle management, stack management, thread management, security and exception management, and garbage collection.
[0060] The system libraries may include a plurality of functional modules, for example, a surface manager (surface manager), a media library (media libraries), a three-dimensional graphics processing library (for example, OpenGL ES), and a 2D graphics engine (for example, SGL).
[0061] The kernel layer is a layer between hardware and software. The kernel layer includes at least a display driver, a camera driver, an audio driver, and a sensor driver.
[0062] A hardware layer may include various types of sensor, such as the various types of sensors described in
[0063] With reference to the electronic device described above in
[0064] With reference to a electronic device control method in an embodiment of this application, the following illustrates working processes of software and hardware of the electronic device 100. The electronic device control method provided in the embodiment of this application is mainly implemented through cooperation among a touch panel (touch panel, TP) module, one or more physical components described above, and software architecture layers of the electronic device 100.
[0065] The TP module receives a touch operation performed by a user on the touch display, and transfers the touch operation by the user to a physical state identification module in a system library for monitoring by the physical state monitoring module to identify the touch operation by the user. The physical state monitoring module transfers the touch operation by the user to a state machine management module of the electronic device, and the state machine management module is used to control a window management system at the FWK layer, so as to control a series of actions, displays, and the like of the electronic device.
[0066] In addition, engagement and cooperation of more other modules and sensors are also required in implementation of the whole process, such as a skin module of the application layer for controlling a display screen of the touch screen. Details are not described in this application.
[0067] For ease of understanding, the following embodiments of this application will take an electronic device with the structure shown in
[0068] First, four screen layouts of 0?, 90?, 180? and 270? mentioned in the following embodiments will be described. When the user holds the electronic device, there is also an actual rotation angle, and the actual rotation angle is used to describe a rotation angle of the electronic device in the clockwise direction. The electronic device may collect the actual rotation angle using a gravity sensor, and determine a current screen layout based on the collected actual rotation angle. In an example, a direct mapping relationship may exist between the screen layouts and actual rotation angles, as shown in Table 1 below.
TABLE-US-00001 TABLE 1 Screen layout Actual rotation angle 0? 345? to 0? and 0? to 75? 90? 75? to 165? 180? 165? to 255? 270? 255? to 345?
[0069] In the example in Table 1, regardless of whether the electronic device is rotated clockwise or counterclockwise, when the actual rotation angle of the electronic device is between 345? and 0? or between 0? and 75?, the electronic device has the screen layout of 0?. Similarly, when the actual rotation angle of the electronic device is between 75? and 165?, the electronic device has the screen layout of 90?: when the actual rotation angle of the electronic device is between 165? and 255?, the electronic device has the screen layout of 180?; and when the actual rotation angle of the electronic device is between 255? and 345?, the electronic device has the screen layout of 270?. It should be noted that the mapping relationship between the screen layouts and actual rotation angles shown in Table 1 is only used as an example. For different sensors or different electronic devices, the corresponding relationship between the screen layouts and actual rotation angles may be different. Details are not described herein.
[0070] In another example, an indirect mapping relationship may exist between the screen layouts and actual rotation angles, as shown in Table 2 below.
TABLE-US-00002 TABLE 2 Screen layout Rotation state Actual rotation angle 0? First state 345? to 0? and 0? to 75? 90? Second state 75? to 165? 180? Third state 165? to 255? 270? Fourth state 255? to 345?
[0071] In the example in Table 2, a specific mapping relationship exists between actual rotation angles and rotation states. Each rotation state corresponds to one screen layout. After detecting the rotation state in which the electronic device is, the gravity sensor can find the current screen layout of the electronic device.
[0072]
[0073] In an optional implementation, control options corresponding to a floating shutter may include Off, Standard, and Smart.
[0074] Optionally, the user selects the Off option as the control option of the floating shutter, and a shooting screen includes a fixed shutter, with no floating shutter.
[0075] For example, as shown in
[0076] When the electronic device is rotated to another state, the shooting screen still includes the fixed shutter, with no floating shutter, and details are not described herein.
[0077] In the example shown in
[0078] Optionally; the user selects the Standard option as the control option of the floating shutter. In the Standard option, a floating shutter may be available on a non-foldable electronic device or an electronic device having a curved display, and may also be available on an electronic device having a foldable display in an unfolded state or a folded state.
[0079] The user starts the camera application, the electronic device displays a screen 600a shown in
[0080] The floating shutter 604 in
[0081] The floating shutter 604 provided in the foregoing embodiment may be in a same pattern as the fixed shutter 602, with a size slightly smaller than a size of the fixed shutter 602, and has a preset transparency (for example, 80% transparency, not shown in the figure). This not only helps the user recognize that the floating shutter 604 has a same function as the fixed shutter 602, but also minimizes occluding of the shooting preview area 601 by the floating shutter 604 as much as possible.
[0082] The floating shutter 604 may be displayed directly, or may appear in a preset motion effect, for example, popping up from a border of the shooting screen, or appearing in a transparency of 0 to 80%. The pattern and attributes (including parameters such as size and transparency) of the floating shutter 604 shown in
[0083] Optionally, on the basis of
[0084] In the Standard option, when the electronic device is rotated to another angle (180? or 270?), the position of the floating shutter 604 in the shooting screen remains unchanged, which is not described herein again. In other words, in the standard mode, the floating shutter 604 is independent of the rotation of the electronic device.
[0085] In the Standard mode, the floating shutter 604 may be dragged.
[0086] Optionally, in addition to tap-and-drag, a manner of moving the floating shutter 604 may be long-press-and-drag, or multiple taps at a position where the user wants the floating shutter 604 to appear, or by other operations. This is not limited in this application. A preset motion effect may also be displayed during dragging of the floating shutter 604.
[0087] In
[0088] Optionally, the user selects the Smart option as the control option of the floating shutter. Taking the electronic device having a foldable display in
[0089] For example, when it is detected that the folded angle of the electronic device is greater than a preset angle, the electronic device is in an unfolded state, and the electronic device displays a shooting screen 700a as shown in
[0090] On the basis of
[0091] The floating shutter 704 in
[0092] In the example shown in
[0093] The floating shutter 704 may be displayed directly, or may appear in a preset motion effect, for example, popping up from a border of the shooting screen 700b, or appearing in a transparency of 0) to 80%. The pattern and attributes (including parameters such as size and transparency) of the floating shutter 704 shown in
[0094] In the foregoing embodiment, the floating shutter may be generated after the electronic device is rotated to a preset state. It should be noted that the example in
[0095] If the user continues to rotate the electronic device 90? clockwise on the basis of
[0096] If the user continues to rotate the electronic device 90? clockwise on the basis of
[0097] It should be noted that if the electronic device is rotated back by the user to the 0)? state as shown in
[0098] In the foregoing embodiment, the floating shutter is generated after the electronic device is rotated by the user to a preset state. Specifically, the floating shutter may be generated when the electronic device detects that the electronic device is rotated to a preset angle, and a default position of the floating shutter is intelligently recommended under these different screen layouts of 90?, 180?, and 270?, so as to achieve an effect of facilitating user operations at various rotation angles. Optionally, the electronic device may implement the foregoing solution using a stored preset configuration file, and the configuration file may record different screen layouts and default positions of the floating shutters in different screen layouts.
[0099] However, some facts are that different users have different hand sizes and different habits in operating an electronic device, and therefore, the foregoing default positions cannot meet the needs of each user. In order to further improve user operating experience in using the camera application, an improved solution is provided on the basis of the solution provided by the Smart option above, and is described below on the basis of
[0100] The user moves the floating shutter 704 in
[0101] If dragging of the floating shutter 704 continues in the shooting screen 700e of
[0102] The foregoing description is made based on the solution in
[0103] It should be noted that when the user rotates the electronic device, an actual rotation angle is random. For example, with reference to
[0104] According to the electronic device control method proposed in the foregoing embodiment, an additional floating shutter different from the fixed shutter can be generated after the electronic device is rotated to a preset angle. In this way, when the fixed shutter originally provided on the shooting screen is no longer convenient for the user to operate with rotation of the electronic device, the additional floating shutter convenient for the user to touch is provided, so that regardless of how the electronic device is rotated, a shutter control that is convenient to operate is provided for the user, thereby enabling the user to conveniently shoot pictures at various rotation angles of the electronic device. Further, after the electronic device is rotated, the user can further drag the floating shutter generated through the rotation of the electronic device to enable the floating shutter to move to a position indicated by the user, thereby satisfying different needs of different users.
[0105] Optionally, when the control option of the floating shutter of the electronic device is selected as Smart, instruction information may be displayed when the electronic device is rotated to 90?, 180?, or 270? for the first time. For example, as shown in
[0106] In another optional embodiment, on the basis of
[0107] In an optional solution, the floating shutter and the fixed shutter may not only have the same pattern and functions, but also have the same motion effect. The following describes motion effects of the floating shutter and the fixed shutter in the camera application in the shooting function, time-lapse shooting function and video recording function, separately.
[0108] The motion effect of the floating shutter in the shooting function of the camera application is described first. On the basis of
[0109] Next, the motion effect of the floating shutter in the time-lapse shooting function of the camera application is described. In the screen layout corresponding to
[0110] Further, the motion effect of the floating shutter in the video recording function of the camera application is described. During video recording by the electronic device, the fixed shutter can change to a pause or stop control, and a shooting-during-video-recording control is added. In the screen layout corresponding to
[0111]
[0112] S101. Display a first screen of a first application, where the first screen includes: an image acquired in real time by a camera of an electronic device and a first control displayed at a first position, where a first side of the electronic device is at a first angle with respect to a direction of gravity.
[0113] The first application may be a camera application of the electronic device. The first screen may be the screen 700a shown in
[0114] The first angle is described by taking
[0115] The first position may be a relative position of the first control with respect to certain hardware of the electronic device, or may be coordinate information of the first control in the coordinate system of the electronic device screen.
[0116] S102. Detect a first operation performed by a user on the first control, and shoot, by the electronic device using the camera, a first image.
[0117] The first operation performed by the user on the first control may be a tap operation. Still referring to
[0118] S103. The first side of the electronic device is at a second angle with respect to the direction of gravity after a second operation performed by the user on the electronic device is received, where the second angle is different from the first angle.
[0119] The second operation performed by the user on the electronic device may be a rotation operation on the electronic device. The second angle may be 90?, or may be any angle corresponding to the screen layout of 90? in Table 1 or Table 2.
[0120] S104. Display, by the electronic device in response to the second operation, a second screen, where the second screen includes: an image acquired in real time by the camera of the electronic device and a second control displayed at a second position, the second position being different from the first position.
[0121] The second screen may be as shown in the screen 700b of
[0122] The second screen may alternatively be the screen 700h shown in
[0123] S105. Detect a third operation performed by the user on the second control and shoot, by the electronic device using the camera, a second image.
[0124] The second operation performed by the user on the second control may be a tap operation. Still referring to
[0125] In an implementation, in S104, the second screen further includes the first control. In this example, the second screen is as shown in the screen 700b of
[0126] In an implementation, the second screen further includes a third control. The method further includes: the first application running in photo mode: detecting a third operation performed by the user on the third control: switching the first application to video mode in response to the third operation: shooting, by the electronic device using the camera, a first video when a fourth operation performed by the user on the first control is detected, and displaying a third screen, where the third screen includes a fourth control and a fifth control, the fourth control being at a same position as the first control, and the fifth control being at a same position as the second control: and detecting a fifth operation performed by the user on a sixth control, stopping shooting, by the electronic device, the first video, and displaying a fourth screen, wherein the fourth screen comprises the first control displayed at the first position and the second control displayed at the second position.
[0127] In the foregoing solution, the third control may be a control used to change the shooting mode in the first application. Taking
[0128] The fourth control being at a same as the first control may mean that the positions of the fourth control and the first control are exactly the same, or that the positions of the fourth control and the first control are close.
[0129] After the first application is switched to the video mode, functions of the fixed shutter 702 and the floating shutter 704 change, and their patterns may also change. The user taps the fixed shutter 702 so as to perform the fourth operation, or taps the floating shutter 704 to enable the electronic device to start to shoot the first video using the camera. The electronic device displays a screen 900 as shown in
[0130] In an implementation, a difference between the second angle and the first angle is 90?.
[0131] In the foregoing solution, the first angle may be 0?, and the second angle may be 90?.
[0132] In an implementation, after the electronic device displays the second screen in response to the second operation, the foregoing method further includes: detecting a fourth operation performed by the user on the second control, and displaying the second control at a third position in response to the fourth operation, where the third position is different from the first position and the second position.
[0133] The fourth operation may be a sliding operation. As shown in
[0134] In an implementation, after the displaying the second control at a third position in response to the fourth operation, the foregoing method further includes: the first side of the electronic device being at the first angle with respect to the direction of gravity after a fifth operation performed by the user on the electronic device is received: displaying by the electronic device in response to the fifth operation, a fifth screen, where the fifth screen includes: an image acquired in real time by the camera of the electronic device and the first control displayed at the first position: the first side of the electronic device being at the second angle with respect to the direction of gravity after a sixth operation performed by the user on the electronic device is received: and displaying, by the electronic device in response to the sixth operation, a sixth screen, where the sixth screen includes: an image acquired in real time by the camera of the electronic device and the second control displayed at the third position.
[0135] After the second control is moved to a third position, the electronic device may store the third position. When the electronic device is rotated to the second angle again after being rotated to another angle, the second control may be displayed according to the recorded third position.
[0136] In an implementation, after the displaying the second control at a third position in response to the fourth operation, the electronic device displays a seventh screen in response to a seventh operation performed by the user, where the seventh screen does not pertain to the first application. The electronic device displays an eighth screen of the first application in response to an eighth operation by the user, where the eighth screen includes: an image acquired in real time by the camera of the electronic device and the first control displayed at the first position, where the first side of the electronic device is at the first angle with respect to the direction of gravity. The electronic device displays a ninth screen in response to a ninth operation by the user, where the ninth screen includes: an image acquired in real time by the camera of the electronic device and the second control displayed at the third position.
[0137] The seventh operation may be an operation for exiting the first application. After the second control is moved to a third position, the electronic device may store the third position. After the electronic device enters the first application again after exiting the first application, when the electronic device is rotated to the second angle, the second control will be displayed at the stored third position.
[0138] In an implementation, the foregoing method further includes: displaying a tenth screen in response to a tenth operation on the first screen, where the tenth screen includes a first option and a second option, and the first option is selected: selecting the second option in response to an eleventh operation on the tenth screen: displaying an eleventh screen of the first application, where the eleventh screen includes: an image acquired in real time by the camera of the electronic device, the first control displayed at the first position, and the second control displayed at the fourth position, where the first side of the electronic device is at the first angle with respect to the direction of gravity: and the first side of the electronic device being at the second angle with respect to the direction of gravity after a twelfth operation performed by the user on the electronic device is received, and displaying, by the electronic device, a twelfth screen, where the twelfth screen includes: an image acquired in real time by the camera of the electronic device, the first control displayed at the first position, and the second control displayed at the fourth position.
[0139] With reference to
[0140] When the floating shutter button is the Standard option, as shown in
[0141] In an implementation, the electronic device is an electronic device having a foldable display, and the electronic device having a foldable display is in an unfolded state. The foregoing method further includes: changing the electronic device from the unfolded state to a folded state after a thirteenth operation performed by the user on the electronic device is received: displaying a thirteenth screen of the first application, where the thirteenth screen includes: an image acquired in real time by the camera of the electronic device and the first control displayed at the first position, where the first side of the electronic device is at the first angle with respect to the direction of gravity: and after a fourteenth operation performed by the user on the electronic device is received, the first side of the electronic device being at the second angle with respect to the direction of gravity, and displaying, by the electronic device, a fourteenth screen, where the fourteenth screen includes: an image acquired in real time by the camera of the electronic device and the first control displayed at the first position.
[0142] The thirteenth operation may be an operation of folding the unfolded electronic device having a foldable display. After the user performs the thirteenth operation on the electronic device, the Smart option becomes unavailable, that is, the electronic device no longer displays the second control. As shown in
[0143] In an implementation, the second control may have a same pattern as the first control.
[0144] In an implementation, the second control may be smaller than the first control.
[0145] In an implementation, the second control may have a transparency greater than 0).
[0146] In an implementation, the foregoing method further includes: displaying, by the second control and the first control, a same motion effect after the first operation performed by the user on the first control is detected: and displaying, by the second control and the first control, a same motion effect after the third operation performed by the user on the second control is detected.
[0147] Taking
[0148] An embodiment of this application further provides an electronic device, including a memory configured to store computer program instructions and a processor configured to execute the program instructions, where when the computer program instructions are executed by the processor, the electronic device is triggered to perform the foregoing related method steps to implement the method in the foregoing embodiments.
[0149] An embodiment of this application further provides a computer-readable storage medium. The computer-readable storage medium stores computer instructions. When the computer instructions are run on a terminal device, the terminal device is enabled to perform the foregoing related method steps to implement the method in the foregoing embodiments.
[0150] An embodiment of this application further provides a computer program product. When the computer program product is run on a computer, the computer is enabled to perform the foregoing related steps to implement the method in the foregoing embodiments.
[0151] In addition, an embodiment of this application further provides an apparatus. The apparatus may be specifically a chip, an assembly, or a module. The apparatus may include a processor and a memory that are connected to each other. The memory is configured to store computer-executable instructions. When the apparatus runs, the processor may execute the computer-executable instructions stored in the memory, so that the chip performs the method in the foregoing method embodiments.
[0152] The terminal device, the computer storage medium, the computer program product, or the chip provided in the embodiments of this application is configured to perform the corresponding method provided above. Therefore, for beneficial effects that can be achieved by the electronic device, the computer storage medium, the computer program product, or the chip, refer to the beneficial effects in the corresponding method provided above. Details are not described herein.
[0153] From the descriptions of the foregoing implementations, a person skilled in the art may realize that, for ease and brevity of description, only division into the foregoing function modules is used as an example for description: in actual application, the foregoing functions may be allocated, depending on a requirement, to different function modules for implementation, that is, an internal structure of the apparatus is divided into different function modules to implement all or some of the functions described above.
[0154] In the several embodiments provided in this application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the described apparatus embodiment is merely an example. For example, division into the modules or units is merely a logical function division, and another division manner may be used during actual implementation. For example, a plurality of units or components may be combined, or may be integrated into another apparatus, or some features may be discarded or not performed. In addition, the displayed or discussed mutual couplings or direct couplings or communication connections may be implemented by using some interfaces. The indirect couplings or communication connections between the apparatuses or units may be implemented in electronic, mechanical, or other forms.
[0155] The units described as separate parts may or may not be physically separated, and the parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of different places. Some or all of the units may be selected based on an actual requirement, so as to achieve the objectives of the solutions in the embodiments.
[0156] In addition, function units in the embodiments of this application may be integrated into one processing unit, or each of the units may exist alone physically, or two or more units may be integrated into one unit. The integrated unit may be implemented in a form of hardware, or may be implemented in a form of a software function unit.
[0157] If the integrated unit is implemented in the form of a software function unit and is sold or used as an independent product, the integrated unit may be stored in a readable storage medium. Based on such an understanding, the technical solutions in the embodiments of this application essentially; or the part contributing to the prior art, or all or some of the technical solutions may be implemented in a form of a software product. The software product is stored in a storage medium, and includes several instructions for instructing a device (which may be a single-chip microcomputer, a chip, or the like) or a processor (processor) to perform all or some of the steps of the method described in the embodiments of this application. The foregoing storage medium includes any medium that can store program code, such as a USB flash drive, a removable hard disk, a read-only memory (read-only memory, ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disc.
[0158] The foregoing content is merely specific implementations of this application, but is not intended to limit the protection scope of this application. Any variation or replacement readily figured out by a person skilled in the art within the technical scope disclosed in this application shall fall within the protection scope of this application. Therefore, the protection scope of this application shall be subject to the protection scope of the claims.