US20110310010A1 - Gesture based user interface - Google Patents

Gesture based user interface Download PDF

Info

Publication number
US20110310010A1
US20110310010A1 US13/161,508 US201113161508A US2011310010A1 US 20110310010 A1 US20110310010 A1 US 20110310010A1 US 201113161508 A US201113161508 A US 201113161508A US 2011310010 A1 US2011310010 A1 US 2011310010A1
Authority
US
United States
Prior art keywords
sub
region
cursor
movements
regions
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/161,508
Inventor
Amir Hoffnung
Micha Galor
Jonathan Pokrass
Roee Shenberg
Shlomo Zippel
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
PrimeSense Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by PrimeSense Ltd filed Critical PrimeSense Ltd
Priority to US13/161,508 priority Critical patent/US20110310010A1/en
Assigned to PRIMESENSE LTD. reassignment PRIMESENSE LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SHENBERG, ROEE, GALOR, MICHA, HOFFNUNG, AMIR, POKRASS, JONATHAN, ZIPPEL, SHLOMO
Publication of US20110310010A1 publication Critical patent/US20110310010A1/en
Assigned to APPLE INC. reassignment APPLE INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: PRIMESENSE LTD.
Assigned to APPLE INC. reassignment APPLE INC. CORRECTIVE ASSIGNMENT TO CORRECT THE APPLICATION # 13840451 AND REPLACE IT WITH CORRECT APPLICATION # 13810451 PREVIOUSLY RECORDED ON REEL 034293 FRAME 0092. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT. Assignors: PRIMESENSE LTD.
Priority to US15/434,081 priority patent/US10429937B2/en
Priority to US16/550,423 priority patent/US10928921B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects

Definitions

  • the present invention relates generally to user interfaces, and specifically to gesture based user interfaces.
  • Gesture based user interfaces allow users to control electronic devices and/or provide user input by hand gestures.
  • Various systems have been described for identifying the hand gestures.
  • Systems based on identifying hand gestures allow a wide range of inputs and can be used, for example, for text entry and for three dimensional control of animation in real time, such as in a virtual reality program running on a computer, as described in U.S. Pat. No. 6,452,584 to Walker et al., titled: “System for Data Management Based on Hand Gestures”.
  • Hand gestures may be used in simpler environments.
  • US patent application publication 2008/0256494 to Greenfield describes using hand gestures to control the flow and temperature of water of a faucet. In one embodiment it is suggested that movements of the hand with one finger held up are interpreted as controlling the water flow, and movements with two fingers held up control the temperature.
  • U.S. Pat. No. 7,821,541 to Delean titled: “Remote Control Apparatus Using Gesture Recognition”, describes controlling a television system using hand gestures. Up and down movements are interpreted as controlling the volume, and left and right movements are interpreted as controlling the channel. In order to increase the number of commands that may be invoked via hand gestures, a sequence of multiple hand gestures can be interpreted as a single command. In order to avoid interpreting random movements as control instructions, a dormant mode is defined, and the user is required to signal a desire to move to an active mode before providing instructions.
  • Embodiments of the present invention that are described hereinbelow provide systems for control of multiple functions using hand gestures.
  • a gesture based user interface comprising a movement monitor configured to monitor a user's hand and to provide a signal based on movements of the hand; a display; and a processor configured to move a cursor responsive to the signal from the movement monitor, within a predetermined region on the display, formed of a plurality of sub-regions associated with respective control commands, and to provide the commands to a controlled application responsively to the movements of the hand.
  • the sub-regions include at least one quick-access sub-region, for which the processor provides the command associated with the sub-region responsively to entrance of the cursor into the quick-access sub-region without additional user hand movements, and at least one regular sub-region, for which the processor provides the command associated with the sub-region responsively to identifying a predetermined hand gesture while the cursor is in the sub-region.
  • the predetermined region comprises a single-dimensional region.
  • the predetermined region comprises a horizontal bar.
  • the processor is configured to define two quick-access sub-regions, one on each end of the single-dimensional region.
  • the predetermined region is a convex region.
  • the predetermined region does not include gaps between the sub-regions associated with respective control commands.
  • the first sub-regions are on the edges of the region.
  • the processor is configured to ignore downward components of movements of the hand.
  • the quick-access sub-regions are located within the predetermined region such that downward movements do not lead to a quick-access sub-region.
  • the processor is configured to entirely ignore downward movements.
  • a method of receiving user input comprising identifying movements of the hand of a user; moving a cursor within a predetermined region, formed of a plurality of sub-regions associated with respective control commands, by a processor, responsive to the identified movements; and actuating a control command associated with a sub-region in which the cursor is located,
  • sub-regions include at least one quick-access sub-region for which the associated command is actuated responsive to entrance of the cursor into the quick-access sub-region without requiring additional user hand movements and at least one regular sub-region, for which the associated command is actuated responsive to identifying a predetermined hand gesture while the cursor is in the sub-region.
  • the predetermined hand gesture comprises an upward gesture.
  • moving the cursor within the predetermined region comprises moving along a single-dimensional region.
  • a computer software product comprising a tangible computer-readable medium in which program instructions are stored, which instructions, when read by a computer, cause the computer to identify movements of a user; move a cursor within a predetermined region, formed of a plurality of sub-regions associated with respective control commands, responsive to the identified movements; and actuate a control command associated with a sub-region in which the cursor is located.
  • the sub-regions include at least one quick-access sub-region for which the associated command is actuated responsive to entrance of the cursor into the quick-access sub-region without requiring additional user hand movements and at least one regular sub-region, for which the associated command is actuated responsive to identifying a predetermined hand gesture while the cursor is in the sub-region.
  • a gesture based user interface comprising a movement monitor configured to monitor a user's hand and to provide a signal based on movements of the hand; a display; and a processor configured to provide at least one interface state in which a cursor is confined to movement within a single dimension region responsive to the signal from the movement monitor, and to actuate different commands responsive to the signal from the movement monitor and the location of the cursor in the single dimension region.
  • the single-dimensional region comprises a horizontal bar in which the cursor is confined only to right and left movements.
  • the processor is configured to allow the cursor to exit the single-dimensional region from a predetermined sub-region of the single-dimensional region.
  • the single-dimensional region covers less than 10% of the display.
  • the single-dimensional region comprises a plurality of adjacent sub-regions associated with different values of a single parameter.
  • FIG. 1 is a block diagram of a control system based on identifying hand gestures, in accordance with an embodiment of the present invention
  • FIG. 2 is a schematic view of a horizontal bar user interface, in accordance with an embodiment of the present invention.
  • FIG. 3 is a schematic view of a two dimensional user interface, in accordance with an embodiment of the present invention.
  • FIG. 4 is a schematic illustration of a joystick user interface, in accordance with another embodiment of the invention.
  • FIG. 5 is a schematic illustration of a display with a horizontal bar user interface 502 , in accordance with an embodiment of the invention.
  • FIG. 6 is a schematic illustration of a display with a user interface, in accordance with another embodiment of the invention.
  • gesture based user interfaces have made significant advances, there are at least two major issues that still require additional attention. The first is the differentiation between user movements intended to be interpreted as control instructions and random movements that should be ignored. The second is allowing the user a range of commands, without making the interface too complex.
  • An aspect of some embodiments of the invention relates to a user interface based on identification of hand gestures in which a processor controls movement of a cursor (represented by an arrow, hand, or other icon or pointer) responsive to identified hand movements.
  • the movements of the cursor are confined within a predetermined region on a display, formed of sub-regions corresponding to different respective controls.
  • At least one of the sub-regions is a quick-access sub-region, which enables actuation of the respective control by positioning the cursor in the sub-region and/or moving the cursor within the sub-region, without additional user gestures, while other sub-regions require an additional user gesture to actuate the respective control of the sub-region.
  • the quick-access sub-regions are on the edges of the predetermined region.
  • the region comprises a one-dimensional bar, such as a horizontal bar, in which the cursor is restricted to horizontal movements.
  • the region comprises a two dimensional region.
  • quick-access sub-regions cannot be accessed from a non-quick-access sub-region by a mere downward movement, such that downward movements (which are often inadvertent movements) are not undesirably interpreted as control commands.
  • the quick-access sub-regions allow fast actuation for their respective control acts, without requiring additional user gestures. Limiting this feature to specific sub-regions prevents control acts from being performed inadvertently.
  • the region on the display is a continuous region without gaps between adjacent sub-regions.
  • the region is a convex region.
  • the region covers only a fraction of the screen on which it is displayed, such as less than 30%, less than 20% or even less than 10% of the area of the screen.
  • An aspect of some embodiments of the invention relates to a user control system based on identification of hand gestures, having at least one state in which a processor controls movement of a cursor within a single dimension horizontal bar.
  • the horizontal bar is formed of sub-regions corresponding to different respective controls, for selection of a control to be manipulated.
  • a command of the sub-region is invoked by a specific predetermined user movement, such as an upward gesture, a twist gesture and/or a movement of a specific number of fingers.
  • a specific predetermined user movement such as an upward gesture, a twist gesture and/or a movement of a specific number of fingers.
  • the command of one or more sub-regions is invoked by the mere entrance into the sub-region.
  • the cursor is not moved out of the horizontal bar at all.
  • the command of the sub-region is invoked by moving a cursor (represented by the same icon or by a different icon) controllably outside the horizontal bar.
  • a cursor represented by the same icon or by a different icon
  • a parameter value control may be displayed, for example in the form of a vertical bar or a separate horizontal bar, and the user moves a cursor within this bar.
  • a user indication such as a swivel or no-movements for a predetermined period are interpreted as a return to the horizontal bar.
  • the horizontal bar covers only a small part (e.g., less than 20%, less than 10% or even less than 5%) of a general display.
  • one or more of the sub-regions allows exiting the horizontal bar into the general area of the entire display.
  • the user moves the cursor to the specific sub-region of the horizontal bar assigned to the exit of the cursor, possibly through a specific gate in the border of the sub-region.
  • the return to the horizontal bar is also limited only to the specific sub-region or to the specific gate. Alternatively, the return to the horizontal bar is allowed from any direction, to simplify the return to the horizontal bar.
  • the general display is divided into a plurality of sub-areas and the access to each sub-area is from a different sub-region and/or gate of the horizontal bar.
  • An aspect of some embodiments of the invention relates to a user interface based on identification of hand gestures in which a processor controls movement of a cursor within a predetermined region, formed of sub-regions corresponding to different respective controls, in which downward motions are ignored.
  • downward gestures may be used for the further control of the specific control. In other embodiments, downward gestures are not used at all. Avoiding the use of downward movements reduces the chances of inadvertent movements being interpreted as requesting desired actions.
  • FIG. 1 is a block diagram of a control system 100 based on identifying hand gestures, in accordance with an embodiment of the present invention.
  • Control system 100 comprises a movement identifier 102 adapted to identify movements of a user's hand and a processor 104 which receives indications of movements from identifier 102 and accordingly adjusts a display on a screen 106 and provides commands to a controlled application 108 .
  • Processor 104 typically comprises a general-purpose computer processor, with a suitable memory and control interfaces.
  • the processor is programmed in software to carry out the functions described hereinbelow. This software may be downloaded to processor 104 in electronic form, over a network, for example. Alternatively or additionally, the software may be stored on tangible storage media, including non-volatile storage media, such as optical, magnetic, or electronic memory.
  • movement identifier 102 comprises an infrared depth mapping system, which may comprise a coherent light source and a generator of a random speckle pattern as described, for example, in PCT publication WO2007/043036 to Zalevsky et al., the disclosure of which is incorporated herein by reference.
  • an infrared depth mapping system which may comprise a coherent light source and a generator of a random speckle pattern as described, for example, in PCT publication WO2007/043036 to Zalevsky et al., the disclosure of which is incorporated herein by reference.
  • Another suitable depth mapping system that may be used for this purpose is described in US patent application publication 2010/0007717, whose disclosure is also incorporated herein by reference. This sort of depth map is segmented, and parts of the user's body are identified and tracked in order to identify user gestures, as described, for example, in U.S. patent application Ser. No. 12/854,187, filed Aug.
  • Processor 104 computes or receives location coordinates and gesture indications with respect to the user's hand within a certain volume in space, and associates this volume with a region on screen 106 , as described further hereinbelow.
  • movement identifier 102 comprises a glove, such as described in U.S. Pat. No. 4,988,981, the disclosure of which is incorporated herein by reference.
  • movement identifier 102 operates based on capacitance coupling and/or is based on the user's hand holding or wearing a signal transmitter, such as an infrared emitting module. Movement identifier 102 may comprise also other devices.
  • Controlled application 108 may run on processor 104 together with a control application which performs the control tasks described herein, or may run on a separate processor. Such a separate processor may be included with processor 104 in a same housing, or may be a completely external unit to processor 104 , connected through a cable and/or wirelessly to processor 104 .
  • Controlled application 108 may include, for example, a television set, set-top box, a VCR, a DVD player, a computer game, a console or other devices, such as home appliances and kitchen appliances.
  • Screen 106 is optionally shared by processor 104 and controlled application 108 , although in some embodiments, separate screens may be used.
  • FIG. 2 is a schematic illustration of a horizontal bar user interface 200 displayed on screen 106 , in accordance with an embodiment of the present invention.
  • User interface 200 includes a controlled cursor 202 , which is constrained to move along a horizontal bar 204 , following left and right hand gestures.
  • Horizontal bar 204 is divided into a plurality of sub-regions 206 A, 206 B, 206 C, 206 D, 208 A and 208 B corresponding to respective controls.
  • one or more of the sub-regions is a quick-access sub-region 208 (marked 208 A and 208 B), for which the entrance of controlled cursor 202 into the sub-region causes processor 104 to actuate the corresponding command without requiring additional input from the user.
  • one or more other sub-regions 206 are regular sub-regions in which the presence of controlled cursor 202 therein does not cause on its own command actuation of the corresponding control by processor 104 .
  • Sub-regions 206 may all be of the same size, or different sub-regions 206 may have different sizes, for example according to the relative importance of their corresponding command.
  • quick-access sub-regions 208 may all be the same size or different quick-access sub-regions 208 may have different sizes according to their corresponding controls. For example, popular commands may correspond to larger sub-regions and/or controls whose commands have a larger span of values may correspond to larger sub-regions.
  • quick-access sub-regions 208 are on the edges of the predetermined region. This allows the user to quickly move the cursor 202 into a quick-access sub-region 208 without worrying about accurately moving into the sub-region without passing it.
  • one or more quick-access sub-regions 208 are not on edges of the horizontal bar, thus allowing a larger number of quick-access sub-regions 208 .
  • Such quick-access sub-regions 208 which are not on edges of the horizontal bar are referred to herein as hover sub-regions.
  • hover regions are located adjacent edge-located quick-access sub-regions having a similar associated command, such as controlling the same parameter but to a different extent (e.g., sub-region 206 D is a hover sub-region having a similar command as sub-region 208 B).
  • quick-access sub-regions 208 may be used to control scrolling of items in a menu, with hover sub-regions and adjacent edge located sub-regions differing in the scrolling speed.
  • a plurality of adjacent sub-regions are associated with different values of a single parameter.
  • each regular sub-region 206 and/or each quick-access sub-region 208 is associated with a respective symbol 216 (marked 216 A, 216 B, 216 C and 216 D) or 218 (marked 218 A and 218 B), which indicates the command associated with the sub-region.
  • the corresponding symbol 216 or 218 is modified to indicate the presence of cursor 202 in the respective sub-region.
  • the modification of symbol 216 or 218 includes, for example, enlargement of the symbol and/or increasing of the brightness of the symbol. Other modifications may be used additionally or alternatively, such as color changes, rotation and/or changes in the shape or texture of the symbol itself.
  • the entrance of cursor 202 into a different sub-region is indicated by an audio sound, for example a quiet tick sound.
  • an audio sound for example a quiet tick sound.
  • Different sounds may be used for the entrance into different sub-regions or for the entrance into different types of sub-regions, e.g., 206 vs. 208 , or the same sound may be used for all sub-regions.
  • sounds are used only to indicate entrance into regular sub-regions 206 .
  • the size of the sub-region is enlarged, such that leaving the sub-region requires moving a larger extent than entering the sub-region. This provides a hysteresis effect and prevents flickering between sub-regions due to small user movements.
  • the entered sub-region 206 may be enlarged only in the direction closest to the current location of cursor 202 or may be enlarged in all directions for symmetry.
  • a command corresponding to a sub-region 206 when actuated, its corresponding symbol 216 is modified to so indicate, for example by increasing its brightness and/or changing its color in a manner different from that used to indicate that cursor 202 is in the corresponding sub-region 206 .
  • a distinct audio signal indicates command actuation, the same audio signal being used for all command actuations or different sounds may be used for different commands.
  • the audio signal indicating command actuation is different from the audio signal indication entering into a sub-region 206 .
  • horizontal bar 204 corresponds to a sufficiently large extent of horizontal movement of the hand, so that small inadvertent movements are not interpreted as user commands to move between sub-regions 206 or to enter a quick-access sub-region 208 .
  • the extent of horizontal movement is optionally greater than 10 centimeters, greater than 15 centimeters or even greater than 20 centimeters.
  • the extent of horizontal movement corresponding to the extent of horizontal bar 204 is not too large, so as to require uncomfortable substantial movements from the user in order to move between sub-regions.
  • the extent of horizontal movement is smaller than 35 centimeter, smaller than 30 centimeters or even shorter than 25 centimeters. In one particular embodiment the horizontal extent is 24 centimeters.
  • sub-regions 206 are adjacent each other, without gaps between them.
  • the horizontal bar includes gaps between sub-regions. In the gaps, user upward movements are ignored as the gaps are not associated with a command.
  • downward movements are ignored and are not used for commands, as downward movements are commonly performed inadvertently.
  • movements along the depth axis are also ignored.
  • some downward movements having specific characteristics may be interpreted as commands.
  • a substantial downward movement may be interpreted as an instruction to discontinue control of controlled application 108 based on the user's movements, allowing the user to move freely without causing unwanted control operations.
  • a predetermined user movement is used to re-establish the control.
  • upward hand gestures are considered as commands only if provided within a predetermined time from entering a sub-region 206 .
  • the chances of inadvertent gestures being interpreted as commands is decreased.
  • cursor 202 is moved to a default location, for example in the center of horizontal bar 104 , so as to reduce the chances of an inadvertent move into a quick-access sub-region 208 .
  • cursor 202 is not moved after a period of non-use, allowing a user to leave cursor 202 near a quick-access sub-region 208 and quickly enter the sub-region, when desired.
  • Actuation of the command corresponding to a regular sub-region 206 is optionally performed responsive to a specific user hand movement, for example an upward movement of at least a predetermined extent, for example at least 0.5 centimeter, at least 2 centimeters, at least 5 centimeters or even at least 8 centimeters. Requiring at least a minimal extent of the upward movement reduces the possibility that an inadvertent movement is interpreted as a user instruction.
  • the extent of the hand movement (e.g., upward movement) considered by processor 104 as a user instruction is user-adjustable.
  • processor 104 automatically adjusts the extent of hand movements considered as a user command according to tracking of instructions which are cancelled by the user shortly after they are given. For example, processor 104 may keep track of the extent of upward hand movements of user commands and the corresponding time passing between when the instruction is given and when it is cancelled. If it is determined that a large percentage of commands given by relatively short upward hand gestures are cancelled within a short period, a threshold for interpreting upward gestures as commands may be increased.
  • Processor 104 may also monitor the number of occurrences of relatively short upward movements which were ignored and were then followed by a larger upward movement interpreted as a user command, in the same sub-region.
  • the threshold of the size of a movement may be lowered if at least a predetermined number (e.g., at least 1, at least 3, at least 5) of such occurrences are identified within a predetermined period (e.g., 10 minutes, an hour).
  • FIG. 3 is a schematic illustration of a displayed two-dimensional user interface 300 , in accordance with an embodiment of the invention.
  • sub-regions 306 and 308 are distributed in a two-dimensional array 304 , and a cursor 302 is moved around array 304 to select a sub-region 306 or 308 .
  • Sub-regions 308 on the periphery of array 304 are optionally quick-access sub-regions for which entrance of cursor 302 therein incurs actuation of their respective command without requiring further user input.
  • Sub-regions 306 are regular sub-regions which require additional user input, such as a push, pull, circular or swerve gesture to actuate their corresponding command.
  • gestures may be used to actuate the command of a sub-region 306 , such as quick left-right flicks.
  • the command of a sub-region 306 is actuated if cursor 302 is in the sub-region for longer than a predetermined period.
  • the predetermined period is optionally at least 3 seconds, at least 5 seconds or even at least 10 seconds.
  • quick-access sub-regions 308 are located on the upper area of array 304 , such that downward movements, which are more commonly performed inadvertently, do not cause cursor 302 to move into a quick-access sub-region 308 .
  • the commands corresponding to regular sub-regions 206 are toggling commands which simply require a user indication that the command is to be performed.
  • the toggling commands may include, for example, on/off, select, menu display and/or increase of a variable (e.g., volume) by a predetermined value.
  • a variable e.g., volume
  • a separate horizontal bar for receiving a desired value of the variable is displayed responsive to a user's upward hand gesture in the corresponding sub-region 206 .
  • the upward hand gesture controls the variable value according to its extent above a minimal threshold.
  • the variable value is controlled from its current value.
  • each time the variable is controlled the variable value begins from a minimal value at the point at which the upward gesture passes the threshold.
  • two separate sub-regions 206 are assigned to multi-value variables, one for increasing the value of the variable and the other for decreasing the variable value.
  • a visual vertical bar is displayed near the symbol of the current sub-region 206 , indicating the extent of upward movement of the hand gesture.
  • the bar may indicate the extent required in order to reach the minimal threshold required for the gesture to be considered a user indication and/or may indicate the extent beyond the threshold, for cases in which the command involves a range of values.
  • quick-access sub-regions 208 and/or 308 are used for toggling commands.
  • one or more of the quick-access sub-regions 308 are used for controlling a parameter value on a multi-value scale, such as for scrolling, channel control or volume control.
  • each time cursor 202 or 302 enters the quick-access sub-region 208 or 308 the value of the parameter is increased by a predetermined value.
  • the time for which the cursor 202 is in quick-access sub-region 208 defines the extent of change of the parameter value.
  • two quick access sub-regions 208 or 308 are used for each parameter, one for increasing the parameter and the other for decreasing the parameter.
  • quick access sub-regions 208 or 308 is used for controls requiring indication of an extent.
  • one or more of quick access sub-regions 208 or 308 is divided into a plurality of zones which are associated with different values or with different extents of value change.
  • the value control optionally depends only on horizontal movements in the sub-region 308 , ignoring vertical movement components.
  • Sub-regions 308 may have various shapes, including, as shown, rectangular and L-shaped sub-regions. Other shapes, including triangular and round shapes may be used.
  • any of the control methods described above regarding sub-regions 206 may be used for sub-regions 208 , including opening a separate horizontal or vertical bar.
  • quick-access sub-regions 208 and 308 are assigned to commands that are used more frequently than other commands and/or for commands that should be available at all times.
  • quick-access regions may be used for “mute” or “pause” in a video/audio control system and/or for a home command, for returning to a home state of the user interface.
  • the commands assigned to quick-access sub-regions 208 or 308 are optionally commands which are easily reversible by the user. Alternatively or additionally, the commands assigned to quick-access sub-regions 208 or 308 do not directly control an operation of controlled application 108 , but rather perform adjustments or selections which require other commands to cause operations of controlled application 108 . In other embodiments, the commands assigned to quick-access sub-regions 208 or 308 do not control acts of controlled application 108 which interact with the user, such as changing the volume and/or beginning and/or changing a viewed program.
  • FIG. 4 is a schematic illustration of a joystick user interface 400 , in accordance with another embodiment of the invention.
  • Interface 400 includes a central sub-region 406 serving as a joystick button and four directional triangles 412 formed of pairs of quick access sub-regions 410 and 408 .
  • Sub-regions 410 are hover sub-regions associated with slow movement in a specific direction and adjacent sub-regions 408 are associated with movement in the same direction, but at a faster speed.
  • triangles 412 may include more than two sub-regions each, possibly, 3, 4 or even more, allowing the user more flexibility in choosing the speed.
  • central sub-region 406 may be divided into a plurality of sub-regions corresponding to different button commands.
  • a horizontal bar 420 including a plurality of sub-regions, such as one similar to user interface 200 is located within central sub-region 406 .
  • Horizontal bar 420 may be displayed continuously, or may be displayed responsive to a user command given when cursor 406 is within central sub-region 406 .
  • horizontal bar 420 when usage of horizontal bar 420 is invoked, for example by hovering cursor 402 over its location and providing a unique user movement, such as a hand twist or movement along the depth axis, the movement of cursor 402 is confined to horizontal bar 420 .
  • one of the sub-regions of horizontal bar 420 is assigned to a command to allow cursor 402 to leave the horizontal bar.
  • a unique user movement may be used to allow indication of leaving the horizontal bar 420 from any point therein.
  • Interfaces 200 and 300 may include control of all commands that are available to a user. Alternatively, one or more of the sub-regions may be assigned to a command of entering a sub-interface and/or of returning back to a parent or main interface.
  • the sub-menu includes a sub-region thereof assigned to a command of returning back to the main menu.
  • the “return” command is optionally achieved by an upward gesture, like the other commands, so that a downward gesture for the “return” command is not required.
  • processor 104 automatically returns to the main menu after a predetermined time and/or after a predetermined period of inaction.
  • downward gestures are allowed in sub-menus, while being ignored in the main menu. Accordingly, downward gestures may be used to return to the main menu.
  • quick-access sub-regions 208 or 308 are assigned to a play/pause command and to a volume control, while regular sub-regions 206 or 306 are assigned to menu selections for movement to other interfaces.
  • one sub-region 206 may be assigned to entrance into a sub-interface for controlling display speed (e.g., fast-forward and rewind) and another may be assigned to entrance into a movie selection interface.
  • quick access regions are assigned to a command for opening a frequently used menu and/or to a command for returning to a parent or main menu.
  • quick-access sub-regions 208 are assigned to rewind and fast-forward controls and regular sub-regions 206 are assigned to play, volume control and program selection commands.
  • most or all of the regular sub-regions 206 or 306 are assigned to menu choices, with each sub-region assigned to an item (e.g., a movie in a movie selection menu).
  • Quick-access sub-regions 208 or 308 are optionally assigned to scrolling commands which present additional sub-regions corresponding to items, particularly when the number of selectable items is larger than the number of sub-regions presented concurrently on the interface.
  • a single-dimension menu interface includes three sub-regions: a middle sub-region 206 corresponding to a selection command of a current item in the menu and side sub-regions 208 for left and right scrolling in the menu. If desired, more selection sub-regions 206 may be included, to allow faster scrolling through the items of the menu.
  • one or more of the sub-regions 206 , 306 is associated with different commands depending on attributes of the user movement. For example, a first action may be performed if the user raises a single finger, while a different command is performed if the user raises two fingers.
  • FIG. 5 is a schematic illustration of a display 500 with a horizontal bar user interface 502 , in accordance with an embodiment of the invention.
  • the movements of a cursor 504 are confined for one or more interface states to user interface 502 , making it simple for the user to identify where the cursor is located.
  • user interface 502 covers less than 20%, less than 10%, less than 5% or even less than 2% of the area of display 500 .
  • User interface 502 may be a horizontal bar as shown in FIG. 2 , a two dimensional array as shown in FIG. 3 or may have any other suitable shape. It is noted that the system may include one or more other states in which cursor 504 is allowed to move over larger parts of display 500 , possibly even over all of the area of display 500 .
  • Display 500 may be used for example to show a movie, with interface 502 used to control fast-forward/rewind, volume, play/pause and/or other relevant commands of viewing a movie. In one or more other states, part or all of display 500 is used for presenting options for selection, e.g., movies.
  • Interface 502 may be located at the bottom of display 500 , as shown, or in other locations, such as the top, right, left or in the middle.
  • FIG. 6 is a schematic illustration of a display 600 with a user interface 602 , in accordance with still another embodiment of the invention.
  • User interface 602 is located on the upper outskirts of display 600 , with the central part 604 of display 600 used for other purposes such as displaying a movie and/or selection options.
  • central part 604 used for general display is located between a right arm 606 and a left arm 608 of interface 602 .
  • a cursor 612 of interface 602 is confined to movement within interface 602 , without entering central part 604 .
  • the movements throughout interface 602 including arms 606 and 608 are performed responsive to right-left movements of the user, and downward movements are ignored.
  • movements in arms 606 and 608 are performed based on upward and downward movements of the user.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

A gesture based user interface includes a movement monitor configured to monitor a user's hand and to provide a signal based on movements of the hand. A processor is configured to provide at least one interface state in which a cursor is confined to movement within a single dimension region responsive to the signal from the movement monitor, and to actuate different commands responsive to the signal from the movement monitor and the location of the cursor in the single dimension region.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • This application claims the benefit of U.S. Provisional Patent Application 61/355,574, filed Jun. 17, 2010, which is incorporated herein by reference.
  • FIELD OF THE INVENTION
  • The present invention relates generally to user interfaces, and specifically to gesture based user interfaces.
  • BACKGROUND OF THE INVENTION
  • Gesture based user interfaces allow users to control electronic devices and/or provide user input by hand gestures. Various systems have been described for identifying the hand gestures.
  • U.S. Pat. No. 4,988,981 to Zimmerman et al., titled: “Computer Data Entry and Manipulation Apparatus and Method”, describes a glove worn on a user's hand which is used in generating control signals for the manipulation of virtual objects.
  • U.S. Pat. No. 4,550,250 to Mueller et al., titled: “Cordless Digital Graphics Input Device”, describes a cordless graphics input device based on an infrared emitting module.
  • PCT publication WO2007/043036 to Zalevsky et al. describes identifying hand gestures using a coherent light source and a generator of a random speckle pattern.
  • Systems based on identifying hand gestures allow a wide range of inputs and can be used, for example, for text entry and for three dimensional control of animation in real time, such as in a virtual reality program running on a computer, as described in U.S. Pat. No. 6,452,584 to Walker et al., titled: “System for Data Management Based on Hand Gestures”.
  • Hand gestures may be used in simpler environments. US patent application publication 2008/0256494 to Greenfield describes using hand gestures to control the flow and temperature of water of a faucet. In one embodiment it is suggested that movements of the hand with one finger held up are interpreted as controlling the water flow, and movements with two fingers held up control the temperature.
  • U.S. Pat. No. 5,549,469 to Freeman et al., titled: “Hand Gesture Machine Control System”, describes a system in which hand gestures are used to move a hand icon over various controls on a screen. Only a single gesture is used to control multiple functions. An additional gesture is used to turn on the system and thus random gestures are prevented from being interpreted as control movements.
  • U.S. Pat. No. 7,821,541 to Delean, titled: “Remote Control Apparatus Using Gesture Recognition”, describes controlling a television system using hand gestures. Up and down movements are interpreted as controlling the volume, and left and right movements are interpreted as controlling the channel. In order to increase the number of commands that may be invoked via hand gestures, a sequence of multiple hand gestures can be interpreted as a single command. In order to avoid interpreting random movements as control instructions, a dormant mode is defined, and the user is required to signal a desire to move to an active mode before providing instructions.
  • SUMMARY OF THE INVENTION
  • Embodiments of the present invention that are described hereinbelow provide systems for control of multiple functions using hand gestures.
  • There is therefore provided in accordance with an embodiment of the present invention a gesture based user interface, comprising a movement monitor configured to monitor a user's hand and to provide a signal based on movements of the hand; a display; and a processor configured to move a cursor responsive to the signal from the movement monitor, within a predetermined region on the display, formed of a plurality of sub-regions associated with respective control commands, and to provide the commands to a controlled application responsively to the movements of the hand. The sub-regions include at least one quick-access sub-region, for which the processor provides the command associated with the sub-region responsively to entrance of the cursor into the quick-access sub-region without additional user hand movements, and at least one regular sub-region, for which the processor provides the command associated with the sub-region responsively to identifying a predetermined hand gesture while the cursor is in the sub-region.
  • Optionally, the predetermined region comprises a single-dimensional region. Optionally, the predetermined region comprises a horizontal bar. Optionally, the processor is configured to define two quick-access sub-regions, one on each end of the single-dimensional region. Optionally, the predetermined region is a convex region. Optionally, the predetermined region does not include gaps between the sub-regions associated with respective control commands. Optionally, the first sub-regions are on the edges of the region.
  • Optionally, the processor is configured to ignore downward components of movements of the hand. Optionally, the quick-access sub-regions are located within the predetermined region such that downward movements do not lead to a quick-access sub-region.
  • There is further provided in accordance with an embodiment of the present invention a gesture based user interface, comprising a movement monitor configured to monitor a user's hand and to provide a signal based on movements of the hand; a display; and a processor configured to move a cursor responsive to the signal from the movement monitor, within a predetermined region on the display, formed of a plurality of sub-regions associated with respective control commands, and to provide the commands to a controlled application responsively to the movements of the hand. The processor is configured to ignore downward movements in controlling the cursor.
  • Optionally, the processor is configured to entirely ignore downward movements.
  • There is further provided in accordance with an embodiment of the present invention a method of receiving user input, comprising identifying movements of the hand of a user; moving a cursor within a predetermined region, formed of a plurality of sub-regions associated with respective control commands, by a processor, responsive to the identified movements; and actuating a control command associated with a sub-region in which the cursor is located,
  • wherein the sub-regions include at least one quick-access sub-region for which the associated command is actuated responsive to entrance of the cursor into the quick-access sub-region without requiring additional user hand movements and at least one regular sub-region, for which the associated command is actuated responsive to identifying a predetermined hand gesture while the cursor is in the sub-region.
  • Optionally, the predetermined hand gesture comprises an upward gesture. Optionally, moving the cursor within the predetermined region comprises moving along a single-dimensional region.
  • There is further provided in accordance with an embodiment of the present invention a computer software product, comprising a tangible computer-readable medium in which program instructions are stored, which instructions, when read by a computer, cause the computer to identify movements of a user; move a cursor within a predetermined region, formed of a plurality of sub-regions associated with respective control commands, responsive to the identified movements; and actuate a control command associated with a sub-region in which the cursor is located.
  • The sub-regions include at least one quick-access sub-region for which the associated command is actuated responsive to entrance of the cursor into the quick-access sub-region without requiring additional user hand movements and at least one regular sub-region, for which the associated command is actuated responsive to identifying a predetermined hand gesture while the cursor is in the sub-region.
  • There is further provided in accordance with an embodiment of the present invention a gesture based user interface, comprising a movement monitor configured to monitor a user's hand and to provide a signal based on movements of the hand; a display; and a processor configured to provide at least one interface state in which a cursor is confined to movement within a single dimension region responsive to the signal from the movement monitor, and to actuate different commands responsive to the signal from the movement monitor and the location of the cursor in the single dimension region.
  • Optionally, the single-dimensional region comprises a horizontal bar in which the cursor is confined only to right and left movements. Optionally, the processor is configured to allow the cursor to exit the single-dimensional region from a predetermined sub-region of the single-dimensional region. Optionally, the single-dimensional region covers less than 10% of the display. Optionally, the single-dimensional region comprises a plurality of adjacent sub-regions associated with different values of a single parameter.
  • The present invention will be more fully understood from the following detailed description of the embodiments thereof, taken together with the drawings in which:
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram of a control system based on identifying hand gestures, in accordance with an embodiment of the present invention;
  • FIG. 2 is a schematic view of a horizontal bar user interface, in accordance with an embodiment of the present invention;
  • FIG. 3 is a schematic view of a two dimensional user interface, in accordance with an embodiment of the present invention;
  • FIG. 4 is a schematic illustration of a joystick user interface, in accordance with another embodiment of the invention;
  • FIG. 5 is a schematic illustration of a display with a horizontal bar user interface 502, in accordance with an embodiment of the invention; and
  • FIG. 6 is a schematic illustration of a display with a user interface, in accordance with another embodiment of the invention.
  • DETAILED DESCRIPTION OF EMBODIMENTS
  • Although gesture based user interfaces have made significant advances, there are at least two major issues that still require additional attention. The first is the differentiation between user movements intended to be interpreted as control instructions and random movements that should be ignored. The second is allowing the user a range of commands, without making the interface too complex.
  • An aspect of some embodiments of the invention relates to a user interface based on identification of hand gestures in which a processor controls movement of a cursor (represented by an arrow, hand, or other icon or pointer) responsive to identified hand movements. The movements of the cursor are confined within a predetermined region on a display, formed of sub-regions corresponding to different respective controls. At least one of the sub-regions is a quick-access sub-region, which enables actuation of the respective control by positioning the cursor in the sub-region and/or moving the cursor within the sub-region, without additional user gestures, while other sub-regions require an additional user gesture to actuate the respective control of the sub-region.
  • In some embodiments, the quick-access sub-regions are on the edges of the predetermined region. Optionally, the region comprises a one-dimensional bar, such as a horizontal bar, in which the cursor is restricted to horizontal movements. Alternatively, the region comprises a two dimensional region. Optionally in accordance with this alternative, quick-access sub-regions cannot be accessed from a non-quick-access sub-region by a mere downward movement, such that downward movements (which are often inadvertent movements) are not undesirably interpreted as control commands.
  • The quick-access sub-regions allow fast actuation for their respective control acts, without requiring additional user gestures. Limiting this feature to specific sub-regions prevents control acts from being performed inadvertently.
  • Optionally, the region on the display is a continuous region without gaps between adjacent sub-regions. In some embodiments of the invention, the region is a convex region. Alternatively or additionally, the region covers only a fraction of the screen on which it is displayed, such as less than 30%, less than 20% or even less than 10% of the area of the screen.
  • An aspect of some embodiments of the invention relates to a user control system based on identification of hand gestures, having at least one state in which a processor controls movement of a cursor within a single dimension horizontal bar. The horizontal bar is formed of sub-regions corresponding to different respective controls, for selection of a control to be manipulated. By confining the movements of the cursor to a horizontal bar, it is easier for the user to identify the location of the cursor and to control its movements, at the expense of limiting the number of control icons available for selection.
  • In some embodiments of the invention, in one or more of the sub-regions, a command of the sub-region is invoked by a specific predetermined user movement, such as an upward gesture, a twist gesture and/or a movement of a specific number of fingers. Alternatively or additionally, the command of one or more sub-regions is invoked by the mere entrance into the sub-region. Optionally, the cursor is not moved out of the horizontal bar at all.
  • Alternatively, in one or more sub-regions, the command of the sub-region is invoked by moving a cursor (represented by the same icon or by a different icon) controllably outside the horizontal bar. For example, when invoking a sub-region associated with a multi-value parameter (e.g., volume control), a parameter value control may be displayed, for example in the form of a vertical bar or a separate horizontal bar, and the user moves a cursor within this bar. Optionally, when completing adjusting the value of the parameter, a user indication, such as a swivel or no-movements for a predetermined period are interpreted as a return to the horizontal bar.
  • Optionally, the horizontal bar covers only a small part (e.g., less than 20%, less than 10% or even less than 5%) of a general display. In some embodiments, one or more of the sub-regions allows exiting the horizontal bar into the general area of the entire display. In order to exit to the general display, the user moves the cursor to the specific sub-region of the horizontal bar assigned to the exit of the cursor, possibly through a specific gate in the border of the sub-region. Optionally, the return to the horizontal bar is also limited only to the specific sub-region or to the specific gate. Alternatively, the return to the horizontal bar is allowed from any direction, to simplify the return to the horizontal bar.
  • In some embodiments of the invention, the general display is divided into a plurality of sub-areas and the access to each sub-area is from a different sub-region and/or gate of the horizontal bar.
  • An aspect of some embodiments of the invention relates to a user interface based on identification of hand gestures in which a processor controls movement of a cursor within a predetermined region, formed of sub-regions corresponding to different respective controls, in which downward motions are ignored.
  • It is noted that in some embodiments, once a user selects a specific control to be manipulated by further hand gestures, downward gestures may be used for the further control of the specific control. In other embodiments, downward gestures are not used at all. Avoiding the use of downward movements reduces the chances of inadvertent movements being interpreted as requesting desired actions.
  • FIG. 1 is a block diagram of a control system 100 based on identifying hand gestures, in accordance with an embodiment of the present invention. Control system 100 comprises a movement identifier 102 adapted to identify movements of a user's hand and a processor 104 which receives indications of movements from identifier 102 and accordingly adjusts a display on a screen 106 and provides commands to a controlled application 108. Processor 104 typically comprises a general-purpose computer processor, with a suitable memory and control interfaces. The processor is programmed in software to carry out the functions described hereinbelow. This software may be downloaded to processor 104 in electronic form, over a network, for example. Alternatively or additionally, the software may be stored on tangible storage media, including non-volatile storage media, such as optical, magnetic, or electronic memory.
  • In some embodiments of the invention, movement identifier 102 comprises an infrared depth mapping system, which may comprise a coherent light source and a generator of a random speckle pattern as described, for example, in PCT publication WO2007/043036 to Zalevsky et al., the disclosure of which is incorporated herein by reference. Another suitable depth mapping system that may be used for this purpose is described in US patent application publication 2010/0007717, whose disclosure is also incorporated herein by reference. This sort of depth map is segmented, and parts of the user's body are identified and tracked in order to identify user gestures, as described, for example, in U.S. patent application Ser. No. 12/854,187, filed Aug. 11, 2010, and in US patent application publication 2011/0052006, whose disclosures are also incorporated herein by reference. Processor 104 computes or receives location coordinates and gesture indications with respect to the user's hand within a certain volume in space, and associates this volume with a region on screen 106, as described further hereinbelow.
  • In other embodiments, movement identifier 102 comprises a glove, such as described in U.S. Pat. No. 4,988,981, the disclosure of which is incorporated herein by reference. In still other embodiments, movement identifier 102 operates based on capacitance coupling and/or is based on the user's hand holding or wearing a signal transmitter, such as an infrared emitting module. Movement identifier 102 may comprise also other devices.
  • Controlled application 108 may run on processor 104 together with a control application which performs the control tasks described herein, or may run on a separate processor. Such a separate processor may be included with processor 104 in a same housing, or may be a completely external unit to processor 104, connected through a cable and/or wirelessly to processor 104. Controlled application 108 may include, for example, a television set, set-top box, a VCR, a DVD player, a computer game, a console or other devices, such as home appliances and kitchen appliances.
  • Screen 106 is optionally shared by processor 104 and controlled application 108, although in some embodiments, separate screens may be used.
  • FIG. 2 is a schematic illustration of a horizontal bar user interface 200 displayed on screen 106, in accordance with an embodiment of the present invention. User interface 200 includes a controlled cursor 202, which is constrained to move along a horizontal bar 204, following left and right hand gestures. Horizontal bar 204 is divided into a plurality of sub-regions 206A, 206B, 206C, 206D, 208A and 208B corresponding to respective controls. Optionally, one or more of the sub-regions is a quick-access sub-region 208 (marked 208A and 208B), for which the entrance of controlled cursor 202 into the sub-region causes processor 104 to actuate the corresponding command without requiring additional input from the user. In contrast, one or more other sub-regions 206 ( Marked 206A, 206B, 206C and 206D) are regular sub-regions in which the presence of controlled cursor 202 therein does not cause on its own command actuation of the corresponding control by processor 104.
  • Sub-regions 206 may all be of the same size, or different sub-regions 206 may have different sizes, for example according to the relative importance of their corresponding command. Similarly, quick-access sub-regions 208 may all be the same size or different quick-access sub-regions 208 may have different sizes according to their corresponding controls. For example, popular commands may correspond to larger sub-regions and/or controls whose commands have a larger span of values may correspond to larger sub-regions.
  • In some embodiments, quick-access sub-regions 208 are on the edges of the predetermined region. This allows the user to quickly move the cursor 202 into a quick-access sub-region 208 without worrying about accurately moving into the sub-region without passing it. Alternatively or additionally, one or more quick-access sub-regions 208 are not on edges of the horizontal bar, thus allowing a larger number of quick-access sub-regions 208. Such quick-access sub-regions 208 which are not on edges of the horizontal bar are referred to herein as hover sub-regions. Optionally, hover regions are located adjacent edge-located quick-access sub-regions having a similar associated command, such as controlling the same parameter but to a different extent (e.g., sub-region 206D is a hover sub-region having a similar command as sub-region 208B). Thus, if a user reaches inadvertently one of the sub-regions and not the other the same general command is performed. For example, quick-access sub-regions 208 may be used to control scrolling of items in a menu, with hover sub-regions and adjacent edge located sub-regions differing in the scrolling speed. In some embodiments, a plurality of adjacent sub-regions (e.g., at least 3, at least 5 or even at least 8 adjacent sub-regions) are associated with different values of a single parameter.
  • Display Indications
  • Optionally, each regular sub-region 206 and/or each quick-access sub-region 208 is associated with a respective symbol 216 (marked 216A, 216B, 216C and 216D) or 218 (marked 218A and 218B), which indicates the command associated with the sub-region. In some embodiments of the invention, when controlled cursor 202 enters a sub-region 206 and/or 208, the corresponding symbol 216 or 218 is modified to indicate the presence of cursor 202 in the respective sub-region. The modification of symbol 216 or 218 includes, for example, enlargement of the symbol and/or increasing of the brightness of the symbol. Other modifications may be used additionally or alternatively, such as color changes, rotation and/or changes in the shape or texture of the symbol itself. In some embodiments of the invention, the entrance of cursor 202 into a different sub-region is indicated by an audio sound, for example a quiet tick sound. Different sounds may be used for the entrance into different sub-regions or for the entrance into different types of sub-regions, e.g., 206 vs. 208, or the same sound may be used for all sub-regions. In one embodiment, sounds are used only to indicate entrance into regular sub-regions 206.
  • In some embodiments of the invention, when cursor 202 enters a sub-region 206 or 208, the size of the sub-region is enlarged, such that leaving the sub-region requires moving a larger extent than entering the sub-region. This provides a hysteresis effect and prevents flickering between sub-regions due to small user movements. The entered sub-region 206 may be enlarged only in the direction closest to the current location of cursor 202 or may be enlarged in all directions for symmetry.
  • Optionally, when a command corresponding to a sub-region 206 is actuated, its corresponding symbol 216 is modified to so indicate, for example by increasing its brightness and/or changing its color in a manner different from that used to indicate that cursor 202 is in the corresponding sub-region 206. Alternatively or additionally, a distinct audio signal indicates command actuation, the same audio signal being used for all command actuations or different sounds may be used for different commands. Optionally, the audio signal indicating command actuation is different from the audio signal indication entering into a sub-region 206.
  • Horizontal Bar Extent
  • Optionally, horizontal bar 204, regardless of its display length, corresponds to a sufficiently large extent of horizontal movement of the hand, so that small inadvertent movements are not interpreted as user commands to move between sub-regions 206 or to enter a quick-access sub-region 208. For example, the extent of horizontal movement is optionally greater than 10 centimeters, greater than 15 centimeters or even greater than 20 centimeters. On the other hand, the extent of horizontal movement corresponding to the extent of horizontal bar 204 is not too large, so as to require uncomfortable substantial movements from the user in order to move between sub-regions. Optionally, the extent of horizontal movement is smaller than 35 centimeter, smaller than 30 centimeters or even shorter than 25 centimeters. In one particular embodiment the horizontal extent is 24 centimeters.
  • As shown, in some embodiments, sub-regions 206 are adjacent each other, without gaps between them. In other embodiments, the horizontal bar includes gaps between sub-regions. In the gaps, user upward movements are ignored as the gaps are not associated with a command.
  • Inadvertent Command Avoidance
  • In some embodiments of the invention, downward movements are ignored and are not used for commands, as downward movements are commonly performed inadvertently. Optionally, movements along the depth axis, are also ignored. Alternatively, some downward movements having specific characteristics may be interpreted as commands. For example, a substantial downward movement may be interpreted as an instruction to discontinue control of controlled application 108 based on the user's movements, allowing the user to move freely without causing unwanted control operations. A predetermined user movement is used to re-establish the control.
  • Optionally, upward hand gestures are considered as commands only if provided within a predetermined time from entering a sub-region 206. Thus, the chances of inadvertent gestures being interpreted as commands is decreased. Alternatively or additionally, when not moved for at least a predetermined time, cursor 202 is moved to a default location, for example in the center of horizontal bar 104, so as to reduce the chances of an inadvertent move into a quick-access sub-region 208. In other embodiments, cursor 202 is not moved after a period of non-use, allowing a user to leave cursor 202 near a quick-access sub-region 208 and quickly enter the sub-region, when desired.
  • Command Actuation
  • Actuation of the command corresponding to a regular sub-region 206 is optionally performed responsive to a specific user hand movement, for example an upward movement of at least a predetermined extent, for example at least 0.5 centimeter, at least 2 centimeters, at least 5 centimeters or even at least 8 centimeters. Requiring at least a minimal extent of the upward movement reduces the possibility that an inadvertent movement is interpreted as a user instruction. In some embodiments of the invention, the extent of the hand movement (e.g., upward movement) considered by processor 104 as a user instruction is user-adjustable.
  • In some embodiments, processor 104 automatically adjusts the extent of hand movements considered as a user command according to tracking of instructions which are cancelled by the user shortly after they are given. For example, processor 104 may keep track of the extent of upward hand movements of user commands and the corresponding time passing between when the instruction is given and when it is cancelled. If it is determined that a large percentage of commands given by relatively short upward hand gestures are cancelled within a short period, a threshold for interpreting upward gestures as commands may be increased.
  • Processor 104 may also monitor the number of occurrences of relatively short upward movements which were ignored and were then followed by a larger upward movement interpreted as a user command, in the same sub-region. The threshold of the size of a movement may be lowered if at least a predetermined number (e.g., at least 1, at least 3, at least 5) of such occurrences are identified within a predetermined period (e.g., 10 minutes, an hour).
  • Two-Dimensional Embodiment
  • FIG. 3 is a schematic illustration of a displayed two-dimensional user interface 300, in accordance with an embodiment of the invention. In this embodiment, sub-regions 306 and 308 are distributed in a two-dimensional array 304, and a cursor 302 is moved around array 304 to select a sub-region 306 or 308. Sub-regions 308 on the periphery of array 304 are optionally quick-access sub-regions for which entrance of cursor 302 therein incurs actuation of their respective command without requiring further user input. Sub-regions 306, on the other hand, are regular sub-regions which require additional user input, such as a push, pull, circular or swerve gesture to actuate their corresponding command. Alternatively or additionally, other gestures may be used to actuate the command of a sub-region 306, such as quick left-right flicks. In some embodiments, the command of a sub-region 306 is actuated if cursor 302 is in the sub-region for longer than a predetermined period. The predetermined period is optionally at least 3 seconds, at least 5 seconds or even at least 10 seconds.
  • As shown, quick-access sub-regions 308 are located on the upper area of array 304, such that downward movements, which are more commonly performed inadvertently, do not cause cursor 302 to move into a quick-access sub-region 308.
  • Command Types
  • In some embodiments of the invention, the commands corresponding to regular sub-regions 206 are toggling commands which simply require a user indication that the command is to be performed. The toggling commands may include, for example, on/off, select, menu display and/or increase of a variable (e.g., volume) by a predetermined value. Optionally, in these embodiments, for sub-regions 206 assigned to controlling a variable, such as volume, requiring receiving user input as to an extent of change of the variable value, a separate horizontal bar, for receiving a desired value of the variable is displayed responsive to a user's upward hand gesture in the corresponding sub-region 206. Alternatively, the upward hand gesture controls the variable value according to its extent above a minimal threshold. In some embodiments of the invention, the variable value is controlled from its current value. Alternatively or additionally, each time the variable is controlled, the variable value begins from a minimal value at the point at which the upward gesture passes the threshold.
  • In some embodiments of the invention, two separate sub-regions 206 are assigned to multi-value variables, one for increasing the value of the variable and the other for decreasing the variable value.
  • Optionally, a visual vertical bar is displayed near the symbol of the current sub-region 206, indicating the extent of upward movement of the hand gesture. The bar may indicate the extent required in order to reach the minimal threshold required for the gesture to be considered a user indication and/or may indicate the extent beyond the threshold, for cases in which the command involves a range of values.
  • Quick-Access Sub-Regions
  • In some embodiments of the invention, quick-access sub-regions 208 and/or 308 are used for toggling commands. Alternatively, one or more of the quick-access sub-regions 308 are used for controlling a parameter value on a multi-value scale, such as for scrolling, channel control or volume control. Optionally, each time cursor 202 or 302 enters the quick-access sub-region 208 or 308, the value of the parameter is increased by a predetermined value. Alternatively or additionally, the time for which the cursor 202 is in quick-access sub-region 208 defines the extent of change of the parameter value. In some embodiments of the invention, two quick access sub-regions 208 or 308 are used for each parameter, one for increasing the parameter and the other for decreasing the parameter.
  • Alternatively, the horizontal orientation of quick access sub-regions 208 or 308 is used for controls requiring indication of an extent. In some embodiments of the invention, one or more of quick access sub-regions 208 or 308 is divided into a plurality of zones which are associated with different values or with different extents of value change. The value control optionally depends only on horizontal movements in the sub-region 308, ignoring vertical movement components. Sub-regions 308 may have various shapes, including, as shown, rectangular and L-shaped sub-regions. Other shapes, including triangular and round shapes may be used.
  • Further alternatively or additionally, any of the control methods described above regarding sub-regions 206 may be used for sub-regions 208, including opening a separate horizontal or vertical bar.
  • Optionally, quick-access sub-regions 208 and 308 are assigned to commands that are used more frequently than other commands and/or for commands that should be available at all times. For example, quick-access regions may be used for “mute” or “pause” in a video/audio control system and/or for a home command, for returning to a home state of the user interface.
  • The commands assigned to quick-access sub-regions 208 or 308 are optionally commands which are easily reversible by the user. Alternatively or additionally, the commands assigned to quick-access sub-regions 208 or 308 do not directly control an operation of controlled application 108, but rather perform adjustments or selections which require other commands to cause operations of controlled application 108. In other embodiments, the commands assigned to quick-access sub-regions 208 or 308 do not control acts of controlled application 108 which interact with the user, such as changing the volume and/or beginning and/or changing a viewed program.
  • FIG. 4 is a schematic illustration of a joystick user interface 400, in accordance with another embodiment of the invention. Interface 400 includes a central sub-region 406 serving as a joystick button and four directional triangles 412 formed of pairs of quick access sub-regions 410 and 408. Sub-regions 410 are hover sub-regions associated with slow movement in a specific direction and adjacent sub-regions 408 are associated with movement in the same direction, but at a faster speed.
  • It is noted that triangles 412 may include more than two sub-regions each, possibly, 3, 4 or even more, allowing the user more flexibility in choosing the speed. Alternatively or additionally, central sub-region 406 may be divided into a plurality of sub-regions corresponding to different button commands. In some embodiments of the invention, a horizontal bar 420 including a plurality of sub-regions, such as one similar to user interface 200 is located within central sub-region 406. Horizontal bar 420 may be displayed continuously, or may be displayed responsive to a user command given when cursor 406 is within central sub-region 406. Optionally, when usage of horizontal bar 420 is invoked, for example by hovering cursor 402 over its location and providing a unique user movement, such as a hand twist or movement along the depth axis, the movement of cursor 402 is confined to horizontal bar 420. Optionally, one of the sub-regions of horizontal bar 420 is assigned to a command to allow cursor 402 to leave the horizontal bar. Alternatively or additionally, a unique user movement may be used to allow indication of leaving the horizontal bar 420 from any point therein.
  • Sub-Menus
  • Interfaces 200 and 300 may include control of all commands that are available to a user. Alternatively, one or more of the sub-regions may be assigned to a command of entering a sub-interface and/or of returning back to a parent or main interface.
  • In some embodiments of the invention, the sub-menu includes a sub-region thereof assigned to a command of returning back to the main menu. The “return” command is optionally achieved by an upward gesture, like the other commands, so that a downward gesture for the “return” command is not required. In other embodiments, processor 104 automatically returns to the main menu after a predetermined time and/or after a predetermined period of inaction. In some embodiments, downward gestures are allowed in sub-menus, while being ignored in the main menu. Accordingly, downward gestures may be used to return to the main menu.
  • EXAMPLES
  • In some example embodiments, quick-access sub-regions 208 or 308 are assigned to a play/pause command and to a volume control, while regular sub-regions 206 or 306 are assigned to menu selections for movement to other interfaces. For example, one sub-region 206 may be assigned to entrance into a sub-interface for controlling display speed (e.g., fast-forward and rewind) and another may be assigned to entrance into a movie selection interface. In other embodiments, quick access regions are assigned to a command for opening a frequently used menu and/or to a command for returning to a parent or main menu.
  • In another example, quick-access sub-regions 208 are assigned to rewind and fast-forward controls and regular sub-regions 206 are assigned to play, volume control and program selection commands.
  • In other example embodiments, most or all of the regular sub-regions 206 or 306 are assigned to menu choices, with each sub-region assigned to an item (e.g., a movie in a movie selection menu). Quick-access sub-regions 208 or 308 are optionally assigned to scrolling commands which present additional sub-regions corresponding to items, particularly when the number of selectable items is larger than the number of sub-regions presented concurrently on the interface. It is noted that in a simple embodiment, a single-dimension menu interface includes three sub-regions: a middle sub-region 206 corresponding to a selection command of a current item in the menu and side sub-regions 208 for left and right scrolling in the menu. If desired, more selection sub-regions 206 may be included, to allow faster scrolling through the items of the menu.
  • In some embodiments, in order to provide for more commands, one or more of the sub-regions 206, 306 is associated with different commands depending on attributes of the user movement. For example, a first action may be performed if the user raises a single finger, while a different command is performed if the user raises two fingers.
  • FIG. 5 is a schematic illustration of a display 500 with a horizontal bar user interface 502, in accordance with an embodiment of the invention. The movements of a cursor 504 are confined for one or more interface states to user interface 502, making it simple for the user to identify where the cursor is located.
  • Optionally, user interface 502 covers less than 20%, less than 10%, less than 5% or even less than 2% of the area of display 500. User interface 502 may be a horizontal bar as shown in FIG. 2, a two dimensional array as shown in FIG. 3 or may have any other suitable shape. It is noted that the system may include one or more other states in which cursor 504 is allowed to move over larger parts of display 500, possibly even over all of the area of display 500.
  • Display 500 may be used for example to show a movie, with interface 502 used to control fast-forward/rewind, volume, play/pause and/or other relevant commands of viewing a movie. In one or more other states, part or all of display 500 is used for presenting options for selection, e.g., movies.
  • Interface 502 may be located at the bottom of display 500, as shown, or in other locations, such as the top, right, left or in the middle.
  • FIG. 6 is a schematic illustration of a display 600 with a user interface 602, in accordance with still another embodiment of the invention. User interface 602 is located on the upper outskirts of display 600, with the central part 604 of display 600 used for other purposes such as displaying a movie and/or selection options. Thus, central part 604 used for general display is located between a right arm 606 and a left arm 608 of interface 602.
  • In some embodiments of the invention, a cursor 612 of interface 602 is confined to movement within interface 602, without entering central part 604. Optionally, the movements throughout interface 602, including arms 606 and 608 are performed responsive to right-left movements of the user, and downward movements are ignored. Alternatively, movements in arms 606 and 608 are performed based on upward and downward movements of the user.
  • Conclusion
  • While the above description relates to hand movements, the principals of the present invention may be used with other movements, such as leg movements, body movements and/or finger movements.
  • It will thus be appreciated that the embodiments described above are cited by way of example, and that the present invention is not limited to what has been particularly shown and described hereinabove. Rather, the scope of the present invention includes both combinations and subcombinations of the various features described hereinabove, as well as variations and modifications thereof which would occur to persons skilled in the art upon reading the foregoing description and which are not disclosed in the prior art.

Claims (20)

1. A gesture based user interface, comprising:
a movement monitor configured to monitor a user's hand and to provide a signal based on movements of the hand;
a display; and
a processor configured to move a cursor responsive to the signal from the movement monitor, within a predetermined region on the display, formed of a plurality of sub-regions associated with respective control commands, and to provide the commands to a controlled application responsively to the movements of the hand,
wherein the sub-regions include at least one quick-access sub-region, for which the processor provides the command associated with the sub-region responsively to entrance of the cursor into the quick-access sub-region without additional user hand movements, and at least one regular sub-region, for which the processor provides the command associated with the sub-region responsively to identifying a predetermined hand gesture while the cursor is in the sub-region.
2. The user interface of claim 1, wherein the predetermined region comprises a single-dimensional region.
3. The user interface of claim 2, wherein the predetermined region comprises a horizontal bar.
4. The user interface of claim 2, wherein the processor is configured to define two quick-access sub-regions, one on each end of the single-dimensional region.
5. The user interface of claim 1, wherein the predetermined region is a convex region.
6. The user interface of claim 5, wherein the predetermined region does not include gaps between the sub-regions associated with respective control commands.
7. The user interface of claim 5, wherein the first sub-regions are on the edges of the region.
8. The user interface of claim 1, wherein the processor is configured to ignore downward components of movements of the hand.
9. The user interface of claim 1, wherein the quick-access sub-regions are located within the predetermined region such that downward movements do not lead to a quick-access sub-region.
10. A gesture based user interface, comprising:
a movement monitor configured to monitor a user's hand and to provide a signal based on movements of the hand;
a display; and
a processor configured to move a cursor responsive to the signal from the movement monitor, within a predetermined region on the display, formed of a plurality of sub-regions associated with respective control commands, and to provide the commands to a controlled application responsively to the movements of the hand,
wherein the processor is configured to ignore downward movements in controlling the cursor.
11. The user interface of claim 10, wherein the processor is configured to entirely ignore downward movements.
12. A method of receiving user input, comprising:
identifying movements of the hand of a user;
moving a cursor within a predetermined region, formed of a plurality of sub-regions associated with respective control commands, by a processor, responsive to the identified movements; and
actuating a control command associated with a sub-region in which the cursor is located,
wherein the sub-regions include at least one quick-access sub-region for which the associated command is actuated responsive to entrance of the cursor into the quick-access sub-region without requiring additional user hand movements and at least one regular sub-region, for which the associated command is actuated responsive to identifying a predetermined hand gesture while the cursor is in the sub-region.
13. The method of claim 12, wherein the predetermined hand gesture comprises an upward gesture.
14. The method of claim 12, wherein moving the cursor within the predetermined region comprises moving along a single-dimensional region.
15. A computer software product, comprising a tangible computer-readable medium in which program instructions are stored, which instructions, when read by a computer, cause the computer to:
identify movements of a user;
move a cursor within a predetermined region, formed of a plurality of sub-regions associated with respective control commands, responsive to the identified movements; and
actuate a control command associated with a sub-region in which the cursor is located,
wherein the sub-regions include at least one quick-access sub-region for which the associated command is actuated responsive to entrance of the cursor into the quick-access sub-region without requiring additional user hand movements and at least one regular sub-region, for which the associated command is actuated responsive to identifying a predetermined hand gesture while the cursor is in the sub-region.
16. A gesture based user interface, comprising:
a movement monitor configured to monitor a user's hand and to provide a signal based on movements of the hand;
a display; and
a processor configured to provide at least one interface state in which a cursor is confined to movement within a single dimension region responsive to the signal from the movement monitor, and to actuate different commands responsive to the signal from the movement monitor and the location of the cursor in the single dimension region.
17. The user interface of claim 16, wherein the single-dimensional region comprises a horizontal bar in which the cursor is confined only to right and left movements.
18. The user interface of claim 16, wherein the processor is configured to allow the cursor to exit the single-dimensional region from a predetermined sub-region of the single-dimensional region.
19. The user interface of claim 16, wherein the single-dimensional region covers less than 10% of the display.
20. The user interface of claim 16, wherein the single-dimensional region comprises a plurality of adjacent sub-regions associated with different values of a single parameter.
US13/161,508 2010-06-17 2011-06-16 Gesture based user interface Abandoned US20110310010A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US13/161,508 US20110310010A1 (en) 2010-06-17 2011-06-16 Gesture based user interface
US15/434,081 US10429937B2 (en) 2010-06-17 2017-02-16 Gesture based user interface
US16/550,423 US10928921B2 (en) 2010-06-17 2019-08-26 Gesture based user interface

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US35557410P 2010-06-17 2010-06-17
US13/161,508 US20110310010A1 (en) 2010-06-17 2011-06-16 Gesture based user interface

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/434,081 Continuation US10429937B2 (en) 2010-06-17 2017-02-16 Gesture based user interface

Publications (1)

Publication Number Publication Date
US20110310010A1 true US20110310010A1 (en) 2011-12-22

Family

ID=45328163

Family Applications (3)

Application Number Title Priority Date Filing Date
US13/161,508 Abandoned US20110310010A1 (en) 2010-06-17 2011-06-16 Gesture based user interface
US15/434,081 Active US10429937B2 (en) 2010-06-17 2017-02-16 Gesture based user interface
US16/550,423 Active US10928921B2 (en) 2010-06-17 2019-08-26 Gesture based user interface

Family Applications After (2)

Application Number Title Priority Date Filing Date
US15/434,081 Active US10429937B2 (en) 2010-06-17 2017-02-16 Gesture based user interface
US16/550,423 Active US10928921B2 (en) 2010-06-17 2019-08-26 Gesture based user interface

Country Status (1)

Country Link
US (3) US20110310010A1 (en)

Cited By (40)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100235786A1 (en) * 2009-03-13 2010-09-16 Primesense Ltd. Enhanced 3d interfacing for remote devices
US20130194632A1 (en) * 2012-01-30 2013-08-01 Kazuhisa Kishimoto Information processing apparatus, control method therefor, and control program
US8615108B1 (en) 2013-01-30 2013-12-24 Imimtek, Inc. Systems and methods for initializing motion tracking of human hands
US20140026101A1 (en) * 2012-07-20 2014-01-23 Barnesandnoble.Com Llc Accessible Menu Navigation Techniques For Electronic Devices
CN103558927A (en) * 2013-11-21 2014-02-05 广州视声电子实业有限公司 3D gesture control method and device
US8655021B2 (en) 2012-06-25 2014-02-18 Imimtek, Inc. Systems and methods for tracking human hands by performing parts based template matching using images from multiple viewpoints
US20140173504A1 (en) * 2012-12-17 2014-06-19 Microsoft Corporation Scrollable user interface control
US8830312B2 (en) 2012-06-25 2014-09-09 Aquifi, Inc. Systems and methods for tracking human hands using parts based template matching within bounded regions
US8872762B2 (en) 2010-12-08 2014-10-28 Primesense Ltd. Three dimensional user interface cursor control
WO2014174513A1 (en) * 2013-04-21 2014-10-30 Biogaming Ltd. Kinetic user interface
US8881051B2 (en) 2011-07-05 2014-11-04 Primesense Ltd Zoom-based gesture user interface
US8933876B2 (en) 2010-12-13 2015-01-13 Apple Inc. Three dimensional user interface session control
US8959013B2 (en) 2010-09-27 2015-02-17 Apple Inc. Virtual keyboard for a non-tactile three dimensional user interface
US9030498B2 (en) 2011-08-15 2015-05-12 Apple Inc. Combining explicit select gestures and timeclick in a non-tactile three dimensional user interface
US9035876B2 (en) 2008-01-14 2015-05-19 Apple Inc. Three-dimensional user interface session control
US9092665B2 (en) 2013-01-30 2015-07-28 Aquifi, Inc Systems and methods for initializing motion tracking of human hands
US9122311B2 (en) 2011-08-24 2015-09-01 Apple Inc. Visual feedback for tactile and non-tactile user interfaces
WO2015153673A1 (en) * 2014-03-31 2015-10-08 Google Inc. Providing onscreen visualizations of gesture movements
US9158375B2 (en) 2010-07-20 2015-10-13 Apple Inc. Interactive reality augmentation for natural interaction
US20150331668A1 (en) * 2013-01-31 2015-11-19 Huawei Technologies Co., Ltd. Non-contact gesture control method, and electronic terminal device
US9201501B2 (en) 2010-07-20 2015-12-01 Apple Inc. Adaptive projector
US9218063B2 (en) 2011-08-24 2015-12-22 Apple Inc. Sessionless pointing user interface
US9229534B2 (en) 2012-02-28 2016-01-05 Apple Inc. Asymmetric mapping for tactile and non-tactile user interfaces
US9285874B2 (en) 2011-02-09 2016-03-15 Apple Inc. Gaze detection in a 3D mapping environment
US9298266B2 (en) 2013-04-02 2016-03-29 Aquifi, Inc. Systems and methods for implementing three-dimensional (3D) gesture based graphical user interfaces (GUI) that incorporate gesture reactive interface objects
US9310891B2 (en) 2012-09-04 2016-04-12 Aquifi, Inc. Method and system enabling natural user interface gestures with user wearable glasses
US9377863B2 (en) 2012-03-26 2016-06-28 Apple Inc. Gaze-enhanced virtual touchscreen
US9377865B2 (en) 2011-07-05 2016-06-28 Apple Inc. Zoom-based gesture user interface
US9393695B2 (en) 2013-02-27 2016-07-19 Rockwell Automation Technologies, Inc. Recognition-based industrial automation control with person and object discrimination
US9459758B2 (en) 2011-07-05 2016-10-04 Apple Inc. Gesture-based interface with enhanced features
US9498885B2 (en) 2013-02-27 2016-11-22 Rockwell Automation Technologies, Inc. Recognition-based industrial automation control with confidence-based decision support
US9504920B2 (en) 2011-04-25 2016-11-29 Aquifi, Inc. Method and system to create three-dimensional mapping in a two-dimensional game
US9507417B2 (en) 2014-01-07 2016-11-29 Aquifi, Inc. Systems and methods for implementing head tracking based graphical user interfaces (GUI) that incorporate gesture reactive interface objects
US9600078B2 (en) 2012-02-03 2017-03-21 Aquifi, Inc. Method and system enabling natural user interface gestures with an electronic system
US9619105B1 (en) 2014-01-30 2017-04-11 Aquifi, Inc. Systems and methods for gesture based interaction with viewpoint dependent user interfaces
US9798302B2 (en) 2013-02-27 2017-10-24 Rockwell Automation Technologies, Inc. Recognition-based industrial automation control with redundant system input support
US9798388B1 (en) 2013-07-31 2017-10-24 Aquifi, Inc. Vibrotactile system to augment 3D input systems
US9804576B2 (en) 2013-02-27 2017-10-31 Rockwell Automation Technologies, Inc. Recognition-based industrial automation control with position and derivative decision reference
US9857868B2 (en) 2011-03-19 2018-01-02 The Board Of Trustees Of The Leland Stanford Junior University Method and system for ergonomic touch-free interface
CN107918481A (en) * 2016-10-08 2018-04-17 天津锋时互动科技有限公司深圳分公司 Man-machine interaction method and system based on gesture identification

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111580652B (en) * 2020-05-06 2024-01-16 Oppo广东移动通信有限公司 Video playing control method and device, augmented reality equipment and storage medium
US11803247B2 (en) 2021-10-25 2023-10-31 Kyndryl, Inc. Gesture-based control of plural devices in an environment
DE102022116737A1 (en) 2022-07-05 2024-01-11 Friedrich-Alexander-Universität Erlangen-Nürnberg, Körperschaft des öffentlichen Rechts System, method, computer program and computer-readable medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6229541B1 (en) * 1999-09-03 2001-05-08 Isurftv Use of templates for cost-effective secure linking of video stream objects
US20020057383A1 (en) * 1998-10-13 2002-05-16 Ryuichi Iwamura Motion sensing interface
US20060020905A1 (en) * 2004-07-20 2006-01-26 Hillcrest Communications, Inc. Graphical cursor navigation methods
US20060248475A1 (en) * 2002-09-09 2006-11-02 Thomas Abrahamsson Graphical user interface system
US20090027337A1 (en) * 2007-07-27 2009-01-29 Gesturetek, Inc. Enhanced camera-based input
US20100229125A1 (en) * 2009-03-09 2010-09-09 Samsung Electronics Co., Ltd. Display apparatus for providing a user menu, and method for providing user interface (ui) applicable thereto

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7263668B1 (en) 2000-11-09 2007-08-28 International Business Machines Corporation Display interface to a computer controlled display system with variable comprehensiveness levels of menu items dependent upon size of variable display screen available for menu item display
US8905834B2 (en) 2007-11-09 2014-12-09 Igt Transparent card display
WO2007098206A2 (en) 2006-02-16 2007-08-30 Hillcrest Laboratories, Inc. Systems and methods for placing advertisements
WO2007096893A2 (en) 2006-02-27 2007-08-30 Prime Sense Ltd. Range mapping using speckle decorrelation
US8139029B2 (en) 2006-03-08 2012-03-20 Navisense Method and device for three-dimensional sensing
EP2201761B1 (en) 2007-09-24 2013-11-20 Qualcomm Incorporated Enhanced interface for voice and video communications
US9772689B2 (en) * 2008-03-04 2017-09-26 Qualcomm Incorporated Enhanced gesture-based image manipulation
JP5361349B2 (en) 2008-11-28 2013-12-04 任天堂株式会社 Information processing apparatus, computer program, information processing system, and information processing method
US11589754B2 (en) 2009-05-20 2023-02-28 Sotera Wireless, Inc. Blood pressure-monitoring system with alarm/alert system that accounts for patient motion
US8487871B2 (en) 2009-06-01 2013-07-16 Microsoft Corporation Virtual desktop coordinate transformation
US8977972B2 (en) 2009-12-31 2015-03-10 Intel Corporation Using multi-modal input to control multiple objects on a display
US8756532B2 (en) 2010-01-21 2014-06-17 Cisco Technology, Inc. Using a gesture to transfer an object across multiple multi-touch devices
US20110289455A1 (en) 2010-05-18 2011-11-24 Microsoft Corporation Gestures And Gesture Recognition For Manipulating A User-Interface
US9519418B2 (en) 2011-01-18 2016-12-13 Nokia Technologies Oy Method and apparatus for providing a multi-stage device transition mechanism initiated based on a touch gesture
US9104239B2 (en) 2011-03-09 2015-08-11 Lg Electronics Inc. Display device and method for controlling gesture functions using different depth ranges
US8840466B2 (en) 2011-04-25 2014-09-23 Aquifi, Inc. Method and system to create three-dimensional mapping in a two-dimensional game
US8854433B1 (en) 2012-02-03 2014-10-07 Aquifi, Inc. Method and system enabling natural user interface gestures with an electronic system
US9619105B1 (en) 2014-01-30 2017-04-11 Aquifi, Inc. Systems and methods for gesture based interaction with viewpoint dependent user interfaces

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020057383A1 (en) * 1998-10-13 2002-05-16 Ryuichi Iwamura Motion sensing interface
US6229541B1 (en) * 1999-09-03 2001-05-08 Isurftv Use of templates for cost-effective secure linking of video stream objects
US20060248475A1 (en) * 2002-09-09 2006-11-02 Thomas Abrahamsson Graphical user interface system
US20060020905A1 (en) * 2004-07-20 2006-01-26 Hillcrest Communications, Inc. Graphical cursor navigation methods
US20090027337A1 (en) * 2007-07-27 2009-01-29 Gesturetek, Inc. Enhanced camera-based input
US20100229125A1 (en) * 2009-03-09 2010-09-09 Samsung Electronics Co., Ltd. Display apparatus for providing a user menu, and method for providing user interface (ui) applicable thereto

Cited By (51)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9035876B2 (en) 2008-01-14 2015-05-19 Apple Inc. Three-dimensional user interface session control
US20100235786A1 (en) * 2009-03-13 2010-09-16 Primesense Ltd. Enhanced 3d interfacing for remote devices
US9201501B2 (en) 2010-07-20 2015-12-01 Apple Inc. Adaptive projector
US9158375B2 (en) 2010-07-20 2015-10-13 Apple Inc. Interactive reality augmentation for natural interaction
US8959013B2 (en) 2010-09-27 2015-02-17 Apple Inc. Virtual keyboard for a non-tactile three dimensional user interface
US8872762B2 (en) 2010-12-08 2014-10-28 Primesense Ltd. Three dimensional user interface cursor control
US8933876B2 (en) 2010-12-13 2015-01-13 Apple Inc. Three dimensional user interface session control
US9454225B2 (en) 2011-02-09 2016-09-27 Apple Inc. Gaze-based display control
US9342146B2 (en) 2011-02-09 2016-05-17 Apple Inc. Pointing-based display interaction
US9285874B2 (en) 2011-02-09 2016-03-15 Apple Inc. Gaze detection in a 3D mapping environment
US9857868B2 (en) 2011-03-19 2018-01-02 The Board Of Trustees Of The Leland Stanford Junior University Method and system for ergonomic touch-free interface
US9504920B2 (en) 2011-04-25 2016-11-29 Aquifi, Inc. Method and system to create three-dimensional mapping in a two-dimensional game
US8881051B2 (en) 2011-07-05 2014-11-04 Primesense Ltd Zoom-based gesture user interface
US9459758B2 (en) 2011-07-05 2016-10-04 Apple Inc. Gesture-based interface with enhanced features
US9377865B2 (en) 2011-07-05 2016-06-28 Apple Inc. Zoom-based gesture user interface
US9030498B2 (en) 2011-08-15 2015-05-12 Apple Inc. Combining explicit select gestures and timeclick in a non-tactile three dimensional user interface
US9218063B2 (en) 2011-08-24 2015-12-22 Apple Inc. Sessionless pointing user interface
US9122311B2 (en) 2011-08-24 2015-09-01 Apple Inc. Visual feedback for tactile and non-tactile user interfaces
US20130194632A1 (en) * 2012-01-30 2013-08-01 Kazuhisa Kishimoto Information processing apparatus, control method therefor, and control program
US8867079B2 (en) * 2012-01-30 2014-10-21 Konica Minolta Business Technologies, Inc. Method and apparatus for communicating user information using body area network
US9600078B2 (en) 2012-02-03 2017-03-21 Aquifi, Inc. Method and system enabling natural user interface gestures with an electronic system
US9229534B2 (en) 2012-02-28 2016-01-05 Apple Inc. Asymmetric mapping for tactile and non-tactile user interfaces
US9377863B2 (en) 2012-03-26 2016-06-28 Apple Inc. Gaze-enhanced virtual touchscreen
US11169611B2 (en) 2012-03-26 2021-11-09 Apple Inc. Enhanced virtual touchpad
US8934675B2 (en) 2012-06-25 2015-01-13 Aquifi, Inc. Systems and methods for tracking human hands by performing parts based template matching using images from multiple viewpoints
US9098739B2 (en) 2012-06-25 2015-08-04 Aquifi, Inc. Systems and methods for tracking human hands using parts based template matching
US8830312B2 (en) 2012-06-25 2014-09-09 Aquifi, Inc. Systems and methods for tracking human hands using parts based template matching within bounded regions
US8655021B2 (en) 2012-06-25 2014-02-18 Imimtek, Inc. Systems and methods for tracking human hands by performing parts based template matching using images from multiple viewpoints
US9111135B2 (en) 2012-06-25 2015-08-18 Aquifi, Inc. Systems and methods for tracking human hands using parts based template matching using corresponding pixels in bounded regions of a sequence of frames that are a specified distance interval from a reference camera
US20140026101A1 (en) * 2012-07-20 2014-01-23 Barnesandnoble.Com Llc Accessible Menu Navigation Techniques For Electronic Devices
US9310891B2 (en) 2012-09-04 2016-04-12 Aquifi, Inc. Method and system enabling natural user interface gestures with user wearable glasses
US10474342B2 (en) * 2012-12-17 2019-11-12 Microsoft Technology Licensing, Llc Scrollable user interface control
US20140173504A1 (en) * 2012-12-17 2014-06-19 Microsoft Corporation Scrollable user interface control
US8615108B1 (en) 2013-01-30 2013-12-24 Imimtek, Inc. Systems and methods for initializing motion tracking of human hands
US9129155B2 (en) 2013-01-30 2015-09-08 Aquifi, Inc. Systems and methods for initializing motion tracking of human hands using template matching within bounded regions determined using a depth map
US9092665B2 (en) 2013-01-30 2015-07-28 Aquifi, Inc Systems and methods for initializing motion tracking of human hands
US10671342B2 (en) * 2013-01-31 2020-06-02 Huawei Technologies Co., Ltd. Non-contact gesture control method, and electronic terminal device
US20150331668A1 (en) * 2013-01-31 2015-11-19 Huawei Technologies Co., Ltd. Non-contact gesture control method, and electronic terminal device
US9498885B2 (en) 2013-02-27 2016-11-22 Rockwell Automation Technologies, Inc. Recognition-based industrial automation control with confidence-based decision support
US9393695B2 (en) 2013-02-27 2016-07-19 Rockwell Automation Technologies, Inc. Recognition-based industrial automation control with person and object discrimination
US9731421B2 (en) 2013-02-27 2017-08-15 Rockwell Automation Technologies, Inc. Recognition-based industrial automation control with person and object discrimination
US9798302B2 (en) 2013-02-27 2017-10-24 Rockwell Automation Technologies, Inc. Recognition-based industrial automation control with redundant system input support
US9804576B2 (en) 2013-02-27 2017-10-31 Rockwell Automation Technologies, Inc. Recognition-based industrial automation control with position and derivative decision reference
US9298266B2 (en) 2013-04-02 2016-03-29 Aquifi, Inc. Systems and methods for implementing three-dimensional (3D) gesture based graphical user interfaces (GUI) that incorporate gesture reactive interface objects
WO2014174513A1 (en) * 2013-04-21 2014-10-30 Biogaming Ltd. Kinetic user interface
US9798388B1 (en) 2013-07-31 2017-10-24 Aquifi, Inc. Vibrotactile system to augment 3D input systems
CN103558927A (en) * 2013-11-21 2014-02-05 广州视声电子实业有限公司 3D gesture control method and device
US9507417B2 (en) 2014-01-07 2016-11-29 Aquifi, Inc. Systems and methods for implementing head tracking based graphical user interfaces (GUI) that incorporate gesture reactive interface objects
US9619105B1 (en) 2014-01-30 2017-04-11 Aquifi, Inc. Systems and methods for gesture based interaction with viewpoint dependent user interfaces
WO2015153673A1 (en) * 2014-03-31 2015-10-08 Google Inc. Providing onscreen visualizations of gesture movements
CN107918481A (en) * 2016-10-08 2018-04-17 天津锋时互动科技有限公司深圳分公司 Man-machine interaction method and system based on gesture identification

Also Published As

Publication number Publication date
US10928921B2 (en) 2021-02-23
US20170185161A1 (en) 2017-06-29
US20190377420A1 (en) 2019-12-12
US10429937B2 (en) 2019-10-01

Similar Documents

Publication Publication Date Title
US10928921B2 (en) Gesture based user interface
US11422683B2 (en) System and methods for interacting with a control environment
US11520477B2 (en) Augmented reality scrollbar
JP7465952B2 (en) DEVICE, METHOD AND GRAPHICAL USER INTERFACE FOR INTERACTING WITH A THREE-DIMENSIONAL ENVIRONMENT - Patent application
EP3436907B1 (en) Remote hover touch system and method
TWI665599B (en) Virtual reality device, virtual reality method virtual reality system and computer readable medium thereof
CN117032519A (en) Apparatus, method and graphical user interface for interacting with a three-dimensional environment
TWI524210B (en) Natural gesture based user interface methods and systems
US9430041B2 (en) Method of controlling at least one function of device by using eye action and device for performing the method
CN104346076B (en) Information processing equipment, information processing method and program
KR101537524B1 (en) Method of providing a user interface
WO2020114395A1 (en) Virtual picture control method, terminal device and storage medium
EP4388406A1 (en) Devices, methods, and graphical user interfaces for interacting with three-dimensional environments
KR20240048522A (en) Devices, methods, and graphical user interfaces for interacting with three-dimensional environments
Asokan Target assistance for object selection in mobile and head mounted augmented reality

Legal Events

Date Code Title Description
AS Assignment

Owner name: PRIMESENSE LTD., ISRAEL

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HOFFNUNG, AMIR;GALOR, MICHA;POKRASS, JONATHAN;AND OTHERS;SIGNING DATES FROM 20110619 TO 20110623;REEL/FRAME:026525/0073

AS Assignment

Owner name: APPLE INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PRIMESENSE LTD.;REEL/FRAME:034293/0092

Effective date: 20140828

AS Assignment

Owner name: APPLE INC., CALIFORNIA

Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE APPLICATION # 13840451 AND REPLACE IT WITH CORRECT APPLICATION # 13810451 PREVIOUSLY RECORDED ON REEL 034293 FRAME 0092. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNOR:PRIMESENSE LTD.;REEL/FRAME:035624/0091

Effective date: 20140828

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION