Enhanced interface for voice and video communications
First Claim
Patent Images
1. A computer-implemented method comprising:
- recognizing an engagement gesture of a user from a sequence of camera images, whereina plurality of persons are present in at least one image of the sequence of camera images, the plurality of persons including the user;
focusing one or more camera images on the user using a user reference location identified based on the recognized engagement gesture and a face of the user;
providing a user interface comprising a control and a representation of the user;
causing the representation to interact with the control based on the recognized gesture; and
controlling a telecommunication session based on the interaction.
2 Assignments
0 Petitions
Accused Products
Abstract
An enhanced interface for voice and video communications, in which a gesture of a user is recognized from a sequence of camera images, and a user interface is provided include a control and a representation of the user. The process also includes causing the representation to interact with the control based on the recognized gesture, and controlling a telecommunication session based on the interaction.
-
Citations
23 Claims
-
1. A computer-implemented method comprising:
-
recognizing an engagement gesture of a user from a sequence of camera images, wherein a plurality of persons are present in at least one image of the sequence of camera images, the plurality of persons including the user; focusing one or more camera images on the user using a user reference location identified based on the recognized engagement gesture and a face of the user; providing a user interface comprising a control and a representation of the user; causing the representation to interact with the control based on the recognized gesture; and controlling a telecommunication session based on the interaction. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 20, 21, 22)
-
-
18. A device comprising a processor configured to:
-
recognize an engagement gesture of a user from a sequence of camera images, wherein; a plurality of persons are present in at least one image of the sequence of camera images, the plurality of persons including the user; identify a user reference position of the user based at least in part on the engagement gesture and a face position of the user of the plurality of persons; set the user as active using the user reference position; focus one or more camera images on the user using the user reference position; provide a user interface comprising a control and a representation of the user; cause the representation to interact with the control based on the recognized gesture; and control a telecommunication session based on the interaction.
-
-
19. A computer-readable storage medium encoded with a computer program comprising instructions that, when executed, operate to cause a computer to perform operations comprising:
-
recognizing an engagement gesture of a user from a sequence of camera images, wherein; a plurality of persons are present in at least one image of the sequence of camera images, the plurality of persons including the user; identifying a user reference position of the user based at least in part on the engagement gesture and a face position of the user of the plurality of persons; setting the user as active using the user reference position; focusing one or more camera images on the user using the user reference position; providing a user interface comprising a control and a representation of the user; causing the representation to interact with the control based on the recognized gesture; and controlling a telecommunication session based on the interaction.
-
-
23. An apparatus comprising:
-
means for recognizing an engagement gesture of a user from a sequence of camera images, wherein a plurality of persons are present in at least one image of the sequence of camera images, the plurality of persons including the user;
means for focusing one or more camera images on the user using a user reference location identified based on the recognized engagement gesture and a face of the user;means for providing a user interface comprising a control and a representation of the user; means for causing the representation to interact with the control based on the recognized gesture; and means controlling a telecommunication session based on the interaction.
-
Specification