Multi-level selection methods and apparatus using context identification for embedded data graphical user interfaces
First Claim
1. Apparatus for use with a user interface having sensory indicia and embedded data code, and an image capture device for selecting a portion of the user interface and sending image information corresponding to the portion, comprising:
- an image processor for decoding first image information into a first image capture code;
a resolver for mapping the first image capture code into a first selection code; and
a syntax processor for analyzing the first selection code with a second selection code.
9 Assignments
0 Petitions
Accused Products
Abstract
An image capture user interface receives an image of an area of a user interface selected by a user and translates the image into operations performable by a computer. The user interface is comprised of graphic entities and embedded code. The user places an image capture device, such as a camera pen, on or near a graphic entity of the user interface, and presses a button on the image capture device indicating selection of the graphic entity. In response to the button, an image is captured that corresponds to the graphic entity selected by the user. The image includes embedded code, which is analyzed to develop an image capture code corresponding to the captured image area. The image capture code is then mapped to a selection code corresponding to the graphic entity selected by the user. The user may then make other selections. The selection codes are processed for a particular syntax, and a computer operation is performed when a selection code, or combination of selection codes, is received which indicate that an operation is to be performed. In other embodiments, mapping of image capture codes to selection codes and syntax processing may be performed in accordance with a particular context.
107 Citations
54 Claims
-
1. Apparatus for use with a user interface having sensory indicia and embedded data code, and an image capture device for selecting a portion of the user interface and sending image information corresponding to the portion, comprising:
-
an image processor for decoding first image information into a first image capture code;
a resolver for mapping the first image capture code into a first selection code; and
a syntax processor for analyzing the first selection code with a second selection code. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16)
a plurality of mapping modules for mapping image capture codes to selection codes based on context.
-
-
3. The apparatus according to claim 2, wherein the syntax processor comprises:
a plurality of syntax modules for analyzing selection codes based on context.
-
4. The apparatus according to claim 2, wherein the syntax processor comprises:
a single syntax module for analyzing selection codes.
-
5. The apparatus according to claim 1, wherein the resolver comprises:
a single mapping module for mapping image capture codes to selection codes.
-
6. The apparatus according to claim 5, wherein the syntax processor comprises:
a plurality of syntax modules for analyzing selection codes based on context.
-
7. The apparatus according to claim 5, wherein the syntax processor comprises:
a single syntax module for analyzing selection codes.
-
8. The apparatus according to claim 1,
wherein the resolver comprises a plurality of mapping modules, one of which determines mapping based on a first context; - and
wherein the syntax processor comprises a plurality of syntax modules, one of which determines an computer system operation based on the first context.
- and
-
9. The apparatus according to claim 1, wherein the sensory indicia comprises a graphic element.
-
10. The apparatus according to claim 1, wherein the embedded data code:
- comprises glyphs.
-
11. The apparatus according to claim 1, wherein the embedded data code comprises an address code.
-
12. The apparatus according to claim 1, wherein the embedded data code comprises a glyph address code.
-
13. The apparatus according to claim 1, wherein the embedded data code comprises a label code associated with a graphic entity.
-
14. The apparatus according to claim 1, wherein the user interface comprises:
-
a first user interface portion including a first graphic entity on a first substrate, wherein the first graphic entity is associated with the first selection code; and
a second user interface portion including a second graphic entity on a second substrate, wherein the second graphic entity is associated with the second selection code.
-
-
15. The apparatus according to claim 14, wherein the first user interface portion comprises a dynamic display.
-
16. The apparatus according to claim 1, wherein the user interface comprises:
-
a first user interface portion having an address space on a first substrate corresponding to a first context code; and
a second user interface portion having an address space on a second substrate corresponding to a second context code.
-
-
17. A method for use with a user interface having sensory indicia and embedded data code, and an image capture device for selecting a portion of the user interface and sending image information corresponding to the portion, comprising:
-
decoding first image information into a first image capture code;
mapping the first image capture code into a first selection code; and
analyzing the first selection code with a second selection code. - View Dependent Claims (18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32)
mapping image capture codes to selection codes based on context using a plurality of mapping modules.
-
-
19. The method according to claim 18, wherein the step of analyzing comprises:
analyzing selection codes based on context using a plurality of syntax modules.
-
20. The method according to claim 18, wherein the step of analyzing comprises:
analyzing selection codes using a single syntax module.
-
21. The method according to claim 17, wherein the step of mapping comprises:
mapping image capture codes to selection codes using a single mapping module.
-
22. The method according to claim 21, wherein the step of analyzing comprises:
analyzing selection codes based on context using a plurality of syntax modules.
-
23. The method according to claim 21, wherein the step of analyzing comprises:
analyzing selection codes using a single syntax module.
-
24. The method according to claim 17, further comprising:
-
determining a mapping using a mapping module corresponding to a context; and
determining a computer system operation using a mapping module corresponding to the context.
-
-
25. The method according to claim 17, wherein the sensory indicia comprises a graphic element.
-
26. The method according to claim 17, wherein the embedded data code comprises glyphs.
-
27. The method according to claim 17, wherein the embedded data code comprises an address code.
-
28. The method according to claim 17, wherein the embedded data code comprises a glyph address code.
-
29. The method according to claim 17, wherein the embedded data code comprises a label code associated with a graphic entity.
-
30. The method according to claim 17, wherein the user interface comprises:
-
a first user interface portion including a first graphic entity on a first substrate, wherein the first graphic entity is associated with the first selection code; and
a second user interface portion including a second graphic entity on a second substrate, wherein the second graphic entity is associated with a second selection code.
-
-
31. The method according to claim 30, wherein the first user interface portion comprises a dynamic display.
-
32. The method according to claim 17, further comprising:
-
providing a first address space having an address space on a first substrate corresponding to a first context code; and
providing a second interface having an address space on a second substrate corresponding to a second context code.
-
-
33. A graphical user interface for use with a user interface system including a device for capturing a portion of the graphical user interface in response to selection of sensory indicia, and a computer system responsive to embedded code in the portion for performing an operation related to the sensory indicia and a first selection code, comprising:
-
sensory indicia; and
embedded data codes having a predetermined spatial relationship to the sensory indicia, wherein the embedded data codes comprise a context code. - View Dependent Claims (34, 35, 36, 37, 38)
an address carpet for relating selected locations in the user interface to corresponding computer system entities.
-
-
36. The graphical user interface according to claim 33, wherein the embedded codes comprise glyphs.
-
37. The graphical user interface according to claim 33, wherein the embedded codes comprise:
a label for relating the sensory indicia to a computer system entity.
-
38. The graphical user interface according to claim 33, wherein the embedded data codes comprise a distributed label context code.
-
39. Apparatus for determining computer system operations, comprising:
-
a user interface having sensory indicia and embedded data code;
an image capture device for selecting a portion of the user interface and sending image information corresponding to the portion;
an image processor for decoding image information into a capture code;
a resolver for mapping the image capture code into a selection code; and
a syntax processor for analyzing multiple selection codes to determine corresponding computer system operations. - View Dependent Claims (40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54)
a plurality of mapping modules for mapping image capture codes to selection codes based on context.
-
-
41. The apparatus according to claim 40, wherein the syntax processor comprises:
a plurality of syntax modules for analyzing selection codes based on context.
-
42. The apparatus according to claim 40, wherein the syntax processor comprises:
a single syntax module for analyzing selection codes.
-
43. The apparatus according to claim 39, wherein the resolver comprises:
a single mapping module for mapping image capture codes to selection codes.
-
44. The apparatus according to claim 43, wherein the syntax processor comprises:
a plurality of syntax modules for analyzing selection codes based on context.
-
45. The apparatus according to claim 43, wherein the syntax processor comprises:
a single syntax module for analyzing selection codes.
-
46. The apparatus according to claim 39,
wherein the resolver comprises a plurality of mapping modules, one of which determines mapping based on a first context; - and
wherein the syntax processor comprises a plurality of syntax modules, one of which determines an computer system operation based on the first context.
- and
-
47. The apparatus according to claim 39, wherein the sensory indicia comprises:
- a graphic element.
-
48. The apparatus according to claim 39, wherein the embedded data code comprises glyphs.
-
49. The apparatus according to claim 39, wherein the embedded data code comprises an address code.
-
50. The apparatus according to claim 39, wherein the embedded data code comprises a glyph address code.
-
51. The apparatus according to claim 39, wherein the embedded data code comprises a label code associated with a graphic entity.
-
52. The apparatus according to claim 39, wherein the user interface comprises:
-
a first user interface portion including a first graphic entity on a first substrate, wherein the first graphic entity is associated with the first selection code; and
a second user interface portion including a second graphic entity on a second substrate, wherein the second graphic entity is associated with a second selection code.
-
-
53. The apparatus according to claim 52, wherein the first user interface portion comprises a dynamic display.
-
54. The apparatus according to claim 39, wherein the user interface comprises:
-
a first user interface portion having an address space on a first substrate corresponding to a first context code; and
a second user interface portion having an address space on a second substrate corresponding to a second context code.
-
Specification