Method and apparatus for selectively integrating sensory content
First Claim
1. A machine-implemented method, comprising in a processor:
- establishing a reference position;
establishing a physical data model representing at least one physical entity, said physical data model being spatially dynamic in time;
establishing a notional data model representing at least one notional entity, said notional data model being dynamic in time and non-exclusive of spatial coincidence with said physical data model;
establishing a first sensory property from said notional data model;
wherein said first sensory phenomenon represents a physical environmental phenomenon;
wherein said physical environmental phenomenon comprises at least one of physical illumination, physical shadowing, a physical volumetric effect, and an optical phenomenon; and
wherein said physical volumetric effect comprises at least one of a group consisting of at least one falling element, at least one flying element, and at least one suspended element;
determining a second sensory property, said second sensory property corresponding with said first sensory property for another of said physical and notional data models;
generating notional sensory content representing at least a portion of said notional data model with at least a portion of said second sensory property applied thereto; and
outputting to said reference position with a perceive-through display said notional sensory content, registered with said physical data model.
3 Assignments
0 Petitions
Accused Products
Abstract
To integrate a sensory property such as occlusion, shadowing, reflection, etc. among physical and notional (e.g. virtual/augment) visual or other sensory content, providing an appearance of similar occlusion, shadowing, etc. in both models. A reference position, a physical data model representing physical entities, and a notional data model are created or accessed. A first sensory property from either data model is selected. A second sensory property is determined corresponding with the first sensory property, and notional sensory content is generated from the notional data model with the second sensory property applied thereto. The notional sensory content is outputted to the reference position with a see-through display. Consequently, notional entities may appear occluded by physical entities, physical entities may appear to cast shadows from notional light sources, etc.
-
Citations
27 Claims
-
1. A machine-implemented method, comprising in a processor:
-
establishing a reference position; establishing a physical data model representing at least one physical entity, said physical data model being spatially dynamic in time; establishing a notional data model representing at least one notional entity, said notional data model being dynamic in time and non-exclusive of spatial coincidence with said physical data model; establishing a first sensory property from said notional data model; wherein said first sensory phenomenon represents a physical environmental phenomenon; wherein said physical environmental phenomenon comprises at least one of physical illumination, physical shadowing, a physical volumetric effect, and an optical phenomenon; and wherein said physical volumetric effect comprises at least one of a group consisting of at least one falling element, at least one flying element, and at least one suspended element; determining a second sensory property, said second sensory property corresponding with said first sensory property for another of said physical and notional data models; generating notional sensory content representing at least a portion of said notional data model with at least a portion of said second sensory property applied thereto; and outputting to said reference position with a perceive-through display said notional sensory content, registered with said physical data model. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9)
-
-
10. A machine-implemented method, comprising in a processor:
-
establishing a reference position; establishing a physical data model representing at least one physical entity, said physical data model being spatially dynamic in time; establishing a notional data model representing at least one notional entity, said notional data model being dynamic in time and non-exclusive of spatial coincidence with said physical data model; establishing a first sensory property from said notional data model; wherein said first sensory phenomenon represents a physical environmental phenomenon; wherein said physical environmental phenomenon comprises at least one of physical illumination, physical shadowing, a physical volumetric effect, and an optical phenomenon; and wherein said physical volumetric effect comprises at least one of a group consisting of ash, debris, dust, fog, gas, hail, heat distortion, insects, leaves, mist, rain, sleet, smoke, snow, spray, and steam; determining a second sensory property, said second sensory property corresponding with said first sensory property for another of said physical and notional data models; generating notional sensory content representing at least a portion of said notional data model with at least a portion of said second sensory property applied thereto; and outputting to said reference position with a perceive-through display said notional sensory content, registered with said physical data model. - View Dependent Claims (11, 12, 13, 14, 15, 16, 17, 18)
-
-
19. A machine-implemented method, comprising in a processor:
-
establishing a reference position; establishing a physical data model representing at least one physical entity, said physical data model being spatially dynamic in time; establishing a notional data model representing at least one notional entity, said notional data model being dynamic in time and non-exclusive of spatial coincidence with said physical data model; establishing a first sensory property from said notional data model; wherein said first sensory phenomenon represents a physical environmental phenomenon; wherein said physical environmental phenomenon comprises at least one of physical illumination, physical shadowing, a physical volumetric effect, and an optical phenomenon; and wherein said physical environmental phenomenon comprises at least one of a group consisting of diffraction, diffusion, glory, haloing, lens flare, and reflection; determining a second sensory property, said second sensory property corresponding with said first sensory property for another of said physical and notional data models; generating notional sensory content representing at least a portion of said notional data model with at least a portion of said second sensory property applied thereto; and outputting to said reference position with a perceive-through display said notional sensory content, registered with said physical data model. - View Dependent Claims (20, 21, 22, 23, 24, 25, 26, 27)
-
Specification