Method and apparatus for placing sensors using 3D models
First Claim
Patent Images
1. A method for dynamic sensor placement comprising:
- positioning at least one sensory device in a scene of a 3D site model supported in a computer;
said 3D site model including data defining a plurality of surfaces in three dimensions making up a plurality of objects in the site model;
rendering in said computer an image of at least part of said scene of the 3D site model in which at least part of a coverage of said at least one sensory device within the scene of said 3D site model is displayed, and part of the scene of the 3D site model outside said coverage is displayed, said coverage being derived in accordance with sensor parameters associated with said at least one sensory device; and
said rendering of said image being derived for a view point in said 3D site model that is different from the positioning of said sensory device; and
wherein said rendering step renders the coverage of said sensor in accordance with said sensor parameters such that surfaces in the 3D site model in said image have a texture that differentiates the coverage from the part of the scene that is not in said coverage;
said surfaces in said image being disposed in the site model at a plurality of different respective three dimensional orientations; and
wherein, when one of the surfaces or objects of the 3D site model is positioned so as to be an occlusion between an occluded area that absent the occlusion is in the coverage covered by said at least one sensory device, said image is rendered so that the occluded area has a texture that differentiates from surfaces in the coverage of the sensory device.
5 Assignments
0 Petitions
Accused Products
Abstract
Method and apparatus for dynamically placing sensors in a 3D model is provided. Specifically, in one embodiment, the method selects a 3D model and a sensor for placement into the 3D model. The method renders the sensor and the 3D model in accordance with sensor parameters associated with the sensor and parameters desired by a user. In addition, the method determines whether an occlusion to the sensor is present.
-
Citations
38 Claims
-
1. A method for dynamic sensor placement comprising:
-
positioning at least one sensory device in a scene of a 3D site model supported in a computer; said 3D site model including data defining a plurality of surfaces in three dimensions making up a plurality of objects in the site model; rendering in said computer an image of at least part of said scene of the 3D site model in which at least part of a coverage of said at least one sensory device within the scene of said 3D site model is displayed, and part of the scene of the 3D site model outside said coverage is displayed, said coverage being derived in accordance with sensor parameters associated with said at least one sensory device; and said rendering of said image being derived for a view point in said 3D site model that is different from the positioning of said sensory device; and wherein said rendering step renders the coverage of said sensor in accordance with said sensor parameters such that surfaces in the 3D site model in said image have a texture that differentiates the coverage from the part of the scene that is not in said coverage; said surfaces in said image being disposed in the site model at a plurality of different respective three dimensional orientations; and wherein, when one of the surfaces or objects of the 3D site model is positioned so as to be an occlusion between an occluded area that absent the occlusion is in the coverage covered by said at least one sensory device, said image is rendered so that the occluded area has a texture that differentiates from surfaces in the coverage of the sensory device. - View Dependent Claims (2, 3, 4, 24, 25)
-
-
5. A method for dynamic sensor placement comprising:
-
selecting a 3D site model supported in a computer, said 3D site model including data defining a plurality of surfaces in a scene; selecting a sensor for placement into said 3D site model; and rendering said sensor within the scene of said 3D site model in accordance with sensor parameters associated with said sensor; said rendering being performed by said computer for a point of view other than the location of the sensor, and including preparing an image of the scene from said point of view that includes at least part of a coverage area for said sensor, said coverage area being made up of the surfaces or parts of the surfaces that are covered by the sensor as derived in accordance with the 3D site model and the sensor parameters, and also includes a portion of the 3D site model that is not in said coverage area; and wherein said rendering step renders the coverage area covered by said sensor in accordance with said sensor parameters such that the surfaces in the scene of the 3D model that constitute the coverage area have a texture that differentiates said surfaces from the surfaces in the scene that are not in said coverage area; said surfaces that constitute the coverage area being oriented in a plurality of three-dimensional orientations in the 3D site model; and wherein, when one of the surfaces of the 3D site model is positioned so as to be an occlusion between an occluded area that absent the occlusion would be covered by said sensor, said image is rendered so that the occluded area has a texture that differentiates from the coverage area. - View Dependent Claims (6, 7, 8, 26, 27)
-
-
9. A computer-readable medium having stored thereon a plurality of instructions, the plurality of instructions including instructions which, when executed by a processor, cause the processor to perform the steps comprising:
-
positioning at least one sensor in a scene of a 3D model, said 3D model including data defining a plurality of surfaces forming objects in the 3D model, said surfaces being each oriented in different three-dimensional orientations and or locations in the scene; and rendering dynamically images of said sensor in the scene of said 3D site model in accordance with sensor parameters associated with said sensor, wherein said rendering renders an image including a view at least one of the surfaces that has an area covered by said sensor in accordance with said sensor parameters; wherein the images are from one or more viewpoints, none of which are that of the sensor; and wherein in said rendering the area covered by said sensor in accordance with said sensor parameters is rendered such that surfaces in the images of the scene of the 3D model that are covered by the sensor have a texture that differentiates said surfaces from surfaces in the rendered images that are not covered by the sensor; said surfaces in said image being oriented at a plurality of different respective three dimensional orientations; and wherein, when one of the surfaces of the 3D site model is positioned so as to be an occlusion between an occluded area that absent the occlusion would be covered by said sensor, said image is rendered so that the occluded area has a texture that differentiates from the area covered by said sensor. - View Dependent Claims (10, 11, 28, 29)
-
-
12. Apparatus for dynamic sensor placement, said apparatus comprising:
-
means for positioning at least one sensor in a scene of a 3D model; and means for rendering dynamically images of said sensor within the scene of said 3D site model in accordance with sensor parameters associated with said at least one sensory device and for displaying said images to a user; wherein the images are from one or more viewpoints none of which are that of the sensor; and wherein in said rendering the area covered by said sensor in accordance with said sensor parameters is rendered such that surfaces in the image of the scene of the 3D model that are covered by the sensor have a texture that differentiates said surfaces from surfaces in the rendered images that are not covered by the sensor; said surfaces in said image being oriented at a plurality of different respective three dimensional orientations; and wherein, when one of the surfaces of the 3D site model is positioned so as to be an occlusion between an occluded area that absent the occlusion would be covered by said sensor, said image is rendered so that the occluded area has a texture that differentiates from the coverage area. - View Dependent Claims (13, 30, 31)
-
-
14. A method for placing a plurality of surveillance cameras in a site, said method comprising:
-
providing on a computer scene data of a 3D model of the site; providing to said computer position data defining discrete positions for each of a plurality of cameras in said 3D model, each camera being associated with data defining viewing parameters defining coverage thereof; rendering with said computer an image of the site from a viewpoint based on said 3D model, said image showing at least a part of the coverage of at least one of the cameras in said 3D model determined from the position data for said camera and the viewing parameters thereof, wherein the coverage is marked in the image with a texture applied to surfaces in the 3D model in said coverage, said surfaces being disposed in the 3D site in a plurality of different three-dimensional orientations; and displaying said image so as to be viewed by a user; wherein in said rendering the texture applied to surfaces in the 3D model in each of said coverages is a pattern that indicates resolution of the view thereof by the associated camera. - View Dependent Claims (15, 16, 17, 32)
-
-
18. A method for placing a plurality of surveillance cameras in a site, said method comprising:
-
providing on a computer scene data of a 3D model of the site; providing to said computer position data defining discrete positions for each of a plurality of cameras in said 3D model, each camera being associated with data defining viewing parameters defining a coverage area thereof; rendering with said computer an image of the site from a viewpoint based on said 3D model, said image showing at least a part of the coverage area of at least one of the cameras in said 3D model determined from the position data for said camera and the viewing parameters thereof, wherein the coverage area is marked in the image with a texture applied to surfaces in the 3D model in said coverage, said surfaces being disposed in the 3D site in a plurality of different three-dimensional orientations; and displaying said image so as to be viewed by a user; and wherein the rendering of said image includes ray tracing between the viewpoint and a point on a surface in the 3D model and ray tracing between the point on the surface in the 3D model and each of the cameras, said point being displayed as in the coverage area when said ray tracings do not encounter any occlusion in the 3D model between said point on said surface and at least one of the cameras, and being displayed as outside the coverage area when there is an occlusion between the point and all of said cameras. - View Dependent Claims (19, 33, 34)
-
-
20. A method for dynamic sensor placement comprising:
-
positioning at least one sensory device in a scene of a 3D site model supported in a computer; said 3D site model including data defining a plurality of surfaces in three dimensions making up a plurality of objects in the site model; rendering in said computer an image of at least part of said scene of the 3D site model in which at least part of a coverage of said at least one sensory device within the scene of said 3D site model is displayed, and part of the scene of the 3D site model outside said coverage is displayed, said coverage being derived in accordance with sensor parameters associated with said at least one sensory device; and said rendering of said image being derived for a view point in said 3D site model that is different from the positioning of said sensory device; and wherein said rendering step renders the coverage of said sensor in accordance with said sensor parameters such that surfaces in the 3D site model in said image have a texture that differentiates the coverage from the part of the scene that is not in said coverage; said surfaces in said image being disposed in the site model at a plurality of different respective three dimensional orientations; and wherein the texture of the coverage is a pattern indicative of resolution of the coverage of the sensory device of the surfaces. - View Dependent Claims (35)
-
-
21. A method for dynamic sensor placement comprising:
-
selecting a 3D site model supported in a computer, said 3D site model including data defining a plurality of surfaces in a scene; selecting a sensor for placement into said 3D site model; and rendering said sensor within the scene of said 3D site model in accordance with sensor parameters associated with said sensor; said rendering being performed by said computer for a point of view other than the location of the sensor, and including preparing an image of the scene from said point of view that includes at least part of a coverage area for said sensor, said coverage area being made up of the surfaces or parts of the surfaces that are covered by the sensor as derived in accordance with the 3D site model and the sensor parameters, and also includes a portion of the 3D site model that is not in said coverage area; and wherein said rendering step renders the coverage area covered by said sensor in accordance with said sensor parameters such that the surfaces in the scene of the 3D model that constitute the coverage area have a texture that differentiates said surfaces from the surfaces in the scene that are not in said coverage area; said surfaces that constitute the coverage area being oriented in a plurality of three-dimensional orientations in the 3D site model; and wherein the texture of the coverage area is a pattern indicative of resolution of the coverage of the sensor on the respective surface. - View Dependent Claims (36)
-
-
22. A computer-readable medium having stored thereon a plurality of instructions, the plurality of instructions including instructions which, when executed by a processor, cause the processor to perform the steps comprising:
-
positioning at least one sensor in a scene of a 3D model, said 3D model including data defining a plurality of surfaces forming objects in the 3D model, said surfaces being each oriented in different three-dimensional orientations and or locations in the scene; and rendering dynamically images of said sensor in the scene of said 3D site model in accordance with sensor parameters associated with said sensor, wherein said rendering renders an image including a view at least one of the surfaces that has an area covered by said sensor in accordance with said sensor parameters; wherein the images are from one or more viewpoints, none of which are that of the sensor; and wherein in said rendering the area covered by said sensor in accordance with said sensor parameters is rendered such that surfaces in the images of the scene of the 3D model that are covered by the sensor have a texture that differentiates said surfaces from surfaces in the rendered images that are not covered by the sensor; said surfaces in said image being oriented at a plurality of different respective three dimensional orientations; and wherein the texture of the area covered by the sensor is a pattern indicative of resolution of the coverage of the sensor on the respective surface. - View Dependent Claims (37)
-
-
23. Apparatus for dynamic sensor placement, said apparatus comprising:
-
means for positioning at least one sensor in a scene of a 3D model; and means for rendering dynamically images of said sensor within the scene of said 3D site model in accordance with sensor parameters associated with said at least one sensory device and for displaying said images to a user; wherein the images are from one or more viewpoints none of which are that of the sensor; and wherein in said rendering the area covered by said sensor in accordance with said sensor parameters is rendered such that surfaces in the images of the scene of the 3D model that are covered by the sensor have a texture that differentiates said surfaces from surfaces in the rendered images that are not covered by the sensor; said surfaces in said image being oriented at a plurality of different respective three dimensional orientations; and wherein the texture of the area covered by the sensor is a pattern indicative of resolution of the coverage of the sensor on the respective surface. - View Dependent Claims (38)
-
Specification