Cybernetic 3D music visualizer
First Claim
1. A visualization method for real-time modulation of visual object parameters of an 3D computer graphics animation, the method comprising:
- a. A real-time software runtime interpreter having one or more visualizer 3D ‘
scenes’
comprised of a matrix of input-output control transfer functions loaded into RAM prior to runtime from an external non-volatile data store;
b. Loading of a plurality of 3D resources from an external data store prior to runtime into RAM data utilized by the interpreter and applying and modulating such resources during runtime in the output 3D visual space;
c. Production and output of 3D animation modulations and effects that are precisely synchronized with simultaneously presented musical content;
d. Allowing of simultaneous real-time control inputs from a plurality of control sources;
e. Allowing of simultaneous modulation of a plurality of 3D objects and their parameters including 3D spatial geometry of models, 3D applied surface textures, 3D particles and video effects;
f. Production in real-time of visualizer outputs on a primary display device or window of either a 2D (CRT or other panel) or 3D (stereoscopic or volumetric) type;
g. Input of streaming digital video resource in real-time into the interpreter and applying and modulating such resources at runtime in the output 3D visual space;
0 Assignments
0 Petitions
Accused Products
Abstract
3D music visualization process employing a novel method of real-time reconfigurable control of 3D geometry and texture, employing blended control combinations of software oscillators, computer keyboard and mouse, audio spectrum, control recordings and MIDI protocol. The method includes a programmable visual attack, decay, sustain and release (V-ADSR) transfer function applicable to all degrees of freedom of 3D output parameters, enhancing even binary control inputs with continuous and aesthetic spatio-temporal symmetries of behavior. A “Scene Nodes Graph” for authoring content acts as a hierarchical, object-oriented graphical interpreter for defining 3D models and their textures, as well as flexibly defining how the control source blend(s) are connected or “Routed” to those objects. An “Auto-Builder” simplifies Scene construction by auto-inserting and auto-routing Scene Objects. The Scene Nodes Graph also includes means for real-time modification of the control scheme structure itself, and supports direct real-time keyboard/mouse adjustment to all parameters of all input control sources and all output objects. Dynamic control schemes are also supported such as control sources modifying the Routing and parameters of other control sources. Auto-scene-creator feature allows automatic scene creation by exploiting the maximum threshold of visualizer set of variables to create a nearly infinite set of scenes. A Realtime-Network-Updater feature allows multiple local and/or remote users to simultaneously co-create scenes in real-time and effect the changes in a networked community environment where in universal variables are interactively updated in real-time thus enabling scene co-creation in a global environment. In terms of the human subjective perception, the method creates, enhances and amplifies multiple forms of both passive and interactive synesthesia. The method utilizes transfer functions providing multiple forms of applied symmetry in the control feedback process yielding an increased level of perceived visual harmony and beauty. The method enables a substantially increased number of both passive and human-interactive interpenetrating control/feedback processes that may be simultaneously employed within the same audio-visual perceptual space, while maintaining distinct recognition of each, and reducing the threshold of human ergonomic effort required to distinguish them even when so coexistent. Taken together, these novel features of the invention can be employed (by means of considered Scene content construction) to realize an increased density of “orthogonal features” in cybernetic multimedia content. This furthermore increases the maximum number of human players who can simultaneously participate in shared interactive music visualization content while each still retaining relatively clear perception of their own control/feedback parameters.
-
Citations
12 Claims
-
1. A visualization method for real-time modulation of visual object parameters of an 3D computer graphics animation, the method comprising:
-
a. A real-time software runtime interpreter having one or more visualizer 3D ‘
scenes’
comprised of a matrix of input-output control transfer functions loaded into RAM prior to runtime from an external non-volatile data store;
b. Loading of a plurality of 3D resources from an external data store prior to runtime into RAM data utilized by the interpreter and applying and modulating such resources during runtime in the output 3D visual space;
c. Production and output of 3D animation modulations and effects that are precisely synchronized with simultaneously presented musical content;
d. Allowing of simultaneous real-time control inputs from a plurality of control sources;
e. Allowing of simultaneous modulation of a plurality of 3D objects and their parameters including 3D spatial geometry of models, 3D applied surface textures, 3D particles and video effects;
f. Production in real-time of visualizer outputs on a primary display device or window of either a 2D (CRT or other panel) or 3D (stereoscopic or volumetric) type;
g. Input of streaming digital video resource in real-time into the interpreter and applying and modulating such resources at runtime in the output 3D visual space;
- View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12)
-
Specification