System and method for capture and rendering of performance on synthetic string instrument
First Claim
1. A method comprising:
- using a first portable computing device as a synthetic string instrument;
presenting on a multi-touch sensitive display of the portable computing device, and in correspondence with a musical score, temporally synchronized visual cues relative to respective strings of the synthetic string instrument;
capturing user gestures indicative of length of respective strings of the synthetic string instrument from data sampled in correspondence with respective finger contacts with the multi-touch sensitive display along visual depictions of the respective strings;
capturing user gestures indicative of excitation of at least one of the strings;
encoding a gesture stream for a performance of the user by parameterizing at least a subset of the string length and string excitation indicative user gestures; and
audibly rendering the performance on the portable computing device using the encoded gesture stream as an input to a digital synthesis of the synthetic string instrument executing on the first portable computing device, wherein the captured gesture stream, and not the musical score itself, drives the digital synthesis.
2 Assignments
0 Petitions
Accused Products
Abstract
Synthetic multi-string musical instruments have been developed for capturing and rendering musical performances on handheld or other portable devices in which a multi-touch sensitive display provides one of the input vectors for an expressive performance by a user or musician. Visual cues may be provided on the multi-touch sensitive display to guide the user in a performance based on a musical score. Alternatively, or in addition, uncued freestyle modes of operation may be provided. In either case, it is not the musical score that drives digital synthesis and audible rendering of the synthetic multi-string musical instrument. Rather, it is the stream of user gestures captured at least in part using the multi-touch sensitive display that drives the digital synthesis and audible rendering.
-
Citations
36 Claims
-
1. A method comprising:
-
using a first portable computing device as a synthetic string instrument; presenting on a multi-touch sensitive display of the portable computing device, and in correspondence with a musical score, temporally synchronized visual cues relative to respective strings of the synthetic string instrument; capturing user gestures indicative of length of respective strings of the synthetic string instrument from data sampled in correspondence with respective finger contacts with the multi-touch sensitive display along visual depictions of the respective strings; capturing user gestures indicative of excitation of at least one of the strings; encoding a gesture stream for a performance of the user by parameterizing at least a subset of the string length and string excitation indicative user gestures; and audibly rendering the performance on the portable computing device using the encoded gesture stream as an input to a digital synthesis of the synthetic string instrument executing on the first portable computing device, wherein the captured gesture stream, and not the musical score itself, drives the digital synthesis. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14)
-
-
15. A method comprising:
-
using a first portable computing device as a synthetic string instrument; capturing user gestures relative to respective strings of the synthetic string instrument from data sampled in correspondence with respective finger contacts with a multi-touch sensitive display of the portable computing device; capturing user gestures indicative of bow traversal of at least one of the strings; encoding a gesture stream for a performance of the user by parameterizing at least a subset of events captured from the finger contacts and bow traversal; audibly rendering the performance on the portable computing device using the encoded gesture stream as an input to a digital synthesis of the synthetic string instrument executing on the first portable computing device; and presenting on the multi-touch sensitive display, and in correspondence with a musical score, temporally synchronized visual cues to guide the user'"'"'s gestures relative to the respective strings of the synthetic string instrument, wherein the gesture stream captured from the user'"'"'s performance, and not the musical score itself, drives the digital synthesis. - View Dependent Claims (16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33)
-
-
34. An apparatus comprising:
-
a portable computing device having a multi-touch display interface; and machine readable code executable on the portable computing device to implement the synthetic musical instrument, the machine readable code including instructions executable to capture both;
(i) user gestures relative to respective strings of the synthetic string instrument from data sampled in correspondence with respective finger contacts with a multi-touch sensitive display of the portable computing device and (ii) user gestures indicative of bow traversal of at least one of the strings and to encoded a gesture stream for a performance of the user by parameterizing at least a subset of events captured from the finger contacts and bow traversal; andthe machine readable code further executable to audibly render the performance on the portable computing device using the encoded gesture stream as an input to a digital synthesis of the synthetic string instrument executing on the portable computing device; wherein the apparatus is configured to wirelessly communicate with a second portable computing device proximate thereto, the second portable computing device including either or both of multi-axis accelerometer and a gyroscopic sensor for capture of orientation and motion dynamics of the second portable computing device, and machine readable code executable on the second portable computing device to compute and wirelessly communicate to the first portable computing device the bow traversal gestures based on the captured orientation and motion dynamics. - View Dependent Claims (35)
-
-
36. A computer program product encoded in media and including instructions executable to implement a synthetic musical instrument on a portable computing device having a multi-touch display interface, the computer program product encoding and comprising:
-
instructions executable to capture both;
(i) user gestures relative to respective strings of the synthetic string instrument from data sampled in correspondence with respective finger contacts with a multi-touch sensitive display of the portable computing device and (ii) user gestures indicative of bow traversal of at least one of the strings and to encode a gesture stream for a performance of the user by parameterizing at least a subset of events captured from the finger contacts and bow traversal,further instructions executable to audibly render the performance on the portable computing device using the encoded gesture stream as an input to a digital synthesis of the synthetic string instrument executing on the portable computing device; and further instructions executable to present on the multi-touch sensitive display, and in correspondence with a musical score, temporally synchronized visual cues to guide the user'"'"'s gestures relative to the respective strings of the synthetic string instrument, wherein the gesture stream captured from the user'"'"'s performance, and not the musical score itself, drives the digital synthesis.
-
Specification