Automated music composition and generation system driven by emotion-type and style-type musical experience descriptors
First Claim
Patent Images
1. An automated music composition and generation system comprising:
- a system user interface for enabling system users to create a project for a digital piece of music to be composed and generated, and review and select one or more emotion-type musical experience descriptors, one or more style-type musical experience descriptors, as well as time and/or space parameters; and
an automated music composition and generation engine, operably connected to said system user interface, for receiving, storing and processing said emotion-type and style-type musical experience descriptors and time and/or space parameters selected by the system user;
wherein said automated music composition and generation engine includes a plurality of function-specific subsystems cooperating together to automatically compose and generate one or more digital pieces of music in response to said emotion-type and style-type musical experience descriptors and time and/or space parameters selected by said system user;
wherein said digital piece of music composed and generated has a rhythmic landscape and a pitch landscape and contains a set of musical notes arranged and performed using an orchestration of one or more musical instruments selected for said digital piece of music being composed;
wherein said plurality of function-specific subsystems include a rhythmic landscape subsystem, a pitch landscape subsystem, and a digital piece creation subsystem;
wherein said rhythmic landscape subsystem is configured to generate and manage the rhythmic landscape of said digital piece of music being composed;
wherein said pitch landscape subsystem is configured to generate and manage the pitch landscape of said digital piece of music being composed;
wherein said digital piece creation subsystem is configured for creating said digital piece of music, employing one or more automated music synthesis techniques, and delivering said digital piece of music to said system user interface; and
wherein said digital piece of music composed and generated has the emotional and stylistic characteristics expressed throughout the rhythmic and pitch landscapes of said digital piece of music as represented by said set of emotion-type and style-type musical experience descriptors and time and/or space parameters supplied by said system user to said automated music composition and generation engine.
2 Assignments
0 Petitions
Accused Products
Abstract
An automated music composition and generation system for automatically composing and generating digital pieces of music using an automated music composition and generation engine driven by a set of emotion-type and style-type musical experience descriptors and time and/or space parameters supplied by a system user during an automated music composition and generation process. The system includes a system user interface allowing a system user to input (i) linguistic and/or graphical icon based musical experience descriptors, and (ii) a video, audio-recording, image, slide-show, or event marker, as input through the system user interface.
182 Citations
24 Claims
-
1. An automated music composition and generation system comprising:
-
a system user interface for enabling system users to create a project for a digital piece of music to be composed and generated, and review and select one or more emotion-type musical experience descriptors, one or more style-type musical experience descriptors, as well as time and/or space parameters; and an automated music composition and generation engine, operably connected to said system user interface, for receiving, storing and processing said emotion-type and style-type musical experience descriptors and time and/or space parameters selected by the system user; wherein said automated music composition and generation engine includes a plurality of function-specific subsystems cooperating together to automatically compose and generate one or more digital pieces of music in response to said emotion-type and style-type musical experience descriptors and time and/or space parameters selected by said system user; wherein said digital piece of music composed and generated has a rhythmic landscape and a pitch landscape and contains a set of musical notes arranged and performed using an orchestration of one or more musical instruments selected for said digital piece of music being composed; wherein said plurality of function-specific subsystems include a rhythmic landscape subsystem, a pitch landscape subsystem, and a digital piece creation subsystem; wherein said rhythmic landscape subsystem is configured to generate and manage the rhythmic landscape of said digital piece of music being composed; wherein said pitch landscape subsystem is configured to generate and manage the pitch landscape of said digital piece of music being composed; wherein said digital piece creation subsystem is configured for creating said digital piece of music, employing one or more automated music synthesis techniques, and delivering said digital piece of music to said system user interface; and wherein said digital piece of music composed and generated has the emotional and stylistic characteristics expressed throughout the rhythmic and pitch landscapes of said digital piece of music as represented by said set of emotion-type and style-type musical experience descriptors and time and/or space parameters supplied by said system user to said automated music composition and generation engine. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17)
-
-
18. An automated music composition and generation system comprising:
-
an automated music composition and generation engine realized on a semiconductor chip containing electronic circuits; and a system user interface allowing a system user to automatically compose and generate a digital piece of music by providing one or more emotion-type musical experience descriptors, one or more style-type musical experience descriptors, as well as time and/or space parameters to said system user interface; wherein said automated music composition and generation engine is operably connected to said system user interface, for receiving, storing and processing said emotion-type and style-type musical experience descriptors and time and/or space parameters selected by said system user; wherein said automated music composition and generation engine includes a plurality of function-specific subsystems cooperating together to automatically compose and generate one or more digital pieces of music in response to said emotion-type and style-type musical experience descriptors and time and/or space parameters selected by said system user; wherein said digital piece of music composed and generated has a rhythmic landscape and a pitch landscape and contains a set of musical notes arranged and performed using an orchestration of one or more musical instruments selected for said digital piece of music; wherein said plurality of function-specific subsystems include a rhythmic landscape subsystem, a pitch landscape subsystem, and a digital piece creation subsystem; wherein said rhythmic landscape subsystem is configured to generate and manage the rhythmic landscape of said digital piece of music being composed; wherein said pitch landscape subsystem is configured to generate and manage the pitch landscape of said digital piece of music being composed; wherein said digital piece creation subsystem is configured for creating said digital piece of music, employing one or more automated music synthesis techniques, and delivering said digital piece of music to said system user interface; and wherein said digital piece of music composed and generated has the emotional and stylistic characteristics expressed throughout the rhythmic and pitch landscapes of said digital piece of music as represented by said set of emotion-type and style-type musical experience descriptors and time and/or space parameters supplied by said system user to said automated music composition and generation engine. - View Dependent Claims (19, 20, 21, 22, 23, 24)
-
Specification