Situated simulation for training, education, and therapy
First Claim
1. A motion suit based responsive training system, comprising:
- a motion suit;
a training device comprising;
a display screen,a non-transitory computer readable medium structured to store a plurality of instructions, anda microprocessor structured to execute the plurality of instructions, wherein the plurality of instructions, when executed by the microprocessor, cause the microprocessor to;
generate an avatar portray a communication disorder by rendering the avatar on the display screen using electronic motion data from one or more hardware input devices including at least the motion suit,define a set of nodes in an electronic data structure using an electronic data object, the electronic data structure corresponding to actions indicative of the communication disorder that are selectively performed by the avatar when rendered on the display screen;
electronically populate in the electronic data structure measures associated with the set of nodes with values that correspond to diagnostic tools selected by a user via electronic inputs to analyze the actions performed by the avatar;
transition between nodes of the set of nodes based, at least in part, on the values selected by the user via the electronic inputs in response to the actions of the avatar in order to provide adaptive training to the user by causing the avatar to be rendered on the display screen according to the selected values, wherein each of the set of nodes causes avatar on the display screen to perform different actions; and
electronically generate and assign a score to the values selected by the user via the electronic inputs to evaluate a training competency of the user in identifying a communication disorder portrayed by the avatar when rendered on the display screen according to the electronic motion data and the electronic inputs.
1 Assignment
0 Petitions
Accused Products
Abstract
Systems, methods, and other embodiments associated with producing an immersive training content module (ITCM) are described. One example system includes a capture logic to acquire information from which the ITCM may be produced. An ITCM may include a set of nodes, a set of measures, a logic to control transitions between nodes during a training session, and a logic to establish values for measures during the training sessions. Therefore, the example system may also include an assessment definition logic to define a set of measures to be included in the ITCM and an interaction logic to define a set of interactions to be included in the ITCM. The ITCM may be written to a computer-readable medium.
32 Citations
14 Claims
-
1. A motion suit based responsive training system, comprising:
-
a motion suit; a training device comprising; a display screen, a non-transitory computer readable medium structured to store a plurality of instructions, and a microprocessor structured to execute the plurality of instructions, wherein the plurality of instructions, when executed by the microprocessor, cause the microprocessor to; generate an avatar portray a communication disorder by rendering the avatar on the display screen using electronic motion data from one or more hardware input devices including at least the motion suit, define a set of nodes in an electronic data structure using an electronic data object, the electronic data structure corresponding to actions indicative of the communication disorder that are selectively performed by the avatar when rendered on the display screen; electronically populate in the electronic data structure measures associated with the set of nodes with values that correspond to diagnostic tools selected by a user via electronic inputs to analyze the actions performed by the avatar; transition between nodes of the set of nodes based, at least in part, on the values selected by the user via the electronic inputs in response to the actions of the avatar in order to provide adaptive training to the user by causing the avatar to be rendered on the display screen according to the selected values, wherein each of the set of nodes causes avatar on the display screen to perform different actions; and electronically generate and assign a score to the values selected by the user via the electronic inputs to evaluate a training competency of the user in identifying a communication disorder portrayed by the avatar when rendered on the display screen according to the electronic motion data and the electronic inputs. - View Dependent Claims (2, 3, 4, 5, 6, 7)
-
-
8. A responsive training device comprising:
-
an input port structured to receive data from a motion suit; a display screen; a processing device, and a non-transitory computer-readable medium storing computer-executable instructions that when executed by the processing device cause the processing device to; generate, using electronic motion data received from the motion suit via the input port, an avatar to portray a communication disorder by rendering the avatar on the display screen; define a set of nodes in an electronic data structure corresponding to actions indicative of the speech that are selectively performed by the avatar when rendered on the electronic display screen, where defining the set of nodes includes defining the set of nodes using an electronic data object stored in the electronic data structure; electronically populate measures associated with the set of nodes with values that correspond to diagnostic tools selected by a user via a user input to analyze the actions performed by the avatar displayed on the display screen; transition between nodes based, at least in part, on the values selected by the user via the user input in response to the actions of the avatar rendered on the display screen, where transitioned between the nodes provides adaptive training to the user according to the selected values, and where each of the set of nodes causes the avatar to be rendered on the display screen performing different actions; and electronically generate and assign a score to the values selected by the user to evaluate a training competency of the user in identifying a communication disorder portrayed by the avatar when rendered on the display screen according to the electronic motion data and the electronic inputs. - View Dependent Claims (9, 10, 11, 12, 13, 14)
-
Specification