Voice-assisted clinical note creation on a mobile device
First Claim
1. A computerized method, carried out by at least one server having one or more processors, of utilizing natural language processing to generate a user interface for display on a touch screen of a mobile device associated with a clinician, wherein the user interface facilitates creation of a voice-assisted shorthand clinical note on the mobile device in real time, wherein the mobile device includes a microphone to capture a spoken conversation between a patient and a clinician, the method comprising:
- capturing the spoken conversation between the patient and the clinician;
as the spoken conversation is in progress between the patient and the clinician, substantially simultaneously;
converting the spoken conversation into a text stream,analyzing the text stream to determine a presence of one or more clinically-relevant concepts in the text stream by applying a set of rules to text in the text stream using a natural language processing system by matching at least a portion of the text in the text stream to the one or more clinically-relevant concepts, andextracting the one or more clinically-relevant concepts from the text stream and temporarily presenting, via the user interface, the one or more clinically-relevant concepts in a first display area of the user interface, wherein the one or more clinically-relevant concepts are selectable;
receiving a selection, from the clinician, of at least one clinically-relevant concept from the one or more clinically-relevant concepts and populating the selected at least one clinically-relevant concept into a clinical note display area of the user interface, wherein the clinical note display area is adjacent to the first display area of the user interface;
substantially simultaneously determining that the at least one clinically-relevant concept selected by the clinician comprises an indication of a symptom, a set of symptoms, or a diagnosis;
upon determining that the at least one clinically-relevant concept selected by the clinician comprises the indication of the symptom, the set of symptoms, or the diagnosis, automatically determining, in real time, one or more of at least one suggested problem list, at least one orders set, and at least one alert corresponding to the at least one clinically-relevant concept by utilizing one or more of differential diagnoses lists, standards-of-care, and best practices stored in a data store;
temporarily presenting the one or more of the determined at least one suggested problem list, the at least one orders set, and the at least one alert on a second display area of the user for further selection by the clinician;
upon receiving further selection from the clinician of at least one of the one or more of the determined at least one suggested problem list, the at least one orders set, and the at least one alert, dynamically updating the clinical note display area by populating the clinical note display area of the user interface according to the further selection; and
as the spoken conversation is in progress between the clinician and the patient, dynamically updating and adjusting display parameters of at least one of the first display area, the clinical note display area, and the second display area of the user interface for maximizing screen real estate of the touch screen of the mobile device.
1 Assignment
0 Petitions
Accused Products
Abstract
Methods, systems, and computer-readable media are provided for facilitating the voice-assisted creation of a shorthand clinical note on a mobile or tablet device. A microphone on the device is used to capture a conversation between a clinician and a patient. Clinically-relevant concepts in the conversation are identified, extracted, and temporarily presented on the device'"'"'s touch screen interface. The concepts are selectable, and upon selection, the selected concept is populated into a clinical note display area of the touch screen interface. The shorthand clinical note may be used as a memory prompt for the later creation of a more comprehensive clinical note.
10 Citations
18 Claims
-
1. A computerized method, carried out by at least one server having one or more processors, of utilizing natural language processing to generate a user interface for display on a touch screen of a mobile device associated with a clinician, wherein the user interface facilitates creation of a voice-assisted shorthand clinical note on the mobile device in real time, wherein the mobile device includes a microphone to capture a spoken conversation between a patient and a clinician, the method comprising:
-
capturing the spoken conversation between the patient and the clinician; as the spoken conversation is in progress between the patient and the clinician, substantially simultaneously; converting the spoken conversation into a text stream, analyzing the text stream to determine a presence of one or more clinically-relevant concepts in the text stream by applying a set of rules to text in the text stream using a natural language processing system by matching at least a portion of the text in the text stream to the one or more clinically-relevant concepts, and extracting the one or more clinically-relevant concepts from the text stream and temporarily presenting, via the user interface, the one or more clinically-relevant concepts in a first display area of the user interface, wherein the one or more clinically-relevant concepts are selectable; receiving a selection, from the clinician, of at least one clinically-relevant concept from the one or more clinically-relevant concepts and populating the selected at least one clinically-relevant concept into a clinical note display area of the user interface, wherein the clinical note display area is adjacent to the first display area of the user interface; substantially simultaneously determining that the at least one clinically-relevant concept selected by the clinician comprises an indication of a symptom, a set of symptoms, or a diagnosis; upon determining that the at least one clinically-relevant concept selected by the clinician comprises the indication of the symptom, the set of symptoms, or the diagnosis, automatically determining, in real time, one or more of at least one suggested problem list, at least one orders set, and at least one alert corresponding to the at least one clinically-relevant concept by utilizing one or more of differential diagnoses lists, standards-of-care, and best practices stored in a data store; temporarily presenting the one or more of the determined at least one suggested problem list, the at least one orders set, and the at least one alert on a second display area of the user for further selection by the clinician; upon receiving further selection from the clinician of at least one of the one or more of the determined at least one suggested problem list, the at least one orders set, and the at least one alert, dynamically updating the clinical note display area by populating the clinical note display area of the user interface according to the further selection; and as the spoken conversation is in progress between the clinician and the patient, dynamically updating and adjusting display parameters of at least one of the first display area, the clinical note display area, and the second display area of the user interface for maximizing screen real estate of the touch screen of the mobile device. - View Dependent Claims (2, 3, 4, 5)
-
-
6. One or more non-transitory computer-readable media having computer-executable instructions embodied thereon that, when executed, generate a dynamic user interface for display on a touch screen of a mobile device associated with a clinician, wherein a spoken conversation between a patient and the clinician is captured using a microphone associated with the mobile device and substantially simultaneously is automatically converted into a text stream for analysis, wherein the dynamic user interface displayed on the touch screen of the mobile device comprises:
-
a first display area that temporarily presents one or more selectable clinically-relevant concepts identified from the text stream of the spoken conversation between the patient and the clinician by using a natural language processing system and matching at least a portion of text in the text stream to the one or more selectable clinically-relevant concepts, wherein the one or more selectable clinically-relevant concepts are presented on the first display area at substantially the same time as they are spoken by the patient or the clinician in the spoken conversation; a clinical note display area presented at the same time as the first display area, wherein upon receipt of a selection of at least one of the one or more selectable clinically-relevant concepts presented in the first display area, becomes populated with the selected at least one of the one or more selectable clinically-relevant concepts in real time wherein, when a determination that the selected at least one of the one or more selectable clinically-relevant concepts comprises a symptom, a set of symptoms, or a diagnosis, automatically determining, in real time, one or more of at least one suggested diagnoses list, at least one orders set, and at least one alert corresponding to the selected at least one of the one or more selectable clinically-relevant concepts by utilizing one or more of differential diagnoses lists, standards-of-care, and best practices stored in a data store; a diagnoses list display area for dynamically displaying the at least one suggested diagnoses list; an order display area for dynamically displaying the at least one orders set; and an alert display area for dynamically displaying the at least one alert, wherein as the spoken conversation between the patient and the clinician progresses, dynamically updating and adjusting display parameters of at least one of the first display area, the clinical note display area, the diagnoses list display area, the order display area, and the alert display area. - View Dependent Claims (7, 8, 9, 10, 11)
-
-
12. One or more non-transitory computer-readable media having computer-executable instructions embodied thereon that, when executed, facilitate a method of generating a user interface for display on a display screen of a mobile device, wherein the user interface is usable for creating a voice-assisted shorthand clinical note using natural language processing, in real time, and wherein the mobile device comprises a microphone for capturing a spoken conversation between a clinician and a patient, the method comprising:
-
capturing the spoken conversation between the patient and the clinician; using natural language processing, identifying one or more clinically-relevant concepts in the spoken conversation; extracting the one or more clinically-relevant concept from the spoken conversation; temporarily presenting the one or more clinically-relevant concepts in a first display area of the user interface displayed on the display screen of the mobile device, wherein the one or more clinically-relevant concepts are presented at substantially the same time as the one or more clinically-relevant concepts are spoken in the spoken conversation between the clinician and the patient; receiving a selection, from the clinician, of a first clinically-relevant concept of the one or more clinically-relevant concepts while the first clinically-relevant concept is being presented in the first display area; presenting the first clinically-relevant concept in a clinical note display area of the user interface displayed concurrently with the first display area on the display screen of the mobile device; determining if the first clinically-relevant concept selected by the clinician comprises a symptom, a set of symptoms, or a diagnosis; upon determining that the first clinically-relevant concept selected by the clinician comprises a symptom, a set of symptoms, or a diagnosis, automatically determining, in real time, one or more of at least one suggested problem list, at least one orders set, and at least one alert corresponding to the first clinically-relevant concept; and presenting the determined at least one suggested problem list in a suggested problem list display area of the user interface, the at least one orders set in an orders set display area of the user interface, and the at least one alert in an alert display area of the user interface, for selection by the clinician, wherein as the spoken conversation is in progress between the clinician and the patient, dynamically updating and adjusting display parameters of at least one of the first display area, the suggested problem list display area, the orders set display area, and the alert display area of the user interface, as the user interface becomes updated with the one or more clinically-relevant concepts identified from the spoken conversation in real time. - View Dependent Claims (13, 14, 15, 16, 17, 18)
-
Specification