Computerized systems and methods for determining authenticity using micro expressions
First Claim
1. A method for calculating authenticity of a human user, performed by a device having at least one processor, the method comprising:
- receiving, via a network, an electronic request from a user device, the electronic request instantiating a video connection with the user device;
generating, using a database of questions, a first question regarding the electronic request;
providing, via the network, the generated first question to the user device;
analyzing video and audio data received via the video connection, the analyzing comprising extracting a plurality of facial expressions from the video data;
calculating, using a first convolutional neural network, first data corresponding to one or more predetermined emotions based on at least one extracted facial expression, and using a second convolutional neural network, second data corresponding to the one or more predetermined emotions based on at least two extracted facial expressions and the audio data;
generate candidate emotion data using the first and second data;
determining whether the candidate emotion data predicts a predetermined emotion;
based on determining whether the candidate emotion data predicts the predetermined emotion;
generating a second question to collect additional data for aggregating with the first and second data;
ordetermining the authenticity of the human user and using the determined authenticity to decide on the electronic request.
1 Assignment
0 Petitions
Accused Products
Abstract
Systems and methods are provided for calculating authenticity of a human user. One method comprises receiving, via a network, an electronic request from a user device, instantiating a video connection with the user device; generating, using a database of questions, a first question; providing, via the network, the generated question to the user device; analyzing video and audio data received via the connection to extract facial expressions, calculating, using convolutional neural networks, first data and second data corresponding predetermined emotions based on facial expressions and audio data; generating candidate emotion data using the first and second data; determining whether the candidate emotion data predicts a predetermined emotion, and generating a second question to collect additional data for aggregating with the first and second data or determining the authenticity of the user and using the determined authenticity to decide on the user request.
-
Citations
20 Claims
-
1. A method for calculating authenticity of a human user, performed by a device having at least one processor, the method comprising:
-
receiving, via a network, an electronic request from a user device, the electronic request instantiating a video connection with the user device; generating, using a database of questions, a first question regarding the electronic request; providing, via the network, the generated first question to the user device; analyzing video and audio data received via the video connection, the analyzing comprising extracting a plurality of facial expressions from the video data; calculating, using a first convolutional neural network, first data corresponding to one or more predetermined emotions based on at least one extracted facial expression, and using a second convolutional neural network, second data corresponding to the one or more predetermined emotions based on at least two extracted facial expressions and the audio data; generate candidate emotion data using the first and second data; determining whether the candidate emotion data predicts a predetermined emotion; based on determining whether the candidate emotion data predicts the predetermined emotion; generating a second question to collect additional data for aggregating with the first and second data;
ordetermining the authenticity of the human user and using the determined authenticity to decide on the electronic request. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9)
-
-
10. A computing system for calculating authenticity of a human user, comprising:
-
at least one processor; at least one memory storing instructions, wherein the instructions cause the at least one processor to; receive, via a network, an electronic request from a user device, the electronic request instantiating a video connection with the user device; generate, using a database of questions, a first question regarding the electronic request; provide, via the network, the generated first question to the user device; analyze video and audio data received via the video connection, the analyzing comprising extracting a plurality of facial expressions from the video data; calculate, using a first convolutional neural network, first data corresponding to one or more predetermined emotions based on at least one extracted facial expression, and using a second convolutional neural network, second data corresponding to the one or more predetermined emotions based on at least two extracted facial expressions and the audio data; generate candidate emotion data using the first and second data; determine whether the candidate emotion data predicts a predetermined emotion; based on determining whether the candidate emotion data predicts the predetermined emotion; generate a second question to collect additional data for aggregating with the first and second data;
ordetermine the authenticity of the human user and use the determined authenticity to decide on the electronic request. - View Dependent Claims (11, 12, 13, 14, 15, 16, 17, 18, 19)
-
-
20. A method for determining authenticity of a loan applicant, performed by a user device having at least one processor, the method comprising:
-
receiving, via a network, an electronic request from a user device, the electronic request instantiating a video connection with the user device; generating, using a database of questions, a first question regarding the electronic request; providing, via the network, the generated first question to the user device; analyzing video and audio data received via the video connection, the analyzing comprising extracting a plurality of facial expressions from the video data; calculating, using a spatial convolutional neural network, first data corresponding to one or more predetermined emotions based on at least one extracted facial expression, and using a temporal convolutional neural network, second data corresponding to the one or more of the predetermined emotions based on at least two extracted facial expressions and the audio; generating candidate emotion data using the first and second data; determining whether the candidate emotion data predicts a predetermined emotion; based on determining whether the candidate emotion data predicts the predetermined emotion; generating a second question to collect additional data for aggregating with the first and second data;
ordetermining the authenticity of the loan applicant and using the determined authenticity to decide on the electronic request.
-
Specification