Evaluating Crowd Sourced Information Using Crowd Sourced Metadata
1 Assignment
0 Petitions
Accused Products
Abstract
An approach is provided for utilizing crowd sourced data to score, or weigh, candidate answers in a question/answer (QA) system. In the approach, a question is received from a user and the system identifies question keywords and a context in the question using natural language processing (NLP). The system mines crowd sourced data sets for crowd sourced information, the mining being based on the identified question keywords and context. The crowd sourced data sets have stored therein a collective opinion of a crowd of individuals. The system evaluates the mined crowd sourced information based on crowd sourced metadata. The evaluation results in a most likely answer that is returned to the user, with the most likely answer that incorporating a portion of the crowd sourced information.
21 Citations
0 Claims
Specification