TRUST BASED MODERATION
First Claim
1. A network device for selectively managing display of content items on a content system over a network, comprising:
- a transceiver to send and receive data over the network; and
a processor that is operative to perform actions, including;
receiving at least one abuse report indicating that a content item is considered to be abusive by a reporter;
determining a content item abuse reputation based, in part, on an accumulation of determined reputations for each reporter of the at least one abuse report for the content item;
determining for an author of the content item an overall content creation reputation for the content item; and
selectively hiding a display of the content item on the content system based on a comparison between the determined content item abuse reputation and the determined overall content creation reputation for the author for the content item.
6 Assignments
0 Petitions
Accused Products
Abstract
A network device, system, and method are directed towards detecting trusted reporters and/or abusive users in an online community using reputation event inputs, such as abuse reports. When an abuse report is received for a content item, the combined trust (reputation) of previous reporters on the reported content item and the trust (reputation) of the content author are compared to determine whether to trust the content item. If the content item is un-trusted, the content item may be hidden from public view. In one embodiment, the content item might still be visible to the content author, and/or members in the author'"'"'s contact list, or the like, while being hidden from another user in the community. In one embodiment, the author may appeal the determined trust, and results of the appeal may be used to modify a trust of at least one reporter.
168 Citations
21 Claims
-
1. A network device for selectively managing display of content items on a content system over a network, comprising:
-
a transceiver to send and receive data over the network; and a processor that is operative to perform actions, including; receiving at least one abuse report indicating that a content item is considered to be abusive by a reporter; determining a content item abuse reputation based, in part, on an accumulation of determined reputations for each reporter of the at least one abuse report for the content item; determining for an author of the content item an overall content creation reputation for the content item; and selectively hiding a display of the content item on the content system based on a comparison between the determined content item abuse reputation and the determined overall content creation reputation for the author for the content item. - View Dependent Claims (2, 3, 4, 5, 6, 7)
-
-
8. A method for use in managing display of content at a content system over a network, comprising:
-
receiving at least one abuse report from at least one abuse reporter indicating that the reporter considers a display of a content item on the content system to be abusive, in violation of terms of service or guideline, inappropriate, or illegal; determining an accumulated abuse reporters'"'"' reputation for the content item based, in part, on reputations of each abuse reporter of the received at least one abuse report; determining a reputation of an author of the content item; and selectively hiding a display of the content item on the content system based on a result of a comparison between the determined reputation of the author for the content item and the determined accumulated abuse reporters'"'"' reputation. - View Dependent Claims (9, 10, 11, 12, 13, 14)
-
-
15. A system for use in managing display of content over a network, comprising:
-
a content server configured and arranged to display submitted content items; and a reputation service configured and arranged to receive at least one of an abuse report or a positive report for at least one user generated content item, and to perform actions, including; determining a content item abuse reputation based, in part, on an accumulation of determined reputations for each abuse reporter providing an abuse report for the content item; determining for an author of the content item an author reputation for the content item; and selectively hiding a display of the content item on the content system based on a result of a comparison between the determined content item abuse reputation and the determined author reputation for the content item,. - View Dependent Claims (16, 17, 18, 19, 20, 21)
-
Specification