Trust based moderation
First Claim
1. A network device for selectively managing display of content items on a content system over a network, comprising:
- a transceiver to send and receive data over the network; and
a processor that is operative to perform actions, including;
receiving at least one abuse report from a plurality of abuse reporters indicating that a content item is considered to be abusive by each reporter;
determining a content item abuse reputation based, in part, on an accumulation of determined reputations for each reporter of the received at least one abuse report for the content item;
determining for an author of the content item an overall content creation reputation for the content item; and
selectively hiding a display of the content item on the content system based on a comparison between the determined content item abuse reputation and the determined overall content creation reputation for the author for the content item, wherein selectively hiding the display of the content item on the content system further comprises inhibiting display of the content item on the content system to at least one visitor to the content system, while enabling at least the author of the content item to view the display of the content item on the content system.
6 Assignments
0 Petitions
Accused Products
Abstract
A network device, system, and method are directed towards detecting trusted reporters and/or abusive users in an online community using reputation event inputs, such as abuse reports. When an abuse report is received for a content item, the combined trust (reputation) of previous reporters on the reported content item and the trust (reputation) of the content author are compared to determine whether to trust the content item. If the content item is un-trusted, the content item may be hidden from public view. In one embodiment, the content item might still be visible to the content author, and/or members in the author'"'"'s contact list, or the like, while being hidden from another user in the community. In one embodiment, the author may appeal the determined trust, and results of the appeal may be used to modify a trust of at least one reporter.
-
Citations
18 Claims
-
1. A network device for selectively managing display of content items on a content system over a network, comprising:
-
a transceiver to send and receive data over the network; and a processor that is operative to perform actions, including; receiving at least one abuse report from a plurality of abuse reporters indicating that a content item is considered to be abusive by each reporter; determining a content item abuse reputation based, in part, on an accumulation of determined reputations for each reporter of the received at least one abuse report for the content item; determining for an author of the content item an overall content creation reputation for the content item; and selectively hiding a display of the content item on the content system based on a comparison between the determined content item abuse reputation and the determined overall content creation reputation for the author for the content item, wherein selectively hiding the display of the content item on the content system further comprises inhibiting display of the content item on the content system to at least one visitor to the content system, while enabling at least the author of the content item to view the display of the content item on the content system. - View Dependent Claims (2, 3, 4, 5, 6)
-
-
7. A method for use in managing display of content at a content system over a network, comprising:
-
receiving at least one abuse report from each of a plurality of abuse reporters indicating that each abuse reporter considers a display of a content item on the content system to be abusive, in violation of terms of service or guideline, inappropriate, or illegal; determining an accumulated abuse reporters'"'"' reputation for the content item based, in part, on reputations of each abuse reporter of the received at least one abuse report; determining a reputation of an author of the content item; selectively hiding a display of the content item on the content system based on a result of a comparison between the determined reputation of the author for the content item and the determined accumulated abuse reporters'"'"' reputation; if the content item is determined to be selectively hidden, notifying the author of the content item of an availability of an appeal process; receiving a request from the author of the content item to employ the appeal process; and employing an outcome of the appeal process to modify at least one of a reputation of at least one abuse reporter or the reputation of the author of the content item. - View Dependent Claims (8, 9, 10, 11, 12)
-
-
13. A system for use in managing display of content over a network, comprising:
-
a content server configured and arranged to display submitted content items; and reputation service configured and arranged to receive at least one of an abuse report or a positive report for at least one user generated content item, and to perform actions, including; determining a content item abuse reputation based, in part, on an accumulation of determined reputations for each abuse reporter of a plurality of abuse reporters providing an abuse report for the content item; determining for an author of the content item an author reputation for the content item; selectively hiding a display of the content item on the content system based on a result of a comparison between the determined content item abuse reputation and the determined author reputation for the content item; and if it is determined that the author of the content item to be selectively hidden is a repeat abusive author, enabling deletion of each content item automatically from the content server, absent notification to the author. - View Dependent Claims (14, 15, 16, 17, 18)
-
Specification