System and method for detecting unwanted content
First Claim
1. A system comprising:
- one or more application servers hosting an application for exchanging messages between users of the application;
one or more processors;
message rate logic executable by the one or more processors to track a message rate at which a first user sends messages;
block rate logic executable by the one or more processors to track a block rate at which the first user is blocked by other users;
message uniqueness logic executable by the one or more processors to track the message uniqueness of the messages sent by the first user;
peer uniqueness logic executable by the one or more processors to track the message addressee uniqueness of the messages sent by the first user;
peer symmetry logic executable by the one or more processors to track a percentage of the messages sent by the first user that are addressed to other users that are (a) within a first address book of the first user and that (b) have other address books that include the first user;
unwanted content logic executable by the one or more processors to determine whether two or more of the message rate, the block rate, the message uniqueness, or the percentage of the messages tracked by the peer symmetry logic indicates a likelihood that the first user is sending unwanted content; and
watch list logic that, when executed by the one or more processors, causes the one or more processors to add the first user to a watch list of suspicious behavior when the wanted content logic determines that the first user is likely sending unwanted content.
2 Assignments
0 Petitions
Accused Products
Abstract
A system and method for detecting unwanted electronic content, such as spam. As a user operates an application to send messages, several metrics are tracked to allow the system to analyze her activity. Illustrative metrics may include, but are not limited to, block count (e.g., how many other users have blocked her), block rate (e.g., the rate at which other users block her), peer symmetry (e.g., percentage of her messages that are to other users that have her in their address book), message uniqueness (e.g., how unique her messages are from each other), peer uniqueness (e.g., how unique the addressees of her messages are), and message rate (e.g., the rate at which she sends messages). Periodically, metrics may be compared to corresponding thresholds. Depending on whether a threshold is crossed, and which threshold is crossed, she may be banned from using the application or placed on a watch list.
-
Citations
19 Claims
-
1. A system comprising:
-
one or more application servers hosting an application for exchanging messages between users of the application; one or more processors; message rate logic executable by the one or more processors to track a message rate at which a first user sends messages; block rate logic executable by the one or more processors to track a block rate at which the first user is blocked by other users; message uniqueness logic executable by the one or more processors to track the message uniqueness of the messages sent by the first user; peer uniqueness logic executable by the one or more processors to track the message addressee uniqueness of the messages sent by the first user; peer symmetry logic executable by the one or more processors to track a percentage of the messages sent by the first user that are addressed to other users that are (a) within a first address book of the first user and that (b) have other address books that include the first user; unwanted content logic executable by the one or more processors to determine whether two or more of the message rate, the block rate, the message uniqueness, or the percentage of the messages tracked by the peer symmetry logic indicates a likelihood that the first user is sending unwanted content; and watch list logic that, when executed by the one or more processors, causes the one or more processors to add the first user to a watch list of suspicious behavior when the wanted content logic determines that the first user is likely sending unwanted content. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13)
-
-
14. A method comprising:
-
receiving from a first user an electronic message having one or more addressees; updating a message rate metric identifying a rate at which the first user is sending electronic messages; updating a message uniqueness metric identifying uniqueness of messages sent by the first user; updating a peer uniqueness metric identifying uniqueness of addressees of messages sent by the first user; for each addressee in the one or more addressees, updating a peer symmetry metric identifying a percentage of the messages sent by the first user that are addressed to other users that (a) are within a first address book of the first user and (b) have other address books that include the first user; comparing each of the updated metrics to one or more threshold values; determining that two or more of the updated metrics indicates a likelihood that the first user is sending unwanted content when a respective updated metric exceeds at least one of the one or more threshold values; and adding the first user to a watch list of suspicious behavior when it is determined that the first user is likely sending unwanted content. - View Dependent Claims (15, 16, 17, 18)
-
-
19. A non-transitory computer-readable medium storing instructions that, when executed by a processor, cause the processor to perform a method comprising:
-
receiving from a first user an electronic message having one or more addressees; updating a message rate metric identifying a rate at which the first user is sending electronic messages; updating a message uniqueness metric identifying uniqueness of messages sent by the first user; updating a peer uniqueness metric identifying uniqueness of addressees of messages sent by the first user; for each addressee in the one or more addressees, updating a peer symmetry metric to reflect whether the first user is included in an address book of the addressee; comparing each of the updated metrics to one or more threshold values; determining that two or more of the updated metrics indicates a likelihood that the first user is sending unwanted content when a respective updated metric exceeds at least one of the one or more threshold values; and adding the first user to a watch list of suspicious behavior when it is determined that the first user is likely sending unwanted content.
-
Specification