Filtering bot actions
The Altcraft platform can detect bot activity and modified proxy settings and is able to filter out these events. Thanks to this, clients do not receive distorted data about user activity on their sites. This mechanism allows you to get accurate information from analytics and reports on the platform.
The platform checks each connection and compares its pattern or subnet with those in the directory. Settings for filtering bot actions are contained in two config files: "bots_ua.json " and "bots_ip.json ". Currently, 1139 network rules and 575 UserAgent rules have been added to these directories.
Validation can be performed on the following events:
| Event | Description |
|---|---|
| import_form | Import from a form with a web layer |
| import_popup | Import from pop-up |
| open | Opening a message |
| click | Clicks and confirms for all channels and redirect to the subscription manager |
| click_sms | Clicks on the SMS handler |
| read | Checking the read pixel in an email |
| unsub | Unsubscription of channels |
| click_push | Clicks on the push handler |
| pixel_open | Registration of target and redirection through pixel |
The following actions are available for the identified bot/proxy connection:
| Action | Description |
|---|---|
| event_ignore | These events are completely ignored. Used to ignore events that were not committed by the user, but the target action (for example, antivirus protection system operation) should be processed. |
| ua_ignore | Ignores UserAgent events and all associated data. Applies when a user performs an action, but because there was a data corruption (e.g., using a VPN), the user data should be ignored. |
| ip_ignore | Ignores the IP address of the event and all associated data, including geo_ip. Applies when a user performs an action, but because there has been data corruption (e.g., VPN usage), the event data should be ignored. |
| block | The request is blocked, opening an error page. Used when you want to protect against malicious traffic (e.g., botnets or crawlers). |
| log | Regardless of other actions, additionally records all details of the event in the info level of the logging (Bot detector action) |
| none | Used if you want to temporarily disable actions for a particular rule |
If you need to change the configuration, create a copy of the config file and name it "custom.bots_ua.json " or "custom.bots_ip.json ", depending on the config. Do not make any changes to the original files!
Structure of bots_ua.json
{
"pattern": "MTRobot", // Pattern
"events": ["click", "open"], // List of events to check ([] - all supported events)
"actions": [ // List of actions to take (default [] - ua_ignore and ip_ignore)
"ua_ignore",
"ip_ignore"
],
// Information fields (optional)
"added": "2023-09-08",
"url": "https://metrics-tools.de/robot.html",
"instances": [
"MTRobot/0.2 (Metrics Tools Analytics Crawler; https://metrics-tools.de/robot.html; crawler@metrics-tools.de)"
],
"source": [
"https://raw.githubusercontent.com/monperrus/crawler-user-agents/master/crawler-user-agents.json"
]
},
Structure of bots_ip.json
{
"network": "87.240.128.0/18", // Subnets
"events": ["click", "open"], // List of events to check ([] - all supported events)
"actions": [ // List of actions to take (default [] - ua_ignore and ip_ignore)
"ua_ignore",
"ip_ignore"
],
// Information fields (optional)
"added": "2023-09-08",
"url": "https://metrics-tools.de/robot.html",
"instances": [
"MTRobot/0.2 (Metrics Tools Analytics Crawler; https://metrics-tools.de/robot.html; crawler@metrics-tools.de)"
],
"source": [
"https://raw.githubusercontent.com/monperrus/crawler-user-agents/master/crawler-user-agents.json"
]
},
If you want to temporarily disable a rule, you should specify "none" in the "actions " parameter. If you specify an empty set, the default actions ua_ignore and ip_ignore will be executed.