So, I limited Bing from robots.txt on all my sites. For now I see no big differences, but maybe it takes a little because of the cache
I also found in my htaccess, these rules that should stop Yandex and China
Code:
RewriteCond %{HTTP_USER_AGENT} ^.*MJ12bot [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Yandex [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Baidu [NC]
RewriteRule .* - [L,F]
Then I went to see Ip2location, I joined and I generated the file, but I did not understand how to use the file that they gave me...
I have generated Linux iptables, they gave me a thing like this:
Code:
iptables -A INPUT -s 104.146.0.0/18 -j DROP
iptables -A INPUT -s 104.146.100.0/22 -j DROP
iptables -A INPUT -s 104.146.104.0/21 -j DROP
iptables -A INPUT -s 104.146.112.0/24 -j DROP
But on Ubuntu in Digitalocean there is UFW, how can I use the file ip2location in UFW
Do I need to install iptables? Will it still work UFW?
I have seen some sites where it says to open a configuration file of UFW and add the lines, but my lines have a different format and the files to be modified indicated in these sites are always different...
I also thought of changing manually this:
Code:
iptables -A INPUT -s 104.146.100.0/22 -j DROP
in this:
Code:
# block IP
-A ufw-before-input -s 104.146.100.0/22 -j DROP
and add the lines to the/etc/ufw/before.rules file, as shown in Ubuntu Wiki
But I'm not sure that doing this manually is a good idea
I'm not really understanding anything.
And I would not use the rules in httacces, first because here also have a different format from what I used in precedence, and then because they are so many...
Do I need to install Fail2ban?
I really need to fix this thing quickly because my server is merging, can you help me?