Portal Home > Knowledgebase > How To's and more > .htaccess tips
Since .htaccess has many things it can do, I will keep this updated as much as I can. And since some people need to know some of this already.
It will be a bit messy.
If you want to prevent people from hotlinking your stuff, use the following:
code:
--------------------------------------------------------------------------------
Options +FollowSymlinks
ErrorDocument 401 /errordocs/404.htm
ErrorDocument 403 /errordocs/404.htm
ErrorDocument 404 /errordocs/404.htm
ErrorDocument 500 /errordocs/404.htm
RewriteEngine On
RewriteOptions inherit
RewriteCond %{HTTP_REFERER} !^http://yourdomain.com.*$ [NC]
RewriteCond %{HTTP_REFERER} !^http://www.yourdomain.com.*$ [NC]
RewriteCond %{HTTP_REFERER} !^http://yourdomain2.com.*$ [NC]
RewriteCond %{HTTP_REFERER} !^http://www.yourdomain2.com.*$ [NC]
RewriteRule .*\.(zip|ZIP|exe|EXE|rar|RAR)$ /errordocs/404.htm [R,L]
--------------------------------------------------------------------------------
That will only allow downloading zip, exe y rar only from www.yourdomain.com and www.yourdomain2.com. Others will get the 404.htm
Note that the .htaccess file I posted also contains custom 404 pages. just create a /errordocs/404.htm in your siteroot.
That solves the files hotlinking from other domains/locations but does not prevent downloaders/rippers to download your files. For that you can use this lines in your .htaccess file:
code:
--------------------------------------------------------------------------------
SetEnvIfNoCase User-Agent "^WebStripper" bad_bot
SetEnvIfNoCase User-Agent "^WebWhacker" bad_bot
SetEnvIfNoCase User-Agent "^WebZIP" bad_bot
SetEnvIfNoCase User-Agent "^Wget" bad_bot
SetEnvIfNoCase User-Agent "^JOC" bad_bot
deny from env=bad_bot
--------------------------------------------------------------------------------
But most of these downloaders/rippers/crawlers have options to identify themselves as Iexplorer, netscape, etc, .. so they easily avoid this method.
Add to Favourites Print this Article