Website securing

halo2_godhalo2_god New York state
edited September 2007 in Science & Tech
Not sure if this belongs here but it does deal with web securing. I use 110mb.com and well i hate Robots, crawlers, spiders, Spam or bad bots. As said from http://www.user-agents.org/ and well i checked this site out and am wondering how to do this http://blamcast.net/articles/block-bots-hotlinking-ban-ip-htaccess what do i name root directory and what do i name the file i load to server?? Please some one help my bandwidth is off the hook. XD All help is needed

Comments

  • kryystkryyst Ontario, Canada
    edited September 2007
    You root directory is the first directory that your website exists on usually / or public_html or possibly the domain name depending on how your site is setup and how it's hosted. You don't rename it. Just look to see where your index.html or index.php page exists and that'll be your root.

    All you do is create a .htaccess file in your root directory. If your hosting site supports .htaccess files then you add the code bits they are talking about to that file and it'll do what it'll do.

    Read the htaccess tutorial they linked to, it's quite good.
  • LincLinc Owner Detroit Icrontian
    edited September 2007
    I dunno, I just throw a file called "robots.txt" in the root http directory with just this in it:
    User-agent: *
    Disallow: /
    
    This doesn't physically prevent spiders from getting at the site (like htaccess will), but no search engine will index your site with that in there, and any spider worth its salt will read that once (it's the first file they're supposed to look at) and not come back for a long time. It's the only "security" I use against spiders on our development site for Icrontic, and it's 100% effective as far as I can tell.
Sign In or Register to comment.