How To Use Your
How To Use Your

Spammers Have A Knack For Developing Overrides To Even The Most Secured Aspect Of The System Including Those That Are Not Readily Recognized As Potential Targets. The .htaccess File Can Be Used To Keep E-mail Harvesters Away. This Is Considered Very Effective Since All Of These Harvesters Get To Identify Themselves In Some Way Using The User Agent Files Which Gives .htaccess The Capability To Block Them.Spams Countered By .htaccessBad Bots Are The Spiders That Are Considered To Do A Lot More Harm Than Good To A Site Such As An E-mail Harvester. Site Rippers Are Offline Browsing Programs That A Surfer May Unleash On A Site To Crawl And Download Every One Of Its Pages For Offline Viewing. Both Cases Would Result To A Jacking Up A Sites Bandwidth And Resource Usage Even Up To The Point Of Crashing The Sites Server. Since Bad Bots Would Typically Ignore The Wishes Of Ones Robots.txtfile They Can Be Banned Using The .htaccess Essentially By Identifying The Bad Bots.There Is A Useful Code Block That Can Be Inserted Into The .htaccess File For Blocking A Lot Of The Known Bad Bots And Site Rippers Currently Existing. Affected Bots Will Receive A 403 Forbidden Error When They Attempt To View A Protected Site. This Usually Results To A Significant Bandwidth Saving And Decrease In Server Resource Usage.Bandwidth Stealing Or What Is Commonly Referred To As Hot Linking In The Web Community Refers To Linking Directly To Non-HTML Objects That Are Not On Ones Own Server Such As Images And CSS Files. The Victims Server Is Robbed Of Bandwidth And Money As The Perpetrator Enjoys Showing Content Without Having To Pay For Its Delivery.Hot Linking To Ones Own Server Can Be Disallowed With The Use Of .htaccess. Those Who Will Attempt To Link An Image Or CSS File On A Protected Site Is Either Blocked Or Served A Different Content. Being Blocked Would Usually Mean A Failed Request In The Form Of A Broken Image While An Example Of A Different Content Would Be An Image Of An Angry Man, Presumably To Send A Clear Message To The Violators. It Is Necessary That The Mod Rewrite Is Enabled On Ones Server In Order For This Aspect Of .htaccess To Work.Disabling Hot Linking Of Certain File Types On A Site Would Need A Code To The .htaccess File Which Will Be Uploaded To The Root Directory Or A Particular Subdirectory To Localize The Effect To Just One Section Of The Site. A Server Is Typically Set To Prevent Directory Listing. If This Is Not The Case, The Required Link Should Be Stored Into The .htaccess Files Of The Image Directory So That Nothing In This Directory Will Be Allowed To Be Listed.The .htaccess File Is Also Able To Reliably Password Protect Directories On Websites. Other Options Can Be Used But Only .htaccess Offers Total Security. Anyone Wishing To Get Into The Directory Must Know The Password And No Back Doors Are Provided. Password Protection Using .htaccess Requires Adding The Approximate Links To The .htaccess File In The Directory That Is Being Sought To Be Protected.Password Protecting A Directory Is One Of The Functions Of .htaccess That Takes A Little More Work Than The Others. This Is Because A File Containing The Usernames And Passwords Which Are Allowed To Access The Site Has To Be Created. It Is Placed Anywhere Within The Website Although It Is Advisable To Store It Outside The Web Root So That It Cannot Be Accessed From The Web.Recommended Practices To Deter SpamAvoiding The Publication Of Referrers Is One Way Of Discouraging Spammers. It Would Be Pointless To Bother Sending Spoofed Requests To Blogs When This Information Is Not Known. Unfortunately, Most Bloggers Believe That Being Able To Click On A Link Such As Sites Referring To Me And The Like Is A Neat Feature And Have Not Evaluated Its Detrimental Effect On The Whole Blogosphere.If Publishing Referrers Is A Definite Must, There Should Be A Built-in Support For A Referral Spam Blacklist And Include The Page In Robots.txt. It Specifically Tells Googlebot And Its Relatives Not To Index The Referrers Page. By Doing This, Spammers Are Unable To Get The Page Rank They Seek. This Would Only Work However, When Referrers Are Published Separately From The Rests Of The Sites Content.The Use Of Rel No Follow Likewise Denies The Spammers Of Their Desired Page Rank At The Link-level And Not Just The Page-level Using Robots.txt. All Link Referrer Section Of The Website Linking To External Websites Should Carry This Attribute. This Is Done Without Exception So As To Offer Maximum Protection.Referrer Statistics Gathered From Beacon Images Loaded Via JavaScript Document, Write Statements That Are More Reliable Than What The Raw Web Server Logs Will Contain. There Is An Option To Totally Disregard The Referrers Section Of A Sites Server Logs. A Cleaner List Of Referrers Can Be Gathered From The Use Of JavaScript And Beacon Images From Referrer Stats.The Current Master Blacklist File Can Be A Powerful And Efficient Weapon Against Spam. A Log File Analysis Program That Filters Referrers Against This List Can Help Root Out Spam. The Master Blacklist Is A Simple Text File That Can Be Downloaded From A Website Or Simply Mirrored. It Is Far From Perfect Since A Check On The File Against The Referrers That Got Through Shows That Few Or None Of Them Were Listed.The Idea Of Combating Comment Spam By Harnessing DNS-based Black Hole Lists Could Also Be Used To Ferret Out Other Forms Of Spam Such As Referral Spam. The Proposal Is Really Rather Simple And Suggests To Query The IP Against A Blacklist For A Request With A Referrer. If The IP Is Blacklisted Or Has A High Score Among A Multitude Of Blacklist, Listing The Referring URL In Any Section Of A Sites Web Stats Should Be Refrained From. Once A Given Site Has Been Identified As A Referral Spam Host Name, Querying The Blacklist Again For Any IPs With The Same Host Name In The HTTP Request Should Not Be Done As A Matter Of Efficiency.There Are Various Forms Of Spam That Has Grown Exponentially Along With The Popularity Of Blogs. This Is Probably Due To The Very Little Restrictions Given Against Those That Can Post A Comment. This Is Easily Exploited By Spammers Who Are Intent On Getting Their Goods In Front Of Peoples View. Spammers Have Automated Tools On A Constant Look-out For Blogs That Can Easily Be Spammed. Spamming In All Its Forms, Carry Heavy Consequences For Those Trying To Use The Internet And The World Wide Web In A Productive Way.