Title: Internet geeks here! Who can determine how many web spiders/crawlers are on 4um?? Source:
none URL Source:http://none Published:Nov 20, 2009 Author:X-15 Post Date:2009-11-20 03:04:08 by X-15 Keywords:None Views:2722 Comments:170
"Web spiders/crawlers: programs that search websites looking for specific words or patterns to compile into a database."
A popular gun website I visit had 20 running, if 4um has less then I assume it has a lower profile in the eyes of FedGov.
Internet geeks here! Who can determine how many web spiders/crawlers are on 4um??
You'd need access to christine's server logs to get a good idea. However, there are many kinds of spiders, some quite difficult to detect.
Here is a quickie spider that I wrote. It runs on the Mac, OS X 10.5 Leopard. However, it is a standard Bash script and should work easily on Linux or Unix systems, probably in a Cygwin setup on Windows too.
The script uses Lynx, a venerable text-only browser, to fetch my Comments page to a file called htmlsource1. It then uses the stream editor Sed to parse this captured HTML file by scanning the right column for news stories, capturing the thread names and URLs at 4um to a file called htmlsource2.
It then uses Lynx to capture each thread to a separate file by thread number in a subdirectory called '4um'.
You could build a database or use text search tools like grep to mine the stored threads for info.
No doubt, various federal agencies and people like ADL or SPLC use scripts like this to capture many forums and use grep and other search tools to scan each thread captured for relevant keywords to flag them for review by human beings.
I'm presenting this so that 4um folks can get some idea of how these agencies and busybodies operate. Geeks know this stuff but the average person has no idea how easy it is. People should know how easy it is to database their every remark since we no longer live in a free country.
No doubt, various federal agencies and people like ADL or SPLC use scripts like this to capture many forums and use grep and other search tools to scan each thread captured for relevant keywords to flag them for review by human beings.
Just for fun, I ran my script again for the first time in months. Since I didn't have any recent threads in my cache of 4um threads, it processed the entire news article sidebar. You can see the time stamp so you can check your logs and find this access.
Sat Nov 21 05:29:23 CST 2009 fetched 108669... (this one is your sticky thread) fetched 110723... fetched 110722... fetched 110721... fetched 110720... fetched 110719... fetched 110718... fetched 110717...
It only took 30 seconds to capture 68 full threads to my hard drive. I note that about ten threads are missing, did you remove some perhaps? Or maybe there's some little bug in my script; it wasn't like this code was highly important to me so I didn't go nuts over it.
Then I ran the script again a few minutes later. Since no new articles had been posted, it found nothing new to cache.
Sat Nov 21 05:34:29 CST 2009 no new articles on freedom4um.com
If I used a cron job to schedule this script to run regularly, say every hour or even every 12 hours, it would capture every thread posted at 4um. You'd have to revisit the thread, perhaps sniffing the headers to detect whether the thread had changed, in order to capture all the comments because those can come in days, months or even years later. You'd capture 99% of these chat forums' content if you just wait one month to capture a thread, parse it, and store it.
As I said, this crappy little Bash script isn't really even a spider, just a webserver capture script. If you wrote in a higher level language and sniff headers and such, it would be quite easy to become extremely sophisticated about this. I could churn out a real spider in very short order as could Neil or tons of script kiddies.
As for the posters here at 4um, one should assume that FBI and other agencies may employ agentes provocateur in the classic style they used against the Klan and others. Assuming that you had, say, a half-dozen posters here that post the most vitriolic content on the site, they could greatly raise the profile of 4um with SPLC/ADL/FBI quite easily. After all, what would they do if they couldn't point to the dire threat of rampant political incorrectness online? And how else could they assemble the "evidence" that there are vicious racists out there which justifies the usual begging letters sent out to donors by SPLC/ADL or the begging to Congress for staff and vast new computer systems by FBI/NSA/etc.? Hey, if that well isn't producing enough, you just have to prime the pump a little, baby!
So even if you're running a legit free speech forum (and I have no reason to believe anything else though I consider the possibility), that doesn't mean that your forum might not be used as a honeypot for the race hustlers like ADL/SPLC or by various letters of the alphabet like F, B, or I.