Ruby Twitter Scraper

Requires the twitter gem. Install it as per usual. Code as follows:

#!/bin/env ruby
# encoding: utf-8

require 'twitter'
require 'csv'

client = Twitter::REST::Client.new do |config|
	config.consumer_key = "insert"
	config.consumer_secret = "insert"
	config.access_token = "insert"
	config.access_token_secret = "insert"
end

def collect_with_max_id(collection=[], max_id=nil, &block)
  response = yield(max_id)
  collection += response
  response.empty? ? collection.flatten : collect_with_max_id(collection, response.last.id - 1, &block)
end

def client.get_all_tweets(user)
  collect_with_max_id do |max_id|
    options = {:count => 200, :include_rts => true}
    options[:max_id] = max_id unless max_id.nil?
    user_timeline(user, options)
  end
end

junk = client.get_all_tweets(ARGV[0])

CSV.open("#{ARGV[0]}.csv", "w") do |csv|
	junk.each do |tweet|
		csv << [tweet.id, tweet.created_at, tweet.user.screen_name, tweet.text, tweet.source, tweet.geo]
	end
end

Excellent. I’m going to revise it as necessary, but it’s a most effective scraper. Though I’d love to add some sort of progress bar to it, haven’t succeeded in that yet. I’ll keep you posted and update it as the iterations of this thing change. It was smashed together from the twitter gem’s bare scraper and CSV output added. I’m quite pleased. Going to also consider adding time and date statistics compilation. I might just write an entirely separate script for that. Not sure yet.

[]

Envy Code R

This is pretty excellent. Check it out.

Envy Code R

It’s called “Envy Code R”, you can grab it here.

I might use this as my “hosted font of choice” for rendering scripts and the like. It’s pretty damn cool.

[]

(╯°□°)╯︵ ┻━┻

If you’re in the automotive manufacture industry, you’re pretty familiar with re-certification. It’s that time of year everyone panics and freaks the fuck out about either surveillance or re-cert audits, works seven days a week, ten hour days, and has a mild heart attack over every misspelling or problematic procedure in the control documents for their company.

What disturbs me is that I absolutely despise this process. I hate it, yet no matter where I go I won’t be able to avoid it. I need to get out of the department that has to deal with that garbage. I need to get ietnto engineering, production, some other department. It doesn’t matter. I can’t put in these hours. It’s too much. I have a family. I have a life outside of work. It’s effecting me in some rather deeply disturbing ways. I don’t sleep as much as I need to and every day feels like a hallucination.

[]

Joseph A. Camp, #Hostage68

Jojo’s PO contact info via @5hm00p

Attention to everyone who was harassed or threatened by Joseph Camp:

I just got word that his sentencing is Tuesday, and he's going to he
served 5 months and when he's out he's free to do as he pleases. I
myself don't believe this is someone who belongs in a normal functioning
society, therefore a few of us have started a campaign to get him some
sort of supervision when he is released. How can you help?
Simple....write to:

US Parole Officer George Martin
100 State St.
Rochester, NY 14614

or

you can reach him by phone at (585) 666-5901


and tell him about your encounter with this maniac. The last thing we
want is him around again harassing people for Sue Basko (don't worry,
disbar comin) again and invading people's lives and their privacy. This
is your chance to put a completely horrible human being away and
hopefully make him learn his lesson.
[]

An Open Letter to Fabrizio Schiavi

Mr. Schiavi,

I’ve read your EULA for WOFF/EOT format fonts. Ridiculous. If this is a standard practice, it’s disgusting. This EULA glorifies prostitution. It makes mopping semen at an adult book store seem like a venerable career. I suppose you might’ve garnered that I’m anti-DMCA.

Dropping $26 on a font is borderline acceptable if you enjoy the font enough. To be honest with you, I’m considering dumping your font from my site out of principle. Paying per page-view for a font license is pretty unreasonable.

[]

Putting It All Together

Grab the mention counter. Grab the tweet scraper. Point it at four or five “known” associated targets and grab 3k tweets at regular expected intervals. Better yet, use tweepy to regularly grab tweets as they’re sent. Amass a good amount. Once you’ve amassed a good amount of tweets, fire up the mention counter. Make your cutoff large. Make it count.

Fire up Gephi and open your CSV in it. It should automatically generate an interesting map. Use the heat map feature to get even more interesting results. Be amazed that you can OSINT. Most of all, have fun.

[]