[Geowanking] Fwd: war tweets

Mikel Maron mikel_maron at yahoo.com
Wed Sep 10 11:50:20 PDT 2008


http://www.obleek.com/iraq/

military casualities, not civilian. still one of the most effective visualizations i've seen.
 

----- Original Message ----
From: "Sean.Gorman at fortiusone.com" <Sean.Gorman at fortiusone.com>
To: geowanking at lists.burri.to
Cc: geowanking at lists.burri.to
Sent: Wednesday, September 10, 2008 10:06:38 AM
Subject: Re: [Geowanking] Fwd: war tweets

Cool application and would be very interesting to have it mapped out.  Might be interesting to have real time casualties on top of historic casualties so that a viewer could see the toll the war has taken over time.  Below are a few data sets that might be useful along those lines.

best,
sean

Iraq Coalition Casualty Count Organization, Iraq War Fatalities by Hometown, USA, 2007  http://finder.geocommons.com/overlays/1655

WITS, Violence in Iraq, Iraq, 2004 - March 2007
http://finder.geocommons.com/overlays/98

DoD, Active Duty Operations Enduring Freedom/Iraqi Freedom Casualties by State, USA, April 26, 2008
http://finder.geocommons.com/overlays/288

National Counter Terrorism Center, Violence Against NGO/Humanitarian Workers/Human Rights Activists, World, 2004- 2007
http://finder.geocommons.com/overlays/121


FortiusOne Inc,
2200 Wilson Blvd. suite 307
Arlington, VA 22201
cell - 202-321-3914

----- Original Message -----
From: "Sean Gillies" <sgillies at frii.com>
To: geowanking at lists.burri.to
Sent: Wednesday, September 10, 2008 12:27:11 PM GMT -05:00 US/Canada Eastern
Subject: Re: [Geowanking] Fwd: war tweets

Snark: Have you considered using the 120mm GeoRSS cannon deployed at
http://barcamp.org/BarCampMil? Image at
http://barcamp.org/f/barcamp_mil_logo_small_web_trans.png.

Seriously: this is a sobering application of twitter. Imagine the impact
it might have had early on, especially if realtime.

Sean

Anselm Hook wrote:
> Paige and I threw this together yesterday morning when we had a free hour -
> it attempts to twitter new civilian casualties in iraq.
> 
> Perhaps folks can suggest ways to improve it - it is kind of brute force -
> mapping the deaths could also be done - but I'm assuming third party
> services already do that and the location is published with the twitter
> albeit using David Troy's notation.
> 
>  - me
> 
> ---------- Forwarded message ----------
> From: paige saez <paigesaez at gmail.com>
> Date: Tue, Sep 9, 2008 at 1:33 PM
> Subject: war tweets-
> To: "makerlab at googlegroups.com" <makerlab at googlegroups.com>
> 
> 
> http://twitter.com/IRAQDEATHS
> 
> 
> #!/usr/local/bin/ruby
> 
> #
> # we want to fetch posts from the iraq death count and show them
> # and we want to only check every day or so ( an external cron job )
> # basically then we take a look at the most recent 50 only...
> # and we twitter up to 50 of them basically...
> # we reverse the twitter order so that the most recent ones are first
> #
> 
> twittercap = 50 # twitter this many posts max
> 
> #
> # 1) fetch the data
> #
> # http://www.iraqbodycount.org/database/download/ibc-individuals
> 
> require 'net/http'
> 
> url = "http://www.iraqbodycount.org/database/download/ibc-individuals"
> 
> data = Net::HTTP.get_response(URI.parse(url)).body
> 
> 
> 
> #
> # 2) chew on the data a bit - masticate it good
> #
> # our first goal is to read the rather dorky csv file iraq war deaths
> provides
> # we have to skip past the header crap... so lets make a copy of the data
> without that header
> # we will just loop through all the data and pull out the good stuff and
> store it into another array
> # clearly not the "perfectly clean" way to do it but good enough and we
> don't really care
> #
> # http://www.rubytips.org/2008/01/06/csv-processing-in-ruby/
> # http://fastercsv.rubyforge.org/classes/FasterCSV.html
> 
> require 'rubygems'
> require "fastercsv"
> @results = FasterCSV.parse(data)   # unused approach ->
> read("ibc-individuals")
> @deaths = []
> inside_stupid_header = true
> @results.each do |death|
>   if death[0] == "IBC code"
>     inside_stupid_header = false
>   elsif inside_stupid_header == false
>     @deaths << death
>   end
> end
> 
> #
> # 3) put the data in our storage area...  keep a copy in our belly
> #
> # http://sqlite-ruby.rubyforge.org/sqlite3/faq.html
> #
> # our second goal is to store this in a structured way so we can process it
> # i will bother to keep it in a database although i could just hold it
> memory
> # we are going to want to avoid storing duplicates - because we will call
> this multiple times
> #
> # this gets slightly more involved
> #
> # 3a) let us make a table to store the data - if it already exists this code
> will crash but thats ok
> # we wrap the whole thing in a begin / rescue and just ignore the crash if
> it happens
> #
> # The format seems to be here in the ibc file : "IBC code","Name or
> Identifying Details","Age","Sex","Marital Status","Parental
> Status","Earliest Date","Latest Date","Location"
> #
> # 3b) um, store everything...
> 
> 
> =begin
> require 'rubygems'
> require 'sqlite'
> # this was a manual database approach - tedious!
> db = SQLite::Database.new( "endiraqwar.db" )
> begin
>   result = db.execute("CREATE TABLE deaths(id INTEGER, code VARCHAR(80),
> name VARCHAR(255), age INTEGER, sex VARCHAR(32), marital VARCHAR(64),
> parental VARCHAR(64), earliest DATE, latest DATE, location VARCHAR(255))");
> rescue
> end
> @deaths.each do |death|
>   result = db.execute("INSERT INTO ")
> end
> =end
> 
> #
> # 3) ok scratch the above, lets use this:
> #
> # http://datamapper.org/why.html
> #
> # lets manage the data in a way that takes advantage of object oriented
> design
> # we will grab datamapper which lets us pretend our data is a ruby data
> object
> # (instead of treating it like radioactive waste and holding it at arms
> length)
> # that means we have to define what our data "is" for ruby...
> # also we will hop over to postgres as our back end datastore - away from
> sqlite ... just because
> #
> 
> require 'rubygems'
> require 'dm-core'
> 
> DataMapper.setup(:default, {
>     :adapter  => 'postgres',
>     :database => "endiraqwar",
>     :username => 'endiraqwar',
>     :password => '',
>     :host     => 'localhost'
> })
> 
> class Death
>   include DataMapper::Resource
>   property :id,         Integer, :serial => true
>   property :code,       String
>   property :name,       Text
>   property :age,        Text
>   property :sex,        Text
>   property :marital,    Text
>   property :parental,   String
>   property :earliest,   DateTime
>   property :latest,     DateTime
>   property :location,   Text
>   property :created_at, DateTime
>   property :posted,     DateTime, :default => nil
> end
> 
> # we do not want to do this now
> # because it would erase our database
> # DataMapper.auto_migrate!
> 
> # go ahead and store all the deaths
> @deaths.each do |death|
>   if Death.first(:code => death[0] )
>     # puts "We already found this death #{death[1]} #{death[0]} so not
> saving"
>     next
>   end
>   # take a second to convert the date phrase into a machine date
>   death[6] = DateTime.parse(death[6])
>   death[7] = DateTime.parse(death[7])
>   # go ahead and make a blobby thing to hold all of this
>   record = Death.new(
>               :code => death[0],
>               :name => death[1],
>               :age => death[2],
>               :sex => death[3],
>               :marital => death[4],
>               :parental => death[5],
>               :earliest => death[6],
>               :latest => death[7],
>               :location => death[8]
>            )
>   puts "recording the passing of #{record.name} at #{record.earliest} and
> #{record.code}"
>   record.save
> end
> 
> #
> # 4) ok cool, now we have the data - lets twitter!
> #
> # http://twitter.rubyforge.org/
> # http://github.com/jnunemaker/twitter/tree/master/examples/posting.rb
> #
> 
> require 'twitter'
> twitter = Twitter::Base.new("endiraqwar","")
> @deaths = Death.all(:order => [:earliest.desc], :limit => twittercap)
> @copyofdeaths = []
> @deaths.each do |death|
>   @copyofdeaths << death
> end
> 
> @copyofdeaths.reverse.each do |death|
>  # publish deaths that are new
>  next if death.posted != nil
>  result = twitter.post("#{death.name}, #{death.age}, #{death.sex},
> #{death.marital}, #{death.parental} killed on #{death.earliest.strftime("%d
> %b %Y")} at L:#{death.location}")
>  # remember that we already published this death
>  death.posted = DateTime.now
>  death.save
>  puts "posted the death of #{death.name} #{death.code}"
> end
> 
> 
> 
> 
> 
> ------------------------------------------------------------------------
> 
> _______________________________________________
> Geowanking mailing list
> Geowanking at lists.burri.to
> http://lists.burri.to/mailman/listinfo/geowanking

_______________________________________________
Geowanking mailing list
Geowanking at lists.burri.to
http://lists.burri.to/mailman/listinfo/geowanking
_______________________________________________
Geowanking mailing list
Geowanking at lists.burri.to
http://lists.burri.to/mailman/listinfo/geowanking
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://geowanking.org/pipermail/geowanking_geowanking.org/attachments/20080910/87c957f3/attachment-0007.html>


More information about the Geowanking mailing list