Give Me My Data API Hacking Masterclass
Thursday, July 27, 2012
A ‘hands-on’ workshop with technical and theoretical overview of contemporary ‘Application Programming Interfaces’ (API’s) of large social networks, en how to use these for your own project or application.
We will cover the ins and outs of creating Facebook apps, web applications, and how to play with the Twitter, Foursquare, Flickr and Instagram APIs.
With WORM’s current ‘Artists In Residence’ Tim C. Schwartz and Owen Mundy you’ll learn from a programmers perspective how to approach your target network, to subsequently make it do what you need it to… As an example you can think of Owen’s project “Give Me My Data” or WORM’s previous release of the “Web2.0 Suicide Machine”
No specific technical skills required, but a curious mind towards the ins&outs of social networking is a must!
[EN] buy your ticket online or send your resrvation email@example.com, Participation is limited so act fast!
– reduced/discount tickets available for WORM volunteers & students (with ID)
Normaal Voorverkoop: € 15.00 Deurverkoop: € 15.00
Normaal reductie Voorverkoop: € 10.00 Deurverkoop: € 10.00
Location: WORM, Boomgaardsstraat 71, 3012 XA Rotterdam
Review of “Python: Visual QuickStart Guide (2nd edition)”
I have to admit I had my doubts about this book. I have not been impressed with Visual QuickStart Guides in the past. I respect the idea, of presenting only essential information in an easy-to-understand introductory manner, but that was rarely the case in previous experiences. Usually I found them to be a mixture of non-essential information that was not presented in a manner that would be good for those new to the subject.
That being said, their Python (2nd edition) breaks with their past. So far, I have almost consistently found it to be concise and to the point regarding an introduction to Python and programming languages in general. The examples are relevant and work accordingly. While I normally head to O’Reilly for the in-depth look at a language, this book is a great starting point.
One note and criticism; The examples using input() are flawed for users with Python <3 (most of us). You have to use raw_input() in place of it, although the book does not point this out.
Scrapy in process
How to install Scrapy with MacPorts (full version)
Here is a step-by-step explaining how I got Scrapy running on my MacBook Pro 10.5 using MacPorts to install Python and all required libraries (libxml2, libxsit, etc.). The following has been tested on two separate machines with Scrapy .10.
Many thanks to users here who shared some helpful amendments to the default installation guide. My original intention was to post this at stackoverflow, but their instructions discourage posting issues that have already been answered so here it is…
1. Install Xcode with options for command line development (a.k.a. “Unix Development”). This requires a free registration.
2. Install MacPorts
3. Confirm and update MacPorts
$ sudo port -v selfupdate
4. “Add the following to /opt/local/etc/macports/variants.conf to prevent downloading the entire unix library with the next commands”
+bash_completion +quartz +ssl +no_x11 +no_neon +no_tkinter +universal +libyaml -scientific
5. Install Python
$ sudo port install python26
If for any reason you forgot to add the above exceptions, then cancel the install and do a “clean” to delete all the intermediary files MacPorts created. Then edit the variants.conf file (above) and install Python.
$ sudo port clean python26
6. Change the reference to the new Python installation
If you type the following you will see a reference to the default installation of Python on MacOS 10.5 (Python2.5).
$ which python
You should see this
To change this reference to the MacPorts installation, first install python_select
$ sudo port install python_select
Then use python_select to change the $ python reference to the Python version installed above.
$ sudo python_select python26
UPDATE 2011-12-07: python_select has been replaced by port select so…
To see the possible pythons run
port select --list python
From that list choose the one you want and change to it e.g.
sudo port select --set python python26
Now if you type
$ which python
You should see
which is a symlink to
Typing the below will now launch the Python2.6 shell editor (ctl + d to exit)
7. Install required libraries for Scrapy
$ sudo port install py26-libxml2 py26-twisted py26-openssl
Other posts recommended installing py26-setuptools but it kept returning with with errors, so I skipped it.
8. “Test that the correct architectures are present:
$ file `which python`
The single quotes should be backticks, which should spit out (for intel macs running 10.5):”
/opt/local/bin/python: Mach-O universal binary with 2 architectures
/opt/local/bin/python (for architecture i386): Mach-O executable i386
/opt/local/bin/python (for architecture ppc7400): Mach-O executable ppc
9. Confirm libxml2 library is installed (those really are single quotes). If there are no errors it imported successfully.
$ python -c 'import libxml2'
10. Install Scrapy
$ sudo /opt/local/bin/easy_install-2.6 scrapy
11. Make the scrapy command available in the shell
$ sudo ln -s /opt/local/Library/Frameworks/Python.framework/Versions/2.6/bin/scrapy /usr/local/bin/scrapy
One caveat for the above, on a fresh computer, you might not have a /usr/local/bin directory so you will need to create it before you can run the above to create the symlink.
$ sudo mkdir /usr/local/bin
13. Finally, type either of the following to confirm that Scrapy is indeed running on your system.
$ python scrapy
A final final bit… I also installed ipython from Macports for use with Scrapy
sudo port install py26-ipython
Make a symbolic link
sudo ln -s /opt/local/bin/ipython-2.6 /usr/local/bin/ipython
An article on ipython
Setup Macports Python and Scrapy successfully
“Scrapy is a fast high-level screen scraping and web crawling framework, used to crawl websites and extract structured data from their pages. It can be used for a wide range of purposes, from data mining to monitoring and automated testing.”
But, it can be a little tricky to get running…
Attempting to install Scrapy on my MBP with the help of this post I kept running into errors with the libxml and libxslt libraries using the Scrapy documentation.
I wanted to try to let Macports manage all the libraries but I had trouble with it referencing the wrong installation of Python. I began with three installs:
- The default Apple Python 2.5.1 located at: /usr/bin/python
- A previous version I had installed located: /Library/Frameworks/Python.framework/Versions/2.7
- And a Macport version located: /opt/local/bin/python2.6
My trouble was that:
would always default to the 2.7 when I needed it to use the Macports version. The following did not help:
$ sudo python_select python26
I even removed the 2.7 version which caused only an error.
I figured out I needed to change the default path to the Macports version using the following:
$ PATH=$PATH\:/opt/local/bin ; export PATH
And then reinitiate the ports, etc.
Finally, I was not able to reference the scrapy-ctl.py file by default through these instructions so I had to reference the scrapy-ctl.py file directly
A quick addendum to this post with instructions to create the link, found on the Scrapy site (#2 and #3).
Starting with #2, “Add Scrapy to your Python Path”
sudo ln -s /opt/local/Library/Frameworks/Python.framework/Versions/2.6/bin/scrapy-ctl.py /opt/local/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/site-packages/scrapy
And #3, “Make the scrapy command available”
sudo ln -s /opt/local/Library/Frameworks/Python.framework/Versions/2.6/bin/scrapy-ctl.py /usr/local/bin/scrapy
How to easily set up a campaign finance database (well, kind of) or Make Python work with MAMP via MySQLdb
I’ve been trying for a few hours to run a Python script from The Sunlight Foundation Labs which downloads (and updates) a campaign finance database from the Center for Responsive Politics. See their original post for more information.
In the process of getting this working I accidentally broke a working copy of MySQL and overwrote a database installed on my MBP (which I had stupidly not backed-up since last year). FYI, you can rebuild any MySQL database with the original .frm, .MYD, and .MYI files if you 1. Recreate the database in the new install of MySQL and 2. Drag the files into the mysql data folder.
I struggled quite a bit getting Python to work with MySQL via MySQLdb. I’m documenting some of the headaches and resolutions here in case they are useful. I’ve tried to include error messages for searches as well.
The Sunlight Foundation instructions require Python and MySQL, but don’t mention you have to have already wrestled with the madness involved in installing Django on your machine. Here is what I did to get it working on my MacBook Pro Intel Core 2 Duo. I’ve included their original instructions with my own (and a host of others).
- Install MAMP.
While I had working installations of MySQL and Python (via installers on respective sites), I couldn’t get Python to connect to MySQL via MySQLdb. I decided to download and try MAMP for a clean start.
- Install XCode
Past installs are available on Apple Developer website.
- Install setuptools
Required for the MySQLdb driver. Remove the .sh extension from the filename (setuptools-0.6c11-py2.7.egg.sh) and in a shell:
~$ chmod +x setuptools-0.6c11-py2.7.egg
- Install the MySQLdb driver
After downloading and unzipping, from the directory:
~$ python setup.py build
~$ sudo python setup.py install
Continue following the advice of this post to the end How to install Django with MySQL on Mac OS X.
I also followed another piece of advice in Python MySQL on a Mac with MAMP to change the mysql_config.path from:
Especially useful is his test script for making sure that Python is indeed accessing MySQL.
- Create a symbolic link between Python and MySQL in MAMP
This is required in order to use a socket to connect to the MySQL. See How to install MySQLdb on Leopard with MAMP for more information.
~$ sudo ln -s /Applications/MAMP/tmp/mysql/mysql.sock /tmp/mysql.sock
- Create a directory and put the two Python files in it.
- Modify the top of the sun_crp.py file to set certain parameters–your login credentials for the CRP download site and your MySQL database information.
- Install pyExcelerator
ImportError: No module named pyExcelerator
I had to install this module next.
- Comment out multiple lines
NameError: name 'BaseCommand' is not defined
In download.py comment out the following:
from django.core.management.base import BaseCommand, CommandError
class CRPDownloadCommand(BaseCommand):to the end of the document.
- From the command line, run the script by typing, from the proper directory: Python sun-crp.py.
- It will take several hours to download and extract the data, especially the first time it’s run. But after that, you’re good to go.
Automata: Counter-Surveillance in Public Space paper on the Public Interventions panel at ISEA2010
ISEA2010 RUHR Conference in Dortmund, Germany
P26 Public Interventions
Tue 24 August 2010
Volkshochschule Dortmund, S 137a
Moderated by Georg Dietzler (de)
- 15:00h | Owen Mundy (us): Automata: Counter-Surveillance in Public Space
- 15:20h | Christoph Brunner (ch/ca), Jonas Fritsch (dk): Balloons, Sweat and Technologies. Urban Interventions through Ephemeral Architectures
- 15:40h | Georg Klein (de): Don’t Call It Art! On Artistic Strategies and Political Implications of Media Art in Public Space
- 16:00h | Georg Dietzler (de): Radical Ecological Art and No Greenwash Exhibitions
About my talk:
Automata is the working title for a counter-surveillance internet bot that will record and display the mutually-beneficial interrelationships between institutions for higher learning, the global defense industry, and world militaries. Give Me My Data is a Facbook application that help users reclaim and reuse their Facebook data. The two projects, both ongoing, address important issues surounding contemporary forms of communication, surveillance, and control.
New Automata sitemaps
Facebook Death Star
New sitemaps with NodeBox
Here are some new sitemaps, built using the Python Scrapy library and NodeBox.
You must be logged in to post a comment.