I made a small app to compare the number of search results for two phrases from the Digital Public Library of America for a hackathon / workshop here at Florida State next week.
The Digital Scholars Reading and Discussion Group will simulate its second “hackathon” on April 21, allowing participants to learn more about the back-end structure of the Digital Public Library of America. With its April 2013 launch, the DPLA became the first all-digital library that aggregates metadata from collections across the country, making them available from a single point of access. The DPLA describes itself as a freely available, web-based platform for digitized cultural heritage projects as well as a portal that connects students, teachers, scholars, and the public to library resources occurring on other platforms.
From a critical point of view, the DPLA simultaneously relies on and disrupts the principles of location and containment, making its infrastructure somewhat interesting to observe.
In this session, we will visit the DPLA’s Application Programming Interface (API) codex to observe some of the standards that contributed to its construction. We will consider how APIs function, how and why to use them, and who might access their metadata and for what purposes. For those completely unfamiliar with APIs, this session will serve as a useful introduction, as well as a demonstration of why a digital library might also want to serve as an online portal. For those more familiar with APIs, this session will serve as an opportunity to try on different tasks using the metadata that the DPLA aggregates from collections across the country.
At this particular session, we are pleased to be joined by Owen Mundy from FSU Department of Art and Richard Urban from FSU College of Communication and Information, who have considered different aspects of working with APIs for projects such as the DPLA, including visualization and graphics scripting, and developing collections dashboards.
As before, the session is designed with a low barrier of entry in mind, so participants should not worry if they do not have programming expertise or are still learning the vocabulary associated with open-source projects. We come together to learn together, and all levels of skill are accommodated, as are all attitudes and leanings.
Participants are encouraged to explore the Digital Public Library of America site prior to our meeting and to familiarize themselves with the history of the project. Laptops will be available for checkout, but attendees are encouraged to bring their own.
Joelle Dietrick and I embarked on a new body of work this summer called “Packet Switching.” Inspired by her Sherwin Series images and wall paintings, and my work deconstructing and re-visualizing source code and other data, we’ve created two new software projects, as well as a series of limited edition prints, large photo installations, wall-sized paintings, and animations.
The full statement explains our process and intent clearly:
Packet Switching is an ongoing body of work by Joelle Dietrick and Owen Mundy that visualizes architecture as fragments affected by economic and communications systems.
The title of the series references how contemporary communications systems break digital files into smaller manageable blocks of data called packets. Each packet is then sent through a network, taking the quickest route possible, and reassembled once they reach their destination. One JPG image, for example, might be broken into several packets, each of which may travel a different path through the net, even through different cities, before being recompiled into a copy of the original file.
To reference this common process used in networked systems, we wrote custom software that deconstructs a 3D model’s source code and produces unique fragments. We further remixed these fragments using an original application created in Processing. The resulting images become limited edition prints, large photo installations, wall-sized paintings, and animations.
Our process underscores how incidental fragmentation and automation can streamline markets, but also make them vulnerable to systems failure. The use of architecture specifically points to recent real estate market volatility and considers how communication technology-enabled pursuits of profit margins alters our most basic needs.
The first software, that “deconstructs a 3D model’s source code and produces unique fragments,” is open source and available on Github. Essentially, the PHP software, parses a 3D COLLADA file and exports a set number of geometries, that can then be further broken down and used in an artwork or design.
The second software, which we will release soon, remixes these fragments using Processing. The video below shows an example of the whole process.
Wall painting at “Temporary Home” in Kassel, Germany
While artists-in-residence at Temporary Home, in Kassel, Germany, which coincided with Documenta13, Joelle Dietrick and I completed a wall-sized temporary painting based on the architecture from the Bauhaus School at Dessau and 2012 American color forecasts.
Commission at Weimer Hall at the University of Florida
Joelle and I have also received a commission to complete Packet Switching (Weimer Hall) at the University of Florida College of Journalism and Communications this fall. This will be inkjet on adhesive polyester on a large wall (approx. 177.5 ft. x 20.2 ft.). More details soon.
A ‘hands-on’ workshop with technical and theoretical overview of contemporary ‘Application Programming Interfaces’ (API’s) of large social networks, en how to use these for your own project or application.
We will cover the ins and outs of creating Facebook apps, web applications, and how to play with the Twitter, Foursquare, Flickr and Instagram APIs.
With WORM’s current ‘Artists In Residence’ Tim C. Schwartz and Owen Mundy you’ll learn from a programmers perspective how to approach your target network, to subsequently make it do what you need it to… As an example you can think of Owen’s project “Give Me My Data” or WORM’s previous release of the “Web2.0 Suicide Machine”
No specific technical skills required, but a curious mind towards the ins&outs of social networking is a must!
[EN] buy your ticket online or send your resrvation tomoddr@worm.org, Participation is limited so act fast!
– reduced/discount tickets available for WORM volunteers & students (with ID)
No one would be surprised to learn that almost all of the user-generated content websites use our personal data to sell advertisements. In fact 97% of Google’s revenue comes from advertising.[1] That’s why it’s important these sites provide as much access as possible to the real owners of our data‐us. After all, we put it there and allow them to use it in exchange for the use of their software. Seems like a fair trade if you ask me.
A year and a half ago Facebook didn’t provide any access. That’s why I created Give Me My Data, to help users reclaim and reuse their personal data they put on Facebook.
By giving more agency to users of online systems, Give Me My Data may have already impacted the nature of online application development. In November 2010, almost a year after I launched Give Me My Data, Facebook created their own service for users to export their profile from Facebook as a series of HTML pages. Unlike Give Me My Data, the Facebook service doesn’t allow you to select which data you want or to choose custom formats to export. It also doesn’t give you options for visualization like the custom network graphs that Give Me My Data offers.
I believe their motivation originates in part with my application, likely due to the popularity of Give Me My Data, and points to the potential usefulness of similar apps. While years down the road may reveal many other online systems giving users control over their data, I see this as a positive effect where the content we create, as well as the means to share and manage it, are democratized.
Meanwhile, the above also keeps me hard at work developing the Give Me My Data project. This week I rewrote the program to use Facebook’s new OAuth authorization, which also required rewriting all of the code that fetches the data. Previously it used the REST API which is being deprecated (sometime?) in the future. I also added new data types, fixed the CSV format (which had the rows and columns mixed-up), and added the possibility to export in the JSON data format.
Finally, in the data selector, I distinguished standard data and customized data types. When I say customized, I mean that I’ve written code that mashes together more than one data table and/or addresses a specific question. For example, right now users can select from two types of network graphs and corresponding formats. One describes the user’s relationship to their friends, and the other describes the user’s relationship to their friends, as well as all their friends’ relationships to each other in various graph description languages. This is how I made the network graph image below. I’m also interested in hearing other suggestions for custom queries I might add. The project will be open source on Github soon, so even code contributions will be welcome.
Anyway, please try out the new version. You may have to delete the app from your allowed applications and then re-authorize it if you’ve used it before. As usual, you can provide feedback on the application page, and you can also contact me on Twitter via @givememydata.
A two-day workshop, with both technical hands-on and idea-driven components. Learn to scrape data and reuse public and private information by writing custom code and using the Facebook API. Additionally, we’ll converse and conceptualize ideas to reclaim our data literally and also imagine what is possible with our data once it is ours!
Here are the slides and some of the code samples from the Freedom for Our Files (FFOF) workshop I just did in Linz at Art Meets Radical Openness (LiWoLi 2011).
The first one is a basic scraping demo that uses “find-replace” parsing to change specific words (I’m including examples below the code)
Basic scraping demo with “foreach” parsing
Basic scraping demo with “regex” parsing
Basic scraping demo with “foreach” and “regex” parsing
And the example, which presents the same information in a new way…
Advanced scraping demo with “regex” parsing. Retrieves current weather in any city and colors the background accordingly. The math below for normalization could use some work.
For the next part of the workshop we used Give Me My Data to export our information from Facebook in order to revisualize it with Nodebox 1.0, a Python IDE similar to Processing.org. Here’s an example:
Update: Some user images from the workshop. Thanks all who joined!
Mutual friends (using Give Me My Data and Graphviz) by Rob Canning
identi.ca network output (starting from my username (claude) with depth 5, rendered to svg with ‘sfdp’ from graphviz) by Claude Heiland-Allen
This weekend I am presenting a lecture about GIve Me My Data and conducting a two-day data-scraping workshop at Art Meets Radical Openness in Linz, Austria. Here are the details.
The Self-Indulgence of Closed Systems
May 13, 18:45 – 19:15
Part artist lecture, part historical context, Owen Mundy will discuss his Give Me My Data project within the contexts of the history of state surveillance apparatuses, digital media and dialogical art practices, and the ongoing contradiction of privacy and utility in new media.
Freedom for Our Files: Creative Reuse of Personal Data
May 13-14, 14:00 – 16:30
A two-day workshop, with both technical hands-on and idea-driven components. Learn to scrape data and reuse public and private information by writing custom code and using the Facebook API. Additionally, we’ll converse and conceptualize ideas to reclaim our data literally and also imagine what is possible with our data once it is ours! Register here
LiWoLi is an open lab and meeting spot for artists, developers and educators using and creating FLOSS (free/libre open source software) and Open Hardware in the artistic and cultural context. LiWoLi is all about sharing skills, code and knowledge within the public domain and discussing the challenges of open practice.
I launched Keyword Intervention in January 2007 and for almost four years now it has been scraping topical search terms and attracting random traffic. Today I moved the project to its own domain, keywordintervention.com and also updated the documentation on the site. Below is a sample of the last 500 search terms by users all around the world. The full list is here.
P26 Public Interventions
Tue 24 August 2010
15:00–16:30h
Volkshochschule Dortmund, S 137a
Moderated by Georg Dietzler (de)
15:00h | Owen Mundy (us): Automata: Counter-Surveillance in Public Space
15:20h | Christoph Brunner (ch/ca), Jonas Fritsch (dk): Balloons, Sweat and Technologies. Urban Interventions through Ephemeral Architectures
15:40h | Georg Klein (de): Don’t Call It Art! On Artistic Strategies and Political Implications of Media Art in Public Space
16:00h | Georg Dietzler (de): Radical Ecological Art and No Greenwash Exhibitions
About my talk:
Automata is the working title for a counter-surveillance internet bot that will record and display the mutually-beneficial interrelationships between institutions for higher learning, the global defense industry, and world militaries. Give Me My Data is a Facbook application that help users reclaim and reuse their Facebook data. The two projects, both ongoing, address important issues surounding contemporary forms of communication, surveillance, and control.
After 4 months the new yourarthere.net website and member-run content management system is now live. Thanks to Braylin and Brittany Morales, Beth Lee, and Chris Cumbie for all their hard work.
The site is valid XHTML/CSS and runs on PHP/MySQL using the Codeigniter framework. All the details from our research from inception onward are archived here.
This site is based around the idea that members should have control of the content on the website. Every member has a profile where they can add images, text, tags, and events to promote their artwork or group. Members can create a new profile for every domain they host with yourarthere.nets.
You must be logged in to post a comment.