Posts Tagged ‘code’

I Know Where Your Cat Lives launched

Wednesday, July 16th, 2014

I just launched a new ongoing project this week. Here’s the text, a video and some screenshots. I’ll post more about how I made it soon.

Welcome to the today’s internet—you can buy anything, every website is tracking your every move, and anywhere you look you find videos and images of cats. Currently, there are 15 million images tagged with the word “cat” on public image hosting sites, and daily thousands more are uploaded from unlimited positions on the globe.

“I Know Where Your Cat Lives” iknowwhereyourcatlives.com is a data experiment that visualizes a sample of 1 million public pics of cats on a world map, locating them by the latitude and longitude coordinates embedded in their metadata. The cats were accessed via publicly available APIs provided by popular photo sharing websites. The photos were then run through various clustering algorithms using a supercomputer at Florida State University in order to represent the enormity of the data source.

This project explores two uses of the internet: the sociable and humorous appreciation of domesticated felines, and the status quo of personal data usage by startups and international megacorps who are riding the wave of decreased privacy for all. This website doesn’t visualize all of the cats on the net, only the ones that allow you to track where their owners have been.

Folks can also contribute to a kickstarter to help with hosting costs.

Screen Shot 2014-05-14 at 11.01.20 PM

Screen Shot 2014-05-14 at 11.06.09 PM

Screen Shot 2014-05-15 at 10.16.42 AM

Screen Shot 2014-05-23 at 8.48.02 PM

Screen Shot 2014-05-23 at 9.24.59 PM

Screen Shot 2014-06-05 at 10.28.53 AM

Screen Shot 2014-06-05 at 10.30.26 AM

Term vs. Term for Digital Public Library of America hackathon

Monday, April 14th, 2014

I made a small app to compare the number of search results for two phrases from the Digital Public Library of America for a hackathon / workshop here at Florida State next week.

http://owenmundy.com/work/term-vs-term

dpla term vs term

Digital Humanities Hackathon II – Digital Public Library of America

Monday, April 21, 2:00-3:30 p.m.
Strozier Library, Scholars Commons Instructional Classroom [MAP]

The Digital Scholars Reading and Discussion Group will simulate its second “hackathon” on April 21, allowing participants to learn more about the back-end structure of the Digital Public Library of America. With its April 2013 launch, the DPLA became the first all-digital library that aggregates metadata from collections across the country, making them available from a single point of access. The DPLA describes itself as a freely available, web-based platform for digitized cultural heritage projects as well as a portal that connects students, teachers, scholars, and the public to library resources occurring on other platforms.

From a critical point of view, the DPLA simultaneously relies on and disrupts the principles of location and containment, making its infrastructure somewhat interesting to observe.

In this session, we will visit the DPLA’s Application Programming Interface (API) codex to observe some of the standards that contributed to its construction. We will consider how APIs function, how and why to use them, and who might access their metadata and for what purposes. For those completely unfamiliar with APIs, this session will serve as a useful introduction, as well as a demonstration of why a digital library might also want to serve as an online portal. For those more familiar with APIs, this session will serve as an opportunity to try on different tasks using the metadata that the DPLA aggregates from collections across the country.

At this particular session, we are pleased to be joined by Owen Mundy from FSU Department of Art and Richard Urban from FSU College of Communication and Information, who have considered different aspects of working with APIs for projects such as the DPLA, including visualization and graphics scripting, and developing collections dashboards.

As before, the session is designed with a low barrier of entry in mind, so participants should not worry if they do not have programming expertise or are still learning the vocabulary associated with open-source projects. We come together to learn together, and all levels of skill are accommodated, as are all attitudes and leanings.

Participants are encouraged to explore the Digital Public Library of America site prior to our meeting and to familiarize themselves with the history of the project. Laptops will be available for checkout, but attendees are encouraged to bring their own.

After Douglas Davis – The World’s First Collaborative Sentence

Thursday, September 5th, 2013

Screen Shot 2013-09-05 at 3.21.58 PM

README for After Douglas Davis
==============================

Statement
————–

The World’s First Collaborative Sentence was created by Douglas Davis in 1994 and donated to the Whitney Museum of American Art in 1995. Much like today’s blog environments and methods for crowdsourcing knowledge, it allowed users to contribute practically any text or markup to a never-ending sentence with no limits on speech or length.

At some point the sentence stopped functioning, and in early 2012 the Whitney Museum undertook a “preservation effort” to repair and relaunch the project. Measures were taken during the “restoration” to stay true to the original intent of the artist, leaving dead links and the original code in place.

During the preservation the curators placed small sections of garbled ASCII text from the project on Github with the hope that others would “fork” the data and repair the original. However, the Whitney Museum did not succeed in realizing that the collaborative culture of the net Davis predicted has actually arrived. This is evident not only through sites like Wikipedia, Facebook, and Tumblr, but the open source movement, which brings us Linux, Apache, and PHP, the very technologies used to view this page, as well as others like Firefox, Arduino, Processing, and many more.

In the spirit of open source software and artists like Duchamp, Levine, runme.org and Mandiberg, on September 5, 2013, I “forked” Douglas Davis’ Collaborative Sentence by downloading all pages and constructing from scratch the functional code which drives the project. I have now placed this work on Github with the following changes:

1. All pages are updated to HTML5 and UTF-8 character encoding
2. The functional code was rewritten from scratch including a script to remove malicious code
3. The addition of this statement

I was originally disappointed the Whitney Museum didn’t place the full source code in the public domain. What better way to make it possible for artists and programmers to extend the life of Davis’ project by learning from, reusing, and improving the original code than to open source this work? Though, possibly like Davis, my motivation is largely in part an interest in constructing a space for dialog, framing distinct questions and new possibilities, and waiting to see what happens from this gesture.

Included software
————–
HTML Purifier http://htmlpurifier.org/

Live version
————–
Enter After Douglas Davis

About the author
————–
Owen Mundy http://owenmundy.com/

Washington Post review of “Grid, Sequence Me” show + documentation

Tuesday, January 22nd, 2013

washingtonpost

The Washington Post recently published a review recently about my and Joelle’s exhibition at Flashpoint Gallery in D.C. Check it out: Joelle Dietrick & Owen Mundy: Grid, Sequence Me, by Maura Judkis, Jan 11, 2013.

A few elements will be recognizable, such as the brutalist outline of the J. Edgar Hoover FBI Building, but many are stripped down to their most generic shapes, making rows of windows look like charts and bar graphs. The projections of some of those shapes echo and interplay with the forms of the Flashpoint gallery interior.

Dietrick and Mundy also scraped The Post’s listings of recent home sales, with architectural elements from some of those homes appearing before a dense thicket of live-streamed code. It’s a visual reminder of just how complicated the housing industry has become.

There’s a sense in the animation that the structures are tumbling away from you — just as homeownership has slipped out of the grip of many Americans. But the piece will elicit a different reaction here than in Florida, where the effects of the housing market crash have been far more pronounced. In Washington, we’ve mostly been insulated from it: Foreclosures are few, short sales are sparse. In the jumble of buildings and code, “Grid, Sequence Me,” may serve as a warning for those who haven’t experienced that sense of loss — but who indirectly, though policy work, may have influenced the systems that led to the crash.

I also finished a short piece with video from the installation and screen captures of the Processing visualization.

Grid, Sequence Me @ Flashpoint Gallery, Washington D.C.

Thursday, January 17th, 2013

2013_grid_11_1024w

Surrounded by images of cross-sectioned buildings and source code excerpts, gallery visitors encounter fragments of Washington, DC architecture—a vaguely familiar roofline or grid of office windows—remixed with data and source code representing the latest housing sales in the area. Constantly changing, the live data streams into the gallery from both local sources (DC short sale listings) and national (federal policy sites), emphasizing the effects of related micro-macro shifts.

2013_grid_screen_10_1024w

2013_grid_screen_09_1024w

Generated with custom software, these fragments echo financial systems and housing market fluctuations. They mirror mortgages repackaged and sold, titles lost in administrative tape, and dreams confused by legal jargon. Like the complex financial systems of the housing market heyday, the software generates an infinite number of arrangements. The complexity of unique and dynamically-created algorithmic outcomes contrasts with the comforting predictability referenced in the exhibition’s title, “Grid, Sequence Me.”

—Joelle Dietrick and Owen Mundy

2013_grid_01_1024w

2013_sea_screen_23_1024w

2013_sea_screen_20_1024w

2013_sea_screen_13_1024w

2013_sea_screen_06_1024w

Packet Switching exhibition, Center for Emerging Media @ UCF

Thursday, January 10th, 2013

Catching up on some documentation from recent shows. Below are images from the Packet Switching exhibition at the Center for Emerging Media, University of Central Florida, October 2012.

2012_ucf_00_flatscreen_cyan

2012_ucf_00_flatscreen_grey

2012_ucf_00_flatscreen_magenta

2012_ucf_00_flatscreen_green

2012_ucf_25

Grid, Sequence Me projection test (cyan) – work in progress

Wednesday, November 14th, 2012

Joelle and I are working on an installation for Flashpoint Gallery in Washington D.C. This animation was made using ColladaFragmenter to deconstruct the architecture, and Processing with OBJloader to visualize the 3D fragments. The text is scraped from pages describing short sell real estate in D.C. This is work in progress:

Packet Switching project: ColladaFragmenter software, Kassel, Germany and University of Florida Public Commission

Tuesday, August 21st, 2012

Joelle Dietrick and I embarked on a new body of work this summer called “Packet Switching.” Inspired by her Sherwin Series images and wall paintings, and my work deconstructing and re-visualizing source code and other data, we’ve created two new software projects, as well as a series of limited edition prints, large photo installations, wall-sized paintings, and animations.

The full statement explains our process and intent clearly:

Packet Switching is an ongoing body of work by Joelle Dietrick and Owen Mundy that visualizes architecture as fragments affected by economic and communications systems.

The title of the series references how contemporary communications systems break digital files into smaller manageable blocks of data called packets. Each packet is then sent through a network, taking the quickest route possible, and reassembled once they reach their destination. One JPG image, for example, might be broken into several packets, each of which may travel a different path through the net, even through different cities, before being recompiled into a copy of the original file.

To reference this common process used in networked systems, we wrote custom software that deconstructs a 3D model’s source code and produces unique fragments. We further remixed these fragments using an original application created in Processing. The resulting images become limited edition prints, large photo installations, wall-sized paintings, and animations.

Our process underscores how incidental fragmentation and automation can streamline markets, but also make them vulnerable to systems failure. The use of architecture specifically points to recent real estate market volatility and considers how communication technology-enabled pursuits of profit margins alters our most basic needs.

The first software, that “deconstructs a 3D model’s source code and produces unique fragments,” is open source and available on Github. Essentially, the PHP software, parses a 3D COLLADA file and exports a set number of geometries, that can then be further broken down and used in an artwork or design.

The second software, which we will release soon, remixes these fragments using Processing. The video below shows an example of the whole process.

Wall painting at “Temporary Home” in Kassel, Germany

While artists-in-residence at Temporary Home, in Kassel, Germany, which coincided with Documenta13, Joelle Dietrick and I completed a wall-sized temporary painting based on the architecture from the Bauhaus School at Dessau and 2012 American color forecasts.

Commission at Weimer Hall at the University of Florida

Joelle and I have also received a commission to complete Packet Switching (Weimer Hall) at the University of Florida College of Journalism and Communications this fall. This will be inkjet on adhesive polyester on a large wall (approx. 177.5 ft. x 20.2 ft.). More details soon.

PaletteVisualizer

Saturday, August 11th, 2012

I’ve been working on a lot of projects this summer using various color palettes. Since my work involves writing programs to use these palettes in different visual outcomes, I keep wishing for an easy way to view and negotiate the lists of colors. So today I wrote a simple app to visualize a list of hexadecimal values and sort them by hue, saturation, and value (HSV). The app can be used here, and the code is open under the MIT License.

Thanks to Alexei Kourbatov for the color format exchange functions.

Give Me My Data API Hacking Masterclass

Thursday, July 26th, 2012

Give Me My Data API Hacking Masterclass
Thursday, July 27, 2012

A ‘hands-on’ workshop with technical and theoretical overview of contemporary ‘Application Programming Interfaces’ (API’s) of large social networks, en how to use these for your own project or application.

We will cover the ins and outs of creating Facebook apps, web applications, and how to play with the Twitter, Foursquare, Flickr and Instagram APIs.

With WORM’s current ‘Artists In Residence’ Tim C. Schwartz and Owen Mundy you’ll learn from a programmers perspective how to approach your target network, to subsequently make it do what you need it to…

As an example you can think of Owen’s project “Give Me My Data” or WORM’s previous release of the “Web2.0 Suicide Machine”

No specific technical skills required, but a curious mind towards the ins&outs of social networking is a must!

[EN] buy your ticket online or send your resrvation tomoddr@worm.org,

Participation is limited so act fast!
- reduced/discount tickets available for WORM volunteers & students (with ID)

Normaal 
Voorverkoop: € 15.00
Deurverkoop: € 15.00
Normaal reductie
Voorverkoop: € 10.00
Deurverkoop: € 10.00

Location: WORM, Boomgaardsstraat 71, 3012 XA Rotterdam