Posts Tagged ‘networks’

Art and the Internet book published by Black Dog Publishing

Tuesday, July 8th, 2014

Here are some shots from the recently released Art and the Internet (Black Dog Publishing, London) with contributions from Joanne McNeil, Domenico Quaranta, and Nick Lambert. The book is a welcome update to writing on the subject and contains many well known works by artists I’ve admired for years. Nice to be included.

art-internet-_0000_Layer 20

art-internet-_0001_Layer 18
Trust (Evidence Locker) (2004) Jill Magid. Essay by Joanne McNeil.

art-internet-_0002_Layer 17
Simple Net Art Diagram by MTAA (1997). Essay by Domenico Quaranta

art-internet-_0003_Layer 16
Colorflip.com (2008) by Rafaël Rozendaal

art-internet-_0007_Layer 8
Every Icon (1997) by John F Simon Jr

art-internet-_0009_Layer 6
Tommy Chat Just Emailed Me (2006) by Ryan Trecartin

art-internet-_0004_Layer 13
1:1 (1999-2002) by Lisa Jevbratt

art-internet-_0006_Layer 11
They Rule (2001) by Josh On

art-internet-_0010_Layer 4
My Generation (2010) by Eva and Franco Mattes

art-internet-_0005_Layer 12
Artist’s Statement No 45, 730,944: The Perfect Artistic Website (2000) Young-Hae Chang Heavy Industries

art-internet-_0011_Layer 1
I am Unable to Fulfill Your Wish (2012) by Owen Mundy

art-internet-_0012_Layer 0

No-one Ever Cried At A Website (Speed Show) Friday, November 22

Tuesday, November 19th, 2013

No-one Ever Cried At A Website (Speed Show)

November 22, 5:00–6:30 pm EST, Computer training room, Facility for Arts Research, 3216 Sessions Road, Tallahassee FL

With FSU students: Monique Boileau, Alexis Cooper, Jonathan Davito, Danielle English, Justin Greenstein, Antoinette Janus, Scotty Johnson, Melissa Lidsky, Michelle Medrano, Denise Morrow, Lena Weissbrot, Meghan “Red” Yancey. Curated by: Owen Mundy

Students from the Fall 2013 Network Art and Typography classes in the Department of Art at Florida State are staging an exhibition titled No-one Ever Cried At A Website (Speed Show) on November 22 5:00–6:30 pm EST at the computer training room in the Facility for Arts Research, Tallahassee FL.

The exhibition title is modified from an article called, “No-one Ever Cried At A Website,” written by artist/coder Matt Pearson. The document examines how emotion is often forgotten when analyzing technologically-sophisticated works of art such as those which exist on the internet. It reminds readers that painting was once a technology, and asks how beauty, empathy, and interaction can all be triggers for emotional response regardless of the medium for delivery. The prompt for the works in this show, created mostly collaboratively, over the course of 10 days, and specifically for this exhibition, is to address how emotion can be used to engage online audiences to look, listen, and be moved by internet-based art.

Speed Show exhibition, popularized by artist, Aram Barthol, are arranged as following: “Hit an Internet-cafe (or computer classroom), rent all computers they have and run a show on them for one night. All art works of the participating artists need to be on-line and are shown in a typical browser with standard plug-ins.”

Poster: Print resolution and E-mail resolution

MutaMorphosis: Examining Networks of Power stream today

Friday, December 7th, 2012

I am an “attractor” for a stream of interest at a conference that is happening this weekend in Prague. There are thought-provoking presentations lined-up. Check it out if you are nearby.

Examining Networks of Power
Attractor: Owen Mundy (US)

Addressing the Future: The Tactics of Uncertainty
Attractor: Claudia Westermann (AT)

MutaMorphosis: Tribute To Uncertainty
Saturday, December 8, 2012
13:30-15:30 | Academy 2

Marc Garrett (UK) | Revisiting the Curious World of Art & Hacktivism
Diane Derr (QA) | Morphological Mapping of the Arab Spring through an Ecological Lens
Daphne Dragona (GR) The power of counterpower
Katerina Karoussos (UK) | LUMEN HILARE – The (bio)photonic emission of a church and its metabolism in human consciousness

Owen Mundy: This thread intends to examine representations of contemporary control structures and discuss whether or not an image, particularly network graphs depicting power relationships, information flows, economic activity, etc. have the ability to reverse trends of concentrated power. Does depicting power lead to greater democratization of said power or are we merely creating beautiful images that detract from their original intent—to reveal, examine, and act. To what effect does mapping complex data depicting influential actors, institutions, moments in time, finance systems, mapped onto a two-dimensional surface unravel the power they represent by distributing that power? Methodology: We’ll begin the discussion by looking at historical examples of communication that intend to affect power structures. We’ll examine works by artists and cultural practitioners such as Hans Haacke, Mark Lombardi, Josh On (theyrule.net), Bureau d’études, and more. We’ll consider Manual Castells‘ “Network Theory of Power” within these context(s). Then we’ll discuss visual components of network graphs, asking in particular; what data and
representational forms help make a graph speak to as many people as possible. Finally we’ll work in groups to extend the discussion into possible solutions and various statements and recommendations for such representations.

Claudia Westermann: More than 30 years ago, Michel de Certeau suggested that a shift in methodologies was necessary from strategies that operate on the basis of planning and prediction to tactics that operate on the basis of appropriation. Within this context, competition for the best prediction is substituted for a search for authorities that create the places that give space. This stream of interest with the title ‘Addressing the Future – The Tactics of Uncertainty’ suggests that even though Utopia has not been valued recently, it is nevertheless the actual turn towards the future – this radical uncertainty – that allows for a profound re- questioning of our methodologies. How do the sciences and the arts address the future? Can we speak to it, and in doing so, can we create openings in the present? Methodology: The main intention of this suggested stream of interest is to provide for a ‘good’ start question from which the theme of uncertainty could be addressed in a rather radical way. I am interested in viewpoints from all areas of the sciences and arts. In addition to presentations, there could be conversational panels and workshops. The latter could also be contextualized within the city of Prague. Which inclination this theme takes also depends on the other streams of interest and should be discussed.

PRAGUE | CZECH REPUBLIC | DECEMBER 6-8, 2012 | WWW.MUTAMORPHOSIS.ORG | ORGANIZED BY CIANT | WWW.CIANT.ORG

Project Presentation and Mini-Seminar: Live Project Launch, Workshop Outcomes and Talks on [Social] Media Hacking

Friday, July 27th, 2012

Tonight Tim Schwartz, Walter Langelaar, Birgit Bachler, and I share our new project. The concept has taken a turn from the original plan, but will be exciting nonetheless.

Project Presentation and Mini-Seminar: Live Project Launch, Workshop Outcomes and Talks on [Social] Media Hacking
Friday, July 27, 2012

Tonight we launch the new version of “Give Me My Data!”, a project that together with Artists In Residence Tim C. Schwartz and Owen Mundy was revised and revived in WORM’s moddr_lab.

Give Me My Data offers functionality to its users with which they can retrieve and backup data and files from several key social networks; partly designed as a backup tool moreover targeted at networks that completely lack these functions.

Besides launching the project and am in-depth presentation by the artists, Walter Langelaar of WORM will give an introduction and overview of similar (art)works and earlier projects that came out of WORMs studios like the “Web2.0 Suicide Machine”.

Further more we’ll have presentations of current and ongoing projects from the lab, like Birgit Bachler’s ‘online social gardening’ platform “Talk To The Plant”, the Force of Freedom (Roel Roscam Abbing and Micha Prinsen) present “partsba.se”, and Geert Lovink talks about the “Unlike Us” initiative. The last addition to tonights programme is a presentation by Greenhost.nl on their very excellent RePress project; a WordPress plugin that automagically converts your site to a proxy-server countering censorship on the internet!

In conclusion there will be an open Q&A and panel discussion moderated by Florian Cramer of Creating010.

Projects & Speakers

Florian Cramer – Creating010
Florian Cramer, is a reader and programme director at the applied research center Creating 010 at Hogeschool Rotterdam, The Netherlands. he is a critical writer on arts, culture and information technology. Recent publications include: Exe.cut(up)able statements: Poetische Kalküle und Phantasmen des selbstausführenden Texts, Wilhelm Fink, 201.

Unlike Us / Geert Lovink
The aim of Unlike Us is to establish a research network of artists, designers, scholars, activists and programmers who work on ‘alternatives in social media’. Through workshops, conferences, online dialogues and publications, Unlike Us intends to both analyze the economic and cultural aspects of dominant social media platforms and to propagate the further development and proliferation of alternative, decentralized social media software.

Tim C. Schwartz – moddr_/WORM Artist in Residence
Tim Schwartz grew up in St. Louis, MO. He received a BA in Physics from Wesleyan University and an MFA in Visual Arts from the University of California, San Diego. In January 2010, he developed a technology to help reunited missing people affected by the earthquake in Haiti and now co-runs an organization dealing with family reunification. Last year Schwartz spent four months traveling the country in a mobile research laboratory investigating what is lost as archives become digital.

Birgit Bachler – moddr_/WORM
Birgit is an Austrian artist living and working in Rotterdam/NL.
She graduated as BA in Information Design / Media & Interactiondesign at the Universityof Applied Sciences in Graz/AT and is a recent graduate of the MA Networked Media at Piet Zwart Institute Rotterdam. She has a background in interactive, audiovisual media and programming. 
Her interests focus on the influence of new media on our everyday lives and the similarities and differences between human and computational behavior.

RePress / Greenhost.nl
“This plugin was made in response to the ongoing limitation of the Open Web. In the dawn of 2012 we found ourselves confronted with a court-ruling blocking the Piratebay.org in the Netherlands. On the other side of the ocean new laws are being discussed to curtail web-freedom even further.”

We zijn pioneer in groene hosting. We ontwikkelden een innovatief energiebesparend hostingplatform waardoor we 70% minder energie gebruiken dan andere hosters. Onze servers staat bij Evoswitch, het meest duurzame datacenter van Nederland.

Owen Mundy – moddr_/WORM Artist in Residence
Owen Mundy is an artist, designer, and programmer who investigates public space and its relationship to data. His artwork highlights inconspicuous trends and offers tools to make hackers out of everyday users. He has an MFA in Visual Art from the University of California, San Diego and is an Assistant Professor of Art at Florida State University.

partsba.se / Force Of Freedom
At partsba.se you can upload, share and download digital designs for real physical products. Partsba.se allows you to share designs of any nature, whether these designs are copyrighted or dangerous. Unlike other websites partsba.se does not claim any rights of your designs once you upload them.In the near future partsba.se will run on a fully secure and anonymous server.

We believe that users should be free to reverse engineer any everyday objects that surround them. Either to improve these objects, customize them, repair them or just to understand them.

The Force Of Freedom is a Rotterdam based collective founded by Micha Prinsen and Roel Roscam Abbing in 2009. Researching ways in which we can relate to things that happen on-line.

Kaarten
De volgende kaarten zijn beschikbaar:
Normaal 
Voorverkoop: € 5.00
Deurverkoop: € 5.00

Locatie
WORM
Boomgaardsstraat 71
3012 XA Rotterdam

Call for Abstracts: MutaMorphosis: Tribute to Uncertainty

Friday, May 25th, 2012

MutaMorphosis: Tribute to Uncertainty conference
December 6-8, 2012, Prague, Czech Republic.

Call for Abstracts
Deadline: July 1, 2012

Examining Networks of Power
By Owen Mundy

This thread intends to examine representations of contemporary control structures and discuss whether or not an image, particularly network graphs depicting power relationships, information flows, economic activity, etc. have the ability to reverse trends of concentrated power. Does depicting power lead to greater democratization of said power or are we merely creating beautiful images that detract from their original intent—to reveal, examine, and act. To what effect does mapping complex data depicting influential actors, institutions, moments in time, finance systems, mapped onto a two-dimensional surface unravel the power they represent by distributing that power?

Methodology: We’ll begin the discussion by looking at historical examples of communication that intend to affect power structures. We’ll examine works by artists and cultural practitioners such as Hans Haacke, Mark Lombardi, Josh On (theyrule.net), Bureau d’études, and more. We’ll consider Manual Castells‘ “Network Theory of Power” within these context(s). Then we’ll discuss visual components of network graphs, asking in particular; what data and representational forms help make a graph speak to as many people as possible. Finally we’ll work in groups to extend the discussion into possible solutions and various statements and recommendations for such representations.

Keywords: Networks; Power; Data; Politics

We invite you to respond to the Tribute to Uncertainty theme and/or to address one of the 21 streams of interest led by 28 conference Attractors.

Do you have something original to say about our world that is increasingly fuzzy, unstable and chaotic? Are you interested in how crisis, uncertainty and complexity can come together in order to question the known as well as predict and/or model yet unknown? Do you want to share projects intrinsically linking domains of scientific, artistic and technological research and creativity that can be introduced as relevant tools for better understanding of our common future?

The aim of this Call for Abstracts is to provide opportunity for community of interdisciplinary practitioners to get engaged in a transformative conversation on mutant futures.

Feel free to address conference Attractors and respond to a specific stream of interest in order to get involved with other interested individuals and collectives in structured collaborative efforts before, during, and hopefully also after the conference. Research groups formed by the Attractors -who serve as the Reviewers- will be expected to elaborate on the general theme Tribute to Uncertainty while turning attention to concrete research objectives as defined by the streams.

The results should be curated panels, roundtables, workshops as well as publications.

For more information about MutaMorphosis conference (planned publications, preparatory committee, partnerships) please visit mutamorphosis.org.

Freedom for Our Files: Canvas starter (Facebook) app

Thursday, December 22nd, 2011

I’m happy to share the code from a Facebook app I created for a workshop earlier this year.

This is an example of a simple Facebook canvas application. This code was originally demoed during the Freedom for Our Files Facebook API workshop at the 2011 Art Meets Radical Openness festival in Linz, Austria. You can view and download the source code on github

The application is very simple; it creates a Facebook object, performs calls to get data belonging to the current user, then prints the data exactly as it is returned. It has examples of basic Facebook Graph API calls as well as an example of FQL (Facebook Query Language).

Internet service just got creepy: How to set up a wireless router with Comcast internet service

Saturday, August 27th, 2011

I just moved back to Florida after a one year research project in Berlin and have subscribed to Comcast broadband service. The whole experience left a bad taste in my mouth, though not because the tech showed-up 2 hours after the installation appointment window. Nor was it because he held loud personal conversations on his cell phone while he was setting up the service. No, the icky feeling is more corporate and selfish, and impedes much more into my private space than “Joe the cable guy” ever could.

Comcast made me install software on my computer in order to use their broadband.

Upon his arrival, “Joe” announced he would need access to my computer to setup broadband service. Understanding that most of the people Joe deals with might not be IT whizzes, and could manage to not be able to connect their machines without his help, I decided to let him use it rather than attempt to prove I was not a member of the usual group. After half an hour of complaining about previous customers to his friend on his cellphone, waiting for an other Comcast person to flip a switch allowing him to do his job, and multiple trips to his truck, he showed me that the internet was indeed accessible on my computer.

At this point the laptop was directly connected to the cable modem via an ethernet cable. He announced I was to follow the steps on the screen and he was out the door. The web page he had left up required me to agree to some terms, create a username and then… install software? Really? I tried to access the net without the final step but nothing doing. Unless I installed this software I was stuck. So I did it, still not believing that a company had really initiated this final invasion onto every customer’s computer. After it was done I had new bookmarks everywhere, for Comcast email, security, and some branding nonsense called “XFINITY” (I thought “X” was out with the ’90’s and “X”games?)

So I thought, “OK, Comcast, you got me, hit me with your best marketing slime. Whatever, I can delete the bookmarklets you installed in my browser, just let me access the service I paid for, wirelessly, on whichever device I want.”

But this is where the relationship got really creepy. Apparently when I installed the Comcast (spyware?) on my machine, it made note of my MAC address, a unique identifier of networked machines, so that it would only allow my machine (or another machine with that MAC address) to connect to the internet. This means when I attached a wireless router to the cable modem I could connect to the wifi, but there was no internet.

So it turns-out that Comcast is not only forcing their adware on customers, it’s also making it difficult (though not impossible) for them use more than one device. Presumably Comcast is doing this in order to circumvent sharing of services among neighbors, but the end result is that you can’t share the service between more than one device, or between roommates or spouses for that matter.

An example (albeit a geeky one): between my wife and I we have 2 laptops, 2 smartphones, and a desktop computer that all might be talking to each other or accessing the net. Comcast’s so-called internet service didn’t allow for any such geekery because it only allows one device, with the correct MAC address, to connect.

So, here’s what I did, on my Mac, with some help from my sister’s boyfriend, Tom, and a lot from Google, to get my linksys wireless router to work with Comcast internet.

  1. Confirm you can access the internet with your machine connected directly to the Comcast cable modem.
  2. Open Terminal and type (without the quotes): “ifconfig en0 | grep ether”
  3. Now disconnect your computer from the modem and connect the modem ethernet cable to your wireless router. Make sure both are plugged-in.
  4. Connect to your wireless router via the airport on your machine.
  5. Go to the following link: http://192.168.1.1
  6. Under Setup, choose DHCP as the Internet Connection Type. Save Settings.
  7. Under Setup : Mac Address Clone, enter the alpha numeric characters returned from Terminal. Save Settings.
  8. Configure your wireless router like you normally would and you are up and running.
  9. Snicker at Comcast

Give Me My Data upgrade: New API, authorization, and data formats

Monday, July 4th, 2011

No one would be surprised to learn that almost all of the user-generated content websites use our personal data to sell advertisements. In fact 97% of Google’s revenue comes from advertising.[1] That’s why it’s important these sites provide as much access as possible to the real owners of our data‐us. After all, we put it there and allow them to use it in exchange for the use of their software. Seems like a fair trade if you ask me.

A year and a half ago Facebook didn’t provide any access. That’s why I created Give Me My Data, to help users reclaim and reuse their personal data they put on Facebook.

By giving more agency to users of online systems, Give Me My Data may have already impacted the nature of online application development. In November 2010, almost a year after I launched Give Me My Data, Facebook created their own service for users to export their profile from Facebook as a series of HTML pages. Unlike Give Me My Data, the Facebook service doesn’t allow you to select which data you want or to choose custom formats to export. It also doesn’t give you options for visualization like the custom network graphs that Give Me My Data offers.

I believe their motivation originates in part with my application, likely due to the popularity of Give Me My Data, and points to the potential usefulness of similar apps. While years down the road may reveal many other online systems giving users control over their data, I see this as a positive effect where the content we create, as well as the means to share and manage it, are democratized.

Meanwhile, the above also keeps me hard at work developing the Give Me My Data project. This week I rewrote the program to use Facebook’s new OAuth authorization, which also required rewriting all of the code that fetches the data. Previously it used the REST API which is being deprecated (sometime?) in the future. I also added new data types, fixed the CSV format (which had the rows and columns mixed-up), and added the possibility to export in the JSON data format.

Finally, in the data selector, I distinguished standard data and customized data types. When I say customized, I mean that I’ve written code that mashes together more than one data table and/or addresses a specific question. For example, right now users can select from two types of network graphs and corresponding formats. One describes the user’s relationship to their friends, and the other describes the user’s relationship to their friends, as well as all their friends’ relationships to each other in various graph description languages. This is how I made the network graph image below. I’m also interested in hearing other suggestions for custom queries I might add. The project will be open source on Github soon, so even code contributions will be welcome.

Anyway, please try out the new version. You may have to delete the app from your allowed applications and then re-authorize it if you’ve used it before. As usual, you can provide feedback on the application page, and you can also contact me on Twitter via @givememydata.

[1] “Google Financial Tables for Quarter ending June 30, 2009” Retrieved October 13, 2010

Semantic network of hierarchical tags from Camp La Jolla Military Park

Wednesday, June 29th, 2011

A semantic network visualization and detail using tags generated from the hierarchical tagging system I created with Thomas Evan Lecklider as part of my Camp La Jolla Military Park project.

For example this item, Defense Contractors recruit at UCSD Job Fair, in the park is filed under: business » arms industry » spending » recruiting

Freedom for Our Files: Code and Slides

Monday, May 16th, 2011

A two-day workshop, with both technical hands-on and idea-driven components. Learn to scrape data and reuse public and private information by writing custom code and using the Facebook API. Additionally, we’ll converse and conceptualize ideas to reclaim our data literally and also imagine what is possible with our data once it is ours!

Here are the slides and some of the code samples from the Freedom for Our Files (FFOF) workshop I just did in Linz at Art Meets Radical Openness (LiWoLi 2011).

The first one is a basic scraping demo that uses “find-replace” parsing to change specific words (I’m including examples below the code)

<?php

/*	Basic scraping demo with "find-replace" parsing
 *	Owen Mundy Copyright 2011 GNU/GPL */

$url = "http://www.bbc.co.uk/news/";	// 0. url to start with

$contents = file_get_contents($url);	// 1. get contents of page in a string

					// 2. search and replace contents
$contents = str_replace(		// str_replace(search, replace, string)
			"News",					
			"<b style='background:yellow; color:#000; padding:2px'>LIES</b>",
			$contents);

print $contents;			// 3. print result

?>

Basic scraping demo with “foreach” parsing

<?php

/*	Basic scraping demo with "foreach" parsing
 *	Owen Mundy Copyright 2011 GNU/GPL */
 
$url = "http://www.bbc.co.uk/news/";	// 0. url to start with

$lines = file($url);			// 1. get contents of url in an array

foreach ($lines as $line_num => $line) 	// 2. loop through each line in page
{		
					// 3. if opening string is found
	if(strpos($line, '<h2 class="top-story-header ">')) 	
	{
		$get_content = true;	// 4. we can start getting content
	}
	
	if($get_content == true)
	{
		$data .= $line . "\n";	// 5. then store content until closing string appears
	}

	if(strpos($line, "</h2>")) 	// 6. if closing HTML element found
	{
		$get_content = false;	// 7. stop getting content
	}
}

print $data;				// 8. print result

?>

Basic scraping demo with “regex” parsing

<?php

/*	Basic scraping demo with "regex" parsing
 *	Owen Mundy Copyright 2011 GNU/GPL */
 
$url = "http://www.bbc.co.uk/news/";		// 0. url to start with

$contents = file_get_contents($url);		// 1. get contents of url in a string
											
						// 2. match title
preg_match('/<title>(.*)<\/title>/i', $contents, $title);

print $title[1];				// 3. print result

?>

Basic scraping demo with “foreach” and “regex” parsing

<?php

/*	Basic scraping demo with "foreach" and "regex" parsing
 *	Owen Mundy Copyright 2011 GNU/GPL */

// url to start
$url = "http://www.bbc.co.uk/news/";

// get contents of url in an array
$lines = file($url);

// look for the string
foreach ($lines as $line_num => $line) 
{	
	// find opening string
	if(strpos($line, '<h2 class="top-story-header ">')) 
	{
		$get_content = true;
	}
	
	// if opening string is found 
	// then print content until closing string appears
	if($get_content == true) 
	{
		$data .= $line . "\n";
	}

	// closing string
	if(strpos($line, "</h2>")) 
	{
		$get_content = false;
	}
}

// use regular expressions to extract only what we need...

// png, jpg, or gif inside a src="..." or src='...' 
$pattern = "/src=[\"']?([^\"']?.*(png|jpg|gif))[\"']?/i";
preg_match_all($pattern, $data, $images);

// text from link
$pattern = "/(<a.*>)(\w.*)(<.*>)/ismU";
preg_match_all($pattern, $data, $text);

// link
$pattern = "/(href=[\"'])(.*?)([\"'])/i";
preg_match_all($pattern, $data, $link);

/* 
// test if you like
print "<pre>";
print_r($images);
print_r($text);
print_r($link);
print "</pre>";
*/

?>

<html>
<head>
<style> 
body { margin:0; } 
.textblock { position:absolute; top:600px; left:0px; }
span { font:5.0em/1.0em Arial, Helvetica, sans-serif; line-height:normal; 
background:url(trans.png); color:#fff; font-weight:bold; padding:5px } 
a { text-decoration:none; color:#900 }
</style>
</head>
<body>
<img src="<?php print $images[1][0] ?>" height="100%"> </div>
<div class="textblock"><span><a href="<?php print "http://www.bbc.co.uk".$link[2][0] ?>"><?php print $text[2][0] ?></a></span><br>
</div>
</body>
</html>

And the example, which presents the same information in a new way…

Advanced scraping demo with “regex” parsing. Retrieves current weather in any city and colors the background accordingly. The math below for normalization could use some work.

<?php

/*	Advanced scraping demo with "regex" parsing. Retrieves current 
 * 	weather in any city and colors the background accordingly. 
 *	The math below for normalization could use some work.
 *	Owen Mundy Copyright 2011 GNU/GPL */

?>

<html>
<head>
<style> 
body { margin:20; font:1.0em/1.4em Arial, Helvetica, sans-serif; } 
.text { font:10.0em/1.0em Arial, Helvetica, sans-serif; color:#000; font-weight:bold; } 
.navlist { list-style:none; margin:0; position:absolute; top:20px; left:200px }
.navlist li { float:left; margin-right:10px; }
</style>
</head>

<body onLoad="document.f.q.focus();">

<form method="GET" action="<?php print $_SERVER['PHP_SELF']; ?>" name="f">

	<input type="text" name="q" value="<?php print $_GET['q'] ?>" />
	<input type="submit" />

</form>

<ul class="navlist">
	<li><a href="?q=anchorage+alaska">anchorage</a></li>
	<li><a href="?q=toronto+canada">toronto</a></li>
	<li><a href="?q=new+york+ny">nyc</a></li>
	<li><a href="?q=london+uk">london</a></li>
	<li><a href="?q=houston+texas">houston</a></li>
	<li><a href="?q=linz+austria">linz</a></li>
	<li><a href="?q=rome+italy">rome</a></li>
	<li><a href="?q=cairo+egypt">cairo</a></li>
	<li><a href="?q=new+delhi+india">new delhi</a></li>
	<li><a href="?q=mars">mars</a></li>
</ul>

<?php

// make sure the form has been sent
if (isset($_GET['q']))
{
	// get contents of url in an array
	if ($str = file_get_contents('http://www.google.com/search?q=weather+in+'
						. str_replace(" ","+",$_GET['q'])))
	{
		
		// use regular expressions to extract only what we need...
		
		// 1, 2, or 3 digits followed by any version of the degree symbol 
		$pattern = "/[0-9]{1,3}[º°]C/";
		// match the pattern with a C or with an F
		if (preg_match_all($pattern, $str, $data) > 0)
		{
			$scale = "C";
		}
		else
		{
			$pattern = "/[0-9]{1,3}[º°]F/";
			if (preg_match_all($pattern, $str, $data) > 0)
			{
				$scale = "F";
			}
		}
		
		// remove html
		$temp_str = strip_tags($data[0][0]);
		// remove everything except numbers and points
		$temp = ereg_replace("[^0-9..]", "", $temp_str);
		
		if ($temp)
		{
			
			// what is the scale?
			if ($scale == "C"){
				// convert ºC to ºF
				$tempc = $temp;
				$tempf = ($temp*1.8)+32;
			}
			else if ($scale == "F")
			{
				// convert ºF to ºC
				$tempc = ($temp-32)/1.8;
				$tempf = $temp;
			}
			// normalize the number
			$color = round($tempf/140,1)*10;
			// cool -> warm
			// scale -20 to: 120
			$color_scale = array(
					'0,  0,255',
					'0,128,255',
					'0,255,255',
					'0,255,128',
					'0,255,0',
					'128,255,0',
					'255,255,0',
					'255,128,0',
					'255,  0,0'
					);	
		
?>

<style> body { background:rgb(<?php print $color_scale[$color] ?>) }</style>
<div class="text"><?php print round($tempc,1) ."&deg;C " ?></div>
<?php print round($tempf,1) ?>&deg;F

<?php
		
		}
		else 
		{
			print "city not found";	
		}
	}
}
?>

</body>
</html>




For an xpath tutorial check this page.

For the next part of the workshop we used Give Me My Data to export our information from Facebook in order to revisualize it with Nodebox 1.0, a Python IDE similar to Processing.org. Here’s an example:

Update: Some user images from the workshop. Thanks all who joined!

Mutual friends (using Give Me My Data and Graphviz) by Rob Canning

identi.ca network output (starting from my username (claude) with depth 5, rendered to svg with ‘sfdp’ from graphviz) by Claude Heiland-Allen