Internet service just got creepy: How to set up a wireless router with Comcast internet service

I just moved back to Florida after a one year research project in Berlin and have subscribed to Comcast broadband service. The whole experience left a bad taste in my mouth, though not because the tech showed-up 2 hours after the installation appointment window. Nor was it because he held loud personal conversations on his cell phone while he was setting up the service. No, the icky feeling is more corporate and selfish, and impedes much more into my private space than “Joe the cable guy” ever could.

Comcast made me install software on my computer in order to use their broadband.

Upon his arrival, “Joe” announced he would need access to my computer to setup broadband service. Understanding that most of the people Joe deals with might not be IT whizzes, and could manage to not be able to connect their machines without his help, I decided to let him use it rather than attempt to prove I was not a member of the usual group. After half an hour of complaining about previous customers to his friend on his cellphone, waiting for an other Comcast person to flip a switch allowing him to do his job, and multiple trips to his truck, he showed me that the internet was indeed accessible on my computer.

At this point the laptop was directly connected to the cable modem via an ethernet cable. He announced I was to follow the steps on the screen and he was out the door. The web page he had left up required me to agree to some terms, create a username and then… install software? Really? I tried to access the net without the final step but nothing doing. Unless I installed this software I was stuck. So I did it, still not believing that a company had really initiated this final invasion onto every customer’s computer. After it was done I had new bookmarks everywhere, for Comcast email, security, and some branding nonsense called “XFINITY” (I thought “X” was out with the ’90’s and “X”games?)

So I thought, “OK, Comcast, you got me, hit me with your best marketing slime. Whatever, I can delete the bookmarklets you installed in my browser, just let me access the service I paid for, wirelessly, on whichever device I want.”

But this is where the relationship got really creepy. Apparently when I installed the Comcast (spyware?) on my machine, it made note of my MAC address, a unique identifier of networked machines, so that it would only allow my machine (or another machine with that MAC address) to connect to the internet. This means when I attached a wireless router to the cable modem I could connect to the wifi, but there was no internet.

So it turns-out that Comcast is not only forcing their adware on customers, it’s also making it difficult (though not impossible) for them use more than one device. Presumably Comcast is doing this in order to circumvent sharing of services among neighbors, but the end result is that you can’t share the service between more than one device, or between roommates or spouses for that matter.

An example (albeit a geeky one): between my wife and I we have 2 laptops, 2 smartphones, and a desktop computer that all might be talking to each other or accessing the net. Comcast’s so-called internet service didn’t allow for any such geekery because it only allows one device, with the correct MAC address, to connect.

So, here’s what I did, on my Mac, with some help from my sister’s boyfriend, Tom, and a lot from Google, to get my linksys wireless router to work with Comcast internet.

  1. Confirm you can access the internet with your machine connected directly to the Comcast cable modem.
  2. Open Terminal and type (without the quotes): “ifconfig en0 | grep ether”
  3. Now disconnect your computer from the modem and connect the modem ethernet cable to your wireless router. Make sure both are plugged-in.
  4. Connect to your wireless router via the airport on your machine.
  5. Go to the following link: http://192.168.1.1
  6. Under Setup, choose DHCP as the Internet Connection Type. Save Settings.
  7. Under Setup : Mac Address Clone, enter the alpha numeric characters returned from Terminal. Save Settings.
  8. Configure your wireless router like you normally would and you are up and running.
  9. Snicker at Comcast

Give Me My Data upgrade: New API, authorization, and data formats

No one would be surprised to learn that almost all of the user-generated content websites use our personal data to sell advertisements. In fact 97% of Google’s revenue comes from advertising.[1] That’s why it’s important these sites provide as much access as possible to the real owners of our data‐us. After all, we put it there and allow them to use it in exchange for the use of their software. Seems like a fair trade if you ask me.

A year and a half ago Facebook didn’t provide any access. That’s why I created Give Me My Data, to help users reclaim and reuse their personal data they put on Facebook.

By giving more agency to users of online systems, Give Me My Data may have already impacted the nature of online application development. In November 2010, almost a year after I launched Give Me My Data, Facebook created their own service for users to export their profile from Facebook as a series of HTML pages. Unlike Give Me My Data, the Facebook service doesn’t allow you to select which data you want or to choose custom formats to export. It also doesn’t give you options for visualization like the custom network graphs that Give Me My Data offers.

I believe their motivation originates in part with my application, likely due to the popularity of Give Me My Data, and points to the potential usefulness of similar apps. While years down the road may reveal many other online systems giving users control over their data, I see this as a positive effect where the content we create, as well as the means to share and manage it, are democratized.

Meanwhile, the above also keeps me hard at work developing the Give Me My Data project. This week I rewrote the program to use Facebook’s new OAuth authorization, which also required rewriting all of the code that fetches the data. Previously it used the REST API which is being deprecated (sometime?) in the future. I also added new data types, fixed the CSV format (which had the rows and columns mixed-up), and added the possibility to export in the JSON data format.

Finally, in the data selector, I distinguished standard data and customized data types. When I say customized, I mean that I’ve written code that mashes together more than one data table and/or addresses a specific question. For example, right now users can select from two types of network graphs and corresponding formats. One describes the user’s relationship to their friends, and the other describes the user’s relationship to their friends, as well as all their friends’ relationships to each other in various graph description languages. This is how I made the network graph image below. I’m also interested in hearing other suggestions for custom queries I might add. The project will be open source on Github soon, so even code contributions will be welcome.

Anyway, please try out the new version. You may have to delete the app from your allowed applications and then re-authorize it if you’ve used it before. As usual, you can provide feedback on the application page, and you can also contact me on Twitter via @givememydata.

[1] “Google Financial Tables for Quarter ending June 30, 2009” Retrieved October 13, 2010

Semantic network of hierarchical tags from Camp La Jolla Military Park

A semantic network visualization and detail using tags generated from the hierarchical tagging system I created with Thomas Evan Lecklider as part of my Camp La Jolla Military Park project.

For example this item, Defense Contractors recruit at UCSD Job Fair, in the park is filed under: business » arms industry » spending » recruiting

Give Me My Data 2.0 advances to 2nd round of Rhizome Commission voting

I just learned the Give Me My Data 2.0 proposal I put together with Dr. Phil Steinberg has advanced to the second stage of 2011 Rhizome Commission voting. Give Me My Data 2.0 is an open source web application that exports your user data from any crowdsourcing website in multiple, reusable formats.

I created Give Me My Data while working on Automata, a project which visualizes powerful relationships and which also made it to the second round of the Rhizome Commissions two years ago.

Please VOTE for Give Me Me Data if you are a Rhizome member!

Freedom for Our Files: Code and Slides

A two-day workshop, with both technical hands-on and idea-driven components. Learn to scrape data and reuse public and private information by writing custom code and using the Facebook API. Additionally, we’ll converse and conceptualize ideas to reclaim our data literally and also imagine what is possible with our data once it is ours!

Here are the slides and some of the code samples from the Freedom for Our Files (FFOF) workshop I just did in Linz at Art Meets Radical Openness (LiWoLi 2011).

The first one is a basic scraping demo that uses “find-replace” parsing to change specific words (I’m including examples below the code)

<?php

/*	Basic scraping demo with "find-replace" parsing
 *	Owen Mundy Copyright 2011 GNU/GPL */

$url = "http://www.bbc.co.uk/news/";	// 0. url to start with

$contents = file_get_contents($url);	// 1. get contents of page in a string

					// 2. search and replace contents
$contents = str_replace(		// str_replace(search, replace, string)
			"News",					
			"<b style='background:yellow; color:#000; padding:2px'>LIES</b>",
			$contents);

print $contents;			// 3. print result

?>

Basic scraping demo with “foreach” parsing

<?php

/*	Basic scraping demo with "foreach" parsing
 *	Owen Mundy Copyright 2011 GNU/GPL */
 
$url = "http://www.bbc.co.uk/news/";	// 0. url to start with

$lines = file($url);			// 1. get contents of url in an array

foreach ($lines as $line_num => $line) 	// 2. loop through each line in page
{		
					// 3. if opening string is found
	if(strpos($line, '<h2 class="top-story-header ">')) 	
	{
		$get_content = true;	// 4. we can start getting content
	}
	
	if($get_content == true)
	{
		$data .= $line . "\n";	// 5. then store content until closing string appears
	}

	if(strpos($line, "</h2>")) 	// 6. if closing HTML element found
	{
		$get_content = false;	// 7. stop getting content
	}
}

print $data;				// 8. print result

?>

Basic scraping demo with “regex” parsing

<?php

/*	Basic scraping demo with "regex" parsing
 *	Owen Mundy Copyright 2011 GNU/GPL */
 
$url = "http://www.bbc.co.uk/news/";		// 0. url to start with

$contents = file_get_contents($url);		// 1. get contents of url in a string
											
						// 2. match title
preg_match('/<title>(.*)<\/title>/i', $contents, $title);

print $title[1];				// 3. print result

?>

Basic scraping demo with “foreach” and “regex” parsing

<?php

/*	Basic scraping demo with "foreach" and "regex" parsing
 *	Owen Mundy Copyright 2011 GNU/GPL */

// url to start
$url = "http://www.bbc.co.uk/news/";

// get contents of url in an array
$lines = file($url);

// look for the string
foreach ($lines as $line_num => $line) 
{	
	// find opening string
	if(strpos($line, '<h2 class="top-story-header ">')) 
	{
		$get_content = true;
	}
	
	// if opening string is found 
	// then print content until closing string appears
	if($get_content == true) 
	{
		$data .= $line . "\n";
	}

	// closing string
	if(strpos($line, "</h2>")) 
	{
		$get_content = false;
	}
}

// use regular expressions to extract only what we need...

// png, jpg, or gif inside a src="..." or src='...' 
$pattern = "/src=[\"']?([^\"']?.*(png|jpg|gif))[\"']?/i";
preg_match_all($pattern, $data, $images);

// text from link
$pattern = "/(<a.*>)(\w.*)(<.*>)/ismU";
preg_match_all($pattern, $data, $text);

// link
$pattern = "/(href=[\"'])(.*?)([\"'])/i";
preg_match_all($pattern, $data, $link);

/* 
// test if you like
print "<pre>";
print_r($images);
print_r($text);
print_r($link);
print "</pre>";
*/

?>

<html>
<head>
<style> 
body { margin:0; } 
.textblock { position:absolute; top:600px; left:0px; }
span { font:5.0em/1.0em Arial, Helvetica, sans-serif; line-height:normal; 
background:url(trans.png); color:#fff; font-weight:bold; padding:5px } 
a { text-decoration:none; color:#900 }
</style>
</head>
<body>
<img src="<?php print $images[1][0] ?>" height="100%"> </div>
<div class="textblock"><span><a href="<?php print "http://www.bbc.co.uk".$link[2][0] ?>"><?php print $text[2][0] ?></a></span><br>
</div>
</body>
</html>

And the example, which presents the same information in a new way…

Advanced scraping demo with “regex” parsing. Retrieves current weather in any city and colors the background accordingly. The math below for normalization could use some work.

<?php

/*	Advanced scraping demo with "regex" parsing. Retrieves current 
 * 	weather in any city and colors the background accordingly. 
 *	The math below for normalization could use some work.
 *	Owen Mundy Copyright 2011 GNU/GPL */

?>

<html>
<head>
<style> 
body { margin:20; font:1.0em/1.4em Arial, Helvetica, sans-serif; } 
.text { font:10.0em/1.0em Arial, Helvetica, sans-serif; color:#000; font-weight:bold; } 
.navlist { list-style:none; margin:0; position:absolute; top:20px; left:200px }
.navlist li { float:left; margin-right:10px; }
</style>
</head>

<body onLoad="document.f.q.focus();">

<form method="GET" action="<?php print $_SERVER['PHP_SELF']; ?>" name="f">

	<input type="text" name="q" value="<?php print $_GET['q'] ?>" />
	<input type="submit" />

</form>

<ul class="navlist">
	<li><a href="?q=anchorage+alaska">anchorage</a></li>
	<li><a href="?q=toronto+canada">toronto</a></li>
	<li><a href="?q=new+york+ny">nyc</a></li>
	<li><a href="?q=london+uk">london</a></li>
	<li><a href="?q=houston+texas">houston</a></li>
	<li><a href="?q=linz+austria">linz</a></li>
	<li><a href="?q=rome+italy">rome</a></li>
	<li><a href="?q=cairo+egypt">cairo</a></li>
	<li><a href="?q=new+delhi+india">new delhi</a></li>
	<li><a href="?q=mars">mars</a></li>
</ul>

<?php

// make sure the form has been sent
if (isset($_GET['q']))
{
	// get contents of url in an array
	if ($str = file_get_contents('http://www.google.com/search?q=weather+in+'
						. str_replace(" ","+",$_GET['q'])))
	{
		
		// use regular expressions to extract only what we need...
		
		// 1, 2, or 3 digits followed by any version of the degree symbol 
		$pattern = "/[0-9]{1,3}[º°]C/";
		// match the pattern with a C or with an F
		if (preg_match_all($pattern, $str, $data) > 0)
		{
			$scale = "C";
		}
		else
		{
			$pattern = "/[0-9]{1,3}[º°]F/";
			if (preg_match_all($pattern, $str, $data) > 0)
			{
				$scale = "F";
			}
		}
		
		// remove html
		$temp_str = strip_tags($data[0][0]);
		// remove everything except numbers and points
		$temp = ereg_replace("[^0-9..]", "", $temp_str);
		
		if ($temp)
		{
			
			// what is the scale?
			if ($scale == "C"){
				// convert ºC to ºF
				$tempc = $temp;
				$tempf = ($temp*1.8)+32;
			}
			else if ($scale == "F")
			{
				// convert ºF to ºC
				$tempc = ($temp-32)/1.8;
				$tempf = $temp;
			}
			// normalize the number
			$color = round($tempf/140,1)*10;
			// cool -> warm
			// scale -20 to: 120
			$color_scale = array(
					'0,  0,255',
					'0,128,255',
					'0,255,255',
					'0,255,128',
					'0,255,0',
					'128,255,0',
					'255,255,0',
					'255,128,0',
					'255,  0,0'
					);	
		
?>

<style> body { background:rgb(<?php print $color_scale[$color] ?>) }</style>
<div class="text"><?php print round($tempc,1) ."&deg;C " ?></div>
<?php print round($tempf,1) ?>&deg;F

<?php
		
		}
		else 
		{
			print "city not found";	
		}
	}
}
?>

</body>
</html>




For an xpath tutorial check this page.

For the next part of the workshop we used Give Me My Data to export our information from Facebook in order to revisualize it with Nodebox 1.0, a Python IDE similar to Processing.org. Here’s an example:

Update: Some user images from the workshop. Thanks all who joined!

Mutual friends (using Give Me My Data and Graphviz) by Rob Canning

identi.ca network output (starting from my username (claude) with depth 5, rendered to svg with ‘sfdp’ from graphviz) by Claude Heiland-Allen

Convert NTSC video to PAL with smooth motion

When converting NTSC digital video to PAL the pixel aspect ratio needs to change from 720 x 480 (NTSC) to 720 x 576 (PAL). Depending on your project the more important problem is the transition from 29.97 (NTSC) frames per second to 25 (PAL).

I found Final Cut Pro and QuickTime both convert 29.97 to 25 frames per second by cutting the five extra frames to make it fit. This results in a loss of temporal resolution, making motion in the footage jerk and skip because the frames which created the illusion of motion are missing.

There are a few commercial applications that can convert NTSC to PAL with smooth motion, but I followed advice on this forum which suggested using Compressor for the standards conversion:

  1. Export an NTSC Quicktime movie from Final Cut Pro without compression
  2. In Compressor, select a DV PAL preset
  3. Turn on Frame controls and set resizing and retiming to “better” or “best.”
  4. Run Compressor. This took >3 hours for 12 minutes of uncompressed footage.

This should give you a 720 x 576 (PAL CCIR 601) with 25 frames per second. Finally, in DVD Studio Pro make sure you choose PAL before you import any footage, and leave all the regions selected which is the default.

UNIX: List open files


Can’t eject a CD or unmount an external hard drive on your mac because of this error: The disc is in use and could not be ejected. Try quitting applications and try again.?

This UNIX command reports a list of all open files and applications that opened them. Open Terminal and replace the name of your volume below to test.

$ lsof | grep /Volumes/media/

Thanks

Addendum: Here’s an even more helpful command: eject the disk with UNIX when the GUI won’t allow it.

$ diskutil eject [Mount Point|Disk Identifier|Device Node]

Oh, and a final tip nestled in this addition: If you have a space in the name (for example you had two disks mounted named “backup” and OSX named the second one to mount “backup 1”) then you can easily reference the name (or any file or directory name with a space) with a backslash which “escapes” the character. Typing the first few characters and then type the tab key will do it automatically.

$ diskutil eject /Volumes/backup\ 1/

Network graph grouping: A small art world

This “Mutual friends network graph” created with Nodebox using data I exported with Give Me My Data contains 540 “Facebook friends” and their connections to each other. When the graph renders it attempts to position people who have lots of connections closer together. With this you can see groups unfold based on your own social networks. Since I have spent more time in academia than I have at specific jobs my “clusters” are based mostly on my academic history.

You can also see that there are a lot of connections between my high school and where I did my undergraduate study, which is based on the fact they are located very close to each other, so friends from high school also chose the same university or town to live in. There are also a lot of interconnections between Indiana University where I did my undergrad, the University of California, San Diego, where I did graduate study, and Florida State University, where I teach now. This is probably due to the fact that my connections are all within a given field, in my case visual arts, and points to the often expressed notion that “the art world is actually very small.”

Random Hacks of Kindness (RHoK) and Google Person Finder

Last weekend I took part in Random Hacks of Kindness an international hackathon dedicated to creating useful systems to respond to critical global challenges. I met with other programmers at the Betahaus in Berlin and worked with Tim Schwartz and Mikkel Gravgaard on Google Person Finder a searchable database of missing persons that helps people find loved ones during disasters. It was used during the 2010 Haiti and Chilean Earthquakes and is developed by volunteers and employees of Google.

Photo by Flickr user rhokberlin

Photo by Flickr user nblr

Photo by Flickr user nblr

Give Me My Data and exporting mutual friends

On the one-year anniversary of the beginning of Give Me My Data I’m very happy to announce that you can now export your friends and your mutual friends from facebook using two new formats. Both of the data formats are geared towards making graphs by displaying objects and their relationships. Needless to say, this is the most often requested feature since the official beta launch in April 2010. See below for more information

The DOT language

DOT is a plain text graph description language and can be rendered using a variety of layout applications like Graphviz or Tulip.

This example (saved as a plain text file with the .dot extension)

graph G
{
	a -- b -- c;
	b -- d;
}

Produces something like this

Python / Nodebox 1.0

The other file format is also for visualizing relationships. You can copy and paste the contents into a plain text file saved with a .py extension and open it in Nodebox, a Mac application that uses Python to create 2D visuals. Learn more about creating graphs in Nodebox.

Here’s an example file. My mutual friends exported from Facebook…

-->