Desert of My Real Life

I don’t think anyone would accuse me of being a Luddite. I began to learn to program in the late 1970’s when I was in high school, majored in computer science, worked as a software developer and got a PhD in computer science. I love my tech toys tools and think that overall, we are better off with the technology we have today than we were before it was available. But I am often a skeptic when it comes to educational technology.

I was reminded of my skepticism about a month ago when I cam across this photo and caption. For those of you who won’t click through, I’ll describe it. It is a photo of a classroom smart board being used as a bulletin board, with large sheets of paper taped to it, completely covering the smart board itself. The poster of the photo asks a number of questions, including whether the teacher who uses the equipment in this manner should be reprimanded for educational malpractice. The comments on the photo imply that the fact that the teacher is using this equipment in this way is evidence that the teacher is resistant to using the equipment appropriately. I was happy to see that the poster of the photo also asked some questions about why a teacher might use the equipment in this way such as not enough training. But I think the issue really is that the teacher has not had the right kind of training and the probable reason for that is that the promoters of educational technology are almost always focused on the technology itself and not on the education that the technology can provide.

The fact that someone would consider reprimanding a teacher for using technology in this (admittedly inappropriate) way is part of the problem that I see in all corners of educational technology. When we engage in technology training for teachers, we almost always focus on how and not why. That is, we focus on how to use the technology and don’t engage in meaningful discussion of the pedagogical advantages of using the technology in the classroom. The impression then is that we want to wow our students with this new technology, to do something flashy because the flashiness will capture the attention of the students. I see several problems with this idea. First, if students are using similar technology in all of their classes, the newness of the technology wears off and the flashiness disappears. Second, we should be in the business of getting students to actually learn something and if we don’t have proof that a particular technology (used appropriately) improves learning, perhaps we shouldn’t be investing in such high-priced items. In other words, I do not see technology as a panacea to our educational problems.

I’ll give my own example of how this has played out in my own teaching. A few years ago, my University purchased a bunch of clickers. I went to several training sessions for the clickers, hoping to hear a pedagogical explanation for why the use of the clickers might improve student learning. I heard a lot about how to use the clickers (technical details) as well as the cool things I could do to survey my students to see where their misunderstandings are. But even this last point didn’t convince me that the technology was worth the cost or the effort to use it because I already have ways that I can survey my students to see where their misunderstandings are. In fact, I’ve been developing those kinds of techniques for years, without the use of technology. So what I wanted to know was how the technology will improve on those techniques so that my students learn better. And no one could provide me with those answers. This summer, however, I went to a technology institute for faculty in the University System of New Hampshire. One of our presenters told us about a learning framework which might help us think about technology use in the classroom. He cited several studies that sought to identify why individual tutoring of students is so effective at improving student learning. The results show that students learn best when they get immediate feedback about their learning (the more immediate the better), can engage in conversation about their learning (that is, when they have to try to explain what they learned to someone else) and have learning activities that are customized to their needs (so that they are not wasting their time going over material that they already understand). What technology can do, he argued, is help us provide individual tutoring learning experiences for large numbers of students cost-effectively. Therefore, we can use clickers, not to provide the teacher with information about student learning but rather to provide the students themselves with information about their own learning. That is, the clickers allows us to ask questions of the class, have all the students answer simultaneously and then when we reveal the answer(s), the student can see how he fared compared to his classmates and compared to the correct answer(s). This immediate feedback provides an individual tutoring type experience only if it is done with an eye toward making sure students understand what they are supposed to get out of the use of the clickers. But too often, clickers are used in the classroom because they are cool, and new, and innovative.

So back to the question of whether the teacher who used the smart board inappropriately should be reprimanded. If, instead of having students write on big pieces of paper which she taped onto the smart board, the teacher had the students type their items into a computer and then she had displayed them on the smart board in the “appropriate” manner, we would not be having this discussion. But in neither case have we asked what her pedagogical motivations were for the exercise that the students engaged in. That to me is the important question and the one that would determine whether she has committed “educational malpractice.” And before we spend tons of money on smart boards and iPads and clickers and and and…, I think we should focus on the learning improvements that might be gained from the use of such technology. In most cases, I don’t think we have a whole lot of evidence that it does improve learning. And I definitely don’t think we’re training teachers to use it in a way that takes advantage of the ways that it might improve learning.

{November 13, 2011}   Software Development and the User

I’ve been thinking about software usability a lot lately, mostly because I encounter so much software that isn’t particularly usable. There are two pieces of software that I use a lot right now which drive me crazy for their lack of usability. And yet, I still use them. Perhaps that’s why the usability doesn’t improve. Anyway, here are some thoughts.

The software development company that I worked for right out of college was The Geary Corporation, founded by Dave Geary in Pittsfield, Massachusetts. It doesn’t exist anymore because Dave died of MS and not too long after, his family sold it to Keane, Inc. But while it was around, Geary was an awesome company to work for and one of the things that distinguished it from other companies at the time (and apparently, companies now) was its focus on the user. We did a lot of development for Fortune 500 companies, which have a lot of middle management type people. Dave would not deal with those folks as we developed our software and this is a lesson I learned well. He would make the contract with the folks at the top of the decision chain and then he would go straight to the users. We might deal some with the users’ direct supervisor but all decisions about how the software needed to work were passed by the users on the front line, tested by them and approved by them. I learned this lesson so well that it is a central tenant of the software engineering textbook that I co-wrote.

I think about this a lot when I’m using Facebook. It’s a great tool for social networking but as time has gone on, I think the folks at Facebook have forgotten the user. The latest example of this is their recent upgrade of the Newsfeed so that it is no longer presented chronologically. Instead, Facebook decides what to show you. The Facebook site explains that this is for people who don’t visit Facebook very often and so Facebook tries to predict what will be most interesting so they don’t have to wade through a lot of minutia. That’s fine but did Facebook test this out with folks who use Facebook every day or multiple times a day? Given the subsequent uproar, I would guess not. To their credit, Facebook recently announced that they’ll be rolling out another update to give users an option concerning how they want their Newsfeed to appear. I keep using Facebook because the advantages outweigh the disadvantages (so far) but I have installed a cool app that gives me more control over my experience with the site. The app is called Social Fixer (used to be Better Facebook) and although it doesn’t work perfectly, given that it’s created by one guy in his living room, it’s awesome.

The other piece of software that is giving me fits is the tool that we use at PSU to search for courses. It’s always been ugly and clunky and not easy to understand but we have such a shortage of IT folks to help fix these things that I’ve never officially complained about it. We recently decided to stop printing a paper list of our courses which forces everyone to use this search tool. And so someone recently decided to upgrade the tool. To do the same search now requires more clicks and more scrolling than before. That’s a sign to me that whoever did the upgrade didn’t talk to faculty about how they use it. I suspect that they also didn’t talk to students. What a horribly inefficient use of time–why would you spend time upgrading a tool so that the result is less usable? If someone had come to talk to me for ten minutes, I would have explained, for example, that searching for courses by department is not an “advanced” use of the software and so I don’t want to have to click an extra time to get to that option.

None of this probably seems like a huge deal. But when you think of the amount of time we spend developing software and then using that software, it seems crazy to me that we would not take a few minutes early on to get user input as to how the tool can be most efficient and effective.

{June 25, 2011}   Technology in Education

I just got back from a three day workshop on academic technology.  As a computer scientist, I was intrigued by the idea of this workshop but I was worried that it would be a disappointment because so many of these workshops focus on what I consider to be the wrong things.  I am so glad I attended the workshop because I learned a lot and was inspired by a lot of what I heard.

The reason I’m often disappointed by technology workshops and technology training for educators is because they are often led by people whose focus is on the technology and teaching the participants how to use that technology.  This is definitely an important task but it is one that I typically find tedious because I’m comfortable with technology and want to go faster than the workshop usually go.  And I want to have conversations about more than “how” to use the technology.  I want to talk about “why” we should use the technology.  We discussed this topic quite a bit (more than I ever have) at this technology workshop.

My big take-away from the workshop concerning “why” we should use technology came from the Day 2 keynote speaker, Michael Caulfield, who is an instructional designer at Keene State College.  He presented research that shows that average students become exemplary students if they can have conversation about the topic they are learning, can have instruction that is customized to them and what they are not understanding, and can receive immediate feedback about their learning.  Basically, if every student can have a full-time, one-on-one tutor, she can move from being an average student to being an exemplary student.  Sounds great, but who wants to pay for that (especially in this economic climate)?  So, Caulfield explained, we really need to figure how to provide “tutoring at scale.”  That is, we need to figure out how to provide each student with conversation, customization and feedback in classrooms that have more than one student.  Caulfield then discussed various uses of instructional technology (which was called “rich media” at this workshop, a phrase that I’m still processing and deciding whether I like) and how to leverage technology to provide “tutoring at scale.”  Caulfield’s talk gave me a great perspective through which to view all of the activities we engaged in during the workshop.

My one critique of the workshop (and it is a small one) is that we didn’t sufficiently separate faculty development of “rich media” artifacts for use in providing “tutoring at scale” from faculty development of assignments that require students to create their own “rich media” artifacts.  It feels like the issues are related to each other but are also quite separate, with different things for the faculty member to consider.

I would strongly encourage my PSU colleagues to apply to and attend next years Academic Technology Institute.  It is well worth the time!

{June 15, 2011}   Tumblr Review–Part 2

It has taken me more than a month and a half to write the second part of this review.  I think it’s because I said in my last post that I would write about THIS topic in my next post.  Since that promise (or threat–take your pick) seems to have stymied me for a while, you can bet that I will never do that again.

I’ve been looking for a long time for a tool that would make it easy for me to implement a web site that looks the way I want it to and organizes information in the way I want it to.  When I first came across Tumblr, I thought I had found a tool that was pretty close to what I wanted.  As I read what the site promises, I realized that it wasn’t exactly what I wanted.  And then as I started to use the site, I realized that the developers of Tumblr hadn’t delivered on what they said Tumblr was going to be and so the tool is even further away from what I’m looking for than I realized.  The first part of my review of the tool focused on the things they promised but didn’t deliver.  I should point out that Tumblr no longer offers the options that I complained about in the first part of my review.  And despite my extensive contact with the technical folks at the company, no one has contacted me about how they’ve decided to resolve these issues. Perhaps it would be difficult to contact a customer (even a non-paying one) to tell them that their complaints prompted you to remove options rather than fix them. In any case, I think my dissatisfaction with Tumblr arises from my overall dissatisfaction with Web 2.0 in general and the values embraced by the people who develop tools for this environment.  So in this second part of my review, I’m going to focus on the main difficulty I have with Tumblr.  I should point out, however, that I am critiquing Tumblr for not doing something they have never promised to do.  I just wish the tool worked differently.

I am one of the few people my age who actually grew up with computer technology.  I started to develop computer software in 1978 when I was a sophomore in high school.  Although the Internet existed then, the World Wide Web did not (trivia: the birth year of the World Wide Web is debated depending on which event you use to mark its birth but it was sometime between 1990 and 1992).  Developing new tools and content for the World Wide Web was somewhat challenging and required a deep knowledge of how it all worked as well as significant programming skills. In other words, I have been producing content since the days of fairly difficult content production.  In those days, the line between content production and content consumption (viewing of that content) was pretty clear.

Gradually, however, tools were developed to allow the creation of content by more and more people. Together, these tools (things like blogging software, photo sharing sites, wikis and so on) make up Web 2.0.  I personally believe that the addition of these new, less technical content producers is a positive thing, leading to more diversity of content on the Web.  But when all of these new, easier-to-use tools entered the marketplace, I recognized that the underlying values of the tools were changing.  I’m only now beginning to fully understand the implications of these changing values.

One of the new underlying values involves a changing understanding of the word production.  I have always thought of production as the creation of new content.  Increasingly, I have come to understand that in Web 2.0 content consumption is in itself a kind of production.  In fact, this is the primary underlying value of Tumblr.  As a user browses the Web, she will inevitably find content that she finds interesting and wants to share with her online friends.  Tumblr makes sharing incredibly easy.  In fact, my unscientific review of Tumblr sites suggests that the vast majority of them are sites where the owner reposts content that she has found elsewhere on the Web.  In other words, the Tumblr owner is producing a new site that is idiosyncratically hers.  Her unique Web content consumption results in the production of a mashup, a site made of pieces of other sites.  For example, this Tumblr reposts items from around the Web that the owner finds “the most entertaining.”  None of the individual items is created by the owner of the Tumblr.  Instead, the owner produces the unique combination of these individual items.  This understanding of production by combining sites is very different than what I had been looking for when I found Tumblr.  Because I wanted to combine my various sites of production (on which I produce the individual items) into a single site, I was looking for something that would automatically grab content from those various sites of production.  Because Tumblr is designed for a human to make qualitative decisions about which content to include (from sites owned by a variety of people), the automatic grabbing of content is not as critical to Tumblr’s designers as it is to me.  As an aside, I am really interested in how this idea of consumption as production is affecting my students and their understanding of things like research and citations and intellectual property and originality.  It’s difficult to know if changing attitudes about these issues is driving changes in technology or vice versa.  In any case, this difference in understanding of the word production is the main reason I am dissatisfied with Tumblr.  What would I be satisfied with?

I would like a tool that automatically consolidates all of my other production sites while also allowing me to easily share Web content produced by others that I find interesting.  And I would like to be able to fully customize the layout of the site into what I will call “channels.”  That is, I’d like a “channel” that shows the content from this blog, another “channel” that shows my Flickr feed and so on, and I’d like to be able to arrange the “channels” on the page in a variety of ways.  And finally, I’d like the tool to allow me to customize how items appear in the various channels.  Another of Web 2.0’s underlying values is the privileging of recency.  That is, the most recent items on a site are the most important and, therefore, appear first.  I’ve written about my concerns about this value before.  Some sites, such as Twitter, take this focus on recency to extremes by deleting any tweets that are more than a few weeks old, which, of course, makes it really difficult to go back at a later time to find tweets that you found interesting in the past.  Therefore, I would like a site that allows me to override the default order of items and to provide my characterization of what is most important.  This last requirement leads me into an entirely new discussion about information organization that I think is an unsolved research problem for the technical world to tackle.  But I want my next blog entry to take me less than a month and a half to write so I won’t promise that that discussion will appear in my next entry.

{March 24, 2011}   Moving to Apple

I got a new “toy” today.  It’s an 11.6 inch Mac Book Air.  I have never owned an Apple computer although in the last few years, I’ve become a fan (mostly) of Apple’s iPod products.  A few months ago, my friend Julie showed me her new Mac Book Air, which she had gotten for Christmas.  I am not usually someone who gets particularly excited about new technology.  I’ve seen (and purchased) too many “solutions” to think that any one tool is going to change anything about my life.  I did, however, get really excited about the Mac Book Air.  I have never thought that it is going to “change my life.”  But I thought it was pretty cool and could see that it provides a level of convenience that I haven’t seen in other products yet.  I was particularly excited about its use of flash technology for storage.  There are no moving parts for the hard drive of the Mac Book Air.  Instead, it has a large flash drive, similar to the thumb drives that have become so ubiquitous, as its hard drive.  The lack of moving parts in the hard drive means that the computer boots almost instantaneously.  It also means that the hard drive doesn’t generate much heat, reducing the need for large cooling fans.  All of this leads to the thing that excited me most about this new computer.  It is SMALL!

There are smaller computers available.  My iPod Touch, for example, is a much smaller computer than this new Mac Book Air.  But the iPod Touch does not include a full-sized keyboard.  Instead, it uses a touch pad key board which I find somewhat cumbersome to use.  I would never try to write a blog entry on my iPod Touch, for example.  It would be much too tedious.  I could get a Bluetooth keyboard but then it seems stupid to carry the iPod Touch AND its Bluetooth keyboard around with me.  The Mac Book Air, on the other hand, is an actual laptop with a full-sized keyboard and an 11.6 inch screen.  So it feels much more like a computer.  But because of the lack of a regular hard drive, it is much smaller than an ordinary laptop.  The main thing that excited me about this computer is its weight–it weighs less than 2.5 pounds.  What does that mean? Go find a 5 subject spiral bound PAPER notebook.  That is about what this laptop weighs.  And its dimensions are smaller than that.  It is significantly less than an inch thick.  And its height and width are smaller than an 8.5X11 inch piece of paper.  In other words, this is a computer that I can see carrying with me and using in a lot of situations where I have used paper up to this point.  And that excites me.

Although this is probably not a laptop that can completely replace every computer you use (mostly because the flash drive on the 11.6 inch version is only 128GB), there are some other nice features that Apple provides that will make it extremely useful.  The main tool to help with the small flash drive size is a product called Mobile Me.  This product also solves the problem of having multiple devices and wanting access to the same set of files, an issue that anyone who has both a personal computer and a work computer has probably encountered.  Mobile Me is Apple’s “cloud” solution which provides space on the Internet for you to store your files and folders.  It also provides a syncing function so that when you change something with one computer, it automatically updates your space in the cloud so that your other devices have access to the changes.  I just signed up for a 60 day free trial, after which it will cost me $99 for a year’s worth of access to 20GB of space in the cloud.  I’m still loading my space with files from my PC so I can’t review how it works yet.  I will say that Apple was having some major technical problems with new users and Mobile Me just when I was signing up for the service.  Although I wasn’t happy with those glitches, everyone at Apple’s customer service was great and didn’t make me jump through stupid hoops when I made it clear that I had already tried a whole bunch of stuff to fix the problem.  Within about 3 hours, they had the problem resolved.  Although I think the idea for Mobile Me is brilliant, I’ll reserve judgment on this particular implementation until I’ve had time to use it.

So, I’m a happy geek with a new toy!  Now I just have to figure out how to use this Multi-Touch trackpad with no right or left mouse buttons.

{February 2, 2011}   iTunes Annoyance

I bought a 3rd generation iPod Touch awhile ago and have been an enthusiastic supporter of Apple’s various music players ever since.  I had owned another brand of mp3 player previously but when I made the switch to the iPod, I gained the convenience of the iTunes store.  I think that is one of the main reasons that Apple has maintained their lead in this crowded market.  I spent a few months on a conversion project, ripping all of my CDs so that my iTunes library now contains all of my music.  I subscribe to several podcasts.  I have created a bunch of playlists.  I have purchased a bunch of music and applications from iTunes.  I love my iPod and use it all the time, for all kinds of things.

I just got a new laptop.  And here’s where I have encountered my first annoyance with the way that iTunes works.  What I would have liked to do is simple.  I wanted to install iTunes on my new laptop, plug my iPod into the new laptop and have my entire library downloaded from the iPod to the new laptop via iTunes sync function.  Sounds simple and seems intuitive that that’s the way iTunes would work.   To my surprise, I discovered that this is NOT the way iTunes works.

I downloaded iTunes onto the new laptop without incident.  I authorized my new computer to access my iTunes library and discovered that each library can have as many as five computers authorized to use it at any given moment.  Not a problem since I only have two.  I made sure that my new, blank iTunes library would not overwrite the library on my iPod and started the sync process.  When it was complete, I noticed that hardly any of my music had been transferred and none of my playlists, applications or podcasts had been transferred.  When I looked more closely, I realized the only things that had been transferred were the songs that I had purchased from the iTunes store.  None of the music I had ripped from my CDs had been transferred to the new laptop.  Thinking I had done something wrong, I checked all of the options and settings available and tried syncing again.  No additional items were transferred.

I then searched for a solution and was shocked to discover that what I wanted to do is something that is not easy to do.  The best article I read on the topic is a bit arcane but the gist of it is that Apple has decided that the relationship between your iPod Touch and your computer is primarily a one-way relationship.  It’s easy to get media files from your computer to your iPod but much more difficult to get media files from your iPod to your computer.  The only exception to that “rule” is a media file that you purchased from the iTunes store.

One theory about why Apple has made this non-user-centered choice is that they are trying to appease their corporate partners concerning copyright issues.  That may be the reason but why then would they have made it easy to transfer your iTunes store purchases?  In any case, this should not be as difficult as it is proving to be.  One of the things I tried was to use the iTunes software to make a backup of my library on my external hard drive and then import that backup to my library on my new computer.  But when I tried this option within the software, it would only let me make a backup to a CD or DVD.  I could not choose where I wanted to store that backup–it had to be stored on a disk in my DVD drive.  This would require many, many CDs or DVDs and so I think people are unlikely to really choose this option for backing up their libraries.

My latest attempt (one that I am in the middle of) is to go outside of the iTunes software to use Windows to copy the iTunes folder from my old C: drive to my external drive and then from my external drive to my new computer’s C: drive.  I fear this might not work because of a number of issues that I’ve read about.  If it doesn’t, the articles that I’ve read suggest that I should purchase one of several pieces of software that have been written by third party vendors to help out people in my situation. 

I am very annoyed with Apple at the moment.  This task is a common, reasonable task to want to accomplish.  The choice that they have made here does little to thwart piracy but instead wastes the time of a lot of their honest customers.  Come on, Apple.  You can do better than this.

{December 27, 2010}   Popular Culture and TIA

I just finished watching the five episodes of the BBC miniseries The Last Enemy.  Ann had recommended it because it is about computers and privacy and also because Benedict Cumberbatch (of recent Sherlock Holmes fame) is the star.  I mostly liked the series but there were a couple of things that really bothered me about it.

The plot begins when Stephen Ezard (played by Cumberbatch) returns home to England after living in China for four years.  He’s coming home to attend the funeral of his brother Michael, an aid worker who was killed in a mine explosion in some Middle Eastern desert.  Ezard is a mathematical genius who went to China to be able to work without all the distractions of life in England.  He is a germaphobe (at least in the first episode–that particular personality trait disappears once the plot no longer needs it) who is horrified by the SARS-like infections that seem to be running rampant on the plane and throughout London.  After his brother’s funeral, Stephen goes to Michael’s apartment and discovers that Michael was married to a woman who was not at the funeral and who appears to be in hiding.  She’s a doctor who is taking care of a woman who is dying from some SARS-like infection–and that woman is in Michael’s apartment.  Despite his germaphobia, Stephen immediately has sex (in this germ-infected apartment) with his brother’s widow.

Meanwhile, Stephen’s ex-girlfriend is an MP who is trying to push through legislation that would allow the use of a program called Total Information Awareness (TIA).  TIA is already largely in place but the people of England are not happy about it.  So Ezard is recruited as a “famous” apolitical mathematician who will look at the program and sell it to the public.  What is TIA?  It’s a big database that collects all kinds of electronic information.  Every credit card purchase, building entry with an id card, video from street cameras, and so on is stored in this database.  The idea is that by sifting through this information, looking for certain patterns, English authorities will be able to find terrorists before they strike.  The interesting thing about this idea is that it isn’t fiction.   In 2002, the US government created the Information Awareness Office in an attempt to create a TIA system.  The project was defunded in 2003 because of the public outcry.  At the time, I was concerned about the project both as a citizen with rights that would potentially be threatened and as a computer scientist critical of the idea that we could actually find the patterns necessary to stop terrorism.

This is where the plot of The Last Enemy became problematic for me.  Michael’s widow, Yassim, who is now Stephen’s lover, disappears.  Stephen takes the job as spokesperson for TIA primarily so he’ll have access to a system that will allow him to track Yassim.  We see many scenes of him sitting for hours and hours wading through data with the help of the TIA computer system.  At one point, he tracks the car that Yassim had been riding in by looking for video footage taken by street surveillance cameras and finding the license plate of the car in the video.  This is completely unrealistic and one of the main reasons that, with our current technology, a TIA system will never work.  We don’t yet have the tools to wade through the massive amounts of irrelevant data to find only the data we’re interested in.  And when that data comes in the form of photos or video, we don’t really have quick, efficient electronic means of searching the visual data for useful information.  Since so much of the plot of The Last Enemy hinges on Stephen finding these “needles in a haystack” in a timely manner, I had a difficult time suspending my disbelief.  The problem is that it is very difficult to find relevant information in the midst of huge amounts of irrelevant information.  Making this kind of meaning is one of the open problems of current information technology research.

The second major problem that I had with the plot of this series has to do with Stephen as a brilliant mathematician and computer expert not understanding that his electronic tracks within the system would be easy to follow.  He makes no attempt to cover those tracks and so as soon as he logs off, his pursuers log on behind him and look at everything he looked at.  And many major plot points hinge on his pursuers knowing what he knows.  He doesn’t even take minimal steps to cover his tracks and then he seems surprised that others have followed him.  This is completely unrealistic if he really is the brilliant computer expert he would need to be in order for the government to hire him in this capacity.

I won’t ruin the surprises of the rest of the plot of this series.  But let’s say that much of the premise seems pretty realistic to me, like we’re not too far off from some of these issues coming up for consideration soon.  For that reason, I recommend the series, despite the problems I saw and despite the unbelievable melodrama that arises as a result of Stephen’s relationship with his brother’s widow.  There is a particularly laughable scene between the two of them when she tries to teach him how to draw blood by allowing him to practice on her.  It’s supposed to be erotic, which is weird enough given the danger they’re in at that point, but the dialog is so bad that I laughed out loud.  Despite these problems, the series explores enough interesting questions that I kept watching, wanting to know how the ethical questions would be resolved.

{December 26, 2010}   More About Net Neutrality

This entry was inspired by Meg, who asked some great questions after I posted my last entry.  In that entry, I explained what the net neutrality debate is about and why consumers should care about the FCC’s recent ruling requiring that traditional ISPs cannot discriminate the traffic that they carry over their wires.  This is a good thing for consumers (IMHO).  Near the end of the post, I also suggested that the ruling didn’t go far enough because it didn’t apply the same rules to wireless providers.  I didn’t explain what I meant by that and so Meg asked some great questions.  So here’s a further investigation of the FCC ruling, as it applies to wireless providers.

An article from Wired summarizes the three rules that the FCC passed for wired ISPs: 1. They must be “transparent about how they handle network congestion”; cannot block any particular traffic on wired networks, and cannot “unreasonably” discriminate on those networks.  This last rule means that the speed of data transmission must be the same regardless of the source of that data.  So Time Warner (as an ISP) cannot make your connection slower to Netflix‘s online video service than the connection to Time Warner‘s own online video service (if they had one).

Despite these consumer protections, the ruling is being thrashed because it does not apply these rules to wireless providers of Internet access.  What does that mean?  It means that if you access the Internet on your phone, your phone company can charge you different rates to access different sites.  If Facebook is particularly popular, for example, your phone company can charge you more to access it than it charges to access MySpace.  Or worse, if your phone company creates their own social networking site, they can charge you more to access all competitors’ sites than they do to access the more well-known sites.  Or even worse yet, they can prevent you from using their wireless network to access the competitors’ sites at all.  This is clearly not in the best interest of consumers.  It’s also not in the best interest of innovation since most innovation does not come from the biggest companies and small companies could get squeezed out if no one is able to access their sites.

Right now, these (non)rules concerning wireless providers apply mostly to cell phone companies who provide Internet access.  Most other access is wired access.  Even when we have wireless networks in our homes and places of work, we have wired access that comes into the building and then we have a local wireless network set up.  So the ISP isn’t providing the Internet access wirelessly.  And so they would be governed by the stricter rules imposed by the FCC ruling.  But that may not always be the case.  In the future, more and more ISPs may figure out ways to effectively and efficiently provide wireless access into our homes and businesses.  And if that happens, those new networks will be governed by the softer rules.  This seems short-sighted to me.  And it seems like it happens because the folks on the FCC are not tech people and so don’t really understand what is different and what is the same about different kinds of technology.  Let’s hope that changes.

The debate about net neutrality has been around for a while.  I taught my students about it back when I was still in the Computer Science Department, during the Bush administration.  Today, finally, we’ve gotten a ruling from the Federal Communications Commission about this “controversial” subject.  But to understand the FCC ruling, we first have to understand the debate.  And that means that we have to understand what the Internet actually is.

So, what is the debate?  It’s about your access to the Internet.  The Internet was founded as a decentralized network of computers.  That’s right.  The Internet is  a network of computers.  Each of these computers provides some service.  So when you connect to the “Internet,” you are connecting to a bunch of computers.  And you ask those computers to provide you with some sort of service.  Like viewing a web page.  Or looking at your email.  Or listening to music.  Or watching a movie.  Each of these services involves sending your computer data in the form of a bunch of zeroes and ones that your computer then translates into something that you (as a human) recognize.  Some of these services involve a few zeroes and ones while others involve MANY zeroes and ones.  The Internet was founded on the idea that zeroes and ones are zeroes and ones.  That is, we should not make any distinction between THIS set of zeroes and ones and THAT set of zeroes and ones.  That’s the idea of net neutrality.

How does this relate to you and your everyday, online life?  It means that when you use your Internet Service Provider (Time Warner Cable or Netzero or Verizon or whoever) to connect to Google (or Microsoft or LL Bean or YouTube or Hulu whoever), the zeroes and ones are not discriminated.  All zeroes and ones are treated equally.  So, for example, Time Warner cannot make a deal with Microsoft to make Bing (Microsoft’s search engine) run faster than Google (Bing’s direct competitor).  And Time Warner cannot make a deal with Microsoft to charge you more to access Google than to access Bing.  AND Time Warner cannot make a deal with Microsoft to completely block your access to Google so that you MUST use Bing as a search engine.  THAT is net neutrality.

So the issue has been whether to consider the Internet to be more like a communication network or an entertainment provider.   If the Internet is about communication, then it should be regulated in the same ways that phone communication has been regulated.  Phone companies must carry all phone calls at the same rate based on distance.  In other words, they can charge you more to call California than to call the town next to you, but they can’t charge you more to call Business A than to call Business B based solely on the fact that Business A is different than Business B.   And they can’t block your call to any place.  They must carry all calls.  On the other hand, if the Internet is about entertainment, then they should be able to make deals like your cable company makes deals.  For example, my cable company, Time Warner, recently failed to come to an agreement with an ABC affiliate out of Vermont.  As a result, I no longer get that channel in my cable lineup–I cannot access that channel no matter what I do (unless I change to a cable or satellite provider that gives me that access–but, of course, most cable companies have monopoly access in the towns where they provide service).  In addition, if I want access to certain channels, my cable company may charge me more.  I have access to The Sundance Channel but I don’t have access to the Independent Film Channel because I pay at the level that gives me Sundance but I don’t pay at the level that gives me IFC.

So the question has been, is the Internet a communication network (like phones) or an entertainment network (like cable TV)?  Another way to ask this question is: should Internet service provision be regulated to prevent differential access to certain sites?   Many Republicans have argued that deregulation, allowing companies to do whatever they want, promotes competition and is therefore good for consumers.  And so they have argued that we should allow Internet Service Providers to charge different amounts for different kinds of access and to actually block access to certain sites.  I generally believe that consumers are best served by rules that promote net neutrality.  So I have argued for a long time that the FCC should make rules that prevent situations such as what happened with my ABC affiliate and my cable TV provider.

So today, the FCC ruled in favor of net neutrality.  THIS is a good thing (IMHO) for consumers–and THAT is why you should care about this.  Some Republicans have called this ruling “regulatory hubris.”  Many on the other side of the debate have also decried this ruling because it doesn’t go far enough in its regulations.  The ruling explicitly singles out cell phone operating systems, such as Android, as the reason that the FCC was softening its rules for net neutrality on wireless networks.  This is defintely something that consumers need to pay attention to.

{December 10, 2010}   Zero Views

Recently, my favorite NPR show, On the Media, had a story about an interesting blog called Zero Views.  The blog celebrates “the best of the bottom of the barrel” by posting the funniest YouTube videos that no one (NO ONE–hence the name “Zero Views”) has watched.  I found several things about this story that are worth commenting on. 

First, this is the kind of meta-site on the Web that I love.  It’s a site that highlights content from another site.  But here’s the thing.  As soon as this site focuses on a video that has zero views, it is HIGHLY likely that the video will no long have zero views.  And in fact, if the Zero Views blog is at all popular (and my sense is that it is fairly popular), any site that it talks about is likely to go viral and become incredibly popular with thousands of views.  That, to me, is a really interesting phenomenon.

The second thing that I find interesting about this story is an underlying issue about popularity.  This is something that I’ve been thinking about for a while.  What makes a blog, a site, a video “popular?”  The easy answer has to do with numbers of views.  But that somehow feels unsatisfying to me.  I’ve watched many videos and traveled to many links that were recommended to me, only to feel…dissatisfied with what I’ve seen.  This makes me think that popularity must have something to do with “likeability” or some other related concept.  How would we measure “likeability” and surely, the fact that someone “recommended” a particular site, blog, video to me must have some relationship to “likeability,” right?

There are sites such as Technorati that try to measure “popularity” by measuring the number of links that each site has to it.  That is, the more other sites link to your site, the higher you rank in Technorati’s popularity rankings.  There are many problems with this idea of “popularity,” the most obvious of which is that more tech-literate folks are more likely to link to other sites.  So if you are “popular” among less tech-literate folks, you are less likely to be linked to so you will be ranked as less “popular.”

I don’t actually know how to measure “popularity” of websites, blogs, videos and so on.  The proliferation of “top 100” or “top 10” shows on TV makes me think that “popularity” is a cultural phenomenon, something we are interested in as a culture.  But I’m curious about what various groups of people mean when they use the word “popular” when it comes to online content.  What do you think?  I’m also really interested in the kinds of activities and behaviors that can affect the “popularity” of online content.  What do you think about that?

et cetera