How much traffic do the Edmonton Journal and iNews880 get from Twitter?

Depending on who you talk to, Twitter is either killing news media or saving it. A recent analysis by Hitwise found that less than 0.2% of people who use Twitter wind up going to news and media sites (thanks to Karen for the link). Their analysis looks at Twitter as a whole though, and I’m not sure how well it accounts for local news sites. I believe very strongly that social media has the greatest impact at the local level (more on this in a future post). Given that, I have long wondered how Twitter has impacted local news media here in Edmonton. Last night, I finally did some analysis. I decided to explore how much traffic the Edmonton Journal and iNews880, Edmonton’s two top tweeting media outlets, received from Twitter last year.

@EdmontonJournal

First up, the Edmonton Journal. They’ve been tweeting news articles since at least January 2009, so I had lots of data to play with. They used tweetburner to shorten links until September when they switched to bit.ly. Using the APIs available from those services, I added up all the click stats for all the links posted by The Journal. Here’s what I found:

Lots of variation, as you can see. Some of that is down to the use of two services, some of it is because of the number of Twitter users. There are probably dozens of other factors too.

For the period January 30 through December 31:

  • A total of 153,968 clicks were recorded on 4737 links.
  • That’s an average of 33 clicks per link, and an average of 15 links per day.
  • According to the stats on the bit.ly links, 95.4% of clicks come from the Edmonton Journal’s hash*.
  • The link with the most clicks (700) was this one, on May 26. It doesn’t work, because annoyingly The Journal doesn’t display old articles for some reason, but it appears it was about Edmonton’s Poet Laureate Roland Pemberton.
  • The day with the most clicks, September 14, doesn’t appear to be special…just lots of clicks that day for some reason (any ideas?).

@iNews880

Next up, iNews880, one of the first local media organizations to join Twitter. They used tinyurl.com until July, when Twitter switched the default to bit.ly, so unfortunately I only have data for the latter half of the year:

For the period July 14 through December 31:

  • A total of 90,500 clicks were recorded on 3811 links.
  • That’s an average of 24 clicks per link, and an average of 22 links per day.
  • According to the stats on the bit.ly links, 93.8% of clicks come from iNews880’s hash*.
  • The link with the most clicks (1933) was this one, on August 2 (that’s the huge spike in the graph above). The link goes to the report on the Big Valley Jamboree stage collapse, and it was popular because it included before and after photos.

Edmonton Journal vs. iNews880

I wanted to do a quick comparison, so I chose the period September 17 through December 31, because both sites used bit.ly for links during that time. Here’s what it looks like:

During that time:

  • The Edmonton Journal posted 2369 links (23 per day) and iNews880 posted 2261 links (22 per day).
  • A total of 79,519 clicks were recorded on Edmonton Journal links (an average of 751 per day or 34 per link).
  • A total of 53,815 clicks were recorded on iNews880 links (an average of 508 per day or 24 per link).

Thoughts

That’s a lot of clicks! Clearly Twitter and other social networking sites (where most shortlinks are posted) are having an impact. But how much? According to the latest report by the Newspaper Audience Databank (NADbank), weekly online readership at EdmontonJournal.com increased by 35% last year to 115,900 from 85,800 in 2008. That’s an increase of 30,100 readers per week. According to the click stats above, The Journal received 3208 clicks per week in 2009. So what does that mean?

Roughly 10.7% of the Edmonton Journal’s online readership increase in 2009 came as a result of posting links to Twitter.

And if I had to guess, I’d say my analysis probably underestimates things. Apparently the NADbank data is based on surveys, so I’m not sure how accurate it is, but it’s probably within acceptable margins of error. I’m also not sure what exactly a “reader” is – a page view, a visit, etc.

Caveats

I’ve tried to be as accurate as possible, but I can’t make any guarantees!

  • All the click stats are current as of last night.
  • I’m suggesting that all the clicks come via Twitter, when that’s probably not entirely true. Links get passed around, displayed on websites, etc. But the shortlinks do originate at Twitter.
  • It’s possible that The Journal or iNews880 posted a link to somewhere other than their own sites, but uncommon. I did remove one link from the iNews880 dataset, because it pointed to an Environment Canada site (it was obvious, lots of total clicks as others have linked there too). For the rest, I’m making the assumption that the links point to the news sites.
  • I don’t know how reliable the stats from bit.ly and tweetburner are. I suspect they are quite a bit different than server logs or Google Analytics metrics.
  • I would assume that both services tweaked the way stats are calculated throughout the year, so 15 clicks on a bit.ly link in May is probably different than 15 clicks on a bit.ly link in December.

* – When you shorten a link using bit.ly, you get a unique hash. If I shorten the same link, I get a different hash. The stats are recorded and made available individually and in aggregate, however.

January 2010 Headlines: Edmonton Journal vs. Edmonton Sun

I think it’s fair to say that Edmonton’s two major dailies have strong stereotypes attached to them. The Edmonton Journal, as the capital region’s newspaper of record, is generally considered reliable, encompassing, and important, with an emphasis on politics and current events. The Edmonton Sun, which has just less than half of the Journal’s weekly circulation (according to data from 2008), is generally considered a bit more tabloid-like, with an emphasis on sports and special sections. But I’m not happy with stereotypes – I like data!

There is obviously much more to a newspaper than its headlines, but I figured that was a good starting point for comparison. Using data extracted from Twitter (which means it may be incomplete) I compared headlines from The Journal and The Sun for January 2010. I counted 662 headlines for The Journal (in blue) and 589 headlines for The Sun (in red).

 

The most frequently used words in The Journal’s headlines were: Edmonton, Alberta, new, fire, man, woman, Oilers, Calgary, gallery, and police.

The most frequently used words in The Sun’s headlines were: Haiti, Canada, city, man, Canadian, Edmonton, Alberta, Hatian, new, and quake.

Here’s a quick comparison of the average length, average number of words, and average Automated Readability Index (ARI) for each headline:

I’m not sure that calculating the ARI for a headline is valid, but calculating it for the collection of headlines isn’t valid either (because they aren’t equivalent to sentences). I did look at the collection though – The Journal used 865 complex words, whereas The Sun used 552 (a complex word is three syllables or more, as determined using this online tool).

I don’t know what the takeaway is here, but I thought it was interesting enough to share. I’ll probably revisit this again in the future, with additional news sources, and probably some sentiment analysis as well. If you have any suggestions, let me know in the comments!

CTV Edmonton launches Inner Tube blog

On Friday afternoon, just hours before the start of the Vancouver 2010 Winter Olympics, CTV Edmonton launched a new blog called Inner Tube. I’m not sure if the timing was just a coincidence or if CTV Edmonton purposefully wanted to “soft launch” the blog, but either way, this “online experiment of sorts” is something that’s worth paying attention to.

First, the key points:

  • Inner Tube is a group blog. Entries will be written by a variety of people at CTV Edmonton, including Carrie Doll and Josh Classen.
  • This is an Edmonton project, not something that came from Toronto.
  • Posts are edited for clarity, comments are moderated.
  • From the about blurb: “You’ll read stories about the inner workings of the news process, how we develop our stories, or just casual observations about what makes north central Alberta so special.”

I called Stewart Shaw, web guru at CTV Edmonton, to learn more about the site that he has been working to launch for the last six months or so. My first question was why it took so long! Stewart very democratically explained that many people just aren’t as comfortable with technology as the rest of us, and that convincing all of the necessary people took time (as it would have in any typical corporate environment). He was pleased with how things progressed.

Stewart told me that CTV Edmonton sees this as an extension of what they’ve been doing for more than 50 years. The station has always felt that it was part of the community, and the blog is just a modern way of ensuring that remains true. And while the CTV Edmonton account on Twitter has been quite successful, and most stories on the news website offer the ability to leave comments, neither offers the same kind of connection that the blog can (though Carrie Doll, Josh Classen, and other personalities regularly interact with other Twitter users). Stewart said that the Save Local TV open house last year opened some eyes – it was the first time in a long time that CTV Edmonton had invited the public to the station, and they were overwhelmed by the response. The idea with Inner Tube is to open up a little, to provide a glimpse behind the curtain from the people that make CTV Edmonton tick.

Local media blogs are not new, of course. The Edmonton Journal, iNews880, and Edmonton Sun have had blogs on their websites for a long time, with varying levels of success. The difference is that CTV Edmonton has created a group blog that everyone will contribute to, rather than individual blogs for each employee or personality. The idea is that it’ll be a little easier to keep fresh, and also to build a following with. I think the jury’s still out on which approach is more successful, but I like that CTV Edmonton is experimenting with something different.

Inner Tube is off to a good start, with roughly half a dozen posts already up on the blog. It’ll be interesting to see how it evolves – I hope it opens the door to even more online activity from the local media. Congrats to Stewart and everyone else at CTV Edmonton for launching Inner Tube, and good luck!

Recap: MediaCamp Edmonton Initial Meeting

Tonight we held an initial planning meeting at Credo Coffee for an event called MediaCamp. I have wanted to hold a local event to bring mainstream (or old) media together with social (or new) media for some time, and last week Karen Unland provided the necessary spark when she tweeted about a hacker event that took place recently in London, UK. A bunch of us very quickly settled on a hashtag – #yegmediacamp – and we got the ball rolling on Google Wave.

MediaCamp Edmonton PlanningMediaCamp Edmonton Planning

The meeting tonight was appropriately informal, and gave everyone an opportunity to meet one another and share ideas. We went around the circle with introductions and initial thoughts, and then discussed what MediaCamp might look like. Karen probably has better notes than I do, but here are some of the things I wrote down:

  • Should it be a small event or a large one? The consensus seemed to be “go big”.
  • Would an event focus on business models? Technology? Something else?
  • BarCamp is pretty unstructured, TransitCamp had a bit more structure but used the same kind of model. What’s the right approach for MediaCamp? The consensus seemed to be that we have some structure.
  • Lightning Thoughts was something that everyone thought was a good idea – quick, five minute demos.
  • Multiple streams or not? We want to break down silos and encourage input from everyone.
  • Will they come? How do we remove barriers to entry? How can we ensure a good mix of mainstream media folks and social media folks?
  • As with most of these events, connections are perhaps the greatest outcome.
  • Potential dates: April 10, May 8
  • It was decided we’d loosely follow the ChangeCamp structure, striking subcommittees to focus on sponsorship/budget, volunteers, day-of, etc. The first step – create a Google Group to get everyone connected.

I agree with Bruce that labels seem to be a necessary evil, so I’ll use them here. The common thread seemed to be, “let’s work together”. What can old media learn from new media, and just as importantly, what can new media learn from old media?

I was quite impressed with the turnout, especially since it was just a planning meeting. Here’s who made it out tonight: Karen, Cam, Asia, Alain, Dave, Rachelle, Kelly, Jeff, Eugene, Brittney, Diane, Jas, Curtis, Reg, Bruce, Marty, Kerry, and myself. I know there were many more who wanted to come but couldn’t make it work!

Please follow along on Twitter, and join the Google Group. I think there’s a lot of excitement around MediaCamp, and I’m eager to see what comes of it!

Upcoming Speaking Engagements & Events

On Saturday I was invited to speak at the Annual Sustainable Campuses Conference, on the subject of Open Data. I gave an overview of open data, shared some examples of open data apps for sustainability, and described how open data arrived here in Edmonton. We also spent some time chatting about Twitter, both in general and how it is being used by the open data community. Thanks to the organizers for including me!

I’ve got a few additional speaking engagements coming up that I wanted to mention:

  • January 25-28: Social Media for Government (on ShareEdmonton)
    The conference started today and finishes tomorrow, with post-conference workshops taking place on Thursday. I’m leading the last workshop that afternoon, where I’ll be taking participants through a social media campaign from start to finish. We’ll look at examples of campaigns that have worked, as well as examples of what to avoid. There are some really smart people speaking at the event, such as Jas Darrah, Diane Begin, Troy Wason, Ken Chapman, Walter Schwabe, and many more.
  • February 5/6: What Happens Next? Future of Story (on ShareEdmonton)
    I’m really excited to be taking part in this conference, hosted by MacEwan’s School of Communications. I’m participating in a panel on “The Next New Journalism” along with Karen Unland and Colby Cosh. The panel will be moderated by Rey Rosales, Associate Dean at MacEwan’s Centre for Arts and Communications. I obviously have some strong ideas about the future of journalism and media, as I’m sure Karen and Colby do, so it should be interesting, and fun!
  • March 11: MacEwan Student Business Conference 2010 (on ShareEdmonton)
    This conference aims to connect students with business leaders and innovators. I’ll be taking part in a roundtable discussion on social media – what is it, why is it important, how can businesses use it, and how it relates to an overall communications strategy. Hopefully we’ll also have some great discussion about how students are using social media.
  • May 5/6: Technocon 2010 (on ShareEdmonton)
    I’m honored to be one of the keynotes for this conference, open to all City of Edmonton and University of Alberta IT employees. The conference focuses on three key themes: open, world class, and transformation. I’ll be talking about open data, open government, social media, transparency, and more all related to the theme of open. Can’t wait!

Here are a few other upcoming events I’ll be at:

Hope to see you at a few of them!

Recent media links & thoughts

I read a lot about new media, journalism, publishing, news, etc. I always try to think about the things I read from both a global and a local perspective. Here are some thoughts on the things I’ve read recently.

From Jeff Jarvis:

I’m not so sure journalism is storytelling anymore.

Jeff points out that saying “journalism = storytelling” is limiting. Journalism is about more than the story, it’s a process. I agree completely. Data, algorithms, aggregators – all are aspects of journalism. They always have been, of course, but their importance/visibility has been heightened lately, thanks to new tools and technologies.

From paidContent.org:

Time Warner’s CNN is taking a stake in hyperlocal aggregator Outside.in—the latest example of a big media organization making a play in the hyperlocal space.

Smart move, just like MSNBC’s purchase of EveryBlock. And the news today that Google is in talks to buy Yelp. The dollars are starting to flow toward local/hyperlocal news companies. You know how the saying goes: follow the money.

From TechCrunch:

So what really scares me? It’s the rise of cheap, disposable content on a mass scale, force fed to us by the portals and search engines.

From ReadWriteWeb:

In my view both writers and readers of content will need to work harder to get quality content. Right now ‘quantity’ still rules on the Web, ‘quality’ is hard to find.

Lots of others have already discussed the “content farm” issue that made the rounds in the blogosphere last week. My view on it is pretty simple: readers need to become more active. There’s so much information so easily available that you can’t afford to passively consume the news. You have to seek out sources and recommendations. Certainly we’ll get better tools (aggregators, filters, search engines) but I think readers need to make more of an effort. See also: Content farms v. curating farmers.

From Clay Shirky:

…one of the things up for grabs in the current news environment is the nature of authority. In particular, I noted that people trust new classes of aggregators and filters, whether Google or Twitter or Wikipedia (in its ‘breaking news’ mode.)

I called this tendency algorithmic authority.

Fascinating. I think there’s incredible opportunity, both globally and locally, to take advantage of this. Who do you trust for your news? Is it the same people/organizations that you trusted five years ago?

From Unlikely Words:

Ken Auletta from the New Yorker wrote a book about Google, “Googled: The End of the World as We Know It” and before he published it, he cut the last chapter of 25 media maxims.

Now you can read them online. A few of my favorites:

  • Passion Wins
  • Adapt or Die
  • Digital is Different
  • Don’t Ignore the Human Factor

And finally, one of my favorite new tools: Times Skimmer. We need more innovation like that at the local level!

Reporting live in a world with Twitter

As you are undoubtedly aware, a gunman held eight people hostage at the WCB in downtown Edmonton last week. I happened to be on Breakfast Television that morning, so I was on the Citytv set as news was trickling in. I had the opportunity to tweet about the news live on the air:

Unconfirmed via @CitytvEdmonton: armed man holed up in the WCB building downtown. #yeg

It all happened very quickly and if the news wasn’t so terrible, I’d have said it was exciting. Certainly it was a good illustration of one aspect of the social media tools I was scheduled to talk about that morning.

A couple of hours later, I setup a live page on ShareEdmonton to cover the story (the feature is a work-in-progress, so it should be time-boxed but isn’t currently). That enabled anyone to quickly look at the stream of updates coming from Edmontonians related to the hostage situation. I used it throughout the day, and the feedback I received was mostly positive. I think what was most powerful about it was that you simultaneously got updates from the local media (in particular, @lyndasteele) and regular citizens, some discussing the event, others simply trying to find out what was going on. I’m sure many more people were just monitoring the #yeg hashtag in Twitter Search, TweetDeck, or some other app.

I think most found Twitter to be a useful resource that day, but not everyone was happy. Can you guess who complained about the Twitter coverage? Some members of the local media, of course. I heard from a number of journalists throughout the day that they were concerned about posting news on Twitter. Esther Enkin from CBC even wrote about it:

The task is complicated further by the sheer volume of communication. Facebook and Twitter were working overtime. At one point, there was a rumour that someone holed up in the building was updating the situation on Facebook.

The level of speculation and misinformation on Twitter was an object lesson on the need to verify and sift the facts.

Late in the day, someone from CBC tweeted that some hostages had contacted us. We weren’t reporting the fact that we had become involved for a bunch of reasons.

But here is a really important principle. We should not tweet what we wouldn’t put on the air.

I’m not going to deny that verifying the facts is important, but I will disagree that the level of “speculation and misinformation” on Twitter was higher than normal. I think it was the opposite actually – I think Twitter enabled citizens to get the facts faster. Faster than walking around talking to neighbours or coworkers, which is where speculation truly thrives, and certainly faster than waiting for the six o’clock news.

Esther takes care in her article to deny that they were withholding information for competitive reasons:

One reason we didn’t let on is because we didn’t want every other news organization jumping in. Not for competitive reasons, but because the chaos could be dangerous.

Really? Chaos would ensue from other media organizations knowing that CBC had talked to the hostage taker? I’m not so sure.

If there’s one thing I’ve learned about the media over the last year it’s that they are incredibly competitive. That was the primary concern when Twitter hit the scene in Edmonton back in Februrary – “we can’t tweet that or our competitors will find out.” Maybe Esther is telling the truth, but I don’t believe it.

Her other reason for withholding the information was based on CBC’s Journalistic Standards and Practices, last amended in 2004 (before Twitter, you’ll note).

Of course, the primary danger of live reporting and detailed descriptions of what is going on outside in a situation like this is that the hostage taker can be listening, watching and logging on.

That makes sense at first, but think about it a little more. That statement implicitly suggests that reporters can collectively control the information the hostage taker is receiving. Really?

Trying to control the information is impossible. You have to assume the hostage taker is going to be looking for information. These days, that probably means he or she is carrying a device with Internet capabilities. You also have to assume that regular people are going to be posting information, people who never went to journalism school and who don’t work for a media organization. Some of those people are going to be merely observers, looking at the situation from the outside. Others will be part of the event.

All the signs point to more information, from more people, faster than ever before. Most of us walk around with phones or other Internet connected devices, and in a couple years you probably won’t be able to buy a device without Internet connectivity. I think that’s the reality, and that’s the world the media need to visualize themselves in.

Stop complaining about the misinformation on social networks, and start preempting it. Stop trying to control the flow of information, and start figuring out how to effectively contribute the facts.

Introducing ShareEdmonton

Today I’m excited to launch ShareEdmonton, a local aggregation platform for Edmonton and area. With it, I want to redefine local media and improve Edmonton by embracing the fact that communication is increasingly taking place online.

You can think of ShareEdmonton as an events calendar, at least right now. It certainly has that functionality, and I want it to become the de facto events calendar for Edmonton. I’ve taken the opposite approach of most online calendaring sites, such as Upcoming or Eventful. Instead of starting at the global level and working down, I’ve started at the local level. This is a simple, but important distinction.

I believe that place is more important than ever. That’s why place is at the heart of ShareEdmonton. Unlike other sites, each place exists once and only once in ShareEdmonton. If you want to find out what’s going on at the Shaw Conference Centre, you can be confident there’s only one in the system. In addition to individual places, ShareEdmonton currently supports neighbourhoods, such as Downtown. This is a powerful way to roll up data about a collection of places.

What kind of data? Events, obviously, but also tweets. For any event, place, or neighbourhood you can see recent related tweets written by people in the Edmonton area, in real-time. Or you can see all tweets written by local users. Over time, I’ll be adding other kinds of data alongside tweets, including blog posts, photos, and more.

ShareEdmonton is all about aggregating the immense amounts of data available online and helping you find the bits that are important, relevant or interesting to you, through place, topic, or some other filter. Here’s an example – George W. Bush is in town tonight, at the Shaw Conference Centre. Here’s the event page on ShareEdmonton (and here’s the page for the rally against him). On it, you find information about the event, the location, and recent related tweets – people talking about the event. Two more examples, using topic as a filter: weather and traffic. That’s pretty powerful, I think, and has the potential to become even more powerful over time.

What’s available today is just a small part of what I hope the site will become. Today is step one, and there’s a long way to go until the vision is realized. I have grand ambitions for ShareEdmonton!

Here are a few other quick points:

  • Though I’m not calling this a beta, it is a work-in-progress.
  • The entire site features clean, hackable URLs.
  • The site also features Microformats. If you’re running Firefox, install Operator and you’ll see your browser “light up” with events, locations, tagspaces, and more.
  • All tweets pages have RSS feeds, and most event listing pages have both RSS and iCal feeds.
  • ShareEdmonton is not, and will not be, open source. I am and will be embracing the concepts of open data, however, so stay tuned for more on that.
  • The engine is generic, so you could in theory turn on ShareCalgary or an aggregator for another city.
  • As I was quoted on Saturday – I’m really not concerned with the business model at the moment. I want to build something that is valuable first.

I want to say a big thank you to everyone who has provided feedback, done testing, and otherwise helped me out with this, especially Chris, Cam, Reg, Eric, Rob, Dickson, Jas, and Adam. Also, though she probably would rather me not say it, Sharon played a big behind-the-scenes role in this – thanks!

Please check out ShareEdmonton, and let me know what you think. Tweet it, blog it, leave a comment below, email me, or post something on the Uservoice forum I’ve setup. You can also follow ShareEdmonton on Twitter, which is where I’ll announce new features. Thanks!

Friday musings on hyperlocal news

A couple weeks ago, Matthew Hurst created the Hyperlocal page on Wikipedia. Previously, the Hyperlocal redirect went to Local News. Here is Matthew’s rationale for the change:

One of the reasons behind separating these two is that hyperlocal content, and especially blogging, is not simply content about a location and of a particular geographic granularity. It is intended for people resident in that location and, importantly, it is written by residents of the location. Local news does not require the later.

According to the article, hyperlocal content is characterized by three major elements:

  1. It refers to entities and events that are located within a well-defined, community-scale area.
  2. It is intended primarily for consumption by residents of that area.
  3. It is written by an individual resident in that area.

I think this definition is missing a few things.

Much of what I write on this blog could be considered hyperlocal under the above definition (assuming Edmonton falls under the well-defined, community-scale part). The same could be said of The Edmonton Journal, however, which is why I think the current definition on Wikipedia is missing something. The most obvious addition would be a fourth point about being locally owned/operated.

I like that the definition does not mention any particular medium, such as blogging, but rather leaves it open. However, I’m not sure the third point is general enough. The phrase “written by” suggests that we’re talking about the traditional article format, with sentences and paragraphs. I think hyperlocal is much more than that. Consider sites like EveryBlock, which contain hyperlocal news created by software (though I suppose EveryBlock conflicts with the locally owned/operated concept, but you get the idea). Sure humans wrote the software, but the content produced for consumption comes from an algorithm. Shouldn’t that count?

Another thought – what about the people who create hyperlocal content, whether writers or programmers or other creatives? Should we call them Hyperlocal Journalists? Before you journalist types get all defensive, consider that there are twenty types of journalism listed on Wikipedia. What’s the harm in adding one more? 🙂

Finally, I think there’s a place for aggregators and curators in the hyperlocal ecosystem. Perhaps another defining characteristic of hyperlocal content is that it is spread all over the place. Aggregators and curators can sift through all of that content to help make it more discoverable.

Patent for podcasting? Seriously?

A company you’ve probably never heard of before announced today that it has been awarded a patent on podcasting. VoloMedia was awarded U.S. Patent 7,568,213  titled "Method for Providing Episodic Media" yesterday. I think the fact that VoloMedia’s Murgesh Navar posted an entry defending the patent before anyone even knew about it underscores just how silly it is.

Here’s what Dave Winer wrote today in response:

I’m certainly not a lawyer or an expert in patent law, but it seems the work Adam Curry and I did in creating the format and protocol for podcasting, in 2001, may have inspired their "invention." It certainly predates it.

Honestly it boggles my mind how software patents are awarded. First of all, VoloMedia applied for the patent in November 2003. Why did it take nearly six years for it to be decided? It’s a cliche, but that’s an eternity on the Internet. Second of all, how could the patent office not discover prior art within those six years? It’s just ridiculous.

According to NewTeeVee, VoloMedia is in talks with Apple and TV networks, among others, “about growing the business and market.” Seriously? I hope VoloMedia fails fast. I really dislike companies that exist solely to sue other companies for violating patents they should never have been awarded in the first place. That’s exactly what VoloMedia is becoming.

For more, check out Ars Technica. Here’s to hoping that VoloMedia’s patent is invalidated.