Maybe next time Elections Alberta can spend $1 million on something useful

Well there’s $1 million down the drain. Voter turnout for the 2015 provincial election here in Alberta ended up being 53.7%, down from 54.4% in 2012. The flashy ad campaign that Elections Alberta ran probably had very little impact on those numbers, if it had any impact at all. I’d wager than anger against the PCs and enchantment with the NDP’s orange crush did more to impact voter turnout than #ChooseYourAlberta did.

choose your alberta

They could have spent that money on things that would have actually, measurably impacted turnout. Like more voting stations. Or better educational resources on how to vote. Or online voting. Or, as I will argue for in this post, on a better website and on open data. Let’s start with the website.

Is a functional, reliable, up-to-date website too much to ask for in 2015?

The Elections Alberta website is an unmitigated disaster. It’s garish, uses tables for layout, and is horribly unfriendly to use on a mobile device. Worse, there’s not just one website, but many. Here are some of the subdomains I’ve come across:

There are probably others that I haven’t even found yet, too. Each of those sites has a different navigation menu even though they share a similar design, which makes them very disorienting. Worse, they change seemingly on a whim. Links are removed or change, redirects are put in place, and there is no revision history.

I suppose you could argue that we don’t have elections very often so it’s not worth putting a lot of money into the website. But I’m not talking about a fancy, complicated, expensive redesign. I’m talking about a simple, responsible, and trustworthy website that is actually useful.

I think being trustworthy is especially important. Elections Alberta is the authority on elections in Alberta – I would expect to be able to go their website to find accurate, reliable information. But it’s hard to trust a site that is constantly in flux, with information appearing for a few days and then disappearing again, or links that look like they were added almost as an afterthought.

For instance, I downloaded a list of candidates in the 2015 election in Excel format a few weeks before election day, as I was building my results dashboard. It was somewhere on the WTV site. Today that page is gone, and the WTV site redirects to results. Thanks to the Wayback Machine, I can see that a completely different site used to be there, with the link to the Excel document I had downloaded. Why remove that?

It seems they have removed nearly all of the previous information and functionality now that the election is over. Searching for your candidates has been replaced with finding your MLA. Which kind of makes sense, except that you’re on the Elections Alberta site, not the Legislative Assembly website. I expect to find election-related information at Elections Alberta, thank you very much!

A small fraction of the $1 million ad campaign budget would have gone a long way toward addressing these issues with the Elections Alberta website.

It’s time to get on board the open data bandwagon

I really like building things for elections. Whether it’s a results dashboard, a where-to-vote tool, a sign management system for a campaign, or something else entirely, I enjoy it all. These projects generally need data. Sometimes you crowdsource the data (where did volunteers drop all of the signs) but often you want official data from the election authority. In the case of the provincial election, I wanted to build a site that was useful before and after the election, with a where-to-vote feature, information on all of the candidates, and a results dashboard. I needed some data from Elections Alberta to make it happen. Here’s a rough overview of what I wanted:

  • A list of all parties (ideally with contact info)
  • A list of all candidates (ideally with contact info, their electoral district, etc.)
  • A list of all electoral districts (ideally with returning officers and other info)
  • A list of all polling stations (ideally with addresses and contact info)
  • The geographical boundary data for each electoral district
  • The geographical boundary data for each polling station
  • Results data for the 2015 election
  • Historical results data

Each of those datasets would allow me to build additional features, especially when combined with my own data. All of them are fairly straightforward in my opinion, and should be things that the authority on elections would have. Once I knew which datasets I needed, I set about finding them.

My first stop was the Alberta Open Data Portal: “The portal makes data the provincial government collects on behalf of citizens publicly available in machine readable formats with an open licence.” Like the City of Edmonton’s data catalogue, the Alberta Open Data Portal should be a one-stop shop for open data. But unfortunately, it contains no election-related data. I of course submitted a dataset request, but knew it wouldn’t be actioned in time. I still haven’t heard anything back about it.

I knew at this point that I’d have to hunt each dataset down individually, likely on the Elections Alberta site. And given what I wrote above about the website, I knew that was likely to be problematic.

As mentioned I found the list of candidates in Excel format. I also managed to find the electoral district boundary information and the polling station boundaries here. I ended up scraping nearly everything else, including the list of electoral districts. Just four days before the election, after repeated requests that went unanswered, they added an Excel document of all the polling stations (which you can see here via the Wayback Machine).

I’m pretty happy with the way the results dashboard turned out, but again it was all scraped. Instead if making a results feed available, or any kind of structured data, Elections Alberta only provides a static HTML page (which of course does not validate correctly making scraping even more difficult). Now that the election is over, I see they have added the resultsnew site, which appears to provide an option to download the results in Excel. Too little, too late.

One quick note on historical data. You can get PDFs here, but that’s pretty useless for anything other than manual lookups. I couldn’t find anything else. The only reason my results dashboard is able to show results data from 2012 is that I had saved copies of the static HTML results files that year.

This situation is untenable. Scraping data, hunting around a constantly changing website, and pleading for more complete datasets is not my idea of an open and accessible government. Open data is not a new concept, and the Province already has an open data catalogue. All Elections Alberta needs to do is make their data available inside of it.

There’s plenty of time to fix this before the next election!

I know that election time is crunch time, and that the folks at Elections Alberta were probably incredibly stressed out and constantly faced an uphill battle. And I know there are smart, dedicated Albertans who work there. Keila Johnston, Director of IT and Geomatics for Elections Alberta, was particularly helpful. But now the election is over, and I’d really like to see some positive change.

It would be an incredible shame if we got to the next election here in Alberta and found ourselves in the same position: with a website that’s out-of-date and unreliable, and a lack of open data to power new tools and experiences for voters. Elections Alberta has the talent and ability to fix both of those issues, if they prioritize it. And the best part? It shouldn’t cost $1 million to do so.

Media Monday Edmonton: The Wanderer

Edmonton’s online coverage got a little bit richer last July when The Wanderer officially launched. Described as “Edmonton’s premier daily online magazine,” The Wanderer was born at the University of Alberta but aims to reach beyond campus by highlighting local politics, culture, science, sports, and more. I sat down recently with Emerson Csorba, one of the site’s founders, to learn more.

Emerson is entering his fourth year of Sciences Politiques at Campus Saint-Jean after spending a year working for the Students’ Union. Last spring he started throwing around the idea of starting a newspaper or magazine with some friends. “We wanted to highlight Edmonton a little differently,” he said, citing influences such as The Atlantic, Gawker, and GOOD. The wanted to provide an alternative to The Gateway, but also didn’t want to be restricted to covering university-related news. The other founders included Sansitny Ruth, Dongwoo Kim, Katrina Regino, Skye Oleson-Cormack, and Sydney Rudko. In the summer they decided to make it happen.

Emerson Csorba

Emerson and the team recruited about 20 writers and started posting content, with the site officially launching on July 5, 2012. Today they’re up to about 70 contributors, 20 of whom contribute regularly. All are volunteers. “We run off gratitude,” Emerson told me. “Thanks for contributing!” Emerson is hoping to have some professors start writing for the site consistently too, perhaps talking about their research. And another challenge is to find a core group of younger students who can contribute. “We want to have a reunion 20 years from now!”

The goal is to publish something new every day. Contributors have quite a bit of autonomy, though usually a piece will get bounced off at least one other person before going live. The site runs on WordPress and contributors are granted “editor” privileges. For the most part this works well, though it can backfire occasionally. The satirical paragraph about northsiders in this piece didn’t come off well, Emerson told me (nor did his piece on Plastiq). Still, they didn’t take it down. As of April 15, a total of 847 articles had been published on a variety of topics.

The name of the site was a suggestion from Sansitny. “At first I didn’t like it,” Emerson admitted, adding that it has grown on him since. It’s meant to capture the idea that students are wanderers, experimenting as they work to find their path. Other names that were considered included “Butterdome Republic” and “Rutherford Post”.

I have really been enjoying the content at The Wanderer, especially lately. Interviews with Omar Mouallem, Edmonton Opera’s CEO Sandra Gajic, and Mayor Mandel have all been great reads. An earlier project that received a lot of attention was The Wanderer’s list of the Top 100 Undergrads. I asked Emerson if he considers himself a journalist, but he shunned the label. “We want people to write about things they’re involved in and passionate about,” he told me.

The Wanderer

As for what’s next for the site, Emerson says “consistency is the goal,” at least in terms of posting content. Watch for podcasts and videos in the future, as well as enhanced visual arts coverage. Emerson is also hoping to have The Wanderer branch out into events. “Maybe we can do a half day conference on education,” he mused. “Tie all the levels of education together.” Another area of interest is community leagues, and how to engage more youth (Emerson served as president of the Parkallen Community League for a year, so he knows a thing or two about that!) There’s clearly a lot of energy and ideas flowing. I think their recent “Thank you, readers” post captures the possibilities well:

The Wanderer honestly doesn’t have an end-point in mind; we evolve based on our writers’ ideas. We provide autonomy to our writers and tell them to basically “go for it.”

The Wanderer is off to a great start, with a Yeggie nomination in the “Best in Edmonton” category (if that wasn’t proof enough that The Wanderer is on to something, a website called Ualberta Green Onion poked fun at them recently), and more than 60,000 unique visitors and 200,000 page views since launch. Add to that a large team of contributors producing quality content, and you’ve got a local site to keep an eye on!

ShareEdmonton updated with support for blogs & news releases

shareedmontonToday I am very excited to share with you the latest release of ShareEdmonton, my ongoing effort to build a platform for finding, filtering, and sharing Edmonton-related content and information. When I launched the site back in 2009, I said: “I want to redefine local media and improve Edmonton by embracing the fact that communication is increasingly taking place online.” While I still haven’t achieved my complete vision for ShareEdmonton, today’s release is another big step in the right direction.

The first thing you’ll notice if you have been using ShareEdmonton for any length of time is the new design. I decided to embrace Twitter Bootstrap as the foundation for the website’s layout and styling, a decision I am extremely happy about. With Bootstrap, the site is responsive and mobile-friendly, lightweight, and standards-compliant. It should look great on an HD screen, a mobile phone, and everything in-between. Another UI-related change is that Google Maps has been replaced with Leaflet and Open Street Maps.

The majority of the work in this release was done behind-the-scenes. For instance, I completely revamped the the way ShareEdmonton imports data so that it is more automatic and much more reliable. Another big change is the underlying infrastructure – ShareEdmonton now runs completely on Windows Azure. This is the kind of thing that is transparent to the end user but means that I can spend far less time worrying about servers and much more time adding new features!

Speaking of new features, there are a bunch in the latest release:

  • Blogs! You can now use ShareEdmonton to keep up-to-date with local bloggers. More than 100 blogs and 10,000 posts have been indexed so far. I have focused on blogs that update fairly regularly, but I know there are many, many more that should be included. You can add blogs here. There are two primary views for blog posts: the visual view and the headline view (which shows posts from the last week).
  • News Releases! Similar to blogs, ShareEdmonton is now indexing news releases. There’s a lot of room to improve, but so far about 20 sources and a few thousand releases have been indexed, including every City of Edmonton news release since January 2009.
  • Event pages have been updated with the new visual view, similar to blogs and news releases. I think it’s a much more enjoyable way to browse what’s coming up. Also, you can now add your own calendars for ShareEdmonton to import automatically.
  • The weather page now supports watches and warnings. If Environment Canada has issued a watch or warning, you’ll see it across the top of the weather page, but you’ll also now see a notification icon on the toolbar across the top of the site, no matter what page you’re on.

There are also dozens and dozens of bug fixes, data updates, and other small improvements.

I’ll be writing more about some of these new features over the next week, but for now, take a look and let me know what you think!

Building a Results Dashboard for the 2012 Alberta Election

Like many Albertans, I have spent a significant amount of time over the last month paying attention to the election! Reading about the candidates, following all the drama, and spending lots of time with the #abvote hashtag on Twitter. As the candidates were making one final push over the weekend before the election, I decided to build a results dashboard. I like a good challenge and enjoyed building it, but it was especially rewarding to see that it proved to be quite popular too! In this post I’ll tell you a little about how and why I built the website, and what I learned from it.

abvote results

If you haven’t checked out the dashboard, you can see it here. I’ve added a bunch of stuff since election night, which I’ll explain below.

The Idea

By late Friday afternoon, my thoughts had drifted to election day itself. I started to think about how exciting it would be to see the results come in – I love election nights! I knew there would be television coverage and that the media would have some web coverage as well, but I also felt that I could build something unique and valuable. If only I had the data! So I looked around, and found the Elections Alberta results site. At that time, the results page was full of test data. I immediately saved a copy to my computer, and saved a few of the electoral division pages too. That proved to be a wise decision, because a few hours later the site went offline!

elections alberta

Before I took a crack at scraping the website, I wanted to know if there was a data feed of some kind available. I blindly emailed the general Elections Alberta address, and to my surprise, received a response shortly thereafter! Unfortunately there was no data feed available, so I set about writing a scraper. Within a couple of hours, I was correctly scraping the main results page as well as all of the electoral division pages. Now that I had the data, I felt pretty confident that I could build a dashboard over the weekend. I didn’t get back to the project until Sunday morning, so that meant I had to prioritize what I was going to build. It took about six hours, but my I finished my initial version late that evening.

The Design

This was not my first election results dashboard. If you’ve been reading my blog for a while, you’ll recall that I built a dashboard for the municipal election here in Edmonton back in 2010. I learned a lot from that experience, and I remember it being a lot more rushed and difficult than this dashboard! Among other lessons, it was clear that design and colors matter, and that mobile devices are important (even then lots of people were asking for mobile support). I also knew that forcing users to refresh the page is less than ideal – it’s not a very delightful experience, and it puts unnecessary strain on the server. I also disliked the limited real estate that I had to work with (the current ShareEdmonton page width is fixed…but I’m working on a new version that is fluid).

So, I wanted a mobile-friendly, fluid-width, Ajax-enabled, attractive looking design. I immediately decided to use Twitter Bootstrap. I have used it a few times now, and I absolutely love it. I can’t thank the folks at Twitter enough for making such an excellent framework available for free! It gave me everything I needed to get going from a UI perspective. In particular it features responsive design, which makes it possible for the pages to scale from the desktop down to mobile devices without much work. For the backend, I used ASP.NET MVC 3. I use it for everything, so I know it well.

For performance reasons, it definitely made sense to cache the data. I decided on a fairly straightforward approach: I’d scrape the data from Elections Alberta and would store it using Memcached for two minutes. That meant that every two minutes, a request would take slightly longer because it had to download the data again, but this seemed reasonable (and as it turned out, the Elections Alberta site was incredibly quick). I also designed the pages to poll for new data every 30 seconds, which prevented users from having to reload the page manually.

The Cloud

When I built the ShareEdmonton dashboard a couple years ago, it was hosted on one of my servers. That worked fine, but it did slow down under load and I didn’t have much ability to scale up or out without a lot of additional cost, time, and effort. I really wanted to avoid that situation this time, so I decided to host the dashboard using Windows Azure. I’m in the process of migrating ShareEdmonton to Azure, so I already had an account and was pretty familiar with how it worked. Deploying to Azure is so easy – I simply had to add a deployment project in Visual Studio, and then I could deploy new versions in just a couple of clicks.

Windows Azure supports a range of instance types – basically you get to choose how big and powerful you want your server to be. I started with “Extra Small”, the least powerful and therefore least expensive type. As the polls were about to close at 8pm, I scaled up to “Small”, which meant redeploying the app (which took about 8 minutes, but happened completely behind-the-scenes). About half an hour later, I had to add capacity because the site was starting to get quite sluggish. This time I scaled out, by adding a second instance. All I had to do was change a configuration setting in the Azure management console, and the service took care of everything. Within a few minutes, I had two load-balanced “Small” instances. The performance boost was immediately noticeable. About an hour later, I added a third instance, and kept the system running that way until about 1am. I scaled it back down in stages, and now have it running as a single “Extra Small” instance again.

Two Key Decisions

I think the two most important decisions I made were:

  1. Using Twitter Bootstrap
  2. Using Windows Azure

The decision first meant that the website looked good and worked across browsers, screen resolutions, and devices. I got all of that engineering effort and testing for free, which meant I could focus on building an election results dashboard rather than building a website. I didn’t have to figure out how to lay things out on the screen, or how to style tables. The second decision was perhaps even more important. By using Windows Azure, I could deploy new versions of the dashboard in minutes, plus I could scale up and out simply by changing a few settings. That meant I could quickly respond when the site came under load. The other big advantage of using Azure was the cost – running the site on election night cost me just $1.54. Incredible!

Some Statistics

The dashboard served around 60,000 page views on election night alone, which is pretty good for a website launched just hours before the main event. Keep in mind that because the data on the site automatically updated, users didn’t have to refresh the page which kept that statistic lower than it would otherwise have been. The visit duration metric is another way to see that – 20% of all visitors spent at least 10 minutes on the site. I actually would have guessed a higher percentage than that, but perhaps the high mobile usage was the reason.

The top screen resolution for visitors was 320×480, not a desktop resolution! Roughly 36% of all visits that night were made on mobile devices (which includes tablets). The iPhone was the most popular device, followed by the iPad. Clearly using a framework like Twitter Bootstrap with responsive design was a good decision.

The other statistic worth sharing is that the vast majority of visitors (about 73%) found the site by way of social networks, and two in particular. Facebook accounted for 78% of all those visits, while Twitter accounted for 20%.

Recent Improvements

Since Monday I have made numerous improvements to the dashboard. Here’s a brief overview of the new features:

  • All the data is now stored locally, which means I’m no longer reliant on Elections Alberta. They have made numerous updates over the last two days, and I have updated the site’s local data store accordingly.
  • I updated the voter turnout chart and added regional voter turnout to the front page. I also added a table of the five closest races.
  • District pages now show voter turnout and the list of polls is now sortable.
  • There’s a new Districts Grid, which lets you see lots of information about all the districts in a single, sortable view. For example, you can quickly see which district had the best voter turnout, which were the closest races, and which had the most candidates.
  • There’s also a Candidates page, which lets you see information about all of the candidates in a single, sortable view.
  • Last night I also added a Maps page, which has interactive maps for the province, as well as zoomed-in maps for Calgary and Edmonton. Click on any region for details and a link to the district page.

What’s Next?

I plan to keep the dashboard up as it is now, though at some point I’ll probably transition it from being a dynamic website to a static one (far cheaper to host over the long-run). If you have any suggestions on things to add or improve, let me know! I hope the site will serve as a valuable reference tool going forward.

Thanks for reading, and thanks to everyone who sent positive comments about the dashboard my way. It’s great to hear that so many people found it useful on election night!

It’s time to stop investing in Edmonton Stories

Nearly three years ago the City of Edmonton launched Edmonton Stories, a new approach to marketing Edmonton. The project will be discussed by Executive Committee tomorrow, and at least one Councillor has been quite vocal about his desire to shut it down. Councillor Diotte wrote about the issue yesterday on his blog:

I argue we have no performance measures for the website. Social media gurus tell me the costs surrounding Edmontonstories are astronomically high and we can’t even gauge if it alone has drawn a single person to come live in this city.

I don’t always agree with Councillor Diotte, but in this case I think he’s right – it is time to very seriously ask if continuing to put resources into Edmonton Stories is the right thing to do. I first raised questions about the value we’re getting back in September 2009, and followed up with then Communications Branch Manager Mary Pat Barry in February 2010. My conclusion at the time was that while the cost was high, the site was starting to deliver results. The case study that was created in conjunction with the Edmonton Police Service was a really positive step.

Now, two years later, where are we? Not much further ahead. Here’s the sad reality:

  • In its first four months, attracted 113,979 total visits. Five months later, that number had grown to 203,685. And in the two years since, it has attracted just 358,691 more visits, bringing the total to 558,376. Most of the growth took place in the first year! Since a picture is worth a thousand words, here’s a graph to show you what the growth curve looks like (linear and logarithmic):

edmonton stories traffic

  • And remember that those numbers are total visits. There’s no word on how many are uniques. The number of people visiting from outside Edmonton is even less, especially when you consider that when an Edmontonian’s story goes up they likely share it with friends and family in the city.
  • The number of stories on the site likewise has grown very slowly. The total now sits at 339 compared to 272 in February 2010.
  • The same case study that was held up in defense of the site two years ago is the one Administration is using now (the EPS one). The report mentions just six organizations that have joined the Recruitment Campaign Partnership. Six! Out of all the organizations in Edmonton!
  • And yes, the budget is a concern. Incredibly, the report does not make it clear how much has been spent on the project. It does state that $1.5 million was allocated in the first year and that a consultant’s estimate of the “right” investment amount was about $5 million. Councillor Diotte says that with this year’s $600,000 budget factored in, a total of $3.5 million will have been spent on the site since it launched.
  • Worse than the overall budget however is the breakdown. UPDATE: The numbers have now been posted at Here’s the split identified for the 2012 budget:

So, let me get this straight:

  • $180,000 is being spent to advertise the website to extend its reach, yet we know that the growth rate has declined significantly over time.
  • $144,000 is being spent on the recruitment program, which has attracted just seven partner organizations in the last two years.
  • $126,000 is being spent on “managing, maintaining, monitoring and engaging target audiences of various social media platforms.” You know, the stuff you and I do every day for free.
  • $54,000 is being spent on “research, planning & development.” I’m not exactly sure what this would refer to in the third year of a program like this.
  • $54,000 is being spent on “website development & maintenance.” I pay $90 per month total to host this site and at least half a dozen others on Amazon EC2. And I can confirm that it more than handles the kind of traffic has.
  • $30,000 is being spent to extend the brand into trade shows and other events.
  • $12,000 is being spent to help people write new stories, yet just 67 new stories have been posted in the last two years.

Clearly the cost is a concern. But perhaps the biggest problem is that the site’s champion is no longer driving the site forward. I don’t think it is a coincidence that after Mary Pat left the City the site received less attention. Reading the report from Administration, it certainly feels like there’s a gap from 2010 until now. It’s hard to look after someone else’s baby.

I recognize that you don’t get results over night and that developing a successful program can often take time. But three years should be enough time to decide whether or not to pull the plug. That’s an eternity in the online world! Incredibly, Administration thinks we should do the opposite by reaching out to more organizations, recruiting student partners, and enhancing the site with things like Google Maps.

I think there’s value in what has been created at and I believe there are ways to continue to leverage that (perhaps via EEDC, which always did seem like a more suitable home for it), but I don’t think the City should be investing any more into the project.

Media Monday Edmonton: How fast are local media websites?

On the web, page speed matters. If your site takes too long to load, people will go elsewhere. Google proved this by purposefully slowing down its search engine. They found that even just half a second caused fewer searches. Bottom line: users love fast sites!

With that in mind, I decided to look at local media sites – how fast or slow are they? Rather than looking at page load times, I decided to use YSlow to determine the performance of each site. Lots of factors can impact the amount of time a page takes to load (your ISP, the speed of your computer, where you are geographically,etc), but everyone has to download the same amount of data for a web page, so I figured YSlow is a little more fair than a stopwatch.

Here are the fourteen sites I looked at (you can download all the data here):

As you can see, none of the sites received an “A” grade. The only one to receive a B was Only Here for the Food. The grade can be somewhat misleading, however. Here’s what the front page performance for each site looks like:

It turns out that Sharon’s blog is the heaviest of them all with an empty cache (the first time you visit the site). This is due to the large number of images she has on her site. In fact, almost all of the weight of the page is due to the images. If you look at the primed cache (subsequent visits to the site) then the Edmonton Journal is the heaviest. The Edmonton Journal has the worst performance improvement going from an empty to a primed cache:

To be fair, I decided I should compare an “article page” on each site as well. With social media in particular, an article page is more likely to be a visitor’s entry point to the site. For this test, I simply clicked on the top article on each site:

One caveat: I used the second story for iNews 880, because their top story was over 25 MB in size! Evidently they think it is fine to embed full size, uncompressed images.

As you can see, Valerie’s blog is the heaviest, again due to the number of pictures she has. Once again, the Edmonton Journal has the worst performance improvement going from an empty to a primed cache:

Final Thoughts

I thought there would be more of a difference between the new and traditional media sites, but there isn’t really. In general the heaviest part of the blogs is images and the heaviest part of the traditional media sites is Javascript, but there are exceptions. On average, the first time you visit the front page of one of these sites you’re going to have to download just over 2 MB. On a 56.6 Kbps dial-up connection, that would take you nearly 5 minutes. On a typical high-speed connection, it’s more like less than 10 seconds.

I think perceived performance is often more important than actual performance, but that’s obviously harder to measure. In my experience, most of these sites load fairly quickly. When I do notice a speed issue, it’s usually because the page I am trying to load has a lot of stuff on it.

Another thing I learned from this exercise is that all of the sites have room for improvement!

What has your experience been like? Which sites do you find slow?

Daily Deals in Edmonton

I’m amazed at how many “daily deal” or “group coupon” sites there are in Edmonton now, let alone the rest of the world. I’m sure you’ve heard of Groupon, and maybe one or two others such as GoodNews, but did you know there are at least ten such sites in Edmonton? Here’s a list of the ones I have found:

There are also a bunch of sites preparing to launch in our city:

Daniel has created a Twitter list for some of the services here.

I don’t see how that many sites are going to succeed. My guess is that most won’t. Some are so similar in form and function that I wonder if they’re actually the same company. Others have obviously picked up on this trend, as there are dozens of “daily deal aggregators” out there so that you don’t have to sign up for each service individually. Here are some Edmonton-related ones I have found:

I’m sure there are many others that I have missed.

The most popular item on daily deal sites seems to be spa packages, so you may or may not find value in these services. I have used Groupon a couple of times, and I think they are here to stay. LivingSocial seems to be the second biggest of the sites, at least in the US. SwarmJam should be able to leverage its existing relationships with advertisers, so maybe they’ll find success. GoodNews has a bit of a twist in that it supports local charities. As for the rest – who knows.

Edmonton’s 2010 Grey Cup Festival Never Happened

In November 2010, Edmonton hosted the 98th Grey Cup. The Montreal Alouettes defeated the Saskatchewan Roughriders for the second straight year to capture the CFL’s top prize. Of course, the event was more than just a football game. We’re festival city, and we turned the Grey Cup into a very successful festival. There was something for everyone, and downtown was full of people, which unfortunately doesn’t happen very often. It wasn’t a perfect event, but I think you’d be hard-pressed to find an Edmontonian who would consider it anything less than a success.

2010 Grey Cup Festival Kickoff

Here’s what Todd Babiak wrote (archive):

Ten years from now, only the statisticians and the really, really heartbroken will recall the winner of Sunday’s Grey Cup game in Edmonton.

What we would like to remember, in 10 years, is that many thousands of warmly audacious people from Saskatchewan came to witness Edmonton’s transition from a cosy little prairie city to something else.

I would go further and say that we absolutely need to remember what we accomplished with the Grey Cup Festival. We need to be proud of it, we need to learn from it, and we need to improve upon it.

But, the Grey Cup Festival never happened.

If you try to visit the festival website, at, you’re redirected to the website of the Edmonton Eskimos. As far as the web is concerned, the festival never happened. And in 2011 and beyond, the web is all that matters. Think about it for a second – less than two months after the event took place, the most important online record of it has vanished.

Ignoring the fact that the website barely worked during the festival (which is an important, but different issue), this is troubling. I have written before about the need to preserve our local, digital, cultural artifacts. The web is the single most important platform for doing so. The web is accessible and pervasive. Too often, however, it is not permanent. We can and must do better. We also need to stop thinking of event websites as only being relevant during the event.

Now obviously the festival happened. And there are other places online that provide evidence and a record of it. There’s the Wikipedia entry, the many blog posts that were written, thousands of photos uploaded to the web, etc. But all of these should be ancillary to the event website, not a substitute for it. And there’s no guarantee that they’ll exist in the future. For instance, you can read Todd’s article today, but in six months it will no longer be available on the web (hopefully my archive link is…this is a problem the Journal is aware of and hopes to address).

The saddest part about this particular instance is that I guessed it would happen. I should have spoken up sooner. The good news is that I archived the entire site on November 27, 2010. You can see the front page here.

I don’t think this is an easy problem to solve, but I believe it is important that we do solve it. I’m going to do what I can to help educate others about why this is so important, I’ll continue learning from the very smart people we have in the “archival” business, and I’ll continue doing what I can to help archive.

Alberta Budget 2010 website – security through obscurity

Tomorrow, Tuesday, is budget day here in Alberta. Like many Albertans, I am curious about what Finance Minister Ted Morton is going to deliver, so I started poking around online. First stop, last year’s budget, available at

Seems logical that the 2010 budget would be at So I tried that URL, and was prompted with a login screen. First thing that came to mind was “administrator” and “password”. Voila:

Fortunately for Mr. Morton, the documents don’t appear to have been uploaded yet. You can see all the placeholders though, which is kind of funny. And it seems you can leave feedback.

It does reveal the theme of the budget, Striking the Right Balance. Last year was Building on Our Strength.

This is what is known as “security through obscurity”. It’s not really secure, it’s just hidden. I’d suggest that programmers working at the Government of Alberta invest in Writing Secure Code, a fantastic book on the subject.

I hope this isn’t a reflection of the budget we see tomorrow…cutting corners, etc.

UPDATE: Sometime around 9:45 AM today they changed the password, and I think pointed the virtual directory somewhere else.

UPDATE2: The Journal wrote about this today.

UPDATE3: The site is now officially live with all the budget documents. Enjoy!

We need to preserve our local, digital, cultural artifacts

As Edmonton continues its climb toward global status, I think it’s important that we consider the digital cultural artifacts that we create along the way. It’s rare that something big happens in Edmonton (or anywhere in the world for that matter) without a website or other online presence of some kind being created. That online presence is important in the weeks and months leading up to an event, but it’s just as important after the fact too. We need to start considering that from the beginning.

Think about big events that Edmonton has hosted in recent years. The 2001 World Championships in Athletics should come to mind. If you do a search for Edmonton 2001, you’ll find:

And linked from the official IAAF website and many other pages that show up in the results, is the the Edmonton 2001 website, at The problem is, that site no longer exists.

What would happen if the IAAF took down the page they are hosting? It doesn’t have to happen on purpose, it could be an unfortunate side effect of a redesign, server relocation, etc. The article at Wikipedia is pretty sparse, containing mainly result information. And the mention on the EEDC site is insignificant. It’s almost as if the event didn’t happen.

Additionally, I’d argue that none of the links that still exist tell the story of Edmonton 2001. The effort that went into it, the many volunteers and organizations that made it happen, the effect it had on the city, etc. I think it’s important that we capture that information, and that we do so online, where it is easily accessible by all.

Another more recent example would be the ICLEI World Congress, held in June 2009. The City of Edmonton has a brief page devoted to the event, but most of the information exists at the ICLEI site. That’s fine, but again we’re relying on someone else for the information, and we’re missing an opportunity to tell our story. The advantage that the ICLEI had over Edmonton 2001 is that many bloggers wrote about the event and many photographers posted photos, and their content will likely continue to exist for quite some time. The new Transforming Edmonton blog will help too, I think.

The idea of digital preservation applies to smaller-scale events too. Try to find an online presence for the 2005 K-Days (now Capital EX), the year the event’s attendance record was set. Or try to find out about the 2008 Fringe festival.

I recognize that there’s costs associated with preserving our online cultural artifacts. Someone has to pay for them, and someone has to maintain them. And if we go that extra step and treat some online presences as legacy projects with updates and other information to tell our story, there’s obviously costs associated with that too. I think the costs would be quite minimal, however, and definitely worth it.

Perhaps this is something for the Edmonton Heritage Council to tackle? Or the Edmonton Historical Board? Or maybe just you and me. Either way, we need to start taking digital preservation more seriously.