Potholes in Edmonton

Every year the City of Edmonton spends a few million dollars to fill a few hundred thousand potholes. Are potholes just a fact of life, or can we do something about them? I think the latter. It’s time for a more sophisticated and creative discussion about potholes in Edmonton!

Pothole
Pothole photo by More Bike Lanes Please

We hear the same thing every year. As spring approaches, dozens of stories are published about Edmonton’s pothole problem. We hear all about the freeze/thaw cycle of the winter and that’s why the potholes are bad. We hear that the City has crews out all the time fixing potholes, on average about 400,000 per year. We hear that a lot of money is being spent on the problem!

Here’s what Mayor Mandel said a few weeks ago:

“If you look at this winter — we’ve had freezing and thawing, freezing and thawing way more than any other year,” said Mandel, “and we have had a little more snow than normal. It creates havoc.”

“It’s not our intention to create a pothole … but it is a fact of life in our city,” said Mandel. “It will be there forever and we’ll never catch up.”

That sounds like a challenge!

I started digging into potholes, well figuratively anyway. I started with a series of questions, and then I just began researching. I went through old council minutes, I looked at City reports, I searched through old newspaper articles, etc. What was supposed to take a few hours turned into days! After a while I realized I had better stop and share what I had gathered, so that’s what you’ll find in this post.

Here’s a video for those of you in the TL;DR camp:

Here are some of the highlights of what I found:

  • Potholes form when water and traffic are present at the same time.
  • The City has filled more than 5.6 million potholes since 2000.
  • On average, the City fills about 433,000 potholes each year, with a budget of $3.5 million.
  • Annual pothole budgets have ranged from $1.5 million to $5.9 million since 1990, for a total of about $85 million (or $104 million adjusted for inflation).
  • Edmonton seems to fill twice as many potholes as any other large Canadian city.
  • The City maintains more than 4,600 kilometers of roads. The average quality of an arterial road is 6.1 out of 10, just below the industry standard. There is not enough funding in place to prevent this from falling.

There’s a lot more information in this PDF report that I’ve put together:

I put all of the data I gathered into an Excel document that you can download here. You’ll find some data in there that is incomplete – if you have the missing information, please let me know! If you use it to generate your own analysis, I’d love to learn from you so please share!

How can we solve the pothole problem in Edmonton? I don’t know. But doing the same thing over and over isn’t going to change anything either. Here are some ideas on how to make progress:

  1. Information is only useful if we can bring it together to turn it into knowledge. I’ve started to do some of that in the report above. In the absence of good data about weather patterns or traffic patterns, it’s easy to make assumptions. I feel as though I’ve only scratched the surface – there’s a lot more information that could be correlated to develop a better picture of the pothole problem.
  2. We need to make better use of the tools and expertise that we have in Edmonton. I’m thinking of tools like the Open Data Catalogue, for instance, and expertise like the transportation engineers and soil experts we have. Edmonton is one of the few cities that tracks the number of potholes filled, let alone makes that data available online, but we can do more! We also need to do a better job of harnessing the collective power of all Edmontonians for crowdsourcing ideas and data. Potholes don’t have to be just a transportation problem.
  3. There’s lots of interesting things happening elsewhere – Edmonton is not the only city that has to deal with potholes! What can we learn from others? There are self-heating roads, nanotechnology is being used to create crack-proof concrete, and all sorts of different polymers designed to make roads less brittle. How can we apply some of that knowledge?

What if we brought together engineers, scientists, designers, programmers, and other citizens for a one-day pothole unconference? What would they come up with? I think it’s an idea worth exploring.

Splash
Splash photo by Owen’s Law

I don’t think we’ll solve the pothole problem in Edmonton just by throwing more money at it, and we certainly won’t get anywhere with cheap gimmicks. Instead I think we need to get a bit more holistic and creative in our approach.

For now, I have two calls-to-action:

  1. If you’ve never reported a pothole using the City’s online form, give it a shot here. Don’t bother with forms or maps on other sites – use the official one.
  2. If you found anything in this post valuable, please share it with others.

Thanks for reading and happy pothole dodging!

Homicide Rates in Canada: Statistics & Trends

About a month ago I shared some statistics about Edmonton’s homicide rate. As an initial effort, I think I got my point across: the homicide rate in Edmonton over the last thirty years has been trending downward and is not that different from other large cities in Canada. I have since done some additional research on this subject and would like to share what I have learned.

The graphs below generally compare the ten largest census metropolitan areas in Canada. I have used the homicide rate (the number of homicides per 100,000 people in the CMA) to compare rather than the absolute number of homicides. Where appropriate, I have included the overall Canadian rate and the average of the ten largest CMAs. The data all comes from Statistics Canada (the 2010 information is here). You can click on any graph to see a larger version.

Here are the homicide rates over the last thirty years:

You can see a few spikes (for Ottawa-Gatineau and Winnipeg in particular) but overall the rates are all pretty similar.

Here are the highest recorded homicide rates:

Nearly every location has had spikes at one time or another. But a few places consistently record the highest homicide rates:

You can see that Winnipeg has recorded the highest homicide rate among large cities the most, followed by Ottawa-Gatineau. Edmonton has recorded the second highest homicide rate among large cities most often, followed by Vancouver.

Here are the average homicide rates over the last thirty years:

Half of the ten largest cities are below the Canadian average. As a result, the average for the ten largest cities isn’t that much higher than the Canadian average.

Here is Edmonton’s homicide rate compared against the overall rate in Canada and the average of the ten largest cities. You can see that is trending downward, despite spikes in 2005/2006:

Over the last thirty years, Edmonton has never recorded a homicide rate lower than the Canadian rate. Only three times has Edmonton’s homicide rate been lower than the average for the ten largest cities:

As homicide rates in Canada have generally been trending downward, I thought it would be useful to look at the rates by decade. Here are the average homicide rates by decade since 1981:

You can see that with the exception of Winnipeg, every location recorded a lower average homicide rate in the period 2001-2010 than they did in the period 1981-1990.

This graph shows the change a little more clearly:

Every location’s average rate decreased in the 1990s. Only three locations (Edmonton, London, and Winnipeg) have recorded increases since 2000, and only Winnipeg’s was enough to increase past 1990 levels.

What’s next?

Today, our city’s new violence reduction action plan was unveiled. You can read the whole thing in PDF here. The report concludes:

The problem of violence in society is complex and multi-faceted. It requires diligent, ongoing coordinated work across a number of agencies and organizations. This includes other orders of government, who have information and resources that will be required in order that solutions be comprehensive, and sustainable over the long-term.

The City and its key partners will continue their efforts to understand and address the root causes of violence and maintain order and safety in our community, keeping the livability of Edmonton among the best in Canada and the world.

I think understanding where we’re at is an important part of unraveling this mystery. Hopefully the information I have shared above will help in that regard. I look forward to the community conversations slated to take place this fall.

In a follow-up post, I’ll take a closer look at Edmonton’s homicide rate in the context of our demographics, economic situation, and other factors.

1.2 zettabytes of data created in 2010

For the last five years or so, IDC has released an EMC-sponsored study on “The Digital Universe” that looks at how much data is created and replicated around the world. When I last blogged about it back in 2008, the number stood at 281 exabytes per year. Now the latest report is out, and for the first time the amount of data created has surpassed 1 zettabyte! About 1.2 zettabytes were created and replicated in 2010 (that’s 1.2 trillion gigabytes), and IDC predicts that number will grow to 1.8 zettabytes this year. The amount of data is more than doubling every two years!

Here’s what the growth looks like:

How much data is that? Wikipedia has some good answers: exabyte, zettabyte. EMC has also provided some examples to help make sense of the number. 1.8 zettabytes is equivalent in sheer volume to:

  • Every person in Canada tweeting three tweets per minute for 242,976 years nonstop
  • Every person in the world having over 215 million high-resolution MRI scans per day
  • Over 200 billion HD movies (each two hours in length) – would take one person 47 million years to watch every movie 24/7
  • The amount of information needed to fill 57.5 billion 32GB Apple iPads. With that many iPads we could:
    • Create a wall of iPads, 4,005 miles long and 61 feet high extending from Anchorage, Alaska to Miami, Florida
    • Build the Great iPad Wall of China – at twice the average height of the original
    • Build a 20-foot high wall around South America
    • Cover 86 per cent of Mexico City
    • Build a mountain 25 times higher than Mt. Fuji

That’s a lot of data!

EMC/IDC has produced a great infographic that explains more about the explosion of data – see it here in PDF. One of the things that has always been fuzzy for me is the difference between data we’ve created intentionally (like a document) and data we’ve created unintentionally (sharing that document with others). According to IDC, one gigabyte of stored data can generate one petabyte (1 million gigabytes) of transient data!

Cost is one of the biggest factors behind this growth, of course. The cost of creating, capturing, managing, and storing information is now just 1/6th of what it was in 2005. Another big factor is the fact that most of us now carry the tools of creation at all times, everywhere we go. Digital cameras, mobile phones, etc.

You can learn more about all of this and see a live information growth ticker at EMC’s website.

This seems as good a time as any to remind you to backup your important data! It may be easy to create photos and documents, but it’s even easier to lose them. I use a variety of tools to backup data, including Amazon S3, Dropbox, and Windows Live Mesh. The easiest by far though is Backblaze – unlimited storage for $5 per month per computer, and it all happens automagically in the background.

Edmonton’s new Centre for Public Involvement

One of the items that was discussed at today’s Executive Committee meeting (agenda in Word) was the proposed Centre for Public Involvement, a joint venture of the City of Edmonton and the University of Alberta. The idea is to combine the strengths of both organizations to “intentionally consider and apply the most effective means to undertake public involvement.” Here’s the proposed mission:

To provide leadership in understanding and applying innovative practices and new technologies for citizen participation, engagement, and deliberation.

The centre would try to strike a balance among research, best practices, and consulting. Annual operating costs would be $300,000, split equally between the City and the University. Other partners may join at some point in the future.

I really like the idea. That said, I want to echo the opening statement of the prospectus:

The timing is right for establishing the proposed Centre. In reality, the timing is probably late by ten years.

Both the City and the University have already started exploring new forms of public involvement. The City has been quite successful with its social media endeavors, and the University is starting to experiment as well. It seems there’s a new U of A account on Twitter each week (the latest I’ve come across is the International Centre)!

While it is true that there is some frustration among the public with regards to being able to impact decision-making, not everyone has become angry and complacent. Initiatives such as ChangeCamp are proof that some citizens are already engaged in re-imagining public involvement.

I think there’s a great opportunity here for the City, the University, and the public to work together to explore the future of public involvement. I think Raffaella nailed it in a recent post discussing the new City of Edmonton blog she’s been working on:

We seek to create informed communities, engaged citizens, and generally make our lives better.

You can download the Centre for Public Involvement Prospectus in PDF here.

Faster wireless, everywhere

As a tech geek I’m interested in a lot of things, but I have a particular interest in wireless technologies. I want to have the ability to connect to the Internet wherever I go, using whatever device I happen to have with me. Despite the progress we’ve made in recent years, that vision is still a long way from being realized. A couple of things I came across recently look promising though.

The first is an article in MIT’s Technology Review, discussing research to make wireless faster:

One way to achieve faster speeds is to harness the millimeter-wavelength frequency of the wireless spectrum, although this usually requires expensive and very complex equipment. Now, engineers at Battelle, a research and development firm based in Columbus, OH, have come up with a simpler way to send data through the air with millimeter-wave technology.

Apparently they’ve been able to achieve speeds of 10.6 gigabits-per-second in a point-to-point field test, with antennas 800 meters apart. In the lab, they’ve demonstrated 20 gigabit-per-second speeds. Those are fiber-like speeds! Of course this wouldn’t work for blanket-wireless (like a cell network), but it could have some really useful applications.

The second article discusses a new study by market researcher In-Stat:

In-Stat said that more than 294 million consumer electronics devices with Wi-Fi shipped in 2007. But that number is quickly growing and will likely reach 1 billion by 2012. The fastest-growing embedded Wi-Fi segment is mobile handsets. By 2011, dual-mode cell phones will surpass PCs as the largest category of Wi-Fi devices, the In-Stat report said.

The phenomenal growth of consumer electronics devices is nothing new, but the takeaway here is that wireless Internet access demand is going to grow quite a bit over the next few years. After all, what good is a device with Wi-Fi capabilities if there is no Wi-Fi network available? This is good news for the Free Wi-Fi project.

A world with faster, more ubiquitous wireless Internet access is a world I want to live in.

Facebook's virtual gifts – money well spent?

In a post at VentureBeat yesterday, Eric Eldon shared some estimates that suggest Facebook’s revenue from virtual gifts this year will be in the range of $28 million to $43 million. That’s a serious amount of coin for nothing more than an image on a web page.

Gifts are priced at $1 each, and the study found that an average of 470,000 are sold each week.

Facebook introduced the gifts feature in February of 2007. A gift is simply an image of something, like a heart, a flower, or hundreds of other options, that when given, shows up on a “gift box” in a user’s profile. If the gift is public, then the recipients’ friends can see it, too. If it’s private, only the recipient and the giver can see it.

I think the key there is “simply an image”. This is definitely one of those things where you can’t help but think “why didn’t I come up with that!”

Clearly, gifts are a good source of income for Facebook. I wonder who buys them though. Why are people so happy to pay $1 for a bunch of pixels on a web page?

Surely that $40 million could have been spent on something better?

In Edmonton, we like to drive

Statistics Canada has released some new data from the last census that shouldn’t shock anyone who lives in Alberta’s capital city. Nearly 80% of us get to work in a vehicle:

The new data from the 2006 census found that 12.7 per cent of workers in the city of Edmonton get to work using public transit, while 79 per cent either drive or travel in a vehicle as a passenger.

Statistics Canada said the reliance on cars seems to increase with the age of the commuter. While those under the age of 25 travelled by vehicle 70.7 per cent of the time, that rate increased to 81.6 per cent for those aged 25 to 34. The rate was even higher for those aged over 35, at 87.2 per cent.

Cheap Gas?

The average Alberta commuter takes a car 84% of the time, so we’re slightly better than the rest of the province.

I guess Bob Boutilier, our city’s Transportation Department GM, wasn’t kidding at the ETS conference a few weeks ago when he said a big challenge is the “pickup truck and two car” mentality of most Albertans. Thanks to the census data, I now have a number to attach to that statement.

Some people like to suggest that we’ll never improve our public transit system until everyone experiences just how bad it is right now. Maybe there’s some truth to that after all. I wouldn’t be surprised to learn that the majority of that 80% have never been on a bus or LRT car.

That needs to change.

281 exabytes of data created in 2007

data I typed the title for this post into Windows Live Writer, and a red squiggly appeared under the word “exabytes”. I just added it to the dictionary, but I can’t help but think that it’ll be in there by default before long.

Either it takes three months to crunch the data or March is just the unofficial “how much did we create last year” month, because researchers at IDC have once again figured out how many bits and bytes of data were created in 2007. You’ll recall that in March of last year, they estimated the figure for 2006 to be 161 exabytes. For 2007, that number nearly doubled, to 281 exabytes (which is 281 billion gigabytes):

IDC attributes accelerated growth to the increasing popularity of digital television and cameras that rely on digital storage. Major drivers of digital content growth include surveillance, social networking, and cloud computing. Visual content like images and video account for the largest portion of the digital universe. According to IDC, there are now over a billion digital cameras and camera phones in the world and only ten percent of photos are captured on regular film.

This is obviously a very inexact science, but I suspect their estimates become more accurate with experience.

Interestingly, this is the first time that we’ve created more data than we have room to store (though one wonders if that’s simply due to a lack of historical data than anything else).

Read: ars technica

Microsoft's WorldWide Telescope

wwtelescope If you spend any time in the blogosphere, you probably heard about Robert Scoble’s sob session on Valentine’s Day. He said that he was shown a project at Microsoft Research that was so world-changing it brought tears to his eyes. Scoble said he couldn’t tell anyone what it was until February 27th, and he kept that promise. Today he explained:

Lots of people are asking me questions about what made me cry at Microsoft a few weeks ago.

If I told you “a telescope” you’d make fun of me, right? Tell me I’m lame and that I don’t deserve to be a geek and that I should run away and join the circus, right?

Well, that’s what I saw.

The project is called the WorldWide Telescope. Here’s how it is described on the official website:

The WorldWide Telescope (WWT) is a rich visualization environment that functions as a virtual telescope, bringing together imagery from the best ground- and space telescopes to enable seamless, guided explorations of the universe. WorldWide Telescope, created with Microsoft®’s high-performance Visual Experience Engine™, enables seamless panning and zooming across the night sky blending terabytes of images, data, and stories from multiple sources over the Internet into a media-rich, immersive experience.

It does sound like a pretty cool project for astronomy, and like Scoble says, it could have a really huge impact on education and the way we view and understand our place in the universe. Scoble will have a video up on Monday showing it off, and it should be officially available sometime this spring.

Read: Scobleizer

Checkers solved at the U of A

Post ImageHow many games of checkers can you win in a row before someone beats you? Quite a few? Doesn’t matter, eventually you’ll lose right? You think, “it’s only a matter of time.” Well some Computing Sciences researchers at the U of A have figured out why – it’s because humans make mistakes. They’ve solved checkers, completely, and have software that is invicible:

After more than 18 years and sifting through 500 billion billion (a five followed by 20 zeroes) checkers positions, Jonathan Schaeffer and his colleagues have built a checkers-playing computer program that cannot be beaten. Completed in late April, the Chinook program may be played to a draw but will never be defeated.

Their research and “proof” were to be published in today’s edition of the journal Science.

This is pretty incredible when you think about it. It speaks to the advances we’ve made not only with technology, but with our understanding of how to harness it to do things that previously seemed impossible.

I generally consider checkers to be a fairly simple game, but don’t let that fool you:

The popular game may be simple to play, but it holds a potential 500 billion billion positions. That’s one million times more complicated than any other game solved before, says Jonathan Schaeffer, the computer science professor who began the project in 1989.

Congratulations to Schaeffer and his team! I can’t imagine what they’ll figure out next.

Read: ExpressNews