Homicide Rates in Canada: Statistics & Trends

About a month ago I shared some statistics about Edmonton’s homicide rate. As an initial effort, I think I got my point across: the homicide rate in Edmonton over the last thirty years has been trending downward and is not that different from other large cities in Canada. I have since done some additional research on this subject and would like to share what I have learned.

The graphs below generally compare the ten largest census metropolitan areas in Canada. I have used the homicide rate (the number of homicides per 100,000 people in the CMA) to compare rather than the absolute number of homicides. Where appropriate, I have included the overall Canadian rate and the average of the ten largest CMAs. The data all comes from Statistics Canada (the 2010 information is here). You can click on any graph to see a larger version.

Here are the homicide rates over the last thirty years:

You can see a few spikes (for Ottawa-Gatineau and Winnipeg in particular) but overall the rates are all pretty similar.

Here are the highest recorded homicide rates:

Nearly every location has had spikes at one time or another. But a few places consistently record the highest homicide rates:

You can see that Winnipeg has recorded the highest homicide rate among large cities the most, followed by Ottawa-Gatineau. Edmonton has recorded the second highest homicide rate among large cities most often, followed by Vancouver.

Here are the average homicide rates over the last thirty years:

Half of the ten largest cities are below the Canadian average. As a result, the average for the ten largest cities isn’t that much higher than the Canadian average.

Here is Edmonton’s homicide rate compared against the overall rate in Canada and the average of the ten largest cities. You can see that is trending downward, despite spikes in 2005/2006:

Over the last thirty years, Edmonton has never recorded a homicide rate lower than the Canadian rate. Only three times has Edmonton’s homicide rate been lower than the average for the ten largest cities:

As homicide rates in Canada have generally been trending downward, I thought it would be useful to look at the rates by decade. Here are the average homicide rates by decade since 1981:

You can see that with the exception of Winnipeg, every location recorded a lower average homicide rate in the period 2001-2010 than they did in the period 1981-1990.

This graph shows the change a little more clearly:

Every location’s average rate decreased in the 1990s. Only three locations (Edmonton, London, and Winnipeg) have recorded increases since 2000, and only Winnipeg’s was enough to increase past 1990 levels.

What’s next?

Today, our city’s new violence reduction action plan was unveiled. You can read the whole thing in PDF here. The report concludes:

The problem of violence in society is complex and multi-faceted. It requires diligent, ongoing coordinated work across a number of agencies and organizations. This includes other orders of government, who have information and resources that will be required in order that solutions be comprehensive, and sustainable over the long-term.

The City and its key partners will continue their efforts to understand and address the root causes of violence and maintain order and safety in our community, keeping the livability of Edmonton among the best in Canada and the world.

I think understanding where we’re at is an important part of unraveling this mystery. Hopefully the information I have shared above will help in that regard. I look forward to the community conversations slated to take place this fall.

In a follow-up post, I’ll take a closer look at Edmonton’s homicide rate in the context of our demographics, economic situation, and other factors.

Edmonton’s Homicide Rate: How much has changed in 30 years?

Reading that we’ve had 28 murders so far this year in Edmonton is disheartening, as others have noted. And without a doubt something needs to be done to understand why this happening and what we can do to stop it. But has the picture really changed all that much from previous years?

Our homicide rate (the number of homicides per 100,000 people) currently sits at roughly 2.41. That compares to Winnipeg’s 2.08 (they have had 16 murders so far this year). If we extrapolate for the rest of the year, we’d finish with a homicide rate of roughly 4.82. That would indeed be our highest ever. However, a rate that high has only been experienced in large cities twice in the last 30 years:

Given that history, I would be shocked if we finished 2011 with a homicide rate above 4.8 (which would equate to 56 murders).

Here’s the average homicide rate for each of those cities:

And here’s what the rate looks like from year to year (it appears Montréal has experienced the most steady decline – we should find out what they did):

As for the title of Murder Capital of Canada – that distinction clearly goes to Winnipeg. It has led large cities in murders more in the last 30 years than any other:

In recent years, it has generally been Winnipeg #1 and Edmonton #2, or vice versa.

It sounds bad: “we’ve had more murders in the first six months of 2011 than we did all of last year”. That’s the kind of statement that will spur us into action. But I don’t think the situation is really all that different from previous years.

The other negative side effect of all of this is the knock on Edmonton’s image throughout Canada and around the world. Countless stories have been written about our homicide rate. I was interviewed by CTV about this today. I said that the words ‘homicide’ and ‘murder’ have been mentioned by Edmontonians on Twitter about 1200 times in the last month or so. What I didn’t get to do in the interview was compare that to previous years:

The absolute number of mentions is higher this year than it was in the last two years, but so is the total number of tweets overall. So I normalized the data. If the same number of tweets had been posted in June 2009 as were posted in June 2011, the words ‘homicide’ and ‘murder’ would have been mentioned more two years ago than today. All this to say: Edmontonians are talking about this topic, but perhaps not more than they have in the past. I would guess that other Canadians are talking about our homicide rate more than is normal, however.

UPDATE (8/5/2011): I updated the second paragraph to better reflect the way Statistics Canada calculates homicide rates, so that the numbers better align with the rest of the post. I had originally stated that extrapolating for the rest of 2011 would result in a homicide rate above 5.0, when it should have been 4.8. My argument remains the same – statistically speaking, that is very unlikely.

1.2 zettabytes of data created in 2010

For the last five years or so, IDC has released an EMC-sponsored study on “The Digital Universe” that looks at how much data is created and replicated around the world. When I last blogged about it back in 2008, the number stood at 281 exabytes per year. Now the latest report is out, and for the first time the amount of data created has surpassed 1 zettabyte! About 1.2 zettabytes were created and replicated in 2010 (that’s 1.2 trillion gigabytes), and IDC predicts that number will grow to 1.8 zettabytes this year. The amount of data is more than doubling every two years!

Here’s what the growth looks like:

How much data is that? Wikipedia has some good answers: exabyte, zettabyte. EMC has also provided some examples to help make sense of the number. 1.8 zettabytes is equivalent in sheer volume to:

  • Every person in Canada tweeting three tweets per minute for 242,976 years nonstop
  • Every person in the world having over 215 million high-resolution MRI scans per day
  • Over 200 billion HD movies (each two hours in length) – would take one person 47 million years to watch every movie 24/7
  • The amount of information needed to fill 57.5 billion 32GB Apple iPads. With that many iPads we could:
    • Create a wall of iPads, 4,005 miles long and 61 feet high extending from Anchorage, Alaska to Miami, Florida
    • Build the Great iPad Wall of China – at twice the average height of the original
    • Build a 20-foot high wall around South America
    • Cover 86 per cent of Mexico City
    • Build a mountain 25 times higher than Mt. Fuji

That’s a lot of data!

EMC/IDC has produced a great infographic that explains more about the explosion of data – see it here in PDF. One of the things that has always been fuzzy for me is the difference between data we’ve created intentionally (like a document) and data we’ve created unintentionally (sharing that document with others). According to IDC, one gigabyte of stored data can generate one petabyte (1 million gigabytes) of transient data!

Cost is one of the biggest factors behind this growth, of course. The cost of creating, capturing, managing, and storing information is now just 1/6th of what it was in 2005. Another big factor is the fact that most of us now carry the tools of creation at all times, everywhere we go. Digital cameras, mobile phones, etc.

You can learn more about all of this and see a live information growth ticker at EMC’s website.

This seems as good a time as any to remind you to backup your important data! It may be easy to create photos and documents, but it’s even easier to lose them. I use a variety of tools to backup data, including Amazon S3, Dropbox, and Windows Live Mesh. The easiest by far though is Backblaze – unlimited storage for $5 per month per computer, and it all happens automagically in the background.

Still Trending Down: Computing-related graduates in Alberta

If we’re serious about shifting the Alberta Advantage, I think we need to focus on technology. If we really want to be in the sweet spot of adding lots of value, participating in the economy of the future, and being globally competitive, we need smart people who can be creative and innovative in the appropriate sectors and industries. Technology is absolutely going to be at the heart of any sector or industry that will enable us to be world-class and trendsetting, there’s just no question about it.

That’s why this graph absolutely shocked me:

The data comes from the University of Alberta, but I think it is representative of the province as a whole.

The number of students graduating in the fields of Computing Science and Computer Engineering in Alberta is trending downward, with no correction in sight. How can we build the economy of the future when the picture looks like this?

Here’s a bit more detail – with the number of graduates broken out by degree/program:

I haven’t looked, but I suspect enrollment numbers would be similar (that is, I don’t think an incredible number of students register in computing-related programs and then switch out).

Bill Gates has been talking about the need for more students to take up computer science for years now. There’s more demand than supply, even when you factor in immigration. The need for us to stay competitive in this regard is well-documented. It looks like we’re falling further behind.

I don’t know what the answer is. I don’t know how we get more students interested in computer-related degrees. But I do think it is important to consider this data when we talk about the success of our provincial technology sectors, and indeed when we consider shifting the Alberta Advantage.

Economics and more with John Rose, the City of Edmonton’s Chief Economist

John Rose moved to Edmonton last May to become the City of Edmonton’s Chief Economist. It’s an important role at the City, though it is one that most people know very little about. I sat down with John last week to chat about his new job and to get his thoughts on Edmonton.

John loved geography when he was younger, and wanted to work in a field where he could apply that passion. He settled on urban planning, but while studying at the University of Toronto, switched to economics. He has been in the field ever since, working for the federal Foreign Affairs department in the 1980s in West Germany and South Korea before returning to Toronto to tackle the consulting business. He most recently worked for PricewaterhouseCoopers.

The move to Edmonton was a unique opportunity for John to combine his interests in urban planning and economics. “I’m interested in what drives the economics of municipalities forward.” He brings an outsider’s perspective to the City of Edmonton, something that initially made him wary. “I thought people would just say ‘here’s another Easterner showing up, telling us what to do’, but people have been very welcoming.”

As the City’s Chief Economist, John is responsible for publishing the reports that the City relies on for budget planning and strategy, among other things. Twice a year he publishes a long-range forecast, using a statistical model of Edmonton’s economy that looks both 3 and 10 years into the future. On a quarterly basis, he publishes City Trends, which provides current information on social, economic, demographic, land development, and transportation trends (here’s the PDF for Q3 2010, the latest to be posted to the website).

The City of Edmonton uses economic models developed by The Centre for Spatial Economics (C4SE). Somewhat surprisingly, Calgary and the Province of Alberta also use models from C4SE. The models can be complex, but John said recent technology improvements are making a difference. “In the 80s, you needed a mainframe to drive even the most simplistic models,” he told me. “Now the tech required is ubiquitous.” While acknowledging that economics is abstract – “you can’t touch the economy” – John said technology is increasingly getting rid of the mystique and mystery.

If you look at the Economic & Demographic section of the City’s website, you’ll find that most of the information is out-of-date. John explained that the transition from his predecessor is the cause, but he said to expect changes. “There’s a lot of value in the information and we want to get it out there, we want it in the public realm.” John noted there currently isn’t a way to notify people when new information is posted, but said an internal effort currently underway should change that. Getting everyone on the same page is a major push for his office this year.

John would also like to see a shift toward more regional forecasting. He works with a variety of organizations outside the City of course, including EEDC, the Chamber of Commerce, and the Finance department at the Province, but sees room for improvement. “We already do regional forecasting to a degree, because we do the CMA and extract a forecast for the City from that.” John noted that most statistics are available at the Census Metropolitan Area (CMA) level, and so it makes sense to look regionally when setting up models. “With a larger economic entity, the trends smooth out a little.” John suggested the Capital Region Board might be the logical place to host a regional forecasting effort.

Speaking of the capital region, John said that while 2011 will be a strong year for Edmonton, “most of the growth will take place outside the City of Edmonton proper.” This is partly because the City itself didn’t suffer as large a setback as a result of the recent downturn. “The manufacturing sector took a big hit in Alberta and Edmonton,” a sector largely concentrated outside the city, such as in the northeast. In a recent interview with the Edmonton Journal, John said we should see an annual growth rate of nearly 4% here in Edmonton.

He was also bullish on the province. “Alberta will be ahead of the national economy as a whole in 2011,” John told me. Again, this is due in part to the way the economic slump affected the province. “The impression is that Canada came through it very well, but the truth is the province didn’t.” In 2011, John expects Alberta to post the first or second best provincial growth rate in the country, depending on how Saskatchewan does.

Turning to individual sectors in the City, John told me that construction will show growth, but mostly due to commercial projects. The residential construction sector will be somewhat sluggish because “there’s just not a big demand for a lot of new housing.” FIRE will do well, but John cautions that increased regulation will have an impact. The retail sector will also grow more slowly this year, because people are reluctant to take on more debt and as as a result savings rates are going up. “The consumer-oriented durable component in particular” will grow slowly according to John, because as people buy fewer houses, the need for new vehicles, furniture, appliances, etc. also diminishes.

John talks about trends and forecasts all the time – he has made it part of his job to do interviews, meet people, and spread the word on Edmonton’s economy. He can rattle the numbers off with ease, and is obviously very knowledgeable. As our discussion shifted toward the city more generally, John became more thoughtful. We talked about the common refrain that Edmonton’s head office situation is dismal at best, and John pointed out that the larger question is how to “attract and retain investment, and talent.” He said we should do “exit interviews” with organizations that leave the City, to try to highlight any cross-cutting themes.

I asked John about the push to revitalize downtown, and in particular, about the City Centre Airport and the proposed arena. He called the ECCA redevelopment a “good move” by the City, because making such a large piece of land adjacent to the core largely residential will have a positive impact on our downtown. “The key to developing a vibrant downtown, is to have people living, working, being entertained, doing all those things, in the core.” He doesn’t think a blanket policy on financial incentives (such as the Railtown subsidy) to attract more residents to downtown makes sense, however. “If there’s an area that we want developed in a particular way, then the City could be come active, but otherwise there’s enough opportunity already.” John didn’t take a side on the arena, but said “it depends on how the development takes place” and said his main concern is that “we don’t want to be in a situation of two competing facilities.” He cited the Air Canada Centre in Toronto and the positive changes and increased activity it brought to the area south of Front Street. “It is very nicely integrated into the city.”

I asked John what he missed about Toronto, and he quickly replied “jazz clubs.” He said while the Yardbird Suite is great, there was more variety with regard to venues back in Toronto. John joked that by moving away from Toronto when he did, he avoided the current political drama that is taking place with new mayor Rob Ford. That led us into a discussion about transit, and LRT in particular, something John considers “the urban equivalent of an enabling technology – if you have it, you can do a lot of great things.” Projects such as the LRT expansion “are a big benefit to the local economy” in the short-term and are “vital for the City’s future,” John told me. The real value to the economy is what the LRT enables, rather than the jobs it directly creates. “If you don’t have mass transit downtown, you’re going to have a hard time developing nightlife, for instance.”

I really enjoyed talking with John (and not just because when I asked him if he was now an Oilers fan he replied, “that implies I was a Maple Leafs fan before!”). Stay tuned to his section of the City’s website for future economic updates.

Social Media and the City

We’ve all heard the stat: more than half of the world’s population now lives in cities and towns. Wellington E. Webb, former mayor of Denver, is credited as saying “The 19th century was a century of empires, the 20th century was a century of nation states. The 21st century will be a century of cities.” Urban areas are extremely important, for the allocation of resources (such as education and health care) and the creation of social and economic opportunity, among other things. As the UNFPA says: “The challenge for the next few decades is learning how to exploit the possibilities urbanization offers. The future of humanity depends on it.”

I believe that technology is vital for this challenge. It was technology that made the city possible, after all, by enabling and encouraging increased population densities. Urban settlements expose incredible network efficiencies because of this density, whether for trade, communication, or service delivery. It is these network efficiencies that, as strategy consultant and fellow Canadian Jeb Brugmann said, “make cities the world’s strategic centres of social innovation.”

Technology will be used in an endless number of ways to exploit the possibilities and to address the challenges of urbanization, but I think creating a sense of place will be key. Resilient cities, those that are sustainable, eco-efficient, and place-based, are one of the four possible outcomes for cities in a world of significant challenges like climate change, according to Dr. Peter Newman (PDF). Telling the story of a place is necessary for a city to become resilient, because creating a stronger sense of place increases the viability of the local economy and facilitates innovation. Social media is driving transparency in cities and is enabling citizens to tell the story of their place like never before.

One definition for social media comes from JD Lasica and Chris Heuer, and it goes like this: “Any online technology or practice that lets us share (content, opinions, insights, experiences, media) and have a conversation about the ideas we care about.” Put another way, you could say that social media tools and technologies are strengthening democracy.

Social media is becoming the best amplifier of a city that we’ve ever seen. True, social media makes it easy to spread the word beyond a single city and there’s definitely value in that, but it’s at the local level where social media truly shines, by taking the network efficiencies created by cities to the next level. Social media is helping to facilitate a new relationship between government and citizens, is enabling creatives inside cities to better connect with one another, and is empowering citizens like never before. In short, it improves a city’s social capital.

Natural capital is made up of the natural environment, such as the river valley here in Edmonton. On top of that we build infrastructure capital – roads, houses, buildings, lights, etc. Human capital and organizational capital refer to the individuals and organizations that use the natural and infrastructure capital to start and grow families, to build companies, and to otherwise create economic value. Social capital represents trust, social engagement, civic participation, reciprocity, and networks.

Social capital is critical for enabling innovation, making it possible to tackle tough problems. Within a city, social capital is vitally important because as Cameron Sinclair pointed out in his TED Wish, “all problems are local and all solutions are local.” Or as you’ve probably heard in the past, “think global, act local.” I think that applies quite broadly; for instance, to climate change. It’s a global problem, but it’s one that we need to approach locally. If we don’t succeed at reducing our impact on the environment at the local level, there’s no hope for solving the problem globally.

For these reasons, I’m extremely passionate about social media and the city. I’ve written a lot in the past about the impact social media is having on Edmonton and other cities, and I’ll continue to do so. Cities are increasingly important, and social media is making them stronger. I think that’s very exciting!

Related links worth clicking:

Thanks to Ted Gartside for the Creative Commons-licensed globe photo of New York.

How much traffic do the Edmonton Journal and iNews880 get from Twitter?

Depending on who you talk to, Twitter is either killing news media or saving it. A recent analysis by Hitwise found that less than 0.2% of people who use Twitter wind up going to news and media sites (thanks to Karen for the link). Their analysis looks at Twitter as a whole though, and I’m not sure how well it accounts for local news sites. I believe very strongly that social media has the greatest impact at the local level (more on this in a future post). Given that, I have long wondered how Twitter has impacted local news media here in Edmonton. Last night, I finally did some analysis. I decided to explore how much traffic the Edmonton Journal and iNews880, Edmonton’s two top tweeting media outlets, received from Twitter last year.

@EdmontonJournal

First up, the Edmonton Journal. They’ve been tweeting news articles since at least January 2009, so I had lots of data to play with. They used tweetburner to shorten links until September when they switched to bit.ly. Using the APIs available from those services, I added up all the click stats for all the links posted by The Journal. Here’s what I found:

Lots of variation, as you can see. Some of that is down to the use of two services, some of it is because of the number of Twitter users. There are probably dozens of other factors too.

For the period January 30 through December 31:

  • A total of 153,968 clicks were recorded on 4737 links.
  • That’s an average of 33 clicks per link, and an average of 15 links per day.
  • According to the stats on the bit.ly links, 95.4% of clicks come from the Edmonton Journal’s hash*.
  • The link with the most clicks (700) was this one, on May 26. It doesn’t work, because annoyingly The Journal doesn’t display old articles for some reason, but it appears it was about Edmonton’s Poet Laureate Roland Pemberton.
  • The day with the most clicks, September 14, doesn’t appear to be special…just lots of clicks that day for some reason (any ideas?).

@iNews880

Next up, iNews880, one of the first local media organizations to join Twitter. They used tinyurl.com until July, when Twitter switched the default to bit.ly, so unfortunately I only have data for the latter half of the year:

For the period July 14 through December 31:

  • A total of 90,500 clicks were recorded on 3811 links.
  • That’s an average of 24 clicks per link, and an average of 22 links per day.
  • According to the stats on the bit.ly links, 93.8% of clicks come from iNews880’s hash*.
  • The link with the most clicks (1933) was this one, on August 2 (that’s the huge spike in the graph above). The link goes to the report on the Big Valley Jamboree stage collapse, and it was popular because it included before and after photos.

Edmonton Journal vs. iNews880

I wanted to do a quick comparison, so I chose the period September 17 through December 31, because both sites used bit.ly for links during that time. Here’s what it looks like:

During that time:

  • The Edmonton Journal posted 2369 links (23 per day) and iNews880 posted 2261 links (22 per day).
  • A total of 79,519 clicks were recorded on Edmonton Journal links (an average of 751 per day or 34 per link).
  • A total of 53,815 clicks were recorded on iNews880 links (an average of 508 per day or 24 per link).

Thoughts

That’s a lot of clicks! Clearly Twitter and other social networking sites (where most shortlinks are posted) are having an impact. But how much? According to the latest report by the Newspaper Audience Databank (NADbank), weekly online readership at EdmontonJournal.com increased by 35% last year to 115,900 from 85,800 in 2008. That’s an increase of 30,100 readers per week. According to the click stats above, The Journal received 3208 clicks per week in 2009. So what does that mean?

Roughly 10.7% of the Edmonton Journal’s online readership increase in 2009 came as a result of posting links to Twitter.

And if I had to guess, I’d say my analysis probably underestimates things. Apparently the NADbank data is based on surveys, so I’m not sure how accurate it is, but it’s probably within acceptable margins of error. I’m also not sure what exactly a “reader” is – a page view, a visit, etc.

Caveats

I’ve tried to be as accurate as possible, but I can’t make any guarantees!

  • All the click stats are current as of last night.
  • I’m suggesting that all the clicks come via Twitter, when that’s probably not entirely true. Links get passed around, displayed on websites, etc. But the shortlinks do originate at Twitter.
  • It’s possible that The Journal or iNews880 posted a link to somewhere other than their own sites, but uncommon. I did remove one link from the iNews880 dataset, because it pointed to an Environment Canada site (it was obvious, lots of total clicks as others have linked there too). For the rest, I’m making the assumption that the links point to the news sites.
  • I don’t know how reliable the stats from bit.ly and tweetburner are. I suspect they are quite a bit different than server logs or Google Analytics metrics.
  • I would assume that both services tweaked the way stats are calculated throughout the year, so 15 clicks on a bit.ly link in May is probably different than 15 clicks on a bit.ly link in December.

* – When you shorten a link using bit.ly, you get a unique hash. If I shorten the same link, I get a different hash. The stats are recorded and made available individually and in aggregate, however.

Recent media links & thoughts

I read a lot about new media, journalism, publishing, news, etc. I always try to think about the things I read from both a global and a local perspective. Here are some thoughts on the things I’ve read recently.

From Jeff Jarvis:

I’m not so sure journalism is storytelling anymore.

Jeff points out that saying “journalism = storytelling” is limiting. Journalism is about more than the story, it’s a process. I agree completely. Data, algorithms, aggregators – all are aspects of journalism. They always have been, of course, but their importance/visibility has been heightened lately, thanks to new tools and technologies.

From paidContent.org:

Time Warner’s CNN is taking a stake in hyperlocal aggregator Outside.in—the latest example of a big media organization making a play in the hyperlocal space.

Smart move, just like MSNBC’s purchase of EveryBlock. And the news today that Google is in talks to buy Yelp. The dollars are starting to flow toward local/hyperlocal news companies. You know how the saying goes: follow the money.

From TechCrunch:

So what really scares me? It’s the rise of cheap, disposable content on a mass scale, force fed to us by the portals and search engines.

From ReadWriteWeb:

In my view both writers and readers of content will need to work harder to get quality content. Right now ‘quantity’ still rules on the Web, ‘quality’ is hard to find.

Lots of others have already discussed the “content farm” issue that made the rounds in the blogosphere last week. My view on it is pretty simple: readers need to become more active. There’s so much information so easily available that you can’t afford to passively consume the news. You have to seek out sources and recommendations. Certainly we’ll get better tools (aggregators, filters, search engines) but I think readers need to make more of an effort. See also: Content farms v. curating farmers.

From Clay Shirky:

…one of the things up for grabs in the current news environment is the nature of authority. In particular, I noted that people trust new classes of aggregators and filters, whether Google or Twitter or Wikipedia (in its ‘breaking news’ mode.)

I called this tendency algorithmic authority.

Fascinating. I think there’s incredible opportunity, both globally and locally, to take advantage of this. Who do you trust for your news? Is it the same people/organizations that you trusted five years ago?

From Unlikely Words:

Ken Auletta from the New Yorker wrote a book about Google, “Googled: The End of the World as We Know It” and before he published it, he cut the last chapter of 25 media maxims.

Now you can read them online. A few of my favorites:

  • Passion Wins
  • Adapt or Die
  • Digital is Different
  • Don’t Ignore the Human Factor

And finally, one of my favorite new tools: Times Skimmer. We need more innovation like that at the local level!

Twitter promoting Search despite major issues

Yesterday Twitter launched a new home page that puts more emphasis on search and trending topics. There’s a nice big search box on top, with up-to-date, daily, and weekly trends underneath. The aesthetic is different from the rest of the site however (you don’t see any of this if you’re logged in), so don’t be surprised to see additional changes in the coming weeks.

If you enter a query or click on a trending topic, the search results appear below. It looks a lot like Twitter Search. Some of the improvements include a description of what the trending topics are (Hell’s Kitchen was given the description “A reality television cooking competition”) and search tips appear in a little box on the right.

I don’t think the new design should be a surprise to anyone – it has been clear for quite some time that Twitter Search is important.

What’s surprising is that they’re promoting search even though it has major issues:

  • Stale Results: Twitter itself has become very stable lately, but the same cannot be said for Twitter Search. Results routinely become stale, sometimes for as long as an hour or two (so the newest tweets to show up in the results were posted an hour or two ago). For a real-time search engine, the stale results issue happens surprisingly frequently.
  • Missing Tweets: Over the last few weeks I’ve noticed that the number of missing tweets has increased (though I think it has always been an intermittent problem). It used to be that I could enter my username and see all replies at Twitter Search, but lately I can’t. Some tweets simply don’t appear in the Twitter Search index. I’ve submitted a support request about this, but have not heard anything back yet.
  • Other Intermittent Issues: There are a few good reasons that someone might not appear in search results (such as if they have a private account) but lately Twitter has had issues keeping the index up-to-date with new accounts.
  • Lack of Innovation: With the exception of adding the “source” property to search results, Twitter has done very little to improve the service they purchased a little over a year ago. Real-time search is new and ripe for innovation, but Twitter doesn’t seem interested. One of the oldest quirks is that user IDs returned from Twitter Search don’t match up with user IDs at Twitter itself. This is scheduled to be fixed in the next version of the API, but it’s not clear when that will happen.

Worst of all, Twitter has been terrible at communicating about the above issues. The Twitter Status blog is never updated when search results go stale, and very little has been shared regarding the future direction of Twitter Search.

The good news is that Twitter is finally starting to acknowledge that they need to improve search. Last night, Biz wrote: “We have a lot of work to do when it comes to the quality of our search results and trend analysis…”

Search is vitally important to Twitter, and I want to see them succeed. If they don’t address the above issues however, someone else is going to come along and steal their thunder.

You’re asking the wrong question

Last week’s issue of SEE Magazine was a “theme” issue, focusing on the future of the media industry (“print in peril”). In addition to this interesting article, there was a panel comprised of four local newspeople with lots of experience: Linda Hughes (U of A, formerly Edmonton Journal), Ron Wilson (CBC), Jeremy Lye (iNews880), and Roy Wood (MacEwan, formerly Edmonton Journal). They discussed a range of things, including the fact that the industry didn’t develop these problems overnight. The general consensus is that journalism is important, but what it looks like in the future is up in the air.

Of course, you can’t have an article on the future of media without asking who’s going to write about City Council, and the panel didn’t disappoint! Linda Hughes asks:

But with breaking news and local-level news, who is going to go sit in a courtroom all day for a three-paragraph story that is important to know about but isn’t sexy and is just part of the pubic discourse? Who is going to do that? Bloggers often provide a lot of insight, but most bloggers are not going to go to sit in city council committee meetings for five hours to keep track of what city council is doing.

Ask a sports writer about the future of news and he’ll probably use this defense, even though he never sets foot inside City Hall! It’s the easy way out, and it’s an incredibly common response lately from journalists in the hot seat. To make things worse, SEE asked the question again later in the piece:

If newspapers and mass media outlets do dwindle, then, who will be the watchdogs in society to ensure politicians don’t run wild? Who will pay for the investigative reporters who can zero in on one thing for months and all of a sudden have the biggest story of the year?

Sigh. There will still be passionate individuals who follow specific topics and do investigative reporting. Probably more now than ever thanks to easy publishing systems (blogs, wikis, Twitter, etc). And they’ll produce much more interesting content than someone who does it just because they get paid to.

Let’s ignore that argument for a minute, however. Asking how to pay a journalist to sit through meetings to get three paragraphs is still the wrong question!

The real question is, why have we ever had to pay someone to sit through five hours of City Council committee meetings? Let’s get rid of that absurd need altogether and this discussion becomes irrelevant.

This is why I’m so excited about ChangeCamp and the possibilities it represents. If we can change the way our government communicates with us, the need for a newspaper filter could go away altogether.

Let’s focus less on how we’re going to pay a journalist to sit with Council all day and more on how we can get Council to communicate with us in a meaningful way. If we can do that, the journalist will have much better things to cover!