20 Years of Visual Studio: #MyVSStory

Today Microsoft is marking the 20th anniversary of Visual Studio, their integrated development environment. To celebrate, they’ve released Visual Studio 2017! Over the last month or two, Microsoft has encouraged developers to share their Visual Studio story on social media. Here’s mine!

Microsoft Visual Studio .NET

I became interested in programming at a very early age and started playing with BASIC on our computer at home. I remember writing a program that asked you for your name and favorite color and then printed something like “Hi Name!” in that color to the screen. I’m pretty sure I tested it on my younger brother, but he was much less interested than I was. I thought it was magical.

One day I was in my Dad’s office and found a box for Visual Studio 97, the first release of the product. He let me take it home to install on our home computer and that started a long and fruitful relationship with Visual Basic. Though I started to learn other languages too, it was VB6 that I really enjoyed. When I started my software company Paramagnus back in 2000, it was VB6 that we wrote our first programs in.

While I probably did use Visual Studio 6.0, the second version that Microsoft released, it was Visual Studio .NET that came out in 2002 that really changed things. I was able to transition my VB6 knowledge into VB.NET and became smitten with the new .NET platform. I remember reading a magazine article about something called “COOL”, a new “C-like Object Oriented Language” from Microsoft that was kind of like Java. Well that became C# in Visual Studio .NET, and it wasn’t long until I switched from VB.NET to C#. It became my primary language and remains so today.

Microsoft Visual Studio .NET

In the early 2000s, I was involved with a .NET user group here in Edmonton. I remember meeting developer evangelist John Bristowe through that in 2005 when he came up from Calgary to show us “Whidbey” which would become Visual Studio 2005. I always enjoyed John’s presentations and his passion for Visual Studio, which he often called “God’s IDE”. That always stuck with me!

From 2003 until 2005, during my time at the University of Alberta, I had a side job as the Alberta Student Representative for Microsoft Canada. Part of my role was to organize and deliver presentations for students, and in 2004 I became an Academic MVP. That brought some nice perks along with it, including an MSDN subscription which meant all of a sudden I had access to everything!

Tech·Ed North America 2010
With the Channel9 guy at TechEd North America in 2010

It’s safe to say that Visual Studio has had a big impact on my life. Everything from my profession to some really rewarding personal experiences. I competed in the Imagine Cup student programming competition in 2003 and represented Canada at the worldwide competition in Spain, for instance. Along the way I’ve met some great people and learned a lot from some excellent developers.

I still use Visual Studio every day, though not always the IDE. These days there’s Visual Studio Team Services, which offers a place to store code, plan work, and test, build, and deploy software. I also use Visual Studio Code, a lightweight, cross-platform code editor. In fact, I’m writing this blog post inside Code because it is such a fantastic Markdown editor. I can’t wait to see what the next 20 years bring.

Happy birthday Visual Studio!

I’m going to Tech·Ed North America!

Tech·Ed is one of Microsoft’s most important annual conferences for developers and IT professionals, held in several places around the world. This year, Tech·Ed North America is in New Orleans in June, and I’m going to be there! I was invited by Microsoft Canada to attend, an opportunity I jumped at. I’ll be there with John Bristowe, taking in the sessions and labs, learning as much as I can, meeting other developers & IT pros, and generally having a good time. And of course, I’ll be blogging, tweeting, photographing, and otherwise recording & sharing the whole experience.

The conference runs from June 7 to 10. There are literally hundreds of sessions during the week, as well as a couple of keynotes and other special presentations. There’ll be some awesome parties too! The sessions are organized into 21 technical tracks, everything from Architecture to Office & SharePoint. I’m particularly interested in sessions on:

  • Open Data (obviously)
  • WCF and WF in .NET Framework 4
  • Windows Phone 7
  • Azure (cloud computing)

If you’re going to Tech·Ed, what sessions are you planning to check out? If you’re not going to Tech·Ed, what do you think I should see? Let me know!

I’m going to blog about my experience a little here, but also at Techvibes and the Canadian Developers blog. I’ll be tweeting about it too, using the official hashtag #teched. Can’t wait!

OpenID Connect

I’ve been doing some work with OpenID and OAuth lately, making use of the excellent DotNetOpenAuth library. I am pretty much a beginner when it comes to these technologies, but I have been able to get up-to-speed fairly quickly. I was a big fan of Facebook Connect, and I quite like the new Graph API too (which uses OAuth 2.0). Though it was easy to develop against, I think the biggest benefit of Facebook Connect was the excellent end user experience. It was consistent and simple.

In contrast, OpenID is a little more cumbersome, and a lot less consistent. The discussion on how to make it easier and sexier has been going on for a while now. It seems like some significant progress will be made this week when OpenID Connect is discussed at the Internet Identity Workshop. What is OpenID Connect?

We’ve heard loud and clear that sites looking to adopt OpenID want more than just a unique URL; social sites need basic things like your name, photo, and email address.

We have also heard that people want OpenID to be simple. I’ve heard story after story from developers implementing OpenID 2.0 who don’t understand why it is so complex and inevitably forgot to do something. Because it’s built on top of OAuth 2.0, the whole spec is fairly short and technology easy to understand. Building on OAuth provides amazing side benefits such as potentially being the first version of OpenID to work natively with desktop applications and even on mobile phones.

Chris Messina has some additional thoughts on the proposal here:

After OpenID 2.0, OpenID Connect is the next significant reconceptualization of the technology that aims to meet the needs of a changing environment — one that is defined by the flow of data rather than by its suppression. It is in this context that I believe OpenID Connect can help usher forth the next evolution in digital identity technologies, building on the simplicity of OAuth 2.0 and the decentralized architecture of OpenID.

It sounds very exciting – I hope OpenID Connect becomes a reality!

Alberta Budget 2010 website – security through obscurity

Tomorrow, Tuesday, is budget day here in Alberta. Like many Albertans, I am curious about what Finance Minister Ted Morton is going to deliver, so I started poking around online. First stop, last year’s budget, available at http://budget2009.alberta.ca/.

Seems logical that the 2010 budget would be at http://budget2010.alberta.ca. So I tried that URL, and was prompted with a login screen. First thing that came to mind was “administrator” and “password”. Voila:

Fortunately for Mr. Morton, the documents don’t appear to have been uploaded yet. You can see all the placeholders though, which is kind of funny. And it seems you can leave feedback.

It does reveal the theme of the budget, Striking the Right Balance. Last year was Building on Our Strength.

This is what is known as “security through obscurity”. It’s not really secure, it’s just hidden. I’d suggest that programmers working at the Government of Alberta invest in Writing Secure Code, a fantastic book on the subject.

I hope this isn’t a reflection of the budget we see tomorrow…cutting corners, etc.

UPDATE: Sometime around 9:45 AM today they changed the password, and I think pointed the virtual directory somewhere else.

UPDATE2: The Journal wrote about this today.

UPDATE3: The site is now officially live with all the budget documents. Enjoy!

TweetSharp for Twitter developers using .NET

Since January I’ve been using a library called TweetSharp in my various Twitter-related programming projects (including my monthly stats posts). Not only has it saved me from all of the effort that would have gone into writing my own Twitter library for .NET, but it has also taught me a few things about fluent interfaces, OAuth, and other topics. Here’s the description from the relatively new official website:

TweetSharp is a complete .NET library for microblogging platforms that allows you to write short and sweet expressions that convert automatically to web queries and fly to Twitter on your behalf.

Maybe this is a generalization but I often feel that .NET developers get the short end of the stick when the “cool kids” release sample code for their APIs. Or more accurately, C# developers get the short end of the stick (because you can run Python, Ruby, and other languages on .NET if you really want to). Thus I’m grateful that Dimebrain (Daniel Crenna) has developed such a useful library.

TweetSharp is open source and under active development (hosted on Google Code), with a growing base of users reporting and fixing issues (I helped with the Twitter Search functionality initially). If you’re writing any kind of software for Twitter using .NET, you should be using TweetSharp.

I want an API for Edmonton Transit (ETS)

edmonton transit When the new edmonton.ca website launched back in the fall, I was hopeful that the Edmonton Transit portion of the site would receive more than just a facelift. Unfortunately, that didn’t happen. Though I’m disappointed, I can understand why. Edmonton Transit is not in the business of developing websites or software, it’s in the business of transportation. They’ve got to make sure buses and trains run efficiently and effectively first, and then they can focus on everything else.

That’s not to say that the website, or BusLink (over the phone), or the other services they offer aren’t important, just that ETS has limited resources and must deploy them accordingly. That’s why I think an Edmonton Transit API makes a lot of sense.

To build an application for looking up transit information, you need both an interface and data (I’m simplifying things a bit). ETS has all of the data of course – they know all of the route numbers, bus stops, and schedule information. What they lack are great interfaces. If ETS exposed their data through an API, third party developers could build great interfaces on top with relative ease.

Here’s the kind of information I’d like to see exposed through an ETS API:

  • Route Information – return name, start and end point, and other details for a given route
  • Stop Information – return coordinates, address, photo, and other information for a given stop
  • Route Stops with Stop Times – return a list of all stops along a given route with stop times
  • Routes at Stop with Times – return a list of routes for a given stop with stop times for each one
  • Search for Stop by Location – return the closest stops for a given address or set of coordinates

That list is similar to the information exposed by the unofficial TransLink API. A good starting point would be to simply clone what they’ve done! More advanced API features could include:

  • Route Interruptions – return a list of routes currently affected by construction or other interruptions
  • Stop Interruptions – return a list of stops currently affected by construction or other interruptions
  • Search for Stop by Landmark – return the closest stops for a given landmark
  • Trip Planner – return a list of route and transfer options for a given location of origin and destination
  • Information for St. Albert Transit and Strathcona County Transit

In the future, the sky is the limit. I know ETS is testing GPS technology on buses, so why not expose “distance from stop” information for a given route? That would be wicked, and incredibly useful when the weather dips below –25 C.

It’s not feasible for ETS to develop interfaces for each new platform that emerges. They have a website, but what about an iPhone application? Or a BlackBerry application? Or a Twitter bot? If they focused their limited software development energies on building an API, I’m confident that local entrepreneurs and software developers would build a plethora of interfaces on top of it. I would definitely build a Twitter bot!

There don’t seem to be many transit systems with APIs available, but that won’t be true for long. Here are a few others I’ve found: TransLink (unofficial), Bay Area Rapid Transit (official), Portland’s TriMet (official), Chicago Transit Authority (unofficial), Charlottetown Transit (unofficial). And here are a couple other resources I’ve come across: the Public Transit Openness Index, and a list of publicly available official GTFS (Google Transit Feed Specification) schedule data feeds.

I’d love to see Edmonton Transit take the lead and offer a completely free, fully functional transit API, and I’d be willing to help make it happen. In the meantime, don’t forget that you can now use Google Maps to find ETS trip plans.

Google Native Client: ActiveX for the other browsers

Today, Google announced Native Client, “a technology that aims to give web developers access to the full power of the client’s CPU while maintaining the browser neutrality, OS portability and safety that people expect from web applications.” Basically it’s a browser plugin that hosts a sandbox for native x86 code. So instead of writing a web page, you’d write a normal application and execute it in the browser.

I admit that I’ve only scanned the documentation and research paper so perhaps I’m missing the details, but Native Client seems entirely unnecessary for a bunch of reasons:

  • There are lots of ways to accomplish this already – Java, ActiveX, Flash/Flex, Silverlight 2, Alchemy, etc. Why do we need another one? Will it be very different or better? Heck even ClickOnce seems better than this.
  • What’s the point of running native code inside a sandbox inside a browser? Unless the sandbox is super efficient and our browsers improve by an order of magnitude, it would seem to me that the benefits of native code would be erased.
  • Similarly, with the performance of Javascript/HTML/CSS in browsers consistently improving, why write native code at all? Web apps are becoming very fast.
  • I don’t really want to install yet another plugin. The classic “chicken and egg” plugin problem will be in effect here (users won’t install the plugin without great apps and developers won’t create great apps if no one has the plugin).

This project feels a lot like Google is reinventing the wheel. Or at the very least, throwing something else out there to see if it sticks. I hope developers think about this before jumping in. A bunch of the comments on Google’s post suggest that will happen, such as this one:

Um, isn’t this called desktop software?

That kinda says it all, I think!

When you get right down to it, Native Client is just ActiveX for browsers other than Internet Explorer. Sorry Google, but that doesn’t sound very appealing to me.

Recap: Edmonton Code Camp 2008

edmonton code camp On Saturday we held Edmonton Code Camp 2008 downtown at MacEwan. Code Camp is an all-day event by developers, for developers. The only rule for presentations is that you have to show some code! Otherwise, it’s just a great opportunity to meet other developers in the community, and learn from one another.

We had three tracks of content. For posterity, here’s a list of the presentations we had:

I think my favorite was probably Mark Bennett’s talk on Javascript testing. It was about more than just jQuery, and I learned some really useful things. Like Mark, I’ve been thinking a lot lately about the best way to organize, test, and evolve an application that is Javascript-heavy.

A large group of us went to Metro for lunch, which wasn’t the best idea because we were half an hour late getting started again in the afternoon! It turned out okay though.

Thanks to everyone who came out on Saturday, it was great to meet all of you! In particular, I was surprised at the number of Twitter users in attendance. For a smaller event, there was sure a lot of tweets posted!

Big thanks also to Steven Rockarts, who did most of the heavy lifting to get the event off the ground.

Let’s make next year’s code camp bigger and better!

Edmonton Code Camp 2008

edmonton code camp We’re just over two weeks away from a really cool event for local software developers – Edmonton Code Camp! What is code camp? It’s a free event by developers, for developers. It’s an opportunity for local developers to get together to share with and learn from one another. Similar to DemoCamp, slide decks are frowned upon – show us the code! It doesn’t matter what your programming language of choice is, everyone is welcome!

Code Camp is an annual event here in Edmonton, organized primarily by Steven Rockarts from EDMUG. I’m really looking forward to it! Here are the details:

WHEN: Saturday, November 29th, 2008 from 9:00am until 4:30pm
WHERE: Building #5, MacEwan Downtown Campus (map)
Click here to register!

[geo_mashup_map height=”200″ width=”575″ zoom=”15″]

You should be able to see the embedded map above also – I’m testing the WP Geo plugin. Works quite well I think! I’m now testing the Geo Mashup plugin – seems to work better, and I can specify the zoom on a per-post basis!

If you’d like to present something at code camp, let us know! You can add your name to the wiki, leave a comment here, or email Steven Rockarts. Just want to attend? That’s cool too! Just register here, and then tell your friends!

We’ll have more updates as we get closer to the event, so keep an eye on the website and wiki. Hope to see you there!

Microsoft is adopting jQuery moving forward

Just came across some really excellent news for developers. Microsoft’s ScottGu has announced that the ASP.NET team is adopting the popular jQuery library and will be shipping it with Visual Studio moving forward:

We are really excited to be able to partner with the jQuery team on this. jQuery is a fantastic library, and something we think can really benefit ASP.NET and ASP.NET AJAX developers. We are looking forward to having it work great with Visual Studio and ASP.NET, and to help bring it to an even larger set of developers.

I think this is just fantastic. I’m a fairly recent convert to jQuery, but I’m sold. I won’t build another website without it. The most immediate benefit of this announcement is the Intellisense support that Microsoft will be shipping in a few weeks as a free download.

You can read jQuery creator John Resig’s comments on the partnership here. This is an interesting kind of move for Microsoft. Instead of building their own or trying to buy a competitor like normal, they’re recognizing that jQuery is great as it is. Using jQuery will benefit Microsoft, and I’m sure it will benefit jQuery too as Microsoft can submit patches, bug reports, and other things.

Great stuff!