Recap: DemoCamp Edmonton 31

Tonight was robot & games night at Edmonton’s 31st DemoCamp which took place at the Centennial Centre for Interdisciplinary Sciences (CCIS) on the University of Alberta campus. After missing the last two, it was great to be back to see some inspiring new projects and entrepreneurs. You can read my recap of DemoCamp Edmonton 29 here.

If you’re new to DemoCamp, here’s what it’s all about:

“DemoCamp brings together developers, creatives, entrepreneurs and investors to share what they’ve been working on and to find others in the community interested in similar topics. For presenters, it’s a great way to get feedback on what you’re building from peers and the community, all in an informal setting. Started back in 2008, DemoCamp Edmonton has steadily grown into one of the largest in the country, with over 200 people attending each event. The rules for DemoCamp are simple: 7 minutes to demo real, working products, followed by a few minutes for questions, and no slides allowed.”

In order of appearance, tonight’s demos included:

DemoCamp Edmonton 31
Bento Arm

Rory & Jaden showed us the latest version of Bento Arm, a 3D printed robotic arm. It features pressure sensors in the finger tips, servo motors that track velocity and other metrics, potentiometers, and even includes a camera embedded in the palm. The idea with having all of those sensors is to use machine learning to improve its capabilities over time (for instance the camera might recognize objects to help the arm pick them up). The demo showed how the hand could be controlled using a joystick, moving the arm around, and opening and closing the fingers. Bento Arm runs on the Robot Operating System and the team plans to open source everything, hardware and software. To the end the demo, they played rock-paper-scissors against the Bento Arm, which won. Welcome to the future!

DemoCamp Edmonton 31
vrNinja demo

Nathaniel & Alexendar were up next and they showed us vrNinja, a ninja simulation game built for the Oculus Rift VR headset. In the game you are a ninja and you must learn and use new weapons as things get faster and faster. The game features positional audio and requires you to move quite a bit in order to play (so be careful what’s next to you). The team are hoping to release it in the Oculus store in the next month or so, and they have plans to look into the HTC Vive VR headset as well. If you’d like a closer look, you can check out the game this weekend at GDX Edmonton.

DemoCamp Edmonton 31

Next, Ian & Evan showed us what they have been working on with Anthrobotics. The idea is to build robots that do all the boring, redundant tasks that we all need to do each day. They showed three prototypes. The first was an anthropomorphic named Robio who sat in a wheelchair. Unfortunately the demo gods got the better of him and the speech demo didn’t work. They said they liked the humanoid form (even though it is difficult to build) because they think it has the greatest potential for being useful in our world. The next two prototypes were a hand that featured and opposable thumb and a leg that could move both entirely and just the foot. They are using Arduino boards right now but have plans to add Raspberry Pis in the future. Their robots are very much in the prototype stage, but if this is what they’re doing in high school, I can’t wait to see what they build in the future!

DemoCamp Edmonton 31
Hugo, the Twitter-powered robot

Jeff and couple of his colleagues from Paper Leaf were up next to show us Hugo, the Twitter-powered robot that you probably tweeted inappropriate things to last year when it launched. The way it works is you tweet something with the hashtag #hugorobot and Hugo will speak it aloud. You can read more about Hugo here. Hugo was a big success, and even helped Paper Leaf to win an ACE Award. At the experiment’s peak, Hugo was receiving 3100 tweets per hour and more than 7000 people watched the livestream. Hugo was posted to Reddit, 4chan, and 9gag, all of which meant that the team had to work hard to keep the blacklist updated. It’s a fun project and Jeff says you could apply the same concepts of social media and crowdsourcing elsewhere.

Our final demo of the evening was from Matt & Logan who showed us RunGunJumpGun. It’s a 2D side-scrolling “helicopter-style” game that they first prototyped at least year’s GDX Edmonton. Now a year later, they have improved and refined the game, and plan to release it this summer. The game features 40 levels that increase along a difficulty curve so that as you progress you should master the skills needed to win. Though honestly the last level looked impossible to pass! There’s a certain amount of frustration that comes along with the style of play, but it also has a high degree of replay-ability. They plan to launch an iPhone version at some point too.

DemoCamp Edmonton 31

Some upcoming events to note:

  • Monthly Hack Day is coming up this Saturday at Startup Edmonton
  • GDX Edmonton takes place Saturday and Sunday at the Robbins Health Learning Centre downtown
  • Preflight Beta takes place Tuesday at Startup Edmonton and “helps founders and product builders experiment and validate a scalable product idea”
  • The full Preflight program started today!
  • The next ROS Robotics Meetup takes place on May 19 at Startup Edmonton

Over 150 meetup events took place at Startup Edmonton last year! Keep an eye on the Startup Edmonton Meetup group for more upcoming events. They have also added a listing of all the meetups taking place at Startup to the website. You can also follow them on Twitter.

See you at DemoCamp Edmonton 32!

DARPA Race Won!

Post ImageMaybe the title should say “finished” instead of “won”, as DARPA’s race for robots has never before been completed. At least three robots have now completed the harsh race:

Stanford University’s Racing Team has accomplished a historic feat of robotics, finishing first in the DARPA Grand Challenge, a 131.6-mile driverless car race that no artificially intelligent machine has ever conquered before.

“We had a great day,” said Sebastian Thrun, director of Stanford’s artificial intelligence lab and head of the racing team. Stanford’s “Stanley,” a modified Volkswagen Toureg with sensors and radar mountings, crossed the finish line within eight hours and 14 minutes, under the 10 hour requirement, according to times posted on the DARPA race Web site.

Director Dr. Tony Tether had this to say in the press release:

“Its incredible what Stanford and the two Carnegie-Mellon teams did today, and what the
other two teams can still achieve,” Tether said. “We had anticipated from the beginning that we might
have to carry the competition over to a second day.”

“When the Wright Brothers flew their little plane, they proved it could be done,” Tether
continued. “And just as aviation took off after those achievements, so will the very exciting and
promising robotics technologies displayed here today.”

Truer words have never been spoken. I remember how difficult it was to get our robot to move when we were building it, so I have great respect for all the entrants of this competition. I can’t imagine how much ingenuity it would take to build a robot that can travel that distance, all by itself.

Read: CNET

Podbot in MAKE!

Post ImageWe were really fortunate to meet Phillip Torrone at Gnomedex, and to have the opportunity to chat with him about the Podbot. He’s got a new entry up in the MAKE: Blog on our beloved podcasting robot, so check it out. And watch future issues of Make Magazine because you never know, we might publish instructions on how to build your own!

The entry includes a number of pretty cool photos of the Podbot too. I especially like the one of me holding up my tablet with the control software open, looks pretty intense. If this is the first you’ve heard of the Podbot, be sure to check out our official site.

Read: Make Blog

Announcing the Podbot!

Post ImageI’ve been waiting to post this for quite some time now. I am very happy to introduce to you, the Podcast Wizard Robot, or Podbot for short. You may have heard rumblings about a podcasting robot already, and if so, you heard correctly!

The Podbot is exactly that, a podcasting robot. It moves around like a mini car, and is controlled wirelessly. It’s equipped with a webcam and of course, a microphone. The Podbot has a Tablet PC on board, to act as both the interface and brains of the robot. We control movement, recording, and other functionality remotely using another Tablet PC connected over Wi-Fi.

All of the software is written in .NET. The control software which handles communication with the Podbot and functionality like movement was written specifically for the Podbot. The podcasting software is actually Podcast Wizard, our upcoming podcasting tool. Our podcast is hosted at Podcast Spot, and all of our episodes are tagged with Podcast Tags.

As you’ll see from the website, the Podbot was created by myself, Dickson, Andrew and Ashish. Above all, it was a fun project, and we all learned a lot. Thanks to Andrew and Ashish for all of the hard work you did – the Podbot just wouldn’t exist without you!

You can check out the website for more information on the Podbot, or if you’re at Gnomedex, come see it in person! You can listen to our first podcast with the robot here.

Read: Podcast Wizard Robot

Cockroach Powered Robot

Post ImageAs some of you may know, we’re in the process of building a robot. We’ve encountered our fair share of problems so far, specifically with regards to getting the damn thing to move. So I was particularly interested to read about graduate student Garnet Hertz and his solution for robotic movement:

He uses the Madagascar hissing cockroach, Gromphadorhina portentosa, which can grow as big as a mouse. In the summer of 2004, he built a three-wheeled cart that rises about knee high. Atop the aluminum structure sits a modified computer trackball pointer, with a Ping-Pong ball in place of the usual trackball, which is heavier.

The roach–he currently maintains a stable of four–rides on top of the trackball. As it scampers, the robot moves in the direction the roach would travel if it were on the ground; a Velcro patch and harness keep it in place.

Quite an interesting approach! I guess the “robo-roach” could be seen as something of a glimpse into the future, where we might have hybrid biological and mechanical robots. I am not sure I would have picked roaches though – I wouldn’t want to have to look after them just for the robot!

Read: CNET