New
Introducing Compute Orchestration
December 10, 2015

Clarifai Featured Hack: How to Build an Artificially Intelligent, 3D-Printed Lightsaber

Table of Contents:

If your answer is SPACE MAGIC then you are correct, young Padawan.

 

Chad Ramey, a Georgia Institute of Technology computer science undergrad, created an awesome 3D printed Lightsaber that recognizes motion gestures (insert swishy Lightsaber sound here). Using a custom trained Clarifai classifier and an Inertial Measurement Unit (IMU) to record movement data, Chad was able to build an artificially intelligent Lightsaber in a single weekend.

WHY WE ❤ IT

Not only is Chad’s use of our image recognition API for gesture recognition a great demonstration of out-of-the-box thinking, it’s also a testament to the power of our technology. Chad’s project means that Clarifai’s artificial intelligence API can be used to recognize any non-visual data that can be represented graphically (for example, a song’s visual spectrogram).

HOW YOU DO IT

We had the chance to catch up with Chad about his epic Lightsaber project and how he managed to build it. If you want to learn from a real Jedi hackathon master, read on!

 

Clarifai: What inspired you to build this project for the UGAHacks hackathon?

Chad: My teammate Josh Marsh and I wanted to take sci-fi objects and translate the crazy things that happen in the movies to things that happen in real life. For example, if you were to wave Harry Potter’s magic wand in real life, we wanted to be able to recognize that gesture and make something happen as a result of it, like turn on a light. We ended up going with a Lightsaber because it was the only thing big enough to put the accelerometer and microcontroller inside.

 

Using our visual recognition API for gesture recognition is pretty innovative. How did you come up with this idea?

I have a background in machine learning so I was immediately interested in using the Clarifai API. Raw motion data is very noisy and difficult to translate into meaningful patterns. It traditionally involves a lot of complex, difficult math – I hoped that Clarifai’s AI would be able to pick up patterns in the data for us.

How did you implement Clarifai in your project?

“With Clarifai, you don’t have to worry about any of the hard math to detect patterns.”

 

We took Clarifai’s custom binary classifiers and trained it on a small data set of graphs to identify four different gestures – right, left, up, and down. We used a Python library called matplotlib to take all the data we were getting in our Lightsaber sensor, plot it as a chart, and save it as a .jpg image. We custom trained Clarifai to recognize that which plot meant left, right, etc. Super, super simple – you only need an image and none of the math.

If someone else wanted to build a similar project, what technical skill level would they need?

“The really awesome thing that I got super excited about was how easy it was to work with the Clarifai API. It was an hour or two of work – just 100 lines of python code for me to write.”

 

Unlike open source toolsets, you don’t have to understand what’s going on under the Clarifai hood in order to use the technology properly. For example, the best open source tool available for motion right now is the Gesture Recognition Toolkit (GRT). But, you need a lot of machine learning knowledge to know what to do with it.

“With the Clarifai API, you don’t need to be a data scientist or machine learning expert. Any developer can use it.”

 

What did it take to 3D print and assemble the actual Lightsaber?

It took about 10 hours for the Lightsaber parts to print from our two 3D printers. We took a strip of individually adjustable RGB lights and put it on the inside of the saber for the glow. In the hilt, we stored the accelerometer and a microcontroller to collect data and control the lights. We used an Arduino microcontroller – a really awesome tool that you can build robots with. It’s very popular in the maker community and only costs $10-12.

Are there real-world applications for how you used Clarifai in your Lightsaber project?

“Our project essentially proves that Clarifai can be used for gesture recognition, so there are limitless real-world motion-based applications that could be built with Clarifai.”

You could make make your own activity tracker (like a FitBit) or smartwatch using a bracelet with a motion sensor and Clarifai as the backend. You could train the model to understand whether you’re doing a push-up or a pull-up or if you’re running or walking, and you can track your fitness that way. Or, you can control wi-fi connected things through gestures. You could draw the letters “TV” in the air to turn on your wi-fi connected TV.

 

Thanks for sharing, Chad!

 

For everyone inspired by Chad’s epic project, sign-up for a free Clarifai account to start using our API – all it takes is three lines of code to get up and running! We’re super excited to share all the cool things built by our developer community, so don’t forget to tweet @Clarifai to show us your apps.