DIY a Robotic Emoji Lamp


This is my interactive emoji lamp, using the SeeedStudio BeagleBone Green Wireless, a robotic arm, some servos and IR sensors. The control of this interactive emoji lamp is based on infrared detection. If you try to touch the left side of the lamp, it will turn right. Meanwhile, the emoji symbol displaying on the OLED monitor will change. What’s more, you can put your hand on the top of the lamp to switch on/off it.


See how it works on the demo video:

I got this inspiration when I was considering the birthday gift for my 8-year-old little brother, who is a robot fan and also loves to use emoji. And there is an old robotic arm I have made before. To combine all of these, I decided to give him a unique funny lamp.


It took me about 2 days to finish it. Actually, even though I am a beginner of SeeedStudio BeagleBone Green Wireless and Python, but the project is not so difficult for me. SeeedStudio Grove module helps me a lot. With the Grove – OLED Display, Grove – Line Finder and Grove – Various LED, I hardly need to worry about the hardware problems. No PCB design, No solder, I just connected them to the right place according to the WIKI of the Grove module, then they worked. All I need to do is trying my best to realize to application logic.

But there were still some challenges in this project. The first problem occurred was how to display emoji on OLED. I only tried to display some words on it before, but no pictures. So I was confused with this at the beginning. Luckily, I found the solution in the wiki page of Grove – OLED. I needed to draw the emoji by myself and saved it as a bmp file. Then used a tool to convert bmp file into hex file. The last I should use upmLCD drawing method to display the emoji.


And the other challenge was I had to make sure all the modules should work well at the same time. Servo, OLED and IR sensors were not easy to control. So I decided to encapsulate servo and OLED into models, and called them when IR sensors output high. Finally I kept checking IR sensors in my main loop, and the emoji lamp worked as I want. But actually I don’t think the script work efficient as what I designed. Would anyone like to guide me?

When my little brother received this interactive emoji lamp, his voice trembled with excitement! And he played with this lamp and showed off it to his friend all the afternoon. But in the evening, he started to ask me that why there were only 4 emoji and why these emoji changed randomly. So, getting inspiration from my little brother, I would like to add more emoji and more interactive function to this lamp, such as displaying a happy face when someone taps it and showing an angry face when someone keeps touching the lamp. What is more, I prefer to rewrite my Python script to make it run more fast and efficient. Here is my project in GitHub:

And if you want to know more details about my project, please check this recipe:

About Author


July 2016