Sus Hinton (@noopkat) released a couple of months ago a library to control OLED screens from Johnny-five: https://www.npmjs.com/package/oled-js, so I ordered a SSD1305 from Adafruit.
I will like to add a screen to Felix to eventually give some status feedback, but mostly to display/animate eyes. Sort of the way Baxter from Rethink Robotics does it.
To mount the screen I designed two head pieces and cut them with on the laser-cutter.
The head is controlled by two HXT900 servos. One servo rotate the head from left to right and the other to tilts the head. The rotation is to allow the two SHARP 2Y0A21 proximity sensors to scan the room. The tilt is just cute.
Besides the screen and sensors I also added a small piezo speaker.
I’m happy with the result, now time to do the wiring.
Mini Felix is a noisy and jittery fellow. Walking is ok, but doing animations with it is quite annoying. Instead I decided to assemble a new chassis using the latest design and use it to test the animation tool. At the same time I wanted to document the assembly so I can write a step-by-step tutorial. Putting it all together took about 4 hours and there are a couple of things I can do to improve the design to ease this process. Today I did the soldering and connected the servos and was able to start playing with the animation editor. More on that later.
I really like laser cutters. They are like a 2.5D copy machine. We have two in our fablab =). For this weeks project I worked on a design for a simple gift-card automata. First I got the dimensions and mechanics right using plywood, my bandsaw and a Dremel. Then I scanned the pieces and traced them in illustrator. Then I exported the vector file to the laser cutter. Our big Chinese cutter is not exactly a precision champion. It’s difficult to find the sweet spot between distance, power and speed to match the material. A rushed setup produced a somewhat charred result, but fine enough to test the assembly. If you want to give it a try download the vector files (ai, svg and dxf): WhyYouCry .
Right now the assembly require 3mm. wood dowels and eventually a bit of glue. I might revise the design to be a fully laser cut solution. Another enhancement could be to print and transfer graphics to the wood before it is cut (http://www.instructables.com/id/Image-Transfer-to-Wood/), you can of course just paint it afterward with acrylics.
If you make one, please tweet me a picture @RonaldXJT!
[Picture credits: Tommaso Ranzani]
One of the problems of soft robotics is that some tasks required a high a degree of precision. Knowing exactly where your end effector is crucial and being able to control the “stiffness” of the system is a plus when you need accuracy.
Today I stumbled upon a paper that has a novel approach to deal deal with this issues. It’s a bioinspired soft manipulator by Dr. Tommaso Ranzani (http://iopscience.iop.org/1748-3190/10/3/035008). The manipulator is consists of small segments of Silicone. Each segment has four chambers. Tree chambers are inflatable, and by controlling their air pressure, you can move the manipulator around. Pumping two at the same time move the manipulator toward the deflated one, pumping all tree, extends the manipulator. There is a 6 DOF sensor in each junction. By modeling the orientation of each junction and the length of the segments they can calculate the position of the tip.
So far nothing new. The genius is in the central chamber. This central chamber is filled with granular material (ex. coffee) and by controlling the pressure in this chamber they can controll the stiffness of the system.
It’s the same principle behind John Amend’s and Hod Lipson’s universal robotic gripper.
Lately there has been a renewed interest in the field of soft robotics. Big Hero has probably something to do with it =).
Anyway, Harvard Biodesign Lab released last year a soft robotics toolkit. Doing some sort of project with the kit has been on my radar since, but I haven’t had an specific idea. But today a small centipede decided to walk by and I took a short clip in order to get a better idea of how he moved. Trying to mimic the locomotion pattern could be an interesting project.
This animation tool can now control servos using socket.io and Johnny-Five and it’s actually quite fun! But it is far from production ready. I created a repo for the project: https://github.com/Traverso/go-johnny-go and I have already added a bunch of issues https://github.com/Traverso/go-johnny-go/issues. Nothing major, mostly debris from a rushed prof-of-concept coding spree.
This week I got a bunch of 9G servos and some PMW motor shields from my local Adafruit pusher m.nu. I want to use them in a smaller (more affordable) version of Felix.
The redesigned chassis is just a scaled down version of the current geometry, but I have to work on the casing for the electronics. There is not much space available for a Uno, a shield and a battery pack. If I lay the electronics down flat, Felix looses it’s slender figure and looks more like a turtle. Currently I’m working on a vertical arrangement, which gives him a unflattering hump, but he seems to prefer that.
Before I settle/refine the design, I have to test how it affects Felix balance during the gait.