In the “old days”, if you wanted to display video on a web page you have to upload it to a server
and hope that people had the right media plugin to handle your video format. YouTube changed that by not only hosting and encoding the files, but also by allowing you to embed their viewer into your website. By “playing ball” with the rest of the web YouTube became an integral part of it (cats may have played a role as well).
Something similar happened with maps. By allowing third-party websites to embed their maps (and by providing a formal API) Google Maps evolved from a novel web-version of a classical desktop-app to a structural part of the web.
Currently I’m intrigued by the future of OnShape, a web & cloud base 3D CAD system.
They are openly developing the product at a furious pace (https://www.onshape.com/cad-blog).
At this rate they could soon reach feature parity with their nearest competitors.
But what’s really interesting is that the system doesn’t doesn’t depend on a plugin. It’s all in the browser, pushing the envelope. It is entirely possible for OnShape to allow other sites to use their tech as an integrated service. And they seem to be open to the idea: https://forum.onshape.com/discussion/comment/6562#Comment_6562
Recently I ran into a relevant scenario. “Enabling the future” is a global network of volunteers that donate 3D print time to create prosthetics for amputees. They have created a great web app, the “Hand-o-matic“. The app let you customise the design of a prosthetic hand and will mail you the generated STL documents based on the supplied parameters.
They use OpenSCAD (an open source CAD modeller) in command line mode to generate still-images of the parts (and to generate STL files).
And embedable CAD system could power a much richer experience for this and many other sites where there is a need for visualising and editing parametric 3D designs.
OnShape has the potential to enable a new kind of web application hybrids (Mashups), that take CAD and the Web to new places.
The update rate of my OLED screen via IC2 is to slow to do proper animation. Redrawing the eyes on a new position every 12 times a seconds was out of the question.
Luckily the oled-js library supports a scroll method. So I went with that and implement the movement of the eyes by controlling the scroll direction and timing the start-stop sequence with temporal. Because the entire screen buffer is pushed around by the hardware, the eyes don’t need to be so simplistic as I initially made them. They could as well be a bitmap with a set of mesmerising cat eyes, but for now this will do.
I purchased an OLED screen to add eyes to Felix. To talk to the display I start playing with Suz Hinton’s (@noopkat) Oled library. I couldn’t find any project on the web using it, so I decided to make a quick tutorial to test the display. The result is a how-to build an electronic Etch-A-Sketch gizmo.
You can find the tutorial on Github:
It’s a repo, so you are welcome to contribute if you stumble upon any issues.
Sus Hinton (@noopkat) released a couple of months ago a library to control OLED screens from Johnny-five: https://www.npmjs.com/package/oled-js, so I ordered a SSD1305 from Adafruit.
I will like to add a screen to Felix to eventually give some status feedback, but mostly to display/animate eyes. Sort of the way Baxter from Rethink Robotics does it.
To mount the screen I designed two head pieces and cut them with on the laser-cutter.
The head is controlled by two HXT900 servos. One servo rotate the head from left to right and the other to tilts the head. The rotation is to allow the two SHARP 2Y0A21 proximity sensors to scan the room. The tilt is just cute.
Besides the screen and sensors I also added a small piezo speaker.
I’m happy with the result, now time to do the wiring.
Mini Felix is a noisy and jittery fellow. Walking is ok, but doing animations with it is quite annoying. Instead I decided to assemble a new chassis using the latest design and use it to test the animation tool. At the same time I wanted to document the assembly so I can write a step-by-step tutorial. Putting it all together took about 4 hours and there are a couple of things I can do to improve the design to ease this process. Today I did the soldering and connected the servos and was able to start playing with the animation editor. More on that later.
I really like laser cutters. They are like a 2.5D copy machine. We have two in our fablab =). For this weeks project I worked on a design for a simple gift-card automata. First I got the dimensions and mechanics right using plywood, my bandsaw and a Dremel. Then I scanned the pieces and traced them in illustrator. Then I exported the vector file to the laser cutter. Our big Chinese cutter is not exactly a precision champion. It’s difficult to find the sweet spot between distance, power and speed to match the material. A rushed setup produced a somewhat charred result, but fine enough to test the assembly. If you want to give it a try download the vector files (ai, svg and dxf): WhyYouCry .
Right now the assembly require 3mm. wood dowels and eventually a bit of glue. I might revise the design to be a fully laser cut solution. Another enhancement could be to print and transfer graphics to the wood before it is cut (http://www.instructables.com/id/Image-Transfer-to-Wood/), you can of course just paint it afterward with acrylics.
If you make one, please tweet me a picture @RonaldXJT!
[Picture credits: Tommaso Ranzani]
One of the problems of soft robotics is that some tasks required a high a degree of precision. Knowing exactly where your end effector is crucial and being able to control the “stiffness” of the system is a plus when you need accuracy.
Today I stumbled upon a paper that has a novel approach to deal deal with this issues. It’s a bioinspired soft manipulator by Dr. Tommaso Ranzani (http://iopscience.iop.org/1748-3190/10/3/035008). The manipulator is consists of small segments of Silicone. Each segment has four chambers. Tree chambers are inflatable, and by controlling their air pressure, you can move the manipulator around. Pumping two at the same time move the manipulator toward the deflated one, pumping all tree, extends the manipulator. There is a 6 DOF sensor in each junction. By modeling the orientation of each junction and the length of the segments they can calculate the position of the tip.
So far nothing new. The genius is in the central chamber. This central chamber is filled with granular material (ex. coffee) and by controlling the pressure in this chamber they can controll the stiffness of the system.
It’s the same principle behind John Amend’s and Hod Lipson’s universal robotic gripper.
Lately there has been a renewed interest in the field of soft robotics. Big Hero has probably something to do with it =).
Anyway, Harvard Biodesign Lab released last year a soft robotics toolkit. Doing some sort of project with the kit has been on my radar since, but I haven’t had an specific idea. But today a small centipede decided to walk by and I took a short clip in order to get a better idea of how he moved. Trying to mimic the locomotion pattern could be an interesting project.
This animation tool can now control servos using socket.io and Johnny-Five and it’s actually quite fun! But it is far from production ready. I created a repo for the project: https://github.com/Traverso/go-johnny-go and I have already added a bunch of issues https://github.com/Traverso/go-johnny-go/issues. Nothing major, mostly debris from a rushed prof-of-concept coding spree.
This week I got a bunch of 9G servos and some PMW motor shields from my local Adafruit pusher m.nu. I want to use them in a smaller (more affordable) version of Felix.
The redesigned chassis is just a scaled down version of the current geometry, but I have to work on the casing for the electronics. There is not much space available for a Uno, a shield and a battery pack. If I lay the electronics down flat, Felix looses it’s slender figure and looks more like a turtle. Currently I’m working on a vertical arrangement, which gives him a unflattering hump, but he seems to prefer that.
Before I settle/refine the design, I have to test how it affects Felix balance during the gait.