Wednesday 8 October 2014

The beauty of Infra-Red over a WebSocket

This post will outline the new IR sensors we have built, problems we faced during it, and the integration of them with software.

Hardware

The initial IR sensors we built were cheap and dirty (figuratively, of course). Using an all-purpose photo-transistor turned out not be a good idea. It would pick up far too much ambient light and thus wherever we went, the phototransistors would be affected a non-trivial amount. This would make it quite hard to decide what levels of contrast we would need in our software.

Furthermore, the IR sensors were affected by the type of material (more importantly, the colour) and how far away each thing was. For example, this system does not work well with black objects. It is one of the limitations that also applies for the Leap Motion; it is the limitation of Infra-Red itself - that it is absorbed by black objects so well!

Thus, I examined some other types of sensors we could use. The first idea I researched was whether you could modulate the signal coming from the IR LED and have it receive at a certain wavelength so that each finger's sensor wouldn't interfere with each other. To this end, I found a receiver called the TSOP4838, which recieves the signal at around 37kHz. Modulating the light of the LED to 37kHz was not difficult - it was a very simple RC circuit (I also experimented using a function generator at 37kHz).


However, to my dismay, I found that the output of this TSOP was digital in nature. It would go HIGH or LOW, but would not output an analog value. This would not be useful for our purposes, as the sensors would need to give a different analog value based on how far away an object was. Interestingly, I found out that these TSOP receivers were more generally used for remote control systems and thus had a long range. I was able to pick up the signal of this modulated LED from quite a fair distance - maybe 60-100cm. The LED I used has quite a narrow view angle, and I was surprised it would be able to pick it up. In the end, I probably could have worked out a solution using the TSOP's, but instead I hit on a much simpler idea.

Infrared Phototransistors: They essentially work the same as the phototransistors we were using before, but instead have a built-in filter for Infra-Red light! This allowed the software to be changed very minimally, and a much more accurate level to be given. A note on the IR phototransistors and the IR LEDs - they have quite a low viewing angle - around 15 degrees. Thus, the IR sensors are highly directional. In the future, it may be valuable to install LEDs with a higher viewing angle, or install more of these sensors so more parts of the hand can have haptic feedback.

The circuitry sketches are shown below:


This sketch shows the basic idea of the phototransistor circuit


The top half of this drawing is the ideas behind how to actually build the VeroBoard for this sensor and how small I could make it.

The final product is below! Shen will detail more about the integration of the sensors and the fingers in another post - suffice to say that these turned out quite well!




The only problem with these sensors is that they were not identically made. Even the slightest difference in the heights of the phototransistor and the LED would cause a much different reading. The angle of each LED and phototransistor determines the optimal distance that it can measure. The next section will detail the software readings and problems that came with the IR sensors.

 

Software

In an earlier post, I detailed the way that the Arduino would work in reading the IR data. The polished version in Javascript is below:

The levels that I have chosen have been experimentally determined. I decided that a 15-10cm, 10cm-3cm and 3cm-0cm threshold should be used. This was an arbitrary decision, but it gave what I felt to be the right amount of distance for a change to occur. If the lowest threshold was longer than 15cm, then it would start vibrating too early. This is what occurs in the checklevel() function in the above code - it outputs a certain level from 0-3 on the intensity of the vibration (a later post will detail the software of the motors).

Using Arduino code, the following values were recorded when there was nothing in front of the IR sensor:
Thumb: 9
Index: 5
Middle: 8
Ring: 10
Pinky: 5

The thresholds you see in the above code come from using a white sheet of paper at differing distances away from the front of the IR sensors. This gives the best response; using other colours will give varying responses (ie the distance thresholds will be slightly different), but it works well with a variety of objects as will be shown in a later demo.

The reading of the sensors comes from each of the small sections that start with "Finger.on('data'...". As NodeJS is asynchronous, as soon as there is data that can be read from the finger, it will be taken and used to update the 'levelarray' that contains the intensity of response for each finger. The reason it is so small is that I felt it would be easier and quicker to send 5 numbers over the socket than a fluctuating array. It is a small difference but it turns out to be easier to do anyway.

Below is what you receive on the client-side:



You can probably see when we introduced an object in front of the fingers, and when we took it away! Another thing you will see is that the fingers are not always consistent in their reading - this is due to the asynchronous nature of the calls that are quite sensitive to change. It does not affect the use of the system, however, and thus does not cause a problem.


Coming soon: vibration motor software integration and fixing wiring problems

No comments:

Post a Comment