Headlining on Indiegogo right now is the Fin wearable bluetooth ring. While it may not keep you up to date on your messages like some other ring projects we’ve looked at, the Fin aims to be a super lightweight and unobtrusive user input device for any number of bluetooth gadgets. It took some work to figure out how it works, but even with that information, one wonders if they will really be able to deliver the performance and user experience they’re advertising.
How does it work?
A dream of many tech fetishists is a completely unobtrusive user input device that somehow interprets natural human gestures to achieve exactly the result that the user intended. Many examples such as the Nymi have shown “demos” where blatantly unspecific gestures are used to trigger highly specific tasks such as a simple arm wave to open a car’s trunk. Fin aims to provide a more precise user interface that better utilizes the finer motor capabilities of our fingers to encode a number of hand gestures. From the look of their demonstrations, the ring is able to detect motion thanks to an inertial measurement unit (IMU) as well as interaction between the user’s thumb and the palm of their hand. Swipes between thumb and fingers can be used for less specific purposes (turn volume up, change slide) while taps of the thumb on individual fingers will treat your fingertips like buttons where each section of each finger can trigger a different action.
It was with this latter feature specifically that I became very suspicious of the team’s claims. If you look at some of the images, it looks like the device sits around the base of the thumb and somehow turns the thumb into a stylus to tap the other fingers.
The top question to answer when validating a crowdfunding project like this is whether there’s an explanation for how the device could work. Sure, the ring has an accelerometer, but the level of precision required for something like this is far beyond what a simple motion sensor can do (especially considering that the other fingers don’t have sensors). I speculated about a possible electrical conductivity sensor or ultrasonic tranceiver until I found some more information on their prototypes.
Unfortunately, all of the demonstrations shown in their campaign video are with nonfunctional form-factor mockups. This is a similar marketing strategy to what we saw with TellSpec where a plastic dummy unit was used as a placeholder to demonstrate the end user experience rather than the current status of the product. In an appearance on TechCrunch’s Hardware Battlefield however, Rohildev Nattukallingal founder and CEO of RHL Vision Technologies, the company behind Fin, demonstrated what they have working right now:
As expected for any device currently in development, the prototype is obviously much bulkier than the final product they are advertising. More important than its size and appearance however is its level of functionality. As Rohildev demonstrates, the device uses an optical sensor to detect gestures where the user rubs the surface of the device (mounted on the thumb with tape) with his or her fingers. Up, down, left, and right can be mapped to different functions. In this setup, he can control volume and switch audio tracks.
What’s demonstrated is scarcely more complicated than a standard optical mouse or more appropriately, the BlackBerry optical trackpad. When compared to an optical mouse sensor by one of the TechCrunch panelists, the team responded saying that their sensor is more sophisticated than a mouse sensor in that it can measure the user’s actual fingerprint.
This answers one of the biggest questions we had about this device which is how it will map the user’s fingers into buttons. An optical trackpad sensor can detect direction of motion, but detecting absolute position is impossible. This is why you can pick up your mouse and drag it over the same section of your mouse pad while continuing to move the cursor in the same direction. Apparently the team is planning on using features of the user’s skin as landmarks to determine the location of the sensor. I was ready to write this project off until I heard about this method which is actually pretty clever. I’m surprised they haven’t mentioned it anywhere on their Indiegogo campaign.
This method is very similar to what Swedish technology company Anoto licenses to digital pen companies such as the popular Livescribe. This system attaches optical sensors to pens that read precisely spaced tiny dots printed on special paper to uniquely locate the pen on the sheet:
The dots are placed such that by reading a small number of them, the pen can determine that it is in a specific location on the page and not anywhere else.
Fin is hoping to employ a similar system by replacing these dots with the unique fingerprints and “friction ridges” of the user’s fingers. Fingerprint recognition is typically achieved by analyzing both the fingerprint’s patterns (such as the familiar loop or whorl) as well as so called “minutiae” which are small unique features of individual ridges of a person’s fingerprint.
Reading up on minutiae, it looks like they are small and distinct features that have a unique spatial distribution across a person’s fingerprint. Based on this information, it looks like they could act as a substitute for Anoto’s special dotted paper. Because everyone’s fingerprint is different, the device cannot be pre-programmed like a smart pen, so the user will have to enroll his fingerprints into the Fin software as described in their FAQ section:
I imagine the process will be something like enrolling a fingerprint on an iPhone 5s. This seems reasonable. After all, Fin isn’t looking to detect the ultra-precise location of a pen on a sheet of paper. They’re only looking to figure out which of only a few finger segments the user is currently touching.
I spent the better half of an evening looking for information on anyone else who has attempted something similar and came up empty-handed. There are numerous devices currently on the market and under development for capturing biometric data from people’s hands, but as far as I can tell, nobody is using that data to capture position data. That being said, it’s a pretty clever solution, and if they can capture clear enough fingerprint data and process it fast enough, I think that it could work
It does have a few implications for the user experience though.
Misleading user experience
So far, every single functional prototype Fin has shown has placed some piece of technology between the thumb and whatever surface the thumb is touching.
In these three examples, the device covers the pad of the thumb and is rubbed along the user’s fingers. This makes sense given the optical sensor explanation, but it doesn’t explain the demonstrations in the video where the device is seated around the base of the thumb.
If they are indeed using friction ridge detection as a method of thumb localization, a physical interface is necessary and there are many technologies already available that can do that such as optical, capacitive, and ultrasound sensing. Getting the same response without some kind of physical interface between the thumb and fingers could prove much more difficult if not impossible. I found a few examples of contactless fingerprint scanners online which all seemed to be using some method of optical sensing that basically amounts to just taking a picture and analyzing it. These devices are bulky as they contain a specific set of lights and lenses to capture fingerprints.
It gets more confusing looking closer at their mockup. It doesn’t appear to have any sensors visible from the outside as shown in this highly informational diagram:
This is a little troubling.
So maybe their mockup is just a stylized version of what the device will actually look like. I’m sure a lot of work can be done to slim down whatever sensor they end up using and it might not look much worse than that. That still leaves the question:
How are you supposed to wear this thing?
In these three examples, we see the user wearing the device with the raised section along the inside of the thumb facing the other fingers. In the first example, it looks like the device is touching the user’s index finger, and in the third example, the user actually rubs his index finger along the device much as we’ve seen demonstrated with the prototype.
But then there’s these:
These devices are all being worn around the base of the thumb leaving no opportunity for an optical sensor to get anywhere close to the other fingers.
What’s also strange is the general lack of consistency with how the device is worn in the video. Sometimes, the elevated section is along the pad of the thumb, sometimes it’s on the back by the nail. Left hand. Right hand. Rattling around on a small finger. It’s really all over the place.
Here’s my theory of what’s going on here. The device is meant to be worn as demonstrated in the diagrams on the project page (the first two examples above). This places a section of the ring partially covering the pad of the thumb which puts it in a prime location for direct contact with the other fingers.
With a good idea and some non-functional prototypes, the team hired on a video production crew who took the idea and ran with it. This crew may not have been briefed on the technical limitations of the device and took some liberties with its capabilities. The general premise is still the same: a device on your thumb turns your hand into a input device, but their representation of this is very misleading to their backers.
I think a device which covers up a major sensory part of a person’s body (the thumbprint) offers a very different experience to the user than something as unobtrusive as what they’ve shown. It’s a great alternative to phone’s touch screen, but for the odd time you want to use a phone’s actual screen, a device covering the thumb will certainly get in the way. It suddenly becomes something that you don’t wear passively like a piece of jewelry, but something you wear actively when you need it.
Processing and security
Biometric measurements are often extremely noisy as dirty sensors, scratches or cuts on fingers, and other physical alterations to the measurement can muddle things up. Getting through this noise to match a fingerprint to a sample requires a pretty substantial amount of digital signal processing which is hard to do on small low-power processors.
Curious about what they’re using, I took a closer look at one of their prototypes:
With a little Photoshop, I was able to determine that they’re using an Atmel processor.
And sure enough, the footprint for their processor shown in the super low resolution screenshot of their flexible circuit board layout appears to match that of the ATMega48/88/168 , the same processor series used in the popular hobbyist Arduino platform.
This is a very low power 8-bit processor that is great for battery powered applications where large amounts of data processing is not necessary. Even in the largest ATMega328 variant, it only has two kilobytes of ram which is enough to store about 0.6% of the above image. It simply is not enough power to analyze a stream of video data from a sensor in realtime. For something like that, you would need a much more powerful processor such as a DSP with special functions built in for analyzing this kind of data as demonstrated in this video from Texas Instruments.
That leaves two possibilities. Either one of the other components on their PCB is responsible for processing the fingerprint data, or that processing is happening on the phone.
The PCB shows two other main components besides the ATMega48:
My guess is that the component to the top left is some kind of optical sensor, while the device to the bottom right of the ATMega is a bluetooth radio. The only possibility I can see for processing fingerprints on-device is if their optical sensor has some built-in DSP functionality, but I find this unlikely as they have a very special use case that is unlikely to be covered by an off-the-shelf solution like that. There’s also the possibility that this isn’t a complete circuit as the device is also supposed to have an accelerometer.
Further evidence for the phone-processing explanation is that their Bluetooth radio is advertised as supporting 2.1+EDR which stands for Enhanced Data Rate. This type of Bluetooth connection supports data rates as high as 2.1Mbit/s which is much higher than what would be necessary for simple button press and swipe signalling but would be appropriate for streaming raw data to be processed.
Technically, the only real impact of a solution like this is a slight drain on the user’s phone which will need to donate some processing power to analyzing the fingerprint data. This also makes it easier for two users to share the same Fin as their fingerprint profiles will be enrolled in their phones and not their Fins.
From a security standpoint however, offloading the processing does mean finding some way to broadcast fingerprint information wirelessly to a device several feet away and managing it securely on the phone receiving it. I can imagine a few people wanting to keep their fingerprints private who might like to know more about how the Fin team plans on keeping that data secure. No information has been offered.
Can they do it?
Can it be done? Maybe. Any device with this level of sophistication is going to be difficult to make work well. Heck, even Apple’s iPhone fingerprint scanner doesn’t work all of the time. Cramming all of that into a tiny battery powered wearable package doesn’t help.
That being said, assuming you’re willing to cope with the annoyance of a device at least partially covering your thumbprint, I think that something like what they’ve shown is possible. I am a little concerned about the team behind it though.
RHL Vision looks like an incredibly competent group of software developers, and their background in video processing for user interaction makes them well poised to tackle the software portion of this project. Making hardware is an entirely different animal though. Their form factor is extremely tight, and taking on something so difficult as a first-ever hardware project makes me worry.
I expect they will ship something, but it won’t look nearly as sleek as their prototype (which by the way doesn’t even show a charging port), and I would be shocked if they ship in the seven months they’ve given themselves.