by gitgud on 10/11/2021, 11:16:06 AM
by tomcooks on 10/11/2021, 4:48:13 PM
This is a GREAT website, I can understand what it does with zero clicks, zero scrolls.
Really great, congratulations, I hope that I can find a way to apply this lesson to my SaaS.
by smoyer on 10/11/2021, 4:59:10 PM
I've been working on a couple of chording keyboard designs and was thinking I might be able to create a virtual keyboard using this library. It would be nice to also be able to recognize the hand from the back. A keyboard would also obviously be necessary to track two hands at a time.
How does the application deal with different skin-tones?
by programmarchy on 10/11/2021, 5:04:53 PM
Was wondering how easy it'd be to port to native mobile, so went looking for the source code, but doesn't appear to actually be open source. The meat is distributed as binary (WASM for "backend" code and a .bin for model weights).
Aside from being a cool hand tracker, it's a very clever way to distribute closed source JavaScript packages.
by phailhaus on 10/11/2021, 5:22:42 PM
An "undo" gesture seems necessary, it was a bit too easy to accidentally wipe the screen. Aside from that, this is fantastic! Love to see what WASM is enabling these days on the web.
by iainctduncan on 10/11/2021, 2:41:36 PM
Hi, I'm not sure if you've looked into this or not, but another area that is interested in this sort of thing and might be very excited is musical gesture recognition.
by layer8 on 10/11/2021, 9:29:34 PM
What would be nice is a version that can be used to paint on the screen with your fingers, such that the lines are visible on a remotely shared screen. The use-case is marking up/highlighting on a normal desktop monitor (i.e. non-touch) while screen-sharing, which is awkward using a mouse or touchpad (think circling stuff in source code and documents, drawing arrows etc.). That would mean (a) a camera from behind (facing the screen), so that the fingers can touch (or almost touch) the screen (i.e. be co-located to the screen contents you want to markup), and (b) native integration, so that the painting is done on a transparent always-on-top OS window (so that it's picked up by the screen-sharing software); or just as a native pointing device, since such on-screen painting/diagramming software already exists.
by stavros on 10/12/2021, 6:30:56 PM
This looks great! Recently I've been wanting to make a hand-tracking library for video editing. I'd make a hand gesture like an OK with my index and thumb to begin recording, and when I was done I'd make a thumbs up to keep the take or thumbs down to delete a bad take. That way, I could very easily record stuff while only keeping the good takes, to sort out later.
Hell, the library could even stitch the takes together, omitting the times when my hand started/finished doing the gestures.
by tjchear on 10/11/2021, 7:44:24 PM
This reminds me of TAFFI [0], a pinching gesture recognition algorithm that is surprisingly easy to implement with classical computer vision techniques.
[0] https://www.microsoft.com/en-us/research/publication/robust-...
by brundolf on 10/11/2021, 6:51:57 PM
Bit of feedback: the home page is pretty sparse. The video is great, but it wasn't obvious how to find the repo or where to get the package (or even what language it can be used with). I had to open the Demo, wait for it to load, and then click the Github link there, and then the readme told me it was available on NPM.
Otherwise looks pretty impressive! I've been looking for something like this and I may give it a whirl
by eminence32 on 10/11/2021, 10:40:22 PM
The demo doesn't seem to work on my chromebook. Maybe it's too underpowered?
Web page doesn't say anything after `Warming up...` and the latest message in the browser console is:
Setting up wasm backend.
I expected to see a message from my browser along the lines of "Do you want to let this site use your camera", but I saw no such message.by jakearmitage on 10/11/2021, 10:31:59 PM
I wish there was a nice open-source model for tracking hands and arms with multiple viewpoints (multiple cameras), similar to commercial software like this: https://www.ipisoft.com/
by WalterGR on 10/12/2021, 3:32:52 AM
Awesome.
Just note that in the demo video, the user is 'writing' everything mirrored.
by tomcooks on 10/11/2021, 4:50:43 PM
BTW this would be great for spaced repetition foreign character learning (Chinese, Arabic, Japanese, Korean, etc.): if the drawn figure is similar enough to the character the student is learning mark it as studied.
Congrats again
by inetsee on 10/11/2021, 5:49:22 PM
My first question is whether this has the capability of being adapted to interpret/translate American Sign Language (ASL)?
by borplk on 10/11/2021, 6:25:14 PM
Very impressive.
I want something like this so I can bind hand gestures to commands.
For example scroll down on a page by a hand gesture.
by kseistrup on 10/12/2021, 7:09:52 AM
Could it be used to translate sign language signs to written or spoken words, I wonder.
by lost-found on 10/11/2021, 10:15:46 PM
Demo keeps crashing on iOS.
by adnanc on 10/11/2021, 6:50:27 PM
Great idea which is brilliantly executed.
So many educational uses, well done.
by karxxm on 10/11/2021, 6:10:04 PM
Would you provide the related paper to this approach?
by tcper on 10/12/2021, 7:27:32 AM
Great work!
by rglover on 10/11/2021, 6:20:31 PM
Great demo.
by itake on 10/11/2021, 5:06:25 PM
I think these tools are super interesting, but I tools like this marginalize users with non-standard number of limbs or fingers.
The demo really sells it here [1]. It's amazingly intuitive and easy to use, it should be a part of video-conferencing software.
[1] https://handtracking.io/draw_demo/