#handTracking

2026-03-11
Hand Tracking with MediaPipe (Task API)

Real-time hand tracking using the MediaPipe Task API and a TensorFlow Lite model.
The 21 hand landmark points are detected live and displayed as a skeleton. I used my old PlayStation 2 EyeToy camera with a resolution of 640×480 px.

Such systems can be used for gesture control, motion capturing, VR/AR interaction, touch-free interfaces, robotics interfaces, or even for computer games and creative projects.

Similar techniques can be used to implement other forms of computer vision, such as face or eye tracking, by using the corresponding model instead of the hand model.

Video workflow:

- Recorded with OBS
- Edited in Kdenlive
- Transcoded with VAAPI (H.264)

Everything runs on Linux + Python (FOSS), so anyone can set this up.

Background music: Kenke - Counting Stars (Rock Version) [Nightcore] (https://www.youtube.com/watch?v=y8OwQo225cI)

#ComputerVision #MediaPipe #MachineLearning #HandTracking #Python #Linux #OpenSource #RetroTech #EyeToy

Evangelion: Δ Cross Reflections, il nuovo gioco XR punta tutto sul tracciamento delle mani

fed.brid.gy/r/https://www.gala

<img alt="Evangelion: Δ Cross Reflections, il nuovo gioco XR punta tutto sul tracciamento delle mani" class="webfeedsFeaturedVisual wp-post-image" height="1079" src="https://i3.wp.com/roadtovrlive-5ea0.kxcdn.com/wp-content/uploads/2026/02/evangelion-cross-reflections.jpg?w=1920&amp;resize=1920,1079&amp;ssl=1" style="display: block; margin-bottom: 5px; clear: both;" title="Evangelion: Δ Cross Reflections, il nuovo gioco XR punta tutto sul tracciamento delle mani" width="1920" /><p>Il 30º anniversario di <em><span class="hover:entity-accent entity-underline inline cursor-pointer align-baseline"><span class="whitespace-normal">Neon Genesis Evangelion</span></span></em> è stato celebrato a Tokyo con un evento che ha attirato fan e addetti ai lavori. Sul finale della manifestazione è arrivato uno degli annunci più attesi: <span class="hover:entity-accent entity-underline inline cursor-pointer align-baseline"><span class="whitespace-normal">Pixelity</span></span> ha presentato ufficialmente <em><span class="hover:entity-accent entity-underline inline cursor-pointer align-baseline"><span class="whitespace-normal">EVANGELION: Δ CROSS REFLECTIONS</span></span></em>, nuovo progetto XR ispirato alla celebre serie anime.</p><div class="starw-contenuto-2" id="starw-3895405957"><ins class="adsbygoogle" style="display: block; text-align: center;"></ins>

</div>
<p>Il focus della presentazione è stato chiaro fin da subito: interazione avanzata e utilizzo esclusivo del <strong>tracciamento de
Metaverse 💞 beyond.picturesmetaverse@eicker.news
2026-02-25

Evangelion VR game will primarily use hand-tracking: The upcoming EVANGELION: Delta Cross Reflections at the 30th Anniversary event in Tokyo features controller-free interaction, allowing players to perform all actions using hand and finger movements. The three-part series begins in 2026. roadtovr.com/evangelion-vr-gam #Metaverse #VR #AR #Gaming #HandTracking

Meta Quest v85 introduce la Surface Keyboard: digitare su qualsiasi superficie ora è possibile

fed.brid.gy/r/https://www.gala

<img alt="Meta Quest v85 introduce la Surface Keyboard: digitare su qualsiasi superficie ora è possibile" class="webfeedsFeaturedVisual wp-post-image" height="1080" src="https://i3.wp.com/roadtovrlive-5ea0.kxcdn.com/wp-content/uploads/2026/02/surface-keyboard-quest.jpg?w=1920&amp;resize=1920,1080&amp;ssl=1" style="display: block; margin-bottom: 5px; clear: both;" title="Meta Quest v85 introduce la Surface Keyboard: digitare su qualsiasi superficie ora è possibile" width="1920" /><p><strong><span class="hover:entity-accent entity-underline inline cursor-pointer align-baseline"><span class="whitespace-normal">Meta</span></span></strong> ha rilasciato un nuovo aggiornamento per Quest che introduce diverse novità rilevanti, tra cui una funzione sperimentale chiamata <strong>Surface Keyboard</strong>, pensata per consentire la digitazione su qualsiasi superficie piana. Con questo update, la piattaforma VR amplia le possibilità di interazione, puntando con maggiore decisione su produttività e usi quotidiani.</p>
<p>L’aggiornamento <strong>Quest v85</strong> è attualmente in distribuzione attraverso il <strong>Public Test Channel (PTC)</strong> e rappresenta il primo intervento di rilievo dopo la recente riorganizzazione di <strong><span class="hover:entity-accent entity-underline inline cursor-pointer align-baseline"><span class="whitespace-normal">Reality Labs</span></span></strong>, annunciata da Meta con l’obiettivo di concentrare più risorse su intelligenza artificiale e smart

galal (@hololux)

글로벌 최초로 안경 없이(글래스프리) 작동하고 핸드 트래킹을 지원하는 언리얼 엔진 5 기반 의료 영상 뷰어를 환자 교육용으로 개발 중이라는 내용입니다. 개발자는 현재 솔로 개발자로서 이 서비스를 운영하는 데 월 1,200달러 이상 비용이 든다고 언급했습니다.

x.com/hololux/status/201359794

#unrealengine #medicalimaging #handtracking #spatialcomputing

Meta Quest migliora l’hand-tracking con l’aggiornamento v83: più precisione, ma restano i dubbi

fed.brid.gy/r/https://www.gala

<img alt="Meta Quest migliora l’hand-tracking con l’aggiornamento v83: più precisione, ma restano i dubbi" class="webfeedsFeaturedVisual wp-post-image" height="1080" src="https://i3.wp.com/roadtovrlive-5ea0.kxcdn.com/wp-content/uploads/2023/09/quest-3-1.jpg?w=1920&amp;resize=1920,1080&amp;ssl=1" style="display: block; margin-bottom: 5px; clear: both;" title="Meta Quest migliora l’hand-tracking con l’aggiornamento v83: più precisione, ma restano i dubbi" width="1920" /><div class="starw-prima-del-contenuto_4" id="starw-1827239056"><div id="addendoContainer_Interstitial"></div></div><p><strong><span class="hover:entity-accent entity-underline inline cursor-pointer align-baseline"><span class="whitespace-normal">Meta</span></span></strong> ha rilasciato un nuovo aggiornamento per <strong><span class="hover:entity-accent entity-underline inline cursor-pointer align-baseline"><span class="whitespace-normal">Meta Quest</span></span></strong>, con l’obiettivo dichiarato di migliorare in modo sensibile prestazioni e affidabilità dell’hand-tracking. La versione <strong>83</strong>, attualmente in distribuzione su <strong><span class="hover:entity-accent entity-underline inline cursor-pointer align-baseline"><span class="whitespace-normal">Horizon OS</span></span></strong>, introduce una serie di ottimizzazioni pensate per rendere il tracciamento delle mani più preciso in contesti complessi, come movimenti rapidi, locomozione e lancio di oggetti virtuali.</p><div class="starw-contenuto
Andrew Wangandyman404
2025-11-18

WIP: Open palm to show the palette of pieces. Grab a unit/defense and place it. This is what it looks like w/ about 300 units placed.

Thankful for Unity's ECS/DOTS & Meta's spacewarp feature for making this performance possible. Recorded on Meta Quest 3.

Andrew Wangandyman404
2025-11-17

WIP: I'm old & achy, but now I can grab things from across the room simply by wiggling my fingers at them!

Recorded on Meta Quest 3

Andrew Wangandyman404
2025-11-14

WIP: More finger slingshot action w/ hand-tracking in . Smoother and the rubber band conforms to the object being launched. Also there's fun sounds!

Andrew Wangandyman404
2025-08-17

WIP 1 - "Home Is Where The Trash Is".

This is a VR hand puppet game I'm making for Sizzling Shrimp VR Summer Jam. Turn on sound to hear me introduce the game through the raccoon!

AR Glasses: Real Smart Glassesarglasses
2025-06-05

»Here’s what’s inside ’s experimental new smart glasses: with advanced and capabilities.« theverge.com/news/679707/meta-

N-gated Hacker Newsngate
2025-05-04

😆 Oh, joy! Another repository promising "easy" hand tracking, because who doesn't want their satellite to have a tenuous grip on reality? 🚀 With a menu more bloated than a engineer's ego, you'll be drowning in before you can even say "artificial intelligence". 🤖
github.com/benb0jangles/EzTrak

Toby on ARaugmentedorg
2025-02-19

My good friends at
@mixed-en.bsky.social
did a comparison for hand tracking on and . Seems like PSVR is getting better, but still both are not perfect. We still have to "learn" how to user our hands in VR to avoid glitches... mixed-news.com/en/playstation-

Now editing the new post for the #ParticularReality #DevLog, to be published in a few hours.

Meanwhile, it's #screenshotsaturday!

#vrdev #indiegamedev #gamedev #prototyping #handtracking #bodytracking #gamedesign #metaquest

First person perspective, looking towards a portal. Orange particles show the gesture to perform, and the player hands prepare to execute it.
Francis Mangion (M) (VR/AR/*)franciswashere@arvr.social
2025-01-09
2025-01-05

Last week I tried to help a fellow pianist get the hang of #strumming which included filming our playing to illustrate differences in technique.

It later occurred to me that we now have real-time #handtracking on phones, and I’m wondering if anyone has developed a strumming instruction app. Personally, I’d find a graph of arm movements handy (pun intended).

You know when you wait for the holidays so that you can finally work in peace? :D

The #ParticularReality #DevLog will be back this saturday!

#happyHolidays #MerryXMAS #gamedev #vrdev #prototyping #handtracking #bodytracking #metaquest3

Client Info

Server: https://mastodon.social
Version: 2025.07
Repository: https://github.com/cyevgeniy/lmst