Christopher Getschmann

I make things. (he/him)

Christopher Getschmannvolzo@mastodon.world
2025-04-04

@floe Ah, I see, it's this time of the year again 😆 I guess this counts as (hay) fever-induced engineering.

Christopher Getschmann boosted:
2025-01-05

For the 4th year in a row, my all-sky camera has been taking an image of the sky above the Netherlands every 15 seconds. Combining these images reveal the length of the night changing throughout the year, the passage of clouds and the motion of the Moon and the Sun through the sky. #astrophotography

This image shows the hourglass shape of the nights shortening and lengthening from winter to summer and from summer to winter, passing clouds, the Moon passing through the night sky every month (diagonal bands) and the Sun climbing higher in the sky during summer.
Christopher Getschmannvolzo@mastodon.world
2024-11-20

@stk Die WP hatte da mal vor einiger Zeit eine interaktive Übersicht: washingtonpost.com/technology/

Christopher Getschmann boosted:
flacsflacs
2024-06-25

as a: frog
I want to: be boiled
so that: I maximize shareholder value

Christopher Getschmann boosted:
þēodrīċ🔸(e/ack thbbft)theodric@social.linux.pizza
2024-05-30

Coolest video I've seen today: an electronic scanner-paint plotter from 1970!

Christopher Getschmannvolzo@mastodon.world
2024-04-01

@floe Last year some numbers were reported. Looked it up: "The rate of moderation automation is very high at Meta: At Facebook and Instagram, respectively, 94% and 98% of decisions are made by machines". Will differ a bit by language for sure, but yeah, probably no scamming report will be seen by humans unless escalated multiple times...

Christopher Getschmann boosted:
Florian 'floe' Echtlerfloe@hci.social
2024-02-12

And together with the indomitable @volzo, we present the LensLeech, an all-in-one soft silicone widget that will turn any camera into a tangible input device: dl.acm.org/doi/10.1145/3623509 #tei2024

Christopher Getschmann boosted:
OpenCVopencv
2023-11-02

The campaign to fund OpenCV 5 is now live! Open your wallets to help secure a future for open source, non-profit, computer vision and AI technology that is freely available to anyone with the desire to learn. igg.me/at/opencv5

Christopher Getschmann boosted:
Emeritus Prof Christopher MayChrisMayLA6@zirk.us
2023-09-16

Tom Gauld on the #watercrisis

Cartoon: Updated illustrations revealed for the 'Wind in Willows':

1908: Messing around in a boat. A smiling Ratty, rows a serene Mole in a boat along the river

2023: Boating around in a mess. A frowning Ratty rows their boat through garbage & sewage, while Mole wears a gas mask
Christopher Getschmannvolzo@mastodon.world
2023-08-18

@vsaw @floe Sure, always appreciated, but don't expect too much... If you just want to get a feeling for the kind of data that can/could be generated with the app, there are two example datasets available: despat.de/visualize/#dataset_e

Christopher Getschmannvolzo@mastodon.world
2023-08-18

@floe @vsaw Nah, wouldn't make sense. The state-of-the-art is simply moving too fast when it comes to any ML applications and I assume the app is of little use now almost half a decade later. Compiling from source would probably work, but Android 13 might require additional permissions for the background operations (waking up, CPU locks and camera access with disabled screen)

Christopher Getschmannvolzo@mastodon.world
2023-07-04

This was -- in its entirety -- quite a huge chunk of things to test, try, and learn. Especially starting from zero.
If you want to know more about how silicone molding with integrated lenses can be done, I did a separate video about this topic:

youtu.be/EsB0X7UcWaI

If you want to learn how the application examples work, I did a video about the hybrid viewfinder (and viewfinders in general):

youtu.be/7Gr8lYbwCyg

Christopher Getschmannvolzo@mastodon.world
2023-07-04

If there is a pattern printed on the silicone, it's straightforward to detect the deformation of the pattern and detect what the fingers do with the silicone.

You can find the pre-print paper on arxiv: arxiv.org/abs/2307.00152
and additional info here: volzo.de/thing/lensleech/

Christopher Getschmannvolzo@mastodon.world
2023-07-04

Quick summary:

If you want to enable some kind of on-lens interaction you need to track fingers on or slightly above a camera lens.
With a piece of soft and clear silicone, it's easy to create a protective barrier for the lens, but the finger will be out of focus and just a skin-colored smudge.
By molding the clear silicone in the shape of a positive lens, it's possible to refract the light in a way that the focus of the camera will always be on the fingertip if it touches the silicone.

Christopher Getschmannvolzo@mastodon.world
2023-07-04

Have you ever felt the urge to touch a lens like a button or a joystick? I built some soft silicone blobs that can transform camera lenses into physical input elements.

I tried to squeeze a paper into the shape of a YouTube video: youtu.be/lyz52IzMcnM

Christopher Getschmannvolzo@mastodon.world
2023-07-02

Recently I was looking into 1D LIDAR scanning for facade measurements and it's crazy how hard it is to find one that works well and doesn't break the bank. The best price/range/precision ratio has the 6-year old Garmin Lidar V3, while all of it's successors and competitors perform worse, cost 3x or both. Looks like shitty 2D sensors (1D sensors that spin very fast) captured the whole market...

Christopher Getschmannvolzo@mastodon.world
2023-06-23

The hard and labor-intensive part is the UX for the vector lines interface. Yet, they offer their service by charging 99$ for the plastic part.

I guess they could simply allow people to print a pattern on a regular piece of paper (just as they do with their computer vision assisted router) and charge a few cents for the web service. Do people really hate pay-what-you-use so much?

Christopher Getschmannvolzo@mastodon.world
2023-06-23

So, Shaper has a new product and it's a plastic computer vision marker frame to digitalize hand-drawn lines with a photo in a web app:

kickstarter.com/projects/shape

The marker pattern looks very much like Pi-Tag by Bergamasco et al. but I guess they don't rely need all the fancy tricks of that paper.

I generally very much like what they do (and they are one of the few succcessful companies that emerged from the field of human-computer interaction). But the pricing is a bit weird on this one.

Christopher Getschmannvolzo@mastodon.world
2023-05-31

A while ago I stumbled upon the fact that cameras can reverse the perspective of an image. The only thing you need for that is a lens larger than whatever you want to take a photo of.

Because I enjoy taking the most horrible photos of faces I can I set out to find the largest lens I could get.

If you're curious how that looks, there is a YouTube video: youtu.be/d0Njtko93RQ

Christopher Getschmannvolzo@mastodon.world
2023-05-26

In case you went to a wedding recently (or you are old, sorry), you may know disposable cameras. ~27 pictures on film and a lot of plastic waste.
I wondered if it's possible to make something useful with them instead of throwing them away. The answer is: kind of.

Re/Upcycling disposable cameras for toy lenses:

Video: youtube.com/watch?v=mnvB71b50w

Blog post: volzo.de/thing/recyclinglenses

Client Info

Server: https://mastodon.social
Version: 2025.04
Repository: https://github.com/cyevgeniy/lmst