A test using a dissolve shader + a skinned mesh renderer as a particle source.
References:
https://github.com/keijiro/Smrvfx
https://www.youtube.com/watch?v=taMp1g1pBeE
A test using a dissolve shader + a skinned mesh renderer as a particle source.
References:
https://github.com/keijiro/Smrvfx
https://www.youtube.com/watch?v=taMp1g1pBeE
I used several particle systems in Unity’s VFX graph to make a face-scanning effect.
References: https://github.com/keijiro/Dkvfx
In 2013, I programmed a custom blob detection and filtering application for identifying robots in the Symbiosis project. The application allowed background substraction, constrast stretching blob detection, filtering blobs by area, and sending the data using the OSC protocol. There are many blob trackers out there today. The main asset of this code was that it integrated the great OfxUI library with OpenCV, therefore allowing to build nice interfaces for OpenCV applications.
Needless to say, this was coded in OpenFrameworks.
Find the code here.
VBOT, Visión por computador para una aplicación interactiva from Sergio Bromberg on Vimeo.
This is a two-camera rig for shooting 360 video. I use it with 220 degree lenses, which yields an output resolution (after stitching) of around 3600×1800. At Alterlab we use this rig for light works with little time for post-production where a 3600×1800 resolution is enough. I modeled this rig with the camera operator in mind: you can access all camera ports (HDMI, USB), SD card, buttons, and battery slot. The rig has a cap that secures the cameras, and also works as a holder for an accesory. We use it for a Zoom H2N recorder. The rig requires 2 M3x16 hex bolts and 1/4′ bolt. I use it with a “coupler” in order to attach it to a tripod.
I wrote these Unity scripts while I was studying ways of feeling presence in virtual reality. You can:
Check this video to see an interface for recording meshes, and this video for an example of recorded meshes in virtual reality.