[Thread thumbnail]
Hello People,
I’ve been attempting to do “EveryDay” VFX / tech artwork sketches in my spare time since Nov final 12 months. Managed a couple of 2 in 3 fee as simply can’t do it day by day, however I’m nonetheless fairly proud of a few of the doodlings and experiments that has come out of it.
This month, present experiments largely been with LIDAR level cloud knowledge from NYC audio-reactive inputs – e.g. right this moment’s is:
You possibly can scroll although them from the beginning on this YouTube playlist when you really feel inclined (a few of the days lacking if it was static picture e.g. Aerialod maps/heightmap renders):
https://www.youtube.com/playlist?listing=PLMqpdxi5SkXLHl5REBvOPOnOfW13muL80
I used to be primarily tweeting the dailies: https://twitter.com/duncanfewkes
However since that web site’s going to crap, I’m attempting out Mastodon: dunk (@dunk@mastodon.gamedev.place) – Gamedev Mastodon
Unsure if I’ll be spamming this thread with each every day, or simply with ones that I’m significantly satisfied with.
Any suggestions, concepts and so on. greater than welcome! I have already got a large concepts.txt file, however I discover it useful to have cool concepts on a backburner for once I get bored with no matter I’m messing about with, so the extra the merrier
Cheers!
7 Likes
#EveryDay 289: Working by way of the present audio-reactive level cloud experiments. First 2 are most polished state, however nonetheless scope for refining colors. Might want to trim out redundant take a look at variants of the pull/push kinds – will preserve the “volumetric” ones, and perhaps work on nicer color gradients, as those in there are just about inventory Unity vfxgraph ones with tweaked brightness.
1 Like
Tweaked color gradients a bit to sq. it away for these level cloud vfx:
Began on extra variants of fluid sim movement triggered particles:
1 Like
“Whaaaaaaat” is my response watching this, very very cool
1 Like
#EveryDay 296: Shiny fish variant. No education/boids behaviour, nonetheless spawning from fluid sim velocity buffer after which movement from turbulence noise discipline (modulated by fluid sim pace).
1 Like
wow, that appears good! you must make a GIF and add it as first picture in your first submit in order that it will get taken as thumbnail (or manually add it as a thumbnail). I nearly didn’t click on your thread as a result of there was no enganging thumbnail However there ware very attention-grabbing experiements in right here!
1 Like
Thanks! Good tip on the gif thumbnail, will do in some unspecified time in the future.
When you like these ones test my YouTube for the opposite 300 or so movies
1 Like
appears tremendous attention-grabbing! nice stuff
I’ve seen these LED partitions used for realtime film VFX manufacturing just a few occasions in particular person. Is the moiree for the particular person controlling the consequences a difficulty?
As a result of up shut I at all times discovered that considerably disturbing / complicated
1 Like
For this one it’s not nice, as a result of the LED panels are 5mm pixel pitch and designed for a viewing distance of 10+ meters (part present is 1% of LED set up going into an indoor theme park as wraparound display all the way in which round central hub part).
So on the distance azure kinect digital camera works (max about depth about 6 meters, however movement picked up higher round 3 meters) the viewing isn’t good for the display. You possibly can even see important moire on the recording – not helped by my iphone 8 being fairly previous and low res now. iPad Professional sensor, particularly utilizing broad FoV lens/zoom, provides a lot much less moire in recording from identical distances and nearer.
Nevertheless, with a lot finer pixel pitch LED panels it’s a lot better – e.g. IIRC these ones are round 2mm pixel pitch and designed/specced for viewing distances round 2 meters (used as a physique tracked recreation set up on a cruise ship, with LED flooring in entrance so the visuals must work up shut):
There’s a great deal of attention-grabbing data about LED volumes for TV/Movie nowadays – AFAIK they use related pixel pitch (round 2mm) however the cameras filming are a lot greater decision sensors and additional away than 2m, so moire from pixel pitch not a difficulty, however panels must be calibrated to the digital camera sensor response, not human eye. This is without doubt one of the greatest articles I’d learn when ILM/Favreau/Mandalorian first began displaying off digital manufacturing: The Mandalorian: This Is the Manner – The American Society of Cinematographers (en-US)
3 Likes
Sure digital camera calibration and a minimal distance of ~5m sounds acquainted. The setup in that video although, wow that appears totally magical
lastly feeling like a magician
1 Like
Probably not vfx, however nonetheless enjoyable – attempting out Unreal Engine Lyra Starter Sport with NDisplay headtracking and lively stereo on the LED wall: