Matija
Erceg
15 years in interactivity, design, and tech.
Currently focusing on design and development for interactive experiences and performances.
This page is video-heavy, so it's best viewed on a desktop browser 🤙
Overworld Navigation System
An overworld with points of interest that can be visited to open a browser of a specific Twitch channel, and a blank card, ready to be populated in real time.
OBS Window Capture > TD via NDI
Card editor interface using Custom Parameters in a Popup Window, including copy, number stats, image URL, and card type preset
Automated countdown timer and voting by viewers
Animated points of interest are dynamically connected to an arbitrary number of specifically-named OPs in a specific place in the network
POI placement controlled by OP node position for parameter-less positioning
Instancing and Video Feedback
The performer is able to control Geometry instances in XYZ space, and control each subset of instances using a combination of MIDI and wireless keypad controllers. Combined with smooth video feedback, optimized not to drop frames, we can create what we refer to as Visual ASMR.
One-thousand plus instances with no performance drop, across several geometries
Optimized HAP video encoding with transparency
Manually rotoscoped assets
Custom Object Browser and Placement Engine
A module that allows for a database of Geometries to be browsed (top right) and placed in an orthographic, isometric space, using an Xbox Controller
Performant, using Geo instancing instead of Replicator
Highlighting, 3D Cursor, Positioning, Scaling
All controls mapped via custom module that converts Xinput data to CHOP data for each object
Session storage via CHOPs and DATs
Performer can move inside of scene using combination of own body position plus a macro mover mapped to a controller button+stick combo
Custom extruder object generator that can create thick objects using only a PNG with transparency and a few sliders/toggles as inputs
'Reality View' Perk
Patron pledges, via an API based alert, prompt the performers to enable a horizontal wipe transition between the post-processed billboard, and the untouched raw camera footage, for an interesting effect and spending incentive.
Intensive AR Styling
In this scene, the performer is standing in a brightly lit green screen set. By filtering the negative (keyed) footage and reapplying it as object textures, we're able to affect both the virtual and real-world objects with saturated directional lighting.
Custom PBRs
Lighting presets inside of one scene
CPU Particles
Audience controlled bubbles
Audience controlled tube release
Fast Scene Creation
Mixture of pre-existing 3D FBX assets and in-app modeled Geometries for fast scene pre-vis and WYSIWYG layouts
Performer controlled camera dolly jump points
A double lighting system to account for mixing of PBR and Phong MATs
Noise generated terrain with texture-sourced vertex displacement for quick grass styling
Lots of Screen Space Reflection
"Communal Jukebox" Song Sequencer
A module that allows for viewers to use Twitch bits, Subscriptions, donations, and Channel Point redeems as triggers for playing various tracks of a song in perfect tempo and start/end points
Custom Twitch API bridge triggers a per-track queue
Each user is attributed above the currently-playing track they triggered
Running on own TouchDesigner instance for optimized drop-free playback
Custom mixing, VST, and ducking componentry
Tied to master timeline for frame-perfect playback and start/end timing
Custom 'smoke' visualization of track playback
Particle System Optimization, Complex Compositing for AR
Multiple simultaneous CPU particle systems
Compositing of virtual and IRL-based billboards (robot flies in front, performer disappears in teleporter)
Background is a projected texture of realtime footage with custom lighting applied on top (notice glow under robot, and the performer reflection is screen space rendered, whereas her shadows are a mix of real and rendered soft shadows)
Trading Card Game Presentation Mode
At the end of each capture session, if the audience votes the work to have honored the Twitch channel that's referenced by each card using an automated timer and voting system, the card is presented in a celebratory zoomed in view.
Dynamically generated bump maps and roughness maps based on the dynamically generated color texture of each card
Card view appears simultaneously as the rendered overworld render below it, as well as performer 'cloud' object
Custom Foliage-Rich Scenery
Using a mixture of camera facing 'sprites' and stationary 'billboards' we create the effect of thick foliage using very few polygons.A second render was custom made to speed up the placement process, as the typical Geometry Viewer in TouchDesigner is unable to display Geometries from multiple hierarchies in the network.
GLSL Shader Adjusting and Triggering
An existing shader's code was edited slightly to adjust movement speed, and add the ability to control the position of the sun in the background by the viewers
Twitch API bridge to send inputs to the ISF shader component
Minor shader editing of existing shader by artist nimitz
Background metaball blobs controlled by Kinect
Performer tracking (for reflectiona and billboard positioning) done via UV instead of Kinect due to Kinect input lag
Custom Drawing App
Cursor smoothing by using previous and current frame cursor position to render lines in a Render TOP and cache the output to a persistent image
Width and color controls
Zero lag going from UV > Tool Render TOP > Compositing/Caching TOP > Main Render TOP
Complex 3D Augmented Reality Scenes
By using large chroma green objects, we're able to have performers interact with the virtual rendered space in a more complex way, such as being able to lie on this rendered bed, whilst still using real pillows.
Accurate performer tracking and placement
Lighting affects virtual and real objects equally
Accurate ambient occlusion and soft shadows
Real-time wet glass diffraction on windows
Whimsical Set Building
Sometimes a small but significant visual effect can take the audience's sense of wonderment to the next level. Here, the performer twirls with the rotating Geometries
Twitch-based microtransaction triggers the animation
Audience-adjustable SOP Particle bubbles and PBRs on glass
Xbox controller for puppeteering the robot
Geo Instancing Grass, SOP Particles
The audience can cut the grass periodically using Twitch Channel Points.
TOP > CHOP > Geo Instancing
Particles increase over time based on bits donated by the audience
UV-based performer depth tracking
Accurate AR Lighting on Real Life Objects
By calculating normals based on sillhouette rather than detail, we were able to cast directional lighting on completely flat billboard objects to create more realism when compositing into a virtual scene. Notice how the light not only changes color as the performer moves, but is lighting the correct side of the performer's various body parts. Note that in the real life stage, the performer is lit by white diffused lighting with no directionality, so this is all done in the render.
Cam up with a Threshold > Normal Map TOP > Blur technique to allow lighting to be cast on sides of performer
Use of PBR allows to adjust mix of matte and shiny look, depending on the scene
AR Transitions
By using the real time camera feed and projecting textures onto Geos, we're able to manipulate the real world room or make it disappear
Manual 1 to 1 stage measurements
Single render pipeline
Isometric World Building using AI Generated Assets
Isometric-game inspired levels
Dynamically generated scenes based on TD network structure
Came up with a parameter-free object placement system based on node position on network
Z-depth compositing calculated automatically
Users are able to join and customize their avatars into spots in each scene using Twitch chat commands and Channel Points
UV-Based Performer Depth Tracking
It was imparative that we have lag-free depth tracking which allowed for the performer to be positioned correctly in relation to virtual, rendered objects in the scene, whilst also casting shadows correctly on the rendered environment, as well as generate ambient occlusion at meeting surfaces/corners.The solution is a custom component that takes the chroma-keyed silhouette of the performer, composits it with a red and green Ramp so that an Analyze TOP can provision the UV position of the feet.Then, a Render Pick CHOP uses the UV to discern the XYZ position of the performer, and we can use that data to both scale and dolly the performer's billboard object.More info in my talk with Derivative founder Greg Hermanovic for Music Hackspace
Streamer Card Catalog
Dynamically generated card "catalog view". Built using Replicator COMP in combination with nested Clone Geometry COMPs. Uses custom parameters on Geometries as storage for each card's unique copy, stats, and image URL.
Custom PBR Mats and Environmental Lighting
Ortho to Perspective camera transition
Live Touch Dev Sessions
Occasional live TouchDesigner development in front of a Twitch audience
Lots of question and answer moments
Many moments of having to figure out issues or solutions under pressure
Streamer Card 'Capturing' System
An overworld with points of interest that can be visited to open a browser of a specific Twitch channel, and a blank card, ready to be populated in real time.
OBS Window Capture > TD via NDI
Card editor interface using Custom Parameters in a Popup Window, including copy, number stats, image URL, and card type preset
Automated countdown timer and voting by viewers
Physical and Virtual Guests
Using 3rd party low latency videoconferencing frameworks, we were able to bring virtual guests into the scene as billboards and send them a low-latency video monitor of how and where they appear.
NDI Virtual Camera output
IRL and virtual performer compositing, including digital puppets (ie. robot)
Audience participation 2D GUI overlays (rate-limited 'Lie Detector' voting) with graph and historical averaging
Pseudo-3D Scene Building from 2D Assets
Here, a purchased stock image of a hallway was used in making an adjustable simple geometry, onto which the asset was projected as a texture, allowing for fast MVP creation of a set, versus manual modeling and lighting.
Flat lighting combined with a couple of spotlights to give the performer(s) some depth
Screen Space Reflections
Camera undulation can transition to drone camera by Xbox controller user
Triple-Chroma Key for an IRL Cursor
In addition to the green and blue chroma keys used for the performer and their costume, we use a third red chroma key to track an real world cursor which can be used as an additional input for performance.Here we've set it up with a simple caching feedback loop to create a rudimentary canvas to paint.
Audience based second paintbrush competes with primary non-black pixels, which are then calculated on the left UI element as a percentage.
Kinect to Metaball
Frequent use of metaballs generated from Kinect body tracking CHOPs to Line SOPs to Metaball SOPs
Rendered in a separate TD instance and composited via Touch Out/In TOPs for performance reasons
Audience controllable color/texture on the Metaball object
Physical Multi-Performer Capture
Automatic switching between up to four Kinect users for blob generation, in case any leave the scene
Subscribers can join as a blob in the background, and set their own custom color using a 3-digit rgb code (0-9), and change it later on demand
Xbox controller based drone camera
Procedurally-Generated City Scape and Audience Avatars
One scene called for a stylized but complex futuristic cityscape backdrop. Buildings were generated using Noise TOPs for scale, variation, secondary geometries, and positioning offset. Decorative advertisements were placed by hand using a custom developed object placement system that utilized an Xbox controller, so as to avoid using OP Parameters in TD.In this scene, audience members could choose to join as a species and color of their choosing, which then instanced them into the scene. Three frames of animation were used by these avatars using simple offsetting, rather than animation shaders. Hundreds of avatars could join with minimal performance hit.
Large Scenes and Skyboxes
Mix of in-TouchDesigner modeled and imported 3D assets
Xbox controller controlled drone camera with presets and reset functionality
Simple animated landscape with custom equirectangular projection skybox texture
'Clone Testing Games'
Audience triggerable interactive elements with sound effects
Simple particle systems and on/off animations
Audience voting capability
Digital puppetry
"SHIFTr" User-Generated Character
SHIFTr is a character we sometimes used which was remote controlled by people who had access to a shared folder, top alphabetical file of which would comprise the body of this googly-eyed character. It would update in real time, giving the audience a chance to create emergent improv play.The audience could also trigger Text To Speech for the character via a microtransaction, for which a bridge was needed between Twitch, TouchDesigner, and Amazon AWS Polly (TTS)
"Shuttle Landing" Mini-Game
Sometimes, transitions are needed for when a performer needs to change costume or take a break. Making them interactive was always a plus, and in this example, the audience must collaborate to keep the shuttle as close to the blue area as possible without touching the orange area.We had to invent many games that took into account the difference in lag which viewers received the video feed, ranging from 2-6 seconds.Proximity to the blue or orange area resulted in a final score, which was automatically tracked and displayed at the end of each landing.
User-Generated Image Improv
Here users can submit things to the Pawn Show game by submitting a Channel Point redeem that prompts for string input. The string is validated to be a URL with a .png suffix, and then entered into a visual preview queue for us to approve or reject each one.Approved ones can then be manually triggered to show up in the world as a billboard for the performer to improvise against.
Animated Scene Transitions
To add polish to the show, smooth animated scene transitions were sometimes needed. Here, the performer is using a chroma green stool to perform alongside a tractor-beam style transition.
Animation COMP utilization
Back End UI for transition resetting, and state management
Object fading animations utilizing Alpha-to-Coverage transparency for ease of setup
Audience Voting Widget
We iterated through several versions of an audience-based voting widget which takes into account total votes cast, ratio of votes cast, and special voting privileges (for subscribed users) to output a meaningful test result for each iteration of the fictional test.
Python-less logic and storage
Custom 3D Text Modules
Custom back-end UI for widget
Video Feedback and Cache Loops
Custom bluetooth keyboard > MIDI > TouchDesigner control solution so that the performer has control even when TD is not in focus
Dual chroma key for ability to change costume color separately from keying the performer against the green screen studio
Controllable complex video feedback presets
Tempo controllable by performer via tapping
Cache clearing and freezing ability
UV-Based Hit Detection
Calculated by overlaying the throwable object's UV position as a circle with the silhouette of the performer to detect when there is overlap. Here, we avoid using the Pythagorean calculation of Object CHOP, as scenes that use multiple throwable items can bog down the CPU.Each throwable is triggered by a microtransaction, and has a success state (shown) and a fail state where a lack of overlap when it matches the performer's Z position results in it dropping to the floor instead.
Performer-Controlled UV Cursor for Interactive Objects
To add visual interest and a sense of immersion, some toys had interfaces in 3D space rather than a 2D overlay. These could be controlled using our third color (red) chroma key and tracker alongside a Render Pick CHOP to detect when the performer is hovering over an interactive object from the point of view of the camera.
Custom text extrusion components
Parametrically generated scrollable list of image based items
Thresholded Video Feedback
Sometimes, we would require not only a fading video feedback trail, but one that only displays a set number of iterations. To that end, we built a module that can adjust fading and a cutoff to eliminate any feedback beyond a certain transparency threshold. By keeping the fade slight, we were able to have many mostly solid iterations without needing to keep multiple caches, which are expensive with VRAM.