Big Tent was a featured venue at the 2018 Lotus world music and arts festival in Bloomington, Indiana. Artistic collaborations 12 with cinematographers, composers, videographers, animators and musicians were featured during the festival, in addition to “unplugged” sets from some of the festival’s headliners!
August 3rd, 2018
Inspired by our partners and time in Haiti & with support from the Arts Council of Indianapolis, Indiana Arts Commission, Reconnecting to Our Waterways and Indianapolis Public Library we brought the Rara to Indianapolis. Rara is a Haitian street band that grows and evolves as it moves through the city playing traditional instruments. Rara’s exist to celebrate the joy of living and unite communities to dance and sing away their troubles.
The Near West Rara started in Hawthorne and moved with a band on a portable stage playing short sets in each neighborhood. When the band wraped up, that neighborhood joined the parade and traveled to the next (just like a Haitian Rara). It culminated on West Michigan Street with Big Tent, an interactive video installation filled with imagery from the Rara, and more music!
August 26 & 27, 2016
Big Tent was a featured venue at the 2016 Indianapolis Light Festival, IN Light IN, held on the downtown canal. The two night event had an estimated attendance of over 4000.
Animation and music for the event was live generated and driven by motion gestural controllers distributed among the audience, and master controlled by Ben Smith.
Photos courtesy Vasquez Photography and Hadley Fruits Photography.
Available in the Unity Asset Store.
A library of tools to support smooth, voxelized (i.e. fully deformable and destructible) terrain generation. Now with levels of detail (to enable infinite paging worlds) and an octree scene model (to minimize loading times).
Implements Marching Cubes and Dual Contouring.
From the pen of collaborator Dr. Fiona P. McDonald:
In the arts, humanities, and social sciences, a gap exists in how we uses our sense to tell stories about shared lived experiences. In recent years, water has been part of so many facets of debate regarding its ownership, consumption, and rights from early treaties between Indigenous peoples and colonial powers. And today, water rights are still contested and access is re-written through Western policy systems. Fiona is looking at water as material culture, and sound as a way to understand our shared experiences. Arts-based ethnographies that rely on the use of our senses may act as a way of going beyond words to harness a deeper understanding of our shared lived experiences.
This project integrated digital storytelling to create a sensory ethnography. This project was carried out in collaboration with visual artist Ashley Blanton and Spanish Language Arts Teacher, Mr. Randall Grillo with the El Orto Lado program at the Monte del Sol Charter School, and elementary Music Teacher, Ms. Sarah Gachupin at Atalaya Elementary School in Santa Fe, New Mexico.
A final installation of this project was installed in collaboration with Dr. Benjamin Day Smith (IUPUI School of Engineering, Music and Arts Technology and Computer Information and Graphics Technology) and graduate student Neal Anderson (Masters in Science and Music Technology) on April 13th, 2017 from 6:30-8:00pm at the Santa Fe Art Institute (1600 St. Michael's Street).
In collaboration with the School of Media and the Department of Modern Dance, Big Tent was installed and presented at the Indiana University Memorial Union, September 2017.
Video and works by Smith, Norbert Herber, Doug Bielmeier, and music and dance by Robin Cox and Stephanie Nugent were presented. The event culminated with an open contact improv work, Hourglass, led by Cox and Nugent.
Photos courtesy of Doug Bielmeier.
August, 2016.
Big Tent and Modality (Virgina Tech based new music ensemble). Public performances at the Indianapolis Opera’s Basile Center.
All visuals live generated and performed by Ben Smith, with interactive gestural controllers distributed among the audience.
November, 2015
Big Tent’s public premier was held at the Indianapolis Museum of Art, featuring a collection of music, dance, and video work, curated and directed by Benjamin Smith and Robin Cox.
Live animation and video created by Smith and Danielle Reide. Live performers included Smith (violin), Cox (violin), and Stephanie Nugent (dance).
Big Tent was installed at The DaVinci Pursuit’s 2015 annual science and art conversations event, held at the Orchard School in Indianapolis, IN. Featuring video by Ben Smith and Danielle Riede and music by Robin Cox.
2013
Continued exploration into the procedural generation of 3D terrain based on voxelized noise maps and Marching Cubes algorithms. Now published as “Ruaumoko” in the Unity 3D Asset Store!
(2013)
A preliminary look at a 3D procedural terrain generation system I am creating in Unity. This approach uses a marching cubes implementation to construct surfaces that can fold and overlap, unlike conventional height map terrain models. The texturing is very rough at this point.
See most recent update as “Ruaumoko.”
(2013)
Inspired by the infinite combinations afforded by the elegantly simple Lego block, the Duplo Interface explores tangible computer interfaces through these iconic child's toys.
The goals of the interface are enabled using computer vision algorithms and machine learning models to capture the configuration of Duplos in real-time and translate the data into audio synthesis parameters.
The Duplo Interface is built in Max and c/c++. The mapping of Lego data to audio parameters is enabled through an original implementation of a Self-Organizing Map neural network which in turn drives a granular synthesis engine.
• Approachable, not requiring any special clothing or expertise.
• Enabling intuitive, natural interactions.
• Leveraging affordances of the Lego block.
• Wire-free, non-invasive connection to the computer.
• Immediately responsive.
• Robust to many light conditions.
2013
A first person solo racing game featuring an adaptive procedural terrain generation system that continually attempts to devise the most optimally challenging, yet satisfying environment for the specific player!
The character, a reappearance of one of the 2012 summer Olympic mascots at some forgotten point in the future, is rudely dropped into new space after new space, from whence he must rapidly reach the exit gates in the hopes of one day finding a way back to the world of frivolous physical revelry. Yet, each new space adapts to his previous achievements, continually placing the exit gate just beyond his easy access.
The game is constructed in Unity 3D, featuring an original Genetic Algorithm model to evolve each level uniquely given the player’s observed prowess and play style up to that point in the play session. The character modeling, animation, and texture mapping were created in Maya. Original shaders are employed to dynamically texture each map uniquely.
2013
An exploration into algorithmic tree generation, driven (“played”) by human motion (using a MS Kinect camera). As the player waves their arms the tree is stimulated to grow in different directions, budding leaves, and catching more ‘sun rays’ to grow further. But unwieldy branches and unstable structures will cause the tree to drop branches or even fall over!
Controlling a supercomputer with a violin!
Presentation (2011) and paper publication at the International Computer Music Conference, 2012.
How can highly complex simulations and systems be effectively controlled in real-time? What sort of interactions are appropriate in a space with thousands of independant, or complexly connected coefficients and parameters? This work starts with a highly complex instrument, the violin, and an expert in its manipulation and control, a violinist. What happens if an interface is constructed such that the musician, through the creation of sound and music, is able to directly control the parameters of a supercomputer simulation?
A flocking simulation, with 50,000 agents, is run on Abe, a 9600 core supercomputer at the National Center for Supercomputing Applications, and is rendered in real-time. Simultaneously data gathered from a live audio stream (from the violinist's microphone) is analyzed and mapped into the coefficients of the simulation.
Featuring the UIUC Telematic Ensemble with Edgar Kautzner, violin, Benjamin Smith, director.
The culminating performances of my dissertation work at the University of Illinois, Urbana-Champaign.
This work explores the aesthetic affordances of the network distributed acoustic ensemble, joining 2 to 4 stages with up to 6 musicians in concurrent performance. Most of the musicians were distributed around the UIUC campus, joined by Edgar at Monash University, Australia.
Following are a collection of videos from different perspectives, capturing and relating varying views of the distributed concerts.
A port of ffmpeg's libavcodec for cycling74's Max environment. It enables fast compression and decompression of video for live streaming or recording/playback, providing access to a wide array of popular codecs and presenting a new solution for video compression in Max and Jitter. For each codec vipr also exposes the compression parameters for real-time control, enabling easy testing and tweaking as well as artistic control of the compression parameters during performance!
vipr provides access to a number of lossless codecs that are only available through ffmpeg. These are of particular value and interest to telematic music and distributed performances where video quality is a primary concern.
Downloads
The publicly available source code and binaries are accessible on GitHub
Appalachian grooves and Irish melodies mix with a modern, Midwestern sensibility in the music of this young acoustic trio, featuring original, joyful dance tunes and wistful waltzes.
Click on the album covers (right) to listen to our music!
2007
A live camera, capturing a real-time ‘portrait’ of a participant, is used to mix video images that result in an abstract freeze frame of their movement into and out of the canvas.