For our Datalogging assignment, Lia and I chose Parallax’s CH4 (methane) Sensor Module. The module is specifically designed to determine whether a preset level of methane gas is surpassed. The module uses a MQ-2 methane sensor, which, when heated by an applied voltage (5V), reduces its internal resistance, and hence its output voltage, in relation to the level of methane gas present in the air. The module provides circuitry to set a trigger voltage to sound an alarm for when a specific level of methane is surpassed. For this assignment, we ignored the alarm circuitry and monitored the output voltage of the sensor.
Low-res video of audio-reactive aliens/jellyfish made in Unity3D:
The movement and luminosity of each of the four creatures corresponds with the maximum amplitude of four defined audio frequency ranges (corresponding with bass, lower middle, upper middle, and treble range). Following Mike Tucker’s tutorial, Soundflower is used to redirect audio input into the Audio Analysis plugin for VDMX5-b8 (see the Soundflower–>VDMX5-b8–>Unity3D posting for more info). Slider values of the maximum amplitude for each of the four band frequency ranges (as well as the overall gain) are normalized and sent to Unity3D via OSC. These values control each creature’s position and scale as well as the position of the Particle System and the range and intensity of the Point Light they are parented with. 3 Interactive Cloths are attached to the Sphere Collider centered within each creature’s body. The lapping illusion is created by a camera rotation (with rotational velocity controlled by the overall amplitude max). The creature’s body was created in Maya.
I thought I’d try out my new XBees with the ADXL335 accelerometer I had on hand to make a 3-axis accelerometer visualization. I borrowed from project 21 in Tom’s Making Things Talk, but taped the accelerometer at the center of a frisbee (level to the frisbee’s orientation plane), along with an Arduino and an XBee. I added 4 LEDs: 2 placed on the two points where the x-axis of the accelerometer intercepts with the frisbee’s circumference and 2 on where the y-axis intercepts. I then programed the Arduino to illuminate the LEDs which corresponded to positive pitch or roll. I added an XBee to make it wireless and transmitted the acceleration (in gs) of the three axes, along with the pitch and yaw. I used processing to sketch a visualization of this data. The ellipses in the graph move vertically along their acceleration (in gs) axes. The orientation of the disc in the center (representing the frisbee) corresponds with the calculated pitch and roll of the accelerometer.
I’m working on my second project for James George’s Pixels to Polygons class. After hearing my proposal, James’ suggested checking out Mike Tucker’s great tutorial on implementing Soundflower, VDMX5, and Unity 3D over at Creative Applications. I decided to take a stab at the tutorial and hit a few snags, mostly due to a glitch between Lion (I have a new MacBook Pro running Lion 10.7.3) and Soundflower and changes within VDMX5 (now b8).
After rereading “Funes the Memorius”, particularly the description of Funes’ perception prior to his paralyzing accident, I was reminded of ants, unaware of the view a macroscopic perspective affords, crawling on a mobius strip. For this assignment, I decided to recreate my own abstract memory of the mobius strip from Eames’ Mathematica exhibit at the Museum of Science in Boston. I wanted to play off of Funes’ geometric “clarity” and the idea of perspective.
I attempted to create small “worlds” of recursive boxes, via a nested for loop, upon cubes that followed a mobius-like twisted shape.I also wanted to explore the Unity3D zoom and orientation shift as a tool for understanding perspective and dimension. Unfortunately, I got stuck in the mud by my graphics card.
I used the following parametric equation to create the cubed curves: