Syncing data: A time-lapse recording of the sunrise
Every sunrise and sunset, the sun leaves a unique signature on the Ionosphere. I have been waking predawn for the past month to set up a time lapse recording of the sunrise (using a DSLR camera) to coincide with the VLF monitor data. I’m hoping to get the hang of this somewhat complicated setup before the next partly cloudy morning. The clear skies make for a pretty boring sunrise, as you can see in the time lapse footage above.
Audio reactive animation using Processing’s minim library
This is one of my experiments using Processing’s minim library to analyze audio. The method employs a Fast Fourier Transform (FFT), allowing for real time visualization of frequency. In this sketch, I played around with frequency bins and intensity thresholds.
The song is “Aspectacle” by Can. I chose this song both because I like it (and could handle listening to it repeatedly) and upon first hearing it, I imagined a psychedelic audio reactive animation. I love doing work like this, as it hits all of the math, physics, music, and visual bells in my brain.
Notes from my first full day of testing the VLF monitor setup on my NYC roof
I’m using the SuperSID monitor and software with my small 1 ft wide loop (made with magnetic wire) at home on my deck. I wanted to test this small loop and compare it to the larger loop (made with insulated 24 AWG wire) that I plan on using at ITP.
There is significant noise in this setup and I hope the combination of the larger loop and 24 AWG wire will yield a marked improvement. There’s something beautiful about this noise. It’s the buzz, the radio noise (or the near field em) of the city. There are marked spikes that correspond neither with the mains and their harmonics (which are filtered out through the software) or the VLF stations. The graph below shows an image of the power spectrum at about 8pm last night. The spike at around 15.65 kHz stayed constant for the majority of last night. It disappeared sometime between midnight and 1 am and reappeared again this afternoon.
Realtime scientific data acquisition and visualization for makers, artists, citizen scientists, and other curious types via an exploration of space from Earth.
Listening to the Sun is an experiment in making real science accessible to curious types through affordable DIY technologies. By repurposing easy to find and often freely discarded consumer electronics and other materials, it is possible to create tools that enable the exploration of our atmosphere and outer space without ever leaving Earth. Coupling these tools with computers, open source software, and hardware, allows for real time data acquisition and visualization.
Thanks to Bill & Melinda Lord of SARA, I received the solar flare monitor (SuperSID) donated by Stanford in record time. Yesterday, NASA’s Solar Dynamics Observatory captured a pretty substantial eruption that should make for some interesting readings.
I constructed my first VLF antenna two weeks ago and have been using a great, open source piece of software by “DL4YHF” called Spectrum Lab for spectrum analysis. The setup is on my rooftop in Brooklyn and is not yet waterproof, which limits the around the clock monitoring necessary for accurately identifying solar activity. However, I have been able to identify several VLF station peaks.
Sparkfun’s 9 Degrees of Freedom (Razor IMU) is essentially a breakout board for a small microcontroller and three separate MEMS sensors: a 3-axis accelerometer, a 3-axis gyroscope, and a 3-axis magnetic sensor. While the Razor’s microcontroller ships with sample firmware that demos the output of the three sensors, the full power of the Razor is realized by uploading firmware that utilize the device as a realtime 3D orientation sensor. The Razor’s onboard microcontroller can be programmed directly by an AVR programmer or by a computer-serial connection via a pre-programmed Arduino bootloader.
Overcoming some hurdles with VDMX5 and Maya on my second project proved to be quite challenging. I hope to continue learning these tools and pick up Quartz Composer, 123d Catch, Syphon, AfterEffects, and OpenNI/NITE. For my final project, I wanted to create an audio responsive projection mapping in my livingroom (the only place I could conceivably take over for a week) with generative geometry and some AfterEffects animations inspired by the unusual works of Paul Loffely.
I hope to finish up weaving together layered AfterEffects animations in scenes with VDMX5, dealing with multiple Syphon servers, adding the additional surfaces I’ve modeled (and making them transparent), adding the dreamer on the couch with the Kinect, incorporating the spatial shifts in Unity3D to make the projection interesting….
Below are some low res videos (taken on my iphone) and pictures: