Final Project PART 1

Last week, in class, I said I want to use sound.js and gibber.js, but through further research, I realized that I should focus on learning gibber and stick to its web editor. gibber is not just a library for sound composition, analysis, and visualization but it is also a live coding environment. This allows live audiovisual performance with extra transparency because the audience can see you typing down your code.

My goal is to compose some sound that includes drums, synth, audio files from freesound.org (the library includes the API), and voice. Then, I will create a visual with a couple of elements, each of them connected to the drum output or the master output. I am going to use both 2D and 3D objects because in gibber when you combine 2D and 3D, the 3D serves as a shade for the 2D object. I have created the basic outline of what my project might sound like and //commented on things that I need to work on.

Here is a screen recording of a very simple live coding.

live_demo

I am not sure if I will be doing a live coding for the final presentation (I hope I can) but I am definitely planning to make a screen recording of it.

gibber web editor: https://gibber.cc/

Access my code from here: https://www.openprocessing.org/sketch/

While I was writing this my code stopped working. I have no idea why. It worked 5 minutes ago. Now my excitment has turned into a doubt.

Research Project – Artist: Reza Ali

 

http://www.syedrezaali.com/   

For my research project, I decided to write about Reza Ali. He is a computational designer, software engineer, and creative director.  He is highly accomplished currently works at Google full time, on webXR its a google owned VR platform. His main position there is as a UX designer for  AR/VR experiences.  Along with that Reza is an artist in residence for Auto Desk. He uses technology and code to express himself, creating visual art sculptures and interactive installations.

Reza creates and collaborates with a lot of people and different companies. His work normally does not stay in one field or one area of interest. One of the fields that he works on the most with is Augmented Reality. My favorite Augmented reality peace from Reza Ali is called Paper Galaxy.  Paper Galaxy is a collaboration between Reza and Jeff Linnell. They use iPads to scan 3D objects and putting them into a 3D AR artwork that can be searched. Reza said “It’s about bringing students together and building communities through the use of creative coding and data visualization. ”         https://vimeo.com/105421711

Reza also makes interactive artwork, one of his most interesting 3D art pieces that I found was entitled BioRhythm and as its name implies it is a biofeedback installation. Which means that it takes feedback from the body to create movement in the artwork. The art piece is controlled by a users PPG which is the pulsating of the user’s blood tissues. The biological content of the artwork was the reason that the artwork was made to look organic. You can also change the sliders on the side to get more color options which I found relating because, just a week ago we learned to use the DOM library. http://www.syedrezaali.com/#/biorhythm/

He also has used processing multiple times, one of the artworks he has created from processing is an audio-reactive visualization.  The particles in his sketch repel and attract to one another so he could make them look like microorganisms in his opinion. http://www.syedrezaali.com/#/a-drifting-up/

Reza has also been known to dabble in generative works, he created a generative terrain maker called FLuid Scapes. He generates these renders and 3D prints them. http://www.syedrezaali.com/#/fluidscapes/

 

Final Project

I have the general setup for my final project; I want to allow the viewer to create the solar system, but I haven’t figured out all the details. So far, I have created two home screens; one where the user enters their name and then another one that greets them and tells them the big bang happened and they are going to create the universe. I have all the planets drawn as well and I’m trying to figure out how to use the boxClicked() function to allow the user to create and destroy a planet. I also want to have a button that destroys the universe where a black hole comes and sucks everything up. I am still figuring out all the details because I definitely want to add more. Ideally, I would also add rotation and movement to the sketch to accurately show how planets move. I haven’t decided if I’m going to use the 3d dimensions library because I had difficulty downloading it. By next week I plan to have all the buttons working and hopefully add in some rotation to the planets.

Final Project – Shyam Mehta

For the final project, I would like to elaborate on one of my earlier sketches that was based on the work of Stan Vanderbeek and and Ken Knowlton.  I am interested in the use of, at this point, Processing over p5.js, and the visualization of sound in connection with music. Earlier this semester I created a sketch that used the mouse input and keys to create a randomization of colors on a “DJ Set”.  However, I would like to use that as a basis to build so much more. What I am currently thinking is that I will have an input for the user to pick a mood (approximately 3-5 choices). Based on the mood, a certain preloaded song will begin to play along with colors on the screen to coincide with the music.  I will randomize the colors so their pattern will correspond to the volume of the music. Additionally, the colors will represent the mood. For example, if mood is calm, I will have light blues, and other mutes colors. On the other hand, if the mood is party/energized, I will have bright colors with a complete randomization of orange, red, purple, yellow, etc.  I am not totally sure about how I will have the screen setup, but I would like it to, again, represent the mood selected by the user. The goal of this project is to create some kind of background the user could keep running depending on the mood. (i.e. a tranquil background setting for calm, or crazy random lights to keep on at a party)

Final Project Outline

My project is a game of “wack a mole” using processing. To achieve this, my code will basically be a giant loop. The logic of the game will depend around three main classes:

  1. Score: missed, time, round
  2. Holes
  3. Hammer

The goal for this first part is to have the “Hole” Boolean controlled based on “time”, “Hammer” and have it be true or false for a number of seconds chosen at random from within an interval each time.

Ultimately, I would like to link the first part to score keeping and timed intervals using loops. See the image for more logic detail.

Final Project

For my final project, I want to stick with my idea of creating a game that teaches kids how to recycle, except it will be slightly more interactive than an online flash game. The main character will be a sloth that can move across the screen from left to right, controlled by the user. 3 bins – trash, recycling, and compost – are on the bottom of the screen. Throughout the game, a random object will be given to the user, who has to decide which bin it goes into. For example, if the item is a water bottle, it should go into the recycling bin, and if the user does that correctly, the score counter increases by one. If it’s wrong, there will be an indication in the sketch that the object was dropped in the wrong bin. To make this sketch, I’ll be using serial communication between Arduino and Processing. The Arduino will have 3 inputs: 2 buttons and a PIR motion sensor. The buttons will move the sloth either left or right, and activating the motion sensor will drop the object into the bin. I’ll also be using the sprite library in Processing to animate the sloth when doing certain actions, such as opening his hands when dropping an object or being annoyed if he gets one wrong.

Interface design for clarification:

Final Project

Hello Everyone,

I hope you are all doing well.

For my final project, I have no idea what I want to do.

That being said, I do have some libraries that I want to use for the project. I want to use the Camera features as well as the audio libraries of either processing or p5js to create an animation that incorporates sound and video to generate itself. I don’t know if  I want the animation to be interactive or not, but I want to leave the user with amazement on their face. I want to get creative with this project incorporating different physics libraries as well. I am currently inspired by working with algorithms and data so I just need to think about how I will incorporate different algorithms into my animation.

-Cesar

 

Final Inspiration

For my final, I definitely want to do something related to sound and music. A little more specifically, I want to make a sketch where the user can draw shapes which will produce sounds depending on various aspects of the shape.  The shapes might be able to interact with each other, and that will create sound as well. I love the idea of combining drawing and music, and I definitely want to explore this further in my project. The website I took inspiration from affords the user the ability to create patterns which repeat, and can combine to make something musical. The user can choose notes and type of notes, but I am not sure I want to include this in my project due to complexity, but ideally I will. I really want to make something that anyone can get lost in and maybe even make something that sounds like a fully-fleshed out song. For this I might have to use sound libraries and libraries for drawing elements, perhaps mapping characteristics to particular frequencies and timbres.

 

http://scribble.audio/

Final Project Inspo

I’m not exactly sure what I want to do for my final project, but I would love to work with music. I really liked the presentations on Rob Clouth and Casey Reas, and I think it would be really cool to do some kind of music video/coding combination, where the code will interact with the music and create some kind of visualization on the screen. I’m not sure what form the visualization will take on; it could be similar to Clouth where he’d make shapes and designs on screen, or similar to Reas where he alters the video footage itself, or even both. My other worry is that I possibly also want some kind of interactivity on the audience’s part instead of them just watching the screen. I think this interactivity will add another dimension to the piece, as it’s very easy to walk past/click away from a video with visualizations otherwise. Maybe the audience will make the music?

This sketch inspired me regarding interactivity:

https://www.openprocessing.org/sketch/418569

Reas’ music video inspired me regarding video manipulation: