translation to p5js

I succeed to translate my sketch to p5js. I put image file and sound track.

Most hard thing was putting sound track. ‘noloop’ makes the sound goes well.

I hope I will make interaction with specific part of picture for the next project.

For example,  If you click on the cherry blossom tree, you will make the leaves of cherry blossoms fall off, or you can click the track to make a sound effect, etc.

Chapter 10+11

While I do feel proud of how far I’ve come in my understanding of computation and coding as a whole, I’m exited to focus more in my free time and use this increased confidence to expand my understanding of how to apply coding to my own artistic interests.

Reading chapters 10+11 gave interesting insight into a more structured approach to coding than I had used in the past. It’s fascinating to see how computation simultaneously agrees and conflicts with my natural and learned ways of thinking as an artist. I’ve always veered away from a concrete structure at the start of any project due to a fear that it would stifle the possibility of improvisation. However I now feel as though it would be far more beneficial in terms of keeping track of the code and various structures at work. Chapter 11 also yielded a lot of guidance, some of which I’d previously applied by asking a friend of mine for help on my midterm. I find myself debugging projects most of the time, but the puzzle-like nature of computation makes is what makes it a really interesting and engaging exercise for me, even when its frustrating.

Midterm 1 Part 1

Working on this project was quite fun and I felt like a real programmer for the first time. As a CS major, most of my classes consist conceptual and mathematical principles of code and this project felt more real and down to earth. I am currently working on creating the entire solar system in 3D. I plan to eventually include all the planets, shining stars in the background and hopefully the asteroid belt as well. I am still debating if I should include Pluto as a joke but I may or may not choose to do so. I made a sphere fo the planets and I plan to use other 3D components such as the camera angle for the user. Other properties of lighting will also be used for the second part of the midterm and my hope is to try and make it somewhat interactive.

Overall, I really liked working with classes. It allowed me to put everything into separate objects and allowed me to manage my code instead of having 300 lines of the same thing.

https://www.openprocessing.org/sketch/512876

Midterm Week 1

Over the next two weeks, I want to create a funny “Match Maker” machine. These love testers/match making machines can be found in dive bars and failed arcades across America, and I want to make a digital version. The final piece will have four bars, and they will each rise when you press a button. Because this will obviously be a joke version of the traditional machine, I will write funny descriptions for each level of the bar. I also want to create little animations that move around the machine, similar to the ones on the video I’ve shared. I tried to start them early, but I couldn’t figure out how to make hearts on Processing even after searching the web. As far as the aesthetics of the machine go, I went for as pink as possible. I have an irrepressible fondness for pastels, so I also included a light blue. As the project continues, I want to play up the campiness and make it as ridiculous as possible.

I will use click-based responsiveness and  randomization to make the bars rise. I also want to have an effect when the button is pressed, but I’m not sure what that effect will be. I will have to continue teaching myself how to work with classes in order to complete this project. I will also have to teach myself how to make hearts, because having a matchmaker without hearts is frankly a sad, sad prospect.

Video Inspiration: IMG_1152

Midterm Part 1

For this assignment, I tried taking a different approach from everyone else. By using images to convey a story, my original plan was to make a movie clip sort of to tell a story. So the idea was that a bird flying above throws an apple on a bears head a hunter notices the bear is getting too close to town so he shoots the apple on his head and then the bear runs off. So far I have the last part but not the portion where the apple is thrown on the bears head. I am not sure if it was smart going a different route relying heavily on images but will see how everyone reacts to it. The hardest part was trying to switch images changing scenes I researched for hours how to do it. Till I finally figured it out by using classes and an if statement to say what class gets displayed when. I think this use of scene change is a great example of the use of classes. Here are some images of how it looks when running properly. I was unable to load it up properly because of images not showing up errors in openprocessing.org I will be sure to upload it correctly soon. 

Week 2 Interactive assignment( Shahriar and Kamau)

For this assignment Kamau and I created Pinnochio. The two interactive elements we wanted to add was one, which may be the most obvious is make his nose grow. Second, we wanted his eyes mainly his pupils to always follow the mouse. Making the nose grow and become smaller was not too bad once I figured out how to set variables because then it became very easy to have increments of having values increase and decrease allowing the nose to grow and become smaller. I even added an extra effect where the sketch flashes when you make it smaller and bigger and some text even appears. The biggest struggle I had was the eyes they would move with the mouse but the pupils would always come out of the eye sockets. So I needed a way to make it moveable and still restrict its movement. This was really tough to do but I figured out the mapping techniques and then it was a lot easier.

processing site: https://www.openprocessing.org/sketch/506733

Week 2 Interaction Project

For this project I partnered up with Yuening Bai. We knew that we wanted to have something following the mouse, and we discussed pulling images from the web and having the ball actually be a cat, so that it is chasing the mouse. I ran into some trouble trying to code this, and could only figure out how to load a photo from the hard drive of the computer, not from the web. So instead we made it a ball that got smaller as it got closer to the mouse. I originally figured I would do this by making the height and width of the circle correlate to the distance from the mouse in the x and y directions. This had an effect similar to what i was going for, but the circle would disappear if you backtracked from the bottom right of the screen to the top left. To solve this, I made it instead the absolute value of dy and dx.

You can see the code and program here.

Week 2-Interactive Assignment (Caleb and SangDoo)

For the interactive assignment, SangDoo and I created a helicopter (squares and rectangles) spinning around the mouse, while following the mouse. When you hold the mouse button, the helicopter flies off into the sky. Once you release the mouse button, the helicopter comes back to the mouse.

One of the hardest parts of the code was having the helicopter spin around the location of the mouse. We didn’t want to have the helicopter perfectly connected to the mouse, so we created an ease variable. The translate and rotate functions are sometimes confusing to use, but we were successful in having the helicopter spin around the location of the mouse, with an ease.

The color of the stroke of the helicopter also changes depending on the location of the mouse.

Another difficulty came when we tried to change the background color once the mouse was clicked. We didn’t want the helicopter to have a trail of color behind its movement, so we created a variable called “color” to put into the first input for the background function. We were able to randomize this variable later in the mousePressed function, and everything ended up working nicely.

https://www.openprocessing.org/sketch/506233

 

Week 2 Interaction (Ahmed & Allen)

The square was relatively the easiest part of this assignment. We were successfully able to implement it rotating according to the x and y coordinates of the mouse and speed it up accordingly. We were also successful in changing the colors every time the mouse is pressed. The ellipse brought the most issues. The difference between what was going on in processing and what was going on in open processing was confusing, but the open processing did what I originally wanted to do anyways. It was hard for me to wrap my head around how draw and mouse pressed worked at first, but I soon realized mouse pressed is an event and draw is a continuous function.  The translate function was hard to figure out at first, partly due to how weird the location of the (0,0) coordinate is to us. Since it was in the draw, it continuously translated whatever was on the screen, and we wanted to figure out how it can be used for only a particular element. Overall though, we were successful in using the material taught in class and implementing two different types of interactions.