Final Inspiration

For my final, I want to create a video game. For my midterm, I was halfway complete but it wasn’t enough. I want to figure out how to make a bow and arrow shooter game in processing. I’ve seen many attempts at bow and arrow structure but never a full-fledged game. So I plan to do that, I drew some inspiration from a game I saw on open processing which is a complete copy of the old brick breaking game a lot of people used to play in their blackberries. I checked the code for that game and learned how to make object bounce until it hits its intended goal. Which I can use the bow and arrow game to only do something when it hits its intended target and maybe do nothing or bounce off when it doesn’t. If I find more similar examples I feel as if I could accomplish my bow and arrow game. I also saw some processing sketch on a bow and arrow shooting maybe this will help me understand how I should make the mechanics of my game. Im also excited for the use of the DOM library I feel there’s a lot of possibilities from that as well. Maybe I can add in buttons or sound to my game which would make it a real game if it has those features.

Brick Breaking game I drew inspiration from:  https://www.openprocessing.org/sketch/533634

 

Final Project Inspiration

For my project, I’d like to create a game using p5.scenemanager. I don’t want it to be linear with only one path you can take… I want the choices you make to determine how you progress through the game. This is inspired by a video game called “Until Dawn”. It’s a horror game where the player is given choices throughout the game and each choice has some sort of butterfly effect. There are many different endings you can get in the game.

 

I love the concept of this game, although making all of the visuals could be a pain. I want the visuals to be minimal so that I can focus on the different choices and outcomes that the player can make. I want a style similar to the mobile game called “A Dark Room”. This game is really fun because of the mysterious story line, even though the visuals are very minimal.

I’m not sure exactly how much I could do with p5.scenemanager, but I’m excited to try it out.

Final Project

For the final project, I want to create a sound visualization. I have been watching many videos of sound visualization done through Processing since earlier this semester and now I am confident enough to give it a try. Since I am not a music person, I will stick to visualizing frequency and amplitude of the sound, but this thought might change after further research. Sound visualization and analysis can be done through p5 sound library but gibber.js also seems like an awesome library to play around with. With gibber I can make drumbeat of my own and make a visualization of that.

Here is an introduction video of gibber. As you can see, you can easily connect sound with 2D or 3D objects.

I really like the video above because it has the aesthetics that I might want to try. I am not sure about other elements that are floating around, but a visualization with a circular center is similar to what I want to create.

Final project

While I do not know exactly what I want to do for my final project, I am leaning in the direction of something to do with AR. I was thinking of using the P5 dimensions library that i found last week to slightly tweak the 3d space my camera sees and make a sort of existential overlay. this would involve the use of the DOM library as well as the  video library for some aspects. Image result for how to create augmented reality content

snapchat sort of does this with there filters an I want to give it a try.

 

Final Project

For the final project, I’m thinking of making an interactive toy for my cats.

The main feature would be cats interact with the toy and the camera takes pictures and posts it on a Twitter account. I can check the Twitter account when I miss my cat.

I’ll be using the Arduino as the main board. A toy to attract the cat. Capacity sensor and motion sensor in the camera as the input and detect whether the cat is here. Camera sets on a servo motor to adjust the angle or position. ml5.js to test if the picture has a cat in it. Yun shield on Arduino to connect to Temboo in order to get Twitter API.

 

 

hardware:

Arduino, Capacity sensor(cooper tape), camera with built-in motion sensor, servo motor, yun shield(?), wood box

 

cat face recognition:

https://ml5js.github.io/docs/simple-image-classification-example.html

p5.bot to communicate with Arduino:

https://github.com/sarahgp/p5bots

 

Final Project Inspiration

For my final project, I’m thinking of creating an interactive sketch where the user creates their own solar system.Using a drop down menu, someone can choose the amount of planets that are in the sky and then there would be a slider to control the color of a certain amount of planets. I plan to use the DOM library to make it easier for the user to build up and destroy the solar system. It would also be interesting to have a button that represents a black hole that destroys that the solar system. I think the most challenging part will be allowing the user to control the amount of planets and then generating them to interact and rotate as a solar system. I think the initial sketch will have a rotating sun in the center. I don’t know if I’m going to work in 3D; I will first try to get the basics of the sketch down before I attempt to go into 3D.

I really like the sketch linked. It is not interactive, but I liked how they formatted the solar system so that it looks like the planets are moving, but they don’t have the normal movement look that most solar system sketches have. This could be an approach that I could try rather than have something like: https://www.openprocessing.org/sketch/386189. Ideally, I would like to combine the two designs and then allow the user to interact with the sketch.

Final Project Inspiration

For my final project, I want to use the p5js library p5.play to make a game. I don’t have an concrete idea about what exactly I’d like to make, but I think I want to create a game that teaches children how to recycle. I found a sketch on openprocessing, a pong game that someone created using p5.play. Basically, it works like traditional pong, where there are 2 paddles on either side and a ball bouncing from side to side, except in this sketch the user controls both paddles and tries to get as many hits as possible. I think this is a great example of a game created with p5.play, and there a couple elements in it that I can see myself incorporating into my own game. For example, the paddles are mapped to the mouse, so they followed the mouse up and down no matter where it is on the x-axis. For my game, I want to have a character in the middle who is mapped to the mouse, so that the character is able to move from side to side and drop items into either a trash, recycling, or compost bin. I also really like the simplicity of the pong game, as there aren’t too many interactive elements that might confuse the user. I want to employ the same simplicity into my own sketch, but also make it visually appealing and fun to interact with. Similar to the hit counter in the middle of the pong sketch, I’ll probably try to create some sort of counter to keep track of points when the user puts an item in the correct bin.

For my project, I’ve started looking through some examples on the p5.play website, and think I’ll have to create some sprite animations for the final result. So, when I actually start creating my game, I’ll first have to figure out which parts need to be animated so I can draw those first, and then start the coding process, which I still haven’t quite thought through.

Pong sketch:

-https://www.openprocessing.org/sketch/518779-

Research Project – HPSCHD

(John Cage and Lejaren Hiller working on HPSCHD)

HPSCHD by John Cage, a composer, and Lejaren Hiller, a pioneer in computer music, is one of the wildest musical compositions in the 20th century. Its first performance in 1969 at the Assembly Hall of the University of Illinois at Urbana-Champaign included 7 harpsichord performers, 7 pre-amplifiers, 208 computer-generated tapes, 52 projectors, 64 slide projectors with 6400 slides, 8 movie projectors with 40 movies and lasted for about 5 hours.

(First performance)

Before explaining further, imagine listening to this for five hours.

When I first listened to the piece at MoMA, it felt like a devil was speaking to be, but the composition is actually one of the early examples of randomly generated computer music.

HPSCHD (a contraction of the word ‘harpsichord’ into a computer language) was created in celebration of the centenary of the University of Illinois at Urbana-Champaign in 1967. The prerequisite of the work was to involve the computer in one way or another. Cage did not want the computer to serve simply as an automatic machine that makes his work easier, but he envisioned a process of composition in which the computer becomes an indispensable part.

The composition involves up to 7 harpsichord performers and 51 magnetic tapes that are pre-recorded with digital synthesis, which manipulates the pitches and durations of sounds in pieces by Mozart, Chopin, Beethoven, and Schoenberg. The music for a harpsichord is generated by the computer for each performers using Illiac II computer and two software created in Fortran computer language, DICEGAME, and HPSCHD.

DICEGAME is a subroutine designed to compose music for the seven harpsichords. It uses a random procedure created by Mozart called Dice Game which generates random music by selecting the pre-formed musical elements with dices.

The second software, HPSCHD, is responsible for the sounds that are recorded in the tapes. The program synthesized sounds with harmonics that are similar to those of a harpsichord. It used a random procedure from I-Ching or Book of Changes, an ancient Chinese divination text. It divided the octave into 5 ~ 56 parts and calculated each for the 64 choices of I-Ching procedure. I don’t completely understand how it works but that apparently allows 885,000 different pitches to be generated.

Each performance of HPSCHD is supposed to be different, due to randomly generated sound from the tapes and performers, the different number of tapes and performers, and different arrangement of all these parts. A performance can play all of the sounds at once, individually, or somewhere in between. The recorded version above is just one of the infinite number of variations.

Further research:

https://www.jstor.org/stable/3051496?seq=1#page_scan_tab_contents

(An academic journal from University of Illinois Press. You can use your NYU ID to access)

https://www.wnyc.org/story/john-cage-and-lejaren-hiller-hpschd/

(Interesting podcast on HPSCHD)

Dom assignment

For this assignment , I made burger order menu with using checkbox in order to choose specific ingredients that customer would like to put inside of burger.  I am going to use the DOM library that can implement html input types and other elements. In this canvas I am going to use createCheckbox() to create the input checkbox for ingredients and have a order submit button that will delete all the elements and create a text that says ‘thank you for your order’.

Processing Library – Box2D

Hello everyone,

I hope you are all doing well.

I am posting in regards to a library that I found on the processing website that you guys might find useful if you choose to work with processing.

The library is called Box2D and physics is the main priority of the library. This library gives the user the ability to work with statics, dynamics, vectors, velocities and so much more. I found it interesting because I know some of you may want to work with video games so physics is a really important aspect of making the game realistic. I plan to use it for my final project and I hope you guys use it as well.

P.S:  The coding guru Daniel Shiffman of The Coding Train made this library.

Link to source code:

https://github.com/shiffman/Box2D-for-Processing