The Future of Interaction is Not A Single Finger

Beginning this reading started out the way most things begin: disappointing.

“Here is the future” and then, a video. A magical video. I was so excited to watch it. And then, as I clicked “play”….

VIDEO UNAVAILABLE

great.

I loved the tone of this reading. It kept me engaged and drew me in as it presented ideas I hadn’t thought of before. I completely agree with most of them. A group of inspired people is definitely one of the most powerful forces and hands are super important for doing anything in life.

I appreciated his ability to make me think about little actions I did and how those little actions weren’t so little for my hands. Tying my shoes, for example, had such precise, nuanced movements and habits that I had learned over time. When you first learn how to tie your shoes, you’re clumsy and you don’t quite understand how to do it. As it becomes muscle memory, however, your hands figure out how to move in just the right way. Hands are such an impressive part of our body. With all the joints, bones, and possibilities, why wouldn’t we use them to enhance our technology?

The thing that stuck with me the most, however, was the quote “with an entire body at your command, do you seriously think the future of interaction should be a single finger?”

This really made me think about where our technology is right now and where we see it going in the future. Phones started with using one finger to swipe, one finger to type in a password. Then, it was one finger that would read fingerprint and unlock the phone. Now, we’re even beyond that. Facial recognition. Why are we straying from hands when Bret so truthfully pointed out that the future is in them?

Response: A Brief Rant on the Future of Interaction Design

I think that Victor makes some really good points in his article. Although this might seem odd, but it was not until he mentioned that humans desire some feedback from the tools and objects they are manipulating that I realized how important this was. Specifically with his mention of the iPad keyboard being not nearly as fun to play or immersive as playing an actual keyboard or piano, I realized just how important feedback is in the experience of using a product. Feedback not necessarily in the form of getting the desired outcome(s), but other things that make the product immersive, creating a “genuine” experience for the user.

As we start doing more and more things with our phones, tablets, laptops, Victor makes an extremely good point that there may be something lost from losing feedback and an immersive process as these tasks become “computerized.” As an aside, I really find the discussion of how humans will react to technological advancement thought-provoking. How will everything becoming computerized influence the human condition, when tasks that humans have been doing for millennia become reduced to something that can be done in a matter of seconds with a machine?

Midterm Proposal

For my midterm project, I hope to create a toy car that, when it detects objects in front of it that are extremely close by, will go into reverse, rotate a little bit and continue on its path forward. I was inspired by a remote controlled car I saw at Yas Mall a few weeks ago that would spin its wheels when it would come into contact with a ramp or slanted surface and continue in the opposite direction.

This project will require the two wheels that come with the Sparkfun Inventor’s Kit, a distance sensor as well as the Arduino itself. I also plan on adding a sound library that would play a noise when the car comes close to a surface. I am also thinking of adding a servo motor that would have pictures of faces attached, that would switch depending on the car’s currents state (for example: a happy face for when the car is running unobstructed, and an angry face for when it is close to an object).

I am still trying to think of ways to make the project more interactive, specifically with how a person might interact with the car besides from blocking its path and forcing it to go into reverse.

Reading response 3 – A brief rant on the future of interaction design

It was very interesting for me to read this. Even though I love my iPhone’s touch screen, I resonate so much with wanting to touch and feel things with our hands. I think it’s such a loss to lose that interaction. The best part of living in a 3D world is being able to see and feel different shapes and angles and textures. As kids, when we walk around in a store, we want to touch everything. For some of us, this is still the case… Kids are told, however, to stop touching things, and I feel this restriction resembles the restriction of touch in the advancement of technology. But there’s a reason that we want to touch different textures and shapes. It comes from our curiosity. The writer talks about a group of inspired people being powerful, but inspiration is derived from experiencing things for yourself, and becoming curious about them.

As someone who loves the internet but still refuses to give up their journal and planner for google calendar no matter how many invites people send, and someone who appreciates physical books so much more, although I feel the writer’s stance is a little bit extreme, I generally agree with them. For example, people used to use type writers, and then they used big keyboards, and slowly the keys got thinner and thinner until touch screen/touch  keyboards were made. But the experience of a typewriter, however might be less efficient, is so much stronger. I would definitely not want to give up my mac’s keyboard for a touch keyboard. To me, it’s great for some things to be touch screen, but others shouldn’t be, because they would lose the experience. And experiencing these little things is what keeps life interesting and inspiring.

Useless (but playful!) midterm proposal

Taking a great inspiration in Zimoun’s work with random tiny objects that make natural sounds, I would like to create something similar- yet obviously on a much smaller scale. A very rough plan is to create a structure with a piece of metal (or some other material) that would be put close to a servo with a long stick attached to it. On the stick, several pieces of small objects (glass balls, metal nuts etc.) would be hanging on strings. In this way, I would like to combine servo’s analog output with an analog input of a sensor (pressure, light, motion – will see what works in the process). Therefore the sensor would trigger the servo to move – just in the right distance so the stick stops in front of the metal piece and physics will do the magic of small tiny objects moving and hitting it.

Starting very simple, but as I learned with the last project, the complexity can be added once the simple idea works.

 

 

 

Response to Bret’s Response

I was pleasantly surprised to see how much self-reflection Bret actually demonstrated while responding to the critiques/comments to his article. He acknowledged the theoretical limitation that lacks concrete examples and elaborated a little more on the “vision” element, making ambitious inspiration for future designers  an ultimate objective of the paper.

Mentioning the relation and negative effect that technological advancements have on the physical mobility of humans is a factor that I rarely thought about as a possible consequence. Frankly, Bret’s criticism feels a little counter-intuitive, yet I can see the point. Hundreds years, people were striving for energy-saving inventions that would minimise the work and effort (and thus movement) of people, yet Bret actually points out the danger of where, if continued in the same direction, humanity might end up, painting an image that is a little frightening.

Though I agreed with his responses and mostly found them reasonable defence, I came to a point when I did not buy one of his explanations completely. Bret uses the words of Bergström to describe how not practising  with our hand due to touchscreens can lead to a disability of touch in a way similar to blindness. Obviously, such a complex topic cannot be summarised and answered within the scope of a single paragraph taken out of the context.

I just found this problematic of what consequences will be seen on our bodies and senses oversimplified, while feeling like he also dismisses the fact that while losing some skills/habits, we might gain new ones. Bret described the changes as inherently negative and that we should design technology according to what our bodies do at the current stage of development. Yet it ignores the fact that our bodies might be slowly merging with technology in a weird symbiosis, eventually opening the possibility of us not needing, the specific grabs for instance, whatsoever.

 

 

 

Response: “A Brief Rant on the Future of Interaction Design”

Bret’s article “A Brief Rant on the Future of Interaction Design” was an a very stimulating read. I really enjoy reading half-formed ideas that do not necessarily serve the purpose of strong arguments and persuasion, as much as they serve as an inspiration- which can push us in challenging the status quo in our way, regarding even such banal scales as the design of things we use daily. As mentioned in my previous posts, I struggle a lot with making myself think outside of the known forms – just because one thing was designed in a certain way, with only subtle alterations throughout the years, it does not mean that we should continue on building on the very same design of “yesterday’s technology”. Yet I need to constantly keep reminding myself.

Here I found Bret’s central point of designing technology for humans particularly useful: he articulates the struggle I already had, yet with attentive observation of human nature. Though he did not bring anything concrete to the table, his remarks regarding the sensitive use of hands, through which we can feel instant feedback of almost any object (apologies to Crawford), well highlights the need for observation of human behaviour when designing a future interaction. Technology can be mended easily, human behaviour not that much. Yet the way Bret describes it, is that this behaviour is unchangeable – and that if designed badly, the interaction will remain unsuccessful, inefficient and uncomfortable.

I immediately connected this problematic to what I ran into in last couple of weeks. I recently started drawing on a tablet with a stylus – and I could not wrap my mind around the weird sensation I experienced while using it. Though the designers put a lot of effort into making the stylus of similar shape and weight as a pen/pencil would feel, as well as decreasing the distance between the glass and tip of the pen, there was something inherently funny and unsettling about the interaction.

Yet I would not say it wasn’t tactile the way Bret describes screens – on the contrary, maybe it was a way too tactile, making my hand being very confused about the flow, friction and feedback – the whole drawing interaction, as a sudden replacement to an interaction between a hand, piece of paper and a pen. Frankly, I was not completely convinced by the argument that we changed the tactile for the visual with touchscreen- we still experience both, but as creatures of habits, we tend to get very whiney and uncomfortable when the known form we are used to changes. It made me think of what is the main obstacle we should focus on in terms of predicting human behaviour regarding interaction: is the problem habits that, however, can be changed the way technology can be mended, or is it the human nature that is unchangeable and deeply encoded within our genes?

And, to leave this with a lighter note, here is a quick meme to balance the heaviness of the text, that, however, weirdly relates to Bret’s explanation of human capability, tool and need, and the whole futuristic spirit of the article. 

Midterm Proposal: Song Cover

For my midterm, I wanted to expand on my last weekly assignment, which was to build an instrument using tone and servos. What I had in mind, then was to have a self-made band, or in other words creating a cover for a song using what I have.

The song I have chosen is “Pumped Up Kicks” by Foster the People, and it is actually a cover of the acoustic live version of that. Here is the cover.

So basically, if you listen to the song, there is the drum, the bass guitar, and the acoustic guitar. I will now describe how each component will be re-created through Arduino.

Drum: I have already started on the drums. I made a small kit, as you will see in the video below, and now I just need the servos to be attached and do the job for me. I will also have to code the speed at which the servos, or should I say drumsticks, will move up and down.

https://www.youtube.com/watch?v=pbI4eq4lDOw&feature=youtu.be

I do not know why this video is not showing as embedded, but the link should redirect you to the video.

Bass, Acoustic: Both the Bass guitar and the acoustic, I believe, should be recorded on Tone. However, as you can see, while the bass plays constantly throughout the song, the acoustic solo is only partial. So I have to make sure while one is looping constantly the other should only play once when I perform a certain action.

All of this will be performed through an analog input. When I fiddle with the photoresistor, for instance, the drum will start playing, etc. I want to make sure that each layer is added when I start the song, and hopefully my singing voice will be the last one to be added, completing the cover for the song.

Midterm Proposal

I have always been interested in the intersection between visual arts and technology, and Interactive Media acts as the perfect middle ground for those interests. For my midterm project, I am considering creating a small art installation or performance; which will combine the use of a certain medium (paint, graphite, ink, etc.) with different sensors and motors to facilitate the process of producing a work of art. The installation will be interactive in the sense that the user can decide which movements or brushstrokes will be undertaken by the motors, through operating the different sensors.

Midterm Project Proposal – LoopMusicBox

For the midterm project, my ideas centers on creating an interactive piece using analog sensors and different sounds. The LoopMusicBox will be a an enclosed box with three labeled buttons for discoverability (SAVE, PLAY, RESET), a buzzer/disc, and a proximity sensor that allows the user to create music with notes decided with the user’s physical movement.

When the piece is powered, the buzzer will play a note where the user can simultaneously change the frequency of the note depending on the mapped value from the sensor – the close your hand is to the sensor, the higher the frequency of the played note.

When the user presses the SAVE button, the program will save the note with the respective frequency at the time the button is pressed and places in an array of saved notes. The user will continue the process until the PLAY button is pressed.

When the user presses the PLAY button, the program will play the notes that are saved in the array until the RESET button is pressed. Then, when the RESET button is pressed, the program resets and it restarts the process from beginning.