In this post I will wrap up my work on my Major Studio 2 final, CitizenScore and present some of the final images of the system and videos outlining the work.
Building the Final Iteration
Just to recap, after my last check-in, I had decided to unite two of the original experiment ideas into one project, Citizen Score, a speculative design installation that imagines a future where ubiquitous surveillance, facial and feature recognition and artificial intelligence finally combine into a system that can gauge a citizen’s worth and likelihood to dissent.
I also identified several next steps I needed to work on to get the final iteration moving:
Fixing Up the Code & Connecting the Experiments
After struggling with some of the technology, I was finally able to modify the cute robot from “Play with Me” to collect the baseline values for “good”, “bad”, and “neutral” expressions, values necessary to power “Threat Score”.
The first thing I needed to achieve was figuring out a way to collect several videos & YML samples of users doing a neutral, bad, and good face, in order to establish a baseline for measuring their reactions to the propaganda video.
To achieve this, I used a speech library for openFrameworks to get the system to interact with users, encouraging them to display certain emotions that I could collect.
This solved the issue of standardized emotion templates, or creating a very large database of emotions and pulling from them. I could measure each person against their own emotional reactions.
Using this system, I built an interactive experience that worked with users to get them to display a good response (smile), and a bad response (sad combined with disgust), auto-capturing a screenshot and YML file of their face at that position.
I then had the system auto-create the YML file types with the names, good, bad, and neutral, so I could automatically feed them into program that would track their reactions to the video and calculate their Citizen Score.
Here are some images taken from collecting the baseline emotions where the user is reacting to the interactive system.
Creating my Propaganda Video
I next needed to create a propaganda video to haver users react to. After some initial user testing, I decided to go with 3 different contentious stances that would be effective triggers for emotional reaction: pro/anti-immigration/diversity, pro/anti gay rights, pro/anti abortion.
User Testing The experience
User tests showed reactions to the videos with emotions being projected quite visibly, successfully getting the code to run and a citizen score to be calculated.
I also learned through prototyping that a random threat level wasn’t very engaging, so I created ranges of scores and associated labels such as “Patriot”, “Dissident” and “Average” to capture user’s relation to the paradigm I had built. In the end, I was pleased to find that more awkward parts of the interaction from earlier testing were very much eliminated through the connection.
I was also able to get the code working to reliably calculate a citizen score based on these reactions.
Here are some pictures and videos of the user testing.
I also wanted to see if this would work with people in public, using captured video versus a live feed from the webcam. I created a companion program quickly to illustrate this point and to be included in my final project presentation video.
Creating the Prints
To finish up the project I created several prints obtained from the Citizen Score process, and hung them up around our space at Parsons. Had I more time, I would like to have these prints up longer and collect people’s feedback on what they think of the posters, how they make them feel about the person portrayed, etc.
The Final Video
By uniting the two projects and having the robot lead the user through the interaction, I was finally able to demonstrate my ideas and critical concerns into a creepy yet fun experience, that left users weirded out, but thinking.
Now I needed to create an amazing video to showcase the results and process in an impactful and thoughtful manner – something I felt I had been lacking in all year and worked hard to improve throughout my second semester.
Here are two videos I created, a demo video or a teaser for the project (which I published on Instagram) and did entirely using my phone as an editing platform – because I wanted to see if I could.
The second final video was used creating Adobe Premier and I was very proud of learning many effects on the go as I needed them for the video. My video game has DEFINITELYimproved this semester.
As you will see in the video, I also spent some time at the end of the project building a physical casement for the technology, creating a cute animated robot face and podium to place it on while hiding all of the wires and circuit boards.
I also created some branded items (logo, animations, posters) for the project so I could demo it at an upcoming Major Major show that would allow people to check out our work.