Journal

05/31/2013 18:00

To begin week 1 I arrived on campus and got settled in. I made sure to know where the building was before I had to be there, so I would not be late. I met my mentor, Professor Leilah Lyons, and she explained to me the entire EcoCollage project that I would be apart of. (Explained on the homepage) I also talked to Tia, the main graduate student in which I would be working closely with, and she informed me on the tasks I had at hand. For my first week she wanted me to get used to the different programs that I would be using, breaking and fixing code, and seeing what my skill set exactly was. Personally I wasn't ready to code, nor did I have the true desire to do so. However, the entire project as a whole was very interesting. The virtual graphics, experimentation, gaming designs, and so on were all very interesting. 

Once Tia would leave during lunchtime, I then began to learn how to transcribe videos. Watching a group of students experiment with the prototype game board, I had to write the times when they began speaking, what "color" had spoken, what they said, and the ending time of their sentence. These segments were about 12 minutes long and took normally two to three hours to transcribe. These were very tedious and frustrating, but it still beats coding for me.

Within the first week I'd say the program I learned the most about was InqScribe. Different ways I could set shortcuts, the best ways to hear a person speak, listening to different play rates (faster or slower), and for some amusement I'd make them repeat over and over what they had said and put it into a little song like version. I also had to work with xCode, which had Java Script in it. Coding, once again is not my favorite thing to do, but the application that we had downloaded opened an iPhone on screen and ran a simulation exactly like an iPhone. I had to click an app to open it, hold it to make them all shake so that they could be deleted, and could even close the running apps. Although very interesting, I still had to look into the developed code and try to see what the pieces did. She told me that I should explore and be able to break and fix the different code segments. I was definitely lost and am glad I could transcribe videos, no matter how time consuming these were.

Working with Program xCode

  • Learning the different coding techniques
  • Learning what's possible to create with xCode
  • Learning what controls what

Working with Program InqScribe

  • Transcribe Videos
  • Watch Student's reactions to Garden placement
  • Learn the strategies of the students
  • Do students grasp the concept of the placement?
06/07/2013 18:00

Beginning week 2 I met a new partner who attends UIC and is interested in being a part of the EcoCollage. Considering the now known fact that the other attendant is no longer able to make it to the project, his assistance will be much appreciated. Although Leilah is out of the office often she was here and very helpful for the first two days of this week. She explained further to us that we would no longer be trying to use the iOS applications to project the gaming board, instead we will definitely be using an overhead screen projector and trying to figure on how to simulate the game. So, instead of using xCode we will now be using the program Eclipse. 

We had a meeting with other members of the EcoCollage team and collaborated on the different scenarios that could be incorporated into the game. What would really matter to the community when trying to place the items to stop flooding, what can really draw their attention so that the game is not boring to them, and what will help players better understand each aspect. Listening to all of the ideas helped me to see what matters to who, such as people with basement flooding problems, commuters who deal with flooded streets, and the special "rose gardens" that would be ruined if flooded. It really helped open me up to see the different perspectives and understand why people would/wouldn't want to install these different items.

Definitely got a reality check on what researching is. How I need to understand what I am reading better and summarize down my readings to let others know the more important information without having to read the same amount. My goal in the readings that were assigned was to find the best way feature detectors. Which would be used best for the program we are attempting to create.  

Began downloading Eclipse

Testing Sample Code

Finding Threshold

Seeing if it's possible to track multiple colors

06/14/2013 18:00

During week three I jumped into the coding aspect of the internship. The goal was to create a program that would show the threshold of a picture and be able to find green objects. It took about two to three days to figure this out and with help from my co-volunteer we got a lot more work accomplished. He helped me focus more on the project and made coding a little more interesting. He is trying to learn more about coding, so the more he learns, the more he refreshes my memory on what I had learned in class before. This also helps teach me some new things I may have not learned or remembered. Once we finally got the threshold to recognize green, we then were told to focus on trying to put boxes around the green objects. This took a lot more work.
While playing around with the boxes we took different pieces of code and combined them with the code we have now. With this new coding we eventually came upon three different screens. The original, the threshold, and the screen where it was trying to put up contour lines. The screen with contour lines was very interesting in that it changed original colors to very odd colors; however it did put boxes around the green images. Once this had been figured out I was assigned a new goal.

My goal was then to be able to only look for the green images once a button had been pressed on the keyboard, be able to hit escape to exit the running program, and to try to get the boxes onto the threshold image. The easiest assignment is to hit escape to exit the image, so that is the one I shall work on first, and will continue on to the next ones. I was only able to complete the task of pressing the escape box to get out of the running program. The rest of the week was spent transcribing videos and seeing what other types of programs could be used with this project.

Tia, Hamlet, and I also went to visit the Innovation Center where another partner of the project was working on different parts as well. He showed us numerous scenarios as to how to show the different types of systems on the map. The swales, rain barrels, green roofs, permeable pavements, and even sewer drains. One of the programs worked off of certain patterns and another worked off the size of the square. When placed on the table the project would place an image on the screen stating what type of system it was. This was a very interesting project and began sparking many ideas and questions. This helped lead into week four. 

06/21/2013 18:00

During this week Hamlet and I continued to work on grabbing the position of the threshold object whenever a specific button was pressed. We first had to figure out how to track the object that was displayed. Once we figured that out it was simple to capture the positioning at the press of a button. The only problem with the thresholding program is that it can only track one image at a time. We also got the contour boxes to go around the object of choice. However because this program only thresholds and tracks one image at a time we needed to find another that would be more helpful. 
We were assigned to watch a video where a man worked with Visual Basic on taking a picture and then the program would match the image and moving object from the camera. We had to convert the code from Visual Basic to C++. We finally got the program to run properly and were able to take a picture of whatever we wanted and have the camera match and track the image. This program worked well until it came upon images without corners. I tried to create a picture with a paw print, heart, and check mark, but it was grabbing the points and couldn't really focus on the rounder parts of each object. So we tried another program.

I believe the third program works with SURF as well. This program ran very smooth as well. It grabbed the important points in the picture that was projected on screen. It could grab multiple objects, which was a plus, however we needed to have it grab objects that were in a certain vicinity. This was just grabbing objects that seemed important to it. For instance it was circling my pupil, iris, and eye. It also had a very long range.

During this week we also had a meeting with everyone who was working on the project. Whether they were on Skype or in the meeting room. We talked about different pieces that could represent the objects at hand. Questions were brought up about how big the map size should be, who were prospective candidates for testing the new simulation, what types of materials would be used for the objects, and more. This was about a two hour-long meeting with many different questions, concerns, and variables that were brought up.  We now have new goals for next week and will hopefully be moving to a new lab so that we can better test our simulations.

Finally During the week I got to take the objects that Matt had made and test them in the SURF program that detected main features. Also creating contrast between the pictures with their original and then making a black and white. There were major feature detection increases from the original picture to the black and white picture. This was a pretty simple way to end the week.

06/28/2013 18:00

During this week we began to test the shapes with the overhead camera in the LSRI lab. We feel as if the overhead camera gives a better demonstration and also allows us to have a more accurate reading as to how far the camera must be from the objects to get an accurate reading. We soon began working with the SURF coding more so that the program would run better. We wanted the program to be able to grab the object it detected and set it in a different window after clicking a specified button. It took a while, but we finally got the piece working properly. Once we press a button it now takes the image that it is searching for and finds images related to it and sets them off in separate windows. Our new goal is for it to be able to properly select the images and the main features for each shape.

We had a meeting with Matt, our designer and he showed us different materials that would be possible to use in the project. There were wooden pieces from different cuts of wood, such as birch and butterwood. These shapes all varied in sizes as well, from ¾ of an inch to 1¼ inch. This was to test how easily human hands could pick them up and how visible they were to the computer. Once we finished with the meeting we headed back to the ERF lab to finish up a little more coding, and then I began to transcribe more videos. 

07/05/2013 18:00

To begin the 6th week we started to combine the threshold code with the SURF code more. The code is now able to search for a specific color and image that is given to it. We have the program working well with a few bugs where it crashes every so often. Our main goal is to move the equipment from the ERF lab to the LSRI lab so that we may begin testing with the overhead camera. This should allow better detection for the SURF program to run off of while there is better lighting and a clear distance between the camera and object. We walked over the equipment to the LSRI lab to discover we had no table for the computer to sit on. So we came back to the ERF lab so that I could continue transcribing. 

We eventually made it back to the LSRI lab so that we could begin testing the program. We set up the overhead camera by taping it to a pole and making sure it was parallel to the floor. The original camera had something wrong with it, so we couldn't use that one and had to use a wide-angle lens camera. With the wide angle lens the camera has a bigger surface area to look at. It was my job to test which square sizes could be picked up and what the range was for the sizes. Tia had already finished a chart so that I could fill in the information easily.

I began testing the pieces and at first it was going smoothly. The two larger squares worked just fine, the 1.75 and 1.5 inch squares. Anything after that became a little fuzzy. Looking on the screen I could tell the camera was having problems picking up the pieces, so I started to tamper with the lighting of the area, brightening and dimming the lights to see if it would help the problem. No matter what I did it still seemed to have trouble, so I looked closer at the screen to see if it was an immediate problem. It looked as if the table was reflecting the lights weird. So, I then tried to use something smoother and less reflective to cover the table. The only thing I could find in my workspace was a large piece of cardboard, and I believe the squares were too close in color to be discriminated by the camera. I tried moving the squares in several different areas and even in just the plain middle of the table to see if it would get picked up, but nothing. I decided that since I couldn't find the others I might as well try finding the exact points for the first two that were found.

This week was short because it was the Fourth of July, so we were only in for three days. I felt as if there was a decent amount of work completed, but there is still much more to be done.

07/12/2013 18:00

This week began off with a meeting with Professor Lyons. We talked about all that I had been learning, what troubles I was having, and what all still needed to get done. I told her that I was still a little iffy on asking questions, but she said as I am learning that I need to ask many questions and it also helps the Graduate students in teaching what they know. Even if they look a little frustrated with all the questions being asked I should continue asking. We spoke about the previous week and the project that I had been testing. I was a little nervous that I hadn't gotten enough done, but she said what I had done was fine and it was good that I had tested out as many possibilities as possible. The goal isn't to have all the pieces working, but to see what would work best. She then sent an e-mail stating what needed to be done and my progress so far. Since Tia wasn't with me the first day this week Professor Lyons told me what I needed to get done next and basically how I should get it done. It was mind easing to speak to her and know I wasn't doing as bad as I thought.

On Tuesday, Tia was there to help me with the testing and problems I had been having. We sat down and identified the main problem, which seemed to be the reflective table, and the picture it was grabbing wasn't big enough to find the key points. Tia then figured out how to threshold the image so that it could be read better. She also made a way possible to resize the image the camera was getting so that it could match the key points much better. Once that was fixed I was able to finish up testing the white squares on a black background. We had not yet gotten the other materials to work with, so I was done with testing and headed over to the ERF building to finish up my transcriptions.

Wednesday started off okay. We were successful in changing the threshold image to look for black images on a white background. At first it did not work very well, but after a few trial and error of debugging we got it working. Then Tia took me through the code to make sure I knew what each part was doing. Once she left, Hamlet and I began testing the black pieces. An error window kept popping up, so I tried to fix it. In doing so I seemed to have destroyed the program. I searched what to do and if I needed to reinstall the libraries to Eclipse, but found nothing to help. After about two hours of trying to figure out what went wrong I decided I should just wait till tomorrow to see if Tia could help fix it. I had hoped I didn't mess anything up too bad. I walked over to the ERF building to see if Chandan had any other videos for me to transcribe since I couldn't figure out the coding and I had finished the ones that were assigned to me. So, for the rest of the day I transcribed more videos.

A frustrating Thursday began with trying to fix the code in Eclipse still. There were no errors in the classes shown, I restarted the computer, remade the project, and added all necessary libraries, but it was still not running properly. The program even built, yet kept saying it was having errors. Binary not found, application failure. I e-mailed Tia numerous times to see what I could do. I tried all of her suggestions and decided that by noon I should just go over to the other lab and continue transcriptions. Around 1 I received another e-mail from her stating that we had had this error before. She said she knew what she would have to do on Friday and she would just fix it. I hoped that she knew exactly what she was doing, and I felt a little silly for not being able to fix it myself since we had this problem before. Anyways, the rest of the day was transcribing videos.

Friday was a long day of transcription and not much else. From 9 to 5 I transcribed the groups speaking. Not a fun day.

07/19/2013 18:00

To begin Monday off it was a very bumpy start. Coming off of Friday when Tia had e-mailed me stating we had fixed this problem before and I should've been able to fix it myself, clearly showing she was upset that I hadn't gotten any work done due to this issue. In her e-mail she stated that she fixed the problem easily and had done some coding to clear up other bugs. I should have been able to continue testing the pieces. However, when I went to run the program all I saw were errors. I had no idea how to fix them, and at this point I was afraid to tell Tia about my problems. After about 3 hours of worrying and trying to fix the problem, I swallowed my pride and e-mailed Tia that the program was still having problems. She was upset that I hadn't contacted her earlier, but with the pictures I sent her in the e-mail she could sort of tell what needed fixed. I had to comment some of the code out that she hadn't finished and then redo some of the other stuff, because for some reason it couldn't find the executable file. I finally got it running and was able to test the rest of the pieces. 

We began Tuesday off with a meeting of the minds. A Skype call to Professor Lyons, Moira, and Dean, while Brian, Tia, Chandan, Emily, and I all were in the same room. We discussed the type of map that would best be used, the size of the map, dimensions, and other things that we'd need to work on with the map. They looked at my chart that I had made with all the testing I had recently done to see what pieces would work the best on the map, what sizes, etc. We had also discussed using color recognition for the map pieces. Tia had stated this would be simpler to code. So later I will have to begin color detection and blob testing with the map and camera in the LSRI lab. There was also a lot of talk about how the public would react to the map. How would people interact if they didn't feel connected? Would one be more interested if this were considered specifically their street block? What are the differences between private and public investors? There was a lot of talking. After the meeting was over, Tia informed me that we wouldn't continue to code until Friday and for the rest of the day I should do gesture transcription. Sounds as fun as it looks. I had to go back through all of the videos that I had already transcribed and write in the nonverbal gestures that the users make. 

All day Wednesday and Thursday were gesture-transcribing days. The only thing that's good about gesture transcribing is you don't have to listen to them speaking and you can listen to music while you do so.

On Friday we returned back to the LSRI lab to begin coding the new code that was mentioned on Tuesday during the meeting. Tia was at the lab already when I arrived. She had begun working in the threshold code we had previously used. She changed what needed to be changed while I observed. We tried again to square out the fish angled camera, but we still couldn't get it perfectly correct. Once we finished to code necessary I could begin testing the white pieces on the black background. Using threshold images were much easier for the program to detect, rather than SURF feature detection. Once I finished that testing I had to change the threshold to search for black pieces on a white background. I thought this would be a simple process, but of course it was beyond difficult for me. I was so close, I could see the square in the threshold screen, but the program couldn't detect where it was. I sent Tia an e-mail and she suggested changing the last two values, and playing with them. I played with them for what seemed like forever, but still couldn't figure it out. 

07/26/2013 18:00

To start the week off I began with more gesture transcription all of Monday.

On Tuesday, Tia and I met up in the LSRI lab to finish up finding the color codes for thresholding without the projector. It took a while for some of the colors because they were very close to each other such as, red and orange or blue and black. We completed our task of getting all of the colors to work. Our next task was to find the coordinates of the squares on the black/white backgrounds again, but with the projector this time. I also had to find the correct HSV scale for the colors with the projector on as well. At first we couldn't get the projector hooked up with the VGA cable, but Matt found an HDMI hook up, so that it would work. I decided I would do the testing with the black pieces on white background and white pieces on the black background like before. I finished up the black pieces on the white background, and then called it a day.

Wednesday I came into a full workspace. Matt and another worker already were in the office area, so it was a little crowded. I finished up the white pieces on the black background by looking through old code to find the threshold colors necessary. It worked perfectly, but shortly, something went wrong. The projector stopped connecting to my computer that was projecting the map. So much time was spent trying to figure out what was wrong with it. Tia has also left for a week long vacation, so I was stuck doing this myself. After many hours I finally finished! I was so proud of myself, I think I actually understand the concept of HSV values for openCV. The first sets of parenthesis are for the lower sets and the second for the higher, so when you set them you want to look for the color in between the two. The saturation was explained to me first as being grayer, that's what it looked like in the pictures, and I was also explained that it was like the pigment of the color. The value is the light and darkness. In openCV the hue only goes to 180 unlike the regular wheel which is 360, so everything is split in half, but the saturation and value both go up to 255. I learned a lot about these doing the testing on this part of the project. My next task will be to set up a switch operation in the code so that while running it can read more than one color and print the coordinates of each color read. 

On Thursday I came in and decided that I should test the other silicon piece out that we had to see if the bolder color made a difference. In testing this piece out I realized that I would also have to retest the other colors to make sure they didn't clash with this piece. While retesting the pieces I decided I'd try to rethresh each one so that they would be more consistent. I was more successful in some than others. This is a very long process, but I wanted to get it right and look better for when Tia returned and Professor Lyons took a look at the project. What I really needed to be doing was trying to add the switch method, but I'm still not confident enough in my coding skills to do so. Once I finished trying to perfect the thresholding methods I would attempt the switch method. After looking up how the switch method is done I attempted to place it in my code. I had no idea what I was doing and beginning to see so many errors was becoming very frustrating. I really wasn't sure what to do and figured the best thing to do was wait until Tia got back. I knew on Friday there would be a photo shoot for the prototype of the project, so that would hopefully fill my time.

Friday started off well. There was a photo shoot for the prototype of the board game. Some very interesting angles and shots were taken. Matt also made some new stains for the pieces. I think a few of them are still going to be effected by the projector when placed on the board, but I would have to find out when I tested it. For now all I could do was test it without the projector. 

08/02/2013 18:00

To begin Monday off there was a meeting with Leilah, Emily, Brian, Matt, Dean, and Chandan to discuss what had been come up with so far. The pieces of the board were the main topic of discussion. How well they could be seen, how appeasing to the eye they were, could they be read on the board, and so on. There was about a two-hour discussion about this and other aspects of the game. Once the meeting was done I had barely a 10-minute meeting with Leilah showing her what I had come up with. I really wish she was here more often to share what was happening and to discuss what was going on. With Tia gone at this point I felt pretty lost and wasn't sure what to do. I wasn't able to get the switch method done, so I figured there had to be something better I could do with my time. I asked Leilah and she shot an e-mail over to Matt asking when he would have new test pieces I could try out, but until then I figured I'd finish up the gesture transcriptions I had left. So the rest of the day I did gesture transcription. 

On Tuesday I went and picked up the new pieces that Matt had created. Four were just the semi new old ones with black backgrounds, and the others were made of paper with green, black, and a pavement colored background. I thought that he was going to bold the new pieces like I had suggested, but every time he made new pieces they were just the same colors we had before, but lightened or darkened. The red and orange pieces were still pretty difficult to tell which was which and now he added yellow, so this yellow blended with the green. However the blue stood out pretty well. So for the rest of the day I had to re-test all of the new colors again. Spent all day doing that and took a quick crack at the switch case again.

Wednesday was sort of an off day. Wasn't quite into working, so I tried to focus on what all needed to be done as a whole.

Tia returned on Thursday and waited for me to finish the color testing. I couldn't find any way to record gray and I'm pretty sure I went through every combination possible. Once I finished all I had to write was that I couldn't find a combination for that color, then I had to e-mail Matt to see if it was possible that I use the projector. He had not responded and I had finished all of my transcribing tasks, so Tia told me to work on anything I had to DREU. 

Friday was pretty easy going. I finished up a few more gesture transcriptions and then Chandan took me out for lunch and we got to talking about the overall experience and schooling itself. It was nice to talk to someone who has been through the experience of school and what it is all worth. It was a nice easy going day and I was thankful to have been able to finish out so well.

I was also very excited to be going home and seeing my family after missing them for ten weeks. This was a great and trying experience. It was definitely worth it.

Final Report

To review my final project report, please click here.

Contact

Brianna White b.n.white@iup.edu