This past week I worked on the scripts for the ScanMatch analysis.
I worked on reading all of the cleaned .csv files I created into Matlab. After reading them into Matlab, I wrote a function to create sequences out of the data for each participant for each task that can be used with ScanMatch. I also wrote a function to create sequence numbers for each line in each file (or for each AOI). This way when I create the sequence data for each participant for each task, I can check the line and file name against my AOI sequence map and get the correct sequence number for each fixation.
I discovered in the next week, I will need to add a special case in my sequence map for when a participant looked at the task description/answer .txt file instead of source code. I will also be working on writing the substitution matrix needed to run the ScanMatch algorithm for our data.
The PME Regional Conference went very well and all of the participants had a great time. My talk went very well, as well.
Thursday, February 25, 2016
Thursday, February 18, 2016
Week 25: 2/11/2016-2/18/2016
This past week I completed the preliminary analysis on the Kent study we are collaborating on and I worked on scripts to prepare our data to run on ScanMatch.
I edited and ran the R script I wrote to clean the Kent study data from iTrace raw data into nice fixation eye gaze data. The script creates two .csv files: the first one aggregates the fixations by source code entity name and file name, so we can see how many times a person visited a source code entity, the second one creates a time series/sequential data file of the source code entities visited over time.
I performed more research on ScanMatch and discovered that I will have to write scripts to get the data into a specific form to be able to use on ScanMatch. ScanMatch doesn't allow us to use more than 676 AOIs when creating sequences from our data, so I will have to write scripts to create sequences from our data. They also don't provide a way to create the substitution matrix in a way that works properly for our data, so I will have to write scripts to create it myself. After that we should be able to utilize their ScanMatch algorithm with a few modifications to their Matlab code. I plan to get those scripts written and ran this weekend.
I have also been preparing for the Pi Mu Epsilon Regional Conference that we are hosting at YSU this weekend, Sat. Feb 20, 2016. I am the current YSU chapter president for PME, so I have to shop for food, set up and preform registration in the morning, stuff folders, give a welcoming speech, and present a talk.
I edited and ran the R script I wrote to clean the Kent study data from iTrace raw data into nice fixation eye gaze data. The script creates two .csv files: the first one aggregates the fixations by source code entity name and file name, so we can see how many times a person visited a source code entity, the second one creates a time series/sequential data file of the source code entities visited over time.
I performed more research on ScanMatch and discovered that I will have to write scripts to get the data into a specific form to be able to use on ScanMatch. ScanMatch doesn't allow us to use more than 676 AOIs when creating sequences from our data, so I will have to write scripts to create sequences from our data. They also don't provide a way to create the substitution matrix in a way that works properly for our data, so I will have to write scripts to create it myself. After that we should be able to utilize their ScanMatch algorithm with a few modifications to their Matlab code. I plan to get those scripts written and ran this weekend.
I have also been preparing for the Pi Mu Epsilon Regional Conference that we are hosting at YSU this weekend, Sat. Feb 20, 2016. I am the current YSU chapter president for PME, so I have to shop for food, set up and preform registration in the morning, stuff folders, give a welcoming speech, and present a talk.
Thursday, February 11, 2016
Week 24: 2/4/2016-2/11/2016
This past week I updated the VISSOFT website, researched ScanMatch, and wrote scripts to analyze the data we collected in a study this past November/December.
I added the award announcement (with a picture) to the Submission page on the VISSOFT website. I also added links/pictures for the Journal that selected papers will be asked to submit to.
I spent a lot of time researching ScanMatch and how we will need to manipulate the data to get it into the form we want to run on the algorithm. We will need to generate our own weighted matrix based on if one participant looked at the same program slice as another participant. To get JabRef into program slices, we will be using srcSlice. I am in the process of downloading, compiling, and running srcSlice on each .xml file generated by srcML for JabRef. After this is done, I can begin writing scripts to get the weighted matrix.
Finally, I wrote some scripts to analyze data we collected in a study in collaboration with Kent State University. I have to modify the scripts this weekend and run them on the data to get the data in a form requested by the student (from Kent) who will be doing a lengthier analysis on the newly formatted data.
I added the award announcement (with a picture) to the Submission page on the VISSOFT website. I also added links/pictures for the Journal that selected papers will be asked to submit to.
I spent a lot of time researching ScanMatch and how we will need to manipulate the data to get it into the form we want to run on the algorithm. We will need to generate our own weighted matrix based on if one participant looked at the same program slice as another participant. To get JabRef into program slices, we will be using srcSlice. I am in the process of downloading, compiling, and running srcSlice on each .xml file generated by srcML for JabRef. After this is done, I can begin writing scripts to get the weighted matrix.
Finally, I wrote some scripts to analyze data we collected in a study in collaboration with Kent State University. I have to modify the scripts this weekend and run them on the data to get the data in a form requested by the student (from Kent) who will be doing a lengthier analysis on the newly formatted data.
Thursday, February 4, 2016
Week 23: 1/28/2016-2/4/2016
This past week I participated in a mathematical modeling competition, called COMAP's MCM. My team and I had to develop a model to determine the best strategy a person can take to maintain the heat in a bathtub. Then we had to write a paper proposal describing the model and our findings. It started on Thursday, 1/28/2016, at 8pm and ended Monday, 2/1/2016, at 8pm.
I also looked into TraMineR to see if it would be a useful tool for sequential analysis on eye-tracking data. I found that it would be a useful tool to modify our data into an appropriate format to run on another algorithm, called ScanMatch. It would also be useful for visualizing the common and not so common sequences of eye-gazes that eye-tracking data produce.
I also looked into TraMineR to see if it would be a useful tool for sequential analysis on eye-tracking data. I found that it would be a useful tool to modify our data into an appropriate format to run on another algorithm, called ScanMatch. It would also be useful for visualizing the common and not so common sequences of eye-gazes that eye-tracking data produce.
Subscribe to:
Posts (Atom)