Christina and I spent our time in lab this week learning how to use MaxTRAQ 2D and MaxTRAQ 3D. This pair of software is an essential tool to support MaxMATE, which is the software we will use to analyze and interpret the data we collect from the water striders! It proved to be tricky to understand, however we eventually proceeded to the calibration step of the process last week. This is pretty advanced software and is able to automatically organize acceleration, velocity, position etc. of digitized points, however, the position of the cameras and testing area had to first be documented. This involved constructing a structure with eight trackable points (beads were used in our case, see picture below), and represented the x, y, and z directions. After constructing and measuring the calibration “tree,” we began filming in order to input the calibration data into the computer. Unfortunatly, it took a few tries to get it right. The first time that we attempted to calibrate the software, we didn’t account for enough frames of the calibration. This means that when we tried to analyze video using the MaxTRAQ software we could only digitize half of the video. Apparently it is necessary to take as much calibration data as will be needed to view all the fames of the video. However, the trickiest part about calibration was that once calibrated, nothing could be moved. If one camera is bumped, or the viewing box shifted, then the system must be recalibrated. Christina and I had a bit of a battle with tape and various clamps to set up the station. By the end of the week we had a successful water strider jump input into the MaxTRAQ 2D system along with the calibration. In MaxTRAQ 2D, we digitized all the important points along the water striders body during a jump (tracked points of interest frame by frame), and input them into the MaxMATE 3D software. This software takes the views from both cameras, along with the dimensions of the calibration tree and gives us data to analyze via MaxMATE which later is exported into excel. As I’m writing this post, everything seems to sound like it went very smoothly. It was a bit of a struggle (a huge struggle actually), figuring out what versions to save the files as… is it saved as a *cds, or *mqa, or perhaps *AVI…? Each step requires a different method of saving, and these methods often left our heads spinning. The software also works “most of the time,” not all of the time… but most often. In theory, this could be a very cool data analysis tool! Providing instantaneous feedback on acceleration, velocity, position etc.
However, water strider research wasn’t the only work we did last week! A few weeks ago, Christina and I met with the SNU frog team whose lab is on the floor above, and we helped an American student from California named Chelsea (she is also fluent in Korean!) with the enormous process of feeding at least fifty baby frogs. The room in which the frogs are kept is extremely small, and extremely packed. The picture to the left is of the stack of frog containers we had to individually open, clean, and feed. It was difficult to keep track of the fed frogs in such a small space, as well as the escaping fruit flies – the frog’s food. I really enjoyed talking with Chelsea about her experiences in Korea, she has been in Seoul for about two years, and learning of restaurants and markets to visit! She recommended an Indian joint by SNU station that Christina and I checked out later in the week. Spoiler alert! It was delicious!
In terms of research, we also met with Piotr and Sang-Im on Friday to discuss logistics of the research and software. We were having synchronization problems with the two cameras. Basically the master camera is supposed to control the recording times of the slave camera, however, our master wasn’t “truly” a master camera and the recording times were off. This made it very difficult to download the same sections of video from the same cameras. Fortunately, Jae Hak and Piotr were on the job, and helped us fix the problem. We plan on meeting again this Wednesday and run some jumps with our understanding of the software to see what data is possible to record and decide on the most beneficial research questions.