Starting User Study Data Analysis

The TIDESS team has begun qualitative analysis on the data we collected from the tabletop user study to try to characterize the processes of how people learn from data visualizations on interactive tabletop displays. We collected both audio and video recordings, as well as logs of all the touch interactions (gestures) participants did with the prototype, during the study sessions of the user study.

To start, the team transcribed the user study session videos so that the transcriptions can be coded, or labeled with interesting behaviors and spoken utterances that occurred. We then timestamped the transcripts so that the participants’ words could be matched up with the gesture data that came from the application on the display. The team constructed a code book for the utterances by reviewing prior literature in learning sciences and collaboration learning, as well as some insights from our own past work and the goals of this study itself. The codes in the code book allow the team to characterize the group dynamics, collaborative work, and group meaning making. To refine the codebook, all of the team members coded sample transcripts, and we discussed any disagreements during our team meetings until we agreed on an initial coding procedure.

We are using MaxQDA, a qualitative data analysis program, to facilitate the coding. MaxQDA allows us to take all of the codes from the code book and use those codes to classify the speech and actions in the transcripts. We’ve already begun coding the transcripts, and then next we will analyze the coded transcripts to see where the interactions with the prototype helped or hindered group learning.

I am a 3rd year Computer Science student at Brooklyn College, in the INIT lab for the summer. It has been really interesting to see how people interacted with the interactive touchscreen tabletop display. Working on this team helping to analyze the data has been fun, and I look forward to continuing the analysis of the data.

Advertisements
Posted in Uncategorized | Leave a comment

TIDESS Open House

The Spring semester is complete here at the University of Florida and so is the setup of our interactive spherical display from Pufferfish Ltd. To celebrate the completion, the TIDESS team hosted an open house for members of the University of Florida to come and interact with both the spherical display demos and the tabletop prototype we have been working on.

The open house was a success and was attended by both students and faculty from The Department of Agricultural Education and Communication (AEC) as well as the College of Engineering. Some of the attendees were also people who helped us bring the sphere to the University through the logistical challenges (e.g., customs, importing, and heavy lifting).

The TIDESS team was excited to see people interacting with the spherical display’s onboard demos, as well as to thank those who worked so hard to help bring the sphere to the lab. The attendees’ enthusiasm to interact with this novel technology was interesting to see, and we look forward to tapping into that for our own research on the sphere.

The end of the Spring semester also marked the end of our data collection for the tabletop user study. As we move towards the summer, we will begin analyzing the data from the study, as well as exploring what will transfer from our tabletop prototype to the sphere platform.

I am a rising junior working in the INIT lab. This was my first semester working on the TIDESS project and my first time seeing a lab study. It has been incredibly exciting to watch the prototype develop and see each stage of research for the first time. I look forward to our continued partnership with Pufferfish as we move forward with the development of our first application on the sphere, and am very excited for the day that we have an open house with our own prototype displayed on the sphere.

Photos from our open house:

Posted in Uncategorized | Leave a comment

Interactive Spherical Display has Arrived!

In March, we received an interactive spherical display, the Puffersphere delivered from our partners, Pufferfish Ltd. The Puffersphere has a 360-degree viewing window with audio integration, and supports multi-touch interaction that allows the user to explore and manipulate content on the display. When the sphere arrived, our entire team was very excited to get started with the installation process. We had to both assemble the hardware and get the software up and running. For initial hardware and software set-up, we followed the installation manuals delivered with the system.

Setting up the hardware: Installation manuals provided necessary information for setting up the connections and organizing various system components. The system was delivered in four cases carrying the sphere, projector, enclosure panels (top, base plates, and the side panels), and the lens. The installation process began with securing the base plate to the projector frame (Figure 1). After attaching the base plate, we tightened the back enclosure panel using wingnuts and washers (Figure 2) and connected the projector to the router using an Ethernet cable. We also connected the projector to the App machine (computer) using the DVI cable. In addition to this, we connected the computer to a LED ring driver by USB, and further connected the LED cable and camera cable from the LED ring driver to the LED ring attached to the Sphere. After making all the connections (Figure 3), we attached the front panel and then attached the top plate to the frame. We were now ready to fit the lens to the projector. It was suggested in the installation manual to remove the top and the bottom cap of the lens to prevent any damage to the projector. To sharpen the focus of the projector, we projected the test pattern on a white sheet of paper (around 30 cm away from the lens) and adjusted the focus until the image was crisp. Ensuring that everything was in place, we joined the sphere to the top plate and attached the side panels.

Figure 1a

Running demos on the sphere: After assembling the Puffersphere, we needed to calibrate it to balance the image correctly on the spherical screen. The instruction manual directed us to use the projector’s Lens Shift function – both horizontal and vertical for calibration on the Pufferfish test pattern. We tried using the Lens Shift control via the projector’s remote control but we were unsuccessful. We were unable to get the computer to use the projector as a monitor or the projector to recognize the computer as a source. We also found that when the router was plugged in, there were no lights turning on. After consulting with Pufferfish about these issues, the main problem seemed to be that the router was not working. In addition to this, there were some loose power cables. To help stabilize the cables, we decided to get a US power strip and use the default US power cables for better stability instead of using UK power strips.

To make our system work temporarily, we used another router of frequency 2.4 GHz (54 Mbps). After replacing the router, the computer began communicating with the projector, as we saw a blue sphere with the Pufferfish logo (Figure 4). However, we still faced issues accessing the controller application via the web browser (IP address: 192.168.1.40). To access the machine’s operating system, we connected through a VNC server and launched the Pufferfish External control app manually. But we were still not able to access the controller application. After some debugging, we noticed that the network was configured to public mode, and therefore the Windows firewall was blocking the connection. This might be because of using the new router with the system. After making the network private we could access the controller application and run the demos (Figure 5, 6). Now that the system was up and running, we were successfully able to calibrate the image on the sphere using the projector’s Lens Shift controls and the Pufferfish test patterns (with concentric circles).

Figure 2a.PNG

I am a 1st year Ph.D. student in the Human Centered Computing Department at the University of Florida. It has been a thrilling experience to interact with the visually stimulating demos on the expertly designed 360-degree view window device. I am excited to start prototyping our first application on the Sphere.

 

Posted in Uncategorized | Leave a comment

Introducing the Education Perspective

We have finally completed the iterative prototyping on the table-top interface and are now starting data collection! The project is coming along well, and an excitement is stirring within the entire team. As a newcomer, it has been thrilling to get to see individuals interact with the table-top and instantly see how these interactions are promoting learning about our Earth’s ocean basins. My name is Brittani Kirkland and I am a first-year Master’s student in Agricultural Education and Communication at UF. I am excited to join this team and provide my knowledge and experience with non-formal (aka informal or free-choice) learning. I have four years of experience in Extension education within the state of Florida’s Animal Science Department at the University of Florida. While animals are a bit different than the work this project is focusing on, my knowledge from the development of non-formal education programs for the 4-H Horse programs will help me aid in the development of this mode of learning we are researching. I hope that my different background will add a new, creative perspective on this research team.

When I first learned of this project in January, it was difficult to wrap my head around. Not only was the concept foreign to me, but also the idea of how these interactions were facilitating learning was hard to define and digest. revamping the prototype after interactive prototyping from the previous semester had revealed a need for some alterations. Getting introduced in the middle of this project made it difficult for me to provide input, or even understand, team discussion. However, within the last month I feel I have been able to grasp what it is we are researching and how these interactions are fostering scientific learning; allowing people to develop scientific skills through interactions that will transfer to future contact with science exhibits. Conducting prototyping sessions last month was exciting, and I have seen how people interact and learn from our project. Annie’s poster at the University of Florida Undergraduate Research Symposium has an overview of the prototypes made throughout our iterative prototyping.

Going forward, we still have barriers to overcome. It is difficult to design something that allows for discovery without prompting. We live in a world of instant gratification, which creates a difficult barrier for us as researchers to retain the user’s attention while trying to investigate raw interaction. While observing, we must allow the user to be, and remain, engaged without providing positive or negative reinforcements that inhibit exploration. This has provided a great challenge for our team because we want the user to learn scientific skills, and as science educators, we must learn to take a step back and allow them to process and explore on their own in order to develop these skills. I am interested to see how we will continue to combat this challenge and what our observations will tell us about human interaction with touch-enabled screens. I cannot wait to investigate the link between these interactions and learning and to see how users discover more about our world and changing temperatures in Earth’s ocean basins through the interaction with this table-top interface as we begin our lab study sessions!

Posted in Uncategorized | Leave a comment

Preparing for the User Study

After months of iterative prototyping, the TIDESS team was able to come up with a design for a prototype that we would move forward with. This is the prototype that we will be using in the studies we conduct. The prototype came about from a combination of ideas from our past prototypes and new ideas derived from the data in our Spring 2016 lab study.

To prepare for the study, the team got ready to do recruitment during two weekends at the Florida Museum of Natural History. Recruitment involved multiple team members engaging with museum-goers and passing out “packets” that had information necessary to contact the team about participating in the study. In particular, we were looking to recruit family groups of 2-4 which had at least one child of age 8-13 and one adult. This would allow us to observe how children and adults interact together on a museum exhibit. Recruitment overall, proved to be a challenge, as sometimes the museum didn’t have many visitors, which meant we gave out few packets. Also, out of the many packets we gave out, only a few groups responded. This led us to brainstorm a few other ways to recruit families. We drafted a short newspaper ad to put in the local newspapers and talked about possibly recruiting at a nearby school.

The next step after this was having a script rehearsed and logging materials ready. The experiment script is a guideline for the study that allows us to make sure the studies conducted are always consistent. For logging, we set up two cameras for recording in the lab study room and finished implementing a “logger” in the prototype, which records the names of gestures performed and other details from interactions with the prototype.

Once the details were sorted, we were ready to begin conducting our studies. We drafted up email responses to those families that inquired about participating, and began scheduling study sessions so that we could begin gathering data. This will lead us into the transition of the study on the tabletop to the one on the spherical display.

I’ve been working on the TIDESS project for two semesters now and have learned a great deal about research methods and research logistics. This is my first research project, and I’ve gained great insight as to how I should apply the methods and concepts I learned in my User Experience Design and Computer Science Education Research class. Overall, I’ve really enjoyed working on this project with my team, and I look forward to conducting the studies and collecting data so that we can move forward with the project.

Posted in Uncategorized | Leave a comment

Iterative Prototyping

Since the last update, we conducted several pilot studies to see how users interacted with the our interactive tabletop prototype that lets people explore Earth’s ocean temperatures. Pilot studies are smaller-scale studies with fewer groups of people to help guide the research and study structure. We conducted the pilots because we wanted to make sure that the prototype was appropriate for the lab study that we designed.

During the pilots, we looked for evidence that people were able to deeply engage with the Earth’s ocean temperature visualizations using the interface we designed. There were clearly some aspects that needed to be improved on this front. Our IDC 2016 paper, in which we investigated museum visitors’ natural gesture interactions with Google Earth, helped us revise our design. We considered what the most common interactions were that people tried in our IDC 2016 paper, and followed the design recommendations we presented in that paper. For example, tapping was often the most common gesture that museum visitors tried. We wanted to design the interface to allow this simple gesture to let people “dig in” to the content more quickly.

We decided to begin a round of iterative prototyping to enable more engagement during the user study. Iterative prototyping is the process of making changes to a design and testing it with users in a repetitive cycle. Using the feedback from the pilots, we made changes to the design, from using a different display element to changing a gesture. The pilots were extremely helpful because they gave us an idea of which parts of the interactive display were engaging and which were not. The goal of this process was to find which design would enable users to discover all the gestures and information presented on the interactive display. We also tried to keep in mind the design choices that would benefit in transferring to the Pufferfish spherical display, which we are hoping to get in early March!

Another large development on the project is the addition of several new members to the team and the creation of our TIDESS website. Having all these new members on board will help in the project in many ways. With more team members, we can speed up the implementation process, and bring more insights into the design.

I am a 3rd year Computer Science student at the University of Florida. I’ve had a lot of fun and insightful learning experiences in this TIDESS project and I am excited to see where the iterative prototyping will lead to in preparation for the user study. I am interested in how we can influence these design changes to afford better engagement with interactive displays.

Posted in Uncategorized | 4 Comments

TIDESS kicks off!

We are pleased to announce that the TIDESS project has recently been funded by the National Science Foundation under the Advancing Informal STEM Learning program.

This funding will enabled TIDESS to investigate how children and adults can engage with data visualizations about science concepts, especially in earth and ocean science domains, on touch-interactive spherical displays.

Stay tuned for project updates!

Posted in Uncategorized | Leave a comment