Mobile News

U of A researchers link smartphone swipes to decision-making clues – thegatewayonline.ca


Researchers at the University of Alberta are gaining insight into our decision-making processes by analyzing the data from our touch-screen devices. Each interaction with our smartphones and tablets requires actions like reaching and swiping, which are also used in real-world interactions. 

Craig Chapman, an associate professor in the faculty of kinesiology, sport, and recreation at the U of A and founder of the Action in Complex Environments Laboratory (ACELab), is collecting this data and applying it to several fields. 

The ACELab has explored the relationship between decision-making and interaction with technology in a recent study. Chapman conducted the study with his master’s student Alexandra Ouellette Zuk as lead author and his PhD student Jennifer Bertrand.

Chapman explained that the ACELab is based on the concept “thinking is moving,” which relates to the tools he has developed to “measur[e] how people move and where they look.” As his research has progressed since graduate school, it “became clear that [they] could use these really sensitive measures as a way of telling what was going on inside of someone’s head.” 

Tracking these movements is “much better than hitting a button,” according to Chapman

The first experiment run by the ACELab “had people reaching out towards [a] screen to choose one of two targets that appeared on [the] screen.” Chapman and his team noticed that they could tell a lot about how hard the decision was for someone by “tracking how their hand moved through space.” 

When compared to traditional methods like tracking a computer mouse on a screen, Chapman stated that this is “better than hitting a button because [you] get [much] more information.” Mouse tracking “is not moving [your] real hand to interact with an object in the world,” Chapman said. 

This is where touchscreen devices become interesting, as the user must physically interact with the device, but they still offer “some of the flexibility and ease of use of a cursor,” according to Chapman. This interaction is more reflective of how “our brains have been wired across millions of years of evolution. We should directly act on the objects in our environment that we care about,” which is different from the information available from mouse-tracking experiments, Chapman said.

One of the reasons modern technology has been developed into touchscreen devices is because it is the easiest to interact with, Chapman said. Gathering data about how we move and interact with the world around us is moving towards analyzing our use of touchscreen devices — as seen in the ACELab — because it is “the easiest place to get information.”

If “you have awareness and consent around these issues, you can really reap huge benefits,” Chapman says

Chapman and some of his colleagues started the U of A spin-off company Gaze and Movement Analysis (GaMA). One of the applications that preceded the formation of this company was a partnership with a United Kingdom (U.K.) organization called Neurosight

“They’ve developed a tablet or screen-based questionnaire where, instead of a typical questionnaire, it makes you [perform a] movement to choose one of two options,” Chapman said. For example, if the questionnaire asked if you prefer to work alone or in a group, “how you choose [your answer], is really indicative of how you feel about that question,” he said. Essentially, it is not possible to “cheat your motor system,” according to Chapman.

This research at the ACELab could also be applied to health care. Chapman said that there are some aspects of assessment protocol or monitoring that can be automated. “We’re not trying to replace practitioners, we’re trying to take their expertise and … multiply it.” By doing so, the abilities and outreach of providers could be drastically upscaled, according to Chapman.

A common concern regarding this type of data is privacy. This information can be classed as biometric information, “which means that it is uniquely identifiable,” like a fingerprint, according to Chapman. 

This type of information allows for personalized care regimes designed for a patient’s particular needs, but it also can be problematic if “you’re interacting with a website and you don’t know that you’ve clicked the button that says you’re giving away all your data. It also means that those organizations can capture that data about you and who you are, and learn a lot about your decision-making, for example, purchasing decisions,” Chapman said. 

However, Chapman believes that as long as “you have awareness and consent around these issues, you really can reap huge benefits.”



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.