Dan Bennett Profile picture
Feb 24, 2022 26 tweets 15 min read Read on X
🧵
Can signatures in user movement inform us about experience & behaviour?
... about engagement, fluidity, and distribution of attention?

Our #CHI2022 preprint takes an empirical approach to these questions, grounded in phenomenology & fractal geometry!

psyarxiv.com/729xw Image
The idea of "readiness to hand", from philosophical phenomenology, has long been influential in HCI. 
The story goes like this:
When using a well-functioning tool, skillfully, you stop being aware of the tool as a distinct object. You perceive through it, to the task. Image
But this situation can break down - for instance if you are unskilled, or the tool malfunctions.

Then the tool becomes an object of conscious awareness; you detach from the task; your attitude becomes more reflective and less tied to the original task context Image
For HCI researchers thinking about usability & UX, this story offers a frame for thinking about design, immersion, breakdown, and it describes a useful regularity in user behaviour and experience

But interpretations of the story have not always been consistent
The engaged mode of tool use has been treated as a prototype of fluid technology use and good user experience.

But is the second, detached, mode necessarily a marker of bad experience? 
Some have argued that it supports creativity, observation, reflectiveness, learning
Other questions also raise themselves: 

Can these states be predicted? 
Is the phenomenon reliable across different contexts?
How strong is the effect?
Is this a hard dichotomy between two modes, or more of a spectrum?
Answering these Q's seem worthwhile: readiness-to-hand is influential in HCI, and ideas associated with it - like fluidity, immersion, reflectiveness - are common currency.
Recent philosophy suggests the account may help identify cases where tech affects user autonomy
But answering these questions will rely on tools for empirical investigation, and some theory of how the phenomenon arises. 
To date HCI has lacked for both. 

Luckily recent research in cognitive science offers a way forwards.
We built on work by @Dobri_Dotov, @tonychemero and colleagues, which associated the phenomenon of ready-to-hand tool use with complex signatures in task-directed movement
sciencedirect.com/science/articl…
@Dobri_Dotov @tonychemero The theory here is that ready-to-hand tool use, arises due to a kind of complex sensorimotor coupling with the tool; a "cascade structure" which is known to give rise to multifractal nesting signatures in movement, due to the coordination of a wide range of sub-processes Image
@Dobri_Dotov @tonychemero This kind of structure is known to support flexible adaptation in natural systems across nature.
It is hypothesized that it supports flexible skill in tool use, and that to maintain this state, attentional resources are de-prioritised, resulting in the tool's "disappearance"
@Dobri_Dotov @tonychemero In the original study, the authors found that multifractal signatures of these structures, in mouse hand movements, were strong during normal play. At the same time, players performed poorly on a visual awareness test reporting on visual properties of the game.
@Dobri_Dotov @tonychemero However, when the mouse was artificially "broken" during play, by distorting the relationship between mouse and cursor movement, multifractal signatures diminished, indicating reduced coupling. Simultaneously, performance on the visual awareness test increased.
@Dobri_Dotov @tonychemero We first replicated this result, addressing issues relevant to the HCI community: removing unecological constraints & refining the attention measurement

Then we extended the approach, testing new predictions drawn from the model and accounts of readiness-to-hand Image
@Dobri_Dotov @tonychemero We asked how:
1 learning and,
2 task-engagement,
affect the multifractal signature?

1 Since ready-to-hand tool relies on skill we expected multifractality to be lower for an unfamiliar task, and higher as familiarity with the task increased, over repeated play
@Dobri_Dotov @tonychemero 2 Readiness-to-hand is associated w/ immersion & engagement, and previous work associates multifractality with engagement-related constructs like executive function. 
We thus expected multifractal signatures to be higher during a more engaging version of the task.
@Dobri_Dotov @tonychemero We tested these question's using a simple game task.
In all cases evidence supported our hypotheses: multifractal signatures tracked 3 distinct dimensions of readiness-to-hand, predicting task familiarity, the engagingness of the task, and attention during breakdown Image
@Dobri_Dotov @tonychemero So what does this tell us? And where can we go from here?

First this provides further evidence for the theory that phenomena associated with readiness-to-hand have their basis in complex sensorimotor coupling between user and tool Image
@Dobri_Dotov @tonychemero Second, in turn this theory and our approach to extending it provides a source of testable hypotheses about further aspects of readiness-to-hand, and it supports the testing and refinement of related design ideas
@Dobri_Dotov @tonychemero Finally, the work points to a fairly nice feature which system designers can add to their toolkit, to infer user experience and behaviour. Multifractal analysis only requires measurement of task-directed behaviour, likely measurable through existing input devices
@Dobri_Dotov @tonychemero And while our work here focuses on mouse use, I have subsequently applied the approach to eye-gaze, typing and pen input. Other researchers have applied similar approaches to a wide range of movement types

This seems an incredibly flexible and under-investigated approach for HCI
@Dobri_Dotov @tonychemero We emphasise, of course, that this is early work - further work must be carried out on the way to applications!
But this is a celebratory tweet thread - head to the paper to see more discussion of that!
osf.io/2hm9u/
@Dobri_Dotov @tonychemero Finally: this is my first full CHI paper, and it took a loooong time to get it published. We found it difficult to explain some complex ideas clearly for a new audience, and we had a lot of help along the way.
So I'm going to do a cringey, effusive "THANKS!" bit
@Dobri_Dotov @tonychemero As well as @anneroudaut @ousmet (my supervisors and co-authors); Jon Bird, @Feng58486062, @LewisChuang and @khornbaek were all generous with time, helping us get clearer about this work and how to communicate it
@Dobri_Dotov @tonychemero @anneroudaut @ousmet @Feng58486062 @LewisChuang @khornbaek @OtherFoovian &@dobri_dotov helped me grokk the methods, both via their lucid papers and via conversations on twitter and email.

I guess the point is - if yr early career like me & struggling to publish smth, there are probably people in your area you can ask to guide and help
Correction here - I missed off @linnienyc who was first author on some of the first work on this!

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Dan Bennett

Dan Bennett Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us!

:(