Research Agenda

This project is a collaboration between two research groups within the OU: The Digital Health Laboratory (DHL), developing interactive systems for health and wellbeing; and the Software Engineering and Design (SEAD) group, developing systematic approaches for engineering real-world secure, adaptive and usable software systems. Works include: computer-controlled vibrotactile device for multi-limb coordination; co-design approaches involving blind and visually impaired people to support technology development; digital platforms for emotional well-being during pandemic-induced social distancing; handheld device to enable patient self-logging of pain, aiding recovery assessment after joint replacement surgeries and post-caesarean pain management; use of sensors and haptic cues to enhance neurological patients’ walking gait recovery; real-time gait-lab-quality data for treatment decisions; machine learning techniques on stress echocardiography data to predict coronary artery disease risk; the development of wearables and smart home technologies to aid older adults’ independent living; and projects that research on social-technical resilience in software development and autonomous systems.

Primary Aims and Objectives

This research aims to provide an adaptable, customizable, and affordable communication tool for people with complex disabilities.

Objectives:
O1: Create a knowledge base of people’s expressions and their corresponding meanings
O2: Analyse data to determine appropriate AI-machine learning techniques that can classify the collected data set to enable mapping between an individual’s expressions and their meanings
O3: Identify appropriate sensors to support a range of individuals’ needs and abilities
O4: Develop an adaptable and customisable system architecture to meet each user’s needs and abilities
O5: Engineer iterative prototypes to evaluate the work
O6: Evaluate prototypes with various stakeholders
O7: Document and publish the findings of the evaluation results

Research methods

Part A) Knowledge base creation:
The knowledge base will be created based on data collected from interviews with informal and formal carers and domain experts; and through sensor-based data capture. We will document (through a survey) the mapping between people’s expressions (gestures, actions, and sounds) and their corresponding meanings. We will recruit adult carers and domain experts (for example, medical professionals, physio-, speech-, and occupational therapists). Participants will be recruited through word-of-mouth through special needs schools, clinics, and therapy centres. We will also document through video, audio, and motion capture how specific individuals indicate their needs. Participants may be offered to wear non-invasive devices to capture motion and gestural data if appropriate for their needs.

Part B) Investigatory Data Analysis
In this stage we will analyse the collected data to determine appropriate AI-machine learning techniques that can classify the data set to enable mappings between an individual’s expressions and their meanings. We will also Identify appropriate sensors that could be used in prototype systems to support real-time capture of a user’s modes of communication.

Part C) System Architecture
Based on the selected AI-machine learning techniques and sensors, we will develop an adaptable and customisable system architecture. The system architecture will accommodate for various modes of expressions of each individual and will change according to the individual’s abilities and needs. We envisage a low-cost architecture formed by already existing devices (or potentially, if needed, by new devices that can be developed inexpensively).

Part D) Iterative prototype development and evaluation
Prototypes will be co-designed using a collaborative design approach that involves actively engaging stakeholders, users, and experts in the design process, helping to ensure they cater to the unique challenges and abilities of the population. We will apply user-centred methods to evaluate our prototypes, as well as to generate new ideas for further development.

Ethics

Our project received approval from the Human Research Ethics Committee (HREC/4823/Zisman) in September 2023, and is committed to upholding the highest ethical standards throughout its duration. Before initiating the project, our researchers engaged in risk assessments, ensuring compliance with IT security and data protection regulations. We also adhere to Open University’s policies and guidance, the “Ethics Principles for Research with Human Participants” and the “Code of Practice for Research,” to ensure ethical conduct at every stage of our work.

Relevant literature

Augmentative and alternative communication (AAC) systems are classified as: “no-tech” (manual signing systems); “low-tech” (printed materials); “high-tech” (computer-based arrays, synthesized voice output) [1]. Approximately 20% of people with CP could benefit from some form of AAC[2] and it can improve the life quality of those with communication difficulties, facilitating autonomy, social relationships, and education [3]. People whose speech is most affected can have difficulties using current high-tech systems[1], and past AAC research often under utilised participatory design methods[4]. Our approach seeks to model real-world human communication, where gestures, sounds, movements, and other signals are given meaning. It is a significantly different paradigm from current approaches, with the objective of a system that adapts to a user, rather than demanding the user adapt to a system [5]. The approach will leverage advances in machine learning and decreases in the cost of processing and sensor technologies. By supporting this user-centred proposal, the results of the work will contribute to the engineering needed for a more inclusive future where all, regardless of abilities, can communicate effectively and thrive.

The individual components of this projects have already been explored for diverse medical conditions. However, they have not yet been integrated to meet this important real-world need. For example, we developed a data aggregation platform and conducted classification of home-based sensor data to support individuals at risk (https://stretchproject.org). We used sensors and machine learning applications to help patients with osteoarthritis, or recovering from surgery, to improve their recovery and to provide clinicians with real-time gait-lab-quality data to inform treatment (https://www.mdpi.com/1424-8220/22/19/7482). Based on a database of patient data with stress echocardiography results, we developed a model that predicts the risk of coronary artery disease (https://oro.open.ac.uk/70005/). We developed an application to report and analyse lifestyle factors of individuals to predict loneliness (https://serviceproject.org.uk/). Our experience in human-centerer research and participatory design comes from our work with children to design new devices and interfaces for patient reported pain monitoring (https://www.open.ac.uk/blogs/DHL/index.php/sample-page/projects/), and the production of a hardware system engineering for post-operative pain management in hospitals (https://oro.open.ac.uk/52825/). We developed technologies to engineer and evaluate secure adaptive sharing of information based on context-sensitive rule access control system. We explored different data sharing models for improving security and privacy in the healthcare domain (https://gtr.ukri.org/projects?ref=EP%2FR013144%2F1).

[1]https://discovery.dundee.ac.uk/ws/portalfiles/portal/95342571/1_s2.0_S1751722217301452_main.pdf
[2]https://discovery.ucl.ac.uk/id/eprint/1477557/1/M_Clarke_paediatrics%20and%20child%20health%20_2016_FINAL.pdf,
[3]https://eprints.whiterose.ac.uk/74332/1/WRRO_74332.pdf?gathStatIcon=true,
[4]https://kclpure.kcl.ac.uk/ws/portalfiles/portal/177534680/state_of_art_aac_assets_prepreint.pdf,
[5]https://www.eecs.harvard.edu/~kgajos/papers/2018/wobbrock18ability.pdf

Relevant research from our team

Bennaceur, Amel; Hassett, Diane; Nuseibeh, Bashar and Zisman, Andrea (2023). Values@Runtime: An Adaptive Framework for Operationalising Values. In: IEEE/ACM 45th International Conference on Software Engineering: Software Engineering in Society (ICSE-SEIS), 14-20 May 2023, Melbourne, Australia.

Bennaceur, Amel; Stuart, Avelie; Price, Blaine; Bandara, Arosha; Levine, Mark; Clare, Linda; Cohen, Jessica; Mccormick, Ciaran; Mehta, Vikram; Bennasar, Mohamed; Gooch, Daniel; Gavidia-Calderon, Carlos; Kordoni, Anastasia and Nuseibeh, Bashar (2023). Socio-Technical Resilience for Community Healthcare. In: Proceedings of the First International Symposium on Trustworthy Autonomous Systems, ACM.

 Bennasar, Mohamed; Price, Blaine; Gooch, Daniel; Bandara, Arosha and Nuseibeh, Bashar (2022). Significant Features for Human Activity Recognition Using Tri-Axial Accelerometers. Sensors, 22(19), article no. 7482.

Bennaceur, Amel; Zisman, Andrea; Mccormick, Ciaran; Barthaud, Danny and Nuseibeh, Bashar (2019). Won’t Take No for an Answer: Resource-driven Requirements Adaptation. In: 14th Symposium on Software Engineering for Adaptive and Self-Managing Systems 2019, 25-26 May 2019, Montréal, Canada.                  

Maia, Paulo; Vieira, Lucas; Chagas, Matheus; Yu, Yijun; Zisman, Andrea and Nuseibeh, Bashar (2019). Cautious Adaptation of Defiant Components. In: The 34th IEEE/ACM International Conference on Automated Software Engineering (ASE 2019) (Lawall, Julia and Marinov, Darko eds.), 11-15 Nov 2019, San Diego, California, USA, pp. 974–985.

Price, Blaine; Kelly, Ryan; Mehta, Vikram; Mccormick, Ciaran; Ahmed, Hanad and Pearce, Oliver (2018). Feel My Pain: Design and Evaluation of Painpad, a Tangible Device for Supporting Inpatient Self-Logging of Pain. In: CHI ’18 Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, ACM, New York, article no. 169. 

Gooch, Daniel; Mehta, Vikram; Price, Blaine; McCormick, Ciaran; Bandara, Arosha; Bennaceur, Amel; Bennasar, Mohamed; Stuart, Avelie; Clare, Linda; Levine, Mark; Cohen, Jessica and Nuseibeh, Bashar (2020). How are you feeling? Using Tangibles to Log the Emotions of Older Adults. In: Fourteenth International Conference on Tangible, Embedded, and Embodied Interaction (TEI ’20), ACM pp. 31–43.

Scroll to Top