Short Communication

Developing a mobile data collection tool to manage a dispersed mental health workforce

AUTHORS

name here
Sarah Maddox1
Masters in Clinical and Public Health Aspects of Addiction, Kings College London, Evaluation Manager *

Donna M Y Read2 Research Associate

name here
Hazel E Dalton3

David A Perkins4 Director

Nicholas N Powell5

CORRESPONDENCE

*Ms Sarah Maddox

AFFILIATIONS

1, 2, 3, 4, 5 Centre for Rural and Remote Mental Health, University of Newcastle, Bloomfield Campus, PO Box 8043, Orange East, NSW 2800, Australia

PUBLISHED

28 February 2020 Volume 20 Issue 1

HISTORY

RECEIVED: 22 August 2019

REVISED: 23 January 2020

ACCEPTED: 28 January 2020

CITATION

Maddox S, Read DM, Dalton HE, Perkins DA, Powell NN.  Developing a mobile data collection tool to manage a dispersed mental health workforce. Rural and Remote Health 2020; 20: 5616. https://doi.org/10.22605/RRH5616

AUTHOR CONTRIBUTIONSgo to url

This work is licensed under a Creative Commons Attribution 4.0 International Licence


abstract:

Context: The Rural Adversity Mental Health Program (RAMHP) connects people who need mental health assistance in rural and remote New South Wales (NSW), Australia with appropriate services and resources. In 2016, RAMHP underwent a comprehensive reorientation to meet new state and federal priorities. A full assessment of program data collection methods for management, monitoring and evaluation was undertaken. Reliable data were needed to ensure program fidelity and to assess program performance.
Issues: The review indicated that existing data collection methods provided limited and unreliable information, were inconvenient for RAMHP coordinators to use and unsuited to their itinerant role. A mobile collection tool (app) was developed to address RAMHP activity data needs. A design and implementation process was followed to optimise data collection and to ensure the successful use of the app by coordinators.
Lessons learned:  The early planning investment was worthwhile, the app was successfully adopted by the coordinators and a much improved data collection capability was achieved. Moreover, data capture increased, while errors decreased. Data are more reliable, specific, timely and informative and are used for strategic and operational planning and to demonstrate program performance.

Keywords:

app, Australia, change management, data collection, evaluation, frontline staff, health promotion, mental health, mobile, program review.

full article:

Context

The Rural Adversity Mental Health Program (RAMHP) connects people who need mental health assistance in rural and remote New South Wales (NSW), Australia with appropriate services and resources. Currently, 19 RAMHP coordinators are located across nine rural local health districts.

In 2016, the program underwent a comprehensive review and reorientation to meet new state and federal priorities1. This review found that existing data collection methods, a simple Excel spreadsheet and multiple disconnected data sources (such as administrative records) were inefficient and produced unreliable and limited information about program activities. Poor data reliability meant that RAMHP managers lacked detailed and accurate information about coordinators’ work, making it difficult to monitor the alignment of activities with program plans1. Furthermore, coordinators couldn’t follow and learn from each other’s activities, contributing to reduced program cohesion1. For more information about the program and its review see Maddox et al1.

The present article describes the development of a mobile data collection tool, its implementation challenges and its resultant benefits. It is intended to inform service providers interested in employing app technology, currently underutilised by health services2.

Issues  

During the review and reorientation of RAMHP, data collection improvements were identified as a priority to assess progress against a newly revised program logic model (PLM)1. A PLM describes the relationship between a program’s activities, outputs and outcomes3. To aid data entry by the coordinators and facilitate data analysis, monitoring and evaluation by management, a mobile data collection tool was developed using a commercially available form-building app. Apps are increasingly used for the collection of health-related data. For example, they can be used to map place-based determinants of health2, or to monitor mental or physical health interventions4-6. Although there is little literature about the use of apps for collection of activity data by program staff, there is evidence that mobile technology can provide timely, efficient and comprehensive data collection2,7.

The steps in Figure 1 and described here ensured that the data collection process would meet the program data needs and was acceptable to the coordinators.

table image Figure 1:  Data collection tool development steps.

Audit existing data collection processes 

An internal audit revealed data were unreliable and poorly matched to program objectives. Redundant and missing data were part of an unnecessarily burdensome process. Subsequent interviews with RAMHP managers and coordinators sought insights into the advantages and disadvantages of the existing data collection system1. These were used to agree on criteria for a new system (summarised in Table 1).

Coordinators agreed that the Excel spreadsheet data collection tool was completed inconsistently due to poor data field definitions and inadequate training in its use. Furthermore, coordinators made notes while working away from the office and later transcribed this information to Excel, introducing the potential for transcription and omission errors. In addition, there was a low perceived need to record data among coordinators due to poor feedback regarding the supplied data. Without effective feedback, coordinators prioritised their work opportunistically, via requests, rather than strategically. Coordinators reported that their work was recorded inadequately due to a lack of qualitative information to contextualise their activity metrics.

Table 1: Criteria for a new data collection systemtable image

Select a data collection solution

An app was chosen because coordinators travel frequently and spend little time on office computers. Coordinators already relied on tablets for email, website and document access. An app offered a means to enter activity data in real time, negating the need to record data twice. Furthermore, it would store all data centrally, removing transcription risk and costs incurred with multiple Excel spreadsheets.

With the aid of a public sector IT adviser, coupled with the criteria compiled during the auditing process (Table 1), five app-building platforms were assessed on their capacity to deliver a suitable app. Two platforms, FastField and Formitize, met the key criteria most closely, and free trials enabled comparison of the functions and usability of each. Formitize was chosen because the Australian provider was able to provide good support to users. While Formitize generally serves non-health users, its features are compatible with the RAMHP data collection needs.

Determine data collection needs and build component forms

The data to be collected were guided by the PLM1 and stakeholder reporting requirements for RAMHP management, the Ministry of Health and local health districts. Five forms were created to capture coordinators’ quantitative and qualitative data: linking a person to a mental health service, delivering training, providing information at a community event, attending a professional meeting or generating mental health promotion via mass media. To protect client privacy8, personal identifying information was not included.

Change management and technical implementation

Using an app, rather than Excel, to collect data was a considerable change for coordinators, thus implementation planning was necessary to ensure acceptance and a smooth transition. Although coordinators took part in the development of the PLM and understood the reasoning behind data collection1, there was apprehension about using unfamiliar technology. To manage this, regular communication about the purpose, benefits and development of the app took place before implementation. As with other quality improvement activities in RAMHP, the roll-out of the app was trialled by three coordinators, selected for their technological skills. These coordinators then demonstrated the app to their peers during a face-to-face team meeting, sharing their positive experience of field-testing the app. Individual support and group training were also provided. In the first months of use, the RAMHP evaluation manager reviewed monthly activity reports to monitor data entry accuracy and provide individualised feedback.

Two main implementation issues occurred. First, soon after deployment the app ‘froze’ and ceased to function for some coordinators. This was caused by tablet-related IT maintenance problems and was overcome by further training and support for coordinators. Second, an important automated reporting feature was not provided by the vendor. Consequently, all analysis and reporting have been managed manually.

Review and adjust

After 3 months use of the app, a review was conducted to assess whether improvements could be made. With coordinators comfortable with the app’s functions, it was possible to increase the number of questions and response options in the forms, thereby improving the data capture. New response options were identified by analysing the open-text (‘other’) responses in the forms. The first 3 months data was recoded to fit the new fields. It was important to optimise data collection early in the funding period to establish a baseline for ongoing comparability; however, a small number of refinements have been implemented more recently.

Lessons learned

All steps taken to develop and implement the app – such as choosing a provider, testing functionality, determining the data collected, deciding how analysis would occur and staff training – required considerable investment in time, resources and strategic planning9. This investment was crucial to ensure that the app provided worthwhile data to minimise subsequent changes. After 3 months, the coordinators were comfortable with the technology, and data collection enabled powerful analysis and reporting. Consideration of the logistics of evaluation and reporting in the program redesign stage assisted with the smooth implementation of the new data collection system.

Improved data collection

There has been a considerable and consistent increase in the program data collected since the introduction of the app. For example, in the 6 months preceding the app, the number of reported people linked to mental health services was 391, compared with 929 in the 6 months after the app was introduced. Coordinator feedback indicated that the app’s accessibility and ease of use, coupled with the monthly performance feedback, increased the motivation of coordinators to enter data, leading to these increases rather than dramatic changes in work practices.

Useful and timely data have increased coordinators’ awareness of the impact of their activities, and facilitated self-management. For example, coordinators receive feedback about the characteristics of training participants (gender, location and occupation), highlighting underrepresented groups to target.

Moreover, since the app aligned with the PLM, co-developed with coordinators, its use has reinforced program objectives consistently and encouraged coordinators to value the data collection process. Further, managers have a more accurate picture of coordinators’ activities, to help determine program strategic priorities.

Improved communication

The specific questions within the app’s forms mean that RAMHP is able to share detailed, timely and accurate information with important partners and stakeholders. The geographic location of activities can be mapped, which is useful for demonstrating reach, such as into remote communities. Furthermore, the demographic data collected makes clear the characteristics of the people linked to mental health services such as age, gender and mental health issues experienced. Qualitative descriptions of encounters may also be recorded and these are used to provide narrative depth to reports and context to statistics. This information allows RAMHP to confidently articulate the program’s value, raise its profile and present high quality evidence to key stakeholders.

Acknowledgements

Thanks to Tessa Caton for comments on the concept.

references:

1 Maddox S, Read DMY, Powell NN, Caton TJ, Dalton HE, Perkins DA. Reorientation of the Rural Adversity Mental Health Program: the value of a program logic model. Rural and Remote Health 2019; 19: 5217. Available: http://www.rrh.org.au/journal/article/5217 https://doi.org/10.22605/RRH5217 PMid:31480849 (Accessed 4 October 2019).
2 Giovenco DP, Spillane TE. Improving efficiency in mobile data collection for place-based public health research. American Journal of Public Health 2019; 109(S2): S123-S125. https://doi.org/10.2105/AJPH.2018.304875 PMid:30785801
3 Centre for Epidemiology and Evidence. Developing and using program logic: a guide. Evidence and Evaluation Guidance Series. Available: https://www.health.nsw.gov.au/research/Publications/developing-program-logic.pdf (Accessed 22 August 2019).
4 Schoeppe S, Alley S, Van Lippevelde W, Bray NA, Williams SL, Duncan MJ, et al. Efficacy of interventions that use apps to improve diet, physical activity and sedentary behaviour: a systematic review. International Journal of Behavioral Nutrition and Physical Activity 2016; 13(127). https://doi.org/10.1186/s12966-016-0454-y PMid:27927218
5 Wang K, Varma DS, Prosperi M. A systematic review of the effectiveness of mobile apps for monitoring and management of mental health symptoms or disorders. Journal of Psychiatric Research 2018; 107: 73-78. https://doi.org/10.1016/j.jpsychires.2018.10.006 PMid:30347316
6 Covolo L, Ceretti E, Moneda M, Castaldi S, Gelatti U. Does evidence support the use of mobile phone apps as a driver for promoting healthy lifestyles from a public health perspective? A systematic review of randomized control trials. Patient Education and Counseling 2017; 100(12): 2231-2243. https://doi.org/10.1016/j.pec.2017.07.032 PMid:28855063
7 McClung MW, Gumm SA, Bisek ME, Miller AL, Knepper BC, Davidson AJ. Managing public health data: mobile applications and mass vaccination campaigns. Journal of the American Medical Informatics Association 2018; 25(4): 435-439. https://doi.org/10.1093/jamia/ocx136 PMid:29140434
8 Bakken S, Marden S, Arteaga SS, Grossman L, Keselman A, Le P-T, et al. Behavioral interventions using consumer information technology as tools to advance health equity. American Journal of Public Health 2019; 109(S1): S79-S85. https://doi.org/10.2105/AJPH.2018.304646 PMid:30699018
9 Materia FT, Miller EA, Runion MC, Chesnut RP, Irvin JB, Richardson CB, et al. Let’s get technical: enhancing program evaluation through the use and integration of internet and mobile technologies. Evaluation and Program Planning 2016; 56: 31-42. https://doi.org/10.1016/j.evalprogplan.2016.03.004 PMid:27018831