Tracie Farrell's profile document
Description for Tracie Farrell
Tracie Farrell
Tracie Farrell
Tracie
Farrell
Research Fellow
Tracie Farrell is a Research Fellow at the Open University and recipient of the UKRI Future Leaders Fellowship (Round 6). Her transdisciplinary, mixed methods research explores the impact of Artificial Intelligence on society, including impacts on people, their communities, and the wider ecosystems in which they live. Before her academic career, Tracie worked for 18 years in the non-formal education sector on issues related to human rights, gender, leadership and citizenship.
840da154de91319d49dc9e9272ce2ca43abbff6c
The Open University account for Tracie Farrell
tmf88
Tracie Farrell's membership at KMi
Tracie Farrell on LinkedIn
Tracie Farrell's participation in Shifting Power
Shifting Power
Shifting Power
How can we create a more just society with AI?
A.I. and "Justice" are brought together under a number of headings, including A.I. for Social Good (AI4SG) [Shi et al., 2020; Tomavsev et al., 2020], Ethical A.I. [Yu, 2018], Responsible A.I. [Arrieta 2020], Fair [Zou, 2018], Accountable and Transparent A.I. [Mohseni et al., 2018; Arrieta et al., 2020] (FAT A.I. or FAT M.L.), Cyberjustice [Senecal et al., 2009; Mykytyn et al., 2019], and A.I. for democracy [Puaschunder et al., 2019]. However, each of these research communities has their own conceptualisation of what justice means and how to achieve it, which is not always informed by marginalised voices [Hagerty et al., 2019; Bones et al., 2020]. In addition, these domains are located within a wider context of A.I. research that has existing problems of representation and visibility [Stanford, 2019]. Pratyusha Kalluri of the Radical AI Network has proposed that asking whether AI is good or fair is not the right question if we want to look at potential benefits and harms. We have to look at how AI "shifts power". In this project, therefore, we are seeking a new paradigm for thinking about what can and should be done with AI Technology, which cannot be reduced to cultural complexity and which takes into account the reality of world forces (such as power and wealth) that help to predict what is likely to happen
This research is funded by a UKRI Future Leaders Fellowship (round 6).
Tracie Farrell's participation in ARIA/GOMARCH
ARIA/GOMARCH
ARIA/GOMARCH
2017-05-01
Augmented Reality in Activism
ARIA was a project sponsored by HEIF to develop a prototype for using Augmented Reality (AR) as a participatory tool for social protest. The ARIA project examined some of the main barriers to physical participation in social protest, including mental and physical health challenges, fear of retribution, and a lack of time and resources. The ARIA team then developed "GoMarch", an integrated web platform and AR application to allow remote protesters or supporters to create and place a digital avatar at a given geolocation. This avatar can be viewed by users that are physically present in the location, though use of the AR smartphone application. The aim of the project was to find new ways of making the efforts of remote activists individually visible to other stakeholders and to provide a new avenue of participation.
Tracie Farrell's participation in Co-Inform
Co-Inform
Co-Inform
2018-04-01
2021-03-31
Co-Creating Misinformation-Resilient Societies
Misinformation generates misperceptions, which have affected policies in many domains, including economy, health, environment, and foreign policy. Co-Inform is about empowering citizens, journalists, and policymakers with co-created socio-technical solutions, to increase resilience to misinformation, and to generate more informed behaviours and policies. The aim of Co-Inform is to co-create these solutions, with citizens, journalists, and policymakers, for (a) detecting and combating a variety of misinforming posts and articles on social media, (b) supporting, persuading, and nourishing misinformation-resilient behaviour, (c) bridging between the public on social media, external fact checking journalists, and policymakers, (d) understanding and predicting which misinforming news and content are likely to spread across which parts of the network and demographic sectors, (e) infiltrating echo-chambers on social media, to expose confirmation-biased networks to different perceptions and corrective information, and (f) providing policymakers with advanced misinformation analysis to support their policy making process and validation. To achieve these goals, Co-Inform will bring together a multidisciplinary team of scientists and practitioners, to foster co-creational methodologies and practices for engaging stakeholders in combating misinformation posts and news articles, combined with advanced intelligent methods for misinformation detection, misinformation flow prediction, and real-time processing and measurement of crowds' acceptance or refusal of misinformation. Co-Inform tools and platform will be made freely available and open sourced to maximise benefit and reuse. Three main stakeholder groups will be directly engaged throughout this process; citizens, journalists, and policymakers.
Tracie Farrell's participation in AI4EDI
AI4EDI
AI4EDI
AI technologies to tackle EDI related issues
AI is here. We interact with AI technology every time we search online, interact on a social media platform or use a credit card. We know that AI can be a force for good, for example, OU Analyse uses machine learning to help identify students at risk of failing. Given the ubiquity of this technology, it is important though that we understand its potential impact, good and bad, for all users. Within AI4EDI we will highlight EDI issues related to AI research and innovation. In particular, how AI can help address EDI issues, such as the awarding gap for black students, and EDI challenges that can be present in AI systems, such as data and decision-making bias.
Tracie Farrell's participation in HERoS
HERoS
HERoS
Health Emergency Response in Interconnected Systems
HERoS aims to improve the effectiveness and efficiency of the response to the Covid-19 outbreak. HERoS creates and provides policies and guidelines for improved crisis governance, focusing on responders to public health emergencies, and their needs to make informed decisions. HERoS further improves the predictions of the spread by understanding and modelling the impact of local behaviour on the spread of the disease. The OU's role in HERoS is to study the spread of misinformation and corresponding fact-checks related to COVID-19 on social media.
Tracie Farrell's participation in CIMPLE
CIMPLE
CIMPLE
Countering Creative Information Manipulation with Explainable AI
CIMPLE aims to experiment with innovative social and knowledge-driven AI explanations, and to use computational creativity techniques to generate powerful, engaging, and easily and quicky understandable explanations of rather complex AI decisions and behaviour. These explanations will be tested in the domain of detection and tracking of manipulated information, taking into account social, psychological and technical explainability needs and requirements. Covid-19, and climate change, are the main two use cases to be investigated.