Inclusive Education through AI-Driven Technologies

Shahla Ghobadi
6 min readMar 7, 2023

--

The rapid growth of tools such as ChatGPT has sparked ongoing discussions across the education sector. The Charted Association of Business Schools (CABS) recently published a post that raises the question of whether ChatGPT signals the end of assessments. As a social scientist specialising in technology development and utilisation, I have researched the potential of tools such as ChatGPT for academic purposes. I have learned ideas about how these technologies can foster inclusive education, where all people, regardless of their background or abilities, demonstrate their knowledge equally. In this post, I share my propositions on how ChatGPT and similar tools can be used to facilitate inclusive assessment which in turn helps train individuals with inclusive minds.

photo showing fog and trees

In the Business School, we have emphasised critical thinking, which involves analysing a phenomenon from multiple perspectives using relevant theories. Humans tend to approach things based on past experiences, observations, and life views. In contrast, AI technologies draw on a wide range of experiences and perspectives. Instead of taking a personal stand on any given topic, they are designed to provide information and analysis from various viewpoints to help users better understand complex issues and make informed decisions. Consider a scenario where a business school educator is analysing a case study about a company that made work adjustments for an employee, without disclosing the reason. Initially, the individual might assume that the adjustments were made due to physical reasons, such as a broken leg or back injury. However, the case study includes sections that the employee was still able to attend meetings and make contributions to the company. By engaging with ChatGPT for additional information, the individual could gain insight into the possibility of invisible illnesses, such as chronic pain or neurodiversity, being the reason behind the adjustments. This could prompt the individual to view diversity in employee experiences, leading to a deeper understanding of their complexity.

Educators encourage learners to use AI tools to explore and analyse diverse questions from multiple perspectives in their academic work. The long-term use of these tools can influence individuals’ thinking to adopt a similar inclusive approach to understanding different phenomena.

By leveraging diverse datasets, these technologies help learners develop a greater awareness of the unique challenges and experiences faced by individuals with different abilities, becoming more empathetic in their responses. If a user identifies any errors or biases in the technology’s responses, ChatGPT welcomes feedback and is eager to improve its performance.

Educators prompt learners to reflect on how AI tools have broadened their understanding of diverse needs and experiences that they may not have encountered before. By encouraging such reflection, educators foster a culture of awareness and empathy, and empower learners to develop more inclusive mindsets.

It is essential to view ChatGPT and similar tools as supplementary technologies to work with. Students can use them to enhance their understanding while still maintaining their learning agency.

Educators encourage learners to leverage AI tools but to actively reflect and build on their limitations. This helps learners enhance critical thinking skills so that they are not overly reliant on emerging technologies.

For example, a business school student could be tasked with developing a marketing strategy for a new product launch. The assessment encourages the students to use ChatGPT to generate a range of ideas on the target market and potential marketing channels. ChatGPT, hence, provides a starting point for the student’s research, helping to identify potential gaps or blind spots in their thinking. The student can evaluate and refine the ideas generated by ChatGPT, taking into consideration factors such as feasibility, market demand, and brand image. In this manner, the student develops a comprehensive marketing strategy, while also gaining experience in using AI for reflection and decision making.

Risks, Developments, and Steps Forward

Developers’ life views influence the technology they build. ChatGPT and Bing, two AI-driven technologies, approach user interactions and response provision, very differently, but both can help create a supportive environment for all learners.

We have a role to leverage these technologies, enhance awareness, challenge them, and ensure that we never compromise our vision for creating more inclusive environments based on the foundational elements of understanding towards all. One example of this is the issue of biased datasets that can perpetuate harmful stereotypes through AI systems. To prevent this, it is essential to be conscious of the data sources used to train these systems and ensure that they are diverse and representative of all groups. As we celebrate International Women’s Day, for instance, it’s important to recognise that AI systems can perpetuate gender bias if not developed and trained carefully:

If an AI system is trained on data that primarily features male faces, it may struggle to accurately recognise female faces; If an AI system is trained on data that only includes information about cisgender women, it may struggle to accurately identify or understand the experiences of trans women; If an AI system is used to screen job applications and is trained on data that has historically favoured men, it may be more likely to reject female candidates.

Companies often establish ethics departments to address ethical concerns in developing AI. However, recent discussions highlight that building ethical AI systems comes at a high cost to the company, leading to a potential burnout problem. To build ethical and unbiased AI systems, it is crucial to address the root causes of problems such as development’s blind spots. One major issue is the “lack of diversity in the teams developing AI systems”, including gender diversity. When teams lack diversity, they may inadvertently introduce biases into the algorithms that go unnoticed and uncorrected. To mitigate this risk, it is important to develop AI systems using diverse data and with the involvement of diverse teams. For instance, including women with different life experiences and backgrounds in the development and use of AI systems can help ensure that their perspectives are taken into account, and that their interests are protected.

In conclusion, inclusive education can be fostered through the use of AI-driven technologies like ChatGPT. By leveraging diverse datasets and drawing on a wide range of experiences and perspectives, these technologies have the potential to provide balanced and informative responses to any given prompt, leading to more comprehensive analysis and understanding of complex issues. However, it is important to actively engage in their use and encourage companies developing these systems to establish diversity in their teams. I conclude with this brilliant thought from Steve Jobs (1955–2011): Technology is nothing. What’s important is that you have a faith in people, that they’re basically good and smart, and if you give them tools, they’ll do wonderful things with them.

Spring, summer, autumn and winter on a small island.

--

--

Shahla Ghobadi
Shahla Ghobadi

Written by Shahla Ghobadi

I write about software, writing, and writers.

No responses yet