Faculty Club / Technology & AI / Bridging Pedagogy and Generative Artificial Intelligence

Bridging Pedagogy and Generative Artificial Intelligence

Dr. Arendt-Bunds explores a key question: How can we integrate GenAI with human-centered pedagogy while keeping the curriculum meaningful?

Generative artificial intelligence (GenAI) tools are changing how we teach in educational theories and in practice. Because guiding policy and regulation for AI have yet to breach the starting gate, educators must strike a balance between rapid technological evolution and enduring principles of effective teaching and learning.

There are many ways AI can be used in pedagogy, such as curriculum and assessment development, customized feedback, and trend identification. An instructor can also use GenAI tools to generate the course title, course description, learning objectives, course content, rubrics, and even the responses to students. 

The question is not what GenAI is capable of nor if it should be permitted. The question is: How can we enhance the integration of human-centered pedagogy with GenAI to ensure the curriculum stays applicable and meaningful for students post-graduation?

Access and Equity

As with any digital tool, access to GenAI is not equal for all. Virtually all GenAI tools require access to a computer or mobile device and a stable internet connection. Many GenAI tools also require payment for ongoing use. 

Some strategies that can be implemented to bridge the digital divide and provide equitable access to necessary technological resources, especially for students who may face financial barriers or have accessibility challenges, include:

  • Work to ensure students have access to public spaces with equipment and Wi-Fi access such as libraries, open access computer labs, or community centers.
  • Work to ensure all content and tools used in the educational setting can be utilized via mobile technology.
  • As required by the Technology, Equality, and Accessibility in College and Higher Education (TEACH) Act, ensure that digital content is designed with accessibility in mind (for instance, content with text-to-speech capabilities). In cases where access is limited for some, offer alternatives to obtain the information.
  • Ensure, to the degree possible, that GenAI tools used by students are free or of limited cost (for instance, not behind a paywall). Use open source and free-to-use materials where possible to prevent additional costs for access to course-related content or software. 

While the quantity, capability, and quality of generative AI tools are massive and expanding rapidly, educators need to work to ensure their classrooms remain equitable, fair, and available to all students. At the same time, they need to provide students with both human-centered and technological resources that offer awareness and proficiency so students are best prepared for the future.

Developing Awareness and Proficiency

In the classroom, educators not only expose students to an array of software like Google Docs and Instructure Canvas, but they teach students how to use them. By developing this awareness and proficiency, access and equity are increased. Similarly, with GenAI, educators and students alike must be aware of and proficient in these tools before they’re able to use them effectively. To build knowledge to share with students:

  • Find open source resources with curated lists of AI tools, tutorials, and documentation for learning and support. These can be shared and used by instructors and students. Online forums and repositories (such as GitHub) , research organizations (such as MIT), and technology blogs (such as That AI Collection) are good locations to start searching.  
  • Attend free or low cost workshops and webinars that introduce AI tools and demonstrate their capabilities using online learning platforms (such as Course Hero), event platforms (such as Eventbrite) or social media (such as LinkedIn)
  • Participate in or host local or virtual meetups that provide a space for individuals to share knowledge, ask questions, collaborate, and network.

Students need to understand that these tools will not only help them in class but in their future careers and in other areas well beyond the classroom. They also need to understand the associated risks of using some of these tools.  

Data Ownership, Privacy, Security, and Technology Dependence

Since the inception of the internet, there have been concerns about privacy and security. AI tools aren’t immune to these concerns. The costs of using AI tools include the potential loss of privacy, the permanent sharing of data, and perhaps an ever-increasing dependence on technology overall. Each of these costs must be weighed and compared to the expected benefit—not just in the short-term but in the medium- and long-term as well. 

Information that is given to AI systems often cannot be redacted. While not the case with all companies, the information submitted may be owned by the sponsoring company, as well as the results. 

OpenAI (owner of ChatGPT) assigns the user all its rights, title, and interest in and to output as per its Terms of Use policy. However, it saves all of the prompts, questions, and queries users enter into its systems. This can be helpful for end users because they can look back at their prior prompts, but it also means that the company can as well. While users can delete their history, it’s possible that the company has already extracted the data for use in improving its tools.

Companies also have terms of use policies that specify the acceptance of risk by the end user and often limit the companies legal responsibilities, both in liability and in company obligations for services. Additionally, the terms of use may specify that a user must pay back (indemnify) the company for any losses it may bear as the result of a violation of the user’s obligations. All in all, terms of use are worth reviewing, even if they are written in legal terminology. 

Tip: Try asking ChatGPT to simplify and summarize the terms of use by copying and pasting them into the prompt.

Relaying privacy and security considerations to students should be part of standard classroom practices and the larger educational process when using technological tools. This includes not just generative AI tools but all other software and database systems. Some good methods for this include syllabus statements, assignment components that relate to addressing these issues, educational modules, and similar.

Critical Thinking and Information Literacy

Along with considerations of access and ownership, there should be reflection on the importance of critical thinking and information literacy in using AI tools. While these tools may be revolutionizing our lives, they also present challenges:

  • The credibility of AI outputs is based on the credibility of the inputs it receives from all the sources it uses to compile results. Not all sources are reliable or accurate. 
  • Just as content may not always be reliable or accurate, it can also, in some cases, be biased. AI systems in turn can inadvertently perpetuate those biases. 
  • AI tools largely make use of existing resources and information to compile its outputs; this means that there are likely limits to independent thought and creativity within the tool and subsequently with their outputs. 
  • Deep learning is adding to the levels of originality and novelty by increasingly using more complex data extraction, variational techniques for the generation of new data, and fine tuning reinforcement learning. 
  • Different AI tools and their varying potential uses may necessitate ethical considerations from appropriateness to fairness to impact. In many cases, regulations and laws have not kept pace with technological advancements. 

Educators use various strategies like debate, sample scenarios, and questioning to spark critical thinking in the classroom. To explore AI tools further, instructors may ask students what they see as a likely negative outcome of societies increasing dependence on technology. They can then follow by asking students what they see as a likely positive outcome. This could lead to a discussion of steps that can be taken to increase the odds of the positive outcomes. Consider using these strategies as you and your students explore the potential challenges of AI tools and seek to understand them better. 

Generative AI and Your Assessment and Evaluation

Types of assessment and evaluation used in education are vast and varied, but AI tools have an impact on all of them. They are causing educators like me to redefine how student assessment is performed, including re-evaluating the goals of learning overall. 

In my classroom, rote learning was never a goal. Instead, the goal has been for students to actively, quickly, and accurately find solutions to problems that have high value compared to other alternatives. I have long encouraged student use of external resources in finding answers, often by comparing, contrasting, and compiling varied options. Now, however, AI helps me create these assessments and helps my students complete them.  

Here are ways instructors and students can use AI tools for assessment:

Selected-response assessments:

  • Instructors: Create the questions and selected responses.
  • Students: Assist in locating the most likely correct answers.
  • Suggested tools: Quizzizz, Quillionz, Yippity

Constructed-response assessments:

  • Instructors: Develop discussion prompts or story problems.
  • Students: Assist in generating responses.
  • Suggested tools: ChatGPT, Google Bard

Performance-based assessments:

  • Instructors: Create assessment rubrics and define scenarios.
  • Students: Assist in creating and combining artifacts.
  • Suggested tools: ChatGPT, Edulastic, Taskade 

Both instructors and students can also use AI to assist with portfolio, observation, criterion-referenced, and even self-assessments. It gives a whole new meaning to generative assessment. 

What I want to be sure of is that students understand these solutions and that they can weigh their value compared to other alternatives. Because of this, I have shifted my assignments to require more reflection, contemplation, and comparison. As an example, I have students ask the same question to multiple GenAI systems and compare the results; or I have students ask their own version of topic-based questions to AI, share their results with classmates, and discuss the benefits and drawbacks of the AI outputs. 

I had to ask myself the role these tools play in my learning outcomes and the role they play in how I assess student achievement of those outcomes. A majority of my assignments now incorporate student use of AI in some fashion. I see AI tools as resources students can use to find the best options and solutions, just as other software and technological tools are resources. All of these tools can help students reach my course learning objectives. They don’t undercut the goals of the assignments, and in many cases, they speed up the process for finding potential solutions. 

Personalization and Differentiation in Education

There is an increasing expectation that educational opportunities, feedback, and outcomes  given to potential and current students be both personalized and differentiated. The expectation is that students should be able to bypass content they already know and instead be given additional individual support for areas of growth and development. That is now the contemporary experience of students. 

As educators, GenAI may prove quite helpful in working to meet these newfound expectations. It can be used to assess prior knowledge, such as in evaluating a students verbal strength in a foreign language with Smalltalk, by creating unique student feedback using Quillbot , or even by making snazzy custom audio tracks using Beatbot

New images or content can be created almost on the fly using tools like Dall-E for images and AI Writer or Hemingway Editor. It becomes a question of how educators elect to use the tools over time, which in turn is a question of overall pedagogical integration.

Pedagogical Integration

As educators, we need to ask ourselves not only what we consider to be valid and reliable learning tools but also what we consider to be valid and reliable methods of content development, assessment formulation, and learning analysis. 

In our individual pedagogical spaces, we then need to consider the role we feel GenAI tools should and likely do play. Pertinent use of the tools can offer increased student engagement, improved learning outcomes, and reduced teacher workload. However, it requires reflection on our values, ethical standards, and risk tolerance. It also depends on how we interpret acceptable use. 

There is a role for a human touch in teaching and mentorship although how we each define that role will vary. There is also a role for technological tools, although again how we define that desired role will vary. Each educator needs to work to strike a balance between the rapid pace of technological evolution and the enduring principles of effective teaching and learning as they understand and embrace them in their areas of expertise.

About Dr. Anne Arendt-Bunds

Dr. Anne Arendt-Bunds holds a Bachelor of Arts (B.A.) degree in English, a Masters in Business Administration (M.B.A.) from the University of Minnesota Carlson School of Management, a Master of Science (M.S.) degree in Educational Change and Technology Innovation from Walden University, and a Doctorate of Education (Ed.D.) from Utah State University, with an emphasis in higher education.

Additionally, she is a certified Six Sigma Black Belt through the American Society of Quality (ASQ) and was previously certified with Project Management Professional (PMP) certification through the Project Management Institute (now expired). Dr. Arendt-Bunds’ love for learning fuels her aspirations to inspire and support other students in cultivating their own passion for education.

Get the Faculty Club newsletter

Browse by Topic