March 22, 2026

Teaching and Curriculum Design in the Age of AI with Hamza Sami

Teaching and Curriculum Design in the Age of AI with Hamza Sami
Apple Podcasts podcast player iconSpotify podcast player iconYoutube Music podcast player iconAmazon Music podcast player iconiHeartRadio podcast player iconPodcast Addict podcast player iconPodchaser podcast player iconPocketCasts podcast player iconDeezer podcast player iconPlayerFM podcast player iconCastro podcast player iconCastbox podcast player iconGoodpods podcast player icon
Apple Podcasts podcast player iconSpotify podcast player iconYoutube Music podcast player iconAmazon Music podcast player iconiHeartRadio podcast player iconPodcast Addict podcast player iconPodchaser podcast player iconPocketCasts podcast player iconDeezer podcast player iconPlayerFM podcast player iconCastro podcast player iconCastbox podcast player iconGoodpods podcast player icon

Want a smarter way to work with AI in the classroom without losing what makes learning human? Jackie had an engaging and insightful conversation with academic manager and curriculum designer Hamza Sami to unpack practical ways educators can harness generative AI as a learning partner while strengthening integrity, critical thinking, and authentic assessment.

We start by reframing generative AI with simple language students can use: it’s the confident friend who doesn’t always have the facts right. From there, we outline day-one norms that encourage curiosity and set clear boundaries—what’s green-light brainstorming, where caution applies, and when only original work is acceptable. Hamza shares why instructor AI literacy comes first, how to discuss bias and hallucinations in plain terms, and why students’ “I feel like I’m cheating” reactions signal values worth guiding, not suppressing.

Looking ahead, we land on the capacity every student needs next year and beyond: moral awareness paired with critical thinking. If the internet went down, could you still perform? Would you hire yourself? Like calculators, AI should sharpen our work, not replace our minds. Subscribe, share this with a colleague, and leave a review to help more educators build classrooms where AI supports deeper, more honest learning.

🔗 Website and Social Links:

Please visit Hamza Sami’s website and social media links below.

Hamza Sami’s Website

Hamza’s LinkedIn Page

📢 Call-to-Action: Need a hand bringing your learning ideas to life, whether curriculum design, instructional strategy, or even creative production? Reach out on LinkedIn, and let’s explore how we can make learning more engaging together.

Send Jackie a Text

Join PodMatch!
Use the link to join PodMatch, a place for hosts and guests to connect.

Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.

Support the show

💟 Designing with Love + allows you to support the show by keeping the mic on and the ideas flowing. Click on the link above to provide your support.

Buy Me a Coffee is another way you can support the show, either as a one-time gift or through a monthly subscription. 

🗣️ Want to be a guest on Designing with Love? Send Jackie Pelegrin a message on PodMatch, here: Be a guest on the show

🌐 Check out the show's website here: Designing with Love

📱 Send a text to the show by clicking the Send Jackie a Text link above. 

👍🏼 Please make sure to like and share this episode with others. Here's to great learning!

 

00:01 - Welcome & Series Setup

00:21 - Guest Background & BTEC Context

02:11 - Career Pivot To Education

07:20 - AI’s Rapid Rise & Human Role

09:19 - Framing Generative AI As A Partner

13:02 - Guardrails & Classroom Culture

18:44 - Authentic Assessment To Reduce Misuse

24:34 - Reflective Practice & Open AI Dialogue

29:04 - Curriculum Revision With AI Support

34:44 - One-Year Outlook: Integrity & Thinking

40:44 - Calculator Analogy & Mental Fitness

47:16 - What’s Next In The Series & Farewell

48:31 - Support, Reviews, & Contributions

WEBVTT

00:00:01.120 --> 00:00:04.000
Hello, and welcome to the Designing with Love podcast.

00:00:04.000 --> 00:00:11.839
I am your host, Jackie Pelegrin, where my goal is to bring you information, tips, and tricks as an instructional designer.

00:00:11.839 --> 00:00:16.879
Hello, instructional designers and educators.

00:00:16.879 --> 00:00:20.960
Welcome to episode 100 of the Designing with Love Podcast.

00:00:20.960 --> 00:00:27.519
Today I'm kicking off a three-part series with Hamza Sami on teaching and learning in the age of AI.

00:00:27.519 --> 00:00:38.719
Hamza works extensively with UK's Pearson Business and Technology Education Council, or BTEC, which offers career-focused competency-based programs recognized internationally.

00:00:38.719 --> 00:00:42.000
In addition, he leads curriculum design and quality assurance.

00:00:42.000 --> 00:00:43.840
Welcome to the show, Hamza.

00:00:45.439 --> 00:00:46.240
Thank you, Jackie.

00:00:46.240 --> 00:00:47.920
It's a pleasure to have to be here.

00:00:48.640 --> 00:00:49.039
Yes.

00:00:49.039 --> 00:00:51.520
And I'm so glad you're a fan of the show.

00:00:51.520 --> 00:00:57.359
That's how you discovered me and uh and we we got connected on LinkedIn.

00:00:57.359 --> 00:00:58.880
And so I'm it's great.

00:00:58.880 --> 00:01:03.119
And then I uh rolled out the welcome mat for you to come on the show.

00:01:03.119 --> 00:01:05.760
So I'm so happy that we're we're doing this.

00:01:05.760 --> 00:01:06.640
It's great.

00:01:06.640 --> 00:01:17.359
So um to start, can you yeah, so so to start, can you tell us a little bit about yourself and share what inspired you to focus on curriculum design in the higher education industry?

00:01:18.480 --> 00:01:18.879
Sure.

00:01:18.879 --> 00:01:26.959
Uh so I work as an academic manager at AXL International Academy, which is a Pearson approach center based in Turkey.

00:01:26.959 --> 00:01:31.040
Uh, we what we do is we adopt the two plus one system.

00:01:31.040 --> 00:01:38.959
So basically, we uh we provide HD in the first two years for our students, uh H and D higher national diploma.

00:01:38.959 --> 00:01:44.799
Uh and this counts as the first two years of uh the bachelor uh in the UK.

00:01:44.799 --> 00:01:51.840
Uh and the third year, actually, it's the last year uh uh uh of the bachelor uh program.

00:01:51.840 --> 00:02:05.439
Uh they go our students, they go to the UK in one of our partner uh universities there, uh, and they finish uh the bachelor uh because in the UK the bachelor's degree is only three years.

00:02:05.439 --> 00:02:16.479
Uh so um yeah, I finished uh now uh about to answer your question how I got into the curriculum design uh and my interest to that uh area.

00:02:16.479 --> 00:02:22.800
Uh I come from a multimedia design uh background and creative production background.

00:02:22.800 --> 00:02:36.560
I finished my bachelor's degree uh back in the in the United Arab Emirates in the American University of Sharjah uh around uh uh six like in 2012, yeah, I I graduated.

00:02:36.560 --> 00:02:49.039
And I worked as a photographer, videographer, and graphic designer for around six years, like full-time and uh also a freelance uh jobs, you know, uh in Dubai.

00:02:49.039 --> 00:02:58.400
Uh but I found I that I had limitations with my with my career, like for example, I always wanted to add values to my work.

00:02:58.400 --> 00:03:11.759
Uh and uh sometimes the the companies or the areas or the fields I'm working for, they uh like the if I've I have problem with my uh with my with my principals, for example.

00:03:11.759 --> 00:03:25.120
For example, uh like if if I was offered uh a job in a TV channel or network, um, you know, they usually have their own agenda and which um which contradicts mine.

00:03:25.120 --> 00:03:28.400
Uh so I felt that there's some limitation.

00:03:28.400 --> 00:03:37.840
Also, uh the idea of um having uh like you know to manipulate your your audience with your work, you know.

00:03:37.840 --> 00:03:38.400
Right.

00:03:38.400 --> 00:03:45.199
Uh this is something I found in in in in doing like media work usually.

00:03:45.199 --> 00:03:55.039
So uh I th I had uh you know I I I decided to go to Vietnam as a like a to take a self-discovery trip or journey, if you would say.

00:03:55.039 --> 00:04:01.759
Uh and there uh for a couple of months I stayed there, I realized that uh education is what I need to explore.

00:04:01.759 --> 00:04:14.319
Um because we all have this one teacher or more actually who affected our own uh life uh with the with their character or what they taught us.

00:04:14.319 --> 00:04:26.879
So I felt if I can um like uh influence one uh student or one person at a time, it will be like I've done my my my mission in this life.

00:04:26.879 --> 00:04:29.680
So I thought what's wrong with education?

00:04:29.680 --> 00:04:31.920
Uh why do we hate learning?

00:04:31.920 --> 00:04:40.000
Uh while as as kids we have this um urge to learn and to discover and to explore things.

00:04:40.000 --> 00:04:48.720
Uh and I think it's the problem with the curriculum, the way we are uh being taught, uh, that delivery of teaching.

00:04:48.720 --> 00:04:50.800
So I thought to explore that area.

00:04:50.800 --> 00:05:02.160
And this is when I went back to to Dubai to um and I joined uh as I worked as an art director at a language uh Arabic language center.

00:05:02.160 --> 00:05:11.439
Uh and I talked to m to to to the manager there to I asked her to to teach some some I like Arabic language and to try teaching, you know.

00:05:11.439 --> 00:05:14.240
I want to experiment with that area.

00:05:14.240 --> 00:05:18.319
And I taught uh some kids for a couple of months and it was great.

00:05:18.319 --> 00:05:36.560
Then I had to move to Turkey, where I'm based now, with no experience of course in in the in the education uh sector, and uh I applied to to Axel uh as a as a teacher academ uh media teacher.

00:05:36.560 --> 00:05:43.680
Uh and uh like they accepted me and I tried my my luck with them.

00:05:43.680 --> 00:05:51.680
Uh I taught for a year like uh three uh models, uh film and audio studies and photography.

00:05:51.680 --> 00:06:02.000
Then I I was offered uh media program coordinator for uh for a couple years, then I was promoted to academic manager.

00:06:02.000 --> 00:06:03.279
This is how I raised.

00:06:03.279 --> 00:06:08.399
And after I was promoted to academic manager, I applied for a master of education.

00:06:08.399 --> 00:06:15.519
Uh and there I focused on the curriculum uh development and challenges and uh yeah.

00:06:15.519 --> 00:06:19.680
So this is uh and this is just uh yeah, more information.

00:06:20.560 --> 00:06:20.879
Right.

00:06:20.879 --> 00:06:29.439
So you yeah, so that started your passion of being able to teach uh a you know a different language to students, wow, and teaching them that.

00:06:29.439 --> 00:06:29.920
Wow.

00:06:29.920 --> 00:06:30.959
I love that.

00:06:30.959 --> 00:06:40.240
Yeah, and it that sparked your interest and really got you into education and noticing that there was something with curriculum that needed improvement.

00:06:40.240 --> 00:06:41.759
And so I love that.

00:06:41.759 --> 00:06:45.759
You saw a gap and you saw a need and you wanted to be able to help and fill it.

00:06:46.079 --> 00:06:47.040
Yeah, yeah.

00:06:47.040 --> 00:06:51.839
I uh it's uh it's a it's a journey because I had to shift my career.

00:06:51.839 --> 00:06:58.160
You know, I had a I used to work in production, then I started from zero, you know, in in the education.

00:06:58.160 --> 00:07:00.319
So yeah, it was a challenge.

00:07:00.720 --> 00:07:10.079
Yeah, it's amazing how this field, instructional design and curriculum design, how uh it's just it's a field that keeps evolving and keeps growing.

00:07:10.079 --> 00:07:11.120
So it's great.

00:07:11.120 --> 00:07:20.399
You know, you and I were talking about before we hit the record button how you know you're you're coming into this field as well, and with instructional design and how I've been teaching for a while.

00:07:20.399 --> 00:07:35.439
And you know, it helps me stay fresh too, because the with everything going on and now, you know, what like work what we're gonna talk about in a little bit with AI, it's just it's really rocked education in a way that I don't think anybody expected it to.

00:07:35.439 --> 00:07:47.519
So it's uh it's really challenged me, not only as a as someone that works as an instructional designer in the field in higher education, but also as an instructor, it's challenged me as well.

00:07:47.519 --> 00:07:56.800
And allow it, but it also allows me to step up my game a little bit more and say, okay, you know, I can't, we can't just continually do things the same way.

00:07:56.800 --> 00:07:58.560
It's not always gonna work that way.

00:07:58.560 --> 00:08:04.399
So we have to be able to pivot and we have to be able to change a little bit and make things a little bit better.

00:08:04.399 --> 00:08:21.199
Yeah, and know that in the age of AI that we can we can still make things human and we can still put our personality into things and it doesn't have to be hu, you know, it can be human uh still, even in the age of of all this machinery and technology, right?

00:08:21.199 --> 00:08:22.879
That we can still have that human side of it.

00:08:22.879 --> 00:08:23.839
So definitely.

00:08:23.920 --> 00:08:30.879
I don't I don't believe that uh human will be uh like the human factor will be repu uh uh replaced by AI.

00:08:30.879 --> 00:08:34.080
Uh of course we will always need uh the human.

00:08:34.080 --> 00:08:38.639
Uh uh because uh we are the one who created uh AI, after all.

00:08:38.639 --> 00:08:48.320
So we're the owners, and it's not uh it's not like a new entity, it's uh from us, uh you know, just uh a tool to serve us in a way, you know.

00:08:48.639 --> 00:08:48.799
Right.

00:08:49.360 --> 00:08:50.159
But yeah, I agree.

00:08:50.159 --> 00:08:58.960
It's uh it's uh it's uh development is it's very uh growing uh very fast and it's it's kind of scary.

00:08:58.960 --> 00:09:01.519
You need to keep up every every week, you know.

00:09:01.519 --> 00:09:03.919
Every week something new is showing up, you know.

00:09:03.919 --> 00:09:12.399
You just need to keep keep uh keep up, you know, with the with all the developments and uh the inventions and yeah.

00:09:12.879 --> 00:09:14.080
Right, it's amazing.

00:09:14.080 --> 00:09:14.639
Yeah.

00:09:14.639 --> 00:09:17.360
And hopefully it's used for more good than anything else.

00:09:17.360 --> 00:09:18.879
So yeah, that's great.

00:09:18.879 --> 00:09:28.799
So kind of to set that foundation uh for my listeners, how do you explain generative AI to students so it becomes a learning partner rather than a shortcut?

00:09:28.799 --> 00:09:30.960
Because I've had to do that with my students.

00:09:30.960 --> 00:09:34.720
So what's one simple way you can kind of explain that to them?

00:10:12.850 --> 00:10:18.850
Well, uh I like the how you use the learning partner, which uh how AI should be.

00:10:18.850 --> 00:10:23.649
Uh but first we can uh define what we mean by generative AI.

00:10:23.649 --> 00:10:35.410
It's the type of AI that can create uh new content or ideas, uh like uh uh like uh stories, images, uh videos, music, these things.

00:10:35.410 --> 00:10:42.769
Um now when we speak about uh AI generative tools, uh we the most popular one is ChatGBT.

00:10:42.769 --> 00:10:49.970
And I actually did my uh my my dissertation master's dissertation on the use of ChatGBT in education.

00:10:49.970 --> 00:11:02.850
Uh now uh when when uh when you log to OpenAI, ChatGBT's website, uh at the bottom of that page in small fonts, you will read ChatGBT can make mistakes.

00:11:02.850 --> 00:11:05.810
Consider checking important information.

00:11:05.810 --> 00:11:10.210
So I think this tells you exactly what we are dealing with.

00:11:10.210 --> 00:11:16.930
Now it can be argued that uh okay, make me uh we uh humans can make make mistakes also.

00:11:16.930 --> 00:11:18.129
Uh what's the difference?

00:11:18.129 --> 00:11:26.850
But uh the thing is uh unlike uh the AI, uh humans have the capability uh to practice critical thinking.

00:11:26.850 --> 00:11:31.330
Now this is uh something you cannot find in an AI.

00:11:31.330 --> 00:11:47.649
So maybe this is what we need to do, what we need to teach our our students uh how to use critical thinking to to double check uh the this the sources, the information generated by these AI tools.

00:11:47.649 --> 00:11:54.290
Uh after after all, they are tools, they are not uh you know, they are not uh humans, you know.

00:11:54.290 --> 00:11:55.970
Even humans, they make mistakes.

00:11:55.970 --> 00:12:03.410
I always do this analogy of the you know, we all have this friend uh who knows everything.

00:12:03.410 --> 00:12:10.930
Uh they can never be wrong about uh whenever you ask them, they have an opinion about everything, they know everything.

00:12:10.930 --> 00:12:14.690
Uh and sometimes they do, you're right, but they don't know everything.

00:12:14.690 --> 00:12:19.490
Now this is uh exactly how uh how AI acts.

00:12:19.490 --> 00:12:24.850
They they have you ever like asked AI and they told you we don't know the answer.

00:12:24.850 --> 00:12:32.450
So they are not built with that feature, just like that friend who always like knows something about everything.

00:12:32.450 --> 00:12:38.129
So what you do with this friend is uh you basically uh double check what they say.

00:12:38.129 --> 00:12:44.450
If they give you an advice on like a legal issue, you don't just go and do it.

00:12:44.450 --> 00:12:46.210
You have to ask a lawyer.

00:12:46.210 --> 00:12:52.129
So this is uh how uh we should look at uh AI in general.

00:12:52.129 --> 00:12:59.170
Okay, uh they give you information, just go double check them, uh the accuracy of that information uh provided.

00:12:59.170 --> 00:12:59.570
Yeah.

00:12:59.570 --> 00:13:14.769
This is basically uh that like this can be the foundation of uh how we look at um at AI and also uh the over-reliance on on it, it can affect uh affect our skills badly.

00:13:14.769 --> 00:13:17.970
Uh that's another thing, you know.

00:13:18.530 --> 00:13:20.850
Right, yeah, the over-reliance on it.

00:13:20.850 --> 00:13:24.610
And yeah, yeah, I yeah, I like that term too, learning partner.

00:13:24.610 --> 00:13:32.370
I've sometimes used, I've heard the term on some of my with my other guests, they've used the term collaborative partner as well.

00:13:32.370 --> 00:13:43.090
So I think, yeah, I think the key is that it's uh it's coming alongside you, and you're you're uh using it to help generate ideas, but the final product.

00:13:43.090 --> 00:13:46.290
I always tell my students, the final product always has to be yours.

00:13:46.290 --> 00:13:48.850
You can't have it be AI generated.

00:13:48.850 --> 00:13:51.810
And yeah, so it's it's always important.

00:13:51.810 --> 00:13:58.129
I have uh this stop actually the university has established this, but I kind of took it and ran with it.

00:13:58.129 --> 00:14:16.530
So in all my announcements, I have I have this uh AI stop light method that's a reminder that I have in all my weekly announcements, and I'm like, green light means this, and red yellow light means, you know, so it's that caution, you know, and then the red light means you cannot use it for any reflections or anything like that.

00:14:16.530 --> 00:14:18.450
That has to be your own words, right?

00:14:19.090 --> 00:14:19.649
Your own thoughts.

00:14:20.530 --> 00:14:21.810
Yeah, so it's very interesting.

00:14:21.810 --> 00:14:37.970
And then for assignments where um, because we we I have of course some classes that I'm teaching where they have to utilize AI to be able to have those conversations and and kind of figure out how would they handle this uh in if it's in a safe setting, right?

00:14:37.970 --> 00:14:40.210
And how would they handle this handle it in the workplace?

00:14:40.210 --> 00:14:41.810
So it's very interesting.

00:14:41.810 --> 00:14:52.690
And they utilize it actually to even in one assignment to help them generate some content or just to you know kind of get started with the e-learning module on workplace safety.

00:14:52.690 --> 00:14:59.570
And I'm like, oh, this is really interesting because some of them are in K through 12 education, they don't know you know workplace training and things like that.

00:14:59.570 --> 00:15:02.850
So it's very interesting to kind of see how they use it.

00:15:02.850 --> 00:15:08.930
But some of my students have actually told me right up front, they're like, I feel like I'm cheating when I use AI.

00:15:08.930 --> 00:15:19.490
And I'm like, see, that's your that's your intuition telling you that you you have guardrails in place, and that's great, you know, that you're not just forging full ahead, but you're thinking about it.

00:15:19.490 --> 00:15:24.850
And and I said, you know, I understand, you know, I sometimes I thought I was cheating too.

00:15:24.850 --> 00:15:31.490
But you know, I said, as long as you're, you know, you're not utilizing it as your own work and you're being ethical, yeah.

00:15:31.490 --> 00:15:36.610
I think you, you know, you can you can use it, you know, just be cautious about how you're using it.

00:15:36.850 --> 00:15:38.370
So that's interesting.

00:15:38.370 --> 00:15:46.530
I thought uh I thought it's only me who feels this way because uh actually I feel something, I'm doing something wrong with that when I use it.

00:15:46.530 --> 00:16:03.170
And I heard from uh my students also, like I noticed that from my students when like when we talked about uh publicly discussing uh the use of AI, and they were surprised by uh like uh can are we allowed to do that?

00:16:03.170 --> 00:16:05.649
Because they try to hide it, you know.

00:16:05.649 --> 00:16:15.330
Right as we do, as we do, because we feel okay, now uh whoever will read this will read this, they will think uh they used AI.

00:16:15.330 --> 00:16:18.450
So it's not entirely uh our work, you know.

00:16:18.450 --> 00:16:29.410
We think this in our head when we whenever we use AI, and it gives us a nice uh polished uh answer, or you know what I mean, especially when we're doing something written.

00:16:29.410 --> 00:16:32.450
So we feel like, oh, should I use that?

00:16:32.450 --> 00:16:38.210
or uh I I just want it to be uh authentic and and uh totally mine.

00:16:38.210 --> 00:16:42.050
When someone asks me, Did you use AI, I tell them no, I didn't, you know.

00:16:42.050 --> 00:16:48.930
It feels like I'm doing something wrong, and uh it's uh it's a stain on my on my work.

00:16:48.930 --> 00:16:50.690
Yeah, I don't wanna have it.

00:16:50.690 --> 00:16:51.490
Yeah.

00:16:51.490 --> 00:16:54.290
That's uh natural.

00:16:54.290 --> 00:17:04.049
Um but it's the the the the question is how to use it and when to use it, mainly not uh if we can use it or not.

00:17:04.049 --> 00:17:08.049
Definitely we can, uh, but when and how, yeah.

00:17:08.529 --> 00:17:09.649
Right, exactly.

00:17:09.649 --> 00:17:15.329
Yeah, and that kind of goes into you know my next question too about classroom culture, right?

00:17:15.329 --> 00:17:16.690
That starts on day one.

00:17:16.690 --> 00:17:29.649
If you if you introduce um guardrails and uh not really rules, but just more of like guidance around AI and you wait until you know week three of the class, that that's not a good way to start off.

00:17:29.649 --> 00:17:31.730
So it starts with day one in that culture.

00:17:31.730 --> 00:17:40.930
So, what are some norms and expectations you think instructors should set with students to help guide that ethical use of AI without shutting down curiosity?

00:17:42.449 --> 00:17:49.970
Yes, uh actually it's important when you said uh without shutting down curiosity, we want to promote uh academic freedom.

00:17:49.970 --> 00:17:58.129
Uh it's very important because uh all sources of of information uh we encourage students to use them.

00:17:58.129 --> 00:18:06.529
Now, about the norms and expectations, uh usually they can be found in the school's policy or AI user guides.

00:18:06.529 --> 00:18:13.490
Uh we can only refer the students to these, but uh after all, they might read it, they might not understand it.

00:18:13.490 --> 00:18:34.849
Um but um of course we are if we are talking about the schools that integrate and allow the use of AI, not blocking it, because I found that in my in my research, I found that some universities they actually block the use, they don't allow, they forbid the use of AI in in education, which is very strange.

00:18:34.849 --> 00:18:35.490
Yeah.

00:18:35.490 --> 00:18:38.289
Yeah, that is it's very strange, yeah.

00:18:38.289 --> 00:18:41.889
Uh mental to to deal to deal with such tool.

00:18:41.889 --> 00:18:44.529
Now uh to answer your question, yeah.

00:18:44.529 --> 00:18:50.449
First, uh instructors should instruct uh the instructors should be educated about the limitation of AI.

00:18:50.449 --> 00:19:16.929
Uh they should have the digital or the AI literacy, which were uh actually my proposal for uh in in in the research I've done, uh it was a professional uh development uh in the form of teacher training uh using uh David Kolb's experiential learning theory, uh where that I I proposed that the teachers should apply it on themselves.

00:19:16.929 --> 00:19:24.049
Actually, I applied it on myself before uh I before I chose the topic for my master's dissertation.

00:19:24.049 --> 00:19:28.769
I uh I haven't used Chat GBT before.

00:19:28.769 --> 00:19:30.369
So I wanted to apply it.

00:19:30.369 --> 00:19:34.209
I had and I have a negative uh opinion about it.

00:19:34.209 --> 00:19:38.049
So I wanted to test it, you know, and apply it on myself.

00:19:38.049 --> 00:19:40.129
This uh this tool.

00:19:40.129 --> 00:19:45.250
I wanted to see how it works and what good can uh come from it.

00:19:45.250 --> 00:19:53.009
So I applied the Kolb's uh theory on it uh on myself and I started experimenting with it.

00:19:53.009 --> 00:20:01.409
And uh actually it was uh about by the end of my dissertation I I was like 180 degrees the opposite.

00:20:01.409 --> 00:20:04.529
I was like, okay, that's not actually too bad.

00:20:04.529 --> 00:20:06.529
It's just like any other sources.

00:20:06.529 --> 00:20:17.649
I can if we're talking about cheating and plagiarism, uh we can do it with with any any sort of uh any way, you know, any any method we can use.

00:20:17.649 --> 00:20:32.609
Uh so I think uh yeah the the the the the instructors first should be aware of the limitations, uh then they can uh pass it uh to the to the to the students.

00:20:32.609 --> 00:20:41.889
Then um another important aspect also of of the teachers' role is to be a role model uh for the students.

00:20:41.889 --> 00:20:49.490
Uh they have to equip the students with the responsible uh mindsets uh through their leadership.

00:20:49.490 --> 00:21:02.769
Um also they need to explain uh now for the students if they understand why this subject is important, why I'm learning this subject, how is it important in my field?

00:21:02.769 --> 00:21:11.809
Uh or what is learning and how I gain knowledge, I think they will understand how to to use it, you know, naturally.

00:21:11.809 --> 00:21:23.649
If I come to Chat GBT and ask it uh to write uh a report or an a task or an assignment for me, I I actually didn't learn anything.

00:21:23.649 --> 00:21:30.129
I just even if I read it, I didn't do the hard work, I didn't experiment it myself, you know.

00:21:30.129 --> 00:21:37.889
So I uh so uh the students should they should know uh that this way they will not learn.

00:21:37.889 --> 00:21:47.569
And if they don't if they don't learn, it will affect their career uh eventually if if they are if they are thinking about their career, you know.

00:21:47.569 --> 00:21:54.849
So uh knowing uh the consequences of uh the misuse of AI can help also in that area.

00:21:54.849 --> 00:21:58.049
When to use it, how to use it, we spoke about that.

00:21:58.049 --> 00:22:06.369
Um now uh also from the instructor they can uh promote uh open discussion.

00:22:06.369 --> 00:22:12.129
They the students they need to openly discuss uh their experience and the outcomes of that use.

00:22:12.129 --> 00:22:19.009
We were we were talking about like um when they feel they are doing something wrong, you know, when they use uh AI.

00:22:19.009 --> 00:22:31.809
Um now if they openly discuss uh their use with the teacher, then the teacher can uh actually uh like guide them into the the correct use.

00:22:31.809 --> 00:22:37.970
And this way they can uh acknowledge they their use so it's no longer plagiarism.

00:22:37.970 --> 00:22:43.889
Uh uh it's they are not doing something, they are not claiming it's it's their own work.

00:22:43.889 --> 00:22:50.209
Even if they copy-pasted it from the from the AI uh response, you know.

00:22:50.609 --> 00:22:51.329
Right, right.

00:22:51.569 --> 00:22:57.649
Uh they are not doing something wrong if they claim uh unless they claim they need to understand what is a plagiarism.

00:22:58.449 --> 00:22:59.569
Right, exactly.

00:22:59.569 --> 00:23:00.209
Yeah.

00:23:00.209 --> 00:23:08.049
You know, and before AI became such a huge thing, you know, there were the people students were teaching were cheating before that.

00:23:08.049 --> 00:23:14.449
So I always look at it if they if they cheated before, then they're just they're just using another way to cheat.

00:23:14.449 --> 00:23:17.250
So, you know, it's exactly it's just a new way.

00:23:17.409 --> 00:23:22.209
Before even Google was invented, uh we had cheating, right?

00:23:22.209 --> 00:23:25.809
People used to pay other people to do their assignment for them.

00:23:25.809 --> 00:23:31.089
Uh and now you you just need you don't need to pay, it's free.

00:23:31.089 --> 00:23:32.289
That's the difference.

00:23:32.289 --> 00:23:44.049
Yeah, and you can have uh unique uh you can just even ask uh uh the AI or any tool to imitate your style of writing.

00:23:44.049 --> 00:23:55.569
So the uh the AI detector or the instructor who didn't uh recognize your work or your you know the recognize it's an AI generated uh material.

00:23:55.970 --> 00:23:56.289
Right.

00:23:56.289 --> 00:23:57.250
That's true.

00:23:57.250 --> 00:23:59.250
Yeah, that's that's interesting.

00:23:59.250 --> 00:24:13.250
And I I've even heard like some people like in my circle of curriculum development and even some of my students saying that if you see the M-dash in writing, that people are automatically thinking it's AI.

00:24:13.250 --> 00:24:22.609
And but I've I know people that are editors that are like, well, the M-dash is actually a really it's been around for a long time and it's actually a really good good thing to use.

00:24:22.609 --> 00:24:30.209
So you know, people think they have to take the M-dash out of everything because it's going to flag it as, oh, that's AI generated.

00:24:30.209 --> 00:24:31.569
So it's really funny.

00:24:31.569 --> 00:24:34.129
But you can actually, I've I've did an experiment too.

00:24:34.129 --> 00:24:40.289
It's it's funny, I asked AI, I was like, can you take you know any of my any of the responses, make sure there's no m-dashes in it?

00:24:40.289 --> 00:24:41.730
And it did, it just took them out.

00:24:41.889 --> 00:25:04.769
So you know actually, I found an interesting when I was doing the literature review uh for my study, I found an interesting uh paper that said that there was an increase in some uh verbs or words after the invention of uh Chad GBT, for example, the word delve, you know, it's used a lot.

00:25:06.049 --> 00:25:06.689
It's used a lot.

00:25:07.329 --> 00:25:09.809
Okay, but this is an actual word, people use it.

00:25:09.809 --> 00:25:10.929
What's wrong with that?

00:25:10.929 --> 00:25:11.329
Right.

00:25:11.329 --> 00:25:16.369
And even if I if even if I use Chat GBT once, I learn from it.

00:25:16.369 --> 00:25:19.250
I and I and I actually do, I learn from it.

00:25:19.250 --> 00:25:31.649
Uh actually, you know what's interesting, what what made me choose this uh topic in my dissertation is I was watching a video of uh I I haven't used uh ChatGBT before that video.

00:25:31.649 --> 00:25:41.649
So I was watching a video of uh some YouTubers who were experimenting asking ChatGBT questions and uh reading the answer and they comment on it.

00:25:41.649 --> 00:25:50.369
So I noticed how the answer was exactly how we want our students to uh to cover the assessment criteria, you know, to evaluate.

00:25:50.369 --> 00:26:00.289
How to evaluate first you need to define the subject, then you need to talk about uh the advantages and the disadvantages and give your opinion and you know what I mean?

00:26:00.289 --> 00:26:02.449
So I thought that's that's actually great.

00:26:02.449 --> 00:26:11.889
I we can teach our students how to evaluate, how to analyze by just reading the responses and imitate it, but in their own words.

00:26:11.889 --> 00:26:15.649
That's the that's the only uh difference, you know.

00:26:15.649 --> 00:26:19.889
So uh yeah, it's uh I we can learn from AI.

00:26:19.889 --> 00:26:21.009
What's wrong with that?

00:26:21.009 --> 00:26:28.849
I if I if I if I I noticed like the way uh it phrases a sentence, and I can use it in my writing.

00:26:28.849 --> 00:26:31.089
So does that mean I'm plagiarizing?

00:26:31.089 --> 00:26:32.849
No, I'm just learning.

00:26:33.250 --> 00:26:33.490
Right.

00:26:33.490 --> 00:26:48.609
Yeah, you're just learning how to yeah, how to do you know it's interesting too, like uh because I I'm writing a a book and it's based on one of the most popular podcast episodes, and I had the structure of it all laid out and the manuscript's pretty much ready to go.

00:26:48.609 --> 00:26:55.809
But as I was looking through the chapters, hands I was like, something seems off because it's all about the ID models and theories.

00:26:55.809 --> 00:27:03.970
And you know, typically you want to like we know as educators, you want to start with the simple ones first and then go into the more complicated ones.

00:27:03.970 --> 00:27:08.209
But I felt like the order of my chapters was just something was not right with it.

00:27:08.209 --> 00:27:16.129
So I decided, you know what, I'll take chat GPT, I'll I'll stick my chapters in there and then ask it, you know, is this a good format?

00:27:16.129 --> 00:27:17.169
Is this in a good order?

00:27:17.169 --> 00:27:20.289
Can you make any suggestions for me on the chapter order?

00:27:20.289 --> 00:27:25.329
And it came back and it said, yeah, you're good, you know, you've got a good structure here, chapters one through five look good.

00:27:25.329 --> 00:27:29.329
But then in the middle, it seemed like things were just kind of a little bit, yeah.

00:27:29.329 --> 00:27:31.970
I knew something was off and it gave me really good suggestions.

00:27:31.970 --> 00:27:35.169
But then it also said, why don't you break it into parts?

00:27:35.169 --> 00:27:37.329
Because then you reduce the cognitive load.

00:27:37.329 --> 00:27:40.369
And I'm like, wow, I should know that.

00:27:40.369 --> 00:27:45.649
I'm I'm an instructional designer, but it was like a yeah, it's like a delightful moment.

00:27:45.649 --> 00:27:49.649
I'm like, gosh, why didn't yeah, but I was like, yeah, 20 chapters is a lot.

00:27:49.649 --> 00:27:50.689
So I did that.

00:27:50.689 --> 00:27:54.369
I broke it into sections and I just took Chat GPT's ideas.

00:27:54.369 --> 00:28:06.529
I didn't uh, you know, and I just kind of massaged it a little bit, worked, you know, with the wording, but it was really neat because yeah, if um I I don't know, I may have thought of that on my own, but maybe not.

00:28:06.529 --> 00:28:11.970
It may have taken me six months down the road, or it may have taken a publisher to go, hey Monet, why don't you do this?

00:28:11.970 --> 00:28:17.409
So for me to, you know, kind of think of that on the front end of it instead of later on, it was kind of neat.

00:28:17.409 --> 00:28:24.929
So yeah, that's a good example of how we can utilize it to come up with ideas and improve what we're already doing, right?

00:28:24.929 --> 00:28:25.809
Yeah.

00:28:26.449 --> 00:28:38.769
Yeah, and thank you for sharing that uh experience because it made me think of the imp the impact of Chat GBT or use of AI on our mental health, right?

00:28:38.769 --> 00:28:41.329
Right questioning ourselves, what's wrong with me?

00:28:41.329 --> 00:28:45.089
Why I didn't think of this uh way or you know what I mean?

00:28:45.089 --> 00:28:51.569
Questi questioning our ability to think uh uh better than uh AI, you know.

00:28:51.569 --> 00:29:01.409
Uh that's an interesting uh area to explore, also the influence of AI on the mental health of the users.

00:29:01.809 --> 00:29:02.289
Right.

00:29:02.289 --> 00:29:05.490
Yeah, oh my gosh, yeah, that's that's amazing.

00:29:05.490 --> 00:29:06.369
Yeah.

00:29:06.369 --> 00:29:14.209
And uh so I wanted to kind of go over, you know, because we were talking about you know how how to set up that classroom for success.

00:29:14.209 --> 00:29:18.209
So, you know, design choices can also reduce misuse.

00:29:18.209 --> 00:29:21.089
So are there different types of assignment structures?

00:29:21.089 --> 00:29:28.929
Maybe, you know, maybe you could share one that increase authentic learning while lowering the incentive to do outsource work to AI.

00:29:28.929 --> 00:29:46.049
Um, because this is something we're doing a lot in our university with authentic assessments and trying to not necessarily you know, we don't want students to think that they're they don't write anymore because you know they can just take and have AI write a paper for them, but we want to make it authentic for them.

00:29:46.049 --> 00:29:48.929
So are there some some ways that they can do that?

00:29:50.049 --> 00:29:50.849
Yeah, definitely.

00:29:50.849 --> 00:29:56.129
There's a lot of ways actually to uh to do that when we design the assignments.

00:29:56.129 --> 00:30:01.250
We actually uh in our uh center we we follow the assignment-based system.

00:30:01.250 --> 00:30:02.529
We don't do exams.

00:30:02.529 --> 00:30:22.369
Uh we ask the students to cover the assessment criteria, which is provided by Pearson, uh, by doing some dividing uh the assignment into parts, and these parts they for they include the presentation and uh report, for example, uh reflective review.

00:30:22.369 --> 00:30:36.369
So for the for the uh presentations, we already discussed uh the students would would uh present their work, and after that we have the QA uh discussion session.

00:30:36.369 --> 00:30:44.209
Uh This way we can we can know that the learner um uh understands what they said.

00:30:44.209 --> 00:30:52.609
Even if they copied it from uh or used uh chat GBT or even if they acknowledged it, it's fine.

00:30:52.609 --> 00:30:57.089
But eventually they learned what we want them to do to learn, you know?

00:30:57.089 --> 00:30:58.929
Uh the purpose is done.

00:30:58.929 --> 00:31:02.449
Uh no matter how it is done, but it's done eventually.

00:31:02.449 --> 00:31:06.929
And they have the correct information, we can um fix it for them.

00:31:06.929 --> 00:31:33.250
Uh uh but for the written part, for example, the reports here we have a problem because if they submit it without uh without uh discussing it with with us and without Q and A, now we will we will be in like in that area where we suspect that they use uh the use of AI and we put it in the AI detectors and we might accuse them of plagiarizing and so on.

00:31:33.250 --> 00:31:41.089
So what we uh what we what we did is uh we added we asked them to discuss what they've done.

00:31:41.089 --> 00:31:51.009
How did they write the the the the report, for example, what did they cover in the report, and we asked them questions just like a presentation, you know.

00:31:51.009 --> 00:31:58.129
But uh uh uh but instead of uh a presentation, it's uh discussion, you know.

00:31:58.129 --> 00:32:11.970
So this way we can understand they they know what they wrote, they understand what they wrote, they comprehend it, and also uh we make sure that we we witness their understanding, not only we read it somewhere.

00:32:11.970 --> 00:32:35.169
Uh I think also uh uh focusing on the on increasing the higher order thinking skills uh in that uh sense it's very important uh with the use of uh AI because we want to leave uh the you know that the first levels of blooms, taxonomy, like uh explain, define, remember these things.

00:32:35.169 --> 00:32:38.769
These things can be done in the classroom activist activities.

00:32:38.769 --> 00:32:45.409
But uh we need to increase the the critical thinking, the evaluation, the analysis.

00:32:45.409 --> 00:32:49.409
We want to hear the students' opinion on everything that you say.

00:32:49.409 --> 00:32:53.409
Uh okay, even if you wrote this definition, what do you think of it?

00:32:53.409 --> 00:33:30.369
Uh when we hear the student's opinion, um okay, they can also ask AI for that, but uh we can understand uh they we can we can like this way we can encourage them to have their own opinion, not only uh copy what uh AI said uh or even if they they can they can they can uh say that uh I used, for example, I used Chat GBT uh about this topic and they said this, but I think uh they are wrong about this, and I think uh this way about this topic.

00:33:30.369 --> 00:33:31.329
You know what I mean?

00:33:31.329 --> 00:33:31.889
Right.

00:33:31.889 --> 00:33:35.889
Uh acknowledging the use of AI, it's very important also.

00:33:36.609 --> 00:33:38.049
Absolutely, right.

00:33:38.049 --> 00:33:40.049
Yeah, that's so important.

00:33:40.049 --> 00:33:47.009
Uh, because I noticed with the program that I help teach for, the instructional design program, it's a master's level program.

00:33:47.009 --> 00:33:49.250
And so I teach the instructional design courses.

00:33:49.250 --> 00:33:50.529
We talked a little bit about that.

00:33:50.529 --> 00:34:08.369
So what's great about the program, the way they built it and the way they built the classes is they're it's all project-based, which is great because not only are they doing col and they're doing collaborative work now, which they didn't have before, but they just revised these courses earlier this year, like around um like March, April.

00:34:08.369 --> 00:34:11.570
And now I'm starting to see the revisions in place.

00:34:11.570 --> 00:34:20.929
So I'm starting to teach these classes now with the revisions, and it's such a such a huge difference to see the AI incorporated it into it in an ethical way.

00:34:20.929 --> 00:34:28.530
But what's really cool is that they're also doing group projects, which I get to actually see their collaboration and connecting with each other.

00:34:28.530 --> 00:34:31.809
So even in an online environment, they can still connect with each other.

00:34:31.809 --> 00:34:33.090
So it's really great.

00:34:33.090 --> 00:34:37.170
So um the class I have that's coming up, it's the theories class.

00:34:37.170 --> 00:34:39.809
And so I that's one of my favorite classes to teach.

00:34:39.809 --> 00:34:50.930
I love teaching the theories class, but they actually have to do this assignment where they have to do a pitch deck, um, which is, you know, like they're pitching to a um their idea, right?

00:34:50.930 --> 00:35:00.530
And then they so they do the presentation and then they get together in peers or in a small group, and they actually have to give that presentation to their peers.

00:35:00.530 --> 00:35:07.890
And then their peers have to evaluate them on it, and then they fill out the the rubric and all of that stuff and submit it.

00:35:07.890 --> 00:35:18.930
And then they do an individual reflection on that process and the ethical, you know, decision making that they what they went through with that and how the collaboration process helped them.

00:35:18.930 --> 00:35:20.930
So that's authentic, right?

00:35:20.930 --> 00:35:22.370
Because you're yeah, yeah.

00:35:22.370 --> 00:35:25.490
Yeah, it's it's amazing those type of assignments.

00:35:25.490 --> 00:35:34.930
I always love grading those because it's always great to see their um, yeah, how they how they go about it and how they uh are able to complete those assignments.

00:35:34.930 --> 00:35:36.530
So it's really fun to see those.

00:35:36.530 --> 00:35:36.850
Yeah.

00:35:38.210 --> 00:35:47.809
The adopting and uh reflective approach is uh it's very, very important uh to to to avoid the entire use of AI.

00:35:47.809 --> 00:35:57.329
Also to discuss the the use uh of AI and the responses, to evaluate it, to reflect on it, uh on the responses generated.

00:35:57.329 --> 00:36:11.250
Uh for example, we suggest in our session plan for the teachers, uh, we suggest, for example, uh for every um session we we need to do uh an AI activity, AI-based activity.

00:36:11.250 --> 00:36:27.970
Uh and it's different every session, but uh eventually the the teacher would ask the students, for example, to uh for example, they can give them a topic uh and uh they c for example they can the students can be in two groups and they make a debate.

00:36:27.970 --> 00:36:36.210
Uh one group they use AI with their research or looking for the answer, and the other group they don't use AI, you know.

00:36:36.210 --> 00:36:43.410
They use the the secondary sources or you know the the regular you know uh articles uh and other things.

00:36:43.410 --> 00:36:59.570
And after that they compare the responses and they their findings and they can discuss which one is more uh accurate, um which which one is more uh like uh factual check factually checked, you know what I mean?

00:36:59.570 --> 00:37:23.329
So uh so this this uh practice of openly discussing the use of AI will will teach the the the learners the limitations of AI and when to like uh take the response uh as it is or when to double check it for accuracy and uh uh and non-bias, you know.

00:37:23.890 --> 00:37:24.930
Right, exactly.

00:37:24.930 --> 00:37:26.370
Yeah, that's wonderful.

00:37:26.370 --> 00:37:30.769
And yeah, and that can actually be the bonus section of the of the episode.

00:37:30.769 --> 00:37:37.730
I love that because that that's a great example of how how to turn that into that that way to make it authentic.

00:37:37.730 --> 00:37:45.329
And I I love that because we're always as as educators trying to find authentic ways that will help them in the in the workforce.

00:37:45.329 --> 00:37:57.490
And um instead of having a course because I've run across this before in curriculum development, and it's um it's a little disheartening when I see an undergraduate or graduate level course.

00:37:57.490 --> 00:38:00.930
In this case, it was a graduate psychology course that we were revising.

00:38:00.930 --> 00:38:06.450
And I went when I was doing my pre-ID review before the revision kicked off, can't it?

00:38:06.450 --> 00:38:13.650
It was uh I I kept looking through the syllabus and I saw assignment after assignment was paper and then another paper.

00:38:13.650 --> 00:38:16.050
So eight weeks of papers.

00:38:16.050 --> 00:38:17.650
That's all it had was papers.

00:38:17.650 --> 00:38:20.050
And I'm like, okay, I've got to make some suggestions here.

00:38:20.050 --> 00:38:21.010
I'm like, can we?

00:38:21.010 --> 00:38:25.170
So I was looking at the what the deliverable was and what they were being asked to do.

00:38:25.170 --> 00:38:28.769
And I'm like, okay, can we make this a PowerPoint instead?

00:38:28.769 --> 00:38:46.530
Or can we make this uh, you know, so I was trying to come up with ideas on like first of all, that's gotta be um just disengaging for students, but then for instructors to have to grade a paper that's like 750 to a thousand words, or even ones that were like 2,000 words.

00:38:46.530 --> 00:38:47.570
I'm like, that's a lot.

00:38:47.570 --> 00:38:50.130
I mean, 250 words is on average a page.

00:38:50.130 --> 00:38:53.010
So yeah, that's that's a lot of writing.

00:38:53.010 --> 00:38:55.970
By the end, I'm sure they would they would be burned out.

00:38:55.970 --> 00:38:58.930
Both instructor and student would be burned out with eight papers.

00:38:58.930 --> 00:39:01.809
So I think authentic assessment is so important.

00:39:01.809 --> 00:39:02.690
Yeah.

00:39:03.010 --> 00:39:08.850
Yeah, then the the SSS will use maybe AI tool to access the students' work and give them.

00:39:08.850 --> 00:39:10.850
Now it's a thing.

00:39:10.850 --> 00:39:11.250
Right.

00:39:11.250 --> 00:39:16.450
Yeah, it's a thing now, also uh, which is really uh unfortunate.

00:39:16.450 --> 00:39:21.890
Um it's uh as this is an unfortunate thing.

00:39:21.970 --> 00:39:24.130
So yeah, yeah.

00:39:24.130 --> 00:39:30.530
So it's always good to look for those opportunities to where of where we can provide that more authentic assessment.

00:39:30.530 --> 00:39:36.769
And so I think uh AI can be a tool for that too, to kind of say, you know, we do this too in curriculum.

00:39:36.769 --> 00:39:39.329
We'll take an existing just this is cool, actually.

00:39:39.329 --> 00:39:41.570
We'll take an existing discussion question.

00:39:41.570 --> 00:39:55.809
And if it's tied to professional standards, like for example, counseling, what we're doing right now is um at the university I I work for, um, we just got accredited accreditation for um counseling for the counseling program.

00:39:55.809 --> 00:39:57.890
So it's called K crep for short.

00:39:57.890 --> 00:40:06.690
And so we're having to now that we're the the programs are accredited, we have to switch over from the 2016 standards to the 2024 standards.

00:40:06.690 --> 00:40:09.650
And that has to be done by July of next year.

00:40:09.650 --> 00:40:12.450
But we can't just do that all in one fell swoop.

00:40:12.450 --> 00:40:13.970
We have to do course by course, right?

00:40:13.970 --> 00:40:16.289
And that takes time to revise the courses.

00:40:16.289 --> 00:40:35.730
So what we've been doing is we've been taking the that existing discussion question, taking the objective it's tied to, and then the new standards and plugging that into our um not an external tool, we have an internal uh closed system AI tool, but we plug it in and we're like, hey, does this meet the standards and does it meet the objective?

00:40:35.730 --> 00:40:40.769
And if not, can you give suggestions on how we can revise it to make it better?

00:40:40.769 --> 00:40:42.850
And I I give guardrails to AI.

00:40:42.850 --> 00:40:46.210
I'm like, because it'll, if you don't give it guardrails, you know this.

00:40:46.210 --> 00:40:50.450
It will just it'll give you a discussion question that is almost like an assignment.

00:40:50.450 --> 00:40:53.090
Like, no, this is a discussion question.

00:40:53.090 --> 00:40:58.769
So I give it guardrails and I'm like, responses should usually be about 200 to 250 words.

00:40:58.769 --> 00:41:04.610
And I think as long as AI knows that, it knows kind of what your parameters are, it can give you better responses.

00:41:04.610 --> 00:41:20.450
So you know, knowing that prompt engineering is so important, but it's really interesting because it helps guide our curriculum and we don't use it, you know, for what it is, but it just gives us ideas and then we have the subject matter experts tweak it a little bit and uh and refine it.

00:41:20.450 --> 00:41:27.010
So it's a really neat process to see how AI can kind of work even in curriculum design too as well.

00:41:27.250 --> 00:41:27.650
Yeah, yeah.

00:41:27.650 --> 00:41:35.890
Sometimes you find uh like uh I find myself like just um lose it with the with the with the Chat GBT, for example, or even Gemini.

00:41:35.890 --> 00:41:37.170
They don't get me.

00:41:37.170 --> 00:41:42.610
I mean, I keep asking them and uh eventually I would say you know what, forget about it.

00:41:42.610 --> 00:41:48.130
I'll just rely on my on my human mind, you know, to to figure it out.

00:41:48.130 --> 00:41:50.289
Yeah, I have to do the hard work.

00:41:50.930 --> 00:41:52.130
Exactly, right.

00:41:52.130 --> 00:41:52.850
I love it.

00:41:52.850 --> 00:41:57.809
So as we wrap up this episode, um I wanted to pose one final question.

00:41:57.809 --> 00:42:08.450
So kind of looking one year ahead, what do you think is a single capacity every student should have for responsible AI use, both in their academic and professional work?

00:42:10.370 --> 00:42:22.370
Uh well, um as you said, like uh one year ahead, we it's uh like uh just guessing what would be happen in the AI uh field and what can be invented.

00:42:22.370 --> 00:42:41.090
But I think the one thing that uh um students can carry uh carry carry uh with with themselves in the journey is the it's building the moral awareness or the moral integrity or the righteousness, uh the academic integrity and honesty.

00:42:41.090 --> 00:43:00.050
I I think it's very important to build that uh in the the students uh you know uh character because eventually uh we can uh we might fool um the instructors, the assessors, even the AI AI detectors.

00:43:00.050 --> 00:43:10.130
Uh but uh we cannot fool ourselves ourselves if we are doing something wrong or if we are claiming this is uh our work, which is basically lying, you know.

00:43:10.130 --> 00:43:15.890
Plagiarism is claiming something that isn't yours and claiming it it's yours.

00:43:15.890 --> 00:43:30.850
So uh no matter like uh for example, like always think that uh honesty is the is the key to to to academic integrity, and also another thing is critical thinking.

00:43:30.850 --> 00:43:44.130
Uh we are we are um the difference between uh us and uh AI is we uh we we you uh we practice critical thinking, we need to use it in every in every aspect of our lives.

00:43:44.130 --> 00:43:56.769
And on the professional uh work as uh side uh you need to um you need to build your uh future character and uh you need to build professionalism.

00:43:56.769 --> 00:43:58.450
Uh what do you want to be?

00:43:58.450 --> 00:44:03.970
Do you want to like uh do you want to where where do you want to work, you know?

00:44:03.970 --> 00:44:05.890
And think about your skills.

00:44:05.890 --> 00:44:09.250
What are you what kind of skills you are building when you use AI?

00:44:09.250 --> 00:44:14.050
Are you um are you activating the skills that God gave it to you?

00:44:14.050 --> 00:44:17.170
Uh what are you what are you doing with them?

00:44:17.170 --> 00:44:25.809
You know, are you uh using them and also put yourself in the employer's shoes, uh the recruiter or the the company you're working for?

00:44:25.809 --> 00:44:35.890
Would you hire yourself uh if you cannot perform well if uh simply the internet was shut down or there's no access to internet or AI?

00:44:35.890 --> 00:44:43.250
So would you be able to do the same uh task uh without the use of AI?

00:44:43.250 --> 00:44:50.370
Or you always need to to uh to ask AI how should I do it and they instruct you.

00:44:50.370 --> 00:44:58.610
Yeah, I think this uh these can be carried out with uh with every student and uh they should be fine eventually.

00:44:59.250 --> 00:45:01.010
I yeah, I think that's great.

00:45:01.010 --> 00:45:01.650
I love that.

00:45:01.650 --> 00:45:07.170
Kind of makes me think of you know math and can students do math without a calculator?

00:45:07.170 --> 00:45:08.210
Maybe, maybe not.

00:45:08.210 --> 00:45:09.970
So it's kind of safe from AI.

00:45:09.970 --> 00:45:11.250
Yeah, yeah.

00:45:11.570 --> 00:45:24.210
Uh actually uh during my my research I found out that when the calculator was invented, I think back in the 60s or something like that, uh there was uh a protest by mathematicians.

00:45:24.210 --> 00:45:36.930
Uh by mathematicians, they they there was prot they were protesting the the use of of uh calculator and they said it's the end of um uh math as we know it.

00:45:36.930 --> 00:45:41.410
And uh so it's the same, uh it's just a tool, you know.

00:45:41.410 --> 00:45:52.370
Uh and I noticed that like when we like when we try to do uh simple uh calculations, we always go to our phone and open the calculator to do it.

00:45:52.370 --> 00:45:59.010
Uh the the mind is the human mind is just like any muscle, uh or maybe it's an actual muscle, I don't know.

00:45:59.010 --> 00:46:04.050
But you need to exercise it uh in order for it to to stay working, you know.

00:46:04.050 --> 00:46:07.490
So uh exercise it by by think.

00:46:07.490 --> 00:46:21.730
Think like try first, try first with your own uh relying on your own mind, then use AI to to get an uh another perspective maybe, or to double check your thinking process, maybe.

00:46:21.730 --> 00:46:29.809
Uh and you'll be surprised how sometimes you your thinking is much better than what AI uh gave you.

00:46:30.370 --> 00:46:31.650
Right, exactly.

00:46:31.650 --> 00:46:32.210
Wow.

00:46:32.210 --> 00:46:41.890
Well, thank you, Hamza, for starting this series with clear classroom-ready uh playbook AI for norms and learning first design, because that's what it's all about.

00:46:41.890 --> 00:46:56.370
So next time uh in part two, we're going to unpack learner autonomy and show how to calibrate self-direction using NOL self-directed learning known as SDL and Vikoski's zone of proximal development.

00:46:56.370 --> 00:46:59.730
I know I was gonna get that one wrong, but I I did good, pretty good.

00:46:59.730 --> 00:47:03.970
Or ZPD, let's call it ZPD for short, but I just wanted to share what that was.

00:47:03.970 --> 00:47:05.809
Yep, so we'll start that in part two.

00:47:05.809 --> 00:47:11.010
So um just real quickly before we end, uh Hamza, uh, where can people find you?

00:47:11.010 --> 00:47:17.170
What's the best way to connect with you if they want to, you know, kind of um collect uh connect with you?

00:47:18.450 --> 00:47:32.850
Well, I'm not on uh social media, but I think uh the only social media I use is uh LinkedIn, so they can just write uh Hamza Sami uh my name and uh they can they they can connect with me right there, yeah.

00:47:33.250 --> 00:47:33.650
Great.

00:47:33.650 --> 00:47:35.490
I love that, and that's how we connected.

00:47:35.490 --> 00:47:36.690
So yeah, wonderful.

00:47:36.690 --> 00:47:36.930
Yeah.

00:47:36.930 --> 00:47:40.130
Well, thanks again, and we'll uh we'll be back for for part two.

00:47:40.130 --> 00:47:40.850
Appreciate it.

00:47:41.090 --> 00:47:41.970
I look forward to it.

00:47:41.970 --> 00:47:43.890
Thank you so much for me too.

00:47:44.210 --> 00:47:45.090
Thank you.

00:47:45.570 --> 00:47:46.450
Thank you.

00:47:47.010 --> 00:47:50.769
Thank you for taking some time to listen to this podcast episode today.

00:47:50.769 --> 00:47:53.010
Your support means the world to me.

00:47:53.010 --> 00:48:01.809
If you'd like to help keep the podcast going, you can share it with a friend or colleague, leave a heartfelt review, or offer a monetary contribution.

00:48:01.809 --> 00:48:07.250
Every act of support, big or small, makes a difference, and I'm truly thankful for you.

Hamza Sami Profile Photo

Academic Manager

Based in Istanbul-Turkey. Graduated with a Master of Arts in Education: Leadership and Management from The University of Derby, U.K., and a Bachelor of Science in Multimedia Design from the American University of Sharjah. U.A.E. Currently working as an Academic Manager at Axcel International Academy, a higher education provider.
Educator and creative producer with over 10 years in higher education and the creative industries. Specialised in vocational learning, curriculum design, and instructional delivery, with a strong foundation in educational theory and hands-on application. Skilled in instructional design, blended learning, and academic quality assurance, particularly in BTEC and other competency-based programmes. Lead teaching and learning strategies, support staff development, and implement innovative, learner-centered approaches that bridge theory and practice. Passionate about enhancing student outcomes and aligning academic practices with institutional goals.