Feb. 8, 2026

From Haiti to Edge AI: Building Privacy-First Learning Tools with Sebastien Fenelon

From Haiti to Edge AI: Building Privacy-First Learning Tools with Sebastien Fenelon

What if your classroom could adapt to each learner without handing their data to the cloud? That’s the promise we dig into with technologist and founder Sebastien Fenelon, whose journey from scarce resources in Haiti to building privacy-first, edge AI tools reframes what “future-ready” really means for educators and instructional designers.

We start with the power of resilience—how self-taught coding, late-night study sessions, and community support can outpace limited infrastructure—and move into practical strategies for teaching code with clarity and context. Sebastien shares why AI should compress project timelines, not critical thinking, and offers a simple “100-hour” ramp to acquire new languages fast. From K–12 to higher ed, we outline how to design small, visible wins that build confidence while using AI to scaffold learning rather than replace it.

We close with a playbook for staying adaptable: keep learning in focused sprints, plug into communities that share what works, and seek mentors who reveal the path behind the skills. If you’re ready to personalize learning, protect student data, and keep your curriculum uniquely yours, this conversation offers a clear blueprint. If it resonates, follow and share with a colleague, and leave a quick review to help more educators find thoughtful, practical guidance on AI in the classroom.

🔗 Website and Social Links:

Please visit Sebastien Fenelon’s website and social media links below.

Sebastien Fenelon’s Website

Sebastien’s Facebook Page

Sebastien’s Instagram Page

Sebastien’s LinkedIn Page

📢 Call-to-Action: If you’d like to dive deeper, be sure to visit the InthraOS homepage. There, you’ll find resources, guides, and a newsletter that will keep you ahead of the curve on AI and privacy-first technology. It’s a great way to gain practical insights, connect with a growing community, and explore products you can start using right away. 

Send Jackie a Text

Join PodMatch!
Use the link to join PodMatch, a place for hosts and guests to connect.

Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.

Support the show

💟 Designing with Love + allows you to support the show by keeping the mic on and the ideas flowing. Click on the link above to provide your support.

Buy Me a Coffee is another way you can support the show, either as a one-time gift or through a monthly subscription.

🗣️ Want to be a guest on Designing with Love? Send Jackie Pelegrin a message on PodMatch, here: Be a guest on the show

🌐 Check out the show's website here: Designing with Love

📱 Send a text to the show by clicking the Send Jackie a Text link above.

👍🏼 Please make sure to like and share this episode with others. Here's to great learning!


00:00 - Meet Sebastian and His Roots

01:06 - Learning to Code With Limited Resources

05:03 - From Design to AI Acceleration

07:20 - Teaching Kids to Code Earlier

10:01 - Personalization With AI in Classrooms

13:26 - Edge AI and Student Analytics

19:00 - Myths, Fears, and Privacy Risks

22:49 - EU AI Act and Data Transparency

27:16 - Local Models vs Closed Clouds

30:54 - Staying Adaptable: Keep Learning, Build Community

34:10 - Mentors, Meaning, and Career Paths

37:09 - Wrap Up, Podmatch Shoutout, Where to Find Sebastian

WEBVTT

00:00:00.160 --> 00:00:02.720
Hello, instructional designers and educators.

00:00:02.720 --> 00:00:06.879
Welcome to episode 88 of the Designing with Love podcast.

00:00:06.879 --> 00:00:10.480
I'm thrilled to have Sebastien Fenelon with me today.

00:00:10.480 --> 00:00:21.519
Sebastien is the founder and CEO of several technology startups with a deep focus on coding, artificial intelligence, and innovation in the tech space, which I'm so excited about this.

00:00:21.519 --> 00:00:26.960
So he brings a unique perspective on how technology can shape education and future ready skills.

00:00:26.960 --> 00:00:28.800
Welcome to the show, Sebastien.

00:00:29.359 --> 00:00:29.600
Thank you.

00:00:29.600 --> 00:00:30.239
Thank you so much.

00:00:30.239 --> 00:00:44.560
It's really a glad to be able to have me here and share some of my experiences with you, both in design, AI, very, very deep infrastructure around tech and the thing that I'm building right now with privacy and AI.

00:00:45.280 --> 00:00:45.679
Great.

00:00:45.679 --> 00:00:46.479
I love that.

00:00:46.479 --> 00:00:47.039
Yes.

00:00:47.039 --> 00:00:51.359
I'm glad that we're going to get into that because that seems to be a concern, right?

00:00:51.359 --> 00:00:56.159
In every industry, uh in education and technology in general.

00:00:56.159 --> 00:00:57.039
So that's great.

00:00:57.039 --> 00:00:57.840
I love that.

00:00:57.840 --> 00:01:05.519
So to start, can you tell us a little bit about yourself and share what inspired you to focus on the technology field, particularly coding and AI?

00:01:06.159 --> 00:01:06.719
Definitely.

00:01:06.719 --> 00:01:15.120
So in someone, we we grew up in a very, very small country, IT or Haiti, if you recall it this way.

00:01:15.120 --> 00:01:31.760
This this technology wasn't something that came naturally because we don't have the infrastructure, we don't have the equipment, we're not really a country that is even remotely uh able to get infrastructure for electricity or uh internet connection.

00:01:31.760 --> 00:01:46.400
However, with uh the little little opportunity to be able to be around people that are thinking differently, that are um educating themselves in a different way, that are asking themselves different questions in order to build our infrastructure or to put Haiti in a different level.

00:01:46.400 --> 00:02:04.239
I was able to be around these people very young and being able to build things and be able to impact or understand that there with resilience and with the ability to understand that you can make an impact, no matter where you are or where you're from, or what the the tools that you have around you.

00:02:04.239 --> 00:02:14.879
Um really, really that's getting me started with learning coding from a very young age, by uh being self-taught in um CHH and JavaScript as well, very young.

00:02:14.879 --> 00:02:43.039
And I built my first small project, which is a browser, very young, not nothing big, but really that was really the uh one of the first time I really felt like I have built something that looked cooler than you know uh the uh browser that came with my my desktop, uh and a lot faster to and and really that starts something, you know, from a young man that really started something, understanding that you can be and you can build something from you know knowledge.

00:02:43.039 --> 00:02:50.800
And it has to start with education, it has to start with the skills, and you have to learn those skills, and education and and teaching is important.

00:02:50.800 --> 00:03:08.800
And I really was around those people who can you know tilt me, not if not if even though they didn't have these knowledge, but they could tilt me towards where to find those knowledge, YouTube videos or documentation, or really around other people that are you know more advanced in the field.

00:03:08.800 --> 00:03:15.199
And I feel grateful for that, and that really uh pushes me to uh be more resilient.

00:03:15.199 --> 00:03:25.120
And resilience connects to staying up at night, waiting for the electricity to come on, because you know, living in Haiti is is the natural uh state of it.

00:03:25.120 --> 00:03:32.479
You don't have any electricity for 18 hours a day, and being able to build tech around that, it's very, very difficult.

00:03:32.479 --> 00:03:43.919
Even internet some of the time, if you're not within a school and establishment, you have to uh go to your neighbor's house or wait wait next to your neighbor's house for hours just to use their internet.

00:03:43.919 --> 00:03:55.439
And and these are these are the things that you can do with only resilience, and I think resilience is what really pushed me toward the edge of understanding that I can build things.

00:03:55.439 --> 00:04:10.159
And even with my project here in New York, resilience is really the uh the main um entity that pushes me to understand that even though the world is shaped in this way, we can really create an impact by thinking differently and by pushing the boundaries.

00:04:10.479 --> 00:04:16.399
Wow, I love that, Sebastian, because that's something I teach my students, even at the graduate level.

00:04:16.399 --> 00:04:18.959
Don't forget those things that you learned early on.

00:04:18.959 --> 00:04:25.600
And I think early learning that early on has helped you know you be successful and shape, you know, what you've been able to create.

00:04:25.600 --> 00:04:29.920
Not only uh to help, it's it's it's something, it's a labor of love, right?

00:04:29.920 --> 00:04:39.199
When you are able to create those tools and those technologies because you know you're you're helping to shape the world in a better way and you're helping to make an impact.

00:04:39.199 --> 00:04:40.879
So yeah, I love that.

00:04:40.879 --> 00:04:43.759
That resilience is so important because I teach that to my students.

00:04:43.759 --> 00:04:44.720
Don't forget that.

00:04:44.720 --> 00:04:48.000
And when they're teaching their kids, teach them that, right?

00:04:48.000 --> 00:04:54.399
And uh don't, you know, hopefully parents are doing that too, but model that behavior, right, in the classroom.

00:04:54.399 --> 00:04:56.240
So uh and outside the classroom.

00:04:56.240 --> 00:04:56.800
I love that.

00:04:56.800 --> 00:04:57.439
That's great.

00:04:57.439 --> 00:04:59.040
I love that you brought that up.

00:04:59.040 --> 00:05:00.879
So yeah, that's great.

00:05:00.879 --> 00:05:02.399
So thank you for that introduction.

00:05:02.399 --> 00:05:03.279
I love that.

00:05:03.279 --> 00:05:13.759
So you've as we as I mentioned in the introduction, you've built startups and you've worked, as you mentioned, at the forefront of emerging technologies like building your first browser like that.

00:05:13.759 --> 00:05:22.959
So, from your perspective, how can coding and AI be introduced in ways that truly benefit learners in K through 12 or even higher education environments?

00:05:23.360 --> 00:05:32.079
So coding definitely is really the tools or the languages able to uh connect from an idea to a software.

00:05:32.079 --> 00:05:44.079
And that's how coding existed, and that's how we have you learn and use coding in order to take an idea to something that exists in the software or even uh hardware as well.

00:05:44.079 --> 00:05:59.040
From my experience, really, from going from design, and I did design for a long time, uh, because I really, really love the idea of designing things and making things appear in a different way, and and and getting impact to things from ideas.

00:05:59.040 --> 00:06:01.199
And I think that's what coding gives you as well.

00:06:01.199 --> 00:06:04.079
It gives you the ability to create and to empower.

00:06:04.079 --> 00:06:11.439
And with AI, now it creates the speed of work from one day to six months.

00:06:11.439 --> 00:06:15.839
Now you can build things within a week, what you can do is six months, right?

00:06:16.399 --> 00:06:16.560
Right.

00:06:17.519 --> 00:06:23.279
It has to start with understanding of that language model, right?

00:06:23.279 --> 00:06:31.759
Understanding if you're building a platform, understanding what it takes to build a system around that platform, and what languages is best for that platform itself.

00:06:31.759 --> 00:06:37.600
First of all, what language do you know for you to understand how to build that platform?

00:06:37.600 --> 00:06:50.879
And that is important because in order for you to create a system and create uh anything that exists, you have to know what are the things that I need to understand and to build as a framework along that platform.

00:06:50.879 --> 00:06:52.480
And a lot of people miss that.

00:06:52.480 --> 00:07:26.319
We had uh um heard about a lot of you know, platform that allows you to build platforms and like from a prompt, they do exist in some way, but one day you'll wake up and you understand that you need to understand every single part of that platform in order for you to give your own impact and for you to make it go from a real idea, because the response that you're getting from those platforms is a generic response because these AI, you know, giant AI um responses are broad and they give you the same response for everybody, right?

00:07:26.319 --> 00:07:30.720
So what you're getting is really a very, very generic response for everyone.

00:07:30.720 --> 00:07:32.800
So everyone's platform will be the same.

00:07:32.800 --> 00:07:35.360
And that's not truly an idea, really.

00:07:35.360 --> 00:07:39.439
That's just a product, right, that you're replicating for everybody, right?

00:07:39.439 --> 00:07:45.040
In order for it to be an idea, in order for it to be a product, it has to come from self, right?

00:07:45.040 --> 00:07:53.759
It has to come from being able to manage those different languages, how to manage different sectors or systems of that platform that you're building.

00:07:53.759 --> 00:07:56.160
And that comes from understanding the languages.

00:07:56.160 --> 00:08:04.879
And now from saying that, it's because now I went to many bootcamps and I really found ways to understand a language within two weeks.

00:08:04.879 --> 00:08:06.560
And I think everyone can do it.

00:08:06.560 --> 00:08:13.360
It's not easy, but I would recommend everyone to try to push themselves for it.

00:08:13.360 --> 00:08:31.360
We call it the 100 hour program, where you put yourself, you know, go on YouTube or go on any kind of platform, allowing you to learn a language, uh coding language in 100 hours, and not believe 100 hours is all you need for you to be able to make an impact and build anything of value, right?

00:08:31.360 --> 00:08:39.039
And and it's it's really, really the base, really understanding, making a member, and getting those skills like off the bat.

00:08:39.039 --> 00:08:45.360
And then you can use AI in a more elevated way to empower and build faster.

00:08:45.840 --> 00:08:47.120
Wow, I love that.

00:08:47.120 --> 00:08:58.960
And I I I I've noticed too with education, I see children at younger ages, even in the uh, I don't, I'm not sure how young they are, but like K through 12.

00:08:58.960 --> 00:09:07.679
So even before they get to high school and college, they're already doing coding exercises and taking those creative approaches.

00:09:07.679 --> 00:09:09.600
And so their teachers are doing that.

00:09:09.600 --> 00:09:22.399
So, for example, I had a student in one of my classes in my graduate class, and she teaches science and she teaches technology classes, and she had a fifth grade class and she was teaching them how to do coding, Sebastian.

00:09:22.399 --> 00:09:23.039
It was amazing.

00:09:23.039 --> 00:09:24.480
And I was like, wow.

00:09:24.480 --> 00:09:29.600
When I was in fifth grade, when you and I were in fifth grade, were we learning that in fifth grade, in fifth grade?

00:09:29.600 --> 00:09:30.960
I wasn't, that's for sure.

00:09:30.960 --> 00:09:36.559
But it was amazing because I didn't learn how to build my first website until I went to college.

00:09:36.559 --> 00:09:41.440
So it just shows how that's happening sooner and sooner for these students.

00:09:41.440 --> 00:09:48.960
Um, and one of my other guests that I had on recently, she talked about generation alpha and how they're in front of screens all the time.

00:09:48.960 --> 00:09:54.879
So I think that's important to get them in front of that technology and get them exposed to it early on, right?

00:09:54.879 --> 00:09:56.879
So that they can have that.

00:09:56.879 --> 00:09:57.600
Yeah.

00:09:58.080 --> 00:10:00.960
And it comes all around resources, right?

00:10:01.120 --> 00:10:01.440
Right.

00:10:01.679 --> 00:10:10.559
And when we were when we were very young, we didn't have the resources to give every single kid a laptop with all these tools in front of them.

00:10:10.559 --> 00:10:13.039
That would be that was so expensive back then.

00:10:13.039 --> 00:10:23.840
And right now, with all these platforms that exist and all these technology, it's very, very easy for you to put this in front of them and give them context.

00:10:23.840 --> 00:10:28.159
And that really is the key to all kinds of knowledge is context, really.

00:10:28.159 --> 00:10:43.919
So you're not gonna give them, you know, a huge system of LML model for them to understand, but you can start very small by sharing with them what are the key um elements for HTML page, right?

00:10:43.919 --> 00:11:00.399
And this you no one knew this uh from you know college or from high school, but you now can teach in high school uh K-12 easily the basic of an HTML page and able to really code an HTML page in six weeks.

00:11:00.399 --> 00:11:09.120
And that is really what people would go to bootcaps around 10 years ago and pay thousands and thousands of dollars for it, right?

00:11:09.120 --> 00:11:17.679
And it's not just the fact that we're getting smarter, it's not really about it's more about information that flows faster.

00:11:17.679 --> 00:11:27.120
We can be on our phone and get as much information as we did two or six weeks' bootcamp uh ten years ago.

00:11:27.120 --> 00:11:42.320
So it's the information that is more a lot more easy to find and like more easy to digest because we can use those Gen AI platforms to understand them with real actual sentences instead of like really looking at our code all day.

00:11:42.320 --> 00:12:03.919
And that really what changes uh in the last decade is really having the resources and having those platforms as helpers to balance the coding with real language and you know, killing, like destroying these barriers and boundaries around the um the issues from going from understanding to building.

00:12:04.320 --> 00:12:05.519
Wow, I love that.

00:12:05.519 --> 00:12:06.559
That's great.

00:12:06.559 --> 00:12:11.120
But taking those uh those boundaries that existed and just breaking those.

00:12:11.120 --> 00:12:12.399
Yeah, I love that.

00:12:12.399 --> 00:12:13.200
That's great.

00:12:13.200 --> 00:12:20.320
So, as you know, many of my listeners are instructional designers and educators who are they're curious about using AI tools in their own work.

00:12:20.320 --> 00:12:23.360
Even me a year ago, I was like, how do I approach this?

00:12:23.360 --> 00:12:24.000
What do I do?

00:12:24.000 --> 00:12:29.919
So, what opportunities do you see for AI in making teaching and learning more engaging and effective?

00:12:29.919 --> 00:12:30.879
Sebastian, I think.

00:12:30.879 --> 00:12:31.600
Sorry.

00:12:31.600 --> 00:12:32.799
That's okay.

00:12:33.600 --> 00:12:40.559
It's truly around how can you make educating more personalized?

00:12:40.559 --> 00:12:57.600
And I think that's why AI Tools allows us, or AI itself allows us, is the power to educate each in every student like more a lot more detailed, a lot more uh personalized.

00:12:57.600 --> 00:13:01.840
And we were never able to do this because we never had the tools now.

00:13:01.840 --> 00:13:22.879
But with AI and what we can do, we are allowing, we can allow ourselves to dissect and and dismantle this kind of like you know, unified education or like once and one-for-all type of education and connect to each and every student as a singular way and teach them in a very, very singular way.

00:13:22.879 --> 00:13:29.039
And how can we do that is with tools that allow us to do lesson plan assistance.

00:13:29.039 --> 00:14:24.080
Well, having edge AI, and something that I really want to share about this today, is is the power of edge AI in terms of personalizing personalization and edge AI classroom analytics, edge AI itself allows us to have very, very smaller uh AI models within devices and having the students using those devices and giving real feedback from each student to those professors and giving uh the opportunity to return a personalized response and output on a daily basis to each student, and that really creates the real power of AI in the education for uh my understanding, and allowing each student to evolve and understand through a more personalized way by allowing those AI tools to manage and create some kind of more analytics on each student.

00:14:24.080 --> 00:14:32.080
And I think one of these uh things never existed in education, no one had analytics around each student and how on the level.

00:14:32.080 --> 00:14:37.279
We had some understanding of grades, but we never had analytics around understanding.

00:14:37.279 --> 00:14:50.799
And with the AI tools, we can build analytics around levels of understanding for each topic, for each chapter, for each um every single top uh every single subject that we're teaching.

00:14:50.799 --> 00:14:52.720
I think that's very powerful.

00:14:53.120 --> 00:14:53.919
I love that.

00:14:53.919 --> 00:14:54.559
Great.

00:14:54.559 --> 00:15:01.360
Yes, because I think of tools like ChatGPT where you can create your own uh your own chatbot, right?

00:15:01.360 --> 00:15:05.200
And then you can center that around something like a specific topic.

00:15:05.200 --> 00:15:53.490
And I have had guests on my show in the past that have done that and they're in education and they're able to create that own their own chatbot for their students and it centers around what they're teaching.

00:15:53.490 --> 00:16:06.850
And I agree that personalized learning is so important today because we want to be able to give that to our students and be able to not make it a one-size-fits-all type of experience for them, but really personalize it.

00:16:06.850 --> 00:16:07.730
I love that.

00:16:07.730 --> 00:16:09.649
That's great, Sebastian.

00:16:09.649 --> 00:16:10.370
Yeah.

00:16:11.009 --> 00:16:19.250
One of the things I've done while learning a new language was React, I believe around a couple years ago, or a year ago, maybe.

00:16:19.250 --> 00:16:33.330
I used uh one of these versions, older version of ChatGBT, and I customized it in order for me to set a project, and project didn't exist at that time, but you were able to do that by really customizing your model.

00:16:33.330 --> 00:16:44.129
And what I did is that I customized the model to really learn and teach me with my own um advancement of React.

00:16:44.129 --> 00:16:58.129
And I was able to create a keyword for each step I was I was going to and getting exercise from that model and allowing it to give me exercise from the things that I'm you know responding and learning as well to.

00:16:58.129 --> 00:17:06.769
And I think that's one of the ways we can have each student interact with the AI platform in order to manage to understand.

00:17:06.769 --> 00:17:25.410
And I think woo wins from that is you know, the one learning and the one really interacting fully and by really, really making sure that these uh individual is a hundred percent getting full uh knowledge and getting full understanding of whatever it's being taught.

00:17:25.410 --> 00:17:28.930
And I think that's why we are educating.

00:17:29.410 --> 00:17:30.450
Right, exactly.

00:17:30.450 --> 00:17:34.529
We're training them for the next for to be future ready, right?

00:17:34.529 --> 00:17:36.210
And we'll talk about that a little bit too.

00:17:36.210 --> 00:17:36.930
I love that.

00:17:36.930 --> 00:17:37.889
That's great.

00:17:37.889 --> 00:17:43.490
So, as we know, whenever new technologies come along, there are challenges, misconceptions, and even fears.

00:17:43.490 --> 00:17:48.929
I've I've experienced that with the faculty I work with, where they're fearful of this new technology.

00:17:48.929 --> 00:17:56.209
So, what's one of the biggest myths about coding or AI that you've encountered, and how do you help people move past that type of fear?

00:17:56.609 --> 00:18:08.689
I really love that fear, and I think that fear needs to exist, for for it to require better products and better way of building around AI.

00:18:08.689 --> 00:18:16.609
And I want to talk more about privacy um first, and just really keep fear in AI.

00:18:16.609 --> 00:18:30.449
And because I have a background in market and I have more understanding of how European culture is responding to AI at the moment, and the response compared to the US and Europe is very, very uh different.

00:18:30.449 --> 00:18:40.049
European AI companies are not really readvancing to the way they they would want to what they do is like they're really entering the US market instead.

00:18:40.049 --> 00:18:40.929
And why?

00:18:40.929 --> 00:18:52.769
It's because people are asking for privacy, people are asking for a more like a full understanding of how their data is being utilized when used on the AI platform or any AI model.

00:18:52.769 --> 00:19:01.490
And now we have the EU AI Act, which is now real and asking for these companies to be transparent in terms of how AI is being utilized.

00:19:01.490 --> 00:19:11.250
And the issue is that there were no actual systems being putting in place for you for them to tell you exactly what they're how they are using your data.

00:19:11.250 --> 00:19:24.689
So the current stage right now in terms of the world AI implementation in the world right now is fully your data is you giving content to that model or to any AI chat that or chat GPT, for example.

00:19:24.689 --> 00:19:35.250
So you're giving them content and you're getting a response, and your content becomes uh uh part of the uh price that you pay to get that response, to get that value back.

00:19:35.250 --> 00:19:47.490
And dark data is being utilized in many different ways that you're not aware of about, but it causes a lot of privacy, a lot of issues and privacy uh boundaries.

00:19:47.490 --> 00:19:52.129
For example, we have one of the main reasons is is prompt injection.

00:19:52.129 --> 00:19:57.569
Or 2023, 20% of all front injections were uh successful.

00:19:57.569 --> 00:20:14.929
And the prompt injection what really is is someone from anywhere in the world can uh you know see through your tokens what you're asking uh an LLM or uh any AI model or chat bat or so they can someone can really have access to those data being shared with an AM model.

00:20:14.929 --> 00:20:28.529
So and yeah, and one of the articles that came out from um uh media company, it's four to eight percent of all content shared in the AI and model or prompt type sensitive data in them.

00:20:28.529 --> 00:20:54.289
So thinking about those two numbers, it means that there is millions and millions, millions and sensitive data being shared with AI on a daily basis, and that all that data can be, you know, uh exposed to some uh very, very like possible threat from a third-party um hacker or someone who just comes on to use your data, not just the AI company itself, right?

00:20:54.289 --> 00:20:58.929
Just because you're not really, really utilizing or creating a system around that property.

00:20:58.929 --> 00:21:13.169
And that really is real, and we have to really uh demand that transference and demand this kind of of uh architectural design for privacy around AI.

00:21:13.169 --> 00:21:23.409
And that's why I'm working on right now in intra OS is building an AI system around privacy first, but still pushing the implementation of AI.

00:21:23.409 --> 00:21:29.009
Why it's we say that we'll still be pushing it, and and we can really have to uh focus on that.

00:21:29.009 --> 00:21:37.329
And and why we believe that that will implement more AI implementation is because it's more connecting to a European market.

00:21:37.329 --> 00:21:44.929
If we can show that companies uh can be trusted with AI implementation, more users will be able to, will be open to use those platforms.

00:21:44.929 --> 00:21:55.730
More users and instructors and people in education can really feel more comfortable to share that platform to their um class, uh to their uh students.

00:21:55.730 --> 00:21:57.250
And I think that's fair.

00:21:57.250 --> 00:22:03.569
You have to feel comfortable and you have to share that platform without fear and understanding how they're using their data, right?

00:22:03.569 --> 00:22:07.409
And then actually training a model around your students.

00:22:07.409 --> 00:22:12.289
And I think that really a fair um, it's really not a myth, right?

00:22:12.289 --> 00:22:14.929
It's really a fair point to make.

00:22:14.929 --> 00:22:21.490
And because they're not really sharing how they're utilizing that that data, it's a fair um assumption.

00:22:21.490 --> 00:22:35.490
And what we do is that we create a quick platform that really creates audits for those companies and for the platform that really shows where that data goes from asking the model to give a response.

00:22:35.490 --> 00:22:46.369
Uh for example, if you have if you have your email and your phone number through that data, before it gets to that model, that data is already tokenized and rehydrated and sanitized before that model says it.

00:22:46.369 --> 00:23:01.809
So no matter if you're using our model, which is local, and I'm talking about I can talk about that in a second, but or although you can use an AI model from OpenAI or Cloud, which is those big AI companies, right, that data will not get there.

00:23:01.809 --> 00:23:17.009
But you're still gonna get the same powerful output and powerful research or whichever component you're gonna want to get from that model, or responses you want to get from that model, you're still gonna get that back, but without them seeing or utilizing that data, your personal data, right?

00:23:17.009 --> 00:23:18.689
So email and phone number.

00:23:18.689 --> 00:23:25.490
And we have a list of more and more and more sensitive data that we don't think about on a daily basis, but still exist.

00:23:25.490 --> 00:23:31.730
And because we want to make sure that we're aware of phone and email, but what about social security numbers?

00:23:31.730 --> 00:23:35.889
What about uh information connected to an image that has your location in it?

00:23:35.889 --> 00:23:47.169
Uh, and what about um URL that we're sharing through our AI model that has a preconnecting it to your own um personal computer or location?

00:23:47.169 --> 00:23:49.970
And and these are some things that we don't think about.

00:23:49.970 --> 00:23:50.849
And I do too.

00:23:50.849 --> 00:23:54.689
I share it, I share everything with AI sometimes.

00:23:54.689 --> 00:23:58.289
Because you know, that's that's what exists, right?

00:23:58.289 --> 00:24:01.009
And we can have a powerful response.

00:24:01.009 --> 00:24:02.289
We have to use it.

00:24:02.289 --> 00:24:11.089
But we need to start asking for you know these kind of architectural design, and they can build it.

00:24:11.089 --> 00:24:12.049
And why?

00:24:12.049 --> 00:24:24.289
It's because we have built platforms with local models, and local models is our model, which is edge AI-based, where you have a device and you're allowed to enter insert an AI model.

00:24:24.289 --> 00:24:29.009
So you like you heard of GPT 3.5 or GPT-5, right?

00:24:29.009 --> 00:24:40.449
So these are models that OpenAI has built, and you can take different models, but because these are just amonggous and they're giant, and you can it's almost impossible for you to have them on a single device in terms of space.

00:24:40.449 --> 00:24:51.730
But you can build models, different AI models that can live within those devices and be accessible offline, no access to your data and no access to the cloud.

00:24:51.730 --> 00:24:58.209
So that those companies have no access to like filter and get that data because that data lives within that device.

00:24:58.209 --> 00:25:02.209
And also because it's local, there is no internet access.

00:25:02.209 --> 00:25:13.009
So there was no way any trunk injector can find like data because it's not going through a system of tokens through an AI cloud to access a response.

00:25:13.009 --> 00:25:16.129
So you're fully safe in many levels, which is incredible.

00:25:16.129 --> 00:25:35.250
And we are able to show, and that's really the value that we want to share is we want to show that we can build better infrastructure on AI implementation, and we can actually show people that AI can be trusted without just giving great, powerful output at the cost of their privacy.

00:25:35.809 --> 00:25:36.289
Right.

00:25:36.289 --> 00:25:37.409
That's so important.

00:25:37.409 --> 00:25:38.769
I like that, Sebastian.

00:25:38.769 --> 00:25:41.490
And you know, it kind of makes me think of something too.

00:25:41.490 --> 00:25:44.209
Is this similar to like a closed AI model?

00:25:44.209 --> 00:25:52.369
Because I have that where I work because we have proprietary curriculum, and so we can't use Chat GPT, we can't use Cloud or any of those tools.

00:25:52.369 --> 00:25:59.569
Now, students can use it, but as faculty and um as staff, we can't use those tools, and especially with curriculum.

00:25:59.569 --> 00:26:07.730
So they built a closed AI model, and so we utilize that, and it's safe for us because we can we can upload that information.

00:26:07.730 --> 00:26:11.569
So is that similar to a closed model uh with that with the local?

00:26:11.730 --> 00:26:27.970
Yeah, yeah, similar, it's similar to a closed model, but that closed model still, you know, it's still on live, or it's still online, so it's not through a device, it's still an idea of cloud um aspect to it, but it's just not connected to a specific um AI company, correct?

00:26:27.970 --> 00:26:39.569
So it's still so company cannot sell that data to uh to uh uh multiple companies to sell that data for ads or for example, correct?

00:26:39.809 --> 00:26:40.449
Right, exactly.

00:26:48.529 --> 00:26:53.970
You wanna make sure that even within that all, you want to make sure some information never gets to anyone.

00:26:53.970 --> 00:26:55.649
And really that's also something.

00:26:56.209 --> 00:27:02.449
Exactly, because we utilize Microsoft 365, we have the OneDrive and the cloud.

00:27:02.449 --> 00:27:10.049
So I can I can see what you're saying, because even with that closed AI model, there still can be vulnerabilities, right, with the data.

00:27:10.049 --> 00:27:14.529
So it's so important for and thankfully we have really good IT security.

00:27:14.529 --> 00:27:26.209
And they say whenever you're in doubt, send it to IT security and don't download any software to your work computers because of malware and all of that and viruses and everything.

00:27:26.209 --> 00:27:28.449
But yeah, that's that's a good comparison.

00:27:28.449 --> 00:27:29.089
I like that.

00:27:29.089 --> 00:27:30.609
And I like that local option.

00:27:30.609 --> 00:27:31.250
That's really great.

00:27:31.250 --> 00:27:46.529
I'm sure my listeners will appreciate that and see how so it sounds like that's an evolving area within AI, is trying to make sure that privacy and security is number one with them and help to alleviate some of that fear that people have with it.

00:27:46.529 --> 00:27:47.490
So I like that.

00:27:47.490 --> 00:27:48.369
That's great.

00:27:48.369 --> 00:27:49.889
Wonderful Sebastian.

00:27:49.889 --> 00:27:50.529
I love that.

00:27:50.529 --> 00:27:53.329
So I figured we do have time for a bonus question.

00:27:53.329 --> 00:28:00.609
So before we move on to the final question, I wanted to slip one in that I know my uh my listeners will love and they'll resonate with.

00:28:00.609 --> 00:28:06.769
So, as someone that's worked in startups, you've seen how quickly technology evolves, and I've seen it too.

00:28:06.769 --> 00:28:14.289
What advice, maybe one piece of advice, would you give to educators or designers who want to stay adaptable and future ready in their own careers?

00:28:14.609 --> 00:28:26.769
Really, at the moment, with AI implementation and how it really, really evolves so fast, and with that really a way to really predict really how it is going.

00:28:26.769 --> 00:28:33.089
I think it's just putting yourself in a place where you're always learning.

00:28:33.089 --> 00:28:42.929
And really, really that is the only way to really uh keep up with this um big noise around AI and bubble, really.

00:28:42.929 --> 00:28:52.209
It's just keep learning and and keep pushing yourself to really learn towards the thing that you want and build.

00:28:52.209 --> 00:29:18.049
And really that's the power that we want to showcase is that with learning, with with a really uh impactful uh system and community, and I really believe in community, and that really keeps the mindset a lot more clearer, and having a community of learners, having a community of educators who can teach each other and and getting that information and sharing that information, I think that's the most valuable thing.

00:29:18.049 --> 00:29:27.329
And from that, I would really recommend anybody to go around communities that are teaching about AI, that are educating about AI, or people who uh know things.

00:29:27.329 --> 00:29:35.649
And sometimes you can look for things in many places and you can just ask your neighbor, and you know, they have that answer for you, right?

00:29:35.649 --> 00:29:46.129
And I think having a community of people that are thinking the same way, that are really trying to really keep themselves uh keep up with that education and keep up with that model is very important.

00:29:47.009 --> 00:29:48.049
Absolutely, right.

00:29:48.049 --> 00:29:48.929
I love that.

00:29:48.929 --> 00:29:54.769
Uh that's great advice because that's something with where I work, we're all we all work remotely.

00:29:54.769 --> 00:29:58.049
So you can feel disconnected and siloed when it comes to that.

00:29:58.049 --> 00:30:03.409
Uh, but but what's great is that we have a really great community and we have different committees.

00:30:03.409 --> 00:30:06.209
And one of them, you would like the the sound of this one.

00:30:06.209 --> 00:30:08.849
This is called Pioneering AI Committee.

00:30:08.849 --> 00:30:20.689
So we we talk about those different elements, and it's great because we talk about curriculum, how we can integrate it into curriculum and how we can help it improve our work as uh curriculum developers, instructional designers.

00:30:20.689 --> 00:30:24.289
So I love that idea of community and collaborating with others.

00:30:24.289 --> 00:30:25.490
That's great, Sebastian.

00:30:25.490 --> 00:30:26.769
That's wonderful advice.

00:30:26.769 --> 00:30:28.769
Absolutely great.

00:30:28.769 --> 00:30:46.929
So, as we wrap up, based on your own journey as a founder and technology leader, what's one piece of encouragement or advice you would give to listeners who are looking To transition into fields like technology or instructional design, because we know instructional design has a lot of technology pieces in it, or they're just starting out in one of these fields.

00:30:46.929 --> 00:30:48.849
What kind of advice would you give them?

00:30:50.049 --> 00:30:55.809
Definitely go for the people that are there in those places.

00:30:55.809 --> 00:31:08.689
And one of the main things for me that I've learned from you know going to a college and is just spending four hours having a conversation with this professor that really is teaching something that I value so much.

00:31:08.689 --> 00:31:16.209
And that professor itself, if I can share that, it was finding a stage working so when I used four hours after class, starting to run.

00:31:16.209 --> 00:31:23.169
And it's it really shows so much more about educating and and the value of it from his point of view.

00:31:23.169 --> 00:31:28.929
And I really, and that really was one of my like father, you know, like that I had for on my lifetime.

00:31:28.929 --> 00:31:31.089
And that person really, really meant so much to me.

00:31:31.089 --> 00:31:40.529
But uh really being able to have mentors, and I really see him sorry and still see him as a mentor, wherever he is right now, uh, he's uh after me.

00:31:40.529 --> 00:31:53.490
Um and that person shared so much with me from his life, and and we think that life things are moving fast in our existence, but no, when internet came in, it was the same thing as AI right now.

00:31:53.490 --> 00:31:59.970
It was really a big bubble, and everybody was asking themselves questions, and people were like, oh my god, what are we gonna what are we gonna do?

00:31:59.970 --> 00:32:01.490
Exactly.

00:32:01.730 --> 00:32:02.129
Right.

00:32:03.089 --> 00:32:31.649
And really, and that person really taught me that change, really, because it was an older person and really taught me some kind of like you know, examples of what life was around that, and really shared with me a lot of great things around in life, and not just, you know, uh really what I wanted to be, but I really knew what I wanted to be after you know having those conversations, after spending the summer um summer uh like uh semester um with them, and and he was really an impactful person in my life.

00:32:31.649 --> 00:32:51.730
And from that I would say having someone that you can look up to, someone who's where you want to be, and you know, uh asking them asking the question, asking the question, not just about what you want to learn, but in order for you to get to somewhere, you have to really see and and and be impacted by many things.

00:32:51.730 --> 00:32:58.769
And for you to become that person, uh it's not just you know going to a specific college or getting a specific degree.

00:32:58.769 --> 00:33:03.809
It's more into their personal experience, and that's really what shaped them to become the person they are.

00:33:03.809 --> 00:33:13.009
Understanding those really detailed experiences of their life is a lot more impactful to understand what they know in terms of a curriculum.

00:33:13.009 --> 00:33:34.689
And I think that's really what shapes you, what gives you resilience, what gives you a really like like some kind of like hope really in your next 10, 20, and you know, 50 plus years, and is understanding what they lived and how they lived it and what they learned from that, and taking that, you know, understanding that and taking that to the next level in your life.

00:33:34.689 --> 00:33:37.730
I think that's really the most powerful for me.

00:33:37.730 --> 00:33:40.689
And I think that's really what I would recommend anybody.

00:33:40.689 --> 00:33:47.970
Get someone, you know, get in touch with someone who's doing something that you want to do and understand what how they did it really truly.

00:33:48.289 --> 00:33:48.689
Right.

00:33:48.689 --> 00:33:49.569
I love that.

00:33:49.569 --> 00:33:50.209
Great.

00:33:50.209 --> 00:34:10.849
That reminds me of in in the curriculum that I helped build, we we ask our students to to go find someone that's in that field, like what you mentioned, and talking with them, interviewing them, and not feel like they're in a silo, and then all of a sudden they finish their degree and then they get into the field and they're just like, What do I do?

00:34:10.849 --> 00:34:13.490
Okay, I feel like I'm not fully prepared.

00:34:13.490 --> 00:34:21.970
So I love that idea of mentorship, yes, and being able to connect with someone, like you said, that's in that field and someone that you truly trust, right?

00:34:21.970 --> 00:34:24.050
And know that you can get that from.

00:34:24.050 --> 00:34:25.329
So I love that.

00:34:25.329 --> 00:34:27.010
Wonderful, wonderful advice.

00:34:27.010 --> 00:34:31.329
Thank you, Sebastian, for sharing your insights and drawing from your own journey today.

00:34:31.329 --> 00:34:34.289
I think it's a wonderful journey that you've been able to share with us.

00:34:34.289 --> 00:34:45.730
So the perspective that you've offered from your experiences as a founder and technology leader to the practical advice you shared, that realm of experience, I know it will inspire and encourage my listeners on their own paths as well.

00:34:45.730 --> 00:34:47.650
So I greatly appreciate this today.

00:34:47.650 --> 00:34:51.490
It's been wonderful to connect and find each other on PodMatch.

00:34:51.490 --> 00:34:53.250
I always like to give a shout out to Podmatch.

00:34:53.250 --> 00:35:06.050
I didn't do that in the beginning, but I want to make sure to give a shout out to Podmatch because it's definitely changed this pod this podcast and being able to connect with uh like-minded technology influencers like yourself.

00:35:06.050 --> 00:35:07.010
So I love it.

00:35:07.010 --> 00:35:08.530
So I appreciate that.

00:35:08.530 --> 00:35:11.090
And I know that I'll have you back on the show.

00:35:11.090 --> 00:35:15.010
I I have a good feeling that we're gonna stay connected and you're gonna be back on the show.

00:35:15.010 --> 00:35:27.809
Um and maybe we can go deeper into something that's you're passionate about as well, uh, with the the AI and how kind of maybe the future of AI and how you see it evolving and changing our lives.

00:35:27.809 --> 00:35:31.970
And and we can look at the positive ways that it can that'll be doing that.

00:35:31.970 --> 00:35:33.490
So I'm looking forward to that.

00:35:33.490 --> 00:35:33.890
Yeah.

00:35:35.170 --> 00:35:35.250
Yeah.

00:35:35.410 --> 00:35:36.450
Maybe we can, yeah.

00:35:36.450 --> 00:35:42.930
Maybe we can look at like the next decade and how you see kind of uh kind of maybe put that future lens on, right?

00:35:42.930 --> 00:35:48.289
And uh it kind of reminds me of the movie The Matrix, that's the whole series of The Matrix.

00:35:48.289 --> 00:35:52.769
And I actually watched some of those recently and I thought, wow, that's very interesting.

00:35:52.769 --> 00:35:59.570
Or just you know, just different movies where you see the technology in it, and you're like, that seems so far down the road, but it's here.

00:35:59.570 --> 00:36:00.930
We're in it.

00:36:00.930 --> 00:36:03.010
So yeah, that's wonderful.

00:36:03.010 --> 00:36:03.490
Great.

00:36:03.490 --> 00:36:06.130
Well, I look forward to having you back on the show, Sebastian.

00:36:06.130 --> 00:36:08.370
Thank you so much for all your insights today.

00:36:08.370 --> 00:36:09.010
I appreciate it.

00:36:09.410 --> 00:36:10.450
Truly, truly a pleasure.

00:36:10.450 --> 00:36:11.890
Thank you so much for having me, Jackie.

00:36:11.890 --> 00:36:13.730
Thank you so much, Podmat, as well.

00:36:14.130 --> 00:36:14.370
Yeah.

00:36:14.610 --> 00:36:19.570
For giving you giving us the opportunity to connect, and I would love, love to be back to share more.

00:36:19.570 --> 00:36:25.570
And really, the more I share it here, the more I'm like, oh my god, I can, I could, you know, I could share so much more.

00:36:25.570 --> 00:36:27.010
Yes, exactly.

00:36:27.170 --> 00:36:30.450
It opens up that whole realm, exactly, of possibilities.

00:36:30.450 --> 00:36:31.570
So great.

00:36:31.570 --> 00:36:33.809
So we'll we'll make sure to stay connected.

00:36:33.809 --> 00:36:37.250
And then for my listeners, um, where can they find you?

00:36:37.250 --> 00:36:42.610
Um, is it best to find you on LinkedIn or what's the best best place to find you?

00:36:43.010 --> 00:36:48.930
I would definitely recommend to uh check out my um my platform, my business uh platform.

00:36:48.930 --> 00:36:55.170
It's infraos.com, I-n-t-h-a-os.com.

00:36:55.170 --> 00:36:58.530
And really you will you know, you will learn a lot about privacy.

00:36:58.530 --> 00:37:08.450
We have a lot of articles and and things that are happening, and we'll share a lot more speakers, a lot more often around privacy.

00:37:08.450 --> 00:37:20.450
And I really really we we want people to connect with privacy and AI, and that will create really the demand in this in this um whole world of AI implementation that are so powerful for us.

00:37:20.450 --> 00:37:24.450
But we want to demand privacy as well, too, and and you know, be part of this movement.

00:37:24.769 --> 00:37:25.730
Great, wonderful.

00:37:25.730 --> 00:37:34.210
I'll make sure to link that website uh of yours for your your business in the notes so that way they have access to that and the call to action as well.

00:37:34.210 --> 00:37:40.210
So they'll be able to learn more and connect with you on those different ways, those different ways and those different levels.

00:37:40.210 --> 00:37:41.170
So I appreciate it.

00:37:41.170 --> 00:37:41.570
Great.

00:37:41.570 --> 00:37:42.930
Thanks so much, Mikey.

00:37:42.930 --> 00:37:43.890
Wonderful.

00:37:43.890 --> 00:37:46.530
Well, we'll I'll have you back on again soon.

00:37:46.530 --> 00:37:51.970
And I know my listeners will appreciate all the insights that you share in the future as well.

00:37:51.970 --> 00:37:52.769
Great.

00:37:52.769 --> 00:37:53.570
Thank you.