American nonprofit
POPULARITY
In this episode of the Campus Technology Insider Podcast, Rhea Kelly, editor-in-chief of Campus Technology, discusses the crucial topic of AI readiness in higher education with Kathe Pelletier, senior director of community programs at Educause. They delve into Educause's updated AI Readiness Assessment tool, its components, and how it aids institutions in strategizing and planning for AI integration on campuses. Learn about the importance of conversation, context-specific strategies, and action planning to navigate AI implementation effectively. This episode also highlights key insights from the second annual Educause AI Landscape Study and its role in shaping the assessment tool. 00:00 Introduction and Guest Welcome 00:29 Overview of AI Readiness Assessment 01:10 Key Components of the Assessment 02:34 Action Planning and Resources 03:41 Insights from the AI Landscape Study 06:28 Institutional AI Strategy and Readiness 09:35 Stakeholder Involvement in AI Readiness 12:48 Assessment Sections Breakdown 18:32 Post-Assessment Steps and Action Plans 24:15 Conclusion and Podcast Sign-Off Resource links: Educause Higher Education Generative AI Readiness Assessment 2025 Educause AI Landscape Study Music: Mixkit Duration: 25 minutes Transcript (Coming Soon)
It's YOUR time to #EdUpIn this episode, recorded LIVE from Ellucian LIVE 2025 in Orlando, Florida,YOUR guests are John O'Brien, President, & Jenay Robert, Senior Researcher, EDUCAUSEYOU cohost is Dr. Chris Moloney, Principal Strategic Specialist, EllucianYOUR host is Dr. Joe SallustioHow is EDUCAUSE serving as a trusted advisor for higher education technology?What does the 2025 AI landscape study reveal about institutional preparedness?Why is digital transformation becoming increasingly critical in higher education?How can institutions assess their AI readiness & implementation?What makes the EDUCAUSE conference unique among higher education events?Topics include:Research insights on AI adoption in higher educationCybersecurity & privacy concerns in the AI eraCreating effective technology transformation roadmapsThe importance of stakeholder inclusion in technology decisionsBalancing enthusiasm & caution in emerging technology implementationListen in to #EdUpDo YOU want to accelerate YOUR professional development?Do YOU want to get exclusive early access to ad-free episodes, extended episodes, bonus episodes, original content, invites to special events, & more?Then BECOME AN #EdUp PREMIUM SUBSCRIBER TODAY - $19.99/month or $199.99/year (Save 17%)!Want YOUR org to cover costs? Email: EdUp@edupexperience.comThank YOU so much for tuning in. Join us on the next episode for YOUR time to EdUp!Connect with YOUR EdUp Team - Elvin Freytes & Dr. Joe Sallustio● Join YOUR EdUp community at The EdUp Experience!We make education YOUR business!
In this episode of the Campus Technology Insider podcast, host Rhea Kelly, editor in chief of Campus Technology, discusses the Educause 2025 AI Landscape Study with Senior Researcher Jenay Robert. They delve into the study's history, methodology, and key findings, including shifts in attitudes towards AI, policy impacts, and the emerging digital divide between larger and smaller institutions. Robert also highlights the importance of effective communication, AI literacy, and community engagement in higher education. Listen as they explore how institutions are adapting to AI's growing presence and the ethical responsibilities involved. 00:00 Introduction and Guest Welcome 00:29 Overview of the Educause AI Landscape Study 02:07 Key Findings and Comparisons 04:07 New Questions and Methodology 08:41 AI Costs and Budgeting 13:46 Perceptions of AI Risks and Benefits 23:00 Final Thoughts and Recommendations 28:48 Conclusion and Farewell Resource links: Educause 2025 AI Landscape Study Educause 2024 AI Landscape Study Educause Shop Talk podcast Educause Library on AI Music: Mixkit Duration: 29 minutes Transcript: Coming Soon
In this episode of Campus Technology Insider Podcast Shorts, Rhea Kelly covers the latest advancements in education technology and AI. Highlights include OpenAI's new research feature for ChatGPT, which offers autonomous web browsing and report generation for professionals, and the release of HECVAT 4 by Educause, an upgraded toolkit for assessing vendor privacy practices and AI-specific criteria. Registration is also open for the Tech Tactics in Education Virtual Conference on May 7th, focusing on AI integration, cybersecurity, and data-driven student success strategies. 00:00 Introduction to Campus Technology Insider Podcast Shorts 00:16 OpenAI's New Research Feature 00:51 Education IT News: HECVAT 4 Release 01:27 Upcoming Tech Tactics in Education Virtual Conference 02:02 Conclusion and Further Resources Source links: New OpenAI 'Deep Research' Agent Turns ChatGPT into a Research Analyst Educause HECVAT Vendor Assessment Tool Gets an Upgrade Registration Now Open for Tech Tactics in Education: Thriving in the Age of AI Campus Technology Insider Podcast Shorts are curated by humans and narrated by AI.
AI has more applications in higher education than you think! This week, AI expert Dr. Michelle Kassorla joins the podcast to share the reason why she integrated AI into all of her courses this academic year. From enhancing classroom instruction to challenging traditional pedagogy, her instructional approach is shaping the future of AI literacy. Want to know how AI is transforming the way students learn and how teachers teach? Tune in as Dr. Kassorla breaks down her hands-on experiences with bringing AI into higher ed. Whether you're an educator, a student, or just curious about AI's role in the classroom, this conversation is packed with insights you won't want to miss! --- ABOUT OUR GUEST Michelle Kassorla, Ph.D., is an Associate Professor of English at Georgia State University, Perimeter College. Dr. Kassorla has served on the “AI Expert Panel” for EDUCAUSE and currently leads the “AI Literacy” committee, creating AI Literacy Standards for Higher Education. She is also the recipient of a Center for Excellence in Teaching, Learning, and Online Education (CETLOE) fellowship in the Scholarship of Teaching and Learning at Georgia State University. --- SUBSCRIBE TO THE SERIES: YouTube | Spotify | Apple Podcasts | YouTube Music | Overcast FOLLOW US: Website | Facebook | Twitter | LinkedIn POWERED BY CLASSLINK: ClassLink provides one-click single sign-on into web and Windows applications, and instant access to files at school and in the cloud. Accessible from any computer, tablet, or smartphone, ClassLink is ideal for 1to1 and Bring Your Own Device (BYOD) initiatives. Learn more at classlink.com.
In this episode, Jeff Utecht interviews Michelle Kassorla, an associate professor of English at Georgia State University, about using generative AI in higher education. They discuss the importance of teaching AI literacy, the role of AI in the writing process, and the benefits of using AI to support student learning. Michelle shares her insights on assessment, transparency, and the creative opportunities that AI provides for students. She also discusses the development of AI literacy standards for higher education and offers practical tips for integrating AI into the classroom. Michelle Kassorla, Ph.D., an Associate Professor at Georgia State University, Perimeter College, has more than 30 years of experience teaching English and Composition. She teaches With and Against AI in Composition I and II courses, integrating AI into all assignments this academic year. She's on the AI Expert Panel for EDUCAUSE, leading the AI Literacy committee for Higher Education. A recipient of a Center for Excellence in Teaching, Learning, and Online Education fellowship, she published "Teaching with GAI in Mind" in the EDUCAUSE Review and co-authors a textbook and papers with Eugenia Novokshanova. She's a Co-Principal Investigator in the "Perceptions of AI" study at GSU. She shares her insights on LinkedIn and her blog, "The Academic Platypus." As a mother of eight boys, she's continually inspired to excel. Connect with our guest: LinkedIn: https://www.linkedin.com/in/mkassorla/ Takeaways Teaching AI literacy is crucial in higher education to help students understand the limitations and biases of AI models. AI can be used to support the writing process by providing feedback on grammar and punctuation, generating topic ideas, and assisting with research. Assessment should focus on voice, tone, audience, transparency, and accuracy rather than traditional grammar and punctuation. AI can free up time for teachers to focus on higher-order thinking skills and creative aspects of writing. Developing AI literacy standards for higher education can help guide educators in teaching AI effectively. Thank you to our amazing show sponsor! Lebra https://www.lebrahq.com/
Recently Educause released its inaugural AI Landscape Study, which polled the higher education community about AI strategic planning and readiness, policies and procedures, impact on the workforce, and the future of AI in higher education. We spoke with report author and Educause Senior Researcher Jenay Robert for a deep dive into some of the thinking behind the study, what the survey findings tell us about institutions' AI journeys, and how "I don't know" might be the theme of the day when it comes to AI. Resource links: Educause AI Landscape Study Educause Innovation Summit National Artificial Intelligence Act of 2020 Educause Workforce Studies Educause Horizon Reports and Horizon Action Plans Music: Mixkit Duration: 29 minutes Transcript
Discusses privacy and other ethical considerations for extended reality settings. Our guest today is Mihaela Popescu who is a Professor of Digital Media in the Department of Communication Studies at California State University, San Bernardino (CSUSB) and the Faculty Director of CSUSB's Extended Reality for Learning Lab (xREAL). She holds a Ph.D. in Communication from the University of Pennsylvania. Additional resources: Educause: https://www.educause.edu/ Electronic Frontier Foundation: https://www.eff.org/ CITI Program's Technology, Ethics, and Regulations course: https://about.citiprogram.org/course/technology-ethics-and-regulations/
Take a deep dive into how generative AI is changing how both higher ed and K-12 administrators are researching and analyzing student data and analytics. Melissa and Ryan explore what the University of California San Diego is doing and how their work is now being shared with other institutions across the country. In this episode, you'll hear from Vince Kellen, CIO of UC San Diego, and Greg Stinsa, Collaborations Manager at UC San Diego. To learn more and to connect with Vince and Greg, check out these resources: • “AI: Friend or Foe” Educause article Educause article: • UC San Diego's StudentActivityHub.com --- Send in a voice message: https://podcasters.spotify.com/pod/show/instructurecast/message
#ICDEWC23 Exploring Future Trends in Higher Ed with John O'Brien, President & CEO, EDUCAUSE
In this episode of "Student Affairs Voices From the Field," Dr. Jill Creighton welcomes Dr. Eric Stoller, the VP of Marketing and Digital Content at territorium, for a conversation about the evolving landscape of higher education and the role of technology in student affairs. They discuss various trends in higher education technology and how it impacts both academic and student affairs divisions. Dr. Stoller traces his journey from his early experiences as a marketing specialist at the University of Illinois, Chicago, to becoming a respected thought leader in the higher education technology space. He emphasizes how technology has become an integral part of the entire higher education experience, noting the importance of CRM tools, mobile apps, and the shift toward hybrid and remote learning during the pandemic. The conversation delves into the changing value of higher education credentials and the importance of measuring and verifying outcomes related to critical thinking, skills development, and employability. Dr. Stoller discusses the growing focus on micro-credentials, badges, and the idea of a learner's "digital wallet" to showcase skills and experiences. They also touch upon the need for interoperability in higher education technology and how data and analytics will play a more significant role in student affairs, helping institutions understand student needs and provide better support. The episode concludes with a discussion of the evolving role of student affairs in helping students navigate diverse pathways to success, emphasizing the need for personalized support and pathways for learners, regardless of whether they complete a degree. This episode sheds light on the transformative impact of technology on higher education and how student affairs professionals can adapt to these changes to better serve students in an evolving landscape. Please subscribe to SA Voices from the Field on your favorite podcasting device and share the podcast with other student affairs colleagues! Transcript Dr. Jill Creighton [00:00:02]: Welcome to Student Affairs Voices From the Field, the podcast where we share your student affairs stories from fresh perspectives to seasoned experts. This is season nine on transitions in Student Affairs. This podcast is brought to you by NASPA. And I'm Dr. Jill Creighton. She her hers your essay, Voices from the Field host today on SA Voices, I'm pleased to bring you a conversation with Eric Stoller. Eric is the VP of Digital at territorium with over 20 years experience in higher education and education technology. As a Strategist writer and thought leader, he founded and led a global higher education consultancy from 2010 to 2019 and created the Student Affairs and Technology blog for Inside Higher Ed. Dr. Jill Creighton [00:00:46]: Previous Ed Tech roles include leadership positions at list. Ed tech element 451 and Gecko engage. Earlier in his career, he was an academic advisor at Oregon State University and a marketing specialist at the University of Illinois, Chicago. Eric. Earned an associate's degree, a BA in Communications and an EDM. In College student Services Administration. Eric, welcome to SA Voices. Dr. Eric Stoller [00:01:07]: Thanks so much for having me, Jill. Great to be here. Dr. Jill Creighton [00:01:09]: It's really great to see you. For our listeners, Eric and I met, I'm going to say 2005 maybe. Dr. Eric Stoller [00:01:16]: I think dinosaurs were just still roaming the earth. Yeah, it would have been 2004. Five Oregon State University in Corvallis, Oregon. Dr. Jill Creighton [00:01:25]: I think we were just escaping the trends of dial up internet and smartphones weren't smart yet in that time. So Eric and I actually worked together in the Office of Student Conduct when we were graduate students. So it's really lovely to see old friends and see careers blossom. And I'm really looking forward to talking about your transition today because I think you have a really unique one for someone who received their master's in Higher Ed. So would love to start with if you could tell us about your current position. And we always like to begin with a good come up story. How did you get to your current seat? Dr. Eric Stoller [00:01:57]: A good come up story, I love that. Well, so my current role is Vice President of Marketing and Digital Content at territorium, which is a global ed tech company that is all about bridging education to employability. And we'll probably get into that later on in the show. And it gets highly technical and I can't wait to dive into that. In terms of how I got into this seat, it is a long, winding story that started on a gravel road in Iowa, and I'm not going to bore your listeners with the full, you know, I went to community college, went to university. I thought I was kind of done with higher education. And then I actually started working at the University of Illinois at Chicago way back in the day in marketing and just loved the work. I was located within Student Services, and that's when I sort of first learned about what student affairs was even all about. Dr. Eric Stoller [00:02:42]: And was, as I am today, still very much into technology back then. And even I remember calling up Kevin Krueger, who's now the executive director for NASPA or the president of NASA. I'm not sure the exact titles nowadays, but Kevin and I had a conversation when I was very new to the field, and I said to him, why is the information Technology knowledge community, as it was known then, why is it gone? Because they had just gotten rid of it. And his first thought or question know, who are you? And I said, yeah, I'm just new professional, kind of bothering this leader of this association, or at the time, I think he was the associate director. Anyway, I went out to Oregon State, as you referenced, and I got my master's degree in higher education. Worked in a variety of different areas from enrollment management, financial aid, registrars, kind of a stint at Student conduct, was an academic advisor. And then during that time when I was an academic advisor, I started writing for Inside Higher Ed. I started the Student Affairs and Technology blog and just loved that experience as a writer for Inside Higher Ed. Dr. Eric Stoller [00:03:41]: And it was also at that time when I started getting invitations to go out and do some freelance work and consult for institutions and speak at events. So I stopped working full time for Oregon State and I became a consultant for nine years in the US, the UK and beyond, various global events and working with institutions all over the place. And the focus was all around digital engagement. This was when sort of social media was kind of coming into its own still and really focusing on how student affairs divisions could just transform what they were doing with all things digital. Because the origin story of student affairs is one that it was all about face to face, one on one experiences with students. And technology was seen because my Grad program, it was what, 2004, when I started, and technology was seen as this kind of gets in the way of that student experience. You fast forward to today, almost 20 years later, and the idea that technology would be separate from the student experience is something that people would never think about. It's really connected deeply. Dr. Eric Stoller [00:04:40]: And so I had this nine year experience as a freelancer, and then I started working for a higher ed chatbot company that was based in the UK and Scotland and did that for a little while, went back to Freelancing, and then I worked for a higher ed CRM company. You're getting kind of a theme here in terms of my Ed tech experience, right? Sort of chat bot to CRM. And then we moved to the Netherlands in 2022 from the US. And so I was doing Freelancing again, and a connection of my wife, professional connection, started talking to me about this potential marketing role at territorium, and they were launching their kind of US presence. territorium as a company has its origins in Monterrey, Mexico, and we're all over Latin America in terms of providing testing and a learning experience platform as well as our comprehensive learner record. But we hadn't really had as much of a presence in the US. And so we launched this US team back in December of last year that's for listeners on the call. I can't even do the math now. Dr. Eric Stoller [00:05:42]: Right? 2022. And so been with territorium since then and leading on all things marketing and digital combination of leading, strategy, producing, execution, go to market, a lot of things that are not part of our Master's degree program that Jill and I went through, but connected to both my undergraduate experience as a PR and marketing major. And then of course, my deep connections and network into higher education have kind of got me to this place. Dr. Jill Creighton [00:06:10]: So I'm going to just do a quick backup to a terms definition. You mentioned CRM, which might not be a term that's familiar to those in Higher Ed. Can you define that for us? Dr. Eric Stoller [00:06:18]: Of course. So this is where things really get interesting because as you know, every institution in the US kind of does things differently. If they're a college, they're a university, they're a community college. The structures, the systems, some institutions have divisions of student affairs, some have smaller sort of scale depending on their organization. But the one thing they all have in common is they all recruit students, they have admissions and they have recruitment. And whether they use a higher ed specific CRM, which is back in the day, it would have been a Customer Relationship Management tool, which is effectively how you keep track of who you're trying to recruit and communicate with them and engage them on a level from maybe they're a junior in high school or if they're an adult. Learner how you're connecting with those folks through a variety of communication vehicles like email, SMS or maybe a chat bot. How it's all interconnected. Dr. Eric Stoller [00:07:10]: So there's the Gargantuan CRMs out there like Salesforce or I happen to be working for Element 451, which is a much smaller shop, but they have quite a few clients as well. That's the CRM. I think the interesting thing about being in Higher Ed is I always say that you live in an acronym soup because you've got all the associations for higher education, all the different tools and platforms. You've got the SIS, the Student Information System. I mentioned the Comprehensive Learner Record, which is shortened down to Clr, which is a record of skills and experiences and credentials for learners. That goes far beyond the transcript because it goes inside the classroom and outside the classroom. So that's the clr. And so, yeah, if we need to, we can have a glossary of Terms attached to this podcast. Dr. Eric Stoller [00:07:53]: In terms of all the acronyms that I might mention, I think for Higher Ed pros, most of these things you're already familiar with, you just didn't know. That's what it was called in corporate land, but things you're quite familiar with. I think the one that we've been using lately is Slate in terms of our CRM for prospective students. It's quite a popular rising one right now. So you do know these things. You just maybe got a new term to associate with it. Dr. Eric Stoller [00:08:15]: I think if you work in enrollment management, if you're in the admissions side, you're in these tools on a daily basis. I think it's one of those things if you're in student conduct or academic advising or every sort of functional area has its set of digital tools that it uses on a day to day basis. But when I was at Oregon State as an academic advisor, I was in banner every single day. And so that was the tool of choice. That's from Elusion. In terms of providers, I'll try not to too much name dropping, but I think that in terms of the Edtech universe, there's so many different providers because so many different functional areas require just different tools to help with the work that they're doing. Dr. Jill Creighton [00:08:53]: One of the reasons I was really looking forward to our conversation is because you can talk about transitions in the digital space. A lot of the conversations we've had this seasons are personal transitions in career, which you've certainly had. But I think one of the things you've always had your finger on the pulse of in higher Ed is how digital kind of arenas, the digital vertical for higher education has really changed and reshaped the way that we do the work in our campus based positions. So I'm wondering if you can talk to us a little bit about that process and what you've seen in terms of trends and bed tech field is really new 2030 years in terms of its boom. So any trends that you're seeing in terms of how educators are using these tools really well, yeah. Dr. Eric Stoller [00:09:34]: I mean, I think it's always good, like you said, to kind of look back where things were. When I was writing for Inside Higher Ed, I remember going to EDUCAUSE a couple of different times. The annual Educause Conference, which is kind of a giant ed tech convention. And most of the providers back then, those events, they were very much focused on the academic experience side of things. There weren't a lot of providers that were doing things that would even slightly sort of go into the student affairs areas. And now you fast forward to today and Edtech providers are in kind of every single space within institutions. As we've already referenced, the CRM tools have become extremely important because with the approaching enrollment cliff for that traditionally aged population, which is kind of a loaded phrase anyway in terms of what is traditional, but that sort of 18 to 22 year old, that population of university, that's a decline. There's just not as many young people that will be going into higher education. Dr. Eric Stoller [00:10:33]: And so the CRM becomes a tool that is even more important as you communicate, as you hone your message, as you try to showcase the value of your institution, of the degrees that students will receive and earn and other systems as well. I mean, it used to be the digital experience was much more based on the staff or administrators who were at their desk with a big screen and students would come to their office and they would sort of navigate a system on behalf of the student. And then mobile apps kind of really entered in in a meaningful way. And no longer are students sort of tethered to an individual and their desk and their office, but they can look things up on their phone and they can access a variety of services. They can ask questions to 24/7 chat bots. They can look at their course schedule. They can look at various activities and events on campus. Now, of course, when you said this, you referenced the question. Dr. Eric Stoller [00:11:29]: You kind of framed it as on campus. I think what the pandemic did was it showcased the need to serve and support students who weren't necessarily going to be on campus, or at least accelerated. Maybe more of a hybridized environment where students were on campus for a portion of the time, but they were also on their computers at home because it used to be that all your lectures were in a big auditorium. And then the idea of the sort of the flip classroom came into play. Professors were recording their lectures and students could listen to a lecture at home and so that the discussion would actually happen when you went to the classroom. And then with the Pandemic, it sort of said, okay, everything's going to be remote for certain people. And it was interesting because you start thinking about how did student affairs serve learners, who historically student affairs would have been saying, okay, in res life, there's no such thing as remote. Students are actually physically located on campus. Dr. Eric Stoller [00:12:27]: But then say, what about the other side of our institution that was serving adult learners or online only learners or people that were coming in for micro credentials, they were never going to set foot on campus. They maybe came once a year, if that. And so technology has really embedded itself throughout the entire higher education experience because the higher education experience has changed. It's such a blended, multimodal thing where students are learning through their phones, they are communicating like we are right now through zoom or other media like this because you don't have to be bound to a certain geography. You could be in Iowa and studying an institution in Oregon, or you could be in Berlin and studying at an institution in South Carolina. So the variety pack now and I think that's where I think back to our higher education master's program. And the fundamentals that we were taught were still very much constrained to a sort of model that was still constrained in some ways. It's like know, we were on a basketball court, for example, and we knew where the boundary lines were for everything, and we knew, like, okay, here's the two baskets, and we know how things work. Dr. Eric Stoller [00:13:39]: But then all of a, you know, I live in Europe now imagine if that basketball court was transported to a football pitch, which is enormous in size and different boundaries and different scope and scale. And I think that's where higher education finds itself. It's having to, as a sort of nebulous thing, now recruit students that in the past might not have been recruited because, like I said, that enrollment decline for a certain demographic, and so all these technologies are really coming into their own. For instance, the territorium, one of the things that we've been really talking about a lot is this idea that why do people go to college? Why do people pay the bill? Why do the people get into debt? Most of us were not financially wealthy enough to just pay for school right away. You have to get a loan. You pay your student loan off over the course of a lifetime or however long it takes. And what's the value of higher education? Right? Yes. It's the experience. Dr. Eric Stoller [00:14:30]: It's about giving back to your community. It's about access. But by and large, most people go to university because they want to improve their overall employability or their chances for a career that will perhaps lead to financial stability because that's why they're doing it. And higher ed, I think, for the longest time, hasn't really talked about that. We shy away from that. We shy away from the fact that people are going to get their BA in English, for instance, and they're going to get in $50,000 worth of debt. But they're doing it because they love writing, they love the work, they love the art. But at the same time, is there a connection to employment at the end of that journey, or are universities just leaving students in debt? And so I think that's where you may have heard people talk a lot about the skills based economy. Dr. Eric Stoller [00:15:16]: And I went to community college for my first two years. I got my Associate of Arts. My brother, he went to the same community college. He got a two year technical degree. That's what he has, a technical degree. And he has done really well for himself career wise. And I think one of the things, when people hear the word skills based economy, they think, well, that's more technical or community college workforce based. But universities are really getting into that space now when it comes to micro credentials and badging and trying to sort out the sense of, okay, it's not just about a pretty campus. Dr. Eric Stoller [00:15:48]: It's not just about a winning football team. It is about what's the direct correlation to you get this degree or you get this credential and it's going to have a direct impact on your success? Because right now I think there's something like 39 million Americans have some college but no degree, and yet that accompanies that with a ton of debt, right? So there's a lot of issues there. And so how do you take folks who have maybe some college but no degree and let them showcase the sort of skills that they have, even though they don't have the diploma, because they might have a transcript that shows that they've taken five classes, but at the same time, how do they show that to employers? Because employers look, traditionally, employers wanted to see the diploma or that you've earned your 40 year degree or you've earned your Master's or whatnot. And so I think that part of the things that higher ed has had in the past is, okay, we've kind of built this foundation of these are our core technologies. But I think there's this transition to, okay, what are some of those core technologies that might need to change, might need to evolve? Because if you're a registrar, for example, you need something more than just a transcript because you're no longer just awarding ABCDF, you are awarding micro credentials. You're giving badges away to students. Faculty members are sort of looking at, okay, my students are learning these skills during the course of this particular class, and now we're going to award them badges that never would have happened 510 years ago. And now you've got employers saying, hey, we are going to hire students based on these skills that they have that are verified by the institution. Dr. Jill Creighton [00:17:15]: Those are really important points because what we're seeing is a transformation of the value of higher education that's not just US. Based, that's globally. Because when we look at what a degree means, I believe it means something extremely different to those of us working in the academy, to those folks that are outside of the academy looking to employ people who need individuals who can demonstrate critical thinking, problem solving skills, technical knowledge, all of those things. And that's part of what the degree is designed to do. But I would believe that, especially at a liberal arts institution like mine, we're teaching ways to think, not just facts and figures and things like that. And you need both. So the question is, how are we transitioning not only our offerings at the university as a whole from a credentialing perspective, but how are we also doing that in student affairs? And how can technology support those transitions for what the work needs to look like? So I'm wondering if you have any thoughts on that. Dr. Eric Stoller [00:18:12]: Yeah, well, I think part of it has to do with the fact that because you mentioned critical thinking, critical thinking is a huge part of the experience of higher education and a lot of student affairs programs the underpinnings of those programs definitely includes critical thinking, equity conversations, cultural diversity conversations. And I think that all those aspects, they just weren't measured in the past. Right, so what did you actually learn throughout your experience that wasn't in the classroom? NASPA, for as long as I can remember, has always talked about learning reconsidered. Right. That learning happens throughout the experience of a student, regardless of where they are on campus, off campus, in a class, outside of the class. And so I think that is part of the work that student affairs is going to have to do going forward, because there's a lot of scrutiny right now, obviously, on institutional budgets and outcomes. And the two big R's, of course, are recruitment and retention. And student affairs plays a big part in both of those areas. Dr. Eric Stoller [00:19:08]: And so I think that the student affairs side of things in terms of transforming kind of what was done to what is being done and what will continue to be done, is going to be verifying and measuring those outcomes so that there's a tangible way to sort of I mentioned badges earlier. How many student affairs divisions are awarding badges to students? You think a lot of times about badges is maybe coming from the academic affairs side of the house. I think that look at Career Services shops, look at the evolution of Career Services because like career centers, they have probably one of the most important roles at institutions. And yet for the longest time, not so much now, but for a long time it was, okay, I'm a junior or a senior, I'll go and talk to career services kind of at the end of my institutional experience before I graduate. And now you see Career Services, they're front loading their engagement with students. So they're at orientation, they're there at first year experience courses, and they're also working alongside employers to connect students to this idea that this is just a step in your journey and we're going to try to help you along. And so I think we're going to see a lot more student affairs divisions awarding badges and getting into the LMS, getting into the badge systems, either coming directly out of a clr or it comes from another provider. I think that's the other thing with this is Ed Tech providers have been very insular in the past. Dr. Eric Stoller [00:20:28]: Like, we've got a platform and it only works with our platform. And so student data is kind of stuck in this database that's very proprietary and an organization called Oneed Tech, unless you're really deeply involved in sort of the Ed Tech space, you might not be aware of them. But one of the big facets of their work is interoperability sort of this idea that all these digital assets that students have are like Lego and that you can kind of plug and play them independently of a certain system. So, for example, if you have a digital wallet, that has all of your badges and has your skills, your credentials, all that stuff in there, you can take it to another institution. Kind of how students transfer from community college maybe to a university, but usually that's with a traditional transcript. But the overall vision will be learners will have this wallet of all of their verified skills and experiences and credentials that they carry with them in an interoperable plug and play type way. And so the sort of sovereignty of learners becomes a much bigger part of the conversation because there's a lot of data that has been part of this as well. And in student affairs, we don't really talk about data. Dr. Eric Stoller [00:21:36]: We don't talk about sort of the technical piece because we've been so much about the soft skills, the one to one. If you want to be a dean of students, you're not necessarily getting into a huge portion of the data unless maybe it's connected to retention or some other issue on campus directly. But the Ed tech space, there's so much data that is coming out of that. And so the thing I think will be interesting to see with student affairs throughout every functional area will be the various dashboards and analytics and outcomes coalescing into a space where you can sort of see, okay, where are students at? What do they need? What kind of support do they need? How is that going to influence things that we're doing programmatically as well as for the next as a student goes to another institution for the kind of a handoff, so to speak, because it won't just be your data is stuck at some institution. It's going with you. It's actually traveling along with you, and it might be enabled in some sort of bitcoin wallet that's kind of independent from an institution that's kind of a buzword. But at the same time, that's kind of the ultimate goal, I think, for a lot of companies that are thinking more about the openness of all this. I mean, when you think about the space that I'm currently in and how we interface into higher Ed, it's not just know, NASPA and Acro are playing a big part in this. Dr. Eric Stoller [00:22:53]: Acro is the Admissions and Registrars Association. They're kind of the home of registrars professionally and technically. Usually that's where the transcript resides. The Lumina Foundation, the big organizations focused on learning and outcomes over the years to even Walmart, because Walmart, I think they're the largest employer in the US. And one of the largest globally. They employ a huge number of people. And so they're thinking about the pathways from higher ed into different careers. I think the pathways piece is one I also want to introduce to this conversation, because it's important to give learners pathways even if they don't graduate, so that people aren't just left with debt and a handful of credits. Dr. Eric Stoller [00:23:31]: What is it they're actually going to be able to get, even if they don't finish. Because as you know, Jill, sometimes success for one person is just a couple semesters of college and that is like a hugely successful outcome for them. Whereas for a lot of other people, maybe it's graduation, maybe it's master's degree, maybe it's a certificate. Success is very much an individualized thing. Dr. Jill Creighton [00:23:51]: Still, it's time to take a quick break and toss it over to producer Chris to learn what's going on in the NASPA world. Dr. Christopher Lewis [00:23:57]: Jill's, so excited to be back again in the NASPA world. A ton of things happening in NASPA. So many of us have been hearing a ton about artificial intelligence. We are starting to explore it or delve deeper into it on our own college campuses. And in the most recent Leadership Exchange magazine, which you all have access to as a member of NASPA, the editors and authors of that magazine did delve deeply into artificial intelligence in the Metaverse and really asked a broader question of whether our profession, whether student affairs is ready for this. It was a fascinating article and definitely a fascinating magazine. To delve much deeper into this topic. I highly encourage you to go to the NASPA website and you can go under publications to the Leadership Exchange magazine and log in and be able to read that for yourself. Dr. Christopher Lewis [00:25:03]: If you want to check out all the different professional development opportunities, and I know I share a lot of them with you on a regular basis, but if you go under the Events and Online Learning tab, you're going to find everything that is happening within NASPA and around NASPA, all the different professional development opportunities that are available. And this is a great way for you to be able to find things that connect with your professional growth and professional learning that you want. And it will open up opportunities for you to be able to see different ways in which you can grow and learn in your own professional journey. So lots of things happening in NASPA, lots of ways to stay connected with NASPA. Start at the NASPA website, naspa.org, and go and check it out for yourself. Every week we're going to be sharing some amazing things that are happening within the association. So we are going to be able to try and keep you up to date on everything that's happening and allow for you to be able to get involved in different ways. Because the association is as strong as its members and for all of us, we have to find our place within the association, whether it be getting involved with a knowledge community, giving back within one of the centers or the divisions of the association. Dr. Christopher Lewis [00:26:24]: And as you're doing that, it's important to be able to identify for yourself where do you fit, where do you want to give back each week? We're hoping that we will share some things that might encourage you, might allow for you to be able to get some ideas. That will provide you with an opportunity to be able to say, hey, I see myself in that knowledge community. I see myself doing something like that, or encourage you in other ways that allow for you to be able to think beyond what's available right now, to offer other things to the association, to bring your gifts, your talents to the association and to all of the members within the association. Because through doing that, all of us are stronger and the association is better. Tune in again next week as we find out more about what is happening in NASPA. Dr. Jill Creighton [00:27:19]: A wonderful NASPA World segment as always, Chris, we really appreciate you keeping us updated. What's going on in and around NASPA? Eric, we are now at our lightning round. I have 90 seconds for you to answer seven questions. You ready to roll? Dr. Eric Stoller [00:27:31]: That's like one of those if a train leaves Chicago heading 5 miles an hour kind of questions. I'm ready to go, Jill. Let's go. Dr. Jill Creighton [00:27:36]: All right, question number one. If you were a conference keynote speaker, what would your entrance music be? Dr. Eric Stoller [00:27:42]: Well, I've been a conference keynote speaker for many different events, so I always like to go with the Glitch mob. They were always pretty good. Dr. Jill Creighton [00:27:49]: Number two, when you were five years old, what did you want to be when you grew up? Dr. Eric Stoller [00:27:52]: When I was five years old, I was a little kid in Iowa on a gravel road. I think I wanted to be probably an NBA player because then I would have pavement. Dr. Jill Creighton [00:28:01]: Number three, your most influential professional mentor. Dr. Eric Stoller [00:28:04]: Gosh, there have been so many. I'd say one of the most influential professional mentors I've ever had. Just one. So Kevin Krueger, when we were doing our pre show talk, he's been an instrumental part of my career over the years, and I always appreciated his leadership at NASPA. Dr. Jill Creighton [00:28:20]: Number four, your essential higher education. Read. Dr. Eric Stoller [00:28:23]: I would be remiss if I did not say Insidehired.com. I Know that Scott Jassic is retiring as Editor co Editor Of Inside Higher Ed. It's still, in my view, one of the best sites out there for comprehensive coverage of what's going on in higher ed. Dr. Jill Creighton [00:28:37]: Number five, the best TV show you binged during the pandemic. Dr. Eric Stoller [00:28:40]: Oh, gosh, that was years ago now. The best show? Well, my second son was born during the pandemic, and I watched ridiculous amounts of things late, late at night. I would say some sort of Scandinaro thing on Netflix, because that was kind of what I was into at the time. Dr. Jill Creighton [00:28:56]: Pandemic's been over for years for you. It's only been over for eight months. Where I'm at. Number six, the podcast you spent the most hours listening to in the last year. Dr. Eric Stoller [00:29:04]: I would say anything from the Enrollify Podcast network. I like the work that they've done. I feel like their shows are really put together nicely, and there's always interesting topics in terms of higher ed innovation and technology. Dr. Jill Creighton [00:29:17]: And finally, number seven, any shout outs you'd like to give, personal or professional? Dr. Eric Stoller [00:29:20]: First of all, I'd just like to say thank you to Jill for asking me to come on the show. I think that it's always nice to reconnect with folks from Oregon State. So I'll just give you a big shout out because it's been a blast to follow your career sort of vicariously through social networks and social media know you've been just a huge leader around the globe. I mean, you've been everywhere, it seems. So I'm going to give Jill a shout out because I don't think she probably gets enough on these things. Dr. Jill Creighton [00:29:45]: Thank you. Appreciate it. Eric, it's been such a joy to catch up with you. I've also followed your career just on social. This is the strength of weak ties. I'll cite Granavetter here as a scholar that I read a lot in my public administration doctorate program. But the Strength of Weak Ties, we haven't spoken maybe ten years probably, but it's so lovely to understand and see how we're both contributing, knowing we started off as babies in grad school. And it's very nice to see what success looks like and means for various people from that time in our lives. Dr. Jill Creighton [00:30:15]: And if folks would like to reach you after the show, how can they find you? Dr. Eric Stoller [00:30:17]: Territorium.com? Or you can always just Google Eric Stoller. Something will come up, most likely. My email is Eric@territorium.com. Dr. Jill Creighton [00:30:25]: Eric with a C. Exactly. Dr. Eric Stoller [00:30:27]: E-R-I-C. Dr. Jill Creighton [00:30:28]: Thank you so much for sharing your voice with us. Dr. Eric Stoller [00:30:30]: Thanks so much, Jill. It's been great. Dr. Jill Creighton [00:30:32]: This has been an episode of SA Voices from the Field brought to you by NASPA. This show is always made possible because of you, our listeners. We are so grateful that you continue to listen to us season after season. If you'd like to reach the show, you can always email us at savoices@naspa.org or find me on LinkedIn. By searching for Dr. Jill L. Craighton. We welcome your feedback and topic and especially your guest suggestions. Dr. Jill Creighton [00:30:58]: We'd love it if you take a moment to tell a colleague about the show. And please like, rate and review us on Apple Podcasts, Spotify or wherever you're listening now. It really does help other student affairs professionals find the show and helps us become more visible in the larger podcasting community. This episode was produced and hosted by Dr. Jill L. Creighton Seth Me produced and audio engineered by Dr. Chris Lewis. Guest coordination by Lu Yongru. Dr. Jill Creighton [00:31:23]: Special thanks to Duke Kunshan University and the University of Michigan, Flint for your support as we create this project. Catch you next time.
On this episode, Tom talks with John O'Brien, President and CEO of EDUCAUSE, the international nonprofit association whose mission is to advance higher education through technology innovation. Together they discuss a need for digital ethics across the curriculum, a renewed focus on what students want and need, and the importance of creating a culture of care in this world of exponential use of technology.
In this Noodle podcast episode, our guest is Alan Mlynek, Chief of Technology Solutions here at Noodle. As we gear up to physically attend both the 2023 EDUCAUSE and Leadership in Higher Education conferences, our conversation centers on the future of tech in education. We dive into leveraging AI to foster dynamic learning communities and talk about the challenge of engaging 'degree-completion' students, highlighting the need for a more tailored and inclusive approach.
Rick Howard, the CSO, Chief Analyst, and Senior Fellow at N2K Cyber, discusses the meaning of quantum computing through a cybersecurity perspective with CyberWire Hash Table guests Dr. Georgian Shea, Chief Technologist at the Foundation for Defense of Democracies, and Jonathan Franz, the Chief Information Security Officer at ISC2. Research contributors include Bob Turner, Fortinet's Field CISO – Education, Don Welch, New York University CIO, Rick Doten, CISO at Healthcare Enterprises and Centene, and Zan Vautrinot, Major General - retired. Howard, R., 2023. Cybersecurity First Principles: A Reboot of Strategy and Tactics [Book]. Wiley. URL: https://www.amazon.com/Cybersecurity-First-Principles-Strategy-Tactics/dp/1394173083. Deen, S., 2008. 007 | Quantum of Solace | Theme Song [Video]. YouTube. URL https://www.youtube.com/watch?v=YMXT3aJxH_A Dungey, T., Abdelgaber, Y., Casto, C., Mills, J., Fazea, Y., 2022. Quantum Computing: Current Progress and Future Directions [Website]. EDUCAUSE . URL https://er.educause.edu/articles/2022/7/quantum-computing-current-progress-and-future-directions. France, J., 2023. Quantum Compute and CyberSecurity, in: ISC2 Secure Summits. France, J., 2023. The Race Against Quantum: It's Not Too Late to be the Tortoise that Beat the Hare [Essay]. Infosecurity Magazine. URL https://www.infosecurity-magazine.com/opinions/race-quantum-tortoise-beat-hare/. Shea, Dr.G., Fixler, A., 2022. Protecting and Securing Data from the Quantum Threat [Technical Note]. Foundation for the Defense of Democracies. URL https://www.fdd.org/wp-content/uploads/2022/12/fdd-ccti-protecting-and-securing-data-from-the-quantum-threat.pdf
The recent hack of MOVEit has serious implications for higher education. MOVEit, an application used by the National Student Clearinghouse and many other institutions to move large files, directly affects numerous higher ed institutions and solution providers. This, coupled with the Gramm-Leach-Bliley Act going into effect in early June of 2023, has (should have) put cybersecurity at the top of mind for college and university decision-makers. In his latest podcast episode, Dr. Drumm McNaughton once again speaks with virtual chief information security officer Brian Kelly, who this time returns to Changing Higher Education to discuss the ramifications of MOVEit getting compromised, tools that can help higher ed institutions protect themselves, all nine elements of the GLBA that colleges and universities must be in compliance with to receive financial aid, what GLBA enforcement could look like, and an online hub that states and higher ed can emulate to ensure students enter the cybersecurity field. Highlights § MOVEit, a third-party tool used by the National Student Clearinghouse and others to move large data pieces, was recently compromised, compromising institutional data. This is having a downstream impact on higher ed since many institutions engage with the NSC. § In addition to performing triage and internal assessments, higher ed institutions must reach out to all of their vendors and contractors and ask if they use MOVEit and, if they are, what they are doing to protect their data. § It is important to have a process in place for vetting third-party risk. EDUCAUSE's HECVAT can help address this and future problems. It's a standard set of questions that institutions can ask third-party vendors about security and privacy. Over 150 colleges and universities use HECVAT version 3.0's questionnaire in their procurement process. Large vendors like Microsoft and Google have completed it. § HECVAT makes it easier for vendors since they don't have to answer bespoke questionnaires from numerous institutions that might have their nuances and differences. It also allows the community of CISOs and cybersecurity privacy practitioners in higher ed to have a conversation around a grounded standardized set of questions. § The Federal Trade Commission's Safeguards Rule, which changed the standards around safeguarding customer information, went into effect on December 9th, 2021. The Gramm-Leach-Bliley Act that took effect in early June of 2023 required higher education institutions to meet the elements of those rule changes. There are nine elements. § The primary rule change is designating a CISO or a qualified individual responsible for protecting customer information or student financial aid data. The second is to perform a risk assessment at least annually by a third party or internally. § The third involves access review controls. Institutions must annually vet employees granted access to information and ensure more people haven't been granted access. Institutions must know where all data resides and that all incoming data is identified. Institutions must ensure data is protected and encrypted when it's being stored and in use, ensure the coding or development of any software that interacts with the Department of Education's data follows secure practices, ensure data that institutions should no longer have or that has aged out has been properly disposed of, and ensure change management has been implemented. Institutions must identify who has access to customer information and annually review their logs. § The fourth ensures that institutions annually validate that these controls are in place and working as intended. The fifth mandates that the individuals who interact with the Department of Education and use customer information are appropriately trained and aware of the risks involved. The sixth ensures institutions have a program and process to address and test for third-party risks. Seventh mandates having a prescriptive plan for responding to incidents, regularly testing and validating the plan to see if it's working, and identifying the lessons learned. The ninth mandates that the CISO annually reports to the board or president. Read the podcast transcript → About Our Podcast Guest Brian Kelly supports the safeguarding of information assets across multiple verticals against unauthorized use, disclosure, modification, damage, or loss by developing, implementing, and maintaining methods to provide a secure and stable environment for clients' data and related systems. Before joining Compass, Brian was the CISO at Quinnipiac University and, most recently the Cybersecurity Program Director at EDUCAUSE. Brian is also an Adjunct Professor at Naugatuck Valley Community College, where he has developed and teaches cybersecurity courses. Brian has diverse experience in information security policy development, awareness training, and regulatory compliance. He provides thought leadership on information security issues across industries and is a recognized leader in his field. Brian holds a bachelor's degree from the University of Connecticut and a master's degree from Norwich University. He has served in various leadership roles on the local boards of the ISSA, InfraGard, and HTCIA chapters. Brian is also a retired Air Force Cyber Operations Officer. About the Host Dr. Drumm McNaughton, the host of Changing Higher Ed®, is a consultant to higher ed institutions in governance, accreditation, strategy and change, and mergers. To learn more about his services and other thought leadership pieces, visit his firm's website, https://changinghighered.com/. The Change Leader's Social Media Links LinkedIn: https://www.linkedin.com/in/drdrumm/ Twitter: @thechangeldr Email: podcast@changinghighered.com #HigherEducation #HigherEdCybersecurity #MOVEitHack
Pablo Molina, associate vice president of information technology and chief information security officer at Drexel University and adjunct professor at Georgetown University, leads the conversation on the implications of artificial intelligence in higher education. FASKIANOS: Welcome to CFR's Higher Education Webinar. I'm Irina Faskianos, vice president of the National Program and Outreach here at CFR. Thank you for joining us. Today's discussion is on the record, and the video and transcript will be available on our website, CFR.org/Academic, if you would like to share it with your colleagues. As always, CFR takes no institutional positions on matters of policy. We are delighted to have Pablo Molina with us to discuss implications of artificial intelligence in higher education. Dr. Molina is chief information security officer and associate vice president at Drexel University. He is also an adjunct professor at Georgetown University. Dr. Molina is the founder and executive director of the International Applies Ethics in Technology Association, which aims to raise awareness on ethical issues in technology. He regularly comments on stories about privacy, the ethics of tech companies, and laws related to technology and information management. And he's received numerous awards relating to technology and serves on the board of the Electronic Privacy Information Center and the Center for AI and Digital Policy. So Dr. P, welcome. Thank you very much for being with us today. Obviously, AI is on the top of everyone's mind, with ChatGPT coming out and being in the news, and so many other stories about what AI is going to—how it's going to change the world. So I thought you could focus in specifically on how artificial intelligence will change and is influencing higher education, and what you're seeing, the trends in your community. MOLINA: Irina, thank you very much for the opportunity, to the Council on Foreign Relations, to be here and express my views. Thank you, everybody, for taking time out of your busy schedules to listen to this. And hopefully, I'll have the opportunity to learn much from your questions and answer some of them to the best of my ability. Well, since I'm a professor too, I like to start by giving you homework. And the homework is this: I do not know how much people know about artificial intelligence. In my opinion, anybody who has ever used ChatGPT considers herself or himself an expert. To some extent, you are, because you have used one of the first publicly available artificial intelligence tools out there and you know more than those who haven't. So if you have used ChatGPT, or Google Bard, or other services, you already have a leg up to understand at least one aspect of artificial intelligence, known as generative artificial intelligence. Now, if you want to learn more about this, there's a big textbook about this big. I'm not endorsing it. All I'm saying, for those people who are very curious, there are two great academics, Russell and Norvig. They're in their fourth edition of a wonderful book that covers every aspect of—technical aspect of artificial intelligence, called Artificial Intelligence: A Modern Approach. And if you're really interested in how artificial intelligence can impact higher education, I recommend a report by the U.S. Department of Education that was released earlier this year in Washington, DC from the Office of Education Technology. It's called Artificial Intelligence and Future of Teaching and Learning: Insights and Recommendations. So if you do all these things and you read all these things, you will hopefully transition from being whatever expert you were before—to a pandemic and Ukrainian war expert—to an artificial intelligence expert. So how do I think that all these wonderful things are going to affect artificial intelligence? Well, as human beings, we tend to overestimate the impact of technology in the short run and really underestimate the impact of technology in the long run. And I believe this is also the case with artificial intelligence. We're in a moment where there's a lot of hype about artificial intelligence. It will solve every problem under the sky. But it will also create the most catastrophic future and dystopia that we can imagine. And possibly neither one of these two are true, particularly if we regulate and use these technologies and develop them following some standard guidelines that we have followed in the past, for better or worse. So how is artificial intelligence affecting higher education? Well, number one, there is a great lack of regulation and legislation. So if you know, for example around this, OpenAI released ChatGPT. People started trying it. And all of a sudden there were people like here, where I'm speaking to you from, in Italy. I'm in Rome on vacation right now. And Italian data protection agency said: Listen, we're concerned about the privacy of this tool for citizens of Italy. So the company agreed to establish some rules, some guidelines and guardrails on the tool. And then it reopened to the Italian public, after being closed for a while. The same thing happened with the Canadian data protection authorities. In the United States, well, not much has happened, except that one of the organizations on which board I serve, the Center for Artificial Intelligence and Digital Policy, earlier this year in March of 2023 filed a sixty-four-page complaint with the Federal Trade Commission. Which is basically we're asking the Federal Trade Commission: You do have the authority to investigate how these tools can affect the U.S. consumers. Please do so, because this is your purview, and this is your responsibility. And we're still waiting on the agency to declare what the next steps are going to be. If you look at other bodies of legislation or regulation on artificial intelligence that can help us guide artificial intelligence, well, you can certainly pay attention to the U.S. Congress. And what is the U.S. Congress doing? Yeah, pretty much that, not much, to be honest. They listen to Sam Altman, the founder of ChatGPT, who recently testified before Congress, urging Congress to regulate artificial intelligence. Which is quite clever on his part. So it was on May 17 that he testified that we could be facing catastrophic damage ahead if artificial intelligence technology is not regulated in time. He also sounded the alarm about counterfeit humans, meaning that these machines could replace what we think a person is, at least virtually. And also warned about the end of factual evidence, because with artificial intelligence anything can be fabricated. Not only that, but he pointed out that artificial intelligence could start wars and destroy democracy. Certainly very, very grim predictions. And before this, many of the companies were self-regulating for artificial intelligence. If you look at Google, Microsoft, Facebook now Meta. All of them have their own artificial intelligence self-guiding principles. Most of them were very aspirational. Those could help us in higher education because, at the very least, it can help us create our own policies and guidelines for our community members—faculty, staff, students, researchers, administrators, partners, vendors, alumni—anybody who happens to interact with our institutions of higher learning. Now, what else is happening out there? Well, we have tons, tons of laws that have to do with the technology and regulations. Things like the Gramm-Leach-Bliley Act, or the Securities and Exchange Commission, the Sarbanes-Oxley. Federal regulations like FISMA, and Cybersecurity Maturity Model Certification, Payment Card Industry, there is the Computer Fraud and Abuse Act, there is the Budapest Convention where cybersecurity insurance providers will tells us what to do and what not to do about technology. We have state laws and many privacy laws. But, to be honest, very few artificial intelligence laws. And it's groundbreaking in Europe that the European parliamentarians have agreed to discuss the Artificial Intelligence Act, which could be the first one really to be passed at this level in the world, after some efforts by China and other countries. And, if adopted, could be a landmark change in the adoption of artificial intelligence. In the United States, even though Congress is not doing much, what the White House is trying to position itself in the realm of artificial intelligence. So there's an executive order in February of 2023—that many of us in higher education read because, once again, we're trying to find inspiration for our own rules and regulations—that tells federal agencies that they have to root out bias in the design and use of new technologies, including artificial intelligence, because they have to protect the public from algorithm discrimination. And we all believe this. In higher education, we believe in being fair and transparent and accountable. I would be surprised if any of us is not concerned about making sure that our technology use, our artificial technology use, does not follow these particular principles as proposed by the Organization for Economic Cooperation and Development, and many other bodies of ethics and expertise. Now, the White House also announced new centers—research and development centers with some new national artificial intelligence research institutes. Many of us will collaborate with those in our research projects. A call for public assessments of existing generative artificial intelligence systems, like ChatGPT. And also is trying to enact or is enacting policies to ensure that U.S. government—the U.S. government, the executive branch, is leading by example when mitigating artificial intelligence risks and harnessing artificial intelligence opportunities. Because, in spite of all the concerns about this, it's all about the opportunities that we hope to achieve with artificial intelligence. And when we look at how specifically can we benefit from artificial intelligence in higher education, well, certainly we can start with new and modified academic offerings. I would be surprised if most of us will not have degrees—certainly, we already have degrees—graduate degrees on artificial intelligence, and machine learning, and many others. But I would be surprised if we don't even add some bachelor's degrees in this field, or we don't modify significantly some of our existing academic offerings to incorporate artificial intelligence in various specialties, our courses, or components of the courses that we teach our students. We're looking at amazing research opportunities, things that we'll be able to do with artificial intelligence that we couldn't even think about before, that are going to expand our ability to generate new knowledge to contribute to society, with federal funding, with private funding. We're looking at improved knowledge management, something that librarians are always very concerned about, the preservation and distribution of knowledge. The idea would be that artificial intelligence will help us find better the things that we're looking for, the things that we need in order to conduct our academic work. We're certainly looking at new and modified pedagogical approaches, new ways of learning and teaching, including the promise of adaptive learning, something that really can tell students: Hey, you're not getting this particular concept. Why don't you go back and study it in a different way with a different virtual avatar, using simulations or virtual assistance? In almost every discipline and academic endeavor. We're looking very concerned, because we're concerned about offering, you know, a good value for the money when it comes to education. So we're hoping to achieve extreme efficiencies, better ways to run admissions, better ways to guide students through their academic careers, better way to coach them into professional opportunities. And many of this will be possible thanks to artificial intelligence. And also, let's not forget this, but we still have many underserved students, and they're underserved because they either cannot afford education or maybe they have physical or cognitive disabilities. And artificial intelligence can really help us reach to those students and offer them new opportunities to advance their education and fulfill their academic and professional goals. And I think this is a good introduction. And I'd love to talk about all the things that can go wrong. I'd love to talk about all the things that we should be doing so that things don't go as wrong as predicted. But I think this is a good way to set the stage for the discussion. FASKIANOS: Fantastic. Thank you so much. So we're going to go all of you now for your questions and comments, share best practices. (Gives queuing instructions.) All right. So I'm going first to Gabriel Doncel has a written question, adjunct faculty at the University of Delaware: How do we incentivize students to approach generative AI tools like ChatGPT for text in ways that emphasize critical thinking and analysis? MOLINA: I always like to start with a difficult question, so I very much, Gabriel Doncel, for that particular question. And, as you know, there are several approaches to adopting tools like ChatGPT on campus by students. One of them is to say: No, over my dead body. If you use ChatGPT, you're cheating. Even if you cite ChatGPT, we can consider you to be cheating. And not only that, but some institutions have invested in tools that can detect whether or something was written with ChatGPT or similar rules. There are other faculty members and other academic institutions that are realizing these tools will be available when these students join the workforce. So our job is to help them do the best that they can by using these particular tools, to make sure they avoid some of the mishaps that have already happened. There are a number of lawyers who have used ChatGPT to file legal briefs. And when the judges received those briefs, and read through them, and looked at the citations they realized that some of the citations were completely made up, were not real cases. Hence, the lawyers faced professional disciplinary action because they used the tool without the professional review that is required. So hopefully we're going to educate our students and we're going to set policy and guideline boundaries for them to use these, as well as sometimes the necessary technical controls for those students who may not be that ethically inclined to follow our guidelines and policies. But I think that to hide our heads in the sand and pretend that these tools are not out there for students to use would be—it's a disserve to our institutions, to our students, and the mission that we have of training the next generation of knowledge workers. FASKIANOS: Thank you. I'm going to go next to Meena Bose, who has a raised hand. Meena, if you can unmute yourself and identify yourself. Q: Thank you, Irina. Thank you for this very important talk. And my question is a little—(laughs)—it's formative, but really—I have been thinking about what you were saying about the role of AI in academic life. And I don't—particularly for undergraduates, for admissions, advisement, guidance on curriculum. And I don't want to have my head in the sand about this, as you just said—(laughs)—but it seems to me that any kind of meaningful interaction with students, particularly students who have not had any exposure to college before, depends upon kind of multiple feedback with faculty members, development of mentors, to excel in college and to consider opportunities after. So I'm struggling a little bit to see how AI can be instructive for that part of college life, beyond kind of providing information, I guess. But I guess the web does that already. So welcome your thoughts. Thank you. FASKIANOS: And Meena's at Hofstra University. MOLINA: Thank you. You know, it's a great question. And the idea that everybody is proposing right here is we are not—artificial intelligence companies, at least at first. We'll see in the future because, you know, it depends on how it's regulated. But they're not trying, or so they claim, to replace doctors, or architects, or professors, or mentors, or administrators. They're trying to help those—precisely those people in those professions, and the people they served gain access to more information. And you're right in a sense that that information is already on the web. But we've aways had a problem finding that information regularly on the web. And you may remember that when Google came along, I mean, it swept through every other search engine out there AltaVista, Yahoo, and many others, because, you know, it had a very good search algorithm. And now we're going to the next level. The next level is where you ask ChatGPT in human-natural language. You're not trying to combine the three words that say, OK, is the economics class required? No, no, you're telling ChatGPT, hey, listen, I'm in the master's in business administration at Drexel University and I'm trying to take more economic classes. What recommendations do you have for me? And this is where you can have a preliminary one, and also a caveat there, as most of these search engine—generative AI engines already have, that tell you: We're not here to replace the experts. Make sure you discuss your questions with the experts. We will not give you medical advice. We will not give you educational advice. We're just here, to some extent, for guiding purposes and, even now, for experimental and entertainment purposes. So I think you are absolutely right that we have to be very judicious about how we use these tools to support the students. Now, that said, I had the privilege of working for public universities in the state of Connecticut when I was the CIO. I also had the opportunity early in my career to attend public university in Europe, in Spain, where we were hundreds of students in class. We couldn't get any attention from the faculty. There were no mentors, there were no counselors, or anybody else. Is it better to have nobody to help you or is it better to have at least some technology guidance that can help you find the information that otherwise is spread throughout many different systems that are like ivory towers—emissions on one side, economics on the other, academics advising on the other, and everything else. So thank you for a wonderful question and reflection. FASKIANOS: I'm going to take the next question written from Dr. Russell Thomas, a senior lecturer in the Department of International Relations and Diplomatic Studies at Cavendish University in Uganda: What are the skills and competencies that higher education students and faculty need to develop to think in an AI-driven world? MOLINA: So we could argue here that something very similar has happened already with many information technologies and communication technologies. It is the understanding at first faculty members did not want to use email, or the web, or many other tools because they were too busy with their disciplines. And rightly so. They were brilliant economists, or philosophers, or biologists. They didn't have enough time to learn all these new technologies to interact with the students. But eventually they did learn, because they realized that it was the only way to meet the students where they were and to communicate with them in efficient ways. Now, I have to be honest; when it comes to the use of technology—and we'll unpack the numbers—it was part of my doctoral dissertation, when I expanded the adoption of technology models, that tells you about early adopters, and mainstream adopters, and late adopters, and laggards. But I uncovered a new category for some of the institutions where I worked called the over-my-dead-body adopters. And these were some of the faculty members who say: I will never switch word processors. I will never use this technology. It's only forty years until I retire, probably eighty more until I die. I don't have to do this. And, to be honest, we have a responsibility to understand that those artificial intelligence tools are out there, and to guide the students as to what is the acceptable use of those technologies within the disciplines and the courses that we teach them in. Because they will find those available in a very competitive work market, in a competitive labor market, because they can derive some benefit from them. But also, we don't want to shortchange their educational attainment just because they go behind our backs to copy and paste from ChatGPT, learning nothing. Going back to the question by Gabriel Doncel, not learning to exercise the critical thinking, using citations and material that is unverified, that was borrowed from the internet without any authority, without any attention to the different points of view. I mean, if you've used ChatGPT for a while—and I have personally, even to prepare some basic thank-you speeches, which are all very formal, even to contest a traffic ticket in Washington, DC, when I was speeding but I don't want to pay the ticket anyway. Even for just research purposes, you could realize that most of the writing from ChatGPT has a very, very common style. Which is, oh, on the one hand people say this, on the other hand people say that. Well, the critical thinking will tell you, sure, there are two different opinions, but this is what I think myself, and this is why I think about this. And these are some of the skills, the critical thinking skills, that we must continue to teach the students and not to, you know, put blinds around their eyes to say, oh, continue focusing only on the textbook and the website. No, no. Look at the other tools but use them judiciously. FASKIANOS: Thank you. I'm going to go next to Clemente Abrokwaa. Raised hand, if you can identify yourself, please. Q: Hi. Thanks so much for your talk. It's something that has been—I'm from Penn State University. And this is a very important topic, I think. And some of the earlier speakers have already asked the questions I was going to ask. (Laughs.) But one thing that I would like to say that, as you said, we cannot bury our heads in the sand. No matter what we think, the technology is already here. So we cannot avoid it. My question, though, is what do you think about the artificial intelligence, the use of that in, say, for example, graduate students using it to write dissertations? You did mention about the lawyers that use it to write their briefs, and they were caught. But in dissertations and also in class—for example, you have students—you have about forty students. You give a written assignment. You make—when you start grading, you have grading fatigue. And so at some point you lose interest of actually checking. And so I'm kind of concerned about that how it will affect the students' desire to actually go and research without resorting to the use of AI. MOLINA: Well, Clemente, fellow colleague from the state of Pennsylvania, thank you for that, once again, both a question and a reflection here. Listen, many of us wrote our doctoral dissertations—mine at Georgetown. At one point of time, I was so tired of writing about the same topics, following the wonderful advice, but also the whims of my dissertation committee, that I was this close from outsourcing my thesis to China. I didn't, but I thought about it. And now graduate students are thinking, OK, why am I going through the difficulties of writing this when ChatGPT can do it for me and the deadline is tomorrow? Well, this is what will distinguish the good students and the good professionals from the other ones. And the interesting part is, as you know, when we teach graduate students we're teaching them critical thinking skills, but also teaching them now to express themselves, you know, either orally or in writing. And writing effectively is fundamental in the professions, but also absolutely critical in academic settings. And anybody who's just copying and pasting from ChatGPT to these documents cannot do that level of writing. But you're absolutely right. Let's say that we have an adjunct faculty member who's teaching a hundred students. Will that person go through every single essay to find out whether students were cheating with ChatGPT? Probably not. And this is why there are also enterprising people who are using artificial intelligence to find out and tell you whether a paper was written using artificial intelligence. So it's a little bit like this fighting of different sources and business opportunities for all of them. And we've done this. We've used antiplagiarism tools in the past because we knew that students were copying and pasting using Google Scholar and many other sources. And now oftentimes we run antiplagiarism tools. We didn't write them ourselves. Or we tell the students, you run it yourself and you give it to me. And make sure you are not accidentally not citing things that could end up jeopardizing your ability to get a graduate degree because your work was not up to snuff with the requirements of our stringent academic programs. So I would argue that this antiplagiarism tools that we're using will more often than not, and sooner than expected, incorporate the detection of artificial intelligence writeups. And also the interesting part is to tell the students, well, if you do choose to use any of these tools, what are the rules of engagement? Can you ask it to write a paragraph and then you cite it, and you mention that ChatGPT wrote it? Not to mention, in addition to that, all the issues about artificial intelligence, which the courts are deciding now, regarding the intellectual property of those productions. If a song, a poem, a book is written by an artificial intelligence entity, who owns the intellectual property for those works produced by an artificial intelligence machine? FASKIANOS: Good question. We have a lot of written questions. And I'm sure you don't want to just listen to my voice, so please do raise your hands. But we do have a question from one of your colleagues, Pablo, Pepe Barcega, who's the IT director at Drexel: Considering the potential biases and limitations of AI models, like ChatGPT, do you think relying on such technology in the educational domain can perpetuate existing inequalities and reinforce systemic biases, particularly in terms of access, representation, and fair evaluation of students? And Pepe's question got seven upvotes, we advanced it to the top of the line. MOLINA: All right, well, first I have to wonder whether he used ChatGPT to write the question. But I'm going to leave it that. Thank you. (Laughter.) It's a wonderful question. One of the greatest concerns we have had, those of us who have been working on artificial intelligence digital policy for years—not this year when ChatGPT was released, but for years we've been thinking about this. And even before artificial intelligence, in general with algorithm transparency. And the idea is the following: That two things are happening here. One is that we're programming the algorithms using instructions, instructions created by programmers, with all their biases, and their misunderstandings, and their shortcomings, and their lack of context, and everything else. But with artificial intelligence we're doing something even more concerning than that, which is we have some basic algorithms but then we're feeling a lot of information, a corpus of information, to those algorithms. And the algorithms are fine-tuning the rules based on those. So it's very, very difficult for experts to explain how an artificial intelligence system actually makes decisions, because we know the engine and we know the data that we fed to the engine, but we don't know the real outcome how those decisions are being made through neural networks, through all of the different systems that we have and methods that we have for artificial intelligence. Very, very few people understand how those work. And those are so busy they don't have time to explain how the algorithm works for others, including the regulators. Let's remember some of the failed cases. Amazon tried this early. And they tried this for selecting employees for Amazon. And they fed all the resumes. And guess what? It turned out that most of the recommendations were to hire young white people who had gone to Ivy League schools. Why? Because their first employees were feeding those descriptions, and they had done extremely well at Amazon. Hence, by feeding that information of past successful employees only those were there. And so that puts away the diversity that we need for different academic institutions, large and small, public and private, from different countries, from different genders, from different ages, from different ethnicities. All those things went away because the algorithm was promoting one particular one. Recently I had the opportunity to moderate a panel in Washington, DC, and we had representatives from the Equal Employment Opportunity Commission. And they told us how they investigated a hiring algorithm from a company that was disproportionately recommending that they hired people whose first name was Brian and had played lacrosse in high school because, once again, a disproportionate number of people in that company had done that. And the algorithm realized, oh, this must be important characteristics to hire people for this company. Let's not forget, for example, with the artificial facial recognition and artificial intelligence by Amazon Rekog, you know, the facial recognition software, that the American Civil Liberties Union, decided, OK, I'm going to submit the pictures of all the congressmen to this particular facial recognition engine. And it turned out that it misidentified many of them, particularly African Americans, as felons who had been convicted. So all these artificial—all these biases could have really, really bad consequences. Imagine that you're using this to decide who you admit to your universities, and the algorithm is wrong. You know, you are making really biased decisions that will affect the livelihood of many people, but also will transform society, possibly for the worse, if we don't address this. So this is why the OECD, the European Union, even the White House, everybody is saying: We want this technology. We want to derive the benefits of this technology, while curtailing the abuses. And it's fundamental we achieve transparency. We are sure that these algorithms are not biased against the people who use them. FASKIANOS: Thank you. So I'm going to go next to Emily Edmonds-Poli, who is a professor at the University of San Diego: We hear a lot about providing clear guidelines for students, but for those of us who have not had a lot of experience using ChatGPT it is difficult to know what clear guidelines look like. Can you recommend some sources we might consult as a starting point, or where we might find some sample language? MOLINA: Hmm. Well, certainly this is what we do in higher education. We compete for the best students and the best faculty members. And we sometimes compete a little bit to be first to win groundbreaking research. But we tend to collaborate with everything else, particularly when it comes to policy, and guidance, and rules. So there are many institutions, like mine, who have already assembled—I'm sure that yours has done the same—assembled committees, because assembling committees and subcommittees is something we do very well in higher education, with faculty members, with administrators, even with the student representation to figure out, OK, what should we do about the use of artificial intelligence on our campus? I mentioned before taking a look at the big aspirational declarations by Meta, and Google, and IBM, and Microsoft could be helpful for these communities to look at this. But also, I'm a very active member of an organization known as EDUCAUSE. And EDUCAUSE is for educators—predominantly higher education educators. Administrators, staff members, faculty members, to think about the adoption of information technology. And EDUCAUSE has done good work on this front and continues to do good work on this front. So once again, EDUCAUSE and some of the institutions have already published their guidelines on how to use artificial intelligence and incorporate that within their academic lives. And now, that said, we also know that even though all higher education institutions are the same, they're all different. We all have different values. We all believe in different uses of technology. We trust more or less the students. Hence, it's very important that whatever inspiration you would take, you work internally on campus—as you have done with many other issues in the past—to make sure it really reflects the values of your institution. FASKIANOS: So, Pablo, would you point to a specific college or university that has developed a code of ethics that addresses the use of AI for their academic community beyond your own, but that is publicly available? MOLINA: Yeah, I'm going to be honest, I don't want to put anybody on the spot. FASKIANOS: OK. MOLINA: Because, once again, there many reasons. But, once again, let me repeat a couple resources. One is of them is from the U.S. Department of Education, from the Office of Educational Technology. And the article is Artificial Intelligence and Future of Teaching and Learning: Insights and Recommendations, published earlier this year. The other source really is educause.edu. And if you look at educause.edu on artificial intelligence, you'll find links to articles, you'll find links to universities. It would be presumptuous of me to evaluate whose policies are better than others, but I would argue that the general principles of nonbiased, transparency, accountability, and also integration of these tools within the academic life of the institution in a morally responsible way—with concepts by privacy by design, security by design, and responsible computing—all of those are good words to have in there. Now, the other problem with policies and guidelines is that, let's be honest, many of those have no teeth in our institutions. You know, we promulgate them. They're very nice. They look beautiful. They are beautifully written. But oftentimes when people don't follow them, there's not a big penalty. And this is why, in addition to having the policies, educating the campus community is important. But it's difficult to do because we need to educate them about so many things. About cybersecurity threats, about sexual harassment, about nondiscriminatory policies, about responsible behavior on campus regarding drugs and alcohol, about crime. So many things that they have to learn about. It's hard to get at another topic for them to spend their time on, instead of researching the core subject matter that they chose to pursue for their lives. FASKIANOS: Thank you. And we will be sending out a link to this video, the transcript, as well as the resources that you have mentioned. So if you didn't get them, we'll include them in the follow-up email. So I'm going to go to Dorian Brown Crosby who has a raised hand. Q: Yes. Thank you so much. I put one question in the chat but I have another question that I would like to go ahead and ask now. So thank you so much for this presentation. You mentioned algorithm biases with individuals. And I appreciate you pointing that out, especially when we talk about face recognition, also in terms of forced migration, which is my area of research. But I also wanted you to speak to, or could you talk about the challenges that some institutions in higher education would have in terms of support for some of the things that you mentioned in terms of potential curricula, or certificates, or other ways that AI would be woven into the new offerings of institutions of higher education. How would that look specifically for institutions that might be challenged to access those resources, such as Historically Black Colleges and Universities? Thank you. MOLINA: Well, very interesting question, and a really fascinating point of view. Because we all tend to look at things from our own perspective and perhaps not consider the perspective of others. Those who have much more money and resources than us, and those who have fewer resources and less funding available. So this is a very interesting line. What is it that we do in higher education when we have these problems? Well, as I mentioned before, we build committees and subcommittees. Usually we also do campus surveys. I don't know why we love doing campus surveys and asking everybody what they think about this. Those are useful tools to discuss. And oftentimes the thing that we do also, that we've done for many other topics, well, we hire people and we create new offices—either academic or administrative offices. With all of those, you know, they have certain limitations to how useful and functional they can be. And they also continue to require resources. Resources that, in the end, are paid for by students with, you know, federal financing. But this is the truth of the matter. So if you start creating offices of artificial intelligence on our campuses, however important the work may be on their guidance and however much extra work can be assigned to them instead of distributed to every faculty and the staff members out there, the truth of the matter is that these are not perfect solutions. So what is it that we do? Oftentimes, we work with partners. And our partners love to take—(inaudible)—vendors. But the truth of the matter is that sometimes they have much more—they have much more expertise on some of these topics. So for example, if you're thinking about incorporating artificial intelligence to some of the academic materials that you use in class, well, I'm going to take a guess that if you already work with McGraw Hill in economics, or accounting, or some of the other books and websites that they put that you recommend to your students or you make mandatory for your students, that you start discussing with them, hey, listen, are you going to use artificial intelligence? How? Are you going to tell me ahead of time? Because, as a faculty member, you may have a choice to decide: I want to work with this publisher and not this particular publisher because of the way they approach this. And let's be honest, we've seen a number of these vendors with major information security problems. McGraw Hill recently left a repository of data misconfigured out there on the internet, and almost anybody could access that. But many others before them, like Chegg and others, were notorious for their information security breaches. Can we imagine that these people are going to adopt artificial intelligence and not do such a good job of securing the information, the privacy, and the nonbiased approaches that we hold dear for students? I think they require a lot of supervision. But in the end, these publishers have the economies of scale for you to recommend those educational materials instead of developing your own for every course, for every class, and for every institution. So perhaps we're going to have to continue to work together, as we've done in higher education, in consortia, which would be local, or regional. It could be based on institutions of the same interest, or on student population, on trying to do this. And, you know, hopefully we'll get grants, grants from the federal government, that can be used in order to develop some of the materials and guidelines that are going to help us precisely embrace this and embracing not only to operate better as institutions and fulfill our mission, but also to make sure that our students are better prepared to join society and compete globally, which is what we have to do. FASKIANOS: So I'm going to combine questions. Dr. Lance Hunter, who is an associate professor at Augusta University. There's been a lot of debate regarding if plagiarism detection software tools like Turnitin can accurately detect AI-generated text. What is your opinion regarding the accuracy of AI text generation detection plagiarism tools? And then Rama Lohani-Chase, at Union County College, wants recommendations on what plagiarism checker devices you would recommend—or, you know, plagiarism detection for AI would you recommend? MOLINA: Sure. So, number one, I'm not going to endorse any particular company because if I do that I would ask them for money, or the other way around. I'm not sure how it works. I could be seen as biased, particularly here. But there are many there and your institutions are using them. Sometimes they are integrated with your learning management system. And, as I mentioned, sometimes we ask the students to use them themselves and then either produce the plagiarism report for us or simply know themselves this. I'm going to be honest; when I teach ethics and technology, I tell the students about the antiplagiarism tools at the universities. But I also tell them, listen, if you're cheating in an ethics and technology class, I failed miserably. So please don't. Take extra time if you have to take it, but—you know, and if you want, use the antiplagiarism tool yourself. But the question stands and is critical, which is right now those tools are trying to improve the recognition of artificial intelligence written text, but they're not as good as they could be. So like every other technology and, what I'm going to call, antitechnology, used to control the damage of the first technology, is an escalation where we start trying to identify this. And I think they will continue to do this, and they will be successful in doing this. There are people who have written ad hoc tools using ChatGPT to identify things written by ChatGPT. I tried them. They're remarkably good for the handful of papers that I tried myself, but I haven't conducted enough research myself to tell you if they're really effective tools for this. So I would argue that for the timing you must assume that those tools, as we assume all the time, will not catch all of the cases, only some of the most obvious ones. FASKIANOS: So a question from John Dedie, who is an assistant professor at the Community College of Baltimore County: To combat AI issues, shouldn't we rethink assignments? Instead of papers, have students do PowerPoints, ask students to offer their opinions and defend them? And then there was an interesting comment from Mark Habeeb at Georgetown University School of Foreign Service. Knowledge has been cheap for many years now because it is so readily available. With AI, we have a tool that can aggregate the knowledge and create written products. So, you know, what needs to be the focus now is critical thinking and assessing values. We need to teach our students how to assess and use that knowledge rather than how to find the knowledge and aggregate that knowledge. So maybe you could react to those two—the question and comment. MOLINA: So let me start with the Georgetown one, not only because he's a colleague of mine. I also teach at Georgetown, and where I obtained my doctoral degree a number of years ago. I completely agree. I completely agree with the issue that we have to teach new skills. And one of the programs in which I teach at Georgetown is our master's of analysis. Which are basically for people who want to work in the intelligence community. And these people have to find the information and they have to draw inferences, and try to figure out whether it is a nation-state that is threatening the United States, or another, or a corporation, or something like that. And they do all of those critical thinking, and intuition, and all the tools that we have developed in the intelligence community for many, many years. And artificial intelligence, if they suspend their judgement and they only use artificial intelligence, they will miss very important information that is critical for national security. And the same is true for something like our flagship school, the School of Foreign Service at Georgetown, one of the best in the world in that particular field, where you want to train the diplomats, and the heads of state, and the great strategical thinkers on policy and politics in the international arena to precisely think not in the mechanical way that a machine can think, but also to connect those dots. And, sure they should be using those tools in order to, you know, get the most favorable position and the starting position, But they should also use their critical thinking always, and their capabilities of analysis in order to produce good outcomes and good conclusions. Regarding redoing the assignments, absolutely true. But that is hard. It is a lot of work. We're very busy faculty members. We have to grade. We have to be on committees. We have to do research. And now they ask us to redo our entire assessment strategy, with new assignments that we need to grade again and account for artificial intelligence. And I don't think that any provost out there is saying, you know what? You can take two semesters off to work on this and retool all your courses. That doesn't happen in the institutions that I know of. If you get time off because you're entitled to it, you want to devote that time to do research because that is really what you sign up for when you pursued an academic career, in many cases. I can tell you one thing, that here in Europe where oftentimes they look at these problems with fewer resources than we do in the United States, a lot of faculty members at the high school level, at the college level, are moving to oral examinations because it's much harder to cheat with ChatGPT with an oral examination. Because they will ask you interactive, adaptive questions—like the ones we suffered when we were defending our doctoral dissertations. And they will realize, the faculty members, whether or not you know the material and you understand the material. Now, imagine oral examinations for a class of one hundred, two hundred, four hundred. Do you do one for the entire semester, with one topic chosen and run them? Or do you do several throughout the semester? Do you end up using a ChatGPT virtual assistance to conduct your oral examinations? I think these are complex questions. But certainly redoing our assignments and redoing the way we teach and the way we evaluate our students is perhaps a necessary consequence of the advent of artificial intelligence. FASKIANOS: So next question from Damian Odunze, who is an assistant professor at Delta State University in Cleveland, Mississippi: Who should safeguard ethical concerns and misuse of AI by criminals? Should the onus fall on the creators and companies like Apple, Google, and Microsoft to ensure security and not pass it on to the end users of the product? And I think you mentioned at the top in your remarks, Pablo, about how the founder of ChatGPT was urging the Congress to put into place some regulation. What is the onus on ChatGPT to protect against some of this as well? MOLINA: Well, I'm going to recycle more of the material from my doctoral dissertation. In this case it was the Molina cycle of innovation and regulation. It goes like this, basically there are—you know, there are engineers and scientists who create new information technologies. And then there are entrepreneurs and businesspeople and executives to figure out, OK, I know how to package this so that people are going to use it, buy it, subscribe to it, or look at it, so that I can sell the advertisement to others. And, you know, this begins and very, very soon the abuses start. And the abuses are that criminals are using these platforms for reasons that were not envisioned before. Even the executives, as we've seen with Google, and Facebook, and others, decide to invade the privacy of the people because they only have to pay a big fine, but they make much more money than the fines or they expect not to be caught. And what happened in this cycle is that eventually there is so much noise in the media, congressional hearings, that eventually regulators step in and they try to pass new laws to do this, or the regulatory agencies try to investigate using the powers given to them. And then all of these new rules have to be tested in courts of law, which could take years by the time it reaches sometimes all the way to the Supreme Court. Some of them are even knocked down on the way to the Supreme Court when they realize this is not constitutional, it's a conflict of laws, and things like that. Now, by the time we regulate these new technologies, not only many years have gone by, but the technologies have changed. The marketing products and services have changed, the abuses have changed, and the criminals have changed. So this is why we're always living in a loosely regulated space when it comes to information technology. And this is an issue of accountability. We're finding this, for example, with information security. If my phone is my hacked, or my computer, my email, is it the fault of Microsoft, and Apple, and Dell, and everybody else? Why am I the one paying the consequences and not any of these companies? Because it's unregulated. So morally speaking, yes. These companies are accountable. Morally speaking also the users are accountable, because we're using these tools because we're incorporating them professionally. Legally speaking, so far, nobody is accountable except the lawyers who submitted briefs that were not correct in a court of law and were disciplined for that. But other than that, right now, it is a very gray space. So in my mind, it requires everybody. It takes a village to do the morally correct thing. It starts with the companies and the inventors. It involves the regulators, who should do their job and make sure that there's no unnecessary harm created by these tools. But it also involves every company executive, every professional, every student, and professor who decides to use these tools. FASKIANOS: OK. I'm going to take—combine a couple questions from Dorothy Marinucci and Venky Venkatachalam about the effect of AI on jobs. Dorothy talks about—she's from Fordham University—about she read something about Germany's best-selling newspaper Bild reportedly adopting artificial intelligence to replace certain editorial roles in an effort to cut costs. Does this mean that the field of journalism communication will change? And Venky's question is: AI—one of the impacts is in the area of automation, leading to elimination of certain types of jobs. Can you talk about both the elimination of jobs and what new types of jobs you think will be created as AI matures into the business world with more value-added applications? MOLINA: Well, what I like about predicting the future, and I've done this before in conferences and papers, is that, you know, when the future comes ten years from now people will either not remember what I said, or, you know, maybe I was lucky and my prediction was correct. In the specific field of journalism, and we've seen it, the journalism and communications field, decimated because the money that they used to make with advertising—and, you know, certainly a bit part of that were in the form of corporate profits. But many other one in the form of hiring good journalists, and investigative journalism, and these people could be six months writing a story when right now they have six hours to write a story, because there are no resources. And all the advertisement money went instead to Facebook, and Google, and many others because they work very well for advertisements. But now the lifeblood of journalism organizations has been really, you know, undermined. And there's good journalism in other places, in newspapers, but sadly this is a great temptation to replace some of the journalists with more artificial intelligence, particularly the most—on the least important pieces. I would argue that editorial pieces are the most important in newspapers, the ones requiring ideology, and critical thinking, and many others. Whereas there are others that tell you about traffic changes that perhaps do not—or weather patterns, without offending any meteorologists, that maybe require a more mechanical approach. I would argue that a lot of professions are going to be transformed because, well, if ChatGPT can write real estate announcements that work very well, well, you may need fewer people doing this. And yet, I think that what we're going to find is the same thing we found when technology arrived. We all thought that the arrival of computers would mean that everybody would be without a job. Guess what? It meant something different. It meant that in order to do our jobs, we had to learn how to use computers. So I would argue that this is going to be the same case. To be a good doctor, to be a good lawyer, to be a good economist, to be a good knowledge worker you're going to have to learn also how to use whatever artificial intelligence tools are available out there, and use them professionally within the moral and the ontological concerns that apply to your particular profession. Those are the kind of jobs that I think are going to be very important. And, of course, all the technical jobs, as I mentioned. There are tons of people who consider themselves artificial intelligence experts. Only a few at the very top understand these systems. But there are many others in the pyramid that help with preparing these systems, with the support, the maintenance, the marketing, preparing the datasets to go into these particular models, working with regulators and legislators and compliance organizations to make sure that the algorithms and the tools are not running afoul of existing regulations. All of those, I think, are going to be interesting jobs that will be part of the arrival of artificial intelligence. FASKIANOS: Great. We have so many questions left and we just couldn't get to them all. I'm just going to ask you just to maybe reflect on how the use of artificial intelligence in higher education will affect U.S. foreign policy and international relations. I know you touched upon it a little bit in reacting to the comment from our Georgetown University colleague, but any additional thoughts you might want to add before we close? MOLINA: Well, let's be honest, one particular one that applies to education and to everything else, there is a race—a worldwide race for artificial intelligence progress. The big companies are fighting—you know, Google, and Meta, many others, are really putting—Amazon—putting resources into that, trying to be first in this particular race. But it's also a national race. For example, it's very clear that there are executive orders from the United States as well as regulations and declarations from China that basically are indicating these two big nations are trying to be first in dominating the use of artificial intelligence. And let's be honest, in order to do well in artificial intelligence you need not only the scientists who are going to create those models and refine them, but you also need the bodies of data that you need to feed these algorithms in order to have good algorithms. So the barriers to entry for other nations and the barriers to entry by all the technology companies are going to be very, very high. It's not going to be easy for any small company to say: Oh, now I'm a huge player in artificial intelligence. Because even if you may have created an interesting new algorithmic procedure, you don't have the datasets that the huge companies have been able to amass and work on for the longest time. Every time you submit a question to ChatGPT, the ChatGPT experts are using their questions to refine the tool. The same way that when we were using voice recognition with Apple or Android or other companies, that we're using those voices and our accents and our mistakes in order to refine their voice recognition technologies. So this is the power. We'll see that the early bird gets the worm of those who are investing, those who are aggressively going for it, and those who are also judiciously regulating this can really do very well in the international arena when it comes to artificial intelligence. And so will their universities, because they will be able to really train those knowledge workers, they'll be able to get the money generated from artificial intelligence, and they will be able to, you know, feedback one with the other. The advances in the technology will result in more need for students, more students graduating will propel the industry. And there will also be—we'll always have a fight for talent where companies and countries will attract those people who really know about these wonderful things. Now, keep in mind that artificial intelligence was the core of this, but there are so many other emerging issues in information technology. And some of them are critical to higher education. So we're still, you know, lots of hype, but we think that virtual reality will have an amazing impact on the way we teach and we conduct research and we train for certain skills. We think that quantum computing has the ability to revolutionize the way we conduct research, allowing us to do competitions that were not even thinkable today. We'll look at things like robotics. And if you ask me about what is going to take many jobs away, I would say that robotics can take a lot of jobs away. Now, we thought that there would be no factory workers left because of robots, but that hasn't happened. But keep adding robots with artificial intelligence to serve you a cappuccino, or your meal, or take care of your laundry, or many other things, or maybe clean your hotel room, and you realize, oh, there are lots of jobs out there that no longer will be there. Think about artificial intelligence for self-driving vehicles, boats, planes, cargo ships, commercial airplanes. Think about the thousands of taxi drivers and truck drivers who may end up being out of jobs because, listen, the machines drive safer, and they don't get tired, and they can be driving twenty-four by seven, and they don't require health benefits, or retirement. They don't get depressed. They never miss. Think about many of the technologies out there that have an impact on what we do. So, but artificial intelligence is a multiplier to technologies, a contributor to many other fields and many other technologies. And this is why we're so—spending so much time and so much energy thinking about these particular issues. FASKIANOS: Well, thank you, Pablo Molina. We really appreciate it. Again, my apologies that we couldn't get to all of the questions and comments in the chat, but we appreciate all of you for your questions and, of course, your insights were really terrific, Dr. P. So we will, again, be sending out the link to this video and transcript, as well as the resources that you mentioned during this discussion. I hope you all enjoy the Fourth of July. And I encourage you to follow @CFR_Academic on Twitter and visit CFR.org, ForeignAffairs.com, and ThinkGlobalHealth.org for research and analysis on global issues. Again, you send us comments, feedback, suggestions to CFRacademic@CFR.org. And, again, thank you all for joining us. We look forward to your continued participation in CFR Academic programming. Have a great day. MOLINA: Adios. (END)
Stephanie Moore + Heather Tillberg-Webb talk about their book, Ethics and Educational Technology, on episode 463 of the Teaching in Higher Ed podcast. Quotes from the episode Learning is change. -Heather Tillberg-Webb Resources Ethics and Educational Technology Reflection, Interrogation, and Design as a Framework for Practice, by Stephanie L. Moore and Heather K. Tillberg-Webb Imagination quote from Percy Bysshe Shelley Ely's quote is included in this EDUCAUSE article, co-authored by Stephanie Moore and others. CAST UDL Affiliate income disclosure: Books that are recommended on the podcast link to the Teaching in Higher Ed bookstore on Bookshop.org. All affiliate income gets donated to the LibroMobile Arts Cooperative (LMAC), established in 2016 by Sara Rafael Garcia.”
In this Anthro to UX podcast episode, Joseph Galanek speaks with Matt Artz about his UX journey. The conversation covers Joe's journey from being inspired by cultural experiences when traveling to pursuing degrees in anthropology and public health. He shares his challenges in finding a job in anthropology departments due to specific research interests and how he discovered opportunities in consulting agencies that value qualitative and quantitative research. The discussion also explores Joe's work leading client engagements as a UX Strategy Manager at Answerlab and his experience mentoring anthropologists transitioning into business. About Joseph Galanek Joseph Galanek is an accomplished user experience researcher and UXR strategist with over 15 years of experience in academia and industry. As the UX Strategy Manager at AnswerLab, he collaborates with ecommerce leaders, designing and implementing customized research that meets partners' objectives and business goals. Joseph's expertise includes qualitative and mixed methods research, scoping, and client engagement. With a strong academic background, including a PhD in Cultural Anthropology and an MPH from Case Western Reserve University, Joseph has held research roles at various organizations such as HireWisdom, EDUCAUSE, and ICF. He has contributed to impactful projects, including CDC health campaigns and national HIV prevention initiatives, and is known for balancing methodological rigor with pragmatic strategies. Key Moments 00:02:00 - Reflections on a Career in Anthropology 00:06:36 - Exploring Career Opportunities in Research Consultancies 00:08:12 - Applying Qualitative Research Skills in an Applied Environment 00:14:17 - Collaborative Efforts and Demonstrating Value in Anthropology Research 00:16:00 - Moving from Academia to Business: Advice for Anthropologists 00:19:41 - Making a Good First Impression on LinkedIn 00:21:19 - Analysis of Research Objectives and Actionable Recommendations for Development Teams 00:22:48 -Exploring Trends in Cost-Effective UX Research Recommended Links
On this episode, Tom talks to Susan Grajek, Vice President for Partnerships, Communities and Research at EDUCAUSE, a nonprofit association with the mission to advance higher education through the use of information technology and data. Together they discuss the future of technology in higher education, how ‘radical creativity' can be used to unlock novel ideas, and the critical relationship between culture and innovation.
In the third episode of a five-part series based on NACUBO's Top Five Higher Education Business Issues of 2022 project, Bryan Dickson, NACUBO's director, student financial services and educational programs, speaks with José Rodriguez, vice president and chief information officer at Pomona College. Together, they discuss José's “holistic approach” to cybersecurity, opportunities and threats of new AI platforms, and the importance of the human element in information technology. Links & Notes NACUBO's Top 5 Higher Education Business Issues of 2022 EDUCAUSE's 2023 Top 10 IT Issues Connect with Bryan on LinkedIn Connect with José on LinkedIn
Kathe Pelletier of EDUCAUSE joins the Digital2Learn Podcast to offer an insider look at digital transformation. In this two-part series, we hear from Kathe about credentialing and collaborations that make some higher institutions stand out from the rest. Listen now!
Kathe Pelletier of EDUCAUSE joins the Digital2Learn Podcast to offer an insider look at digital transformation. In this two-part series, we hear from Kathe about credentialing and collaborations that make some higher institutions stand out from the rest. Listen now!
In this episode, the Manager of Collaborative Learning from ASU's Learning Experience team (Celia Coochwytewa) and the instructional designers from ASU's Edson College of Nursing and Health Innovation (Jinnette Senecal, Aaron Kraft) explore key concepts and practical applications of data literacy for educators and designers, as highlighted in the 2022 EDUCAUSE Horizon Report, Data and Analytics Edition. We then consider professional learning pathways for developing data literacy fluency and implementation skills. Today's HOT TOPIC is focused on a popular approach to enhancing professional/scholarly learning experiences: gamification. Resources from the episode: *Brown, A., Croft, B., Dello Stritto, M. E., Heiser, R., McCarty, S., McNally, D., Nyland, R., Quick, J., Thomas, R., & Wilks, M. (2022, February 9). Learning analytics from a systems perspective: Implications for practice. EDUCAUSE Review. https://er.educause.edu/articles/2022/2/learning-analytics-from-a-systems-perspective-implications-for-practice *Kraft, A., Coochwytewa, C., & Senecal, J. (Hosts). Van Leusen, P. (Guest participant). (2019, August 26). Adaptive learning in higher ed: Failure is not an option. (Summer 2019 Bonus Episode 03) [Audio podcast episode]. In Instruction By Design. Arizona State University, Edson College of Nursing and Health Innovation. https://soundcloud.com/ibd_podcast/summer-bonus-episode-03-adaptive-learning-in-higher-ed-failure-is-not-an-option *Panetta, K. (2021, August 26). A data and analytics leader's guide to data literacy. Gartner. https://www.gartner.com/smarterwithgartner/a-data-and-analytics-leaders-guide-to-data-literacy *Raffaghelli, & Stewart, B. (2020). Centering complexity in “educators” data literacy' to support future practices in faculty development: a systematic review of the literature. Teaching in Higher Education, 25(4), 435–455. https://doi.org/10.1080/13562517.2019.1696301 *Ridsdale, C., Rothwell, J., Smit, M., Bliemel, M., Irvine, D., Kelley, D., Matwin, S., Wuetherick, B., & Ali-Hassan, H. (2015). Strategies and best practices for data literacy education knowledge synthesis report [Technical report]. https://www.researchgate.net/publication/284029915_Strategies_and_Best_Practices_for_Data_Literacy_Education_Knowledge_Synthesis_Report *Tippens Reinitz, B., McCormack, M., Reeves, J., Robert, J., Arbino, N., Anderson, J., Hamman, J., Johnson, C., Kew-Fickus, O., Snyder, R., & Stevens, M. (2022). 2022 EDUCAUSE Horizon Report, Data and Analytics Edition. EDUCAUSE. https://library.educause.edu/-/media/files/library/2022/7/2022hrdataandanalytics.pdf Hot topic and related resources: *Shellgren, M., & Becker, S. (2022, October 12). Gamifying professional learning. The OLC Blog. https://onlinelearningconsortium.org/gamifying-professional-learning/
On today's episode of the Illumination by Modern Campus podcast host Amrit Ahluwalia was joined by Tony Casciotta to discuss the unique challenges in change management and how to strategically align where your institution is going. This episode was recorded live at Modern Campus's Educause 2022 booth in Denver.
On today's episode of the Illumination by Modern Campus podcast host Amrit Ahluwalia was joined by Phil Hill to discuss the depth of transformation for higher education over the past decade and the pain points when digesting this change. This episode was recorded live at Modern Campus's Educause 2022 booth in Denver.
On today's episode of the Illumination by Modern Campus podcast, guest host Sharon Schwarzmiller was joined by Ian Wilhelm to discuss the looming challenges in store for higher education and how to stay innovative successfully in a volatile environment. This episode was recorded live at Modern Campus's Educause 2022 booth in Denver.
This week on SA Voices From the Field we brought back Dr. Amelia Parnell as we talked about the outcome of the midterm elections and the future of public policy as we end our public policy season. Amelia Parnell is vice president for research and policy at NASPA – Student Affairs Administrators in Higher Education, where she leads many of the Association's scholarly and advocacy-focused activities. Amelia's policy and practitioner experiences include prior roles in association management, legislative policy analysis, internal audit, and TRIO programs. Amelia Parnell writes and frequently speaks about topics related to student affairs, college affordability, student learning outcomes, leadership in higher education, and institutions' use of data and analytics. She is the author of the new book; You Are a Data Person: Strategies for Using Analytics on Campus, and host of the new podcast, Speaking of College. Amelia currently serves on the board of directors for EDUCAUSE and is an advisor to several other higher education organizations. She holds a Ph.D. in higher education from Florida State University and masters and bachelor's degrees in business administration from Florida A & M University. Amelia Parnell is also the host of the Speaking of College Podcast. Speaking of College is a show that answers common questions about college. The episodes are a good fit for anyone who is curious about how college works and how to navigate the college environment. Please subscribe to SA Voices from the Field on your favorite podcasting device and share the podcast with other student affairs colleagues!
On today's episode of the Illumination by Modern Campus podcast, guest host Sharon Schwarzmiller was joined by Allan Chen to discuss the key role IT leaders play, and the importance of understanding processes across campus to grow as an institutional leader. This episode was recorded live at Modern Campus's Educause 2022 booth in Denver.
On today's episode of the Illumination by Modern Campus podcast, guest host Sharon Schwarzmiller was joined by Julie Ouska to discuss the shift in how IT is perceived and the balancing act that occurs when it comes to collective decision-making around new modern learner needs. This episode was recorded live at Modern Campus's Educause 2022 booth in Denver.
On today's episode of the Illumination by Modern Campus podcast, EvoLLLution editor-in-chief and host Amrit Ahluwalia was joined by Katrina Biscay to discuss the importance of information security in a rapidly growing cyber world, and what leaders need to keep top of mind to help secure their institution. This episode was recorded live at Modern Campus's Educause 2022 booth in Denver.
On today's episode of the Illumination by Modern Campus podcast, EvoLLLution editor-in-chief and host Amrit Ahluwalia was joined by Jenn Stringer to discuss the evolving role of technology leaders to empower an institution and the growing responsibilities of CIOs. This episode was recorded live at Modern Campus's Educause 2022 booth in Denver.
On today's episode of the Illumination by Modern Campus podcast, EvoLLLution editor-in-chief and host Amrit Ahluwalia was joined by James Wiley to discuss enrollment woes and the shift in technology for modern institutions. This episode was recorded live at Modern Campus's Educause 2022 booth in Denver.
On today's episode of the Illumination by Modern Campus podcast, EvoLLLution editor-in-chief and host Amrit Ahluwalia was joined by Lois Brooks to discuss meeting the needs of diverse audiences and competing priorities through technology without compromising security. This episode was recorded live at Modern Campus's Educause 2022 booth in Denver.
Ravi Pendse is passionate about data privacy. As vice president for information technology and chief information officer at the University of Michigan, he has worked to ensure that privacy is a part of every technology decision on campus. At the same time, he is committed to fostering a robust data culture that democratizes the use of data to inform decision-making. At the center of that culture is transparency: making sure students, faculty, and staff know exactly what types of data are collected, and how that data is stored, accessed, managed, and shared. In this episode of the podcast, we talk about creating a data-aware, privacy-aware ecosystem, data governance challenges, making data visible to students, and more. Resource links: University of Michigan ViziBLUE portal Educause 2020 Student Technology Report Music: Mixkit Duration: 33 minutes Transcript
Technology and new platforms are one of the most exciting parts of any convention, and Educause 2022 was no different. In our second episode of coverage, direct from the Educause show floor, we take a stroll down Startup Alley, and meet a fantastic cast of talented characters that each have a platform you should know about for your institution. From engaging students more deeply to eliminating grading biases — these are just some of the glimpses of the Higher Ed future we'll hear from today. Join us as stop by the booths for: Ribbon Education Read.AI Public Insight Pressbooks Level.io Wildflower Education Interact123 Gyan.AI Rah Rah Kritik We had a great time chatting with all of this episode's interviewees and cannot thank them enough for the time they gave us to share their insights and platforms! To hear this interview and many more like it, subscribe on Apple Podcasts, Spotify, or our website or search for The Higher Edge in your favorite podcast player.
On today's episode of the Illumination by Modern Campus podcast, EvoLLLution editor-in-chief and host Amrit Ahluwalia was joined by Tom Andriola to discuss how higher ed institutions need digital excellence to thrive in this rapidly growing and evolving digital world. This episode was recorded live at Modern Campus's Educause 2022 booth in Denver.
Did you miss out on Educause 2022? Fear not! We saddled up our mobile podcast rig and took it to the floor of the show where we went in search of some of the greatest partnerships in higher education. Join us as we bring you a trio of success stories, direct from the floor of the conference. We'll hear from: Ken Connelly and Zach Meyers about how Breakpoint Labs has helped the University of Northern Iowa better secure their campus both physically and digitally (2:12) Troy Burnett and Deborah Lee and their partnership of engaging more students sustainably across their institution via the deployment of Pathify at Concordia University - Irvine (11:05) Joshua Sine and Jason Hill on their collaboration using Qualtrics to help Utah Valley University monitor university health and cleanliness (22:45) We had a great time interviewing these folks and certainly hope they inspire you to find the Higher Edge at your institution with their innovations. To hear this interview and many more like it, subscribe on Apple Podcasts, Spotify, or our website or search for The Higher Edge in your favorite podcast player.
Jim Siegl, CIPT, is a Senior Technologist with the Youth & Education Privacy team at the Future of Privacy Forum (FPF). For nearly two decades prior to joining FPF, Jim was a Technology Architect for the Fairfax County Public School District with a focus on privacy, security, identity management, interoperability, and learning management systems. He was a co-author of the CoSN Privacy Toolkit and the Trusted Learning Environment (TLE) seal program and holds a Master of Science in the Management of Information Technology from the University of Virginia.Shea Swauger (he/him) is a Senior Researcher for Data Sharing and Ethics at the Future of Privacy Forum. He holds a BA in Philosophy, and a Masters of Library Science, and is currently working towards a PhD. in Education and Critical Studies.A 2021 Educause article showed that while many believe students who have grown up with technology are not concerned with the privacy for their data, research shows otherwise. A 2018 Pew Research Center survey found that almost all Americans 18 to 24 years old utilize social media, ranging from YouTube to Twitter. A 2019 Insider Higher Ed survey found that 46% of faculty taught at least one course online. As technology within education continues to grow, and the world becomes more connected through the internet, student privacy will continue to be a topic of discussion. In the U.S., specific laws and regulations aim to protect student privacy, such as the Family Educational Rights and Privacy Act (FERPA).Learn more about CITI Program: https://about.citiprogram.org/
Dr. Veronica Diaz is director of professional learning at EDUCAUSE, an international nonprofit professional association whose mission is to advance higher education through the use of information technology. In her role, she ensures that learning and development programs support and advance the overall strategic priorities of the professional learning, member communities, and research teams. She also supports the development and design of career pathways that guide the professionals supporting and transforming higher education through the innovative use of technology. In this episode of the Leading Learning Podcast, co-host Jeff Cobb talks with return guest Veronica about microcredentials, mentoring, professional pathways, personalization, the need to reevaluate COVID pivots, and a “three Cs” approach to sourcing your portfolio of offerings: curate, create, and commission. Full show notes and a transcript are available at https://www.leadinglearning.com/episode308.
Recorded April 22, 2022 After over a month away, we're back and have all sorts of things to talk about… but not on this episode. We start with more data from EDUCAUSE that suggests leadership should talk with their local IT staff who support the learning spaces on campus. It's a good article from a trusted source to forward on to your managers and directors. Then, what's all this event support we keep getting roped into? Can we build our learning spaces to support events or will users just want something else in addition? Marc wants to clearly define the support scope, Justin keeps running into his video production colleagues, Jamie is way ahead of the game with aux I/O in every classroom, and Chris would be happy if people would just stop putting gaff tape on his microphones. Finally, we all go over our clever little ways to avoid the dreaded capital asset tags. At least technology is getting cheaper… Article discussed: https://er.educause.edu/articles/2022/4/educause-quickpoll-results-learning-spaces-transformation
One of the takeaways from this year's Educause Horizon Report is that there's likely no such thing as a return to normal: Many of the changes that higher education has undergone over the past couple of years are here to stay. At the same time, the trends, technologies and practices impacting teaching and learning have developed more depth, more nuance. For example, while hybrid learning was addressed broadly in last year's report, this year the report drilled down into important facets such as hybrid learning spaces and professional development for hybrid teaching. To delve into the details of what's on the horizon for higher ed in 2022, we spoke with Kathe Pelletier, director of Educause's Teaching and Learning Program and co-author of the report. Resource links: 2022 Educause Horizon Report: Teaching and Learning Edition Educause Showcase: The Digital versus Brick-and-Mortar Balancing Game Campus Technology Leadership Summit: Building a Digital Transformation Strategy Music: Mixkit Duration: 39 minutes Transcript
Welcome to Episode 46 of the Canvascasters Podcast today we are "discussing discussions" which is one of the most powerful tools for learning within Canvas LMS! In this episode, we dig into the power of Discussions in K12 and Higher Education, fun ideas to innovate with discussions, and best practices from the experts as we are joined by Melissa Greer & Linda Lee. Melissa Greer began her career as a high school Latin teacher for the Newton College and Career Academy and STEM Institute in Covington, GA., where she, herself, attended growing up. This year she moved into a Digital Learning Coach role where she supports teachers at 3 middle schools. Currently, she is working on getting her Ed.D. in Curriculum and Instruction with a concentration in Instructional Technology from Valdosta State University. In her infinite free time (*jokes*) she likes to hang out with her boyfriend and dog, travel, cross stitch, decorate custom cookies, and just be a nerd (listen to podcasts about and play D&D, Star Trek, Battlestar Galactica, Star Wars, books by Brandon Sanderson, etc.) Linda Lee is a folklorist, educator, and instructional designer. She's the Director of Instructional Design at the Wharton School of the University of Pennsylvania, where she leads instructional design and training efforts for the IDEA Courseware Team. She came to instructional design by way of adjunct teaching. She's used Canvas as an instructor since 2014 and as an admin since 2015, when she joined Wharton as the school's first instructional designer. Since then, she's developed deep expertise in inclusive course design and all things Canvas, including Blueprint courses, discussions, and group assignments. She's presented widely on these topics, including at InstructureCon 2021 and Educause 2021, where she co-presented sessions on “Change the Prompt, Not the Tool: Developing Effective Discussions” with Penn colleagues Meryl Krieger and Adam Zolkover. ________________________________________________ Visit bit.ly/CCE_faqs for more information about the Canvas Certified Educator Program. ________________________________________________ Follow Melissa Greer at @greerslatin on Twitter Follow Linda Lee at @lindajeanlee on Twitter ________________________________________________ Learn more from Linda And Friends: Change the Prompt, Not the Tool: Developing Effective DiscussionsStructuring Asynchronous Discussions InstructureCon 2021 Video: https://www.instructure.com/canvas/resources/instructurecon-2021/change-the-prompt-not-the-tool-developing-effective-discussions Handouts link: http://whr.tn/educause-async-disc ________________________________________________ MUSIC PROVIDED BY: Finding Happiness by Dj Quads http://soundcloud.com/aka-dj-quads Music provided by Free Music for Vlogs https://youtu.be/Yh9fk9iLR4s --- Send in a voice message: https://anchor.fm/canvascasters/message
Welcome to Episode 45 of the Canvascasters Podcast today we are "discussing discussions" which is one of the most powerful tools for learning within Canvas LMS! In this episode, we dig into the power of Discussions in K12 and Higher Education, fun ideas to innovate with discussions, and best practices from the experts as we are joined by Melissa Greer & Linda Lee. Melissa Greer began her career as a high school Latin teacher for the Newton College and Career Academy and STEM Institute in Covington, GA., where she, herself, attended growing up. This year she moved into a Digital Learning Coach role where she supports teachers at 3 middle schools. Currently, she is working on getting her Ed.D. in Curriculum and Instruction with a concentration in Instructional Technology from Valdosta State University. In her infinite free time (*jokes*) she likes to hang out with her boyfriend and dog, travel, cross stitch, decorate custom cookies, and just be a nerd (listen to podcasts about and play D&D, Star Trek, Battlestar Galactica, Star Wars, books by Brandon Sanderson, etc.) Linda Lee is a folklorist, educator, and instructional designer. She's the Director of Instructional Design at the Wharton School of the University of Pennsylvania, where she leads instructional design and training efforts for the IDEA Courseware Team. She came to instructional design by way of adjunct teaching. She's used Canvas as an instructor since 2014 and as an admin since 2015, when she joined Wharton as the school's first instructional designer. Since then, she's developed deep expertise in inclusive course design and all things Canvas, including Blueprint courses, discussions, and group assignments. She's presented widely on these topics, including at InstructureCon 2021 and Educause 2021, where she co-presented sessions on “Change the Prompt, Not the Tool: Developing Effective Discussions” with Penn colleagues Meryl Krieger and Adam Zolkover. ________________________________________________ Visit bit.ly/CCE_faqs for more information about the Canvas Certified Educator Program. ________________________________________________ Follow Melissa Greer at @greerslatin on Twitter Follow Linda Lee at @lindajeanlee on Twitter ________________________________________________ Learn more from Linda And Friends: Change the Prompt, Not the Tool: Developing Effective DiscussionsStructuring Asynchronous Discussions InstructureCon 2021 Video: https://www.instructure.com/canvas/resources/instructurecon-2021/change-the-prompt-not-the-tool-developing-effective-discussions Handouts link: http://whr.tn/educause-async-disc ________________________________________________ MUSIC PROVIDED BY: Finding Happiness by Dj Quads http://soundcloud.com/aka-dj-quads Music provided by Free Music for Vlogs https://youtu.be/Yh9fk9iLR4s --- Send in a voice message: https://anchor.fm/canvascasters/message
In this episode, we'll discuss the do's and don'ts of performance reviews. How can leaders approach this important opportunity for feedback in a way that promotes growth, productivity and retention in their teams? And how can employees utilize performance review feedback to pursue personal goals and advance their careers? Our host Dan Link talks with Martin Muff, Senior Vice President at 5/3 Bank, and Nichole Arbino, Communities Program Manager at EDUCAUSE. They will provide insight from the point of view of a manager and an employee on ways to ensure a positive performance review experience.
Recorded December 10, 2021 It's a prediction show! Educause suggests 10 IT-related trends we should all be aware of and we decide they're not in order, so we're starting with number 6. As with anything from Educause, it's highly-researched and detailed, but often just a little too milquetoast for our tastes. Then the SuperFriends each has a prediction for 2022. Some of us are suggesting general technologies (blockchain, AVoIP), others are focused more on the adoption of various technologies, and Larry is concerned that after almost two years of working, teaching, and learning from home, that “good enough” will be seen as good enough and cause all sorts of integration and quality issues. Have we really lowered the bar that much? (Yes). Article: https://www.educause.edu/research-and-publications/research/top-10-it-issues-technologies-and-trends/2022
In this episode, President Joe Gottlieb welcomes Colleen Baker, co-founder and COO of Higher Digital and the creator of (SEA)change. During the episode, Joe and Colleen revisit their findings at EDUCAUSE 2021 and dive into how (SEA)change can unlock the future of strategic change management.
In this episode, Joe and Wayne discuss the 2021 EDUCAUSE Digital Transformation Survey. Listen in for an interesting conversation about how higher education institutions are, and are not, adopting digital transformation strategies as we return to near normalcy on college campuses around the country.
The InPursuit Podcast: Insights from the Education & Workplace Lifecycles
Join Rachel and I as we gain insights and inspirations from Dr. Amelia Parnell on the current state of Higher Education and the value of inclusivity with our teams. Amelia Parnell is vice president for research and policy at NASPA – Student Affairs Administrators in Higher Education, where she and leads many of the Association's scholarly and advocacy-focused activities. Amelia's policy and practitioner experiences include prior roles in association management, legislative policy analysis, internal audit, and TRIO programs. Amelia writes and speaks frequently about topics related to student affairs, college affordability, student learning outcomes, leadership in higher education, and institutions' use of data and analytics. She is the author of the forthcoming book, You Are a Data Person: Strategies for Using Analytics on Campus and host of the new podcast, Speaking of College. Amelia currently serves on the board of directors for EDUCAUSE and is an advisor to several other higher education organizations. She holds a Ph.D. in higher education from Florida State University and masters and bachelor's degrees in business administration from Florida A & M University. Contact her on Linked in at https://www.linkedin.com/in/amelia-parnell-0913735b/ --- This episode is sponsored by · Anchor: The easiest way to make a podcast. https://anchor.fm/app
Episode Overview On episode 10 of EdUp EdTech, I had the pleasure of speaking with Stephen Laster, Chief Product Officer at Ellucian. "Inspired by the transformative impact of education, Ellucian develops solutions that power the essential work of colleges and universities. As the world's leading provider of software and services designed for higher education, Ellucian works with more than 2,500 institutions in nearly 50 countries—enhancing operations and enriching the experience for over 18 million students." We had a wonderful conversation surrounding higher education and what Ellucian is doing to propel institutions forward as they step into a world of change post-covid. There's a big announcement drop in this episode about their new learning platform. Check out the announcement article. About Stephen Laster As senior vice president and chief product officer, Stephen J. Laster leads all aspects of Ellucian's product direction and strategy, across the product and research and development organizations. Stephen is a seasoned technologist, business leader, strategist, e-learning designer, and educator with an extensive background in higher education. He comes to Ellucian from Penn Foster, where he served as chief digital and learning officer. In this capacity, he led the technology, courseware, and education practices to advance the impact and engagement of the organization's learning experiences, both in pursuit of learner success and to advance the technological capabilities of the firm. Prior to Penn Foster, Stephen served for more than a decade in technology leadership roles related to higher education, including as chief digital officer at McGraw-Hill Education, chief information officer at Harvard Business School, and director of curriculum innovation and technology at Babson College. His expertise in this field is widely respected and has led to advisory positions for organizations such as EDUCAUSE Corporate Advisory, IMS Global, Microsoft Higher Education, the Online Learning Consortium, Preserve, Inc., and Babson College. A frequent commentator on education technology and innovation, he has spoken and written articles for ASU GSV, EDUCAUSE, IMS Learning Impact, and Campus Technology. Stephen holds a bachelor's degree from Bowdoin College and an MBA from the F.W. Olin Graduate School of Business at Babson College. How to Connect Connect with Stephen on LinkedIn & Follow Ellucian on Social Media Facebook | LinkedIn | Twitter | YouTube Connect with the host: Holly Owens Join the EdUp community at The EdUp Experience! Follow us the EdUp Experience Facebook | Instagram | LinkedIn | Twitter | YouTube We Make Education Your Business! --- This episode is sponsored by · Anchor: The easiest way to make a podcast. https://anchor.fm/app
Cybersecurity is a complex topic to understand. A plethora of standards, frameworks, processes, tools, conferences, presentations, courses, and even certifications may help but can also be a bit overwhelming. This is where the community comes in.Understanding the threats and the risk faced by a university is one thing; understanding the high education system's inner workings and how to mitigate the risks are other challenges, yet all part of the same problem and solution -- a complex one. Trying to put all these things together successfully can represent a challenge even for the most seasoned InfoSec professional.So, how do the security leaders from many universities work to overcome this? They join forces, share information with each other, brainstorm solutions, and lend a helping hand to each other.They don't just do this haphazardly, however. There's a coordinated, singularly focused program designed to make this happen. The program is called EDUCAUSE.From the EduCause website:Through the EDUCAUSE Cybersecurity Program, you can find the tools, resources, and peer connections you need to learn about, better understand, or help promote information security and privacy to everyone across your campus, including institutional leadership, students, faculty, staff, and external partners.Establishing effective information security and privacy programs can help your college or university understand the risks it faces, the secure methods it should use, and the precautions it should take to keep campus constituents safe and institutional data protected.In turn, these programs can help you identify your key messages, know who your audiences are, and determine how and when you will communicate with these audiences.We are stronger together, and this program offers a chance for multiple universities to come together to share ideas and best practices, mentor each other, and raise the InfoSec bar across the entire higher education ecosystem.Have a listen to learn more about how the program works.GuestBrian Kelly, Director of the Cybersecurity Program at EDUCAUSE (@bkinct on Twitter)This Episode's SponsorsBugcrowd: https://itspm.ag/itspbgcwebDevo: https://itspm.ag/itspdvwebResourcesLearn more about EDUCAUSE: https://www.educause.edu/ (@educause on Twitter)Learn more about the CyberSecurity Program: https://www.educause.edu/focus-areas-and-initiatives/policy-and-security/cybersecurity-programProfessional Learning Catalogue: https://catalog.academy.educause.edu/Mentoring Platform: https://mentoring.educause.edu/To see and hear more The Academy content on ITSPmagazine, visit:https://www.itspmagazine.com/the-academyAre you interested in sponsoring an ITSPmagazine Channel?https://www.itspmagazine.com/podcast-series-sponsorships