Summit Recordings Available
Videos capturing the 4th Learning Analytics Summit are now accessible to all with Kaltura.
Videos capturing the 4th Learning Analytics Summit are now accessible to all with Kaltura.
Description of the video:Well, hello everybody. So I'm really delighted to be able to talk to you about a variety of things of why I'm excited about learning analytics and sharing some of the work my colleagues and I have been doing give distinct intentionally empower our faculty could transform students success. What to expect. Over the next 50 minutes or so in this session, you'll be invited to contribute to our collective learning and ideas around where analytics in the different contexts that you represent or that you've experienced. And hopefully we'll get transposon and apply some potential communities for doing L'Oreal takes work. And how are we doing this? Why are we studying a part of the timesharing my stories, the approach that my team and I have taken to bring learning analytics to our institution and some of our initial lessons learned. Yeah, I certainly would love to hear from you and learn from your experiences as well. After the session is done, I will share a copy of the slides which has a very long list of references at the end of the slides. If you want to go back and look at that through the hoop at later on. So let's go ahead and get started. I'll be sharing this in three parts. So part one is a little bit kinda just sprites of background in terms of my journey into learning analytics. So first and foremost, what guides my work? What's my North Star? I really believe in the power, education and the potential that every learner brings as they enter the educational space. And the vision for the work of our center is to really empower faculty to contribute to the success of all of our students. And I use several lenses that, that guide my thinking and my approach to the work. What is systems thinking which recognizes the connections and the relationships between different individuals, different offices, in different activities and service of a goal. I often take an entrepreneurial mindset to my work where my curiosities can turn into thinking about connections and then leading into creating value in, in our efforts and activities. And I also seeks to incorporate empathy and empathy mindset on as I work with individuals and groups in the work that our officers. Additionally, I want to share some of the contexts that have shaped my perspective so Farsi understand kind of where I come from and how that informs the work that we do. As Linda mentioned in her introduction, I do serve as educational developer and our Center for Teaching and Learning. Our center is what we call a center of one in that I don't have a large group here. We're bringing my center. I do collaborate with colleagues across other offices that have different roles and focuses around supporting the academic mission of our institution. And our institution is a public regional, comprehensive institution that also is a primarily undergraduate institution with close to 8 thousand students, 77% of which are undergraduate students. I currently am, and also have experiences at multiple institutions as a faculty member. And so have the experience. Knowing what it's like in different ways to be a faculty member and remind myself what that means as I work with faculty. As Linda mentioned, I also had the Polish sir as a program officer at the National Science Foundation, where I focused on a variety of stem education programs. And what that afforded me is just a view of all the range of different projects and activities and goals that researchers, faculty members, and miniatures have to support student learning, student success it in different ways and much has been so eye opening. My training is in environmental engineering and so certainly that lens comes out at time. And I also am a researcher within my discipline as well as researcher across different education, stem education and faculty development projects. I just want to share that as some background. So what brought me into learning? I love it. I would say I had the good fortune to have a radio serendipitous encounters that have become inspiration for some of the work that we're doing currently at us. I, some of these are through attending conferences. Saw that we're just exploring things that I come across in my inbox, in my email, through newsletters and other things and just try that. The power of the Internet, when you buy one thing, it leads you to another wonderful website and the information, um, and so I want to highlight a few of these. One is the work at the University of Kansas and I think some of the colleagues at, at KU are part of LA some of this year and in past years summits. And I was really inspired growth way they use data to prompt inquiry from faculty and use facts to have groups of faculty and departments look at ways to change their courses in the curriculum. And may have had several programs to do that. Like many of us work at IU and that blender and George have lead so wonderfully through the Learning Analytics Fellows and other programs to really have that systematic support, to provide the data and support to allow faculty and staff to explore research questions that have a data component to it that really have an eye towards student success and student learning. Other work, such as that the work at University of California at Davis, aware that Marco and, and his team have really developed some really what I consider innovative datasets and visualizations for their faculty and for their campus with a focus on equity. And similarly at UNC, Chapel Hill, Vg and Kelly have had been part of creating a dashboard that collects learning analytics data with inclusive teaching practices and somebody, there's many other examples, but these have been kinda starting points for my inspiration into a learning analytics. Then I saw these examples are just great ways to provide opportunities to bring data and form action and to motivate change to make us improvements in our courses to support student success. And so as I started digging a little deeper into our analytics, they're certainly as, as he might all know from your work and your equations, just a wide variety of examples of different types of learning analytics, activities and projects and programs. Some of them focus on that institutional academic data. Data that occurs from one semester to the next in a cross on a student's time at an institution. And examples are these first three that you see here on the slide. And then there's additional data that really is looking at, maybe data within the course management or Learning Management System or LMS. It is really within the course. I'm at the course level. Then I say back up for the work at us, I, we've been focusing more on academic and institutional data, sorts of things. You see highlighted in those top three bullet points. We have not yet dug into the more LMS and, and coarse level. We do. We did a day type of data. So far. So I want to take a pause and invite you to share what has brought you to learning analytics, whether you're just starting your work and explorations and learning analytics, or have some activity in, in more analytics. And I invite you to share that. Since there's several as a mermaid or we can use the zoom chat for you to share your thoughts and ideas. So let's chat and I'll be monitoring the Zoom chat. Well, spill and Casey on being able to understand student hurdles for success. Institutional supports. Definitely. I mean, I got a bill is one of my colleagues in the project that we'll talk a little bit more. Bounce and thanks Bill for offering that in case you get to see you. Yeah, thanks for sharing your perspective in the work that you do. Stephen for your contributions, talking about your work on looking at cloud technologies for storage and data processing. That just makes it easier to go through all of the data that might be available and to make meaning of it, yeah, That's really important work. Merely thanks for sharing your work in Egypt, chimp developmentally and thinking about how to leverage learning analytics. Prediction has he's interested in prediction of students at risk. Very base for your cat Nisbett year taking over some work that a colleague who has left is taking on so on. That's exciting. And saying hello and thanks for your interest in hearing about what others are doing. And I hope it will get told to learn about that across different contexts. And Jennifer, thanks for sharing that. You want to have opportunities to work with other equity as researchers in creating dashboards on that's great. And Marlene, thanks for sharing. I'm using ExamSoft and enter data around that. In the context of medical school. Jean and Linda, thanks for sharing What brought util or analytics too. So we can see this really a variety of interests and different ways that each of us have come into being interested in and working in different asked to Maria minutes. So thanks. So pacer participating in that. So let's move on to Part 2, where I like to share our current project that my colleagues and I have been working on. Where our focus is building capacity for change and using learning analytics as, as one of the key ways towards that goal. So I mentioned some of the learning analytics programs that I learned about and that existed. Other institution does emphasize, I share just a few slides ago. I'm happy not large research institutions. And I wish you were thinking about, well, how do we adapt those types of programs to our contacts as a smaller institution? Not very small, but smaller, but also that has may slightly different focus as an primarily undergrad institution. In some of the things that we saw were challenges to us jumping in to say providing an equivalent of a learning analytics fellows program or some of the other examples I shared is that at the moment and we just don't have the broad availability and access to the academic institutional data. Whether it's the data, raw data itself, or visualizations that faculty can directly use and take action on. We have a small institutional research office, a small but mighty, and that the work they do is terrific and that they do have each semester dashboards on student enrollment, student retention rates, and graduation rates that we can also does aggregate. But so far they haven't had the availability to create additional data. That's more at the course level. Relates, it's at the programmatic college for institutional level. They have interested gene, so they don't have the bandwidth to take on those additional areas of work at the moment. And then there's some other challenges. And just not knowing if we have a ready-made community to explore L'Oreal attics from faculty's perspective in been just kind of doodle conversations and interactions. We know that FBI really a wide range of areas of interests and question topics around learning analytics. So there's just a whole lot that's going on. We've got to start somewhere. So we really want to take a capacity building approach to our work. What you see in the slide here is a representation of the Appalachian Regional Community Consortium multidimensional framework to capacity building. And in their framework that they look at capacity building across different dimensions, capacity type in level. They also look at the project stage, and they also look at the change outcomes and so forth, the project that we have ongoing and that I'll describe. Now, these are the different areas of capacity that we're seeking to really focus our attention on at the moment. So with that on and be inspired by the work that I at other institutions, we went ahead and wrote and submitted a proposal to the National Science Foundation. In there I use or improve an undergrad stem education program. In fortunate enough to receive a grant to embark in this capacity building project, which has four main focus areas. One focus area certainly is looking at student success in introductory, in gateway level stem courses. Because we know if teams are successful in these early courses than it makes it really challenging for them to continue on to be successful in their programs. But another area of focus that we want to be able to develop some actual dashboards for faculty use, but also as with faculty as co-designers to these dashboards. And we also want to find different ways to engage faculty and build that community of faculty you're interested in and curious about different aspects of student success, student retention. And then we have theories of change and how we can really move towards the vision that we have. And so we ought to be able to test stats are hypotheses and theories of change. And well, what are our assumptions and hypotheses for this project? One is that we want to be able to provide multiple ways for faculty and others to engage in the work. And we hope it will cultivate some motivation and some longer-term efforts. We also recognize that having data available alone will often not be sufficient to drive change. And so we also want to intentionally foster how everyone can connect their own perspectives with the data and with a bear experiences. And we also thought that a systems thinking approach where we connect different pieces together is a way to kind of organize our efforts and also helping to facilitate systems thinking amongst our faculty participants would also be useful in our long-term efforts. So what's the framework for our project to build capacity towards change? So you can see there on the top of the slide in, highlighted in red are kind of three main thesis of our project, engagement, data and community. With stories built in as threads that we seek to weave across the different activities. And with that, we hope that that will help motivate interest in eventually actually taking part in testing out changes and transformations in individual courses, curricula, and policies, and the way we all consider on supporting our students success. And that was informed by a variety of theories of change. One that's often cited is expectancy value theory of motivation. And there has been a tweak to that. Also consider the cost of that expectancy is the belief that it can be done at one can do it. A value is just create that value for the individual. Again, I have a set of references and my set of slides. And if you're interested in any of these papers, I've known, you can seen as references from when those fights and make available. Other theories have to do with the potential of Giza transmission and practice and the power of what Roxanne markets and cause significant conversations and significant networks. And we also see that storytelling can be a powerful and motivation change. And so it's really through these different theories that have given and inform the way we approach our project. And there's also Kotter's eight-step change model that we take elements of, particularly the elements on building coalitions really obstacles and have any short-term wins. So I mentioned one focus area or hypothesis is that we want to offer multiple points of entry for our faculty. And so we have different activities at different levels of engagement. One that occurs every semester and it's for a broad group. And you can see these are presented in these blobs on this diagram here, or what we call the mini activities. So many activities are our shortcut there about 15 minutes. And they occur during our college, which are Stam, our science, engineering. And math focused college or all the faculty are participating in this because it's a meeting called by our dean. And during the time of that meeting, we had opportunity for a 15 minute activity. And what activity usually includes and has included each semester is a brief, fairly simple data visualization or data table that looks into some aspect of student success. So it's like an academic and institutional level data within the college. And there's a few short prompts for individual faculty to consider. And we've been having these meetings In the past few semesters. And then faculty are grouped into small groups in Zoom breakout rooms. And they have a chance to chat briefly about it. But yet they see and some of their ideas and responses to the prompts and as occurs all within 15 minutes. So again, really many activities. But that gives folks a small taste. Perhaps some L'Oreal analytics, but also provides an invitation to folks. So and, and we say, well, if you're interested in these conversations and be fine. Talking about this data and thinking about student success and student learning is something that I'm really interested in doing while we have these two faculty communities available for you to participate in. And that's what you see in these red blobs on the diagram here. And so the two faculty committees. One is the inquiry in stem success. That's where faculty get together typically every two weeks and talk about different papers on different aspects of student learning, student success, different models, different interventions, different research. And the topics are driven by faculty interests and conversations. Another faculty Command-D is our data tools co-design. And they are the ones who work with our institutional research office, our product team, and a consultant that we included as part of our project budget to help design the data dashboards. And B, also test data dashboards and go through this iterative process. Again, meeting typically bi-weekly. And so we have these two different faculty communities for the past three academic semesters. Early on, recycling me is focused on compiling a set of questions and I wonders that come from conversations about the different papers that reread together and just different things that come about. But then now we've been moving to other topics there. Those are covered different levels of engagement. And the other part is we also want to be sure that we have touch points for our department and program chair. So at least once a semester, we also have brief visits during what's called the Dean's Council meeting, more objects to the project. And also give them some snippets of what the different athlete needs have been talking about, Liam, No. Which faculty have been engaged. So they can recognize that amongst their faculty. So we're doing that as well. And I think I saw a question from Casey about that, so I'm happy to talk a little bit more about that. We do try to also engage department chairs. So thanks for that question. But I like IBM Data parties, That sounds great. Brands. So a little bit more about faculty as co-designers. So I mentioned those. I wonders and questions that came from our first semester of conversations amongst the two faculty communities. And that generate 80 plus different questions and I wonders on from those conversations and that was used. Then we're, the, the team explored and thematically looked at these questions and, and the types of faculty questions. They're not surprising that the range of kind of heights of data that are prompted from those questions and the types of actions. They either get curious about what it is to customize course to foreign course redesign, look at trends or impacts of changes in. Those are not surprising, but we want to be sure these are things that our faculty presence we're interested in and have know that our products driven by their interests and their conversation and their perspective as opposed to solely taking it from the literature or the work at other institutions. And again, we want to remind you of our contracts, you know, or analytics for us has been focusing on institutional academic data as opposed to say, LMS or educational technology data. And the other thing that's perhaps different than, say, large research institutions is that all of our courses at the introductory gateway level courses are in small core sections, typically less than 25 students per course section. And we don't have those large lecture hall. Courses, say for your intro to math or chemistry or biology courses. So that's important when we talk about data and the size of its core sections. So our purpose so far. So I talked about, you know, one major area of a capacity building is fostering community across those two gametes, which also have a chance to interact with each other at least once a semester. And we do have some faculty who are part of both QB piece. But across the three semesters, We've had 20 active faculty members participate in these faculty community meetings that happen typically every two weeks and baby in excreting animals. All of them. Which is great when you consider all constraints for our faculty and for contracts that's out of around 86 or so full-time stem faculty. So we think that's pretty good for capacity building in the level engagement. And definitely had been participating from one semester to the next coupon continue to participate. So we feel like that's an indicator of the value of these faculty committees for our participants in these many activities that we have during the college-wide meetings, There's been on average at least 130 participants. And so we know that we're touching at least briefly across a really wide and almost all of our members of the faculty and staff and our college. And we also here at early about offline conversations that happen, not during the Catholic community meetings and different action steps that groups have actually started to really further explore some of these topics in terms of the dashboards that our faculty are co-designing B. So we have to that, again, working with our consultants and our internal product team, what has been inspired by the dashboards at UC Davis and UNC Chapel Hill, what we call who are your students snapshot. So this is data provided to faculty that will be provided faculty at the beginning of semester have some information about different attributes of their students. One particular aspect is the range of student majors in their courses. Certainly for introductory courses, it's not only students who are in the major who take introductory same math or chemistry course at Hopkins has students are across, may use that can be important for faculty if they want to customize the course examples and, and different ways to discuss a topic known who the student majors are. That's one dashboard. Another dashboard that everybody's really excited about is what we call the course sequence dashboard. Our faculty participants and have identified three course sequences at the introductory and gateway level. Ami, I think of like somebody might say, KEQ was 123 or Physics 1, 2, and then the follow on course being things like that or General Chemistry 1, General Chemistry 2 and Organic Chemistry 1. And building dashboards where faculty can see how students performed in the first course and then knowing how they perform in the fall one force the kind of a matrix of course grades in different course pairs across a three-course sequence. And this dashboard will also allow users to disaggregate the data across different student attributes and characteristics to really allow them to hone in on different questions around student success and how that might inform things that they could do in their courses for across their programs, such as identifying bottlenecks in course sequences. And then that could lead to some additional conversations and action or how much change or topics or curriculum policies. And then we also have seen other evidence of change for our capacity building products so far. So let's next, Saleema, I want to continue to strengthen and expand our communities. And one thing we haven't had much focus on is using a storytelling. Storytelling for our participants in the communities to reflect on their journeys. But we will also why use those stories to share with colleagues, to hopefully use that to motivate others. To be interested in different stem and steel success topics and see how data might be really important in that work that they do. So we see stories having different possibilities in terms of their use and their power. It's not just stories using the data, but it's a story from the participants and from the faculty themselves. We also want to find additional ways to empower T, but those who participate in our programs as change agents because we're really in the long haul of how do we think about change and transformation. We also recognize the importance when we provide data that users are trained to use the data ethically and responsibly and appropriately and informed by a variety of ethical considerations and evidence-based practices. And in the law character building capacity, the capacity towards what? Capacity, so that we can eventually have programs and implementation of projects and support mechanisms for groups of faculty, staff and students to really identify in particular. Areas of change to develop some interventions and changes and test that out within courses, within curricula. And to have the a to measure the impacts of that. So some possible models include department action teams, some of the programs that I mentioned earlier that we were inspired by. But we're still in that capacity stage. So still lots of work to do at the same time. And to recognize when you consider how we value faculty is time, how I recognize their efforts in these types of programs and activities Dam the reward structure. Not only them, necessarily monetary reward, but perhaps reward in terms of recognition and promotion and tenure and evacuation. These are broader conversations to be had with lots of stakeholders. So, so far, you know, we're a little less than a year and a half into this capacity building project. So we're not done yet, and capacity-building would just always ongoing. But a few lessons learned is that certainly capacity-building takes time. You know, we could have just worked with our institutional research office and perhaps others to just build up a few dashboards and have them delivered and Ray made for our faculty. But that wasn't the oppression we want to take you want it. We knew it was going to take time to have these faculty communities, Muslim or a read through papers and have them work with our consultants and data team to build dashboards. But we think that's going to have hopefully long-term positive impacts in that there's probably more buy-in and engagement during the process and into the outputs and products of, of these different efforts that the faculty evaluate. And they knew that they were created by their colleagues, I think has value to you. It also takes time to be intentional, strategic on to ride regular communications and points of contact with our shares. With faculty broadly with project teams, with different groups of faculty. And to be able to communicate that in a clear and contextual way. We want to be able to offer that transparency and also kind of help guide some of the conversation on. And then certainly there's this ongoing process to plan, act, measure, refine, and repeat. We did have a logic model coming into the market and initial framework. And we're further refining how we evaluate and provide assessment to our work thus far so that we can move from capacity to implementation. And one lesson learned is having an advisory board, particularly an external reward, has been invaluable in our product humus, so grateful to our advisory board. There's provided lots of comments and feedback and thoughts to the work that we do. Thanks Catherine. Also, knowledge that you do. We've had a variety of different communities that not only do we create for our project, but that have been supporting it and fluorine our work. Certainly. Working in learning analytics has provide opportunities to engage in a variety different offices and individuals and these different offices. The less you see here are probably not surprising, but like just pies. Another reason to have these conversations and to interact with folks in these different offices among common goals around student success. Externally, there's been a really wide variety of different organizations and groups where we've learned from the work of others we've been able to share and get feedback on our work. Things that you see highlighted are hyperlinked. So you should be able to find a document with these organizations. You can, it'll take you to the webpages for these different organizations and entities. And, you know, this is not an exhaustive list. Certainly, we've interact and learn from a lot of other groups to including some of the organizations that you all represent as well. So perhaps we have, I think we have a little time for a little more check via the Zoom chat. So the question for you is, how might learn when it's connected to communities within your institution or organization that you're familiar with or are active in. And also some emerging or new communities that don't quite exist yet. Here's the here about the community aspect in your context and experiences. So feel free to use the Zoom chat. And I know this topic about communities will be the focus of other sessions during the LA symmetric but on keys to hear any initial thoughts about existing communities. Folks that you work with internally, externally, for ideas, for economies that could be useful for your work. Steven, thanks. At Berkeley, you have some audiences. Yeah, definitely the undergrad advisor is, yeah, there's really, I think, a really broad group where they have a live data. That was our institution at slightly different set of data that could be useful. Yes, of the holistic view. That's great. Thanks. Carrie, thanks for your question. How do we leverage instructional designers and attack? Yeah, that's a good question. At are at your side. We have a separate office for our online learning and other apps as a core or EdTech. We do collaborate with them. So far in our capacity building. We do have plans, but we haven't yet formally rot folks such as structural designers, Ed Tech, those in our professional advising offices and other students a quarter off in there as part of that systems thinking that we do want. Another one I move into aesthetic. It's useful to have those conversations across different areas to hear from each other and hear other perspectives and learn from what's going on in these different areas. Brian, thanks. On discipline-based education, etc. Definitely. Home. Actually, I'm not familiar. Again said about Data Wise and data coach shoes. That's not something I'm familiar with, but certainly curious to learn more about that. And so perhaps a bit later on we can hear a little bit more. But by that area, case, uh, your question about faculty. So we do have a broad definition of faculty. So, so far proposed the folks who had been engage our faculty have the more traditional faculty titles. Instructor, assistant, associate, full professor, those, those different ones. Because we are primarily undergraduate institution, we actually do not have teaching assistants, graduate or undergraduate. So GTA is gratitude instance if he's not really in an area, batch, really have a population that. So thanks for sharing that, but thanks for sharing that resource there. I hope maybe we have time. After this third part 2 may follow on some of these things that you all have offered who asked about. So with that, the final part of our presentation is I think touches upon some of the things that you all are asking here and mentioning in the chat, which is what we see as potential opportunities and considerations for our learning analytics work. I don't know that these bullet points or anything new and revolutionary. And I think some of the other presentations in the summit to this week highlight these. But I'm for us, we certainly see moralities as really a great opportunity for collaboration and connection. I think that participants on the comments earlier, connections with all these different offices and individuals who have different roles and focused area. Certainly our faculty broadly define our colleagues who are in the academic advising offices, in our students support and success offices, our colleagues in the libraries and our ad tech in intestinal designers. The range in our registrar's office. Really, in our IT, really why we instantly or institutional research. A lot of devout husband and we've been able to interact with folks that we haven't in the past through our party so far. And so we see that potential. I think also L'Oreal and XOM through the data that comes out, really provides a way to articulate the vision and values of a department, of a college of institution, of individual paddling member around student success and equity. I think it's just another way to be able to highlight at. And certainly the data allows us to check our assumptions and our practices. As we also foster a culture of data inspired data informed action. And there's another opportunity to also bring in students as partners in urban or Alec's work. And I think we've seen examples of that in the different presentations this week. And, you know, what is the purpose of learning analytics efforts? We are certainly to support student success, but what does it really mean in different ways? And see your source, our initial thoughts on it. And and I'm certainly learning more and in the presentations that are offered this week, and certainly we might have a little time to hear more about the different values and purposes of learning analytics work. And a lot of the same time with opportunities, there are certainly challenges, considerations, and threats. Now, I think it's important to be mindful of and to consider in our plans, in our implementation, in the way we connect with colleagues and how we implement for work. Um, there's been competitions already in the sessions this week and in other works about minimizing harm. How do we best mitigate bias at, address the ethical issues around all the different types of data that exists and how we use them. Some of our faculty have expressed concerns about the data and concerns of whether any of his academic course thermal data would be part of evaluation, faculty evaluations. Our leadership has expressive, that is not the purpose to learn allies Theta. The purpose is to inform decisions that we can make improvements. But still these are about concerns, faculty concern about. We need to find ways to address that directly so that there is trust in the data and how it's used for both the faculty and the students perspective. So that also includes intentional framing, transparency, as well as a sum or technical issues around data visualization design. Issues that our data across institutions have different owners of the data and different sources and platforms. And how do you kind of put all that together and musical waves when they are across different areas of our institution. And certainly, you know, we're looking at capacity building. The capacity building. We believe it's ongoing no matter where you are in the stage of the work. And so considerations around how do you scale efforts and sustained efforts towards the longer-term goals of institutional change around student success, student retention, and other metric to student learning as file. So with the bat, let's chat some more, whether it's following on some, but the great comments you all are offered in the Zoom chat, but also curious to hear about additional opportunities, as well as the challenges and threats. Further analytics that you're wondering about or that you're working on. And then the meantime as me scroll up in some previous comments. So I see an earlier comment, Jose, you mention the training around their analytics. And so, so far we have not deployed and made available a dashboards because they're still in developmental stage. But that is an important aspect in the training. Training from the technical side of how do you navigate around the dashboards and having very clear information on the dashboard. So we define different variables, define a different datas, and protect context to it, but also the training on handling the data. Responsible ethical US EPA data access to they didn't own things around here. And so that's going to be really important effort and we look forward to learning from the community around that. C is a question about Yeah, assessment data and self-study data. I've heard that question at other institutions and we haven't quite wrangle with that yet. We recognize that there are tie-ins with existing self-study data, say, around Program Assessment or internal program reviews are the types of assessment data. I think that relates to kind of the different sources of the data and where they might exist. Near format systems thinking point of view, there's so much data that exist are ready. How do we make best use out of them is a really big question that we're still wrapping our, our ideas around. Yeah, thanks for your comment there about applying, validating technological environments. There's certainly a lot of different areas there. That's the stuff scanning. Fairness. Yeah. So I wonder if fairness also relates to equity concerns and issues. Yeah. That's something of a concern. And actually, when we had submitted our proposal to the NSF, one comment that was raised was, how do we mitigate the bias and misuse of data? Ido. So we've really been grappling with what kind of data is available in these dashboards. We know that this aggregating or data can be really important for men. Looking at making visible equity issues and concerns and, and, and gaps and where there are differences and how we can help support that. Yeah, The issue that we have is with our small section sizes typically less than 25. As opposed to large lecture homes. This aggregation for certain attributes wouldn't be feasible. So there's certainly that balance there. And that's also a reason why for our use of storytelling where you're initially, we're thinking about how do we create stories from the data. But then we recognize a data is typically student being a student academic and institutional leader, our students. And so right now, we're kinda holding off on creating stories from the data because of that potential of the bias which goes against our goals. And that's why we're really steering our storytelling efforts on our faculty have been participants in our activities and using a storytelling for their reflection to document their journey. And then having waste for them to share that with their colleagues. So it's about themselves as opposed to applying the story from the data about students. That's cut our current thinking about that. I'm curious to hear about other thoughts that you won't have. Any. I have a question. This is Casey. Casey. I can you speak a little bit, give us a little bit more insight into the coalition building you did on your campus. Who's part of your project team? Who, what are the stakeholders that are part of this process are helping either that need to be informed or that are part of the decision-making. What does that look like? Yeah. Thanks for that question. Actually. Yeah. This next slide, which is already not like me. He's a little more insight. Our product team meeting those who were part of her in our NSF grant or those that you see listed there. So that includes our Associate Provost for Academic Affairs, who also happens to have a stem Becker background. Our colleagues in our institutional research office. Bill who's on the call. He's a department chair in stem, but also as a faculty member in stem. We have our Dean of our stem focused college, and then our product team, also our consultants that are helping to build out a dashboard which is really the majority project funds. So that's kind of the core team from the product standpoint. But then who we consider also core to our project or our participants and our faculty communities. And you can see the range of faculty members there who have been actively participating across semesters and epiphanies. And they are faculty at all different ranks and levels. From first year faculty to faculty who are Kenyan benefit institution for 20 plus years. So really the wide range and across the different departments and disciplines in our stem college and all the work has been on our stem college. In part because as a pilot project, but also because of the source of our funding, the NSF. But we also institution recognises that this pilot we hope can be expanded. To our other colleges are liberal arts college or business college or health professions college. And certainly our institutional research office is very interested in taking the dashboards and also expanding it across the different departments and areas there. So it's really bright range. And as a previous question asks you, our next step also to bring in that's in our plans. Colleagues and the different offices that are like this act, different academic support offices. I'm areas that work in diversity, in register and enrollment, or really a wide range there for you so far or Stripe has been tackled babies ever try and really build a core group of faculty. But that also help them be familiar with our colleagues across offices. And another core important part of our project team is our external advisory board. The names are listed here. I want to give special recognition to both Jewish and I really, I use hosting this as well as Linda who also helping to facilitate this session. And they've been instrumental as well as our colleagues in providing really great feedback and really valuable information as we move this project forward. Let's say, unless you have a question
Description of the video:Well, I want to introduce Paul to the, to the folks here. And I first met Paul. He's here at Indiana University, Bloomington. And I first met him when he came here to join us from he'd been at the Pennsylvania State University teaching large introductory undergraduate courses since 1998 and then 2008, he came to Indiana University and one of the first things Paul bid actually was in the summer time as he came to our Teaching and Learning Center. And I thought that was wonderful, right? Just that in itself and helped him develop some online courses. And then later on he got involved in their learning analytics fellows program. So anyway, it's, it's, it's great to have you here. Paul graph is. He's been promoted twice at the economics department, most recently at a teaching professor. And he serves as the Economics Department teaching specialists. It's a business principles course coordinator and an undergraduate intern coordinator. And he was awarded the Mumford excellence in extraordinary teaching in 2021 and a Trustees award for non-tenure track faculty and to 2015. And he has been a learning analytics fellows since 2015 to 2020. And I really appreciated all the work you've done in both the Economics Department as a teacher and being one of our very influential learning analytics fellows. So Paul, I'll turn it over to you. Great. Thank you so much, George. I really appreciate that. I should just stop right there. I think it's going to all go downhill basically, but I appreciate thank you. Off recombinant taken the time. As I was joking earlier, I was looking at the agenda and I think I'm the last person, so I'm keeping you all from probably better things here, so I'll try to be efficient in my time. But again, thank you all for coming. As George said earlier, my name is Paul graph. I'm a teaching professor at Indiana University and I'm going to share with you kind of what's happened in my journey, I guess I'll call it for with learning analytics and what it means for our department in terms of action we took ultimately for the student's success. Again, I have to get a lot of things to her. And George, you can see here is also listed on here as George mentioned earlier, we've got to know each other over the years. But the Center for Learning, Analytics and student success, the class, their former Vice Provost, dense growth, once the ones that really got this thing off the ground for us at Indiana University, the folks at IU bar, the Bloomington assessment and research always want to say reports, but mike du which Harsha and then obviously linda Shepherd, just the people that were behind the scenes that helped me with a lot of that data. Whether it was gathering specific things, just really big, very helpful to say the least. So them a lot of thanks. And then my colleagues from Professor glom on some grad students, Professor lot in particular are referred to in our time together because he became a coauthor and he would reach out and they would to Jordan's and some of the people read bar and they'd say, Well, you know, you have this person in your department working on the data. You might want to go talk to Paul and then finally, a former grad student help me out with the more recent project, Nikita law patent there at Ashland University. So once again, I couldn't be here talking to you all and enjoying the opportunity to share with you the success we've had if it wasn't for these folks. Once again, thank you all for that. Also to as George said, I don't know if I'd call it influential. I often joke with George over the years. Are you sure you want to have me come back because I keep doing this. I've been part of the organization of the learning analytics for quite a while. I've, I've walked away a little bit from it and for good reasons, but I noticed when I would share a similar results of my colleagues, they got involved. And I think there was a year in particular, we had three or four of the present state a presentation. So many of us in our department gotten involved with learning analytics in part because of our experiences with that. And even one of our grad students, and that was probably one of the more satisfying things. Are grad student. She did her dissertation and learning analytics. I think it was reset, retained or perseverance. I think it was so student who's an econ major huff, how did the persevere through majors and whatnot. And her name is Morgan Taylor and she's now an instructor at University of Georgia. What George maybe did not mentioned is that my primary role as instructors that teach large classes. And so Morgan's now doing this over at Georgia. So again, that was probably because of the learning analytics opportunities during my time together or during my time working on the data. Said earlier, professor glom, for example, who was department chair at the time when he found out I was playing with the data, so to speak. He would often come in, you say Paul, tell me about the transfer credit, what's going on. And so having access to that data for our departmental administrators was really helpful. Just, you know, I could bring up the data, for example, or how are this subset of students performing in our introductory courses? So having that opportunity to be a resource for my department was really helpful. And then it even went so far as to kind of spreading out to the entire department. I'm not sure the exact reason why, but I'm non-tenured, so I don't I do basically teach the large classes. I have no research responsibilities. Again, I guess I can say thanks again to George, maybe blames George in the sense of, you know, I got involved because of the Center for Innovation teaching and learning. The hook meeting that got me involved was they showed week we had this event they talked about, you know, Indiana University has a lot of institutional data. And if anyone wants to play with that, oh, and by the way, we've got this data visualization called Tableau. And I'm thinking, well, I don't do research, but this would be fun to get back into research a little bit. And so having that opportunity and then being able to share that at departmental meetings and use the data visuals, I think was really helpful in that regard. And so I found not only was I getting benefit from experiencing and playing around with data, which I had not done. I'm quite a long time. My fingers and I was a grad student. But then very simply I had more questions when I would play with the data. But then again, just kind of this spillover effects, as we say in my profession to departmental meetings and why now is just really beneficial. And then I remember a couple years ago I got invited to the seismic group. And so that's the Sloan equity and inclusion and in stem introductory courses. And they revealed what I thought was really interesting. Data points are one of the data analytics was great surprise. So when a student experiences great in a particular class relative to some sort of average, it's either positive surprise or negative surprise. And that one kind of got me interested in a future discussion as well too. And then obviously there are other colleagues here. I'm going to have to kinda give out some additional credit. She's not in the economics department, but I'm Dr. Jennifer met or robinson it anthropology, if I remember, she did a study on this kind of grade surprise. And that was kind of this idea in terms of when students are surprised classroom, does that affect their learner success, their students success in that regard. So a lot of opportunity to kind of explore myself to have interactions with people, and just an opportunity to kinda play with the data. But like I said, this was also, I'll just give you a brief history of terms what my projects were all about. I'm not going to go through too much detail. There's a couple I want to highlight because again, the point of this project, and as I often will do here, and my students are very familiar with this. I forgets things. I have to say, thanks again to George. He's the one that came up with the title. I was running a little behind schedule. And to the point where george, like we need to get a title, we need get some stuff. And so this was George says def title. So thank you George for the title of how in economics department use learning analytics to improve success. So I thought it was perfect. So I'm going to set up, thanks again for that, George. But when my first begins opportunity, it was just kind of looking at data. I just want to look at the classroom data and see what's going on. Because at the time in their economics department, we just went through departmental review. Our majors were up, our enrollments were solid. We were doing really well and I thought, well this is interesting. Maybe it's because, you know, there's a certain instructor that came over back in 2008. I'll play with the data as well. It wasn't, but I was able to look at the data a little bit. Then I said, okay, I've done the introductory level and E21 is an introductory microeconomics class. It's kinda the original class that most students will take when they come to IU. In particular, business students, which I'll reference in just a little bit. E2, A2, which is basically my background, is Introductory Macroeconomics, so it's a sequence. And so I started that in the first project. In the second product, I turn my attention to the intermediate class known as 320 one. So as you can probably understand, the E21 is a prerequisite for E3 21, so the introductory to the intermediate. But when I looked at the intermediate data, I started to notice something. First of all, DWT, afraid to students received Ds, withdrew or failed. I didn't see that also at the introductory level, but I would see this kind of effect. And so we started looking at this a little bit further again, just mostly descriptive statistics, nothing to analytical in terms of rigorous statistical analysis. We're just kind of playing around with that. And then again, conversations are occurring in our department and my department trusted, well, let's look at this data a little bit more because we're starting to lose enrollments. So our enrollment started to fall in the introductory courses. Students are still struggling at the level. And then at the intermediate level we kind of saw this really amplified. And so I'm going to talk a little bit about these two reports, these two data analytics over these two years, where basically where we really kind of went down and said, Okay, now we have some good rigorous data. We've got some evidence because during these years, progress enrollments kept falling. And partly because as you will see, transfer credits and students bringing in credit for each. We'll one in particular the introductory Microeconomics course. Into IU, but when they would take the class at the next level, they will not perform very well. And so we were able to take with that data and then take it to further administrators to get some changes, ultimately, I believe, for student success. And then so that occurred. And then couple more years go by and I'm playing around with some more data. I want to look at our majors because shortly around this time or they just started to decline. And again, as I mentioned earlier, the seismic study about great surprise. I just wanted to do one of those. And again, getting into the analytics and I compared not only economics but other classes as well too. So I see there's a chat here. Okay. Thank you, George. Okay. And so but I'm going to focus here on what what was the whole point that put the Economics Department inaction and what's happened since then. So again, just kind of the summary statistics of our report in 2016, again, myself in the art department. Sure. After you reached out to the bar group there and Linda shepherd, they said, Okay, well, we looked at the data. And what we found in this case here is that students were transferring a certain amount of credit in, and I was looking at about 3,400 units of data. Sorry, my screen is being blocked here and I'm trying to navigate. There we go. Okay, that's better. So about 14 years with the data. And then the results was even though the grade distribution was similar among instructors, because sometimes you can explain student performance based on instructor. There is some studies out there that show that. But again, just drawing your attention to that 20 percent DW, an effort which is obviously a little high. And again, we had also kind of elevated levels at the introductory level. But I was able to play with the data and pull out where were they taken the classes? Where was it at IU Bloomington and other institutions or was it outside? So this was the first where I realized students, we're transferring the credit in for the introductory or its equivalent performed worse. So students success was actually failing, students who were transferring it in. We did a more rigorous analysis. We brought in an econometrician. This is our She was just that she did a really dig a bit deeper dive. She started looking at other effects, no demographical effects. You started controlling for variables like high school SAT scores and all that. So you really did the rigorous monks. I thought maybe there's something I'm observing in just the descriptive statistics, but in reality, there's something going on and basically we got a lot more data. So we pulled out the data for Tool one. And as you can see here, at this time or over this time period of about 14 semesters, we had about 20 percent of the students transferring these this credit in for 20 one. And again, not all 20 percent took the 321. But we saw this and again, we also saw was it was this business school known as the Kelly School that was taking these classes at a local community college transferring in, but we weren't really seeing them in the 321 data because one of the requirements for business student is to take the introductory micro class, but they don't have to take the intermediate. So we were losing some data points there. Oh, consistent. Greatest juice. And thank you for the question. I'll just looking at the number of A's, B's, C's and D's and the, the averages by instructor. So what we found is that the instructors were basically awarding similar type raids, the same percentage give or take of A's, B's, C's, and D's. So there wasn't one instructor being harsher, easier than the other. So again, there wasn't anything going on. We saw the DFW rates on the aggregate was an isolated to one class. I didn't really see that in that regard. I hope that answers your question. And once again, we got the results in terms of what happened. And I'll, I'll show you kinda the numbers you real quick. But basically again, as I'll just use the descriptive statistics when we got into this more rigorous analysis. For those of you who would like to know. And again, I don't have the data here, but it was, it was very similar results to the descriptive statistics in terms of the, the magnitude of the effects of students success. We did obviously OLS regressions ordinarily squares. We did some probability models like logit and probit and linear regressions, all that stuff I had not heard of in quite a long time until I started getting into this and working with a true econometrician who again, she did, she did a great job and enlisted her there on my thanks there, but we found this idea of performance level. You're welcome. Thank you. Okay. So let me just kinda show you the results real quick here again. And I want to get to the point about what we did obviously as a department in terms of turning around student success. So here's Was, this is kinda the, the tableau of the day, if you will. So this was kind of related to that first study. There's all the data points. I was able to look at the number of students taking the Principles of Microeconomics. But it was this trend I saw in the data both at prior projects and the one where we were focusing on student success at the intermediate level, students are transferring the credit in. And again, it stops here at 2016, if you can see that. Okay. This goes back to 2010 and again, I just can't say this enough. We are very fortunate at Indiana University to have the data that goes all the way back, I think is even before 2010. So I was really lucky to have this historical data to play around with. These are the Trent, the top 20 transfer institutions. I will list the names there you can read for yourself. But finally down here we were looking at the overall enrollments. And this is the DFW rates I was mentioning before. As they go across the years in question, are there averaging about 20 percent? But over here is the ones that we're transferring it in. And you can see kind of almost a doubling of the number of students. So once again, students were not becoming successful in the intermediate level if they were transferring in the introductory course. And so we were armed with this data. And again, there's a lot of students that are transferring in Introductory Microeconomics. But they were, again, they were doing it for requirement. They would stay. And the Kelley School of Business. So we would not see the data show up in 321 because it wasn't required. So the introductory course of E2, all one was required to get into Kali, but they weren't seeing the results there. So what will we, but we just knew that if you took this class is over there looking at the metric of E 321. They weren't learning as much, they weren't being successful. So this is also something we put together very quickly and again, I'll just cover the numbers here real quick. So this is the grade distribution, 321. So this is the average grade distribution. If someone took an introductory course at IU Bloomington, that's what I mean by AUB. You can see they got about a 2.6 to a 2.7, which is about the average for each we'll 1 classes. And again, I'll reference that in just a little bit. But you look at what happened when someone transferred in from a community college or any two-year institution? It's a 2.2 to 2.3, almost one letter grade I as a point of reference, a 2.62.7 is probably about a B minus. Or as a 2.3 is a C plus. So I would simply say this was a one letter grade difference. But when you look at the grade the student earned in the introductory classes in the next row here, the 21 distribution here at IU Bloomington, we were averaging again about a 26 or 27, so B minus range. But they were getting an a or excuse me, a a B plus average at the community college, a 3.3 or even a 3.7, which would be an a minus. So according to the data, they're performing extremely well when they take the class over there. But then coming back in, they would struggle at the, at the next level. So again, you want to basically compare the introductory level and then their performance at the intermediate level, introductory, intermediate. Meanwhile, we were looking at high-school GPAs are very similar for the two different groups. Their SAT scores are very similar. And again, also, and we did, we ran those rigorous analyses with our econometrician. She saw the same results. So what happened? Well, again, a couple of years later, our enrollments are still falling or major started to decline. And partly we found it was because this, this Kelley School and that's kind of the central here. So we realized that not only were we losing enrollments at the introductory level, students were not performing well in the metric known as E3 21 greater performance, but we were losing it to the Kelly School. And so finally, this department chair gear or glom again, my coauthor into those projects I highlighted there said, Okay, we've got to finally do something. So August of 2019, we started working myself, some other colleagues, some instructors, we develop the plan to address this idea of transferring credits from 20 from outside institution. And so basically we did, we said, okay, we looked at the data and we found, again, not in the 321 data, but in terms of the number of people transferring, what were they doing it that they were doing? Students in the Kelly School. We're transferring. So they would come to campus but pay money to go outside, to go to the community college. But as you saw in that prior slide, you can't really blame them because it's not exactly accurate. But the gist of it is a student comes in, wants to progress in the Kelley Program, which the business school, probably one of the largest, if not the most, one of the largest colleges on campus. Students gravitate towards that. They need to get a B, on average in the introductory microeconomics class within their first year. So they come to campus. Again, we're averaging about a B minus. But if you take and transfer it in, but still you're in Bloomington, I have to go outside of Bloomington, but even there they had some online classes. This institution, you could average or B plus. So we would see that behavior and, but we said, the reason why we want to do this, you want students to be successful. So we basically, with the suggestions of the Kelly School and said, Okay, what can we do? So that way your students will gain success in our classroom. And they said, Okay, you can revise the curriculum just a little bit. And what they basically want it. And again, I think this makes intuitive sense from, from our departmental standpoint. They said, teach more behavioral economics. So an introductory microeconomics. We don't really do what's called behavioral economics. Behavioral economics is simply game theory to throw out some terms you may have heard. It's, it's kind of relatively new in the field of economics. I mean, it's new in the sense of it's own, like 30 or 40 years old if you want to go that far. But it's not something I mean, we do a little bit of it, but we didn't emphasize it. And it makes sense that if a business students wants to know about strategy, for example. That they would like that. So we said, Okay, let's revamp the introductory course in microeconomics. Let's do the same thing for macroeconomics. And we had to talk about moving some content around and creating new content. And we're still kind of in a revision process. But by the time we figured it out, we said, okay, they, we need to launch this. But we also, and this was the second the Kelly School said to us, we want students to engage. We want students to apply the economics. And so we create a new assessment in the classroom in essence, we said, Okay, let's take some of the points are some of the grades we would assign students said as opposed to maybe taken from an exam, Let's create a low stakes assessments. Were student engages in economic theory, economic data, or just simply strategy. So for example, what I've done in my classroom is that I have the students play games, in essence, where these, these low stakes assessments and a classic game theory game, a strategic game is, you're probably familiar with what's called the prisoner's dilemma. The infamous TV show or to two villains are kind of cotton and separate them and the police try to interrogate them and you want to turn on the other. So we would play these games now and to be classes. So we revamped the class, we call them the B classes now. So we'd play these games. Other colleagues would go I have and show students economic research on why it matters. Other instructors would use there, have the students write papers. We'll find an economic topic or take a country and tell us about what's going on in their economy. So we had this kind of emphasis towards what we now call modern evidence based applications. And so between a change in some of the contents and this kind of hands-on learning, if you will, students success improved. Finally, again, as George mentioned earlier, there was an issue of consistency right around all this time, I would say I think it was 2017. The Department k-means that Paul, we would like you to teach our graduate students on how to teach. Now this is not something brand-new. Professor Walker who instituted this program, which we have grad students. They instituted this program. Whoops, sorry. Yes. Did I have to ask a question? I maintain HIPAA. No, no. I'm I'm just gonna keep going. Please please enter. Okay. Sorry. I just clarify one thing. So did you wind up then with that with that and economics 20 one specific business school economics. So when that was different than general economic theory one, correct? Yeah. So we, we relabel that, renumber it. So it's now called B 251 and between 52. And in essence what we've done, your question, we've even created a non-business when we just call it the e, but not E2. E1 and E2 are two. Those are slowly kind of going away. We created what's called each will, E2, 151 in each 152. So the numbers of change, There's still 200 level. Well, we have a non-business plan class and a business econ class. And so if a student is in the College of Arts and Sciences, which is where we are housed, by the way. Or another large college on campus is called SPM. The School of Public and Environmental Affairs, Public Policy, Environmental Affairs, I think. And so they're still taken those classes over there. But I also find that some of those students and up it might be class. So we went to the Kelly School who was at the time, even from what I understand, advising their students to go elsewhere. Got them on board. We said, Okay, what can we do to improve the success of your students? And this idea of consistence. So I am now the coordinator as George said, which I'm enjoying for now at least it's become a challenge to say the least. I'll talk more about that later. But my job now is to also make sure there's consistency among the sections because we want to have that student experience as close as possible when a student takes my class or another instructor, including graduate students. That's coupled with the idea of we still want instructors have academic freedom. So once again, I may choose to do different application that my colleagues might. But we were still all teaching the same content with an emphasis towards what this Kelley School want it. So we did this in 2019 and 2020 rate during COVID, we'd launched it. So obviously at any, any university we went in the spring of 2020, we went from face-to-face to immediately. We went on spring break, we went to shut down. And then that fall we were completely online. And I wish I could tell you more about the success. I haven't really looked at the data in part because the data is a bit problematic With COVID occurring, some of the policies change at the institutional level at Indiana University. And so when we launch this brand new class, again, it's called the fundamentals of economics for business, also known as B 251. In the first semester, was taught by nothing but instructors know graduate students are large classes, but we were doing this online. And so there was a challenge there. And I will tell you one thing I have found that that was revealed even in prior studies, business students are very motivated. I had noticed when I was looking at earlier analyses with one of my colleagues that when we separate the performance is. The introductory level business students were doing pretty well relative to non-business students. So they're already performing well in our introductory classes before all of this. In the spring of 2021, still during COVID, we launched the macro, if you will, business fundamentals for business of economics part two. And once again, I need to emphasize the fact that this be 251 was critical to get to get right the first time. So again, when a student comes into the Kelly School in that first year, they have to get at, you know, again, it's, it's not exactly a B. They can get a B minus and some like that, but it's, so it's about GPAs animals, but they need to get this done in that first year. Be 252. On the other hand, a student does not need to finish that until they graduate. So right now, our enrollments are pretty strong and be 251. But now it's been almost a year since we launched be 252 and now we're seeing those numbers as well too. So after the launch of these a little over a year ago, what we found in this case here is that again, the data is a bit problematic because unlike say, they fall, like I said, we, we decided that at the university, for example, we had the option as instructors to finish in 13 weeks. So I said Okay, As coordinator for the B classes, I said I'd like for all of us to finished by the Thanksgiving Day break, but we still had to teach, do something in those last two weeks. I said I know what I'm going to do. I'm going to play some more games. There's a game called the ultimatum game. So it's another game, strategy game about and use what's called backwards induction to solve the game. And I said, I'm going to do that for the extra credit because any student that's been done that border line to get that b, they're the ones that are going to want to come. The ones that have A's, they're not going to show up. I was wrong. 70% of my class came back after we finish the final exam. And they said We want more. And I'm like, oh my goodness. So my grades were a little bit higher than expected because I did not expect that kind of motivation. So again, I'm not sure if it's the motivation of the student themselves because their business. I'm not sure if because we changed it. They said This is great. Let's move on to more ON. If you're gonna give us extra credit, that would be great. So the Fall data of 2020, again, it's going to be a little bit on the high side and all of us solve x. I told all the colleagues said, Look, let's do something, maybe do some extra credit. So our means are a little bit higher than what I would expect. But I would say this right about now. I was looking at the data a couple of days ago. And as I showed you earlier, in the traditional ETL one class, or average or about 2.62.7. Well, I think in part because of the motivation of the students, I think because of the revamping of the content and I think more importantly, the modern evidence-based applications. So we took some hands-on assessment and moved it to something that was if you play with the data, if you engage, do some activities, it's a way to improve your performance in the class. I think when you add that all up, we are now closer to about 3. So about a 27 and 3 is, I think where we're at, which one could argue is kind of in-between where the transfer occurs. The transfer averages were for occurring as well too, at the same time. Now, once again, the deed again, have DFW, dw, f. Sorry about that. Those rates have fallen. Now again, part of that is because the policy has changed here at Indiana University. So we went face-to-face this past fall, this past semester, spring semester we had one snow day where he had to go back on to Zoom there, but we were face-to-face. But the policy is now students don't have to drop a class or withdraw until the last day class. That was not the case before COVID. And so, but, but just the number of Ds and Fs, those, those have all fallen. Again, have not looked at the rigorous, uh, we're gonna have not done the analysis of yet. When it comes, it's statistical analysis. Again, I think that the data is a little bit muddy in the sense of between all the policy changes and everything else. But I'll just say just looking at a quick descriptive statistics. Performance is better, students are more successful in our classrooms. Okay? Obviously we went back to the Kelly School. We are now in constant communication with them at least once a year and saying How are we doing? And anecdotally, we talk to their head advise they had advised there re Rene raincoat and she said, students love it. There are students that have concerns. Obviously, there's some consistency issues that we're still wrestling with on occasion, but by and large, students are very happy. The Kelly School is happy, the economics department is happy. And so we see the increase in enrollments. We've got our enrollments back. So we're kind of back to where we were right before I started my whole journey in learning analytics. There are in the mid 2010s, but there's one problem. Our majors have not recovered. Although again, looking at the data real quick before our time together, we saw a little blip. So hopefully we've kinda bottomed out in terms of the number of economics majors in the College of Arts and Sciences. I will say this, especially knowledge, it's fun to being back face to face with the students. I miss that kind of energy in the classroom. I've had students come up to me, said, You know, I'm really enjoying these new things you all are doing the B classes. I just wanted to let you know Professor graph. I decided to pursue an economics major. I'm like, that's great. They said Yep, economic consulting at the Kelly School. My great. So I get them excited about economics and they still stay in the Kelly School because there's that there's any kind of apartment into Kali school that teaches economic consulting. But again, I'll take a win-win wherever I can get it. But our majors are still struggling in that regard it. So now the next thing for us to figure out, how can we get those numbers backup from the College of Arts and Sciences. Finally, I'll, I'll just end with this and then obviously I'll take more questions as well too here is that we even got, shall I say, a shout out, a call out About a year ago from the College of Arts and Sciences. So our direct our chair, she has a lot of the information we get from the College of Arts and Sciences. So they had these, I guess, monthly meetings. And apparently when we started when we implemented this B2B 51 there in the fall of 2020. One year later, it showed up and tour. The college started notice the dean of the concert, hey, look what's going on in the economics department. They're getting their enrollments back. They're having students success in the classroom. And they basically told all the other departments in the college, if you're a struggling, take a look at what economists doing. So that was kinda nice to see that we have another problem. And I'll kind of conclude my statement syringe the next few minutes with the following year, this past year, the Kelly School decided to even go further because I guess they weren't big enough. So they've changed some of their requirements and they notified us and luckily, we had launched the B classes beforehand when you're earlier. So in last fall they said, by the way, econ, we need more people, we need more seats in your classroom because we are going to admit what's called the Godzilla class. This is not my quote. This is theirs. I said, What do you mean? Where we had the meeting then they said, Well, we normally admit about 2400 students here. Some are directed myths, et cetera. They said we're going to do about 4 thousand. So we are quite busy to say the least. And fast forward to just this past spring. Our b 251 classes are pretty much at capacity. As I mentioned earlier. For example, our graduate students are now teaching these classes. We reallocated more grad students this past semester. As coordinator, I oversee 13 graduate students who on average teach about 60 students, 50 to 60 students each one. It's my job to make sure they're following procedures or deal with any issues that arise in the classroom. I'm still teaching as a large classroom instructor. There's two other instructors that are teaching large classes. We are at capacity to say the least to the point where we might be able to even get some additional hiring and that end. And again, be 252 is no nastily increasing. So the rumor is apparently this coming fall in 2022. It's Godzilla part two. I don't know if that's Godzilla versus mater or whatever for they'll throwback there, but another 4000 students are coming. And then, so in other words, we're going to be busy. So to summarize, I will say this. Thank you. Learning analytics. Thank you George and everyone, because now we're a victim of our own success. We're busy. And again, this is a good problem to have, I like to say because like I said earlier, students are happier from what I'm gathering. They're more successful in our classes. We are way behind busy. I'm a little tired, but I will take it because prior to all this, we were on a different trend. It was the data analytics that got us to, to make that argument, to take it to the Kelly School said, Look, if you allow your students to transfer, they're not going to perform better in other classes. Help us, help you tell us what we can do to help your students success. And we couldn't have done that if it wasn't for George. And again, here at Indiana University, that if given the opportunity for someone like me to play around with data years ago, how this opportunity now to be coordinator of these new classes is probably think inhibition of signed up for those data analytics years ago. But it has been truly a journey and a wonderful experience. And I just want to say thank you all very much for your time. And I hope this was informative to you. Thank you.
Description of the video:I guess we'll get started. My name on the shepherd. I'm here in Indiana University. Welcome to the session today. It's my great pleasure to introduce our speaker Teams, folks, that is from Colorado State University. Dr. whole stanza, Professor of Education and Colorado State is the Director of the Center for the analytics and learning and teaching. He is the recipient of many teaching awards and all the University Distinguished Teaching Scholar transition at CSU. He received his PhD in Human her education, Human Resource Development from Texas A&M and research expertise, a thin technology enhanced learning and alerts for learning and teaching and education about innovation. I became acquainted with James maybe five years ago when he was doing pioneering work in learning analytics skills, feelings, developing ethical principles and how they live those forward at Colorado, which was quite, quite interesting today, is change direction a little bit, and it's focusing on developing a behavior application, which he will be the purpose of this talk today. So thank you for being here and I'll turn it over to you. Hey, thanks, Linda. Yeah, thanks for being here today. Linda and I were talking earlier that if if you have questions as I'm presenting the content today, you can ask them and interrupt me. You could put it in the chat or you can just open your mic. I'm completely fine with that. I'd rather it be a conversation as we move through and I provide clarification because I think it would benefit everybody to have that clarification. We should have plenty of time for question that questions at the end as well. I'm not gonna take the full time that's allotted here to me today. So we'll have time to have a conversation and the discussion around this application called you behavior that I've developed at Colorado State University. So I'll just share my screen and then I will start. So everybody see that. Okay, Lynn, Does that look good? Oh, yes. Okay. Great. So you behavior is is really a methodology or a pedagogy that we developed at Colorado State University in response to a mole tie, years of sort of tinkering and experimenting with the Canvas Learning Management System and thinking about ways that we could use Canvas to understand student learning behaviors and what those learning behaviors met when it came came to durable, what we call durable learning and I'll get into that. But what durable learning, the way I define durable learning. So it was this multi-year sort of experimentation that started off in one particular class. And we, WE tinkered around with the technology, try to figure out how to utilize it better. And that particular class is microbiology. So I'll talk a little bit about the microbiology class, as well as another class that started to use you behavior after we had developed it and put it in place. And I'll be talking about sort of 22 experiments that we did in 2020, the fall of 2020, and the data that we collected using you behaviour. So we're work that's kind of where we're going. I want to introduce sort of the, the, the concept that this has been developed over multiple years and is a technology that we're still sort of tinkering with. And we're still trying to improve its impact on students learning behaviors and then also how it affects performance. So Linda introduced me, my my contact information is there. If you would like to reach out to me. Certainly can. I'll put it up at the, on the last slide as well. So what is durable learning? You know, I love the book, make it stick. Probably many of you have read it. And it's the, the Science of Successful Learning. And the definition that they utilize is acquiring knowledge and skills and having them readily available for memory so that you can make sense of future problems and opportunities. So basically, having information and knowledge on board that you can have retained and that you can recall. And then having the opportunity to apply it to a unique and novel situations. I love this definition for durable learning, but I just love this definition for learning because I think it's the goal of. All right. Pretty much any faculty member who we'd like students to be able to recall, remember what we're teaching them. Not only in our class, but later on, in subsequent classes, for example. And they go from a general microbiology class to something more specific or a higher division microbiology biology class. We want them to be able to retain it and be able to apply that information that we taught them. So that's where we were sort of focused in on that. And focused in on some of the things that the science of learning tells us about what creates more durable learning. And when we were sitting around talking about microbiology, where we're saying cautious, a lot of really hard content in this class. And there's a lot of really difficult content the students that we like students to be able to retain and we want that knowledge to be durable. So we started to look at the science of learning and asking, okay, what are some of the things that they're saying that, that we know from the, from the science and literature that, that impact that. And when we started looking at those things, we started realizing that a lot of those things underneath a lot of those things were actually behavioral. It's, it's the way that you practice that has an impact on whether or not knowledge is going to stick with you. So we got really intrigued by that and we started to explore some things in Canvas and in particular the learning management Canvas, Learning Management quizzing system. And we started thinking about the quizzing system and what we can learn from it when it came to this kind of thing and the behaviors. So we, again, we sort of came upon this idea that behaviors really impact durable learning. This is what we kind of learn from the science of learning, right? So there are these optimal behaviors of, of testing oneself. Okay? So when, when I talk about testing and when they talk about testing, it is about recall. It's about testing to see if you can pull stuff out of your head, not put stuff in your head, but pull it out. So testing yourself on your recall. And then that's a behavior. And then the other behavior that surrounds that is this idea of spacing out that practice across time. So it's not OK. And I recall it once. No, it's can I recall this across time? In other words, if I if I can recall it today, can I recall it tomorrow? And if I can recall it tomorrow, can I recall it in a week? And if I can recall it or we can go much. Okay, So that idea of spacing out your practice impact on durable learning. And then the other thing that we were intrigued by, we wanted to pick up on and seemed sort of like low hanging fruit was interleaving or mixing up your practice so you can space out the practice and maybe you can recall one can't, difficult concept or topic or thing. But can you mix up the practicing, go between different concepts and different learning objectives and learn those things or practice those things and mix them up. And can you still recall them? Okay, So we know that that has that behavior of mixing or interleaving also has this impact on durable learning. So that's kind of where we started. We're, we're sort of steeped in learning about what we know about the Science, Learning and then thinking about what it meant. And then really landing on, hey, these are behaviors. And if we want students to have durable learning, we really need to, we really need to understand a little bit more about their behaviors, like what's going on with them. How are they studying and what are the practices if they're using, right? And that's, that's where we started. And then we wanted to understand just how, you know, how students could take control and maybe change their processes. Are there practices? So this was kind of a goal. This last bullet here was kind of a goals is what we're working on. So basic, really basic question that we asked is, what behaviors are students engaged in? What are they really doing right? Or do we really know what they're doing? Okay, So we looked around the classroom and many of you have done this and we see I'm taking notes. We same with sticky notes. We seem highlighting. We see them maybe taking notes in their computers. But what are they doing with those notes after they leave the classroom? Do we know? And, you know, if these behaviors are so important for durable learning, are they doing them? Okay, this is, these are just questions and we basically landed on we really don't know what they're doing, right. We have no idea what they're doing with these notes. If they're if they're rereading them, which isn't as beneficial as testing yourself or trying to recall. Or they spacing out there. They're sort of practice of these things across time. Are they interleaving the content? Okay, We have no idea what they do when they leave the classroom, right? So that's, that's kind of where we landed on on that particular question. And then we ask the question, okay, what our instructors asking students to do and do these activities translate into actual beneficial behaviors. All right, so this was even more intriguing. This question I thought was really fascinating. It's like, okay, what do you know instructors ask students to do within the classroom and within practice? And are those actually good behaviors? Are that are, that is the pedagogy that we're handing students. Does that translate into actual, actual sort of science of learning behaviors? Or are they behaving? Are we encouraging them to behave and maybe less than optimal weights, right? So that was really interesting to us. So instructors provide students with quiz questions in Canvas sometimes. And they say, Hey, these are for your practice, right? They put up clickers, clicker questions and they say, Hey, answer these questions, right? And we can kinda see that instructors are, are trying to get students to, to test themselves with the clickers, right? So we see these, these attempts at trying to do this and this is encouraging, right? And so we do see faculty members do, doing things and attempting to do things. So we're kinda left in this, in this space with, okay, we, we really don't know what students do when they leave and take the learning materials out there. We do have some observations in the classroom of students maybe clicking and trying to answer questions. But are those optimal? We don't really know. Do they have an effect? We don't really We were not sure. They might, but we weren't sure. So we started at this base that we just started at this basic sort of level. And we said, okay, let's try to learn about that, right? Let's try to learn about that. And we wanted to learn about it through actual evidence of practice, an actual evidence of practice behavior. So we said, Okay, what can we use in the learning management system that can get at some of these behaviors that we're interested in promoting. And we landed on the quizzes in Canvas. So we figured we could get students to test themselves. We could get them to recall, try to recall information, right? We, we could encourage them or ask them to space out their practice using those particular quizzes. And we could ask them to interleave are mixed up. The practice of those quizzes across maybe a semester. So we said, okay, That's really interesting. But what we wanted to do was actually see what their behaviors were. And that's where the learning analytics component came in for the work that I've been doing. And I said, okay, well, let's, let's capture all the times that they try to, and try and practice these particular quizzes. And let's just take a look at what they do when we give them these practice quizzes in the classroom. So that was kind of the starting point for our research. So we gave students practice quizzes in Canvas. Many of you've probably done this or had this experience in the classroom. And we asked, what did they do? All right, so the, the way that we originally started this data was we just went with a basic sort of low stakes quizzing methodology. We said, okay, here, here's all these quizzes that you can practice in the classroom. You know, it's good. We did give them some encouragement. We said, we told them, Hey, you know, it's, it's best if you practice these things over time. You space them out. You try to recall this content and just use your brain and CPM and get it, get the answers. It's best if you space out your practice across time. And it's in, and there's this thing called mixing or interleaving, that's also beneficial. So when you take these, just don't take the first quiz and then take the second quiz and take the third quiz, trying to mix them up a little bit, go back and forth, right? We encourage them just verbally to do that. We had some instructions to on the, on the quiz itself. And we looked at the data. And what we found was something like this. And I'll show you some actual graphing of, of this data as well. But this is just a visual. We saw that the students would take quiz number 1 until they got a 100 percent on it. And then quiz number two. Until they got to a 100 percent on it. Quiz number 3, tell he got a 100 percent on it and so forth across the semester. This is the behavior when we captured the data underneath the quiz, quizzing system in Canvas, this is the data we would see on about 95 to 98% of the students. Even though when we try to encourage them, we said, hey, you know, try to space these things out. It's good for your memory, right? They all, almost all of them. Let's put it that way. Would use this practice behavior. And I'm sure I'm not going to ask ask the question, but I'm sure that many of you might, might think, well, yeah, that's interesting. And you might even think, well, the reason they're probably doing that is because I didn't really mentioned this. But the reason they're doing that is because they get to keep the high score on the quiz. Okay? So once they get a 100 percent, they stop. They stop out. Even though we tell them, Hey, listen, space these things out, interleave these things, mix them up, right? They just stop out. So we were sitting there thinking, hats, really interesting. Alright, the pedagogy drives practice. The pedagogy drives behavior. And the behaviors are important for HDR warning. Right? So we were stuck at this spot saying, all right, we really don't like that pedagogy. I've just given students practice quizzes. We need to change the pedagogy. We need to think of a different way to get students engaged with these things so that they practice them across time. So that we can impact herbal learning. That was our thought. So these are, these are actual graph and I'll, I'll, I will talk about this because this is important. And this is the way we're graphing. We created this tool that actually graphs out the student's attempts. Okay? And up at the top of this graph, here you'll see that each one of these color and little dots and different shapes is a different quiz. Okay? The access to the bottom goes from 0 days and it runs, it goes from 0 days and it runs, let me grab my pointer. It runs from 0 days and goes all the way down to the end of the semester. And when a student takes the first quiz, which I'll talk about in a second here. They're labeled RPA for a reason. But when they take that, you can see they take that red particular quiz until they get a 100 percent, right, same with the green little a plus sign or RPA Request. Take it over and over again. You can see is so consistent. We were shocked. We had no idea that students were behaving, that this was their behavior around the use of the quizzes. When we use the low-stakes quizzing methodology, we were shocked. Another thing that's really, really interesting is we ran this experiment in a intro to psychology class, which is completely focused, 100 percent focused on the science of learning for students and teaching them how to practice better. Guess what? When we put low-stakes quizzing into their class, even though the entire class is telling them to practice in the optimal ways around the science learning. They still did this. About 90 percent of those students still practiced in this way. Okay. So this graph right here on the, just so I can finish up sort of talking about it. Over on the axis on the side here, it goes from 0 to a 100 percent, correct. Okay. We didn't want to focus on the students, focus on getting them all correct. But we didn't want that to be the major focus. But we needed a plot of somewhere, some house we put them in that, that orientation. There's a couple of scores at the top that we generate for them automatically. The first one is a space practice scored that be put together and the other one is a mixing score. These students both, while they, they've mixed, they mixed a couple of their RPA is on this bottom graph. These are two separate students, but This student up here in the upper, upper part ended up getting to 0%. Okay, they're both 0 because they didn't space them out at all. And they didn't mix the practice at all. So they just ended up at 0 as students space out the practice of each RPA and as they mix between the RPA is those particular scores go up. Anybody have any questions I can pause because I had loved me. I just delivered a lot of information. I'm sure there are questions. If anybody has one color, if not, I will keep going. But I want to make sure I don't just flood you. So I have a quick clarification on one of these sheets for each student. Yeah. And then about how many things did happen? Like one of the typical classroom will have 300 plus students to them. Okay. Wow. Yeah, we have a lot. Great. Yeah. Carol. Yeah. I mean, just a thought. When I saw this is I'm just wondering if part of the behaviors driven by that, the opportunity to improve immediately and that satisfaction of getting that beyond her scent and leaving it behind. Okay. Unsuccessful there. Now I can wear to the next it is Carol and I think that that's something that we've, we've trained students into over many, many years, right? It's like the points are learning to them, right? Like almost it's like, Hey, I got a 100 percent, I learned a great amount. I'm going onward and it does feel good. That is very rewarding and that's one of the challenging. I've talked to a lot of our cognitive psych people about the challenges of getting students to engage in these behaviors that we know are beneficial. Because they don't, they don't immediately have a kicker, right? They don't immediately feel good, right? They don't you don't get that adrenaline. That lab who I am, I'm, I learned it. Because the first time you take it and you will see this in the subsequent graphs as students start to change your behavior, the first time they take it, they might get a 70. Okay. And therefore they're going man. Okay. But I know I'm not supposed to take this over and over again, so I'm going to wait out, I'm going to wait and try it again later. And so, you know, it is, there is a huge component of that. I think that what you're talking about. Alright, good questions. Okay. So, you know where where we were was. We just saw like this pedagogy was driving this particular behavior. And we ask the question, what can we do to change the behavior? So, so that was our next question. And we then design the EU behavior pedagogy. This process that we put students through. We designed it with that intent in mind. So we, we wanted to change their behavior. And so we wanted to design the behavior methodologies so that we could put it into any class. Okay, and it was a pretty light, relatively light loads. So it wasn't like, Hey, you have to change your entire class. So we wanted to put this method in, into classes that had quizzes already. And we want it to be pretty simple to put in. And we want to run these experiments to see if we can get students to change your behaviors 1 and 2, would we see any, any boost in endurable learning or longer-term learning, right? So the behavior methodology, I'm going to run through it quickly here to some of the components of it. And you'll probably have some questions about it later. But I'll, I'll give you a synopsis of how, what it, what it entails. So the first component is, is this component here which is an online tutorial, ten minutes. Okay, we wanted to keep it short again because we can't teach a whole class on the science of learning processes in the brain and retention and recall. So we just gave him this tetanus students 10 minute tutorial. It's a little cartoonish type tutorial. It tells them, Hey, here's some, here's some things that are good for your ability to remember and recall information. You basically running through those behaviors. You should test yourself. Try to recall the information. You should space it and you should enter or leave it. And then the tutorial teaches them about the EU behavior graph, which you've seen one already, and the graphing. And it tells me how to use that in their, in their process of planning their learning. And then it tells me how they'll be graded at the end. Okay, so instead of a major component of the EU behavior process, is that at the end, we grade not based on whether they get all the questions correct. On each of the quizzes. We give them a grade based on their behavior. It's how they behaved, practicing these particular quizzes. Okay, and this is what the grading rubric is, and we give that to them. It explained it to him in the tutorial and we give them give it to them so they habit, and then they earn the points. Now what's kind of interesting when we set up the experiments is we had the traditional approach, which is the low-stakes quizzing. And typically, professors will use like five or 5% of the overall grade or less. We just took those points and put it and applied it to this particular rubric or applied the rubric to those points. So in the experiments, when I talked about the two can conditions, they're earning the same number of points for. Basically doing these quizzes, one is just earning, um, for low-stakes quizzing, which we've seen that behavior. And you behavior they're earning, um, for changing or not changing their behavior, but practicing in the way that where we want them to practice for durable learning. Does that make sense? Okay. They do due to reflections during that. This is an important component of it. They do due to reflection is during the semester where they download their graph, which you've seen an example. And they look at how their behaviors going. And they, It's really simple. They look at the behavior that they've, they've done so far, then they submit just out there's just a blank box for them to submit if they plan to change their behavior around their practice okay. In the future. So they do that at like 1 third and then two-thirds in the semester. All right. So let me see where this is that I think I've already looked at this. Let me, let me go through this. So let's just talk about results. So or the experiments and the results. So we ran to two experiments in the fall 2020. The first was in a course which is on the Introduction to biomedical sciences, which is a freshman level course. So it'll be MBS 100. And we had a 124 students in that particular class. We put, we randomly assign, which I was talking to Linda about. A Cape Cobra gave us a unique opportunity to randomly assign the condition by students because everybody was online. And we had 59 students that were selected into the EU behavior condition, and then 65 students in the low-stakes quizzing condition. So the students literally replacing the two conditions and had two different experiences. We have talked about the EU behavior as the sort of this minimalist approach to sorting, trying to get students to change their behavior. And that's what we're kind of interested in is like, Well, can we get students to change their behavior when it comes to this? So the only thing that the students got in this experiment was that 10 minute tutorial. They did the, they did the reflections that I talked about during the semester and then they submitted their graph at the end for grading. All right. There was no additional push. We, I specifically worked closely with the professors there, their co-researchers on this particular thing. And we specifically said, Don't push the students, don't, don't give them additional encouragement outside of that 10 minute tutorial. And let's see what happens. Okay. Because we really wanted to see what would that look like? Yeah. So under both conditions, the behavior and the low stakes condition, the students had access to the same quiz, quizzes. We called them retrieval practice activities because that's what we were encouraging in the low-stakes quizzing methodology we did, we did include that statement that said, Hey, space this out, practice this thing over time, use this for recall him. He said all those things, right? What was it like? We totally went and shied away from giving them some guidance. But they didn't see the tutorial and they didn't do the reflections. All right. So you know what happened? All right. So in the basically in the control condition, which was the low-stakes quizzing condition. We, we saw the same, the same time. Okay? Here are two different students graphs. And you can see they're both right or hovering right around 0 on the scores that we generated. You can see that they're stacking the practice. They're just repeating these things until they get the highest score. That's where we weren't really surprised by that, right? We weren't really surprised. Here's what two graphs look like from the EU behavior students. Now, I need to tell you that we didn't get all the students to change their behavior. Okay. That would be amazing if we did. But we didn't as a matter of fact, we didn't get we didn't get up to 50 percent of the students that change our behavior. But we still get students to change your behavior. And if, if you just visually look at this, this is radically differ practice behavior. Okay? And when we ran that, the statistics on this, based on our scores, what we saw was an effect size which was very large. Okay, from if we look at the difference between the two groups, the effect size is very large. If somewhere around a 1.2 Cohen's d. All right, so we know that we had a, we had an impact, we had an a. There isn't effect. When you put the u behavior experienced in front of the students. You change the pedagogy. Slightly the points are the same as the quizzes are the same, right? So we know that we, we ended up with behavior change, insignificant behavior change between the two groups. I put this slide in there. I forgot that. Again. I just wanted to make sure everybody knows we didn't get everybody in the EU behavior condition to change their behaviors as a matter of fact, so listings looks like this and these are two graphs from the behavior ignition, right? There's still stacking, they're still sort of massing that massing to practice doing it repeatedly over and over again, right? Not all of them did it. And that's really, I think that's really interesting. So we knew we had work to do at this point to get more students to to change our behavior. Carol, Did you have another question or you just came on video? I'm sorry. I love, I love the visualization than 0. It to me, it's like all, all these questions come up and go. Yeah, I don't want to sidetrack, but you're doing bagging. Noticed on the previous slide that one of the, and I didn't look too closely, but when I noted like the diamond and the lower one here, the second one. Yeah. Yeah. So it actually this score decreased pitch. Yeah. I'm just wondering if some questions came up about that obviously didn't decrease with all of them. But that part I can work with, they did worse and worse and it could've been maybe they're just print format, just walking through it and didn't care. Yeah. Sure. This one up here to your right, this one maybe this isn't what you are looking at but it decreased all the way. Yeah. Yeah. Yeah. So so that's that's an interesting question. We haven't analyzed stuff at that level. One thing to note at which I totally skipped over was that we, we develop pools of questions. Okay, So they're not seeing the same quiz over and over again. It randomly, randomly draws questions, okay? On the topic, the same topic, randomly draw questions from the pool. So they're seeing a new quiz every, not every time, but they'll see the same question over repeatedly. They take it multiple times. But the quizzes are different. Well, that's interesting to me because that also creates less anxiety because they're not being graded now on yep. How well they do, but just having done it, so even if their performance is less, you're saying he's still value the fact that there yeah. And you know what, I would hope that we don't know this. I'd love to know the answer to this because it's, it's interesting question is like this student, as they were just sort of dropping off the, off the map here. Did they go, Oh, man, I really need to look into this more like it. Maybe I do. I hit it. Right. And then maybe did they study with that and prepare then that'll help them prepare and study for the final exam, right? That is an interesting question and hopefully that's the behavior that we would see. We don't have insight into that. We only have, we only know that. Now they're they're actually taking these things at a given at a couple of weeks in-between or a weak in-between. And I'm taking it again, and they're getting that repetition that's so powerful that we know so powerful theoretically for durable learning. So there's so many more questions that I have to, so thanks, Carol. But isn't that always the case with data? Yeah. Oh, yeah. Why didn't they didn't like Oh my God, there's all these into question. Yeah. Yeah. So so, you know, as far as the behavior change goes by condition, here's the EU behavior. The darker blue, hopefully you can see this, okay, the darker blue is sort of this high engaged behavior, optimal behavior. So we're getting more like we're approaching 50 percent of the students sort of engaging in the way we want them to just with a 10 minute tutorial, right? You have to remember that. And, and the other pieces of the pedagogy. And then overhears the low-stakes quizzing. We had virtually no one, I think in this in VM BS1, 100, you might have had a contaminant. In other words, there was a student who might have talked to a student just because they knew or maybe their roommates are some. But one student was spacing and mixing on their own. And we rarely see that they might have been taught these things in the past and they just started using that. We don't know. But the point is, is that if we don't change the pedagogy, students don't use these practices. Okay, so let's talk about performance. Okay, so what we did was to do this analysis. We, we basically put, we'd been the students into sort of high, this high Optimal sort of practice behavior and low practice behavior when it comes to testing, testing, mixing, and spacing their practice. And what we ended up with in the MBS 100 was basically 24 students falling into this high practice optimal bin. And the low sort of practice behavior was around a 100. Now the reason that these, these shifted a little bit is. In this, from this previous, these previous numbers is we actually did a critical dive into their practice for this analysis. Okay, so you can actually bump your week. We learned that you can actually game the system and bump your score up. In the EU behaviour experience by massing and spacing rapidly at the end of this semester. And so we took some students and we move them into the lower band because we could, we could see there. They were literally gaming the system. There are boosting there. They were practicing in spacing across the entire semester. They were gaming. So here's what we saw. So this is on the midterm exam. For those then the high you can see the mean is mean is definitely higher. And on the final exam, when we ran a t-test on those. The midterm exam is, it's interesting the midterm exam was significantly different between the two groups. The Cohen's d is quite large. On the final exam, we're approaching significance here. We didn't reach significance and the Cohen's d is still interesting and intriguing the effect size. So, you know, on this experiment, what we learned was that we can change their behavior. We couldn't change everybody's behavior. We we saw some intriguing things around the exam scores as far as performance goes. Okay. The final exam we were approaching significance. I think this had to do with the lower ends. Here, is why we didn't reach the significance level. The other thing that's really interesting about comprehensive final exams, they are sort of these longitudinal tests, but people can cram for, right? And people do cramp forum. So it gets a little bit murky like, you know, is this a true retention test? So that's what we did in the next experiment, which I'll talk about is we did a true retention test where the students were were they had no reason to cram for the exam, right? They could have, I guess a really eager student could've we ask them that question and I think we had one said that they did a little bit of studying for the retention test. But anyway, that's kind of where we landed as far as the performance goes on that. So let's talk about the next study which is in a general microbiology class, a 300 level class. So this is like a junior level students. And we random, we did not randomly assigned to the student. We are randomly assigned to a section. We had a total of 166, we had 87 in the behavior condition, and 79 and the low-stakes quizzing condition. Same, same pedagogy, same minimal intervention pedagogy, 10 minute tutorial right? At with reflections. And again, no push from the instructors. We, we, we really wanted to keep them from encouraging students to engage in behavior one way or the other. So essentially we ask the same question. Can we get students to change their behavior? And how does it look for on a retention test as far as performance goes. Okay. So again, this flip-flop, these, sorry about this, but this is, these are students in the EU behavior condition. So we got again a radically different behavior, at least that you can observe students intermixing and in mixing stuff up. Versus here's two from the, the, the low-stakes quizzing condition, where they just repeated them, stack them up until they got the white high score. And we saw similar changes in behaviors between behavior and the low-stakes quizzing condition based on those. So we're approaching half again. And then as far as the performance goes on the exams, these, these were quite encouraging in microbiology class. The, this is again the high and low, and this is exam 3. They actually have three midterms are three exams during the semester and then a final comprehensive exam. So I just took a look at Exam 3, but you can see that the mean is a lot higher on Exam 3 for those who engaged in practice. And it's, it's quite healthy again, on the comprehensive final exam. The t-test on these, there are significant differences between the groups and the Cohen's d. Sorry about that. I got an extra thoughts out in their period. They are healthy. I mean, those are in the social sciences. That's quite nice. And in a classroom, just to stay if you, if you've worked with Cohen's d before, but that's approaching about a letter grade difference in performance on, on those particular exams. So that's, that's really encouraging. The other thing that's really nice about the Cohen's d here. That, that is in line with what other studies have seen when it comes to this type of behavior change. And performance enhance performance where the effect of the intervention, when it comes to these types of behaviors. So it's not like those Cohen's d values are way out a line there in line with what we've seen previously. Okay, so that was on the exams. And then let's look at the sort of the, the, the bigger question which is durable learning about the retention on the retention test. So here we, we did it return true retention test. So this is four to five weeks later, we opened up an exam and we asked, do we invited students to come in. Now, this is really difficult to do because the intervention was held again in the fall. And so this was over Christmas break and we're asking students to come back and take this exam. So we we incentivize them. We, we gave them an Amazon gift card to come back. And and so these are the numbers. We were pretty happy with these particular numbers that we got this many students to come back. And we did get some numbers in each of the bins, the sort of the high behaviors and the low behaviors. And that's what the ends look like. And you can see that the difference in the means or the means here. And if we look at the t-test on those, it was significant. And the Cohen's d effect size on the retention test was a point 64. So again about a letter grade dendrites and performance. So that's, that's really encouraging. It's really encouraging that we're, we're saying that sort of performance. Hang with these students after five weeks. Okay. Again, they didn't they were a lot less likely. I don't think I don't think there was one student. Like I said, it said they might have looked at some notes before they took the retention test, but most of them just took this cold. Now, some of you might, I didn't put this data in there, but I know some of you are thinking that something might as well talk about it. You might think, Well, okay, you, you set up the conditions. You set up the uveal, you apply the conditions to these sections. What if you just had one section with a bunch of really smart students in it and, and another one with not so high-performing students. And we looked at all that data. We've collected all the demographics. We were quite that high school GPA, ACT score, SAT score combo. We also looked in this class at their CSU GPA the semester before they entered this particular class. There were no significant differences between those performance indicators between the two groups. So we felt pretty, we felt pretty comfortable that we didn't just end up with a higher performing group of students in the EU behavior condition. And therefore, we see these differences. Okay? So anyway, that's, that's something that I didn't put in here. I didn't put that's the stats in here. So I will yeah, that's essentially what I have to present today. We know that I think there's really interesting questions that we are still trying to explore. We're collecting more data. We have more data than we know what to do with at this point, but we know that behavior, the behavior impacts verbal learning. We know this from the literature. We also know it from our study is that we're seeing this. And so we feel that this pedagogy has some, some merit in changing student behaviors and also has an impact. Looks like it has an impact on performance. Not only on midterm exams, comprehensive exams, but retention tests. We what are the things that I was going to mention was that we did this with this minimal intervention. And subsequently, in subsequent semesters, we said, okay, this is really cool. We have this evidence. Let's see how many students change there. How many more students we can get to change their behaviors through encouragement. And so faculty members have been just telling students now, yeah, you really should be doing this. You really should be changing, be practicing in these ways. So, so, you know, do that. And we've seen a lot more students, percentages of students changing their behaviors. And these classes, which I think is encouraging. And we're still seeing these trends of almost a let a great difference between those who practice in the UK, those who don't. And the other thing I'll mention is that the visual RPA graph that we created, we have a lot of qualitative evidence that the students appreciate having those. They feel that that has a really big impact on just getting them to think about their practice and changed our behavior. So we have some qualitative, qualitative evidence from students feedback that the visuals are an important component of, of the, the behavior pedagogy. So we continue to support that and continue to implement it. We're now, we've now implemented it in five courses. So we haven't in microbiology still VM BS1, 100, we have an end, we're removing it in Physics 140 1, which is an introductory physics course. We have it in the introductory psychology course that I talked about, trying to get students to change. And we haven't in the introductory geology class. So we were expanding its use and collecting more data and trying to understand more about it. But I'll stop there and see if you have questions. We'd be glad to answer any of them. Carol. Sorry. That's okay. I really liked this presentation. I've lot vocal as well as the eye could see this to be a real benefit for language teachers using this approach as well. Because that's so important that a durable learning. Have you found that the faculty are involved with this and also the ones they talk with our changing some of their teaching practices as well now that you're showing the relevance of these cognitive principles? Yeah. Yes. Yeah. So the microbiology where we started, so they, they've had numerous discussions about changing their entire microbiology program, right? Because, because they see this in VM PS10000, which is this first course that their students take. And then they want to embed it going forward and making sure that the opportunities to practice are there for students. And it's a really intriguing idea because what we would like to know is if, if we can, you know, by the end of their second year or third year or fourth year, if they've had enough experience practicing in these ways, if they if they continue to do that without being heavily prompted or or reminded if they just kinda get into this practice and that would be that would be the ultimate goal. As far as I'm concerned for you behaviour is if we just had students picking up things and practicing them this way. They've also been thinking about incorporating this into other other pedagogical experiences that the students get. So yes. Thanks. Yeah. I guess I meant Yes. I didn't want to interrupt your presentation earlier, so I just put it in the chat. Okay. But you were talking about or somebody pointed out some of the scores were lower at 1 on, on your graphs. And it reminded me of a story. And I am I actually going to tell you two stories. But the first story is, now, this is back in the early 1980s. So go back 40 years from today. And we were experimenting back then, are trying out, I mean, desktop computers weren't even available. But we were doing this. So it's minicomputers and doing quizzes and all in order to help students learn. And this was not a study I did, but I remember hearing about it and that was done at a law school. She said Indiana University. And they had developed, I mean, colleagues of mine who were interested in doing the design has this kind of practice, which was new at that time. And pretty much they developed these great quizzes. And they were expecting that when they would know the law students, when they would get feedback on the questions, you know, they would improve their scores. And actually what they found was the scores went down. And so somebody got the bright idea. Let's talk to the students and find out what's going on. And they quickly learn that the reason the law students scores we're going down on these quizzes was very helpful to them. To answer questions incorrectly, to see the feedback as to why those answers were not correct. Now these are. People preparing to be lawyers and, and, and, and they are going to have questions like these also on the law school bar exam, part of it. And so that that was something that was quite surprising because they were thinking, Okay, well the better they do. Yeah, on the quizzes, the better they'll do and whatever course. This is on something called torts in LA. Yeah, So I remember that, but then the next one happened to dissertation student of mine, and this was the first one I ever directed. And that's partly why I remember this. And this was back in 198687. And he was doing research on, again, different kinds of quizzing and how that would impact student learning, similar to what you've been doing. And, and this was also done by computer at the time, which was relatively novel. I mean, hardly anybody had been doing those kind of things back then. Yeah. And he was looking at I was doing research on computer adaptive testing at that time. And he was looking at several variations. I don't remember the details. But basically, he got through with this study with different conditions like similar to what you were describing, where he was expecting to find differences in the kind of practice techniques that were being used. Okay. Yeah. Between two or three different groups and I don't remember what they were. But the important thing was he got no significant differences and he was really disappointed. And so we got to looking at the data together a little bit more. And what we noticed was there was a huge variation in the ways that students access these practice quizzes. And let me give a more concrete settings for this. This was a fresh, these are freshmen students, general population and the university, they are not law school students. And this was an intro course on learning to use computers. That's one hardly anybody knew how to use computers and desktop computers were just coming on the scene. And so there was a textbook and that was being used in this course. And there were six chapters in each one of them have a chapter quiz with lots of questions. And so what this student had done was to put those onto the computer and build in feedback and so forth. And then he added and the different kinds of adaptive testing algorithms and so on, collecting this data. And so we didn't find any differences in what he was looking for, but we found huge differences in how people accessed these quizzes. Must like you're describing. Yeah. And they, they sort of naturally grouped themselves. There. There was an assignment or an expectation from the teacher, the course who was somebody else. So it wasn't until student who's doing research that if they passed each of those chapter quizzes, there were six of them that would count for 5% or 10 percent towards their grade. This activity there were supposed to do. And so the first thing we noticed was that there were some students who didn't pass the quizzes a whole bunch of times, and others who didn't even do the quizzes are hardly at all. And so the natural groupings that came out were there. There was a group of students who did not do the basic assignment. In other words, they didn't get the credit. They didn't do all six chapters and pass the quizzes. And then there were ones who did exactly what they were asked once each. Then there were ones who did it? Twice each since what I recall. And then there were ones who did it like four times or more each day. They sort of broke themselves out into those natural groupings, not how he had assigned them to various intervention. And then they had a common final exam, which was based on the same stopped, the same six chapters. And narrow final exam scores for all these undergraduate students. And so they broke out. As you might expect, the students who didn't do the basic quizzes and pass them at least once each scored roughly, I think it was around 35 or 40 percent on the final. In other words, it basically flunked out or got a D. And then the ones who, who did it four times or more past each of the quizzes four times and more, each scored around 85% correct on the final. And then the others are in-between the ones who did the basic assignment and pass them once. They, I think they're averages around 45 or 50 percent. And then the other group was in-between that and the 85%. So basically what We bought, he concluded then in this research was that this is consistent with the research that had been going on with academic learning time. And basically that is, and this has been well studied in educational research going way, way back, back to the sixties and seventies and onwards. And that is academic warning. Time is successful engagement and tasks that are similar to those you're going to be later tested on or assessed in some way. So successful engagement in common tangent is a good predictor of achievement. And I think that's what your, your research is also showing here. The same kind of pattern and also showing the variability in how much students actually take advantage of those things. Yeah, that's, that's an interesting TED. Thanks for sharing those stories. One of, one of the things that just to respond to that is that one of the things that we looked at is okay, so maybe some of you have this question, but okay, those that were in the low stakes quizzing methodology, they just spend less time studying these things than those in the EU behaviour experience. And what we found where there were no significant, and I think it's really fascinating. There were no significant differences in the amount of time that the slow, yeah. But that's what ALT findings have shown over and over and past research. It's not that engage time is important, yes. Agreed. But yet made the biggest difference was successful engage time. In other words, it wasn't the number quizzes or the amount of time students in that doctoral students study we're spending. It was a number of successes they had. And the practices are randomly selecting, selecting questions out of a larger pool for each chapter. And so, yeah, the number of times they pass that test was was a predictor of how well they did find out an exam. So it wasn't just the number of attempts or the amount of time spent. I don't remember. Off hand. Yeah, there he investigated that it was the number of successes, successful engagement. That makes sense. Are there other questions? So I guess I was wondering, what are you recommending that we do to encourage students to have that behaviors that are now more meaningful for their time. The thing that your time. Yeah. I mean, I guess I guess my thought would be should probably try to implement your behavior. I don't I don't know. I think I think that the metrics drive or significant. Right. So if you're going to, if you're going to do low-stakes quizzing methodology and they get to keep their high score. You are guaranteed that they're going to do the less than optimal study behaviors and practices. Okay? We've seen it in every class we've looked at that uses that low stakes learning are quizzing methodology. So I think you have to change or those metrics, you have to find a way to put the points not on that, but on practice, right? And and then you need then you need something that can give you an indication. And that's what they RPA graph does. The way that we're graphing these things from the data. It gives you an indication that, that students are practicing in optimal ways, right? Or you need to figure out a different way to do that. And that was just our sort of solution for it. Yeah. Yeah. And you can imagine, I mean, I can imagine a lot ideas. I have kind of imaginative minds, so, but I can think, okay, Can we do this with clickers in the classroom? Right? Like, I mean, clickers are our grade because you've got students clicking and they're engaged. At least I'm not just sitting there staring at the wall. But can we, can we space out the practice of these things across time and in the use of clickers. And have we studied that to see if there's an effect, right? But I would imagine that there would be. Mm-hm. Right? So if you, if you figured out a way to manage that as the instructor and sort of had this, these recall tests happening in your clicker system, that would be really cool or more? Yeah. Yeah, I could I add something sure. In response to lend us. I take it with lenders. I can see who is speaking. Question, what, what else could you do? I would just bring to people's attention research that's been done on first principles of instruction. And you're talking about the application principle, which is one of the 5 first principles of instruction. These are not things I invented. These are in the literature. I've done research on them, but they're out there because they're all associated with things it can be done that promote student learning. And so I would respond to Linda's question that, Yeah, successful practices important, agreed. Certainly. It makes a difference. It's consistent with all the research that's gone on for about 60 years on academic learning time in educational research. Well, that's one principle, okay? There's four more that are quite important. And, and what I would say would be that principle number 1 if something to start thinking about and for people who are hoping to motivate their students more. And that is coming up with what are authentic problems or tasks for students to engage in authentic problems or tasks, their real-world tasks, in other words, whole tasks. And I know this becomes a huge challenge with these large enrollment courses. So I don't want to go there right now. I'm just going with what will make a difference because those tasks are, I mean, first of all, students will see how they are relevant, okay, to what it is you want them to be doing. And then they're going to do not just one, but a number of them. And the key principle there is increasing the complexity of those authentic tasks over time. So that's, that's the overriding principle and that's consistent with a lot of the research that's been done on problem-based learning, especially in medical schools and, and, and other programs, professional programs. And the second principle that makes a difference is called activation. And that's getting students to connect what they already know with what they're supposed to be learning new. And that's called the activation principle. The third one is demonstration that the students see examples of somebody, either the instructor or others solving those tasks or problem. They see how it's being done. Then the applications, so it's good to try themselves. And then finally, integration, which is where students can take what they learn and bring it into their own lives. Use it, somehow, use what they've learned. Those principles when taken together will improve learning. Increase the likelihood that, that this kind of long-term retention that you're aiming for is going to happen and not just recall facts. Okay. But we're talking about wide ranges of barn like whether it's a surgeon remembering somebody's appendix to people learning to drive a truck. I mean, it applies to a wide range of different kinds of learning, not just what I would call factual recall, basic concept learning, which is what you're going for and that's well-written. I'm no, I think yeah. I didn't really clarify that at the beginning, but a lot of these questions that are in the MIP 300 bank pool, pools of questions or conceptual questions. There is some jargon language stuff in there, but, you know, a lot of them are conceptual based, pretty complex conceptual concepts in responses. But I don't disagree with you. I, you know, I didn't want this to come off as this was the end of learning or the solution to learning. What we were trying to do. If you remember where we started this conversation or I started the conversation was we were looking at the behaviors that were being that we were seeing students engage in, that word is less than a mole with the quizzing them. We wanted to that, yeah, I get that. And I also, this is my first participation in this particular learning analytics group or submit button. But I'm really pleased to see having been here about three days now and, and listening or watching quite a range of presentations is I'm, I'm really glad to see that there are other people, not just educational researchers, but other people like chemists and biologists and so on, who are interested in how can change what they're doing to improve learning that I, I've, I find that very encouraging. I really do. I mean, I've, I've been around quite a while and in education, I've been retired for 12 years now, but 10 years. And I've seen a lot and yeah, the attitude that I recall seeing that was so disappointing years ago was research has all that mattered? If you're in a research one institution, which is where I will answer. Yeah, it's teaching as well. Okay. I got do it. And. They didn't care whether students are really learning or not. Yeah, it is kinda, kinda nice. That's a nice thing to see. Definitely the microbiologists getting engaged and that the physics instructors getting engaged. Are there other questions, Carol? So we're, we're exploring the use of some AI in our learning management system. And I could see applying this methodology very specifically being a good use for that. Because why do you have AIA in there? And this essentially is a, we're looking at two tools that act like they, they call them tutors. It's a pretty high level. They're reporting with that, but essentially allow the students to constantly test themselves with the knowledge of gain. And they're limited obviously, and they're just one tool in the learning toolkit. But this has been really interesting for me. That's a resignation because it's definitely thinking more specifically around behavior change, which is not something that we haven't gotten to the point of what our bones here. But that could be very much one of the goals of momentous. I really liked that idea and just to circle back around to where Linda started talking and introducing ME is that V1 is one of the reasons I got started in the ethics component was because of some of the things that I saw that we're going into intelligent tutoring. And they they it it looked to me as though they were just they were, let's put it this way. They were going to make students depended on the learning management system. Okay. And I immediately said, This is not an institution I want to work for. I don't want to work for an institution that's not teaching people how to learn. Okay? And so I immediately put together a task force to look at the ethics of learning analytics. And we pushed back against a bunch of those initiatives that were happening on our campus specifically because I didn't like that direction. And believe me, there were plenty of faculty members who didn't like that direction that we brought on board and we wrote these up. And then that kind of spurred me into this, right? Because I was like, Okay, I am in this field of learning analytics. I don't think, I think there's opportunity here, right? And that's when we started looking at things like, okay, what, what we, how could we coach students better in the learning space? And that's where we ended up with behavior. It's been a long journey. It's been fun. Other questions. Yeah, I had a quick question. So loved this presentation as an instructional designer who asked As has to have many conversations about the usability and purposes of quizzes. Definitely the metacognitive aspect here, but is there any plan to make this application available outside of CS or yeah, so so we actually are, have been working with with unison to make it available through the unison server platform. We didn't run an experiment. And that experiment, which was during COVID, was sort of shut down for a period of time, or we're trying to bring it back up a line. So if your unison member, Jeff, Jeff, I don't know where you are. You're at Michigan okay. Western Michigan University. Okay. Yeah. So that's one way to go about that. If you're interested in talking to me further about potentially using the application, then reach, reach me. Okay? Cuz you could also run it locally, obviously where we're running it locally right now. Actually thinking, yeah. And let someone that apps instructional videos available on a public. It's It's not have if, if that whoever that is reach out to me if you're still here and I'll see if I can find a way for you to see that. That is me as well. So just to see, let's see how you can lay out the information for students. Yeah, yeah. Okay. Yeah, I'd be glad to share with you. How do you spell unison? And IZ IN I think is that right? Else? That's correct. Okay. Unison.org. Thank Zinfandel, I think is that where the name came from? I think. I know no other questions. Thank you. Yeah. Thanks. Thanks. Thanks, everybody. If there's no other questions. Yeah. Every other things to say, Linda as well, thank you so much for coming in to work on your innovations. They've been watching over the years, so we're real happy that you shared this button as well. And I didn't realize it was going to be any sense of that even better.
Description of the video:So anyway, I'm going to thank you, Shannon, and I do appreciate your involvement in this work. I know you. I think you come to this summit every year so far, So in-person and virtual. So it's really nice to see colleagues from the Ohio State and share work that intersects with other interests. We all have. Shannon Jagger is, is the Assistant Vice Provost for the Ohio State University, directs the student success research lab, where she leads both qualitative and quantitative research projects. They focus on academic support programs, patterns of student academic progression, and instructional improvement initiative. Phipps. The I guess the thing that really is of interest to her talk today is she did publish a book in 2015, published by Harvard University and co-authored by Thomas Bailey and Davis Jenkins, which is called redesigning Americans community colleges, a clear path to students success. And the research there, which is quite rich, it is at that evidence which supports the need for fundamental redesign, the way two-year colleges operate, and stressing the integration of those services with more clearly structured programs of support for students goals. So, thank you very much, Anna, I look forward to hearing your talk today and I will turn it over to you. It sounds like you're also willing to take questions during the talk so folks want to raise their hand or are arrayed in the chat, I'll just interrupt you at the appropriate time and we'll go from there. Absolutely. Yeah. I'm happy to stop and go into more detail. And at any point, we have an hour and a half for this session. But George said that, that was, that was it a maximalist timeframe that he does not expect this to take that long. And so we have plenty of time to sort of stop and chat about things and something comes up that you find interesting. So this talk is about access crediting among university graduates. The question that we're asking with this paper is whether a statewide course equivalency policy increases the application of transfer courses, two bachelor's degrees. So George mentioned that I've written a book on community colleges. I'm a big fan of community colleges. And this is a big issue for community college students. So my collaborators on this project, or Marcos Rivera, who was a postdoc in my lab and is now at the NCAA. And Katie's. Finally. So this is part of a larger project on transfer students that was funded by the Joyce Foundation last year at this conference, we presented some preliminary findings from ours study of degree audits and we've advanced fat work over the past year and divided it into two papers. And this paper is focused on transfer credit. And we're writing this paper for a policy audience. Because a number of states have passed legislation related to transfer credit articulation. But nobody is really clear about how lilies work or whether they have an impact. So here we're trying to get a sense of whether our Ohio credit transfer policy makes a difference in terms of how our students transfer credits are processed and applied. So first I want to give you some background on why should we care about this. So I'm almost 40 percent of college students transfer between institutions at some point in their college career. And these students often lose course credits at the point of transfer. So our big national study, Monahan and narrow, found that for 14% of community college transfer students, the receiving institution, accepted almost none of their community college credits. For about 58% of students, the four-year college accepted on almost all of their credits. Um, and then the nurse or the middle group where the four-year college accepted between ten to 89% of their credits. And this extended credit last who really did impact the student's likelihood of graduation. So those who were able to transfer most or all of their credits had 2.5 times greater odds of graduating within six years compared to those who transferred less than half of the credits that they had earned. So based on my hand animals panels, they estimated that if we were able to get rid of credit was entirely and that's for college level credits. Not even counting when we are credits. That bachelor attainment rates among community college transfer students who go up by about nine percentage points from 45 percent to 54%. So state policymakers are very taken with this state of affairs. They don't like the idea that they are paying Community Colleges, tub or subsidizing community colleges in having students take courses that then those students have to take again when they get to the states for your colleges. This is considered a waste of taxpayer money. And so they would like to reduce this situations as possible. So a lot of states have passed legislation related to credit articulation, course articulations. So these are some screenshots from a recent report that came out from the tackling Transfer Project. I'm on the Policy Advisory Board for tackling transfer. And so provided some support in the creation of this report. And this just at a couple of screenshots to give you a sense of how intensively states are pursuing the, the, the goal of trying to make course in credit transfer more seamless, especially between the states community colleges and four-year colleges. So I'm not gonna go into detail about what all these different policy elements are there, all sort of focused on trying to make credit transfer more seamless? And you can see in the map, the, the states that have the deeper blue color have more of these kinds of policies or more advanced policies. Whereas the lens with the lighter blue color have fewer policies and all. And only like, maybe like one of these foundational policies. All of these policies, not always but often in most states, are predicated on course equivalency as one of their core through foundations. And so that was one of the reasons why we thought it would be really interesting to look at the concept, of course, equivalency, what it needs and how it plays out in practice for real students. And sort of that movement around thinking about this started as early as the 1920s. Not in terms of state-wide policy, but in sort of a more individual college to college relationship. So this is actually a social network graph. It's, it's something that I, I found through comments, the, the open, open licensing. But I liked it because it, to me really illustrated the same kind of situation that you see with these individuals and individual institutional articulation agreements. So what you have with those individual agreements is that individual pairs of college, that, colleges that have a high volume of transfer, they start working with each other to try to help students transfer seamlessly between just there to knowledges. So for example, one colleges introductory calculus course would be accepted and treated as equivalent to the partner universities introductory calculus course. You can kind of see that there's just this extremely complex pattern of institution to institution articulation agreements. And so if you're going to two colleges that are quite close together on this graph, you might have lots of courses that you could transfer between them. But if you're going to colleges that are quite far apart in the scrap, you might have few or no courses that could transfer between them. And just to zoom in on that idea a little bit further. Here's another social network diagram that I sort of added some text to deserve against her help illustrate this idea. So in this theoretical example, we have university of state kind of in the middle of the diagram. And they have lots of course agreements with their community colleges and four-year colleges around their state on. So you can see that they have some kinds of course agreements with an urban community college. For urban State University, the urban Tech University. I've been Tech Community College, suburban community college. Then we have another university, a small city university. So it also has agreements with other colleges, not the same colleges, the University of status, although some of the same college that the University of status. So when you have these individual institution to institution articulation agreements, they most typically include the very common freshman and sophomore courses, gen ads, or prereqs to majors or introductory major courses. And so very commonly you'll see introductory headless English, composition, Spanish, psychology, organic chemistry, this first introductory courses in each of those disciplines. Can I ask you a couple of questions? Sure, Sure thing. So even going back to the other slide. I guess the, so the way this network is working is that further distance away a node is what? That need is something that just has a conceptual illustration. Okay, I don't know, a graph, it's not a real graph and talk I mentioned agonists, but it gives you that same kind of idea that you would see like a social network. You see that when you talk to lots of institutions about who they have, I say agreements with ACH. It tends to be like the institutions that are close by, that they have a lot of students transferring between here, not going to see an articulation agreement between, say, a community college and South Dakota at a University in Minnesota, right? Right. Right. So yeah, so I'm sorry, that a drug. So the so but what the point of this this slide here was to say. Yeah, that was to say that the articulation agreements began in the 1920s. We've been having, that's how a wall and go, we started thinking about this particular process. Thanks, how complicated it is as you would have no way of knowing this site as a student, right? Yeah. And even as even there is, there is no one individual who would know all of this. The only way I know which universities or colleges one or another articulates with are the ones who happened to work and they mentioned insistence of each one of those individual Conrad, right? Right. Yeah. Okay. So let's see. So in addition to the sort of introductory courses, sometimes institutions when layer program articulation agreements on top. So for example, a psychology major at one university might specify which courses from a local community college should be accepted to that psychology degree. And so there's a very localized, very complex system that arose over time. And a single college could accumulate hundreds of different articulation agreements with other colleges, both near and far. Very different agreements slit each of those colleges. And you can imagine that reviewing and updating all those agreements is really time-consuming and a way that might not actually be feasible. So if the curricular change, the agreements might not be updated in a timely fashion and compact creates misinformation and confusion. And so this is a set of affairs that state policy started to wade into ten-foot teen years ago. I think that sort of started to be a really big positive movement. So to simplify and streamline this multitude of agreements, a lot of states created statewide course equivalency policies. And so this is o, I'm going to zoom in a little bit further, actually, just to say a couple more things about this. So if we were looking at, say, the university's state, okay, So we have university of stake here and then another graph. And we see that they do have some agreements with urban Canada College and they have some agreements with suburban community college. And so zooming in a little bit further and what that means. So so again, this is just an example. This is representative of the kind of thing I've seen a live. So university state might have an agreement with urban community college that they will take their intro calc class, and that will be the equivalent of the University of states trigonometry and geometry class. You took a drug, how the urban CC, you know exactly how it's going to translate when you get to university of state. It's a trigonometry and geometry. If you took intro physics, that's going to be algebra based physics at University Upstate. If you took physics for majors at Bourbon CZ, that's going to be calculus-based physics at University of state. If you took intro psychology at Urban Community College, well, there's no agreement for that. So it, it it might get converted into psychology one-on-one at University of state. If say the Psychology chair or a credit coordinator takes a look at it and says, Oh yeah, this looks like psychology of marijuana. It my art form, it may not. Network analysis. Similarly, no agreement. Say not not real sure what's going to happen. If you're coming from urban CC, think are coming from suburban CC era intro class course in this example would translate into calculus line. It translates differently than urban CCS. Who knows why that is? It may very well be that urban CCS intro course is actually below their calculus. One course, it's just called Intro Calc. Then their congressmen calls, of course, is called calculus 1. So that might be why it translates differently than this one does. Or it might just be that university of state math chair doesn't think highly of urban CCS math courses and just doesn't think it deserves to be called Calculus 1. That's also possible. I'm so used to serve similar, similar situations here. The suburban community college, their intro physics becomes calculus-based physics at the University of state. That might just be because of different naming conventions. Or it might be because of perceptions of the quality of each of these colleges, physics departments. And then still further, if we zoom in and look at urban community college, let's say you took intro physics there, you sent your Intro Physics course to University of state and this process just algebra based physics according to the agreement. But then how is it applied to your degree? Depends on what major you're in. So in this example, at that university of state, you're in earth sciences education major algebra, a specific for those major requirement is a measure of granite. So check your good. If you're an English major at university of state, algebra based physics, that those the science GE. So again, great. If you're a physics major at university of state, algebra based physics or not considered part of physics major. It's not majors course is the course that I guess you can take before you take calculus-based physics. If you feel like you really need an introduction physics first. But given that calculus-based physics is the way you're supposed to take. If you took algebra based physics, it's probably not going to qualify as a science GE because we're taking so many of your other courses and science courses that algebra face physics is not probably not going to use to fill that bucket. And it's not a major requirement, it's probably going to become an elective. So does that make sense? Yes. How does that resonate with you about what you see at your own universities and just seeing tree farm. Again, it's just an example based on what I've seen some places. Yeah. And I think I think what you're what you're showing us here is that there are many different decisions are getting made by many different people with many different interests. And how these courses gets translated, right? Is that a good, is that a good summary of what you've learned so far? Yeah. And i'm I'm going throughout this background because when I get to data analytics part of this presentation, yeah. Really confusing. Yeah, it's already confusing, right? Yeah. Yeah. Yeah. Right. In the chat and see if I'm thank somebody. Put something out. Yeah. He yeah. He says I'm going to leave. But yeah. Okay. But other thing I understood. I understand. Well, what you've laid out and this needs that make sense. Yes. Yeah. Thanks, Casey. Okay. So why states have been trying to do is make many of them have passed legislation related to statewide course equivalency. And so basically the theory of change here is that the state requires some colleges, usually just public colleges, to recognize and honor each other's courses. So let's take introductory capitalists as an example. So the colleges are supposed to take each other's introductory calculus courses and accept them and apply them in exactly the same way as they do their own introductory calculus course. So if colleges comply, then they'll alter their business processes to ensure that equivalency. And that should increase the likelihood of degree application. That calculus course should apply to the students. Ge, your major requirements. And just the same way that the calculus course SIT They natively took at that institution would do. Now this could also shift student course taking towards the courses that are covered under the agreement because they know that they're safe. If they take those courses, they know that they'll transfer. So if they did that, but they changed their behaviors, that would help magnify the impact of the policy, but it's not necessary in order for the policy to work. But this theory of change might run into a couple of challenges. So first, colleges might feel uncomfortable with the policy. They might feel like, well, our intro calculus course is really different from those of the other colleges. And students aren't going to be successful moving forward if they don't take our calculus course. So for example, advisors might tell students, Yeah, I mean we'll take your calculus course, but I don't think you're going to be successful. I would recommend you take it again here, so that could happen. And then that case, that first calculus course probably would not apply to the student's degree because it would be a duplicated corks. And it could be that college. To start assiduous in complying with the state policy. And so those courses that are supposed to be processed adequately as equivalents, maybe they're not always processed as equivalents. And then even if colleges do comply with the policy, then the path from course equivalency to course the clickability seems simple, but it represents a series of complicated and unclear mechanisms that could interlock an unexpected ways that could be moderated by students academic pathway. So that's why I've got that little dotted line up from the white box. So what we're exploring this paper is where this path sort of 23 are the equivalency to application, how that seems to work and where it breaks down. And we're focusing on one large university in Ohio which showed though unnamed. So some background information and Ohio and our policy. So back in the nineties, ohio created an equivalency policy that was focused on general education courses. And then over time they built on this with equivalencies for pre major and introductory major courses. And then over time they've layered and technical versus a military credit. And in recent years they've been building on all this great statewide 2 plus 2 agreements that we can talk about at the end. But these were not yet in place during the time-frame that we're studying. So this policy framework applies to all of Ohio's public colleges. So it's really important to say here that Ohio's policy implementation has been really collaborative. So the way that it works is for each course, like a capitalist line, the state has a faculty panel that includes representatives from two-year and four-year publics. And each panel. I'm at the outset you've collaborated to establish state-wide common learning outcomes for Calculus 1. And then an irregular cycle they meet to re-examine and update those learning outcomes if necessary. So individual colleges have flexibility to tweak or to add their own learning outcomes, but they have to be at least 70 percent the same as the state standards in terms of the learning outcomes. So it's their intro calculus course meet statewide standard for Calculus 1. Then it will be approved by a state and go into a statewide database labeled as Calculus 1. So this database is accessible through web-based tool so anyone can check and see how their calculus, one course where intro calculus course will be processed into equivalency at another college. And want to participate in college, receives a course tagged as Calculus 1. They will convert it directly into their own equivalent of calculus one, which might not be called Catholics, one it might be called capitalists, 1100 or whatever. So at our university we call that an active role, which means that no real person ever sits down and looks at the courses to decide which of our several capitalists related courses it should map onto the system. Just knows we received this course from this college. So it becomes NADPH 1150 when there is no person kind of intervening in that process. By if a college receives a course that doesn't have an active role within an institutional agent has to evaluate the course to determine whether their college offers an equivalent. Horse has an equivalent at the receiving college, then its applicability to the student's current degree will be immediately clear. So depending on the college, student or their advisor can run a degree audit and determine how the students transferring courses are going to tell different group requirement for that degree. But if the incoming course has no equivalent at the receiving college, It may still be accepted, but its applicability to the student's degree is unclear. So Janet, may I interrupt again? So when you just said that thing about running a process, that process. Recall what was it? O degree on it? Yeah. Who's running the degree audit? Was full? Depends on college age. College runs their own degree audits. I mean I mean, is it a school that the students transferring in or the school that they're trying to have credits applied from. Right. So they they are so like let's say they're wanting to transfer to Ohio State. They can look up and see how their courses will be processed as equivalents. But they don't necessarily know exactly how that will apply to their degree. Got a look and see at different degree maps and be like, Okay, I see a really want enough for sure. They can ask to have a degree audit run when I their advisor could run it for them. Okay. And say, Okay, we see exactly where these courses are going, of course doesn't go anywhere in this course applies in the gen ed and so on. That makes sense. Yeah, thank you. But if the incoming course doesn't have an equivalent at their new college, it's going to be unclear how it's going to be applied. So the student's going to need to work with their departments, transfer credit coordinator or a chair or senior advisor to figure out whether those credits can be assigned to any specific degree requirement bucket. So to get an understanding of how this ohio policy works for work. Focusing on a large public university in Ohio. We got an extracted data from a degree audit system. We looked at students who graduated with a bachelor's degree at any point across two academic years. To make it simple, you'll see it's really not football, but to make it as simple as possible, we included all the students who graduated with one major, no double majors. And for this analysis, we include only students with any transfer courses. Because what we really want to see is to purpose how those transfer courses for processed and apply it to degrees. So this gave us almost 8000 graduates. About half of those were officially transfer students. Together across all these students, they transferred almost a 100 thousand courses. The average student in our sample had 13 transfer courses. Those who were officially transfer students had about 20 transfer courses, and those who started as freshmen had about for transfer courses. These were mostly either dual enrollment courses that they took while they college courses they took while in high school. Or sometimes courses that they took during summer breaks from other universities or from other key like their home community college. This is a sample that we're working with. We're excluding international and military credits from this analysis for variety of reasons I can talk about if any of you had a strong interest in international or military credits. Okay, and so then who is in our sample? Before I get to who was in our sample, which colleges are in your sample. So we have above in-state and out-of-state colleges and our sample, we have a 104 and state colleges to you. I am a 150, two out of state colleges that sent it in state, over 7 thousand students with 75 thousand horses. And out-of-state about 2000 students, that 25000 courses. These are all students who graduated from the university. And then it just sort of gives you a sense of, you know, across all the breakdowns, possible breakdowns of to your public, four-year public that we're in state, that we're at state two. And for your privates in-state and out-of-state. And look what the breakdown is of that. So you can see that we do have, if most, not most, but the, the more of a concentration of the colleges being two-year public colleges, students coming from two-year public colleges and courses coming frontier public colleges. Okay? And then in terms of sample demographics, so where are samples about half female. And the races, ethnicities mirror the race and ethnicities of that the larger university, the ages tend to be a bit older than our typical an undergraduate student because there are a lot of transfer students in the population and a fair amount of first-generation students, first in their family to attend college. And then we have, since you're an academic characteristics, does a lot of these were interested in looking at here was by then the students graduated with a competitive major. So this is a major that you can't just enroll in. Once you're accepted to the university, you have to go through a selective, competitive process in order to enter the major. So these are majors like engineering, business, and other majors that require you to complete certain prerequisites for certain grades in order to get into the major. Then we had we were interested in knowing how are sort of looking at the majors. How many of these students had meters that required more than a 120 credits in order to graduate, which is the minimum number of credits required to graduate from the university. Only about 25 percent of students in our sample, we're in the sort of hide credit required majors. And the other 75 percent, you only had to earn 120 one credit summer undergraduate. Then we, we included this option into exploration variable because we, I have this major called exploration, which is for students who enter the university not quite sure what they want to study. And some students are put into that major or optioned into that major because they don't get into the major, they want it. So this 9% of students in our sample who are auction did explorations that are represent students who had to change their major or had to spend a lot more time to make selected getting into their major. So you could see how they might serve the crew excess credits during that. Period, that 52% transfer students. And then we also reserve interested in this question of many of our students do change majors at some point. But when we looked at the average proportion of time that students spend in their final degree that they earned. Most students were in their degree for at least 75 percent of the time that they were at the university. And we kind of split it did about, about that point. So we, we felt like we might see differences between students who like changed majors a lot or changed majors later, versus students who never change majors or change their major priority. So that's, that's kinda what that variable is getting. Yeah. Okay. And then we also looked at the courses in our dataset to see how many of them had an active poll. So if you remember, that active role means that the course is covered by state policy. So the policy says this course should be processed as an equivalent. So air sample 38% of our course has had an active rule and the other 62% did not. So 38% were covered by the state policy. And those are all going to be courses that are at Ohio publics because Ohio privates and not Ohio's are not covered by the state policy. And so are those active rules or being covered by the state policy, does that mean that the courses are processed after equivalence? In general, yes. So the courses that had an active rule, 90% of them are processed as equivalent. The 8% that were not office as equivalent. We're also covered. This was allowable by state policy because they were courses that are not offered at the university. So, let's say, for example, that a student takes a technical math class at another university or, or another community college that's covered by state policy. But the institution that they are transferring two does not offer a course that is an equivalent to that. The university does not have to process it is an equivalent because it can't really, it figures out something else to do with that course. So had our university of the course is covered by state Halsey, only about 8% of them were not processed as equivalence. Of the courses that did not have an active role. A lot of them were also processes of complaints about 44% Word processors equivalence and the rest were not. Okay. So now we're starting to get more into the analytics. So, so I'm showing you first the added state colleges because these are all courses that were transferred without an active rule, not covered by state policy, obviously, did you out state colleges. I'm just to sort of show you how they are processed when they get to the university. As it is a little bit of a counterfactual. Like what might we expect if there's, if there's no policy covering it? So we can see the type of college on the left, not Ohio, two years now, four years now, Ohio for your privates. And then we have how they were processed on the right hand side. So they can be possesses an equivalent. The first top two buckets are equivalent courses, and then the bottom is not equivalent courses. In this top two buckets, we have about 300 courses at the university that are tagged as GE courses. So these are courses that can be used to fulfill GE requirements. They're not always used to fulfill GE requirements. But you can really fulfill most GE requirements without these courses. So these are, these areas are your typical GE courses, your philosophy, English, introductory math. The introductory language is social sciences, introductory courses and so on. If I went non GE courses are the courses processes and equivalent, but it's not a course that's tagged as a GE, so probably wouldn't be applied as a GE either. So you can see that across all of the out-of-state college is It's more likely than not that of course will not be processed as an equivalent or it will be processed as a non-equivalent. But some sudden good chunk of them do get processed as equivalent GE courses or equivalent non GE courses. And so compare that to the situation for in-state colleges, especially. So here's the in-state causes. The public colleges are covered by state policy. The private colleges are not set the public colleges, and these courses have active rules attached to them at that particular course is covered by state policy. So we've color-coded it here with the active rules. Covered by state policy or blue and then not active roles are great. You also see like a slender thread of active roles kind from the Ohio for your privates. That's because during that, under this time period under study, we started to create some active roles for horses, but weren't explicitly covered by policy. But we saw that for the active rule thing was pretty really well and we decided to start expanding it to other colleges that we had articulation agreements. I thought we should create articulation agreements that but in general, if it's blue, it's covered by state policy and if it's gray, it's not. So you can see that courses that are coming from the publics are much more likely to be processed as equivalence, especially those that are covered by the act of rules are covered by the state policy. So then the question is okay, so they've been processed equivalent courses for the most part had. But does that mean in terms of their application to a degree? Because that's flipped the state policy really cares about I mean, it's the equivalency is at a mechanism. What we care about is how does it apply to the student's degree? So we looked at how the courses are applied to a student's degree. And this is again, just transfer courses. So we're only looking at courses that students transfer end. About 43 percent of them were applied to the students. Ge requirements, about 8% were applied to major requirements. I'm about 8% were applied as electives. And if you had sort of yeah. All right, So I'll come back to that. And then about 3% were unethical. So inapplicable is courses that simply cannot be applied to the degree. And so these are gonna be remedial courses, developmental sort of below college level courses, and duplicated courses. So courses that you took a second time, you can apply one of them to a degree, but not the other. So generally students apply the one that they pat via higher gradient and then the axis. Okay? So electives are courses that didn't fit into the major bucket and didn't fit into the GE bucket. But that can be used to fulfill the remaining credit hours that are required by major, which is usually a 120 credits, as I said, but could be more. And then that excess applicable courses. So after the student is still there major bucket, they felt the GE bucket, they felt the electives bucket. They may have additional coursework which is qualified to fill one or more of these buckets. But it's not necessary to, the bucket was well to be felt without these courses. So usually this course is falling fungible with a course in another bucket. So usually it could have been used as a GE, or it could've been used as major, could've been used as an elective. And so it's just kind of a matter of chance that whether ends up in one of those buckets, if it ends up with an access course. Okay? So the major question that we had is that when equivalent courses be more likely to be applied to the GE or the major. Being applied to the elective is not that helpful. Because you can apply any of these courses as an elective an adult. There's an elective course can be anything as long as it's not death. And so having a course that applies, it's an elective, is like, Yeah, it's okay, but it's really not what students are looking for. They really want their transfer courses. I took the GE or supply to the major. And so we created this variable that was, did the court apply to your major? And we looked at whether whether equivalent courses are more likely to do. So this is what we see. So can ignore the color-coding by active role right now. But if it was processed as an equivalent GE course on the left, and then the likelihood that it will be applied to the general education, a bucket of the debris out. It was really high. You can see that almost all of the equipment GE, courses were applied to the degree as general education or to fulfill the general education requirement. And it was processing so prevalent course that was not a GE, it kind of spreads out. Some of them go to general education, but not very many. Some of them are major forces. Plight of the major. Some of them end up as electives and some of them end up as access than if you're of course was not processes and equivalent. And therefore can't be a G here, can't be processed as a GE. Some of them do end up being applied to the general education, but very small proportions of them. Some of them do end up as a major or an elective, but most of them and as excess credit. And so then we also, once we did a sum logistic regressions, tried to understand the likelihood that if you were to transfer and of course, from any of these four types, colleges, in-state or out-of-state, private or public. The likelihood that it would apply to your degree as a GE or major. And we did this separately for courses that were processed as equivalence versus not the equivalence. And here we're controlling for a couple of other institutional characteristics, like whether it was a two-year, four-year college, and whether it had the word tech and its name, except for Ivy Tech. But most, most of the times it has the word tech in its name. It's more of a technical college. And so its courses are probably not going to be equivalent or not covered under policy even if it's of in-state school. So if your course, so to sort of understand what this chart is saying, if you look at the top set a non-equivalent courses, these are courses that came in that were processed is not the equivalence. If the course came in from an Ohio public than a likelihood that would apply to your GE or a major crime? It is about 13 percent. Do you have about a 13 percent chance that it would apply if it came from Ohio private, about a 15 percent chance that it came from an Ohio public, about an 80 percent chance. And a nano higher private, about a 16 percent chance. So nominal level and horses kind of, no matter where they came from, pretty unlikely to apply as a G major. The equivalent courses, these are on the seventies. So there's a pretty strong chance at recourses process that equivalent that will apply as a GE or major no matter what kind of college it came from. But your chances actually seem a little bit higher if that course came from like a nine Ohio private. And now we're actually a little bit less if it came from Ohio public. So the real thing that we're seeing here as the impact of equivalency. Okay? So does a theory of change hold up? Well, at least at this university that we're studying, that there is a very strong policy impact in the sense that the student's courses are much more likely to be processes are equivalent if a, covered by state policy. So that part definitely holds up. And then we also saw that in the students. And again, this theory of change was referring to mass specifically. But in general, if the course is processing equivalent, but also something that was more likely to apply to G major. So it does kind of seem like we have a strong theory of change here. But there's a lot that happens to kinda complicate the picture. So all these sort of Sankey diagrams that we've seen so far, sort of put them all together right here together into the same chart so that you can see how it all works together. And what we see when we look a little bit more carefully at the results is that if we do logistic regressions, saying whether the transfer courses are applied to the G major. And we compare between these four different types of colleges. We do see that in general, the Ohio publics do seem to have a higher probability of the course and applying to the GE, your major or 53 percent chance. And then it goes down down the list. Ohio privates to nano have habits, not a private. You don't have a great chance that your course is going to, no matter whether it's equivalent or not, a Berlin, that it's going to apply to your GE or major about a 42% chance. So I mean, it kind of seems like the Ohio publics are getting a unique advantage here because they're covered by the state policy. But maybe not so fast. So when we predict for course equivalence without controlling for student characteristics, we see really strong positive may affects republics and say, public college, no matter if it's an in-state or out-of-state. You're more likely to have the of course processes equivalent. We see positive effects for in-state separate courses coming from the in-state college, whether it's public or private, it's more likely to be process was no equivalent. So what we're really interested in is that interaction. Public and Ohio because that's kind of the unique boost that you might get from the policy. And we do see a strong positive interaction there. So does seem like the course, like the policies making really big difference in terms of course is being processed, it's course equivalence. But then when you go to predict application to the major AGE, like I showed you, it looks like the Ohio publics are ahead. But that's includes both the main effects and the interaction effect. And really what's going on. America's mostly main effects. So there's a strong positive immune effects for publics. There's a strong positive main effect for me and stay. But then this is kinda like a week and inconsistent positive interaction term, which kind of suggests slay. Ohio publics, their courses might be more likely to apply to the major or GE. Anyway, even if there was no policy, just because they're in Ohio and their publics there and Ohio, the other colleges know them. You know, they kind of know what their courses are. There, public, their courses tend to be more similar. In general, even in the absence of state policy, that horses might tend to be more similar. So it's like it's hard to really know whether the policy that might be making a difference here and it's just being a public and being in state. And so we might see like in any state, even if they didn't have a policy, you might see Republican state colleges having their courses being more likely to apply to majors or genes. It's just, it's a little bit fuzzy. And then if we control for whether the course is equivalent or not, a process that's equivalent or not. And student characteristics. We, we still see that positive, strong main effect for publics and strong main effect for in-state. But kind of a near-zero interaction term is. One of the problems with trying to look at this is that we are probably being too conservative about our attempt to estimate the mean hours of isolate the impact of the policy because we're trying to do it on all transferred in courses. Whereas the policy only covers, you know, those those introductory, freshman and sophomore level courses. So if we could sort of restrict the analysis to just those kinds of courses that tend to be covered by these policies. We, maybe we would see an impact hop out. But we can't restrict our analysis in that way because for the courses that are coming that are not covered by the policy, we don't know if they're the kind of courses that wouldn't be covered by the policy. So like if we see that intro calc course coming from, say, you know, Indiana University, we don't know what intro calc course means. We don't know if that means below calculus 1 or if it means calculus land. The only way we can tell us by looking at the syllabus. So it's and we don't have the syllabus and our analysis. So like we don't we don't know whether that's the kind of course that would be covered by policy or not. Okay. So but it was yours. You're thinking more also about what could be going on here. We thought a lot of this could also have to deal with the student pathway characteristics. So this sort of dotted line, this or moderating impacts between whether the course is processes and equivalent and whether it applies to your major. So we started looking at some of this, the Student Academic and pathway characteristics. And one of the things that we saw is that students that were, that were, that came in as freshmen, which are all these students over here. Their courses were more likely to apply to the cheap they're transferred in courses were more likely to play the G major, then transfer students transfer courses for. The other thing that we saw is that students who were in non-competitive programs. So that's this cluster and this cluster that their courses are more likely to apply to the gene for me or major than students who graduated from non-competitive programs, which are these clusters. So we decided to kind of look at the interaction that public, private Ohio, not Ohio interaction for each of these clusters to just sort of see, you know, does there seem to be a, can we isolate the impact of being in Ohio public, having courses that are covered by state policy over and above, just being in an end state or a public course for any of these clusters. And so what we saw is that for transfer students and non competitive programs, there is a public, there is a positive coefficient for that. An extension of public by Ohio that boosts the public Ohio institutions about the other three institutional types, sort of even more than the main effect of being public or being Ohio. But we don't really see that for the three groups. So we're continuing to do work on this paper. I mean, really what we're trying to do here is to draw out like the things that will help policymakers understand what's working and what's not working about the current policy. And how they can try to make transfer more seamless, sort of on top of this sort of baseline credit equivalency. Okay? So some policy impacts. So equivalency is very useful, but it's not sufficient. Just because of course is equivalent. It's more likely to apply to your GE or a major. But it's not. That doesn't mean it necessarily. Well. Also, automation is really important in terms of ensuring adherence to the policy. So the institution that we're talking about here has doubled its number of active roles in the past several years and now has nearly a 100 thousand active rules, including many for out of state and private institutions Courses. And to see how to create the infrastructure throughout us because we, we had to adhere to state policy. We needed a way to do it efficiently. So we built the system of active rules to do that. And once we built the system, and we saw that positive effects on processing efficiency than the institution was motivated to scale up that sort of active rules approach to all kinds of institutions. So this, this was, this has been happening in the past couple of years, especially like, you know, sense since the students graduated tour in this sample. So in this way at a statewide policy might be creating a sort of rising tide lifts all boats situations. So the policy might not in the future provide a unique boost for in-state residents alone. But it created a situation that had been benefiting all students. So trying to isolate the impact of a policy might not even make sense anymore because it's helping everybody, not just in-state students. And then the other thing that we saw is that the competitive programs and less applicability to the, to the GE and major and or that may be that they're just more limited in what courses they feel that they can accept as applying to their major or event, to their GE requirements because of industry and accreditation constraints. Because the companies that they are preparing students to enter our very specific about what they want students to be able to know and do. And so they, they may feel like we can't be flexible in what we allow students to apply to their GE or major credits. Good thing about competitive programs as they have enrollment capacity constraints. So they can't take all the students who might be in it. I can take all the students were even who are qualified to be in a program and would be successful in the program. They just don't have the space because the labs are expensive, that faculty are expensive. And so Did that sort of create some motivation to have a high bar in terms of prereqs and what prereqs for apply, what, you know, what grades you will take and to allow prereqs or major courses to apply to to get into your program. And then once you're in the program, took IT Manager. And then the consistency of the GE is also an issue. So fino at many universities, including my university, you know, you, you have like a long list of GE type courses. But what is acceptable to fill specific GE buckets in your major might vary from major, a major. So for example, like an a stem major, you may not be able to take any data analytics course to fulfill your your data analysis GE, you might have to take a data analytics course that is very specific to your discipline. And that serve similarly lake, well, like psychology or sociology, they often have very specific data analysis courses that they want you to take. Um, I mean, you could take another statistics department, but it's probably not going to apply to that psychologies statistics requirement. So Shanna, yeah.
Description of the video:All right. I'm assuming everybody can see it. If not the kick me. Okay. All right. So so first of all, thank you for the invitation, George. And I just want to say, thank you for your leadership as well. Yes, I came to the summit and you, you were doing what I wanted to see us do as well. And you've been encouraging many of us to do it. So thank you. And I I enjoyed coming to Bloomington and I hope I get to do it again soon. So let me just kind of give you a little overview of what I think we're going to do. First of all, I'm just here to stir the pot. You can get a 0 and then i'll, I'll promptly the part and leave. Leave. You know, we'll see what we see what I believe here. Just a little bit about UMBC. I want to talk a little bit about the origins and imperatives of analytics. Why I think learning analytics installed. I want to offer how I think learning analytics can work. It is not the only way. I'm putting this out there to stir the pot as well. What I've talk a little bit about FERPA as empowering, not limiting, and then we'll open it up to Q&A. Okay. So just real briefly about UBC. If anybody's ever been to the Baltimore-Washington International Airport, were five minutes from there. Just about 10 or 15 minutes out. The Baltimore 2025. 30 minutes from DC, where about, you know, 1313 thousand students. We've been, you know, a predominantly white institution that is known for now being the leading producer of under-represented minorities who go on to get PhDs in stem. That is nothing of it. You know? Yeah, that's all about our president and leadership, but he's at here, President Freedman were basket who is retiring this year. We've also been doing a lot in in many other areas in terms of working with minority serving institution and things like that. I want to just talk a little bit about our learning analytics community, which really was inspired by your own. And I want to give all credit to George and Linda and Dennison and Kurt as well for the help that you have created by modeling the kinds of things that we can do. I think it takes a village and you're going to see several references to some community events that I think are applicable, maybe even generalizable, and we'll go from there. So let's dive right into sort of the learning analytics origins and imperatives in what I want to do is just open with a poll. And so Kristi, I think this is our q. And we'll see if that comes up here. Yes, there we go. So if I gave you a list of a 100 students who were predicted to fail with 100 percent certainty, which answer best describes what you would do about it. And I'm going to just leave this up here for a second. Christie, I believe everybody can see the results as they're coming in, but if you could just confirm that would be awesome. Yeah, they should be able to you right now. You can see learn more about what the prediction is based on. Liter. Yep, yep. Keep coming, keep coming in. Because we're going to we're going to lean on this theme a little bit. Okay? All right, so also say, Yeah, we want to learn more about the prediction. So I think that's, that's good and I'm very interested in something else. So maybe we'll be able to get to that as well. So I'm going to stop sharing this right now and move that up. So we've we've answered that, but we're going to go to say the consensus is D. All right? And that's important because are we going to say a consensus to see? But I want to talk about why D might be the answer we wanted to look at as well. And in fairness, this is kind of a a bogus thing. I don't have any other contexts as to whether you're an instructor of these students, whether you're an administrator by now, I know that I'm sort of leading us astray, but I want to give some credit to my sharpie for this little thought experiment. So I first heard the word analytics. I first read about it in fill Goldstein's article from edge of cars about academic analytics. And what he specifically said was that it was an imperfect equivalent term to business intelligence. At the time. There was no mention of learning. But academic was really more of a context where things take place for administrative use of business intelligence in the notion that we could bring data to bear in informing our decisions. Shortly after that, John Campbell, who was at Purdue University, published. This was based on his dissertation, I think a seminal article about academic analytics that really did start getting into the, what you could know about students and what you could infer about students based on the data that they both brought to the institution and that they generated in the learning management system. At the time. Famously, John is the one who coined the phrase of what is an institution's ethical obligation of knowing that I think to this day is really the imperative of, you know, what do we think we know? And more importantly, what, if anything, are we prepared to do about that to support students? So most people know about produce course signals. I'll talk a little bit about that. But what I really liked in the work that John and his colleagues Kim, our null map is still he had done was a really simplified sort of the value proposition of analytics. And I just wanted to take a little bit of time to reference it here. Yes, we need to gather data. And increasingly with our digital systems from different types of systems that has become harder to do just because it takes. Lot of times the people who manage the student information system may not even know that people who are managing the learning management system, that's less the case now. But I think if you go back to 2007, it really made sort of strange bedfellows of different people who were stewards and stakeholders of data. That the value proposition you pull it all together. More importantly, as you have that data, especially of past students, you make a prediction or you train a model on what you think few current or future students would do. A piece where I think we've gotten hung up a little bit is this third area at the purple act. There's far more prediction enacting. I'll get into why I think that is. But what I'll really, my key takeaway is, is how do we get back to really trying to inform the field by example, rather than perfecting predictions at the expense of the grid intervention. If you do an intervention, you should monitor it and see what changes in student behavior you can make the refinement. And that might inform that we want other data, that we want to gather other things. So this kind of cyclical model, I think is really important. I'm going to come back to it throughout the presentation. So the most commonly used definitions come from the edge cause learning initiative, Malcolm brown. And there was a, an article that he published also in Educause quarterly, I believe it was the Learning Analytics and Knowledge Conference Elliott Kaye in 2011. Not going to read these, but I've kind of highlighted the focus on acting as a key part of analytics. And in fact, the title for my talk is a little phrase that I use sometimes that analytics without action is just analysis. Not that there's anything wrong with analysis. It's a big part of the sort of gathering in predicting. But what are we going to do about it? It's a, it's a reason why I asked that little thought experiment in terms of, you know, if you could predict what would the effect of not only instruction but attending college be on changing the trajectory of incoming students and they're out coming. Success, which we want them to be successful. Another way to look at this I came, is this was a Venn diagram from Jenn Stringer who now is not the chief academic Technology Officer at Berkeley, the Chief Information Officer. And I want to thank her for, for this as well. But this sort of intersection of institutional graduation rates, Students Success, academic in terms of performance and actual learning and engagement. This, it looks neat and tidy, but it's actually a messy kind of thing that is challenging. Because a lot of times people who are responsible for these areas, they know their area, but they may not know the other areas themselves. And so being able to hone in on that, the interaction effects of different data and outcomes is also key as well. I also want to kind of point out, you know, kind of some differences about the kinds of data and implications for student success. Student information systems really focus on sort of the data across terms, you know, passing and failing, retention, persistence, graduation. It's really looking at the trajectory or sort of the journey that the students are on. Whereas things like the learning management system over last 20 years or so, has really become not simply a place to post the syllabus or post content, but they really become data sources for sort of a diagnostic Use Cases. And you know, what, what are we learning in terms of student engagement? And I'll talk a little bit about that because when I first started doing this, I used to get some critiques on it. I may still as where we're talking with you. But one of the challenges I, I think is that a lot of times the people who were focused on the learning don't necessarily have pretty access to or responsibility for the student's success across terms and vice versa. You know, the people who are focused on sort of those across the term. Student journeys may not always have what's going on in individual courses in terms of student engagement, or even curricular connections where one course blocks another. And, and that's where I think success is going to have to be going forward, is that we need to be able to work on those things. So, you know, why has learning analytic stalled? So first and foremost. And this is again another seminal article and I'm going to have at the end a list of references that you can go to. So don't worry about trying to grab these right now. Live McFadden, Shane Dawson back in 2012. This really, this title caught my eye. Numbers are not enough why learning analytics failed to inform and institutional strategic plan. Cut to the chase. They had absolutely great data, great analysis. And they didn't get on the radar of institutional leadership, in part because they realized that they were not moving the heart, as well as the head end. At the end of the piece, they talk about a more sociotechnical approach to analytics. Basically, what they were talking about was, yes, we need to bring evidence and data, but we need to tell the story that can move the head and heart for change. So this was a seminal piece that kind of looked at why analytics was sort of failing to sort of engage. It's one thing to inform, but we also need to engage in, I would recommend this as well. I mentioned produce course signals. This was back in 2013. Many of you may be familiar with what Purdue was doing, of course, signals. I'm not going to dive into the, the, the exact nature of it, but there were some questions about the claims of retention success. I think at 1 was 21% retention for students who are working in the course Signals courses. I thought, to be honest with you, that Purdue really got beat up a little bit by the blocks, a little bit. I not also convinced that leadership was supporting Jon Kim NMAC in terms of being able to share what they were doing, I don't pick it was quite as bad as some of the folks were saying. But I will say because what Purdue did with Core Signals was so widely heralded in 200789. When this critique came, I think it had a sort of chilling effect on those who might have a similar audacious spirit of what you could attempt to accomplish with analytics. Again, I think in some cases, some people, you know, they went too far. But I think this was a seminal moment in sort of, you know, what happened with learning analytics. What's interesting too is at the end of the piece, coffee, all that says that remains a question as to how a system that boost grades could fail to boost retention. We assume that students who do well in classes will do well in terms of graduating. And I'm going to show something that I think kind of speaks to that as well. About this time, Mike Sharkey is a good friend and colleague and a leader in the analytic space. Also even post this notion. Has there been this trough of disillusionment, you know, relative to the Gartner Hype Cycle, you know, kind of where you have that initial peak of inflated expectations, the trough of disillusionment, maybe it's around 2012 or so. But, you know, that that really was there we go. That really was kind of a key piece. And then as you start to get more realistic, you start to change what the expectations might be and then you kind of settle into for lack of a better, more operational. I think we still are kind of, you know, in that slope of enlightenment, working ports to plateau. But definitely there have been some, some issues in terms of managing expectations of what learning ADL layers can accomplish. This is where I wanted to go back to Caulfield. How could students do well in terms of course grades, but not do well in terms of students success. If you haven't seen it, I would encourage you to take a look at the Education Advisory Board back in, I think it was around 1416. Talked about this murky middle, that 45 percent of students who drop out have a decent GPA. Okay. They may not be, you know, 4 but they're not failing. And what this really looked at and we, when we came across as my colleague, Bob carpenter, he did this similar kind of analysis and we were seeing the same kinds of things. Is that after that first year, students for a variety of reasons, who are, you'll would leave the institution in good standing. This is a piece that we need to look at as well. Not just from the learning analytics side, but the academic, social, and institutional analytics because it may be that people are struggling with funding to continue, or maybe they're working too much. But there's a variety of things I know I'm preaching to the choir on this, but I just wanted to put this in the context of kind of where analytics may have been disillusioned or even plateauing. I want to credit Vince Kellen, who is the CIO at UC San Diego. Whatever I I was at his presentation to IMS Global. And what he did is he kind of went through this. Sort of facts. You know, how many neurons are in the brain, the synapses, minutes in 10 billion years, you know what his point was that human learning is essentially a mystery. And so in some respects, I think this was around 2017, you started to hear less about learning analytics and maybe more about learner analytics and maybe focus more on engagement and learning because there is a difference. But, you know, his, his key point was cultivating a sense of humility. Large jumps in teaching productivity, akin to 20th century technology has not likely, again sorted that enlightenment slope that maybe we have more manageable expectations about what can be accomplished. I'm sorry. There we go. The other thing, even in, in LA k, the learning animal that Linux and knowledge conference, you know, you started at the MCP. Research appearing about, you know, we need to act. It's not enough to predict, you know, what students learning or what they will do. What, if anything, is the effect of instruction college so that success isn't simply about recruitment. It's about what we do with the students that we bring to the institution and say that they're, they're, they're part of us. And so you started to see these kinds of things as well. And then lastly, I've tried to touch on this in a chapter I did a couple of years ago with john Whitmer, who was the one that told me to come to the Indiana LA so Summit. But what what was happening a lot for John and eyes that we were noticing, we go to conferences. And this was when you started to hear more about sort of data, Big Brother, privacy. And sort of the, the, when people would talk about what could be accomplished with Analytics, you would often hear, you know, how do you do ethical learning analytics? And really what that was was sort of harkening to FERPA Family Educational Rights and Privacy Act. And what what he and I would joke about is it we'd go to conferences and you'd hear the word FERPA. And it really was code for don't go there, don't attempt. Because student data and privacy is, is the most important. Even some people would talk about informed consent, prior consent from students, which is completely opposite of what FERPA enables, empowers. And I'll show that when we get to the end, you do not need to get students prior consent in order to study your own students prior data so that you can improve the experience of current and future students. So we'll talk a little bit about that going forward as well. So let me just pause here for a second and just ask, Are you with me? Everybody tracking along. Okay. Kristi, I'm not kind of looking at the chat so much myself, but I don't know if there's any questions. But yet also, also folk raise their hand, you know, it's opened Zoom or so somebody would like to ask a question. They'd go ahead and do that as well. If we're if we're good, I'll carry on. But I just wanted to did I know I'm throwing a lot of sort of the literature review and I just wanted to see for truck and I can only comment you've had so far is from Martha and just most of times failing an exam has something to do with the student. But when everyone veils, has lot to do has a lot to do with the evaluation told she had posted that shortly after year survey question. Okay. Thank you. Mark them. May come back to that as well. I'm going to carry on. Okay, so what, what I'm going to talk about here is just how learning analytics can work. But with the caveat, your mileage may vary. And a lot of this is based on our own work at UMBC. By no means am I trying to hold this up as do it the way we do it? I'm simply to actually try to put my neck out. I stick my neck out a little bit and say Here's how we're thinking about it. Because you can learn a lot about in institutions, theory of change, or even how they view student learning by looking at the interventions that they employ. Because it assumes a, a, a sort of understanding of what you think. We'll move that move the needle so we'll, I'll put it out there and again, open for feedback as we go. So I'm going to start with our provost, art Johnson. Back in 98 to 2008. You can see it somewhat in harsh, you know, I had a withering sort of question, you know, back in 2003 where he said, and we use Blackboard. You said so as Blackboard making a difference. And as an IT guy, I was sort of, well are we had 700 courses last semester and we've got 800 courses. And the look on his face. And that I realized as well. Both of us realized that that's not what he was talking about at all. Yeah, that's sort of the Prufrock. That's not what I meant at all. Sort of line. And actually this is what spawned my dissertation, is trying to answer this question and, you know, it took me 227 pages, but it was at least my attempt to try to answer our question. But I also said something else. In 2006, we're going through our Middle States accreditation working group. And we were we were basically preparing the kitchen sink of data for getting ready for Middle States and art the way he would do. He just kind of cut to the chase and, and it was like, you know, why collect data that we're not going to act on. And it was it was kind of a an epiphany for me. It's like, man, you can for everything and the kitchen sink into it. But if you're never going to act on or why do it. So it's trying to find that sweet spot in terms of having the most comprehensive model you can have versus the effort that goes into it. The trite phrases, is the juice worth the squeeze kind of thing? But these were kinda key questions in my, you know, in the last decade or so. So the problem was that as an IT guy, I didn't have access to great. So I couldn't answer our question. And even if I could, the problem with grades is that they occur too late in the term to be actually. And so that's where we started trying to look at different ways of trying to get at student engagement. And, you know, I'll, I'll admit that student data trails in the LMS are a proxy. But you see in the social science literature the use of proxies all the time. Whether it's for socioeconomic status, sense of belonging. We use these things as a way to approximate something that's difficult to measure. So, you know about this time you're, you're seeing the rise of the LMS as a source of data for actionable intelligence. To the point that back in 2007, we we did this analysis and it carried forward ever sentences that UMBC students earning a final grade of B or F used our LMS 40 percent less than students earning a C or higher every semester. Now this is summary aggregate across the institution. Again, different courses vary. Some would be higher, some would be lower. But what was interesting was two things. Same general pattern or trend persisted every term. And this pattern in trend showed itself early in the semester as well. So students who were not as engaged at the beginning of the semester tended to stay disengaged throughout. That's an important piece because if you're going to use this proxy or try to leverage it, you want something that is somewhat reliable, but you could act on. I'm going to talk a little bit about what we've tried to do. So among other things, we built this thing called check my activity. It's in our campus portal. This is where students would go to log into their Blackboard courses. The links down here in terms of would actually go to Blackboard. But what we would show them along the way was how active they were compared to their course peers. The black bars, the student, the grayer bar is the course average. And it was simply looking at how how active in terms of hits and clicks. And I know I get, I, when I was doing this, back then, I would get critiques of click on Mitre and, you know, you know, maybe some students were clicking the Submit button a 100 times. I never saw a student's activity compel a faculty member to give a different grade. But, you know, these, these are the kinds of things that we built in, in your long story short. What we would do is we would show students and anonymous summary of their course peers to see how active they were. And this was the killer app, is faculty would post grades. They could see how active they were too, that same anonymous summary based on students who had earned the same higher or lower grade up to that point. So you can never see the names, but you could see how acted students were up to the point of the assignment grade. There was back in 2017, bodily and Berber put together kind of a, a, a, a literature review on student-facing dashboards. Sadly, they did not include ours. So I got my revenge by publishing our own chapter on on this in, in 2017 as well. And among other things, one of the things I, I feel good about is I also identified the University of Michigan's e coach. Don't have time to go into if you're here, but suffice it to say, this was, this was sort of a genius. Tim McKay at Michigan, a physics professor, head of the Honors College, who really wanted to take. He described as a sort of behavioral nudging to scale that up. And what he modeled it after was he described as the Center for Disease Control public health model that when you're trying to get people to change their behavior, either quit smoking or to take their blood pressure medicine. More often than not, they're responsive to people who encourage that behavior. From people who look and sound like themselves. In the genius of what I thought Michigan did was they would talk to students who did well in the physics course and in an interview them. What kinds of strategies did you use? What kinds of tools or approaches? And what they would then do is harvest that knowledge and then present it to the next cohort, the next semester at key milestones in the course, including the first exam. And so this has built up quite a bit. I'm not going to go through all of this. I've provided some links in here, but this is now being used that these schools that are listed on the left, I think this is an interesting model by their own admission, we'd been looking at it as maybe either a supplement or even replacement for our check my activity. But I really think that Michigan is onto something here. And I would encourage you to take a look at it because by their own admission, it is primarily for larger courses. I think their smallest courses, 300 students. But it's how do you make a large pore small. And so to be able to have custom tailored messages to the students at key milestones based on their performance. I think there are onto something and I think some of these other schools are, are learning it as well. So here, here's where we found ourselves, you know, going back to John Campbell's, the obligation of knowing. We think, we think we know that LMS course design is related to higher student usage. I'll show some of that here in just a second. We also thank bit higher student LMS usage is related to better student outcomes. What we were trying to really work on DOE was, could we show, could we prove that LMS course design is related to better student outcomes? We could see how it improves engagement, but does it improve the outcomes? Yeah, sort of those sort of a equals to b equals to c. But does a equals c, That was sort of problem that we're trying to solve. So I want to talk a little bit about why I think course design matters and I'll give you a couple of examples. So this is again some work with john Whitmer in my jerky and colleague Tom penicillin. We just wrote about it a couple a year or so ago. We've also been presenting it, George, in our learning analytics community of practice. You'll have these slides if you want, you can go to these things as well. The meeting recordings. We make it UMBC only about the presentation slides are available if you're interested. But long story short, what we wanted to try to show was the research is pretty clear that these are the three main ways that faculty use the learning management system. One is sort of user and document management. Posting the syllabus, you know, maybe the presentations. If you then start moving into sort of interactive tools, chat, discussion, things like that. It has a different level of sort of engagement from from the students. But the killer app or an LMS is auto-graded assignments are quicker, or quizzes or even exams, and I'll talk a little bit about that. So electronic assignment, delivering collection, you go to any course. Fed is doing number three. The, it's like orders of magnitude difference in terms of student use of that course compared to number one. In the literature, you know, kind of shows how that pans out as well. One of the things, you know, working with John when he was at Blackboard, we put this together as well. And this was one of the key findings and I wanted to take a little time just to look at this. So these are two courses at UMBC. Ones, math ones, physics. The course on the left. What you can see, it's the same. On the right is every row is a student. Every column is a week in the semester. And the colored density is the amount of time the students spend in this course, in this LMS during that week. And then it's organized by the final grades that were earned. And there's a couple of things that jump out at you. The first is you can kind of see where, you know, Thanksgiving break was in both of these courses. You can also see that really at the beginning, at the first week of the semester, even though these are organized by final grades, it's a little it's a little different on the other course. But there's really not a lot of difference in terms of engagement between students and DNF students. But there, there's some, is a little chart. But as the week, as the weeks tick on by, you really start to see that some of the students who earn lower grades, they literally just sort of evaporate. We call this our waterfall chart in terms of kind of if this were the top of the waterfall, Here's a rock blocking Thanksgiving. You can kind of see where the engagement patterns are. There's a couple of other things that are very different about the course. The course on the right. The typical two midterms and a final. The students are engaged, you know, right around the time they're getting ready for the typical kind of approach to earn a college course. On the left, what you see is a course that has weekly auto graded quizzes in reading quizzes and assignments. The students are doing something every, every week and in some cases every day. The only thing I would point out on this I've always thought, and this kinda goes back to the check my activity. I'd love to be able to show students a kind of you are here red dot. In each term. We wouldn't have the benefit of grades because they hadn't been earned by that. But I think that students would start to sort of connect the dots themselves. In terms of am I engaged? What do I need to do to be moving forward? And so I think these would be really important ways to proceed. So fast forward to this is hi, Karen Carpenter, again, one of our recent learning analytics community of practice presenters. Tara teaches the largest course on campus, principles of chemistry to her colleague Sara bass. Peaches, the principles one. For us it, these are big courses, 67, 800 students. And what she was dealing with was, you know, it you've been teaching this course for 18 years. Incoming college students, they bring the same habits and patterns that they brought from high school. And that is that they think of preparing for an exam as memorizing rather than learning. And she was getting tired of this, you know, in, in, so she wanted to try to do something about it. So this sort of builds off of something as a pandemic response. They started doing large question banks that they generated themselves. And others lot here just focus on the, on the main screen here where they didn't want to use surveillance software. So what they ended up doing was creating exam questions where no two were like for anyone, for any two students, they could randomize the groups they could have using the Blackboard calculated question, you plug in a, a question prompt stem variable and it outputs a different answer variable. So no students would get the same answer. They'd randomize the groups. And they've been doing this since the pandemic. And even as we're coming back to campus, they're continuing to do it. So, you know, but they sort of made this investment both Sarah and terror in exam question bank. So what basically happened was terrorist started doing it for student practice as well in problems she was trying to solve. I don't know how many of you've come across this. I didn't know about this until I had introduced me to it. There's, there's something called the Ebbinghaus Forgetting Curve. And ebbinghaus was famous for trying to study how long it would take for him to forget a list of randomized letters. He literally took like 2300 three letter sets, memorize them. And, you know, at a certain point he had immediate recall, but within 19 minutes, he had forgotten 40% of what he had memorized. And he'd never consulted the list until he could not remember anything. But he would go through he knew there was 2300. And if he couldn't, he got to a certain level and you forget it. And then you can see how that forgetting curve, you know, sort of plays out. Now by 31 days, he could still remember just over 20 percent. But he's also where we get the, get the term, one of the people where we get the term, the learning curve. And it's really the ideal learning curve that what he found out was regular, smaller, bite-size chunks of practice spread out over time. Give our brains a chance to process, organize, and then recall, and retrieve information. Much more than sort of rote memorization alone. So with this in mind, terrorist started doing space practice repetition using her question bank pool, pool of questions and just subjecting students sort of, you know, you have to go on this regime of putting in daily practice. Now I'm not going to brag too much on it. I've been doing Duolingo for Spanish for about 340 days now. And it's like every 151520 minutes a day. And they're things out. It's just there that I could never memorize alone. But for recall and practice, you get through it. So that's what carried it. What's interesting is that we had that same sort of waterfall every role as a student, every column is a week in the semester. And she introduced it after spring break in spring 21. And we have that same sort of dark, darker color. In terms of what students were doing. What's interesting is you can see where spaced practice began for her students. What she ended up doing in Georgia. Was there a learning analytics may grant patterned after what you folks do? If she wanted to see, would students not only do well with the use of space practice in Chem one or two. Would they take those lessons learned to the next course that required hearse, which was organic chemistry 351. So what she did is looking at the Spring 2021 cohort, there's probably about 600 or so students. I think students who went all and nobody got anything lower than a C. And then she wanted to see how did they perform going forward. So it's about 250. It was yeah. 271, I guess. Yeah. And what she found is that for students who who used it, 90 percent, only, only 9.4% got a, D, F, or W. Whereas the control group, and admittedly it's a small n, you know, it's about 50, 50, you know, that students who, who did get it, I'm rounding up. Sometimes. It was interesting to kind of see. So she's going to renew this grant. Again, taking a page from George, making it as easy as possible for faculty to continue their own investigations. What was also interesting is that she used an adaptive learning tool called realize it, which gave us some data to look at how students were doing as well. And what she did is, again, these are the final grades. The dark green is an a, the the bright red or the darker red is withdraw. And what you can see again that same, every row is a student. Every day is completion of spaced practice, how much they had done. And what you find is that same pattern. You know, here's, here's the, the break in the fall, 21 for Thanksgiving and students were sort of disappearing. What, what's, what's interesting? And she's been trying to figure out what's going on is you have some of these students who are really active every day, but they're not getting the grade that they that their effort should performance. So she's been honing in on those students, really trying to encourage them to be more reflective about their thinking. You know, be more metacognitive thinking about their thinking and how they learn so that they can adapt to different problems than what they might have memorized in the realized the practice environment. Among other things. One of the things we're finding is that after only 14 days, the model is 82.6% accurate predicting a, B, and C are DFA. This has nothing to do with other grades. It's only the final grade, and it's based purely on students use of the practice environment that she had setup. I think it's fascinating that the among other things, what she's also found is that the grade distribution has changed dramatically from Fall 20 when she did not view space practice. What she was finding was the A's were fewer than the bees than the seeds. In fall 21 after now semester and a half of doing space practice, she was finding that the A's increased, the bees decreased indices. Now there was a upward tick and the f's. But you're seeing that sort of shifting in the grade distribution, which was interesting as well. She also found that if you looked at that same Gertrude, grade distribution by race, in terms of white versus under-represented minorities, that trajectory was changing quite a bit as well. To the extent that under-represented minorities in the fall 21 instance where now four times more likely to earn the higher grades than their white counterparts. I think this is fascinating. In terms of equity in digital education, which the terror had been a part of this. Well, you know, I kind of talked a little bit about, you know, the the ways in which she was using question pools. This is kinda where that came from. And among other things, we have found over time that on average, our students spend about 3000 minutes in all Blackboard courses submitting 20 assignments. In Chem 1, 0, 1, 1 or 2. They're spending 10 thousand minutes and 200 mini micro segments with the, with the practice. And she's trying to figure out a way to cut down the amount of time because she does get some complaints initially from students who say, This is too much. Those same students that we'll come back and say when they left her course, can you help me figure out how to do spaced practice on my own? And so we kind of have that going forward. We have seen for years that our stem, our College of Natural Mathematical Sciences, that's our stem college. The, the, the use of practice, quizzes and reading. Reading quizzes and assignments in Blackboard is always the highest. It goes. So get through that. And these are our why, even though there are fewer. Courses. The color intensity is the amount of interactions. They are basically using quizzes and assessments much more. I wanted to bring back Michigan, just because unbeknownst to us, they were doing question goals as well to promote open note online exams. And we were seeing also that this spring semester, this one kind of caught me off guard. Our biology faculty who were also using question goals, who had seen what Kara had been doing, asked if they could give online exams in the return to class. This is r and this was, is not a classroom, it's a ballroom. It's one of the largest rooms on campus. But you had, I think it's 200 students using the lockdown browser from Respondus, taking the exam. Among other things, they just found only a handful of students didn't have a laptop with a battery that could last long enough. And there's just a handful that have some problems getting the software setup. But among other things, they were saving reams of paper. I had no idea. For an exam. They would typically Use about 2500 pieces of paper in there's five exams in this course. So it was something that was really interesting to them. But the fact that we're sort of seeing a blurring of the lines between sort of face-to-face, hybrid, blended and online. I think it's fascinating and I think looking at assessment is a, is an interesting way to go forward. All right, so some takeaways thrown a lot at you. Think it's time for another poll. So I'm gonna go ahead and put this up here. And Christy, maybe you can help me by bringing up this poll, but I think it was really, well, we'll do it in this one way. So the first question is, yes. If you had to predict the outcomes of current students based on what you think you know about prior ones, could you do so? Yes. No, I'm not sure. Good. Come on. Yes. Go on. Yes. And it's okay if you're not sure either yes or no. But we are resettling out here. Okay. I'm going to say slight head, slight lead for yes. That like a rich Strike coming along at the end there will say, so. Very good. So we're starting to see some ways that we think we could intervene. Okay, So the next question is, and I think go ahead and put that up there. See if you have that there. Yes, here we go. So if you could predict, how would you do so, how would you intervene to fix the predicted outcome? Could you do so? Yes, no, I'm not sure. Okay. So little, little different now, again, this might be where we're struggling a little bit with sort of translating the prediction into an intervention. So it's a little different from what we had seen from the first question. Now, I hit wanted did you say something, Chris? Okay. Okay. So we'll come back to this. I'm going to stop this right now. And what I want to do is we're going to not do this as a poll question, but just in the chat. If you aren't sure. Can you use one word that would best describe why? And if I can, I wanted to see if I can see the chat myself. Heterogeneity instructor scale. Okay? Just ethics interesting, okay? Contexts, diversity. Okay, a lot of context. Maybe in the Q and a, we can kind of uncover what that means. Disconnection, uncertainty. That's fine. That's fine. Good. All right, so keep this in mind. So some of the takeaways, I would say this is going back to 93. Vincent Tinto, often seen as like the Dean of Students Success. I came across this when I was doing my dissertation and kinda afford me was, you know, you know, as I was trying to read the student's success literature, it was often pitched as what is the institution doing to save its students. And it absolutely is right to think about that. But what he said here is as well is that students have a responsibility as well. And, you know, we can't absolve them from that. So the question is, how are we creating environments for them too? Not only demonstrate, but maybe improve or a deepen or mature their own responsibility for learning. That's, that's one thing. The second thing and, you know, the same thing that I would say with either the check my activity or wanting to show the red dot for that you are here on the waterfall. Maybe to some extent, the Michigan e coach is this idea that maybe we can help through what Miller and running called motivational interviewing. Interviewing, where you essentially try to create a sense of discrepancy between what people say their goals are and what their behaviors. It isn't that you tell them what they do. But you, you sort of hone in on and sharpen. If there's a discrepancy between what they're saying, what they're doing. I'll show how that plays out a little bit more. Yeah, we we know we can't do nothing and I want to call out Jennifer made a robinson. This was one of the key presentations that still resonates for me. Back from when I came to your learning analytics summits, we invited Jennifer and George and he had Linda to join us for a summit. But this idea of it, our students come in with an expectation of how they'll perform in after they do so. And this was after the first exam. You know, what is their expectation going forward? They, they have a responsibility for learning, but they are they honest and accurate in assessing what they currently know, understand, and can do? And if not, or if they are honest, are do they have the, the self-regulated learning skills to be disciplined, to remediate those kinds of things. I thought this was fascinating. I still reference it all the time. But it's one of the takeaways I got from you folks. So I thank you for that. Before I want to kind of talk about FERPA. And I'm again, I'm probably preaching to a quiet here, but, you know, I as I've been involved with this data, I hit gone to our legal counsel countless times and say, Are we okay to do this? And I've always been encouraged because their interpretation, which I agree with, is that it's not simply that FERPA gives rules for how institutions should manage student data. I actually think the focus on those with the legitimate educational interests is an empowerment. To schools to study their own data, to act on it, to collect it, and to make predictions so that you can try to help students going forward. It also allows us to access information without requiring students prior consent. That's a key distinction. You know, what I would hear sometimes, especially if the GDPR focus on data and privacy, which is understandable. We've seen no shortage of abuses of big data. But it creates that tension between, you know, you know, what, what do we want to accomplish for our students and what do we think, what data we have? It? I think actually, you know, it obliges institutions. To me. It gets back to Campbell's obligation of knowing that I think is embedded in our sort of us stakeholder role. And our stewardship role is to not simply protect student data and privacy from getting outside of the institution, but then actively studying and turning the data into actionable intelligence. On every page had the university's portal, we have a link here that talks about the student use of data. We tell people that were doing this. We give a, an example of how they're doing it with some links to more information. Now I can't promise that every student is looking at this. But what we're trying to do is be transparent about it as well. So with that, I want to try it kinda close with sort of a framework for action that we can think about for ourselves. And it really is based on, on this book, Thaler and Sunstein's, if you read Nudge, this notion of choice architecture, that we can design environments. And I would even say digital learning environments to try to help students go in the way that we think they should go. Yes, it's paternalistic. But that is our responsibility. You know, we should be informed by what students bring in, what, you know, how they're engaged. But Thaler and Sunstein would say it's legitimate for choice architects to try to influence people's behavior in order to make their lives better. If it tries, in a way that will make the chooser better off as judged by themselves. I realize there's some tension in here, but this notion that we can create environments, whether they're in the living, learning, environments on the campus, in the classroom at the institution. Does the environments that we invite students into can contribute to the choices that they make. It's not going to make their choices, but it can present the range of options available to them. Finally, we've been. Playing with nudging as well. Just had a presentation from our analytics group who's been working on this. Among other things, they've been doing some nudges at week four, the 15 week term, and it Week 7, basically trying to give the students enough runway to turn around a predicted trajectory. You don't share the prediction. But if we've identified who they are, then we, we try to nudge them. And part of that is, these are the top features in the model. These are the kinds of nudges that students would get it right now It's been piloted in six courses where the instructor allows us to send this message to students on their behalf under their signature. So that we can study some of the global responses rather than just kind of what comes out in e-mail that goes to the instructor. This last thing is kinda busy, so bear with me. But, you know, what we're trying to do is get the right message to the right student at the right time. And what I would ask you to focus on is sort of the student agency on the left. Again, this is a timeline of the of our semester is 15 weeks long. And about midterm is, you know, up until that point, we're really trying to focus on what students can do to help themselves. Now I want to just show you what some of those things might be. I'm not going to go into any of these things. But, you know, it it's, it's meant to show the sort of not just the interventions, but the character or the nature of the interventions that is focused on student agency. We've had some lot of faculty do two things with syllabus quiz. So these are sort of some examples early on that students are encouraged to do. We've been doing a lot with the LMS in textbooks heart, our textbook provider is vital source. A lot of things with the check my activity. Maybe students who have a high credit load but a low GPA nudge. Here's where other students have gone to get help if you need it. And then as we start to head into the midterm point, that not just become a little more intrusive in terms of, you know, hey, you know, students. This can be a time where were students who are not where they want to be, need to think about? Can they pull it off? Are you, are you doing all you can? We've talked about empathetic nudges in terms of what you can do going forward. And then finally, if they're not finishing strong, maybe it's time to sort of encourage them to think about starting strong. The next term, you'll get withdrawal so that you can keep that from being an F on your transcript. So I'm not going to go into any of these things, but I just want you to see sort of the nature of the Nudge as the semester goes by. My conclusion is this. You know, yes, you can't lead a horse to water, but a good farmer salts the horses oats. It's basically showing that no, none of us learned from a position of comfort including faculty. Sometimes it's not until they can see what their colleagues and peers are doing that maybe it inspires some change as well. And I think, of course, design can be the most stable form of institutional intervention. Afflict the comfortable and comfort, comfort the afflicted. Last thing I want do is I just give some credit to our president. Remember basket, he, this is his last semester. Next week or so it will be his last commencement. But, you know, he he made this statement years ago and I couldn't agree more. If you want to change the culture, shine light on success, not failure. And the reason I have this year is that yes, we have access to a lot of data. And if we wanted to, we could shine light on where people are not measuring up. I prefer to be public in our success and let failure be private so that people can come to that reflection on their own. And if, you know, deans or chairs were to think, well, I'm going to use this data to go after people. My caution is always been to them. Well then watch what happens to the LMS. It'll dry up sooner than later. And with that, I'll sort of leave some references out. I'll be honest with you, I'm about two-thirds of adding all the references to this Zotero library. It's public. So give me just a few more hours here and I'll I'll get it updated. But you can go into this as well. And with that, I'll close it and invite questions.
Description of the video:to Dr. John gates, and I just want to mention a few things about him, but before I actually go into his official bio that we have, I just want to tell you that I first met Dr. Gates when I heard her speak at a size and a meeting up at the University of Michigan in 2019. And for those of you that don't know, Seismic, Seismic is a Sloan funded initiative and cruelly created to improve student's success for marginalized students in large introductory stem courses. And I'm not sure, you know, one of the things I learned through John is using proper terminology and I always struggle with how, what term to use. So maybe he'll talk about today what term to use when dealing with what we have formerly used in the past, cleaning with NSF, which we called under-represented minority students, which I I personally have some discomfort with. So but anyway, when I first met John, it was that that presentation and it was a transformer, transformational learning experience for me. And the way I have come to understand issues of diversity, equity, inclusion in higher education. And as I mentioned, and even the terms we use to speak about them and how powerful that can be. That led me to invite him to come give a talk on campus through the seismic initiative back in 2020. And now I've been able to have the good fortune to have come back and speak with all of you, which I'm very excited about. Very confident that you will get a lot out of this presentation. Dr. John Gates joined Purdue University in early 2019 as the Vice Provost for Diversity Inclusion. And he's also a clinical professor in the Krannert School of Management. Prior to his appointment at Purdue, he served as the Inaugural Associate Dean at the University of Virginia School of Engineering and Applied Science. He was responsible for diversity, inclusion engagement strategy. He also provided leadership in the recruitment and retention of faculty and graduate students from underrepresented minorities and guidance for diversity and curriculum and development programming. His other academic posts include the Associate Dean of administration and finance at Harvard College, Harvard University, and a special assistant to the president and provost of the university could not really focused on strategic initiative is change management. And diversity is how the series of positions with increasing responsibility at New York University, culminating with his role as executive director of Global Operations. Gates also launched an LED is a consulting firm, critical, critically, critical, critically management consulting, where he advised corporate clients and a range of work for issues. He's a native of Gary, Indiana and received his bachelor degree from Morehouse College, a master's degree from New York University, and a PhD from the University of London. Please join me in a warm welcome for Dr. John gates. Thank you, my friend. Thank you, George. It's wonderful to be here. Wonderful to be with you again and to have this conversation with colleagues from IU and from other, other places, I really agree to psi you from your cousin down the street and around the corner. There is something, we have the competitiveness. But at the end of the day we aim for the state, IU and Purdue in many ways. And so I'm just delighted is always to be with you. Now. I'm a pretty just kind of regular speak or write. So we're just going to talk about what is, what, what isn't, what can be. I invite you to lean in at any point and ask questions or engage me. And I want Let's see here, I would ask those of you who can come up to do so that I can see your faces and know that I'm with you. Um, and yet I'm about to do the presentation, so fabulous, fabulous, fabulous. I feel I feel good. I feel good now. Thank you. Thank you so much. I want to start with a little exercise that I do all over the country. I want each of you to do me a favor putting into the chat one word to define or describe diversity to you. One word to define or describe diversity to you. Let's see what we got. Necessary. Variability, holistic home ideal, empathy, human representation, inevitable richness, wholeness, wholesomeness. They are fantastic. Fullness, understanding, inclusive. Come on, that's not everybody. I'll give you one more minute. One word to define or describe. Diversity, tier, inclusive holiness, understanding. Okay, We'll pause it there. Now. The default. Now want to ask you, what is it listed there? No one said race. No one said gender, sexual orientation, religion, nationality. We talked about wholeness, inclusivity, the wonderful s a b. We talked about diversity with lots of feelings out of an express eats Kind of exalted place of being an ideal place of b, which is the best of all a box and that which we've kind of been joined together. It is interesting before that when we construct paradigms to address diversity, we, we forget about those things often. And we focus on race, gender, sexual orientation. And that Mike, I would like as we begin this conversation, for you to think about my blackness. Think about my gain us. Think about my first-generation ness. Think about my band from Gary the great GI, and the ways that you just wrote. My blackness is whole. My game. This is incredible. Right? My beam from Gary back, it's all excited, right? I see me differently than you might see me. And in order to do the work of diversity, equity, and inclusion in, from a data perspective. And perhaps especially is can you see me and me the way I see and value myself? That is the first order of business for us to change our, our thinking about what we're doing and who were doing it for. I would also say to you that I am a strategic diversity specialist. That is not diversity. For diversity said, it is not because it is the right thing to do solely is leveraging diversity to drive excellence in higher education. Leveraging diversity to July excellence. So I'm less concerned frankly about in this instance, about politics and this, and that. Rather than unleashing the potential of human beings, right? Let me start my presentation. See if I can get this right. Share screen. You've got the rice growing. Okay. How about they loo, yeah. So when I spoke initially to the seismic through that George was N and post heard me. My title was doing justice, a road map to closing achievement gaps. I've since learned. And I've grown a bit. And I've taken a look at that, you know. And I've asked myself, remember some some conversations that credit. When I first came here, I kept hearing this thing about at-risk students. And I said, who's at risk? It's a, you know, a first-generation students, underrepresented, minorities, socially, socially, economically disadvantaged students. And I said, Well, how did they get to be at risk? Well, you know, the data tells us that sin we have outcomes. Tell me, tell me about how we know. Well. We have, we did an analysis, we have an algorithm, it has various factors. Tell me about the factors and how they will eat it. Well, you know, 35 factors heavily weighted, more heavily weighted around race, generational status in college and, and wealth. I said so you mean to tell me that every black student becomes to Purdue is advice. They said based on the data. Yeah, blue. Every poor student is at risk. Yes. Every first-generation students at risk in D. And of course, if they are, they share and intersectionality they are right? One or more than one of those traits. They are. We triple yet, but is that the kids? Yes. That's what the data tells us said. Well, what do we do with that? All right, well, you know, we have support systems and we engage students and we engage support staff. Said, oh, it's similar to what Harvard called. They didn't call them address, they call them tea cups. And said, Have you ever heard about? So it's also a teacup able to be fully utilized, right? But fragile. May break more easily. I said, now, couple of things with this. You ever heard about stereotype threat? Well, I've heard about it. So stereotype threat is that we either perform to our stereotypes or take the extra energy to perform against our stereotypes. I did teach a class at the University of Virginia School of Engineering encoded race, gender, and diversity of the engineering workplace. And I asked students, I said, give me one word, right? Same thing we just did. As a stereotype for black. They went to town, they said fried chicken, watermelon, hostile, angry, and lazy. Safe space. Give me a stereotype for Hispanic or Latinx. They said illegal. Give me a stereotype for Asian. And they said smart. And Dasani, give me stereotype full byte. And they said, white man, can john haha and I say an analogy know about contingency theory? How would you perform academically if you're in an environment that thought of you as hostile, angry, lazy, or illegal. They had to ponder that. I said to my colleagues at Purdue. If all of these students are at rest, did you say Tibetan? Did we say to them, we invite you to produ upon entry and upon injury, you will be at risk. And the answer is no. In other words, words and concepts matter and they matter from the very beginning. And what we decided to do at Purdue is to stop deficit framing our students. We asset frame our students. Every student comes to this university having demonstrated excellent, send tremendous potential and promise, it is our job to lean into that potential, into that promise to help maximize it. So I've changed the title of my talk away from closing achievement gaps to maximizing student potential. Doing justice. Right? Not just about justice, not just talking about justice. I'm calling upon us to do justice, do the work to maximize student potential. As we move on. We at Purdue undertaking some pretty significant efforts. And only just to make sure I got this right. Yes, some pretty significant efforts around data. I believe that worked here. We're developing a fully data enable a diversity, inclusion and belonging office. And so data is paramount. We combine data science with the lived experience to try to understand that, which may be nominally understood or try and deepen our connection. Maximizes student potential, moves us from addressing gaps to maximizing potential. It causes us to identify, understand, and to remove barriers where they are present. It calls on us to work collaboratively to maximize impacts. Remembering the people represented in the data is critically important. Every data point that you have as a human being, full of promise and purpose, right? So let us not remove the data and our understanding of what we're doing from the people. Understand that data has limitations. It has great possibility. Isn't that great, great limitations? I am uneasy with artificial intelligence as applied to the lived experience. We're going to lean into it. We have to make sure that it's equitable because it's difficult to understand what it is like to be any one of us. But if you imagine someone who is a first-generation, economically disadvantaged from an impoverished city like Gary, where people may assume that they have limited potential, that they may have been taught that. How do you begin to see through the data what the possibilities are, right? And so those connections, and then we want to ask. Connect everything to his students that we do. And we want to ask what it's like. And so I want to begin the presentation by showing you a brief video that some students develop about under-represented students, about what it's like to be black or white or brown at Purdue and a changing environment coming from different places. I'm going to stop sharing and asked my colleagues on the other end to share that video with us. I'm going to pick back up on the wall anchor. Frankfurt's pretty small, we add it and maybe theater when I was a kid, but at shut down and now my graduating class was 200 people. It's weird saying this, but I've never seen so many Asians before in my life, or even African-Americans. There really aren't that many in my hometown. So it was kinda weird at first. I hated the fact that I noticed. It's like I felt bad, that it's just this is so different from what I've grown up with. So I don't know, I don't feel like I'm racist and I'm not trying to be. It's just I notice and I try not to let that affect how I interact with people. It's just this is different from what I grew up with. I went on a trip to Colorado with Leslie Foundation. And there was five international students with us. So that was kinda like little cultural shock is theirs. Only white people in my town, like that's it. There's no other, many other rates. And we spent 10 days and a little cabin with 22 people total. And so I feel like being shoved in a little cabin with a bunch of people really like. And I've opened my eyes because here Like if they didn't like me, I just go away. Like I just kinda wandered somewhere else but in the cab and I couldn't get away from them. They're everywhere. So like I couldn't just wander off because it's the woods. Mira King people to me, there are more abrasive. There. There is more focus on into official awesome. A lot of Americans that I mat on early interest in learning. Chair. I don't feel that I'm not allow calm. I don't feel that are very welcome. It seemed to me people would label me for what was convenient for them at the time. So if I was playing sports, they'd be like, Oh man, like you're still black, you're doing so good and basketball. And then to fathers like didn't want our tests than they switched it around. And so that was really frustrating because I just wanted to be me. I didn't want to have to prove to people my identity. I didn't want to have to prove which side I was on. That just seems so ridiculous, especially growing up, were half your family is why you have your family's black and they get together and get along and have a good time. Sam, row number two, I mean, obviously it's kind of screwed up that I have to write that I feel like I have to blend in with whatever group them with that time and not stand out so they don't get asked two questions are labeled and unfortunate ways and stereotype, but it's just something that I feel like I have to do. Given the people I'm around. Since I dislike, I have to fit in, I'm dumped. I'm tired of of being made feel like the other the, the weird kid in the group. Me, I like to express my soap. So like with this being a conservative cause her I've kind of at times I've compromised my price nodding while was just because I don't want to it makes me feel uncomfortable in far as I kind of left myself outside in different parts and inside. Whereas being in groups or just add black area, Martha, nominate one guarded. Nobody else wants to sit next to. That. I'm going to have is autonomous, is smallest of people look at you like you're crazy when you say something, smart. Me. It's those kind of things essential part. And he's really smart and they don't really recognize a dawn of it. We recognize it though, because you might just be one person. What is 40 thousand people on this campus? So it's happening day, nearly out of time, five times a day to meet their hearts. You kind of because every day Vienna, okay. As kind of this, as tough every every day. Like I sometimes do O'Day with our target because people don't. So to me, if I don't initiate now I'm a math major ill-advised block. You just drop your jaw like, Oh, by people do math like yes. Yes, we do. Who? Of a cake? How many times that I'm a very social guy. So like first two days CloudWatch, I'll get people to go, do homework, aim in my situation. More often, not the only black person inside the class. Now, month down the road, first test come out. I got a 98, they got 72. They look over, hey, you want to study together? Was just a mom prior to the you were to base it on ABC and D uptake in pride. Tim, I have class this year, seven, at least 90 maybe. Maybe it's because their day they just analyzed, study, maybe because they didn't nobody made light is so many things that he could have been. But when it happens consistently and like you hear stuff in the map as a maybe you're not, you're not the normal black guy. Hey man, you're actually working hard. You know, like stuff like that sticks. I went to a party. I can go to file was and this is why dorm BGR week and goggles head. You play football? No. I play basketball now, is that my bad market led black guys? You know, it it doesn't feel good. Kind of gone out to activities being like, Well, it sounds like someone like to do. And then just meeting people. It's, I think the only negative at Purdue is that's pretty easy if you're introverted or something like that to just stay in your room. On used to just stay in your room. You're not mean anybody. You're just on your computer or whatever, playing video games. I think. I think want to just go out a little bit. French is going to happen. Thank you. We can stop sharing that. So this is a video that was produced a few years back by a group of produce students as a documentary to share with each other in the administration what it's like to be black, core, Asian, or coming from a rural community or to be an introvert and to be in this shared community of Purdue. Each one of these students talked about something that related to their sense of need for belonging, right? How do I fit in? Where do I belong? How am I valued? How am I nurtured? When I saw the, the piece from the young African-American guy that says he can go all day without speaking. No one speaks to him what? Because no one speaks to. It's heartbreaking. And it is also very, very true that these places can be wonderful and exciting and yet deeply isolate. And I live in New York City for a long time. You've got millions and millions of people in New York. You think that life is a bustling all the time and that there is no end to the things that one can do it on there. So true. And New York can be one of the loneliest places. I think on the face of the earth to write that one does not feel a sense of belonging. And so part of the challenge we have in data analysis is how to connect the data to the lived experience of the people we are talking about. We might have a number of assumptions about folks, but what might it be like to walk in the shoes of the young African-American men and to just walk around alone in the midst of a sea of people to feel alone and valued. Always. At my age, 58 years old. I think I looked good. I'm trying. I still get nervous. When I am in an elevator alone with a white woman thinking that she may be feeling something about me, you know, clutching her purse or whatever way. That's the truth or not. She may have no thoughts of me. But there's a stereotype about black folks at the Olympics periods, about black-box and particularly in relationship. Some others that we think has gone by Nogot is ever present, right? I am at the age now that I have a level of privilege and I never had an encounter with the police that was negative. I'm pro black. I'm properly some all that stuff. Right. Until I came here. Back home to Indiana where web app is broken, right? Yep, I'm alive in all of these great places, Chicago, Atlanta, New York. What I found here was that policing is different. At a situation where it was the first black person to live in my neighborhood, lived there for three years, but about a year ago, police started sitting outside of my house. Just sit. What does this mean? I know the police chief says it. What is this? Right? Another incident. I've got work happening in my house. Right. Workers IN and OUT. Neighbors had some concerns and call the police about some of the workers thinking that maybe someone's breaking into my garage door was open. And so was the door leave it to the police, entered my home. Unannounced lever. Right there. You may have seen the video. Of the students from, I think they were soccer team or something from the University of one of the HBCUs, Delaware State University where they were in the middle of Georgia. And police stop the bus, charter bus for a traffic stop. And lacrosse team. Thank you. And stereotyped the students as having used drugs or whatever. That's going to be a big to do, right? We see this all the time and many of us say, well, here are these things don't exist. Store. We turn our heads and right, We move on or there cannot be systemic racism, systemic sexism, systemic anything. But we're living these experiences every day. Every day. That is part and parcel of what we have to include in the data. Just got a note from some folks. You may have noticed that we had a policing incident at Purdue not too long ago. They look to be, to be racially motivated. Turned out not to be. But we made lots of assumptions during that time about what was and who was and how things have come to be. I just am seeing all of this to say that behind data points, our lived experiences that are nuanced, that account for a great deal of of what we might see. I also want to know to you that in looking at the students, black students and white students and Asian students and others. We talk about implicit bias a lot. We talk as well about unconscious bias. We rarely talk about internalized bias and how that might present and student outcomes and n-bit data. Internalized bias is a belief in the mid-80s of one's marginalization or degradation or inferiority. You might recall that back in the 19 fifties and sixties and has been replicated in that. That was a big thing on Anderson Cooper show about a psychological experiment that occurred with black and white kids years ago, you take two kicks in this case. It will girls, one black and one white. And you meet with them independent up together. And you show them two dolls. One black, one white, kids about 34 years old. So babies, right? No bikes. And you say to the white chalk, which is pretty, is she chooses the white doll, which always good, and she chooses the white doll. Which doll is too much. You chose the white doll. Taking the black shot, which is pretty, is she chooses the white doll. Which DO, is good, which is, which dullness, bad. Teachers of the banked. All we've done is she chooses the black doll. Both of them independent of yellow. Which doll do adults like the best? They both choose the white doll. Which doll do you want to be? They both choose the white doll. Well, both children, it is evidence of a white supremacist ideation. Both are innocent, but both have come to believe that why does better and prefer. For the blood child. It could be the beginnings of a lifelong rendezvous with internalized bias that is different than Imposter Syndrome. Imposter syndrome says, you know, I just may not be good enough. I've got all these accolades. Internalized bias says that my blackness is not good enough, makes me not. My gain us makes me not good enough. And that's a struggle to live with that sense of inferiority. So how do we account for that in the data? Let me go back to my slides, I'll share my screen. Ask you to continue to think about a question points, discussion points. So with all of that as a background, I have taken a stand. And what diversity is. You look at the university's diversity statement and it's everything and the kitchen sink. Maybe doesn't mean a whole lot to me. I've claimed that diversity to be as excellence for expressing itself at the intersections of perspectives and lived experiences. Mind from the space of I come from and Ted, yours whose base you come from and Sarah, this space from you come from, and George and Christy and Heather. Your diverse to diversity is excellence. My my background, my intentionality, are reflections of excellence, not my impure reward. I reject that my blackness is deficient in any way. Expressing itself. In other words, the UI, we don't get to determine what excellence feels like it looks like or acts like. Is do the intersections of perspectives and lived experiences as radically different as our perspectives and lived experiences are. They aren't valid for us. And so as we put a potential stake in the ground around what it is we're doing weight and who were doing it for. I would offer to us, might we think about the people as excellent in whole and expressions, whether they are, what we expect them to be or not, right? That's not to say that we cannot call our deficiencies. We certainly can, but we call our deficiencies with an eye towards explicating and elevating the possibility that the student has. Now, this is just the way I roll with diversity. I wanted to show a picture of what we're trying to do is this thing about diversity, about representation. I think not. This is, I like to call it the Empire State Building, is actually the Chrysler Building in New York. Empire State Building sounds better, right? So this is the Chrysler Building. It is divided into four parts. From the bottom. The first part is the foundation of the building, the basement, the footing, what we need to get started, That's representation in the diversity space, that's race, gender, sexual orientation, all those things that we deal with. The thicker part of the building really gets built. And here's what I want you to do if you are willing to and able to. But the data gray help us with this, develop a strategy. What do we do with representation once we haven't? Strategy should be so robust that it reinforces representation at the foundational end and leads us to the third part of the building, which is innovation, creating value. In DEI. If it's not innovative or the institution for the individual, it will lack resonance, right? The third, fourth part of the building really gets built. That's excellent. That's why we have culture change. That's where everybody brings their opponents, their own as to being We get that full potential unleashed. And so the goal of the work that we're doing is to leverage representation to drive excellence. That's just a, a framework as we move forward. Now, we created a paradigm for analysis here. When I was at the University of Virginia College of Engineering, we recognize some of the same issues that we have at Purdue and that you have in your institutions. And that's about differential outcomes between population groups based on race and other factors. It is true that there are differential outcomes we wanted to understand. Not just, excuse me what the data says, but how the data matches up. In other parts of the human experience. At Virginia, we developed, we committed about to do this study. And we developed a by Bayes analysis that we replicate it and advanced here at Purdue. The first phase was a large-scale qualitative analysis, a longitudinal analysis of the data. This is the dataset on race, right? We have it for every category. Within this dataset. We looked at change of major country of origin and residency, ethnicity and residency, gender, housing, military status, scholarships, student athletes, student conduct, students who are first-generation students with disabilities study abroad, summer programs and the like. We had about 80 areas that provided us data. We developed a data warehouse for this particular project. So now we did it over a ten-year period. We can analyze student. Success trajectories for every population that is in the university currently that has been through the University of Virginia. We did it for 20 years. Here we get a pre-tenure. So first is a large-scale quantitative analysis. The second part begins the mixed message and methods to study. The third is documentation design and prototyping of interventions, documentation of policies, determination. So here will be focused on all the data for phase one. Involving the students. Face too. A review of the conditionings at Purdue that the students come into, policies, practices, etc. Three, a, and analysis of the psychological, the cycle social experiences that students are having a relationship with the university and vice versa. So that was a deep qualitative analysis. Their focus groups, surveys, et cetera, et cetera. Then we developed targeted interventions that we tested. And then we went back and repeat it at the outcome at Purdue, at UVA engineer was that there was a significant differential in GPA attainment between black and other students that we were able to begin to close by looking at DFW rates across the board and other other areas of struggle. How did we get into this? How did we get into this? Why did we spend $2 million at UVA? And why do we replicate the study At Purdue? Inputs B is friends on the lived experience. I'm going to tell you the story. So I'm going to stop sharing for a second, although I don't know. Okay. I went to UVA. Wonderful institution. And I was in the School of Engineering, Applied Science and looked around. It's just a visual where they looked at the data. That said, what does it look like most of the black students or lessen the major here? At about close to 70 percent chrome. Tell me about the major. The major was engineering science. It was made up of two minors to engineering meters. It had industry didn't understand it because they weren't mechanical engineerings or engineers or electrical or whatever. They with this hybrid. It was the only program that we offered online at the University in the College of Engineering, the only one where community colleges students could take. But 70 percent of it at the lowest earnings outcomes, the lowest graduate school participation rates. It was the lowest. And the majority of black students were in the major cities. How did they get to be in this major? If you said, well, you know, that'll work for me, right? So how do we choose students for me to say, well, they are selected, they apply in the middle of their second semester, the spring semester to 0. So you mean to tell me that the only grades we have to go in it that there was a GPA required. The only grace we had to go on is first domestic. Okay. What do we tell black students when they come here? Take your time, acclimate, work your way into it. Why do we see a gap? Is usually math. Or what advice do we give students about math? We say if you're uncomfortable with or uncertain about your math skills, start with calculus one, which was for you be engineering would be ill. I said unsure on comfort about uncomfortable with your math skills. Now, you didn't know that that's a bulls-eye. Or to under-represented minorities, to women in first genes. Because many of them feel uncomfortable, were uncertain about their Mansfield's, yet they got into one of the top schools. And I said, What does the GPA requirement for moving into major of choice 3? What is the GPA? Race by race of students at the end of their first semester. White students, about a 3.2. What about a 33? Asian students about a 32 in there, 22 or more races. But at 31, Latinx, about 30, black, about a 28. How long has this paradigm been in place where we have an, have known about differential achievement rates. Husband is, right, this is just what it is. It's an immediate tell me. We constructed paradigm, the systematic when left black students out and channel them into a major. Then we knew would be less competitive because it's suited us as it. Let's talk some more about, about, about this mapping, right? So we did an analysis. I wanted to know how black students were positioned. We did a boxplot analysis and we found no correlation between students incoming high school GPAs and Math readiness. We did fouling the substantial correlation between their standardized test scores, SAT and ACT scores in math, their placement. Come to the boxplot that we did. We found that number might remember UVA engineering. There are two. Foundational math is calc 1 and calc, calc 2 and 3. It used to be Calc one way back in the day. We've bound nearly, nearly all black students. Starting in calculus one. I wanted to know why the box plot indicated that due black students who attended, remember, students were self-defining where they belong, right? They was self-defined. So here's where that internalized bias comes in. Students who've bit the bad spot for Calculus 3 put themselves in Calculus 1 because they felt that they weren't ready for calculus 3 and that they would have a better GPA start. There's a black students, white and Asian students conversely, who, according to the boxplot, should have been in Calculus 1, disproportionately placed themselves in Calculus 3. There's a quarter of a point GPA differential between starting to count at the undergraduate started in Calculus 1, they started in cognitive history. Black students will put in an effort to think that they would be a head and to get a greater foundation. We're putting themselves be height, right? And not steered towards where they should be because they felt inferior. Once we receive that information, we got to work with the students. And now at UBA engineering, at least when I left, we had an even distribution of black students between the barriers, the various areas. Calculus 1, 2, 3. Steel wanted to know what happened. Why we're black students. So just really in Calc 1, it is still a story go by those who would that? The university had, the college had credit requirements. They weigh something like 132, be able to come down to 12, 8. And faculty simply decided, we're going to cut a class. We'll cut calculus one. Does this one make it count good was to every race, except black. Made the transition to saying Calculus 2 as a starting point. Black students never made the transition. They wanted buys two, they weren't encouraged to. They did believe themselves ready too. They believed, and their degradation in their inferiority, in their need for greater support. And that's why I said, we're going to maximize student potential. Here. Coming back to where we are. The study was designed, it was adopt design thinking process using base 3. We wanted to empathize, define, ideate, prototype, and test. And so we had a research team lead out of the College of Education and the College of Engineering. We had psychologists, sociologists, economists and others involved in that work or to do the analysis. The data, we develop a data clearinghouse. And you'd be engineering to grow over time and to help visualize what, what information we had. All of these various areas came into play. At the University of Virginia. There were key problems in areas of change that we wanted to identify. Key problems under, under production of URM stem graduates in the US, the area of change increase that number, call for action, improve the stem pipelining, particularly for URM. So we had some problems and some areas, again that we wanted to address and we felt we could at the University of Virginia. This is a look at the impact on Black students. Now, this is, this is still UV. A note that this is Purdue. This is Purdue. This is beginning look at how GPAs were arranged over there. I'm sorry, this is not who this is. This is UBA, this is UBI. So and the fall of 1990, right? We had these starting GPAs. All students at 3.3. We had undergraduates, we had graduates. And then we started looking at a trailing line, these up black student GPA. From the beginning. The average was here we began to close the gap. And 20, 19. This is Purdue OWL. Purdue. So they talk about sense of belonging in isolation. I don't know how many of your institutions like ours. But this is the numbers of black people. At Purdue. We have exactly 920 black undergraduate students out of 37,101. For 2.5%. We have 277 black graduate students out of 11,600. We have 48, like professional students out of nine, 25, 59 to you than tenure-track black faculty out of 191062 non-tenure track faculty out of 1849, staff to 27 out of 7700, there are 1609, 2009 blacks. People at Purdue out of 61,126, for a total of 2.8%. Why does this matter? Because representation matters, because scene ourselves matter and because a sense of belonging is really important, the greatest threat to a sense of belonging is isolation. Isolation eats up the soul. It's not. So if we look at these numbers and you think that representation and affinity are important, what might it be like? To be? One of 200 black students coming into the class? Adaptation dancing. They produce Stadium holds 57 thousand people, roughly. To put all the black people and that produce data, we would take a nullable section, a good section. How many of your institutions are similarly position? Your numbers may be higher with blacks being that much harm. And some of them are based on your your, your original campuses as well. Brian, which inflate the numbers. I would say to you that a sense of isolation underpins the sense of b on our campuses. For some people. And it is not just easy, It's so easy to go out and make new friends. Our environments need to be positioned to support and to see us than our formula. So I want you to look at the data in light of sense of belonging and how critically important that is. What we did at Purdue. As some of your institutions may have done, is after the murder of George Floyd. We love dogs. Who are we? Where are we? How are we doing? And let's do better. We developed in equity task force on the black ball and make your experience that had a heavy, heavy, heavy data heavy data component. The committee was made up of 150 people from all over the campus. We looked at the data and many different areas. We decided that we would work on systemic and environmental impediments to blackball. I'm maker, thriving. The focus was had three pillars. Representation tied to recruitment, experience, tied to retention, success, tied to placement. I am a translation list. And so I needed to make sense for me. Representation means that Purdue becomes the first choice institution. First choice, not secondary boy, first choice, perspective blackball him he experienced means that Purdue becomes a place to truly call home. For black BLM leaker. Success means that we have effective way engaged in maximizing black bullet maker potential. This is all around a sense of belonging and line. The goals for the equity task force was to increase the number of black undergraduates, graduate students, faculty and staff. What have we done? We've hired early outreach coordinators and married and late challenges. So that's Indianapolis and the Gary area. We've developed new yield to strategies that our data science and fused. We have three now going to be four ways of faculty cluster hires. The first cluster is just about completed and public health, health equity and health policy. New communication strategies in place, enhancing graduate diversity. So lots of things. I'm going to pause there and say that one of the things Purdue didn't understand, we just cut this up here for a second. You didn't understand this diversity story. And it was easy for Purdue to deficit frame. Blackness in brown IS. And a deficit frame could do because they didn't know its diversity story. How many of your institutions know Yoda grocery story? And from a data science perspective, how many of you know your diversity story? So since we didn't know it, then we hadn't codified it. And part of my role here because I needed to express it, was to unearth it and articulate it. So here's something I want to tell you about Purdue. We were founded in 1869. Surely after the Civil War. At the same time, but most historically black colleges and universities were founded for similar reasons as well, to provide access and equity. As to the citizens of Indiana who were farmers and day to newly freed enslaved people and they're all suffering. 1869. We graduated our first black student in 890 on our 21st birthday. And we became drawn. That student was in the College of Pharmacy. Pharmacy as the incubator of diversity, equity and inclusion at Purdue. Second black student graduated in 890 for in the College of Engineering. Then came a 196. What happened in 196? We had Plessy versus Ferguson and the establishment of segregation, separate but equal, that reigned for 100 years. Were black. Bow couldn't get into white schools. But Purdue and IU and a handful of other midwest puppets actively recruited blacks into our undergraduate and graduate programs throughout the 100 years of segregation. We graduated our first black woman in 1911. The very heart of segregation in the College of Pharmacy. Purdue is the founding home at Nest be. The National Society of Black Engineers. In almost every black engineer under the age of 50 in the United States. Part of Nesbitt, we founded the Society of Women Engineers. We are grateful. We think about business schools. Think about heart. All the buildings at Harvard Business School are named after the former Treasury of the United States, except one. It is named James I cashed Hall. James I cash was the first African American to receive tenure at Harvard Business School. And he has a double bulimic. We are the number one producer among R1 institutions and the victim in the production of blacks earning PhD degrees in chemistry. Last five years, computer science, economics, statistics. A number of other stem areas. With a number 1 online message communications program enroll in an educated blacks. We have the number 1 program and engineering technology like educating lacks, in other words. But I had to say to Purdue is what I'm asking you to do here is not new to us. It's true to us. This work is in our founding. Documents. Write it as the reason for which we were founded as what we've been doing all the time. It took data to help tell the story of the excellence of Purdue in this space. We are, I think about state with the arrow up my university which I love. I believe that Notre Dame and Purdue or the Citadel conservative thought in the state of Indiana. In so doing just hurts. And that really requires not just hard, it requires data. And data needs to make it make sense. Right? If I did not have an exquisite data science team that I work with very closely at Purdue. We would not know who we are. We would not be able to tell the story. We would not be able to engender a higher sense of belonging. And we would not be able to D, Stereotype the institution, right? We would not be able to just be. So I was telling you that story. Back to sharing screen. We're improving the black ball and make your experience with developing a new MPH. See National Pan-Hellenic plaza. We're engaging the Black Cultural Center. We're building new programs, we're improving taps. So all of the things to help make existence and bribing more possible. We also have begun to inspire people. We rededicated, are dedicated two of our newest residence halls for the two black women, Frida and when it for Parker, who in 1947 desegregated our residence halls. As wonderful as our story is. We are part of the American story as well. And so while we welcome to students, relatively students, relatively speaking, to study here, they could not live here. Friedan when a bird had to bus or walk a few miles every day in their freshman year to go home because they were not allowed to live in West Lafayette. They had to live in Lafayette across the river. And only in certain resonances. They loved Purdue sufficiently that they fought for their right to be on campus. Their father was part of the Silver Rights Movement, appeal it to the governor of the state of Indiana, whose name was case at the time, who told produ frankly, to desegregate. So I want to know, I want you to know that in the midst of this, there are still struggles. And these struggles have had to be overcome in different ways. So this is a picture over here of the sons and daughters of the Parkers sisters and our president. Dedicating those residence halls. We decided to develop strategic partnerships with a number of just a handful, a collection of HBCUs. We've decided to partner with Morgan State University in Baltimore, Maryland. Morgan is one of four black are two universities that will go up supported by the AU or R1 status. Purdue will be Morgans R1 sponsor as it, as it develops its Roadmap, Roadmap to R1. Over the next ten years, we had the President of Morgan State, Dr. David Wilson in the picture there as our MLK speaker. And we had on the next picture a high delegation from Morgan State and Purdue engaged together where we will announce and from which we will announce our institutional partnership. Excuse me, in the coming couple of months. This partnership will include large-scale institution level research collaborations, probably an artificial intelligence, and cyber security, and other areas. It will include faculty exchanges. The potential of Joint Faculty Appointments, 32 programs for one programs, and 34 kroeber, as it will be, we hope a model engagement for R1 institutions working with HBCUs. We will fully support Morgan State on its journey towards R1 status. So this is what we're doing. Joint funded research development of new graduate programs, joint teaching faculty and student exchanges, 20 degrees, etc. We have to talk about representation. We have black fraternities and sororities. Produ didn't realize and perhaps others don't realize that. When it comes to black progressive movement, political, educational, social, economic, there are three pillars of the black community that are really vital to black church. Black colleges, and black fraternities and sororities. We have no houses for our Black fraternities and sororities here is most used stone. Some institutions have gone to developing Panhellenic places. This is the Panhellenic plaza at Wake Forest, which I think is one of the nicest and the country, we will develop our own penalty Nick Plaza, which will be the best in the country because we know what the rest of the country is done so we can do it. We will honor and celebrate the rich legacy. Black Greek, the Greeks. Now, there's a data point here that's important. 7 percent of black undergraduate students are members of Black for each fraternities and sororities. 6% of Black Alumni are members of the Purdue Black Alumni Association. About 70 percent of the people who are in the Black Alumni Association, we're in a black fraternity or sorority. There is a connection from a data perspective of increasingly a sense of belonging on the front end in order to sustain an increase, a sense of belonging on the backend. And that accounts for long-term institutional vitality. So it all matters. The small things. Building data guide a culture is a whole lot problem. But I want you to note here is that being data guide, it means using data to understand the real life experiences of people and taking actions that have positive impacts. According to gaze. Belonging has three fundamental questions. Do I feel like I belong? It's a feeling, it's intrinsic. Do I envision the voters of my puts into recombinant being here. Can I see it? An MI nurtured or supported to reach that potential? I think we have affirmation to those questions. Houston, we have leptons and we don't friends, we have an opportunity to reconsider and to reframe. That is my presentation for you today. I thank you for listening to me, for being so patient and I'm open to any questions you may have. Thank you very much, John. Yeah. So we can entertain questions in the chat. You're also welcome to raise your hand. It's certainly a small enough group and I enjoy hearing people talk about their question, so I'll give everybody a moment. And if there are no questions, I certainly I have a bunch hoping to let them go home. Url it. Well, they're probably, many of them are probably go that way. So what does the UBA right? Yeah, we did. You'd be like engineering change, ethnic group wanted, but the data changes not easy, right? It's not easy. Race is not easy. Doing this work is not easy. So I will tell you a story. I decided that I couldn't live with what I saw. And that change was going to come. Changes don't come one way or another. I knew that people would not be happy because there were some people who are comfortable with where things were. I had one department head say to me, John, if you've identified a problem, if you got the students are doing less well, they've got them corroded, leave them there and work on three. And another department head say to me, John, I, I get it, but I don't want these students messing up my rankings. So don't want it leave him there. There are a few things that actually transpired. I used to give talks about a sense of belonging at UBI. And a black female student came to me and said, Dean gates, I wanted to tell you about my experience and I listened to her. And it was very impactful. I said at the end of that, if you want to make a difference, tell the faculty. She said, How do I tell the faculty? I said, Well, call me. I'm a student, an undergraduate. How do I call a meeting of the faculty? Call them by name. Professor X, Y, and Z. Convening the conversation on my experience as a black student at UVA, I would like for you to attend that. She did that with the group. All the deans came except the dean, the department heads came. There were 18 black students, Crescent. I said to them, ask them what was that? How many of you got into the major of your choice? Elevating? No hands went up. Like I said, what was the impact of that on you? And Here's began to wear down their faces. One young woman said, I came to be a and I'm not dogging UBA by the way, all of our institutions are saying, don't believe the hype I've been at Harvard, the Vermont in a nanobots changed that much. So if you look at your data and what's actually happening, UBA may not be that, that difference. So I'm not don't they talked about having their dreams dashed, about not being able to be the people that they had bought themselves to be For many years. In the sense of demoralization. The faculty were moved by that last move by the data. Then by the stories. Faculty talk, talk. This was, this was in my first three months at the university. Right? Because I saw this real quick. And I I knew that I would not be able to stay at UBA. I took this on because people were really invested in the status quo. They form movement not only sent a note about faculty and said, are we prepared to affirm that we're not denying equal access to education to black people. Based on the data that I've shared with them my first three months, the faculty voted overturned caps on majors. And now every student who enters, you'd be engineering interests into the nature of their choice. We liberated black students and all students. What did they change? They've changed there and they've done deep. They've gone deep. They changed how they advise students. There's a deeper engagement with students success. There is more scholarship being awarded. We rebound the other center that supports the students, we began to work on the DFW rates and solve. Those began to decline. For black students. We saw the GPA, the mean GPA of black students go up to be just about on par with majority students, right? We saw the outcomes over time increase. They are continuing that, that, that work and we've seen some of the same things here. So let me see. I got questions. Da da, da, da, da, da, da. Okay. What are specific actions you recommend for an unstructured to pivot, to maximizing student potential rather than achievement gaps for a student with lower PRI or prior knowledge. Do you have suggestions? Yeah. So it's in freely, right? This does not negate the fact that students have differential achievement rates and that we need to do the work with them to support leveling them up. But it's the way we talk about it, right? The work is the same, is the funnel through which we interpret the work. So different people need different things. The data has indicated to us in the literature and experience that some people need support, some people need nurture, right? Support is here's what we got, right? Go do it right UP, need something. Tommy. Nurture means expressing a belief and the students beyond that, which they may believe in themselves. In other words, there's a difference between saying to us through here, this is my syllabus. All right. Good at it. And saying to Samantha, who may be one of only a handful of black students or whatever the background is, may or may not be building a sense of belonging or isolation. Samantha, happy to have you in my class. I am, you know, I'm a tough professor and I have high expectations. I'm looking forward to you rocking. In this class. I'm so excited. You're going to just blow my mind. If there's anything I can do to help you along the way, let me know. But you got this. I'm looking forward to it. What does that do for submit? It gives Samantha's sense that somebody cares, that somebody sees her in a different way at an elevated way. And that she's not just working for some math, but she's working for you to prove to you too that your faith and her is well-placed. There is still a knee or that human connection. So It's more about seen what the potential of the student is, recognizing the gaps, the achievement differentials, supporting the student to incrementally, systematically as we do, begin to move to higher levels of attainment. Indicating therefore that the student doesn't have a gap per se, right? If it's sudden nuanced language, right? But it's nuanced thinking. But rather a space and opportunity to elevate right from where they are, right? So reframing place in it in positive terms. Providing environments, don't think that every black student is in need. That's really important. Listening back to those videos, Black students do math. Black students, rock. Black students are scholars, presidential scholars, trustees scholars, emerging leaders scholars here. We've never had a with the Cradle of Astronauts. We've not had a black astronaut yet, but we're getting there right. So place them on a higher level. Our expectation. Never, ever, ever, ever reduce your expectations. And desired outcomes based on race. Never ever, ever, ever, ever see that black person as equally capable, expect and require the same results? Knowing that they may need more support or nurture to me those results. They need a personal connection. Write a personal connection. It might be good. As you've read a paper to scribble down. John. Great job on questions 2, 3, and 4. I'd like to see you reflect more and come back to me in conversation about questions 5 and 6. All right. So now this is rocket science. This is just about tap into the human, okay, and see what else we got. At UVA or Purdue had they'd been advocates of making changes to the curriculum or content of courses in the service of maximizing student potential. What about pedagogically chickens? Yes. And you know, it's difficult. But because we're tied to our pedagogy, know what we have at Purdue, for instance, like them, should UBA has done some great things. We showed this data to our, some of our faculty we have. What do we call it here, the Teaching Academy. Teaching Academy. It's made up of the most distinguished teachers, professors on campus. These people have lots of cache and particular data to them. And we said, we want to work with the faculty. They were doing the same thing we developed on a framework collectively around inclusive advising, inclusive pedagogy. So our Teaching Academy is championing and inclusive pedagogy. Framing it up. It's going into the curricula of some across the university. We see department's wanting to change and augment their curriculum. We see departments looking closely at the DFW rates and disparate impact and doing things there there seems to be in the past, not been in for three years, kind of a whole movement across the university towards this framing. If you ask somebody about deficit framing, that they're going to take you don't depth to anybody here, right? We don't talk about at rest, all of that sort sort of thing. So the faculty have taken it on. And how do I do I know what's, what's happening? The faculty are leaning into diversifying itself. Yes, Lord. Now we have these cluster hires happening, right? So we've got this focus on increasing black faculty. The waste of Black faculty are coming to Purdue and check your own institutions because I bet you're eating too much different. Is probably 90 percent of Black faculty have come in through the strategic hiring paradigm, right? Where we supplement salary and startup and whatnot. So black folk for our faculty have been thought of as a, as a plus, not a must, right? And because there's this interesting financial incentive, right? What we found is that we have for strategic hiring lines per year. Over many, many years, black faculty in coming back with never exceeded poor and did not come generally outside of that pool. So whether they were first choice or not, they weren't presented expert's choice. They were presented as good to have our faculty this year have leaned in, sit centrally. Well, I expect to see a record year for black faculty hiring on campus. Coming through the cluster hires, through the strategic hires, the root, the exceptional hiring pool, and through the regular, regular doors. I expect that we will celebrate a record number of black faculty coming in this year. And that number will be a record even where it not for the equity task force. So part of what people needed to do here was to have a reason. Knowledge, right? Give us the knowledge. Let us know what's happening, and now give us a reason right to do the work. We did not castigate the faculty. We ask them to do the work of diversified themselves and they took that up. Okay. What freezing recommendations do you have for speaking directly to students about changing their behavior to encourage more help seeking, twittering. Completed. Yeah. So black students, first-gen students, a lot of students, they, they have an issue with saying I need help, right? It could be Generation Z, whatever it is. But I can tell with black students, they don't want to be seen as not been able to cut the mustard. And so they will suffer on their own in silence and in isolation. What we did at UVA, what we've done here in these heavy stem areas is to elevate the the cache of tutoring. And I've counseling and other areas. In other words, the best students, right? We'll get tutoring to elevate, right? It's not the worst student, just the best students that did that, do that we're all struggling and suffering, right? Reaching out for support and help is a sign of great, significant independence, et cetera, et cetera, et cetera. I think the biggest way to break things down is if we can to just have a conversation with the students, get them more comfortable if they're willing to come to your office hours, right? Students are afraid of faculty. They're intimidated. Faculty. They don't want you to think poorly of them. They think they're coming to your office. It's like going either to the principal's office or to the president's office, right? You are that for them? All? I gotta tell you this. This is a big thing for me. Faculty, dear colleague at UVA, Yoast HIPAA, rock, solid, cobalt, the book years ago academically or tripped. She had the most profound research finding. And I think April 2016, 2017 on motivation. You can find it out there. And the general takeaway is where as motivation to succeed in college is predictive of success. It is less so for African Americans. African Americans come to college with the highest motivation to succeed second to Asians. But the predictive factor for African-American students, whether they believe that their faculty believe and nurtured their x looks. I come from an HBCU, Morehouse. Our faculty where everything we would not disappointed. And they believed and nurtured our excellence every step of the way. The question is, how do we convey to students that we believe and nurture their excellence? This isn't the small ways that I've just talked about. And you know, it's not, it's small ways is not rocket science. We think about microaggressions. Microaggressions are the intentional, unintentional, non-physical assaults that minimize one's sense of being. What is the antithesis of microaggression? It is micro affirmation. It's the small weeks in which we affirm the excellence, the brilliance, the dignity of these students. Learn to practice micro affirmation if you haven't with your students and watched them blossom. And see if there's anymore questions here. How do co-requisite developmental courses show and data that show in data than starting low with prerequisite in Bridge Program. So co, co-requisite bridge programs as he no work. Just they work. Now, help me understand what you mean by co-requisite. Just pop on and share that with me so I can better understand. You can unmute. So I want to be able to ask your question. When you're ready. I'll come back and do that. Did you have any programs specifically designed to address the issue about bias, conceptualization of inferiority, and black students? Yes, Go ahead. I'm sorry. I work at the community college currently and then have developmental courses so they can spout the students. And based on the information they have. And based on that test, they recommend developmental courses, which are credit courses. And they are given along the main course. They are staffing for the program. And then as I mentioned, if they're starting, for example, our calculus one, that it's there, also increasing years of completion and sometimes especially for community correlates. These are walking individuals and as the time goes, they may dropout and then the graduation rate doesn't show up good. So affect things in those universities. In your data. You're right. What we found was a direct correlation to higher time to degree, longer time to degree higher IQ. So black students at the greatest embeddedness of all students through the university and through this one as well. Marginally, but slightly longer to graduate and slightly more, more overall indebtedness. We have here co-requisite curriculum for students that are going through our Student Success Program. So these are often some of the same categories of students, but based on economics. More than Ben, race or other things. Where there is a special support systems, tutoring and the light and some some co-requisite courses. At UVA engineer. There were no coal break was there unless the student was coming back from probationary academic probationary status. And there were some things that that they needed to do in both cases, both institutions. There is a resistance to anything around and remediation of students, right? So students most common demonstrate their, their excellence. But they were pathways of support for them. George, I see it.
Description of the video:Thank you for being here, Martha. Thank you, Linda for that very kind introduction and thank you, George, for the invitation to speak here. I am. It's always hard to live up to really nice introductions like that, but I'm going to not think about that too long and just start my my presentation. I am. So let's see. Martha man interrupted before we get started, I want to mention though, that we're going to allow people to ask questions throughout the conversation. And it's a small enough group. If somebody wants to raise their hand and speak, they're welcome to do it. But otherwise, I'll monitor the chat and I'll I'll try to interrupt us. Those come up, right. That's how you want to go, right? Absolutely. Okay. Thanks. Yeah. I mean, George has alighted be two hours in. If you've led an administrator talk for two hours, It's going to be pretty painful. So you should interrupt me freely infrequently. And I won't be able to see everyone. But George has promised me that hill that he will that let me know if folks have questions. So the title of the talk today's mapping student pathways through the curriculum. And I really wanted to try to do a couple of things in this in this presentation. I unlike most of you, I'm not I'm not trained as a data person or in with any expertise in education. I grew up as a chemists and biochemists and an eye towards, as Linda said, I've been doing that for a long time and for a shorter time. I've been involved in efforts to improve equity in undergraduate stem education and then became associate sleeping Associate Vice Provost for a couple of years. And and I and I guess what George was hoping is that what I lack and expertise about the data analysis I can provide in sort of understanding as from my past as a, as an Associate Chair in charge of a curriculum for chemistry and as an as a current instructor, I'm only an administrator half time. So I so I I have to practice what I preach or people will notice. And, and so he, I think the hope is that I can, that I can sort of illustrate through the use of some of our data, how, how such data might inform faculty. And I think also how rich conversations between faculty administrators on the one hand and data folks on the other hand really lead to exciting new insights. So before I do anything else, I really want to thank three people. I came to this work initially through George. Actually I have a colleague of mine, had done a class and learning analytics fellowship in which she was interested in whether we could use an intelligent tutor in chemistry to help students who didn't, who didn't pass the test to get into our general chemistry class. Whether we could use an intelligent tutor to have them work, work over some review some things over the summer and come into the fall semester ready to take the, the general chemistry course. And we have a pre general chemistry course or use a learning analytics center to compare those outcomes. And what she found is, is that really the intelligent tutor work just as well, especially if you've had any chemistry before. And so that was my introduction to the kind of powerful role that data could play in advising our students and then thinking through our curriculum. And then as I started working and then George introduced me to Linda and Stefano. And I remember our first couple of conversations where I think we were speaking totally different languages in and with time, i've, I've come to either understand better the language of data folks or either, or just be more comfortable, only half understanding it. But we've had a lot of really vigorous conversations and Linda says something that I think is really important. And that is the data don't do us any good if we don't have the questions we need to come to you, the faculty and administrators for the questions and, and then, and then we can use this incredibly rich source of data. And Stefano, I've worked with Stefano quite a bit when the George and stuff no, in the seismic Alliance which is a which is attend University Alliance for equity in, in in introductory stem classes. And also the baby you Alliance, which is interested in an evidence-based teaching. And so and the latter part of the of the talk will be all work that Stefano has done. Okay. So I'm going to tell us and go through three sort of vignettes. The first is, how did I come to understand that we're not ready for the students of the future. You guys may want to know whether, whether people actually pay attention to your data and if they think about them. And so i'll, I'll show you how I've thought about them. And I'll briefly talk about, about, about my approach to trying to do chemistry a little bit differently. Call that teaching an old dog, new tricks. And then finally, I'll talk about curriculum analytics as process mapping. And again, this is work that, that Stefano, a few really has been the main driver for. Okay. So I guess I'll I'll begin my talk. I attended a few years ago when I was Associate Chair of Chemistry, my chair was invited to go to an AAU symposium on effective stem teaching and she delegated that job to meet. So I went and I saw a talk by micromole an arrow at UC Davis in which he talked, in which he showed data that basically showed that your GPA correlates your college GPA correlates with the with the average income of the neighborhood in which you grew up. Right? And in a first think about this, Any go, it's awful. And you think about it some more. And you think we're not, we're not leveling the playing field. Many of us who are interested in education got into education because we thought we were leveling the playing field. And we weren't than the more I've learned about this, the more I think that not only are we not leveling it, I think we're making it worse. And I know you heard from John Gates yesterday and he can talk about that better than I can, so I won't dwell on that. But this is a figure from a paper that was on which Stefano and Linda are coauthors, was submitted from the Seismic Seismic collaboration. And basically they talk about this term, systemic advantages. And so if you were a white male, affluent and, and your parents went to college, you have four advantages. If none of those things are true, you have, you have 0 advantages and there's a clear correlation, but especially between about 0 to three advantages in your overall GPA. And, and the advantages that you came into college with. And then I also learned from the seismic collaboration about, about great anomalies of women in stem. And I've been a woman in stem for a long time. And when I started at IU, I was the only woman in faculty member in chemistry. And so I've had a longstanding interest In this issue. And whoops, sorry. But longstanding interest in this issue. And they published a really nice paper on great anomalies that basically showed that women do worse in stem classes and in their other classes, they don't actually necessarily do worse than men. They do worse than women in their other stem classes, unless it's a lab class in which they do a little bit better. So I got very interested in that. And my colleague Laura Brown and I started examining some of the data. And as we started examining the data, you know, my, my eyes were really open to just how significant these disadvantages are. And so again, Martha, yeah, back to replicate. Go back to the backs. The best. Last slide for a second. Can you, can you explain that scatter map a little bit. You know, it's kinda small on the screen for some of us, yeah, it is small and the basic idea is, this is on the y-axis, is the gendered performance difference. So do men do better than women or women do better than men? These are stem classes and so than men do better than women, or down here is going to do better than men are up here in yellow are the lab classes. So you can see women do better than men in the lab classes in general, not quite as well as men in the, in the non lab classes. You can also see, and then on the, on this axis is the average grade anomaly, which is basically how much harder is his class than a normal class. And what you can see again is that the men are outperforming the women. Not by a huge amount, but, but by a lot and especially compared to women's normal grades. So women in general, at least on our campus, have a GPS, but two tenths of a point higher than men. So if you're performing the same and in chemistry either that means that chemistry is less discriminatory than everything else, or that women are just not quite you and as well as they are in their, in their other areas. And that leads to a perception that I think week 2, attrition, as I'll show you a bit later, towards Should I answer your question? Okay. Thank you very much. Okay, So I came back and looked at our data, although the slide won't tell you that. And you know, what I saw is that if you are, in this case, I did not consider gender. If you are white, affluent, and your parents went to college, your butt, your GPA is going to be almost 3.4 on average. But if you are, if you are from a historically excluded group on the basis of race or ethnicity, if you are first-generation college student or if you're eligible for Pell Grants, your GPA is going anymore on the order of 3.9. That's a 0.5. That is a huge difference considering the standard deviation of overall grades. And it turns out that I was curious what percentage of our population are we talking about? So about 31% of students at this university fall into one of these groups of underserved students and an agenda, an intro chemistry class like the one I've been teaching, it's 39 percent. So it's a not insignificant proportion. And, and you know, that there's a real big difference in ingrate. And one thing that's interesting is, I think this is, this is important. This is a slide that came out of a question a student made of me. As a student said, I want to understand how women of color do an intro class. And she said, I want you to look at two math classes. One can muster class and one biology class. And so I went look at the data in bridge grades for, for women who, and I apologize for using urea and that's what we say in our, in our, in our institutional data. So as a shortcut, I've used this, but women who are from an underrepresented group on the basis of race or ethnicity on the one hand, versus white and white and Asian American women on the other hand. And the same thing for men. And what I hope you can see is that if you're just thinking about non URM folks, with the exception of one math class, the men and the women do pretty much the same. And women do better in biology. But if you look at it, minoritized women versus minoritized men, there's a big difference except for in biology. Okay? And so what that tells me is if we have programs that are trying to address the performance of, of women in science. And they're not, and they're not taking into account this kind of intersectionality. We're going to recruit middle-class white women and we're not going to move the needle at all. So it's really important that we analyze the data in, in, in these sorts of intersectional ways. And we have so much data that we really have to wait for someone to ask a good question before we can get a good answer. Okay. Also wanted to show you that what the DFW rates. So the percentage of students earning a D or an F in the class or withdrawn for the class from the class. And basically, if you were first-generation students, if you're from an historically excluded racial or ethnic group or if your Pell eligible, your chances of not getting through a course or about a time and a half as high as they were, would be if you are not in one of these underserved groups. And it's pretty similar across all of these, all of these categories are Marta. So what courses, what data are you looking at here is all court that all courses at a large Midwestern Universities to which I have access to the data or all the courses? Yeah, this is every course. Including humanities courses, everything. Everything. Okay. Yeah, that's a really important point, George. So I've been focused on, on stem a lot. And I get to continue being focused in stem and stem, even using even wearing a university hat, because we have terrible DFW rates. But what's different about stem versus other courses? It's not so much the degree of inequity. It's the background DFW rates that are higher. And so if your background DFW rate is 30 percent, that means students, one of these groups have a 45 percent chance of not getting through. And so it may, so the numbers are just really big, but the problem is pervasive throughout the university. And I don't think that, that we're atypical in this regard. Now of course, we know if a student has to, has to drop a class, gets a DEA for w in their first semester. It has a huge impact on retention and graduation. I was a little list. I decided to look at those data for our university and I was a little surprised at just how strong an impact it has. First-year retention, if you have, if you have only one df for W doesn't make that much difference. But if you have two big difference, and then that just gets exacerbated as you go down through four-year and six-year graduation rates. So the difference between having, having one dear effort w in your first year, It's a pretty signifigant on the six-year graduation rate. And if you have to, that's really, really significant. So it makes a big, big difference. And in fact, over half of our students get a df or w in their first three semesters. So it's not a small group of students, It's not an outlier group of students that we're thinking about. So this really matters. Okay, But grades aren't the only thing that matters. So in our chemistry courses at IU, women essentially do as well as men. But if you look at persistence in our curriculum, and if we start with, we start with our pre general chemistry course and go up through general chemistry, organic chemistry one, Organic Chemistry 2. And I'll come back to this because I don't really feel. Comfortable pointing fingers at other people's departments. I'll point fingers at my department and also this is the department I know the best. But we started off in our pre gen chem class with about 62% women. So we don't have any problem getting women in the door in chemistry. But by the time we get to the third class, we have a much smaller percentage of women, only about 55 percent. And this is the kind of slide I used to talk the deans into giving us money. Because I say look, if we, if we women persisted in at the same rate as men, we don't even have to improve the rate that men persist. We would have had an additional 800 Pete women in these courses over the last five years. And an end, customers in seats, Gibbs, this is how we earn our money. Another thing I found exploring these sorts of issues is that it's real, There's a really different picture in computer science. So in computer science where there's also a highly structured course sequence, it turns out that that women actually have slightly worse DFW rates, at least in the first couple of courses than men do. But they persist in the Major at this to the same extent as men. So we have 18 percent women in the early courses and we have 18 percent women and in that fourth course, 34th course, right? So, so that's kind of interesting. That suggests to me that women are more likely to retake a course and, and plow through than men and computer science are. But we have got a very different problem in computer spine. So we haven't chemistry, we can't get women in the door and computer science and chemistry, we need to keep him in computer science. We need to get more women into the door. Okay, so those are the kinds of things that I've been exploring and thinking about how, you know, how we need to do things differently, how we can fix these problems. And and Linda mentioned that I also am a fan of of survey data. So I just want to show a couple of slides that came from the National Survey of Student Engagement or Nessie. And I just want to highlight first generation students here. So in the nationalists, uh, sorry, I've seen engaging men in 2021, which is given to both first-year students at the end of their first-year and senior students at the end of their senior year. Fewer first-generation students at our University reported that they've studied with other students for exams, worked with other students on class projects. And this is first-year and senior students both seeing this. In the first year. First-generation students perceive courses as not being taught in an organized way as other people perceive them. By the time they're seniors, we either get them used to our unorganized waves or they are, we reveal the hidden curriculum and that goes away. Fewer students report that they've had good interactions with faculty can only in the first year. So that's good. That's a problem that maybe we're solving as we go along. But again, it's persistent that they report fewer good interactions with other students. And first-generation students are much less likely to have participated in high-impact practices like internships, study abroad, and undergraduate research. So that's, that's a real problem. All of these things persist. We're not solving it, solving these problems as we go along. But more first-generation students report that they've connected, they're learning to societal issues and problems. And we know that especially this generation of students really, really want sat. And I have to admit I'm, and I've been in an ivory tower for a quarter century. I'm not that great at that. We're having students on our campus who can do that is really important. I've also included diverse perspectives in their coursework to a greater its report during that, to a greater extent than returnee generation students. And so I use, I use these data to, to point out that these students who are in underserved categories are bringing richness to our campuses that we benefit from. We really need that. But we're not serving these students as well as we are serving affluent white students whose parents went to college. People like me. Ok? And we have an obligation if we're going to use this richness to solve this problem. Okay, so this is my part about teaching an old dog. Sorry, should be an old dogma and older, a new trick. And I want to highlight, at the same time that I was learning about data, I was participating in a program, an inclusive teaching that was the brainchild of the Assistant Dean for Diversity in the College of Arts and Sciences here, Carmen headed show. And she's just fantastic. And I also want to highlight clear, Heinz, he was a student in the program. So this was a program and inclusive recently read bell hooks and all kinds of stuff. And we're teaching to transgress. And, and I thought, wow, I thought I cared about student empowerment, but I, I got a long ways to go here. And, but it was, it we had faculty student conversations that were really, really powerful. And I want to credit clear Hynes who say, I recently graduated patriotically chemistry and actually helped me teach my class for really helping me to understand a lot about the student experience that changed my view, especially of assessment practices in chemistry. So here are the data for chemistry I'm showing are DFW rates. Again, basically the student and success rates and their classify here by URM, non URM. So on our campus, URM is, is largely black and Latinx. And what you can see is, is that we have really high DFW rates and all of our classes. Right. The pre-generate come class and somewhere between 20 and 40 percent depending on sorry, that base base rate is around 20 percent. Gen chem base rates around 25 percent. Organic chemistry. It could be or it could be up to 40 percent organic 1 this is our highest DFW rate class. And it goes back down a little bit closer to normal In in in organic two are pretty bad. Yeah. Difference between the blue line and the orange line? Yes, sir. Okay. So the blue line is is under-represented minority students and the, and the, and the orange line is, is non URM students. Okay. So this, so I wanted to highlight first with the background rates are high and then the equity gaps are also high. We're seeing really big differences between the the the DFW rates for Thai students, a nine-month minoritized students. But look at this slide and I thought, yeah, Somebody asked that. Maybe you're going to say this, but the question is what happened in organic 2 in 2013? Yeah, so so I so I would say that sometimes these blips occur. So here's, here's me, see something in organic one into 102 thousand. And in fact, I was going to make a joke about this. But I want to, I want to first say that I, I really, when I looked at these data, right? Some people say, well, you know, our job is to help students figure out what they're good at in college. And we should, and if he's not good at chemistry, we should let them know so that they can go and major in something that they're really good at and lead a long, happy and prosperous life. And that sounds good. But if that was true, if that were true, if we were actually just identifying students who are good at chemistry and the other ones would go away and do something else, didn't we would expect to see the DFW rates go down by a third course for sure. And that's our highest DFW rate class, right? And we've gotten rid of what most everybody we're still, still kicking out 25 to 30 percent in the fourth class. It's astonishing, right? So to me, I look at these data and NSAID, the system's broken. There's something fundamentally broken about the system. And we can't just tinker around with little changes. Right? And I just want to say for to reveal everything else I wish I could tell you. I've taught there was one year in here that I taught organic 2 for almost the whole year and just not the summer. So mark the vast percentage of students and I wish I could tell you that it was 2013. It was because me. Alright. But in fact, the year I taught it was 2011. And actually I co-taught one of the times with the person who actually got these data and who taught at some other times two, so there's some, there's some fluctuation here. And I will tell you about some things that I think more Brown who taught this year has done to make things a lot better. But I went to teaching award that year. For teaching this class. I had great seen evaluations. I got great peer evaluations. But I was only serving two-thirds of my students. Write. That shouldn't be enough to get you a teaching award anymore. We need to rethink what it means to be an effective teacher. It's not just putting on a good show. We need to serve all of our students. And I don't want to, I don't want to say that there aren't challenges to teaching in classes of 300 plus students. There clearly are challenges to teaching in classes or 300 plus students. And we can talk more about that if people are interested, but still we cannot do better than we're doing. We just Harper? Yes. A question came up. I think you're going to get to this, but what actions were taken to decrease the IDF WLS and we're going to talk about that. Yes. So for I will talk a little bit about that, uh, for in these years, what I would say is that there were a lot of fact that there are a lot of faculty members in chemistry who were really interested in decreasing these. Not all the people who teach these classes, but a lot of them. And one of the things that was interesting to me as I started looking at these data. This, you know, I know who I think has a good attitude about diversity and who has a bad attitude about diversity. And I thought there would be some correlation between equity gaps and an attitude and there was none. Absolutely none. And so on one hand, that's a little bit disturbing, right? Because you can really care about your students, all your students and still have similar outcomes. But I choose to look at this more optimistically to me, it means there are a lot of faculty out there who want to do better than this and don't know-how. And the Kovner program taught me how I'll talk a little bit about my journey and then I'll tell you a little bit about what some of my colleagues have done as well. So one of the things we knew from, from early data that, that George and Linda provided us is that students who start in the present him fast, struggle the past gen chem. So what I have here is a DFW rate along with your grade and preach and Kim. And so if you can make a B plus or better in pre-K and can you do just fine in Gen Chem? I mean, we have we have people who drop with a B plus so 10 percent DFW rates pretty good? Yeah. But if you make, You don't have to go to a B minus if you make it be or worse. Your chances DFW rate is 55%, 55 percent. So you have less than a 50 percent chance of getting to the course. If you've made below a B plus in, in the present in class. And I learned a lot from Carmen and CLIL. And I went to my chair and I said, Hey, in the spring, we have a section that's reserved for students with really good background. I would like to turn that. I'd like to, instead of using that fifth person section for that, I'd like to use it. Students who have earned being a b and a c minus a sub-region gen chem class and give them more support in gen chem. So in my time working with Carmen and he really had become convinced that grades are stupid. We can't get rid of grades all the way, but, but I think the data are really quite clear that grades incentivize performance rather than learning in to the extent that they incentivize mastery at all. It's, it's fairly superficial. Mastery. Grades don't incentivize persistence, learning from mistakes, teamwork, and I would argue even deep understanding. So that's a problem. So what I decided to do was to go with a mastery-based grading approach. And my course, this was a pretty labor intensive practice. I don't think I can I was hoping I'd be able to figure out how to do this for a class of 300. I'm not sure that I can, without the help it, without serious help from the textbook industry who has the resources to make some of these, it's easier. But I may come up with a list of competencies for each topic in the semester. And students could earn that competency by giving sort of be level answers to those questions on a quiz. I had bi-weekly quizzes. The first half of the material would be a repeat from the last quiz in the second half was new material. And the way it work is his students would take the quizzes on Tuesday nights and they'd come in on Wednesday morning. And I would have them work with to come up with the answer key to the quiz. And we'd go around and try to make sure they didn't leave the room until they actually had the right answers instead of just agreeing on the wrong answers. So we spent a lot of time trying to help students understand the quizzes more deeply as they were doing that. And this was, you know, if this was spring of 21. So no spring break, everybody was getting tired and it worked pretty well in the beginning, but oh, I'm sorry, I should say. And then at the end of the semester, I just gave them to attentional, two additional attempts at all topics. So they didn't quite have some things. They could study it some more, then some more and get them. Well, I found is if we, if we incentivize persistence properly, the students will step up to the play. So I calculated I didn't target the group wanted perfectly. They've got about half of the people that I wanted and half folks made a B plus or an a minus and in that class. So I calculated that the historical DFW rate would have been about 30 percent. And after 14 weeks, I was a little worried after the last quiz with new material using the grading scale that I eventually used, we would've lost almost path. And the average grade would have been a C minus? I did. Okay. You have three more shots? Three more shots. And we basically, you know, we had Zoom rooms that you want to study. Topic one, come to this room's public, to come to this room. And basically the students really stepped up to the challenge after three more quizzes. They had an 8% DFW rate, 8%. And the average grade in the course was a B. That's kind of astonishing, right? I didn't expect anything that good. All right. Did it work again this semester? Well, the semester you would think would be better for Kobe. But I think if you talk to any faculty member, they'll say was actually worse. And so here are the preliminary data I just turn in grades a couple days ago. But this time by about 95% of the students in the course, where from the target populations they're either retaking general chemistry heard earned between a C minus in a b, inclusive in Crete gen chem. So the median grade this year was a B plus the average. A little bit. Learn that the DFW rate was about 15 percent, but I'll qualify that by saying I have cement completes. I was a little generator then completes because of the crisis in the semester. And probably a couple of bills will turn into DFA as well. But if you compare that to the 55% DFW rate that you would expect for this group of students. That's thump bad. So again, the students are separate up to the plate and I suspect someone will ask they doing in the next class. And it turns out that's a really complicated question. So Sean tall, the best Bristol at Purdue who runs their impact program, has, as I was telling me about a really interesting natural experiment, they head up at Purdue. They were, they were reforming Calc 1 and Calc 2. And in that, in the time that they reform was going into, into effect, have some of the students took traditional calc 1 and reformed Calc 2. Some took reform Calc 1 and traditional CAL to, some to reform, some took reform both. And the students who did the best were the ones who had reformed Calc 1 and reformed count to the students who did the worst had reformed Calc 1 and traditional calc 2. So Martha Washington maybe now at later is if you think it's useful and to what extent are they, depart back The aware of these trends, strategies. And what strategies work to bring about awareness. Yeah, well, I can address the liver that have been showing these very data make it made a big difference. And I'll show you some new, some new ways of looking at the data that Stefanos come up with that I, that I haven't really introduced to the faculty yet, but we'll now that the semester's over, we have time to breathe. And what I would say is in a chemistry department at IU, we have tack color teaching faculty. So these are non-tenure track faculty whose, whose primary job is teaching and they are great. They're scholarly, they're smart, they write textbooks, they, they participate in seismic. And so that group of people was very easy to convince. I would say the more traditional folks were a little bit harder to convince. And my hope is, is, I think in this kind of work, we start with a coalition of the willing. We figure out how to make it work with the people who, who care enough that they won't give up after the first setback. And then once we figure out how to do it, we do two things. We provide infrastructure to make it easier for the faculty who were not doing things the right way. I think if you make it easy enough, they'll do it. And and we were and we were to convince folks to you that it can't happen. Yeah, go ahead. So in Eric art that where you were talking about the experiment or Urdu, I mean eventually be students, if they persist in Chemistry, eventually are going to be meeting with teachers who are present or not or not on the same page, right? Yeah. So I think I think a couple of things about this one is, so working with this class, you know, empirically what I found is there were a lot are really smart students in this group of students who had, had really struggled at 1. I had people make an a plus and actually quite a few people making eight pluses. And some of those people made D's in the previous class. And I, I think, you know, so what I've come to believe, that white belief for a long time that, that performance in Chemistry was a lot about confidence, right? And if you're from a group and you've been told your year group is just not good at science. That's one strike and you're out. Basically. People just think, Okay, I can't do this and then you don't do it. You're mentioning resources are going. So what, what, what, what kind of resources are available to students other than, you know, the change in your course, I guess. Yes. So I mean, we, we have a couple of things that we've done for students in general chemistry. We have this course, we also have a supplemental instruction course. Let's talk really, really good professor. And so we can, we can, we can provide extra support within our classes. So we have about 1200 scenes. We take gen chem year and we can provide extra support for a couple of 100 of them. So that's actually not bad. I'd love to be able to do more, but that more or less meet the demand. To be honest, it's hard to get students to understand the importance of doing some of these things. So that's one part, part mental tutoring. We have all of those other things. But I think we really, and we have we have faculty member Joe Robinson use a national expert in, in active blur and goes off and talks to analytical chemistry professors from all over the country about this has had grants and so we provide support to our faculty to through our, through our teaching faculty. But we still have horrible scene that student, that student teacher student-faculty ratios are our teaching assistants. Generally, it's one teaching assistant, a 200 students, sometimes a 100, but, but not much less than that. And so we use, we use an undergraduate teaching and turns on unpaid undergraduate teaching interests freely. And that actually helps people are willing to do this because it helps them study for the M cats. So they come back and we'll work for free for us and they care about teaching other students. I understand how hard the curriculum is and they want to make it easier for other people. They're really terrific. Actually, we get about a 150 students during that semester. I'd say a 100 of them are terrific, really terrific. 50 maybe a little bit in over their heads are overextended, right? So those are the kinds of things that we try to do. What I really think community is essential. And, and I think we have to find ways to build community, even enlarge for us because large course if they're not going away. And so there's an example from a Biles professor at University of Washington. He has a class of 1200 students and he has a, he has to sit with their AI with a teaching assistant in the classroom. So they sit by section. And so every lecture, somebody there that's their person to help them understand something. And then they go to the discussion section and do the same. Now for us, that wouldn't work because our people are doing 44 sections and they may have a class at the same times, the class, et cetera, et cetera. But if you can make that work, it's great. I actually think probably the most cost-effective way to solve these problems is to start paying undergraduates to help us. So that was that anyone else want to race anything else or ask another question before I go on and change? It's a bit. So that's good. Okay. How about one of the other things I meant to say is, so when we, when we, we actually get a focus group of students who did my class last spring. And what we did find as they got to the you know, they worked hard because they felt somebody cared about him as what they said. And I don't think was just me. I think it was the whole teaching team. And and I should say that this was, this course was designed and taught in conjunction with undergraduate students. And it really made a huge difference. There's no way I would've been able to do something like this without that. But they got to the big class and they had great teachers. Actually, they had the best teachers we could've given them. But they just felt lost again in the large class and didn't work as hard. So one of the things I'm trying to do is figure out how to encourage students who have built teams in my class. They continue to use those teams in the next class, so they don't feel so alone. But I, I think too, that as you gain more confidence, I mean, the hard part is we have smaller classes for those upper level classes, you don't have to be as good of a teacher. When you've got small classes, students are less intimidated about asking you things and you can, and you can realize, hey, I didn't explain that very well and think of a way to re-explain it. But in a class with 300, nobody wants to raise their hand, right? So that, you know, the big classes are, are really an issue. It's tough and unfortunately they're just more people who need to take gen chem then who need to take physical chemistry. So we have to do it this way. But we're putting students in huge classes at a time that they haven't developed their study skills for composite, it's a problem. And I don't know how to solve that problem. Okay. I just want to, I just want to point out that my colleagues have found similar things about repeated assessments giving students opportunities to try again, I think it's super important pedagogy logically to teach people to learn from failure. I've tried that for years. I've always told my students that if they, if they don't, if they're not expect to get everything right the first image, try it. They're never going to do anything important. And there's way too much talent in this room for you never to do anything important. But before 2020, 125 years of teaching, I've never incentivize that directly with grading and it makes a huge difference. So my colleague, Laura Brown, kit symposium last year, also had it in the fall of 2020. She basically one the pandemic she was able to teach and asynchronous large course. And part of her strategy was to allow students to retake the automatically graded portion of her exams was only about 30 percent of the exams. And she had them do some peer assessed writing, reading assignments, and stuff to you. And what she found is that the DFW rate decreased from 24% to 12 percent. But if you look at how students did on the part that was just the same as it was every other year, they did better. So the grading that she used more low-stakes assessments, but their improvement didn't come from that they get better on a high-stakes assessments and they had before too. So these things were really hoping student learning. Angela Robinson and Meghan Porter who taught the 600% general chemistry class this fall. Decided it. Okay, Martha, we're going to we're going to allow our scenes to take, retake the exams on the basis of your data. And they found that the average exam grade improved by a full letter grade. The DFW rate was still pretty high. I think part of that was the pandemic. It's really hard to compare these years because of the pandemic. But the gap between highly privileged, privileged students shrink considerably. Okay, so the last part, I'm finally getting to them. What George asked me to talk about. And this is some new work that Stefano Fiorina has done using process mapping to look at our curriculum as a whole. So instead of just eyeballing it, making some guesses about what's wrong with it, like I've been doing. He's got some interesting ways of looking at the data. And so on. I was really interested in and do students actually follow the curriculum map that we set up for them, right? So there is increasing pressure a few years ago from the state to improve our four-year graduation rates. That's good. There shouldn't be increasing pressure for that because we're set up, the incentives are setup all wrong in higher ed, right? If people take six years to graduate, you're getting more tuition from a Min if they take four years to graduate. And so you do need state pressure for that and that's good. And we implemented general education requirements in 2012. And that added an extra layer requirements. It made it harder for students to navigate the college path. And so the state said you have to decrease, create Degree Maps for every major. And so we developed something called the interactive graduation planning system, which I'm not allowed to look at because I'm not a student. But there is a map for every degree. And so the question is, to what extent a students follow this plan? And if they do follow this plan, is it a good idea? That's a question I can't answer the first question. I can show you a bit more about okay. So this is I'm I'm a little bit out of my comfort zone here. Stefano is a Stefano is an anthropologists and he's interested in actor 3 network, which basically says that everything has to do with social networks. And so part of this network, obviously as faculty and administrators telling students what to do. But there are other parts of the network. And it makes sense to consider education as a transformative process. And so what's definitely decided to do was to use business process analytics. And you can see the reference here. And the rest of that I don't understand at all. I can just show you the results and if you have questions, you can talk to Stefano or Linda about this. But I do want to point out that we're certainly not the stuff. No, it's not the first person to think about these ideas. And I want to make sure I had a slide that showed other folks who've been working on similar issues. And if you are in the audience and I should have cited you and I didn't put that down to my ignorance rather than Stefano scholarship and let me know and we'll add you to the slide. All right. So Stefanos, not stupid. And he says, All right, I betcha. The Associate Vice Provost is from chemistry. I bet you, if we analyze chemistry first, we can get, are interested in this. And so, and we can build some momentum here. And that worked very well. So this is a simple process map and he shows me. So when in chemistry, people come in if they're well-prepared, they go straight to General Chemistry. If they're less well-prepared or less confident, we let them choose. They go to pre general chemistry. And then after general chemistry, the normal pre-med sequence biology, major sequence, et cetera. As you take organic chemistry one, Organic Chemistry, 2, and so forth. Okay, So this is, this map covers what 60% of our students do and, and it has, it has people who stopped taking chemistry after General Chemistry, I think it picks up any, anything that, that over 5% of the students do. Okay? And it picks up some people who leave but not leave after taking pre-generate can. But as I'll show you in the next slide, people leave after each of these steps leave the university. But it looks pretty streamline. It looks like, yeah, how students are doing more or less what we tell them. But that's only 60% of the students. What if we do a 100 percent coverage? We get a diagram that looks like this. And I'm sure you can all look at that in five seconds and understand everything you need to know about the chemistry curriculum and might apart. Now, I mean, obviously, this is complicated as hell, but let me just point out a couple of things. One is, we're including the no students here retake a course. So 10 percent of the students who started in our chemistry curriculum retake general chemistry. 12 percent, we take organic chemistry, 1, 5 percent we take organic chemistry to k. We can again, they started preaching on camera or gen chem. And I think what the size of the width of the arrow tells you what percentage of the students do each. But you can also see things like, what percentage of students leave without a degree. And it's 15%, which is a lot, but it's not far off our university average. Actually. A lot of people leave after degree and some people leave after leave without a degree. Starting with pred1, Kim, suddenly without a degree at Gen Chem and some leave it without a degree at Organic Chemistry 1 and Organic Chemistry 2. But if I look at these data as an administrator, I say, Okay, I've got ten courses to cover and I have six really good professors. And for that I need to bury somewhere, right? What do I do? Which courses do I prioritize? Well, you might say should prioritise Organic Chemistry 1 because it's our highest DFW rate class. But far fewer people leave the university after organic comes you want they're more likely to retake it and stay, and stay the course, right? So we chose to prioritize gen chem. And I think you'll see that we really should prioritize pri gen chem as well. And we cited gen chem, what help us work backwards towards breccia and CAM, and forwards towards organic comes through. But that's the kind of data that can be informed by this, by this map, that kind of decision, that can be informed by this map, that that wouldn't be informed by the other data that I showed you earlier. Okay, you can also do this. I find that these percentages a little bit confusing. So I said stuff, no man, I'm an old lady and give me some numbers. And he and so whoops. So you can also do a map like this. This I think is just for one year rather than the five peers that they use you starting cohorts from 2011, 2016. And he looked graduation rates and four-year graduation rates. So this is the kind of map you can use and actually see how many students do each of these things. And that you data people love these, these complicated graphs I like Excel. So I'm going to show you some boring Excel diagrams that I hope can illustrate how even somebody who's not good at data can configure out a few things from, from these kinds of process maps. So I talked before about the systemic advantage index. It has a big impact on student pathway. So if you want to see our students who are, who have all the advantages, half of them go straight into Gen Chem and half started in our region CAN course. Students who are low-income, first-generation college students minoritized, almost three-fourths of them start in present and only about a quarter start in gen chem. So that's already a huge difference, right? And it's also the case that, that, that lo sai students have a lower continuation rate than high SAS, you can. So if we ask what percentage of the students who started pre Jim Kim go to gen chem? It's, it's seven, it's 72% for people with all the advantages, 63 for those without. And it continues and there's a small gap the whole way through. But that small gap adds up to a big, a big difference for the percentage of people who actually in each category who actually complete the sequence all the way through organic chemistry too. So if you're, if you're affluent, white male and your parents went to college, 43% of those folks get through. Only 24 percent of people who only have, you have none of those advantages are only one of those systemic advantages complete the sequence. Okay? When you look at who leaves college completely, again, a big gap. Many more students with low systemic advantage leave college than those with high systemic advantage. And even if you look at who has to retake a course, 55 percent. Now some of these are students, they're the same student counted twice because every taking more than one course, but the idea is the same. It's almost twice as many more lo sai students were taking a course and hi Sai CDs these days, I just worked up in the last few days and I haven't shown my colleagues, but you can bet that I will show my colleagues. Oops. Okay, so, so again, so I asked Stefano, I've been concerned for a long time about, you know, do we offered the same opportunities to students who don't have great chemistry coming in, but might be really talented. And so can you get it can you get a chemistry degree if you start in and present? Can. So I'm Stefano. I analyzed the NMI. Yes, sorry. Analyzed the difference between students who start with present cam and you go to gen chem. And again, if you start with pregnant, can you only have a 21% chance of getting through the curriculum if you, and of course, some of these, some of this reflects course requirements from other fields. We don't have a separate field for separate courses for biology majors and chemistry majors and et cetera. So if you're an exercise science major, you leave after gen chem. So not all of this is, is a defeat. But boy, that's a big difference. That's a really big difference more than a two-fold difference. If you start with present camp, you're, you're only half as likely to get all the way through organic chemistry. So we lose 18% after a gen chem. These are people who get a degree, but 18 percent, but never take another chemistry course. After gen chem, we lose 36 percent of the people who started it and present can only 23 percent of those who start in gen chem organic chemistry want it's a bit more even organic chemists, it's a bit more even if you can get organic chemist when you have a reasonable chance of getting to organic chemistry too. But 19 percent of those who start with preach and can leave the university, and only 15 percent of those who started engine can do. So. We gotta fix this. Let me, To me, it's a moral imperative. It tells me nothing about how talented a student is that they didn't have great high-school chemistry tells me not to. Martha. So when you were talking about in the first set of ours that there are students who are, they're not intending to be chemistry majors. Great. Is that pre gem class count towards credit or do they have to go in to the course to get the credit? So they take two courses to get the credit and they're permanent? Yes. So they always had to take two courses. We just change what the courses were. So the, so the deal is present. Cam actually can satisfy that. The natural sciences and math requirement try and we try to get students to do that into another course because it's not the best course for that. But exercise science yet you can, you can take C one or three, you can to preach and Kevin gen chem. Or you can take the traditional general organic in Biochem 2, two semester sequence to satisfy those requirements. So yeah, yeah, it does count it it counts for college credit, but it does not count towards a chemistry major, that extra course. So that those are both important points. Anything else? I'm stopped. Okay, and I thought, well, it's just that we have some crappy attrition rates it, that taking three courses that you're more likely to drop out then if you take too. But that's not true. If I if I, if I say, let's not care about what course it is if you start with with gen chem or pregnant, can you just about as likely to go on one more course? And that's in part because pretend Kim has a much lower DFW rate than gen chem. But when you get to two courses, there's a really big difference already. And so it's not just that there's one more hump you have to get through. And I and I do think having one more hump that you have to get through matters. We'd like to have courses that are not humps, but actually ways of getting you to the next class and we're working on it. But it's not, it's not something that's been widely solved in our field, I'll say. So it's not just that. And it's also the case that retake rates are significantly higher for students who begin in preach and can. So again, this looks pretty similar to the data I showed you about systemic advantage. I think that plays a role in it, but I don't think you place a whole, whole role in it. So that's chemistry and I went into a deep dive in chemistry. I just want to briefly show you what data from some other fields look like. And I won't tell you what these are because I have an ask their permission to show the data. But we started There's one month stem field at IU that you say have really bad DFW rates. So this is the same kind of graph I showed you before, where we have the DFW rate for not URM students in blue, for URM students in orange, and also the, the average grades and courses. And so this is a department that look as bad as chemistry in 2010, but looks a hell of a lot better right now. They've worked hard on getting this better. And I think again, it's been a few faculty members teaching large classes who have really worked hard to make this work. And if we look at their, at their curriculum map, it's just as crazy as a chemistry curriculum map is, as you might think. And it turns out that 15 percent of students to start the chemistry curriculum leave 16%. You start this curriculum. We'd, even though their courses have much lower retake rates and much better DFW rates. Of course, the students who major in this field have to take chemistry assumption, we play a role in that. But I found that actually kind of interesting that, that, that there wasn't a strong correlation. And I'll show you an even more dramatic a graph about that later, necessarily a strong correlation between DFW rates in a, in a field and that, and the percentage of students who leave university. I Soviets. So it's hard. We chose initially to look at, at fields that have a pretty strong core curriculum. I'll show you one field that doesn't have that. So there are two foreign languages that we're going to show you. One of them was on my radar screen. Because they have high DFW rates and pretty high inequities. And, and in fact, in this, in this field, 20% of students leave college. It's at some point or another. So that's pretty high and that the retake rates are miniscule compared to chemistry miniscule. Still we're losing 20 percent of students who start with this foreign language. And, you know, and I don't know yet, or is this just a different group of students then we're thinking about for the other maps. We haven't done any of that yet. We're just scrape the very surface of this iceberg. This, this is a really rich way to answer questions, I think. Okay, So I looked at a lower DFW rate language. And there's some really interesting patterns. I'll show you this in another slide a bit more clearly. But in this language, people mostly start in the third course. They have to good semesters already knocked out from high school and they start in the third course and still feel 14 percent leaf. And that's about the university average leave without a degree in six years. So I was shocked at how different the pattern is in these two languages, right? They're both languages that were offered in my high school is not like one of them is one of these crazy languages that only IUP teaches. They're both pretty common languages. And in one line and language a, the high DFW rate one people, half the people start in the first one, and the other one, 67 percent start in the third course. That's pretty interesting. What it means, I don't know, but it's pretty interesting. Okay, so now let's take something that has a less clear tricky on. So this is a social science and you have to take four of these courses, social science, a, B, C, and D to major in this area. And of course, most of people taking these courses are not majoring in this area. And you can take them in any order. And so, you know, I've been curious for fields that don't have very strong curricula. Are there different ways that students proceed that are better than others that lead to more success. And I don't think we know, we don't know the answer to that question yet. What I learned here as well, if you're not a major in this area, you take course SSH. I was wondering why so many take this course rather than one of the others. And I found out it actually meets your natural science and math requirement. So I think that's why we have so many people in this course. The other ones are social science courses. But again, you know, we get about 13 percent of people leaving at the end. Leave move thought agree. I'm sorry. So I, I just wanted to this blew my mind. So if we use social science, science, chemistry, language, and language B, you can see the leading rates are around 14 or 15 percent except for language either higher, I don't know why. But look at the percentage of students who have to retake classes. I mean, it's astronomical in chemistry. And there's there's just almost no correlation between the retake rate and and the percentage of students that we know the DFA matter. But they matter in different ways, in different fields. And I certainly haven't explored that. There may be folks out there who haven't. You are interested in that, but I thought this is really amazing. So like I said, we have looked at the chemistry eight a fair amount. The other data, we're just barely getting started in, in the next step is to invite faculty and directors of undergraduate studies from those departments and to look at the data and help us understand what's going on and how we can provide. We asked the Vice Provost for Undergraduate Education how we can provide more support for these departments to serve their students better. Okay, and nonetheless, short vignette I want to share is, is a story that stefano, this is all, these are Stefanos direct slides that came up with he was interested in or maybe someone higher up. The administration asked him, how does changing your major impact degree completion, is it a good thing or a bad thing to change your major? And so we use this kind of enrollment and he says, okay, if we look at all of our students, about 42 percent of them said majors once. About a third of them don't change their majors at all, and the rest changing majors two or more times. And, you know, many of you may be parents of college students who wish they would quit changing their majors and maybe changing them seven times makes a difference. But what stuff and changing onetime actually helped. So most, most students experienced one or more major, major changes. And if those who are not enrolled or don't complete a degree and six years, more than half come from the no change group. And so anabolic. And almost half of those who do get a degree come from that one change groups. So there's a 22 or more change group as well. And then Stephanie, I looked a little bit more carefully. So we have we have a if students can either come to university saying, I don't want to get to a particular field, but here's what I think my major is going to be in there and something called university division. And so they're the undecided students basically. And if you're admitted directly to the school of business, your business soon if you're admitted directly to an apartment in college, you're in the college. So we call these direct admit students. And so it turns out that that in an university, I'm going upside, it experienced the greatest rate of changing majors. That makes sense. The students were directly admitted to the, to the business school or the college of Arts and Sciences. If you change your major. Actually associated with lower attrition. And the highest attrition is found in the schools, in the business to students in the business school who don't change their majors. Okay? And so to me this actually is a good sign, right? We want students to learn things when they come to call it, when they come to college, they don't know about most of these majors. And maybe they'll find something that they like better than their initial plan or maybe they're just majoring in business. You make the parents happy. Who knows, you know us, but, you know, it's not a bad thing for your kid to change your major. So it's a so what Stefano proposes and I agree with him is that we should think of changing make yours as an integral part of student progression to degree. It's positively associated with retention. Positively associated with graduating in four to six years. Okay. And, and maybe we should think of this as a reflection of what Stefano calls the transformative cultural processes that are stimulated by our environment. We want people to learn things that are universally. If learning things means you find something that you love, that you never knew about. Greed, go after. And, and it's, this can be an indication of a student who was actually engaged with and committed to their academic path or taking responsibility for their own learning, which is what we always everyone and I think we should want that. So I'll stop here and give you a couple of take-home messages that I hope I was able to communicate. Institutional data can be used successfully to build empathy among faculty and administrators. I, I guess I told you most of what I showed you was from n equals 1 me. But my experiences, it works on other people to robust usage of institutional data to actually change what's going on our classrooms requires an ongoing conversation among data experts, faculty and administration. In general, we don't give underserved students and this is, this is a conclusion that's, that's based not just on data, but also on my experience working with this population of students in, in my general chemistry courses, we don't give underserved students enough time to build their confidence. Figure out whether they're good at what they're doing. And these high meet our courses, we just knock people out before they've had a chance to, to, to, to find out they're good at it. And I've been amazed at how many times I have to tell someone, my students that they're good at it. Especially from underserved categories. So there are some, we are really losing some diamonds that we need to help solve problems in our country. And we gotta pay attention to this. This process mapping reveals tremendous complexity and the path since students actually take as they navigate the curriculum. And process mapping can reveal major bottlenecks. We can use it to improve advising of students are doing weird things like going from general chemistry to organic 2 is add some advisers, somebody that doesn't know the deal or students doing dumb things. We can figure that out. And we can also, and I think we can also use these maps to help administrators at the department level or higher, set priority for investment of resources. And we've really only begun to scratch the surface of what this tool can show us. So with that, I'd like to thank you all for
Description of the video:So thank you all for taking the time to join us for this working session. And this is a working session which means when and I will talk for a bit. But we've got a couple of cases that we are going to have you work through and talk about data. And we have a couple of tablets that go with those cases. We'd like to capture your thoughts about the kind of data, both as you see learning analytic data, but also financial data that you would bring to bear within these cases. So that the whole aim of our session is to help you think more deeply about combining both learning analytic metrics and financial considerations. As you are trying to make a case for transformation or trying to measure the impact of your transfer transformation projects. So who are we and what gives us the right to talk to you today? I am a Professor of Higher Education Leadership and currently Associate Dean of my College of Education and Human Development, Western Michigan University. Now be associate dean about another month and a half. And then we'll be going back to the faculty. But relevant to this conversation is that I am the PI and one of the hub leaders of the accelerating systemic change network, which Charles Henderson and I and Linda slinky and several colleagues developed to bring change agents and change researchers together to talk about what we need to be looking more into in order to accelerate change within higher education. So that's me window. Yes. So I'm really pleased to be here with Andrea. She has been a great colleague already. I fully known her for a short time and i'm, I'm thankful she's here for her leadership. I have been involved in creating information for the campus at the Bloomington campus specifically for many, many years. And we do typically support units at the school or the department level. And more recently as I've been working with George, we've been working a lot more with faculty and learning centers and trying to think through more about what is needed to participate in the conversations that faculty need information to improve their classes. And so that's my interest here and I look forward to learning from all of you as well. So thank you. So we want, we want to shout out till in this Leakey who actually was the bridge between the accelerating systemic change network and the baby you Alliance, which is a group of institutions that have committed to working together to identify and find funding for critical projects around Transforming undergraduate stem education. And Linda, did I get that right? All right. We we wanted to shout out to her because she has been so active in all of these initiatives for so long and really has been the linchpin for so many conversations. She and I. Coli now what we call Working Group 24 within our accelerating systemic change network. And originally these were two separate working groups. One looking at cost-benefit and one looking at demonstrating impact of change. And after about 89 months of working independently, we all realize that really we are talking about the same issue from different perspectives. And that talking about them separately was not going to accelerate systemic change. So we combined forces between the working groups because we realized that demonstrating impact is measuring benefit. And that if you're not if in trying to demonstrate your impact, you're not taking costs into consideration, then then you're probably not going to be as convincing as you would like to be for the folks that you're trying to convince now, banned. And you're missing a big piece of what it means to try to build sustainable change. So those of you that might be interested in scrolling elsewhere as we're all talking. The link to the the resources that working group 24 have curated around thinking about how to integrate cost and benefit analysis, interchange, work. Is on the screen. And I think I've boxed myself in a way from my Microsoft Word document that has the link. Easy to get to. If you get to asean, it's one or two plex down. So what's challenging about measuring cost and benefit? One would think it seems pretty straightforward. You invest this amount of money, you get this kind of outcome. That's a business model. And I'm man, it's business thinking. But higher education is a lot more complex and a lot less straightforward than business enterprises. And so applying business models of cost-benefit aren't going to take into account all of the interconnected costs that we incur when we try to change what we do within higher education. And it's not going to lead us to being able to identify how to measure benefits. When they are also intertwined with several initiatives likely going on, on our campus at one time. So you can move real error, costs and benefits can be really difficult to isolate. And they're not that amenable to parallel measure it. What do we mean by that? Well, as Director of the Center for Teaching and Learning, Faculty Development Center, I had costs for various programs, but I also had the cost for my staff and for myself. How do you parse those out when you are trying to demonstrate benefit in terms of instructional change. And how can I prove that the instructional change that we may be measuring and actually came from our programming. And that's something else that a faculty member or a department we're engaged in. It's really hard to make a straight through line. Another example is first-year experience. And I had a big project that tried to alter first-year experiences and experiment with different models with null results after four very difficult, challenging years trying to put randomized controlled trials with undergraduate students together. And we, we could not demonstrate that we were different from any other first-year experience. But we also had so many intervening factors, even though we were doing a randomized control trial, that was really, really hard to say where failure or that hung. You know, it's trending nicely. So why don't we go in this direction as an institution? So most of, most of the time when we think about higher education, we're not businesses were not for profit. We're about maximizing the resources we do have for the best outcomes for the students. And so it's a different kind of thinking, but money is involved. So what's challenging about demonstrating impact? We all know that meaningful assessment processes at the course level, at the student learning level, at the program level, and at the institution level can be really hard to put into place. They meet with a lot of resistance. Folks aren't sure that what we're trying to measure is authentic to the goals and the values that they have within their academic programs. And so creating metrics that demonstrate that really do get at what faculty value in student learning is, is the reason we have a learning analytics Summit, right? Because it's hard and demonstrating full impact of a program takes time. And a lot of times we're not given the time to let things play out. So how do we maybe identify leading indicators instead of lagging indicators that we can track to, to put together with the money that we're spending. And, and know whether we ought to continue pursuing a change initiative or if, you know, this is not getting us where we want to go. And finally, one of the challenges is your audience. Who are you trying to talk to? If you're trying to talk to your faculty colleagues, there are money's not probably going to be particularly compelling. The values and the learning and the outcomes and the number of majors are going to be compelling. If you're talking to your provost. And you're not including finance, you're probably not hitting the things that they that wake them up at night. And that they have to really, really pay attention to. So that's the reason we wanted to have this session and talk about the kind of financial considerations that would make sense with different levels of change. And I'm going to turn it over to Linda to talk a bit more about like, what are we talking about when we're talking about money? And you just let me know when you need me to change the slides. Okay. Great. So yeah, What are at stake here? I think one of the things that we are trying to discover, what are the things cares about? What are the things that would catch their attention? And of course, retention is one of those things. And I think of retention is sort of a macro metric in terms of it. It tells you what's happening at the campus. But retention is built on a variety of little building blocks of the student's experience that make them either successful or not successful. And so what I'm going to try to do here is start with the whole concept of retention as perhaps one example of how we might approach this and, and try to break it down into some maybe more meaningful and usable parts. So Casey, your comment is spot on. I was thinking what kinds of data are we talking about? And faculty records, the instructional effort, budget data. All kinds of data could be relevant. So just so we're talking about student retention here doesn't mean that's the whole, the whole piece as just an example. So and so what I've done here is I'll take this example of retention down a few levels into some details. So any campus could figure out some metrics that would help us understand what is the cost of having a student stay on the campus versus leaving the campus. And what this little diagram is showing that if we retain one student for three years, typically they leave after the first year. If we are able to keep that wants to the campus, would generate 66,900 additional dollars from tuition and revenue has nothing to do with all the other costs associated with the students. So this is just one high level. Then if you think about that in terms of percentage points. So we have pretty solid retention rates, but if we were to increase them by 1%, that's maybe 70, 80 students. And we're looking about $2 million. So this is just trying to break down a little bit more about what the costs could and should be. And let me just also state that these metrics of retention are broadly used by our campuses as a, as a macro metric in many, many instances. So US News and World Report uses them. That's how our R there are evaluated by our peers and by students who are coming to the campus, et cetera. Student right to know requires that we provide them. So it has multiple benefits and we're all, we're all interested in this. You can go ahead and move that forward. India. I of course, lost the presentation. And you all are still see it, correct? Yeah. I think what we're seeing your notes there, slice selector. My yes. We're not seeing the need to go to, you know, show again and you gotta do. Okay. Yeah. I was trying to quietly get the Padlet stuff ready. And clearly that's not going to work. There. You got it now, this is perfect. So what are the things that affect at the instructional level? And so one of the things we're talking about are improve teaching and how that can lead to improved course performance for students. And then we can sort of understand a little bit more about what would be their relationship to retention on those those metrics. So this is just an example where we have a 90 percent retention rate for the campus. For students who get a W in their first term, just 1 w, we see four percentage point difference between the campus ray and the W grade. Those you get a W grade or what if they get a D or F? That's 87% retention rate for those with a D or an F. Now. That's that saving a lot of students, but it certainly is one of the values that we have. So that ends up being maybe 10 to 20 students in each of those categories. But yet it does improve. The overall picture adds to the puzzle, love campus retention, how you can go and more than we can think about this in terms of a longer time period. So we see that the student who gets a w the first semester is influenced slightly, but take that to the neck too. A four-year graduation rate or a six-year graduation rate. And you see the differences increase a little bit more and more. So the student has disadvantaged themselves with a grade of w right from the onset and that is lingering, it carries through and has compounding effects. And so these are the types of things perhaps you could try to estimate in terms of some challenging costs and savings of improving the student's performance in the class, which we hope these tapes of improved teaching would, would, would improve and help them with their performance. They can go ahead and move at one more. Our campus is very interested in the gaps, very interested in performance gaps. We want to mitigate gaps between the under-represented minorities in their performance in classes with the, with the traditional group. So what you see here is the gold line at the very top represents the percentage of underrepresented minorities who received the df for w and a course. And the blue line represents the non under-represented minorities. So you see that the under-represented minorities have a larger share of the DFA use in our populations. And so trying to minimize that gap is what we're, we're talking about. And the bottom two lines, the average course grade. It's also prominent in the, the grades that count towards the GPA. So the non under-represented minorities have a higher GPA in those courses than the under-represented minorities were. The, the gold line is as following it. And we can do this by course. We can do this a variety of ways. We can expand this out into some costs like we did with the very first slide when I started talking. So this is DC, things that are important to the campus. We probably can map these out into real savings and real costs. As we, as we think through how we want to, to evaluate the effectiveness of our, our new methods. So, okay, you can switch at one more. And then I didn't want to forget about the qualitative data. So this is just a word cloud of student evaluations. This happens to be a standard course. You know, the types of things you would expect to see. We can go to the next slide. I didn't get it created for this in time because my connection wasn't working. But you can create these word clouds for the under-represented minorities and under-represented minorities. And try to see what does the experience look like in the class and how can we improve that? So there's a variety of metrics that deal with different types of data that could also be quite helpful, I think in mapping out your case. So yeah, I guess that some leads us to the first case that it is anybody have any questions before we go into the case studies? Haven't been monitoring the chat for anybody have any comments. So what we were hoping to do is get feedback from all of you. So this is a dialogue in progress. Everybody will probably have to create their own scenario, their own arguments, their own justifications. And we just wanted to start generating ideas around it to see if there were some things that all of us could benefit from each other's work and, and thoughts around this. And so we've created two different case studies. And we will, I guess the 100 you ought to explain how we're going to to do these breaking out. And thank you. Christy is waiting in the background to send you to breakout rooms. And Christie, it looks like three is going to be about right. We want to enough people to make a nice group. And I've created a couple of padlet for you to throw ideas onto, and we'll put those links into the chat. We'll also put this case in the chat to follow you into the breakout rooms. But we created a couple of cases that should look relatively familiar to those of you that had been working on undergraduate. Education Transformation. Because we often will bring learning analytics and metrics to bear within these situations, but not so often. Thinking about cost, financial investment, and possible financial gain, either for the institution or for the students. So it's not all about, you know, is the institution making money or breaking even, but are we lowering costs for students so that their experience enhanced? So we wanted you to think both about the kind of learning outcomes and metrics that would be brought to bear in these cases. But also be thinking about like, okay, how, how would you, how would you add financial questions? And what kinds of financial questions would you want to ask about these two scenarios that, that will deepen your analysis and your planning around the change initiative. How long in reality, each breakout to be open? Let's see. We are at half an hour in we have an hour if we could do 20 minutes per week. Thanks, Kristi. Yeah. And is do you want to be assigned to a room as well? Yeah. Throw throw me into the mix and George away from us and he can serve as a third. If I need to. I'll move the three of you around. I'm going to have it just automatically assign L and then if we need to shift you, Linda or George, we can go. All right. I'm gonna go I think I've got a job that you do want to explain the case quickly. Okay. And yeah, Wilson people a way to talk. And I dropped that case study into the chat. But basically we have a junior instructor who has ideas about improving what the introductory course, in terms of both trying to improve the content and make it more relevant to the students and make it more authentic to the, to the work of the students. And, and by doing that thinking that it would also create more diversity among those who are attracted to the major and the status quo. And the department is that they're very pleased with the way students are performing in the upper level classes because they'd been doing very well. They didn't consistently doing well in the upper and the lower classes. They think they're well-prepared for the upper level classes. They really don't want to bother, but to increase the majors. They want to target the incoming freshmen and provide more information to the incoming students about what the major offers. And so you'd have sort of two competing approaches to maybe a similar kind of problem and then a few other nuances that deal with each of the personalities involved, I suppose so. Yeah, so but basically, what we'd like you to do is think through the case study and try to imagine, you know, what are the problems? What are the types of evidence that might be valuable? And making a case for change and transforming the classrooms and share that on the Padlet. So here's the Padlet link. For case number 1. There will be a different address for case number two would kind of bring you back between them. And Linda will drop actually, Lena, I've got the language in here. Okay. So just to clarify, Andrea, rhythm. When we go into the real me, click on the link. Yes. Not map like on the world either now or they are. The Padlet will be what the Padlet is. Good. It'll kinda open your browser and send you to a web-based space in the chat is not liking our text and wonder if we've exceeded the characters. Let's see if we can be. Alright, here's part number one, where there's a will, there's a way for Q. Here we go. So that will be joining you all for this conversation and crisp. You'll pull us back into the main room. After about 20 minutes, we can do a little bit of processing. And then we'll have a different level case for you to think about. You all for that. The Padlet contributions. I hope your conversation was as lively as our conversation was. And let me find our PowerPoint again. And we'll just being mindful of time. When I introduce case number 2, have similar kind of conversation. I think we'll shorten that timeframe because you've already been warmed up and you're ready for the heavy duty exercise. Your, your Peloton warm up is done. And so this is the meat. So case number two is a little different level. And I'm not going to bothered if you all can see this, I'm not going to bother to try and put it in presentation mode. And we will add it to the chat. So this has to do much more with program redesign and looking at a group of departments who as part of their regular review, are realizing they have a problem and they have a problem with the loss of majors who don't already come into their programs with AP calculus or calculus taken in high school. Those students who have to take the sequence are far more likely to dropout or to leave their majors. And they decide that they need to do something collectively about this because they all have similar enough majors that all require the same basic courses. And they need to, to address the, the loss. So faculty don't really, and you know, actually desegregating the data, give them an even starker picture of what's going on. Faculty don't really agree on the solution. Small but vocal number comfy. This is a quality indicator to them that loss should happen. There's another group that basically says, you know what? The math department's not doing us any favors. That sequence needs to go, it needs to die. And we need to integrate whatever we think is important for the students to know into our own early courses. And then there's a third group advocating. We're like No, that the calculus sequence is good, it's solid. What we need is a bridge and some learning assistance to know both our intro courses and calculus, who can help the students integrate and be successful. And so that's what they're proposing. So we've got a status quo. We've got completely redesign our courses, get the math department out of our early sequence. And then we've got the invest in support structures that will help students take both and be successful. And the questions that we have for you are really similar to the prior one. What kind of data can be collected to support the likelihood of success of each of the options. And what success actually look like. If you've got an 18 percent loss, you're not going to get any 18 percent gain, right? You're not going to you're not going to bridge at a 100 percent what would what would success look like? And what kind of data might help the departments understand the cost and benefit of either the redesign option or the learning assistant option. We're going to kind of put the status quo to the side. Now I'm in this conversation. So that is our second case for you. And I am going to stop sharing. And we will add this text to the chat for you all to refer to. But here first is the Padlet, because we're going to do a second different one for this particular for this particular conversation. Sorry, talking and typing at the same time. And we're going to throw you back into the same groups. Like I said, you've been warmed up and we're going to give you 15 minutes because we really want to get the group back together and talk about what you see across these two cases. Maybe as data that that is similar and what might need to be really different between these two. And just whatever else you see after this conversation. So Kristi and welcome back. So we have 15 minutes. And what Linda, I wanted to do is just sort of process with you all. But what came of these cases for your groups? Casey and I were pretty busy with the Padlet on that second one, but I didn't. And I know we're representing two different groups, but now across the sort of course level, and then multiple departments and multiple program level. What do you see it as data that's really similar, that could be collected and brought to bear. I think Casey and dad had a terrific idea inaccurate about the costs that retake. One of them went to explain why. Well, we were if you look at the rates of withdraw and failure, and then look at the number of retakes. How much do students pay? Additional that they wouldn't pay otherwise by taking a class two or three times before they move through the curriculum. And what does that cost them in the longer term? Probably in probably at least another semester if they're taking it twice and probably maybe another year. Data. Linda's data from IU is pretty compelling. With that, you know that one DFW drops your four-year graduation probabilities fairly dramatically and the number of six-year graduation rates go up. But what that represents is a lot more time that the students are taking to achieve their degree. Yeah. Because I mean, or what I suggested is that you don't get the math department doesn't get that revenue. If a student has to take it twice, it goes centrally or to the dean or whatever you shouldn't be rewarded for failing us do better, right? So we had exactly that conversation in our group. We were when we were trying to talk about okay. You know what what incentive wooden department have to to address their DFA use and the fact that there are departments that budget themselves on repeats, which well, and to protect you, take it further than that. Can you see like no, you get dinged for them. You know, there's a there's a there's a point at which it's like, whoa, you know, are, are we then incentivizing the social promotion? So where's the balance? Well, where your departments and programs can retain the standards that they believe students need to be reaching. Well, getting rewarded for increasing their support, right? And that's an argument that many of those departments makes. Well, we can easily change that will just give everybody higher grades. Yeah. Which is sort of a pretty silly I think so the response to the problems that we have. But so when I think what was important which your group was talking about, which we didn't really, I don't think we've really touched on quite the same way as is looking at the data. Not from that, necessarily only have from the financial health of the institution, but looking at it as the financial health of burden of the individual student can certainly when it comes to leading to take another whole semester to graduate. I mean, there are multiple things there that there being a bird by one is that they have to remain in school so they're no longer doing something else and hopefully not spending money, making money into their they're having to pay for that tuition somehow. And you know, most scholarship programs and most almost all the programs than many of the students rely on, There are only good for those with clean out those four years under 90. Yeah, That's ED after those after that, you're on your own, which for some students at our school, be like a huge burden. And so no wonder why they find themselves dropping out when they look at that. And, you know, estimating those costs. And the cost to the institution of having to recruit and replace all of all of the students that may leave and dropout or slow down. You certainly I think, would be compelling for deans and the provost. Is that compelling for the faculty? And, you know, how, how do you make information, financial data having to do with the students and the burden on them. Compelling to the folks who are seeing themselves as. The arbiters of the quality of the education and the quality of the students that they, that they pass in their classes. That's I think an interesting conundrum. It's might be partially more amenable to like qualitative data. So stories of students who have left and we, mathematics courses may have been a deciding factor. I think, you know, sort of faculty always see that kind of big picture is their responsibility. But if you can kind of help them connect the dots back to the centrality of that course or courses might be. And that really is where we've talked about you. How accessible is the like the research literature on the effect of course redesign or, you know, learning analytics, that sort of array, excuse me, neo, a learning assistant program. Well, to me it's like, it seems like it almost doesn't matter because unless faculty are presented with data that come from their classes, from their program and its benchmark against other institutions that they care about, benchmarking themselves to. Then, then we can write all the journal articles in the world with the best randomized designs. And it's not going to be convincing. Was from the previous discussion, biology, for example, trying to avoid trout calculus. Myself. I'm like I was a biology major undergrad, and I remember what I was doing in the mats in the traditional curriculum. I mean, the science. Oh, they select their stream that's prescribed. And I'm wondering how much this is going to continue. So prescribing those common courses before you select your major. Because I I I don't remember where I wanted calculus. So I looked at the master's level biostatistics courses. No much given I, I was trying to apply for a PhD program on computational biology is much more, much of artificial intelligence and related to statistics and data. So 1 which was raised is that, okay, Can we follow our graduates and what they are doing? Some data can come, but are from the parents educational system and the job market. We say that we may not have a job 10 years from now. What we're doing as a joke, women, due to the change, they're saying they even are not created for the generation which is coming. Yet. Although that the lemma is there, still, that data may tell us from the alumina where they ended up, what they were doing. May tell us some information. Another one is selective content. I remember I applied for forensic computing master's program. So I have my undergrad was biology. I did some low. So I thought when I moved to IT, I will combine the IT and the law for digital forensics program and Sunni joint j hat, a bridge program. So they selected seven courses which you are required to take undergraduates. And they split it into two semesters for course. Hence the course in one semester. And they selectively incorporated algorithm, the mathematics section, only the one which is applied to the digital forensics, the networking, the computer structure, anything only the one which is applied to the forensics. They were very selective in green that so as a prerequisite like developmental program, this was a bridge, but very, very selective. I wonder in the future. Such things maybe helpful because they're targeting specific skill rather than understanding the general thing. So here's the theorem. You unbundling of, of majors and pathways into graduate school, I think has been really a conversation that Canada has gone up and down in the last decade or so. You don't hear so much about it right now. But it, you know, the challenges for those is scale, right? Scaling to the level of the average undergraduate experience. And Ted, you said, your comment is really well taken that, you know, focus on costs should be secondary. Costs for the institution I think should be secondary. But thinking about cost to the student of what it is we want. To prepare them for. Is that what you may just B19. Know? I gave the example in our group, but I was trying to get in and Mike was muted. I didn't realize that somebody admitted it. I gave the example of when my own program and structural assistive technology made some changes 30 years ago. And what stimulated those changes as we were preparing masters and doctoral students to go out on both the business world and academia. Was both what those graduates told us that they lack in preparation, as well as what they'll employers told us. They laughed and we found that out. But talking to him, we did a needs assessment. And then we work backwards from that and redesigned our master's program to start with in 1992 or so. And then later the doctoral program. And so we found some things that we weren't expecting. For example, students need to learn to work on teams and work with each other rather than competing with each other, which is what they were doing most of our classes to get grades and so on. And they needed better communication skills, both speaking and writing. And those were biggies that came out of that needs assessment. And so when we did our redesign of our core courses and other courses, we didn't have a course on communication, we didn't have a course on team participation. But we built into those courses, those things. And so are we let the goal, what we're trying to achieve drive the changes rather than just how much money was causing a time, right? And actually, we, we created another kind of problem that we didn't anticipate, which was our program once had been very successful prior to that top-ranked the United States at that time, we got too many students. We were attracting too many students, and it got so big, we had to actually downsize. And at the same time, we had created another problem, which was since we were preparing our graduates better. They went out in other institutions and they became our competitors. And suddenly we found ourselves so successful. My pin one that we created competition outside the university. That then now we were competing for those top students, recruiting them with our graduates who are competing for them, for their program. That's my point. Yeah. I'm letting that we're we're letting the goal, what we're trying to do and drive the decisions backwards. Not whether you need calculus one before something else. And that's the logic I'm talking about. And so it seems to me sometimes we're focusing on the wrong issues and that's actually what I want to talk. Breakout coming up is the how the structure is Dr and thinking rather than maybe we need to change the structure so that we can get done what we're trying to do. You're suggesting they're just so people know is you're going to host a roundtable on this conversation that I want to join you in Formerly they cap, that's really bothering. You. Got a plug-in for the next five years to try to get people to go to this, I guess I'll admit it. And you have 12 George. So backward mapping though, you know, is, I think really key a and again said was alluding, I think to some of that as well, that we need to backward map what we, what employers and our graduates are saying, you know, we're doing well and what we're not doing well for them. And and addressing those Is, is also a potential financial gain. Keeping in mind that there are, there are also ways that we can ask ourselves exactly how many students can we accommodate? Know Chand? And I think that's a, that's a question that sometimes doesn't really come up in these kinds of conversations. And you get yes, punished for your success. If you remember a little bit about that as well. If you're a if you're a department that actually gets resources coming your way because you have more students. Great. But if you're enough budget model within the institution where that doesn't necessarily flow automatically. You can find yourself in a really problematic space. Having more students than the resources that you are being allowed by the powers that be, then, then you can really serve well, It's all right. Yeah. Yeah, they're stored and there's another twist to that story. We were also one of the first programs in the country anywhere to do complete online masters degree. And we're quite proud of that. And then we did a doctoral degree online. The first, anyway, all in archea. And one of the outcomes was our online program was so successful. In fact, it was copyright to the latest US News thing. Back then, I'm thinking around 200020032004. We drew away from our residential competing with yourself. You got it. And the tuition for the online program, a lot less than that tuition you are charging. And so if I were advising and students coming into our master's program, I said, Okay, Given our requirements and number of credit hours, it will cost you. All right. I forget the number at the time. Something like seven or $8 thousand in tuition for out-of-state, I think they were. But if you do it, residential aids don't cost you three times as much. My program actually did a little bit of that to themselves as well. We are at time and we have to close this down. But linda, any last words of wisdom from you? Now these have been terrific conversation. They really appreciate everybody's input. I think we have some great ideas to, to move to the next level. I think there's a lot of data points we could be collecting and should be paying attention to. And I'm hearing these comments has been very helpful. Yes. Thank you all for your engagement. We hope that this hour and a half was a good use of your time and get in touch with us directly if you have any questions.