2015 Southeast Educational Data Symposium

– ABSTRACTS –

SESSION 1: Academic Analytics & Student Success

Closing the Achievement Gap Using Predictive Analytics
Timothy M. Renick, Vice President for Enrollment Services and Student Success (Georgia State University)

Timothy M. RenickMuch national attention has deservedly been focused on the persistent gap in graduation rates between college students from different socio-economic backgrounds. Georgia State University has engaged in a multi-faceted effort to use analytics to increase student success, especially for at-risk students. Through the proactive use of data in the areas of advisement, financial-aid, and course redesign, Georgia State has increased graduation rates by 22 percentage points and eliminated all achievement gaps based on race, ethnicity and economics. It now confers 1,700 more annually degrees than five years ago and more bachelor’s degrees to African Americans than any non-profit university in the nation. The session will focus on Georgia State’s use of data and analytics to diagnose obstacles to student success and to design and implement a series of large-scale interventions, including an advising tracking system based on predictive analytics that produced 34,000 advisor-student interventions last year.

You Can’t Always Get What You Want
Chaudron Gille, Associate VP for University Affairs and Academic Services (University of North Georgia)

Chaudron GilleLast year at the University of North Georgia we launched a project to implement a card swipe tracking system to capture and analyze student use of academic support services. Our objectives were to create a reporting system that cross-references the data collected with demographic and course enrollment data in Banner, and then to use this data to promote student success by sharing the data widely and using it to intervene with at-risk students. We chose to focus on Supplemental Instruction, the math lab and the Academic Success Plan used by the Advising Center with students Not In Good Standing for this project. We believed that the project would impact success and completion through the timely use of data to increase effectiveness of interventions leading to success in courses, through the creation of an environment that encourages student-faculty-staff collaboration and engagement in student success, and through the provision of a holistic approach to supporting students through the use of Academic Success Plans with an emphasis on personal accountability for students. Although the project did not go smoothly, meaningful data was obtained that is being used to promote student success.

You Can’t Play Baseball if You Don’t Know the Rules: Making a Case for Learning Analytics at Emory University
Timothy D. Harfield, Scholar in Residence, Learning Analytics (Emory University)

Timothy HarfieldAt Emory University, high retention rates and high levels of student performance have opened up opportunities for rethinking student success and for embedding analytics within learning environments in support of teaching, learning, and instructional design. Before throwing data and tools at practitioners, it is important that those practitioners have a strong grasp of how, why, and in what context analytics may be employed to best support desirable outcomes. This brief presentation will provide an overview of Emory’s short learning analytics journey, including early successes and failures, and ongoing efforts to educate stakeholders to make use of educational data in a way that is both meaningful and actionable.

SESSION 2: Integrated Tools & Data Visualization

Making Sense of Big Data: Analysis and Visualization of Data from 640,000 students from the Los Angeles Unified School District
Ben Sayeski, Managing Partner (Education Strategy Consulting)

Ben SayeskiESC has been providing value-added analyses and data visualization for the Los Angeles Unified School District since 2009. The work is grounded in the district’s initiative towards a comprehensive system of measuring student progress across time. The initiative is called Academic Growth Over Time (AGT). Like many value-added initiatives across the country, the AGT initiative is looking to illuminate the areas within the district that are beating expected performance.

A technical advisory committee of national experts on statistical modeling and education policy informed the value-added methodology used in LAUSD. ESC uses this methodology to run the value added analyses on all charter schools in the district. Within that process, ESC handles the data collection and quality (Documentation, Layout, Consistency, Longitudinal Tracking, and Coverage), adherence to business and suppression rules, modeling, and data visualization using a web-based tool. The tool can be found on the following link: www.escmatrix.com/lausd/

Using Analytics in ALEKS to Enhance Instructional Design in General Chemistry
Tracy McGill, Senior Lecturer (Emory University)

Tracy McGillALEKS (Assessment and Learning in Knowledge Spaces) has been used as a pre-course assessment and homework tool in general chemistry since 2012. ALEKS uses adaptive questions to accurately assess what a student knows and what they are ready to learn in a course. The analytics that result are exceptional advising tools and study guides for the student and have led to great gains in the instructional design of the general chemistry course. I will present several ways that the powerful analytics ALEKS generates have been used to flip the class in order to both promote concept mastery and facilitate students’ ability to confidently work through complex problems.

Saving Students Through Activity Analytics and Effective Intervention
David Lindrum, Founder & Instructional Designer (Soomo Learning)

David LindrumFor learning analytics to make any difference in outcomes, we have to know what to count and when, what it means, and what we need to do about it. In this brief case study we will explore weekly activity data from courses to better understand just how early patterns of failure manifest, and what can be done to reverse the trend before it’s too late.

SESSION 3: Learning at Scale

Exploring Student-Generated Text at Scale
Kevyn Collins-Thompson, Associate Professor, Information / Computer Science & Engineering (University of Michigan)

Kevyn Collins-ThompsonWith collaborators, I’m exploring the educational text analytics of large datasets of student-generated content that have never been explored at scale. One source we’re currently analyzing is a dataset of millions of words of student free-form comments from years of online course evaluations. By connecting language-based insights from text analytics with existing numeric data on admissions, course properties, learning outcomes, and other variables, we hope to gain new insights into factors connected with effective teaching, learning, and student motivation and progress that we believe will be of research and institutional value.

Digging for Meaning in Mountains of MOOC Analytics
Anissa Lokey-Vega, Assistant Professor, Instructional Technology (Kennessaw State University)
Jordan P. Cameron, Instructional Designer, Bagwell College of Education (Kennessaw State University)

JordanCameronAnissaVegaMassive open online course platform providers like Coursera provide course designers and instructors with a wealth of data, both output in raw and pre-computed forms; however, understanding these Coursera outputs and using them for instructional improvement is not always clear. This panelist will discuss how Coursera session analytics have been used or are being used to improve learning.

A Data-Driven Exploration of the Backgrounds and Behavior of MITx MOOC Participants
Daniel Seaton, Educational Technologist (Davidson College)

Daniel SeatonMassive Open Online Courses have led to large and complex datasets describing behavior of participants from around the world. Data from over 30 MITx courses will be used to support this narrative, where over 1 million participants have generated more than 1 TB of clickstream data through the summer of 2014. As MOOCs continue to evolve, two themes are critical: 1) continued analysis of clickstream data that inform course providers on behavior and learning, and 2) understanding behavior through a more rigorous understanding of participants and their enrollment motivations. Teacher enrollment in MITx MOOCs will provide an example as to why understanding participants is so important to analytics overall – nearly 1 in 10 survey respondents identify as current teachers and activity measure indicate they are highly active in discussion forums. Further discussion will focus on a startup project at Davidson College (Davidson Next), whose goals are aimed specifically at empowering AP high school teachers to use edX in their classrooms. (Collaborators: Cody Coleman, Isaac Chuang, Julie Goff, Aaron Houck, Patrick Sellers)

SESSION 4: Learning Analytics & Instructional Design

Modeling Strategy as a Measure of Expertise
April Galyardt, Assistant Professor, Department of Educational Psychology (University of Georgia)

April GalyardtStrategy choice is one important dimension that distinguishes expert performance from that of novice. Even when a novice is able to successfully complete a task, their strategy is often much less efficient than that of an expert. Analysis at the strategy level is difficult, if not impossible, when only correct/incorrect response data are collected. However, as fine grained data collection (e.g., from an adaptive computer tutor) has become more common, strategy modeling is becoming more feasible. I discuss the psychological literature on strategies and the features of student performance that can be used to distinguish strategies in different contexts. I then compare existing psychometric models in their ability to analyze these necessary dimensions of student performance, and illustrate the potential of machine learning models to describe learner strategy usage.

Using High-Stake Exam Results to Estimate and Compare Undergraduate Students’ Chemistry Competency across Different Classes
Shannon Sung, Assistant Professor, Education Studies/Interdisciplinary STEM (Spelman College)

Shannon SungCollege teachers are often perplexed by the effectiveness of their innovative instructional methods implemented in distinct group of classes. This study recorded how the use of a high-stake chemistry test provides the bases for the instructor to compare students’ learning outcome. In this study, the chemistry instructor adopted a combination of instructional strategies (i.e., gates, blended, and enhanced-blended learning) in distinct classes. This study intends to investigate whether students’ ability increases across classes. The participants (N = 126) were chemistry majors taking the second general chemistry course in the first year of college. The students’ responses for the same high-stake chemistry exam (n = 70) administered to four classes were tabulated into dichotomous (i.e., 0 or 1) format and analyzed with the Item Response Theory (IRT). The application of the IRT offers a less labor-intensive approach for obtaining a meaningful result in order to communicate the effectiveness of adopting innovative pedagogies. Specifically, the Rasch model in the IRT was applied to obtain item difficulty and students’ chemistry competency levels which are plotted on the same scale of Wright map. One-way ANOVA demonstrated that there were significant differences in chemistry competency levels among the classes using different instructional methods. Educational implications are discussed.

Informing Course Design with Analytics and Experiences
Roxanne Russell, Instructional Designer, Candler School of Theology (Emory University)

Roxanne RussellHow can instructional designers take advantage of the opportunities presented by ever growing access to learning analytics on faculty and student behaviors in online classrooms? This session will present an emerging program evaluation framework for cross-referencing student perceptions of their learning experiences with learning analytics data of faculty and student interactions with learning tools. This framework offers a method for gathering and triangulating multi-perspective data about course design and student performance.

Analytics for Learning at Emory