Hotels Harrah's and Harveys Lake Tahoe, Nevada, USA
Tuesday December 10, 2013
Location: Harrah's Tahoe D
Schedule TimeDurationEventSpeakerSlides
7:25-7:305 minWelcome and introductionOrganizers
7:30-8:0030 minDiscovering and removing barriers to learningKen Koedinger (CMU)[pdf]
8:00-8:3030 minDemographics and learner behavior in MITx and HarvardX MOOCsDaniel Seaton (MIT, EdX)[pdf]
8:30-9:0030 minPersonalized Learning and Temporal Modeling at Khan AcademyJascha Sohl-Dickstein (Stanford, Khan Academy)[pdf]
9:00-9:3030 minCoffee Break
9:30-10:301 hrPoster Spotlights
10:30-3:30- Ski break and lunch -
3:00-3:3030 minCreating Infinitely Adaptable CoursewareZoran Popovic (UW)
3:30-4:0030 minWhat I've Learned about LearningPeter Norvig (Google)[pdf]
4:00-4:3030 minOpportunities and Challenges for Education Research on CourseraAndrew Maas (Stanford/Coursera)[pdf]
4:30-5:301 hrCoffee break and Poster Session
5:30-6:151 hrData PanelInvited panelists
ConclusionOrganizers
Workshop Description Motivation and GoalsGiven the incredible technological leaps that have changed so many aspects of our lives in the last hundred years, it's surprising that our approach to education today is much the same as it was a century ago. While successful educational technologies have been developed and deployed in some areas, we have yet to see a widespread disruption in teaching methods at the primary, secondary, or post-secondary levels. However, as more and more people gain access to broadband internet, and new technology-based learning opportunities are introduced, we may be witnessing the beginnings of a revolution in educational methods. In the realm of higher education, rising college tuition accompanied with cuts in funding to schools and an ever increasing world population that desires high-quality education at low cost has spurred the need to use technology to transform how we deliver education.
With these technology-based learning opportunities, the rate at which educational data is being collected has also exploded in recent years as an increasing number of students have turned to online resources, both at traditional universities as well as massively open-access online courses (MOOCs) for formal or informal learning. This change raises exciting challenges and possibilities particularly for the machine learning and data sciences communities.
These trends and changes are the inspiration for this workshop, and our first goal is to highlight some of the exciting and impactful ways that our community can bring tools from machine learning to bear on educational technology. Some examples include (but are not limited to) the following:
- Adaptive and personalized education
- Gamification and crowdsourcing in learning
- Large scale analytics of MOOC data
- Multimodal sensing
- Optimization of pedagogical strategies and curriculum design
- Content recommendation for learners
- Interactive Tutoring Systems
- Intervention evaluations and causality modeling
- Supporting collaborative and social learning
- Data-driven models of human learning
- Analysis of social networks of students and teachers
- Automated, semi-automated, and crowdsourced methods for formative and summative assessment.
The second goal of the workshop is to accelerate the progress of research in these areas by addressing the challenges of data availability. At the moment, there are several barriers to entry including the lack of open and accessible datasets as well as unstandardized formats for such datasets. We hope that by (1) surveying a number of the publicly available datasets, and (2) proposing ways to distribute other datasets such as MOOC data in a spirited panel discussion we can make real progress on this issue as a community, thus lowering the barrier for researchers aspiring to make a big impact in this important area.
Target Audience- Researchers interested in analyzing and modeling educational data,
- Researchers interested in improving or developing new data-driven educational technologies,
- Others from the NIPS community curious about the trends in online education and the opportunities for machine learning research in this rapidly-developing area.
Call for papers Submissions should follow the NIPS format and are encouraged to be up to six pages. Papers submitted for review do not need to be anonymized. There will be no official proceedings, but the accepted papers will be made available on the workshop website. Accepted papers will be either presented (both) as a poster and a short spotlight presentation. We welcome submissions on novel research work as well as extended abstracts on work recently published or under review in another conference or journal (please state the venue of publication in the latter case); we encourage submission of visionary position papers on the emerging trends in data driven education.
Important Dates - Paper Submission --- October 16th, 2013 (extended deadline)
- Author notification --- October 24th, 2013
- Camera ready deadline for accepted submissions --- October 30th, 2013
- Finalized workshop schedule out --- October 30th, 2013
- Deadline for NIPS Early Registration --- November 8, 2013
- Data Driven Ed Workshop --- December 10th, 2013
Invited Speakers
Panelists - Eliana Feasley, Khan Academy,
- Una-May O'Reilly, MIT,
- Jim Bower, Barshop Institute, University of Texas Health Science Center, San Antonio, Texas and Numedeon Inc.
- Lori Breslow, MIT,
Accepted papers - Varun Aggarwal, Shashank Srikant, and Vinay Shashidhar
Principles for using Machine Learning in the Assessment of Open Response Items: Programming Assessment as a Case Study - Sumit Basu, Chuck Jacobs and Lucy Vanderwende
Powergrading: a Clustering Approach to Amplify Human Effort for Short Answer Grading - Franck Dernoncourt, Choung Do, Sherif Halawa, Una-May O’Reilly, Colin Taylor, Kalyan Veeramachaneni and Sherwin Wu
MOOCVIZ: A Large Scale, Open Access,Collaborative, Data Analytics Platform for MOOCs - Jorge Diez, Oscar Luaces, Amparo Alonso-Betanzos, Alicia Troncoso and Antonio Bahamonde
Peer Assessment in MOOCs Using Preference Learning via Matrix Factorization - Stephen E. Fancsali
Data-driven causal modeling of “gaming the system” and off-task behavior in Cognitive Tutor Algebra - Damien Follet
A three-steps classification algorithm to assist criteria grid assessment - Peter W. Foltz and Mark Rosenstein
Tracking Student Learning in a State-Wide Implementation of Automated Writing Scoring - Jose P. Gonzalez-Brenes, Yun Huang and Peter Brusilovsky
FAST: Feature-Aware Student Knowledge Tracing - Fang Han, Kalyan Veeramachaneni and Una-May O’Reilly
Analyzing student behavior during problem solving in MOOCs - Mohammad Khajah, Rowan M. Wing, Robert V. Lindsey and Michael C. Mozer
Incorporating Latent Factors Into Knowledge Tracing To Predict Individual Differences In Learning - Robert V. Lindsey, Jeff D. Shroyer, Harold Pashler and Michael C. Mozer
Improving students’ long-term knowledge retention through personalized review - Yun-En Liu, Travis Mandel, Zoran Popovic and Emma Brunskill
Towards Automatic Experimentation of Educational Knowledge - Andras Lorincz, Gyongyver Molnar, Laszlo A. Jeni, Zoltan Toser, Attila Rausch and Jeffrey F. Cohn
Towards entertaining and efficient educational games - Travis Mandel, Yun-En Liu, Zoran Popovic, Sergey Levin and Emma Brunskill
Unbiased Offline Evaluation of Policy Representations for Educational Games - Sergiy Nesterko, Svetlana Dotsenko, Qiuyi Hu, Daniel Seaton, Justin Reich, Isaac Chuang, and Andrew Ho
Evaluating Geographic Data in MOOCs - Andy Nguyen, Christopher Piech, Jonathan Huang and Leonidas Guibas
Codewebs: Scalable Code Search for MOOCs - Zachary A. Pardos
Simulation study of a HMM based automatic resource recommendation system - Arti Ramesh, Dan Goldwasser, Bert Huang, Snigdha Chaturvedi, Hal Daume III and Lise Getoor
Modeling Learner Engagement in MOOCs using Probabilistic Soft Logic - Nihar B. Shah, Joseph K. Bradley, Abhay Parekh, Martin Wainwright and Kannan Ramchandran
A Case for Ordinal Peer Evaluation in MOOCs - Adish Singla, Ilija Bogunovic, Gabor Bartok, Amin Karbasi and Andreas Krause
On Actively Teaching the Crowd to Classify - Glenda S. Stump, Jennifer DeBoer, Jonathan Whittinghill and Lori Breslow
Development of a Framework to Classify MOOC Discussion Forum Posts: Methodology and Challenges - Weiyi Sun, Siwei Lyu, Hui Jin and Jianwei Zhang
Analyzing Online Learning Discourse using Probabilistic Topic Models - Joseph Jay Williams
Applying Cognitive Science to Online Learning - Joseph Jay Williams and Betsy Williams
Using Interventions to Improve Online Learning - Diyii Yang, Tanmay Sinha, David Adamson and Carolyn Penstein Rose
“Turn on, Tune in, Drop out”: Anticipating student dropouts in Massive Open Online Courses
Organizers
Related Conferences and Workshops
|