Completed🏆 Project Excellence Award

The study behaviours and expectations of DATA1X01 students

Course Project · University of Sydney• 2024 S2
Survey EDA
Learning Styles
Study Habits
Mark Goals
Likert
R
tidyverse
ggplot2
Visualization

Exploratory Data Analysis · Survey Analytics (DATA1X01)

Project Overview

We investigate DATA1X01 students’ study behaviours and expectations using cohort survey responses. Two questions guided our analysis: 1) How do study habits (steady work, last-minute work, subject-driven patterns) relate to self-identified learning styles (deep, surface, strategic)? 2) Across different DATA1X01 streams, how do students’ mark goals relate to the weekly study hours they plan to invest? We performed initial data analysis (IDA) in R (tidyverse/ggplot2), cleaned and encoded survey fields, and produced descriptive summaries and visuals. A key insight is that intended study time does not strongly align with mark goals, suggesting a gap between expectations and planned effort. We also observe mixed alignment between self-reported learning styles and declared habits, highlighting the limits of self-assessed study strategies.

What I did

  • Framed research questions with course scope; operationalized study habits and learning styles into analyzable variables.
  • Cleaned and encoded survey responses; handled missing/ambiguous entries; documented assumptions and data caveats.
  • Performed IDA in R (tidyverse): grouped summaries, cross-tabs, and clear visuals for patterns and outliers.
  • Interrogated links between mark goals and planned weekly hours; contrasted signals across different DATA1X01 streams.
  • Examined correspondence between declared learning styles and reported habits; highlighted misalignments and nuance.
  • Authored a reproducible analysis script and an accessible narrative for non-technical readers.

Reflection

Winning the Project Excellence Award validated not only our results but our process. The strongest lesson for me was learning how to translate messy, self-reported survey constructs—“study habits,” “learning styles,” “mark goals”—into analyzable features without over-claiming. I committed to a transparent, reproducible pipeline (scripted data cleaning, clear encoding rules, figure generation from code) so every chart had a traceable origin and every conclusion was proportional to evidence. If repeating this project, I would incorporate objective traces (e.g., LMS activity, time-stamped submissions) to triangulate self-reports, add reliability checks for composite indices (e.g., internal consistency of “learning style” items), and pre-register primary questions and visuals to reduce researcher degrees of freedom. I would also design a lightweight follow-up experiment: a weekly planning nudge or progress dashboard, evaluated with simple A/B or stepped-wedge logic and clear ethics/consent language. Finally, I learned to communicate with stakeholders. Instructors want actionable signals, not statistical flourish. Framing the gap between mark goals and planned hours as a concrete opportunity for targeted support (time-planning templates, early-semester check-ins) made the work useful. The project’s recognition, I think, came from this combination of rigor, restraint, and relevance—an approach I’ll carry into future data projects.