This is a past event.
WebTA: Automated Iterative Critique of Student Programming Assignments
We introduce an interactive tool called WebTA that facilitates learning through automatic critique of student source code. Our tool provides immediate feedback to students and gives them experience with test-driven development. Students receive the benefits of cognitive apprenticeship through the feedback they receive in the tool. This facilitates tight, productive cycles of inquiry, critique and learning.
WebTA compiles each student submission and executes it over a series of shakedown tests. Immediate feedback is given concerning errors and warnings, coupled with suggestions for debugging. The tool performs a textual analysis of the students source code and critiques programming style based on standard programming guidelines. To encourage inquiry through test- driven development, edge-case coverage, and API compliance, students develop and submit their own tests to be evaluated by the software.
We report on use of WebTA in one first-year programming course and one second-year data structures course. Lab and assignment scores have improved with WebTA, and student comments attest to the effectiveness of the tool. Preliminary results indicate students receive higher grades with WebTA. One area with mixed results is WebTAs analysis of student developed JUnit tests; this feature improved API compliance but reduced edge-case testing. With these successful initial results, we offer suggestions for future development.
0 people added