Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,9 @@ out/
# Compiled class file
*.class

# Generated libraries
lib/

# Log file
*.log

Expand All @@ -45,3 +48,6 @@ out/

# virtual machine crash logs, see http://www.java.com/en/download/help/error_hotspot.xml
hs_err_pid*

# backup files
*~
6 changes: 6 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -89,3 +89,9 @@ _4.15.2019_

**Repository**
- Switched CI to GitLab

#### 2.1.1
_12.12.2022_
- Fixed Gradescope example
- Updated library versions
- Improved error handling and reporting
304 changes: 8 additions & 296 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,300 +1,12 @@

# JGrade
_A library for grading Java assignments_


[![pipeline status](https://gitlab.com/tkutcher/jgrade/badges/dev/pipeline.svg)](https://gitlab.com/tkutcher/jgrade/pinelines/dev/latest)
<a href="https://tkutcher.gitlab.io/jgrade/api">
<img src="https://img.shields.io/static/v1?label=%20&message=docs&color=informational" alt="docs"/>
</a>
<a href="https://tkutcher.gitlab.io/jgrade">
<img src="https://img.shields.io/static/v1?label=version&message=1.1.0&color=orange" alt="version"/>
</a>


<br>



[API Documentation](https://tkutcher.gitlab.io/jgrade/api)


---


NOTE - I've moved the CI to GitLab and am using GitLab to host the API docs (https://tkutcher.gitlab.io/jgrade), but this
will remain the primary repository. GitLab will just mirror the master and dev branches.


:bangbang: Help Wanted :bangbang:

Once upon a time, it was my priority to be grading intermediate-level Java 8 assignments - but after graduating that
priority has gone down a bit :grimacing: . I would be happy to help familiarize anyone who is interested in contributing
and keeping this maintained. Submit an issue in the project if you are interested or have any ideas!


- [Overview](#overview)
- [Quick Start](#quick-start)
- [Features and Usage](#features-and-usage)
- [Development](#development)
- [Ideas / Wishlist](#wishlist)

---

## Overview
JGrade is a helper tool with various classes designed to assist in course instructors "autograding" an assignment,
inspired by the [Gradescope Autograder](https://gradescope-autograders.readthedocs.io/en/latest/). There are classes
that the client can integrate with directly, or use the jar's main method (and provide a class with annotations) that
wraps a lot of common functionality (see [examples](https://github.com/tkutcher/jgrade/tree/development/examples)).
It was designed to produce the output needed for Gradescope while being extensible enough to produce different
outputs and configure the specific JSON output Gradescope is looking for.


## Quick Start

To make use of this, you first need to grab the jar file from the [Releases](https://github.com/tkutcher/jgrade/releases) page.
This includes many classes you can make use of, as well as a main method for running and producing grading output.

With this, you could have the following setup:

A class that runs some unit tests we want to treat their success as a grade (these would import student code):

```java
import com.github.tkutcher.jgrade.gradedtest.GradedTest;
import org.junit.Test;

import static com.github.tkutcher.jgrade.gradedtest.GradedTestResult.HIDDEN;
import static com.github.tkutcher.jgrade.gradedtest.GradedTestResult.VISIBLE;
import static org.junit.Assert.assertFalse;
import static org.junit.Assert.assertTrue;
import static org.junit.Assert.fail;

public class ExampleGradedTests {
@Test
@GradedTest(name="True is true", points=2.0, visibility=VISIBLE)
public void trueIsTrue() {
assertTrue(true);
}

@Test
@GradedTest(name="False is false", number="2", points=3.0, visibility=HIDDEN)
public void falseIsFalse() {
assertFalse(false);
}

@Test
@GradedTest(name="Captures output")
public void capturesOutput() {
System.out.println("hello");
}

@Test
@GradedTest(name="This test should fail")
public void badTest() {
fail();
}
}
```

and a main method with some other grading-related non-unit-testing logic `MyGrader.java`:

```java
import com.github.tkutcher.jgrade.BeforeGrading;
import com.github.tkutcher.jgrade.AfterGrading;
import com.github.tkutcher.jgrade.Grade;
import com.github.tkutcher.jgrade.Grader;
import com.github.tkutcher.jgrade.gradedtest.GradedTestResult;

import static com.github.tkutcher.jgrade.gradedtest.GradedTestResult.HIDDEN;


public class BasicGraderExample {

/* All @Grade/@BeforeGrading/@AfterGrading methods must take exactly one parameter
* of type Grader. This parameter is the same grader throughout.
*
* @BeforeGrading methods are run before others.
*/
@BeforeGrading
public void initGrader(Grader grader) {
grader.startTimer();
}

/* You can run unit tests that are annotated with @GradedTest to add
* GradedTestResults to the Grader in this way.
*/
@Grade
public void runGradedUnitTests(Grader grader) {
grader.runJUnitGradedTests(ExampleGradedTests.class);
}

/* You can also manually add GradedTestResults you create to the grader. */
@Grade
public void singleTestResult(Grader grader) {
grader.addGradedTestResult(
new GradedTestResult("manual test", "1", 1.0, HIDDEN)
);
}

/* Grader.startTimer() and Grader.stopTimer() can be used to time the grader */
@Grade
public void loopForTime(Grader grader) {
long startTime = System.currentTimeMillis();
while (System.currentTimeMillis() - startTime < 1000);
}

/* @AfterGrading methods are run after all other methods. */
@AfterGrading
public void endGrader(Grader grader) {
grader.stopTimer();
}
}
```

Then, you could run

```shell script
java -jar ../lib/jgrade-1.1-all.jar -c MyGrader -o results.json
```

and get GradeScope-formatted json. See the [examples](/examples) for more complete examples and how to set up a script
to work with GradeScope, and expand the usage below to see the arguments you can provide this main program.

<details><summary>Usage</summary>
<p>

```
-c,--classname arg the class containing annotated methods to grade
-f,--format output-format specify output, one of 'json' (default) or 'txt'
-h,--help<br>
--no-output don't produce any output (if user overriding)
-o destination save output to another file (if not specified,
prints to standard out)
--pretty-print pretty-print output (when format is json)
-v,--version

```

</p>
</details>


## Features and Usage

The way I used this library is to have a base class for the course (for example, a `_226Grader`) that contains
annotated methods for functionality/grading parts that are consistent across all assignments. For example, the
`@BeforeGrading` method starts a timer and the `@AfterGrading` method stops it. There is a `@Grade` method that
does the "grading" of style with checkstyle. Subclasses, for example `Assignment1Grader` (or `Assignment0Grader`
I suppose :wink:), extend this and add `@Grade` methods to add assignment-specific grading.
See the gradescope folder in the examples for a rough example setup.

### Features

See the [API Docs](https://tkutcher.gitlab.io/jgrade/api) for more complete documentation.

#### `CheckstyleGrader`

With the `CheckstyleGrader` you can specify grading deductions for checkstyle errors. This method below, for example,
would check the students files and deduct a point for each checkstyle error type (missing javadoc, require this, etc.).

```java
@Grade
public void runCheckstyle(Grader grader) {
CheckstyleGrader checker = new CheckstyleGrader(5.0, 1.0, MY_CHECKSTYLE_JAR, STUDENTFILES);
checker.setConfig(MY_CHECKSTYLE_CONFIG);
GradedTestResult result = checker.runForGradedTestResult();
result.setScore(Math.max(0, 5 - checker.getErrorTypeCount()));
grader.addGradedTestResult(result);
}
```

#### `DeductiveGraderStrategy`

You can use this strategy to make failed tests deduct points from a total. So say in the current assignment there are two
parts, A and B, each worth 25 points. If someone fails 30 tests for part B each worth one point, you don't want that to cut
in to the assignment A portion:

```java
public class GradeAssignment7 extends Grade226Assignment {

private static final int AVL_POINTS = 30;
private static final int TREAP_POINTS = 20;

@Grade
public void gradeAvlTree(Grader grader) {
grader.setGraderStrategy(new DeductiveGraderStrategy(AVL_POINTS, "AvlTreeMap"));
grader.runJUnitGradedTests(GradeAvlTreeMap.class);
}

@Grade
public void gradeBinaryHeapPQ(Grader grader) {
grader.setGraderStrategy(new DeductiveGraderStrategy(TREAP_POINTS, "TreapMap"));
grader.runJUnitGradedTests(GradeTreapMap.class);
}
}
```


#### `DeductiveGraderStrategy`

You can use this strategy to make failed tests deduct points from a total. So say in the current assignment there are two
parts, A and B, each worth 25 points. If someone fails 30 tests for part B each worth one point, you don't want that to cut
in to the assignment A portion:

```java
public class GradeAssignment7 extends Grade226Assignment {

private static final int AVL_POINTS = 30;
private static final int TREAP_POINTS = 20;

@Grade
public void gradeAvlTree(Grader grader) {
grader.setGraderStrategy(new DeductiveGraderStrategy(AVL_POINTS, "AvlTreeMap"));
grader.runJUnitGradedTests(GradeAvlTreeMap.class);
}

@Grade
public void gradeBinaryHeapPQ(Grader grader) {
grader.setGraderStrategy(new DeductiveGraderStrategy(TREAP_POINTS, "TreapMap"));
grader.runJUnitGradedTests(GradeTreapMap.class);
}
}
```

#### `CLITester`

A class to help wrap testing command line programs. You subclass `CLITester`, then implement
the `getInvocation()` method for how the command line program is invoked, then you can use
`runCommand(String)` to get the output in an object that you can test for expected output.


---

## Development

- `mvn install` to compile
- `mvn test` to run unit tests
- `mvn checkstyle:checkstyle` to run checkstyle
- `mvn javadoc:jar` to generate API docs.

Check out [contributing](/CONTRIBUTING.md) for more.


### Requirements
JGrade is written in [Java 8](https://www.oracle.com/technetwork/java/javase/overview/java8-2100321.html).
Since the library has classes designed to run alongside JUnit, [JUnit 4](https://junit.org/junit4/) is a dependency
for the entire project (as opposed to just for running the projects own unit tests).
The [org.json](https://mvnrepository.com/artifact/org.json/json) package is used in producing correctly formatted
JSON output, and the [Apache Commons CLI](https://commons.apache.org/proper/commons-cli/) library is used for
reading the command line in the main program.

For simplicity, the main jar (appended with "-all") includes all of these dependencies.
_A library for grading Java assignments_

### Wishlist
- Feedback for required files
- In our autograder, we built in something that took a list of required files and created a visible test case worth 0 points of what files were missing - this helped students debug.
- Could try and move some of this there.
- Actual Observer pattern
- Allow for people to specify custom handlers whenever things like new graded test results are added
- Old "observer" terminology not really an observer
This is a fork of [tkutcher/JGrade](https://github.com/tkutcher/jgrade),
created by Tim Kutcher. Because he has moved on to other projects, I
forked it so I could run it with the current (2022) version of Gradescope.
I have made minor changes to the JGrade library and significant changes
to the [Gradescope example](examples/gradescope/README.md). See
[my changes](https://github.com/tkutcher/jgrade/compare/tkutcher:jgrade:dev...espertus:jgrade:dev).

**This has been superseded by [Jacquard](https://github.com/espertus/jacquard).**
1 change: 1 addition & 0 deletions examples/gradescope/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
zips/*.zip
47 changes: 36 additions & 11 deletions examples/gradescope/README.md
Original file line number Diff line number Diff line change
@@ -1,17 +1,42 @@
# JGrade Gradescope Example

This is a full example that works on gradescope, and models much of the setup from the original [java example](https://github.com/gradescope/autograder_samples/tree/master/java) Gradescope links to.
This demonstrates a Gradescope autograder that uses checkstyle and JUnit.

It compiles all files in to a created `classes/` directory (not tracked). The `lib/` folder contains all jars and library files - for this example just `jgrade-1.0.0-all.jar` (which includes JUnit, etc.), and `checkstyle-8.12.jar`. The `res/` directory is for resources (like the checkstyle configuration file). `src/` is the main source code, and `test_submissions/` are submissions to test with on Gradescope.
[![Watch the video](https://img.youtube.com/vi/o1FHbHZwyUY/maxresdefault.jpg)](https://youtu.be/o1FHbHZwyUY)

The source has 2 main packages, `staff` and `student`. The staff package contains the unit tests, a solution (to debug with) and the code to do the grading.
These are the files and directories:
* `make_autograder.sh`: zips up files for uploading to Gradescope
* `makefile`: provides an alternative to directly sourcing the shell scripts
* `setup.sh` [required by Gradescope]: sets up the environment by installing a recent JDK
* `run_autograder` [required by Gradescope]: attempts to copy student code into `/autograder/source` directory, compile the grading code with the student code, and run the grader
* `compile.sh`: compiles the combined code, copying `compilation_error.json` into the `results` directory if compilation fails
* `compilation_error.json`: the JSON for indicating the [student's] code did not compile
* `run.sh`: runs JGrade, passing in the assignment-specific grading class (`GradeHello`)
* `README.md`: this file
* `classes`: the destination for compiled files
* `lib/`: the location of needed libraries, which are not checked into git
* `checkstyle-10.5.0-all.jar` [which you need to download if you want]
* `jgrade-2.1.1-all.jar` [which you need to build yourself for now]
* `README.md`: documentation
* `res/`: the location of resources
* `sun_checks.xml`: a configuration file needed by checkstyle
* `src/main/java/`
* `staff/hello/`: code provided by the instructor
* `GradeHello.java`: the controller (package is `staff.hello`)
* `Greeting.java`: code imported by the student (package is `student.hello`)
* `Hello.java`: model solution to the assignment (package is `student.hello`)
* `HelloTest`: test cases using JUnit (package is `staff.hello`)
* `student/hello/`: the student's code
* `Hello.java`: skeletal code that students need to complete
* `test_submissions/`: zip files of student submissions, to be manually provided to Gradescope for testing the autograder
* `correct.zip`: a fully functional project with checkstyle errors
* `errors.zip`: a project that fails some tests
* `nocompile.zip`: a project that has compile errors
* `zips/`: where `build_autograder.sh` places the zipped autograder it builds

To build the autograder, run either `$ sh make_autograder.sh` or `$ make autograder` which will place it in the `zips/` folder.

While debugging, a makefile is provided for compiling and running. `make output` will start fresh and run the autograder, pretty-printing the output to the console.
To test (and debug) the autograder before uploading it, execute:
```
./run_autograder --local
```

- `setup.sh`: Installs correct JDK
- `run_autograder`: Main script for the autograder. Copies in submission, compiles, and runs.
- `compile.sh`: Compiles all of the source into a classes directory
- `run.sh`: Runs JGrade, passing in the `GradeHello` file, writing output
- If run with `--local` then prints output to console, else to the results/results.json file.
To build the autograder, run either `$ sh make_autograder.sh` or `$ make autograder` which will place it in the `zips/` folder.
Loading