Dave's focus is on the transcript. He expresses concern about the volume of information that would appear, and with the difficulty of comparing students from a school with component-based grading with students from other schools.
On the first point, he explains:
Putting all this data on transcripts is a little iffier, especially if you propose to include the separate components for each class. In these times, pretty much everyone is short of time and attention, including hiring departments. A traditional transcript provides a certain (small) amount of data -- from your description, perhaps a dozen total or few dozen yearly grades. Each letter grade represents about four bits worth of information, small enough to be instantly "seen" rather than interpreted. (Assuming consistent grading standards, but you're already championing those!) The total transcript, or at least itsThere is no question that all of us are drowning in information. I guess that's why it's called the information age. Dave is correct, moving from one grade (letter or number) in each of roughly 30 courses, to component grades that could add up to 120, 150, 180 grades would swamp anyone examining the transcript with other things on their "to do" lists. There definitely would need to be some aggregation of the various grades into more general measures. For some components it should not be too difficult. In a component grading system, writing skill would probably appear as a grade for most courses. Even if a student has an aberration in the pattern, it should be possible to take, say, 22 B+ grades and 3 B grades and wrap them into one B+ grade (or better yet, convert them to their GPA equivalent and report the average). So perhaps a summary for each component, aggregated and averaged across all courses, would be the primary piece of information. Under current practice, transcripts are accompanied by an explanation of the grading system (because no two schools are identical), and surely an explanation of the component grading system could and should be attached. The detailed grades (120, 150, 180) could be provided on a second attachment, for those who want to dig into the student's abilities more deeply. After all, when potential employers call me about a student, they're asking those questions, namely, was the student an active participant? Creative? Responsible? Making deadlines? Answers to those questions are probably more helpful than the "this student would be great for your firm" recommendation letters that are in some ways more useful than transcripts and in some ways more difficult to assess and absorb than are transcripts.
summary section, probably come out to a short paragraph's worth of data. Some prospective employer can easily compare this small dataset among their applicants, spot common and/or relevant patterns in the data, etc.
If you include multiple component grades for each course, you risk swamping the reader's attention. Said reader will then look for the summary, that is the total grades. If you haven't provided such a summary, they'll improvise or intuit one, by who-knows-what methods.
One last comment about Dave's first point. Ultimately transcripts will be delivered in digital form. Under those circumstances, the data can be re-arranged with a click (or two) so that it can be presented in a manner that suits the needs of a particular employer (or LL.M. program). A bit of spreadsheet programming (a macro package), generating an application for employer use (and/or for registrar use), would be handy and surely forthcoming.
On Dave's second point, he suggests:
Even if you provide overall grades for the component skills, you still have a lesser problem:If the components are sensibly labelled, and if an explanation is attached (as is done under the current system), lawyers who are making hiring decisions, and most non-lawyers doing screening for the lawyers, will be able to see the flags, and cull the resumes accordingly. Firms looking for research assistance won't worry about a less than outstanding aggregated average component grade in spoken advocacy skills, whereas an employer looking for a litigator would be less interested in such a student. Low scores in a component such as "making deadlines on time" would tell employers far more than a B+ that reflects an A-level student who suffers from deadline phobia. Ideally, a law faculty adopting component grading would invite practitioners to share what they would like to see on a resume. In other words, the system should not be something understood only by a faculty or something developed in a vacuum. Perhaps the attempt to institute component grading would be a strong bridgework to span the gap between the "academy" and the practice world that has widened so quickly during the past decade as the philosophical side of law has overwhelmed those many law schools strugging (in vain, of course) to find a seat at the elite 30's head table in the rankings dining room.
The employer will now have trouble comparing your graduate with others using course grades. Further, when thinking about these grades, the employer will have to keep in mind the descriptions and implications of your new categories, and try to connect them to the business at hand. A law practice or faculty probably will understand the categories; but those aren't the only employers of new lawyers. Consider corporate and NGO legal departments, various public agencies and offices, etc. What are they to make of your categories? Note that this sort of confusion isn't insuperable, but it will be common to most major changes. I see
it as a sort of "systemic inertia" that you'll need to just deal with.
Dave presents his concept of the transcript:
At the very least, I would suggest a transitional approach, which would make the transcript rather lengthy but let the reader pick out the data they want easily. The transcript would be made up of several sections, preferably on separate pages:His ideas are not that far from mine. I'd put the explanation as an appendix and I'd put the grand matrix as an appendix. The element that gives me pause is the traditional overall class grade. I know that if I shifted to component grading, I'd welcome the release from generating overall grades and I'd find it very difficult to put together 6 or 8 components into one letter grade. For example, how much weight to each? Before someone says, "Well, that's what you're doing now" my response is, "Yes, in some ways I do this, but most law faculty do not necessarily think in these terms when determining the grade that a student has earned." Admittedly, though questions on one of my examinations that measure ability to identify missing but necessary facts might earn 15 points (and thus represent x% of the course grade), they also measure other things, and ultimately I can't put a precise number on the weight given to "identifying facts" as a component. With component grading I'd design questions a bit differently, and focus on the skill that question attempts to measure. I'd also take into account separately other components not measured by examination performance (which is far from the best measuring device available to law faculty).
1) explanatory material about the format and categories,
2) the traditional class and overall grades,
3) the overall skill component grades, and possibly...
4) the grand matrix of skill components for each class, as an appendix for dataholics.
This way, you're providing the traditional data for convenience, and also offering your new data for consideration, without forcing it on readers.
In any event, it's fun to bat this idea around. Whether or not it goes anywhere (other than around and around), it makes us think about what we are doing in the whole process of evaluating students and determining the grades that they have earned.