21 March 2012

But how do you know if they're the best?

A couple of days ago I posted this:
I continue to believe there is a huge opportunity for a school to market itself as the college which takes the best students and doesn't bother with any of the other dimensions which crop up on typical applications.

"Your parents didn't send you on a trip to Mexico to dig latrines so you would have a moving experiences to write about in your admissions essay? We don't care, because you aced multivariate calculus when you were fifteen. We take the best students we can find and forget all the other bullshit. Here, take a look at our admissions files and graduation rates and job placement statistics and alumni salary histories judge for yourself. Come to our school and you will be in classrooms with the smartest other young people we can find. We do not put our thumb on the scale in any way to tip the balance towards or away from any type of person, because we are entirely committed to admitting the most intelligent."
An anonymous commenter raised an interesting wrinkle:
A school which does this is going to have to do something about the Chinese student problem. Chinese students are constantly applying everywhere with ridiculous scores in everything. And then it turns out they got those scores by cheating at every opportunity, and they continue cheating in the university. That's the problem with the approach-- distinguishing between "you aced multivariate calculus at 15" and "you cheated your way to an A in multivariate calculus at 15"
I think this is worthy of responding in a separate post.

(1) One approach is to require online "interviews." Jeff Atwood has mentioned this several times in the context of evaluating whether potential hires for programming positions can actually program (eg). There are systems which are a combination of VOIP connections and live screencasts. You ask the interviewee a question, and then you can see what they see as they type out the solution. I know similar systems exist for testing English competency. Calculus would probably be harder to test in this way than English or programming, but something along these lines could be worked out.

Of course these systems do not completely eliminate cheating, but they do raise the cost of cheating. As a side-note, the rhythm of someone typing an answer someone else is telling them and typing an answer they are generating themselves, live, is different. There are, IIRC, machine learning routines that can differentiate between a student typing an essay they are composing as they type and the student re-typing an essay their "tutor" is providing to them.

(2) I would be satisfied if schools would simply verify that their foreign students can speak English. This is a problem I've experienced mostly in grad school. I'm all for more foreign students, but I think some basic language competency needs to be assured. I'd settle for being able to answer "Is someone sitting here?" which is beyond more than a few of my classmates/co-workers.

Verifying that someone really knows calculus over the internet is difficult, but a simple skype call should be enough to check if they really speak English.

(3) Another solution is to make foreign students arrive several weeks before their first semester to take placement exams and courses. If you fail the exams, you pack your bags and head home.

(4) A more drastic option is to not accept foreign students at all. Or not accept foreign students from countries with known cheating problems like China and, to a growing extent, Russia. Removing these applicants from consideration seems to contradict the commitment to "accept the best students," but I don't think it would be too difficult to explain that the policy must be implemented as "accept the student we can confidently verify are the best."

(5) Most US universities love foreign students because they pay full-freight tuition. A university that could foster a reputation among employers for diligently weeding out cheaters might be able to command a premium which would replace the lost potential revenue from admitting under-qualified foreign (or domestic, for that matter) applicants.

The cause of the wage premium from college degrees is hotly debated, but I think everyone agrees that at least portion of it is due to employers getting colleges to do the work of pre-screening their potential hires for them.*
e.g. Firms are more willing to hire the guy from Yale not because he learned that much more there than the guy from NC State, but because he was good enough to get into Yale in the first place.
If a college could build a reputation for doing that screening more thoroughly than other schools then their alumni could command higher wages, which would attract more talented applicants, which would make their screening more impressive, which would feed back in the forms of higher alumni wages, and so on.

(6) Most colleges have no incentive to punish cheaters or expel students whose work is not up-to-snuff. A warm body that pays its bills on time is fine for most universities. A school which was building it's entire reputation of winnowing the wheat from the chaff would have different incentives. Would that be enough for the school to properly identify cheaters/slackers/idiots-who-slipped-through-the-application-process-by-mistake? I don't know, but my inclination is that it would providing the school had really, publicly committed itself to the academic quality of its student body above all else. If even a small fraction of the graduates had faked their way through it would endanger the entire strategy.

(7) As a nice side effect being willing to punish cheaters may also help to attract better faculty. I've lost track of how many furious articles and blog posts I've read from faculty members who catch people cheating and are told to brush it all under the rug because the university's incentives are to let it slide.

Most faculty members really don't want to look the other way. That's why I don't think it would take a huge shift in the university's incentive structure to make it possible to strongly crack down on cheating. The instructors are champing for an opportunity to do so.


  1. Starting a school is hard.

    It's what I'm doing right now in Mauritius.

    Between the accrediting bodies and all of the other interested parties, there are any number of groups that do nothing but suck up resources.

    There is a lot of unnecessary overhead in the typical educational institution that should allow for profit businesses to operate in the sphere. There are too many other organizations that, in protecting their turf, try to suck up the margin available for profit.

    Reputation, even mediocre reputation in the case of my institution, is very hard to establish. It takes time. Time that too few investors are willing to endure.

    Someone is going to break out. They will make piles of money. I'm trying to be part of it.

  2. I know a group of TAs who discovered a cheating ring in one of the engineering lab courses because the group was turning in lab reports using data that didn't match the lab-performed experiments (e.g. it was all from prior years' work). The administration wouldn't even let them give Fs just for the lab reports they knew the students had cheated on! Soured me on ever wanting to teach...

  3. @Mike: good luck! Like you said, someone needs to break through. I hope it's you.

    The idea of starting a school seems like such a bureaucratic nightmare that I hadn't even considered it for the situation I was describing. I just assumed an already existing school should re-make itself with only-the-best-students policies. Starting up from scratch seems Herculean.

    @Random: yeah I've heard horror stories like that more times than I care to think about. It's definitely a turn-off for teaching.