Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on May 30, 2017

Computer science students should learn to cheat, not be punished for it


Computer science students should learn to cheat, not be punished for it Image by: Shutterstock

Computer Science students are constantly getting into trouble for lifting entire blocks of code from the Internet.

Yesterday, the New York Times published a fascinating piece about academic dishonesty in the computer sience field, which it says is rampant.

Here’s some eye-catching figures. At UC Berkley, 100 out of a cohort of 700 computer science students were discovered to have used code that wasn’t entirely their own. At Brown University, almost half of all academic honor code violations involve CompSci students. Elsewhere at Purdue University, two students were caught after they handed in projects that had 100 identical lines of code.

It’s not a new phenomenon either. In 2010, Ars Technica reported that 22-percent of all honor code violations at Stanford related to plagiarism in Computer Science.

Is the problem that Computer Science students are just bad, unethical, dishonest people? Or is the issue that the way people write software in an academic setting is completely at odds with how programmers work in the real world.

Plagiarism is a skill

It’s safe to say that in the academic world, plagiarism is a cardinal sin. There is nothing worse. If you get caught, you face sanction, or even expulsion. In the case of the aforementioned Purdue students, they got a zero for their work, and were docked a letter grade.

But in the professional world, things aren’t quite as cut-and-dry.

When you’re a professional coder, the priority isn’t to demonstrate originality with each line and algorithm, but rather to complete tasks as efficiently as possible.

In practice, this means consulting sites like Stack Overflow and Reddit, in order to solve the problems you are unable to. Within a workplace context, plagiarism isn’t a vice, but a skill. It takes aptitude and understanding in order to look at how someone solved a problem, and integrate it into your own code.

There’s a certain irony that, in fields outside of computer science, plagiarism is a sign that you didn’t understand the question. Within computer science, the opposite is true. Not only have you found an acceptable solution, you’ve understood it enough to use it within the parameters of your own project.

Or, as the writer T.S. Elliot once said:

“Immature poets imitate; mature poets steal; bad poets deface what they take, and good poets make it into something better, or at least something different.”

Universities aren’t teaching job skills

This debate surrounding plagiarism is indicative of a wider discussion going on about the role universities play in training the next generation of software developers.

Universities teach computer science not from a job-training perspective, but rather from the viewpoint that it intends to create researchers and academics. This is evident not only in the curriculum, which emphasizes theoretical concepts unlikely to be use in the workplace, but also in the expectation that students behave in a way that they are unlikely to in the real world.

The consequences of this are obvious. Within the UK, computer science suffers from staggeringly high graduate unemployment. Students who graduated in the academic year 2014 to 2015 face a ten-percent unemployment rate. This figure is higher than those who studied Mass Communications and documentation, Physical sciences, or Engineering and technology.

Maybe it’s time we re-think how we teach computer science.

Perhaps, for the majority of students who don’t wish to become academics, but rather software developers, there should be an alternative to the current offering. Ideally, this would look like the now-ubiquitous software development bootcamps that have sprung up everywhere.

This line of thought has its supporters. In 2008, Stack Overflow founder Jeff Atwood penned an article that argued for the reformation of computer science university programs in relation to present-day industry requirements.

The same year, CrossTalk — a defense software engineering publication — argued that computer science education was failing to teach basic professional skills. Interestingly, it said that the practice of teaching Java as a first language was partially to blame. This is an argument I’ve got a lot of sympathy for, myself.

“It is our view that Computer Science (CS) education is neglecting basic skills, in particular in the areas of programming and formal methods. We consider that the general adoption of Java as a first programming language is in part responsible for this decline.”

So, what would an ideal programming course look like?

Not only would these focus on teaching the fundamental skills required to be a developer, but would also emphasize the professional skills that are required to become a well-rounded professional. Off the top of my head, I’m thinking about source management with Git, testing, project management techniques, and devops fundamentals.

And students wouldn’t get penalized for using Stack Overflow, because that’s what developers do in real life.

Get the TNW newsletter

Get the most important tech news in your inbox each week.