TransWikia.com

How can we best inform potential students of the nature of CS?

Computer Science Educators Asked by Buffy on December 30, 2020

Recent questions deal with the problem of too many applicants to CS programs, both in High School and University. Yes, this is a problem when only a smaller number can be accommodated for reasons of resource availability. One of the easier issues to deal with is whether the student’s past performance indicates potential success, though even this is hard. If the student hasn’t studied Computer Science in the past, what courses are predictors? But this question is focused elsewhere.

Some students, in periods of high demand, choose a field simply because of its popularity at the time. While they may have the aptitude and background for it, their interest may be shallow or even misinformed. It would be good to help such students make better choices about their futures by informing them early of the nature of the field and how their future would likely play out if they stay in the field. The goal is to reduce the number of drop-outs: students who start a program, but leave it before completion. This potentially wastes both the student’s time and resources as well as wasting institutional resources. It also risks denying other students an entrance slot that they might have had.

In particular, such students may not know all of the things that CS professionals do on a daily basis. What are the interesting things? What are the boring things? What are the risks? What are the rewards? What frustrations? What will they ultimately need to do to become successful, in either the commercial or academic worlds.

What resources can be brought to bear to assure that, possibly naive, students get a good look at the profession so as to better decide whether they want to enter?

Google provides a lot of information for “What is Computer Science?” This includes the summaries given by a lot of colleges. However, most of it stresses only the positive things and so gives a somewhat incomplete picture.

In the early 1960’s as a teenager, I visited a local computer center. They had what was likely an IBM 650 with drum memory. I got to see what folks did back then and recall that I wasn’t very impressed. I studied math. But CS wasn’t an option then. But something more was needed in my case to generate interest.


For the record, I’m interested in drawing an accurate picture of work in the field, not horror stories or overly optimistic projections.

5 Answers

If the main issue is that the not-so-great side of programming (to pick one area as an example) is not disclosed, then tell people point-blank, in an admission conversation or while describing programming to groups:

Programming is one of the most frustrating and difficult things that humans have ever devised to do.

I say that to all my students right out, usually the first day. I also tell them about the big egg-shaped rock I used to wave threateningly at the (glass CRT) screen when I got particularly frustrated. And I tell them about the time I had to go back to the bookstore to buy a different book about the intricacies of writing DOS interrupt routines for networking with IPX over Novell: the first book had a missing line in a code example.

It certainly is true that one can succeed with poor background and preparation. I hated math classes, and they probably would have helped me become better faster at programming (my first chosen career that I attended college to get to). There is no substitute for being really interested in something, interested enough to defy your parents and change your major in college, interested enough to press on through many difficulties, come what may. "Being carried along is not enough."

Loving it even when you don't really know what it is will work, I do not know what else will. As it says about qualifications in the book At the Feet of the Master:

Love, if it is strong enough in a man, will force him to acquire all of the other qualifications, and all the rest without it would never be sufficient.

We are supposed to do what we love. Freud said that the keys to a meaningful life are Work and Love. Really, they are one.

Answered by user5288 on December 30, 2020

Every career has "gotchas". There is something about every job that is undesirable. That's why it is called a "job". But students can make sure they are matched as well as possible - minimize the "gotchas", so to speak.

I entered CS because I loved to write code. I stumbled into the field as a junior in college and immediately changed my educational course. I am thankful for that "accidental" encounter, but it wasn't an accident, I don't believe, that I loved computing as soon as I was exposed to it. My "desire" had been predicted. I remember taking a career assessment of some sort as a sophomore in high school. I remember the career assessment saying I was a good match for "computer science". I remember thinking "what the heck is that?" (this was prior to the PC revolution), and I remember immediately dismissing the results of my assessment. But the assessment was correct.

I say all that to say this: students should be encouraged to use career and interest assessments in high school, maybe even repeatedly (once a year). A quick Google search reveals many free assessments, and my state's college foundation provides 7 of them for free in one place.

Up until recently, the students at my high school had been required to do several of these assessments as sophomores. Unfortunately, with a change in management, that practice ceased. But I still have my students do them as part of my class. We all should, I think.

Using assessments such as this at least helps students better understand themselves, and may prevent them from making a very bad career choice. Of course, sometimes the assessments will be downright wrong (I recently had a student, one of my very best programmers, who loves to write code, is good at it, and knows she wants to go into CS, have an assessment tell her she should be an artist; however, I don't believe it was entirely off track because her code is very creative and is an artistic outlet for her). But sometimes career and self-assessments will help a student think outside the box about what they want to "be". And that is a good thing.

Another thing my school does is bring in professionals to share about their jobs. They speak briefly about their backgrounds, what they do, what they like about their jobs, and what they dislike about their jobs. They usually speak during lunch in a classroom. The kids eat their lunches while they speak. It is very informal and always well attended. The feedback from the students is always very positive. Our local CDC (career development coordinator) organizes these "Lunch and Learn" sessions, as we call them. We have 5-6 a year.

These are tangible things you can "do" to help students get ideas about careers. Some students may still fall headlong into something that is not a good match for them, because of tunnel vision, or peer pressure, or parental wishes. But some of them will identify or reinforce what they love to do, and they will be willing to make the trade-off: you pay me to do what I love X% of the time, and I will put up with the 100-X% of the time I have to do stuff I don't necessarily like.

Answered by Java Jive on December 30, 2020

This is not a complete answer, and it misses the "drawing an accurate picture of work in the field" goal, but I think it shows an approach worth mentioning.

I'm not sure you can show the picture accurately without a hands-on experience with coding.

The goal is to reduce the number of drop-outs: students who start a program, but leave it before completion.

That may not be the best metric. Even if it's actually possible to tell apart populations of people with basic programming abilities and those lacking them, it's not enough of the solution. Some people may be quite capable, but still dislike the process. So there will be people trying to study CS that will eventually want to leave.

Maybe it's better to just make these people leave quickly

The way that my university was approching this was to put fast paced programming course with focus on algorithms in the first semester. The course, apart from teaching obvious skills, had an explicit purpose of failing or discouraging all the people without the skill or interest necessary to finish the studies¹. And then in the second semester, it was followed by writing a sizable GUI application as an individual project.

While I don't have any numbers, I believe it worked kind of well. It seems that whoever survived the first year and was willing to stay, was well capable to finish the undergrad course and find work in IT.

However, there are some caveats

The main issue with the presented approach is that it's strongly negatively biased. It discourages all the people that should be discouraged, but perhaps discourages also some happened to be just a little bit to slow. And while a university sucking up in all the best country talents may afford being picky, I'm sure it doesn't work everywhere, at least not at this scale. It also must be more tricky in a private school. My university was public, and people generally paid no tuition.


¹ Actually, the course, called "introduction to programming", had 2 variants: imperative (mainly for beginners) and functional (for people with some experience). The first one was the sieve, for the second one it was mostly a non-issue.

Answered by Frax on December 30, 2020

The fix is in accepting that failure, sometimes repeated failure, is not merely an option, but is absolutely necessary and an integral part of success.

The goal is to reduce the number of drop-outs: students who start a program, but leave it before completion.

Is it? Who says so? Is that yet another law of Nature we forgot to be told about? There are academic environments where drop-outs are how students learn about stuff not working out, they are pretty much a given... I dropped out of an automation and robotics M.Sc. degree at age 18, only to take Ph.D.-level robotics classes a decade-and-a-half later, on another continent, and enjoying them immensely. No regrets. It's good that I attempted it and dropped out of that life-sapping uninspired quagmire!

The corollary: sometimes you do want to enter something just to appreciate it's not for you - not necessarily in the absolute, but at least at a given moment in time, or in a given mode of presentation (this includes institutional culture). And trust me: institutional culture matters a whole darn lot. There are teaching institutions that do a superb job at the lectern, yet the culture may be wholly unbearable to a particular student. Whenever I think about studying just about anything in place X, I want to throw up: no "resources" would have helped beforehand! I had to try it and leave.

But unfortunately I know where your question comes from: it's the deeply American obsession with an educational system that financially ruins a good chunk of the students who enter. The solution to avoid the ruin is not ever, not even once, to avoid drop outs. Instead, it is to appreciate the fact that this fantasy of a student getting a loan, entering a major, finishing it, getting a job, and then paying it off in a reasonable amount of time is completely unrealistic for most people. It is doing everyone the most disservice to attempt to wrap the educational system around this fantasy, and thus to attempt to avoid drop outs in order not to ruin the students' finances, or avoid dropping in some "rankings" or eligibility for student loans.

The failure is an integral part of success. Until the American academia wraps their thick collective head around this fact, there's no salvation, only misery. Life continuously bears this out, yet people pretend the problem is elsewere. It's nuts.

In other words, the question as stated is totally begging the question in the most horrific way possible: the assumptions made ruin lives. I'm not joking.

Answered by Kuba hasn't forgotten Monica on December 30, 2020

Teach students coding in high school.

In Australia, the national curriculum makes teaching students coding mandatory up until year 10, after which it becomes an elective. As a result, you’d be able to expect most school-leaver students entering university to have at least some experience with the negative aspects of IT, and those who would be disinclined to work in OT would be less likely to apply for an undergraduate IT degree.

Answered by nick012000 on December 30, 2020

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP