I have, instead, bad advice, more in the vein of How's your Perl? and How's your Perl? (II) The last time I used these questions, I asked a coworker who sat in on the interview about them. He responded by saying that if someone gets them right, you know they know Perl really well. If someone gets them wrong, however, you can't tell whether they know Perl well or not. I agree, so I'm more or less retiring them here.

Yes, I really did ask these questions of a candidate. In my defense, I prefaced my questions by saying that I was not very concerned with whether he got the answers right or not. These questions were meant to be "conversation pieces."

I think your answer is correct. My own answer looks at it from the perspective of where you might notice a difference in a real program. I think your explanation says more about how that difference arises (i.e., how it all works).

I'd decline any job offered by an organisation that asked these questions. The first couple were OK, but after that, you're just trying to catch the applicant out with trick questions involving code that they'd be sacked for if they used it in production code.

I'd leave the interview with the impression that you're just trying to show the others on the interview panel how much better you were than the applicants

I'd decline any job offered by an organisation that asked these questions.

Mmmm .. and they'd probably be OK with that, because how you answer the answer (or walk out of the room) is part of your answer.

An interview happens on many levels -- there's the basic, "Hi, How are you, ..." level, there's the technical level, and there's also the meta-technical level. For me, the meta-technical level is the most interesting -- sure, you know how something clever works, but can you explain it to someone so that they understand? And why was it necessary to do it that way? Can you explain your thought processes out loud as you go, so that your interviewers 'get' how you approach a problem? (In my most recent interview, I proposed a solution to a regex problem, was asked to explain it, started my first sentence, said out loud -- "Wait -- that won't work", paused, then proposed a second, different solution. Apparently, that approach works.)

This meta-behavior also helps them understand how you may well behave when you get stressed our doing too many things -- My response used to be to bark at people (don't get into that habit -- it upsets them), but now I look them in the eye and say "I'm in the middle of an emergency right now -- is your problem more urgent?" and wait for them to explain. I make a point to follow up a few minutes later, once my emergency is over, and deal with their emergency.

The first couple were OK, but after that, you're just trying to catch the applicant out with trick questions involving code that they'd be sacked for if they used it in production code.

I really don't think anyone gets the boot as a result of a code review. It has to be a combination of many factors, all pointing to the breakdown of the employer/employee relationship.

There shouldn't be any 'tricks' involved in interviews -- it's an exploration into whether there's a the basis for a good relationship, based on mutual compatibility. But if someone has (unwisely) labeled themselves as a 'Perl guru', I guess they should expect a few of these tough questions.

Did you read the whole post? In the interview, the questions were posed as starting points for discussion, not as a pass/fail trick quiz. And that completely changes how I for one react to them. I'd say there are plenty of valid reasons for finding out what a candidate thinks about code like these examples!

My reaction to an interview along these lines would be:

Answer the questions.

Discuss why tricky code like that is bad, how I would refactor it, when and why I might choose to document rather than refactor, and the value of coding standards in preventing such code being written in the first place.

I'd make my decision on whether it was a place I wanted to work based on how the discussion went, not the questions themselves; and I'd hope the interviewer would judge me likewise.

I would be worried about hiring anyone for a perl job where at least some specific perl questions like this were not asked.
Consider:

1) perl gives you enough rope to "hang yourself from several trees while blowing your own foot off with it"

2) for those who can swim to the surface of programming language agnosticism, there's a justifiable reason for perl having the stereotype of being a "write once language"--it's called the swiss army chainsaw for good reason.

3) in the real world, that of businesses, deadlines, changing requirements and general idiocy will force all of us at some point down the slippery slope of committing bad code, designs or architecture decisions, so at least try to minimize these flaws as an act of charity towards the poor sap who will end up maintaining your code.

A good perl programmer should be aware of a small handful of "bad perl idioms" and related gotchas, or at the very least be aware of the the dichotomy between perl's blessing and curse: the flexibility of the language comes with a cost--you have to be careful with what you do or you can hang yourself with bad code. (Of course this is countered by the rewards gained by the smart use of tricks in the language.)
Someone who would walk obviously isn't interested in any of the above, and I'd also think twice about taking on a developer who isn't generally interested in puzzles related to programming languages themselves.

While you haven't said so, I'm guessing that what you have on your mind is something like what you said in Re^4: returning tied array, which is to say:

You can only return scalars and lists and (nothing) from subroutines. You can't return arrays or hashes directly, only as lists or references.

I suspect this statement is more meaningful to perl programmers than to Perl programmers. I haven't read perlguts, let alone perl source, but I'm guessing that under the covers somewhere, in the sea of C, it's really true that nothing can escape a sub besides a scalar, a list, or Nothing.

get_list and get_array return the same collection of values in different ways. They do different things in scalar context. This is why I think there's a difference between returning a list and returning an array—they behave differently.

One might say that they both do the same thing in scalar context—return a scalar (neither list, nor array).

I think these points of view are looking at the sub from different sides of the return. This is the difference between "imply" and "infer". It's the difference between what one says and what another hears. It's the difference between expression and interpretation.

If I scream "value" into the Void, have I still said something?

I think one could correctly say that sub { @_ } returns an array. It always returns an array regardless of context. Its caller may receive from it a scalar or a list or Nothing, depending on the context it provides, but what the sub itself does is the same every time.

Inside get_stuff somewhere there's a return with some "stuff" after it. I'm guessing that perl takes that "stuff" and turns it into something that makes sense to whatever "..." is, and it does that before the "stuff" gets out of get_stuff. So what actually comes out of the sub can only be a scalar, a list, or Nothing.

I could conceive of it being implemented differently. It could pass out of the sub whatever the sub "said" it wanted to return, and then coerce it into the appropriate type once it got there.

If that were the case, would we still say that subs can only return scalars, lists, or Nothing? Would they really behave any differently? More to the point, how is any of this distinction relevant to Perl programmers?

(There's an even subtler trick related to a fundamental truth about the semantics of Perl 5 in that example, which I only realized after I chose it.)

I'm guessing that under the covers somewhere, in the sea of C, it's really true that nothing can escape a sub besides a scalar, a list, or Nothing.

The implementation is what it is to support the semantics of Perl-with-an-uppercase-P. I'm not interested in an ontological debate as to which came first, but the internals could change if p5p decided that these language semantics needed to change.

More to the point, how is any of this distinction relevant to Perl programmers?

I find that correctly understanding the semantics of a programming language has a positive effect on the quality and correctness of code written in that language.

This reads to me like it is motivated by the all-too-common and deceptively flawed meme of "an array in scalar context gives its size while a list in scalar context returns its last element". I've seen that used to justify so very many flawed conclusions. You can also use it to make flawed justifications to quite a few "correct" conclusions, but that just demonstrates how the flaws in that meme are deceptive (which is probably part of why it is still "all too common").

The subroutine does /not/ return an array that then decides to give its size when it finds itself in a scalar context. There are many other ways that thinking "this sub returns an array" will mislead people. So it is better to divorce yourself from both of these flawed ways of thinking about context in Perl.

And, no, my objections are not based on some secret knowledge on how Perl works internally. They are based on repeatedly seeing people make mistakes about how Perl scripts would behave based on these ideas. There are lots of cases where these ideas give the right answer. But the cases where they lead to the wrong answers are surely more important.

if someone gets them wrong, however, you can't tell whether they know Perl well or not

I'd say the first two are important to know.

The next two fall into the "I wouldn't do that, therefore that code is suspect. I'm not sure what it does, but documentation needs to be added to explain what it's doing and/or it needs to be refactored."

The last one is just plain unimportant. The code dies quite loudly and reproduceably. A better discussion would be how to fix it.

Hey, that was fun! 5 out of 6 (the last one tripped me up). My explanations weren't as in depth as yours, but I feel my interpretation of the end result justified a correct answer. I don't think I would have done near as well last year at this time. Thanks Monks!

I hate these type of technical questions... I find they are just dumb!

If I could recite 'PI' to the X decimal points is that a good measure of my mathematical abilities?

If I am a Spelling Bee champion would this indicate my abilities to write better than other people?

NO!

There is more important skills than knowing the subtle differences between coding implementations( this is especially true when you have an interpreter to check your work).

What makes a good programmer? How about,

Can they write understandable code

Can they abstract code to make it readable?

Are their coding behaviors consistent?

To me these are harder to evaluate but drastically more important. It is amazing the number of problems you can avoid with a good approach. A deep knowledge of a languages quirks and behaviors will not take you very far.

I have had horrible interviews with people that decide that knowing this intimate behavior is considered a sign of a good programmer.

These people also end up writing 600 line while() loops. that no one can understand. So I have little respect for individuals who rely upon these sort of evaluations.

Personally, I like to hire the type of person who finds these types of quiz to be fun. It doesn't matter if they know the answer, it's the attitude that matters. A person who loves programming sufficiently to care about these irrelevancies is likely to be a good programmer. The sort of person who views such questions as being about right and wrong answers is probably not an "-Ofun" type of person. The best way to determine a person's attitude towards this type of quiz is to ask a couple of questions. The ideal candidate will probably have one or two of their own to throw back at me. At that point the interview becomes fun, because I know I'm going to recommend to hire.

I rarely found these types of question fun, your right the attitude matters, but the people who use these questions are not looking for a good programmer, they are looking for a technical programmer which is not the same thing. It is easier to identify a persons technical attributes than there qualities as a good programmer. Take this example object;

Can you see the problem? Technically they are both correct, but one of these basically creates havoc that cannot be overcome by intimate knowledge of the language.

The print statements display the same value because they access the same data. The call to access the data directly via the hash reference forces it to be a static implementation. It creates maintenance headaches and you cannot update your object without breaking code.

Focusing on technical trivia is not a replacement for development/design process.

Update: I did sloppy work on my object code (pointed out by shmem and I fixed it to reflect the problems he pointed out)

I always feel bad putting a candidate on the spot with "tricky" questions, but on the other hand I want to give exceptional candidates a chance to stand out. I have a favorite question which lets me sneak some advanced stuff in while not putting unfair pressure on non-advanced candidates:

Take a look at the following short snippet of code. There are some problems with it. Can you point out the problems and explain them?

I include syntax errors, programming logic errors, some really bad style elements and a couple of debatable style problems. Some of them are really obvious, but others are very subtle. It gives the good communicators a chance to show how well they can explain things, and the knowledgeable Perl programmers a chance to show off. At the same time it doesn't put anyone on the defensive, because there are a few easy-to-spot errors.

Another open-ended, conversation-starting question I like: write (or describe how you would write) code that performs some specified task, without using particular perl operators or functions. I asked one question like this during our last round of interviews -- manipulate a given array without using for/foreach/while/until -- and was pleasantly surprised to get several different correct answers. When I wrote the question, the keyword I was fishing for was "map", because a lot of beginners seem intimidated by it or completely unaware of it.

...but on the other hand I want to give exceptional candidates a chance to stand out.

This reminds me of an assessment technique I faced in high school once. A teacher from another school would come to the student and begin asking questions. At first, they were simple, and they'd gradually become more difficult. Once the teacher found the student's limit, the teacher would walk the questions back again. From the student's perspective, only one or two questions were too hard to answer, and they got to finish well. I could see using this method for technical assessment, but I'd have to have more questions available than I'd normally use.

I've had the "write code that does..." questions in interviews, and I've enjoyed them. I really like the code review method that you present also.

I've had the "write code that does..." questions in interviews, and I've enjoyed them.

Personally I've been through one interview with "write code that does..." and I really disliked it. But in part that was due to the fact that I was given a pen and paper (yep, pen, no erasing here). I don't write code on paper.

I was able to get a snippet of code that would work, but I write slowly -- especially if I'm trying to write legibly. So the whole thing made me uneasy and very self-conscious for the rest of the interview. If I'd been given a computer to type up the code (even a computer with nothing but Notepad) it would have alleviated a lot of anxiety. But maybe that's just me.

Fine questions, but they'll only tell you if someone is technically proficient. That's useful information but in my experience it's not nearly as important as whether the person is a good learner and cares about their work. I'd take a dedicated beginner over a careless guru any day. I still don't know how to interview for these qualities, but I wish I did!

Rather than just asking the candidate a few questions, I prefer to give them a task and a set time to do it in. The tasks that we use here (obviously I can't tell you what they are) test that he has sufficient clue about data structures, algorithms, and the language. We do *not* expect anyone to finish any of them, but can get a great deal of information by seeing how far they got and how they approached the problem.

The code they write can often lead to some quite interesting discussions.

If and when you do that, always give the candidate a collection of language reference-texts or free-access to a web equivalent. Be sure to emphasize to them that they are free to use any of those sources entirely without penalty. If they feel that they can't do the exercise and instead want to explain to you the approach that they would take, let them do it ... without penalty nor prejudice.

A good coder is fine, but a good conceptual designer who can present his or her thoughts and ideas to you in a cohesive and understandable way is infinitely better.

Another approach is to present a candidate with a block of code and ask them to explain, in their own words, what it's doing and perhaps what its data-structures look like. Ask them if they might have any comments or suggestions about the code. The code that you select for such a purpose should be the clearest, least-obscure code that you can find.

During all the community-college courses I have ever taught, students were allowed to have a hand-prepared “cheat sheet” with them during the exams. They turned-in a copy of those cheat-sheets with the exam. You could see their depth-of-understanding from the way in which they prepared that material, and I notice that the very best sheets were rarely used during the test.

Another important courtesy that I suggest, in these days of e-mail, is to send the candidate a detailed description of exactly what you intend for them to do during the interview. Consider sending them a preliminary e-mailed interview, not from Brain-whoeveritis, asking them to return their responses via reply. I'd have no problem at all telling anyone generally “what they are,” since each ‘exam’ when I actually sent it out would be unique. I'm not trying to test a candidate's ability to react to surprises, and I don't want to re-create grammar school with all of its anxieties.

Oh absolutely. They have the interweb, CPAN, all the man pages, and any books that we have in the office. We tell them in advance that there will be a programming test, but *not* what it will be. We give them an hour and a half to get as close to solving our chosen problem as they can. Giving them, say, six hours, or a day, would make it a lot harder for us to tell the excellent candidates from the merely good.

I think that I, too, would join the “walk out of the room” group. The mere fact that I was being asked such questions would tell me a great deal about the organization, including the fact that I would not want to work there.

I've been programming for ... well, for a very long time now ... and “picking up a new language” is frankly the work of a long weekend, at most. The task of understanding an obscure language-construct is the work of fifteen-minutes on Google.

... but the experience that enables me to know not to write such code in the first place, and to know to be repelled by it and to eliminate it (like kudzu) wherever it may be found, has taken ... well, a very long time.

Therefore, I frankly do not want to work for an organization that prizes its developers' knowledge of arcane language-lore. I don't want to have to deal with their code-base or with the rash of avoidable bugs that I know it will contain. I don't want to plunge into a nest of competing egoes, because in such a nest there will be neither partnership nor communication. This will not be “a healthy place to work.” Instead, it will be a constantly-abrasive one that will grind you down, and life's too short for that. The best thing to do with such places (and they are legion...) is to avoid them at all costs.

The questions that you are asked during even the very first stages of the hiring process will tell you a great deal about what that organization values, what qualities it holds in high esteem, and how it defines its worth within the business organization in which it is situated. A company's interview process is a bright window straight into the personality and temperament of a fairly high-level manager whom you may never get to meet. They will reveal the organization's confidence (or lack thereof) in itself, and may illuminate the nature of the political image-battles which the organization fights.

I say again: interview questions are a magic mirror. A workgroup that peppers its candidates with obscure questions lacks confidence in itself, and therefore will lack confidence in you even if your name is Larry Wall. A workgroup that asks how you feel about teamwork and long-hours isn't a cohesive and well-managed team and pays for it with long hours. Per contra, a workgroup that talks about company-paid employee training early in the interview process is probably a well-run group that is on top of its game (as it should be), and confident-enough about staying there to pay attention to its members' professional growth and personal well-being.

Your reactions to being asked such questions will likewise tell you a lot about yourself. If you find that you have a visceral negative-reaction to it, do not ignore your ‘gut,’ no matter how badly you (think that you) want the job! It's tough to walk away from an interview, much less a firm offer, especially when you don't have another offer in-the-wings. But sometimes that's what you have to do. You want to “get to the ‘yes,’” but it must be the right “yes,’ and the right one might not be the first one. If you are not satisfied with your job right now ... if it did not turn out to be what you expected it to be ... then unfortunately, you made a poor selection, too.

From the perspective of an interviewer, a big problem is that it's much easier for a candidate to BS you with a hand waving design issue than with raw code. You can hide a lot of ignorance behind an insistence on big-picture design issues. If I have 45 minutes (max 1 hour) with a candidate then it is worthwhile spending one or two minutes checking the fundamentals using this style of questions.

I sometimes compare this focus on fundamentals to the game of Go: strong players have a very strong sense of "direction of play" (i.e. strategy), but the way to get strong is to study life&death problems, and tesuji: the intricate tactical issues that weaker players ignore in favor of "power" moves (which work only against similarly weak players)

Of course, it depends on the resume. If you say "Perl expert", then I consider the questions fair: you should at least be able to get the answers wrong. If instead you claim "Ruby expert" then I wouldn't ask the same questions: I'd probe more for the ability to pick up new languages (hopefully such a candidate would know more than one, so I could find some common basis).

And how exactly does the interviewer know that your years of experience translates into skill? Given your years of experience, I'm sure you've met other developers who, even though they have many years under their belt, could hardly barely write a "hello world" program.

Simply put, testing helps find out if the candidate is fibbing about their skills or not. And yes, a lot of people lie (or exaggerate if you wanna be nice about it).

I know that a fair number of candidates are fibbing lying about their past experience. They have to, as long as gatekeepers filter-out resumes based on the field-names they put into a “skills” array and the numeric value of that field.

The best solution that I have found for this phenomenon is to talk about soft skills in the job-requisition, and hope that enough of it makes it through the HR-gauntlet to be useful. Describe what the candidate will be responsible for, not in programming-terms but instead in business-terms.

A problem that you will very-frequently encounter is that candidates simply don't have “business” skills to begin with. Nothing in their formal education (that they might well have spent tens of thousands of dollars for) has prepared them for this. So they have studied “wrenches” for years, and they've maybe even torn-apart and rebuilt an engine, but if you make the mistake of asking them an abstract question you get a blank stare. But when I'm interviewing, it's those abstract questions that I want to get answers to.

One of my favorites: “In your opinion, what makes a Truly Great piece of software, and why?” Notice that there is no right answer. That throws a lot of people... I wish it didn't, and I don't mean for it to. It's another chapter of the story of folks who get out of school with a perfectly-honed ability to take tests, and no practical knowledge whatever. That's a failing of our educational and training system, not of those people.

Let me put it this way: “around here, we don't ‘write programs in Perl.’ Well, that's what we Little-D-Do. What we Big-D-Do is to build solutions to business problems for people who, quite frankly, don't want to give a damm about computers except to use them. We intend for them to find that our solutions are technically flawless (“but of course”) ... and to find that our solutions are great.”

I'll omit all mention of what programming languages we are using, if I can. The folks who don't particularly care what language we're using are the ones I want to talk to. The ones who know how to design-and-build Great Software... in anything.

It can, of course, be problematic to get these things through “the HR gauntlet,” and yet you have to work with these guys and do things their way, because they're the ones who make it their business to keep you from getting sued. “Hell hath no fury like a lover loser-candidate spurned...”

I think the value of knowing arcane language-lore is not in its utility when writing but in its utility when debugging and refactoring (i.e., when reading). Yes, you can look up some obscure construct and figure out what it does fairly quickly. On the other hand, some constructs do not appear to be obscure but can have unexpected features anyway (the map vs. for example, for instance).

Most (if not all) of the places I've worked have somewhere some old scary code written by someone who wasn't very experienced at the time. I want people who can read it and know what it really does, not just what it looks like it's doing or what the comments say it's doing.

Knowing the arcane can also reveal a passion for the subject.

Pretty much every conversation of strange constructions or obfuscated code that I've been in has included someone saying, "but writing that would be a bad idea anyway." If I'm talking to a candidate who doesn't say that, it makes me wonder. If I ever talk to a candidate who says, "I'll have to use that feature," that's almost certainly disqualification.

Went to join the gridlock to see it
Held an eclipse party
Watched a live feed
I cn"t see tge kwubosd to amswr thus
I tried to see it, but 8000 miles of rock got in the way
What eclipse?
Wanted to see it, but they wouldn't reschedule it
Read the book instead