Difference between revisions of "Hiring Programmers"

From iGeek
Jump to: navigation, search
(Created page with "<includeonly> 96px|right|</includeonly> <noinclude> 240px|right</noinclude>Human Resources people, Managers,...")
 
Line 87: Line 87:
  
 
{{Footer| written=1998.01.08|edited=| }}
 
{{Footer| written=1998.01.08|edited=| }}
[[Category:Programming]][[Category:Tech]][[Category:Career]]
+
[[Category:Programming]][[Category:Tech]][[Category:Career]] [[Category:HR]]
 
</noinclude>
 
</noinclude>

Revision as of 23:08, 20 June 2019

Hiring-programmers.jpg

Human Resources people, Managers, and general users, have no idea how simple or complex computer programming is. They think that they can just throw programmers around from one task to another, then some HR people select computer programmers based on language (Syntax), and not what really matters (skills and abilities). This would be like hiring an employee based on what school they attended and not what subjects they studied! This article will give some non-programmers a better idea of what Programming is about, and what they should be looking for when hiring programmers.

Syntax

There are many computer languages. Non-programmers think that language is important to programming. This would be like hiring an American auto-mechanic to add an addition to your house because he "spoke the right language". You shouldn't care about the language, but the abilities and skills.

A Computer Language (like C, C++, Basic, FORTRAN) probably has probably 50 different common commands (do-while, if-then-else, for-next, etc.), and about 50 symbols to define actions (+, -, *, /, =, etc.). Of those, about 60-80% will be in common with other computer languages. All the command names and rules for a programming language are collectively known as "the Syntax" for that Language. The basic structure of computer languages have far more in common with other languages than is different from them, so when you learn one language you are learning the concepts for most of them. More modern languages (C++) are more complex than older ones (like FORTRAN), but newer languages still aren't very complex. This has changed a bit with the complexities of Object Oriented Design and Programming, but not THAT much (for non-specialist jobs).

The scope of Syntax is small. A programmer that knows a Language (say Basic or Pascal), can be programming in a new language in a day or two (in say FORTRAN or C). To be pretty good at the new language will take a week or two, but they could be creating code during that learning process. There are idiosyncrasies with each language and little gotcha's that will take a month or two to get over, but the little mistakes they make while learning are costing like 20-30% less productivity (than average). If they are any good, they'll be over that quickly. Even if they become a "guru" of a language (the top few %), and know that language inside and out (which can take years), they are probably only another 10-20% more productive than the average programmer in that language. So language skill (Syntax) is not the key to being a good programmer.

The issue of productivity is not how well you know the syntax, but how well you know the scope of the problem you are solving, and your personal motivation to solve it. I've known some people who are mediocre "coders" as far as knowing all the details of a syntax, but who still make short work out almost everything a company needs.


To really utilize a syntax to it's fullest, you would have to write code that only other gurus could follow. That means the others (majority) can't maintain the code, because of all the esoteric "quirks" and tricks that are far too unique to that language to be good form (since most other "average" programmers won't know those tricks). It is better for the company if the experts learn to program like they are just "average" coders (in syntax), so that most of their peers will be able to understand and maintain that code quicker (if they leave or are busy on something else). I don't want coders that are trying to show off their abilities by writing code that's more expensive to maintain and polluting the program with code that no one can follow.

A good example is in C. Instead of writing with the readable and self-documenting construct of "if-then-else", they use the symbolic Boolean expression of "? :", which actually means the same thing, but many average programmers may not even know.

The point to all this is that the mastery of a Languages' Syntax is no big deal. Basically understanding a Syntax is usually a matter of days or weeks. What language a programmer knows is practically irrelevant compared to the other issues and larger learning curves.

Libraries

A "routine" is some code, that is meant to do one "thing" well. Maybe it is drawing a line, or a character on the screen, or sometimes it is drawing a whole picture or displaying a menu, or one of a thousand other functions. A program is usually made up of a lot of routines (thousands or tens of thousands).

A library is a collection of prewritten routines (all doing one common thing). These prewritten libraries help make a programmer more productive, because they don't have to add all that functionality to every program themselves. Many languages are not only a Syntax, but they come with a set of "libraries".

Learning all the Libraries for a language takes far longer to learn than the Syntax itself. For example, C comes with a set of routines (library) to manage text strings (a string is just a bunch of letters, often grouped as words, sentences or paragraphs). Now these routines (called Standard Input/Output or STDIO) can manipulate "strings" in hundreds of different ways, and put them on the screen, or input characters from the keyboard, etc. Lets say there are 50 different "routines" in this library alone, and there are many libraries. These routines behave like an "extension" to the language. So a programmer (who wants to use this library) not only needs to know C, but they also need to know this library. It takes weeks just to learn this library, and there are quite a few different libraries that come with C, and many others that come with different development environments.

Development Environments -- Each company that sells tools, will often add functionality (libraries) to a language to add value to their tools. So Microsoft often has their own libraries for their version of C, as does Symantec, Metrowerks, Borland, Apple and others. The environments libraries are usually "unique" (but similar). These custom Libraries (and other environment complexities) can be as complex as learning a new Syntax (including a languages standard libraries). Again, we are usually talking about learning curves that are measured in days to be usable, weeks to be pretty proficient.


Company Code

As companies develop their own solutions (programs), much of the added value or competitive advantage is in the code they create. But that code is complexity, and often a set of libraries as well as Application(s).

These custom libraries (unique to each company or application) and the applications themselves, are usually more complex to learn (by far) than a new Syntax. (We are sometimes talking weeks to becoming usable, and often months to become proficient).

It is far more costly to train programmers in all these libraries, or custom code, than it is to teach them a Syntax. Some programmers will never work with the "standard libraries" at all, so despite being "C" experts, they are probably less than halfway to doing what you may want them to do. And that's assuming they are a menial programmer who just codes; if they need to be a higher level programmer or team lead, they need to understand the business processes that are unique to the business and company to make better solutions for the customers. That can take years.

The point is that Libraries are often far MORE important when making hiring decisions than the language, and you business is more important still. But many managers or Human Resources people never think of that. Most important to companies is NOT the Syntax or Libraries a programmer knows, but the quality of the programmer himself (and how fast they learn), but it is hard to get that from a Resume.

Disciplines and specialties

There are also areas of expertise (or disciplines) -- think of specialties in medicine. Just because a Doctor is a good M.D., does not make him a good Dentist, Surgeon, Psychiatrist or even a good Oncologist. These are all areas of expertise (or disciplines). The same "specialization" applies to computer programming.

There are areas like Networking and Communications, Low-Level Drivers, Database Programming, Graphics, User Interface, etc., that each have hundreds (or thousands) of unique problems associated with just those areas alone (those disciplines). Just because you know a language, doesn't mean you know diddly about a discipline that a company wants you to work in. In fact, the discipline or area, is often far far more important to how productive someone will be than the Language or even the System they are using.

Management or Human Resources often thinks of "programmers" as a generic resource; like they can just replace one for the other. They throw a low-level driver person (who works in assembly language) at a User Interface problem (to work in C), and think they will get adequate results. This is about as smart as using your proctologist as a brain-surgeon. Of course, the companies doing this demonstrate this might be effective, considering where their brains seem to be residing.

Systems

An Operating System (like MacOS, Unix or Windows) often has many (dozens) complex libraries, each library can have hundreds of procedures (also called Functions) and dozens of structures associated with each library. These System Libraries (called API's) are far more complex (individually) than Language Libraries, and there are far more of them. They are often grouped in functionality based on specialty or problem they are solving (and there are many many specialties).

I've heard that Systems like MacOS or Windows have literally 15,000 - 30,000 different "routines" that can be called. Think about that. Compare that to Standard Language Libraries (a few hundred routines) or a Syntax (50 commands) to get a relative idea of the problem. Some companies will throw a Windows programmer at writing a Mac Program because the Mac people available don't know the Syntax (language) they want. If that sounds a little insane to you, don't worry, the business world sometimes is.

Now there is a lot of commonality between some OS's; much of early Windows was ripped off in architecture from the Mac. If a programmer is shifting platforms, but staying in their areas of expertise, then they can often make the switch just fine (since the commonality in their specialty is greater than the differences among the platforms). But if you are throwing a general programmer, with a complex Program, on a different platform (System), it can take weeks or months before they are truly productive - the sheer number of libraries (and routines) that they need to know can be overwhelming, on top of many other platform specific gotcha's.

Procedural vs Object Oriented

There are two totally different ways to think of programming. One is called Procedural Programming and the other is Object Oriented Programming (OOP, or OOD for Object Oriented Design). These are completely different paradigms (ways of thinking).

OOD programmers, can often work with Procedural Programming, but the opposite is less often true. Sometimes procedural programmers just program OOD the same old way they know (procedures) resulting in really bad code. It can take months or even years for Procedural Programmers to really "get it" and think in Objects.

Object Oriented Language experience does not guarantee that a programmer understands OOD/OOP. They could be using the OOD paradigm the old way. They really need to understand concepts like Polymorphism, Encapsulation, Inheritance, Data Hiding. I use my programmers to interview other programmers, and then listen to them; they can get to the real technical capabilities of the interviewee.

Frameworks

Frameworks are basically Object Oriented Libraries. Code (procedures) are grouped into Objects (in OOP) - and collection of Objects are grouped into Frameworks. These Frameworks are often the skeletons (Frameworks) of working programs, and all the programmer has to do is add a little functionality to get an Application written. Basically instead of writing a program, they just have to change the program that is already written in the ways that they want. This can make some OOD programmers more productive than older procedural programmers. (This is the motivation behind creating OOP/OOD in the first place).

The gotcha is that a Framework may have hundreds of classes, that each have dozens of methods (functions). This is one of the most complex things a programmer has to learn. Shifting between frameworks can take weeks easily and often months. Some are more similar to each other than others, but framework changes are still huge effort for programmers. In fact, so huge, that the cost of choosing a framework, and getting over the learning curve, can outweigh the gains you'll get from it; so you need to choose wisely.

Middleware (Java, Cocoa)

Imagine a Framework that would allow you to program once, and then deliver that App on multiple platforms. That Framework would be like a huge library that sits between programmers and the OS (in the middle). Better ones are implemented on multiple platforms so that you can reuse them everywhere. But this middleware are just specialized prebuilt frameworks to help programmers create more powerful solutions quicker.

(1) The term middleware is also used in Database terminology -- basically they have multiple stages (or tiers) for some databases - and the stage between the others is called "middleware". But middleware is broader than just the database specific term; even though the database usage is the more common usage.


These middleware frameworks have large learning curves, but can offer large productivity gains. But each of them is so complex that they have many caveats and you need to start thinking about hiring "specialists" when using them, or paying the price (in lost productivity or work-around/debug time if you don't).

Conclusion

Hopefully, I have explained some of the complexities of hiring programmers (and what programmers need to know).

The Syntax is of low importance despite the archaic trend for HR people and many managers that care. Programmers (worth anything) can pick up a Syntax in no time (a day). What takes far more time (money) is learning a System, a Specialty, or a Framework, or a whole methodology or paradigm (like Object Oriented Programming).

The individual programmers ability to learn and adapt, is far more important to a company than what Syntax they know. This is assuming you are looking for an employee that will be with you for years. If you're looking for a in-and-out contractor, then you pay for them to be specialized in exactly what you need (including syntax).

Libraries, frameworks, and systems all grow in complexity and thus take more time to learn (overall). But the most important stuff is your solution and systems. That's going to have a larger learning curve (to be good) than most of the languages. But you need to hire programmers based on their ability to learn, adapt, grow, and help your company. Their motivation and abilities to grow and learn and help you solve your problems is far more important than what they've done in their past. If you understand these concepts, then you understand more than most Human Resources and hiring managers. Hopefully more of them will read this article, and the computing world will be a more sensible place.

Written 1998.01.08