Web development is hot right now. Really really hot. In Atlanta, code school grads with little experience are getting snapped up as soon as they graduate. I'm having to look to Eastern European remote talent because salary expectations in the US are unreal.

Literally every company needs a web presence, and it's quickly getting to the point where the usual crappy UX just isn't cutting it.

It also happens to be as hard or harder than most other types of software engineering. You have to stay on top of trends, keep building your skills. You don't deal with algorithms much, but your OOP needs to be on point if you hope to build something maintainable for the web.

Most of the potential talent has a subtle disdain for web work, everyone wants to be a game dev or do stuff that's math-heavy or algorithmic.

"It also happens to be as hard or harder than most other types of software engineering."

I would disagree with this point, as evidenced by the fact that people coming out of code schools are still getting hired. To do software engineering in other fields (financial, embedded systems, game dev, operating systems, enterprise LOB, cloud platform, crypto, big data/distributed systems, etc) takes a lot more experience and training. A competent programmer can crank out a rails based Mom & Pop small business site in less than a day. You can't really crank out Big Table, or Unreal Engine. However, the market size for small business web sites vastly exceeds that for Big Table.

> A competent programmer can crank out a rails based Mom & Pop small business site in less than a day.

I never wrote any code in C, but given a tutorial I'm pretty sure I could create a client-server application to exchange time in milliseconds between two hosts. Would you use it instead of your NTPd daemon in a production server? Of course you wouldn't.

Likewise, I'm very sceptical when I read about programmers who learned Javascript in a weekend or wrote a rails application in one week without knowing Ruby. Is it possible yes? Do I trust the app or the programmer absolutely not.

Web development is not considered programming by friends of mine that indulge themselves in (linux) kernel development, write (or try to write) code for drivers, do reverse engineering and what not.

But to create a modern website these days you need to know lots of things: CSS3, HTML5, Rails, Ruby, Javascript (Angnular? Ember?), you also need to know how to combine them. How to setup everything in order to mitigate attacks, etc.

Ruby and Rails are easy to learn but hard to master, same goes for JS (with all the whistles and bells that the language comes and the various frameworks try to elude one way or another). There's also Go, Python, PHP... Same goes for all these languages: Easy to learn hard to master.

So, any language can be hard or easy depending on a wide variety of things, but web-dev is not easy these days. It's easy to be mediocre, it's not easy to be good.

It's a steep cliff. Everyone says the barrier to entry is low, and it is, but there's a ginormous difference between Mom & Pop site and something suitable for even a small business. E-commerce is sorta simple, but the constant need for third-party integrations will bury you if you don't work hard to keep everything simple. And the demand for those integrations just keeps coming.

There's also a much larger number of technologies you have to work with. With financial, embedded, game dev, OS, crypto, big data, you have a finite number of languages / tools to learn, once you learn them, you spend the rest of your career perfecting them. With web dev, you have to not only perfect your tools, but constantly integrate new tools and languages and paradigms into your workflow. It's a lot harder than it looks to do well.

And just because code school grads are getting hired doesn't mean they're making useful contributions from day one. Guaranteed it'll be some time before they're really worth their salary.

Web development's difficulty is a shallow complexity that doesn't require (deep) problem solving ability so much as it requires a willingness to tolerate tedium.

The kicker is that, in my opinion, almost all of the complexity in webdev is superficial and to a high degree artificially (by choosing an inappropriate application data transport protocol,HTTP) introduced precisely because of the perception that developer talent in the field is cheap (cheaper than alternatives at least).

Edit: and from your other comment suggesting you can get a $60k offer to somebody implies there is not a real shortage of talent (good or otherwise) and that you consider webdev to be a commodity, and not actually "hard".

- web development is a monstrous sub-field in the software engineering field. As others have pointed out, just about everything has some sort of web presence somewhere. There's a pretty big difference between someone doing something groundbreaking and novel and someone making the 920375235236th standard CRUD oriented java enterprise site. The latter camp will grossly outnumber the former camp by multiple orders of magnitude, but I think it's clear that what vinceguidry is referring to is the former (at least I hope he is)

- People tend to think that their sub-field in the software world is some sort of special snowflake that makes it harder to deal with than the others. They're aware of the tricky bits of their sub-field due to having a ton of specialized knowledge, but then look at the other ones and mainly see the surface scratching bits and declare, "Oh, that's easy!".

- Everyone has different strengths and weaknesses and will tend to gravitate towards the areas that they'll excel at. People also like to think that they're smart and that their coworkers are smart, so assumptions can be made that all of the smart people are gravitating where they're heading. To loop back to the OP a bit here, HN has a strong bias towards bay area startup culture which has a strong bias towards web development (generally of the former category I described above, at least they all would self-report themselves in that category) so everything is going to be seen through that bias.

In my circle of software friends we span a pretty broad range of the industry. Over time the view we've taken when discussing each other's world is that we all could rapidly learn to do the jobs that the others do, however most of us wouldn't want to. Personally my point of pain is when I start hearing about UI oriented stuff, but when I start talking about reworking algorithms to scale from GB to PB that's when they start tuning out. Neither's harder, they're just different.

People always think this when they think CRUD. The reality is, even CRUD can get quite complex. Even just laying out an HTML page properly takes a lot of knowledge. Understanding the 'zen' of CSS. The zen of HTML5. The zen of jQuery. What looks simple on the surface can get very complicated quickly. If you don't respect that complexity, then you'll find it tedious, and won't apply your problem-solving mind to it, won't abstract that difficulty away, and will find it unbelievably painful to maintain and add features to later on down the line.

I'm making what looks on the surface to be standard CRUD. A product info management app. Well, the product data itself are key-value pairs and the design calls for you to be able to edit which keys you're able to put in. So there's no set database schema, and I don't want to do anything like making a two-column table of values that has a zillion rows, so I'm using Postgres Hstore. Now I have to maintain an abstraction layer over that which has gone through a few iterations and might have to withstand a few more.

It makes creating views difficult because you don't know just how big everything's going to be. So I might spend a whole day laying out the list page, diving into the minutiae of CSS rules, thinking carefully about all the different types of data it's supposed to display and deciding where to put certain types of logic so that I can reuse it later if needed.

The thing about CRUD is that to really do it right, you have to also think carefully about the data domain. If you're not building flexibly, which takes up-front time, then when you have to move stuff it's going to feel really painful. I realized that I don't want the ability to add new SKUs to be accessible to normal users, so I moved it into the admin section.

It took five minutes and was as straightforward as it sounds. But if you're not all that great at web dev and/or don't respect the power tools at your disposal, such a move could take all day and introduce subtle bugs in your UX that you won't catch until next week when you're working on something else.

If you really are trying to re-solve a solved problem, like blogging, then don't do any programming at all, just fire up Wordpress. Absolutely solving solved problems again is going to be tedious. But if you're building a CRUD app, and you can't find something interesting about it, then you're not really applying your full mind to it. There's a reason you're being asked to build it, because the functionality they need isn't being offered elsewhere.

I built a vehicle reservation system that was "standard CRUD" but they wanted this dashboard style view that took up half the dev time that I thought was really interesting and turned out great. The rest of the CRUD came out easy, we have lots of tools at our disposal for generating that sort of thing.

I struggled with how to phrase that and eventually gave up hoping the ridiculous number I supplied would suffice.

I agree w/ everything you said, but I'd argue that oodles of software jobs out there are solving solved problems and doing it over and over and over again and these were the sorts of jobs I was trying to evoke.

The argument I'm trying to make is that it's not the job that's boring, it's the programmer that refuses to find something interesting about the job out of the mistaken belief that it's tedious or that it's a solved problem. It's not a solved problem. If it were solved, there would be a software package out there with a huge community, loads of extensions, that you could just use and be done with it. You wouldn't need any programming talent, just a guy smart enough to run an installer. Blogging is a specific case of CRUD that has just such a software package, Wordpress. Even so, Wordpress isn't the perfect solution to all instances of the blogging problem. Maybe you want a static one, maybe you want one in Haskell. Each combination is quite interesting in its own right. Hell, there's lots of things that Wordpress could be doing better.

There's lots of things out there that are half-solved, like the more general case of CRUD. Here there are frameworks that allow you to solve specific CRUD problems, but each specific instance of CRUD, for it to really be solved, needs its own software package just like Wordpress with a huge community and easy installer. The vast majority of them don't, so you can't call them solved. That means there's still interesting things to learn about them.

Remember 2048? I remember a lot of people saying it wasn't a terribly interesting game. Bullshit it's not! I'm building a Ruby library to run the game, (didn't like any of the ones I found) then I'm going to start building an AI engine to solve it using heuristics. When I play 2048, I've gotten up to 8192 a few times, and would love to be able to program an AI to play like I play. To build tooling to help me visualize my AI programming and watch it operate. 2048 is plenty interesting, if you don't think so, it's you that's the problem, not 2048.

I solve problems once. Once solved, if I need the solution again, I refactor or extract the needed solution to a library and reuse it. If I find myself writing the same kind of code enough times to where if I continued it'd become tedious, I switch gears and start looking for an abstraction.

Web applications 'grow'. Their growth needs to be managed. If you don't manage it well, your business needs suffer horribly.

> and from your other comment suggesting you can get a $60k offer to somebody implies there is not a real shortage of talent (good or otherwise) and that you consider webdev to be a commodity, and not actually "hard".

No it means that we've tried the corporate approach and found it a non-starter, so I have lots of leeway to drive the hiring process. There just aren't any good candidates willing to work for the money we're willing to pay. I have to lead up front with our salary because we've done it too many times where once salary comes up the whole process breaks down.

> There just aren't any good candidates willing to work for the money we're willing to pay.

...so pay more money? Or take a chance on the not so good candidates and have a great training program? Or start an internship/co-op program at a local university and find cheaper talent. Get creative.

Unless "the company" is an AI from the future, "the company" is made of people that can possibly be influenced to try something new. Except nothing I said is revolutionary. If you want to hire experienced devs, you'll have to pay more than entry level.

Agreed. While I completely disagree w/ the statement that webdev is de facto more difficult than other softeng fields, there are difficult challenges that they face.

However, I wouldn't touch webdev with a 10 foot pole, at least not the front end stuff. While not intellectually complicated, there's something about that stuff which makes my eyes cross, my head hurt and my brain melt.

Another reason that's contributing to the shortage is that universities are not emphasizing on it. The university that I went to doesn't even have web development course. There was one experimental summer course, then it never happened again. Even the Android development class went on to become very popular.

So what's left is that those that work in web development are either self trained or goes through one of those dev bootcamps. But people that goes into these programs does not necessarily have a CS background, and to be a "good web programmer", you do need that kind of background.

Note that universities haven't traditionally seen it as their role in society to teach job skills, believing that breadth of knowledge and subject area fundamentals are more difficult, more important, and more widely applicable, and that companies can and should foot the bill to train their employees on the details of their jobs. You can certainly argue that this is an outdated view, but it isn't a new thing.

Database? Network protocol? 3D graphics? I tend to put these in the same league as web programming, they are job skills. Yet there are plenty of university course on these subjects. So why not have web programming?

First of all, I think these are really good, tough questions, and figuring out what universities should be versus what trade schools (or as we seem to prefer, "code schools") should be versus what makes sense for community colleges, is really interesting and fluid.

I'm not sure what a good definition for "fundamentals" would be, but I'll speak to your examples one by one, which I think gives a decent sense. My databases class talked about what databases are, why they came to be historically, an overview of the different approaches, and a discussion of their trade-offs. My networks class talked about why we started creating networks of computers historically, gave an overview of a bunch of different approaches, talked about trade-offs, discussed why TCP/IP+ethernet/wifi has become the most common end-user deployment, and talked about how it all works. My 3D graphics class was very mathematical, discussing how and why the 3D transformations work, with mostly toy algorithm implementations. This wasn't one of your examples but my programming languages class followed a similar schematic of "history, survey, trade-offs" as databases and networking (ditto for operating systems, and some others). I imagine a web programming class following that pattern to give a sense of how and why the web works. But that is pretty different than learning how to make applications using rails/nodejs/ember/angular/react/whatever like we do in industry.

Actually, when I was in school, this model really frustrated me, it seemed out of touch with industry and like I wasn't getting the specific skill buzzwords I needed to put on my resume to get my first job. But looking back on it, I'm really glad to have gone through a program focusing more on history and concepts rather than technology and details. It's harder (though of course not impossible) to self-learn the broader subject matter, and the details are ever-changing and needing to be kept up with constantly anyway.

I think a really good model for universities is broad coursework in combination with an aggressive internship program and industry-sponsored project classes.

I once got a lowball (cough) offer that was so bad, I wouldn't have been able to pay my (unusually low) rent, buy gas for my (paid off) high-milage car, and make my (unusually low) monthly student loan. Health care was not included.

My experience is that employers with that attitude aren't worth working for. These employers don't respect my career, nor do they have a realistic understanding of how to make a profit.

Please don't treat me like an idiot. I've been studying OOP concepts for quite some time now. I'm quite capable of figuring out any deep concepts you care to throw at me. If you can't explain them in your own words, fine, but don't expect me to believe you have any real insight on this question because of that. I could watch that whole video and still not know what you were trying to argue.

Doing OOP in languages like Java and Ruby tend to involve classes, objects, encapsulation and state coordinating. Now, the truth is, state is really hard to maintain. You have this piece of information where everyone is trying to change or grab at, it's gonna end up leading to race conditions and locking. It may work for a small app, but as your application scale up, it's a maintainability disaster. So you said OOP done right helps. But, OOP was the wrong paradigm to begin with, it's like saying "Horse-riding done right gets you anywhere faster". I'm not saying everyone who has been programming in OOP is wrong, it totally makes sense to use OOP because it's easier to model real world things with. But no one has ever question, "This paradigm may be facilitating my understanding of the problem, BUT is it the right paradigm?"

> You have this piece of information where everyone is trying to change or grab at, it's gonna end up leading to race conditions and locking.

Why is everything trying to grab at state? I call information that everything wants access to 'data', and I manage that accordingly. Each application interested in it grabs the data, preferably stored in a database selected specifically for the needs of the data and how it's accessed, parses it into objects, does the operation it needs to on it, and then perhaps writes a new record of data. The objects can go away as soon as they go out of scope, leaving the data available to construct a new object when it's needed.

People say to store state in a RDBMS, I think that's ridiculous and a perversion of OOP. Program state belongs in memory, not on the network. It's not intended to be tabular, an object's state often consists of references to other objects. I sure hope you're not storing these in a database.

An object's state is only supposed to be accessible through it's interface. It's bad OOP to have other parts of the program interrogating its state directly. It's bad OOP to have an object interact with more than a few other objects. If you find yourself violating that, then you're treating data as state and you need to start managing that data separately, through a persistence layer.

Maintainability means being able to alter a program's behavior without having to understand the whole thing or make drastic edits. If you follow the rules of OOP and don't just say you're doing OOP because you're using classes and stuff, then you'll have earned maintainability because you'll be able to change a class's internal behavior without affecting the rest of the program because it's using an interface rather than needing deep knowledge of the altered class. And you can change the interfaces too, only changing the two or so other classes that use them.

I would be interested in learning about the demand in Atlanta? Which language? How much is a reasonable expectation?

I did a Ruby on Rails bootcamp in Chicago awhile back. At the end, I decided to take a mid six figure job in another industry and am wanting to go back.

It's been two years and while I have been able to save a couple years reserve now, I am basically starting over. I lean towards Ruby on Rails, because at one time I was offered an entry position as well as I know maybe a dozen developers that went through the same program.

I have money/time and the determination (In a previous job I taught myself chemical engineering, so Ruby on Rails was much easier)

Reasonably, you can probably expect $80K-$95K, over six figures if you shop around. Or I could make you an offer for maybe $60K. (send me an email, it's in my profile) I can put you in touch with a great recruiting firm. But since you have the time to look around, you should start coming to the meetups, there's usually a good turnout. Everyone I meet there is working at a company that's looking for at least one dev.

I don't consider it pedantry. I know where you're coming from, though. If I hear that a developer position offers a "six-figure salary" I figure that means $105k. But that's not what a "six-figure salary" means for a lawyer or a doctor. So I think it would help us all to treat that phrase more literally---and more in line with what it means in other industries.

> It also happens to be as hard or harder than most other types of software engineering.

This remarkable statement is probably true if we correct difficulty levels to the average ability of the population of "web developers".

Doing stuff that's "math-heavy or algorithmic" often involves years of training before you are even remotely effective. People are not snapping up the equivalent of "code school grads with little experience" to design and build optimizing compilers, computer vision systems, high performance distributed systems, operating systems, etc.

My experience is that as soon as I mention that I'd soon move back to my home country (in Eastern Europe), the conversation dries up, as 99.9% of the SV companies that contacted me are only looking to move talent to SV, and don't want to have remove positions.

Because this site has been overrun by (and/or revolves around) Silicon Valley entrepreneurs who don't give a damn about something unless it's in the form of a single-page noSQL web-scale big-data in-the-cloud CrappucinoScript imperatively dysfunctional HTTP2 Wangular.js monstrosity written by "hipster rockstar ninja devs" wielding Macbooks and plaid shirts and pocket calculators and half the inventory of ThinkGeek in a mockery of actual programming? ;)

More seriously, it's because web developers are in high demand, so there are going to be more postings for them; having a proper web presence is absolutely vital to modern businesses nowadays, and that requires developers to establish that presence. Hacker News is also run by YCombinator, which specializes in funding startups - a market which tends to lean very heavily on web development, since many of those startups are based on web apps - and therefore will already have an inflated quantity of web development jobs by that virtue alone.

Journalists and novelists are both writers, but they do not compete in the same niche.

The former will need to do several hundred writing jobs per year, while the latter may only need two. Thus, 99.5% of writing jobs may be for journalist stories. Any novelist looking for commissioned work will not want to sift through 200 posts to find even one relevant listing. So people posting such jobs would probably get better results on a site that explicitly excludes the noise.

In short, there are more "lightweight" postings because the people who solicit them and do them need to secure new work more often. If you can make 20 websites in the same time that you could build one enterprise application, you will probably see that job advertisements are 95% for websites and 5% for business software.

The demand for web-devs might be artificially high right now, as most normal people have not yet discovered that they do not really need a programmer to build their website. I think Squarespace / Wix / WP etc, maybe with a custom design, should cover most websites.

I've been doing back-end web programming as a freelancer for a few months, but it's really hard to find good clients, so I'm moving to something else.

All YC companies build technology, however many (most?) are building technology for markets that traditionally have not used technology in a modern way. These are things like cleaning (HomeJoy), flower delivery (Bloomthat), t-shirts (Teespring), etc etc. Theses technology enabled businesses are the ones that primarily use web/mobile, and thus all the jobs.

If you are referring to the monthly jobs threads i was wondering the same thing, i would have expected more people searching for mobile developer (especially freelancers), but i guess mobile apps are not yet the new "website".

Over the past 6 months I've had multiple clients drop their mobile requirement to focus on web. I think it has mainly to do with scope creep and cost.

All those extra features that you can cram on a web page need that much more effort on mobile. People, for some reason, seem to be understanding that mobile costs a LOT of money, so they wait until later. But they expect to be able to build the web side really cheap, so they stick with that and argue price.