I've been an avid Evernote user since the beginning (one of the first few thousand users). I use it to record all sorts of ideas, thoughts, notes, reminders, research, and references.

One year ago, my girlfriend was using Evernote (on my suggestion) to write her travel journal on our trip to Southeast Asia. I saw her note sync a bunch of times (the iOS app shows a little blue arrow when it's uploading). But one day she opened it and the note was gone. I contacted support but they couldn't do anything. (They offered her a year of free Premium service and "apologized for the inconvenience".)

Since then, I've stopped recommending it to people because I don't want to feel personally responsible if they lose notes too. I also have a tinge of doubt every time I record important information. My biggest worry is Evernote quietly losing a note, because once I record something in Evernote I typically push it from my internal memory.

On top of that, their iOS app is incredibly slow. When I want to quickly jot an idea down, it's very inconvenient.

I've started using SimpleNote lately, which is far faster, but I don't know to what extent I should trust it to keep my data safely.

I remember the day a year and a half ago when I went out apartment hunting in a new town, looking at my notes on apartments in Evernote's Android app. It was a complex note, with lots of text in deep hierarchies of bullet points. At one point I tried to edit it, and after a few visual glitches, the text of the note disappeared. Then it synced, and there was no undo or history option in the app as far as I could tell.

I was able to get the note back by driving back to my hotel, retrieving my laptop where the note was cached, and opening Evernote while offline to ensure it wouldn't sync and wipe out that copy. Pretty frustrating. I've learned some tough lessons about cloud services and free stuff.

But even putting aside these syncing issues, it's a really terribly-designed piece of software There are so many issues with it, UI-wise, that is just bugs the hell out of me. (See note below for an example of a feature designed badly).

Still, Evernote fills a need I have that unfortunately there is no other solution for. And while it's taking a long time, it is gradually improving. So I'm still hoping that, one day, Evernote fulfils its destiny and becomes as amazing as it could be.

Note: Example of a badly deigned feature: tagging support is crazy-bad - you can tag things, and you can even organise tags into a tag hierarchy - except, no you can't, because it's only supported on some platforms. And the "support" for it is purely visual - selecting a "parent" tag doesn't auto-select the child tags, so it is basically no help. So let's go to solution 2, which is to tag things with a prefix, like "History\Middle Ages" and "History\US". But now, their generally awesome tag-completer will be annoying, since it will force you to type "History\" before getting to the point. So lets reverse tag it, like "US (History)". No, that wont' work, since you can search tags by prefix (e.g. search for anything with a tag starting with "History") but not by tag suffix. Even though, through the UI, you can do this, you can't do it with an actual search, so you can't select these tags.

I do lots lots lots of paper notes. I even make my own notebooks according to my system which evolved a lot over past 5 years.

Since 2008, I am making my yearly Evernote migration attempts. So far, Evernote is not any close to the paper notebook + smartphone camera duo. On every account excluding possibly search, it is inconvenient, complex, slow and less reliable.

If core HWR functions of Evernote will be available as an one-button app in my phone (like Camera), that will be a really strong value proposition to me.

Otherwise, the value of notes depends on being within immediate reach (ideal: on the wall, open on the table). Every additional tap, click or wait-one-second halves the value of it.

Navigating a complex unreliable app, paying for it and worrying about privacy/reliability/bugs altogether makes it less than helpful for me, hence a no-go.

This. I'm always particularly annoyed by the tech support when I've tried to submit bug reports. One time I found a reproducible bug in the Chrome Clipper and even offered a possible explanation/solution for what was happening and the person first insisted that it wasn't happening. I couldn't believe he was telling me what wasn't happening on my screen when I was looking right at it. I pay for prime so next requested to be put in contact with a developer to submit a bug report and was denied. Finally like the author they asked me for activity logs which I also refused to fork over because they seemed too personal so instead I just put up with a buggy clipper. I wish they focused less on selling socks and more on the software. [https://www.evernote.com/market/feature/socks?sku=SOCK00106]

I've been a paying customer almost from the start. Unfortunately, as Evernote has expanded, it's gotten less and less useful for me.

Their web clipper is great, the best around IMO (especially since Clipboard folded), however there's no way to exclude those clipped pages from search, so after using the clipper for a while, searching for just about any phrase is mostly irrelevant results. Ideally it'd be possible to filter by source or have default searches to exclude certain types of content.

Another example of this is that I have a well-curated and geotagged Travel Notebook (this was actually much harder than it should have been since their geocoder is picky and you can't really massage it). I'd love to be able to see these notes on a map, but the "Atlas" map view that Evernote provides doesn't let you filter by notebook (or anything really).

Evernote does a great job of making it fairly painless to capture notes and despite the author's problems, has generally worked well on syncing everything. It's never done a good job for triaging/filing/finding or organizing notes though, and it seems to simply get worse as you use it more (and with each redesign). Evernote seems to want to encourage you to put "everything" into it, but as you do, it becomes harder and harder to get what you need out of it. Honestly, I'm baffled at how the Evernote devs/designers use it.

I really hope Evernote's take-away from this is that they need to scale back development on all their auxiliary stuff - hello, food, whatever, as well as all but the most critical feature requests, and focus as much as possible on making the core experience bulletproof. I would _hate_ to have to give up Evernote, but like others here, am extremely apprehensive about the possibility of losing data.

One stop-gap they might be able to implement quickly would be a scale-up of their version control. They could throw money (storage space and bandwidth) at the problem, increasing the number and frequency of revisions stored. Certainly not as good as preventing loss in the first place, but reliable versioning would help minimize catastrophic loss in the meantime, and would still continue to be valuable once things are more stable.

While this post must be pretty distressing for the Evernote team, their response time is pretty impressive! Within a couple hours of this post being published, Phil Libin has already contacted him. See the edit at the end of the post:

"Update: Evernote CEO Phil Libin contacted me and we spoke about the issues described. He apologized, saying the post rings true and that there is a lot of work to be done both on the application and service fronts and that he hopes my impression will be reversed a few months from now."

I think this is a direct consequence of how thinly they're spreading themselves out across multiple platforms. They have a native app for every mobile and computer platform, along with web, plugins for every major browser, and then the other apps - skitch, penultimate, clearly, hello, etc...

It truly is in the face of the "do one thing and do it well" mindset that many other companies subscribe to. It's a shame too, because I love Evernote. I truly do live in it... true to Phil's vision, my mind is thoroughly mapped out throughout my Evernote account.

I totally relate to this note on two levels, one as a user and having Evernote go wacky on me and just flat out lose something it used to have saved, and two as an engineer having worked on systems that were not designed but instead evolved at the hands of people "getting things done."

The latter aspect is the most intriguing because if Evernote is in fact evolving and not designing, they are vulnerable to being out executed by someone with good design principles. I sometimes wish I could look inside their system and see how it is put together, and sometimes I worry about what I mind find there if I did.

Yep. Lost 3 hours of writing once. No chance of recovery apparently. Now I've learned to not edit existing notes on mobile devices if I can avoid it. Copy the note to a new one and edit that so I have a backup.

So I've never used Evernote before, but after reading this article I decided to try it out since I know a lot of people who swear buy it. My first time user experience was awful... I immediately started creating test notebooks, test notes, etc just to get a feel for how it worked. Within seconds the app was freezing on me every time I tried to delete something (this was on a brand new iPad Air). I had to either rotate the device or put the app into the background in order to unfreeze it. This is a common scenario and I can't believe there isn't quality control for that. I'm a developer and I understand not having time for edge cases...but freezing on delete? I can repro it 100% of the time

I've been meaning to move off of Evernote because their OSX client is just slow as all hell. Are there any alternatives, aside from something like Dropbox? Evernote's OCR implementation was/is really useful.

Definitely thought some of these issues were just me. My hypothesis is that Evernote gets increasingly unstable the more notes (and data) you have stored. Which sucks because I have over 1200 and I pretty much need Evernote at this point. The desktop app is damn near unusable, takes forever to create a new note.

I no longer trust that they will always have all of my notes, so I started to back them up to Dropbox via the HTML export. But I'm lazy, haven't done it for a while.

Perhaps this is an opportunity for a new company to do what Evernote is doing, better. Automatic backups to Dropbox, lighting fast no matter how many notes stored, reliable and instant syncing, etc.

All developers know that feeling when using an app: you're dealing with something a little half-assed. Evernote has always had that feel for me. Switching over to something else, preferably based on flat files using something like Markdown, is on my to-do list.

I evaluated Evernote once two years ago. The iOS app crashed as I was appending to a text note. The app lost a half hour of unsaved meeting notes. I never trusted Evernote again. In my mind, a shoddy rich text editor cast serious doubt on the durability of Evernote's distributed revision control.

I don't mean to downplay this but seriously, make backups of all data that is important to you. Lots of things can go wrong everywhere and they do go wrong. Maybe it is a sucky app or maybe your own error -- if you have a backup, you don't need to worry.

I know this is no excuse for Evernote's app being at fault, but if something matters this much to you, you should not be trusting anyone or anything and the only way to stay safe is to have backups in multiple places. Might seem like a PITA but it is worth the effort.

I had to go find the original version of skitch, built by a different company, because of how badly the Evernote team massacred it. They just seem like a company more interested in high level bullshit than actual user experience.

I've had a lot of the same issues in the Windows client. one particularly annoying bug is how a simple slip of hitting backspace in the wrong spot can delete an audio recording with no possible way of undoing the operation. I've started using the service less and less over the years to where I barely open it at all anymore, relying on services like git and drop box instead.

Several years ago[1] I installed Evernote on my Mac and used it pretty regularly for a few months, then just tired of the sluggishness and fragility of the app. Mind you, I've never installed the web client, none of the browser extensions, nor have I used it within a browser. It was always the OS X app for me.

Spurred by this post (nicely done, btw) I went and gave a look at what was inside my old Evernote account. Nothing. Everything's gone except the myriad folders and tags I'd added to help keep everything organized. It's a ghost town now.

I guess I don't really care the stuff is gone since I'd given up on the app long, long ago. Still, I can't help wondering what I'm missing, if there was anything truly important that marched in line & jumped off a cliff along with millions of other users' data.

I've been using desktop app for quite some time now and it's good enough for the most part. Just recently I was considering moving to the android app when it hit me that Android app doesn't even have an option for so-called local notes even on premium subscription. Whatever you write there should eventually be synchronized with the cloud. That's no-go for me, I'm both paranoid and dealing with rather sensitive information.

Though I can't find a reasonable substitute on Android. Most apps in this category focus on getting notes easily or on some to-do/calendar side, and very few has a good set of features to organize and navigate through a vast db of notes. Springpad has the same notebook/tags system and pays a good deal of attention to the organization part, but alas it is a web app with no option for private local notes.

Shit, here I am finally taking the time to see what Evernote is and finding that I could find use for it in my day to day. Yet now there is no way I could ever trust it with my data, everything I would put on it is important to me.

I just realized that I've stopped using my Evernote account ever since I upgraded to iOS7...and that's just because it kept crashing on me even if I had a brand new phone. This post just reminded me to cancel my Evernote pro-account and move all my stuff over to -- I guess, Google Docs? (as others have recommended). For receipts, I've been using the Flickr app to auto-upload my iPhone photos into a private folder. Then, I just send download links to accounting for reimbursement. It's even easier than Dropbox -- and it's free for up to 1 TB. The security issues are scary...it'd be interesting to see Evernote's reply.

Both my wife and I have also lost notes in Evernote. I'm a paying member and just last week exported everything out into text files within Dropbox. I have to get the workflow right but at least I know my files won't disappear.

I've been using a Windows desktop app for years that's a lot like Evernote. I have many MB of notes, especially code snippets. But also lots of other stuff, exactly like what Evernote is intended to do. I would love to migrate it all to Evernote to get cloud access to it all, but my experience with a Evernote is that it is just not trustworthy.

My fear with the desktop app is that a Evernote is killing it. It's a great app, though. Never let me down, not once. Never crashes, never lost a note. And it has more features, more flexibility in formatting, and the ability to have deep nesting of what Evernote calls notebooks. But the UI look & feel is very outdated.

You probably aren't familiar with how "guarantees" work here in South America, ugh.

See companies like Samsung and Toshiba have "certified" stores that "take guarantees" but they are not tied by their parent company, they are privately owned stores that just negotiated with the parent company to use their "sticker".

I bought a Phillips shaver and under warranty, the Phillips station wanted me to pay 70% of the cost of a new one, despite being a DoA device.

So while the sticker works as it should in the US and Europe, South America has a god damn wild west scenario. Anything goes, and if you don't like it, buy something else. Yep.

I purchased Toshiba laptop in 2002. Within 3 months the laptop's graphics card failed. Toshiba does not repair their own laptops, rather they send it out to some 3rd party repair center. The repair center took 3 weeks to repairer the laptop. When I came back to pick it up, the laptop started but the screen turned off as soon as I picked it up from the counter. I left it with the relier center. 2 weeks later they called again. This time it worked for a day before dying again. 3rd time they took another 3 weeks to repair. After that it worked for a month and died. I gave up and got a new laptop. Since then I never purchase Toshiba. I don't care how good or bad their products are, their customer service is one of the worst.

I'm probably with you, but there's not a lot of information here. Where did you buy it? Could it have been from a dealer that wasn't authorized to issue this warranty? If they couldn't agree to it on Toshiba's behalf the contract would be null, right? And what is the problem with the laptop - though that is of course a separate question from that of honoring the warranty.

I've had good luck with Dells from that point of view. I've bought 3 in the US, and two, at some point in their lifetimes, have needed some love from a technician (bad HD, and a cosmetic problem with a very new laptop that I wanted fixed because I spent quite a bit on it). Despite being very much not in the US anymore, they promptly dispatched people on site (in Innsbruck, Austria, and Padua, Italy) to fix the problems with no questions.

I'm sorry to hear that you are having this problem. After buying/selling 10,000+ used laptops (every brand imaginable) over many years, I personally purchase and recommend only Toshiba laptops. Take it for what it's worth.

After buying a Toshiba Satellite P100-J01 years ago, and having to choose between either sound or ACPI when running GNU/Linux (until a BIOS update came out, and even then I had to patch the DSDT), I'll never buy Toshiba again.

I'll add another Toshiba support horror-story. It's why I haven't even looked at Toshiba products in 2-3 years:

My work laptop (supplied by employer) was a Toshiba and had a 1-year warranty. After about 10-11 months of using it, the DVD drive stopped working. Toshiba's warranty support was typical ship-to-depot, so IT pulled the drive and sent the laptop off for repairs. I wouldn't ordinarily care about a laptop our for repair, but IT supplied me with a temporary machine that was at least a generation back (ie: slow and heavy).

IT got a message that except that my machine had been received at the depot but heard nothing else for weeks and weeks after. By the time I'd bugged a tech at my company enough to contact them the warranty had lapsed ... and Toshiba refused to service the machine.

Toshiba refused to service it for several more weeks. I finally took over contacting support from the IT tech, and got the machine serviced after a half-dozen (long hold-time) calls. But for the amount of time the IT dept & me spent getting an optical drive fixed our company could have paid for two new machines.

> In 1987, Tocibai Machine, a subsidiary of Toshiba, was accused of illegally selling CNC milling machines used to produce very quiet submarine propellers to the Soviet Union in violation of the CoCom agreement, an international embargo on certain countries to COMECON countries. The Toshiba-Kongsberg scandal involved a subsidiary of Toshiba and the Norwegian company Kongsberg Vaapenfabrikk. The incident strained relations between the United States and Japan, and resulted in the arrest and prosecution of two senior executives, as well as the imposition of sanctions on the company by both countries.[6] Senator John Heinz of Pennsylvania said "What Toshiba and Kongsberg did was ransom the security of the United States for $517 million."

I'm sorry this happened to you, buddy. That really sucks. Us United States consumers should be more concerned with the shortcuts and backhanded ways companies deal with customers outside of the States. A company that treats customers badly just because it CAN instead of doing what it SHOULD doesn't deserve our business.

Wonderful work, and thank you for documenting the experience. From the title, I thought this would be a story about decoding a banking website's cookies and gaining access to other peoples accounts, or something similar. I was quite surprised to see that your bank did basically everything right. I was also surprised that you went so far as to implement an embedded clone. Very cool!

P.S. Consider yourself lucky to have such a bank. Here in the U.S., our major banks do not take security seriously by any stretch of the imagination (they have little incentive to).

This post had me guessing, but good work. First I saw the card with codes and thought you'd be showing that they weren't randomly created. But then you went on to the app -- and from the "What you'll need" section, when I saw the decompiler and the rest, I thought, "I know what comes next," but again I was surprised. You went above and beyond with the decryption of obfuscated error messages, etc. I could have guessed that it was OATH TOTP, as that's how these apps should work. Congrats on getting there from the source code, and indeed it's too bad they didn't retain compatibility with Google.

To fix the bug you mention -- root access from phone -- perhaps you could use something like Yubikey Neo loaded with ykneo-oath. I was searching the code for ykneo-oath (it's a java applet for the small key) to see where the timestamp was used for the dates, but it appears to be part of the YubiOATH app: https://play.google.com/store/apps/details?id=com.yubico.yub... So you'd have to modify the app source (it's on github). The advantage, however, is that your secret isn't stored on your phone and vulnerable to root apps. Instead, your secret is on a mostly-offline key inaccessible from your phone. There's a YouTube video on how it uses NFC to get that OTP from the Yubikey when you need it. In case you're somewhat extremely paranoid, this might interest you. :) For the truly paranoid, you've found a way to disable account recovery methods while mixing time-based and counter authentication mechanisms ;-)

A good lesson for those of us who have had the idea of building a similar app to generate one-time passwords. Now we have a better idea of the minimum that needs to be done to build such an app securely. Thanks.

The only point of these token generators is to provide a stream of tokens, so that if the generator is cloned (which is trivial), that can be detected. That's it. As far as I can tell, this attack does not prevent the server from detecting a cloned token.

(To do that, you would have to install a new client on the victim's device that will increment its counter and tell you the counter when you ask.)

The real language part of LINQ isn't the simple sequence functions (.NET had them before LINQ), it's the LINQ monad embedded into the language.

While personally I don't think I've ever used the LINQ monad syntax, and wish it had been implemented as a generic feature, not every language with support for anonymous functions or closures or function pointers is "LINQ".

Non-disparagement clauses can be seen as a throwaway item, a suffocating burden, an essential protection, or a damned nuisance, each according to taste or context.

To begin with, lawyers tend to see these clauses as essential protections and they are sometimes right. But, right or wrong, they tend to insist upon them, especially in the employment context. This explains their prevalence but, of course, does not necessarily justify their use.

Just to illustrate the cases where they truly are an essential protection, you and a competitor have been fighting for years in court over ugly and untrue things that someone has said about you or your company - non-trivial things that have really hurt you. When it comes time to settle that case, a continuing non-disparagement obligation will be not only helpful but essential to the resolution. The same is true in many other legal fights. When emotions have run high, and parties have antipathy toward one another, it is good practice to help ensure the peace after their fight has been settled to require that they not speak badly of one another and to give a simple mechanism such as binding arbitration to help resolve any follow-on dispute over whether they have done so or not. In such cases, there are excellent reasons to bind parties contractually to restraints on their ability to speak where they would normally be free to do so.

The employment context gets trickier because the antecedent acrimony that characterizes a legal fight may well not be present at the time of a termination and the question then arises: why am I being artificially muzzled? And there is a point to this: why be barred from speaking truthfully about a former situation even if it might be negative? why be at risk of a harassing lawsuit over what it means that something "may" reflect "negatively" about someone? why, in an age of easy communication through social media, be made to feel you cannot even speak about something that may have been a major part of your life, perhaps for many years? What may be seen as a throwaway item by some can be felt to be suffocating by others, and all the more so because it is tacked onto a token severance that gives you very little in exchange.

That said, I would say that the overwhelming number of employers and employees alike see these simply as throwaway items. They figure no one will care about such clauses except the lawyers. And, in most cases, they are probably right. The question then becomes whether one should not sign as a matter of principle or whether to just sign and take the money. Most employees take the money.

Of course, employees can push back if they have leverage. No one is obligated by law to sign a separation agreement. If the terms aren't right, and can be made right, then push back. Insist that the token severance be made more substantial. Or that non-disparagement, if it is to be included at all, be made mutual (it can be quite a head-ache for a large employer to keep control of its many people to ensure that none speak badly of you). Or insist that it be narrowed or clarified so as to reduce or eliminate vagueness about what may or may not be deemed disparaging. Or insist that it be coupled with other considerations that give you benefits apart from your normal final pay, etc. This sort of negotiation can make these clauses a big nuisance from the employer standpoint and may cause the employer simply to drop the clause. However, all of this assumes employee leverage, which doesn't often exist in the routine case, and so, as noted above, most employees simply take the money, accept the restriction, and don't bother to look back.

And so it all depends. For the author of this piece, this was a critical issue. For many others, it is not. Context is critical. And for all but trivial cases, do check with a good lawyer to understand the implications of what you are signing. If the risks are real, there is nothing worse that a harassing lawsuit from a former employer angry with you over some statement you made out of emotion. This is what gives these clauses a bad name and it is also what can make them dangerous. In such cases, be cautious about exposing yourself to such risks in exchange for some token severance. It is probably not worth it.

And I was soon informed that the president wished to assure me that there is nothing unusual about such clauses

Whoop whoop whoop! This sets off giant alarm bells in my head.

It might be totally normal. That doesn't mean you should sign it.

It's also an older-than-dirt salesman tactic to say that something you just made up is "totally common."

Of course, the company can attach whatever clauses it wants to a separation agreement. You aren't entitled to a severance payment. I'll tell other engineers that two weeks' salary is a piddly amount for the company for you to surrender such rights. You can just walk away. They are the ones who want you to sign that.

(It's kind of ironic, but after you have been given notice you are fired, you have power. They want you to do certain things, and what are they going to do? Fire you? Already did that. Withhold pay? Illegal.)

I have serious misgivings about this sort of "hush money". In general, I'd prefer not to interfere in private contracts, but this one has such serious implications for everyone else. In particular, it can end up creating an information imbalance, enforced by the courts, that allows a certain group of people to remain "in the know", with everyone else unaware of what is going on.

I read a while back about a law firm that had evidently done something very dodgy - representing an inventor and the firm purchasing the invention at the same time. The engineers were eventually paid a settlement, but part of the settlement was a gag order - nobody was allowed to talk about what had happened or the amount of money paid. This included, of course, the press.

Now, what do you want to bet that well connected lawyers, upper managers, and so forth, are able to access the terms of this deal - even if they weren't involved? What are the odds that an inventor who approaches a law firm will know what transpired and why? The imbalance of information will put the inventor at an overwhelming disadvantage.

My gut feeling is that there is a third party in all of this - me. Well, me, and all the little people. I understand the need to enforce contracts within reason, but I'm having a tough time seeing my own personal interest, or the general public interest, in enforcing these "stay quiet" contracts.

I'd also point out that this isn't really a situation where we are prying into a private transaction and forcing people to talk. Our courts are actually enforcing the gag rule that keeps most of us in the dark about what is really going on out there. My misgivings about regulating private transactions aren't as strong when all we'd need to do is stop enforcing contracts that are clearly against the public interest [1].

[1] I am still thinking this through. I'm not absolutely sure this is against the public interest, or, even if it is, if we the courts should refuse to enforce the provision. It's how I'm leaning, but I have a sense that there may be more to this. I am generally glad that courts won't enforce certain terms of contracts, such as very long non-compete clauses and the like...

No disparagement clauses are a lot like non-compete clauses. I would never sign one, but at the same time, I am reluctant to speak poorly of a former employer or to go into direct competition with former colleagues.

I want the right to do those things, but I don't actually want to do them.

Now of course, they always say, "Yeah, maybe not you, but somebody." To which I say, "That's why you took so long to check my references before hiring me. I'm no longer somebody. And if it mattered that much, you should have put it in the employment agreement so that I could have declined the job before taking it."

The only people who should feel morally compelled to sign no-disparagement clauses before accepting severance are people who are being fired for leaking confidential information on the job.

"This is a reminder not to let a digital world full of others moments deceive you into devaluing your own. Their moments are infinite yours are finite and precious."

For some time now, I've been spending the majority of my day jumping between Reddit, HN, and YouTube. Any time that I'm not on the computer, from the time I wake up to the time I go to bed (with a few exceptions), I'm listening to podcasts. Basically never idle, and rarely truly living my life - just endlessly consuming information in an attempt to fill the void.

I used to be a builder, living the high described in the post almost every day, but I've lost it somewhere along the way.

Something I hope to change. This post was a big kick in the pants for me.

Part of the reason were at the top of the food chain is that we are chemically rewarded when we are industrious it is evolutionarily advantageous to be productive.

And were slowly and deviously being trained to forget this.

Good article, yet the author seems to be making the same mistake so many make - to assume that our present generations are being eroded away and forgetting how to be productive because the masses are hypnotized by social media and news on demand.

If you stop for a minute and look at what's being spread via social media and TV/Internet news, you quickly realize it's the exact same things that hunter-gatherers probably spend 99% of their downtime gossiping about too: this person said that thing; this guy slept with that girl; this guy has so many resources and isn't that so unfair to the rest of us; the guys in charge of tribal society have secretly been spying on all of us, isn't that scary... social media and online news isn't changing anything more than the mediums we gossip through and making said gossip more permanent and apparent and less ephemeral and transitory than it's previously been. But just because it's still there doesn't mean people are spending much time obsessing over the gossips of yesterday; just like those in tribal societies, the news of yesterday is quickly forgotten, and soon supplanted by the urgent, pressing news of TODAY.

I'm pretty sure in Archimedes's or Newton's days most people weren't sitting around removed from society on their parents' farms inventing calculus, or holed up in towers devising calculating machines and giant ship incendiary weapons... rather, they were going to the county dance, swilling home-brewed beer with the neighbors, and gossiping about the same things we gossip about today: wasn't it scandalous how Ellyn was behaving with the men at the dance? Isn't it a crime how much the poor are taxed by the local lord, while he lives in luxury? How unfair it is that the law applies so unevenly between peasant and lord! Can you believe that Brom and Beatrix are fighting again?

Despite the very long period of leisure that medieval peasants had during wintertime, not a whole lot of scientific or technological progress came out of the peasantry. While I agree there's little more satisfying than building something yourself, I'd differ with the article in suggesting that the masses of people today are in fact no different than the masses of people of times past - a minority produces new things, while the majority handles the day-to-day of maintaining what we've already got, and spends its leisure time consuming the output of those producers who've successfully managed to produce things others want and/or things useful to those others.

I completely agree with this idea. I also experience the "high" of completely a cool project, even if no one else ever sees it, just knowing if solved a problem in a cool or fun way, it just gives me a buzz. I think this is a universal feeling that woodworkers, artists, musicians, and other various artisans feel. In that way, the act of techno-creation is... I think, related to art.

I suspect (like myself) many, many people are interested in the idea of setting aside time in their lives to spend creating, whether it's writing, cooking, building, crafting, photography, whatever.

I also think that (like myself) many people, especially in the age of the internet, blogs, twitter, facebook, etc, are essentially stopped in their tracks before they start, by a fear of (for lack of a better word) "publishing".

I wonder if we would be freed from our fear if we agree with ourselves to create, but without publishing. Write 500 or 1000 words per day, but don't publish it to your blog. Take a photo every day but don't post it to instagram.

The enemy of creativity is the inner censor ... and I wonder how many of us just need to be reminded that it's ok (indeed, arguably better) to create for nobody but ourselves.

This really hit home with me. For a while now, whenever I've been coding, I've been more focused on learning new things instead of creating things. While this isn't necessarily a bad thing, I had project for a CS course that I didn't end up doing so well on, and I attribute that partly to me not having built something in a long time.

I'd like to change that this year. When he posed the question "What was the last thing you built that gave you that Builder's High?", I sure as hell couldn't remember much that I've built recently that gave me that feeling.

Not those kinds of "systems". It is funny, it seems initially it was meant to be those kind of "systems", and then it pivoted as we like to say, to become a "distributed-server-network-backend systems" not "hardware-kernel-OS" kind of systems. And then creators kinds of winged it and remarked how "well, that's what we meant when we said systems".

As for Rust, yeah, there are already a few projects trying to build a kernel or drivers in Rust. Here is one for example:

An important part of an OS is that it implements support for concurrency - it builds processes and threads out of the raw materials such as page tables and timer interrupts. A kernel also builds its own spinlocks and higher-level waiting primitives. If you are relying on some high level language runtime to do the heavy lifting, is that really learning about how an operating system works?

For a kernel I actually see a lack of language support for concurrency as a plus. You need something to sit below the higher level stuff.

Is there any support for the article's assertion that C uses = for assignment (and == for equality) to save memory? dmr simply described it as a matter of taste and hinted that he was simply used to typing = for assignment. [Edit: And the compilers didn't keep source in memory, anyway.]

This just doesn't make a lot of sense to me, given that almost all systems programming in the real world is done in C and C++, and given the complexity of OS kernels, we're not going to see a general purpose commercial grade one written in Rust either ever (if it falls out of popularity) or for ten to fifteen years (if it becomes very popular). This isn't like application or web programming - in comparison to web technologies it moves at a glacial pace. So I think teaching an OS course in Rust is going to leave them underprepared and be a big shock if the students actually go on to do systems programming on real systems.

The other thing is that I did an OS course in C, where we got dirty playing around in the depths of the Linux kernel, replacing the scheduler, writing a filesystem, and so on. It was damn hard, but probably one of the only courses in Uni that really pushed me as a programmer generally (everything else I found easy). I think that it really helps to have experience in unsafe languages like C, just like you really need to do at least a bit of assembly when you teach computer organisation.

"I'm not aware of any other language that has concurrency constructs as elegant and easy to use as Rust's spawn, and I don't know of any language that comes close to the race-free safety guarantees provided by Rust."

I'd recommend having a look at Erlang, it allows you to start lightweight pseudo processes for concurrency and provides a robust data sharing model.

Not so sure about low level access, we used C for those parts in the OS course I took.

Lol what? Unless tasks cannot communicate at all, or they are scheduled entirely deterministically, how can the compiler eliminate race conditions? How can a compiler even determine what is a race conditions and what is intended non-determinstic behaviour.

So this is going to sound mean when I don't mean for it too, but I sort of doubt the reality of being fired for being a generalist. I just don't buy it. I totally believe that's what they /told/ him, I just don't think it's the actual truth.

So far I've made a career out of being a generalist. In my experience, companies of any size love having people like that. Specialists are valuable of course, but most of the time managers end up with a broad spectrum of problems and if they can throw problems to you without wondering if you can handle it, and feel confident it's going to get done, they probably don't care that you're only 70% as efficient as the specialist.

One could imagine a company with a young single-digit employee, who is very good at what they do but difficult to slot into a traditional org chart, hiring an older industry veteran for a role which will be very well defined. There might be, hypothetically, political dimensions there: "report to a 25 year old" might be a non-starter for the new VP, who may have connections or gravitas perceived as key to the future growth of the business in strategic directions. Conversely, re-shuffling such that the guy who built things from the ground up ends up reporting to the new guy might also be a bit on the awkward side.

I'm not discussing any particular company here, but be aware that this has been known to happen before, and it will probably happen again. I hate to make age a factor [+], but in particular, if one is in one's mid-twenties, one identifies with this guy, one works for a company with enterprise software ambitions, and one's company has recently taken investment and hired the requisite team of industry veteran executives/VPs... well, I hope you're a founder. If you're not, consolation prizes are a) you're in the best hiring market for your skillset in the history of ever and b) you've probably got what it takes to me a founder next time.

[+] i.e. This is a description of the world, not the world I would like to be living in.

They did do you a favor, but not only for the reasons you think. They got you off the sinking ship early. Seriously, what kind of bumbling fuckwit fires a highly productive employee for not being a specialist? Broad-based problem solvers are a serious value.

There's something fishy about this story. Either you're not telling all of it, or...

You joined before money was raised, and possibly got an oversized equity stake that is being clawed back. If you were supposed to have more than 1%, I'd consider it possible. If you were supposed to have 5%, I'd consider it a certainty.

If my guess is correct, you should talk to a lawyer. You may have options.

Why the urgency to get rid of you 12 days before Christmas? You would think that someone that is a valued employee and someone who contributed to the growth of the company would get treated a lot better than that. They could have waited until the New Year, or even given the employee an option to move to a different role. To get him out after 8 months, 12 days before Christmas certainly doesn't sound like someone who was really appreciated.

I have to commend you on the brutal honesty. I thought you weren't going to revisit what they said, but you did.

You also mentioned you were fired from another job but didn't provide details... just make sure you examine what happened in each case and there's not a pattern. All I mean is I've seen people have to leave a job for certain behavior, exhibit the same behavior in the next job, and remain in denial about it when confronted.

I enjoyed the blog post and your writing style. I don't have any info aside from what you wrote about, but it sounds strange that you were fired when you were doing things like improving conversions by 400%. My gut instinct, which could very well be wrong, is that either there is a "behind the scenes" reason for you being let go, or the company is doing something seriously wrong. If you have any reason to suspect that it might be the former, you should ask your former teammates. It will help you out in the long run. Anyway, best of luck to you!

"But get this through your head: if youre not the best at something, youre replaceable."

I disagree. I theory this might be true, but in practice, it's not. I'm often not the best X for a particular job/role/etc, but I'm the best they're going to find.

Perhaps especially in software, THE BEST in their field are already turning down work and forging their own paths. THE BEST developer in tech XYZ is not going to close down their startup or leave MS or Google or Amazon to come work for your company's 'agile' team.

I feel pretty strong that generalists have the edge in most cases, because they generally have a broader background and can see bigger picture stuff, often can see patterns of how different areas connect (code areas, business areas, etc). You certainly need specialists at some point, but rarely are those specialists the best in their field.

I've had a few phone calls with potential clients (and earlier, job negotiations) where people pulled this "we only look for the best XYZ people". At one point during a conversation I told someone (politely, I think) that I happened to know some of 'the best' people in the field they were looking for, and there was no way they were going to move across the country, take an 80% pay cut, and uproot their entire family to come work in some mid-level corporate dev team. On the other hand, I happened to be pretty good and would be interested in stopping by the next day in person to see if I could help solve their problem.

Actually, I've used that 'line' (not always the same words, but the same gist) on a few occasions, and in one case got me a foot in the door. It's more about delivery with a bit of humor, catches people off guard I think.

It sounds like he was laid off and not fired. The CEO did not mention that his performance was subpar, or that he violated some company policy or law. The difference, at least in California, is that collecting unemployment is no problem when you are laid off through no fault of your own, but if you we're fired, EDD will need to check with your employer as to whether you we're fired for just cause, and maybe a hold hearing where both sides present their case.

Blog posts like this make me feel concerned that I'm becoming a generalist right now and I'm not sure I can avoid it. I'm on a really small team doing development of both front end (Javascript/HTML/CSS) and backend (PHP for our back end, SQL db, node server for websocket updates and some other stuff). I feel like I don't spend enough time doing any one thing in order to be considered a specialist at any of it, just passable enough to get done whatever needs to be done at the time.

Like others, i'm equally skeptical, and would love to argue that being a generalist is bad if you are equally good in anything it has tough sustainability.In working for companies as a dev, this could you in a position of stagnation. It's really tough to keep up with technology these days and making the right choices.

Which brings me to the point that you're telling you've built a successful new website, which boosted revenue.Now that the website is built and launched 2 days before, they are letting you go 14 days before Christmas.

I for one would be pretty upset and demand an explanation other than "a generalist". It seems like a rather rude and insensible act. To do that probably means they did not see anything in your skill set that you would be able to add more value to the company, and so they replaced you with a marketeer...

I must therefor conclude that your website was a one-page infograph with a sign-up button.

Sorry but the missing bits of information are frustrating, and given that I don't see why you shouldn't be pissed off.I would most likely lawyer up.There are some important bits of information missing to be able to make sense of your blog.

The key is to be jack of all trades, master of some. Go super deep on one topic, then switch to another, then another. Now you are the best at two or three things, while still competent at another dozen or more.

I also agree that the generalist/specialist saying from the CEO was in poor form and should not be taken too literally.

I think you can shorten it to "You are expendable" without any qualifiers. Unless, perhaps, you have dug yourself in so deep that nobody else can do your job - not because of lack of expertise but because only you know how it works (bad practices).

I'm a generalist myself. I'm lucky to see this hasn't happened to me. The best possible solution for this according to me is when you have a clue that the company has outgrown you and you are not visible in the big picture, you should choose to opt out & start finding a better opportunity and be very confident that you would get one, being a generalist.I had a somewhat similar situation, it was new year of 2012 that i had to cope up with a failure of a startup and the CEO handed my cheque with some extra bucks + salary and said "Good bye" and i didn't know what to do, confused but confident. This is no where related to the story above but i was employed one week from the time, that is because i'm a generalist.But i have now concentrated more onto one area and it utterly sucks being specialist keeping in mind that you were a generalist before and you got many toys to play with and now its only one toy. But hey,Cheers! Change is awesome!

Is there any guarantee that your (about to be former) employer tell you the truth about why you're being canned? I don't believe so. They may just be giving you what they think is a safe answer. The real reasons for ending your employment with them could be anything (good, bad, silly, or even non-reasons). Nothing to take personally.

That was a very good article. The writing style is very "conversational", like a friend relating a story to you. I have never been fired before but friends always tell me that after being fired a lot more opportunities opened up for them. It's like being fired provides them with more impetus to seek a quick bounce back.

I'd be interested to hear what kind of success he has with this approach. Some would see it for what it appears to be. I assume many more would consider it unprofessional at best and begging at worst. I tried a similar approach a while back (pitching services while highlighting a personal financial need) and received overwhelmingly negative responses - around 15:1 negative to positive. Sadly, most people just don't care what you need and seem to be offended if you tell them.

I'm curious how the "you pick the rate" method is working out for you? The calendar on your site [1] still shows early December - does that mean you're no longer accepting clients this way?

I'd also love to know more about "Free Dev Time" (5 hrs/night). That seems like an interesting way to explore some new projects and technology without the hassle of estimates and contracts. And because you want to convert these people into paying clients, you've got some motivation to actually finish the projects you start for them, as opposed to personal projects that seem to fall by the wayside (for me, at least).

Great work, I'm inspired! And good luck with that plane flight. Looks like you'll have plenty of new work opportunities.

Hey! I live in Nashville and there's plenty of work out here. With a week or two you could probably find a gig here and turn it into a business trip. Boom 30% discount come tax time!

Hell - while you're in Chattanooga stop in at http://colab.co/. Their office is close to downtown. Find a project based out of Chattanooga and see if they'll pay for you to fly out every couple months!

As someone who has been in such a relationship before while on a very tight budget, I have respect for this guy. It is really hard to put up a post asking for business like this, too, I imagine.

To the OP: Good luck to you. I hope you not only find business and get your ticket but also do a great job for those who hire you. I sincerely wish you happiness and success in your relationship and your work.

My initial reaction was "that's not the definition of hacker" but when I tried to find a written definition that matched what was in my head, I kept running into the related stereotypes and connotations (e.g. being completely obsessed with computers, Richard Stallman talking about coding until 7 in the morning).

My takeaway is this: a lot of us feel impostor syndrome, feel that we'll never be a true hacker. The difference is it appears easier for men, for whatever reason, to ignore or move beyond those feelings. And you know maybe that's a problem we should take seriously.

>Despite these feelings of difference, we find that male students report less distress, are less affected by the perceived difference between themselves and their peers, and leave the major in smaller proportion; and despite resistance to total absorption in computing, they do not feel like frauds. The 36% of male CS majors who say they feel different from their CS peers, regardless of experience level or obsession level, do not question their ability to become computer scientists if they choose to do so.

From personal experience, this resonated with me, complete from the rocky start with high school CS in Logo onward.

I know the feeling, and (though I'm male) I recognize many of the waypoints described. And I, too, don't feel like a real hacker. But of course, the word means wildly different things to different people. Even though I love programming very much and I'm doing it all the time, and especially doing it for fun, it was only after joining HN that it dawned on me I could be considered a hacker in some circles.

Still, my primary definition of being a hacker is someone who is insanely active in hacker and cracker culture, someone very interested in systems security, someone who knows to debug a defunct DSL modem given only an oscilloscope and a bit of tinfoil. That's not me.

If I were to apply for a VC program that looks for hackers, I'd probably feel like a bad fit or a complete fraud. Nevertheless, I probably am a hacker. Labels are always an imperfect solution.

One other thing about getting people to program: Male or female, I wouldn't know where to start either. In school I was literally the only kid into actual programming, out of about 500 students. It was only much later in life that I made my first actual programmer friend, and I also recruited an unhappy English major into programming, but most people just aren't interested. And of those who can, and by necessity must, program most would never do it in their spare time for fun.

It seems this discussion is going all the way back to high school and childhood. In that case, let's keep in mind that many (most?) of the kids who spent all their time "hacking" did so because they were often excluded from more social activities. Remember, there was essentially no social reward for "hacking" (unlike getting good grades, being a great athlete, able to tell jokes, speak in public, dance, sing, play an instrument and most other skills kids might develop).

Sure, today hacking is "cool" .. because money, fame and power are cool and many hackers have achieved that. But until very very recently (last couple years), the only reasons a kid would start hacking were curiosity and having little else to do because they couldn't (or didn't want to) fit in.

With that in mind, this article disturbs me in a way I can't easily explain. Particularly when she describes feeling like an imposter "in the face of the desirable hacker stereotype" and even claims the nerdy clothes that hackers wear are in and of themselves exclusionary simply because one might choose to be more fashionable.

This is no different from her stereotypical "hacker" complaining about having to wear different attire to fit in at a school dance (or should he/she later choose to go into banking, law or any profession other than programming at a startup).

I went to a top high school, where academic excellence (grades, SATs, AP tests, what college you got into, etc) was definitely cool and admired. Even then, if you saw a kid hacking in the hallway (or painting or otherwise working on things completely of their own choosing) you knew that he/she was likely at the bottom of the social totem pole.

The fact of the matter is that a disproportionate number of the best hackers come from that population. Those who were the "cool kids" in high school may want in now, but they'll have to earn their place at the table. If that means working ungodly hours, having to change the way they speak or dress to "fit in" .. when in Rome ..

(please do not misinterpret anything I wrote as condoning sexism, which is despicable in any setting)

I remember in the late 90s I was 10 or 11 and I figured out how to use the public library internet for more than the limit of 30 minutes.

I wanted to use internet so bad but I was not allowed a computer so I always hung out at the public library and I wished more than anything to use the internet as much as possible.

I thought by asking how does the computer know when the 30 minute is up? There must be a timer! Can I prevent the timer?

Sure enough, for half a second there's a dos terminal in windows 95 that is displayed when I restart the computer. It took a few tries but I managed to close it. 30 minutes passed and I was still on it. then an hour, another hour passes and I'm still not logged out!

I used the internet all the day from morning to dark. I let my sister use it too when I wanted to take a 'break'. I would see people check the wait list look over at me and leave very annoyed.

The next week, the librarian who remembered me told me they would need my library card as collateral, and that they will be keeping tabs on the time. the jig was up.

This was my hacker moment. not exactly sophisticated but I wanted something so bad and I got it.

I can relate to a lot of this though I'm not female I had some of the same early experiences. I did some BASIC on a VIC-20, some Logo on the school's Apples. Some programming in a specialized adventure game development tool on a Mac. I got lots of ad-hoc "systems" experience because I wanted to know how things worked behind the scenes. I never really considered doing "programming" as a career because it was "too much like math" or something. I didn't find out about Linux till I was 20 (this was in 1996, but still...not much cred!). I never would have had this career except for a lucky first job where I had so much time on my hands I learned how to program and automate all my tedious help-desk work. So they gave me more. Later took some comp-sci classes though never did finish a degree in it - its too hard to schedule 300+ level classes at night at a real university. I remember reading the description for the first batch of YCombinator and thinking "thats not me...". Now - I feel more like a hacker at 37 then ever before.

Anyway, Paul was really talking about the fact that he can't make someone into a hacker in three months. Never did he say that people can't become a hacker at the advanced age of 22.

This resonated a lot with me as well -- elementary school computer science education in the '90s wasn't very developed. I went to a high school that had one of the better computer science programs for that level (in that it existed), but it still wasn't anything compared to the university level. The people who really learned the stuff were a self-motivated group who interacted outside of class as a club. These kinds of extracurricular studies were at best ignored by school faculty, and for students like myself, who was falling behind the curve in the school's demanding math/science/language program, were actively discouraged.

There's a crucial distinction between my experience and hers, though. I had the support of a core group of computer nerds, egging each other on and making simple games and the like. This group was almost entirely male, and were mostly interested in other stereotypical geeky hobbies like video games, anime, and tabletop role-playing games. We weren't consciously exclusionary, but anyone who didn't match the profile was probably going to feel very out of place.

I remember in my CS course, there were two girls in the class, and they were most definitely not part of the clique, even though they were brilliant by most objective definitions. One of them was definitely in the "gifted" category, and while the rest of the class was concerned with making video games, she was writing an equation plotter. We didn't talk to her all that much.

My point with all this rambling is that there's a lot of bundling of interests that goes on, all of them male-dominated, and if you aren't into those things, you don't get the same peer education as somebody who is. I think this explains a lot of the gender disparity we see in CS education today, and I don't know what to do about it. Education should not be tied to social cliques, but in reality, it often is.

I identify with the sentiment this person wrote about. My first language was GW basic/logo at around that age, learned some pascal/C around high school, and just tried to learn more.

I miss those days, being a youngster and programming: doing without totally understanding but still learning. I still do stuff on my own, though, but no one will look at it or care.

But now all I do is fix bugs. I got one job after school, where I created something new. Looking back on it now, it was garbage but it was my garbage. That was the last time, though. Now I do 'sustaining' work. I don't mind but it feels like the industry as a whole has turned their back on me. I'm considering doing a startup sooner or later.

To me, a hacker is someone who thinks creatively about problems, and who views technology as a set of tools to solve them. The YC question about "the time you most successfully hacked some (non-computer) system to your advantage" is, to me, most resonant of the 'hacker' ethos. Although many answers I've seen to this question involve using a computer system to solve a non-computer problem, the sheer variety of non-tech "hacks" out there speaks volumes about what a 'hacker' is and does.

To me, the label 'hacker' isn't claimed, it's earned. I call myself a hacker, though I didn't until someone else did. I've never broken into secure systems armed only with a 28.8 bps modem and active matrix screen, though -- I use it entirely in the 'problem-solver' sense of the word, and proudly.

Having programmed since childhood doesn't make me a hacker. I owe far more to years of Latin study, an unhealthy interest in logic problems and strategy games, and being trained via school that there is always a solution to every problem if you apply yourself hard enough.

But on the other hand, if I were just learning to program now, how much of the 'hacker' mindset would come along with it? Very little, in and of itself; I tutor beginning students and I'm always trying to teach them how to solve problems and look for answers, how to be creative and elegant, and how to reuse other people's work -- it wasn't until I started mentoring that I realised how little of this some people do naturally, and I still don't know the reasons why that is.

On a side note, and having just re-read Little Women, I have a thought about hacking and poverty. When you are poor, you have to be creative. How much of that mindset overlaps with what makes a good hacker? How many hackers grew up in disadvantaged circumstances and learned to make the most of the resources they had? How many hackers hung out in libraries, absorbing information like sponges, because it was free and warm and both their parents were at work?

I was about 9/10, not sure exactly, I came across some article that taught you how to create a virus using the windows notepad (lame, I know), turns out the "virus" was just a fork bomb written in ms-batch.

I quickly got into the hacker culture, I went into some underground forum where there was a big "ms-batch scene", I have no idea why, but this guys were implementing games in ascii, trojans, interpreters... all in Batch.

And so I started learning, Batch is a horrible language, yet pretty simple to learn, in the meantime I heard about this mythical developers who wrote code in C or Python, languages I thought inaccessible and extremely complex.

I remember my first "big" project, I wrote some kind of graphical adventure where you were a "hacker" trying to "hack" into someone's PC by using commands such as "ping" and "telnet" (I had no idea about system administration or pentesting at the time).

Thinking about that horrid code, take in mind this was Batch, so no private variables, no functions, no structs... just an endless pile of GOTOs.

This is even more funny considering that this was 5 or 6 years ago, in a time where Python, Ruby and Javascript were a thing, I could have gone the easy way, but I took the side-path... and I'm glad.

I am now 16, I still have a lot to learn yet I've also learnt a lot. I do consider myself a hacker, just because I write code for fun and like reinventing the wheel when possible. But the definition of "hacker" is very wide, for me, a "hacker" is everyone who enjoys writing code, maybe they work 8-17 writing Java in the Enterprise, but, if you enjoy what you do, if you come back home and keep writing code, if you want to improve: then you are a hacker.

Being a programmer does not make you a hacker. In fact, being a brilliant programmer does not make your a hacker. I suppose its like being a driver does not make you a racing driver. Hacking existed long before computers and programming.

What gets held up as an ideal gets followed, especially as children, when we're looking to establish our identities. I wonder if promoting a more diverse image of "hacking" to be included in other hobbies will help. While programming has always been a passion of mine, I also like astronomy, travel, and politics. Perhaps there should be discussion of how to change the practice and culture of programming to be one that more children can follow if they desire?

I want the education sector to start treating the ability to programme as they do literacy and numeracy. Everyone should know how to write a letter and read, but not necessarily write a novel. Everyone should know how to count, but not need to win the Fields Medal. I believe that everyone should be comfortable with looking at a script and tweak it to their needs, but not necessarily have to create a AAA game from scratch.

I was vaguely aware of programming as a child, but had no education (unless you count mailmerge and a broken floor-turtle) and certainly no encouragement at school (in the UK if my spelling hasn't given it away). I basically forgot all about it until the middle of my degree (physics) when C was mandatory. It took until two years after a PhD to work out that a career in programming was what I really wanted.

Do I regret the way I got here? Nope. I learnt a lot of cool stuff along the way. But had it not been for that C course I may never have worked out what I wanted. I got lucky, and luck should not be a factor.

I don't know if I ever thought "hacker" was only a term for someone who has been programming for 10+ years. I'm about to graduate from college, and I started programming my freshman year. I unfortunately never got to see the beauty and joys of programming until then, and it does suck a little bit. But does that automatically not make me a hacker?

Yet, I am arguably successful. I got a job offer from Amazon at the start of my senior year and took it, and of course, I eventually want to start my own company that's why I'm on this site. I understand the stigma because I personally have never told anyone I only started programming in college. I've never been asked either. I love playing around with new languages and learning all I can about software, but I don't feel I'm that far behind people who started programming early if even behind at all.

We need to stop acting like people who have been programming since kids have a huge advantage. It is much more about what you constantly put in, and there are numerous other ways to be a hacker then just programming. Most people who didn't program might already have the "hacker" in them and just need to attach the programming element of it.

I don't fit the stereotype and am okay with that: I wear dresses and heels instead of hoodies and sneakers, I keep a regular sleep schedule

I dont remember that being the stereotype for hackers. I would be interested to hear from other people about what they think? But to me it has been someone who is intensely interested in computers and is very good at it. What they wear (wow really shallow much) hasnt really ever factored into it.

In humans, differences in physical and behavioral traits tend to lead to isolation, which leads to cultural differences. Thats why the general trait of being very curious, experimental and focused on technology and problem-solving has given rise to (among others) the hacker culture and its associated persona. But, one need not consider oneself a part of the hacker culture if one does not want to. Indeed, one can be a programmer without having any of the above traits and/or consider oneself part of the hacker culture.

1) Regarding physical traits leading to culture, the same development can be seen to have given rise to Deaf culture, and also in the very concepts of men and women, which are more cultural in nature than most people think.

I consider myself a professional, and not a hacker or a hack. It is a somewhat derogatory, and as a programmer I can tell you don't have to accept it. Just correct them if someone call you.There is an application for a incubator, they asked if there are any hackers on the team. We 3 programmers looked at each other and filled out: no, no hackers in our company.

The prevalence of the 'hacker' stereotype hurts those who don't identify with it, such as women

The stereotype of prodigies in sports is also hurting everyone like Michael Jordan and Tiger Woods, why should other sports people be disadvantaged by not starting at an early age. Really?

"Hacker" doesn't equate to the best software engineer, the best founder, or much of anything other than having benefited from a longer period of time to gain experienceextra time that may or may not have been used effectively to gain additional knowledge

I think the issue is that the word hacker has too many meanings, and some of them contradictory. A hacker can be a prankster, a skilled programmer, a creative programmer, a skilled/creative programmer with some vague ideology that supports freedom of information, a computer security breacher, a person who tinkers with electronics, a person who subverts any type of system or authority (reality hacker/culture jammer), any sort of whimsical and creative fellow who applies unorthodox solutions to his everyday life, etc.

In addition, a hack can mean: an unskilled or untalented person, a charlatan, an inelegant but efficient solution to a problem (a kludge), an inelegant and inefficient solution to a problem (cruft), an elegant but unconventional solution to a problem, a skilled application of marketing (growth hack), a culture jam (reality hack), hacking off a limb, a prank, a computer security exploit or breach, a program to cheat at video games and so forth.

As far as I know, the "hacker" stereotype is that of the skilled but evil and socially isolated computer criminal.

Well, in my opinion the view a young person can develop by reading a site like Hacker News is a skewed reality of what it is to be a software engineer, a hacker.

One example of this is what I will generically call the "Google Interview". I know brilliant engineers how have done massive non-trivial projects that have generated millions of dollars in revenue who could not pass such an interview.

The other aspect is the constant exposure to the language-of-the-day and framework-of-the-day club. I can see a young aspiring developer becoming utterly disappointed when realizing it is nearly impossible to keep up with it all. Where do you start? How do you learn this stuff? Someone could very easily think they are not up to par if they can't walk in these mythical shoes.

The truth is very different from that. There's a huge world out there for software engineers. There are real problems to be solved. And, no, not everything in software engineering lives and dies on a web browser or a smart phone.

Is the OP saying that she needed to feel like a hacker to feel legitimate? That would be sad.

What the hell is a hacker anyway? Definitions abound. In many ways it is more about how someone might approach hardware and software problems than anything else. You don't have to be YC material to be a legitimate software engineer. One could very easily argue there are tons of software engineers out there doing far more important work than almost any YC developer has done to date. Most of them are invisible. Think about the people writing code for MRI machines, aircraft avionics or even your car's ABS system.

If you are young and love software engineering please don't think that building websites is the only way you are going to become somebody in this business. Explore what's out there and dare to learn about other interesting problems you might be able to solve.

I'm not a young woman so it's hard for me to say, but it looks to me like people are doing a much better job of encouraging young women to program. There are lots of programs to support women in CS. On the other hand, I don't know how effectively these solve the deeper cultural problems where it's not considered "normal" for girls to be interested in coding.

"Hacker" is more about confidence, it seems, and cowboy coding isn't really a good thing in most cases. Don't get me wrong: there are a lot of good things about hacker culture's emphasis on flowful programming, fast iteration, and frequent engagement (demo early and often). Those are all good things, but I feel like (especially thanks to the VC chickenhawking) there is now more of an emphasis on the superficial-- the overblown confidence that is usually just massive ego and upper-middle- to upper-class entitlement-- that if you don't have that arrogant air, you're not seen as a real hacker. This is sad, because we need technologists more than ever to attack the truly hard problems (cancer, oil scarcity, global warming, economic inequality) and instead, the VC's have created this Disneyfied technology economy that is 99% hot air. It wouldn't be a problem if that nonsense were self-limiting, but it's now transferred so much wealth to undeserving people as to have set off a terrible and probably permanent housing price problem in the Bay Area.

I think there are more similarities than the author supposes between these "networks" and the traditional asian enterprise groups. For example, it's a classic feature of the keiretsu to have interlocking shareholdings, directors, and financial foundations, ensuring cross-pollination of resources and broadly aligned interests. I think that has more than a few parallels with emerging "business networks" such as ycombinator.

Obviously the scale is a little different, but hey, give it a century...

They switched to a new (outsourced) billing system recently that was supposed to "dramatically improve" billing statements.

At our house, we had two accounts for PSE - one for the house power and a separate account number for a streetlight-like fixture on our property. The reason for the two account numbers was that they were billed differently: house power as normal, streetlight at a fixed rate paid once every other month.

To add to the complexity, we were enrolled in an estimated payments plan. Our costs stayed they same, month to month, with adjustments twice a year to update our average payment amount up/down.

Before the new billing system, we received bills once a month - with the amount fluctuating every other month by the cost of the outdoor light.

Under the new system, we had the "estimated payments" check marked, the reps could see it in the system, they claimed we were enrolled - but our bills that arrived acted as though we weren't in that payment plan.

They couldn't explain it. We were told "Oh, well, it's probably due to the cutover to the new system. Next month's bill should be normal." But it wasn't.

"Oh, it's probably just hiccups - I just changed something here so next month will work for you." Nope.

If they are not billing you, stop paying them! Go talk to a federal debt settlement attorney, this will make a grate class action, and you can get paid good money for being a class rap. I am not an expert in this area, but I am pretty sure there is a case here for damages if they try to collect money from you without presenting you with a bill first.

It's likely much more than two databases. Most MSOs outsource billing and customer service to companies like Convergent. It's a mess of legacy regions and products with different vendors involved. Try troubleshooting phone number porting -- that goes through Accenture.

You can take one of my bills -- I receive two Ecobills and a paper bill!

As a longishtime Linode user, this makes me extremely happy. Keep it coming linode. Not that pv-grub is beyond my linux-foo by any means, but after working on Linux all day professionaly and dealing with $latest_version_of_fedora on my home pcs, I'd rather not play sysadmin too much more on my vps boxes.

I took a different tack with clojure-scheme by compiling via Gambit Scheme, which can target Objective-C. https://github.com/takeoutweight/clojure-scheme. This approach made it easier to provide self-hosted compilation, enabling niceties like an on-device repl.

Clojure's one real advantage is Java interoperability and it is its biggest single selling point.

You just can code nice and quick in a language much nicer than Java, but easily access all that code from our_stalled_corporate_crap.jar, and package/deploy it the way they do it in their Java EE world.

When I was in college (99-2000ish, maybe?), one of my roommates was an art major with a graphic design bent, and he had this high end printer and would copy and print $1 bills and take 20 at a time over to the vending machines around the corner from our apartments once a week or so. After about a month of doing this, the machines disappeared.

Of course, he's the same roommate who later got arrested for wire fraud and grand theft.

The algorithm that the author is painfully searching by brute-forcing a picture is hidden by simply compilation. Decompilation of a binary was hard in the 20th century, not so much now. Discovering the exact algorithm only requires some time and some skills.

This is one of the things that made me really see the value of free software.

I did verify experimentally that the Eurion constellation alone doesn't trigger photoshop's image rejection algorithm. I think it would be fun to distribute a bunch of images that false-positive the digimarc algorithms, just to mess with people.

When I used to make news graphics for a tv station, every now and then this problem in Photoshop would arise and it was incredibly annoying to make anything with a flat dollar bill as part of the background.

"Ah! I remember reading the original web page from years and years ago about the fact that this happens. I wonder if they've discovered how they do it now (the algorithm). I will click on this presumably new (on a news site) link that says 'how'. WTF? It's just the same old page from 2004!"

What bothered me in the other thread where I first learned this is that while I have always understood that I don't have a reasonable expectation of privacy on the Internet, I feel dirty as shit when I realize my computer determines when something is legal or not for me to do.

What ever happened to just paying $40 for a game and getting to use it forever? (This killed Candy Crush for me. I love the gameplay dynamic and the graphics, but the constant upselling just makes me want me smash my phone into a fine paste and then feed the paste to the game's developers.)

Clever, but it will never take off. These real world type games always tank. Why? Because no one wants to go running off randomly looking for a stranger whilst looking like a complete twat. And making actually pay for this? pfft

Wouldn't it be wiser to choose a hypermedia format for the API, and then use a generic hypermedia client on whichever platform you like? Then you write just as many client libraries (zero), but the problem of pushing updates to your clients is solved as well.

Just tried this out and Alpaca is an awesome tool. However, I'd never want to release this in production without tests. Alpaca doesn't generate tests, so you're back to maintaining the tests for your N different platforms/languages. But you'd have the same problem with its competitors, too; Thrift [0], for instance, doesn't generate tests either.

Overall, I'm not sure that the time savings is as big as it first appears, but I think it's great for quick projects.

This is cool. I would suggest that it would be very useful to have this kind of thing for JSON Schema [1], which is what I use with Python code to validate incoming JSON. (I was originally hesitant to using that, but since getting into it, I have yet to run into a use case which it cannot handle.)

There is also an RFC for "JSON Hyper Schema" which is intended to describe REST APIs. It doesn't have much library support in much of everything, but I am surprised that it hasn't taken off!

I like that this library is fairly opinionated (options for how to authenticate, supported formats, etc.) Though I worry that that creates a bit of inflexibility - for what exactly does "oauth" actually mean, there are always vagaries.

I like the SPORE (Specification to a POrtable Rest Environment) approach better. You create a description file in JSON and each native language client can use that file to access the HTTP API. https://github.com/SPORE/specifications

SPORE already has clients for Clojure, Javascript, Lua, NodeJS, Perl, Python, and Ruby. I have used SPORE in a few projects and I was not disappointed. Another approach to solving the cross language library problem.

Very cool, but why come up with a new API schema rather than use an open standard like OData (http://www.odata.org/)? Then Alpaca would be compatible with a bunch of APIs that already exist today. In fact, something like this (generating client libraries from APIs) may exist for OData already, but if it does, I've only seen it for .NET and OData (Visual Studio 'Add Service Reference').

This is actually pretty similar to a side project I've been working on called Gargl (Generic API Recorder and Generator Lite) (https://github.com/jodoglevy/gargl). Haven't gotten around to doing a Show HN post yet, but would love any feedback or to combine efforts. Basically it lets you generate an API for websites that don't have APIs publically available, by looking at how a web page / form submission of that web site interacts with the web server. You record web requests you make while normally browsing a website via a chrome extension, parameterize them as needed, and then output your "API" to a template file. Then this template file can be converted to a client library in a programming language of your choosing.

I have to admit, this is very reminiscent of "Add Service Reference" in Visual Studio, a capability to which I have grown to despise over the years. The code was almost always incomprehensible. I cannot tell you how much I loath seeing the comment at the top of a file "This was generated by a tool".

Having said that, this tool does look interesting. I hope that a goal is to always make sure that the generated code is as readable, and maintainable, as possible. Also, as mentioned by others, adding generated tests to the generated client libraries is extremely important.

So, this is markedly similar to the project I've been working on in grad school, only without static typing. http://research.cs.vt.edu/vtspaces/realtimeweb/Also, mine is explicitly geared towards education purposes. I'm about one third of the way thru version two, but I wonder if we can cross pollinate our code bases to get something even better.

That looks fantastic, and seems to be in alignment with some things I have been doing lately (generating the server side controller of the API in Clojure + a set of documentation, from a set of definitions of API methods).Good job!

So I have an OWC 480 GB Pro 6G that, at the time, was $1.5k. It has an interesting intermittent failure mode: If the temperature of the laptop goes below 16 C (60 F), the system won't recognize it. Guess what the problem is? Condensation. It collect inside the SSD's plastic clamshell and shorts some of the pins. They failed to seal it and failed to include a desiccant in that space, or just pot in the whole thing in epoxy.

I've been exclusively using Samsung and Intel SSD for production machines- however I'm not too worried about the power cutting out. With dual power supplies and several diesel generators in my datacenter, I'll be okay.

The gods confound the man who first found out how to distinguish hours! Confound him, too, who in this place set up a sundial, to cut and hack my days so wretchedly into small portions! When I was a boy, my belly was my sundial one surer, truer, and more exact than any of them. This dial told me when twas proper time to go to dinner, when I had aught to eat; But nowadays, why even when I have, I cant fall-to unless the sun gives leave. The towns so full of these confounded dials the greatest part of the inhabitants, shrunk up with hunger, crawl along the street.

This reminds me of feelSpace[1]. It is a belt which has vibrating motors all around it and a compass. I believe the point is to give you an extra "sense" which after you have worn it enough you incorporate with your other senses.

This must be a gag, or the worst example of western opulence I could possibly think of.

Seriously, an electronic gadget made to disrupt your day every 5 minutes. I think this requires a better explanation of the purpose before I take it seriously. Right now all the marketing speak combined with the banality of this device makes me think it's the perfect april fools but a few months too soon.

Let's assume it's for real, why would you want to get reminded that 5 minutes have passed when you've lost yourself in a fun life moment? Or when you're waiting for something a long time, that would be even worse.

Water resistance just seems to be a requirement these days - even my Pebble which I'd never wear in the shower is water resistant so when I am forced to react to a bath incident, I don't worry about the watch, and my kids instead.

Sounds like a pet project - maybe they should hire a real watch designer who can get them from hobby to actual product at some point.

The idea might be good, but the price is extravagant. I think the parts should cost only $60-70, maybe less, so charging nearly $148 is laughable. Especially when I can get a nice, brand-name watch for $118 or so.

Check out TicTocTrac (http://www.tictoctrac.com/). Open source watch that monitors when you check the time, and lets you track time perception whenever you want (instead of buzzing you every 5 minutes, you tap to start the test and then tap when you think X minutes have elapsed). You can save the data and use the site to visualize it later. The BOM comes to about $55.

I think a lot of its cost comes from the fact that the device is made in Oslo. In Norway, labor prices are upwards of $125 USD per hour. If hamburgers are $50 there, then it makes sense that this thing is priced like a couple of hamburgers.

I'm curious to see the results of this. I wonder if there are any negative impacts to consciously acknowledging the passage of time with a device like this considering that when "time flies", it does so because you aren't conscious of its passing.

From TFA:> how time flies by when you enjoy yourself, and drags along when you wait in line at the post office

I suspect my experience at the post office be made worse by the fact that there's something vibrating on my wrist reminding me of how long I've been waiting?

Wow... I've been having this idea for a watch for a while.. someone with the hardware knowledge pulled it off. Did the creators try having a short vibration for five minutes and than a longer vibration for an hour?

Better: Install "Multilingual Speaking Clock" instead (windows only I think) and set it up so it announces the time every 5 minutes through your computer speakers.

It. Is. A. Mind. Game.

When your brain "idles" you lose track of time. Time in the mind only exists as a comparison of one moment in the past and another moment in the present. So when you're daydreaming (mostly brain idling) time appears to fly by because there's less datapoints for your mind to use to compare. But when you're checking the time constantly time appears to go by very very slowly.

It can become mentally exhausting when you do it too much though because it speeds up the mind. You become aware of every little moment. It can make you extremely productive but you really do have to take "brain idling" breaks every now and then. Also when you first start off, time will appear to fly by, it'll feel like "5 min" are flying by every 30 seconds. But after about 30 minutes your brain will stop idling so much, you'll become aware of more time points, and it'll feel like time is slowing down and you're thinking "faster".

> If theres any growth hack that worked well for us, its been product integrations with successful companies. Early on, we looked at other companies with whom we shared mutual customers and with whom a product integration would be mutually beneficial. We built integrations with New Relic, Datadog, Pingdom, Librato, TempoDB, Heroku, and HipChat. Now, our signups love the fact that they can hook their status page up to their their existing tools with only a few clicks.

"At the bottom of every status page, we include a small, Powered by StatusPage.io link. While we felt uneasy about this at first, one of our mentors encouraged us to include the link and it has worked incredibly well... One-third of new signups and customers originate from our existing customers' status pages."

Love it. We've had a few moments like this at Paydirt (http://paydirtapp.com/). As engineers, we tend to have hangups about little details like this that customers genuinely don't give a second thought to, and they can be a means to massive (and multiplicative!) business wins.

Now you're talkin! When you first posted the $5k/mo blog, I was like "ok, that's better than nothing, but let me know when you've got something that can support employees". At $25k/mo you're definitely well on your way there! At $100k you've got a $1m+ business, which is great.

> Its also possible to compress updates to the bloom filter. The server just needs to calculate the XOR of the version the client has and the updated version, then run that through LZMA (the input will be mostly zeros), and transmit the compressed diff to the client.

> Unfortunately, for a reasonably large user base, this strategy doesnt work because the bloom filters themselves are too large to transmit to mobile clients. For 10 million TextSecure users, a bloom filter like this would be ~40MB, requested from the server 116 times a second if every client refreshes it once a day.

Wait, the LZMA-compressed daily diffs would be still be ~40MB? If the 40MB is just the initial bloom filter download, that's not so bad. If 40MB is the daily diff, I'd be interested in seeing the calculation for that.

This is great until you realize the average address book is a disgusting mess of spelling errors, wrong values in wrong fields, punctuation in places where punctuation isn't necessary (commas in phone number fields, digits in name fields, etc.)

The only field that might be fairly consistent is email address and even that is no guarantee. Sure, it's possible to clean up a contact before hashing, but when you consider the fact that the typical phone contains >500 contacts it's not something that can be done in a reasonable amount of time on a modern smartphone. For the last two years I've been working on a product that involves cleaning/organizing contacts and making them searchable, when I started I had no idea what a massive undertaking it would be.

So, hash till your hearts content, but if you have 10 different people with the same contact in their address book I would be willing to bet that you will get 10 different hashes because humans are not good at data entry.

The problem is that it's easy to upload 1000 randomly generated phone numbers and get back results. But I think you can distinguish random or sequentially generated numbers from a real user's actual contact list, by looking at the distribution of friends among the contacts. A real contact list should have a high density of friendships between contacts. The system should only return results if the uploader provides a subgraph that's more highly connected than a random one would be.

You'd need to look at real data and tune the parameters to make this effective.

A new private-set intersection paper, claiming to scale orders of magnitude better than predecessors, has just been released [1]. I don't know whether it solves your problem, as I haven't read anything beyond the abstract, but you might want to check it out anyway.

Is it reasonable to trust for me to trust you to you run some arbitrary binary on my phone that does some magical black box computation on my contact list and then phone home to your service, but not trust to that you will just not be a dick with the raw data?

Stated more simply, if I am paranoid about my privacy, then why am I running your apps and letting you read my contact list at all?

I think there is a big difference between private contact discovery and contact discovery that isn't brute forceable like we are seeing with SnapChat and others. One change that could be made for these very small pre image corpuses would be to ask the target user if they should be exposed to the searcher.

1) I upload the hashes of my contacts which enqueues a request to those that match the hashes

2) The target of the request dequeues those requests and is given the option to expose themselves to the other user

3) If approved, the original uploader gets access to that user on the service

This would stop the massive discovery issue and would only put a medium amount of friction into the growth engine. It obviously doesn't cover the issue that the service itself has access to the contacts which means they could potentially be exposed. However, if the service discards hashes without matching queues and removes matches when dequeued, the risk to the user is much reduced from the full address book storage case.

This was written off the top of my head, so forgive me if I have left a big hole somewhere.

So don't do it. I know I'm probably in the minority here but I've never given another app permission to use my facebook or twitter account to "Find Friends" and I personally find the whole practice offensive. Just have an option to invite people by their email address/number/whatever, but not in bulk.

Is is possible to leverage some sort of peer-assisted discovery? Where you ask some sub-set of your trusted contacts already on the system, to call the server on your behalf. So that the server is less likely to know whose contacts are being sent to it.

Of course, this can only be part of the full solution.

Disclaimer: just throwing a wild idea out there. don't know anything about the field.

> Building a social network is not easy. Social networks have value proportional to their size, so participants arent motivated to join new social networks which arent already large. Its a paradox where if people havent already joined, people arent motivated to join.

Value is relative. For stakeholders, value is indeed given by the networks size. For users (participants), its very debatable whether size equals value. The author seems to be mistaking the two perspectives for a single one. No paradox here.

It's not a technical problem it's a value prop problem. Most users don't care about privacy....

Fundamentally there is more value for the network operator to grow the network than for users to disclose their details. Email works perfectly fine without having to disclose your address book because no one owns the network and therefore do not care how many people are using email.

Alternatively one could just allow users to choose whether to make their info public and then the client can just search a directory.

I don't really think you need to solve this problem using crypto. I think a simpler solution would be to use the "hash it" solution, but restrict how many lookups each client can do. You can either do this by the using the client's phone number, IP or other unique information etc. This way an attacker would have a very hard time brute forcing this.

Additionally you could use captchas (or other humanity tests, such as SMS verification) to limit hackers creating automated accounts (in order to fight automated bots spamming and polluting the network).

At first, I thought this was encouraging. But then I checked out the 3 "providers" listed for my country (Germany). One is a club maintaining connectivity for a specific student housing project, one offers only dialup/ISDN, and finally only one offers actual ADSL connectivity. I'm sure it's the same in other countries.

While it's interesting as a concept, I don't really get the feeling there is much to see here. It's also not clear to me how these very few access providers would actually federate without real backbone connectivity.

It seems like the "from around the world" part isn't really up and running yet. I expected to see some of my local SF Bay Area ISPs on the map, but it's only showing some European ISPs right now. Further I wondered what the qualifications for DIY might be--individual-level shared networks or the typical small business thing? Next I wondered why it was so important to know about FDN attending some conference, vs. spending more time explaining in basic terms what they do and what they plan to do.

Have always been curious about setting up an ISP. Especially so recently. I have been unsuccessful finding much information, though I haven't tried as hard as I probably could. Anyone have any good links?

OP here - FYI - I was a software engineer at Novell over 20 years ago, then went to the business side. Last January for some crazy reason I decided to build a website. I couldn't write HTML last January, and though this is a pretty tame site by HN standards, I'm pretty proud of it. I've had to learn HTML, CSS, Javascript, Jquery, Angular, D3, Require, Node, Apache, Bootstrap, noSQL DB (Cloudant/Couch), linux system management - the works. It's been fun.

While I'm still active in software development, I haven't done web development in 10+ years. I'm currently working on a side-project which is also teaching me "the works" when it comes to web development and deployment. Of course, I will still be a n00b by the time I'm done, but I'll soon have written and deployed my own RoR, data-driven website.

My side-project also shares a similar motivation as yours: First, it's my belief that most data isn't taken advantage of because most people suck at data. I wanted to see if I could take an existing, well-picked-through data set and extract value, just by sucking less. This could also describe my side project, which is in the sports domain.

You reaching this point is extremely inspiring. Thank you for sharing!

Very cool. I like the way that you show 'means vs time' on the left panel and then you can dig into the actual distribution on the right panel. FYI, I think that, e.g., http://weather-explorer.com/history/country/US/state/WA/city... should read "Daily High Distribution", not "Average Daily High" or something. The mean trend line is on the left panel, and the right shows the the entire distribution. I'd be curious to see also what the 2nd and 3rd moments look like vs time, to see if the weather has an equal 'spread' month over month or if it tightens up for certain periods of the year.

Also, you need to drop in a full post with commentary, analogous to what you did with your "learning python" post. More feedback about tools, resources, learning sites, etc.

Which tools/languages/techniques did you enjoy learning and employing the most? It all sounds fun, but some of it sounds more fun than others (for example, I really love Python's legibility, and am impressed by D3's power but find it to have a pretty steep learning curve).

While you might have forgotten some programming, and quite a bit has changed over the last couple of decades, what is quite interesting to me is the efficacy of the result. Although I don't know what your word looked like 20 years ago, this probably shows that over the last couple of decades your sense of how to do something useful and present it has developed.

I would have guessed that a "what I did to catch up after 20 years of non-programming" project would have had a lousy UI (presentation), which yours does not. And while the presentation tools are better, the decision of what to present and how are still up the programmer.

Very cool site...with your background, it shouldn't be too hard to grok the concepts that you used to build the site...it's just that there are a lot of them, and joining them together (efficiently, and attractively) takes experience...if this is your first (or fiftieth) web project, it's one to be proud of.

(that said, you probably could've gotten away with just HTML, CSS, JS, jQuery, D3, and Bootstrap, as the site could probably run off of flat static files...but even sussing out that architecture is its own skill)