A recent Forbes article described the hot new trend of tech companies hiring newly-minted holders of liberal arts degrees. In my 20+ years of experience, this is neither new nor hot – I’ve worked on amazing dev teams full of people with multiple advanced degrees in the humanities who felt like earning an actual salary – but it’s worth talking about. While it’s absolutely true that there is a vast shortage of people with STEM skills in the US, and plenty of well-paying jobs sitting vacant for them at tech companies large and small, the notion that you ‘need’ a STEM degree to land one of these jobs is damaging, both to jobseekers and to companies. At the same time, there is an extremely tired idea that studying the arts or humanities is a waste of time, because it doesn’t ‘prepare you for the workforce’ – and that’s simply untrue as well. Both sets of skills are necessary in the modern workplace – and getting beyond that initial entry-level engineering job may be easier for those with liberal arts background, as we’ll discuss in a moment.

But the basic premise of the article maintains a strict tech/non-tech divide: a new Slack employee with an arts background was briefly profiled, but it emphasized that she was so useful because she was non-technical:

She’s been at the company for barely a month but she’s already helped a construction company assimilate Slack’s software to keep track of things as varied as plaster shipments and building regulations via employee smartphones. Lee says she’s in awe of her technical colleagues who write Slack’s code. They, in turn, respect her because of her untechnical ability to “connect with end users and figure out what they want.”

And this is the point that is often misunderstood: you can absolutely succeed in a technical role with a humanities or liberal arts background, as long as you’ve also got the technical chops, and even if you are in a purely non-technical role, remaining ‘in awe’ of your technical colleagues isn’t particularly helpful – you should have at least some understanding of what goes into what they do, and know that it’s hard work, not magic. Many new non-engineering graduates gained solid technical skills as they studied Proust or philosophy (which does, to be fair, get a mention in the article), but it’s not always a given.

On the flip side, moving into a management or leadership role with a purely technical background is a different sort of challenge. For those looking to brush up on their technical skills, there is a burgeoning industry of boot camps and self-directed learning. If you’re an engineer who needs to learn to write, present and influence decision-makers in a new role, the path forward is rather murkier, even if someone is on a strict principal-engineer path. Good code isn’t enough to get you there, and some of the more theoretical aspects of an engineering degree program (which in itself is not exactly ‘vocational’ education, though that’s something that could be much more highly-valued in tech) are fabulous in helping you develop ways to approach a technical problem; being able to lead a team and explain to your leadership why you’ve chosen a particular path forward isn’t as straightforward.

People with balanced strengths in social and math skills earn about 10% more than their counterparts who are strong in only one area. In fact, socially inept math whizzes fare no better than go-getters who struggle with numbers.

While I’d be more than happy to introduce you to some equally-introverted historians (they’d totally hate that, of course), there is a useful point buried here: a basic understanding of both technology and the liberal arts gives you adaptability; fluency in both can give you career superpowers. Understanding how to wrangle data is important. Being able to contextualize and tell a story with that data, to multiple audiences, is equally critical. And having the ability to pivot to an entirely new role, vertical or industry is more realistic if you simply have more tools in your toolbox; being able to switch back and forth between technical and non-technical positions as business or life conditions change gives you options you might not have otherwise.

This is not to say that specialization is a bad thing, or that all engineers are lacking writing and management skills – far from it. But developing expertise in one or more areas is what happens on the job, as you gain more experience, and technical degrees become ‘stale’ far more quickly than those in the humanities: a programming language you spent several months, or even perhaps a few years, learning as an undergraduate is most likely almost useless ten years down the line – if you’re still in the field, you’ve learned new languages and skills through work. But the ability to research, synthesize and present arguments, whether those are about the Corn Laws or stylistic pottery variations at Mohenjo-Daro, are still valuable skills when differently employed. The subject may be far removed, but the skills around critical thinking, thoughtful skepticism and time management are vital.

And arts/humanities graduates have another leg up when it comes to tech job descriptions: ‘comfort with ambiguity.’ You’ll see a similar phrase in nearly every job description from a tech company, in both tech and non-tech roles, and yes, it’s an extremely useful quality to have in this (and many other) fields. Fortunately, if you’ve spent several years gathering data, writing research papers and debating complex issues that don’t have a clearly-identified ‘solution,’ congratulations – you’ve got the right mental training for this career. I’ve seen some young engineers struggle with just this aspect of the field – you can’t always engineer your way out of the problem (well, often you can build something, but it leaves significant technical debt that you – or someone else – will need to deal with eventually), and there may be multiple paths forward. Having experience of referring to historical precedent goes a long way.

In my own tech career, I’ve never had to reproduce any of my shaky college algebra (turns out it wasn’t even useful early on as a front-end and back-end web engineer), but I write research papers, give presentations and analyze strategies and processes; these are things I was quite well-prepared to do as both and undergraduate and graduate student of archaeology – and that’s especially true for the data analysis skills I learned there, though the technologies and techniques are now quite different.

So, where do we go from here? I was fortunate to be in the right place at the right time as a self-taught techie; early on in the dot-com era, the skills were the important thing; it didn’t matter where you’d acquired them. To a certain extent that’s becoming true again – boot camps and coding challenges are offering other paths in to the profession. But there’s a fundamental disconnect in the way we approach teaching both technology and the humanities, at least at the high school and college levels (there seems to be a little more room to experiment in the elementary years, though that seems largely driven by the STEM-only crowd). A newly-minted engineer, either at the undergraduate or graduate level, needs coursework and experience writing. New English or Art History grads may have had some exposure to technology through text mining or other digital humanities projects, but ensuring a solid exposure to ‘real’ coding is just as important for them. Internships would also ideally include both coding and writing experience – and many more are starting to do just that.

We also need to do a better job as a profession helping people from purely technical backgrounds move into senior roles – a few hours of ‘management’ or ‘business writing’ training isn’t especially impactful in most cases, and there aren’t equivalent writing ‘boot camps’ to help hone those skills. Having a foundation as a matter of course, even if it wasn’t the key focus of a degree program, would go a long way toward setting people up for success – testing out of English 101 isn’t the same thing.

While many larger tech companies have figured out that an ever-broader population has tech skills as well as what we might term ‘business’ advantages, startups and smaller companies aren’t always aware that they should cast a wider net in tech recruitment. Librarians have often been forced to become software development managers, just by the nature of modern work in the field. PhD historians often outpace new data science grads – many of those skills are part and parcel of modern academia, they just pay very poorly in that setting.

There is an artificial barrier between these two broad skillsets that needs to disappear; having a foundation in both is critical for success in tech, and in many other businesses. Putting the two together brings out the best of both, just like the commercials said.

I recently had the good fortune to geek out on corporate culture with the wonderful people of Zappos (full disclosure, we are ‘cousins’ within the Amazon ecosystem, though I include my usual ‘Not Speaking for AWS’ disclaimer here), and while they had a full spectrum of fascinating, positive things about their culture to latch onto, what I was most struck by was the role that shared experiences played in shaping their unique approach to work, and how the thoughtful, intentional creation of shared workplace experiences is often overlooked as a tool to drive a positive corporate culture.

I am certainly not unique in having worked for a variety of companies, large and small, that miss the mark when it comes to helping you learn how to navigate and thrive in their specific cultures. Back in Silicon Valley during the dot-com boom and bust, I experienced both little startups – I was employee 18 (or so) at a dot-com, pre crash/burn – and I subsequently worked for a few huge, global tech companies. While those organizations were very different from each other in almost every way, they did share a total lack of structure around onboarding. That’s expected (though not really excusable) at a startup, but even at Big Tech Company No. 2, no one helped me figure out how to get paid until about 3 months in. There was no training, either formally or informally, on in-house tools, norms or expectations. I don’t think I saw a company mission statement or had a specific new hire or role-based orientation program until about a decade into my career.

And then I have experienced the other side of that coin – training and process overkill. Another nameless company I worked for was insistent about transmitting everything to do with its goals, values, compliance, and culture via time-consuming, mandatory e-learning. While there is certainly a time and place for asynchronous training, especially when you have a global workforce, I argue that if you are looking to foster long-term business relationships and a strong, healthy company culture, e-learning and classroom training aren’t magic bullets. Live, shared experiences are the key, and that brings me back to Zappos.

Everyone who joins Zappos, regardless of role or level, joins a cohort of new hires who have four weeks of training – they learn the customer service role inside and out, they work the phones and speak directly to customers in the call center; no one gets to opt out to attend a ‘more important’ meeting. Their training is capped off by a real-life graduation ceremony, and many of the people I met, in a variety of roles, fondly recalled their training; it gave them a firm grounding not just in the company culture and expectations, and also set them up for success at building relationships across departments and roles. I’m sure those relationships are a major factor in why there were so many long-term Zapponians – people whose tenure often exceeded a decade. From a tech perspective (including my own, which, again, is not unique, where I’ve seldom been in any one company more than 2-3 years), that’s astounding.

This is not to suggest that every company should go out and bolt on a four-week immersion experience to their hiring process; it’s certainly not cheap and for a globally-dispersed team, small or large, it’s simply not always feasible or even desirable. But even fully-remote companies realize that technology alone can’t create and develop culture; Automattic’s approach of an annual meetup for the full company and smaller team get-togethers creates regular opportunities for their employees to share experiences. Other companies have town halls or all hands meetings that serve similar functions; the cyclical, almost ritual repetition of these kinds of meetings (and, not infrequently, the trip to the libation chamber bar after) lets employees build organic relationships and memories – ‘remember the all-hands where X spoke or Y performed?’ That’s important.

Shared experiences drive shared purpose. As humans, we seek out cyclical, seemingly ritual, experiences – is an annual trip to Disneyland substantially different from a theoretical ‘pilgrimage’ to Avebury or Stonehenge undertaken by their builders (and, quite probably, their plus-ones)? We have good evidence that the ‘users’ of Stonehenge (to put it in vaguely techie terms) liked a good annual party; the motivations behind it may have not been terribly different than that of a modern company picnic or offsite: do something different from your regular workday, with your colleagues (and possibly your family as well), then consume food and beverages. There would have been other commonalities with our era – everyone would recall the colleague who got horribly drunk one summer, or the time someone’s dog tried to attack the fire-eater (you may recognize the voice of experience here). While the terms we use to talk about prehistoric gatherings tend toward the mystical or mysterious, that’s largely a function of the paucity of evidence and/or our tendency to want to make something we don’t immediately understand more meaningful, but annual or seasonally-occurring events in the distant past may have been quite similar to ours – a working meeting with a party afterward.

In the workplace, we create rituals whether we mean to do so or not. A standing happy hour, a semi-organized run at lunch, a yearly offsite or even our more formal business mechanisms like annual reviews or daily standups drive our culture. How we create and evolve those experiences for employees says a lot about that culture – going back to Zappos, they ensure that everyone has the opportunity to attend their all hands meeting; it’s such a priority that the call center is shut down for the occasion, as it is – briefly – for some other seasonal events. Creating an environment in which all employees have consistent, shared experiences builds personal connections and deeper engagement – provided those are good experiences. Yes, it’s hard to do globally, at scale, but it’s worth trying.

A few simple guidelines:

Be intentional. What do you want to create, and why? How will you evolve it?

Be consistent. Create a regular cadence and stick to it.

Be inclusive. If your site or event doesn’t welcome everyone (and there may well be certain team- or role-specific events), what are you telling current and prospective employees?

Have fun. You may not see a direct ROI on every event, but if your employees want to be there for the long term, you’re doing something right by giving them something to remember that that isn’t just their meeting schedule.

Finally, think long term. Everything you do is adding to your company’s history, whether that will eventually be long or short – what kind of story do you want your employees to tell their future grandchildren or robot overlords?

It’s been difficult to miss stories of tech and startup culture fails of late, whether it’s Uber or Thinx, and there have been many excellent suggestions on how to improve diversity and the employee experience, but I’ll throw another one into the mix: hire an archaeologist*.

No, it’s not a joke, though I fully admit it may be a head-scratcher at first, but hear me out: I’ve been working in technology for 20+ years, and while I’m emphatically not speaking about my current role at AWS, where I’m the Culture Lead (yes, we’re secretive, but you knew that, and no, I’m not claiming we’ve ‘solved’ everything culture-wise), I can assure that my two archaeology degrees have been incredibly useful in this field – though never more so than in my present position. Allow me to explain –

I fell into technology while working on my MA in archaeology at University College London in the 1990s; I began my tech career as a coder and moved (kinda/sorta) swiftly into people and technology management in Silicon Valley, NYC and elsewhere – I’m now happily situated in Seattle, where I get to do all sorts of Secret Things I Can’t Tell You About Right Now. Along the way, I’ve seen some pretty bizarre things from a company culture perspective (terrible brand rallies! awful ‘culture fit’ excuses in hiring! team and product names that are totally offensive to colleagues in other regions!), but I’ve also been lucky enough to see the good as well. After a few general culture protips, we’ll discuss how having an archaeological viewpoint can be a huge benefit – for real.

First, though, a few notes on What You Should Do; your company culture, like any other aspect of business, can’t be left to good intentions – it needs structure and mechanisms to reinforce it and to help it evolve in a positive direction. Whether you are a tiny startup or a huge multinational, you need mechanisms that will scale with your organization’s growth, and that can be consistently applied wherever your people are. You may need to modify them to work in some regions or for remote people or teams, but they should still be scalable and repeatable.

Your culture is modeled by your leadership, and that’s at every level, from the c-suite to brand-new dev managers. While it seems that every company has ‘values’ or ‘principles’ that were drawn up early on, in my experience the uptake on these ranges from absolutely embedded and referenced on a daily basis to openly mocked and derided, with most places falling somewhere in between. When they work, they are a valuable tool and a core driver of your business – they dictate hiring, promotions and offer direction on key decisions. When they don’t work, there’s usually an obvious reason:

They were developed by outside consultants to ‘sound good’

They are meaningless platitudes that simply take up time during the onboarding process

They are actively terrible, and are used as an excuse to avoid diversity

I won’t dig too deeply (see what I did there?) into the third point, simply because it needs to be its own discussion (as it is here), but I’ll pivot to why they work when they work:

They are thoughtfully, and intentionally, developed in-house, taking into account a wide range of viewpoints

They are flexible and can be specifically applied to daily work, but aren’t ‘rules’ that must be obeyed

They are regularly reviewed and updated as the company grows

They are an expected, and hence unremarkable, part of daily worklife

If your company’s mechanisms for people management don’t reflect whatever your company’s stated values are – or if they overindex on a specific one or two points – you’ll very quickly get drift away from the good intentions that went into their creation. Having repeatable, measurable processes around your business life cycle and the people who make it happen is the key to a healthy culture, and this is where the archaeologists come in.

The popular view of archaeologists falls into one of two main camps: we’re either Indiana Jones or scruffy bearded people with a fondness for drink who wish they looked a bit more like Indiana Jones. I surely don’t need to point out that both of those impressions skew almost entirely male (feel free to insert a Tolkien joke about dwarf wives and their beards), but there’s a lot more going on than just drinking digging and/or punching Nazis. While I won’t get too deeply into describing different approaches to archaeology (for example, did you know that theoretical archaeologists mainly argue about French social theory, and rarely, if ever, go outside, much less dig? Did you know that post-processual archaeology is real? Mostly true facts!), there are some commonalities that give archaeologists an edge in mapping and shaping company culture.

Everyone ‘knows’ that archaeologists can take an artifact (or, more typically, an assemblage of artifacts) and use clues from that artifact to tell us more about the people who created it, traded it, used it or who perhaps just thought it looked cool. At work, we create ‘artifacts’ every day without thinking twice about it – documents, wikis, websites, apps, you name it. And when we’re speaking about those internally-created artifacts that are used to hire and manage people – interview notes, performance reviews, presentations and so on – it’s easy to forget that the mechanisms that generated those artifacts were designed with specific long- or short-term goals in mind. Indeed, there may have been considerable ‘cultural drift’ between a mechanism’s original purpose and its current usage; for example, it may have once been the case that ‘big ideas’ went through a presentation-heavy gating process to get executive buy-in, but now it seems that absolutely every decision goes through some version of that. That’s not to say that processes and mechanisms like that can’t work, but that the rationale behind them needs to be understood, and that they need to be regularly reviewed to ensure they are still fit for purpose. Not infrequently, most employees who need to actually follow these processes have little-to-no information about why it was created, or what the unwritten rules are – it’s purely tribal knowledge.

And that’s another way archaeologists ‘get’ how to dig (har) into corporate culture: when they don’t know why something was created or can’t pin down an obvious purpose, there’s a default answer – ritual! (In all seriousness, this is a thing. It’s practically reflexive). But so much of what happens day-to-day at work falls into this bucket as well; as mentioned, the people who designed (or inherited) a process have left, or have long since forgotten its origin, and it has become almost entirely ritualistic – we do it ‘just because.’ Sure, we’d like to fix that broken process or mechanism, but it’s like that For A Reason, we assume – and thus are corporate sacred cows born. This is just as true looking at archaeological sites; while some pretty weird things do, indeed, fall under the ‘ritual’ heading (at least without further evidence), it’s also clear that people in the past not infrequently did things just because they were fun or looked cool – they aren’t so different from us.

Throwing an archaeologist at your company processes and mechanisms can turn up all sorts of unexpected things about your company’s culture; simply having a complete audit of all the ‘things’ you’re doing, how they came about, whom they affect, how and where they are implemented is quite illuminating. Turning an archaeological lens on this adds further value; as mentioned above, people rarely know precisely why they created something or how it evolved, so having a background in making educated guesses in that regard, based on data, is quite useful.

With this information in hand, you can begin to make better data-driven decisions that drive your company culture – did you discover a gap in your onboarding process in a specific region? Perhaps there is no policy to handle difficult employee situations, or you may simply have not had time to develop a codified, shared value system for your organization. Knowing where you have a potential problem and what resources you need to allocate is job one – you can thank an archaeologist when they help you unearth these clues.

Finally, a closing thought for the archaeologists out there: want to come work in tech? You have great skills in data analysis, project management, research and writing (to name just a few), and many of you have excellent coding skills – while we don’t get to spend much time studying the past over here, we have the opportunity to help our organizations be thoughtful about how we build the future. Bonuses: excellent pay and benefits (actual excellent pay and benefits, not what most rescue digs or academia can afford), opportunities to work remotely and/or travel, and a work culture that still enjoys a drink or three – though that’s not certainly a requirement. Beards are entirely optional.

*Other types of social scientists are also available, but I don’t know if they are as much fun.

After years of following along on Twitter, not to mention 20 years simply existing as a woman in tech, I finally made it to my first Grace Hopper Celebration of Women in Computing in Houston (#GHC16 for you Twitter nerds) this year. And on the whole, it was a fabulous event — great keynote speakers, especially Dr. Latanya Sweeney of Harvard and Ginni Rometty of IBM, and so many opportunities to share experiences with other women in the field. It seemed that the vast majority of the attendees were computer science students looking for internships (and more power to them); they were poised, well-prepared and passionate about what we do — I wish I had been that clear about career paths when I was in my early 20s, and I was thrilled to chat with them — it was a splendid chance to offer advice and, of course, try to recruit them. Hiring is a lot harder now than it was in the 1990s, though more on that in a moment.

But I did notice a creeping undercurrent about who ‘counts’ as a woman in tech — not, I hasten to add, coming from any of the sessions I attended, merely snatches of conversation I overheard while walking the conference floor or lining up to get into a heavily-oversubscribed talk or two. ‘She’s just the recruiter’ or ‘I think she’s in marketing, not a software engineer’ or even ‘she’s not a CS major, she’s just looking to find a job with a good salary.’ And I admit that earlier in my career, I also had similar divisions in my mind — the women (and we only ever remarked upon the women, never the men — unconscious bias is a bitch) in marketing didn’t ‘get’ what ‘we’ the developers did, they were a different breed. Never mind that back then, few of ‘us’ had actually studied computer science; we had fallen into the profession through various routes — perhaps coding on the side as a hobby, or taking an interesting tech elective, or even been ‘drafted’ into a long-open role by having the ability to fog a mirror. But we worked with code. We were techies. Different. Special. Highly in demand.

But having racked up a lot more work and life experience then, I realize now that it’s just as easy to be the person on the other side of the ‘othering.’ A decade-plus into my career, when a CS degree was becoming the more standard route into tech (and the number of women I worked with dropped off quickly around that point), not having one suddenly became a bit suspect. Was I still a ‘real’ techie when I became ever-further-removed from hands-on coding? Sometimes my matrixed reports didn’t think so — and were on occasion surprised to know that I understood what they were talking about and could call them out on sloppy development work. Were my project managers still techies? Maybe. What about tech writers, editors and designers? Sometimes — especially if they were men.

The current mania for ‘STEM education’ at the expense of the arts and humanities, especially at the undergraduate level, makes the tech/non-tech division seem natural and ‘correct’ — when, in fact, you cannot build good tech products and programs without a diverse mix of skills and backgrounds. Yes, we need more women (and people of many other underrepresented backgrounds) in technology, but we cannot let an undergrad CS degree and a great internship become the only path in, nor should we let people become so focused on writing great code that they cannot develop in other ways. I want to meet great engineers who can also write well, give a kick-ass presentation and become go-to mentors for others — and those so-called ‘soft skills’ are just as vital, and need nurturing from the start. Outside interests are just as important; you can be passionate about what you do without it being the only thing you do.

I digress to make the point that we’re all in this together; whether you are a woman working in HR at a tech company or a female software engineer just getting started at a non-profit, you’re both women in tech. Even if your current team has an ideal gender balance (and I’ve been on quite a few), it’s unlikely you’ll always be that lucky in your future career; being able to advocate for each other, instead of only those who are Just Like Us (and Just Like Us doesn’t have to be based on gender or background — when we define ourselves by our roles at work, either in whole or in part, it’s relevant) is hugely important. There are no Fake Tech Women any more than there are Fake Geek Girls. Women who want to transition into a tech career from another field, perhaps with decades of non-technical experience under their belts, should not feel unwelcome. Given how incredibly difficult it is to hire people with the right skills, we need to stop gatekeeping, even when it’s unintentional, and help build other solid paths in. Coding boot camps, especially those with industry support that include internships for so-called non-traditional candidates, are a good start, but coding is just one important element of a successful tech career. Code should not be the sole defining feature of what a tech career looks like, any more than being a white dude under 30 is what a tech worker ‘looks like.’ We need to focus on our commonalities and drive positive change; creating artificial barriers is no help to anyone, not even the bottom line.

And that leads me to my next topic — where are the senior women in tech? The metrics presented at #GHC16 showed an uptick in early career tech women, but still what looks like a sheer cliff in mid-career and senior executive positions. The guidance offered was that formal leadership development programs are the key, and it certainly sounds like a useful path forward; I’ve been fortunate enough to participate in some useful coaching programs in previous roles, but they tended to focus on developing capabilities for individual projects or programs, rather than looking at how to move to the next level — that just ‘happened’ along the way. And I am very much aware of the fact that most of the other women I worked with in my early career are gone — they’ve left the field entirely.

But I took great inspiration from walking the #GHC16 conference floor and watching companies work hard to impress potential interns, entry-level and early career folk — imagine if we had the same opportunities as Old People to be, as Lerner and Loewe once wrote, ‘worshiped and competed for’ at conferences that focused on sharing roles at those levels. Yes, we get random calls from recruiters, but it’s not the same as having the opportunity to see a fuller picture of what’s out there and what we might work toward, nor does that offer the same chance to do in-person networking and story-telling. Luckily, there were some of ‘us’ there, and while we may not have been explicitly catered to by the hiring companies — not really an issue since most of us were there to hire for our own teams — it was nice to have some representation. Your tech career doesn’t have to end when you switch careers at 35 or take some time out to travel or have a family, and it’s important to see people who are visible reminders of that, just as it’s important to see real-life examples of women of color in tech, transwomen in tech, disabled women in tech and so forth.

I’ve written before about how the media tends to portray ‘successful’ women in tech as those who made the C-suite before 40 (or 30, or 25, or hey, why not 12?), or as young company founders blazing new trails. But a mature field allows for a wide variety of career paths, and incremental success is just as valid as headline-friendly overnight success. Sure, I’d like to have retired wealthy by 40 and had the opportunity to become a world-traveling philanthropist, funding rare book libraries and specialist archives all along the way, but I do really love my current position — I’m still moving onward and upward in my career (which affords me a ludicrous level of freedom and privilege compared to most), and I have the opportunity to mentor others. Whether that means we need to have more conferences aimed specifically at mid- and senior-career women in tech I do not know, but I do know that representation matters, and there was a lot of it at #GHC16. Hopefully there is more to come.

My other takeaway was that people will stand in line for a very long time for a freshly screen-printed t-shirt, but I have yet to wrap my head around that one — though that said, it created an ideal bottleneck for career conversations, so all in all, a win. 🙂

Now, if I can just find (or kick off) one of those formal leadership development programs, I’ll be set for my next act…

Another DAMNY is in the books, and once again, there were a few too many thought-provoking sessions than one could attend without bilocating, but to my mind, that is the sign of a healthy conference agenda and a maturing field. While there were still discussions on choosing the right DAM and making the vendor-vs-roll-your-own decision – and very important and useful those are for those new to the field – it was encouraging to see more panels looking to the future – indeed, some were beginning to address the gaps I see in the DAM world. I continually wonder when DAM, content strategy and knowledge management will all coalesce (or, barring that, make their boundaries clear in solutions that play nicely together), and this year’s conference confirmed that I’m not the only person asking those questions.

Sometimes a DAM is implemented without giving much thought to the foundational content strategy: in these cases, simply ‘getting a DAM’ is expected to solve any and all problems related to the digital supply chain, content marketing, audio and video encoding, web content management, rights management, digital preservation and content delivery, all in one fell swoop. A tool built to manage what we might now call ‘traditional’ digital assets – images, audio and video – may be tasked with being the single source of truth for copy and translations, contracts and filesharing; in short, handling and delivering structured and unstructured data of all stripes to varying degrees of success.

And perhaps that is indeed where we are going, albeit more thoughtfully – if the DAM is truly to be the core of the digital ecosystem, the end users may not need to know what it can and cannot do under the hood, as long as ancillary systems are seamlessly doing what the user needs, thanks to some deftly-designed data models, well-described asset relationships and friendly APIs. But without DAM leaders, both those at DAM vendors and expert DAM managers, developing these use cases and solutions for them, and demanding some firm industry standards, it will take some time to get to that ideal state. A case in point that came up in several sessions was that of the explosion in video resolution and formats – while that (exciting) problem will not apply to every organization, the approach to potential solutions will most likely affect the direction DAM vendors begin to move.

Similarly, the opportunities presented by linked data and well-described semantic relationships must be embraced; the digital humanities field was quite rightly called out for being at the forefront of this wave, having been surfing it long before business or even most technology companies thought to dip a toe in the water (just take a look at any THATCamp writeup). Indeed, it’s another example of how librarians have been key to the development of DAM over the past decade; not only can they (we) whip up a snazzy taxonomy and run your DAM better than anyone else, but they (we) can be amazing futurists – defining a roadmap for a product before the vendor thought to do so, or simply building a homegrown solution.

And that brings me to a slight worry; I noted (though I was far from the only one to do so) that a few of the technology-specific panels fit the dreaded all-male panel stereotype. This has not been my general experience at previous DAMNYs, and I did see that at least one of them had not been designed that way, but DAM managers and end users – frequently librarians and, nowadays, marketers – and DAM product managers and developers sometimes give the appearance of dividing along gender lines. I’ve previously raised the concern about how this could affect salaries (tl;dr – as a technical, or any other sort – of profession becomes more ‘feminized,’ salaries shrink), but I would hope that as a small, though growing, profession, we can all be mindful of that pitfall and work together to avoid a needless binary, where (at least superficially) men develop the software and serve in senior executive roles, but women do the day-to-day work. I will certainly grant that as a women with 20 years of experience in technology, my Spidey sense is more sharply attuned to look for this than it might be otherwise, but here’s how you can all make me feel better – take this year’s DAM Foundation Salary Survey and let the data speak.

But there is another way we a rising tide can lift all ships in this field – we can be more proactive about creating mentoring opportunities, both for those looking to get into the field, as well as for those looking to get to that next career step. The DAM Guru program does an excellent job of matching people with those looking for advice on a particular solution or for those who are just starting out, but we have no formal mechanism as DAM practitioners to take that next step for mid- and senior-level folk. As someone who has been ‘doing this’ a long time, and in different types of companies, I’d be more than happy to mentor those coming up, but I’d equally love to spend some time with some of those very senior executives who are driving the shape of DAMs to come. To borrow a phrase, I want to be in the room where it happens, and I’d like to help other people who want to get there find their own paths.

My biggest takeaway from this year’s DAMNY is that we’re at an exciting point in DAM’s maturity, and for those of us who are lucky enough to have found our way into this field, often by fairly circuitous routes, it’s always nice to re-convene to be among ‘our people’ – but let’s take lessons learned from other tech specialties and ensure that the DAM community’s diversity continues to grow, rather than contract. As we develop systems with ever-broader capabilities, the field as a whole can only benefit from a wide range of backgrounds and experience – let’s aim to keep adding new lifeblood.

I should probably propose a DAM career development workshop for next year…

Twenty years ago this month, I landed my first tech job, quite by chance – and fell headfirst into a career I neither planned for nor expected, yet here I am, two decades later, enjoying my standing desk in a gleaming tower. The setting for this serendipitous accident was London, and London in January of 1996 was an exciting place to be. Britpop was in full force (even if many of the bands lumped into that category did not embrace the tag, often quite rightly), amazing comedy was all over television and live clubs, and the theatre was in fantastic shape, from the RSC to tiny pub venues. Keeping track of the wealth of culture on offer was the purview of Time Out, and even as a relatively poor grad student, especially one who was thrilled to discover student discounts on theatre tickets were much deeper in the UK compared to the US, I happily paid for a copy of the magazine each week to plan my leisure time – more on that in a moment.

Of course, I should not have had such extensive free time; I was busy studying for my MA at the Institute of Archaeology, with plans to go on for a PhD, and then to become a clubby and chummy academic in the JRR Tolkien or MR James mold – obviously, I fell at the first hurdle by never learning to use abbreviations, rather than my first name, or possibly by having two X chromosomes and not being born in the 19th century. Instead, I seemed to find ample opportunity to hang out at the British Museum (that totally counted as work, right?), see bands like David Devant and his Spirit Wife, catch Iain Glen and Judi Dench onstage, hit regular comedy nights and, just for fun, I learned to build websites.

My coding hobby began initially as a way to organize websites I liked for easy access – enormous shared desktop computers in a lab did not make bookmarking useful, but having my own hotlist (hotlists were a thing) gave me some portability and, oddly, kudos among my less-technical peers. Even in that now-distant era before web comments became an archive of discontent, I soon realized that my free webspace let me share my interests – and gave me a platform to complain about things. I believe the Spice Girls came in for a good deal of online umbrage from me in those early, pre-irony days, but as a cool indiekid, my online persona had to take against them. But I later turned this opportunity in a more positive direction by building sites for bands I liked – official versions were still some way in the future. There was also the instant gratification element missing from academic research – if I wanted to spin up a new webpage, it only took a few minutes to knock together some code, find an appropriately-’90s background image, and play around with fonts. A brief aside – I once had a turquoise and neon yellow tiled background that perfectly matched a cheap shirtdress I bought at C&A, or possibly Topshop – it is possible that I was cosplaying my own website before anyone discovered something so ridiculously meta was possible.

Then I realized you could get paid to do this.

One day while poking around on Time Out’s website – one of a very few covering London in any meaningful way at that point – I saw an ad for a web assistant. If memory serves (and it may not be as accurate as I believe it to be), it sounded slightly mournful – the site was getting bigger, but no one else had the requisite HTML skills to keep it updated. Could someone please apply and perhaps they would train them to do the work? ‘But I can do that right now,’ I thought – and I duly emailed off a copy of my resume and links to the pages I had built. I got a speedy reply and an invitation for an interview – the notion of attaching a resume as well as links to previous ‘work’ seemed to have been rather more than any other candidates had managed. Within a few days, I presented myself at Universal House, just a short walk down Tottenham Court Road from my UCL stomping grounds, and was hired immediately.

I discovered that in addition to the princely sum of £75/day (yes, really), I’d also be receiving a free copy of Time Out each week – two if I wanted them! Never having had a real job before, such an unexpected perk was especially welcome – my days of getting terrible free corporate art, snacks, software release t-shirts and on-site massages were still some way in the future. I’d get to hear about upcoming gigs in advance as I dropped them onto the website, and if something was missing, I could add a plug for a band I liked, as long as it matched the writing style of the rest of the site. I learned about an exciting new comedy group called the League of Gentlemen, who had yet to make their way to television. I got press kits from bands like My Life Story, and invitations to alcohol-soaked book launches. I discovered that there was a free drinks trolley that went around the office on certain afternoons. In short, there was not a better job for an overeducated 20 year-old with no real responsibilities.

But it wasn’t all just fun and games – I also got the chance to build on my skills. When my boss (the only full-time employee on the website for a very long time indeed) went out of town, I got to field all the questions about what we did, and generally run the show; when I came back after a week away, I was excited to learn that he’d tweaked the site to improve the layout with ‘a new thing – tables in HTML.’ With our nested tables (frames came later) and many, many carefully-sliced gifs, we could almost, but not entirely, get rid of imagemaps for the ‘graphics-heavy’ version of the site that was offered to people with faster dial-up connections. A second brief aside here: while I never liked the sound of a connecting modem, I do miss the Eudora ‘new email’ tone, which was an exciting thing to hear at the time. The office sounds fundamentally different today.

In many ways, that first job set the template for my career; if I wanted to try something novel on the site – Javascript, ASP or another ‘new’ technology – I was encouraged to experiment. If it worked, great, and if not, well, it was worth giving it a go, and it was never bad to add another technical string to one’s bow on company time; continuous learning was considered standard practice. I could dress as I liked, and my usual t-shirt-jeans-and-Doc Martens wardrobe was utterly unremarkable. Another plus: occasional-to-frequent free booze. That structure has served me well in the diverse directions my career has taken me since then – to Silicon Valley before the dot-com crash, where I worked at Women.com (an experience not unlike a triple-decker novel in many ways), Juniper Networks and Hewlett-Packard, to New York as a techie-in-non-tech companies (and ditto in Philadelphia), and back to the west coast, where I’m now an Amazonian in Seattle.

In those twenty years, I’ve only ever had to ‘dress up’ for work for the non-techie organizations (interestingly, it’s also only outside of tech-specific companies that I’ve experienced any overt sexism, though that’s another story) – it was delightful to donate all my ‘grownup’ work clothes when we moved back to the left coast, where I can wear my nerdy t-shirts, hoodies and DMs to work again without a second glance. Also back: occasional free booze, though as the tired parent of a tween and a toddler, I’m rarely out late – I need my sleep, so the ‘occasional’ aspect is really by choice these days.

If I have any work wisdom to impart as a ‘veteran’ tech nerd lady, it’s this: hire smart people, with diverse backgrounds and skillsets, and let them get on with solving tricky problems as a team in their own way – but set high expectations. Keep learning about new technology, languages and tools, even if you accept you can’t be an expert in everything; it’s especially important if your career evolution has taken you out of day-to-day development and into a leadership position. Volunteer for things – the non-profit world desperately needs your skills and experience, and you never know when your passionate hobby project may become your full-time concern. But most importantly, ensure that the ladders you used to find your way still exist – or build new ones if they do not. There is no single path into the tech world, but people from ‘outside’ are not always aware how transferable, and ultimately useful, their experiences might be for a technical team. A little coding knowledge on top of solid writing, communication and management skills can go a very long way, especially if you give someone the time and space to learn by doing. Beer helps, too.

And if there is a larger moral to my narrative, it is that procrastination can pay off in ways you never expected – just call it ‘learning’ and it becomes a virtue, rather than a vice!

I fondly recall my very first URL – it wasn’t a GeoCities site, though that would follow along in due course – but just the few KB (indeed) of web space every postgrad student was allotted by the Institute of Archaeology, University College London. Unfortunately, there’s no trace of the content now, though the URL lives on as a ‘not found’ snapshot in the Wayback Machine. It’s a shame, because while I don’t recall falling prey to blink tags or other early web missteps, it did have a very vivid teal-and-yellow tiled background that coincidentally matched a dress I’d bought at Topshop (more on them below), and I wouldn’t mind seeing either one again. So, while my first foray into web development doesn’t exist anymore (a bit ironic, given that archaeologists love preservation, digital and otherwise), at least I still remember this: http://www.ucl.ac.uk/~tcrnlag/index.html

But thanks to the Internet Archive’s drive to save GeoCities – and, of course, a vast galaxy of sites beyond – some of my early work, both professional and otherwise, does live on; so many websites captured from the Time Before CMS and DAM. After running out of space on my UCL account, I set up shop on GeoCities with a ‘hotlist’ related to my MA dissertation – those were a big deal circa ’95-’96, since search engines weren’t especially powerful, and even the site that would become Yahoo, Jerry and David’s Guide to the World Wide Web, was human-curated back then. I also built a GeoCities site for one of my favorite bands, David Devant and his Spirit Wife, and employed what seemed like a pretty cutting-edge Java applet, though alas, the applet hasn’t survived the freezing process. And I nearly forgot until the recent 20th anniversary that I used to help out on The Craggy Island Examiner, a Father Ted fansite. The site was powered by basic HTML, visible tables and not a few pints at a pub near Waterloo where we held ‘editorial meetings,’ and once a mini-Tedcon, circa 1996. But that bit of volunteer work did help lead to my first actual web job, at Time Out in January of 1996.

The site was a one-man operation when I started, so it was perhaps noteworthy that the web team immediately reached gender parity when I joined (though we did have some occasional help from another gentleman/former member of Hawkwind later). I believe one reason I got the job was simply because I emailed my resume and links to my ‘experience’ in response to the job posting; it was mentioned in the interview that no one else had taken that radical step. Time Out was a fantastic place to work in the mid-1990s – I got a free copy of the magazine each week, I got invited to book launch parties, occasional press passes and the inside scoop on some of my favorite bands. All I had to do was update the site each week – all the global sites (such as they were then, imagemaps and all) were run from London. And when I saw it again, I actually remembered dropping in that note about Budapest. Midway through my tenure at Time Out, we brought in a more structured layout with ‘complex’ tables – though still no sign of a CMS or anything approaching one.

I moved on to work for an agency that built sites for clients like Christies, Condé Nast and the Evans Group (retail clothing shops like Dorothy Perkins, Evans, Topshop and so on), where heavily-sliced images, complex tables and frames – and getting them to line up in competing browsers – became the bane of my existence. But I do fondly recall the spinning ‘D’ on the Debenhams site; that was also quite exciting back then. And this particular Dorothy Perkins page was a nightmare to build – so I’m glad it still exists. Unfortunately the early Topshop pages seem to be long gone, though it was fun working on something for which you were the target audience.

But the real mother lode (as it were) of my early web work comes from the Internet Archive’s snapshots of my career at Women.com in Silicon Valley. As the web nerd in charge of the homepage, both for Women.com itself as well as many of its affiliated sites like WomensWire, Prevention and more, there’s a great deal more preserved. I moved back to the US in late 1998 (when the site looked like this) , and, having turned down a wildly underpaid job at Yahoo (yes, there were stock options, but you couldn’t have paid rent in the meantime), I commenced work at Women.com. It was an exciting time to be there, and at first, there was a lot of ‘smart content’ aimed at women – not in the modern sense of ‘smart content’ of course, but there was a lot of information on careers, finances and health. It wasn’t quite Bust Magazine territory, but it wasn’t as far off as it would be later. I was tasked with building the redesigned site in 1999 – now everything was yellow – but what’s most interesting to see is what remains of the content – features like the Bloomberg/Women.com 30 Index, tracking the success of woman-led companies on Wall Street; the ‘first ever online presidential primary for women’ (spoiler alert: Al Gore won) and the Men of Silicon Valley (‘high-tech’s hottest bachelors!’). So yes, that was a Thing That Happened.

There’s much more to dig and record where that came from; I was at Women.com until 2001, when, with the writing on the wall for pure content sites, I moved on to Juniper Networks where ‘no layoffs’ were promised – when that turned out not to be true, I went to Hewlett-Packard, where Carly Fiorina was on what seemed to be a mission to destroy the entire company, largely from the recording studio next to my desk, but that’s a story for another time…

This year’s always-fascinating and very valuable DAM Foundation Salary Survey came out in February, and there were some interesting – though also, possibly worrying – trends to analyze. First, though, the positives: DAM jobs are becoming ever-more-global, as companies begin realize the value of their digital assets (or, perhaps more accurately, as they discover how disorganized or missing digital assets are a huge money pit). This is an encouraging trend, and one I would hope continues to grow. And the influx of those with MS-LIS and other library degrees suggests that the value of accurate metadata is being recognized – though I’ll explore a concern that brings up as well in a moment.

Mapping job titles to skillsets and salaries was noted as a continued area of confusion, and one I have certainly seen borne out myself, as well as amongst my peers; while it’s to be expected in a still-somewhat-nascent profession, it can be an area of frustration, not only for the postholder, but for potential recruiters and managers. It may seem a minor point, but given the volume of confused recruiter calls I receive, I think it’s worth digging into it for a moment, given this background from the survey analysis:

“Those with the term “Director” in their title tended to make the highest salaries, and those with the term “Archivist” or Archives” tended to have lower incomes. There were no other clear correlations between title and salary. One listing that included the word “Supervisor” in the title made as much as other “Director”s; many with the title “Specialist” showed no appreciable difference than those listed as a “Manager”. This suggests that when reviewing the resumes of experienced DAM workers, an analysis of their actual daily duties, tasks, and projects may be more of an indicator of skill level than job title.”

Indeed, I’ve had to explain on numerous occasions that my current title, Content Librarian, isn’t ‘just’ a content management role, and that I’m fairly senior in the hierarchy, where my tasks include crafting policies, setting standards and analyzing IT solutions – so likening it to a position such as ‘the’ University Librarian, rather than ‘a’ librarian who happens to work for a university, only makes sense to those coming from academia. When speaking with those from a straight-IT background, I explain it’s a bit like a product or program management role with a lot of taxonomy bolted on, though any DAM professional knows that’s still only a portion of ‘what we do.’ And having worked in traditional library and archival settings as well as in IT-focused environments, that brings me to my chief concern – will having more (very useful) library skills drive down DAM salaries, over time, simply through assumptions made by employers over title and background?

I’ve experienced the disparity between IT and library-land salaries first-hand – I began my career in IT, building websites and managing content back when it had to be done by hand, before DAM and CMS solutions existed. Even as software to help corral and catalog content and digital assets came into being, my salary working with those tools remained quite comfortable. Then I went back to library school, with a view toward using my IT background, augmented by my new taxonomy and knowledge management skills, in the heritage/academic sector – libraries, museums and archives. Despite having additional skills and experience, moving into that world reduced my pay by more than 50%; at the time, it was a manageable reduction, and I had a fantastic work environment and great colleagues, but it wasn’t sustainable in the long-term. I returned to IT, and immediately more than doubled my salary – using the same skills, but with a different job title and cost center. While part of that jump was down to non-profit vs corporate budgets, even in the for-profit world, I know other DAM ‘librarians’ and ‘archivists’ who have found that a change in job title made a vast difference to them in terms of salary. It’s anecdotal, to be sure, but it seems that those whose titles are more ‘techie,’ and less ‘librarian-y,’ often have higher incomes, albeit for the same sort of work – and good luck figuring out who is more junior or senior, if job title is your guide! Clearly, we have some work to do.

As more librarians – and more women – come into the DAM field, there is a danger that salaries may become depressed; we already know that the youngest cohort in the survey results have lower salaries, and that they are overwhelmingly female, though they have more library degrees. Having said that, it’s quite rightly noted that their youth and relative lack of experience is likely the key driver behind their lower pay. But historically, the ‘feminization’ of a profession (think teaching, or, going back much further, textile production) has never had a positive impact on salaries; quite the reverse. It would be nice to think that we can ignore historical precedent and that we’ve moved beyond that – and I’ve written elsewhere about what it’s like to be a mid-career woman in technology facing those issues – but given the existing salary gender gap in DAM, it’s something we should continue to be vigilant about – let’s make sure that gap is truly reflective of a historical blip, and that it doesn’t become wider.

I am a firm believer in the value of a library background in the DAM world – combined with solid IT and management skills, it’s an ideal, broad-based skillset for an evolving field. And I completely understand someone coming from years in ‘traditional’ library settings jumping at the first salary offered in a DAM role; given the lack of funding in academia and public libraries, it’s (sadly) likely to be a big bump, regardless of how ‘low’ it might be for an IT or marketing position. But it’s been well-documented that failing to negotiate in salary situations leads to lifelong repercussions, and as we see more highly-skilled, and likely previously-underpaid people coming into DAM roles, we should continue to share salary surveys and job title information as we build toward a more well-understood profession. Likewise, as hiring managers, we should do our best to keep salaries fair, and to help our recruiters and HR departments understand that a great DAM professional might not be obvious from their last job title or training.

My longer-term hope is that by highlighting the value of librarianship in digital asset management, we can help enhance information work all around, making the wider world realize that it’s a useful route into a technical profession, and one that deserves to be better-known and appreciated, and paid on par with other IT jobs. An MBA may be one ticket to a ‘good’ salary in DAM, but we need to demonstrate that it isn’t the only one, and that men and women have an equal shot at long-term advancement in the field.

Consider this a call to action to make an impact before the next salary survey!

Note: I wrote something work-related, after years of silence in that regard! Revel in the novelty.

Having read and considered the three recent articles on the lack of innovation in the digital asset management space, I can only agree that there are certainly issues with vendors, chief among them being the lack of standards, and it starts at the most basic level of simply describing their solution’s basic function. Major Vendor A can call their solution ‘digital asset management’ while Major Vendor B uses a broadly similar tool for web content management, but they can each easily swap labels if that’s what the customer thinks they want, perhaps because they don’t have anyone with real DAM expertise on staff to dig further into what’s on offer.

And that goes to the core of Jeff Lawrence’s article – customers aren’t demanding clarity, much less innovation. It’s almost depressingly common in our field to discover that the only person in an organization who truly understands how DAM works (or, perhaps, how it should work) wasn’t involved in the purchasing decision; they’ve often inherited something that wasn’t truly fit for purpose, and they don’t have the budget to do much about it now. But if the customer does not budget for enhancements or new systems, vendors can’t be expected to pay particular attention; understandably, they’ve moved on to selling their existing solution to a new client. Yes, new features may roll out if that bigger client demands more attention during the implementation phase, but after that, the feedback loop unravels.

But standards are again top of mind in Ralph Windsor’s piece on the role of the media; his points about the truly alarming lack of metadata knowledge give one pause, and the difficulty in measuring ROI certainly takes time away from crafting the perfect taxonomy model. Some DAM vendors have clearly given careful thought to the role of taxonomy and metadata, and considered how users, both administrative and end-user, might interact with that metadata (even if they don’t know they are doing it). But that’s not true across the board, and if DAM enhancements have fallen to someone who lacks experience in that space, it’s difficult to move forward true functionality improvements, since all DAM functionality flows from useful, well-managed, metadata.

And while we ‘know’ that the DAM saves money in the long run, demonstrating that to those who hold the organizational purse strings isn’t as easy as it should be. This can prove a particular challenge if the team (or, let’s be realistic – person) running a DAM is that rare IT unicorn with a combination development/project management/taxonomy background; suddenly they also need to become an expert in presenting on their program’s successes and challenges to senior management. While that’s a great career development opportunity (and you may detect the voice of experience here), tools within DAM software should make getting to that supporting data simple.

To summarize, my view is that there is a lot of truth in each article, and it’s something of a vicious circle. DAM vendors (or vendors that have decided they have a DAM solution, even if it’s far from best of breed) aren’t incentivized to innovate because the clients don’t demand it. Clients don’t demand it because they have systems that can be difficult to use, and therefore hard to build a business case around further improvements when they’ve already spent their initial budget – not infrequently on the ‘wrong’ system, so they are essentially starting from scratch again when they can afford to ‘go shopping’ once more. And much DAM media is so internally-focused that the ‘right’ people in organizations that need DAMs don’t even know it exists. It seems that one solution would be for DAM vendors to seek out long-term DAM managers and librarians for product management roles – people who live and breathe the tools, and who understand the importance of standards – to really push the next generation of DAM solutions.

And as DAM professionals, we also need to keep the conversation going with our vendors; it’s not always easy, and there isn’t always a response, but keeping quiet hasn’t helped so far. Let’s get loud!

A bit of background is in order: I fell into my first coding job in 1996. I was meant to be working on my MA in archaeology in London, but I discovered that HTML, even back in those days before tables, offered a sense of instant gratification that is often lacking from the humanities. I duly emailed my (then very brief) resume to Time Out magazine in response to an ad seeking a Web Assistant, and that was such a novel approach that I was hired on the spot. From there, I bounced to Silicon Valley, turned down a job at Yahoo that would not have paid enough to live on, stock options notwithstanding, and spent the next several years at Women.com, where I quickly rose to the heady heights of Web Production Manager.

While I did work with a few men, unsurprisingly the company was nearly all-female, and even when other companies tried to headhunt me (something that happened all the time to everyone in the late ‘90s/early 2000s, presuming you had a pulse, basic HTML and Javascript, and an ability to navigate 101), most of the development teams I met with were largely female. Sexism never crossed my mind back then; the only faint flicker of an ‘issue’ was on a night out with some fellow techies. One complained that ‘all the good female coders get swept up into management,’ and I was taken aback — wasn’t that the goal? I enjoyed working with code, but I didn’t plan on doing that forever — I also liked managing people (though it was less pleasant when they were pretending to be sick or were otherwise not especially interested in their jobs) and projects, figuring out if a vendor had a good solution, writing here and there, and translating what my team did up to the c-level. In my mind, being a coder was a foundation for being a good tech generalist, which was what could (I thought) propel you up the hierarchy. It hadn’t occurred to me that some people simply loved code, and that there was an attitude among some of them that women who started in code but moved on were somehow letting the side down. I filed it away as an interesting point of view, but not one that would be terribly relevant to me.

Then the dot-com crash happened, and things began to change.

It’s certainly true that many of the pre-bubble companies, including some I worked for, could have been more strategic, less spendy, and generally more thoughtful in how they did their business. But it also seemed that hands-on experience in being a part of the building of those companies’ products became less important than having a freshly-minted MBA, and the men in suits — and they were largely men — swooped in to pick over what was left. One of my former colleagues, who had been at the company much longer than I had, said she felt the experience of the boom and bust had been like getting an MBA in how not to run a company, and I fully agree, all these years later. With tech jobs becoming harder to find, many friends and colleagues went into other lines of work, and I found myself a minority, along with my fellow female tech holdouts. I took jobs that were a few steps below those I’d had during the boom, but assumed that would be a temporary step back –surely, the market would improve and I’d be back where I’d been at the age of 24. It was around that point I realized that while I often still had female managers, that was as high as things went on the tech side. There were women executives elsewhere in the companies I worked for, but they tended to be in marketing, HR and other roles. Somehow, the lady nerd pipeline stopped at middle management.

I ensured I still had as many strings to my bow as possible: true, I did less coding, but much more project and program management, more writing and content strategy, more taxonomy, more going to conferences, and spent a goodly amount of timing thinking about what I wanted to be when I grew up. Social media and even its mainstream cousin began to fill with stories of women in tech being either subtly passed over to outright abused (online and in person), and that became one new reality. On the flip side of that, high-flying women in tech of the Sheryl Sandberg- or Marissa Mayer-variety were trumpeted as success stories. But for those of us somewhere in between, there was no real acknowledgement of our experiences, nor a clear pathway to get from the middle to closer to the top. There are a lot of possible lateral moves, but unless you want to found a new company — and more power to you if you do — it’s hard to get to the c-suite, or even just below it. And the women who do make it there often came from a less technical background, though I’d argue that your MBA won’t teach you as much about how to choose an enterprise software solution nearly as effectively as living through trying to integrate the wrong one. That’s not to say, as some might, that they don’t belong there; anyone can learn to code, but learning to be a good writer, people manager and politician can be a much trickier road. That said, it still seems that comparatively few women who started off in the lower rungs of the tech world, whether that’s valuable experience gained doing tech support or writing code, are getting to those top-level positions.

We’re told we don’t ‘do’ enough — we should ‘lean in,’ we should speak at conferences, we should go to hackathons, we should give back through programs that teach girls to code, we should be mentors in our workplaces — all while doing our day jobs, continually learning more skills on the side, raising families and occasionally sleeping. These are all positive things, but one wonders if men are held to the same measuring stick; I know very few men who do all of these things, yet they seem to keep rising in the workplace without all the ‘extras’ — yet they often seem to be prerequisites for a female tech leader.

And tech is perhaps unique in that it’s possible to earn more money in a ‘lower level’ position; I am constantly reminded by recruiters that I could be earning considerably more as a developer than in my present role managing developers (among other things), and while I still enjoy breaking out code from time to time, I like flexing other muscles more, and I’m very well aware that in many coding languages, there are people who are simply better at it than I am at this point. That may mean that I no longer pass the ‘real coder’ litmus test, which I find another irritating variety of the ‘female fake nerd’ straw woman, but it’s equally important to have someone in the middle who can see All The Things. And if I can still call out a vendor who claims there’s no solution to a problem when I found it in five minutes on StackExchange, so much the better.

But the question remains — how can we ‘upper middle management’ tech women get beyond our current levels (understanding that there’s already a huge amount of privilege and opportunity that is simply unavailable to most people on the planet, male or female, but that’s another essay), and into those CTO/CDO/CIO offices? Obviously to some extent you need to write your own ticket, and that’s not a path everyone wants to take, but the mid-career ceiling seems to be less made of glass, and more of a Red Rover situation.

From my perspective, what’s missing are the stories of women in tech who had a more varied path to the c-suite (or to whatever more senior role was their goal, understanding that goals can and do change) — those who haven’t had the editorial-friendly ultra-rapid rise to the top, who weren’t profiled in Wired, who didn’t have a book tour, and who can help bring others up behind them along the way. There’s nothing wrong or inauthentic about those who did have that experience, but it’s not reflective of those who started off as worker bees and continue to keep the hive humming.

If younger women look at tech careers and get the impression that the two options are to either encounter unbeatable sexism, or that you’ll have ‘made it’ by 30 (and that something is wrong if you haven’t done that), we’re doing a disservice not only to them, but to ourselves. Highlighting the other positives about tech — flexibility, a culture of continuous learning and experimentation, and a wide variety of potential career paths would be hugely helpful, and a more realistic view of the field, which is so often presented in mainstream coverage as a binary (see what I did there?) either/or.

If I’ve learned anything in nearly 20 years as a nerd-for-pay, it’s that you make career leaps when someone tosses an opportunity you weren’t expecting into your lap, and you’re left to sink or swim with it. But as I’ve gotten further up in the world, fewer of those have come my way. I’ve had people assume I ‘wasn’t interested’ because I was a parent (never mind that the man who got the opportunity was as well), and as managers who ‘didn’t realize’ I had a hands-on tech background. While there is no road map for challenging these assumptions, and beyond the high farce of the dot-com crash and subsequent layoffs (oh, I have stories), the ‘normal’ career progression isn’t an immediately exciting topic for a book, stories from women in tech who have had a career that trundled along nicely enough, thanks very much, would be of great value to others coming up behind them.