Before the match, I spent a fair amount of time describing Carlsen's astonishing endurance and ability to sustain his concentration over a six, seven, or even eight hour chess game.

But his skill on shorter time frames is even greater.

And, although Karjakin was every bit Carlsen's equal during the standard time control games, today was all Carlsen.

So we move on. As I said, I don't think anyone is pleased that it had to go to tie breaks, but those are the rules and that's the way the match was organized, it was not a surprise that this was a possibility.

Everybody is going to have their own opinions about the match, but overall I was pleased. It was fun chess to watch, and I can't wait for the next match! (Of course, not everyone shares my opinion.)

To probe these subtle shifts, scientists combined multiple radar scans from the Copernicus Sentinel-1 twin satellites of the same area to detect subtle surface changes – down to millimetres. The technique works well with buildings because they better reflect the radar beam.

Over the weekend, my wife and I were walking along the shore of the bay, approximately 7 miles from downtown, with a very clear view on the day after a big storm, and my wife wondered if it was possible to tell which tower was the Millenium Tower from our perspective.

We had a lot of rain in November, and the reservoirs are filling up fast. New Melones and Pine Flat are still dramatic outliers at less than 25% capacity, but the Big Three (Shasta, Oroville, and Trinity) are filling up fast. Let's go, rain!

As it was making its slow descent, Schiaparelli’s Inertial Measurement Unit (IMU) went about its business of calculating the lander’s rotation rate. For some reason, the IMU calculated a saturation-maximum period that persisted for one second longer than what would normally be expected at this stage. When the IMU sent this bogus information to the craft’s navigation system, it calculated a negative altitude. In other words, it thought the lander was below ground level.

...

Encouragingly, this behavior was replicated in computer simulations, which means mission planners stand a good chance of correcting the anomaly. The exact cause of the IMU’s miscalculation was not disclosed, but if it was tripped by some kind of mechanical problem, that would be bad news. The ESA is planning a similar mission in 2020, which doesn’t leave much time for an engineering overhaul. A software glitch, on the other hand, would likely prove to be an easier fix.

The bigger question at this point is how NATS Streaming will tackle scaling and replication (a requirement for true production-readiness in my opinion). Kafka was designed from the ground up for high scalability and availability through the use of external coordination (read ZooKeeper). Naturally, there is a lot of complexity and cost that comes with that. NATS Streaming attempts to keep NATS’ spirit of simplicity, but it’s yet to be seen how it will reconcile that with the complex nature of distributed systems. I’m excited to see where Apcera takes NATS Streaming and generally the NATS ecosystem in the future since the team has a lot of experience in this area.

We downsized the team working on core components (the transactional, distributed key-value store), composed of five engineers with the most familiarity with that part of the codebase. We even changed seating arrangements, which felt dangerous and counter-cultural, as normally we randomly distribute engineers so that project teams naturally resist balkanization.

...

Relocating team members for closer proximity felt like it meaningfully increased focus and productivity when we started. However, we ended up conducting a natural experiment on the efficacy of proximity. First two, and then three, out of the five stability team members ended up working remotely. Despite the increasing ratio of remote engineers, we did not notice an adverse impact on execution.

...

The smaller stability team instituted obsessive review and gatekeeping for changes to core components. In effect, we went from a state of significant concurrency and decentralized review to a smaller number of clearly delineated efforts and centralized review.

Somewhat counter-intuitively, the smaller team saw an increase per engineer in pull request activity

Concepts fight obsolescence. Even when ASP.NET inevitably dies, the concepts I've learned from programming in it for ten plus years will still be useful. Concepts have a longer shelf life than details, because details change. Languages are born and die, frameworks become unpopular overnight, companies go out of business, support will end. But the thoughts, the ideas, the best practices? They live forever.

Learn about SOLID. Learn KISS, DRY, and YAGNI. Learn how important naming is. Learn about proper spacing, functional vs object-oriented, composition vs. inheritance, polymorphism, etc. Learn soft skills like communication and estimation. Learn all the ideas that result in good code, rather than the details (syntax, limitations, environment, etc.) of the code itself. Mastering the ideas leads to your mind being able to warn you when you are writing bad code (as you will inevitably do).

What do modern applications look like? We are seeing the combination of rapid cloud based provisioning, a DevOps culture transformation, and the journey from waterfall through agile to continuous delivery product development processes give rise to a new application architecture pattern called microservices. This shares the same principles as the service oriented architecture movement from 10–15 years ago, but in those days, machines and networks were far slower, and XML/SOAP messaging standards were inefficient. The high latency and low messaging rates meant that applications ended up composed of relatively few large complex services. With much faster hardware and more efficient messaging formats, we have low latency and high messaging rates. This makes it practical to compose applications of many simple single function microservices, independently developed and continuously deployed by cloud native automation.

The AWS Well-Architected Framework documents a set of foundational questions that allow you to understand if a specific architecture aligns well with cloud best practices. The framework provides a consistent approach to evaluating systems against the qualities you expect from modern cloud-based systems, and the remediation that would be required to achieve those qualities.

Something that stood out in almost all of the presentations at DOES16 was the vast number of tools and point solutions companies are using to achieve their goals. Many speakers at some point in their presentations even listed the hodge-podge of vendors and tools they have in their toolchains. Different teams within an organization use a variety of different tools, and the resulting complexity can become overwhelming for enterprises.

Instead of racing to the bottom as the market plummets, Apple appears to be taking the “high road”, in a sense: They’re taking refuge at the high end of the market by introducing new, more expensive MacBook Pros, with a visible differentiating feature, the Touch Bar. This is known, inelegantly, as milking a declining business, although you shouldn’t expect Apple to put it that way.

Facebook has allowed advertisers to create news stream ads with the option of not publishing to them to the news feed for some time, but it’s still a fairly untapped play for the moment.

By employing this tactic the advertiser mentioned about could run all four product ads as sponsored posts, target different audiences, split test headlines and even create personalized messages for demographic and geographic targets – literally run dozens of ads all on the same day – without a single ad showing in their own news stream.

The modern era of autonomous driving began in the 1980s. The US and Germany were the two nations at the forefront in this line of research. In the US, the research was largely funded by DARPA (Defense Advanced Research Projects Agency). In Germany, large automotive companies such as Mercedes-Benz funded research. The leading projects utilized computer vision-based systems, lidar, and autonomous robotic control. The decision-making systems were essentially driven by optimizing if-then-else algorithms (e.g., optimize speed subject to not exceeding the speed limit and not hitting anything; “if is raining, then slow down…”, “if a pedestrian approaches within 5 feet on the left, then swerve right”). In other words, the systems were algorithmic, that is, codifying an algorithm describing the connection between road conditions and decisions related to speed and steering.

However, to be able to drive on roads in unstructured and unpredictable environments, including ones where other drivers (human or autonomous) also exist, requires predicting the outcomes of a large number of possible actions. Codifying the set of possible outcomes proved too challenging. In the early 2000s, however, several groups began using primitive versions of modern machine-learning techniques. The key difference between the new method and the old method is that while the old method centered around optimizing a long list of if-then-else statements, the new method instead predicts what a human driver would do given the set of inputs (e.g., camera images, lidar information, mapping data, etc). This facilitated significant improvements in autonomous driving performance.

Whereas the utopian view has argued that blockchain technology will affect every market by reducing the need for intermediation, we argue that it is more likely to change the scope of intermediation both on the intensive margin of transactions (e.g., by reducing costs and possibly influencing market structure) as well as on the extensive one (e.g., by allowing for new types of marketplaces). Furthermore, for the technology to have any impact in a specific market, verification of transaction attributes (e.g., status of a payment, identity of the agents involved etc.) by contracting third-parties needs to be currently expensive; or network operators must be enjoying uncompetitive rents from their position as trusted nodes above and beyond their added value in terms of market design.

Economically speaking, the American health care system is not built for patients, because patients aren’t the ones paying for it directly. Insurance companies are.

See, health care in the U.S. is mostly a B2B business. It is only B2C where insurance doesn’t cover expenses to the patient. And even then, insurance still often pays for it when patients can’t, don’t or both.

Thursday, November 24, 2016

You could just sort of tell, I think, that Carlsen wanted this game, badly.

On move 19, Karjakin offered an exchange of bishops, and re-took with his f-pawn, to open up the possible lanes for an attack.

But Carlsen quickly exchanged off the queens, and by move 30 the game had entered what became an extraordinarily complex endgame. Both sides had a pair of rooks and a knight, and all 16 pawns were still on the board.

But one of those pawns was doubled: Karjakin's f-pawn, offered back on move 19, became the focus of the entire game.

Carlsen maneuvered and maneuvered, patiently and carefully, taking his sweet, sweet time, as he is so willing to do.

Karjakin defended superbly, for 50 more moves, as the game stretched past the second time control, and entered its seventh hour.

And then, in a blink, it was over.

So now there are just 2 games to go in the match. Both sides have demonstrated they can win.

Wednesday, November 23, 2016

As we draw closer to the conclusion of the match, things have really heated up!

With his back against the wall, Carlsen is NOT going quietly.

In what appeared to be a very classic, very vanilla Ruy Lopez, Carlsen, with the black pieces, sacrificed a pawn in the opening for initiative, rapid development, and a quite threatening attack. By move 17, Karjakin's king was open and exposed, and Carlsen was lining up the big guns.

Moves 20 through 40 were, frankly, as exciting as chess ever gets, with the advantage see-sawing back and forth, pieces en prise all around, sacrifices, advanced passed pawns, each player's king being chased around the board, both barely avoiding disaster...

Then suddenly, just after both players had made the 40-move time control, all the pieces were suddenly off the board, leaving each player with just Queen and Bishop.

Karjakin had an extra pawn, but it was doubled, and Carlsen's pawns were connected, while Karjakin's were not.

Karjakin pressed hard, hard, hard for 30 more moves, but Carlsen was equally tenacious, and there was no breakthrough to be found by either player.

I'm really looking forward to the final three games. The match may have started slowly, but once blood was drawn, it's been as vibrant and vivid as I could have possibly hoped for.

But I think it is also like trying to count all the grains of sand on the beach.

There are so many.

And more keep washing up after each storm.

Trying to identify the bad guys and keep them out seems fundamentally flawed. It strikes me as somewhat analogous to the computer security debates about "white-listing" vs "black-listing". That is, you can try to enumerate all the things you don't want, but that's a long list. Perhaps better just to make a very short list of the sources you do trust.

Or, even better, just to educate people about the need to "understand the context; understand the source".

Even, perhaps especially, we "simple code monkeys" who, in the end, build these algorithms and deploy them on these computers.

I had the opportunity, recently, to watch Shattered Glass, the now-15-years-old dramatization of the story of Stephen Glass and the fall of The New Republic, once perhaps the most respected magazine in all of journalism. It's not the greatest movie ever made, but it's an important story, and worth watching (or at least learning about). I thought the movie did a particularly good job of showing how so many different people were complicit, in so many different ways, in what happened.

I'm not sure where the answer lies.

But I'm very happy that the discussion is considerably more lively than it has been.

the U.S. Forest Service has identified an additional 36 million dead trees across California since its last aerial survey in May 2016. This brings the total number of dead trees since 2010 to over 102 million on 7.7 million acres of California's drought stricken forests. In 2016 alone, 62 million trees have died, representing more than a 100 percent increase in dead trees across the state from 2015. Millions of additional trees are weakened and expected to die in the coming months and years.

Who knows how you even count 102 million dead trees. I suspect they use techniques like those used to estimate the size of crowds during parades: divide a very large-scale picture into small sections; pick a handful of those sections at random; zoom in on that tiny image section and count the trees (by eyeballing them); assume an even distribution across the overall image and multiply by that factor.

That is, random sampling of large scale data.

Which can be quite sensitive to inaccuracies, but still, it's at least some data, collected as rigorously as their budget would allow, I suspect.

I was up in the mountains in August, for my back-packing trip, and again in October, for our visit to Mammoth Lakes and Yosemite. The route we took, both times, was CA 108, known as the Sonora Pass road.

From what I can tell, Sonora Pass has been roughly the dividing line between mild mortality and severe mortality, with the worst damage being to the south of Sonora Pass, while areas to the north have been significantly less affected.

However, the Forest Service cautions that even this not-as-bad-as-it-might-have-been news is changing quickly:

The majority of the 102 million dead trees are located in ten counties in the southern and central Sierra Nevada region. The Forest Service also identified increasing mortality in the northern part of the state, including Siskiyou, Modoc, Plumas and Lassen counties.

Those bark beetles are hungry, it seems, and once they ate all the healthy bark around Yosemite they flew off to find more trees to chew on.

On our October trip, we noticed a lot of the results of the efforts to collect and remove the astonishing amount of dead trees in the various national forests, an effort the Forest Service has been championing, noting that:

"These dead and dying trees continue to elevate the risk of wildfire, complicate our efforts to respond safely and effectively to fires when they do occur, and pose a host of threats to life and property across California," said Vilsack. "USDA has made restoration work and the removal of excess fuels a top priority..."

Sunday, November 20, 2016

Together with my almost-13-year-old granddaughter, I saw Arrival this weekend. It was a nice way to spend a rainy day.

I admit that I was a bit of an easy target for Arrival.

I dabbled in linguistics as an undergraduate, and have always thought it was one of the most important fields of study, so any movie which is built around the Sapir-Whorf hypothesis was going to be right in my sweet spot.

Moreover, my daughter was a linguistics major at UCSC, and so any movie in which the heroine is a linguist was an easy sell.

And, one of the lead characters shares my granddaughter's name, so she liked it too!

More objectively, the movie is nicely paced, beautifully shot, and the director and writers have a good awareness of how to set up and deliver the principal breakthroughs for maximum effect.

And that gorgeous logogram smoke-language? Gorgeous! Vivid! Inspired! Delightful! The subtly-delivered metaphor of comprehension emerging out of smoke and fog? Superb!

As modern science fiction movies seem to go, this is probably as good as you get, nowadays.

But I can't help feeling that much of the current "buzz" around Arrival is due to its impeccable timing (probably mostly due to good luck), "arriving" as it does at a time when many of us are desperately wondering why it seems to be so hard to talk to, and to listen to, those we share this planet with, whether they be halfway across the world or just down the street.

Friday, November 18, 2016

This time, the draw was decided early, on move 32. The queens were still on the board, but each side had only Queen, Bishop, and pawns, and the bishops were opposite color.

Sunday, the second half of the match begins.

Due to a little quirk of the protocol, the order of the white-black pairing reverses for the second half of the match, so while Carlsen had the white pieces for the odd-numbered games in the first half of the match, Karjakin has the odd-numbered whites for the second half.

Which means that Karjakin had white today, and has white again on Sunday.

Tuesday, November 15, 2016

This game, in which Karjakin had white, went 94 moves and took nearly 7.5 hours, before it was finally agreed to be drawn.

Around move 70 or so, it seemed to me that Carlsen had a winning advantage.

Clearly, though, this just demonstrates how little I know. There were long stretches of this game which left me completely confused. The distance between my level of comprehension of chess, and their level of play of chess, is so vast as to frequently leave me slack-jawed, though mesmerized.

Leonard Barden's weekly Guardian chess column began in September 1955 and has continued since with no breaks for sixty-one years. It has broken the previous record for any columnist, held by English local columnist Tom Widdows, who wrote weekly in the Worcester News from October 1945 until April 2006, 60 years and 6 months. Leonard's other (daily!) column, in the Evening Standard, began in June 1956 and has continued every day since.

Mr. Barden has been writing a daily column since before I was born! (And I'm beginning to achieve a pretty ripe old age myself.)

The only other comparable person who comes readily to mind was the fantastic film critic Stanley Kauffman, who worked for 55 years, right up until his death at age 97.

Congratulations, Mr. Barden, and may you keep it up as long as possible.

Mad Magazine’s Al Jaffee is the oldest working cartoonist, and he has the certificate from Guinness World Records to prove it. The satirist has worked with everyone from Stan Lee to EC Comics founder William Gaines over his career’s 73-year span, though he will probably go down in comics history as the inventor of the “fold-in”, a transformable painting that originally mocked lavish centerfolds in magazines like National Geographic and Playboy and later became Jaffee’s calling card.

Sunday, November 13, 2016

I knew this fact before, but every so often I re-learn that the various websites run by the various agencies of the State of California are of very high quality, and are a wonderful way to spend a quiet Sunday morning while listening to Mozart sonatas, drinking coffee, and frying potatoes and onions.

It typically takes weeks for counties to process and count all of the ballots. Elections officials have approximately one month (28 days for presidential electors and 30 days for all other contests) to complete their extensive tallying, auditing, and certification work (known as the "official canvass").

Meanwhile, there has been a dramatic increase in use of the vote-by-mail system.
The 2012 presidential election was the first election where a majority of the ballots cast were vote-by-mail.

As the results continue to be counted, the numbers continue to evolve. It is now looking like the overall California turnout will be much higher than I expected, as it's already up to 51% and clearly still climbing.

It is an extremely complicated thing, running an election of nearly 20 million registered voters.

Kudos to the California Secretary of State, and to all the individual county and city agencies that cooperate in the process, for what is, by all appearances, another extremely thorough, careful, open, transparent, and professional result.

Saturday, November 12, 2016

It seems to me that Karjakin actually had the advantage, and had the play, but somewhere around move 25 or so he must have played an inaccuracy, because all of a sudden Carlsen invaded the back rank and Karjakin settled for a repetition of moves.

One of the amazing things about computer programming is that you can lose yourself, deep, deep, deep into the contemplation of an arcane detail, and you can find that you've spent several hours on a beautiful fall day composing something like this, and yet it can just seem deeply, deeply satisfying.

Thursday, November 10, 2016

Lots of people are taking this opportunity to say something, so I'll chime in.

This election involved a lot of anger. Anger is a powerful motivator, but it is also an extremely destructive force.

Nobody was motivated to vote . Although the overall turnout was better than I dreaded, the final tally involved about 38% of the people currently living in the country; neither Trump nor Clinton managed to get even 20% of Americans to cast a vote for them.

People are made very uncomfortable by change, and I think many people made an implicit vote to "stop all those changes".But of course change does not stop.

People don't understand why their health care costs are so high, and continue to rise. Politicians said the solution was the A.C.A. Politicians said the problem is the A.C.A. But for the past 15 years, every year has seen double-digit price increases in healthcare, so manifestly both groups are wrong. No wonder people don't trust politicians, and, in their anger, use words like "corrupt", "liar", "rigged", which aren't at all the right words, but express a very valid and real frustration.

Yet let me say this; in fact, let us all stand up tall, throw our heads to the sky, and bellow this from every rooftop:

Just 10 weeks from now, for the 60th consecutive time in an unbroken 240 year span, the United States of America will conduct a peaceful transfer of power to a new, freely elected government.

Carlsen is, of course, an astonishing talent, and a tremendously exciting player to follow; his life story is at this point well-known:

Born in Norway

Noticed as a brilliant chess prodigy at a very young age

Was beating the top players players in Norway by age 10

Earned the Grandmaster title when he was 13.5 years old

Came under the personal tutelage of Garry Kasparov at age 18

Defeated Viswanathan Anand in 2013 to become the World Chess Champion, and defended his title against Anand in 2014

Is particularly known for his extraordinarily precise calculations, coupled with an ability to win marathon games in endgame positions that most other players would assess as having far too small an advantage to be worth pursuing.

Is renowned as well for his brilliant and extraordinarily accurate play, which includes a number of spectacular games over the past 18 months during the major qualification tournaments for the championship

Surely Carlsen has to be the favorite, simply because he's "been there before."

But Karjakin should test him to the limit. In many ways, Karjakin should be a much stronger test than Anand was, as Anand is considerably older at this point and although he's always been my all-time favorite player, Anand's skills were clearly below those of Carlsen by the time they met.

Anyway, enough rambling.

I'm excited! The World Chess Championship begins in less than 4 days!

Because life's more fun this way, I feel the need to make a prediction, so here goes:

Carlsen wins 4 games. Karjakin wins 1. They draw 5 games, and the championship is decided after game 10.

So I set out on a road trip to the part of America most coastal elites don’t think about, except when they’re reading one of the fourteen daily pieces in the mainstream media where a journalist visits a town most coastal elites don’t think about.

Tuesday, November 1, 2016

It turns out that I'm just the right age to have witnessed the rise, and perhaps the fall, of a software industry segment known as "Enterprise Software".

Enterprise Software, as a term, came to mean the notion of a business which created software which was, in turn, sold to other businesses which used that software for their own purposes: to design and build their own products; to manage their books; to control their inventory; to track their customers; to streamline their internal processes.

Back in the late 1960's and early 1970's, software existed, but it wasn't thought of as a "product", as something you could "sell".

Rather, if you bought a computer (that is, bought a computer from IBM, but perhaps possibly instead from Sperry or Burroughs or Honeywell or Control Data), you bought the hardware -- the actual computer that is -- and the software was generally considered to be thrown in, as an accessory or a sweetener to the deal.

The history of the computer industry is short, as histories go; even if, as this entertaining article on storage media notes, we can trace computing hardware ideas back for a hundred years, it's really amazing how short the history of computing is, and yet how much it covers.

Obviously, the topic of the workshop was of great interest to me, having worked at both companies during the time period(s) in question, as well as because my manager of some two decades (Greg Batti) was one of the primary participants in the workshop.

It's interesting to remember what things were like, back in those days:

Butterworth: Well, that was kind of the beginning and, yes, we got the first products out so that
was early in 1981. And at that point all the parts kind of worked. It wasn’t the world’s most
robust software but it definitely got the job done. From that point forward, the big issue was
trying to gear up the sales effort and gear up the technical effort to mature the product.

Janeway notes that this relationship between hardware and software as commercial "products" changed due to the U.S. Government's anti-trust activity with IBM:

IBM had come under anti-trust assault from the U.S. Department of Justice and, in 1969, its response was to unbundle software from its hardware leasing contracts, thereby opening up commercial space for an independent software industry.

Out of this basic alteration of the way the world viewed hardware and software, there quickly arose a brand-new industry:

Venture capital was needed for the initial cost of developing the software, including the cost of buying development hardware and software, but once there was a "minimal viable product" (to use a modern term) to sell, license payments funded the build out of the sales, marketing and customer support functions essential to turning a venture into a sustainable business. Moreover, the enterprise software industry was the first product-centric business not to require the "manufacturer" to hold inventory: losses on the revaluation of inventory were eliminated by construction and working capital needs were limited to financing receivables.

It was a time of great change, and of great progress in the creation of huge new corporations based on this notion of a software industry: Ingres and Sybase, of course, but also Microsoft, and Oracle, and many others whose names are just foggy memories nowadays.

Janeway's story, though, does not end there, but rather moves on to the next great event:

Software as a Service ("SaaS") transformed the customer’s purchase decision from a capital investment to a recurring operating expense. A product sale now generated cash flow to be collected in the future, whether based on a simple calculation of number of users licensed per period or transactions executed or some other metric of value realized over time versus anticipated value paid for upfront.

It is, of course, a great difference to transform an entire industry from "pay me now" to "pay me later" (or "pay me as you go", if you prefer), but just as important is how that change (from perpetual licensing to service subscription) change the way that companies are built and operated, nowadays:

The enterprise "customer" is now fragmented into multiple customers: the disparate business units where the need and budget are to be found...but not the technical capacity to deploy or manage complex code.

And, as Janeway goes on to highlight, this change in the "enterprise customer" has caused a corresponding transformation of the enterprise software sales process:

The disappearance of the internal enterprise IT Department as the customer for software means, in the first instance, that attempting to sell a new general purpose "platform" to the enterprise -- emulating the enormous success of Oracle in database or BEA in app server -- is quixotic. There is nobody home when the salesman calls. Instead, the technology must be fashioned into a "solution" relevant for and appealing to the distinct business unit customer. Now the problem is that there are too many homes on whose doors to knock.

For a near-perfect example of this transformation, you could do no better than to look at last year's hottest software IPO: Atlassian. Atlassian have just finished their first full year as a public company; you can read their Shareholder Letter here, although more relevant for this discussion is the minutiae in the Atlassian annual report, where we read things like:

We offer and sell our products via both the cloud and on premises using the customer’s own infrastructure.
Our cloud offering enables quick setup and subscription pricing, while our on-premises offering permits more
customization, a perpetual or term license fee structure and complete application control. Historically, our products
were developed in the context of the on-premises offering, and we have less operating experience offering and
selling our products via our cloud offering.

...

our software is frequently purchased by first-time customers to solve specific
problems and not as part of a strategic technology purchasing decision.

...

We do not have a direct salesforce and our sales model does not include traditional, quota-carrying sales
personnel. Although we believe our business model can continue to scale without a large enterprise salesforce, our
viral marketing model may not continue to be as successful as we anticipate and the absence of a direct sales
function may impede our future growth. As we continue to scale our business, a sales infrastructure could assist in
reaching larger enterprise customers and growing our revenue. Identifying and recruiting qualified sales personnel
and training them would require significant time, expense and attention and would significantly impact our business
model. In addition, adding sales personnel would considerably change our cost structure and results of operations,
and we may have to reduce other expenses, such as our research and development expenses, in order to
accommodate a corresponding increase in marketing and sales expenses and maintain our profitability. If our lack
of a direct salesforce limits us from reaching larger enterprise customers and growing our revenue and we are
unable to hire, develop and retain talented sales personnel in the future, our revenue growth and results of
operations may be harmed.

Predictions are hard, especially about the future.

But yes, Mr. Janeway, the evidence from the Atlassian Annual Report (and from many, many, many other sources) is crystal-clear: the times, they are a'changing.

As private equity firms continue to make deeper entry into the software industry, people like me, who were perhaps aware of the existence of such firms but ignorant of the details, continue to learn more and more.

So here's a round-up of various interesting things I've learned recently.

Basically, this is because there are too many of these companies, and the market that they are in keeps changing:

John Prunier, partner at Petsky Prunier, told Business Insider: "Confronted by scant interest from the largest and highest-paying strategic companies — Google most visibly — and a consequent drying up of growth capital, ad tech companies were forced to retool their products and business models."

Apparently the well-known Silicon Valley meme of: "Build me an adtech company; sell it to Google; retire!" has finally reached the inevitable result: "because there is a large supply of ad tech companies in the market, many ad tech company valuations still remain attractively modest enough for private equity firms."

As private equity partner Julie Langley observes, the private equity approach still requires certain properties of the business to be acquired:

Private equity much prefers enterprise software because of its stickiness. In other words: It’s painful to rip out and replace. A SaaS license model offers a degree of assurance in terms of driving longer terms revenues. This means private equity firms can feel more comfortable in putting more debt into the business, this in turn helps to drive their upside.

Along the same lines, Prunier notes that the sort of adtech company that private equity firms are looking to acquire should be ones that, among other things, have:

Repeatable, if not wholly-recurring revenue

Evidence that operating leverage or some other driver of EBITDA margin expansion will be attainable

That is: there must be reason to believe that revenue will remain mostly flat while expenses are slashed.

You have massive pools of capital in China that want to be deployed in the U.S.; you have entrepreneurs in the U.S. who want do business in China. What’s missing is a level of trust and understanding of how the two sides do business with each other. CSC solved that problem in part by partnering with AngelList. We’re doing something similar for much bigger deals and for M&A.

Levit further notes that, although the Chinese investors have many similarities in approach to American private equity firms, the details are different:

The Chinese are most concerned about profit — true net income. It’s a very non-Silicon Valley mentality. Here, you reinvest your profit to make your company bigger. That doesn’t work so well in China.

We’re looking for companies that are massive in scale and throwing off earnings and look a lot more like a private equity target, though the Chinese are often willing to pay more than private equity.

And, in his always-superb Money Stuff column, Bloomberg columnist Matt Levine talks about how the last decade of monetary policy has had dramatic effects on American private equity firms: Big Data and Expected Returns

lower interest rates meant that expected returns on all sorts of asset classes went down. And so Goldman had to go out to investors and be like "hey it's not 20 percent anymore, sorry," and it was awkward.

Anyway, eight years into the modern low-interest-rate environment, private equity is going lower and longer

Joseph Baratta, the head of private equity at the Blackstone Group, the biggest alternative investment firm, said at a conference in Berlin on Tuesday that the firm was speaking with large investors about a new investment structure that would aim for lower returns over a longer period of time.

Many private equity executives say that these structures allay selling forced by a too-short time horizon, adding that longer time horizons will lower the number of sales between private equity shops as an exit strategy, which had surged this year.

They may also help combat the image of private equity as squeezing out profits in the short term, according to Jay Freedman at law firm Ropes & Gray.

I must say, I don't know what data Mr. Freedman of Ropes & Gray is seeing, since from my own personal observations, there is no change in the behavior of "squeezing out profits in the short term." I see many examples of "operating leverage", and of "throwing off earnings", and of "putting more debt into the business", and much less of "reinvest your profit to make your company bigger."

In fact, "invest" seems to be rapidly becoming a dirty word in the technology industry: Amazon is routinely pilloried for it, while companies that emit dividends or perform stock buybacks are correspondingly celebrated.

From where I stand, the direction of the industry is clear.

Still, it's interesting stuff, even if much of it is still very puzzling to me.