Saturday, March 22, 2008

Dynafitted

Ever since I started skiing in the backcountry, I've been on Fritschi bindings. They got me into the sport, and I didn't have many complaints. Serious backcountry skiers kept telling me that Fritschis are heavy and have an unnatural walking mode, both of which waste energy on tour. But the fiddly aspects of the alternative, Dynafit bindings, and the fact that I finally had very good boots that worked both inbounds and in the backcountry, made me procrastinate. But seeing again the efficiency of Dynafit gear in my recent week in Canada, and the good reviews of recent Dynafit-compatible boots, broke the logjam. I stopped at the Backcountry in Truckee last weekend and I picked up the new setup today: Dynafit Zzero 4 C-TF boots, Dynafit Vertical ST bindings, and Karhu Jak BC 100 skis. So light! Skinning up to Castle Peak was like no skinning I've ever done: the stride was so natural, and the connection between foot and ski so efficient, that it felt like a walk in the park. Skiing down on corn on top of sun crust was smoother also because of the firm and low connection between boot and ski. The new skis are so light and fat that they are deflected more easily than my other stiff and narrower skis, but the boot and binding make up much of the difference. I love this new setup.

About Me

I am VP and Engineering Fellow at Google, where I lead work on natural-language understanding and machine learning. My previous positions include chair of the Computer and Information Science department of the University of Pennsylvania, head of the Machine Learning and Information Retrieval department at AT&T Labs, and research and management positions at SRI International. I received a Ph.D. in Artificial Intelligence from the University of Edinburgh in 1982, and I have over 120 research publications on computational linguistics, machine learning, bioinformatics, speech recognition, and logic programming, as well as several patents. I was elected AAAI Fellow in 1991 for contributions to computational linguistics and logic programming, ACM Fellow in 2010 for contributions to machine-learning models of natural language and biological sequences, and ACL Fellow in 2017 for contributions to sequence modeling, finite-state methods, and dependency and deductive parsing. I was president of the Association for Computational Linguistics in 1993.