Alejandro SotoA planetary scientist and aerospace engineer involved in the exploration of the solar and extrasolar systems.
https://alejandrosoto.net/
Wed, 01 Apr 2020 18:30:47 -0600Wed, 01 Apr 2020 18:30:47 -0600Jekyll v4.0.0Books read in 2017<!-- _includes/image.html -->
<!-- From: http://codingtips.kanishkkunal.in/image-caption-jekyll/ -->
<div class="image-wrapper">
<img src="/images/2017books_2018-02-04.png" alt="" />
<p class="image-caption"> </p>
</div>
<p>My girlfriend is on a mission to read 100 books in four years. Since she got a strong start in 2016, she was motivated to read at least 25 books in 2017. Her enthusiasm was infectious, so I decided to take on a similar challenge. I committed to reading 24 books in 2017. At two books a month, that seemed like a reasonable goal for the year.</p>
<!-- more -->
<p>I have always considered myself an avid reader, but apparently I have not been a frequent reader for a long time. Reading 24 books in 2017 proved to be more of a challenge than I expected. I achieved my goal, but I definitely had to work at maintaining a good pace. In the end, I read 9396 pages over 25 books in twelve months.</p>
<!-- _includes/image.html -->
<!-- From: http://codingtips.kanishkkunal.in/image-caption-jekyll/ -->
<div class="image-wrapper">
<img src="/images/2017books_Screen_Shot_2018-01-24_at_3.30.54_PM.png" alt="" />
<p class="image-caption"> </p>
</div>
<p>Mixing both challenging and difficult books with fun, light reads proved to be a successful strategy for me. Two things helped out. First, I started the year with some short books, including John Scalzi’s <em>Miniatures</em> which clocked in at a whopping 104 pages. Starting with short books allowed me to tally some early “wins”, thereby building confidence that I might actually pull of this goal. Second, August proved to be an important month. In the preceding three months my reading rate had dropped to an average of one book per month. In August, I made up for that lackluster reading rate by finishing four books. Admittedly, one of those books was an audiobook (the entertaining <em>Born a Crime</em> read by the author, Trevor Noah), but that counts from my perspective. A weeklong road trip in August was a perfect time for an audiobook.</p>
<!-- _includes/image.html -->
<!-- From: http://codingtips.kanishkkunal.in/image-caption-jekyll/ -->
<div class="image-wrapper">
<img src="/images/2017books_Screen_Shot_2018-01-24_at_3.31.12_PM.png" alt="" />
<p class="image-caption"> </p>
</div>
<p>To people who know me well, my choice of books might be predictable, although I think I have a few surprises in the list. There are 13 science fiction books (I am not counting <em>REAMDE</em>, though that’s debatable) , which make up the bulk of my fiction reading. The other two fiction books were Neal Stephenson’s <em>REAMDE</em> and Don Winslow’s <em>The Cartel</em>, both of which I enjoyed immensely. Of the remaining 10 nonfiction books, two were science books, two were philosophically-oriented books, five were memoirs of sorts (including one of the science books), one was an essay, and one was a history (the wonderful <em>Hidden Figures</em>). If I had to guess the book that would most surprise my friends, it’s Megyn Kelly’s <em>Settle for More</em>. But who knows what really surprises people?</p>
<!-- _includes/image.html -->
<!-- From: http://codingtips.kanishkkunal.in/image-caption-jekyll/ -->
<div class="image-wrapper">
<img src="/images/2017books_Screen_Shot_2018-01-24_at_3.32.00_PM.png" alt="" />
<p class="image-caption"> </p>
</div>
<p>Overall I enjoyed every book I read in 2017. They each entertained me and provoked me in their own way. For some of the books, I even wrote short book reviews, which I share below. These reviews were fashioned, in tone and length, for a <a href="http://www.goodreads.com">Goodreads</a> audience.</p>
<h2 id="the-currents-of-space-by-isaac-asimov"><em>The Currents of Space</em> by Isaac Asimov</h2>
<div class="pull-left">
<img src="/images/2017books_22007437.jpg" style="height:125px;" />
</div>
<p><em>The Currents of Space</em> by Isaac Asimov is one of Asimov’s older works, where you can see him working out what eventually will be his Foundation universe. This is one of the rare books where Asimov attempts to tackle issues of race while still providing one of his classic yarns, complete with a plot twist at the end. The story fails both to be exciting and to really challenge race issues in his fictional universe. As well, this novel continues the paper-thin portrayals of women that are so evident in early Asimov novels.</p>
<p>I read this book because Asimov “retconned” this book, and the other two in the Galactic Empire series, into his larger Federation universe. However, the connections to the Foundation books are inconsistent, since he did not originally plan out how the various stories would connect. I enjoyed reading this book in connection to the Foundation series, but it does not stand alone very well.</p>
<h2 id="miniatures-the-very-short-fiction-of-john-scalzi"><em>Miniatures: The Very Short Fiction of John Scalzi</em></h2>
<div class="pull-left">
<img src="/images/2017books_33225997.jpg" style="height:125px;" />
</div>
<p>The title does not lie: these are very short pieces of fiction by John Scalzi. Despite their diminutive size, however, the stories are packed full with jokes and clever observations. Best of all, this is a quick read — perfect for the reading commitment-phobe!</p>
<h2 id="on-bullshit-by-harry-g-frankfurt"><em>On Bullshit</em> by Harry G. Frankfurt</h2>
<div class="pull-left">
<img src="/images/2017books_385.jpg" style="height:125px;" />
</div>
<p>This short book, an essay really, is a surprisingly thought-provoking discussion of the definition of bullshit. As often happens with philosophy, I began the book in disbelief that I needed a definition of such an obvious word, but in the end the process of exploring the definition was well worth it. At a minimum, Frankfurt’s discussions of the differences between telling the truth, telling a lie, and bullshitting are worth consideration. Throughout history, humans have benefited from understanding these differences and identifying their occurrence.</p>
<h2 id="the-collapsing-empire-by-john-scalzi"><em>The Collapsing Empire</em> by John Scalzi</h2>
<div class="pull-left">
<img src="/images/2017books_30282601.jpg" style="height:125px;" />
</div>
<p>If you are looking for an action packed read with space battles, palace intrigue, and a foul-mouthed heroine, this is your book! If you are looking for another classic Scalzi sci-fi then this is also your book. In <em>The Collapsing Empire</em>, Scalzi introduces a new universe on the edge of change where the people with solutions aren’t the people with power. The story is told in typical Scalzi style, with humor and fun. It’s the first of a series, so I eagerly await the sequel to see where this story is going.</p>
<h2 id="the-cartel-by-don-winslow"><em>The Cartel</em> by Don Winslow</h2>
<div class="pull-left">
<img src="/images/2017books_24739992.jpg" style="height:125px;" />
</div>
<p>In <em>The Cartel</em> by Don Winslow, we follow DEA agent Art Keller as he hunts down a Mexican cartel leader, ostensibly as part of the war on drugs but in actuality as revenge for the death of a DEA agent. Although this book is a sequel to Winslow’s <em>The Power of the Dog</em>, the death of the DEA agent is the only key piece of information needed from the previous book, and Winslow makes sure that you know all about it, with frequent reminders of Keller’s motivations as he hunts the cartel leader. This book is an exciting read because it’s one long, continuous chase, with plenty of action and just a little bit of romance and character development. What makes this book a disturbing read is the level of historically accurate detail that Winslow brings to the book. The characters are fictional, but many of the cartels and the activities of the cartels are drawn from real life, particularly the violence and corruption. Winslow has plumbed the tragedy of the war on drugs and its affects on the Mexican people to provide a disturbingly realistic crime story. As you follow Keller on the hunt for revenge, you are also challenged by the author to understand the consequences of the war on drugs and to empathize with its victims in Mexico. This book is worth reading, both for the story and the challenging issues for the reader.</p>
<h2 id="the-joy-of-x-by-steven-strogatz"><em>The Joy of x</em> by Steven Strogatz</h2>
<div class="pull-left">
<img src="/images/2017books_13356649.jpg" style="height:125px;" />
</div>
<p>In the <em>The Joy of x</em> Steven Strogatz introduces us to the wonders of mathematics. The book is based on a long running blog, so each chapter is a short, stand-alone essay discussing a particular feature of math. The essays are organized thematically but you can also enjoy them in isolation. I read the book over the course of a year, reading each essay and then letting the topic settle in my brain for awhile. Strogatz succeeded in writing about math topics both practical and esoteric in a manner that should be accessible to readers regardless of their prior background in math. I wish I had a book like this assigned along with the textbook during my high school math classes.</p>
<h2 id="trespassing-in-einsteins-lawn-by-amanda-gefter"><em>Trespassing in Einstein’s Lawn</em> by Amanda Gefter</h2>
<div class="pull-left">
<img src="/images/2017books_19743444.jpg" style="height:125px;" />
</div>
<p>Hidden in the memoir about a woman and her father is a popular science story about reality. Or is it, hidden in this popular science story about reality is a memoir about a woman and her father? Either way, this is a wonderful book by Amanda Gefter.
Gefter’s father has shared his love of physics and cosmology with his daughter. Together they have led their own investigation into the nature of reality. Inspired by their investigation, Gefter started a career in journalism that allowed her to ask the leading physicists questions about reality. With this book, Gefter brings us on this journey, teaching us about fundamental physics while showing us the process by which questions lead to ideas which lead to discoveries. This is a challenging book to read, filled with abstract and mind-bending concepts, but the journey is worth the work.</p>
<h2 id="between-the-world-and-me-by-ta-nehisi-coates"><em>Between the World and Me</em> by Ta-Nehisi Coates</h2>
<div class="pull-left">
<img src="/images/2017books_25489625.jpg" style="height:125px;" />
</div>
<p><em>Between the World and Me</em> by Ta-Nehisi Coates is a long letter from Coates to his son. Coates is sharing his experiences and knowledge of life as a black man in America. His intent is to guide his teenage son as the young man enters adulthood in America. For the rest of us, Coates provides an intimate view of life as a black man in America, a view that more non-black people need to read and see. We need this knowledge in order to have more compassion and sympathy, for all people. This was a powerful and challenging book that has not left me.</p>
Wed, 31 Jan 2018 00:00:00 -0700https://alejandrosoto.net/blog/2018/01/31/books-read-in-2017/
https://alejandrosoto.net/blog/2018/01/31/books-read-in-2017/readingbooksAstronomy Adventure in Argentina, Part 1<p><em>This post <a href="http://planetary.org/blogs/guest-blogs/2017/20160616-mu69-occulation-campaign.html">originally appeared</a> on the guest blog at the <a href="http://planetary.org">Planetary Society</a>.</em></p>
<p>I work as a planetary scientist at the <a href="http://www.boulder.swri.edu">Southwest Research Institute in Boulder, CO</a>. My main area of research is the atmospheric and climate dynamics of terrestrial atmospheres, like Mars, Titan, and Pluto. However, I have a pretty eclectic background, having been involved in projects outside of my main area of research. Thus, when one of the astronomers in my office mentioned that I should get involved in an occultation campaign for the <a href="http://pluto.jhuapl.edu">New Horizons mission</a>, it made complete sense to me to say yes. And I am glad I did, because I had a lot of fun helping dozens of other astronomers in an ambitious observation campaign.</p>
<!-- more -->
<p>Although a Pluto flyby, which occurred in the summer of 2015, was the primary motivation for the mission, the New Horizons mission always intended to visit other <a href="https://en.wikipedia.org/wiki/Kuiper_belt">Kuiper Belt Objects (KBOs)</a> after the primary mission ended. In 2014, using the Hubble Space Telescope, the New Horizons team <a href="http://www.planetary.org/blogs/emily-lakdawalla/2014/10151024-finally-new-horizons-has-a-kbo.html">eventually</a> found <a href="http://hubblesite.org/news_release/news/2014-47">targets</a>. Two years later, <a href="https://www.nasa.gov/feature/new-horizons-receives-mission-extension-to-kuiper-belt-dawn-to-remain-at-ceres">NASA approved</a> an extended mission to one of those targets, a KBO named <a href="https://en.wikipedia.org/wiki/(486958)_2014_MU69">2014 MU<sub>69</sub></a>.</p>
<p>MU69, as it’s known among the New Horizons team, is likely a small object, with an estimated radius in the <a href="https://en.wikipedia.org/wiki/(486958)_2014_MU69">10s of kilometers</a>. However, we are not really sure about its size, and we know very little about the <a href="https://en.wikipedia.org/wiki/Albedo">albedo</a> of the surface, which is basically the brightness of the surface of MU69. Additionally, we have no idea if there are rings or debris surrounding MU69, which could be hazards for the New Horizons mission.</p>
<p>One of the best ways for us to to learn about the size, albedo, and surrounding debris of MU69 is to watch the KBO create a <a href="https://en.wikipedia.org/wiki/Albedo">stellar occultation</a>. When MU69 passes in front of a star, it temporarily blocks (“occults”) the light of the star. If we take a time series of the light from the star, the occultation creates a deep dip in the light. The figure below shows an example of a stellar occultation, where the dots are actual measurements of the star’s brightness. The deep dip in the center of the figure is the occultation of a star by <a href="https://en.wikipedia.org/wiki/10199_Chariklo">Chariklo</a>, a <a href="https://en.wikipedia.org/wiki/Centaur">Centaur</a> that orbits between Jupiter and Neptune. The small dips before and after the main one are due to a ring of small particles in orbit around Chariklo. This is an example of the kind of information that the MU69 occultations will provide about the KBO and any rings and debris that might exist.</p>
<!-- _includes/image.html -->
<!-- From: http://codingtips.kanishkkunal.in/image-caption-jekyll/ -->
<div class="image-wrapper">
<img src="/images/20170614_chariklo-ring.png" alt="Chariklo occulation data" />
<p class="image-caption">A RING SYSTEM DETECTED AROUND THE CENTAUR (10199) CHARIKLO. Figure from F. Braga-Ribas et al., “A ring system detected around the Centaur (10199) Chariklo”, Nature 508, 72-75 (03 April 2014), http://dx.doi.org/10.1038/nature13155.</p>
</div>
<p>MU69 has three stellar occultations this summer, on June 3rd, July 10th, and July 17th. I had the opportunity to join the team of professional and amateur astronomers to observe the June 3rd occultation. That occultation was observable from two areas on Earth: South America and South Africa. Since observing this occultation will greatly benefit the New Horizons mission, the team of scientists leading the campaign to observe the occultation decided that we needed the largest number of telescopes deployed into the field as possible. Thus, twelve portable telescopes were sent to South Africa and twelve were sent to South America. With two astronomers assigned to each telescope along with support staff, from the United States, South Africa, and South America, around 60 people were in the field observing the occultation. Even before we left for South America and South Africa, a lot of work was done to <a href="https://www.nasa.gov/feature/new-horizons-deploys-global-team-for-rare-look-at-next-flyby-target">prepare for this occultation observation campaign</a>.</p>
<p>I was sent down to Mendoza, Argentina along with the rest of the team sent to South America, to try our best to see the occultation. We had predictions of the path of the MU69 occultation as it intersected the Earth, but even with these predictions there was uncertainty as to the the precise location we needed to be to observe the occultation. Therefore, we did not know where in the Mendoza province we would set up the telescopes until we arrived. Most of us arrived on the Monday before the occultation event, which took place just after midnight local time on Friday night (i.e., 03:15 Universal Time). The next day 12 teams of astronomers drove the countryside scouting for possible observations sites. They were looking for sites without trees or buildings, away from the city and any surrounding towns. Dozens of potential sites were found, and the final 12 sites used during the actual occultation observation came from this first day of scouting.</p>
<!-- _includes/image.html -->
<!-- From: http://codingtips.kanishkkunal.in/image-caption-jekyll/ -->
<div class="image-wrapper">
<img src="/images/mu69_June3_occultation_path.png" alt="June 3rd occultation path" />
<p class="image-caption">THE PATH TO THE MU69 OCCULTATION OF A DISTANT STAR. The occultation moved from right to left, so the team in South Africa observed it minutes before the team in South America. Image courtesy of the New Horizons team and SwRI.</p>
</div>
<p>I spent my days in the run up to the occultation night slightly differently. Since I speak a decent amount of Spanish, I helped track down supplies in the town of Mendoza. These tasks provided me with the unexpected pleasure of sharing our adventures as astronomers with the citizens of Mendoza. My favorite moment was convincing two employees at Easy, the Argentine equivalent of Home Depot or Lowe’s, to cut some wood for me that the teams could use as shims to help level the telescopes out in the field. I like to say that I speak “kitchen” Spanish or “family reunion” Spanish – I can argue baseball with my relatives or swap stories about a cousin’s misadventures in childhood. However, I don’t know many hardware terms in Spanish. For example, I had no idea how to say “shims”<sup id="fnref:1"><a href="#fn:1" class="footnote">1</a></sup>. But we worked it out. I described what we were doing, and the employees got excited about helping us out. This is just one quick story of how local Argentines helped us with this campaign.</p>
<p>Every night before the event night, we practiced using the telescopes. Some team members had a lot of experience specifically with the type of telescopes we were using. Many of us did not. We came from a variety of backgrounds, including professors, engineers, and students, among others. On Tuesday, the first night, there were challenges for every team. But by Thursday night, our dress rehearsal, we were ready.</p>
<p>We had some fun adventures along the way. On Tuesday night, we were provided a police escort to a park just west of Mendoza, where we practiced as one giant group. The next night, we again had police escort, but we split into three teams, each going to a different campground south of Mendoza. At the campground where I worked that night, I spent part of the evening talking with the caretakers of the campground and their family. They were puzzled by the trucks that rolled up to the site and began pulling out large boxes of gear. With the help of our police escort, I explained what were were doing after which the campground caretakers were eager to help us out. Next thing I knew, I was on the phone with the nephew of one of the caretakers, telling the young man all about our project. He was clearly an astronomy fan. Later in the night I found him out in the field with us, watching the teams practice for the occultation event.</p>
<p>Thursday night was a complete dress rehearsal. Each team went out to their assigned site and practiced the sequence of observations that we would execute on the actual event night. The team I was on included an extremely experienced amateur astronomer, an optical engineer who was also an experienced amateur astronomer, and myself. One of the State department officials from the U.S. Embassy in Buenos Aires also joined us. We found a ridge to the east of the highway that leads south out of Mendoza; this was one of the sites previously scouted two days before. Our only neighbor was a Catholic chapel a quarter mile west along the ridge. Otherwise we were alone, with clear skies. We found a flat spot on the ridge, parked our truck to block any wind, and set up our telescope. Everything went smoothly and we took 45 minutes of practice data. When the dress rehearsal was done, I passed around some Argentine empanadas, which we ate on top of that ridge while talking about how great a site this would be for the actual occultation event.</p>
<!-- _includes/image.html -->
<!-- From: http://codingtips.kanishkkunal.in/image-caption-jekyll/ -->
<div class="image-wrapper">
<img src="/images/20170614_soto-mu69-telescope.png" alt="Telescope assembly" />
<p class="image-caption">ASSEMBLING THE TELESCOPE ON THE DRESS REHEARSAL NIGHT. Image: Alejandro Soto.</p>
</div>
<p>The next day – the big day – we were told by Marc Buie, the team leader for South America, that we would observe the occultation from a completely different location. “Wait. What?” We liked the site on the ridge, so we had an initial resistance to the change, but that resistance passed quickly. The entire week, Buie had warned us that there would be last minute changes. He and a few other team members were doing their best to optimize the location of each of the 12 telescopes in Argentina while coordinating with the 12 telescope locations in South Africa. It was a tricky problem that was fine tuned right up to Friday afternoon, less than half a day before the occultation event.</p>
<p>The night of the occultation we drove out to our newly assigned site: a campground near a dry river bed just southeast of Mendoza. Again, I was with the same two astronomers and the embassy representative. We quickly scouted the new site and identified a good spot for the telescope. After that, the week of training kicked in. We set up the telescope, aligned it, and found the field were were going to observe. We waited about half an hour, working to keep dew from collecting on the telescope mirrors. At the time set by Buie and his colleagues on the New Horizons team, we started our 45 minute observations. Then we stepped away from the telescope and did nothing.</p>
<p>That’s the challenge to observations: most of the work happens well ahead of the observation event. When it comes time to make an astronomical observation, you ideally want it to be the most boring, mundane moment in the entire process. If you did everything right, you start the observation and let the telescope and camera collect the data.</p>
<!-- _includes/image.html -->
<!-- From: http://codingtips.kanishkkunal.in/image-caption-jekyll/ -->
<div class="image-wrapper">
<img src="/images/20170614_soto-mu69-telescope-team.png" alt="Occultation team." />
<p class="image-caption">ME AND MY FELLOW OBSERVERS. I am standing in the back, while to the right is the 16 inch telescope used for the observations. Image: Alejandro Soto.</p>
</div>
<p>We did everything right, so we stood in the cold night, waiting. I broke out empanadas again, almost always a crowd pleaser, and the experienced astronomers told stories about other campaigns. When the 45 minutes were up, we packed up and went back to our hotel. That was it. Thanks to good luck with the weather, which gave us clear skies all night, we executed our observations successfully. I would later find out that all of the other teams in South America had the same success.</p>
<p>After a week of scouting sites, running down last minute errands and supplies, and practicing late each night, we had observed the MU69 occultation as a team. The team in South Africa also were successful, although they had more challenges and adventures than we did. And the scourge of all astronomical observations, clouds, only affected a handful of sites in South Africa. In all, the occultation observation campaign on June 3rd, which was likely the largest coordinated and deployed campaign of its kind ever, went extremely well. We definitely celebrated the next night.</p>
<!-- _includes/image.html -->
<!-- From: http://codingtips.kanishkkunal.in/image-caption-jekyll/ -->
<div class="image-wrapper">
<img src="/images/20170614_mu69-star-field.png" alt="Star field." />
<p class="image-caption">OUR VIEW OF THE STAR FIELD FOR THE OBSERVATION. In this image you can not see the target star nor the occultation due to the display setting we were using and the fact that this image was taken 16 minutes before the occultation event, which lasted just two seconds. Time will tell if we actually saw the occultation or not. Image: Alejandro Soto.</p>
</div>
<p>“What about the results? Did you see the occultation?” you ask. Everyone asks this. Right now, we don’t know yet. This was an extremely challenging occultation to observe. MU69 occulted a very dim star, which always makes the observation tough. On top of that, we need to combine data from 24 deployed sites, plus some fixed sites, that cover a wide range of conditions, including <a href="https://en.wikipedia.org/wiki/Astronomical_seeing">seeing</a> and illumination. Therefore, the data analysis will take awhile. It is likely that only a couple of the telescopes actually observed the occultation. All of the data, however, will be useful for the New Horizons mission. All of us, astronomers and the general public, just need to wait until the New Horizons team finishes the analysis and announces the results. I am not a New Horizons team member, so I am eagerly awaiting the results, like everyone else.</p>
<p>The New Horizons team is already planning the next two occultations, on July 10th and July 17th. The July 10th occultation will be observed by the <a href="https://www.nasa.gov/mission_pages/SOFIA/index.html">SOFIA</a> airborne telescope. Then, on July 17th, many of the observers from the June 3rd occultation will be in southern Argentina for the occultation, which will involve all 24 telescopes. I can only hope I get invited to join the fun again.</p>
<div class="footnotes">
<ol>
<li id="fn:1">
<p>I now know that I should have said “cuña” in Spanish, since it means “wedge or shim”. <a href="#fnref:1" class="reversefootnote">&#8617;</a></p>
</li>
</ol>
</div>
Sat, 17 Jun 2017 00:00:00 -0600https://alejandrosoto.net/blog/2017/06/17/astronomy-adventure-argentina-1/
https://alejandrosoto.net/blog/2017/06/17/astronomy-adventure-argentina-1/astronomyoccultationmu69researchSetting up my Mac for climate research<!-- _includes/image.html -->
<!-- From: http://codingtips.kanishkkunal.in/image-caption-jekyll/ -->
<div class="image-wrapper">
<img src="/images/wind_art.png" alt="Mars surface wind plot" />
<p class="image-caption">Surface wind vectors from a simulation of the ancient Martian climate.</p>
</div>
<p><strong>Updated 2016-10-07</strong> Thanks to <a href="https://twitter.com/michaelaye">@michaelaye</a> for useful comments and corrections.</p>
<p>I use a Mac computer for most of my climate research, since the Mac OS X operating system provides me computational foundation I need to develop and run planetary climate models. I am not a fanatic follower of Apple and I will use Windows machines when the task demands it, e.g. CAD design on Solidworks or mapping on ArcGIS. For me, a computer OS is just another tool, like Fortran, Python, a spectrometer, or a soldering iron. I have a toolbox and <a href="http://youtu.be/d8Oe9SteE3M" title="I'm a weapons man. (Ronin)">I put the tools in for the job</a>. The trick is setting up the tools right.</p>
<p>Since I am using the Unix underpinnings of Mac OS X, my setup requires a number of steps that the average Mac owner does not need in order to be productive. Most of these additional steps involve installing and configuring software for writing my modeling and analysis code. This is essential for my research. The rest of the additional steps are there just to make my life easier.</p>
<p>Once you dive into the Unix engine under the hood, you are no longer working with Mac OS X software installers. Instead, you are often in the realm of package managers, compiling your own code, and customizing the paths and configurations. Not being a computer scientist, I was intimidated at first. Fortunately, a number of people posted their own experiences in setting up their own systems<sup id="fnref:1"><a href="#fn:1" class="footnote">1</a></sup>. Over time I strung together disparate instructions and suggestions that resulted in a working system for me. In the spirit of <a href="http://en.wikipedia.org/wiki/Pay_it_forward" title="Paying it forward (Wikipedia)">paying things forward</a>, I am providing this description of my setup<sup id="fnref:2"><a href="#fn:2" class="footnote">2</a></sup> in case it might be useful to another scientist out there facing the same problems that I already faced.</p>
<!-- more -->
<h2 id="taming-bash-by-using-a-dotfiles-system">Taming BASH by using a dotfiles system</h2>
<p>I use a number of machines at work and home, with a roughly 50/50 split between Macs and Linux machines. I like to have the Unix environment set up the same on all of the machines. Ideally, this means using a bash shell with a custom prompt, colored <code class="highlighter-rouge">ls</code> output, and all of my standard aliases in place. To manage my environments across multiple computers, both Mac and Linux, I have created a <a href="http://dotfiles.github.io/" title="Google does dotfiles">‘dotfiles’ system</a> using a simple script and GitHub. This method is based on <a href="http://blog.smalleycreative.com/tutorials/using-git-and-github-to-manage-your-dotfiles/">Michael Smalley’s dotfiles</a> setup. I built on his script and setup to create my own dotfiles system.</p>
<p>The code in <a href="https://github.com/soto97/dotfiles" title="soto97's dotfiles">my repository</a> organizes my various dotfiles, including <code class="highlighter-rouge">.bashrc</code>, <code class="highlighter-rouge">.bash_profile</code>, <code class="highlighter-rouge">.vimrc</code>, and others. The repository is cloned into the home directory of any of my machines such that the path is <code class="highlighter-rouge">~/dotfiles/</code>. The <code class="highlighter-rouge">makesymlinks.sh</code> setup script creates symlinks of the dotfiles from the home directory to the files in <code class="highlighter-rouge">~/dotfiles/</code>. The setup script is smart enough to back up my existing dotfiles into a <code class="highlighter-rouge">~/dotfiles_old/</code> directory thus giving me a means of reversing any changes. By hosting <a href="https://github.com/soto97/dotfiles" title="soto97's dotfiles">the code</a> on GitHub, I can clone and setup this system on any Unix based machine that I work on. The code is “smart enough” to setup environments for different flavors of Unix (Linux or Mac OS X). Right now, my dotfiles only supports a bash environment, but a quick Google search will find other versions for csh, zsh, and other shells.</p>
<h2 id="iterm-and-solarized">iTerm and Solarized</h2>
<p>I use <a href="http://www.iterm2.com/" title="iTerm2">iTerm2</a> for my Mac terminal. After years of fighting with terminal color schemes, I have settled on a scheme created and used by many software engineers: <a href="http://ethanschoonover.com/solarized" title="Solarize">Solarized</a>. I am a bit indifferent to the specific colors, but the scheme overall works really well and gives me two consistent and easy to apply colors schemes, one light color scheme and one dark color scheme. Also, the color scheme is available to a number of other programs. For example, I used <a href="http://ethanschoonover.com/solarized/vim-colors-solarized" title="Solarized colorscheme for vim">Solarized for my vim apps</a> as well.</p>
<h2 id="xcode">XCode</h2>
<p>Alright, it is time to get started on configuring the Mac for scientific research. First, we need to be sure that we have XCode installed. XCode provides a number of tools that a scientific programmer will likely not need, but the Command Line tools included in XCode are critical for scientific programming. So, if you don’t already have XCode, get it from the App Store.</p>
<p>Once you have installed XCode from the App Store, then you need to install the command line developer tools. Using the command line, enter:</p>
<figure class="highlight"><pre><code class="language-bash" data-lang="bash">xcode-select <span class="nt">--install</span></code></pre></figure>
<p>This will generate a pop-up message asking to install the command line developer tools. Go ahead and install. Once that is successfully done you will then have a number of command line tools that we will be using throughout the rest of this setup.</p>
<p><strong>Update</strong>: According to Michale Aye (<a href="https://twitter.com/michaelaye">@michaelaye</a>) you no longer need the entire XCode software to get the commmand line tools:</p>
<blockquote class="twitter-tweet" data-lang="en"><p lang="en" dir="ltr"><a href="https://twitter.com/soto97">@soto97</a> nice blog post for setting up mac for research.Couple of things: 1/n Since 10.9 xcode-select cmd can B run w/out xcode installed.</p>&mdash; Michael Aye (@michaelaye) <a href="https://twitter.com/michaelaye/status/783822768722673664">October 6, 2016</a></blockquote>
<script async="" src="//platform.twitter.com/widgets.js" charset="utf-8"></script>
<blockquote class="twitter-tweet" data-lang="en"><p lang="en" dir="ltr"><a href="https://twitter.com/soto97">@soto97</a> 2/n it will just download the cmd-line tools then. And the link to GrADS is dead.And give the conda-forge channel a try, it has</p>&mdash; Michael Aye (@michaelaye) <a href="https://twitter.com/michaelaye/status/783822974663000064">October 6, 2016</a></blockquote>
<script async="" src="//platform.twitter.com/widgets.js" charset="utf-8"></script>
<h2 id="install-x11">Install X11</h2>
<p>Mac OS X no longer comes with a pre-installed X-Window manager for use with the terminal and command line tools. Therefore, you need to be sure you have X11/XQuartz installed. Visit http://xquartz.macosforge.org/trac/wiki and download and install the most recent version. Just follow the instructions at the XQuartz site. You might need to fix the symlink it makes by entering the following command in the terminal:</p>
<figure class="highlight"><pre><code class="language-bash" data-lang="bash"><span class="nb">ln</span> <span class="nt">-s</span> /opt/X11 /usr/X11</code></pre></figure>
<h2 id="package-manager-homebrew">Package Manager: Homebrew</h2>
<p>To install and manage many of my tools I use <a href="http://brew.sh" title="Homebrew">Homebrew</a>. There are other package managers for OS X, including <a href="http://www.macports.org/">MacPorts</a> and <a href="http://www.finkproject.org/">Fink</a>, but I have found Homebrew to be the most usable and useful. Needs and preferences will vary.</p>
<h3 id="a-fresh-installation-of-homebrew">A fresh installation of Homebrew</h3>
<p>To install Homebrew from scratch, run the following command:</p>
<figure class="highlight"><pre><code class="language-bash" data-lang="bash">ruby <span class="nt">-e</span> <span class="s2">"</span><span class="si">$(</span>curl <span class="nt">-fsSL</span> https://raw.github.com/mxcl/homebrew/go<span class="si">)</span><span class="s2">"</span></code></pre></figure>
<p>This will both download and install the Homebrew software. After installing, run ‘brew doctor’ to insure that everything was installed correctly. If everything is working well, then you can start installing packages. For example, I install HDF5, NetCF, ack, and the Silver Searcher (ag), among others. The <a href="http://brew.sh" title="Homebrew">Homebrew website</a> provides details on how to use Homebrew. As well, typing <code class="highlighter-rouge">man brew</code> at the command line will bring up the manual page for Homebrew.</p>
<h3 id="updating-homebrew-from-a-previous-version-of-os-x">Updating Homebrew from a previous version of OS X</h3>
<p>Since I actually had Homebrew installed for on a previous version of Mac OS X, I took the following steps to make sure everything was still working properly.</p>
<p>I started with the command:</p>
<figure class="highlight"><pre><code class="language-bash" data-lang="bash">brew list</code></pre></figure>
<p>which told me what packages I have installed. Many of these packages were installed in support of others, but I generally know which ones I intentionally installed. For these, I tried running each package. If the command worked, then I was all set and left things along. If a particular package did not run, then I needed to remove it and reinstall, using the following commands:</p>
<figure class="highlight"><pre><code class="language-bash" data-lang="bash">brew remove &lt;package&gt;</code></pre></figure>
<figure class="highlight"><pre><code class="language-bash" data-lang="bash">brew <span class="nb">install</span> &lt;package&gt;</code></pre></figure>
<p>These instructions are based on step 5 from <a href="https://gist.github.com/myobie/1860902">myobie’s Gist</a>.</p>
<h3 id="installing-netcdf-operators-nco-using-homebrew-science">Installing NetCDF Operators (NCO) using homebrew-science</h3>
<p>The baseic Homebrew database does not include formulas for all of the scientific software that I need. Instead, we need to use an additional Homebrew database, <a href="https://github.com/Homebrew/homebrew-science">‘homebrew-science’</a>. From <a href="https://github.com/Homebrew/homebrew-science">homebrew-science</a> we have instructions for installing software from this alternative database. First, we need to tell brew to use this alternative database. This is done by ‘tapping’ the database. The command to do this for homebrew-science is:</p>
<figure class="highlight"><pre><code class="language-bash" data-lang="bash">brew tap homebrew/science </code></pre></figure>
<p>Now that homebrew-science is ‘tapped’ we can start install software from that database. The command is similar to any Homebrew install command:</p>
<figure class="highlight"><pre><code class="language-bash" data-lang="bash">brew <span class="nb">install</span> &lt;formula&gt;.</code></pre></figure>
<p>If the formula conflicts with one from the <a href="https://github.com/Homebrew/homebrew">master database</a> or another tap, you can install with this version of the install command:</p>
<figure class="highlight"><pre><code class="language-bash" data-lang="bash">brew <span class="nb">install </span>homebrew/science/&lt;formula&gt;.</code></pre></figure>
<p>You can also install via URL:</p>
<figure class="highlight"><pre><code class="language-bash" data-lang="bash">brew <span class="nb">install </span>https://raw.github.com/Homebrew/homebrew-science/master/&lt;formula&gt;.rb</code></pre></figure>
<p>To get the NetCDF Operators, I then entered the following command:</p>
<figure class="highlight"><pre><code class="language-bash" data-lang="bash">brew <span class="nb">install </span>nco</code></pre></figure>
<p>That’s it. I now have NetCDF Operators like <code class="highlighter-rouge">ncks</code> and <code class="highlighter-rouge">nccat</code> installed along with NCView (<code class="highlighter-rouge">brew install ncview</code>) for viewing NetCDF files. Since most climate models output the simulations results as NetCDF files, I am now ready to inspect the climate simulation output of almost any model.</p>
<h3 id="installing-grads-using-homebrew-science">Installing GrADS using homebrew-science</h3>
<p>The <a href="http://opengrads.org">GrADS</a> tool is useful for plotting climate data and can read in NetCDF files. Though I primarily use Python for plotting, GrADS has its place in my scientific workflow. In order to get GrADS, we will need to access an alternative Homebrew database. Similar to homebrew-science we need to tap homebrew-binary to get GrADS:</p>
<figure class="highlight"><pre><code class="language-bash" data-lang="bash">brew tap homebrew/binary</code></pre></figure>
<p>I want a copy of GrADS, so I type at the command line</p>
<figure class="highlight"><pre><code class="language-bash" data-lang="bash">brew <span class="nb">install </span>grads</code></pre></figure>
<p>If the formula conflicts with one from mxcl/master or another tap, you can</p>
<figure class="highlight"><pre><code class="language-bash" data-lang="bash">brew <span class="nb">install </span>homebrew/binary/&lt;formula&gt;</code></pre></figure>
<p>You can also install via URL:</p>
<figure class="highlight"><pre><code class="language-bash" data-lang="bash">brew <span class="nb">install </span>https://raw.github.com/Homebrew/homebrew-binary/master/&lt;formula&gt;.rb</code></pre></figure>
<p>Again, that’s all there is to it. Now I have a copy of GrADS on my machine.</p>
<h2 id="installing-the-anaconda-python-system">Installing the Anaconda Python System</h2>
<p>To keep things simple, I use the <a href="https://www.continuum.io">Anaconda</a> system for installing Python, SciPy, Numpy, Matplotlib, and almost any other Python package<sup id="fnref:3"><a href="#fn:3" class="footnote">3</a></sup>. You can <a href="https://www.continuum.io/downloads">download</a> either a commmand line or package installer. Once you run the installer, double check that you have a working version of Anaconda by typing:</p>
<figure class="highlight"><pre><code class="language-bash" data-lang="bash">conda <span class="nt">--version</span></code></pre></figure>
<p>I installed the Anaconda system on top of my previous Python installations. This worked smoothly. The only thing I had to do was fix the PATH and PYTHONPATH variables. The Anaconda installer tried to do this automatically, but failed since I use a dotfiles system. Therefore, I had to manually remove old Python paths in my PATH and my PYTHONPATH. As well, I added the Anaconda path by adding the following line:</p>
<figure class="highlight"><pre><code class="language-bash" data-lang="bash"><span class="nb">export </span><span class="nv">PATH</span><span class="o">=</span><span class="s2">"/Users/user_name/anaconda/bin:</span><span class="nv">$PATH</span><span class="s2">"</span></code></pre></figure>
<p>Once this was successfully done, I had NumPy, SciPy, Matplotlib, and other packages automatically installed. I need to add a few more packages, like xarray, Cartopy, and netcdf4. This was easily done by typing at the command line:</p>
<figure class="highlight"><pre><code class="language-bash" data-lang="bash">conda <span class="nb">install</span> &lt;package-name&gt;</code></pre></figure>
<p>Cartopy required a slightly different command, since it’s part of the SciTools suite:</p>
<figure class="highlight"><pre><code class="language-bash" data-lang="bash">conda <span class="nb">install</span> &lt;package-name&gt;
conda <span class="nb">install</span> <span class="nt">-c</span> scitools cartopy</code></pre></figure>
<p>Anaconda installed any dependencies and handled any conflicts. In this way, Anaconda is like Homebrew or pip<sup id="fnref:4"><a href="#fn:4" class="footnote">4</a></sup>, but for Python. Right now, it looks like any package I might need will likely be available through th Anaconda system.</p>
<p><em>One quick note:</em> I chose to install the complete Anaconda system<sup id="fnref:5"><a href="#fn:5" class="footnote">5</a></sup>, which is free. The complete system automatically installs various standard packages, like NumPy, Matplotlib, and Jupyter. You can also chose to install <a href="http://conda.pydata.org/miniconda.html">Miniconda</a>, a strip-down version that does not automatically install packages. With Miniconda you get more control of what packages are installed on your system.</p>
<h2 id="in-conclusion---">In conclusion . . .</h2>
<p>At this point, my Mac is ready for some scientific heavy lifting. I can compile scientific code, inspect and analyze climate model output, and manage my data with the tools that I have now installed and configured. Of course, this means that the fun has only begun, because it’s time to <a href="https://twitter.com/search?q=%23doingascience&amp;src=hash">do some science!</a></p>
<div class="footnotes">
<ol>
<li id="fn:1">
<p>For example, see Hacker Codex’s instructions for <a href="http://hackercodex.com/guide/mac-osx-mavericks-10.9-configuration/" title="Configuring Mac OS X Mavericks 10.9">configuring Mavericks</a> and <a href="http://hackercodex.com/guide/python-development-environment-on-mac-osx/" title="Python Development Environment on Mac OS X">installing Homebrew Python and other tools</a>. As well, both <a href="https://gist.github.com/myobie/1860902">Nathan</a> and <a href="http://www.lowindata.com/2013/installing-scientific-python-on-mac-os-x/">Lowin Data</a> have good instructions for various configurations. Finally, Damian Irving discuss <a href="https://drclimate.wordpress.com/2014/10/30/software-installation-explained/">installing software for climate research</a>. His <a href="https://drclimate.wordpress.com/2016/04/13/keeping-up-with-continuum/">rave reviews</a> of <a href="https://www.continuum.io">Anaconda</a> were the final straw to convince me to switch to using Anaconda to manage my Python environment. <a href="#fnref:1" class="reversefootnote">&#8617;</a></p>
</li>
<li id="fn:2">
<p>These instructions apply to Mac OS X Yosemite and El Capitan. These instructions may or may not work with earlier versions of Mac OS X. <a href="#fnref:2" class="reversefootnote">&#8617;</a></p>
</li>
<li id="fn:3">
<p>I previously used a more <a href="/blog/2014/01/22/setting-up-my-mac-for-scientific-research/">customized method</a> for installing Python and the various packages. I originally did that because older Python systems like Enthought (now Canopy) did not play well with my Fortran compiled NetCDF and other tools. With the introduction of <a href="https://www.continuum.io">Anaconda</a>, these previous problems no longer exist. Thus, I have been able to completely streamline my Python software management by using <a href="https://www.continuum.io">Anaconda</a>. Damian Irving provides a succinct argument for <a href="https://drclimate.wordpress.com/2016/04/13/keeping-up-with-continuum/">using Anaconda</a>. On top of this, I can use Anaconda’s environment management to switch between Python 2 and Python 3. This flexibility has allowed me to finally make the switch to Python 3 as my primary version of Python for scientific programming. <a href="#fnref:3" class="reversefootnote">&#8617;</a></p>
</li>
<li id="fn:4">
<p>pip was installed as part of the <a href="https://www.continuum.io">Anaconda</a> installation. <a href="#fnref:4" class="reversefootnote">&#8617;</a></p>
</li>
<li id="fn:5">
<p>I’ve been told that my terminology is a little off when it comes to <a href="https://twitter.com/michaelaye/status/783824037973635072">Anaconda</a> vs <a href="https://twitter.com/michaelaye/status/783824209474576384">conda</a>. <a href="https://twitter.com/michaelaye">@michaelaye</a> is right, but it doesn’t affect geting things up and running. The instructions that I followed and that I shared here get me a Python installation that works and the flexibility to have multiple environments. Once again, I have a toolbox and <a href="http://youtu.be/d8Oe9SteE3M" title="I'm a weapons man. (Ronin)">I put the tools in for the job</a>. I do not care how the tools are made, as long as they work. <a href="#fnref:5" class="reversefootnote">&#8617;</a></p>
</li>
</ol>
</div>
Tue, 16 Aug 2016 00:00:00 -0600https://alejandrosoto.net/blog/2016/08/16/setting-up-my-mac-for-climate-research/
https://alejandrosoto.net/blog/2016/08/16/setting-up-my-mac-for-climate-research/softwareresearchScientific Spoilers<!-- _includes/image.html -->
<!-- From: http://codingtips.kanishkkunal.in/image-caption-jekyll/ -->
<div class="image-wrapper">
<img src="/images/rosetta_comet_banner.png" alt="Comet 67P/Churyumov-Gerasimenko" />
<p class="image-caption">A Rosetta image of Comet 67P/Churyumov-Gerasimenko, taken by the OSIRIS camera. ESA/Rosetta/MPS for OSIRIS Team MPS/UPD/LAM/IAA/SSO/INTA/UPM/DASP/IDA.</p>
</div>
<p>We live in a culture filled with spoilers. With the extensive, intricate, and redundant lines of communication that connect us to each other, any new bit of news or nugget of discovery is quickly shared. Sometimes the conclusion arrives before the beginning. This is especially true in the world of storytelling, where knowing the ending can spoil the enjoyment of the story. This problem is magnified by people who enjoy being the one to ‘share’ the news. Whether malicious or not, these spoilers, both the act and the actors, have stolen the enjoyment of many stories from many people.</p>
<p>In a similar way, the journey of scientific discovery can be spoiled. A scientist builds a story, beginning with data or an idea or both. From that beginning, the story progresses through challenges and obstacles until a discovery or new understanding is reached, i.e., the end of the scientific story is found. For many scientists, experiencing the complete story is what makes the job so exciting. Being given the ending as opposed to discovering the ending just spoils the joy of doing science.</p>
<!-- more -->
<p>The immediate sharing of scientific measurements and observations risks spoiling the scientific discovery, the storytelling of science, that a scientist may have spent years pursuing. Recently, there have been <a href="http://www.bbc.com/news/science-environment-30859411">complaints</a> about the proprietary period applied to the release of images from the Rosetta mission. The BBC article discusses the political and career reasons why these images are protected for a proprietary period. But the article glosses over the desire to avoid having the scientific journey spoiled for the scientists on the Rosetta mission. Avoiding ‘scientific’ spoilers is a strong motivation for imposing proprietary periods on the images.</p>
<p>The Rosetta science team has spent one to two decades on this project, from the first seeds of a mission idea to the landing on the comet. They have painstakingly pieced together a riveting story about the comet. Years of research to provide a backstory to this exploration of a comet were followed by years of struggle to acquire funding, to build the spacecraft and instruments, and to eventually fly the mission. Now that they have arrived at the comet, it is only natural that they want time to experience and enjoy the end of this story. But since the scientists continue to plan the spacecraft observations of the comet, along with maintaining their regular scientific duties (teaching, advising, proposal writing, etc.), the end of this story plays out slowly, too slowly to compete with the internet. If all of the data were immediately released, the Rosetta scientists would find this decades-long epic spoiled by their understandably eager colleagues and fans on the internet who would quickly make the discoveries that are waiting in the data.</p>
<p>As a planetary scientist who does not work on Rosetta, I too hunger for more images from the Rosetta mission; the images that have been released are so tantalizing, revealing <a href="http://www.planetary.org/blogs/emily-lakdawalla/2015/01260947-rosetta-science-results.html">dune-like features</a> and <a href="http://blogs.esa.int/rosetta/2015/01/22/getting-to-know-rosettas-comet-science-special-edition/">perplexing icy interiors</a>. But, I will gladly wait. I do not want to spoil someone else’s story. Instead, I will eagerly await the day when the Rosetta team is ready to share their incredible story about a comet, a story that they have spent decades making. No spoilers for me.</p>
Sun, 01 Feb 2015 11:30:00 -0700https://alejandrosoto.net/blog/2015/02/01/scientific-spoilers/
https://alejandrosoto.net/blog/2015/02/01/scientific-spoilers/scienceSetting up my Mac for scientific research<!-- _includes/image.html -->
<!-- From: http://codingtips.kanishkkunal.in/image-caption-jekyll/ -->
<div class="image-wrapper">
<img src="/images/light_dots.png" alt="" />
</div>
<p><br /></p>
<p><em>UPDATE: I have changed how I <a href="/blog/2016/08/16/setting-up-my-mac-for-climate-research/">setup my Python environment</a>. These instructions are no longer up to date and may not work on newer versions of the Mac OS.</em></p>
<p>I use a Mac computer for most of my research, with the exception of running large climate models, which is usually done on clusters that are built from Linux machines. For most of my work, the Mac OS X operating system provides me computational foundation I need to develop and run planetary climate models. I am not a fanatic follower of Apple and I will use Windows machines when the task demands it, e.g. CAD design on Solidworks or mapping on ArcGIS. For me, a computer OS is just another tool, like Fortran, Python, a spectrometer, or a soldering iron. I have a toolbox and <a href="http://youtu.be/d8Oe9SteE3M" title="I'm a weapons man. (Ronin)">I put the tools in for the job</a>. The trick is setting up the tools right.</p>
<p>Since I am using the Unix underpinnings of Mac OS X, my setup requires a number of steps that the average Mac owner does not need in order to be productive. Most of these additional steps involve installing and configuring software for writing my modeling and analysis code. This is essential for my research. The rest of the additional steps are there just to make my life easier.</p>
<p>Once you dive into the Unix engine under the hood, you are no longer working with Mac OS X software installers. Instead, you are often in the realm of package managers, compiling your own code, and customizing the paths and configurations. Not being a computer scientist, I was intimidated at first. Fortunately, a number of people posted their own experiences in setting up their own systems<sup id="fnref:1"><a href="#fn:1" class="footnote">1</a></sup>. Over time I strung together disparate instructions and suggestions that resulted in a working system for me. In the spirit of <a href="http://en.wikipedia.org/wiki/Pay_it_forward" title="Paying it forward (Wikipedia)">paying things forward</a>, I am providing this description of my setup<sup id="fnref:2"><a href="#fn:2" class="footnote">2</a></sup> in case it might be useful to another scientist out there facing the same problems that I already faced.</p>
<!-- more -->
<h2 id="wrangling-bash-preferences-my-new-dotfiles-system">Wrangling BASH preferences: my new dotfiles system</h2>
<p>I use a number of machines at work and home, with a roughly 50/50 split between Macs and Linux machines. I like to have the Unix environment set up the same on all of the machines. Ideally, this means using a bash shell with a custom prompt, colored ls output, and all of my standard aliases in place. Historically, I have gone through a tedious process with every new machine in which I manually recreate the set up that I already have on my other machines. With my most recent new computer, I have adopted a more systematic and automated way of maintaining syncing my environments on different machines: I have created a <a href="http://dotfiles.github.io/" title="Google does dotfiles">‘dotfiles’ system</a> using a simple script and GitHub.</p>
<p>This method is based on Michael Smalley’s dotfiles setup, which he described at his blog. I built on his script and setup to create my own dotfiles system. The code in <a href="https://github.com/soto97/dotfiles" title="soto97's dotfiles">my repository</a> organizes my various dotfiles, including <code class="highlighter-rouge">.bashrc</code>, <code class="highlighter-rouge">.bash_profile</code>, <code class="highlighter-rouge">.vimrc</code>, and others. The repository is cloned into the home directory of any of my machines such that the path is <code class="highlighter-rouge">~/dotfiles/</code>. The <code class="highlighter-rouge">makesymlinks.sh</code> setup script creates symlinks of the dotfiles from the home directory to the files in <code class="highlighter-rouge">~/dotfiles/</code>. The setup script is smart enough to back up my existing dotfiles into a <code class="highlighter-rouge">~/dotfiles_old/</code> directory thus giving me a means of reversing any changes. By hosting <a href="https://github.com/soto97/dotfiles" title="soto97's dotfiles">the code</a> on GitHub, I can clone and setup this system on any Unix based machine that I work on. Right now, the files are designed to be universal, but eventually I will add some smarts to the system so that I can have some customizations setup for different flavors of Unix (Linux or Mac OS X) and possibly for different shells (csh, tcsh, zsh, etc.).</p>
<h2 id="iterm-and-solarized">iTerm and Solarized</h2>
<p>I use <a href="http://www.iterm2.com/" title="iTerm2">iTerm2</a> for my Mac terminal. After years of fighting with terminal color schemes, I have settled on a scheme created and used by many software engineers: <a href="http://ethanschoonover.com/solarized" title="Solarize">Solarized</a>. I am a bit indifferent to the specific colors, but the scheme overall works really well and gives me two consistent and easy to apply colors schemes, one light color scheme and one dark color scheme. Also, the color scheme is available to a number of other programs. For example, I used <a href="http://ethanschoonover.com/solarized/vim-colors-solarized" title="Solarized colorscheme for vim">Solarized for my vim apps</a> as well.</p>
<h2 id="xcode">XCode</h2>
<p>Alright, it is time to get started on configuring Mac Mavericks for scientific research. First, we need to be sure that we have XCode installed. XCode provides a number of tools that a scientific programmer will likely not need, but the Command Line tools included in XCode are critical for scientific programming. So, if you don’t already have XCode, get it from the App Store.</p>
<p>Once you have installed XCode from the App Store, then you need to install the command line developer tools. Using the command line, enter:</p>
<figure class="highlight"><pre><code class="language-bash" data-lang="bash">xcode-select <span class="nt">--install</span></code></pre></figure>
<p>This will generate a pop-up message asking to install the command line developer tools. Go ahead and install. Once that is successfully done you will then have a number of command line tools that we will be using throughout the rest of this setup.</p>
<h2 id="install-x11">Install X11</h2>
<p>Mac OS X no longer comes with a pre-installed X-Window manager for use with the terminal and command line tools. Therefore, you need to be sure you have X11/XQuartz installed. Visit http://xquartz.macosforge.org/trac/wiki and download and install the most recent version. Just follow the instructions at the XQuartz site. You might need to fix the symlink it makes by entering the following command in the terminal:</p>
<figure class="highlight"><pre><code class="language-bash" data-lang="bash"><span class="nb">ln</span> <span class="nt">-s</span> /opt/X11 /usr/X11</code></pre></figure>
<h2 id="package-manager-homebrew">Package Manager: Homebrew</h2>
<p>To install and manage many of my tools I use <a href="http://brew.sh" title="Homebrew">Homebrew</a>. There are other package managers for OS X, including <a href="http://www.macports.org/">MacPorts</a> and <a href="http://www.finkproject.org/">Fink</a>, but I have found Homebrew to be the most usable and useful. Needs and preferences will vary.</p>
<h3 id="a-fresh-installation-of-homebrew">A fresh installation of Homebrew</h3>
<p>To install Homebrew from scratch, run the following command:</p>
<figure class="highlight"><pre><code class="language-bash" data-lang="bash">ruby <span class="nt">-e</span> <span class="s2">"</span><span class="si">$(</span>curl <span class="nt">-fsSL</span> https://raw.github.com/mxcl/homebrew/go<span class="si">)</span><span class="s2">"</span></code></pre></figure>
<p>This will both download and install the Homebrew software. After installing, run ‘brew doctor’ to insure that everything was installed correctly. If everything is working well, then you can start installing packages. For example, I install HDF5, NetCF, ack, and the Silver Searcher (ag), among others. The <a href="http://brew.sh" title="Homebrew">Homebrew website</a> provides details on how to use Homebrew. As well, typing <code class="highlighter-rouge">man brew</code> at the command line will bring up the manual page for Homebrew.</p>
<h3 id="updating-homebrew-from-a-previous-version-of-os-x">Updating Homebrew from a previous version of OS X</h3>
<p>Since I actually had Homebrew installed for Mountain Lion before my upgrade to Mavericks, I took the following steps to make sure everything was still working properly.</p>
<p>I started with the command:</p>
<figure class="highlight"><pre><code class="language-bash" data-lang="bash">brew list</code></pre></figure>
<p>which told me what packages I have installed. Many of these packages were installed in support of others, but I generally know which ones I intentionally installed. For these, I tried running each package. If the command worked, then I was all set and left things along. If a particular package did not run, then I needed to remove it and reinstall, using the following commands:</p>
<figure class="highlight"><pre><code class="language-bash" data-lang="bash">brew remove &lt;package&gt;</code></pre></figure>
<figure class="highlight"><pre><code class="language-bash" data-lang="bash">brew <span class="nb">install</span> &lt;package&gt;</code></pre></figure>
<p>These instructions are based on step 5 from <a href="https://gist.github.com/myobie/1860902">myobie’s Gist</a>.</p>
<h3 id="installing-netcdf-operators-nco-using-homebrew-science">Installing NetCDF Operators (NCO) using homebrew-science</h3>
<p>The baseic Homebrew database does not include formulas for all of the scientific software that I need. Instead, we need to use an additional Homebrew database, <a href="https://github.com/Homebrew/homebrew-science">‘homebrew-science’</a>. From <a href="https://github.com/Homebrew/homebrew-science">homebrew-science</a> we have instructions for installing software from this alternative database. First, we need to tell brew to use this alternative database. This is done by ‘tapping’ the database. The command to do this for homebrew-science is:</p>
<figure class="highlight"><pre><code class="language-bash" data-lang="bash">brew tap homebrew/science </code></pre></figure>
<p>Now that homebrew-science is ‘tapped’ we can start install software from that database. The command is similar to any Homebrew install command:</p>
<figure class="highlight"><pre><code class="language-bash" data-lang="bash">brew <span class="nb">install</span> &lt;formula&gt;.</code></pre></figure>
<p>If the formula conflicts with one from the <a href="https://github.com/Homebrew/homebrew">master database</a> or another tap, you can install with this version of the install command:</p>
<figure class="highlight"><pre><code class="language-bash" data-lang="bash">brew <span class="nb">install </span>homebrew/science/&lt;formula&gt;.</code></pre></figure>
<p>You can also install via URL:</p>
<figure class="highlight"><pre><code class="language-bash" data-lang="bash">brew <span class="nb">install </span>https://raw.github.com/Homebrew/homebrew-science/master/&lt;formula&gt;.rb</code></pre></figure>
<p>To get the NetCDF Operators, I then entered the following command:</p>
<figure class="highlight"><pre><code class="language-bash" data-lang="bash">brew <span class="nb">install </span>cdo</code></pre></figure>
<p>That’s it. I now have NetCDF Operators like ncks and nccat installed along with NCView for viewing NetCDF files. Since most climate models output the simulations results as NetCDF files, I am not ready to inspect the climate simulation output of almost any moedl.</p>
<h3 id="installing-grads-using-homebrew-science">Installing GrADS using homebrew-science</h3>
<p>The <a href="http://www.iges.org/grads/">GrADS</a> tool is useful for plotting climate data and can read in NetCDF files. Though I primarily use Python for plotting, GrADS has its place in my scientific workflow. In order to get GrADS, we will need to access an alternative Homebrew database. Similar to homebrew-science we need to tap homebrew-binary to get GrADS:</p>
<figure class="highlight"><pre><code class="language-bash" data-lang="bash">brew tap homebrew/binary</code></pre></figure>
<p>I want a copy of GrADS, so I type at the command line</p>
<figure class="highlight"><pre><code class="language-bash" data-lang="bash">brew <span class="nb">install </span>grads</code></pre></figure>
<p>If the formula conflicts with one from mxcl/master or another tap, you can</p>
<figure class="highlight"><pre><code class="language-bash" data-lang="bash">brew <span class="nb">install </span>homebrew/binary/&lt;formula&gt;</code></pre></figure>
<p>You can also install via URL:</p>
<figure class="highlight"><pre><code class="language-bash" data-lang="bash">brew <span class="nb">install </span>https://raw.github.com/Homebrew/homebrew-binary/master/&lt;formula&gt;.rb</code></pre></figure>
<p>Again, that’s all there is to it. Now I have a copy of GrADS on my machine.</p>
<h3 id="installing-homebrew-python">Installing Homebrew Python<sup id="fnref:4"><a href="#fn:4" class="footnote">3</a></sup></h3>
<p>Plenty of arguments are given on the web for not merely using Apple’s installation of Python (e.g. <a href="http://hackercodex.com/guide/python-development-environment-on-mac-osx/">Hacker Codex’s discussion</a>), but for me, it’s mainly because I want all of my packages to play well together within Homebrew.</p>
<p>Getting Python 2.7.6 installed is pretty straightforward. The command is:</p>
<figure class="highlight"><pre><code class="language-bash" data-lang="bash">brew <span class="nb">install </span>python <span class="nt">--with-brewed-openssl</span></code></pre></figure>
<p>This will also install package management tools like pip, which we’ll need later. For my scientific work, I only need Python 2.7.x. Most scientific and mathematical packages have not yet moved to Python 3.x.</p>
<h2 id="installing-the-scipy-superpack">Installing the SciPy Superpack</h2>
<p>Although the <a href="https://www.enthought.com/">Enthought</a> Python distribution provides an all-in-one, turnkey solution to getting SciPy and matplotlib installed, EPD does not play well with Homebrew, my preferred package manager on the Mac. Therefore I am trying a different route, namely the <a href="http://fonnesbeck.github.io/ScipySuperpack">SciPy Superpack</a>.</p>
<p>Now it was time to install the SciPy Superpack, developed by <a href="http://stronginference.com/">Chris Fonnesbeck</a> . I used his very simple instructions to get this code installed. First, there was a curl command:</p>
<figure class="highlight"><pre><code class="language-bash" data-lang="bash">curl <span class="nt">-o</span> install_superpack.sh https://raw.github.com/fonnesbeck/ScipySuperpack/master/install_superpack.sh</code></pre></figure>
<p>Then, I move the script to my bin directory, at ~/bin/ . I then ran the script by typing:</p>
<figure class="highlight"><pre><code class="language-bash" data-lang="bash">sh install_superpack.sh</code></pre></figure>
<p>Once this script was done, I had installed Numpy, SciPy, Matplotlib, iPython, Pandas, Statsmodels, Scikit-Learn, and PyMC.</p>
<h2 id="just-another-pretty-interface-qtconsole">Just another pretty interface: qtconsole</h2>
<p>I decided to take everyone’s (on the internets) suggestion and install qtconsole to provide an <a href="http://ipython.org/ipython-doc/dev/interactive/qtconsole.html">aesthetically pleasing interface</a> for iPython<sup id="fnref:3"><a href="#fn:3" class="footnote">4</a></sup>. This was the trickiest step yet. First I had to install the Qt software. Unfortunately, the newest version, 5.0, comes packaged with quite a bit of stuff (e.g. the “Creator”) that I do not want. I just want the console. So I went to https://qt-project.org/downloads and downloaded the Qt Library for version 4.8. Once I ran the installer, the basic Qt software was in place.</p>
<p>Next I downloaded the <a href="http://pyside.markus-ullmann.de/pyside-1.1.0-qt47-py27apple.pkg">PySide libraries</a> and used the package installer to install the libraries. Just follow that link and the PySide libraries should start downloading. Then, using pip<sup id="fnref:5"><a href="#fn:5" class="footnote">5</a></sup>, I installed pygments through the command:</p>
<figure class="highlight"><pre><code class="language-bash" data-lang="bash">pip <span class="nb">install </span>pygments</code></pre></figure>
<p>Finally, I installed pyqt by typing:</p>
<figure class="highlight"><pre><code class="language-bash" data-lang="bash">brew <span class="nb">install </span>pyqt</code></pre></figure>
<p>Once I did all of this, I was able to verify that I had a working qtconsole by executing the following commands:</p>
<figure class="highlight"><pre><code class="language-bash" data-lang="bash">ipython qtconsole <span class="nt">--pylab</span><span class="o">=</span>inline
plot<span class="o">(</span>randn<span class="o">(</span>500<span class="o">)</span>,rand<span class="o">(</span>500<span class="o">)</span>,<span class="s1">'o'</span>,alpha<span class="o">=</span>0.2<span class="o">)</span></code></pre></figure>
<p>The last command produces the following output:
<!-- _includes/image.html -->
<!-- From: http://codingtips.kanishkkunal.in/image-caption-jekyll/ --></p>
<div class="image-wrapper">
<img src="/images/qtconsole_screenshot.png" alt="" />
</div>
<p>So now I have a pretty iPython console with inline plotting.</p>
<h2 id="netcdf4-the-only-way-to-read-and-write-climate-data">NetCDF4: the only way to read and write climate data</h2>
<p>For my climate modeling work I need the netCDF-python package installed. Fortunately, because of the <a href="https://pypi.python.org/pypi">Python Package Index (PyPi)</a> and the pip command, this is one of the easiest steps in the installation process.</p>
<p>Here’s the command:</p>
<figure class="highlight"><pre><code class="language-bash" data-lang="bash">pip <span class="nb">install </span>netCDF4</code></pre></figure>
<p>And that’s it. This set up works with the Homebrew Python, HDF5, and NetCDF4 as well as with the SciPy Superpack.</p>
<h2 id="in-conclusion---">In conclusion . . .</h2>
<p>At this point, I have a Mac that is ready for some scientific heavy lifting. I can compile scientific code, inspect and analyze climate model output, and manage my data with the tools that I have now installed and configured. Of course, this means that the fun has only begun, because it’s time to <a href="https://twitter.com/search?q=%23doingascience&amp;src=hash">do some science!</a></p>
<div class="footnotes">
<ol>
<li id="fn:1">
<p>For example, see Hacker Codex’s instructions for <a href="http://hackercodex.com/guide/mac-osx-mavericks-10.9-configuration/" title="Configuring Mac OS X Mavericks 10.9">configuring Mavericks</a> and <a href="http://hackercodex.com/guide/python-development-environment-on-mac-osx/" title="Python Development Environment on Mac OS X">installing Homebrew Python and other tools</a>. As well, both <a href="https://gist.github.com/myobie/1860902">Nathan</a> and <a href="http://www.lowindata.com/2013/installing-scientific-python-on-mac-os-x/">Lowin Data</a> have good instructions for various configurations. <a href="#fnref:1" class="reversefootnote">&#8617;</a></p>
</li>
<li id="fn:2">
<p>These instructions apply to Mac OS X Mavericks. These instructions may or may not work with earlier versions of Mac OS X. <a href="#fnref:2" class="reversefootnote">&#8617;</a></p>
</li>
<li id="fn:4">
<p>Instructions based on <a href="http://hackercodex.com/guide/python-development-environment-on-mac-osx/">Hacker Codex’s instructions</a> and <a href="http://www.lowindata.com/2013/installing-scientific-python-on-mac-os-x/">Lowin Data’s instructions</a>. <a href="#fnref:4" class="reversefootnote">&#8617;</a></p>
</li>
<li id="fn:3">
<p>See the articles <a href="http://www.thisisthegreenroom.com/2011/installing-python-numpy-scipy-matplotlib-and-ipython-on-lion/" title="installing-python-numpy-scipy-matplotlib-and-ipython-on-lion">“Installing python, numpy, scipy, matplotlib, and ipython on Lion”</a> and <a href="https://github.com/sympy/sympy/wiki/Installing-the-IPython-qtconsole-in-Mac-OS-X" title="Installing-the-IPython-qtconsole-in-Mac-OS-X">“Installing the IPython qtconsole in Mac OS X”</a> and <a href="http://sergeykarayev.com/work/2012-08-08/setting-up-mountain-lion/" title="setting-up-mountain-lion">“Setting up Mountain Lion”</a>. This is more than an aesthetic upgrade; the <code class="highlighter-rouge">qtconsole</code> provides additional functionality when working with iPython. The <a href="http://ipython.org/ipython-doc/dev/interactive/qtconsole.html">iPython website</a> points out that: “ This is a very lightweight widget that largely feels like a terminal, but provides a number of enhancements only possible in a GUI, such as inline figures, proper multiline editing with syntax highlighting, graphical calltips, and much more.” <a href="#fnref:3" class="reversefootnote">&#8617;</a></p>
</li>
<li id="fn:5">
<p>pip was installed as part of the Homebrew Python installation. <a href="#fnref:5" class="reversefootnote">&#8617;</a></p>
</li>
</ol>
</div>
Wed, 22 Jan 2014 12:36:00 -0700https://alejandrosoto.net/blog/2014/01/22/setting-up-my-mac-for-scientific-research/
https://alejandrosoto.net/blog/2014/01/22/setting-up-my-mac-for-scientific-research/softwareresearchFunding: the Lifeblood of Scientists<!-- _includes/image.html -->
<!-- From: http://codingtips.kanishkkunal.in/image-caption-jekyll/ -->
<div class="image-wrapper">
<img src="http://farm3.staticflickr.com/2861/9278752797_2efae9e655.jpg" alt="Abstract spheres." />
</div>
<p>I submitted my first two proposals as a principal investigator (PI)<sup id="fnref:1"><a href="#fn:1" class="footnote">1</a></sup> this last month, which is a major milestone as newly minted PhD. In fact, I have a bit of a head start since I was able to submit proposals as a PI while still only a postdoctoral scholar. Most academic institutions do not allow postdocs, as we are commonly called, to be a PI on a grant proposal. Fortunately for me, I work at a non-profit research institute that encourages their scientists to pursue grants funding as a PI, both early and often. Since grants are the lifeblood of scientists here in America, submitting proposals as a PI is an important step in the progression of my scientific career.</p>
<!-- more -->
<p>In America, the systematic and extensive funding of science by the federal government is a modern, post-WWII phenomenum. The importance of federal investments in scientific research was recognized as much <a href="http://cnx.org/content/m14356/1.1/">as 80 years earlier</a>. With the <a href="http://www.nasonline.org/about-nas/history/archives/founding-and-early-work.html">founding of the National Academy of Sciences</a>, the U.S. Federal government recognized the importance of science in government and economic policy by providing a formal mechanism for communication between the scientific community and political community. For most of the next 80 years, however, there was not an ongoing, organized program for providing federal funds to the general scientific community. Specific projects were funded, but apparantly the infrastructure that allows the modern scientist to propose research to be funded was not in place.</p>
<p>Things began to change during the 1940’s, due as much to <a href="http://cnx.org/content/m14356/1.1/">the demands of war as the vision of President Roosevelt and others</a>. Near the end of World War II (WWII), at the request of President Roosevelt, Vannevar Bush, the Director of the Office of Scientific Research and Development, reported the need and means by which the United States could maintain its rapid progress in scientific research once the war was over.<sup id="fnref:2"><a href="#fn:2" class="footnote">2</a></sup> Bush felt that scientific progress was fundamental to making progress in our economic and social lives and that understanding nature and its laws, i.e. basic research, was critical if we wanted to apply science to our society and our lives. Bush also recognized that basic research was not, in isolation, economically profitable. Instead, Bush supported continuous federal funding to provide the basic scientific research that industry required to make technological developments and thus improve our lives, both economically and socially.</p>
<blockquote><p>Advances in science when put to practical use mean more jobs, higher wages, shorter hours, more abundant crops, more leisure for recreation, for study, for learning how to live without the deadening drudgery which has been the burden of the common man for ages past. Advances in science will also bring higher standards of living, will lead to the prevention or cure of diseases, will promote conservation of our limited national resources, and will assure means of defense against aggression.</p><footer><strong>Vannevar Bush</strong> <cite>Science the Endless Frontier</cite></footer></blockquote>
<p>Bush’s ideas formed <a href="http://cnx.org/content/m14356/1.1/">the basis of the National Science Foundation (NSF)</a>, which was the first step in providing continuous U.S. federal funding for the last 70 years.<sup id="fnref:3"><a href="#fn:3" class="footnote">3</a></sup> Today, scientists pursue funding for research by writing research proposals to any of a number of U.S. federal agencies, including the NSF, the National Aeronautics and Space Administration (NASA), the National Institute of Health (NIH), the Department of Energy (DOE), and the Department of Defense (DOD). Depending on the particular rules of the grant opportunity, grants can be awarded for any length of time, from months to years or more and for any number of resources, including researchers, hardware, and software. Whether university/college faculty, employees of non-profit organizations, or employees of for-profit organizations, many scientists depend on government funding for a least a portion, if not all, of their annual funding in order to conduct basic scientific research.</p>
<p>Which brings us to the present, when government funded scientists spend a decent fraction of their time writing proposals to win funding in order to do research. Successfully completing their previous research is important, including publishing their results, but successfully winning funding can become far more vital to the survival of a scientific career. So a scientist combs through the announcement of opportunity, whether posted at the website of the NSF, the NIH, NASA, or elsewhere. She parses the instructions and the requests and then formulates a proposal outline that should satisfy those instructions and requests. Then the scientist tackles the toughest part of the proposal process: write a description and justification of a research project that has never been done before. Even though the research is brand new and thus a trip into the unknown, the funding agencies still want to know how you will manage the project to guarantee success and stay within budget. They want a schedule for something that has never been done before. This process is as difficult as it sounds, but the scientist must find a way to satisfy the proposal reviewers, both through research novelty and liklihood of success, because without this funding there will be no research at all.</p>
<p>This is the science funding process that I have now entered as a full-fledged PhD.<sup id="fnref:4"><a href="#fn:4" class="footnote">4</a></sup> Although I am funded, courtesy of a colleague, for two years, there is no time like the present to start acquiring my own funding. Since the <a href="http://science.nasa.gov/researchers/sara/grant-stats/">success rates are so low</a> for an individual proposal submitted to NASA, I have an incentive to start writing proposals as soon as possible and for as many programs as is reasonable. This strategy helps in two ways: one, this increases my odds, and, two, I get to learn about the proposal process while I still have the buffer of existing funding.<sup id="fnref:5"><a href="#fn:5" class="footnote">5</a></sup></p>
<p>And learn I did. First, there is the proposal process that needs to be learned, e.g. developing a budget and schedule, both the NASA process and the internal process of my institution. The latter is unique to each institution, so my experience working on instrument and spacecraft proposals as an engineer at the [NASA Jet Propulsion Laboratory (JPL)])http://www.jpl.nasa.gov) was not as transferable as I would have hoped. In addition to learning the proposal process, you also learn about the research that you are proposing. This is where proposal writing became fun.</p>
<p>I am a scientist not a writer, but I find that it is in writing that my work synthesizes into understanding. This understanding occurs both while writing journal papers and proposals. On each proposal, I discovered the questions I really wanted to ask and the methods I really needed to apply only when I began to put my fledgling ideas to paper (or, rather, <a href="http://www.latex-project.org/">LaTeX</a> in my case). The process was fun, though stressful due to the looming deadline. In the end, if I win I will understand my project better, thereby increasing the quality of my research. If I lose I will understand my project better, thereby allowing me to write a better proposal in the future. Sort of a win-win, right? Right?</p>
<p>For now, I am glad to have my first two proposals under my belt and look forward to the next funding opportunity, which is November for me. In the mean time, it is back to day to day research, which means coding, simulations, and analysis. And thinking about the planets, which is always fun.</p>
<p>###Post Script
Admittedly, I was pretty excited when I submitted my first proposal as PI. Being the PI of a research project represents a major milestone in being an independent scientist. This being the 21<sup>st</sup> century, I turned to Twitter to share the excitement:</p>
<blockquote><p>Just submitted my first NASA proposal as Principle Investigator. #nobucksnobuckrogers Let's see how this goes. #fingerscrossed</p><footer><strong>@soto97</strong> <cite><a href="https://twitter.com/soto97/status/348147316101898240">twitter.com/soto97/status/&hellip;</a></cite></footer></blockquote>
<p>By the second proposal, two weeks later, I just wanted to get back to work:</p>
<blockquote><p>Second PI proposal submitted. No more NASA proposals until Nov. so back to research. Now, where did I put that Titan weather model . . . ?</p><footer><strong>@soto97</strong> <cite><a href="https://twitter.com/soto97/status/352163862268751873">twitter.com/soto97/status/&hellip;</a></cite></footer></blockquote>
<p>Novelty wears off so quickly.</p>
<div class="footnotes">
<ol>
<li id="fn:1">
<p>Many U.S. science agencies categorize the scientists working on a project into various groups. Most science teams include a single Principle Investigator (PI) who has ultimate responsibility for the science project and multiple Co-Investigators (Co-I) who support the PI in executing the science project. This organization is not completely universal, and you will see larger NASA missions where there may be instrument PI’s but a Project Scientist who oversee all of the PI’s and a Project Manager who has budgetary responsibility. For the kinds of scientific grants that I am discussing, the PI has scientific, budgetary, and schedule authority and responsibility. Though the size of the project is likely miniscule compared to a space misssion, being a PI of a science grant is still an important role. <a href="#fnref:1" class="reversefootnote">&#8617;</a></p>
</li>
<li id="fn:2">
<p>Bush, Vannevar (Ed.). (1945). Science: The Endless Frontier. United States Office of Scientific Research and Development. Washington, DC: United States Government Printing Office. <a href="#fnref:2" class="reversefootnote">&#8617;</a></p>
</li>
<li id="fn:3">
<p>This story deserves its own article, which I hope to get to in the future. <a href="#fnref:3" class="reversefootnote">&#8617;</a></p>
</li>
<li id="fn:4">
<p>I will also outline this process in a later article. <a href="#fnref:4" class="reversefootnote">&#8617;</a></p>
</li>
<li id="fn:5">
<p>What if I win funding this year, while I already have funding? The PI of my current funding will just move that money to another researcher and I will switch to the new project. There is no excess windfall for me nor anyone else. <a href="#fnref:5" class="reversefootnote">&#8617;</a></p>
</li>
</ol>
</div>
Sun, 14 Jul 2013 15:37:00 -0600https://alejandrosoto.net/blog/2013/07/14/funding-the-lifeblood-of-scientists/
https://alejandrosoto.net/blog/2013/07/14/funding-the-lifeblood-of-scientists/sciencefundingPostdoc with Southwest Research Institute<!-- _includes/image.html -->
<!-- From: http://codingtips.kanishkkunal.in/image-caption-jekyll/ -->
<div class="image-wrapper">
<img src="http://farm9.staticflickr.com/8218/8430546639_a4fa5d921e.jpg" alt="Boulder, CO" />
<p class="image-caption">The Flatirons from Pearl Street (Boulder, CO).</p>
</div>
<p>The arrival of 2013 brings a new home and new job for me. A few weeks ago I started as a Postdoctoral Researcher at the <a href="http://www.boulder.swri.edu">Planetary Science Directorate</a> of the <a href="http://www.swri.edu">Southwest Research Institute (SwRI)</a>, where I will continue to study the Titan atmospheric dynamics. The SwRI Boulder office has a great group of planetary scientists working on a number of projects, from exoplanets to <a href="http://www.boulder.swri.edu/~kretke/Research.html">planetary formation</a> to <a href="http://www.boulder.swri.edu/~hal/">solar system dynamics</a> to <a href="http://www.boulder.swri.edu/~hamilton/VEH/Home.html">planetary geology</a> to <a href="http://www.boulder.swri.edu/~rafkin/">planetary atmospheres</a>. Plus, they are in charge of the <a href="http://missionjuno.swri.edu">Juno mission</a> and the <a href="http://pluto.jhuapl.edu">New Horizons Pluto mission</a>. This will be a fun place to work, and the fact that I will be living in <a href="http://en.wikipedia.org/wiki/Boulder,_Colorado">Boulder, CO</a> is just a bonus.</p>
Sun, 03 Feb 2013 14:45:00 -0700https://alejandrosoto.net/blog/2013/02/03/postdoc-with-southwest-research-institute/
https://alejandrosoto.net/blog/2013/02/03/postdoc-with-southwest-research-institute/Thesis results at AGU 2012<!-- _includes/image.html -->
<!-- From: http://codingtips.kanishkkunal.in/image-caption-jekyll/ -->
<div class="image-wrapper">
<img src="http://farm3.staticflickr.com/2701/4212939715_107dd956ea.jpg" alt="San Francisco" />
<p class="image-caption">The San Francisco skyline in December.</p>
</div>
<p>The second stop of my thesis results world tour was the <a href="http://fallmeeting.agu.org/2012/">2012 American Geophysical Union (AGU) Fall Meeting</a>, which is held every year in downtown San Francisco at the Moscone convention center. This was a short trip for me, since I did not stay the whole week, but I did have four hours of conversations about my <a href="/docs/Soto_AGU2012_poster.pdf">poster</a>, which was a lot of fun.</p>
Thu, 06 Dec 2012 13:27:00 -0700https://alejandrosoto.net/blog/2012/12/06/thesis-results-at-agu-2012/
https://alejandrosoto.net/blog/2012/12/06/thesis-results-at-agu-2012/researchconferenceMarsthesisTime for Titan<!-- _includes/image.html -->
<!-- From: http://codingtips.kanishkkunal.in/image-caption-jekyll/ -->
<div class="image-wrapper">
<img src="http://farm9.staticflickr.com/8034/8055838015_ee4b8164bb.jpg" alt="Abstract Titan" />
</div>
<p>After a great half year in Colorado working with <a href="http://inside.mines.edu/~jcahanna/">Dr. Jeffrey Andrews-Hanna</a> I have returned to Pasadena, California to again collaborate with <a href="http://www.gps.caltech.edu/~tapio">Dr. Tapio Schneider</a>. Dr. Schneider and I will be investigating the methane cycle on current Titan with an emphasis on understanding the large scale processes that influence precipitation, both seasonally and secularly. Although I am still working on various Mars projects, I am excited to tackle a different terrestrial planet, particularly a slowly rotating planet with a thick atmosphere. There will be some interesting new physics to learn.</p>
Tue, 04 Sep 2012 22:30:00 -0600https://alejandrosoto.net/blog/2012/09/04/time-for-titan/
https://alejandrosoto.net/blog/2012/09/04/time-for-titan/jobresearchThesis Results at LPSC 2012<!-- _includes/image.html -->
<!-- From: http://codingtips.kanishkkunal.in/image-caption-jekyll/ -->
<div class="image-wrapper">
<img src="http://farm8.staticflickr.com/7077/6928992598_5940b9a9be.jpg" alt="A session at LPSC." />
<p class="image-caption">One of the Mars sessions on the afternoon of the last day of LPSC 2012. Only the diehards remain.</p>
</div>
<p>One of my thesis chapters discusses the climate dynamics of atmospheric collapse on Mars. Although I am still working on submiting a version of this chapter to a peer-review journal, I thought that the Lunar and Planetary Science Conference (LPSC) would be a good place to present the results. Since the work would be difficult to squeeze into the LPSC standard 12-minute talk, I chose to present the work as a poster.</p>
<!-- more -->
<p>What is atmospheric collapse? If the temperature at any point in an atmosphere becomes colder than the condensational temperature of the primary constituent of the atmosphere, then that primary consituent condenses or deposits (basically, the bulk atmosphere ‘snows out’). “Atmospheric collapse” is the term used by many in the planetary atmosphere community to describe this phenomena. For Mars, the primary constituent of the atmosphere is carbon dioxide. For the current global mean surface pressure of Mars, which is around 6 millibars, the condensation temperature of carbon dioxide gas is ~148 K. Therefore, if the atmospheric temperature ever dipped below ~148 K, at that location the carbon dioxide would condense as ice. Thus, there would be atmospheric collapse.</p>
<p>Previously, scientists had hypothesized that the Martian atmosphere may have collapsed at some time in its past. Due to the large variations in the <a href="http://www.imcce.fr/Equipes/ASD/insola/mars/mars.html">Martian orbital elements</a>, it is possible that Mars experienced the right conditions to initiate the collapse of the atmosphere. The previous work used <a href="http://en.wikipedia.org/wiki/Climate_model">one dimensional and two dimensional climate models</a> to investigate the stability of the Martian atmosphere. I have continued this line of research by using a general circulation model to investigate atmosphere collapse.</p>
<p>The presentation of the <a href="/docs/soto_lpsc2012_poster.pdf">poster</a> went well. Some of my colleagues were very interested in the work, and some great discussions were had.</p>
<p><em>Update</em>: The paper on this topic was <a href="http://www.sciencedirect.com/science/article/pii/S0019103514006575">published in Icarus</a>.</p>
Fri, 30 Mar 2012 19:40:00 -0600https://alejandrosoto.net/blog/2012/03/30/thesis-results-at-lpsc-2012/
https://alejandrosoto.net/blog/2012/03/30/thesis-results-at-lpsc-2012/researchconferenceMarsthesis