Lorinda Cherry received her Masters in computer science from Princeton's Stevens College in 1969. Computer science then consisted mostly of studying Turing Machines and numerical analysis, rather than compiler or data structure courses. About these studies she says, "...I'm not sure that anyone who was here at that time was really a computer scientist. I think everyone's training was in something else, it was in math or engineering."

Cherry joined Unix in 1972 as an assembly language programmer, originally with the purpose of building Unix systems. Because each had to be hand-crafted with the proper device drivers, it was important for someone on the project to do this work. However, she soon became involved with text tools.

If people know her, it's mostly for co-authorship of eqn and bc. She co-authored eqn with Brian Kernighan which might have eclipsed her fame a bit.

As we can see from an interview (see below) the text corpus on which a lot of early text processing development in Unix was done, were the Federalist papers typed into Unix from a paperback via teletype terminal. Excuse for lot of the work was "statistical analysis of authorship" research.

A more detailed precis on Cherry and her work, by Malika SethAn Interview with Cherry at Princeton's 'History of Science' SiteBC − An Arbitrary Precision Desk-Calculator Language (pdf)On Quality and NaturalityCherry: No, the graphics is easy. The hard part is getting a language that you can teach to a math typist that will just flow off her fingertips to complicated graphics. I think the language part of that is what was neat about it. It’s still what’s neat about it. The graphics part of it, I think Tech is still better as far as what EQN does and what Tech does. From a mathematical standpoint I think you’ll find Tech better, but I don’t think Tech stuff is anywhere near as natural to work with. That may be very prejudiced, I don’t have a…