@mrmrs_@jxnblk Some thoughts from testing it for PH:
* Would love if I could zoom into the specificity graph (i.e. what is the line that produces it)
* Color codes don't seem to be normalized, i.e. #8F8683 != #8f8683
* I want this as a weekly email report / automated test we can run (I'd pay for it!)
* For media queries it would also be useful to zoom in / see how often a certain query was used
Thanks for making this :)

@lukasfittl Thanks!
1. We plan on adding something like a detail view for all selectors – probably a table.
2. That's a conscious decision. It can help show code style inconsistencies. See https://github.com/mrmrs/cssstat...
3. Me too :)
4. More details about media queries is definitely something we want to add.

@staringispolite As a designer I definitely refactor a lot of css - so I was constantly finding myself asking a lot of questions about whatever codebase I was currently working on. I had started to use some Unix stuff to automate answering them. Some were design based questions like "what colors do I have to work with? what font sizes can I use?" I'd do things like grep for a certain property and pipe it to only show me unique values. I'm not the greatest wizard with Unix though so there were a lot of questions I still couldn't get answers to easily. Which frustrated me.
Then I saw a talk by Alex Sexton about hacking on Abstract Syntax Trees generated by open source preprocessors. This pretty much blew my mind at the possibilities of questions you could answer given the data structure of the AST.
Automating things with Unix can be powerful - but I wasn't good enough to grep for certain patterns. Once I had an AST in json format using a node module called css-parse, I was pretty quickly able to write some hacky JavaScript to start replicating my Unix scripts... and so much more. Around this time my friend @jxnblk and I were spending a lot of time talking about all the different questions we both had about css when working within a new system - so after I showed him some of my hacky JavaScript - he helped me transition it into an actual webapp that we put out last November (2013) as we were finding the tool useful for our own dev needs and thought it might be useful for others.
Over the past year Jxn has done a lot of work around automating css documentation with js on various personal projects like bass css and has pushed a lot of the ideas we originally started chatting about.
Recently he took a stab at putting a new version of cssstats together with improved performance and architecture on bob the front and back ends. The original version was brittle and difficult to iterate on so we're excited to have a more flexible system moving forward.
I think most companies don't know how much time and money they are unnecessarily losing to pulling apart messy css / front-end systems - and hopefully we can have more tools like cssstats that are trying to help visualize the current state and complexity of a system so teams can make informed design / dev decisions. While there are neat ways to automate css documentation - they require forethought and maintenance. I think ideally cssstats would be able to spit out a rudimentary style guide that both designers and developers can use to build or refactor interfaces more quickly.

@jxnblk@mrmrs_ Wow, thanks for the super detailed response! At a previous company, it took them 5 years to get to a nice cohesive/interactive style guide. (And still, it was a passion project of the main front-end engineer and never actually made it onto the roadmap.) Definitely feel your pain refactoring CSS from the engineering side. Automated style guides (and warnings of deviations?) would save a lot of time and add clarity to design discussions
I also find myself imagining some benchmarks for CSS complexity now, like "your site has double the font sizes of Apple.com: consider simplifying" or "55% as many X as Salesforce.com: great job!"
Almost like Hubspot's "Website Grader", but from a design complexity and performance standpoint