tag:blogger.com,1999:blog-69315770837167527462017-09-05T23:23:23.721-04:00Finding FocusSoftware Development, Technology, Peace and HappinessJohn Rosehttps://plus.google.com/106748084801427416443noreply@blogger.comBlogger43125tag:blogger.com,1999:blog-6931577083716752746.post-4928822157091918322014-11-22T15:13:00.001-05:002014-11-22T15:13:26.676-05:00Pioneer A4 Airplay Speaker: Fixing The "Too Much Bass" Issue<p>Here's how you fix the "boomy bass" issue on your Pioneer A4. This is a method I haven't seen recommended elsewhere. </p><h3>Background</h3><p>From 2012 to 2013, Pioneer manufactured a line of fantastic network speakers: the A1, A2, A3, and A4. The top-of-the-line A4 model has fantastic sound and <a href="http://www.soundandvision.com/content/review-pioneer-a4-wireless-speaker-page-2">received rave reviews everywhere</a>, and it was <a href="http://www.thewirecutter.com">The Wirecutter's</a> pick for Best Airplay Speaker until Pioneer discontinued it. The Pioneer A4 supports Apple's AirPlay as well as HTC Connect, DLNA, Spotify Connect, and and of course your standard 1/8" headphone jack audio input as well. </p><p>There's one drawback to the A4: while it does have best-in-class sound, the bass can be overpowering, as <a href="http://www.amazon.com/gp/product/B00903HKXI?tag=thewire06-20&linkCode=as2">Amazon reviewers have noted.</a></p><p>Most people recommend fixing the problem by adjusitng iTune's audio EQ settings, or by simply stuffing a sock in the bass ports in the A4's rear. Those methods both work but rob you of the A4's full potential. </p><h3>A Better Solution</h3><p>You can <i>acoustically isolate</i> your A4 by placing energy-absorbing rubber or foam feet between the speaker and the surface it rests upon. This gives you crystal-clear, powerful sound including bass that I wouldn't have thought possible from a unit this size. </p><p>My favorite material for this job is Sorbothane, which is specifically designed to absorb vibrations. It's the same material some people place under their washing machines. Other types of foam may work as well, but you're on your own. </p><ol><li>Get some Sorbothane. For the most value, you can <a href="http://www.amazon.com/s/?_encoding=UTF8&camp=1789&creative=390957&field-keywords=sorbothane%20sheet&linkCode=ur2&tag=bootyproject02&url=search-alias%3Delectronics&linkId=NVHA4F64FWEJ3YMZ">buy it in sheets</a> and trim it however you like. Or you can buy it in <a href="http://www.amazon.com/s/?_encoding=UTF8&camp=1789&creative=390957&field-keywords=sorbothane%20disc&linkCode=ur2&rh=n%3A172282%2Ck%3Asorbothane%20disc&tag=bootyproject02&url=search-alias%3Delectronics&linkId=4VLT45LZZY5UVHWD">pre-cut discs</a> or <a href="http://www.amazon.com/s/?_encoding=UTF8&camp=1789&creative=390957&field-keywords=sorbothane%20feet&linkCode=ur2&rh=n%3A172282%2Ck%3Asorbothane%20feet&tag=bootyproject02&url=search-alias%3Delectronics&linkId=3XPPDC3WVZLBDFFL">bumpers</a>. </li><li>Place the Sorbothane between your A4 and the surface it's resting upon.</li><li>How much Sorbothane? Depends on the surface. I used four 1" square pieces of 1/4" thick Sorbothane. </ol><p>Ideally, you should be able to place your hand on the surface next to your speaker, play a song with powerful bass, and <i>not</i> be able to feel any vibrations. If you do, consider adding more Sorbothane until you're happy with the sound. </p>John Rosehttps://plus.google.com/106748084801427416443noreply@blogger.com5tag:blogger.com,1999:blog-6931577083716752746.post-40214515954773913632014-10-22T11:17:00.001-04:002014-10-22T11:17:33.637-04:00Apple Recap<p>How'd I do with those predictions? </p><p>The big surprise, to me, was that the "It's Been Way Too Long" teaser image appears to have been nothing but a red herring. It doesn't seem to match up with any of the things they actually announced. It makes me wonder if perhaps they had to shelve a planned announcement. </p><p>The iPad Air and iPad Mini received incremental updates. The star of the show, obviously, was the Retina iMac. </p><p>As <a href="http://www.marco.org/2014/10/16/retina-imac-vs-mac-pro">Marco Ament notes</a>: </p><blockquote>According to early Geekbench reports, the 4 GHz, 4-core Retina iMac appears to be 25% faster than the 6-core Mac Pro in single-threaded tasks and only about 15% slower in multi-threaded tasks... <br><br>...The Mac Pro is still more expandable than the iMac in some ways. It has 6 Thunderbolt ports across 3 buses for more monitors and high-bandwidth external storage capacity, and it supports up to 64 GB of RAM instead of the iMac’s 32 GB ceiling. Otherwise, the differences are small. <br><br>Retina iMac with 4 GHz, 16 GB, 512 GB SSD, M295X: $3500<br>Mac Pro with 6-core, 16 GB, 512 GB SSD, D500: $4300</blockquote></p><p>But that's a bit like saying a $70,000 Corvette is cheaper than a $250,000 Ferrari. Sure it is, but it's way more than I could ever realistically spend. My most realistic hope involves my boss buying himself a Retina iMac so I can at least check it out. </p> <p><strong>Biggest Letdown.</strong> The Mac Mini's update was more of a downgrade, going from a quad-core CPU to a dual-core CPU, losing its second drive slot, and losing user-upgradable RAM. That's a real bummer, because Mac Minis make pretty good servers and you can host them very cheaply at hosts like <a href="http://www.macminivault.com/">MacMiniVault</a> or <a href="http://macminicolo.net/">MacMiniColo</a>. I'd also been considering getting an 2014 Mini I could use for development work, since you can plug a couple of 27" monitors into one -- but the new Mini is just too lackluster. </p>John Rosehttps://plus.google.com/106748084801427416443noreply@blogger.com0tag:blogger.com,1999:blog-6931577083716752746.post-2717507989646399402014-10-09T11:46:00.001-04:002014-10-09T11:46:50.086-04:00Apple's October 16 Event: "It's Been Way Too Long"<img style="float: right; margin-left:2em; margin-botom:1em" src="http://lh5.ggpht.com/-UiQm5J5jAv0/VDao6eZBODI/AAAAAAAAARo/LmuArKih3uU/apple-invite-1014.jpg?imgmax=800" alt="Apple invite 1014" title="apple-invite-1014.jpg" border="0" width="300" height="161" /><p>That's an interesting tease for an Apple event. What could be they be referring to? </p><p>Not the Mac Mini. It's one of my favorite Apple things, but it's also their least important (to them) product. There is zero chance it's headlining an event. </p><p>As everybody else has noted, the logo appears to be a throwback to the classic rainbow-colored Apple logo from the 70s, 80s, and 90s. It'd be cool if they were refreshing their branding. I give that possibility a "maybe," though I also don't think they'd throw a press party for that alone. </p><p>My guess is that the colors are a reference to colored Apple products. I'm almost embarrassed to admit that I hope this is true, but... I do. The colored aluminum on the iPods of yore was cool; imagine a colored anodized MacBook in your color of choice. </p><p><strong>Things I Predict We'll See</strong><ul><li>OSX Yosemite, obviously.</li><li>Updated iPad Air, obviously.</li><li>Colored MacBook Pros and/or MacBook Airs.</li><li>Retina Thunderbolt Displays and/or iMacs.</li><li>Minor Mac Mini refresh. This might be a "silent" update where they don't mention it in the presentation, but it gets rolled out in the online store after the event.</li></ul></p><p><strong>Wish List</strong><ul><li>A "real" server-oriented Mac Mini. Max of 32-64GB of RAM instead of the current 16GB. ECC RAM for increased reliability. Lights-out management. Aside from a bump to 32GB I don't think these are even possible; the Mac Mini's form factor means it's pretty much always going to have "laptop guts" and not one of Intel's bona fide server-oriented chipsets.</li><li>Some possible way that a Retina Thunderbolt Display could work with my 2011 MacBook Pro @ 60hz. Again, I don't think this one is physically possible.</li><li>Free hookers, world peace, pizza.</li><li>A revelation that the A7/A8/A9 chips are somehow hardware-optimized to run their Swift programming language at obscene speeds, faster than C or hand-tuned assembly, and that they're turning this feature on for everybody <em>today</em>. Yeah, this is even less likely than the hookers, peace, and pizza.</li><li>Some kind of renewed Free / open source software commitment.</li></ul></p>John Rosehttps://plus.google.com/106748084801427416443noreply@blogger.com0tag:blogger.com,1999:blog-6931577083716752746.post-81133719007290397472014-06-03T12:19:00.001-04:002014-06-03T12:19:12.107-04:00Apple's 2014 WWDC Announcements<img src="http://lh5.ggpht.com/-yj-fMozP1c8/U431fdoqu2I/AAAAAAAAAQY/vDVh07CwUgM/WWDC-2014-logo.jpg?imgmax=800" alt="WWDC 2014 logo" title="WWDC-2014-logo.jpg" border="0" width="300" height="168" style="float:right; margin: 0 0 1em 1em" /><p>I like that there were no hardware announcements. Nice that this one was just for the developers. Even if I just write code on Macs without targeting OSX or iOS. </p><p><strong>Swift Programming Language.</strong> If you're a programmer, you have pretty much the same schizophrenic reaction to any new language announcement. One half of your brain is excited about a new programming language, and the other half of your brain is screaming <em>oh God, yet another programming language</em>. </p> <p>It's great that there's an official script-ish alternative to Objective-C for iOS development. That is, I think, objectively a good thing. Especially when you can mix and match Obj-C and Swift in a single project, which is the single most important thing about this announcement; it means you can upgrade your old Obj-C projects in a piecemeal way, or quickly write something in Swift and replace bits of it with Obj-C if you need to, and so on. </p><p>But why a new language? Why not adopt Ruby or Python or Lua or...? </p><p>At the very least, I hope the <a href="http://www.lighttable.com/">Light Table-ish</a> live preview thing finds a niche in education. Maybe it's the <a href="http://en.wikipedia.org/wiki/Logo_%28programming_language%29">Logo</a> for our generation. <p><strong>Metal Graphics API.</strong> What is this, 1997? Think about how badly OpenGL would have to drop the ball for a couple of decades in order for people to even think about going back to the bad old days of every piece of graphics hardware having its own proprietary, incompatible API. Well, that's how badly OpenGL dropped the ball. Everybody point at OpenGL and laugh.</p> <p>What I found unclear is what Metal is actually is. iOS only? iOS and maybe OSX someday? iOS devices have a wide range of graphics hardware. So Metal lets you code "close to the metal," but <em>which</em> metal? All of them?</p> <p>It would be nice if Metal is an alternative to DirectX or OpenGL, in the sense that it's hardware agnostic, but provided an approach that's a little more in tune with how modern graphics hardware actually functions. I don't think that's what it is, though.</p> <p><em>Edit: It looks like all iOS devices have used PowerVR graphics. I don't know how similar they are, but I'm assuming there's a fair bit of commonality. So I guess that's what Metal targets, specifically.</em></p> <p><strong>HomeKit</strong> Yawn. Cool in theory, but in reality, there's just not really that much I want to automate in my home. We have pets, so I can't really let the house go down to 40F in the winter or let it get up to 90F in the summer to save a few pennies while we're at work.</p> John Rosehttps://plus.google.com/106748084801427416443noreply@blogger.com0tag:blogger.com,1999:blog-6931577083716752746.post-18019506151459491442014-02-04T05:59:00.001-05:002014-02-04T06:20:35.177-05:00"I'd Like To Learn To Code!"<p>So, you'd like to learn to code. </p><p>First things first: that's great news! </p><p>Making software is one of the more meritocratic pursuits we have going right now and, if your goal is to make it into a career, you can do it as a self-taught coder without a formal degree. Better yet, many of the tools and learning resources are free... and amazing. </p><h3>What Do I Need?</h3><p>You need to be smart, although not necessarily scary-smart. </p><p>You need to enjoy this stuff. And you need to enjoy staying on top of it because things change really, really quickly... usually in fun and awesome ways, but usually at a relentless pace. We are a young industry; just a few decades old. </p><p>If you don't want to constantly change and adapt, consider something like furniture making. I do not mean that as an insult. Making furniture is awesome and takes a lifetime to master. Your children are more likely to fight over a piece of furniture you made than some code you wrote. What I mean is that the basic practices of furniture making haven't changed much in centuries. </p><p>I am serious: for anything other thing doing this on a hobby basis, you really need to enjoy the "constant change" part! </p><h3>You Don't Need To Be Good At Math</h3><p>It helps, of course! There are coding jobs that require it. If you're doing number-crunching for scientists, you'll need to know statistics. If you're doing 3D game programming you'll need to know matrix math and such. And so forth. And in general there are quite a few parallels between the thinking one needs for math and programming. </p><p>But generally speaking, there's usually not any math involved. </p><h3>The Thing Nobody Tells You</h3><p>If you plan on making this into a career, you probably need to be good with people. I know -- nobody ever says this, right? I truly believe this, though. You will be working with others and your job is to understand what they want and translate the needs of non-coders into code. </p><h3>What Language Should I Learn?</h3><p>Practically speaking, you'll need a few. </p><p>The good news is that most programming languages follow the same basic principles. Programmers move between them fairly easily... and we generally find this process pretty fun! </p><p>The reason why we have lots of programming languages is because different languages are suited to different tasks. Just like in the real world where we have a "language" to describe music (sheet music) and another "language" to describe math and so forth. </p><p>For example, web application developers are at least conversant in... </p><ul><li>HTML, CSS, and JavaScript+jQuery to describe web pages.</li><li>A language like Java, PHP, Ruby, Python, Javascript, or C# that runs on the server and talks to the web browser</li><li>Some tools to store data have their own language like SQL, though often you can do this right from Java/PHP/Ruby/Python/C#/etc.</li></ul><p>There are plenty of other paths one can take. For example, complex games are usually written in a mix of a nitty-gritty language like C or C++ and a "scripting" language like Lua. Web games are usually JavaScript. iOS applications are written in Objective-C. Android applications are written in Java. </p><h3>Where Do I Start Learning?</h3><p>There are so many possible answers here! </p><p>I'm not going to pretend to have a comprehensive knowledge of every resource, because nobody does. </p><p>I'd recommend the following as a first step. These online courses run in your browser; you don't have to buy or install anything on your computer. These are good for discovering if you're good at this stuff and find it fun. </p><p><a href="http://tryruby.org/">http://tryruby.org/</a><br><a href="http://try.jquery.com/">http://try.jquery.com/</a><br><a href="http://www.trypython.org/">http://www.trypython.org/</a><br><a href="http://railsforzombies.org/">http://railsforzombies.org/</a></p><p>One that I personally use and recommend is: <a href="https://www.codeschool.com">https://www.codeschool.com</a></p><p>They offer a lot of free resources and if you enjoy them, you can access all of their material for $29 a month. Their "courses" are split into short, fun videos. You watch a few minutes of video, do a coding exercise in your browser, and then move on to the next lesson. </p><p>I also recommend Code School because their courses are organized into various "paths" - they have a Ruby Path, a Javascript Path, an HTML/CSS path, and so forth. <em>If you complete all or most of them, you'll be pretty darn ready to develop web applications.</em></p><h3>How Do I Actually Get a Job?</h3><p>Coding is great hobby, obviously... doesn't need to be a job. But if you are trying to get paid for it... </p><p>Well, you could make a great application and sell it. </p><p>If you want somebody to hire you to write code the best way to get your foot into the door (when you have no prior experience) is to write some sample applications. Put them online and then put the code on GitHub, which is kind of like Facebook for writing code. Or contribute code to others' open-source applications... most of which are on GitHub these days. </p><p>One other thing to keep in mind is that your choice of languages will influence your employability a bit. For example, Microsoft's C# and Sun/Oracle's Java tend to be used in corporations. Open-source languages like Ruby and Python are more likely to be used at smaller companies and start-ups. Don't take my word for it; check the local job listings on Craigslist and see what's in demand in your area. </p> John Rosehttps://plus.google.com/106748084801427416443noreply@blogger.com11tag:blogger.com,1999:blog-6931577083716752746.post-25646578587206719552013-09-17T10:20:00.001-04:002013-09-17T10:20:20.147-04:00Technology: It's About Empathy, Dummy<p>In the real world, technology and software development are about empathy as much as they're about engineering chops.</p><p>Writing successful software, even if you're just writing an API or component for another coder, requires an understanding of your users' wants and needs. </p><p>The typical real world IT project goes something like this: something is going wrong, and you're asked to fix it with software. The people asking you for these solutions frequently have no idea what they want, and wouldn't have the technical vocabulary to describe it to you even if they knew. Worse, sometimes they think they know exactly what they want, and you need the finesse to tell them why their idea is wrong and suggest a better one ...all while helping them to feel as though they're playing an important role in the process, which they actually are, just not in the ways they're intending to. </p><p>And that's fine, actually, because if they know how to describe and write their own code they wouldn't need you. That's why you have a job. </p><p>There are certainly engineering jobs where empathy plays less of a direct role. If you're designing processors at Intel, or writing software to sift through telescope data to discover asteroids, you will perhaps be less concerned with others' needs. </p><p>Know, however, that those kinds of jobs represent a very small portion of the opportunities out there in this amazing field. </p><p>Also remember this: if you're just a "good coder" your job can be fairly easily outsourced. What they can't outsource so easily is your empathy and your understanding of intangibles. </p>John Rosehttps://plus.google.com/106748084801427416443noreply@blogger.com0tag:blogger.com,1999:blog-6931577083716752746.post-53225163850335970152013-03-26T09:13:00.001-04:002013-04-04T10:40:06.732-04:00Hey, Men! Let's Be Awesome.<p>There's been much talk about the role and treatment of women in the tech industry lately. </p><p>Let's step away from the cesspit of online debate and focus on some positive things we can do. There is a time and place for pointing out what others have done wrong. Since the rest of the Internet has that covered already, let's focus on awesome things we can do. </p><p>As men, let's: </p><ul><li><strong>Recognize that we have a position of privilege and power in the overwhelmingly male technology industry.</strong> <em>"Privilege" doesn't mean you haven't worked hard and earned your achievements. I believe you when you say you've worked your butt off.</em></li><li><strong>Recognize that, particularly if you're a white male, you may have never experienced what it's like to be in the minority.</strong> Let's not tell women or anybody else how they should feel about it. <em>This does not mean you are bad because you are a white male! I'm a white male; I think I'm pretty alright. </em></li><li><strong>Realize that a lot of women don't appreciate sexual jokes and conversation from men they don't know.</strong> Some enjoy it, some don't care, and many dislike it. They may even find it threatening. Even if you think this is dumb (it's not) then simply accept that a lot of women feel that way. We would not want our mothers, sisters or wives subjected to unwanted sexual conversations from strangers. </li><li><strong>Realize we can still make jokes about boobs and penises.</strong> Nobody is taking that away from us. Let's save it for our friends (of any gender) who enjoy those kinds of jokes. </li><li><strong>Realize that engineering is the art of creative problem-solving, and we benefit from others' perspectives.</strong> Solving problems involves understanding them. Often, this means understanding people. We need more perspectives, not less. </li><li><strong>Realize that accepting women into our industry means <em>accepting women.</em></strong> Not just accepting women as long as they "think like men." </li></ul><p>This industry is important; I really believe in it. If you're reading this, I think you believe in it. We really hurt this industry when we exclude bright minds and new perspectives from our field. </p><p>So, let's be awesome. </p> John Rosehttps://plus.google.com/106748084801427416443noreply@blogger.com0tag:blogger.com,1999:blog-6931577083716752746.post-80660354238370022912013-03-21T10:37:00.001-04:002013-03-21T10:37:58.353-04:00What Would A "Computer For Developers" Look Like?<p>"Workstation" computers typically seem designed for graphics/video professionals, or users who run scientific software. </p><p>Why have there been so few systems designed for software developers? The only example that really comes to mind is the <a href="http://www.dell.com/us/soho/p/xps-13-linux/pd">Developer Edition</a> of Dell's XPS 13 laptop. In this case, "Developer Edition" means that it comes preloaded with Ubuntu and Dell has sorted out any possible driver issues for you. (Dell, are you listening? I'll take one. Thanks.) </p><p>Software developers certainly seem like a market worth pursuing. There are <a href="http://en.wikipedia.org/wiki/Software_engineering_demographics">over 1.25 million Americans</a> who identify as software engineers or computer programmers. This is a <a href="http://www.bls.gov/ooh/Computer-and-Information-Technology/Software-developers.htm">well-paid profession</a> full of people who use their computers in a demanding fashion for at least eight hours a day, nearly every day. Why aren't computer manufacturers falling over themselves to serve this market? </p><p>One reason is because it's hard to define exactly what a software developer would want out of computer hardware, other than "fast, has a nice keyboard, and is hopefully portable." </p><p>Virtualization might be an answer. Software developers (and QA professionals) love to run multiple operating systems on a single computer, to test their software and take advantage of tools that only work under a particular operating system. Even the most diehard of Linux developers often needs multiple Windows installs around, if she's making web applications and needs to test them on Internet Explorer. </p><p>Existing desktop virtualization software like VMWare and Parallels works well, but can be clunky. You can't boot a guest operating system without booting the host, and guest operating systema have limited access to hardware resources like GPU acceleration. Full hypervisors like Xen or VMWare vSphere get around this issue, but are complex to configure and administrate. </p><p>I'd love to see a company take Dell's "Developer Edition" approach a step farther and sell a machine with something like Xen preinstalled, so that we could install/migrate/clone/snapshot multiple operating systems as easily as we copy around .txt files today. </p><p>Dell, are you listening? </p>John Rosehttps://plus.google.com/106748084801427416443noreply@blogger.com2tag:blogger.com,1999:blog-6931577083716752746.post-9171206200003806922013-03-21T10:09:00.001-04:002013-03-21T10:09:32.647-04:00Apple, Mac Pros, and the "Halo Effect"<p>John Sircacusa <a href="http://hypercritical.co/2013/03/08/the-case-for-a-true-mac-pro-successor">makes a case for Apple to re-invest in the Mac Pro line</a>. </p> <blockquote>In the automobile industry, there’s what’s known as a “halo car.” Though you may not know the term, you surely know a few examples. The Corvette is GM’s halo car. Chrysler has the Viper. ...Let’s talk about the Lexus LFA, a halo car developed by Toyota over the course of ten years. (Lexus is Toyota’s luxury nameplate.) When the LFA was finally released in 2010, it sold for around $400,000. A year later, only 90 LFAs had been sold. At the end of 2012, production stopped, as planned, after 500 cars. Those numbers should make any bean counter weak in the knees. The LFA is a failure in nearly every objective measure—including, I might add, absolute performance, where it’s only about mid-pack among modern supercars. The explanation for the apparent insanity of this product is actually very simple. Akio Toyoda, the CEO of Toyota, loves fast cars. He fucking loves them! That’s it. </blockquote><p>I'm a big believer in the halo effect, but I question whether it can be achieved in today's computer market. Apple, of course, is limited to using the same Intel chips that the rest of the industry uses, and just about anybody computer-savvy enough to crave a faster computer knows this. </p><p>Imagine a slightly parallel universe where the entire car industry had standardized on engines and power trains from General Motors. You'd never get excited about the new Ferrari or the new Mercedes because, at best, it'd be powered by the same Corvette engine that every other high-end car is using. It might have nicer seats and a better stereo, but those are nice-to-haves, not things you lust over. </p><p>Additionally, computers have been "fast enough" for most users for years now. I'm a software developer and I push my CPU pretty hard, and the fact is that the CPU in my 2011 MacBook Pro is almost never a limiting factor. </p><p>Are there any other ways Apple could build a "halo product" in today's computer industry? (Their crown jewel, of course, is OSX but you already get that with every Mac.) </p><p><strong>Possibility #1: The "GPU Beast Mac Pro."</strong> Suppose Apple sells a Mac Pro that's stuffed to the gills with the biggest, best professional-grade GPUs that nVidia offers. That wouldn't be a bad idea but unlike a faster CPU, very few users would benefit from extra GPU compute power. The vast majority of applications simply can't benefit from the speedup offered by GPU acceleration. Perhaps Apple could somehow make automatic utilization of GPU compute power more pervasive in OSX 10.9, but that's doubtful - Mail.app isn't going to fetch your emails any faster with the GPUs helping it out. </p><p><strong>Possibility #2: The "4K Retina Display iMac Pro."</strong> Apple's shown an inclination to differentiate its high-end notebook lineup by introducing displays with densities approaching 300dpi. One has to believe they're itching to be the first to accomplish this on the desktop, though of course they're limited by what the display manufacturers can currently produce. What if the next "professional Mac" is a slightly more expandable iMac with an integrated <a href="http://en.wikipedia.org/wiki/4K_resolution">4K resolution</a> display? It's only a matter of time before Apple brings the high-DPI concept to the desktop, of course - it's just a question of how exactly they'll target it, and whether or not it will be part of an expandable "pro" machine that many of us are hoping for. </p> John Rosehttps://plus.google.com/106748084801427416443noreply@blogger.com0tag:blogger.com,1999:blog-6931577083716752746.post-48855770526751551982013-02-05T12:10:00.001-05:002013-02-05T12:10:29.643-05:00Plus / MinusWe'll start with the "plus" - keeping it somewhat positive in 2013! <h3>Plus Of The Week: Join.me</h3><p>A few years ago, it seems like there was no drop-dead easy way to share your screen with a friend or coworker unless both of you were on a Mac and could use iChat's excellent screen-sharing feature. Services like LogMeIn required the installation of software; alternatives like VNC or Windows Remote Assistance were firewall-unfriendly. </p><p>Today, though, there are quite a few drop-dead easy alternatives that run inside a browser. Join.me, from LogMeIn, is one of them. About the only downside is that the person on the other end needs to have Flash enabled in their browser. There has to be something out there that functions in "HTML5-only" mode if Flash isn't installed, right? </p><h3>Minus Of The Week: No SFTP in Windows Server 2012</h3><p>It's kind of hard to believe that it's 2012 and Microsoft still ships a server version of Windows without SFTP, or some kind of substitute for it. (And no, FTP isn't a substitute) </p><p>There are third-party implementations, but they tend to be spendy and/or bereft of any user community, which makes me awfully wary of them. </p><p>You can allow SSH/SFTP access via Cygwin - which is free, free, free - but in my experience there's no way to selectively chroot accounts on an account-by-account basis. Which means that you can either lock all users in a single home directory, but you can't do it for some accounts and not others, so you're kind of out of luck if you want to give full access to admins and restricted access to other users. It's entirely possible, of course, that I'm mistaken on this. <a href="mailto:johnedmundrose@gmail.com">Drop me a line and correct me</a>. </p>John Rosehttps://plus.google.com/106748084801427416443noreply@blogger.com0tag:blogger.com,1999:blog-6931577083716752746.post-69226854910035581532013-02-03T16:12:00.001-05:002013-02-03T16:25:57.774-05:00I Know It's 2013 But Seriously: Office 2011 For OSX<p> I <em>like</em> Office 2010 on Windows. I like the Ribbon interface. Really, I do. As much as I'm a Mac guy for many things, I think that the Ribbon interface, when done well, is a great use of space and is far preferable to lots of tiny menus, nested three or four deep. </p><p> I certainly like it better than the iWork (Pages, Numbers, and Keynote) interface on OSX, with its tons of floating toolbars. You can't just drag a Pages window around, nooooo - you must also drag three or four separate windows at times. I assume that OSX will eventually get a version of iWork that carries over many of the lessons learned from creating iWork for iOS, but who knows? </p><p> Anyway. Here's what spawned this post. </p><p> What was Microsoft's OSX team <em>thinking</em> when they designed Office 2011 for OSX? Seriously, this is just horrible. It looks like somebody from 1995 traveled forward in time to 2010, had sixty seconds to memorize what OSX looked like, and then was transported back to 1995 whereupon he feverishly worked to recreate that look in an early version of Visual Basic. </p><p> There are extra bevels, some kind of bizarro tabbed interface, and the whole thing is a jumbled mix of flat icons and pseudo-3d-gradiant-having buttons. Why? <em>Why?</em> Dear God. </p> <img src="http://lh4.ggpht.com/-CzCh5TPVsuI/UQ7TJrjb5wI/AAAAAAAAAKc/FDVhchqMh4o/Office2011-OSX.png?imgmax=800" alt="Office2011 OSX" title="Office2011-OSX.png" border="0" width="600" height="225" />John Rosehttps://plus.google.com/106748084801427416443noreply@blogger.com0tag:blogger.com,1999:blog-6931577083716752746.post-4763093868362260472013-01-22T19:13:00.001-05:002013-01-22T19:13:32.615-05:00What Working With Steve Jobs Was Actually Like<p><a href="http://inventor-labs.com/blog/2011/10/12/what-its-really-like-working-with-steve-jobs.html">In the words of Glenn Reid, who worked on iMovie and iPhoto with Jobs:</a></p> <blockquote> <p>“…when I worked with Steve on product design, there was kind of an approach we took, unconsciously, which I characterize in my mind as a ‘cauldron’. There might be 3 or 4 or even 10 of us in the room, looking at, say, an iteration of iPhoto. Ideas would come forth, suggestions, observations, whatever. We would ‘throw them into the cauldron’, and stir it, and soon nobody remembered exactly whose ideas were which. This let us make a great soup, a great potion, without worrying about who had what idea. This was critically important, in retrospect, to decouple the CEO from the ideas. If an idea was good, we'd all eventually agree on it, and if it was bad, it just kind of sank to the bottom of the pot. We didn't really remember whose ideas were which -- it just didn't matter. Until, of course, the patent attorneys came around and asked, but that's a whole ‘nother story.”</p></blockquote> <p>Even an with his famously vast ego, there were times when Steve knew when to set his ego aside for the good of the product.</p> <p>Also, perhaps a more subtle point. Let’s think about this sentence:</p> <blockquote> <p>“If an idea was good, we'd all eventually agree on it, <strong>and if it was bad, it just kind of sank to the bottom of the pot.</strong>”</p></blockquote> <p>My favorite part, actually. I believe that in, order to come up with good ideas, you need permission – from yourself and others – to come up with bad ideas along the way. Nobody produces brilliant ideas 100% of the time and if you try, you’ll be mediocre at best and wrapped in “analysis paralysis” at worst.</p> John Rosehttps://plus.google.com/106748084801427416443noreply@blogger.com0tag:blogger.com,1999:blog-6931577083716752746.post-52235816312950105742012-06-25T12:39:00.001-04:002012-06-25T12:39:22.048-04:00Minority Report: 15 Harmful Years Later<p><a href="http://www.wired.com/underwire/2012/06/minority-report-idea-summit/">Wired took a look at the impact of the movie version of <em>Minority Report</em>, fifteen years later</a>. </p><blockquote>The year was 1999, and Steven Spielberg was preparing to turn Philip K. Dick’s short story “The Minority Report” into a $100 million action movie starring Tom Cruise. There was just one problem: The story was set in the undated future, and the director had no idea what that future should look like. He wanted the world of the movie to be different from our own, but he also wanted to avoid the exaggerated and often dystopian speculation that plagued most science fiction<br><br> ...To mark the 10th anniversary of Minority Report‘s June 21 release, Wired spoke to more than a dozen people who were at the so-called “idea summit” that delved deep into the future. As participant Joel Garreau recalls, “I don’t think many of us knew what the fuck we were getting ourselves into.”</blockquote><p>The tech from <em>Minority Report</em> that people remember most -- the scenes of Tom Cruise waving his hands to navigate a computer interface -- <strong>are probably some of the most harmful ever filmed</strong>. </p><p>Like voice control and television wristwatches, controlling our devices with three-dimensional physical gestures <em>seems</em> like a good idea. However, like those other technologies, its usefulness is limited. </p><p>While the image of Tom Cruise waving his arms all over the place like an amphetamine-boosted symphony conductor makes for dramatic cinema, it turns casual computing tasks into physical tasks exhausting for a healthy person and impossible for a disabled person. </p><p>The lowly computer mouse and its cousin, the trackpad, are marvels of efficiency. By moving one hand (or finger) a couple of inches, one can navigate thousands of pixels' worth of computer interface. </p><p> The precision of mice and trackpads is unparalleled as well. Mice and trackpads are precise to within several pixels. When considering touch interfaces, accuracy drops by an order of magnitude: <a href="http://developer.apple.com/library/ios/#DOCUMENTATION/UserExperience/Conceptual/MobileHIG/Characteristics/Characteristics.html">Apple recommends that touch-based interface targets measure no less than 44x44 pixels</a>. When considering three-dimensional, gesture-based, <em>Minority Report</em>-style interfaces, accuracy drops by another order of magnitude.<sup>1</sup></p><p>Three dimensional gesture-based navigation clearly does have its uses. Microsoft's Kinect has shown that it can be very useful for specially-designed games, and low-precision tasks like simple media playback control. </p><p>Three-dimensional gestures are clearly a part of the future, but they are not <em>the</em> future. A variety of input methods (the command line, mice, keyboards, touch interfaces, gesture interfaces) will continue to be used, each fulfilling a role. </p><p>______________________<br><sup>1</sup> Speculation. I couldn't find hard data on this. It's tough to dispute, though. </p>John Rosehttps://plus.google.com/106748084801427416443noreply@blogger.com0tag:blogger.com,1999:blog-6931577083716752746.post-60322115473037094572012-06-15T10:06:00.001-04:002012-06-15T10:16:35.711-04:00This Developer's Life: Dinosaurs and Fortran<p>I've always had a weird semi-fascination with Fortran. </p><p>I don't even know what Fortran looks like, actually. </p><p>For years, I lumped it in with ancient languages like Cobol that are gone and not missed in the slightest. While there's always at least one weird guy living in a cave somewhere to prove you wrong, I don't think anybody misses Cobol. </p><p>Fortran, though, is apparently a different beast. It's not a general-purpose programming language, exactly. It's more like a thing that hardcore science and math dudes use to crunch numbers. </p><p>Apparently, Fortran has two interesting properties. </p><ol><li><strong>Battle-Tested, Bulletproof Libraries.</strong> The kind of "bulletproof" you only get after a couple of decades of hardcore, NASA-launches-spaceships-with-this-shit, the-stock-market-runs-on-this-shit use.</li><li><strong>Disgustingly Parallel.</strong> Apparently it scales to about as many processors as you can throw at it, with no real extra work required. Hundreds, thousands. The kind of warehouse-filling computers that predict weather or whatever. </ol><p><a href="http://www.thisdeveloperslife.com/post/2-0-7-dinosaurs">This episode of <em>This Developer's Life</em></a> features several segments, including one with a fresh-out-of-college kid found himself in a job learning Fortran. After laughing at it and attempting to convert their codebase to C#, he became a Fortran convert. John Rosehttps://plus.google.com/106748084801427416443noreply@blogger.com0tag:blogger.com,1999:blog-6931577083716752746.post-87708664402922837902012-06-11T15:10:00.001-04:002012-06-11T15:10:54.151-04:00Doing It...<p>A Hacker News comment: </p><blockquote>"I run Windows on my MBA for when I'm travelling. Apple make fantastic hardware; but their saccharine UIs make me retch. <br><br>One constant, though: over the past 10 years, I've moved my life further and further into the Cygwin command line, so that I'm insulated from the frippery going on at the edges; with my setup, I'm approximately as at home on Linux, Solaris, Mac and Windows. I'm not optimistic based on what I've seen of Windows 8's direction."</blockquote><p>Hey, look. I do a lot of Windows development work on my Mac in a Windows VM too. And some of the recent OSX UI stuff is regrettable. </p><p>But running a VM in OSX (a real BSD Unix) so that you can run Cygwin (a sorta-Unix sorta-emulation layer) in a virtual machine? </p><p>Doing it wrong. </p>John Rosehttps://plus.google.com/106748084801427416443noreply@blogger.com0tag:blogger.com,1999:blog-6931577083716752746.post-82653277349929388932012-05-23T15:01:00.001-04:002012-05-23T15:01:38.622-04:00VMWare Fusion and Excessive Idle CPU Usage<p>I've stuck with VMWare Fusion (currently at 4.02) over the years because I find it much more stable than Parallels. However, I've noticed that my virtual machines always have rather high CPU usage under Fusion. My Windows Server 2008 R2 VM has always consumed about 40% CPU usage on the host side, even when at close to 0% CPU usage on the guest side. </p><p>This is a problem. It makes my Macbook Pro run hot and puts a serious dent in my battery life. </p><p>Things I ruled out by trial and error: <ul><li>It's not a RAM issue - I have 16GB of RAM, 4GB of which is dedicated to the Windows VM, and I'm not seeing paging on the guest or host.</li><li>Removing the guest's virtual USB device is one suggested fix I've seen people offer. This seemed to improve CPU usage by several percent, but nothing significant.</li><li>Manually changing the virtualization engine didn't help.</li><li>Enabling or disabling hard disk buffering didn't help.</li><li>I disabled as many services as possible in the Windows 2008 R2 guess, and confirmed via Sysinternals' Process Explorer that nothing was chewing CPU or doing significant I/O in the background.</li><li>Enabling or disabling 3D acceleration in Fusion's virtual machine settings didn't help.</li><li>Enabling or disabling Aero on the guest didn't help.</li></ul></p><p>In the end, you know what worked? <strong>I changed VMWare Fusion's settings for the virtual machine and reduced the number of virtual CPUs from 4 to 2.</strong> This took idle host CPU usage from ~40% down to ~20%, a figure I consider much more reasonable. </p><p>I'm not sure exactly <em>why</em> this worked. FWIW, this is on an early 2011 MacBook Pro with a 4-core Sandy Bridge i7 CPU. There are 4 physical cores and OSX "sees" 8 virtual cores. Therefore, my virtual machine is probably now running on a single physical core. I'm sure that has something to do with it. </p>John Rosehttps://plus.google.com/106748084801427416443noreply@blogger.com0tag:blogger.com,1999:blog-6931577083716752746.post-87464450355649730582012-05-22T07:48:00.001-04:002012-05-22T08:02:52.571-04:00New Software News: No More Aero Glass, GitHub for Windows, Coda 2<p><strong>Bye, Aero Glass.</strong> Microsoft announced that it's <a href="http://www.wired.com/gadgetlab/2012/05/microsoft-drops-aero-glass-ui-in-windows-8/?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+wired%2Findex+%28Wired%3A+Index+3+%28Top+Stories+2%29%29&utm_content=Google+Reader">phasing out the Aero Glass UI in Windows 8</a>. The new interface is flatter and sharper. I like the direction they're taking. </p><blockquote>“This style of simulating faux-realistic materials (such as glass or aluminum) on the screen looks dated and cheesy now, but at the time, it was very much en vogue,” [Jensen Harris, the Director of Program Management for the Windows User Experience] writes in <a href="Jensen Harris, the Director of Program Management for the Windows User Experience">the blog post titled ‘Creating the Windows 8 User Experience</a>.’ </blockquote><p><strong>GitHub For Windows.</strong> <a href="https://github.com/blog/1127-github-for-windows">GitHub for Windows is now available</a>. I've only played around with their OSX client very briefly, but it's friendly and works. </p><p><strong>Coda 2.</strong> Over on the OSX side of things, <a href="http://panic.com/coda/">Coda 2 is finally about to ship</a>. I'm a big fan of Coda 1 for certain things - it's great at editing remote files via FTP/SFTP. (But isn't that kind of an outdated mode of development?) </p><p>Catching my eye in Coda 2: Git support, easier color scheming, and a CSS editor that appears to have great support for creating gradients and other CSS effects. There's a built-in MySQL management GUI, which is cool, but also a little bit "five years ago" - it seems like people are either moving "down" to SQLite or NoSQL databases, or "up" to a more fully-featured RDBMS like Postgres. </p><p>According to Cabel from Panic, <em>"Coda 2 will be $75 ('upgrading pricing for everyone') for a while, Diet Coda will be $19. After the sale of course."</em> The sale he's referring to is the 24-hour sale on May 24th when both apps are 50% off. </p>John Rosehttps://plus.google.com/106748084801427416443noreply@blogger.com0tag:blogger.com,1999:blog-6931577083716752746.post-14031438037873250982012-05-08T07:31:00.001-04:002012-05-08T07:31:15.039-04:00HP ZR2740w Monitor Update<p>In an <a href="http://www.johnbooty.org/2012/03/finally-affordable-high-resolution.html">earlier article</a> I mentioned being pretty excited about ordering this monitor. </p><p>Hated it. Completely unacceptable monitor. I sent it back for a refund. However, it might work for <em>you</em>. Let me explain. </p><p>The anti-glare coating on the ZR2740w is unbelievably bad when you're looking at light-colored backgrounds. The anti-glare coating is so thick and coarse that the screen actually <em>looks filthy</em>. If you know what text looks like on a dirty monitor, then you know what the ZR2740w looks like. </p><p><strong>Will It Work For You?</strong> It might, if you're not using any software that uses light backgrounds. If this is strictly a gaming machine, or if you're a coder who spends all day in customizable terminal windows or IDEs with dark color schemes, maybe it's worth a shot. </p><p>But then again, while ~$600+ is cheap for a monitor of this size and resolution, that's still a ton of cash to pay for something that's going to make anything on a light background look like shit. </p>John Rosehttps://plus.google.com/106748084801427416443noreply@blogger.com0tag:blogger.com,1999:blog-6931577083716752746.post-63551492309039629672012-05-08T07:22:00.001-04:002012-05-08T07:22:37.880-04:00"Mastered for iTunes" Revisited<p>Since my previous article on Apple's Mastered for iTunes program, more information has come to light. We now know what "Mastered for iTunes" actually means! <a href="http://www.npr.org/blogs/therecord/2012/02/24/147379760/what-mastered-for-itunes-really-means">NPR sums it up best; check the addendum at the end of this article</a>. <blockquote><p>"...I spoke again with Bob Ludwig, the mastering engineer quoted in the story, who has submitted "Mastered for iTunes" tracks to Apple. He says the company is simply providing mastering engineers with tools that allow them to see how songs mastered at 24 bits will clip (that is, distort audibly) when they go through the standardized AAC encoding process. The uncompressed files are then submitted to iTunes, which creates lossless versions before encoding the songs as 256 kpbs AAC files for sale in the iTunes store.</p><p>...Why is this significant? Because the fact that Apple retains the lossless versions of the high-quality studio masters means that iTunes, at any time it decides to, can begin selling higher-quality encodes, or even lossless files."</p></blockquote> <p>Ars Technica chimed in with their opinion. With the aid of some professional audio engineers, <a href="http://arstechnica.com/apple/news/2012/04/does-mastered-for-itunes-matter-to-music-ars-puts-it-to-the-test.ars/3">they concluded that "Mastered for iTunes" can make a positive difference</a>, though it should be noted that not all of the audio engineers agreed with each other. </p><p> <strong>Is Everybody Missing The Point?</strong> Kind of. A lot of the discussion has centered around the fact that it's almost physically impossible for us to hear the difference between 24-bit and 16-bit audio, or 96khz and 44.1khz audio. While true, that misses the real point of high resolution audio. </p><p> Whenever audio is transformed, data can be lost. It's just a mathematical reality. By default, audio goes through quite a few steps in the pipeline before making it to your ears. iTunes' volume control, its Sound Check and Sound Enhancer features, and the built-in equalizer all play a role. So do the volume controls built into Windows/OSX, as well as other sound "enhancements" performed by your audio device. </p><p>With high-resolution audio, there's simply more room for error - all those little rounding errors likely won't add up to something your ears can detect. However, with 44.1khz/16bit ("CD-quality") audio, there's not much room for error: 44.1/16 is <em>just</em> good enough to cover the range of normal human hearing, and excessive audio processing quickly adds up to something our ears can detect. </p><p>In many ways, it's exactly like working with lossy JPEGs. JPEGs are fine for viewing and can be nearly indistinguishable from uncompressed master photos, but once you start editing JPEGs extensively all of the artifacts pile up pretty quickly. </p>John Rosehttps://plus.google.com/106748084801427416443noreply@blogger.com0tag:blogger.com,1999:blog-6931577083716752746.post-10665343451997948892012-03-18T17:20:00.001-04:002012-03-18T17:20:20.795-04:00Apple: Keeping It… Modest?<p>While Apple certainly promotes itself as a premium brand, one thing they do <em>not</em> do is change their hardware designs frequently. </p><p>Without close inspection, nobody knows you're using a MacBook Pro from 2008 and not one from 2012. </p><p>It's even harder for somebody to tell if you're using an iPhone 4s or an original iPhone 4 from two years ago. HTC alone has introduced what, literally twenty designs in that timespan? Thirty? </p><p>With the exception of IBM/Lenovo's iconic Thinkpads &dash; which I also love &dash; Apple holds on to their external designs longer than anybody in the industry. </p><p>You could make a case that Apple actually has the most modest designs of any PC or smartphone manufacturer today. Were Apple to ever ditch the big glowing Apple logo from their laptop lids (not likely, of course) it wouldn't even be close. </p><p>Please note that I spend less than two hours a week watching television, and well over forty hours a week using a Mac and an iPhone. In contrast, I see perhaps thirty seconds of Apple advertising a week. </p><p>So I'm talking about <em>actual Apple hardware</em> and not the yuppified marketing image they present in their TV ads. If you watch a lot of television and see a lot of Apple ads, and don't own any Apple products, you'll probably feel differently - but just know that your opinion is based more on marketing than the physical reality of their products. </p>John Rosehttps://plus.google.com/106748084801427416443noreply@blogger.com0tag:blogger.com,1999:blog-6931577083716752746.post-73455413305262217252012-03-16T15:29:00.001-04:002012-03-16T15:30:07.436-04:00Ruby: Staying "In The Zone" With Code Completion<p>Nothing breaks my flow of thought like a bunch of compiler errors -- or worse, subtle runtime errors -- because I misspelled an identifier name somewhere in my code. </p><p>As dynamic languages like Ruby have gained in popularity, we've often had to choose between robust, Intellisense-sporting environments like Visual Studio and the dynamic languages we really love. </p><p>Out of the box, <a href="http://www.sublimetext.com/2">Sublime Text 2</a> (OSX, Windows, Linux; $59; free unlimited evaluation of Sublime Text 2 during its prolonged beta period) has some excellent code completion for identifier names and built-in language constructs. There's an important shortcoming, however: the editor only "knows" about identifier names that appear elsewhere within the current file. A variable declared in File1.rb is invisible to the editor in File2.rb. </p><p><a href="https://github.com/Kronuz/SublimeCodeIntel/">SublimeCodeIntel</a> is a promising attempt to fix that shortcoming in Sublime Text 2. Based on my simple, initial tests, it works. There still appears to be work to be done, as the code completion dropdown randomly fails to appear at times. </p><p>Another alternative for IDE-like Ruby development is <a href="http://www.jetbrains.com/ruby/index.html">JetBrains' RubyMine (OSX, Windows, Linux; $69), now on version 4.02</a>. RubyMine aims for the full IDE experience, as opposed to the smart-and-extensible-text-editor approach of Sublime Text 2. </p><p>And then there's the venerable <a href="http://macromates.com/">TextMate</a> (OSX only) which may or may not see itself replaced by TextMate 2 if the author ever gets around to it. While I love TextMate, I've never found the code completion to be particularly useful. </p><p>While I like Sublime Text 2 the best for Ruby code completion, the saga of TextMate's pseudo-abandonment makes me awfully wary. Like TextMate, Sublime Text 2 is largely (if not entirely) the work of a single developer. What happens if he tires of the project, or other life circumstances prevent him from devoting himself to it? </p><p>This is by no means an exhaustive list. Please let me know if you've got a favorite of your own. </p>>John Rosehttps://plus.google.com/106748084801427416443noreply@blogger.com0tag:blogger.com,1999:blog-6931577083716752746.post-51206613450574828032012-03-16T11:46:00.001-04:002012-05-08T07:33:32.092-04:00Finally: Affordable, High-Resolution Monitors?<p><em><strong>Update, 5/1/2012</strong> I posted <a href="http://www.johnbooty.org/2012/05/hp-zr2740w-monitor-update.html">an update on the ZR2740W</a>. Long story short: This is a completely unacceptable monitor; don't buy it. </em></p><p>How much screen real estate do you need? </p><p>Every programmer has a different style. Command-line gurus are making the most of their screen real estate by using <href="http://tmux.sourceforge.net/">tools like tmux to tile several terminals together</a>. </p><p>Others, by necessity or choice, have multiple space-gobbling GUI applications open at once. This is <em>my</em> reality, and shuffling through six or seven overlapping windows has a huge potential for interfering with my fragile mental focus. Dealing with multiple too-large windows on a too-small screen is like trying to do one's taxes on a tiny airline seat tray… maddening! </p><p>Fortunately, monitors with resolutions greater than 1080p have finally started to come down from the $999 price point. A number of eBay resellers have started to offer bare-bones, 2560x1440 27" S-IPS Korean displays for shockingly low prices right around $400. Quite a few <a href="http://www.ebay.com/itm/New-YAMAKASI-CATLEAP-Q270-SE-27-LED-2560X1440-WQHD-DVI-D-Dual-Computer-Monitor-/110834882557?pt=Computer_Monitors&hash=item19ce4617fd">members of Overclockers.net have ordered these displays</a> and have had generally good results. </p><p>Unfortunately, those displays are so bare-bones that they lack multiple inputs. None of the $400 displays offer DisplayPort compatibility, which means that Mac owners would need a pricy DisplayPort-to-dual-link-DVI adapter. Anecdotally, however, a minority of users have reported <a href="http://www.monoprice.com/products/product.asp?c_id=104&cp_id=10428&cs_id=1042802&p_id=6904&seq=1&format=4#feedback">flaky results with MonoPrice's $69 adapter</a> and <a href="http://store.apple.com/us/reviews/MB571Z/A">even worse results with Apple's $99 adapter</a>. Since the Korean eBay displays are essentially unreturnable, I wasn't willing to take the risk. </p><p>Luckily, salvation may be in sight. <a href="http://www.anandtech.com/show/5636/hp-zr2740w-high-resolution-ips-that-doesnt-break-the-bank">AnandTech reviewed the HP ZR2740w</a> - a 27" S-IPS display that runs at 2560x1440. Long story short, this is their conclusion: </p><blockquote>If all you really want is a good display for your PC and you don't need to hook up multiple devices, the ZR2740w is an excellent choice. For such users we recommend it with very few reservations and present HP with our Bronze Editors' Choice award. </blockquote><p>Did I order one? You bet. </p>John Rosehttps://plus.google.com/106748084801427416443noreply@blogger.com0tag:blogger.com,1999:blog-6931577083716752746.post-33568415337824387032012-02-21T19:57:00.001-05:002012-02-21T19:57:53.848-05:00What Does "Mastered For iTunes" Mean?<p>It's not empty hype &dash; but there's no guarantee you'll be able to hear the difference. </p><p>Modern albums are typically recorded at sampling rates and bit depths far higher than those supported by CDs or typical home listening equipment. Whereas a CD supports 44.1khz/16bit digital music, music has typically been recorded at 96khz/24bit for quite a while now. </p><p>In the past, record labels typically supplied the 44.1khz/16bit masters to online music retailers like iTunes, who then converted them into AAC (iTunes) or MP3 (everybody else) format for sale. Therefore music was lossily converted twice: </p><blockquote style="text-align:center;"><strong>Non-"Mastered For iTunes" (3 steps)</strong><br><strong>(#1)</strong> 96khz/24bit studio master recording &rarr; <strong>(#2)</strong> 44.1khz/16bit CD master <br>&rarr; <strong>(#3)</strong> 44.1khz/16bit MP3/AAC version for online sale </blockquote><p>Clearly, the middle step is a bit of a waste of time. "Mastered For iTunes" recordings skip the middleman, so to speak. </p><blockquote style="text-align:center"><strong>"Mastered For iTunes" (2 steps)</strong><br><strong>(#1)</strong> 96khz/24bit studio master recording &rarr; <strong>(#2)</strong> 44.1khz/16bit MP3/AAC for online sale </blockquote><p>Mathematically, this holds water. It's a fact: all other things being equal, less information is discarded this way. Whether or not you'll be able to hear the difference is another story. </p><p>You're particularly unlikely to hear a difference if the studio master was recorded, mixed, or mastered poorly as a part of the ongoing "<a href="http://en.wikipedia.org/wiki/Loudness_war">Loudness Wars.</a>" From a fidelity standpoint, these recordings are essentially pre-ruined before they ever reach step #2 or #3 in either scenario. =) </p><p>Source: <a href="http://www.apple.com/itunes/mastered-for-itunes/">Apple.com</a></p> John Rosehttps://plus.google.com/106748084801427416443noreply@blogger.com0tag:blogger.com,1999:blog-6931577083716752746.post-35441314474736696262012-02-21T12:22:00.001-05:002012-02-21T12:22:32.019-05:00Is My Saw Sharp Enough?<p>"Saw-sharpening" is <a href="http://www.codinghorror.com/blog/2009/03/sharpening-the-saw.html">shorthand for this cautionary tale of a frustrated lumberjack</a>. </p><blockquote>There's a guy who stumbled into a lumberjack in the mountains. The man stops to observe the lumberjack, watching him feverishly sawing at this very large tree. He noticed that the lumberjack was working up a sweat, sawing and sawing, yet going nowhere. The bystander noticed that the saw the lumberjack was using was about as sharp as a butter knife. So, he says to the lumberjack, "Excuse me Mr. Lumberjack, but I couldn't help noticing how hard you are working on that tree, but going nowhere." The lumberjack replies with sweat dripping off of his brow, "Yes... I know. This tree seems to be giving me some trouble." The bystander replies and says, "But Mr. Lumberjack, your saw is so dull that it couldn't possibly cut through anything." "I know", says the lumberjack, "but I am too busy sawing to take time to sharpen my saw." </blockquote><p>I typically fall prey to just the opposite: too much time obsessing over tools and -- I fear -- not enough time actually chopping down trees. That's not <em>entirely</em> misguided; in a fast-changing industry like software development it's easy to "relax" for four or five years find yourself terminally behind the curve. </p><p>Just ask those 50 year-old COBOL programmers who were laid off from a bank somewhere and can't find work any more because of "ageism." </p><p>Right now, I think I'm in a pretty good place. I'm mostly current on my tools, and I'm mostly focused on work instead of scanning Github and the Visual Studio Extension Gallery for new gems and extensions every day. </p><p>Mostly. </p> John Rosehttps://plus.google.com/106748084801427416443noreply@blogger.com0tag:blogger.com,1999:blog-6931577083716752746.post-54647400375800667122012-02-16T19:18:00.001-05:002012-02-16T19:18:38.159-05:00But Don't Take My Word For It<p>Stevenf from Panic has a <a href="http://www.panic.com/blog/2012/02/about-gatekeeper/">great write-up on Gatekeeper and code signing in OSX Mountain Lion</a>. Panic is a leading third-party, independent software developer for OSX. They make Coda, Transmit, and a few other well-regarded apps. The majority of their apps <em>are not</em> available through the App Store, so they're quite sensitive to the needs of those who wish to independently distribute software. </p><p>Overall, he's extremely positive about the changes in OSX Mountain Lion. He does mention one important caveat about feature parity between App Store and non-App Store software. (I agree with his concerns there) </p>John Rosehttps://plus.google.com/106748084801427416443noreply@blogger.com0