Nerval's Lobster writes: As the Slashdot community well knows, chasing features has never worked out for any software company. "Once management decides that’s where the company is going to live, it’s pretty simple to start counting down to the moment that company will eventually die," software engineer Zachary Forrest y Salazar writes in a new posting (Dice link). But how does any developer overcome the management and deadlines that drive a lot of development straight into mediocrity, if not outright ruination? He suggests a damn-the-torpedoes approach: "It’s taking the code into your own hands, building or applying tools to help you ship faster, and prototyping ideas," whether or not you really have the internal support. But given the management issues and bureaucracy confronting many companies, is this approach feasible?Link to Original Source

Nerval's Lobster writes: Imagine a couple of employees at your company create a spreadsheet that lists their salaries. They place the spreadsheet on an internal network, where other employees soon add their own financial information. Within a day, the project has caught on like wildfire, with people not only listing their salaries but also their bonuses and other compensation-related info. While that might sound a little far-fetched, that’s exactly the scenario that recently played out at Google, according to an employee, Erica Baker, who detailed the whole incident on Twitter. While management frowned upon employees sharing salary data, she wrote, “the world didn’t end everything didn’t go up in flames because salaries got shared.” For years, employees and employers have debated the merits (and drawbacks) of revealing salaries (Dice link). While most workplaces keep employee pay a tightly guarded secret, others have begun fiddling with varying degrees of transparency, taking inspiration from studies that have shown a higher degree of salary-related openness translates into happier workers. (Other studies (PDF) haven't suggested the same effect.) Baker claims the spreadsheet compelled more Google employees to ask and receive "equitable pay based on data in the sheet."Link to Original Source

Nerval's Lobster writes: Whether or not certifications have value is a back-and-forth argument that’s been going on since before Novell launched its CNE program in the 1990s. Developer David Bolton recently incited some discussion of his own when he wrote an article for Dice in which he claimed that certifications aren't worth the time and money. But there's a lot of evidence that certifications can add as much as 16 percent to a tech professional's base pay; in addition a lot of tech companies use resume-screening software that weeds out any resumes that don't feature certain acronyms. There's also the argument (Dice link) that the cost, difficulty, and annoyance of earning a certification is actually the best reason to go through it, especially if you're looking for a job; it broadcasts that you're serious enough about the technology to invest a serious chunk of your life in it. But others might not agree with that assessment, arguing that all a certification proves is that you're good at taking tests, not necessarily knowing a technology inside and out.Link to Original Source

Nerval's Lobster writes: A day after Facebook’s head of security suggested the Web would be better off without Adobe Flash, Mozilla executive Mark Schmidt announced that Firefox would block all versions of Flash by default. Adobe Flash isn’t going away anytime soon—a great many websites rely on it to power animations, forms, and other features. But the two-pronged attack from Facebook and Mozilla is sure to revive the long-running argument that the plugin is too error-riddled for its own good. The current criticisms of Flash focus almost exclusively on its security vulnerabilities, and Adobe’s perceived slowness in patching them. Given the comments by Facebook Chief Security Officer Alex Stamos, it seems likely that the social network will eventually sub out Flash for HTML5, notably for video playback. For those developers who specialize in Flash, the thought of thousands of websites suddenly deciding to dump the technology en masse is probably not a comforting one (or maybe it is, if you dislike working with the platform). Given Flash’s sizable presence, however, that doom date is likely a long time from now. Wherever he is, Steve Jobs might be grinning a little.Link to Original Source

Nerval's Lobster writes: Having one or more certifications sounds pretty sensible in today’s world, doesn’t it? Many jobs demand proof that you’ve mastered a particular technology. But is the argument for spending lots of time and money to earn a certification as ironclad as it seems? In a new column (Dice link), developer David Bolton argues 'no.' Most certifications just prove you can pass tests, he argues, not mastery of a particular language or platform; and given the speed at which technology evolves, most are at risk of becoming quickly outdated. Plus they aren't the sole determiner of whether you can actually land a job: 'Recruiters sometimes have trouble determining a developer’s degree of technical experience, and so insist upon certificates or tests to judge abilities. If you manage to get past them to the job interview, the interviewer (provided they’re also a developer) can usually get a good feel for your actual programming ability and whether you’ll fit well with the group.' Are certifications mostly a rip-off, or are some (especially the advanced ones) actually useful, as many people insist?Link to Original Source

Nerval's Lobster writes: WebAssembly is the next stage in the evolution of client-side scripting. In theory, it will improve on JavaScript’s speed. That’s not to say that JavaScript is a slowpoke: Incremental speed improvements have included the rollout of asm.js (an optimized subset) in 2013. But WebAssembly—while not a replacement for JavaScript—is intended as a “cure” for a variety of issues where JavaScript isn’t always a perfect fit, including video editing, encryption, peer-to-peer, and more. (Here’s a full list of the Web applications that WebAssembly could maybe improve.) If WebAssembly is not there to replace JavaScript but to complement it, the key to the integration rests with the DOM and Garbage Collected Objects such as JavaScript strings, functions (as callable closures), Typed Arrays and Typed objects. The bigger question is, will WebAssembly actually become something big, or is it ultimately doomed to suffer the fate of other hyped JavaScript-related platforms such as Dart (a Google-only venture), which attracted buzz ahead of a Minimum Viable Product release, only to quickly fade away afterward?Link to Original Source

Nerval's Lobster writes: Simon Hughes, Dice's Chief Data Scientist, has put together an experimental visualization that explores how tech skills relate to one another. In the visualization, every circle or node represents a particular skill; colors designate communities that coalesce around skills. Try clicking “Java”, for example, and notice how many other skills accompany it (a high-degree node, as graph theory would call it). As a popular skill, it appears to be present in many communities: Big Data, Oracle Database, System Administration, Automation/Testing, and (of course) Web and Software Development. You may or may not agree with some relationships, but keep in mind, it was all generated in an automatic way by computer code, untouched by a human. Building it started with Gephi, an open-source network analysis and visualization software package, by importing a pair-wise comma-separated list of skills and their similarity scores (as Simon describes in his article) and running a number of analyses: Force Atlas layout to draw a force-directed graph, Avg. Path Length to calculate the Betweenness Centrality that determines the size of a node, and finally Modularity to detect communities of skills (again, color-coded in the visualization). The graph was then exported as an XML graph file (GEXF) and converted to JSON format with two sets of elements: Nodes and Links. "We would love to hear your feedback and questions," Simon says.Link to Original Source

Nerval's Lobster writes: A standard video game relies on a mountain of code, painstakingly pieced together by an army of programmers and developers. Then you have Tiny-Twitch. Inspired by a challenge from Australian game designer Ben Porter, developers Alex Yoder decided to create a game using 133 characters’ worth of JavaScript and HTML. The game itself is simple: A black “X” appears on your screen. When you click it, the “X” appears in another place. If you’re very easily amused, you could probably spend hours chasing that little digit around. Sure, as a piece of digital entertainment, it isn’t exactly “Arkham Knight,” but as an example of elegant coding, it’s pretty hard to beat.Link to Original Source

Nerval's Lobster writes: jQuery isn't without its controversies, and some developers distrust its use in larger projects because (some say) it ultimately leads to breakage-prone code that's harder to maintain. But given its prevalence, jQuery is probably essential to know (Dice link), but what are the most important elements to learn in order to become adept-enough at it? Chaining commands, understanding when the document is finished loading (and how to write code that safely accesses elements only after said loading), and learning CSS selectors are all key. The harder part is picking up jQuery's quirks and tricks, of which there are many... but is it worth studying to the point where you know every possible eccentricity?Link to Original Source

Nerval's Lobster writes: Facebook is the latest tech giant to release a 2015 diversity report, and the data shows a company that's majority white and male. Facebook as a whole is currently 55 percent Caucasian and 36 percent Asian; its senior leadership is 73 percent Caucasian, and its technology ranks are 51 percent Caucasian and 43 percent Asian. Blacks constitute less than 5 percent of all categories, and Hispanics less than 10 percent. On the gender front, Facebook is 68 percent male; within its tech ranks, that percentage climbs to 84 percent male. In a corporate blog posting, Maxine Williams, Facebook’s Global Director of Diversity, tried to put a positive spin on the data. “Our work is producing some positive but modest change and our new hire numbers are trending up,” she wrote. “In addition to best practice programs we have been running in recruitment and retention, we are always trying creative approaches.” Those creative approaches reportedly include presenting hiring managers in some parts of Facebook’s U.S. operations with at least one qualified candidate from an “underrepresented group.” Facebook also requires its employees undergo a reworked Managing Bias training course, which features discussions about stereotypes and unconscious bias.Link to Original Source

Nerval's Lobster writes: C++ is not an easy language to master, but many people are able to work in it just fine without being a 'guru' or anything along those lines. That being said, what separates C++ beginners from those with 'intermediate' skills, or even masters? According to this Dice article, it comes down to knowledge of several things, including copy constructors, virtual functions, how to handle memory leaks, the intricacies of casting, Lambda functions for C++11, (safe) exception handling and much more. All that being said, is there one particular thing or point that separates learners from masters?Link to Original Source

Nerval's Lobster writes: In a posting that recently attracted some buzz online,.NET developer Justin Angel (a former program manager for Silverlight) argued that the.NET ecosystem is headed for collapse—and that could take interest in C# along with it. “Sure, you’ll always be able to find a job working in C# (like you would with COBOL), but you’ll miss out on customer reach and risk falling behind the technology curve,” he wrote. But is C# really on the decline? According to Dice’s data, the popularity of C# has risen over the past several years; it ranks No. 26 on Dice’s ranking of most-searched terms. But Angel claims he pulled data from Indeed.com that shows job trends for C# on the decline. Data from the TIOBE developer interest index mirrors that trend, he said, with “C# developer interest down approximately 60% down back to 2006-2008 levels.” Is the.NET ecosystem really headed for long-term implosion, thanks in large part to developers devoting their energies to other platforms such as iOS and Android?Link to Original Source

Nerval's Lobster writes: In the eleven years since Mono first appeared, the Linux community has regarded it with suspicion. Because Mono is basically a free, open-source implementation of Microsoft’s.NET framework, some developers feared that Microsoft would eventually launch a patent war that could harm many in the open-source community. But there are some good reasons for using Mono, developer David Bolton argues in a new blog posting (Dice link). Chief among them is MonoDevelop, which he claims is an excellent IDE; it's cross-platform abilities; and its utility as a game-development platform. That might not ease everybody's concerns (and some people really don't like how Xamarin has basically commercialized Mono as an iOS/Android development platform), but it's maybe enough for some people to take another look at the platform.Link to Original Source

Nerval's Lobster writes: Since Python is a general-purpose language, it finds its way into a whole lot of different uses and industries. That means the industry in which you work has a way of determining what you actually need to know in terms of the language, as developer Jeff Cogswell explains in a new Dice piece. For example, if you’re hired to write apps that interact with operating systems and monitor devices, you might not need to know how to use the Python modules for scientific and numerical programming. In a similar fashion, if you’re hired to write Python code that interacts with a MySQL database, then you won’t need to master how it works with CouchDB. The question is, how much do you need to know about Python's basics? Cogswell suggests there are three basic levels to learning Python: Learn the core language itself, such as the syntax and basic types (and the difference between Python 2 and Python 3); learn the commonly used modules, and familiarize yourself with other modules; learn the bigger picture of software development with Python, such as including Python in a build process, using the pip package manager, and so on. But is that enough?Link to Original Source