The Insider News

The Insider News is for breaking IT and Software development news. Post your news, your alerts and
your inside scoops. This is an IT news-only forum - all off-topic, non-news posts will be
removed. If you wish to ask a programming question please post it
here.

Get The Daily Insider direct to your mailbox every day. Subscribe
now!

Moving beyond the gotcha blogs, there’s an actual reason for using technology products and services other than the ones you make (or happen to be made by the company where you work/ed). I think everyone knows that, even a thousand tweets later. The approach in many industries to downplay or even become hostile to the competition are well-documented and studied, and generally conclude that experiencing the competition is a good thing. Learning from the competition is not just required of all product development folks, but can also be somewhat of a skill worth honing. Let’s look at the ins and outs of using a competitive product.

If you know the enemy and know yourself, you need not fear the result of a hundred battles.

Languages form the terrain of computing. Programming languages, protocol specifications, query languages, file formats, pattern languages, memory layouts, formal languages, config files, mark-up languages, formatting languages and meta-languages shape the way we compute. So, what shapes languages? Grammars do. Grammars are the language of languages. After reading this article, you will be able to identify and interpret all commonly used notation for grammars.

Grammar is the logic of speech, even as logic is the grammar of reason.

Being compatible with Heisenberg uncertainty isn't enough for something to be realisable as a physical state. Is there a wavefunction that allows us to know the digits to the right of the decimal point as far as we want for both position and momentum measurements? Maybe surprisingly, the worlds of audio and graphics can help us answer this question.

The Dirac comb is an example of a wavefunction whose position and momentum aren't fuzzy.

I [was] passing the time by poking around on the imgur gallery, and saw a couple animated gifs based on one of my all-time favorite games, Super Mario Bros. It got me wondering: could I use matplotlib’s animation tools to create these sorts of gifs in Python? Over a few beers at an SFO bar, I started to try to figure it out. To spoil the punchline a bit, I managed to do it... This animation was created entirely in Python and matplotlib, by scraping the image data directly from the Super Mario Bros. ROM. Below I’ll explain how I managed to do it.

After I got over how cool it was that this technique was going to be useful, I got uncomfortable. “There’s a giant space of programs that this computer can emit and consume that I have no way of ever understanding,” I thought. That’s cool, but it’s also scary. I made something that caused me to feel deeply insignificant. Programming became scary. The potential of the machine suddenly felt limitless.

Usernames and passwords are the de facto standard for accessing user accounts on the web, so it’s likely that if your users have accounts, that’s the way you have them sign in. Keeping up with best practices for handling passwords can be hard, but is important for your users safety. Here’s a quick list of the things you should be doing to secure your passwords today.

biometrics sound like a good idea; but for anything beyond decrypting a local store they're not. Changing a widely used password is a pita if it's compromised; but try changing your fingerprint if an attacker gets a copy of it...

Did you ever see history portrayed as an old man with a wise brow and pulseless heart, waging all things in the balance of reason?Is not rather the genius of history like an eternal, imploring maiden, full of fire, with a burning heart and flaming soul, humanly warm and humanly beautiful?--Zachris Topelius

Training a telescope on one’s own belly button will only reveal lint. You like that? You go right on staring at it. I prefer looking at galaxies.-- Sarah Hoyt

Biometrics for security is basically using a password that you leak everywhere you go and you can never change it.

And to make things worse: the hardware used as gatekeepers will always lag behind the latest gadgets available to criminals to analyze the information you are unknowing and unwillingly spreading around every day.

Any $10K biometric security system that's secure today can be cracked by anyone with $1000 equipment tomorrow. It's an uphill battle you'll never win. Eventually you'll be spending so much money on security that you're better off getting hacked.

We feature a lot of different DIY electronics projects on Lifehacker, but the barrier for entry might seem high at a glance. However, it's not nearly as difficult as it looks. Here's how to get started.

So back to my original frustration about Windows Phone and having a default non-WebKit browser. This is a real issue and I feel now that I recommended the wrong solution. The better solution is to add the capability to Windows Phone 8 to choose an alternative default browser. This would leave it open to the browser providers such as Google, Mozilla, Opera, etc. to provide a better option on the platform. Specifying a different default browser is something you can’t do on the iPhone, and Windows Phone could easily turn it into some positive publicity. After all, Windows 8 lets you pick your default browser, so why not enable this in Windows Phone.

Typography is one of the important aspect of the web designing . The User Interface / UX (User Experience) design is one of the challenging work in the design and development of the website. In this blog post , we will explore some of the Typography tools which will be useful for the Web Designers.

Researchers have uncovered an ongoing, large-scale computer espionage network that's targeting hundreds of diplomatic, governmental, and scientific organizations in at least 39 countries, including the Russian Federation, Iran, and the United States. Operation Red October, as researchers from antivirus provider Kaspersky Lab have dubbed the highly coordinated campaign, has been active since 2007, raising the possibility it has already siphoned up hundreds of terabytes of sensitive information. It uses more than 1,000 distinct modules that have never been seen before to customize attack profiles for each victim.

You're afraid of our code. Well, you should be. Personally, I'd give us one chance in three.

I came across this article discussing why PHP has a bad reputation. A lot of it came down to developers using poor practices. This excerpt: "Copying internet tutorial code and not reviewing it" struck me as particularly relevant, and something I think we see a lot of in the PHP forum. Granted for any language there are the low-quality and outdated tutorials out there, but PHP seems to have an abundance of this.

A few months ago while sitting at a Burger King (yes, I know) I recorded a video on "How to use Windows 8 in 3 minutes" and threw it up on YouTube. It's been viewed nearly a half million times. Eek. It's got poor audio, and it's WAY too fast. I did it on a goof. However, people keep showing it to family and friends.... So tonight I took a few hours and did a new video that I'm VERY happy with and I hope you enjoy it. It's clean, clear, and only 25 minutes long and it explains, I believe, Windows 8 and its changes for anyone with basic Windows experience.

Here's the story, of a Metro UI, who was showing us some very lovely tiles...

By focusing on a simple circuit, the 6502 microprocessor chip can actually be understood at the silicon level. It's interesting to see how the complex patterns etched on the chip can be mapped onto gates, and their function understood.... In this article, I show how overflow is computed in the 6502 microprocessor at the transistor and silicon level. I've discussed the mathematics of the 6502 overflow flag earlier and thought it would be interesting to look at the actual chip-level implementation. Even though the overflow flag is a slightly obscure feature, its circuit is simple enough that it can be explained at the silicon level.

Dijkstra is an emminently quotable computer scientst, mostly for his famous lists of uncomfortable truths. Oft repeated is his rallying call against BASIC, most of the time without context: "It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration."... The thing people forget is that programming was substantially different in 1975. Dijkstra railed against Dartmouth Basic — a glorified assembler language. It isn’t the BASIC used today.

Programming is changing. The PC era is coming to an end, and software developers now work with an explosion of devices, job functions, and problems that need different approaches from the single machine era. In our age of exploding data, the ability to do some kind of programming is increasingly important to every job, and programming is no longer the sole preserve of an engineering priesthood.... Where am I headed with this line of inquiry? The goal is to be able to describe the essential skills that programmers need for the coming decade, the places they should focus their learning, and differentiating between short term trends and long term shifts.

TPS reports, for sure... but also robots, smart devices and so much more.

In this episode, Robert revisits the Windows 8 line of business app he wrote. This app provides the ability for employees to create and submit expense reports—and for managers to view and approve or reject them. Robert first reviews the app as you last saw it and then discusses what changes he made to the app's user interface and why he thinks those changes resulted in a much better app. He also announces that the app is now available for your downloading pleasure on CodePlex.

It’s worth considering the open source and free software movements. Does Richard Stallman fall under the banner of activist engineer? Linus Torvalds? The Ubuntu and Apache teams? Have they not changed the world? I’d argue that most open source software is about craft rather than seeking societal change. Rails and Django are terrific at improving the lives of the developers who work with them every day. But they’re ultimately tools used to build things, not outcomes in and of themselves.

Ask not what your code can do for you, ask what your code can do for everyone.

Google, with Android, is the biggest threat to Microsoft. Apple operates on a similar principle to Microsoft – still taking a cut at the point of sale, although in Apple’s case they count hardware and software together, where Microsoft focuses on the software side of things. But Google, with their ‘free’ software, is playing a completely different game. The recurring revenue from users through advertising is the key. The more users in Google’s world, the better Mountain View’s bottom line. What would be a better way to expand their reach that claim not just the web browser, but the whole desktop?

The first thing I did, which presumably all of you have already got covered, was to learn about computers, the Internet, and Internet culture. I read a bunch of books, I read enormous numbers of web pages, and I tried stuff. First I joined mailing lists and tried to understand the discussions until I felt comfortable jumping in and trying to participate for myself. Then I looked at web sites and tried to build my own. And finally I learned how to build web applications and I started building them. I was thirteen....

It's really sad about his suicide. I'm really proud of that young boy who has done such a tremendous job in his early ages.

Sadly I came to know about him just 2 days back. There's lot to learn from him, Below is what I was just reading about,

Aaron Swartz is a contributor to the MusicBrainz project, especially its metadata initiative. He is also a member of the W3C's RDF Core Working Group and a co-founder of SWAG: The Semantic Web Agreement Group. You'll likely find him in the RDF IRC channel, working on some interesting new Semantic Web software. His website is at http://www.aaronsw.com/

Swartz was eulogized by his friend and sometime attorney, Lawrence Lessig, calling his prosecution an abuse of proportionality and noting, "the question this government needs to answer is why it was so necessary that Aaron Swartz be labeled a 'felon'."

Quote:

Alex Stamos, CEO of Artemis Internet, is a computer forensics investigator employed by the Swartz legal defense as an expert witness. On January 12, 2013, he posted a summary of the expert testimony he was prepared to present in the JSTOR case, concluding, "I know a criminal hack when I see it, and Aaron’s downloading of journal articles from an unlocked closet is not an offense worth 35 years in jail."[48]

I agree with both of these. The Government and MIT overreacted.

Quote:

The government, however, has interpreted the anti-hacking provisions to include activities such as violating a Web site's terms of service or a company's computer usage policy, a position a federal appeals court in April said means "millions of unsuspecting individuals would find that they are engaging in criminal conduct."

That is a good thing. The government went nuts. I am wondering if someone at MIT has something to do with all these allegations and things... hmm.....

I also saw earlier that the government said that only Swartz was guilty in that interpretation, which reeks of foul play. Just have to find where I saw that. (Chrome crashed and deleted my history, so I can't find it through that.)

So, during the Summer of 2002, several bloggers and tech websites speculated that Dave must be bringing Chimera to the Mac. Except that Chimera was already a Mac application and didn’t need to be ported. So what the hell was Dave doing at Apple? Building another Gecko-based Mac browser? No one knew. And none of this made much sense. Which is probably why the rumors subsided so quickly. But people would remember all of this when Safari debuted at Macworld in San Francisco on January 7, 2003...

Team leader Bob Melton tells the story behind building and launching Apple's Safari browser.

Let me express the only feature I really desire in the next version of Visual Studio: Replace the format of all project and solution files with PowerShell scripts. I hear you groaning – just hear me out. I have many reasons for wanting this – too many to list all but the highlights here. In a nutshell they all boil down to the notion of simplicity.

What new features would you like to see in the next version of Visual Studio?

More options in its formatting preferences, e.g. always put a SPACE before a semi-colon, don't put a SPACE before empty braces, brackets, or parentheses.

I do a lot of my development at the command line and I create files for my library in a hierarchy. I do use Visual Studio when I have to and I have a Library project that is supposed to include all the files in the hierarchy, but of course it's often missing the latest files. Finding and adding such orphan files within VS is tedious. I have written a utility to try to add them to the project file, but I forget to use it and there's always the fear that it will corrupt the project file. I would like VS to be able to find and add orphan files.

The ability to build and run (and debug) a simple console app without requiring solution and project files -- like Turbo languages and Quincy*. A year ago I finally got around to writing a simple editor that will do that (except for the debugging part); I have it configured for C, C#, and VB.net.

Trim trailing whitespace on save and load (my editor does this too).

* I assumeQuincy[^] still doesn't require project files, but I haven't used it since last millenium. Hmmm... same with Turbo...

What new features would you like to see in the next version of Visual Studio?

The performance, fast load time, and simplicity of Visual Studio 97.

Color.

Normal menus, not SHOUTING menus (yes, I know those can be changed back)

A lean & mean IDE. I don't need a server explorer. I don't need a WPF editor. I don't need ASP.NET. I don't need wizards up the wazoo for things I never do. Hell, I don't even need a form designer most of the time. I don't need all that refactoring crap. I don't need integrated unit testing. Make the IDE truly component based so it loads only what I tell it to load and not all this bloated, implemented better by third parties, functionality.

That is why I like SharpDevelop. As it is open source, it can be tailored to the needs of the person using it. I have done that, and I am using its code base to create a simple, embedded scripting IDE. (Which may take some time, as I still have to learn more about WPF and stuff).

Microsoft was quite literally founded on Basic. Few of us who were doing software development in the 90′s could argue that Visual Basic successfully lowered the bar for entry such that just about anybody could write a simple program. I would even go so far as to say that Visual Basic was a key to the success of Windows in the 90′s and 00′s.

Old languages never die... and they don't seem to fade away much, either.

The requirement for developers isn't the same as a neural surgeon. In a free market you get what you pay for and if customers want to pay less for crap, it's their choice.

With a surgeon you can't afford to take this risk, but if an 18 year old highschool graduate could perform surgery with the help of advanced robotic tools, then it doesn't really matter.

To reduce overhead for highly qualified staff, a lot of jobs in the medical sector are already being replaced by mere operators who control a machine and the doctor just sits in his office analyzing the data that comes back.

JavaScript is not the answer. The problem with Visual Basic is when they did a complete redo for .NET. Somebody thought that VB should parallel C#. The result is a language that is not as easy for beginners.

Yes. It justifies the existence of managers. Especially managers who hire VB programmers, replacing their C#/C++ staff, because they are more plentiful, cheaper, and don't use complicated architectures that the rest of us use. (Yes, that was actually the reason given to me.)