Daniel Pamich

When you are choosing a PC for development work to get the maximum productively there are the following factors to consider:

CPU Speed

Storage( e.g SSD)

Memory

Monitors

Also for each of the above, I will also provide some guidance on return on investment. After all, there is no point in overpaying for a PC also we need to consider how much the average developer earns. Based on developer salaries in the USA the average salary is close to $100,000 or around $50 per hour.

Developers are expensive, so it’s important to give them great equipment so you can get the best value from them. A slow less expensive computer is a false economy, as the small cost difference in getting fast computer will be quickly recovered from the increased productivity of a developer.

One of the worst things for developer productivity is interruptions, as it can take developers 15mins+ to get back to being productive. So the slower the PC the higher the chance a developer will be interpreted. Waiting minutes for a computer to finish a task will result in the developer’s mind wandering(after all we are all human). Then it will take time for the developer to get back up to being productive, these interruptions need to be avoided at all costs.

Modern development is very intensive on computers reasons, most build processes are multi-core now, unit tests run on build or code changes, web pages are updated with the latest code, mobiles emulators are updated etc. It’s very easy for a development computer to become slow. Which introduces interruptions to the development workflow and hence non-productive for a developer.

CPU Speed

This can make a surprising difference in how much time a developer ends up waiting on the PC. Here is an example of the difference it can make.

I have an HP Workstation Z420 with a Xeon E5-1620 and a PC with an Intel Sandy Lake i7 6700k overclocked to 4.6gzh. Both are quad-core processors. The i7 is 2x as fast as the HP workstation in compiling code(5mins vs 2.5mins)!! I was very surprised when I ran the benchmarks. I though the Xeon CPU would give the consumer i7 CPU a run for its money. I was very wrong.

Since the cost when ordering a computer is only a few hundred to upgrade to the fastest i7 processor it is well worth the investment. Especially considering most computers are good for three years or more of use.

For the CPU I would recommend the fastest quad-core i7. If you find like me that your CPU is sitting a lot of the time between 50% to 100% utilization, then consider intel’s latest 10 Core i9 processor. The productivity gains from the increased performance would pay for this upgrade.

Storage

This one is a no brainer, just get SSD’s as they are x100’s of time faster than an HDD. Why wait minutes for a reboot when it can happen in seconds. As well as opening apps 10x quicker.

I would recommend the following setup:

One for the boot drive – 512gb

One for the source code -512gb or larger

One for Virtual PC’s – 1tb or larger(if required)

One large HDD.

The boot drive should always have it own SSD and the source and all development files on another SSD. This setup provides a clean separation between the OS and the source code. When the PC needs to be rebuilt the OS drive can be safely formatted with no risk of losing any source code.

Also since virtual PCs than to take up a lot of space, I give them their own SSD to be stored on. A common problem is that virtual PCs can use all the disk space. When they are on their own SSD they won’t affect the OS or Development drive by causing it to run out of space.

The HHD in my system is used for backups. I use CrashPlan to keep all my changes backed up locally as well as remotely.

Memory

There are a number of factors to consider for this.

For the web or mobile development, 32gb is a great starting point. I normally use around 24gb of ram when doing web/desktop development, so 32gb leaves a bit of space for other tools.

When using virtual PC’s to simulate other servers or development environments, then 64gb would be the best. Then you have the headroom to simulate and test some quite complex environments.

For virtual PC’s I also have another computer under my desk that is dedicated to virtual PC’s. Virtual PC’s can be demanding on what resources they need. This is my old development PC but it is more that up to the task of running the virtual PCs and is also a great way of recycling old equipment.

What you do not want is for your computer to run out of memory and start paging memory to disk. Then your computer will be running 100X slower as paging even to SSD’s is massively slower than using computer memory.

Monitors

Adding extra Monitors boosts your productivity. There are a number of studies showing a second monitor will boost your productivity by 35%!! That’s an extra $30,000+ worth of work each year for the average developer, for a $500 monitor. This is an amazing return on investment.

I run three monitors which I find a perfect number for web development. One monitor has the IDE(e.g Visual Studio) and next one the website I am working on and in the third one other tools I need like database tools, specifications etc.

Tip: Always hide/close email and team chat windows(e.g Slack), that way you won’t get distracted when working. Having then opened in one of the monitors is just asking to be distracted.

Having upgraded to 27″ 4k monitors I have found these to be brilliant to use, they lower eye strain and allow you to concentrate for longer.

Summary

For a new development PC I would recommend the following specs

Intel i7 or i9

2 to 3 SSD’s, One HHD

32Gb -> 64 GB RAM

2 to 3 27″ 4K IPS Monitors.

The extra $$ to upgrade to the fastest computer components will quickly pay for themselves through the increased developer productivity.

Whats makes for a fast development PC was last modified: July 9th, 2017 by Daniel Pamich

Like this:

When you start a new project there are massive number of ways to store your data to choose from. Here are some examples

Relational Databases

Document Databases

Key Value Database

File System

etc

From experience 99% of the time for new projects the best data store to use is a Relational Database. Having said that the key factor is understanding your data and how you store/access that information and then choosing the appropriate data storage system. Don’t make the big mistake of using a technology just because it’s the current fad, as this will most likely end up with massive rewrite when the current fad doesn’t pan out. I’ve been wanting to use MongoDB for years, but have yet to find a project were this is a good match. Let the data choose the storage method, not what you want to use.

Every system will work and perform well with a small sample data set, so unless you have done lots of careful planning you are not going to find issues until you are live and in production and by then it may be too late to fix.

Here is a article on the issues caused by using the wrong data storage system.

Relational Databases make an excellent starting starting point and I will explain why.

Relational Databases scale really easily.

Relational databases scale very very well. For example stackoverflow.com which is in the worlds top 50 web sites is powered by Relational Database(Microsoft SQL Server). In fact it works so well they have close to 0% CPU usage on their database servers!! That’s insane!

One common technique with Relational Database is to add a read replicas. So all query’s(reports) go to this secondary server(you can add more if required). This technique reduces the load on the primary server. As reports tend to be the heaviest usage of databases adding read replicas allow you to scale these operations.

With Modern hardware e.g. SSD Hard Drives, Terabytes of Memory, 10’s of CPU cores, not many system are going to hit the limits which allow Relational Databases to scale amazing well. It’s a lot easy to scale vertically than horizontal. It’s also lot easy to maintain 1(using vertical scaling) server than 10’s(using horizontal scaling) of servers.

Relational Database’s are easy to program against.

The tools for Relational Databases are very easy to work with. When you use tools like Entity Framework for .Net or (Hibernate for Java)they are so simple to use, you almost forget that you are using a relational database. The ecosystem for Relational Databases is incredibly rich, from Monitoring servers, Reporting Engines, Code Generators etc there is no shortage of tools. Also if you hit an issue, someone else is bound to have hit it too. So sites like Stackoverflow.com have tons of helpful tips.

Relational Database keep your data safe

An important part of relational database is that they have a feature called referential integrity. It helps to ensure that any data you add or delete will be valid and won’t have any broken records. Also each field can have extra checks to help ensure that the entered data is correct.

It can be a pain at times but the extra safety the checks Relational Databases have are well worth the investment of using these features. Nothing is life is free, but for the low effort to enable and use referential integrity etc the gains are fantastic and they have saved me more than once. Some other features that relational databases use to keep you data safe are:

Relational Database are fast(very fast)

They have designed and had over 30 years of optimizations, which have made them very very fast at querying data. Some of Microsoft upgrades to there SQL server have been stunning. I have seen 2x to 10x performance upgrades just by upgrading to the latest version!!

One database I work on has 100,000,000’s of updates very year and hits peaks of 30 to 100 adds/updates per second and it isn’t even breaking a sweat. The database server is a Microsoft Azure P1 instance, so the actual hardware(my laptop is faster) it uses is very very low. You can buy hardware that would be 100x more powerful, so that is a lot of room for growth!

Next Steps

For you next project, I would recommend first modelling our your data structure on a white board. Think about what reports you will require and what indexes you will require. I am sure you will find for majority of applications a Relational Database is the best starting point. As you can always in the future use other Data Storage engines to supplement it required. I use Azure Blobs(file) and Tables(key/Value) to store data that doesn’t need to be in the database as required.

Like this:

Recently I have been hunting for a firewall to use for in my internal network. Primarily I needed to create a DMZ for some web servers so that they could not access my internal LAN. Also it needed to work with Hyper-V.

After looking at (and trying) a number of options I came across clearOS and I was presently surprised. It ticked all of the major boxes for me in and it was simple to set-up and has a fantastic UI.

So many firewall products have UI’s that are very difficult to use and can take ages to get the simplest firewall rule set-up, simply because finding the correct option takes ages.

In comparison clearOS is a breeze to use. The default set-up comes with a very clean UI which only has a few options, because of this it only takes minutes to get up and running. A lot of vendors could learn a lot from clearOS and how they have implemented their UI. The default set-up quickly provides a firewalled LAN with no fuss, perfect!

When you need to add extra functionality like 1 to 1 NAT, you simply open the clearOS Market Place and add the functionality you want to use. The new functionality loads quickly without a reboot. There are over 100 modules for you to add. So there is no shortage of functionality!!

When configuring your network you have the usual options of External(for you internet connection), LAN (for your internal PC’s), DMZ(for any web servers) but they also add the option of a “Hot LAN” which is mixture of a LAN and DMZ. You can place your servers in the Hot LAN and know that your LAN will be still safe. The Hot LAN is very useful when you don’t have direct access to your public IP’s(mine are hidden behind another router) or need to use NAT.

So not only is clearOS simple to get started, it can easily be expanded to allow for much more complex environments. It’s the best of both worlds!

If you are looking for new firewall you should put clearOS at the top of your list. They also provide hardware with clearOS pre-installed and support(both free and paid).

Firewalls: ClearOS Review was last modified: April 26th, 2016 by Daniel Pamich

Like this:

Whenever you create a file in windows with a long file name(more the 255 characters) you risk losing that file and data forever. Since windows can not safely copy, move or even delete these files!

First a bit of history. Over 30 years ago when Microsoft created the 2nd version of the FAT file system it was only able to handle files and paths that were only 255 characters long. Today we are still stuck with that limitation.

Over time Microsoft has done massive changes to the File systems. The introduction of NTFS in 1993 removed these limitations, well sort of.

The problem was that Microsoft hasn’t updated it’s OS or Tools to correctly handle these long file names in the past 23 years.

So 23 YEARS later we still run the risk of losing all our data if any file path should exceed 255 characters. This is because the underlying Microsoft tools can’t use long file paths and simple file operations such as file copy or deleting files!

I call it a bug, since the problems been known for 23+ years but still to this day Microsoft has made no effort to fix the issue (simple fix’s) and keep your data safe.

So the crazy bit is that Microsoft has had the ability to work around these limitations built into the OS for 23+ years. They just have chosen not to update any of their tools to use these new ability’s.

Even their more modern products like OneDrive and Powershell are still affected by this bug. I can’t understand why they still choose to write their modern tools in this way, with the same limitations, as everything is built into the OS. It’s just pure madness.

To avoid this problem keep your file paths under 255 characters.

Fingers crossed Microsoft will fix these issues and take steps to keep our data safe.

Update 6/8/2016

Microsoft have updated their .Net api to have long file support by default!! .Net 4.6.2

Windows 10 anniversary Update

The file explorer still can’t delete folders with long file paths but since you can now install a Linux Bash Shell on Windows 10, you can use Linux to delete, copy and move folders with long filename in Windows 10. That’s crazy!!!

Microsoft’s 23 Year Old Data Loss Issue was last modified: August 6th, 2016 by Daniel Pamich