Welcome to my website.
Here you can find my two blogs: this one in English and another one in Italian where I share my thoughts about computer, eletronics, developing, information/cyber security and technology in general.

This site, like some others I own, was hosted on a DirectAdmin managed virtual server that I started renting in 2015.

I used many server managed panels in my life, mainly because running a mail server is an hassle, but now I decided to use a cloud service for my e-mails and I can easily configure and manage the web servers using Ansible, so I don’t really need a panel for server management anymore.

Farewell DirectAdmin you served me weel.

Due to the dns transfer there could be some minor issues with the domain till the info propagates.

As you can see the "cf-cache-status" is missing, and this should happen when the file type is one of those that are not "not something they would ordinarily cache" see this article

The resource showed the proper headers and was set as public just like other website resources that were cached correctly by cloudflare. So I’ve tried enforcing a "Cache everything" page rule without any effect.

Then I spent again some time looking to find what was different into the headers without noticing anything new till, after a while, reading again this article, I noticed this phrase:

caches the following types of static content by extension

So I realized that the only difference between cached and uncached content was the presence of the file extension into the url! Even if the files are of the correct mime types and the header contains the file name with the extension.

So I made an experiment and changed the URL of the previous resource including the extension and the result was this:

The resource was now cacheable by Cloudflare, what was strange is that the page rules did not enforce the caching of the resource though.

My hypothesis is that the function that decides the cacheability first extracts the file extension from the url then does the actual evaluation, if the resource has not a file extension it just skips all the other phases no matter what page rule you put there.

This is not an issue only something I think it’s useful to be aware of, especially if you serve many "cacheable" content in a rest fashion.

Privacy is one of those concept that for most people are not really understood.

Most people don’t read privacy policies and terms of services of the various website/apps they use, moreover many of the ones who do read them don’t understand their meaning, and between the ones who do understand them most don’t have really the freedom to choose to accept them.

Most of times TOS and Policies are endured and not accepted, this because if you don’t use those services you are cut out of the world.
I don’t have a WhatsApp account and neither a Facebook account and this means that I am cut out from most social interaction. When asked why I don’t have WhatsApp “since is free” I answer that: “I would not mind to use WhatsApp if the price was only a fair amount of money”, for me the price of using those services is too high and there are also many, more privacy wise alternatives, like Signal.

There is a person who never contacts me simply because she only communicates via WhatsApp and Facebook, so I understand the price of my choices and why most people just don’t want to know what happens with their data, nevertheless I consider this a form of violence from those companies that uses the unawareness of the most to force the others into submission.

I had a network outage at home from the 1st of February till the 11th, it was a long period (the longest since at least 2010), and this event make me think about how much we depend on internet and on the cloud for so many things.

Luckily most of my home systems don’t need cloud services to work (as an example I use ownCloud for files sync and mercurial for code versioning on a server at home) but of course I could not watch netflix or downloads games from psn and I had only the mobile for news sites and casual browsing (and I almost depleted the bandwidth).

But I was thinking of those people that buy systems that highly depends on cloud to work (there are also lamps that needs the cloud to be turned on and off!), what will happen during an outage?

Internet become a service we depend on like electricity, and it does rely on standards that allow to replace a provider with another quite easily.

On the other end most cloud services are not based on standards so is not trivial to move from one to another, of course many of them provide some exports functionalities but yet again not based on any standard so you don’t see as much import functionalities. There are of course exceptions: for services like dropbox, is just a matter of moving files from a directory to another (losing history thought) but most are not as easy.

Lets return to the cloud lamps, if the company that provides the service cease operation the lights will stop woking and you have to replace them all, and what if all your house lights are based on it? You will be left alone in the dark…​

The cloud is a valuable resource but also a risk due to lack of standards and security/privacy concerns, I’m not saying that people should avoid it but I think that we should all be aware of related risks.

The truth is that I abbandoned it in 2013 due to lack of time, and till today all the time I’ve spent on it was spent for drupal mainenance.
So I decided to drop dupal and move to something that required little effor for mainenance.

My choice went to jBake that generate a static website, that have not need for patching or similar security mainenance. Another advantage of jBake is that uses a template system common in JEE world that I already know.

Just a final note: Drupal is a really good platform, but it is just too powerful for a simple website like this is now.

At home I've a server that I use for file serving, developing and making experiments but it is really old, has no virtualizzation support built'in (I'm using virtualbox but it is desktop oriented and I need something more oriented to server), and the available HDD space is almost out of space.

So my requirements list is:

at least 4Tb of raid protected hdd space

low power consumptions on idle (at least no higher than current setup)

lowest possible noise

full HW virtualizzation support

two gigabit Ethernet cards

full centos 6 compatible (that is the hardest part)

It was not easy, expecially the noise requirement since that information seldom provided into the hw specifications, my main source for this kind of info is Silent pc review but you can sill have bad suprises.

My current server configuration has 3 HDDs (raid 5) + a 2Tb hdd for "slow changing data" and an ssd for booting.

To achieve larger capacity and lower consumption I decided to use only 2.5 hdds. This is a setup that is now also common for enterprise servers, obviously I don't want to buy enterprise grade 2.5 HDDs (those are pricey) and this will also need a proper backplane for assure a good heat dissipation (so no plastic case).

I've selected the Samsung Spinpoint M8 1T hdd the spinpoint series was always a silent one so I've ordered a sample and make some tests, the result into an usb box was quite satisfactory. To reach the 4tb size in raid 5 i need 5 HDDs, so to have those plus the boot SSD I needed a MB with at least 6 sata ports.

Another important subject was the CPU, this is not a gaming PC, I just want to host some virtual guest for making clustering experiments so two physical core should be enough and power consumption have to be as low as possibile. This seemed simple but I soon discovered that there are not many low power consumption desktop cpus.

I have to admit that i liked the idea of using the E3 expecially due to the ECC memory support but I was unable to find a MB that officially supported that (and was compliant to my other requirements) so I had to buy the i5.

Joomla 1.5 is in end of life, so I had to migrate to a new product.
Joomla is really nice but I've found myself to like Drupal much more so I've started the migration.
It'll take some time to complete.

Google removed the share functionality of Google Reader to push Google+, I'm really upset by this choice cause I used that functionality for sharing news and articles that I thought were interesting on this website.Today I've thought that twitter could be a effective replacement for it, so from now onward I'll be a twitter user too.