DNS & Web Acceleration

Infractructure

We use cookies and other technologies to give you a better experience on our website.
These cookies help us see what’s happening on our website and allows us to serve you personalised ads.
By clicking on the button on the right or by browsing our website you give consent for placing these cookies.
For more information, please see our privacy policy.

Building together with Autogespot

Objects Case

Building together with Autogespot

Autogespot is an international platform where car enthusiasts meet each other, share photos and follow the latest news about exclusive cars. The site has now existed for fourteen years, twelve of which hosted by PCextreme.
The heart of the website are the spots. Anyone with a camera who 'spots' an exclusive car can send in photos to share with other admirers.

Autogespot visitors are some of the most active people on any of our services. On a daily basis, an incredible amount of awesome spots are posted by you and your fellow admirers.
Autogespot currently has a total of 865,000 unique visitors per month worldwide, 225,000 of those in the Netherlands, 110,000 in Germany and 88,000 in Belgium. Even from Vietnam we draw more than 8,000 visitors every month!

This vast amount of traffic requires a smart configuration, one that relies heavily on our Objects platform. As the numbers above might indicate, Autogespot is by far our largest Objects user. We would love to share our journey towards this point with you.

Humble beginnings

The site started out on running on a simple Shared Hosting package, but quickly outgrew its limitations.

After several platform changes, PCextreme and Autogespot sat down together to develop a brand new cloud platform. It has taken several months, but now we can confidently say that we have a setup that everyone can be proud of.

Housing a website with 3 Million hits a day and containing 900.000 images poses a huge challenge for any server configuration. Therefore, our wonderful system administrators had to come up with a simple, robust and scalable solution. One that is easy to maintain and strikes a balance between reliability and speed.

We decided build a cloud platform in which each and every aspect of the website has it’s own specific compartment. Kind of like how every element of a car has its own purpose. Here’s how it works:

The cluster

At the heart of the cluster is the Load Balancer; A High Available cloud server that divides all traffic amongst several different servers, using Varnish caching for optimal performance. Hitch is used to divert the many thousands of requests through an encrypted connection courtesy of Let’s Encrypt certificates.
The great thing about the High Available server is that it is scalable in size, and if it ever goes down for one reason or another, it automatically restores itself.

The Load Balancer selects one of six cloud servers to handle requests, using ‘sticky’ load balancing. This means that, if a visitor has cookies stored on a certain webserver for instance, they will automatically be directed to the same server when revisiting. This makes sure all previously cached data doesn’t need to be re-calculated, which allows for a much quicker user experience.
The six Apache-based cloud servers are arranged so that if one becomes unavailable, there are always five others that can easily handle webtraffic. These servers are geographically spread amongst three different datacenters, and are under constant monitoring. This way solutions can be found well before visitors ever experience any issues. On top of that, Memcache is utilised to minimize the amount of times databases needs to be accessed, speeding up the processes even more.

Speaking of databases; these are accommodated in several separate cloud servers, in a so called ‘master/slave’ setup. This means that MySQL databases are replicated across these servers, to spread read access for scalability and allow us to utilise the Bacula backup service without interrupting master data.

Delivering content using Objects

Now, all of these services would have been irrelevant without the the actual storage and content delivery for ’spots’, which basically amounts to 10TB of data in image and video files. Obviously, these files should not be stored on the main webservers. It would slow down the website tremendously. This is where Objects comes in.

All of these images and videos are stored and retrieved on our Objects platform, which serves specific images to the webservers when requested, without taxing the actual webservers themselves. Using Ceph storage technology, the data itself is replicated three times across multiple, scalable servers to ensure data preservation and allow some headroom for future growth. Files get sent out through a 10GB fiber connection in order to present visitors with their spots as quickly as possible. To put this into perspective: On a daily basis, approximately 1 terabyte of data is transferred, which amounts to about 3 million hits!

There's more!

Lastly, A separate utility server processes all the uploaded spots, converts images to fit the requirements of the spots asynchronously as not to interrupt the viewers’ experience, and create cronjobs for mailing purposes and the like.
All of these services are managed using Saltstack, which eliminates manual processes and subsequently reduces the chance of human errors. It allows us to easily deploy new servers and make changes to current ones. All in all, this configuration allows for an easy to maintain, highly available, lightning fast platform allowing 900.000 car-enthousiasts to enjoy a vast collection of exotic cars through Autogespot.