“Email on the Internet can be forged in a number of ways. In particular, existing protocols place no restriction on what a sending host can use as the “MAIL FROM” of a message or the domain given on the SMTP HELO/EHLO commands. This document describes version 1 of the Sender Policy Framework (SPF) protocol, whereby ADministrative Management Domains (ADMDs) can explicitly authorize the hosts that are allowed to use their domain names, and a receiving host can check such authorization.”

“DomainKeys Identified Mail (DKIM) permits a person, role, or organization that owns the signing domain to claim some responsibility for a message by associating the domain with the message. This can be an author’s organization, an operational relay, or one of their agents. DKIM separates the question of the identity of the Signer of the message from the purported author of the message. Assertion of responsibility is validated through a cryptographic signature and by querying the Signer’s domain directly to retrieve the appropriate public key. Message transit from author to recipient is through relays that typically make no substantive change to the message content and thus preserve the DKIM signature.”

“You can help prevent spoofing by adding a digital signature to outgoing message headers using the DKIM standard. This involves using a private domain key to encrypt your domain’s outgoing mail headers, and adding a public version of the key to the domain’s DNS records. Recipient servers can then retrieve the public key to decrypt incoming headers and verify that the message really comes from your domain and hasn’t been changed along the way.”

Click Generate the Domain Key

Follow the steps and generate a key

Generate a new record

Add the DKIM key to your DNS record

DNS will take a while to replicate so do wait a few hours before checking again with the checkmx tool.

“Domain-based Message Authentication, Reporting, and Conformance (DMARC) is a scalable mechanism by which a mail-originating organization can express domain-level policies and preferences for message validation, disposition, and reporting, that a mail-receiving organization can use to improve mail handling.

Originators of Internet Mail need to be able to associate reliable and authenticated domain identifiers with messages, communicate policies about messages that use those identifiers, and report about mail using those identifiers. These abilities have several benefits: Receivers can provide feedback to Domain Owners about the use of their domains; this feedback can provide valuable insight about the management of internal operations and the presence of external domain name abuse.

DMARC does not produce or encourage elevated delivery privilege of authenticated email. DMARC is a mechanism for policy distribution that enables increasingly strict handling of messages that fail authentication checks, ranging from no action, through altered delivery, up to message rejection.”

“Spammers can sometimes forge the “From” address on email messages so the spam appears to come from a user in your domain. To help prevent this sort of abuse, Google is participating in DMARC.org, which gives domain owners more control over what Gmail does with spam email messages from their domain.

G Suite follows the DMARC.org standard and allows you to decide how Gmail treats unauthenticated emails coming from your domain. Domain owners can publish a policy telling Gmail and other participating email providers how to handle unauthenticated messages sent from their domain. By defining a policy, you can help combat phishing to protect users and your reputation.“

Snip from here: “DNS: Internet’s Directory Nearly everything on the Internet starts with a DNS request. DNS is the Internet’s directory. Click on a link, open an app, send an email and the first thing your device does is ask the directory: Where can I find this? Unfortunately, by default, DNS is usually slow and insecure. Your ISP, and anyone else listening in on the Internet, can see every site you visit and every app you use — even if their content is encrypted. Creepily, some DNS providers sell data about your Internet activity or use it target you with ads.”

I am not sure if Cloudflare is any more private than using ISP DNS but I’ll happily use it.

Several people have asked me about Cloudflare’s new 1.1.1.1 privacy DNS service. To be clear: it DOES NOT stop your ISPs from collecting your browsing history. ISPs can still see the sites you’re connecting to — even if the site is over HTTPS. You will still send a hostname.

Backblaze has a Cloud storage solution that costs as low as $0.005c a GB (a month), The first 10G is free. Backblaze say “From bytes to petabytes Backblaze B2 is the lowest cost high-performance cloud storage in the world. ”

Backblaze state “The B2 command-line tool is available from the Python Package Index (PyPI) using the standard pip installation tool. Your first step is to make sure that you have either Python 2 (2.6 or later) or Python 3 (3.2 or later) installed.”

I set up the following caching rule to cache everything for 8 hours instead of WordPress pages

“fearby.com.com/wp-*” Cache level: Bypass

“fearby.com.com/wp-admin/post.php*” Cache level: Bypass

“fearby.com/*” Cache Everything, Edge Cache TTL: 8 Hours

Cache Results

Cache appears to be sitting at 50% after 12 hours. having cache os dynamic pages out there is ok unless I need to fix a typo, then I need to login to Cloudflare and clear the cache manually (or wait 8 hours)

Performance after a few hours

DNS times in gtmetrix have now fallen to a sub 200ms (Y Slow is now a respectable A, it was a C before). I just need to wait for caching and minification to kick in.

No more logging into NameCheap to perform DNS management (I now goto Cloudflare, Namecheap are awesome).

Cloudflare Support was slow/confusing (I ended up figuring out the redirect problem myself).

Some sort of verify Cloudflare Setup/DNS/CDN access would be nice. After I set this up my gtmetrix load times were the same and I was not sure if DNS needs to replicate? Changing minify settings in Cloudflare did not seem to happen.

“Kali Linux is an open source project that is maintained and funded by Offensive Security, a provider of world-class information security training and penetration testing services. In addition to Kali Linux, Offensive Security also maintains the Exploit Database and the free online course, Metasploit Unleashed.”

Download Kali

I downloaded the torrent version (as the HTTP version kept stopping (even on 50/20 NBN).

After the download finished I checked the SHA sum to verify it’a integrity

Parallels will not install, Ithink I need to upgrade to parallel 12 or 12 as the printer driver detection is not detecting (even though it is installed).

Installing Google Chrome

I used the video below

I have to run chrome with

/usr/bin/gogole-chrome-stable %U --no-sandbox --user-data=dir &

It works.

Running your first remote vulnerability scan in Kali

I found this video useful in helping me scan and check my systems for exploits

Simple exploit search in Armitage (metasploit)

A quick scan of my server revealed three ports open and (22, 80 and 443). Port 80 redirects to 443 and port 22 is firewalled. I have WordPress and exploits I rued failed to work thanks to patching (always stay ahead of patching and updating of software and the OS.

Without knowing what I was doing I was able to check my WordPress against known exploits.

If you open the Check Exploits menu at the end of the Attacks menu you can do a bulk exploit check.

WP Scan

Kali also comes with a WordPress scanner

wpscan --url https://fearby.com

This will try and output everything from your web server and WordPress plugins.

/xmlrpc.php was found and I was advised to deny access to that file in NGINX. xmlrpc.php is ok but can be used in denial of service attacks.

First, decide on what the problem(s) are and are aiming to solve? After the problem(s) are known you can discuss and validate possible solutions before any coding is done (billed). The business requirement will drive the technical requirements/directions,.

Letting technology goals drive the project is a bad idea. Why do you need XYZ? Is an app required after all? Does an “off the shelf solution” exist already? Can existing processes or software be tweaked?

You will need stakeholders (or stakeholder proxies)

Choose customer stakeholders (or customer proxies) that are engaged and positive, build a team of about 5 people who can drive the decisions and who understand the problems. Involve key stakeholders early and capture problems in a formal setting

Identify the stakeholders and establish a team

Document the problems (as it is and not how it should be)

Identify and specify improvement points

Plan solutions and Iterate

Improving a process or product should be measurable and agreed. The main aim of a customer/stakeholder is to create validated user stories that define the problem(s) and solution(s) from the end users perspective.

Whiteboarding is a great way to get everyone on board (and on the same page). One person may assume that something is a certain way but another person from the trenches may say it is not the case at all. It is key that problems reported lower down the managerial structure are listen to and be valued. Accept any problems and add them to the backlog (prioritizing comes later).

Whiteboarding should build a backlog of ideas and common understanding of what the landscape of issues are. I use a tool called Mind Node from https://mindnode.com/ to map things electronically. Within minutes you can have a map of connected or not connected areas of interest.

Stories are smaller things like “deliver back-end server infrastructure”, “Deploy a working prototype of feature X”, “Build an API (Application Programmer Interface)”. A user story can usually be completed in 2 weeks.

Tasks are often subtasks to stories and can be done quite quickly. A task could be “Change the colour scheme”, “Set up a testing environment” or “Deploy the alpha release”.

The following graphic is well known within the Project Management community.

How would you feel as the customer knowing that you have not been understood and that you may not get what you need?

How would you feel as the developer not knowing fully what you need to deliver?

No one likes to double back or waste time and resources unnecessarily. Define and plan early. “Failing to plan is planning to fail.” Benjamin Franklin

Technology Requirements

The backlog will drive the requirements; e.g:

mobile app or web app

type of technology needed

the location and number of servers

number of prototypes

scalability and redundancy

Knowing the requirements drives the technology. Technology drives the work and the budget.

TIP: Be wary of contractors that quote before listening to your full requirements. I was once told by a contractor that their main job was to “Con” “and Insult” you. If you accept a quote before saying what you need is a guarantee that you are paying a load of profit.

Now What

If you have defined the problem well, have good user stories and validated with stakeholders you are on the right path.

Be prepared to pivot and set frequent milestones to launch value (RERO).

Obviously centralised web-based technology is easily updatable compared to compiled and distributed mobile apps so choosing the right technology early is key to success.

Security and Quality

This is the project triangle where you can choose two sides (not three)

The ABS website was pulled down on the night of the study, with it then being unavailable for about 40 hours.

The ABS is judged to have failed to communicate with the public properly about major

IBM is deemed to have failed to properly test its disaster recovery processes

IBM had failed to properly plan for how to bring systems back up again.

IBM’s failure to have tested a router restart, or have a backup synchronised and in place, appears to have been significant contributing factors to the failure of the eCensus

In his report, meanwhile, Mr MacGibbon also found that the long-term, almost exclusive, relationship between the ABS and IBM had contributed to the problems by meaning any external questioning or oversight was absent.

As a minimum, you should plan and discuss early on about the performance, reliability and uptime targets for any app. Personally I have moved between multiple cloud providers (CloudAnt, Digital Ocean, AWS and Vultr) to comfortably meet cost v performance (each app WILL have different requirements)

Type of app

The type of app (Web, Desktop, Mobile) you will need will come after discussing the problem, focus on the problem(s) and not the solution(s).

Conclusion

Do’s

Keep Source ownership and IP (never let contractors own the code or processes)

Insist on frequent handover often

Do plan for updated version with major OS releases

Do secure the app

Do plan for maintenance of the app.

Don’t

Lock into expensive maintenance contracts.

Ensure the vendor is passionate about solving your problem and not the invoice they will submit.

Below is my quick blog posts on using the EWWW IO ExactDN CDN plugin in WordPress to set up an ExactDN (Global Dis.cotribution Network (CDN)) to distribute images to my site’s visitors and shrink (optimize) images in posts.

I am hoping adding a CDN will make things faster. My blog is delivering over two-thirds images, perfect for a CDN, this is why I am trying this out.

TIP: Check where your customers/readers are located, and how many are New versus Returning customers? Do you need a CDN to deliver content (Images) that are closer to your customers/readers or do you need to move your web server somewhere else? The more you know the more you can help them. Worst case you will be supporting a positive experience (and potentially turning a one time visitor into a returning visitor).

I looked at my Google Analytics data to see where my visitors are. Whether good or bad, they are all over the world (Hello)?

Other data is available in Google Analytics. I can see the last few years growth is growing and I am getting more returning visitors, now is the time to ensure my site is ready for more traffic and returning visitors.

Note: The fall in traffic in the Audience overview (right-hand side of the left image) is the unfinished month (not a reader fall off).

Personally, I set a goal to have a high page bounce rate of 90% be way lower (at present I am at about 80% and falling (good) and my page read time has gone from 40 seconds to 1 minute 40. Every bit you can do will help create a positive experience and help your visitors. I can see from the data above the content is being read, I am building returning visitors and they are geographically spread out. A CDN will be great. After you know where your visitors are it is good to know the times of day that your visitors are hitting your site. Lucky for me it is spread evenly over a 24 hour period.

Before I started to optimize my WordPress site (hosted on a shared CPanel server) I had the following Web Page Test score. Itested from Singapore as that’s was where my server was originally (and the closest to me).

My initial scores were bad across the range of tests (before optimisations). On the upside, I was manually compressing images with a tool on my desktop before uploading images and this showed an “A” but this scorecard overall was really bad.

Here are the results after Quick Optimizations (EWWW.io image compress, moved the server, reorganizing the site and Lazy Load Images)

Now I get these results after speeding up my site (after using the EWWW.io image resizing, reorganizing the site, minifying, lazy load images, moving servers etc.).

Even at 4 seconds web page “First Byte“, this is considered not good. My brain says I want sub 1 second, I doubt this is achievable with WordPress over thousands of miles away with SSL (read here about scalability).

I know https and non-geographically favourable servers add half a second to data. SSL will add processing overheads and latency period. If you only want speed don’t setup SSL but if you want SEO and security then setup SSL.

WebPageTest.org test reveals there is no effective use of Content Delivery Networks (CDN) on fearby.com (that’s why I am about to install EWWW.io ExactDN).

ExactDN (Content Delivery Network)

I did try and set up a number of caching and CDN plugins in the past (e.g Max CDN, W3 Total Cache, WP Fastest Cache, Cache Enable, WP Rocket, WP Super Cache, etc.) but they either made results worse or were impossible to set up.

Now that EWWW.io has a CDN let’s give that a go.

What is EWWW.io’s ExactDN?

You can read more about EWWW.io’s two-pronged approach to delivering one plugin to A) “compress images” and B) “add a Content Delivery Network (CDN)” here: https://ewww.io/resize/

Now you can go back to the EWWW.io ExactDN product page (https://ewww.io/resize/) and purchase a subscription (Make sure you are logged in with an EWWW.io account before you purchase ExactDN).

I purchased an ExactDN monthly subscription for $9.00 monthly (with a $1 signup fee for the cloud compression service).

Purchase confirmation screen.

Post-purchase, I was advised to find my “Site Address (URL)” in WordPress and add it to EWWW.io Manage Sites screen.

I now noticed I had a CDN for my domain ( fearby-com.exactdn.com ). Nice.

EWWW.io said to tick the CDN option in the Image Resize area of the EWWW.io plugin. But before I do that I will update WordPress Core and WordPress Plugins before enabling the ExactDN as they are out of date.

I checked the DNS replication for fearby-com.exactdn.com with this DNS checker. It’s setup and ready to go across the globe 🙂

A quick CNAME check reveals its upstream provider is fearbycom-8ba8.kxcdn.com. Keycdn.com have been around since 2012. It is great that EWWW.io has paired with keycdn and included their image compression magic in a single WordPress plugin.

Configuring the EWWW.io plugin in WordPress.

Before you enable the CDN below do check the https://www.whatsmydns.net/ and see if your CDN has replicated around the world (Australia can be a bit slow at DNS replication from time to time). Do wait at least 10 minutes before proceeding.

FYI: I have always used webpage test from Singapore and I have done the same here to compare apples to apples. Singapore is not a good test location in Australia (my server is in Australia) and it is 200ms away and the added layer of SSL adds latency to the connection (read here). This is a real-world test though (worst case).

FYI: https://www.webpagetest.org does want other (all) static content to be on a CDN to give a higher score than 42 out 100, sorry I did not get a subscore before enabling the CDN.

I was expecting a green A here for CDN but Webpage Test has given me more items to investigate to ensure WordPress plugins are also fast (minify or move to CDN). It appears WordPress includes are my next target for optimizing.

Web Page Test indicates the following WordPress assets should also be in CDN.

It would be a huge effort trying to read and keep static plugin and theme related files in a CDN. I’ll ask the EWWW.io developer to see if this is possible in a future version (that would be nice). The developer did promptly point me here to opt into using the CDN to deliver CSS, JS etc.

Is that it? It can’t be that simple!

I can load my site at https://fearby.com and my images load from https://fearby-com.exactdn.com 🙂 I am impressed, blog posts are now loading images from the CDN network and I did not have to edit posts. I did have to subscribe to ExactDN and tick a checkbox in WordPress though.

Effective Use of CDN: X (I should have captured the subscores). plugins now need to be on a CDN.

The Web Page Test site does give detailed scores and recommendations if you scroll down or click the score car (A-F). Do read the recommendations and see what you may need to do next. I am happy that I now have a CDN via EWWW.io. Clicking the first byte and CDN buttons at Web Page Test reveal sub-scores to allow you to see if you have made improvements, regrettably, I did not know of and capture sub-scores until after installing the CDN (I suggest you do).

https://gtmetrix.com is giving a good score across the board (86%). It hints I should optimize Javascript files and “Remove query strings in static files” as some proxy servers do not cache URLs with “?” in them. That is (fortunately) not a problem with ExactDN, as the servers are configured to properly handle query strings.

Gtmetrix.com does give some optimisation tips too (However, it does report low CDN optimizations if a single file is not delivered over a CDN).

I do like Gtmetrix.com email reports, you can see if your site performance is degrading.

Page Speed Insights

I am now looking at the Google PageSpeed Insights test for things to fix next. I think I can tweak my NGINX a little (adding caching). Read more about Google Page Speed Insights here.

fyi: My Google Page Speed Insights score (desktop) along with another local big corporate site I tested.

I refreshed my site in a browser and now CSS, JS and fonts are loaded from the CDN too.

To reduce files served from my website I re-enabled the Fast Velocity Minify plugin in WordPress and pointed it at my CDN (https://fearby-com.exactdn.com/)

Now I am getting a GTMatrix score of Page Speed Score = 93% (A) and YSlow Score = 74% (C) and a 2.6-second load and a much lower 35 requests (post minification and pointing the minified files to the CDN).

I think I’m done (ExactDN CDN and the image optimiser and other tweaks have worked its magic).

How do I compare to other sites?

Given Apple, Microsoft and NBC speed scores are worse than mine I’m happy for now. 🙂

GTMetricx Graphs

GT Metrics graphs show the improvement, YSlow report does indicate I can reduce the website DOM elements to speed things up and tweak plugins but I’d rather keep my design for SEO and not play with plugins or they might break.

I could tweak the server side (NGINX/MySQL/Cache or DNS) but I don’t need too.

Conclusion

I still can’t believe setting up a CDN is a one-click solution and adding a wp-config.php lin (after you subscribe). Best of all the BJ Lazy Load plugin still works. I am very happy with the EWWW.io ExactDN. I now have a CDN and it has lowered my First Byte time, Start Render Times, Speed Index time (on more than just the front page) and all I did was subscribe and tick a checkbox.

It is nice that EWWW.io have not charged for Data delivered from the CDN network on top of a monthly subscription at this stage but they have left it open in the terms and conditions. It is important to ensure a service can sustain itself, so this is a good sign. FYI: my favorite Agile toolkit (Atlaz.io) that I reviewed here closed shop today so it is wise for aaS shops to plan ahead. I know how hard it can be to spin up servers and stuff and allocate times to keep them running. Good luck EWWW.io!

Also, I now have a clear set of steps (below) to resolve other non-optimized assets that are outside of the CDN.

View the Vultr pricing calculator here. Vultr does say that you can resize your block storage volume but there are manual actions and risks involved so get the space you need early on and prevent resizing later.

Darn, I can’t choose Syndey yet as a location to create a block storage volume (I have asked Vultr when we can) so I’ll continue this guide with my existing (free) 50GB volume in New Jersey) and mount it in a server in NY/NJ (and also Syndey).

Vultr say’s “Block storage is connected to your server as /dev/vdb. We do not create any filesystems on it by default.” Official Block storage documentation is located here.

5. Run the commands listed in the Block Storage screen (above)

Error: In my case, the echo command failed add to configuration to the /etc/fstab file (even with sudo) and the mount command failed?

mount: can't find /mnt/blockstorage in /etc/fstab

I checked the /etc/fstab file contents

sudo cat /etc/fstab
# ..missing mount commands from Vultr..

I manually edited the /etc/fstab file and added the mount point configuration as suggested by Vultr.

sudo nano ./etc/fstab

Contents

# /etc/fstab: static file system information.
#
# Use 'blkid' to print the universally unique identifier for a
# device; this may be used with UUID= as a more robust way to name devices
# that works even if disks are added and removed. See fstab(5).
#
# <file system> <mount point> <type> <options> <dump> <pass>
# / was on /dev/vda1 during installation
UUID=removedGUID / ext4 errors=remount-ro 0 1
/dev/fd0 /media/floppy0 auto rw,user,noauto,exec,utf8 0 0
/dev/vdb1 /mnt/blockstorage ext4 defaults,noatime 0 0 >> /etc/fstab

Disclaimer

Terms And Conditions Of UseAll content provided on this "www.fearby.com" blog is for informational purposes only. Views are his own and not his employers. The owner of this blog makes no representations as to the accuracy or completeness of any information on this site or found by following any link on this site. Never make changes to a live site without backing it up first.

Some ads on this site use cookies. You can opt-out if of local analytics tracking by scrolling to the bottom of the front page or any article and clicking "You are not opted out. Click here to opt out.". AcceptRejectRead More

GDPR, Privacy & Cookies Policy

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these cookies, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may have an effect on your browsing experience.

Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.

Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.