]]>http://pento.net/2015/07/30/podcasting-with-wptavern/feed/0How I Would Solve Plugin Dependencieshttp://pento.net/2015/05/24/plugin-dependencies/
http://pento.net/2015/05/24/plugin-dependencies/#commentsSun, 24 May 2015 04:02:26 +0000http://pento.net/?p=1156Continue reading How I Would Solve Plugin Dependencies]]>lol, I wouldn’t1.

WordPress isn’t (and will never be) Linux

ZYpp is the dependency solver used by OpenSUSE (and its PHP port in Composer), it was born of the need to solve complex dependency trees. The good news is, WordPress doesn’t have the same problem, and we shouldn’t create that problem for ourselves.

One of the most common-yet-complex issues is determining how to handle different version requirements by different packages. If My Amazing Plugin requires WP-API 1.9, but Your Wonderful Plugin requires WP-API 2.0, we have a problem. There are two ways to solve it – Windows solves it by installing multiple versions of the dependency, and loading the correct version for each package. This isn’t a particularly viable option in PHP, because trying to load two different versions of the same code in the same process is not my idea of a fun time.

The second option, which ZYpp solves, is to try and find a mutually compatible version of the dependency that each plugin can use. The biggest problem with this method is that it can’t always find a solution. If there’s no compatible way of installing the libraries, it has to throw back to the user to make the decision. This isn’t a viable option, as 99.999*% worth of users wouldn’t be able to tell the difference between WP-API versions 1.9 and 2.0, and nor should they.

But there’s a third option.

Technical Debt as a Service

Code libraries are, by their nature, developer facing. A user never really needs to know that they exist, in the same way that they don’t need to know about WP_Query. In WordPress Core, we strive for (and often achieve) 100% backwards compatibility between major versions. If we were going to implement plugin dependencies, I would make it a requirement that the code libraries shoulder the same burden: don’t make a user choose between upgrades, just always keep the code backwards compatible. If you need to make architectural changes, include a backwards compatible shim to keep things working nicely.

This intentionally moves the burden of upgrading to the developer, rather than the end user.

What Version?

If we’re going to require library developers to maintain backwards compatibility, we can do away with version requirements (and thus, removing the need for a dependency solver). If a plugin needs a library, it can just specify the library slug.

Better Living Through Auto Updates

If we were to implement plugin dependencies, I think it’d be a great place to introduce auto updates being enable by default. There’s no existing architecture for us to take into account, so we can have this use the current WordPress best practices. On top of that, it’s a step towards enabling auto updates for all Core releases, and it encourages developers to create backwards compatible libraries, because their library will almost certainly be updated before a plugin using it is.

Let’s Wrap This Up

I’m still not convinced plugin dependencies is a good thing to put in Core – it introduces significant complexities to plugin updates, as well as adding another dependency on WordPress.org to Core. But it’s definitely a conversation worth having.

Today I went skydiving, whilst barely leaving the ground. It was a bunch of fun.

<3, Gary.

]]>http://pento.net/2015/01/23/indoor-skydiving/feed/1JSON Encoding in WordPress 4.1http://pento.net/2014/12/19/json-encoding-in-wordpress-4-1/
http://pento.net/2014/12/19/json-encoding-in-wordpress-4-1/#commentsThu, 18 Dec 2014 23:01:35 +0000http://pento.net/?p=1093Continue reading JSON Encoding in WordPress 4.1]]>Earlier in the year, we noticed a problem with JSON in WordPress. The JSON spec very explicitly notes that it only supports UTF-8, whereas WordPress can use any character set that MySQL supports. So, for sites that didn’t use MySQL’s utf8 or utf8mb4 character sets, this generally presented itself as json_encode() returning false; which resulted in either invalid JSON being returned from an Ajax request, or a JavaScript error in some embedded code.

To fix this, WordPress 4.1 now includes a shiny new function that we recommend for all plugins and themes:

Usage for wp_json_encode() is identical to json_encode(). It works by trying a json_encode(), then checking if that encoded properly. If it failed, wp_json_encode() will go through whatever lump of data you passed to it, convert it to UTF-8, then return it as JSON.

Have fun with WordPress 4.1, and see you next year for new and exciting functionality coming to a WordPress install near you!

]]>http://pento.net/2014/12/19/json-encoding-in-wordpress-4-1/feed/0Let’s have a chat about Reddithttp://pento.net/2014/09/10/lets-have-a-chat-about-reddit/
http://pento.net/2014/09/10/lets-have-a-chat-about-reddit/#commentsWed, 10 Sep 2014 12:35:23 +0000http://pento.net/?p=1052Continue reading Let’s have a chat about Reddit]]>Before I start, I should warn you that I’ll be commenting on some of the awful things that Reddit implicitly condones, which include sex crimes, animal abuse and what can euphemistically be described as “disrespectful” behaviour towards the dead. I know these topics can be traumatic for people, so if you’d prefer to avoid reading them, please close this window.

(For reference, when I say “Reddit” here, I’m referring to the owners and administrators, not the vast majority of users. I’ll also be posting examples of subreddits (though not linking to them) which I strongly advise you do not visit.)

Many of you will be aware of “the fappening” that occurred a couple of weeks ago – this is the name for the mass leak of celebrities’ private photos to the internet. While the photos were hosted on various sites around the internet, it was primarily Reddit.com that was a focal point for dissemination of these photos, on the /r/TheFappening subreddit (“subreddit” is the term that Reddit uses for “sub forum”).

Reddit’s initial response to this was minimal. They were clearly aware of it, they apparently set new traffic records thanks to people trying to view these photos, but they chose to not do anything about it. This has been Reddit’s modus operandi for its entire history – hiding behind weak “freedom of speech” arguments until the pressure becomes too much – for example, they refused to delete /r/underage (a child porn subreddit) until massive public outcry forced them. It wasn’t until it became apparent that some of the photos in “the fappening” were of underage celebrities that they actually started to delete them en masse, previously choosing to force the victims to submit DMCA requests to have them taken down. The DMCA is a fairly useless tool in this respect, as Reddit users would simply re-upload the photos to a new location, and re-post them.

In this case (unlike many other cases), the victims of “the fappening” are actually able to take action, as they can afford lawyers to force Reddit’s hand. Victims of /r/PhotoPlunder, a revenge porn subreddit, often cannot afford lawyers, or aren’t even aware that their private photos have been posted to a public forum. There are subreddits like /r/PicsOfDeadKids and /r/CuteFemaleCorpses, which, while not containing any illegal content, would clearly cause trauma for friends or families of the deceased.

Then there’s the plainly illegal content, such as /r/SexWithDogs and /r/BeatingWomen2, which contain exactly the content their names describe. Despite being made aware of such subreddits many times, Reddit’s leadership refuses to act on them.

So, what can you do?

If you’re a celebrity (and I know my little blog has many celebrity readers!), it’s likely that you’ve been approached by Reddit, or your publicist, to do an AMA (“Ask Me Anything”) thread. Until they change their policies, I’d encourage you to refuse all such requests. If Reddit will happily exploit your privacy for a few extra page views, why should you legitimize them?

If you’re a person who contributes to one of the many legitimate subreddits, perhaps it’s time to look for a new home for your contributions. The StackExchange network has many sites (both technical and non-technical) that reward experts for their contributions. If you’re after a creative outlet, there are many communities built up around writing, art, music, and every other creative activity you can think of! (If you have a favourite, post it in the comments!)

And if you just read Reddit for the entertainment on the front page, perhaps it’s time to reconsider supporting them with your clicks and page views. I promise you, there are many other sites that are willing to provide you with entertaining cat videos, without also implicitly supporting illegal or abusive behaviour. (For example, Fark have excellent policies, not allowing abusive or illegal content.)

Finally, remember that Reddit won’t be around forever. Just like Digg before them, something else will come up and become the new place for pop culture on the internet. With a bit of luck, you could be one of the people who get to influence the next big thing for the better.

]]>http://pento.net/2014/09/10/lets-have-a-chat-about-reddit/feed/1The Next Adventurehttp://pento.net/2014/08/29/the-next-adventure/
http://pento.net/2014/08/29/the-next-adventure/#commentsFri, 29 Aug 2014 04:42:32 +0000http://pento.net/?p=1048Continue reading The Next Adventure]]>Over my past few years at Automattic, I’ve worked on a bunch of different teams and projects – VideoPress, the WordPress iOS app, various Social projects, and most recently, o2. I even took a few months to work on WordPress core, helping build the auto-update functionality that we now see rolling out security updates within hours of their release.

The few months I spent working on WordPress core made me realise something – there’s a lot more I have to contribute. So, with the WordPress 4.0 RC out the door, I’m super excited to be moving to my next project – working on WordPress core full time!

Automattic naturally puts a lot of people-hours into WordPress, with over 30 of us contributing to WordPress 3.9. I’m looking forward to being a bigger part of that, and giving more back to WordPress community!

]]>http://pento.net/2014/08/29/the-next-adventure/feed/8My Media Serverhttp://pento.net/2014/08/25/my-media-server/
http://pento.net/2014/08/25/my-media-server/#commentsMon, 25 Aug 2014 05:06:39 +0000http://pento.net/?p=1040Over the years, I’ve experimented with a bunch of different methods for media servers, and I think I’ve finally arrived at something that works well for me. Here are the details:

The Server

An old Dell Zino HD I had lying around, running Windows 7. Pretty much any server will be sufficient, this is just the one I had available. Dell doesn’t sell micro-PCs anymore, so just choose your favourite brand that sells something small and with low power requirements. The main things you need from it are a reasonable processor (fast enough to handle transcoding a few video streams in at least realtime), and lots of HD space. I don’t bother with RAID, because I won’t be sad about losing videos that I can easily re-download (the internet is my backup service).

Downloading

I make no excuses, nor apologies for downloading movies and TV shows in a manner that some may describe as involving “copyright violation”.

If you’re in a similar position, there are plenty of BitTorrent sites that allow you register and add videos to a personal RSS feed. Most BitTorrent clients can then subscribe to that feed, and automatically download anything added to it. Some sites even allow you to subscribe to collections, so you can subscribe to a TV show at the start of the season, and automatically get new episodes as soon as they arrive.

For your BitTorrent client, there are two features you need: the ability to subscribe to an RSS feed, and the ability to automatically run a command when the download finishes. I’ve found qBittorrent to be a good option for this.

Sorting

Once a file is downloaded, you need to sort them. By using a standard file layout, you have a much easier time of loading them into your media server later. For automatically sorting your files when they download, nothing compares to the amazing FileBot, which will automatically grab info about the download from TheMovieDB or TheTVDB, and pass it onto your media server. It’s entirely scriptable, but you don’t need to worry about that, because there’s already a great script to do all this for you, called Advanced Media Server (AMC). The initial setup for this was a bit annoying, so here’s the command I use (you can tweak the file locations for your own server, and you’ll need to fix the %n if you use something other than qBittorent):

Plex is the answer to this question. It looks great, it’ll automatically download extra information about your media, and it has really nice mobile apps for remote control. Extra features include offline syncing to your mobile device, so you can take your media when you’re flying, and Chromecast support so you can watch everything on your TV.

The Filebot command above will automatically tell Plex that a new file has arrived, which is great for if you choose to have your media stored on a NAS (Plex may not be able to automatically watch a directory on a NAS for when new files are added).

Backup

Having a local server is great for keeping a local backup of things that do matter – your photos and documents, for example. I use CrashPlan to sync my most important things to my server, so I have a copy immediately available if my laptop dies. I also use CrashPlan’s remote backup service to keep an offsite backup of everything.

Conclusion

While I’ve enjoyed figuring out how to get this all working smoothly, I’d love to be able to pay a monthly fee for an Rdio or Spotify style service, where I get the latest movies and TV shows as soon as they’re available. If you’re wondering what your next startup should be, get onto that.

For many years, MySQL had only supported a small part of UTF-8, a section commonly referred to as plane 0, the “Basic Multilingual Plane”, or the BMP. The UTF-8 spec is divided into “planes“, and plane 0 contains the most commonly used characters. For a long time, this was reasonably sufficient for MySQL’s purposes, and WordPress made do with this limitation.

It has always been possible to store all UTF-8 characters in the latin1 character set, though latin1 has shortcomings. While it recognises the connection between upper and lower case characters in Latin alphabets (such as English, French and German), it doesn’t recognise the same connection for other alphabets. For example, it doesn’t know that ‘Ω’ and ‘ω’ are the upper and lower-case versions of the Greek letter omega. This creates problems for searching text, when you generally want to match characters regardless of their case.

With the release of MySQL 5.5, however, the utf8mb4 character set was added, and a whole new world opened up. Plane 1 contains many practical characters from historic scripts, music notation and mathematical symbols. It also contains fun characters, such an Emoji and various game symbols. Plane 2 is dedicated to CJK Ideographs, an attempt to create a common library of Chinese, Japanese and Korean characters.

For many websites, being able to use Emoji without installing an extra plugin is an excellent reason to switch your WordPress database to utf8mb4, but unfortunately it’s not quite that simple. MySQL still has a few more limitations that cause problems with utf8mb4.

Without further ado, here’s how to configure MySQL, so that WordPress can use utf8mb4. If you don’t have the ability to configure your MySQL server directly, you should speak to your host. If they don’t want to, it’s probably time to look for a new host.

Upgrade MySQL

You need to be running MySQL 5.5.14, or higher. If you’re not already running at least MySQL 5.5 (ideally 5.6), you should be doing that anyway, as they provides significant performance and stability improvements over previous versions. For help with upgrading MySQL, check out the MySQL manual.

Configure MySQL

Before we convert your tables, we need to configure MySQL correctly.

In your my.cnf file, add the following settings to the [mysqld] section. Remember to double check that you’re not duplicating settings in your my.cnf file – if any of these options are already set to something different, you’ll need to change that setting, rather than add a new setting.

Use InnoDB

You’ll need to run these two queries for each WordPress table, I used wp_posts as an example. This is a bit tedious, but the good news is that you’ll only ever need to run them once. A word of warning, you should be prepared for some downtime if you have particularly large tables.

Configure WordPress

Finally, you can tell WordPress to use utf8mb4 by changing your DB_CHARSET setting in your wp-config.php file.

define( 'DB_CHARSET', 'utf8mb4' );

And there you have it. I know, it’s not pretty. I’d really like to add this to WordPress core so you don’t need to go through the hassle, but currently only a very small percentage of WordPress sites are using MySQL 5.5+ and InnoDB – in order to justify it, we need to see lots of sites upgrading! You can head on over to the core ticket for further reading, too – login and click the star at the top to show your support, there’s no need to post “+1” comments.

]]>http://pento.net/2014/04/07/wordpress-and-utf-8/feed/14PayPal is Still Bad at Account Securityhttp://pento.net/2014/04/03/paypal-is-still-bad-at-account-security/
http://pento.net/2014/04/03/paypal-is-still-bad-at-account-security/#commentsThu, 03 Apr 2014 07:50:19 +0000http://pento.net/?p=985Continue reading PayPal is Still Bad at Account Security]]>A couple of months ago, following the news of PayPal being partially responsible for a person’s identity theft, I activated Two Factor Authentication on my PayPal account. First up, I was fairly unimpressed with their configuration options. In order to use 2FA, my options were to buy a dongle to generate the security codes, or have the codes SMSed to me. Neither of these are particularly good – I don’t want to have to pay for and carry around a dongle everywhere, and SMS isn’t a secure protocol, as SIM cards can be cloned or hacked. If someone really wanted to get into my account, then this wouldn’t present much of a barrier.

Then, there’s the login process. For some reason, PayPal doesn’t automatically send me an SMS, I need to click an extra button for that while logging in. This isn’t so much a security problem as a weird UX. Also, the Android app doesn’t support 2FA, so I can’t use that at all.

The real fun started last night, however. I tried to login to my PayPal account, and was prompted to enter my security code. No problem, I clicked the Send SMS button, and waited. And waited. I clicked it again. Waited. Tried to login again, and repeated the process a few times. No luck.

Okay, so their SMS service was having issues. Apart from the security issues with SMS, it’s also a notoriously unreliable protocol, regularly causing problems exactly like this. While I was pondering this, I noticed there was an option to bypass the 2FA. I clicked the button, and was prompted to answer my two security questions: my favourite author, and my favourite movie. Unfortunately, I’d set these questions 10 years ago when I first created my PayPal account, and never thought about them since. It turns out that 22 year old me had very different taste in film and literature than 32 year old me, and I had no idea what the answers were. Defeated, I went to bed.

This morning, I decided to try again, with the same result. This time, I called their customer support centre, to see if they could at least give me an update on when SMSes would be working again. Unfortunately, it seemed the customer support representative wasn’t familiar with how PayPal’s 2FA worked, so after a bit of back-and-forth explaining the situation, the CSR said they’d “reset my account” (I don’t know what this means), and it should be working again in 15 minutes.

Half an hour later, still not working, so I call back. Fortunately, this CSR was aware of the SMS issues they were having, and was able to fill me in. Unfortunately, it seems PayPal hadn’t really thought about the implications of their policy for this situation, as he immediately offered to disable 2FA on my account for me.

I’ll just let that sink in for a moment. At this point, I’d only loosely identified myself – I had an identity code from the PayPal support site, that I was able to get with just my username and password. The support systems probably showed my current phone number as matching my 2FA phone number, but they shouldn’t be relying on that at all – the source phone number can be easily spoofed, Skype even offers this as a service.

Sadly, it’s clearly evident that PayPal’s 2FA is broken in a bunch of different ways. You can still keep your account secure by choosing a strong password, and making sure you only login to your PayPal account on devices your trust.

—

Even if PayPal are in no hurry to mend their ways, here are some things for developers to make sure their own 2FA system is secure:

Don’t offer SMS as the only option. SMS-based 2FA is okay for guarding against mass account hijacking, but cannot prevent a targeted attack. As we’ve seen, it’s also wildly unreliable.

You should be using a standard method for generating your 2FA codes, such as RFC 6238, which is used by a bunch of different websites, like Google and WordPress.com.

Make your 2FA system as easy to use as possible – your users should want to use it, because it doesn’t get in their way, but makes their account safe.

Teach your support reps the 2FA mantra: “Something you have, and something you know”. In the case of PayPal, they’d already confirmed something I know (my password), so they could’ve easily confirmed something I have, like my ID or my credit card.

If you’re going to use security questions, prompt your users to re-enter them occasionally, so they don’t forget.