BADCamp holds a special place in the history of weKnow. Kenny Abarca and Enzo Garcia co-founders of the company, had their first foray into the Drupal community at BADCamp 7 years ago. Enzo and Jesús met at BADCamp, and this meeting changed the trajectory of the Drupal Console as we know it. So this time around Jesús and Omar had the ‘grueling’ task to represent the weKnow team in the Bay Area. We highly recommend placing BADCamp on your bucket list, there are no words to describe the atmosphere, maybe … ‘the Woodstock of the Drupal calendar’?

So here are our highlights after interviewing Jesús and Omar, besides the social events, BADCamp offers a lot of insight on innovations and the general direction of the Drupal project.

Omar goes to BADCAMP 2017

Omar pointed out that he noticed that the camp focused, and was split, around two themes that are currently abuzz in the Drupal kingdom, Theming and DevOps. For theming, he was quick to point out how Twig has continued to make a massive impact in the frontend sector.

“The arrival of engine templates like Twig, has helped avoid ‘drupalism’ in how we create themes for Drupal, this is a great step for the productivity during the development process, the theming focuses on components, allowing us to work independently in the theming layer of the Drupal backend and site building, without the need to think in an integrated architecture perspective. The complexity to a big group of front end developers, without Drupal knowledge in theming, was one of the main reasons to use decoupled architecture in Drupal 8, it’s definitely one of the best things Drupal 8 has to offer” Omar added.

According to Omar, discussions around the DevOps subject seemed to revolve mostly around enterprise development needs. This ranged from workflow to testing and using bots to automate the quality assurance.

Jesús goes to BADCAMP 2017

Jesús took the opportunity to give a talk on creating, building, testing and deploying Drupal 8. He shared tools that are helpful in the process of setting up the local development environment and how to make use of them for a successful deployment.

“Everything is Docker!”

That was Jesús’s simple reply to the question “What were your BADCamp highlights?”

Jesús added that it looked like every Drupal developer was attracted to Docker’s flexibility and was quickly becoming the preferred setup for the local development environment. He based his observation on the emergence of a myriad of tools that simplify Docker setup such as Lando, Docksal, DrupalVM.

“This is a natural evolution when you realize creating and building a Drupal 8 site is more complex than previous versions. In order to start working on a Drupal 8 site, your local environment requires some tools; a package manager as composer for handling the site dependencies and talking about theming something similar is happening since you will be probably using gulp or npm.” Jesús added.

Serverless + FaaS

As Jesús has come to learn, a lot of the nuggets of innovation are hidden deep in the hallways of Drupal events. One such nugget was acquired while in a conversation with Thom Toogood from Australia. Thom introduced him to ‘Composer as a Service’, a project he is working on using , which is an open source framework that you can use to run any CLI-driven binary program embedded in a Docker container… making it a Function as a Service.

GatsbyJS Drupal Plugin

One of the main highlights for Jesús came in the last session he attended, about a static site generator, GatsbyJS. It’s based on ReactJS and packs some awesome benefits, just to name a few;

Most Interesting Sessions of BADCAMP 2017

The weKnow team is excited to announce that we’ll be attending BadCamp October 18-21. BadCamp is a celebration of open source software and one of the most prominent events in the Drupal universe. We take great pride in our track record for giving back to the open source community, so we are also happy to announce that Jesus and Omar will be holding a training session on hands-on Drupal 8 module development using Drupal Console.

Recently, I had to build a Drupal site that had to be fast for registered and non-registered users. One of the most frequently used module for caching, boost, can offer caching for unregistered users only. But what can we do for the rest of our visitors?Why can't we offer them more speed?

The solution

The module I used is authcache and can give us a plain and straighforward configuration that can be used with or without advanced caching server software. For example, if our web server has no Varnish or opcache or memcache etc. we can get from the module a very satisfying out-of-the-box result providing fast page surfing for our registered visitors.

In a nutshell, the module is similar to the well known boost. The first time a page is requested, a HTML version of it is stored for future requests. Each user role, has its own HTML version of the requested page. Once the cached page is stored, whenever it's been served as static page, an additional AJAX light-weight request is made to the server in order to update the usage statistics of the page requested and as a result the module returns personalized data for this page such as FAPI tokens, FAPI default values etc.

And where are the cached files?

Authcache gives us the ability to expand its functionality with extra features that allows us to use any advanced caching software the server may have. By adding some contrib modules, we can store the cached content either on files, Redis, Memcache, MongoDB etc. By default the module stores the content on our database. That may be a solid approach, but we suggest to use the File Cache module so that the content is stored in sites/default/files folder. With that aproach, MySQL stores only the URL and the age of each stored page. That keeps our database more clean and easier to maintain. Like many caching modules, authcache, has configurable settings such as minimum/maximum lifetime, cache update triggers, debugging modes etc as well.

What do the numbers say?

If a registered user requests a page that has 2 views, 3 blocks, 2 menus and some more "plain" content, the average rendering time is ~650ms. That means that a registered user has to wait for ~650ms after his click before the server can send to him the 1st byte of our page. With authcache and File Cache, we got to ~48ms!

These numbers are recorded on a 64bit Windows 7 laptop with SSD and plenty of memory (8GB). You will never going to host you site on something like that, but you get the picture. And if you use any of the contrib modules with the appropriate server software, you WILL get even better response times.

If you use authcache, boost, Block Cache Alter and the native caching of Drupal and Views, you will turn your sluggish Drupal installation to a rocket!

There will be times when you will need your app to know the visitors IP. Either for debugging purposes or for functionality purposes.

Possible workarounds

If you are a newbie, you will probably go for something like this:

if ($_SERVER['REMOTE_ADDR'] == '1.2.3.4') { // Your code here}

If you are bit more experienced, you 'll do something like this:

if (ip_address() == '1.2.3.4') { // Your code here}

The problem

Yes! That would do the work alright. But what will happen if your website is behind a reverse proxy? Varnish or Nginx? Or even behind advanced caching and proxy services like Cloudflare or Akamai? And what would you do if the server is not yours and you cannot install mod_rpaf on it? And it's 03:00 in the morning and your server administrator is asleep? None of the above approaches will reveal your visitors IP address!

The solution

But guess what! Drupal has a solution for that too.

The solution lays in the site/default/settings.php file at the reverse proxy configuration section.All you need to do is follow these steps:

1. Comment out the line $conf['reverse_proxy'] = TRUE;2. Put the known reverse proxy IPs in the following array. In case of Varnish or Nginx, the IP would normaly be the server's own IP.In case of Cloudflare or any other service, you will need the IPs that the service uses, and store them in the array:$conf['reverse_proxy_addresses'] = array('a.b.c.d', ...);3. We store the header that the proxy uses to keep the visitors IP. By default is HTTP_X_FORWARDED_FOR but in case that it's not, you can easily find it by executing the following:

print '<pre>';print_r($_SERVER);print '</pre>';

Check where is your IP and add the header to the variable: $conf['reverse_proxy_header'] = 'HTTP_X_CLUSTER_CLIENT_IP';

Follow the above and then the ip_address() function, will always return your visitor's IP address.

Earlier today, Propeople (together with our parent company, Intellecta) made two big announcements that I am really excited about. The first is that Blink Reaction is joining our larger Intellecta family, and will be merging with Propeople - creating the largest professional Drupal services company in the world! The second piece of news is that Propeople is also merging with Bysted, one of Denmark’s foremost creative agencies. Together, these two deals are strategic in starting an exciting new chapter for Propeople.

Intellecta’s purchase of an 80 percent stake in Blink Reaction also means that Blink Reaction will be operating under my direct management. Our companies will initially operate as independent units but will join together as a new company under a unified brand by the deal’s completion on March 31st, 2015. This new agency will have a truly unique international reach - with 350+ employees worldwide, working across 9 countries. The new agency’s global footprint and working capacity will be unmatched in the Drupal space.

It is an honor to lead the start of a whole new chapter for Propeople and Blink Reaction. I hold the company that Nancy Stango (Founder and CEO of Blink Reaction) and her team have built in very high esteem, and can’t wait to see what we’ll be able to achieve together.

Welcoming Blink Reaction into the Intellecta family will greatly expand our technical capacity, specially when it comes to the development of digital solutions built on Drupal. At the same time, bringing Bysted into Propeople will bring our creative, design, and strategy offerings to new levels. Both of these developments will prove to be strategic in charting the future direction in which Propeople is heading.

This future direction is driven by the vision of being a full-service agency for the digital age. Propeople is an agency that has had a truly technical upbringing; this is at the core of our identity, manifests itself through our entire organization’s culture and touches everything that we do.

The majority of the prominent agencies in the larger digital space tend to come from strong creative or communication backgrounds. The fact that Propeople comes from a technology background is a significant distinction in a world where technical matters have become increasingly more important for organizations as they develop their brand’s digital presence. And I’m not just talking about the importance of coding - but the larger way that analytics, data, and integrations with a variety of systems seem to be the common threads running through everything that a company does.

Today marks an important milestone for me, the Propeople team around the world and our customers. I, for one, can’t wait to get started!

What is Turnip?

Turnip is a Drupal starter kit created by OpenSourcery as starting point on scratch built Drupal installs. Turnip provides a basic setup that puts site builders several steps ahead of the vanilla Drupal install.

Furthermore, Turnip adds a host of community contributed modules that make up the core sitebuilding & development functionality at OpenSourcery. This post is the first of a series highlighting the tools & methodologies that make up Turnip.

What are the components?

The main components of Turnip are:

Drush make

Profiler

Features

Behat

Deploy

Individually they represent codebase, installation, configuration management, testing and content staging tools. Together they allow devs to build complex websites that can be deployed and built on nearly any platform with little adaptation.

Drush make

Drush make is a sub-command of Drush (If you haven't of Drush, stop right now and go check it out, it will change the way you develop with Drupal forever.) Drush make is a package manager (like composer, apt-get, or gem) that can, given a manifest of modules, download them to a central location ready for use.

These manifests, or lists of modules are composed into make files. Drupal doesn't interface with these directly, which is why you need Drush installed.

Unlike the old days of Drupal, where a distrobution contained not only the custom code created by the maintainers, but Drupal core and all the contrib modules the custom code depended upon. Drush make files can track the core and contrib versions and download them. All that is needed is a path to put the contrib modules.

Since OpenSourcery uses install profiles, we prefer to put any and all contrib modules in /profiles/PROJECT_NAME/modules/contrib. Custom modules can be put into custom and any features in features (more on features later).

Furthermore Drush make is even more useful given when modules aren't exactly perfect in their stable release state. Given enough time working on Drupal projects and one is bound to find a bug in a module. One of the greatest things about Open Source Software, and doubly so for Drupal, is that bugs are discussed publicly and the fixes are available immediately. Bugs are first fixed using files called patches. Oftentimes patches exist for a module long before the maintainers see fit to roll a new release. If the fix exists, but isn't part of the code base yet, what's a dev to do? Patching the module and entering it into source control is one option, but any and all changes to the module in the future must also be tracked for the duration of the project. Luckily enough, drush make allows projects to have patches applied to a given project. This is an even bigger win, if you discover a bug in a contrib module, figure out the fix for it, and submit the patch to fix it on Drupal.org! You get a working module, street cred for posting a patch, and the community gets a more stable, workable product. EVERYONE WINS!

To build the site, simply run drush make PATH/TO/MAKEFILE DRUPAL_DIRECTORY. If you forget the exact command to build the site in drush, no worries. Just use bin/rebuild and the site will be built in the /drupal directory.

Profiler

Profiler is a neat library that extends the functionality of the install profile. Gone are the days where custom install profiles meant long and largely unreadable database updates. Followed by obscure Drupal function calls and more database queries. With profiler, creating basic placeholder content is done in a human readable format. Even custom fields can be included given the proper set of parameters. It all starts out with the install profile's .info file. Adding a placeholder node is as simple as this.

The above example isn't exactly easy, but it is quite a bit more straightforward than creating a node object and calling node_save().

Features

Features is a way to assemble and export site components into a custom module. Features can be exported via the interface in the site itself. The feature can then be tracked in source control, making transferring settings from dev to test site and then onto deployment more straightforward. #### Module development, point & click style Features shine at taking things that are built in the UI like permissions or views. The components are then exported to a custom includes file specific to the components in the feature. Features can be managed via drush using commandes like features-update or features-revert to move features out of the database into code or the reverse. ### Behat Behat is a framework, independent of Drupal that tests whether or not software behaves in a given way. Since the software is being tested on a more interface driven level, the tests can be less specific to verifying the state of a variable. The upshot of Behat it is that the tests are designed to be human readable and writable.

Testing on the client side

Since tests can be almost directly ported from user stories, the responsibility of writing the tests can be reassigned from the developer or the QA lead to a Project Manager or possibly even the client. Tests can be tagged to differentiate levels of depth, or even browser versions.

Deploy

Deploy is a project that uses the Universally Unique Identifier (UUID) to export staged content, much in the same way that Features indentifies and exports site architecture, deploy exports and manages content.

While the weather didn't completely cooperate (it's still cloudy, cold, & misting BTW), the overall atmosphere was polite, the keynotes were interesting, the sessions were varied and the attendance was record setting. I feel like I learned a ton.

Here's a short list of some of my favorite sessions:

Anthony Ferara's session on code complexity was instructive, delightful and inspiring. I spoke with another dev about it and we both started geeking out on the tools he was using. Phploc in particular was something I could easily get using in short order.

Michael Lopp's keynote was thought provoking, and focused. He spoke on the three archetypes critical for the success of a startup, the engineer, the designer, and the dictator.

The old & new field API in D8 session was another crazy mind expanding session that incorporated much of the new D8 hotness. For example it was reported that field instances are now CMI compliant, work on field widgets and display settings are in the works.

Ryan Weaver’s Behat session was splashy, inspiring, and raucous. It covered how Behavior Driven Development has the possibility to convert user stories into tests that can be executed as QA steps.

The Takeaway

What was truly memorable was the sheer scale of DrupalCon. With 3300 recorded attendees, the logistics of coordinating all the speakers, the food, coffee and extra time activities was staggering. The continued growth of the Drupal Community at times can be hard to quantify, but when you’re standing shoulder to shoulder with 3000+ drupaleros for a group photo, the exhilaration is palpable. What’s even more impressive is that not long ago, DrupalCon North America could have fit inside of one of the smaller conference rooms with space to spare. The sprints on Friday alone had higher attendance than several early DrupalCons, as claimed by chx.

More people at the sprints than whole early DrupalCons

All in all, it was an exciting week and a productive one at that.

DrupalCon Portland produced numberless new connections and strengthened friendships, a FEMA affiliated disaster logistics site in under 24 hours supporting victims of the tornado in Oklahoma, committing twig support into Drupal core, and completely swamping the testing servers for core patches.

For a short, rainy week, it was good work, good work indeed. All of this stems from a central Drupal truism: our strength is in our community. Codebase is one effect of that strength.

The pitch:

I'm excited that Drupaleros the world over will be invading my hometown, and that I'm involved in the process as the track chair for Coding & Development. What I'm looking for is encapsulated in the track description

Sessions that are original, exciting and leave attendees raring to get involved.

Sessions that present new ways to accelerate the process of putting code on disk.

Sessions that use entities in dramatic ways to extend the Drupal experience.

Sessions that help Devs bridge the communication gap with PMs/Clients.

The clincher:

The cutoff date to submit sessions is the 15th of February. It's on the other side of the holidays, but the more awesome session submissions, the better. So if you don't already have a session idea and have a development related topic that keeps you up at night, organize your thoughts and get those proposals in.

The Pacific Northwest Drupal Summit was last weekend at the University of Washington campus and what a weekend it was!

First, the summit probably set a record for OpenSourcery with 5 sessions. Secondly, this was my second summit and my first that was out of town (the 2011 summit was held here in Portland). The nine of us (Brian, Jessica, Andrea, Heather, Adam, Jason, Jesse, Simon, and myself) piled into a steel blue 15 passenger van for the 3+ hour drive up to Seattle. A fun time was had by all, especially during bowling on Saturday night.

The second Drupal Commerce session I attended was presented by another member of the Commerce Guys team, Pedro Cambra. Pedro co-maintains several major Commerce contrib modules such as commerce feeds, commerce, Commerce Reorder and Commerce Extra Panes.

Many of these modules, while not technically required for the operation of a online store, make the online experience much better. For example Commerce Reorder allows users to create a new order using the contents of an existing order. For B2B online commerce, this is a killer feature, since many businesses order the same set of products over and over again. Commerce VBO Views brings Views Bulk Operations into commerce, making mass deletion of products, profiles, and orders possible, this makes unclogging the order queue as easy as cleaning out old nodes from the content section.

Importing content from another system is a particularly thorny challenge. Luckily there exists a pair of solutions for this issue. Whether you're a Migrate or a Feeds user, Drupal Commerce has a module that imports data into Commerce automatically. One commerce contrib module that I have a particular fondness for is the Commerce Feeds module. This leverages the feeds module, which takes a feed in an XML, RSS or CSV format and turns the data into products. The feeds are exportable via features and new products can be added automatically as they're created. This obviates the need to manually import products into Drupal Commerce. Commerce Migrate takes a similar approach, but utilizes the Migrate module to create the product entities.

Another neat Commerce contrib module is the Commerce Physical module, which utilizes the Physical Fields module to give products physical characteristics such as dimensions and weight. These can be used in shipping calculations to return the exact shipping cost calculated by courier APIs.

Payment methods are probably the most important aspect of eCommerce. Drupal Commerce has this particular part in spades. The payment method modules run the gamut from PayPal to Purchase Order, and everything in between.

There also exists Commerce Contrib wishlists that adds a 'Add to Wishlist' button to products, coupons that offer discounts, addressbook that simplifies the checkout process, multi-currency support for international commerce, and so much more.

The Commerce sessions were just a few of the amazing and wildly varied panoply of concepts, companies and community that is DrupalCon.

It was quite the experience, especially with the backdrop of the sunshine and the warm weather. I loved being able to introduce myself to people I've only met in IRC previously, and reconnecting with old Drupal buddies.

As said previously, this was my first DrupalCon and I was very happy to have gone. I now having a greater idea of the depth of the passion and the breadth of the community. It's pretty amazing to pass groups of people between sessions speaking so many languages and yet, they're all talking about the open web, and Drupal's role in that ecosystem. I've always been a big fan of Drupal, but going to DrupalCon has definitely added to that appreciation.

Going to DrupalCon Denver was a welcome break from the cold and gloom of a northwest spring. The Rocky Mountain sunshine provided a backdrop for the largest DrupalCon on record.

This DrupalCon was especially notable since it was my first opportunity to attend in person. I've made it a point in the past to watch the keynotes, especially Dries, as close to real time as possible. As well as jealously following the Twitter feeds of those in attendance to glean any breaking news from the forefront of the Drupal community. As expected, there were too many good sessions to choose just one to attend.

One particular track that caught my attention was the eCommerce track. For sometime I've felt that Drupal's greatest untapped potential was in this sphere and it was a welcome change to see eCommerce receive the kind of attention given to other Drupal application classes. To complete my excitement for this particular track, I noted Ryan Szrama's Drupal Commerce session on Wednesday Morning. As a project lead for the very popular Drupal commerce project and long time Drupal eCommerce guru, sitting in on this session was my chance to get a sneak peak at arguably the fastest growing sector of Drupal.

All about Commerce

For a 30 second introduction to Drupal Commerce, it's important to know the past. Drupal Commerce grew out of UberCart, a highly successful set of eCommerce modules for Drupal 6, it provided a simplified eCommerce solution.

Shortly thereafter, Ryan Szrama started a new project, that expanded the horizons of eCommerce in Drupal. New to Drupal 7 was the concept of entities. Objects that behaved like nodes, but could be displayed and manipulated more flexibly.

Furthermore, entities could have fields placed upon them, and thus the Commerce concepts of using entities as components in a commerce website was born. With entities, products could be freely displayed privately, abstracting the stock keeping and managerial facets of commerce away from the display side. In addition, products could be uniquely identified by a Stock Keeping Unit or SKU, in Drupal much in the same way products are tracked by SKU in warehouses.

Since entities could be referenced in a node, creating a display page became as simple as creating a product display node and referencing the products that belonged to that particular display. Whereas UberCart was designed as a starter store in a box, more of a plug and play solution. Drupal Commerce provided a more fully customizable framework, that allows businesses to modify the framework to fit their particular business model. Drupal Commerce was designed to leverage the full potential of Drupal 7 for power and flexibility.

As a framework, Commerce core is as lean as possible, so much so that the Commerce shipping module doesn't even ship with Commerce core. The rationale is that not all stores have physical products that need to be shipped. As a side note, there exists a Commerce shipping module, maintained by the same crew that work on Commerce core, that is easily download-able for shipping physical products. Since commerce is so flexible, there exists a myriad of neat and, possibly, ready to use solutions for nontechnical users. Contributed modules such as a wish-list, reorder functionality and stock keeping extend the power of commerce and make it easier for those new to Drupal and Commerce to utilize these resources, with the end result of increasing the adoption of the Drupal Commerce framework.

What's next in Commerce

While Drupal Commerce is receiving a lot of attention for the flexibility of the framework. The learning curve can be a little steep, especially for those new to Drupal.

One of the main initiatives to be undertaken by Commerce Guys to help increase adoption is to increase the overall amount of documentation, with a focus on how to do basic tasks like calculate a tax rate or how to create a discount for certain customers. There exists a fair amount of documentation both in the form of step by step guides and in screen-casts or video format.

Another sticking point in using Drupal Commerce is the lack of a clear, easy to use UI. One of the powerful things about Drupal Commerce is that it's built on existing Drupal tools and infrastructure. One such example is the pervasive use of the Views and Rules contrib modules that make it easy to create and customize data displays of products, shopping carts and lists of line items. The catch with using some of the more complex sets of contrib modules is that in order for store administrators to make simple changes, they must first learn the basics of things like Views.

One perfect example of this would be where a store administrator, someone who normally uses the middle part of the site to administer transactions, needs to adjust the percentage of a discount. This discount might be offered to individuals who purchase more than a set cost of items. Editing the discount amount requires no less than diving three levels into the commerce product pricing rules interface and then an additional two levels via the Rules interface directly. While this might be a trivial task for a lifelong Drupalista, for someone more at home in a WYSIWYG editor, this could be scary and confusing.

The aim is to make simple tasks that site managers do regularly more straightforward. That having a familiarity with Drupal not be a requirement for administering a Drupal Commerce site. In short, to drive adoption of the Drupal CMS via Commerce installs by introducing people to Drupal as a course of building a Commerce site.

To help accelerate the adoption of Commerce another initiative is to release a new version of Commerce Kickstart. Commerce Kickstart is an introductory install profile containing a basic store, with a trio of example products, an example payment method and not much else. In my experience I've used it more as a testing suite or a playground to get a feel for what Commerce is and what it can offer me in terms of tools and examples to build a commerce setup for a client. This focus on developers was helpful to me, but probably isn't for someone looking to setup a small eCommerce site with little to no custom code. This will no long er be the case for Commerce Kickstart 2.0, who's focus will be squarely on site builders.

In that vein, Commerce Kickstart 2.0 will be redesigned from the ground up. Commerce core, by design, lacks anything not considered absolutely essential for eCommerce. This includes things like shipping, since not all stores sell physical products. However, Commerce Kickstart 2.0 will have the shipping module built in.

One other sticking point is appearance. Again as Drupalistas, we're used to seeing an unthemed site, something in perhaps in Bartik or maybe even -gasp- Garland. That said, the rest of the world may not share this understanding view of Drupal's theme layer. As such, Commerce Kickstart 2.0 will have its own Omega based sub theme that would function as an introductory theme for people just looking to get an eCommerce site up and running as quickly as possible.

To make this even more exciting, Commerce Guys has of pledged to release Commerce Kickstart 2.0 by DrupalCon Munich this August, quite the ambitious time line.

Needless to say I'm very excited in the direction Drupal Commerce is taking, I'd love to see Drupal take off in the eCommerce, much in the same way it has taken off in government and the NGO space.

The workshop portion of the Drupal Career Starter Program wrapped up in mid-December, just in time for participants to catch their breath and enjoy the holidays before embarking on the second phase of the DCSP. 15 of the 18 students are currently beginning internships with organizations around the United States in an effort to further their education and training to the point where they can begin careers in Drupal.

Conception and Funding

Anello Consulting (parent company of DrupalEasy) conceived, designed, and implemented the DCSP as a way to help Brevard County, Florida (home of NASA's Kennedy Space Center) retain some of the over 7,000 skilled workers who lost their jobs as part of the retirement of the Space Shuttle program. Brevard Workforce funded the program and has been a fantastic supporter and champion of it.

"Well paying jobs are available now, and growing fast in areas of ICT that may not be familiar to the general population, nor have formal certifications," said Lisa Rice, president of Brevard Workforce. "The data on the growth of Drupal, and the tremendous opportunity for those people who learn the software and gain experience convinced us we have to step outside the box a bit so we can offer our workforce, fast turnaround programs that provide well-paying options quickly" (See the complete Brevard Workforce news release).

Training and Internships

The inagural session of the DCSP began with a rigorous application process to identify qualified individuals. Selected participants then underwent 10 weeks (70 hours) of classroom training in Cocoa Village, Florida during the fall of 2011. The training included a strong foundation on the basic building blocks of Drupal, the use of commonly-used contributed modules, writing custom modules, and creating custom themes. Participation in the Drupal community was also encouraged (and assigned!) in the form of participating in various issues queues and contributing to documentation.

At the conclusion of the classroom training, qualified students were placed in internships with various organizations around the United States. Each internship entails about 300 hours of on-the-job training with interns working both on-site and telecommuting, depending on the location and needs of the employer. The DCSP is the first of its kind multi-week program to provide participants with everything they need to change careers into the opportunity-rich field of Drupal.

Participating employers include:

The vast majority of the internships were provided at no cost to the employers through "Adult Work Experience," which is a workforce-paid on the job training program.

Post-Training Survey

At the conclusion of the workshop portion of the program, the partipants were surveyed to determine the quality of the instruction. Of the 18 responses:

94%, or 17 of the 18 responses, indicate that the pace of the course was good, while 1 indicated it was too fast

Responses were evenly split on the length of the course, with half indicating it should remain seven hours per week for ten weeks and half suggesting it be longer

94% agreed, or strongly agreed that the course provided them with a strong foundation to begin a career in Drupal, with 1 response remaining neutral.

94% of responses indicate overall satisfaction with the course, 64% of those indicating extreme satisfaction.

89% feel they now have the skills and confidence to become part of the Drupal Community, with one neutral and one strongly disagreeing.

100% indicated they foresee themselves pursuing, or are considering a Drupal Careers

83% plan to attend local/regional Drupal Camps and meetups

Lessons Learned

As this was the first time any of us had participated in this type of program, we all went into it with an open mind and the knowledge that it would be a learning process.

We learned that our application process needs to be further refined to not only find and identify potential participants with the necessary technical background, but also with an "entreprenureal spirit". While there are many fulltime Drupal jobs availble around the world, there are a limited amount in our local area, so potential participants must be comfortable with (or willing to learn) telecommuting, contract work, consulting work, networking, business relationships and anything else that goes along with being in business for yourself.

In addition, as a career in Drupal can be a very independent pathway, often requiring commitment, determination and self-discipline through contracting and consulting work, it is not for everyone. A requisite, or strongly encouraged entrepreneur course for participants would alleviate the lack of knowledge and trepidation regarding becoming a contractor or starting a business.

Participants were required to provide their own laptop to use during the course and were encouraged to utilize the most recent versions of their operating systems. We encountered numerous delays during the workshops due to older versions of Windows not being fully "cooperative" with software we were using (17 of our 18 our participants were on Windows-based machines).

Conclusions and the Future

The DCSP has been an unqualified success to this point. The real yardstick of success will come out in several months, at which point we'll resurvey the participants to see how many have transitioned into a Drupal-related career. We've submitted a proposal for a second session, and are looking forward to introducing the career opportunities in Drupal to 20 more participants.

Anello Consulting is proud to announce the start of the first session of our Drupal Career Starter Program (DCSP). This Drupal training and internship program is designed to teach the basics of Drupal, including a strong foundation on community involvement and practical experience.

The DCSP kicks off on October 4 on Florida’s Space Coast to give laid-off IT-savvy Space Shuttle workers an opportunity at new careers. Brevard Workforce, the local workforce development board funded by the state of Florida, is using federal grant funds to provide scholarships, and potentially paid internships, for the 19 carefully selected participants. The goal is to keep these skilled workers in the area while expanding their skills from the shrinking aerospace industry to self-guided, opportunity-rich careers in Drupal.

We carefully designed an application process that allowed for a maximum number of people to qualify for consideration. We weren't looking for a particular skill set - we wanted to be able to accept students who had at the very least a small set of applicable skills as well as a strong desire to learn something new as well as to be part of an open-source community.

We developed this idea as an ideal combination of our skills at Anello Consulting. While I specialize in Drupal development, training, and media, Gwendolyn Anello's specialty is economic and market development for government and industry. With the shortage of qualified Drupalers and the plethora of local folks looking for new careers, our proposal to the workforce board brought Drupal training into the fold of funded career development training. Over the course of several months, we were able to secure a contract for a pliot program for the DCSP with the support of several companies willing to lend support and possibly provide internships.

We designed the 10-week, 20 session (70 hours total) course to teach the students a strong Drupal foundation, confidence to dive in deeper in their interest areas, a strong community-involvement aspect, and to give them enough background for them to hone in on the specific Drupal niches that appeal to each student. In addition to the classroom training and self-study, we've also teamed up with BuildAModule.com to provide steeply discounted memberships for attendees to access Drupal video tutorials for the duration of the course. Anello Consulting is planning on using this course as a pilot to build on for other regions that have high unemployment rates. Mid-way and after the course, we’ll monitor and report on the success of the program with participants and a few weeks after the course, touch base with the host intern companies to see how we can improve things.

The program could set a precedent with workforce boards, and hopefully help to mainstream training programs through funding from other agencies.

If you are interested in supporting the DCSP by hosting one or more of our future interns, especially telecommuting jobs for the Space Coast workers, please let us know.

I've been heads-down for the past 4+ months working with Bonnier in an effort to migrate FieldAndStream.com and OutdoorLife.com to Drupal 5. I'm proud to say that earlier this week, both sites went live! In case you're not familiar with Bonnier, they're also the parent company of Popular Science, a well-known Drupal site.

As a contractor to the development team, I took part in adding new and expanding existing features to the sites. The previous incarnation of both sites was based on a proprietary content management system that had reached its limits.

The two sites have virtually the same feature set, but each one has their own custom theme and vocabularies (and not just in the Drupal-ese meaning of the word!) Therefore, most of the work we did had to be written in a generic enough way to work in both places.

As with most content-heavy Drupal sites, the workhorse modules we used include CCK, Views, Pathauto, and Taxonomy. We wrote a significant number of custom modules (most of them rather small) in order to tweak different aspects of the site, and spent a considerable amount of time theming the sites to match the meticulously prepared mockups.

My role in the project focused mainly on the Photos and Videos areas of the site (lots of ViewsAPI and form-related jQuery), but I also had a hand with the Answers section, implementing the Nodecarousel module, working with VotingAPI, and using hook_form_alter in virtually everything I touched!

If I take one thing away from my experience with this project, it is the importance having a strong taxonomy system from the very beginning of the project. Amazing things can be done with taxonomy (and Pathauto) when it is treated as the backbone of the site.

Overall, it was a great project and I think the results speak for themselves!

It is often desirable to add a node count to the title of a view to help users gauge how many nodes have been returned - especially when using exposed filters.

In Views 1.x, this is a fairly simple process using the "hook_views_pre_view(&$view, &$items)" function (documentation).

Let's say you have a view that displays titles and authors all content of type "story" on your site in table format. Furthermore, you've exposed the "Node: Author Name" field as an exposed filter for the view:

As the user plays with the "Author Name" exposed filter, as expected the title of the view doesn't change.

To modify the title, we can implement the hook_views_pre_view function as follows:

For this example, I created a new module called "view_title" and a view called "test_view". The "total_rows" value is part of the $view array, so it is as simple as appending the total_rows to the existing page_title. The result is then:

While it is possible to affect the view title without writing code when using arguments, I don't believe it is possible to add the node count to the view's title regardless of if you're using arguments or exposed filters.

Feel free to download the "view_title_count" module I wrote for this post below.

As far as doing this same type of thing in Views 2.x, I haven't yet figured it out - it does appear that the "total_rows" variable is not used in Views 2.x, so an alternative way will need to be found. If you know how to do this, feel free to leave it in the comments.

Collecting and using user profile information has always been a popular aspect of the Drupal module scene. The Profile module (part of Drupal core) has always been a relatively straight-forward way of collecting additional profile data about users, but its lack of default Views and CCK integration has been problematic for most users.

Saving user data as nodes has been possible using a variety of methods for quite a while, but it seems that with Drupal 6.x, things are coalescing around the Content Profile module. This allows you to set a particular CCK content type as a user profile (the module actually creates a default "profile" content type automatically for you) - thus gaining all the advantages of CCK and Views (and their associated universe of modules) when dealing with user profile data. This is extremely powerful and lets you do all sorts of wacky things with your user's profile data (don't be evil).

This article talks about the (relatively easy) process of getting the Content Profile module configured for a Drupal 6.x site. Then, I'll go through the process of making one of the profile fields available to Views and a template file for use when displaying a node. This might be useful if one of the Content Profile fields you're collecting is a short biography of the user that you want displayed within any nodes the user has authored. Then, your standard node view can look like this:

The first step is to download, install, and enable the modules you'll need:

When the Content Profile module is installed, it will automatically create a "Profile" content type. You can go to "admin/node/types" to confirm this. Clicking to "edit" the Profile type, you'll see that this is a standard CCK content type with one exception - near the bottom of the edit page, you'll see a "Content Profile" fieldset. Inside the fieldset, you'll see a single selected checkbox: "Use this content type as a content profile for users" - this basically links each node of the Profile type to a particular user automatically with automatic integration on each user's "My account" page.

In the "Submission Form Settings" fieldset, let's go ahead and set the Profile's title to hold the user's full name. Just change the "Title field label" to "Full Name". Let's also change the "Body field label" to "Short Biography" - this is the field we'll eventually have displayed under the byline of the Story content type:

In the "Workflow settings" fieldset, uncheck the "Promoted to front page" box.

If you have the comment module enabled, in the "Comment settings" fieldset, set the "Default comment settings" to "disabled".

Once these changes are made, let's save the Profile content type.

Clicking to "edit" the Profile content type again, you'll see that there are some additional Content Profile settings under the "Content Profile" tab - we'll leave the default settings for now. One complaint I have about the "Content Profile" module is that even when the "User page display style" is set to "Display the full content", the "title" field (or the "Full Name" field, in this example) is not displayed on the "My Account" page. I'm not sure if this is a bug or the module was designed this way.

Like any other content type, you can add additional fields to your heart's content - we're going to keep things simple for now and go with what we have.

Next up, we need to visit admin/user/permissions and give "authenticated users" permission to "create profile content" as well as "edit own profile content" and "delete own profile content". Feel free to modify this step to meet the particular needs of your site.

You can check out what we have so far by clicking on the "my account" link in the main menu. Since all of this is brand new, you won't have any profile data entered yet, so you'll see a "Create your Profile" link. Clicking this will bring you to a standard Drupal node form where you can enter your Full Name and Short Biography. Go ahead and fill these out before continuing.

So, at this point, we have the Content Profile module up and running and collecting user data. You can also enable the "Content Profile User Registration" module (it is included with the main Content Profile module) to collect the profile data during user registration, much like the standard Profile module does.

Now, we want to set up a view that displays a list of nodes along with a field that displays the author's "short biography". To start, you'll need to take a few minutes to create 2 or 3 "story" nodes (be sure the "author" of these nodes is the same user as you used when entering your content profile data). Go ahead, I'll wait here.

Ok - all set? Good. Let's go to admin/build/views and click to "Add" a new view. Set the first set of form fields to:

If you haven't used Views 2 before, this next part might be tricky (check out thesescreencasts for a quick primer). First, let's set up the "Defaults" part of the view:

Add Filter: Node: Published = Published

Add Filter: Node: Type = Is One of Story

Add Field: Node: Title (link this field to its node)

Add Field: Node: Teaser

Add Field: User: Author (Label: Author)

At this point, in the "Live Preview", you should see your test nodes appearing. Let's go ahead and add the author's short biography. This is done using a Views "Relationship":

This sets things up so that the author's content profile is linked to the story node via the user id (author) of the node. The next step is a bit tricky. We need to add the "Node: Body" field - but not the "body" of the story node, we're actually going to add the "body" (which is our "short biography") of the Profile node via the Content Profile relationship we just added. Click to add a field, then select the "Node: Body" field - after you click the "Add" button, you'll be prompted to configure the "Node: Body" field. This is where you select the "Content Profile" relationship, telling views that you want the "Node: Body" of the author's Profile node, not of the Story node:

Click the "Update" button and Voila! Checking out the Live Preview, you'll see that Short Biography for the author of each node is now listed. You may want to adjust the order of the fields using the up/down arrows in the "Fields" section. Go ahead and save the view at this point.

Let's go ahead and add a "Page" display for this view. Click the "Add display" button (assuming that the associated select box is set to "Page"). Under "Page settings" set:

path: contentprofiletest

At this point, your "edit view" page should look something like this:

Click to save the view, and then go to: /contentprofiletest to see the results so far.

The last step is to make the "short biography" field available to the appropriate node template file so that when a user clicks to view the full node, the "short biography" is still visible. Right now, clicking on a node's title brings us to a standard node page, but without the "short biography" field. The reason it isn't being displayed is that the "short biography" field isn't part of the "story" content type - it is actually part of the "profile" content type, so Drupal doesn't know to include it as part of its node loading behavior. We'll have to write a small module to accomplish this.

I'm only going to show the "meat" of the module within the article, but you can download the entire module below. Basically, all we need to do is add the "short biography" field data to the story node's data whenever the story node is loaded (we could also do this using the "view" $op, but using the "load" $op gives us more flexibility if we want other modules to be able to modify the Short Biography data as well). We'll do this using hook_nodeapi():

The Content Profile module provides the handy "content_profile_load()" function that gets the data based on user id - we simply pass it the name of the profile content type ("profile", in our case) and the user id of the author of the node. Then, we set a new "short_bio" field to the node that is equal to the "body" field (our "short biography") of our profile node. If we had additional profile fields that we wanted to add to the node object, we could add them in a similar manner.

At this point, you can now access the author's short biography in your template file using:

<?php print $node->short_bio; ?>

You can quickly test this by altering your theme's node.tpl.php (or node-story.tpl.php, if you have one) to include this new variable.

I'm fairly certain that you can use the Panels module to override the default Drupal node display to include the Short Biography data within the full node view as well, but that exercise is for another day.

Seeing how Drupal consulting is about 90% of my business, I figured it would be a good idea to upgrade AnelloConsulting.com to not only the latest version of Drupal, but the first commercially supported version - Acquia Drupal. While most of my clients are still focused on 5.x, I have begun recommending to new clients that they use 6.x and consider using Acquia's support services.

Previous to the upgrade, this site was running Drupal 5.5. The update process actually went fairly smoothly - no more difficult than any other "major" Drupal upgrade. Since I use a manageable number of contributed modules that aren't part of the Acquia Drupal distribution (subscriptions, fasttoggle, global redirect, google analytics, image, meta tags, page title, persistent login, smtp, and xml sitemap) with no custom code, there weren't a whole lot of tedious steps to take other than what is recommended.

Since Acquia Drupal ships with a number of contributed modules, I decided to replace some of the modules I was using in 5.x with the Acquia-supported ones. I dropped Recapchta in favor of Mollom. I'll also be dropping the Image module in favor of its CCK equivalent.

I did run into a couple of issues. First, for some reason, I lost both the uid=0 (anonymous) and uid=1 (main admin) users from the "users" table. I'm pretty sure this was due to my particular server and file transfer situation, but it had me scratching my head for a bit before I tracked it down.

Second, I had to re-create a couple of views since I was going from Views 1 to Views 2. As the author of views writes in the release notes, "The upgrade path from Views 1 to Views 2 is rather painful."

Overall, I was quite pleased with the upgrade. I'll be posting a follow-up to this article in a month or so once I've had some time to fully understand and test-drive Acquia's network and support services.

I attended BarCampChaos at the Orlando's Marriott World resort on Monday evening, October 13, 2008. The BarCamp was piggy-backed on Create Chaos, a conference for creative professionals. Ryan Price gave a presentation on developing an online portfolio site using Drupal, and I gave a 20 minute presentation titled "Anatomy of a Drupal Theme".

My presentation was a thinly-veiled attempt to recruit local designers to learn Drupal theming. As a Drupal developer, I am constantly looking for talented local designers and themers to work with. While the title of the presentation was "Anatomy of a Drupal Theme", it might as well have been called "Anatomy of a Drupal Themer" as I spent a good amount of time talking about the difference between a designer and a "themer" and how at the intersection of the two, there is lots of money. Feel free to download my presentation.

There were a lot of great presentations - the thing that makes BarCamps unique (and so much fun) is the wide range of topics that are discussed back-to-back. Here's a sampling of what was presented last night:

Technology employment statistics for Florida and Orlando - the economy sucks, but technology jobs are a bit insulated and growing.

The decentralization of the desktop computer - applications on a stick. Somebody (I forgot to write down his name, if someone could enlighten me, I'd appreciate it) gave a presentation of using GIMPPortable running off a USB stick. Other portable apps available from PortableApps.com.

Life Hackery - Getting Things Done with Room for More - Eric Marden - First tip: use GTD (software that helps: Things (simpler) and OmniFocus (advanced)). The basics of GTD is breaking things up into Projects, Contexts, Items, and Actions. Second tip: firm believer in "Inbox Zero" - a vow not to spend more time on a given email than it deserves. With every new email, either Delete, Delegate, Reply, or Process immediately.

Jeremy from Whoiscarrus talked about creating artwork digitally using a combination of drawing and code. Nodebox is another application that does similar stuff. Wow - really, really cool stuff.

Seth talked about Facebook application development and his experiences while developing the "Running Expo" application. He talked about the limitations of the Facebook Markup Language (FBML) and Facebook JavaScript (FSJS) as well as the advantages of developing a Facebook application (viral marketing, installed user base).

Bensan George talked about some useful iPhone applications, including Things, Twitterrific, last.fm, and Aurora Feint.

Ryan Price talked about setting up a Drupal-based photo gallery web site using CCK, Embedded Media Field, and Views. In 20 minutes, Ryan used Drupal to set up a couple of new content types, showed how to use Embedded Media Field to pull content from Flickr and YouTube, tag content using taxonomy, then how to tie everything together using Views. Plus, he did this all while battling his apparently overworked, uncooperative laptop.

Gregg Pollack of Orlando Barcamp fame talked about the Technology of Sound. He basically taught the basics of sound (Hz, dB, sampling rate, compression, etc...) and how it relates to podcasting. I learned more about sound in his 20+ minute presentation than really should be possible.

The guys from Hydra showcased a site they've been working on all summer that is based on PHP's Zend Framework. They talked about what they felt were the advantages of developing custom content management systems.

Finally, Dan Kinchen gave a talk about making money on the internet. Dan's talk covered the multitude of ways he's levereaged the power of search keywords, AdSense, and Amazon's API to build low-maintenance sites that provide steady income.

As a Drupal developer and recent memcache convert, I now know the joy of speedy caching.

Memcache actually comes with 2 modules: the main memcache code as well as "memcache admin" which, as far as I can tell, is really only necessary during devlopment and testing of the site (sort of the same way the Views UI module can be disabled after a site goes live).

When using Memcache with Drupal 5.x (it hasn't been ported to 6.x yet, but there is some ongoingwork), there's a big "gotcha" that has gotten me on more than one occasion - the "show memcache statistics at the bottom of each page" option on admin/settings/memcache.

This works a lot like the devel module where it provides some useful stuff at the bottom of every page, but since it attaches itself to the bottom of every Drupal page load, it can mess with some AJAX and AHAH calls.

The upload module is one place where it can wreck havoc, pretty much causing any uploads to fail (endless-barber-pole-of-death style). Disabling the statistics or the entire "memcache admin" module returns things to normal.

In looking at the memcache_admin.module code, the problem is (relatively) easy to spot - in the memcache_admin_init() function, there are a couple of cases where the statistics aren't displayed: when the user is anonymous, on update.php, and during an autocomplete AJAX call. The devel_init() function by comparison, includes a bunch more exceptions where the statistics aren't displayed "upload/js" is one of them.

I've gone ahead and created an issue for this fix, hopefully it'll make it into the next release (and prevent me from any more head-smacking). In the meantime, if you're using memcache and file uploads are failing, this could be your solution.

One thing I do while developing Drupal sites is make frequest trips to the "Recent Log Entries" page (admin/reports/dblog) to see what kind of trouble any custom code (or rouge themes and modules) I've added is causing. It is a quick "sanity check" to make sure things are moving in the right direction.

A pet peeve of mine when using a custom theme has always been the "page not found" log entry for the "favicon.ico" file. When using a custom theme (or an overridden favicon.ico file), while Drupal outputs the correct path for the link tag that specifies the file, some browsers still look for the favicon.ico file on the root of the web site. Normally, I "fix" this issue by making a duplicate of the favicon.ico file and placing it on the root of the site.

The teeny-tiny (29 lines, including comments!) favicon module takes care of this by automatically redirecting the /favicon.ico path to the site's actual favicon.ico file. Once enabled, there's no configuration, access permissions, nothing. Superfantastic.

The brand-new Incoming module is something that every site owner should hope that they need. This module, only available for Drupal 6, will send you an email alert you when traffic on your site suddenly increases. You set the alert thesholds based on a change in number of visitors over a specific time span. For example, it can be used to send you an alert when your site gets 100 more users in the last 10 minutes than it got in the previous 10 minutes.

The module also does an admirable job of staying out of the way. It includes a "probability limiter" that lets you set how often it samples traffic. You can set it to only sample on a certain of all page views - this way it won't be running on every page view (and thus reducing overhead).

I haven't had a chance to test it in a real-world situation yet, but just the idea of it is enough to put it on my "install on every site" list for now.

I have a new client that has hired me to build a small, Drupal-powered web site for his scholarship foundation. Since the site is very straight-forward and not slated to go live for a few weeks, I decided to see if I could build the site using Drupal 6. The big challenge is seeing if I can get all of the image functionality working (and stable) using the bevy of development- and alpha-versions of the necessary modules.

The idea is to be able to add an image field to the standard story and page content types and use ImageCache and Lightbox2 to display the images. As of this writing, the required modules are at various stages of development:

Installation of the modules wasn't as smooth as I hoped, after moving the modules into the sites/all/modules directory, I decided to tempt fate and try to enable all of the relevant modules at once. I immediately got a white screen of death with a error message indicating that the "content_notify" function was not defined. I went back and enabled CCK first, then FileField and ImageAPI, then ImageField, then the rest. It appears that one of the modules I was trying to install tried to call the "content_notify" function from CCK before it was loaded (based on a cursory search of the code, both FileField and ImageField call the "content_notify" function from the install files).

Regardless, once I had all of the modules installed, I set up my configuration as follows:

I created two imagecache presets: "thumbnail" and "large".

I added an imagefield to the "story" content type. For the most part, I used all default values in the imagefield's configuration other than enabling custom title text, allowing for unlimited images per story, and disabling the description field.

I set the teaser and full node display options on "display" tab of the edit "story" content type (admin/content/node-type/story/display) to use Lightbox2. In my case, I used "Lightbox2: thumbnail->500_pixels_tall". This will initially display the thumbnail of the image on the page and when the image is clicked, the Lightbox effect will be used to display the larger image.

Once the configuration was done, I went ahead and added some new "story" nodes with various photos and everything worked great - I haven't seen a single issue yet. Despite the fact that this is simple application of these modules, I'm quite pleased that these 6 modules - with 3 different maintainers (and a whole bunch of contributors) are working so smoothly despite the fact that 4 of the modules are alpha-level code and one is a development version. Thanks and great work!