Clean Transportation Funding From The MSRC

Fri, 12/22/2017 - 15:33

The MSRC provides funding for clean fuel alternatives primarily in California. 

The website required a number of customizations outside of existing modules. I also needed to import 10 years of legacy data and documents.

http://cleantransportationfunding.org/

GUI Content PageThe User Interface

The previous website was built in Expression Engine and was apparently not well received by the employees. Well, at least from what they told me, it had it's flaws. I think as a developer I have heard this from users with just about every Content Management System. Regardless we wanted to try to simplify the admin interface and make it a bit more elegant. 

Drupal 8 allows custom themes and has a well developed system for creating custom post types. Mix that with some CSS and Javascript and what we came up with was rather nice. I set the system to allow one Super Administrator that used the normal Drupal admin GUI. This login had access to the system, installing templates, modules and creating blocks, views, etc...Basically the things that the average user, including an admin, probably had no business ever seeing much less modifying.

The regular admin still has access to the permissions but I hid various super admin permissions. Anyone who has used Drupal 8 knows that there are about 50 permissions out of the box and after creating custom types and adding a list of modules this went up to nearly 100. So with a little simple Javascript I added the ability to search the page for keywords.

Another area where a little Javascript made the interface a bit more manageable was the Projects content type. This type had some 50 fields available. Most of which were apparently never used. project content type interface cloudI, of course couldn't simply discard the fields that seemed to never be used and yet adding or editing a new project was daunting. My solution was to provide a cloud of potential fields in the side-bar. When a user created a new project they are presented with about five fields based on the most common fields used in previous projects. If they needed more then they simply need to click the necessary field and it will be added.

The Contractor / Reviewer Proposal Interface

Requests for proposal are posted to the website. Prospective contractors are given an interface where they can upload proposals (provided the RFP allows for proposals to be submitted online). Previously contractors could only upload a single document. Any subsequent information or reports were uploaded in the same manner. These documents were simply added to a document library. There was no association with the document, the contractor or the RFP outside of the document naming convention. If the contractor failed to follow the upload proposal process the documents could be lost among all the other documents in the document library. The new system not only allows for multiple documents to be uploaded as well as images. Contractors can, if they desire, type or copy and paste their proposal into a form. The proposal must be associated with a currently open RFP. A drop-down menu with all available open RFPs that allow for an online submission is provided. All subsequent documents are also associated with the initial proposal. This way contractors can see and modify their previous proposals. When a proposal is created or modified or a subsequent report is added an email is dispatched to all members of the review committee.

Review committee members are provided with a separate interface allowing them to search and view proposals based on a number of fields. 

The contractor interface and the review interface are separated from each other and all other administrative login types on the website except for the Super Administrator.

Importing Data

The previous website was based on Expression Engine. Expression Engine is open source but because the production version is a pay version there is not a lot of documentation or examples available for exporting the data. For this reason I had to map out the database and reverse engineer. Not a problem. I am quite adept at MYSQL. 

The data, for the most part, is stored in a large table with about 100 columns. The data is given an ID and is mapped to a number of other tables based on the post type and any extra data that needed to be stored. It was a bit more complicated than it sounds actually. Finding what foreign keys matched with what contact was difficult because the columns can be dynamic. So to determine the foreign key there was another table with a list of keys to match and in some cases it would match again to other locations.

In any case I was able to map it out and create a process for exporting the data, normalizing it and putting it into the new database. This entire process took about 8 hours.

The Email Newsletter And Notification System

email renderedIn the previous website the newsletter was built in the CMS using basically the same system as was used to create a page or article. They still wanted the newsletter to be basically created using this same idea. I have built email newsletters using Constant Contact, Granicus and Mail Chimp as well as building email from scratch for a number of years. To me it made sense to give them a better interface that felt more like they were creating a newsletter with multiple articles rather than simply creating a simple article that happened to be sent in an email. 

The interface I gave them allowed the user to render the email before actually sending as well as to send a preview. Rendering the email not only gave the user an opportunity to view the email as it would appear but, it also allowed me to actually inject the styles inline based on how the browser interpreted the CSS. The problem is email clients are far worse about how they render than browsers ever were...To this day many email clients do not use CSS or use it sparingly and the consistency between versions is pretty insane. Using Litmus (a tool for rendering email for debugging) I have seen email from Outlook 2013 render using CSS correctly, only to see it fail in version 2014, work in again in 2015 and then fail again in 2016.

My solution is when the user clicks "render email", a Javascript is run that loops the email DOM nodes and looks at what the browser actually has interpreted. I then inject that CSS inline into each node. This way each part of the email has a style and it does not need to worry about how the client uses inheritance. For even greater backward compatibility my code injects a specific with into the main table and injects a width into each image using old fashion inline HTML attributes.

The Email Template System

Each email type has an associated template. The type is attached to Drupal's user node type. When a new user registers an account they are given an opportunity to select the notifications they would like to receive using a list of check boxes. Custom templates can be created easily using basic HTML/CSS (using a stylesheet) , following a few simple rules and using pre-defined short codes.

I included instructions for creating an email template in the developers user guide.

Sending The Email

We decided to use Amazon as the SMTP service. Amazon provides the Simple Email Service API which is capable of sending multiple emails in one push, but they don't provide any method in the API for tracking email opens, clicks, etc...I know it doesn't make sense. I researched for a day looking for some legitimate method. I think this is because of a growing consensus that tracking email is kind of unreliable. 

But, of course, explain that to a customer. So anyways the best method was to send the emails in batches, record the initial result and to create a tracking pixel in the email. This was based on hashing the article ID and placing a link to the hashed ID as a jpeg on the server. The JPG would of course not exist. This would simply be a 404 request in the server logs. When the CRON would run it would count the number of occurrences of the article hash as a number of openings.

This process records: sends, errors, opens and then a redirect injected to all the links in the emails allows tracking of click-throughs.

I was concerned that sending so many emails would bog down the server so I designed the system to run off a basic PHP script based on a CRON initially.

This script makes a request to the server asking it if it has available resources. If true, the request gathers any messages scheduled to be sent and the email addresses of all users who signed up to receive the notification.With each batch the server checks to make sure resources are available. If resources are not available the system tries again in 30 seconds. If the system can not successfully send a second time the list of unsent emails and the rendered message are stored in a file.

The next time the Cron runs if it no longer has any tasks the system it grabs all the unsent messages and sends them in a batch.