We're proud to provide Development services.

Hide additional information on this project
Show additional information on this project Expand
Ease-of-use API Key Management Tool.

Services
Processes
  • Agile/Kanban
Team Leadership

In order to remove the requirement of understanding AWS, LCM built a serverless API that interacts with API gateway. This allows the EOTSS team to manage API keys without having to interface with AWS directly.

The Commonwealth of Massachusetts maintains several APIs used by a variety of internal and external teams.

The APIs are built leveraging AWS API Gateway, which allows each granted user to be assigned an API key with various functionality such as rate limiting and access one or more of the state’s APIs. This system is great to work with, but requires a user who is managing the applications to be very familiar with AWS and API gateway in order to add or modify a user’s access to each API.

The application requires OAuth2 authentication through GitHub in order to access the tool, and has a ReactJS frontend that allows authenticated users to easily administer the keys.

Hide additional information on this project
Show additional information on this project Expand
Enabling Pega's developers to confidently and quickly respond to customer feedback

Processes
  • Continuous Delivery
Team Leadership
  • Senior Development
    Chris Shelton

Pegasystems Inc (Pega) is the leader in cloud software for customer engagement and operational excellence. Businesses rely on Pega’s AI-powered software to optimize customer interactions while ensuring their brand promises are kept.

Pega’s development team grew rapidly and was suddenly faced with technical challenges of delivering value to customers confidently and at speed. Pega hired Last Call Media to help improve, automate, and scale their deployment pipelines, and implement DevOps best practices.

How we did it

After studying Pega’s workflow to identify areas of improvement and discuss pain points, we found that the team’s main challenges were around deployment. Developers did not have a strong sense of confidence in the deployability of the code and team members still relied heavily on manual testing during the development cycle. This caused delays that frustrated the team and impacted the business agility.

We all agreed that the development team should be able to write automated tests to rely less on tedious manual testing. With this goal in mind, we looked into removing blockers for the Pega team.

The development team should be able to write automated tests to rely less on tedious manual testing.

Pega’s hosting provider at that time was making it prohibitively expensive to spin up the desired number of environments to test on, meaning, we had to find a cost-effective solution to spin up and spin down environments to enable automated testing. Providing those initial environments would also unlock the ability to demo and review work more easily before deploying to production.

Pega is a highly technical group and we wanted to fully empower them to do their best work confidently. After reviewing and discussing internally and with Pega all of the various options on the market and the costs associated, we chose to develop a custom solution built on Kubernetes in AWS. We made sure that all configuration of the entire system was built with Terraform and in a Github repository, meaning anybody on Pega’s team could dig into the codebase to understand it and make changes in the future. Fully documenting the new system and how to use it has always been a priority for any client engagement.

Fully documenting the new system and how to use it has always been a priority for any client engagement.

With that foundation set up, we were quickly able to load Pega’s application into Docker images, write sample automated tests and give their development team the ability to deploy an unlimited number of environments, one per Github branch, at a reasonable cost.

 

As a result of our work with Pega, their team gained confidence in its automated building and testing systems to bring the company closer to continuous deployment. Pega can now confidently deliver incremental improvements in small batches. This improved the quality of their product, and helped uncover bugs before they reached production. 

Pega can now confidently deliver incremental improvements in small batches. This improved the quality of their product, and helped uncover bugs before they reached production.

Releasing smaller batches of work led to low-risk deployments and allowed improvements to reach the site’s users faster. Automating testing lowered costs and improved quality. The Pega team was able to rely on computers to do the repetitive tasks computers are good at and freed up Pega team members to do what humans are good at: listening to customers, solving problems, and being creative.

Hide additional information on this project
Show additional information on this project Expand
Engaging and retaining the next generation of experiential learners.

Team Leadership
  • Art Director
    Colin Panetta

Wentworth Institute of Technology is a technical design and engineering university in Boston, Massachusetts. Wentworth is continuously investing in creating transformational educational experiences for its students and embracing a culture of innovation and creativity. 

The goals of the institute translate directly into the digital experience they need to provide to visitors of wit.edu, their current and prospective students, as well as faculty and staff. Last Call Media has been a close partner of Wentworth for years, helping them accomplish important milestones and solve challenging problems: to create an inspiring and meaningful user experience, develop a seamless process for hundreds of content authors, and finally, tackle inefficient development processes.

How we did it

Design to turn a vision into reality

We began by really thinking outside the box of what a college website “should” be. The institute was looking to showcase their commitment to innovation, diversity and inclusion, as well as exceptional educational experiences. Last Call Media collaborated with the Wentworth team to create their new branding direction and to turn this vision into reality. The Last Call Media design team created a fresh, vibrant new look for wit.edu to capture the attention and excitement of the school’s two key audiences: prospective students and their parents.

After translating existing static graphic designs into UX/UI prototypes with an eye towards accessibility, we introduced the Wentworth team to a new and improved method of designing and site-building by utilizing a design system, so that the site could be built in a systematic way as opposed to every page being conceived of and built individually. This way of building the Wentworth site would set us up to create a great-looking site while having an efficient development phase, and playing to Drupal’s strengths, flexibility and customizability.

WIT-homepage

A consistent, sitewide visual language

To avoid the need to uniquely design and theme every element on every page, Last Call Media created a maximally efficient workflow for Wentworth. We created a consistent sitewide visual language that becomes more intuitive the more users interact with the site.

We implemented a design system-based strategy for the redesign, defining a number of styles and components that would be reusable throughout the site. It included:

  • 40 styles (for things like colors and text styles),
  • 50 elements (which can range in scope from things like buttons to entire sections of pages),
  • 10 fully-responsive page designs—many of these involving multiple design iterations.

Authoring experience for hundreds of content creators

Within an organization like a college or university with often hundreds of content authors, it’s key to remember that site and content changes will often be coming from multiple people throughout the school, who all have varying levels of experience and comfort working within a CMS. This was absolutely the case for Wentworth, and so we varied the flexibility of each content type based on who would be building them. Content types that would be handled by the more experienced members of Wentworth’s web and marketing teams were allowed maximum flexibility—they can essentially put any component on any page they want, in any order they want. Content types that would be used by a wide array of authors with inconsistent levels of technical expertise offered a more structured layout, in order to minimize complexity and maintain the integrity of the website’s information architecture and UX/UI.

The impact of continuous delivery and maintenance

Over the years, we’ve worked with Wentworth to implement industry best practices in terms of programmatic coding, database architecture, and staged deployment and automated testing workflows. Our first engagement with Wentworth involved completing a large set of outstanding updates, setting up Single-Sign-On with Shibboleth, configuring Apache Solr with multicore for development along with boosted “more like this” search results, and other search enhancements.

Our Core Services team was also able to complete Wentworth’s site migration to Pantheon on a tight timeline of less than a week, prior to the school’s Thanksgiving holiday in 2017. This has led to significant cost savings for the school, and provided a clearer path to empowering their small internal development team; the workflows on the new platform were easier for Wentworth to manage, and they benefited greatly from the multidev environment feature.

Our work with the Institute has spanned everything from regular core and module updates, migrating their site to a new hosting platform, to refactoring user groups, URL structures, to accessibility updates and SEO best practice improvements. We maintain a close working relationship with Wentworth in order to continue offering advice and guidance on how to make the most of Drupal

Through a refreshed online presence, the new wit.edu represents Wentworth’s commitment to creating transformational educational experiences for students, and a culture of innovation and creativity. Wentworth is now able to provide site visitors with a more accurate picture of what Wentworth is really about to help them decide whether they can see themselves or their child succeeding at Wentworth. 

Last Call Media is proud to have been Wentworth’s partner in improving user experience, creating a seamless process for making content changes for hundreds of content authors, and modernizing development processes.

Hide additional information on this project
Show additional information on this project Expand
Faster than lightning Addressing API.

Services
Processes
  • Agile/Kanban
Team Leadership

The Commonwealth of Massachusetts deals with addresses from many organizations, each entering addresses in their own unique way. This leads to a single address being entered potentially dozens of different ways. The Massachusetts Bureau of Geographic Information (GIS) maintains a database that uniquely identifies each individual address within the state in a consistent format, so they needed a way to take the many possible variants of each address, and reconcile them to a canonical address entry within their internal data set - which was stored in a format that is difficult to query against (FGDB). Because of the volume of data the state deals with, using an external service was outside of the allowed budget.

LCM worked with the Commonwealth to build an extract-transform-load tool (ETL) that runs whenever an updated data set is provided by GIS. The ETL takes millions of records from the FGDB dataset, normalizes them, and imports them into an AWS RDS database. The state had a proof-of-concept process for this that took hours to run. By leveraging AWS Fargate and the open-source GDAL library the required time was brought to under 10 minutes.

After the ETL, we built an API endpoint using Serverless.js (AWS) that takes in addresses in a typical mailing address format (which allows for many potential variations). We leverage libpostal to separate the address into distinct address components, and then perform a query against the RDS database to see if any matches are returned.

The serverless architecture of both the ETL and the API endpoints are highly scalable and secure, and the state is only charged while they are actually being used. This allowed us to create a flexible address matching system that took advantage of their internal dataset of unique addresses, and allowed user input in a wide variety of formats, for a very small cost.

Outcomes: The API has gone through internal testing and is fully functional and is in the process of being rolled out across the Commonwealth. 

Measurements: Delivery of most API responses in < 500ms milliseconds.

Hide additional information on this project
Show additional information on this project Expand
Finder’s performance boost with an improved release process

Finder helps millions of people worldwide make better decisions by allowing them to compare a wide range of products and services. Finding the right credit card, buying a home, or getting health insurance can be a daunting task. Finder.com makes the research more straightforward, and consequently, can save users time and money.  

Since a major part of Finder’s business is to compare numerous products, Finder.com needs to provide users with tools that are quick and easy to use, while still displaying a wealth of dynamic information. In order to accomplish this goal and give users a great experience, Finder’s team of developers needs to be able to continuously improve the platform, iterate quickly to regularly evaluate what it is that delights users, and drives engagement for the platform. 

Last Call Media supports Finder’s efforts to build an exceptional digital experience for its users within the WordPress platform. We also helped their software development team rethink the way they deploy to production.

How we did it

On a daily basis, dozens of developers work simultaneously on Finder.com, adding new functionality, fixing bugs, and creating new ways to provide value to users. As the team scales, it is faced with several challenges around managing the deployment pipeline. 

Until recently, developers were often limited and blocked by a complicated build process, where any code change would take about 12-16min to go live. Making that change often required modifying multiple repositories since all the themes and plugins were split and could not be fully decoupled. Deploying to staging environments was a manual process, and code reviews required lengthy instructions and thus were error-prone. The developer that authored a change, needed to remember to merge all repositories for a successful deployment or risk bringing down the site. 

Speeding up the deployment process 

An ineffective deployment pipeline can easily add up to hundreds of hours of time a month waiting for a build to complete that could have been spent on delivering other functionality instead. The development workflow should be smooth with stumbling blocks removed. Since Finder’s engineering team is growing rapidly, this problem needed to be addressed first. 

To speed up time to market, we overhauled the build process at Finder. We implemented Buildkite to enable continuous integration in a more efficient way. As a result, build time decreased by more than 50%, to only 8min. 

This increased efficiency sped up the process for Finder’s developers to get their work onto staging environments and out to production, and in the hands of customers.

Finder-main

Offering stability to the development team

Another challenge for Finder was the stability and time it took to deploy a hotfix to production. If multiple developers were trying to deploy to the product at the same time, they would often frustratingly block each other, further increasing everyone’s time to production. 

To release this functionality, we identified and rearranged certain key jobs so they could be run in parallel. We also identified build steps that could be made more efficient. For example, there were many jobs downloading the same libraries, and when the Docker image was built in the end, these libraries were again downloaded, synchronously. By limiting this, speed was improved significantly.

In the end, speeding up the build process, meant increased stability, and a decreased time to deploy a hotfix. 

Building comparison tables

In addition to platform work, Finder brought Last Call Media on board to help improve their comparison tables. Finder’s business model requires them to accurately connect users with relevant products. If a user with a “poor” credit score tries to apply to a credit card that only accepts “very good” scores, they’ll waste their time, and Finder will not be succeeding in its mission to connect users to relevant products. The tables were written in legacy JavaScript, HTML, and CSS and we worked to rebuild the tables in modern React, adding modals on top of the tables to surface the most important data filters to the user first.

Now, after users make an initial selection, they can see the most relevant results for their case. We added advanced interactive features such as calculators to the top of tables so users can input data relevant to their personal circumstances and see automatically calculated savings for each product.

Finder-tables

Taking advantage of the WordPress platform and Buildkite, Last Call Media empowered Finder’s development team to deploy to production efficiently. The improved build process enabled developers to get their work to production and in the hands of customers much quicker. Additionally, Finder’s comparison tables now present results to users that are personalized making their experience so much more satisfactory. 

Now, Finder has a path forward to build stable, interactive comparison tools for users in a highly iterative way.

Hide additional information on this project
Show additional information on this project Expand
Generating complex product comparison pages dynamically.

Team Leadership

A major part of the Finder application is how it allows users to discover and compare various products. Finder produces thousands of pages for individual products and pages to help visitors compare two or more products. This historically has meant that Finder needed to create these comparison pages manually in its CMS, WordPress, with the manual help of Finder’s editorial staff. Comparison pages are now built dynamically, without the manual help of Finder’s editorial staff, meaning they can quickly publish all required pages, especially before major events like Black Friday.

Last Call Media developed a new way to quickly and programmatically surface thousands of pages for Finder. All the data for these products is contained in two custom-built APIs, which also pull data from outside APIs. We were able to build out Finder’s APIs and create a custom WordPress plugin to render pages solely from data contained in the APIs.

This means that editors no longer need to create WordPress pages for coupon pages, allowing Finder to quickly enable hundreds of coupon pages in time for Black Friday, decreasing costs associated with publishing efforts, and increasing revenue by having more coupon pages up faster.

In addition to the coupon pages, work was done expand this system to replace Finder’s manually created pages that compare two currencies. By replacing these pages with API-backed pages, Finder can reduce costs and increase accuracy, as well as develop centralized templates for displaying the information which will allow for quick, global changes and unify the design of these pages.

Now Finder can move faster and provide visitors the freshest information.

Hide additional information on this project
Show additional information on this project Expand
Learning Page Redesign.

Processes
  • Agile/Kanban
  • Agile/Scrum
Team Leadership
  • Senior Producer
    Kelly Albrecht

Working with the International Land Coalition gave us another opportunity to use our agile design process, generating ideas and a few unique solutions.

ILC has a lot of work going on around the world and we wanted to give users a way to explore the content on the site about each specific project that’s going on and the members involved.

Our approach was to focus in on the exploration aspects. What would give users the best sense of context of these global projects? How do we give attention to the members involved? What’s the best experience for sorting through the variety of project categories, subcategories and locations? And how do we present all this information without overwhelming the user?

How We Did It.

Building a relationship at the start

It was crucial we let ILC know that this is a journey we’ll both overcome together and they have our full support, as well as expertise with regards to questions or insight. They had many different goals for the project, but with the specifics undefined, we worked with them to help figure out the details and create a fully-realized vision.

The project’s dynamic was healthy constant dialogue between our creative team and everyone on the ILC team. We listened to their insights, we heard the things we needed to consider and the parameters we should work within. As things progressed, we started checking off the points on everyone’s checklists in order to be confident about the direction we were going.

First problem: Tools for finding relevant content

As mentioned before, ILC has many internal organizational projects. We needed an experience that helped users easily find content relevant to them. We mocked up a few ideas for a filter mechanism then iterated, each time pointing out the pros and cons then making changes specific to the drawbacks. This was our solution:

Image of ILC Filter
Image of ILC Learning Page Filter

The thought process behind it was pretty straight forward. Present a lot of options and information without overwhelming the user, make it accessible while browsing and allow some filter terms to tell their own story on hover. It took a few tries to get here, but we believe it accomplishes those goals.

Second problem: The Map

I mentioned context earlier, during the project we decided a map did a great job of showing the global reach of ILC.  We utilized the core framework of an existing map on their website but completely redesigned the visuals and added a little more functionality. On first load, we show users the globe with a few markers showing the number of their members. Users from there have the ability to zero into an area or region of interest, finding which member(s) are doing work and what the projects going on in those areas are. Here’s the break down:

Image of ILC Learning Page map
Image of ILC Learning Page map

Our work was well received by the ILC board and our solutions for giving their site users the best sense of context of resource content from all over the globe was deployed on time and budget. Site users are now experiencing new ways to learn about ILC members through new ways of exploring a variety of categories and locations.

Hide additional information on this project
Show additional information on this project Expand
Localization means more than multilingual.

Processes
  • Agile/Kanban
Team Leadership

Blackboard is a global education technology company whose product offerings differ in various parts of the world, and in different languages. In order to provide relevant information for site visitors in different markets Blackboard’s new Drupal 8 site had to not only provide content in different languages, but also content specific to each visitor’s geographic location. It also needed to be faster and cheaper for Blackboard to spin up sites of varying complexity in new markets. 

Goals

The goal of localization for Blackboard’s corporate site was to provide site visitors from around the world with content specific to their region in their region’s language. This means that in addition to displaying content in different languages, we also needed to be able to incorporate regional and dialect-specific (think “color” vs “colour”) versions of content. Also, since not all product offerings are available in every region we needed region-specific navigation as well (to avoid linking to irrelevant content for some users). All of this variation needed to exist between each regional site while allowing some content types to be shared across each region, such as dynamic “resources”, “case studies” and partner content.

Implementation

Location Detection: Not only did we need to provide the ability to display regional content, we also wanted users to be aware of it. In order to do this, we use Acquia’s GeoIP service in order to determine where the user is visiting from. If their country doesn’t match the regional content that they are viewing, we present a modal dialog to show them that we have a section of the site that may be better suited for them. Once they either follow the link or indicate that they are happy where they are we set a cookie so that they don’t continually see the alert. If they leave the section that they have selected, we again alert them but provide the option to stay where they have navigated to.

Regional Sections: In order to provide the regional sections of the site we relied heavily on the “group” module. Each region is a “group” entity, with its own content. We have several “group-level” fields that allow us to define things like language, navigation menus that will appear in each section for each group, region-specific 4xx error pages, and the alias that serves as the beginning of the alias for each page belonging to the group.

Each page node is a part of exactly one group. There may be pages in different groups with similar titles and content, but this model allowed us to have content in the same language and still handle regional colloquialisms, dialects, etc. While the distinct page nodes were distinct for each region, we still had to recognize that there were some types of content had to be reused across regional sections, because creating new educational resource nodes with identical content for each region would not be sustainable. Those content types are allowed to belong to multiple groups. Content listings are built in a way so that they only display content that belongs to the regional section that is currently being viewed.

Multilingual: Possibly the most obvious part of localization is enabling content to be displayed in multiple languages. Drupal’s standard multilingual functionality doesn’t really play nicely with a content model that supports multiple versions (product pages can vary between markets) of content in the same language (think spelling differences between American and British English). In order to accommodate the model Blackboard required, we decided to use Drupal’s language modules, but to leave content translation out of the equation. Instead, we would create different nodes for each language that content was to be displayed in. From the administrative side this approach caused us to lose the “translate” operation for nodes, but in turn gave us a huge amount of flexibility. We were still able to create a site that supports content being entered in 10+ languages (including RTL languages like Arabic), and accommodate localized nuance for each region as well.

 

Blackboard’s requirements were complex, and caused us to rethink our typical multilingual strategy. Instead of creating a site that supports content in multiple languages, the approach that was taken here grants the internal team at Blackboard to create their new regional sections on their site, taking into account available product offerings as well as language and regional nuance - creating a platform for a site that is not only multilingual, but truly global.
 

Hide additional information on this project
Show additional information on this project Expand
Making it easier to deploy frontend applications.

Processes
  • Agile/Kanban
Team Leadership

Last Call Media developed a microfrontend strategy with Finder’s in-house developers. Using Next.js and Docker, Last Call Media worked to make it easier for engineers at Finder to build, test, and deploy small frontend applications, like tables, that are completely independent of each other, while styled using Finder’s sitewide library of React components.

Finder needs to provide users with tools -like interest calculators- that are quick and easy to use, while still displaying a wealth of information. Because this is a core part of the business, Finder’s large global team of engineers need to be able to iteratively improve these comparison tools based on customer feedback, and test their changes to see what works best to drive user engagement.

As an example of this, Finder asked Last Call to improve their comparison tables. The tables were written in legacy JavaScript, HTML, and CSS. We worked to rebuild the tables in modern React, adding modals on top of the tables to surface the most important data filters to the user first, so that, after selected, they’d see the most relevant results. We added advanced interactive features such as calculators to the top of tables so users can input data relevant to their personal circumstances and see automatically calculated savings for each product. 

Finder now has a path forward to build stable, highly-interactive comparison tools for users in a highly iterative way.

Hide additional information on this project
Show additional information on this project Expand
Migrating from Workbench Moderation to Content Moderation.

Team Leadership

When the Commonwealth of Massachusetts first built their Drupal 8 site, they started in Workbench Moderation that allowed their authors to write content and put it into various moderation states such as “draft” or “needs review”. With over 600 content authors on the platform, this is a vital piece to help ensure content on the platforms meets their requirements. At the time of build, Workbench Moderation was in core and Content Moderation had not reached a stable release yet. Drupal 8 introduced Content Moderation in core and as part of our ongoing engagement with Mass.gov, we were asked to help with the heavy lifting associated with the upgrade in an effort to keep their site supported and up-to-date. The Commonwealth found this to be a greater challenge than expected and relied on LCM to facilitate the right migration path.

Our first step in initiating the migration was to investigate what we were dealing with. At the time, mass.gov had around 600,000 revisions with a moderation state that needed to be migrated from Workbench Moderation to Content Moderation. We started off by digging into Content Moderation’s code to fully understand the complexities and layers of the switch. We found the systems to be very different and with no pre-designed migration path, we deduced that we needed to create a handmade migration path from scratch, outside of a standard Drupal 8 migration. Once the initial configuration of this path was set up, the process was just to keep building the migration through trial and error and figuring out the fastest plan of action.

We knew that switching from Workbench Moderation to the new core module, Content Moderation meant the mass.gov site and its authors would benefit. For a government site, security is always a concern and therefore a top priority. When working with a core module, it is actively supported for security and for any updates that core has as opposed to still running on a contributed module. 

After we felt sound on the coding portion of the switch, we wanted to make sure we were in alignment about expected workflows, transitions, non-transitions, and revision states. We started with around 20 transitions in Workbench Moderation that we were able to consolidate to 5 transitions in Content Moderation for a more optimized workflow. 

We also worked on rebuilding some views such as the “all content view” and mass.gov’s specific dashboard called, “MyContent Block”, which contains all the content the logged in author is watching. 

After a successful switch, mass.gov users are leveraging Content Moderation to moderate content. We are implementing a patch to Content Moderation for the view for “filter by moderation state”. This filter was missing some indices that would cause MySQL to incorrectly do a full table scan instead of an indexed scan of content that caused a lot of performance issues. We ended up writing a patch to include the missing indices that would bring down the query load time on the “MyContent Block” view from 15,000 milliseconds to 200 milliseconds. We plan for this patch to be made available for other sites experiencing the same issue with a lot of content like Mass.gov does.

Overall, the goal was to make the switch as seamless as possible and create little to no changes to the content authoring experience. On the module itself, content authors did get some smaller UI changes such as the submission process with dropdowns for moderation states. When it came to deployment, authors trying to make changes during this window would have experienced downtime but we were strategic enough to initiate migration on Memorial Day weekend and experienced no content loss.

When we started the migration path, the expected migration time was 15-20 hours. When we executed deployment, we got the time down to 4 hours through our optimization efforts. The migration was incredibly successful on deployment and we experienced no issues!