slideshare quotation-marks triangle book file-text2 file-picture file-music file-play file-video location calendar search wrench cogs stats-dots hammer2 menu download2 question cross enter google-plus facebook instagram twitter medium linkedin drupal GitHub quotes-close

Building a brand new Drupal 7 site

Stock market depicted on a laptop screen

Société du Figaro is one of France’s largest publishing groups, and home to the internationally famous French-language newspaper, Le Figaro. Already popular within the organisation - their single sign-on, user personalisation and commenting systems are already Drupal-powered - Société du Figaro took the strategic decision to move their ‘bourse’ website, for displaying financial news and stock market data, on to Drupal.

The project was undertaken by a partnership of companies, Société du Figaro themselves providing some developers and the main project manager, while two separate companies, Open Web Solutions (Paris) and Code Enigma (London), provided consultancy, technical architecture and development resource.

Visit bourse.lefigaro.fr →
Screenshot of the Le Figaro Bourse front page.

Why Drupal was chosen

There are surely more strategic and financial reasons for Le Figaro undertaking various projects in Drupal in the past few years, but for this case study we will focus on the specific reasons Drupal was selected for the Bourse project:

  • Existing Drupal experience within Société du Figaro
  • Good links to trustworthy external specialists to help delivery
  • Good synergy with other projects already in Drupal, such as the commenting system
  • Positive results on previous projects
  • Ability to develop features more quickly that previously used proprietary CMS
  • Describe the project (goals, requirements and outcome): 
  • Le Figaro’s Bourse website has the primary goal of displaying financial news and data, combined with other financial services, so website visitors have a fast, reliable single source for all their stock market and financial information.

From a CMS technology perspective, there were only two options on the table for Société du Figaro - either continue with the current proprietary CMS or move to Drupal, a CMS they already had internal experience with. After much deliberation, they decided to make the leap to Drupal, largely because the pace of development of new features had been historically slow with the previous CMS.

Initial timelines were aggressive and the project was run using a hybrid of the Scrum framework for agile software development, with the Scrum Master being provided by Code Enigma while Société du Figaro provided a dedicated Product Owner. Once the stories required for the minimum viable product (MVP) were defined (the features the website simply must have for launch) initial delivery was very quick, happening in a matter of weeks and a handful of sprints.

The primary requirement for launch, and the focus of the MVP, was to use Drupal to create a “mash-up” of content, coming from a variety of sources for data and editorial, and present that content in a navigable and consumable way. That might sound straightforward, but with a number of data sources in double digits, plus intense caching requirements to protect the site against outages in third-party services, this was no simple task. Alongside the technical complexities, Le Figaro’s web designers had to figure out how to present a potentially overwhelming amount of financial data on the page without it crowding the news content and making the site feel too busy.

Reliability

Ensuring crucial data is available, even when there are short outages in providing services, was critical to Le Figaro and something they had struggled with in the past. Of course, being able to trust financial data is key, so stale data is also unacceptable, so rather than ‘harvesting’ content and storing it in Drupal for display, the development team opted to use Drupal’s caching API. They devised a set of custom modules to cache important data from key services to ensure it continued to be available in the event of a single failed service call or a short outage of a few minutes (common occurrences) but also expired when it became unreliably old. The caching of the different types of data can be controlled in a very granular way, with some types of data unlikely to change frequently (such as company names) being cached for weeks, while other types of data (such as share price) expire in minutes.

Breaking news

Another challenge was the “time to live” (TTL) of breaking news, previously taking several minutes due to the scheduled nature of the process for importing content. The development team devised and wrote a lightweight Bash-based Linux daemon that monitors several editorial ‘drop’ directories independently, allowing it to react as soon as a new piece of content arrives. This also means there is no need to start heavy processes within Drupal on a schedule when there might not even be any content ready to import. There are as many instances of this daemon as there are custom Drush commands (and sources) to ensure one process does not block another. Thanks to this design, the daemon is capable of simultaneously reacting to numerous new content events without any of them being held up by the others. The daemon works by executing one of a set of custom Drush commands (Drush is the Bash client for Drupal) depending on the affected ‘drop’ directory and source, to import the new content into the website almost instantly.

Integration

There are no less than fourteen individual data sources being pulled into Drupal to drive the Bourse website, all in different formats, from custom PHP APIs to different types of XML, with transport mechanisms from FTP drop to SOAP web services. Drupal excels at dealing with all these different types of data, taming them and importing them into its own database. The development team spent a significant amount of time writing the ‘glue’ to handle these different integrations and ensure the data was suitably sanitised and ready to use when Drupal needed to display it.

Complex layouts

The sheer amount of data on some pages meant fairly complex three and four column page layouts which would have been unwieldy using the traditional Drupal approach of blocks and regions. Using Panels and a custom theme, the team created the discrete content areas and spaces necessary to present all the data in a flexible, manageable and - crucially - editable way, so Le Figaro webmasters could control the content properly and with a view of the context not afforded by the core Drupal block admin page.

The site also makes extensive use of jQuery, using the core UI tabs libraries, which are not often used for anything outside of the Drupal core user interface. By using the available components, developers can cut down on dependency on third-party JavaScript libraries and integrations. However, to meet real-time table filtering requirements the team used the DataTables module for Drupal, which integrates an additional jQuery library for sorting data in tables with JavaScript. This was very handy for allowing users to control the appearance of their data.

Performance

With a mix of up-to-the-minute stock data and relatively static editorial and informational content, it is important the site can operate without Drupal core page caching. Couple this to a high number of authenticated users and you have quite a performance challenge. There are numerous mechanisms at play to achieve the necessary performance levels within these constraints:

The Memcache module for Drupal is transferring Drupal’s cache tables to the Memcached distributed caching application for Linux. We do this because it performs significantly better than MySQL, Drupal’s default store for cached data, resulting in a marked performance gain.

The pages themselves mostly use the Panels module, which has an inbuilt Simple Cache mechanism for time-based caching of individual Panels panes. In this way we can get Panels to cache data we know we don’t want to frequently refresh, saving a lot of database calls by fetching cached panes from Memcached instead of MySQL in a fraction of the time.

Varnish is also heavily employed, with custom caching rules to cache pages for both anonymous and authenticated users. Caches are aggressive but expire regularly (every three minutes) to allow critical market data to remain fresh. Le Figaro is also sponsoring the continuation of the Edge-Side Includes (ESI) project for Drupal, so in the future discrete pieces of content can be pushed through to the end-users without the need to expire the entire page cache in Drupal or Varnish, though this work is not yet in production at the time of writing.

Summary

Just a few short months after embarking on the first iteration of development, the Drupal version of Le Figaro Bourse was live. The pace of development astounded senior managers of the group. The project continues to this day, Drupal 7 is performing strongly under demanding conditions and new features are added to Bourse on a regular basis, often weekly, by a cross-company team of experts. Ongoing sprints have built on the original MVP, integrating more data sources, providing commenting on content, integrating single sign-on with other Le Figaro properties, and much more besides. Development continues and an exciting roadmap lies ahead.

Let's start our project together


Contact us

More case studies

Economist

The Economist Group

Hosting and support for the media

Read the Economist case study

workmen in hard hats

Airmic

A new Drupal 8 site and ongoing support

Read the Airmic case study