Optimize Drupal under slow network speeds

Drupal with slow network

Site loading time is critical, especially when your website is running on a network that has slow or fluctuating speeds. If a user request times out or takes too long to load content, the user may lose interest and switch to another site. Having had one bad experience on your website, it is highly unlikely that the user will visit again. Slow loading sites invariably result in loss of business traffic. This is why giving a pleasant experience to your website visitor is critical.


According to surveys done by Akamai and other content hosts, most web users expect a site to load in 2 seconds or less, and they tend to abandon a site that isn’t loaded within 3 seconds. 79% of web shoppers who have dealt with a poorly performing web site, say that they would not return to the site; and of those, 44% would tell a friend that they had a poor experience shopping there.


Page loading time is determined by the size of the page which in turn depends on:

  1. The size of the text content.
  2. The number and size of external files it references(Javascript, Style sheets, images and multimedia).
  3. The user’s internet connection(bandwidth and latency).


Since you don’t have any control on the user’s network conditions, all optimizations will have to be performed at the server and the Drupal application level. There are no magical, out-of-the box methodologies or technologies that can be applied to improve Drupal site performance when network speeds are erratic. You have to carefully examine user expectations and optimize the performance of your website to meet those expectations.

What are some of the key website experience expectations of site users?

  1. The website should perform well even at the slowest network speed.
  2. Web pages should load in 3 seconds or less.
  3. If the user is working on a form and loses network connectivity, the information already filled out should not be lost. Rather, the user should be able to continue from where he/she left off upon returning to the site.
  4. If for some valid reason a page has to take longer than 3 seconds to load, the user should be notified and provided with an explanation.

How can user expectations be met?

By following some web design best practices and by using recommended Drupal performance optimization techniques, you can provide the user with a first-rate website experience. Drupal’s built-in caching is the easiest way to improve performance on your site but is not sufficient by itself. For large amounts of content under heavy traffic, the Drupal application doesn’t scale well. To optimize performance under these conditions, you need to apply multiple caching strategies as well as server level optimization.


The following is a list of optimizations at the Drupal application and the server levels:


Application Level Optimization

  • Save Form State/ Save form Data as Draft
    • Auto save data entered in a Drupal form without actually submitting the form. It will be used to restore the form data if internet connectiity is lost while the user is filling out the form.
    • You can also save form data as ‘draft’, which can be used to restore form data at a later point.
    • Utillize contributed modules such as “Save Form State” and “AutoSave” to save the form data.
  • Cut down on multiple HTTP requests
    • Loading a single web page involves sending multiple HTTP requests to the server for loading different elements of the page i.e. Javascript, Style sheets, images etc. Try to load fewer resources which will result in cutting down on parallel HTTP requests and thereby improve site loading time.
  • Smaller Page Sizes
    • Keep page size to a minimum to avoid slow loading web pages.
    • Avoid heavy resource consuming themes. Try to keep the theme as light as possible; themes increase overhead on page load time.
  • Show notifications for slow loading pages
    • If a page load time is unacceptable, notify the user about the cause of the delay.
    • Provide an alternative link such as a lighter version of the page.
  • Show File sizes for links/downloadable items
    • Show files size for all downloadable items, so that user is aware of the resources it will consume.
  • Defer the parsing of Javascript
    • Load Javascript files in footer scope, so that at HTML element will load first in case of slow connectivity.
  • Pages should not Auto refresh
    • Avoid auto refreshing of pages to save the bandwidth.
  • Page should have limited content
    • Use pagination to avoid displaying a long listing of content.
    • If you want to load a long listing of content on a single page, use ‘lazy’ loading to cut down the page load time.
  • Aggregate and compress CSS files
    • Use the compression utilities available with Drupal7 core.
  • Aggregate JS files/ use minified JS files
    • Use the aggregate JS functionality that is available in Drupal7 core.
  • Turn Views Caching on
    • Views caching can be enabled from the Views ‘settings’ page.
  • Enable Block level caching
    • This feature is also provided with Drupal7 core.
  • Disables modules that you are not using
    • Un-install modules like ‘Color’ which are very resource intensive and are not commonly used.
  • Move Images, Videos, Static files to CDN (Content Delivery Network)
    • You can use contributed modules that are available for CDN
  • Implement caching in your custom modules as needed
    • Try to cache output of custom modules, especially when they involve complex business computation.
  • Keep the Drupal core and contributed modules updated
    • Keeping the core and contributed modules updated helps in availing benefits of the latest performance improvements implemented by the Drupal community.
  • Enable caching for authenticated users
    • Enable caching of data for authenticated users using the ‘memcache’ module.
  • Cache panel content
    • Cache panel content with ‘panels content cache’ or ‘panels hash cache’
  • Load Images only when needed with lazy loading
  • Use performance related Drupal modules like
    • Performance and scalability checklist.
    • Performance logging and monitoring.
    • XHProf PHP profiler.
  • Rewrite queries for slow performing views queries
  • Use Image sprites to reduce loading of multiple images
  • Reduce 404 & 403 error
  • Reduce Image Size
    • Use Image styles to reduce image size on page loading.
    • Use style sheets to replace images that have no purpose other than for layout.
    • In general, due to the compression techniques used, the JPEG format is better suited to photographs and GIF; while PNG formats are better for bitmap graphics such as logos and images that contain areas of discrete colours.
  • Turn Page caching(Drupal default) on
    • It is available by default with Drupal core.



Server Level Optimization

  • Enable Browser Caching
    • Enable caching by keeping a copy of data that thas already been received, to avoid the need to request it again.
    • Use static content where possible.
  • Turn On page Compression
    • HTTP compression is a technique supported by most web browsers and web servers. When enabled on both sides, it can automatically reduce the size of text downloads (including HTML, CSS and JavaScript) by 50-90%. It is enabled by default on all modern browsers, however all web servers disable it by default, and it has to be explicitly enabled.
    • Enable gzip compression, it helps in loading the HTML elements 50% faster.
  • Enable APC(Alternative PHP Cache)
    • Enable PHP-FPM (FastCGI Implementation) instead of mod_php.
  • Use syslog
    • Send log to your hosting OS instead of writing to the database.
  • Enable reverse proxy such as Varnish/Redis to cache your static content
    • Contributed modules are available which facilitate integration of Drupal with Varnish & Redis.



You need optimization at both the Drupal application level as well as the server level to tune the performance of your website. If done right, this will provide the user with smoother functionality even over an unreliable, low bandwidth network.

Ashley Madison: When the Cheaters got Hacked

Independent of the obvious moral and ethical challenges that the recent hack of the Ashley Madison online cheating and adultery website raises, it is clear we have entered a new era of malware, viruses, worms, ransomware, trojans, phishing attacks and botnets.

Cryptolocker, a trojan virus by design, ushered in this new era of cyber bribery, extortion, corruption and ransomware. In the first 9 months of its release into “the wild”, Cryptolocker affected over 400,000 individuals whose users we’re told to pay $300 within a 3 day period after encrypting most of the data on their affected systems. If the ransom was not paid the infected user’s files would remain encrypted and inaccessible forever.

In the case of Ashley Madison, who boasts “Life is short, Have an Affair”, was compromised on July 11 by a group called the Impact Team. This event resulted in a data breach of up to 10Gb of data and a compromise of approximately 30 million user accounts. The data elements that were compromised in this breach included first and last name, street addresses, phone numbers, accounts names, hashed passwords, e-mail addresses, credit card information and in some cases GPS coordinates along with Windows domain accounts and other data related to Ashley Madison’s internal network suggesting a much broader compromise of their infrastructure. Although Ashley Madison is not disclosing technical details about this breach we can assume with a fairly high degree of certainty that multiple control failures may have occurred at their webserver, perimeter network, firewall/s, operating system/s, backend database and identity infrastructure.

It is clear, like with so many organizations, that the need to embrace and embed best practices into our networks and operating procedures is more essential than ever. Constant vigilance and adhering to industry standards like NIST 800-122 (Protecting the Confidentiality of Personally Identifiable Information (PII)), NIST 800-144 (Security and Privacy in Public Cloud Computing), the ISO 2700X series and the 12 primary control objectives of PCI DSS 3.1 are minimum standards that must be embraced today.

Unfortunately like in the case of Cryptolocker, various cyber exploitation and ransom schemes are now surfacing including cyber extortion, ransoms requiring bitcoin payments among many others.

Although Ashly Madison may want to reconsider their business model and a total revamp of their security infrastructure, what I might suggest for their end users is that they consider taking their own partners out for an intimate dinner and nice movie rather than someone else’s. It might lead to far fewer complications in their lives.

Security Breach Headliners: A Closer Look at the OPM Breach

sbh map jpg new

The first half of 2015 has been a season of information security breaches…and the biggest of all was a massive data breach at the U.S. Office of Personnel Management (OPM). OPM was impacted by two separate but related cybersecurity incidents involving data of Federal government employees, contractors, and others. In April 2015, OPM found that the personnel data of 4.2 million current and former Federal government employees had been stolen. While investigating this incident, in early June 2015, OPM identified that some additional information had also been compromised, including background investigation records of current, former, and prospective Federal employees and contractors. OPM and an interagency team from the Department of Homeland Security (DHS) and the Federal Bureau of Investigation (FBI) have been investigating these incidents, and are working to put in place changes that should prevent similar thefts in the future.

Fatigued by the enormous efforts to ensure a quick, effective and orderly response to information security incidents, organizations may sometimes lose sight of the broader and a more holistic approach of information security management. Often this narrow approach may lead to a mad rush of aggressively adding numerous tools and technologies, which not be an optimal, comprehensive and risk-based approach towards preserving the confidentiality, integrity and availability of organization information.

So what should organizations do?

Without doubt, when a breach occurs, the organization should immediately respond to take charge, try to correct what’s gone wrong, deal with the outcomes of what has already happened, conduct root cause analysis and implement additional or modified controls (preventive, detective, corrective, compensatory, deterrent), so that similar breach should not recur or occur elsewhere in the organization.

However, in the interest of a more robust and comprehensive information security approach, organizations should also consider following Risk & Compliance Lifecycle:

Harmonize – Map requirements and identify a mutually exclusive and collectively exhaustive list of controls, specified in:

  • Risk registers, vulnerabilities log, incidents log and audit reports
  • Industry standards (such as ISO, NIST, etc.)
  • Legal, statutory, regulatory and contractual obligations
  • Organization’s corporate & business unit policies, procedures, controls list

Assess – Conduct risk and compliance assessments based on harmonized requirements, using a combination of questionnaire and data analytics (correlation and prediction). Additionally, use combinations of Brainstorming, HAZOP, Structured “What-if” Technique (SWIFT), Scenario analysis, Business impact analysis (BIA), Root cause analysis (RCA), Failure modes and effects analysis (FMEA), Fault tree analysis (FTA), etc. to identify additional risks, keeping both top-down and bottom-up view.

Risk & Compliance Lifecycle

Risk & Compliance Lifecycle

Strengthen – Update & implement management systems, policies, procedures, guidelines, design documents, etc. to reflect additional or modified controls identified as part of above assessment.

Getting to the CORE of Fast App Development

TruOps Business Integration Platform Defined – 40 X’s Faster App Development


If you are looking to build apps that give your organization a differentiated customer experience, then look no further.   In deploying technology to support and automate internal processes and to replicate functionalities online, the objective is typically to reduce costs and make the customer experience easier, more convenient and more engaging.

But if you don’t have a business integration platform, that objective is easier said than done. SDG’s Big Data Framework – TruOps CORE – enables an enterprise to exceed that stated objective and to grow quickly because SDG provides an instant technology stack.  The result is a significant reduction in development time and time-to-market.

Today, time-to-market is the be all and end all. Beating your competitor to market can provide your organization with unprecedented growth and revenue.  With TruOps Core, we’re finding that we can accelerate development by 40% and achieve a comparable cost savings over developing an enterprise application from scratch.

TruOps CORE is not just for application development.  Many organizations have multiple business systems that need to be integrated into a single business process.  To manually or separately collate and correlate data from individual systems is a huge and ongoing challenge.

Without a business integration framework, most organizations’ unification efforts die a slow death. Executives sometimes only focus on their process while other executives work at cross-purposes.

With TruOps CORE it is easy to unify processes and easier yet to provide a unified view with engaging graphics to all executives.  TruOps configurable dashboards and visualizations enable the enterprise to see and react as one, and to react in real time through a proprietary notification engine.

TruOps Core works for enterprises across the globe and across industry sectors:

  • A multi-national Fortune 10 financial services company used the TruOps Core platform to collect business metrics from functions and business units around the world to populate a globally available dashboard and reporting interface.
  • A media analytics firm uses an application built on TruOps Core to analyze TV long form and short form advertising data to make media buy/sell recommendations to their clients.
  • A garment manufacturer and retailer used TruOps Core to build an availability and performance dashboard for their business systems.

BPM, The Future And You

by Anurag Gupta, Collaboration Practice Leader

SDG Business Process Improvement

Effective BPM aligns IT and business in a highly productive manner. This, in turn, pays off in faster development and better implementation of solutions. Ultimately this makes for a more responsive organization.

Automating Is Sometimes Faster With A BPM Tool

In some cases, BPM can accelerate the automation process, although this is not its purpose. Because processes evolve continually, it is important to have the ability to make rapid changes. Those seeking rapid process automation might be better served with a focused software solution. BPM is for those who anticipate rapid, frequent, ongoing changes.

The Truth About BPMN Standards

With so many standards, acronyms and buzzwords, IT can seem like a jungle. But make sure you always make decisions based on the right criteria. Capability, flexibility and simplicity are most important—not by the version of BPMN supported. Even well-known J2EE will not work on all web-servers.

Do you need a perfectly defined process, yet?

Experts frequently stress process definition and standardization. But remember, well-implemented BPM can bring speed and flexibility to any process.  Often it is better to define a sub-set process and then implement. Processes can always be improved later via BPM.

BPM Does Not Make IT Redundant

BPM can make it easy for business users to modify and even implement processes, sometimes without IT.  However managing, monitoring and optimizing the BPM environment still requires IT. For example, IT can help choose the most efficient of varying ways to implement a solution. IT delivers know-how, and strategies to utilize BPM more effectively.

BPM Is Here To Stay

The key word in BPM is “Management.” Of course, management involves elements of planning, organizing, execution, monitoring, and control. While a new name or label may emerge, management will remain the key to the process. Dynamic organizations will have rapid evolution and faster changes in their business processes. BPM—by any name—assures that they will answer the challenge of change and while maximizing future opportunity.