Ashley Madison: When the Cheaters got Hacked

Independent of the obvious moral and ethical challenges that the recent hack of the Ashley Madison online cheating and adultery website raises, it is clear we have entered a new era of malware, viruses, worms, ransomware, trojans, phishing attacks and botnets.

Cryptolocker, a trojan virus by design, ushered in this new era of cyber bribery, extortion, corruption and ransomware. In the first 9 months of its release into “the wild”, Cryptolocker affected over 400,000 individuals whose users we’re told to pay $300 within a 3 day period after encrypting most of the data on their affected systems. If the ransom was not paid the infected user’s files would remain encrypted and inaccessible forever.

In the case of Ashley Madison, who boasts “Life is short, Have an Affair”, was compromised on July 11 by a group called the Impact Team. This event resulted in a data breach of up to 10Gb of data and a compromise of approximately 30 million user accounts. The data elements that were compromised in this breach included first and last name, street addresses, phone numbers, accounts names, hashed passwords, e-mail addresses, credit card information and in some cases GPS coordinates along with Windows domain accounts and other data related to Ashley Madison’s internal network suggesting a much broader compromise of their infrastructure. Although Ashley Madison is not disclosing technical details about this breach we can assume with a fairly high degree of certainty that multiple control failures may have occurred at their webserver, perimeter network, firewall/s, operating system/s, backend database and identity infrastructure.

It is clear, like with so many organizations, that the need to embrace and embed best practices into our networks and operating procedures is more essential than ever. Constant vigilance and adhering to industry standards like NIST 800-122 (Protecting the Confidentiality of Personally Identifiable Information (PII)), NIST 800-144 (Security and Privacy in Public Cloud Computing), the ISO 2700X series and the 12 primary control objectives of PCI DSS 3.1 are minimum standards that must be embraced today.

Unfortunately like in the case of Cryptolocker, various cyber exploitation and ransom schemes are now surfacing including cyber extortion, ransoms requiring bitcoin payments among many others.

Although Ashly Madison may want to reconsider their business model and a total revamp of their security infrastructure, what I might suggest for their end users is that they consider taking their own partners out for an intimate dinner and nice movie rather than someone else’s. It might lead to far fewer complications in their lives.

Security Breach Headliners: A Closer Look at the OPM Breach

sbh map jpg new

The first half of 2015 has been a season of information security breaches…and the biggest of all was a massive data breach at the U.S. Office of Personnel Management (OPM). OPM was impacted by two separate but related cybersecurity incidents involving data of Federal government employees, contractors, and others. In April 2015, OPM found that the personnel data of 4.2 million current and former Federal government employees had been stolen. While investigating this incident, in early June 2015, OPM identified that some additional information had also been compromised, including background investigation records of current, former, and prospective Federal employees and contractors. OPM and an interagency team from the Department of Homeland Security (DHS) and the Federal Bureau of Investigation (FBI) have been investigating these incidents, and are working to put in place changes that should prevent similar thefts in the future.

Fatigued by the enormous efforts to ensure a quick, effective and orderly response to information security incidents, organizations may sometimes lose sight of the broader and a more holistic approach of information security management. Often this narrow approach may lead to a mad rush of aggressively adding numerous tools and technologies, which not be an optimal, comprehensive and risk-based approach towards preserving the confidentiality, integrity and availability of organization information.

So what should organizations do?

Without doubt, when a breach occurs, the organization should immediately respond to take charge, try to correct what’s gone wrong, deal with the outcomes of what has already happened, conduct root cause analysis and implement additional or modified controls (preventive, detective, corrective, compensatory, deterrent), so that similar breach should not recur or occur elsewhere in the organization.

However, in the interest of a more robust and comprehensive information security approach, organizations should also consider following Risk & Compliance Lifecycle:

Harmonize – Map requirements and identify a mutually exclusive and collectively exhaustive list of controls, specified in:

  • Risk registers, vulnerabilities log, incidents log and audit reports
  • Industry standards (such as ISO, NIST, etc.)
  • Legal, statutory, regulatory and contractual obligations
  • Organization’s corporate & business unit policies, procedures, controls list

Assess – Conduct risk and compliance assessments based on harmonized requirements, using a combination of questionnaire and data analytics (correlation and prediction). Additionally, use combinations of Brainstorming, HAZOP, Structured “What-if” Technique (SWIFT), Scenario analysis, Business impact analysis (BIA), Root cause analysis (RCA), Failure modes and effects analysis (FMEA), Fault tree analysis (FTA), etc. to identify additional risks, keeping both top-down and bottom-up view.

Risk & Compliance Lifecycle

Risk & Compliance Lifecycle

Strengthen – Update & implement management systems, policies, procedures, guidelines, design documents, etc. to reflect additional or modified controls identified as part of above assessment.

Getting to the CORE of Fast App Development

TruOps Business Integration Platform Defined – 40 X’s Faster App Development


If you are looking to build apps that give your organization a differentiated customer experience, then look no further.   In deploying technology to support and automate internal processes and to replicate functionalities online, the objective is typically to reduce costs and make the customer experience easier, more convenient and more engaging.

But if you don’t have a business integration platform, that objective is easier said than done. SDG’s Big Data Framework – TruOps CORE – enables an enterprise to exceed that stated objective and to grow quickly because SDG provides an instant technology stack.  The result is a significant reduction in development time and time-to-market.

Today, time-to-market is the be all and end all. Beating your competitor to market can provide your organization with unprecedented growth and revenue.  With TruOps Core, we’re finding that we can accelerate development by 40% and achieve a comparable cost savings over developing an enterprise application from scratch.

TruOps CORE is not just for application development.  Many organizations have multiple business systems that need to be integrated into a single business process.  To manually or separately collate and correlate data from individual systems is a huge and ongoing challenge.

Without a business integration framework, most organizations’ unification efforts die a slow death. Executives sometimes only focus on their process while other executives work at cross-purposes.

With TruOps CORE it is easy to unify processes and easier yet to provide a unified view with engaging graphics to all executives.  TruOps configurable dashboards and visualizations enable the enterprise to see and react as one, and to react in real time through a proprietary notification engine.

TruOps Core works for enterprises across the globe and across industry sectors:

  • A multi-national Fortune 10 financial services company used the TruOps Core platform to collect business metrics from functions and business units around the world to populate a globally available dashboard and reporting interface.
  • A media analytics firm uses an application built on TruOps Core to analyze TV long form and short form advertising data to make media buy/sell recommendations to their clients.
  • A garment manufacturer and retailer used TruOps Core to build an availability and performance dashboard for their business systems.

BPM, The Future And You

by Anurag Gupta, Collaboration Practice Leader

SDG Business Process Improvement

Effective BPM aligns IT and business in a highly productive manner. This, in turn, pays off in faster development and better implementation of solutions. Ultimately this makes for a more responsive organization.

Automating Is Sometimes Faster With A BPM Tool

In some cases, BPM can accelerate the automation process, although this is not its purpose. Because processes evolve continually, it is important to have the ability to make rapid changes. Those seeking rapid process automation might be better served with a focused software solution. BPM is for those who anticipate rapid, frequent, ongoing changes.

The Truth About BPMN Standards

With so many standards, acronyms and buzzwords, IT can seem like a jungle. But make sure you always make decisions based on the right criteria. Capability, flexibility and simplicity are most important—not by the version of BPMN supported. Even well-known J2EE will not work on all web-servers.

Do you need a perfectly defined process, yet?

Experts frequently stress process definition and standardization. But remember, well-implemented BPM can bring speed and flexibility to any process.  Often it is better to define a sub-set process and then implement. Processes can always be improved later via BPM.

BPM Does Not Make IT Redundant

BPM can make it easy for business users to modify and even implement processes, sometimes without IT.  However managing, monitoring and optimizing the BPM environment still requires IT. For example, IT can help choose the most efficient of varying ways to implement a solution. IT delivers know-how, and strategies to utilize BPM more effectively.

BPM Is Here To Stay

The key word in BPM is “Management.” Of course, management involves elements of planning, organizing, execution, monitoring, and control. While a new name or label may emerge, management will remain the key to the process. Dynamic organizations will have rapid evolution and faster changes in their business processes. BPM—by any name—assures that they will answer the challenge of change and while maximizing future opportunity.


An Early Post-Mortem: The Case for Quality Assurance and Testing


by Ajay Gupta, President & CEO

Remove the veil of politics and the finger pointing. At the end of the day, is a large implementation of enterprise software.

President Obama said in his October 22, Rose Garden address that more than 20 million people had visited There are dozens of examples of much larger websites that have far more transactions and higher traffic. The day after Christmas in 2012, Amazon had 25 million visitors and Wal-Mart had 8 million, both without a hitch.  At this early post-mortem, there is not enough data to postulate root cause(s) of the problem.

While the exchange was a complex implementation with connections to insurance companies, Medicaid, Medicare and state marketplaces, shouldn’t have broken in such a spectacular fashion. A pertinent question should be asked:  Would a rigorous quality assurance and testing program improve the likelihood of implementation success, or a launch with few or minor bugs?

Now it is about isolating and fixing performance and data issues.  If the fixes are put into production with a robust quality assurance process with real-world test cases, then insurers and users will likely get far better results. Like other complex implementations, fixing the obvious issues will likely uncover more troubles that won’t surface until most of the functions of the site are used. Fixing faulty design takes time and hard work.

Quality Assurance, an often overlooked and understaffed function, is critical to the success of every software implementation in an enterprise, and importantly, for any government institution.  In fact, Health and Human Services Secretary Kathleen Sebelius concurred when she isolated “more testing” as one of the reasons why the site is not performing to expectations.

Quality assurance – more commonly referred to as QA by the industry – includes developing a unified testing and validation framework, designing a self-sustaining and scalable service-based model, developing consistent and tangible metrics and rigorous compliance.

QA includes Independent Validation Testing (IVT) – having an experienced team and an end-to-end comprehensive process – with the primary objective to: identify and quickly fix every software, human interaction and hardware risk and business exposure, deliver a comprehensive, independent verification and validation audit, validate with third-party testing and perform full life cycle testing.

Lessons Learned from the Launch

With such a complicated software structure, the plan should have been phased in rather than launched in full.  States like California appear to have rolled out their healthcare marketplace with little fanfare.

When integrating databases from dozens of contractors and 55 separate platforms, there should have been an entire layer of independent testing, quality assurance and oversight before launch.

Agility is required to solve software development challenges, and typically government institutions don’t work fast enough to react to and fix glitches.