Secure SDLC Review
If it is true that ‘Data is the new oil,’ then applications could be compared to refineries, databases would be digital oil tankers, and the networks they use to communicate would be equivalent to modern-day digital pipelines. Similar to this physical analogy, the components required to complete the refinement of data to valuable information rely on a set of highly interdependent, synchronized, and complex processes. Both refinement techniques are highly dependent on uninterrupted public utilities, communication networks, specialized facilities, hardware, software, people, knowledge, policies and procedures to keep the entire production line running safely and as efficiently as possible. For many organizations, application development is like a black box that is not well understood and often not well managed. Understanding how your organization takes in data, processes it, licenses Independent Software Vendor (ISV) software or platforms (PaaS), creates/inspects/test internally developed source code to keep core business processes running is very important.
For many organizations, the only applications and supporting infrastructure they care about are the ones related to business processes that fall under the purview of either government (e.g., HIPAA, PII, GDPR) or industry (e.g., Payment Card Industry) regulations. In legacy environments, where assets were typically hosted in company-owned facilities on company-controlled assets, this was a daunting task. As infrastructures have migrated to 3 rd party hosting environments, access to sensitive data is now done via BYOD portable platforms, and the number of suppliers in your digital supply chain has exponentially multiplied the task of trying to remain secure is as challenging as ever.
Most business processes are based on applications build by an external organization you likely have signed an end-user licenses agreement with, are running on a platform you don’t directly control, utilizing unencrypted public communication networks or source code that has been written by your in- house development team. It is a low bar, but all too common industry norm is to only provide extra attention and scrutiny to the applications that handle sensitive data and are ‘in scope’ for your organization’s regulators. This stands in stark contrast to the spirit of the ‘Prudent Man Rule’ given the current environment of comprehensive, cyber hygiene that would require all digital assets and applications be ‘in scope’ when it comes to providing some level of minimal inspection, detection, and protection against data breaches and the potential impact of bad actors.
You cannot control what you cannot see, so start by creating a comprehensive software, application, and network inventory. If an application, asset, piece of code, or network connection is required to support a business process, it should be accounted for in your inventory. Next, review the types and sensitivity of the data processed by the application. Document the logical data flow of the application inclusive of the network and digital assets it is running or stores data on. Next, step through some common use and abuse cases or formal threat modeling. These foundational data elements will allow you to develop a testing strategy appropriate for all the applications in your portfolio. Application testing can range from source code review, open-source analysis, pen testing, continuous testing, fuzzing, to name a few.
There are several best practices, prescriptive and descriptive software security lifecycle resources that can be utilized to determine the current maturity of your organization’s application development and testing capabilities. With this baseline established, 12 – 48 month organizational capability improvement plans can be developed to achieve a level of sophistication and assurance that the applications and platforms your business and clients rely on are coded well and as running securely as possible. Anything short of this comprehensive approach will introduce ‘blind spots’ into your perspective of your organization’s risk universe. Do not be naïve to believe that just because your senior leadership isn’t protecting an asset that a bad actor isn’t exploiting it. Establishing a disabling bias by incorrectly anchoring on the concept of the absence of imminent harm correlates to the absence of a threat risk is a dangerous mindset for any senior leader. Trust but verify that your purview and understanding of your application and infrastructure’s current risk profile is accurate and complete.