As global demand for user-facing and internal enterprise apps continues to increase, the mobile development industry has focused largely on increasing developer productivity to satisfy the demand. We have seen huge advancements in cross-platform tooling and rapid development workflows, which has allowed enterprises to cut mobile development cycles and rush to market much sooner than in previous years. This quick-to-market app delivery model lends itself to the unintended consequence of security being an afterthought, if it’s being considered at all.
A recent Ponemon Institute study revealed only 29% of companies invested in a mobile app vulnerability test. So it shouldn’t be surprising that it is estimated that 85% of App Store apps fail the OWASP Mobile Top 10 Mobile Risks list. Coupled with this lack of investment is the fact that many development teams and product owners simply don’t understand basic mobile app security. Downloading the entire OWASP Mobile Security Testing Guide will give you a comprehensive knowledge base on the subject. But if you don’t have a weekend to spare to read through the whole thing, we have a list of the very minimum tasks a developer can execute to provide a minimum-security baseline.
Leverage secure data storage
If you look up how to persist session data for iOS and Android, most examples will show you how to leverage the device’s local storage, which is easy to implement and enables the developer to quickly check off the acceptance criteria for that user story, commit, and deliver for QA. Login/web service credentials, secure tokens, and other sensitive data are not secure in local storage. With iOS, an attacker would simply need access to an iTunes backup to be able to view all local app storage in plain text. Local storage is your mobile app’s garage. Tools, lawn mowers, and canned goods go in there, not valuables sitting on the workbench in plain sight.
The best advice here is to simply keep data in RAM and never persist sensitive data to the device. If this is not possible or user experience dictates some sort of secure persistence, our recommendation would be to encrypt the sensitive data using a tool like Facebook Conceal and store the decryption key in the operating system’s secure credential store.
Ensure sensitive data is not logged or included in any telemetry or crash monitoring
Developers commonly leverage print/log statements to log events, errors, and other information to the console for debugging and general monitoring during development. See the code snip below for an example of what you hopefully wouldn’t find in a hospital’s patient management system.
Print(“Patient apt confirmed \(Patient.ssn) \(Patient.name) confirmed apt at \( Appointment.aptDate) for \(Appointment.procedure)”)
That line of code, while useful to a developer, is certainly not HIPAA compliant as it exposes a patient’s name, SSN, and describes a medical procedure they are about to have in the app’s logfile! Any local logging commands that display sensitive data must be disabled during a release build as these entries are easily retrieved using a binary file reader after a device sync. Custom app telemetry logging via App Center / Firebase / Crashlytics that includes sensitive data should never be allowed because you’re essentially handing that information over to a third party.
When using cross-platform tools like React Native or NativeScript, make sure the code is obfuscated
Retrieve sensitive data on-demand only
One of the common ways mobile app developers can achieve a smooth and responsive user experience is to download all the pertinent data at login, cache in RAM, and then display the local data from screen to screen. This creates a vulnerability where if the device were to be rooted, its memory can be scraped should it fall into the wrong hands. Instead, only allow the retrieval of the private data on-demand, thus reducing the lifetime of this sensitive information on the device. To take this a step further and protect the in-memory data, you can leverage libraries like SecureString, Obfuscator, or Android NDK to encode your strings.
Roughly 30% of apps still use HTTP instead of HTTPS. Always assume network communications are susceptible to eavesdropping, spoofing, and tampering. Even if the nature of the data isn’t particularly sensitive, this vulnerability could lead to secondary attack vectors.
Most contemporary apps are not completely developed from scratch. Rather, they are to some degree “assembled” using any number of open-source libraries and frameworks, which increases developer productivity but also creates potential vulnerabilities as the consuming app inherits the libraries’ security deficiencies. For example, if you find a great library for annotating images but it hasn’t been updated since July 2015, buyer beware. So, please be sure to vet all third-party libraries and frameworks.
Most of us have a basic grasp on home security. We may not go the extra mile of home alarms, full time monitoring, or a 12-gauge shotgun mounted above the bed but we do observe the basics. We don’t leave our home’s backdoor and windows unlocked overnight, post the garage door code to the side of the keypad, or gratuitously post our vacation schedule on social media. We reap a degree of safety by simply adhering to the basics of door locks and using common sense. Securing your mobile app is no different. You reap a degree of safety by simply picking off the low hanging vulnerabilities. These won’t deter a motivated and determined attacker but, going back to the home analogy, it will frustrate the opportunistic thief prowling for an unlocked door.
The battle to secure mobile apps is a constant one. It will never end and while it cannot be completely won, it can be lost! The penalty for losing could be a steep one both in terms of public trust as well as the legal fallout. If you are unsure of your app's vulnerability, reach out to us to learn more.