Case Study: Grab & The PDPA – How Policies & Audits Might Have Prevented Fines

vigilance Case Studies, Industry Roundups Leave a Comment

Business logic vulnerabilities are arguably the most dangerous of all vulnerabilities, because they can be silent yet equally deadly. Some of these vulnerabilities may be exploited by malicious threat actors. Some of them may also be internally self-inflicted.

Let’s take a look at 2 examples of cases that happened close to home in Singapore. They happened on one of our favourite apps to have on our mobile phones these days – Grab.

Case #1

What Happened

In December 2017, Grab launched a campaign that targeted email campaign to its customers. 399,751 emails were sent. But among them, 120,747 contained the name and mobile number of another customer. Under the PDPA of Singapore, the name and mobile number of an individual constitutes Personal Data and thus Grab was in violation of the PDPA. Grab was fined $16,000.

According to Grab, this was due to an incorrect assembly of information from across different database tables. Although, from an administrative perspective, the cause was actually a self-inflicted failure of checking for sound business logic in Grab’s marketing application.

How It Was Resolved

Grab implemented new measures in its policies, including having “a third person to perform sanity checks of the data before triggering any new campaigns” and mobile number masking in its email content.

How It Could Have Been Prevented

  1. Data classified as Personally Identifiable Information (PII) should always be masked or omitted where possible to reduce risk exposure. That includes the full name of an individual.

    The first name (or last name) of an individual by itself will not be considered as PII – which happens to be a perfectly good reason to collect names separately as first names and last names instead of a single full name.

    However, if coupled with other pieces of information such as a complete mobile number, the set of information would be considered PII. Therefore, should the mobile number have been masked (or omitted from the marketing email entirely) and the name been limited to a first name-only basis, the PDPA might not have been considered as breached.
  2. Due care should have been done prior to the email sending where a full set of test data or internal data would be utilised to execute the marketing campaign first. This means not just sending 1 email to the first entry in the database, but other business logic tests like random selections as well as group selections too.
  3. For large sets of data processing, due diligence should also have been done to ensure that the system does not malfunction halfway through. In this case, this could have been done by performing sample checks on the content of the sending / sent emails. With this process, Grab’s exposure would not have reached 120,747 sets of PII, but perhaps a just a few hundred or a few thousand.
  4. Finally, a policy of senior leadership approval should be in place for the approval of PII usage in campaigns. This ensures that the reporting metrics for the marketing team would not only be about the response to a campaign, but also that PII was responsibly used. With the extra effort from the campaign team to ensure that the system functions right before seeking senior management approval, the chances of such an error happening becomes slimmer.

Case #2

What Happened

Also in 2017, Grab launched a Google Form that allowed Grab Hitch drivers to submit appeals for various offences or acceptable usage violations. In the form, Grab requested for various PII including the drivers’ full names, NRIC, mobile number and vehicle license plate.

Due to a misconfiguration in the form’s settings, it turned out that every person who submitted the form was able to view all the submissions that were made to the form. This included all the PII that previous submissions had included.

It is the responsibility of an organization to have sufficient understanding and appreciation of a product before using it. Without this understanding and any additional special security arrangements to protect personal data, Grab was found to be in breach of its responsibilities to the PDPA again. However, the data was only exposed for 7 hours and together with other factors, the PDPC fined Grab $6,000 after appeals.

How It Was Resolved

There doesn’t seem to be any official statement about how Grab intended to mitigate possible future happenings of such business logic vulnerabilities.

How It Could Have Been Prevented

  1. Similar to the previous case study, due care should have been performed to ensure that the project team knew about the complete user experience of a user. The link to access all the results is visible in the email that a user receives upon successful submission of the Google Form. Should the team have tested a form submission and followed through logical interactions that might have followed from a form submission, they would have spotted this issue before the customers received it.
  2. Again, similar to the previous case study, a policy of senior leadership approval for the collection and processing of PII might have resulted in greater effort in ensuring that their data privacy policy was conformed to.

Why These Matter

Neither of the 2 above campaigns / projects probably had much sufficient budget assigned to the execution of it. Yet, the losses accumulated from the data breaches overshadowed the would-have-been costs to audit the campaign’s information security measures.

In the first case, there was a $16,000 fine. Legal fees and opportunity cost in the time that both staff and management had to commit to it might have led to a loss upwards of $25,000.

In the second case, there was a $6,000 fine. The same considerations of legal fees and opportunity costs might have led to a loss upwards of $15,000.

Both of these don’t take into account the negative publicity. They also don’t take into account the steep increase in financial penalties the PDPC have been starting to dish out since then. Furthermore, the fines were already reduced due to other mitigating factors, such as the presence of a data protection policy. But they could both have been avoided by adopting reasonable information security practices and engaging in process auditing – testing of business logic, and verifying that the data protection policy was adhered to.

Leave a Reply

Your email address will not be published. Required fields are marked *