Big Data Causes Legal Dilemmas

Regardless of the industry, companies with the best way of handling big data tend to be the ones on top. There’s more digital information to be collected and analyzed now than there ever has been, and the amount of data present in the virtual world is expanding exponentially. Those companies that can take in this deluge and create from it new and sophisticated ways to make the right decisions (whether they be about who to hire or who to offer a discount) have found themselves coupled with new markets and tools to increase their efficiency.

data2The drawback to our era’s newest and most helpful tool is that, wielded without the proper caution, Big Data can cause companies to create major problems for themselves. National regulators like the United States Federal Trade Commission (FTC) have repeatedly cautioned that the use of Big Data can perpetuate and even amplify societal biases. Big data can be used in a way that screens out certain groups from opportunities for employment, credit, or other forms of advancement to which all citizens have a right.

At this point in the discussion it makes sense to introduce the term “disparate impact.” Disparate impact is a well-established legal theory under which companies can be held responsible for illegal discrimination that may have occurred during processes that are supposed to be neutral. Methods of screening candidates and/or consumers can be and have been found illegal due their disparate impact on particular groups of people, especially if it can be proven that these methods had a disproportionately adverse impact on individuals based on race, age, gender, or other protected characteristics.

Companies using big data with unforseen effects that fall under the umbrella of “disparate impact” will not be forgiven in a legal sense, even if they completely lacked knowledge of the unlawfully discriminatory practices that their data analysis was encouraging; plaintiffs do not need to demonstrate that a defendant company intended to discriminate or knew that discrimination was occurring, and instead must only show that a company’s policies or actions had the discriminatory effect of excluding protected classes of people from key opportunities.

How could big data facilitate illegal discrimination in a company? It’s a worryingly easy process. Say that research demonstrates that employees that live closer to work stay at the company longer. A company hears of this and decides to formulate a policy that screens potential employees by their zip code. Depending on where a company is based, this policy may inadvertently exclude candidates based on race as well. The policy may then trigger claims of discrimination.

big data3Governments around the world have noted that big data usage can create these kinds of problems. However, Big Data can also be used to clarify issues that were always ¬†obfuscated by a lack of research and information. The FTC’s Council of Economic Advisers acknowledged that Big Data “provides new tools for detecting problems, both before and perhaps after a disciminatory algorithm is used on real consumers,” and has actually held companies responsible for their inability to use the tool properly.

For example, the FTC brought action against the lending company Gateway Funding Diversified Mortgage Services, stating that the company had failed to “review, monitor, examine or analyze the loan prices, including overages, charged to African-American and Hispanic applicants compared to non-Hispanic white applicants.” In other words, even though the company did not utilize Big Data in a certain situation, because they ran into problems with particular protected groups as a result, they were still responsible for the mismanagement of a tool that they had proven that they could and would use in other situations with whiter customers.

No comments yet.

Leave a Reply