Amy Keller
  • Partner
  • DiCello Levitt
United States of America
Amy Keller

Amy Keller

  • Partner
  • DiCello Levitt
United States of America

I came to the data field when shortly after I graduated law school, a friend of mine came to me with a horrifying story: adulterated photos of her had shown up on a website, and the hosts of the website were making disgusting and disparaging jokes about how the size of her breasts would cause her “future lower back problems.” It was embarrassing and nothing short of harassment. She had not submitted the photos and certainly did not consent to their use.

The website was based in California, but we tried suing them in Illinois where she lived. Although the court dismissed for lack of personal jurisdiction, the website’s owners adopted significant reforms, stopped publishing photos of other people without their consent, and took down photos of my friend. I realised that I had only uncovered the tip of the iceberg of the privacy issues at stake with my friend’s lawsuit, and started exploring legal theories related to privacy, technology, and cybersecurity after that.

A highlight of my career is that I’ve been appointed to lead more cases in the past two years than any other woman in history.

I joined DiCello Levitt in 2017 because the firm wanted to build a top-notch team of professionals dedicated to consumer protection issues – including privacy, security, and technology. Shortly after I joined the firm, Equifax announced that it suffered a massive data breach. With my partners’ support, I applied for appointment to lead hundreds of class actions that were filed against the consumer reporting giant and became the youngest woman ever appointed to co-lead a multi-district, nationwide class action. The settlement we achieved in that case – $1.5 billion – is the largest ever data breach settlement by several orders of magnitude, and it was upheld by the Eleventh Circuit Court of Appeals and survived a challenge at the United States Supreme Court as well.

My 40th birthday is still a year and some change away, so I feel so grateful and fortunate to have had this level of success so early on in my career.

When thinking about challenges in data law, I typically look to how the laws in place affect consumers. The problem is that, with the exception of a couple of state statutes, the law does very little to protect data or privacy. I try my best to work with lawmakers to suggest revisions to their statutory framework and advise them when proposed amendments would do nothing more than help Silicon Valley. Unfortunately, I find that more often than not, elected officials have no idea how technology works, and rather than ask intelligent questions about how companies use our data, they use hearings as an opportunity to grandstand about “censorship” by social media companies and conflate two distinct constitutional protections.

Unfortunately, consumers often pay the price for congressional gridlock. And when technology outpaces the law (as it does all the time), big problems arise. I can think of two examples when it comes to cybersecurity. One, companies are amassing mountains of information on consumers while, two, simultaneously ignoring lingering security threats associated with end-of-life systems and complex servers on which information is ferreted away. Threat actors know this, and unless companies are incentivised to develop good cybersecurity hygiene and data minimisation (and threatened with serious consequences if they don’t), more and more of our information will become available to cybercriminals over time. Although paying for cybersecurity is never sexy, and rarely marketable for most companies, it’s necessary to protect customers.

The obvious answer to the above concerns, of course, is to not only elect intellectually curious politicians but to also encourage lawmakers to work with subject matter experts to better understand which policies would make sense when dealing with technology.

Emerging trends I’m very excited about include the “personal information as a property right” line of cases that my colleagues and I are developing because the United States does not have a statutory scheme to adequately compensate consumers for privacy violations and data breaches. I am also excited that consumers are taking notice of the way companies use their information and the greater transparency that some companies are building into their apps and products. Finally, I am excited to see how technology is being used to make our lives more convenient – provided, of course, that consent is obtained and adequate disclosures are made about how our information is used!

An example of how gender gave me a different perspective on a data protection issue at work happened when a colleague of mine handled a data breach case that involved the unauthorised disclosure of homeowner’s insurance information and files to hackers. The information seemed to be rather innocuous – such as marital status and even whether someone had a dog. But the information disclosed presents far more severe challenges to women: discovery revealed that criminals were more likely to target single women without dogs. Suddenly, a data breach that involved “publicly-available information,” as the defendant argued, provided a blueprint for which homes to burglarise. It opened my eyes to how collecting information on things that may seem routine could have very serious consequences when not protected or exposed to the wrong people.

One obvious change in the data field over the last decade is that more and more women are pursuing careers in tech. The change is encouraging; women offer different perspectives, more collaborative leadership styles, and almost always have a positive impact on corporate culture. One big problem remains, though: how do we accommodate the childcare gap? When women have children, even if they are supported by amazing partners, most of the time they are the ones taking time off work to care for young children. Women are also expected to take accommodations to have a family – such as “stepping back” from leadership roles, or going part-time and receiving less pay, to do the same amount of work that other employees do (except they do it at home, after the children go to bed). Every single industry – even those outside of tech – must face an evergreen problem: how do you ensure that talented and qualified women have the option to remain employed (or return to work after a long absence) when they decide to have children? Without an answer to that question, we’ll continue to have problems recruiting and retaining amazing talent.

A piece of advice I would give to aspiring data lawyers and professionals is ask the “dumb” questions. I’ve had a lot of meetings with subject-matter experts, and there have been many times where what the expert said just didn’t make sense to me. Every single time I asked the expert to explain, others thanked me or said, “I was wondering the same thing!” Asking questions is often the best way to learn, and it also keeps you more engaged with the issues in front of you.

Unlock unlimited access to all Global Data Review content