ESG Investing: Is Facebook a Responsible Investment?

Facebook (NASDAQ:FB) is the world’s largest social network, with 2.7 billion people around the globe using one of the company’s social media products at least once monthly. Of those, at least 2.1 billion log in to message friends, relatives, and strangers daily. Nearly 40,000 people work at the company full-time, and tens of thousands more serve as contractors. Other than perhaps Alphabet‘s Google or Amazon, no company touches as many people or exerts as much influence as Facebook.

And it does this with a relatively small number of products:

  • The original Facebook platform, unveiled on Feb. 4, 2004.
  • Facebook Messenger, the direct-messaging service created in 2011 that links to a Facebook account and mirrors texting.
  • Instagram, founded by Kevin Systrom and Mike Krieger in Oct. 2010 and acquired by Facebook for $1 billion on Sept. 6, 2012.
  • Internet.org, a nonprofit arm founded in August 2013 for bringing the internet to underserved communities.
  • Oculus, a virtual and augmented reality company founded in 2012 and acquired by Facebook for $2 billion in March 2014.
  • WhatsApp, arguably the world’s most popular messaging and texting app, which is present in 180 countries today after being founded by Jan Koum and Brian Acton in 2009. Facebook acquired the company for $19 billion in February 2014.

Facebook has been a cultural phenomenon almost from the beginning. So much so that in 2010, writer Aaron Sorkin wrote a drama about the company titled The Social Network. The film would go on to gross $224.9 million worldwide on a $40 million production budget and win three Oscars (for adapted screenplay, editing, and original score).

In terms of business biopics, only 2013’s The Wolf of Wall Street — $392 million in worldwide ticket sales on a $100 million production budget — has drawn better than Sorkin’s on-screen character study of Facebook CEO Mark Zuckerberg (played by Jesse Eisenberg).

Facebook’s global influence has grown exponentially in the years since. In 2018 on his weekly show Last Week Tonight, comedian John Oliver did a segment on the company’s lax efforts to prevent hate speech in Myanmar. Watch this NSFW video to get the full story if you’d like. The abridged version amounts to this: Facebook is by far the most accessible, most trusted source of news in the country, which made it easy for the Buddhist majority to inflame anti-Muslim sentiment, which in turn led to growing genocide of Myanmar’s Muslim population.

Either as investors or as human beings, it’s fair to say we can no longer think of Facebook merely as a profit-making company. Its global footprint makes it the world’s first interconnected public square — and that introduces opportunities to do both good and bad, at scale. Examples of both are easy to find.

On the one hand, Facebook is the creator of the Open Compute Project and a pioneer in the business of making data centers more green with its Prineville, Oregon, facility. More than 75% of the energy the company consumes in its quest to connect people around the globe is sourced from renewables, according to Facebook’s sustainability reporting. On the other hand, Facebook spent extraordinary amounts of time and capital downplaying its role in the Cambridge Analytica scandal, in which the personal data of up to 87 million people was compromised in a sweeping effort to influence the outcome of a U.S. election.

Investors take Facebook’s privacy practices seriously because poor governance can depress performance and impact returns on invested capital (ROIC). Accordingly, in June, Standard & Poor’s removed Facebook from its Environmental, Social, and Governance (ESG) index on the occasion of its annual rebalancing. The company also has yet to join the elite members of ESG outperformers on the Dow Jones Sustainability Index. These are important but also subjective measures of Facebook’s ESG commitment.

Ultimately, how good (and how bad) Facebook has become at serving both shareholders and the public interest is a matter of perspective and debate. So in that spirit, we decided to look deeper by evaluating the company using The Motley Fool’s 10-question framework for ESG attributes. Here’s how Facebook stacks up.

1. Does the company treat its employees, customers, community, and other stakeholders well?

Maybe. If this answer feels like hedging, it’s for a good reason. For full-time workers, Facebook can be an amazing experience. Glassdoor ranks the company seventh on its 2019 list of Best Places to Work. LinkedIn is the only tech-driven company to rank higher, and Google — a premium employer in its own right — ranks eighth.

Here’s how one hardware engineer, a current Facebook employee, describes the cons of working at Facebook in a July 2018 Glassdoor review:

It’s so good that it would be hard to work anywhere else. I’m destroyed for life. If [Facebook] is your first job I would feel bad because this is NOT a true view of the working world. If you’re humble enough to appreciate that then you should be ok. I believe people can develop a degree of entitlement and hubris in this environment. You can run into folks like that, but the majority of others are great.

Talk about a powerful testament. When your core criticism is that the benefits and work environment are so good that a handful of workers have become spoiled, then you’ve found a truly great place to spend your weekdays. If only contractors and customers were as lucky.

Sadly, they aren’t. According to two blockbuster exposes published by The Verge in February and June, conditions at the sites employing tens of thousands of contractors who perform content moderation duties for Facebook — that is, searching for, reviewing, and potentially deleting offensive content — include abnormally long hours, low pay, and high stress. At least one employee has died on the job, according to The Verge. Others suffer from posttraumatic stress disorder, or PTSD, from the avalanche of gruesome images of violence and child pornography they’re tasked with viewing and removing as quickly as possible.

To be fair, these contractors do not work for Facebook but for Cognizant Technology Solutions (NASDAQ:CTSH). Legally, Facebook doesn’t have the same obligation to care for its contractors that it does to care for its full-time workers. And yet these workers are quite literally on the front lines. Without them, Facebook users around the globe would be bombarded with offensive and illegal content. That the company isn’t doing more to protect them is both telling and tragic. Facebook doesn’t do enough to care for all the workers who feed its success.

There’s hope for that to change, according to The Verge. Arun Chandra, Facebook’s vice president of scaled operations, told the publication that he plans to include worker well-being in the equation when evaluating contracting firms such as Cognizant. That’s a hopeful promise, but just as customers who were promised their data would be safe have witnessed the fallout from the Cambridge Analytica scandal, I can’t score Facebook as “yes” on this question until there’s proof of better treatment.

Areas for improvement: As it now stands, Facebook is a divided community. Some employees are rich and enjoying the fruits of being associated with a multibillion-dollar global enterprise while contractors struggle to make a living wading through images that could be better handled with artificial intelligence or other means of computerized screening. Evidence of better treatment of contractors — including a clear implementation of Chandra’s contractor scoring system to include worker well-being — could flip the answer to “yes.”

2. Is the company a good steward of the environment?

Yes. While Google deserves credit for being the first major tech company with a data center footprint to invest directly in renewables — solar, in this case, all the way back in 2006 — Facebook helped usher in the era of green data centers nine years ago. In January 2010, the company publicly committed to opening a reimagined data center, built from the ground up to be energy efficient in ways no one had previously considered. The Prineville, Oregon, facility opened in April 2011 and is still serving users today.

Facebook’s Prineville, Oregon, data center is one of the centerpieces of a strategy for making data center computing greener and more efficient. Image source: Facebook.

A WIRED article from that year underscores just how important Facebook’s contribution is. Ranking nine of the “more innovative facilities that came online in 2011,” the magazine found that only Prineville ranked under 1.1 for “power utilization Eefficiency.” (A facility with a PUE score of one is perfectly efficient and consumes no extra power for lighting, power generation, or cooling.)

Data centers have been getting greener in the years since; just look at the work ESG leader Microsoft (NASDAQ:MSFT) is doing with undersea data centers via Project Natick. Facebook deserves at least some credit for inspiring Microsoft and others to think bigger. Why? The Open Compute Project, which Facebook unveiled around the same time it opened the doors at Prineville. Think of the initiative as what happens when the spirit of shared innovation that embodies the open-source software movement is applied to data center hardware. In this case, Facebook freely published the designs that made Prineville a state-of-the-art facility nine years ago. Today, there are roughly 200 organizations participating in and contributing to the project.

For its part, Facebook has joined in the RE100 and says in its sustainability pages that the company ran on 75% renewable power in 2018 and is on track to get to 100% in 2020. Zuckerberg sees it as a priority, writing a withering criticism of the U.S. decision to retreat from the Paris climate accords in a statement.

“Withdrawing from the Paris climate agreement is bad for the environment, bad for the economy, and it puts our children’s future at risk,” he wrote. “For our part, [at Facebook] we’ve committed that every new data center we build will be supported by 100% renewable energy. Stopping climate change is something we can only do as a global community, and we have to act together before it’s too late.

We know from Facebook’s years of commitment to better data center computing that Zuckerberg means what he says, and it’s reasonable to conclude his company will continue to be an advocate for environmental sustainability.

3. Does the company promote diversity and inclusion?

Yes. Nine months after joining the company as global chief diversity officer, in June 2014, Maxine Williams published Facebook’s diversity figures for the first time. The blog post was striking if predictable: 69% of employees were male and 57% were white. Of those, 85% of technical employees (i.e., developers and hardware engineers, primarily) were male and 53% were white. Those numbers had to change.

Williams wrote at the time:

Research … shows that diverse teams are better at solving complex problems and enjoy more dynamic workplaces. So at Facebook we’re serious about building a workplace that reflects a broad range of experience, thought, geography, age, background, gender, sexual orientation, language, culture and many other characteristics.

Maxine Williams has been Facebook’s chief diversity officer since September 2013. Image source: Facebook.

Where is Facebook on the diversity spectrum today, five years later? Doing better: 63.1% of the workforce is male versus 39.9% female. More importantly, white staff no longer comprise an overwhelming majority. Instead, white workers now account for 44.2% of all roles and 40% of technical roles. Also of note: 32.6% of senior leadership positions at Facebook are now occupied by women, up from 23% in 2014.

That’s different from the four executives Facebook profiles at its investor relations site. There, COO Sheryl Sandberg is the only woman, and no people of color are yet represented. Of the members of the eight-person board, three including Sandberg are women, and one of the men — former American Express CEO Kenneth Chenault — is nonwhite.

Not surprisingly, Williams believes she and the company as a whole can do better, writing in a blog post about the 2019 survey results:

We envision a company where in the next five years, at least 50% of our workforce will be women, people who are Black, Hispanic, Native American, Pacific Islanders, people with two or more ethnicities, people with disabilities, and veterans. In doing this, we aim to double our number of women globally and Black and Hispanic employees in the US. It will be a company that reflects and better serves the people on our platforms, services, and products. It will be a more welcoming community advancing our mission and living up to the responsibility that comes with it.

Prioritizing diversity and inclusion requires audacious thinking and the wherewithal to follow through. So far, Williams and her team appear empowered to provide both to a workforce hungry for greater representation. The company ranks 71st on Forbes’ Best Employers for Women and doesn’t rank at all on Forbes’ Best Employers for Diversity, which illustrates that while Facebook is a national leader in workplace gender inclusion, it still has plenty of room to improve.

4. Does the company have ethical corporate governance principles?

No, and it’s a major disappointment. Rewind to May 2012 and Facebook’s landmark initial public offering. Most will remember the feeding frenzy over shares and the extraordinary market value of the company at its debut — $100 billion — or the 50% haircut in value investors endured over the ensuing 90 days. I remember Zuckerberg’s letter and the powerful, hopeful vision he penned about a more connected, more just, more peaceful world:

These days I think more and more people want to use services from companies that believe in something beyond simply maximizing profits. Facebook exists to make the world more open and connected, and not just to build a company. We expect everyone at Facebook to focus every day on how to build real value for the world in everything they do.

Ethical practices, in other words, were to be central to the Facebook ethos as Zuckerberg foresaw it in 2012. Seven years later, he’s leading one of the world’s most scandal-plagued enterprises. It’s so bad that BuzzFeed published a list of fresh scandals and new controversies for every month of 2018. Unfortunately, if you look at history, scandal is business as usual for Facebook.

A partial list of Facebook missteps and privacy problems

Year Problem
2007 Facebook’s Beacon tracking app sends data from partner websites to Facebook, including publishing information to users’ public News Feeds. Complaints follow as users who bought gifts for spouses have their private purchases aired on their News Feeds, among other unwanted revelations.
2010 The Wall Street Journal publishes an investigation revealing that popular apps such as FarmVille were transmitting “identifying information … to dozens of advertising and internet tracking companies.”
2014 Facebook tests how negative or positive words in users’ News Feeds can affect mood, changing content on more than 689,000 users’ home pages. The verdict: Facebook can manipulate how we feel.
2015 British newspaper The Guardian reveals the first details of the Cambridge Analytica scandal. Four years and many in-depth investigations later, the story of Facebook’s role as a tool in manipulating the U.S. electorate is still being told.
2018 Unidentified hackers expose roughly 29 million users’ data in an attack in September, orchestrated by stealing “access tokens” that allow Facebook users to log into third-party sites. Affected users have since filed suit in federal court; the case is ongoing.

The Cambridge Analytica scandal is arguably the most far-reaching and demonstrative of Facebook’s influence and privacy practices. The now-defunct U.K.-based political consulting firm obtained private data on about 87 million Facebook accounts worldwide with the help of an app developed by a Russian-American academic who has since sued Facebook for defamation. The data became part of an orchestrated campaign to influence the outcome of the 2016 U.S. elections.

Amid pressure from investigative reports from The New York Times and others, on April 4, 2018 , Facebook shared staggering news: 87 million users’ private data had been obtained and misused without their consent. Chart source: Facebook.

Zuckerberg has since testified before U.S. and European lawmakers about Facebook’s business practices and what protections it provides to users. His testimony to EU regulators in May 2018 unleashed fervor among both liberal and conservative lawmakers, but it may be Guy Verhofstadt, a Belgian politician and a major voice in liberal EU politics, who best captured what many people — not just users but also investors — have been thinking since the Cambridge Analytica story first broke:

You have to ask yourself how you will be remembered. As one of the three big internet giants together with Steve Jobs and Bill Gates who have enriched our world and societies, or on the other hand, the genius that created a digital monster that is destroying our democracies and our societies?

That this is even a legitimate question — and it is — precludes me from considering Facebook an ethical operator in a business that badly needs ethical oversight. Or at least that seems to be the opinion of the Federal Trade Commission. In July, the agency imposed a $5 billion fine related to Facebook’s privacy failures.

Areas for improvement: Too often, it seems Facebook is reluctant to take responsibility for privacy breaches impacting its users. The company is reactive rather than proactive, and sometimes in the worst ways. The FTC fine speaks to this truism. Facebook can and should take the initiative and develop collaborative agreements with governments around the globe to watch for and respond to incursions. I’m not suggesting Facebook give governments access. Rather, I’m suggesting that instead of performing its own internal analysis and then reporting what it wants much later, the company should be working hand in glove with regulators to report on suspicious activity as it occurs so there are fewer surprises and more cooperation for protecting citizens who use Facebook’s apps and tools.

5. Do the company’s business model and its investments promote ESG principles?

No, and it’s not even close. Since the days of its missteps with Beacon, Facebook has been apologizing for its privacy issues and then returning to business as usual. And it’s not just Zuckerberg at fault here. In a November 2018 expose, The New York Times revealed a series of disturbing decisions management made during the height of the Cambridge Analytica crisis.

Two decisions by Chief Operating Officer Sheryl Sandberg, in particular, beg the question of whether Facebook can ever meet Zuckerberg’s original intent of building a customer-first business for the social good.

  1. In January 2018, Sandberg asked Facebook’s communications team to investigate financier George Soros’ “financial interests,” according to The Times, following a speech he gave at the World Economic Forum in which Soros suggested that Facebook and Google posed a “menace” to society.
  2. Four months earlier, in September 2017 in a board meeting, Sandberg berated Facebook security chief Alex Stamos for informing directors that efforts to contain Russian infiltration of the social network weren’t complete — news that should have come directly from Zuckerberg or Sandberg but apparently hadn’t. “You threw us under the bus!” The Times reports Sandberg saying in the meeting, directing visible anger at Stamos for being outed.

In both instances, Sandberg appears to be taking steps to protect not only her own reputation but also that of the company. Trust is currency for Facebook; the business model doesn’t work without it. In attacking those who could undermine that trust, she’s playing a justifiable if unseemly form of defense. Trading in people’s data is an inherently conflicted business that can force leaders to choose between profit and protecting users.

Knowing this, Zuckerberg in March 2019 wrote a blog post in which he laid out a vision for a private social network governed by Facebook:

Public social networks will continue to be very important in people’s lives — for connecting with everyone you know, discovering new people, ideas and content, and giving people a voice more broadly. People find these valuable every day, and there are still a lot of useful services to build on top of them. But now, with all the ways people also want to interact privately, there’s also an opportunity to build a simpler platform that’s focused on privacy first.

And where are we in the development of a private version of the world’s most popular social network? We don’t know. Facebook hasn’t yet provided a roadmap, and in his post, Zuckerberg conceded it could be a while before a private-focused network goes live.

Image Source: Facebook.

In the meantime, private messaging service Telegram had more than 200 million users worldwide as of March 2018 and gained 3 million more a year later — in 24 hours — when all Facebook messaging services experienced outages around the globe. Such brittle loyalty appears to be due to a long-term, systemic breakdown in trust between Facebook and its users. A good ESG actor wouldn’t have let the relationship sour without taking more direct action, and sooner.

There’s little evidence Facebook has learned its lesson. Recently, the company unveiled plans to formally enter the dating business, which could go wrong in numerous ways given Facebook’s history with handling private data. Imagine, for example, that in signing up, you unknowingly give permission to a third party to post on your News Feed — including notifications when someone requests to connect to you on the dating app. What happens when an ex reaches out and your current partner sees the notification? And that’s just one possibility. Policing usage by minors is certain to be an issue as well. Trusting Facebook with romantic data when it’s proven untrustworthy in too many other areas seems like a stretch.

Areas for improvement: It’s pretty simple. Zuckerberg can and should make good on a private, subscription-based version of Facebook as soon as possible. When he does, it’ll give users the option to choose what experience they most want, which is what every company — especially a company with ESG ambitions — should be striving for.

6. Does the company have a healthy balance sheet?

Yes. In fact, Facebook has one of the strongest balance sheets you’ll find. The company has more than $48 billion in excess cash and investments versus $7.8 billion in debt as of this writing. That’s a profoundly healthy ratio and more than enough capital to fund Facebook’s operations for the next two and a half years without adding a dime of new free cash flow. It also ensures that fines from the FTC and other agencies around the globe — for now — amount to little more than the corporate equivalent of a bee sting.

7. Can the company generate organic revenue growth supported by long-term tailwinds?

Yes. Facebook’s growth is inexorably tied to the growth of smartphone and internet usage around the globe, and both should continue to rise. Market research provider Newzoo put total smartphone usage at 3 billion in 2018, rising to 3.8 billion by 2021.

In other words, roughly half the world is still getting the basic infrastructure that leads to a Facebook account, getting connected to friends, using messaging, and getting news — all steps that boost Facebook’s revenue and profit profile because they increase the addressable audience to which Facebook can deliver ads. (In its SEC filings, Facebook says that it generates “substantially all of our revenue from selling advertising placements to marketers.”)

Wall Street has reached a similar conclusion. According to financial analysts polled by S&P Global Market Intelligence, Facebook is gearing up to grow earnings by roughly 19.54% annually over the next three to five years. Revenue will at least double over that same period, analysts project.

8. Can the business generate growing FCF and sustain high ROIC?

Yes. Facebook is a remarkably good cash generator, having more than tripled annual cash from operations between 2015 and the trailing 12 months. Free cash flow has more than doubled over the same period, with the difference due to a sharp sevenfold increase in capital spending in that time. Spending on principal for finance leases, share repurchases, and data center equipment, among other things, seems to be the cause.

In the meantime, Facebook’s return on invested capital (ROIC) has improved every year from 2015 to 2018 and sits at a heady 30% over the trailing 12 months versus an 8.5% weighted average cost of capital (WACC), according to New Constructs data. Zuckerberg and Sandberg deserve credit for an impressive record of allocating capital for the benefit of shareholders.

Source: New Constructs.

9. Is the management team focused on driving long-term profitable growth?

Yes. Growth has never been a problem for Facebook. Despite its well-publicized privacy issues, the company has grown both its daily active user (DAU) and monthly active user (MAU) count every quarter since Q2 2015 through Q2 2019. Average revenue per user (ARPU) has grown 155.45% over the same period — from $2.76 to $7.05. Clearly Facebook is finding ways to deliver ads that its customers value.

How those ads perform is important, because Facebook has no one else to blame. Like Alphabet, Facebook is a “walled garden” in that you can only develop and spend ad dollars using Facebook’s own tools and platform. Third-party tools such as those provided by The Trade Desk aren’t capable of plugging into Facebook for programmatically bidding on ad space. In keeping control, Facebook continues to grow fast and produce tens of billions of dollars worth of annual free cash flow. Management has been laser focused on making the most of its ad inventory — Zuckerberg even has a 10-year growth plan — and I don’t see that changing soon.

They’re also focused on creating optionality to boost revenue from sources other than ads. Recent add-ons beyond dating include Facebook Marketplace for selling secondhand items and Libra, a cryptocurrency the company says will serve the 31% of the world’s population that remains unbanked.

10. Does the company have a medium- or lower-risk profile?

Maybe. The company’s financial strength makes it one of the current era’s titans of industry. At the same time, at least 3% of Facebook profiles are fake. In an age when “fake news” is a verifiably real problem, Facebook is one of the main pushers — even if it doesn’t mean to be. Therein lies risk that’s often overlooked. Since Facebook can’t properly police itself, it must be regulated.

If that sounds harsh, consider that as it’s currently structured, Facebook needn’t answer to shareholders or the board. Zuckerberg has voting control of the company, and there’s virtually no scenario in which he can be forced to make changes that others deem good for shareholders, customers, or the public.

Source: Facebook’s April 2019 proxy statement.

Here’s what this chart means:

Controlled Company Status

Because Mr. Zuckerberg controls a majority of our outstanding voting power, we are a “controlled company” under the corporate governance rules of The Nasdaq Stock Market LLC (Nasdaq). Therefore, we are not required to have a majority of our board of directors be independent, nor are we required to have a compensation committee or an independent nominating function. We have nevertheless opted to have a majority of our board of directors be independent and to have a compensation & governance committee comprised of independent directors, as more fully described below. In light of our status as a controlled company, our board of directors has determined not to have an independent nominating function and to have the full board of directors be directly responsible for nominating members of our board.

Were Facebook actually a state — and now with its own currency, it’s increasingly looking like one — we’d call it a dictatorship. It’s hard to see governments standing aside while Facebook emerges as a global economic superpower.

More likely is that regulators will write new laws and enforce old ones to govern how the platform serves people in the years to come. Already, seven U.S. state attorneys general and the attorney general of the District of Columbia are cooperating on an antitrust probe. “The investigation focuses on Facebook’s dominance in the industry and the potential anticompetitive conduct stemming from that dominance,” New York Attorney General Letitia James said in a press release. Regulators could seek to seriously curtail the company’s business practices or break it up. Either way, the version of Facebook you’re investing in today could look very different even five years from now.

Areas for improvement: Regulation is coming. The only question is whether Facebook will participate in the process or have new laws foisted upon it. Instead of pouring dollars into lobbying — Facebook has spent more than $31 million on lobbying in the last three years alone — Zuckerberg and Sandberg could be spending time and effort discussing commonsense ideas for improving how we govern the social web around the world. They’ve been reluctant witnesses so far.

A 7 of 10 score raises questions about Facebook’s commitment to ESG

Even though Facebook checks the boxes on six questions in The Motley Fool’s ESG Compounder Checklist, the two “maybe” answers and two “no” answers on critical questions that relate to management choices and the company’s ongoing collisions with regulators and failures on public policy issues make the company not a high scorer on our ESG framework.

The good news? Facebook’s ESG standing will improve materially if Zuckerberg and Facebook’s other leaders make good on promises to bring to market a fully private Facebook experience that’s built on a subscription model, giving customers choices in how they use and experience the world’s most feature-rich social media platform. Unfortunately, that day may never come — and I can’t see Facebook as an ESG compounder unless it does.