Big Tech Faces Big Test on ESG Issues

As big tech and media companies face growing concern about the power of their businesses, more questions about environmental, social and governance (ESG) issues are likely to be raised. Social and governance issues deserve greater attention amid increasing regulatory scrutiny of industry giants.

ESG issues are attracting greater investor interest across industries today. Tech companies have a pretty good reputation in terms of the environment. Most don’t consume huge amounts of natural resources. And their products and services support technological advances that help give people better access to information, while fostering economic growth.

But many tech companies face social and governance risks that shouldn’t be ignored. Mounting political pressure to regulate technology and media giants should compel investors to ask what these companies are doing to manage exposure. Recent reports in the New York Times and Wall Street Journal suggest that regulators will be widening the scope of their enquiries into big tech companies beyond antitrust law. Investors should watch out for lesser-known potential ESG hazards by asking these three questions:

Is Antitrust the Right Battle?

The regulation of big technology companies, like Alphabet Inc. (Google’s parent company) and Facebook, is widely perceived to be a US antitrust problem. Antitrust laws are statutes developed by the government to protect consumers from predatory business practices by large, dominant companies. However, the US antitrust actions may face challenges because Alphabet and Facebook aren’t doing damage to consumers, in our view.

But we think additional regulatory questions will be asked. For example, Google-owned YouTube has 2 billion monthly unique users globally, and the Facebook ecosystem (Facebook + Instagram + WhatsApp + Messenger) has amassed 2.2 billion daily users. The sheer scale of users allows both companies to benefit from a powerful network effect. At the same time, this has created unintended social implications.

Content distribution is a case in point. Facebook and Google’s dominance over content distribution could clash with a long-standing US tradition to protect the plurality of opinions. Even today, the Federal Communications Commission enforces strict ownership restrictions over traditional media outlets, such as national and local TV stations, to ensure that no single entity has too much influence over the “media voices” in a given market. There are also rules in place to prevent the merger of national broadcast networks, such as CBS and NBC. Given the massive scale of their audiences, YouTube and Facebook far exceed the reach of mainstream networks. What’s more, in the US, traditional news and advertisement publishers are obligated to verify the authenticity of information in their publications, or face exposure to liability.

Google and Facebook don’t face such liability today. Since they say they aren’t publishers, they aren’t responsible for editorial content. Both companies consider their technologies to be platforms that connect users and publishers in an “open internet” environment. That’s why Mark Zuckerberg recently said at a roundtable discussion when talking about the spread of misinformation, “I don’t believe that our platform should take [content] down.”

What Does It Cost to Be a Publisher?

The thin line between “free speech” and “fake news” will continue to provoke discussion. With increasing evidence that misinformation through Google’s and Facebook’s platforms continues to stir controversy in politics, science and other areas, regulators may ultimately be inclined to require media giants to censor content distributed on their sites. If regulators don’t take action, the platforms themselves may suffer reputational consequences as users start to question the authenticity of content.

Should companies become responsible for removing “fake news” from their platforms, they start to assume de facto responsibility for deciding what information makes it to the public. In other words, these companies would be transformed overnight from platforms to publishers that influence public opinion. In our view, this role may lead to regulatory oversight and corporate responsibility that they haven’t faced before.

It would also be extremely expensive. As publishers, these companies would be forced to assume the same liability and responsibility for authenticity as any traditional media publisher. With millions of hours of user-generated content constantly uploaded to the system, big tech companies would be forced to expend huge amounts of capital to monitor and authenticate information, which could hurt their profitability. Artificial intelligence may help, but probably isn’t fully capable of the task just yet; indeed, Facebook added 15,000 contract workers as content moderators to review and remove images of violence and other content deemed harmful. Investors should consider the potential business implications of these social risks when evaluating any company in their portfolio.

Do Founders Have Too Much Control?

Over the past 25 years, technology founders have increasingly negotiated for more power. For example, Google’s founders structured the company in a way that gave them outsized voting rights and the ability to retain control even if they sold their stock. Mark Zuckerberg owns just over a quarter of Facebook stock but controls around 60% of shareholder votes. Among the many recent internet and tech IPOs, dual-class shareholder structures and founder control has been the norm, leaving limited rights for public investors.

Of course, many founder/CEOs continue to play instrumental roles in shaping their companies’ journey. However, as the tech founders wield more power and influence than ever anticipated, the accountability question is paramount. Independent boards, separating the CEO and chairman roles, and single share–class stock are strong corporate governance practices that have yet to be widely adopted in this industry. Investors need to continue to press companies to move in this direction and hold directors accountable, in our view. Poor governance practices can be a serious business risk for investors.

These are just some of the ESG risks that big tech and media may face in the coming years. As big tech companies continue to enjoy the benefits of huge networks, we believe they must also live up to the social responsibility that comes with their immense power to shape public discourse—or their businesses will ultimately suffer. Integrating an analysis of such social and governance risks in fundamental research and valuation estimates is essential for investors seeking positions in the sector.


Lei Qiu is a Portfolio Manager on the International Technology Portfolio and a Senior Research Analyst for Thematic & Sustainable Equities at AllianceBernstein

Dan Roarty is Chief Investment Officer of Thematic & Sustainable Equities at AllianceBernstein

Lei Qiu
Portfolio Manager—International Technology; Senior Research Analyst—Sustainable Thematic Equities
Daniel Roarty
Chief Investment Officer—Sustainable Thematic Equities

The views expressed herein do not constitute research, investment advice or trade recommendations and do not necessarily represent the views of all AB portfolio-management teams.

Related Insights