Audio of this conversation is available via your favorite podcast service.
This episode focuses on the role of shareholder activism in pursuing transparency and accountability from tech firms. In a week in which board resolutions are up for a vote at Meta and Alphabet related to each company’s development and deployment of artificial intelligence, Justin Hendrix spoke to five individuals working at the intersection of sustainable investing in tech accountability:
Michael Connor, Executive Director of Open MICJessica Dheere, Advocacy Director at Open MICNatasha Lamb, Chief Investment Officer at Arjuna CapitalJonas Kron, Chief Advocacy Officer at Trillium Asset ManagementChristina O’Connell, Senior Manager for Shareholder Engagement and Investments at Ekō
What follows is a lightly edited transcript of the discussion.
Justin Hendrix:
I feel like the whole space of shareholder activism, investor activism around tech accountability issues is under-known, under explored, perhaps by most of the tech accountability community. I want to go around the room and hear from each of you about your organization and what you’ve got up to lately. And Michael, let’s start with you.
Michael Connor:
Open MIC’s mission is to foster greater accountability in the tech and media sectors and we’ve been doing that amazingly now for 17 years. It’s hard to believe. And 17 years ago, frankly, people in the tech sector thought we were a little bit nuts because they thought we were straddling two things. One, we were talking about social issues, but then we were talking about financial issues and the people who wanted change thought we were in the pockets of the people of Wall Street and the Wall Street people thought we were in the pocket of all those progressive people.
But that’s changed a lot and I think one of the things that we’ve done is encouraged both advocates and investors to deal with facts and to encourage fact-based research and base our advocacy about facts because that’s important to investors as well as advocates in social justice organizations. And so now we’ve embarked over the years on a whole range of issues ranging from the need for federal privacy laws writ large to artificial intelligence, facial recognition, all sorts of issues. And more recently working with both Arjuna and with my colleague Jessica at Open MIC on questions of artificial intelligence and what that means for misinformation and disinformation and as well dealing with Ekō and Christina as well. So the three organizations have been involved in a big effort lately about artificial intelligence.
Justin Hendrix:
I’ll come to Natasha because I know you founded our Arjuna to focus on sustainability and focus on building a fossil free future. Perhaps you can explain how you got into tech accountability.
Natasha Lamb:
Sure. So Arjuna Capital is a wealth management firm and we care about environmental issues. As you said, we’re fossil fuel free, but we actually widen the aperture a bit more. We’re not just looking at financial issues for our clients, we’re not just looking at environment. We’re also looking at what are the implications for society, what are the implications for business, governance, the stability of our institutions, the underlying fabric of our society that makes all of this possible. And it’s really important that companies actions are not compromising that because our economy is dependent on it. So we started engaging directly with tech firms on issues of material importance really in 2016 after the 2016 presidential election when we learned that, actually it was before we learned this, but what we learned was that people in the United States were not consuming mainstream media. Those weren’t the top performing stories before that election.
We went to Facebook and we said, what’s going on right now? Why are these the stories that are popping up to the top for folks? And we ended up learning that 126 million Americans were viewing Russian propaganda in the lead up to that election. And unfortunately, those problems have continued through today and we are still engaging with, well, it’s now called Meta and engaging with Microsoft and engaging with Alphabet or Google on these same issues, these issues of distributing dis and misinformation. And now with AI, as Michael alluded to, what are the issues that we now have to deal with? If it’s so much easier to generate and propagate content that is false or invented, then how are tech companies going to handle that? How is our society going to handle that? And this year, 2024, when we’ve got 60 elections around the globe, what are the implications for societal harm?
Justin Hendrix:
Jonas, Trillium Asset Management, perhaps you can say word about what it is and how you got involved in shareholder activism related to technology.
Jonas Kron:
So Trillium has been around for over 40 years at this point. We’ve been a sustainable and responsible investor for that time. Currently, we’re a little over $7 billion in assets under management. And basically what that means is that we’re going to be looking to invest in companies that have a strong ESG profile. And that can take a lot of different forms in terms of the environmental performance, social and governance, but we also take our ownership stakes in these companies as an opportunity to really push them to do better. There’s no company that is perfect, there’s no company that gets it right. And a lot of times companies could really benefit from some encouragement from investors to not backslide even. So we take our ownership stake in those companies as an opportunity to really engage the companies to help them either become more sustainable, more responsible, or to maintain the programs that they do have in place.
I guess Trillium has been involved in the tech space going back to probably about 2008, 2009, with issues at the time like net neutrality, if folks remember that one. And really encouraging companies like Verizon and AT&T to adopt practices with net neutrality or being involved not only at the company engagement basis, but also actually participating in the public policy process, providing the investor perspective to the regulatory structure.
We also were very involved in looking at government surveillance and government requests for data from tech companies and helped encourage and support companies and issuing what have become called transparency reports, which are reports about requests and compliance with requests for data and information and metadata from governments around the world to these companies because they are repositories for that information.
Also looking at issues like say for example, whistleblowers. I think whistleblowers we’ve seen have had an important role to play in tech company activities and allowing the public and investors to really understand what’s going on at these companies.
And then the other part of it is governance. I think as we recently saw with OpenAI, governance really matters. The way an organization is set up from a governance point of view can have very serious consequences for the way that the company balances those cross competing demands on them for profitability and growth, but also obligations to society and their employees. And so those are all different ways that we’ve become involved and really trying to bring the investor perspective to the fore with these technology companies.
Justin Hendrix:
Christina, let’s come to you and hear about Ekō and exactly how you think about your community, which I think of as a more traditional movement oriented activist organization. How you got involved in working with the people on this call?
Christina O’Connell:
Absolutely, Justin. Ekō is a global campaigning organization focused on corporate accountability issues. And so we have members virtually everywhere around the world and those members are concerned about a variety of issues from environmental impacts to standard human rights issues, workers’ rights, et cetera, quite a range of issues. Our members are interested in having their voices heard by the companies that we address and where they see issues. And many, if not most of our members, are in fact shareholders in one way or another, though maybe not at the scale of our friends at Arjuna and Trillium, but retail shareholders themselves or investors through mutual funds, pension funds, IRAs, et cetera.
So we’re interested in providing them with an opportunity as shareholders to make their voices heard as well as concerned citizens and activists. And the opportunity for shareholders to speak to the executive leadership of a company as the owners of the company to point out issues of risk, issues of potential harm, issues of concern is one that we take very seriously. So our members raise issues with us, we organize their participation in shareholder proposals and they have an opportunity to both meet directly with the companies and also advocate for strong votes at proxy season.
Justin Hendrix:
Jessica, I’ll come to you next as well. I came to know you when you had your role at Ranking Digital Rights. So you’ve had a period of looking deeply at how tech companies perform in a variety of ways, both online platforms, as well as telecom companies. What are you learning about this space of shareholder activism?
Jessica Dheere:
I was really attracted to the shareholder activism space when I was at Ranking Digital Rights because I saw it as new to me at the time, but also an under leveraged opportunity for civil society and our policy crew, the tech policy for us relies on just find data, use data of the sort that Ranking Digital Rights produces, looking at company policies and what they say they do and then put it into shareholder proposals where we really have a foundation of evidence of the fact that companies say they do some things, but actually when you go in and you look at the practice, or not the practice, but the policy and then the practice out in the world, that they’re not actually doing those things. So it was very eyeopening. The companies are always asking us to take their word for it in terms of their self-regulation and efforts like Ranking Digital Rights help establish the evidence base that we need as advocates to say they’re actually not doing what they say.
And then organizations like Open MIC and Ekō and investors like Arjun and Trillium allow us to then take that case and make it directly to the boards of these companies and to the investors in these companies.
As a journalist, we’re always told to follow the money. And so for me, looking at what the money part of the equation is, both from the investor perspective as well as from the profits that the companies are making by not actually being able to show the receipts on their policies signaled and opening for stronger advocacy and advocacy that is ultimately aligned with a lot of the good regulatory development and other policymaking that’s going on right now.
Justin Hendrix:
So I want to go around the room and ask a question about accomplishments in this area. And I want to say, I understand this is not about necessarily getting everything you want every time when it comes to shareholder proposals. It’s not a binary, yes, we won or no, we lost. You’re often up against pretty difficult circumstances, especially when it comes to governance of companies like Meta. But let’s talk about biggest wins and maybe Natasha, I’ll start with you and come around to Michael and Jonas and then Christina and Jessica.
Natasha Lamb:
I think just focusing a bit on what’s happening today in this past year with AI entering stage left and taking up everybody’s attention immediately, that for us was something that raised a lot of red flags. There’s so much opportunity that can come from AI and there’s such risk in terms of the downside. And as investors, we really wanted to make sure that we understood what those risks are, what they will be, and that companies have accountability mechanisms in place. Past wins, we’ve worked with Meta and Google, Microsoft to do better reporting, to have transparency reports as Jonas alluded to earlier, to have that transparency to investors. If things are going wrong, let your investors know what it is, report out on it. We want to see how you’re addressing it. We want to see how it changes year over year, and that accountability mechanism is so important.
So when we looked at AI, we’re like, okay, we see this ethos yet again of move fast and break things, and the innovation is moving so quickly that instead of listening to your AI ethics team, companies started suppressing what their ethics teams were saying, and there was this arms race that started. And the reason for that, if you think about it, you’ve got, Google owns 90% of the search market, every 1% that Microsoft can take from Google, that’s 2 billion in revenue. So there’s huge incentive to be a leader to move even if you’re not ready. But in doing so, you’re not considering, companies are not considering what the risks are. And we’ve seen even with the introduction of Bard at Google that was seen as botched and rushed, the market value of the company dropped $100 billion. This past February, we saw with Gemini that was producing images of Black Nazis, again, about $100 billion drop in shareholder value when that happened.
So those are examples of things going awry as investors absorb that these technologies are really not ready for prime time. We see that happening. Nobody’s paused. We’ve heard a lot of policies, a lot of platitudes, everybody’s talking about responsible AI, but what are the accountability mechanisms?
So last year in the early summer, we filed the first proposal with Microsoft asking for just that, an accountability mechanism. We want to know what the risks are, we want to know if there’s harms, how you’re going to address them, how you’re going to remediate them, and then have that accountability mechanism of transparent reporting. And that proposal went to a vote late last year, we just saw Microsoft come out with a very lengthy report. They’re not quite there, but they’re moving in the right direction. It’s very thorough. It’s the best reporting we’ve seen so far. They’re some metrics in there, but there’s more work to do to really make that accountability mechanism.
Justin Hendrix:
And you see the connection between your advocacy and that action by Microsoft.
Natasha Lamb:
Yeah, exactly. And I think it’s the transparency that we’re all looking for as investors. In order for us to make the best decisions that we can for our clients, we need to know what’s happening and how companies are addressing this because there are these downside risks and at the end of the day, who gets left holding the bag if there’s lawsuits, if companies are not adhering to regulation, like the EU’s new AI Act. If you have a violation there, 7% of revenue is at risk. So those are the kinds of things that we’re concerned about and we just want to make sure that through our dialogue for our shareholder proposals that we’re pressing for that transparency.
Justin Hendrix:
What about you, Jonas? What would you point to as a victory?
Jonas Kron:
Sure. So I guess maybe three points that I’d like to touch on here. One is I think Justin your point at the top here looks really an important one, which is to see shareholder advocacy not in isolation, but within a constellation of other pressure points. These companies don’t really exist in isolation either, and they are subject to a lot of different cross pressures and investors can play an important role in the overall process. And so when you think about the successes and where you see successes, I think that’s an important piece to keep in mind.
I guess what I’d like to do is talk about some tech wins, but then also maybe just wrap up with a “win” in another area though that I think can really help illustrate the ways in which investors can make a difference. If I can step out of the tech zone for a moment.
From Trillium’s perspective, in our work, a couple different places that we’ve seen some real good progress, some of them are a little on the older side, some are newer, but at Apple, for example, back in 2012, so 12 years ago at this point, we engaged the company in a similar way that we’re engaging Google now, which is looking at its audit committee charter and what are the responsibilities for the audit committee in terms of oversight of the company. And one of the things that we saw that was lacking there was any reference to user privacy, consumer privacy, and we filed a shareholder proposal and worked quite closely with the company to get that language into the audit committee charter and make it really one of the key responsibilities of the audit committee. I think as we saw over the following decade that Apple really did build its brand around privacy for its users, in some ways culminating, well, culminating in a number of different ways in terms of how they were expressing that.
I don’t want to say that Trillium’s advocacy is single-handedly responsible for apple’s prioritizing user privacy, but it played an important role in bringing the investor perspective and showing why investors think it would be important in getting that baked into the governance structure.
Other things that we’ve seen are at Alphabet, for example, shareholder proposals and other shareholder engagement leading to greater board diversity, linking pay to employee diversity, which is not the bread and butter of tech engagement I think more generally, but it really does set the stage for companies to be able to be more responsive to the communities that they are serving and that they should be responsible to.
But if I can just give one other example of a recent shareholder engagement that I think can really illustrate from people that I think can help them understand the ways in which investors can make a big difference. For the last almost two years, a pretty large group of investors that Trillium was a part of engaged Starbucks around the way that it was treating its workers that were trying to organize. And we used dialogue and we used letters from, we organized groups of over $3 trillion in assets to send letters to Starbucks, questioning the way that they were approaching organized labor, eventually getting to shareholder proposals, focusing on worker rights that got a majority vote at the company’s annual meeting. And eventually the company and the union back in February came to an agreement for the ground rules that they will be moving forward towards collective marketing. And that was seen as a very big win, big development and really, illustration of the company changing its approach to unions, understanding the value of stability that comes with unions and the value of loyalty of workers that comes with unions.
The workers identified three key features to that development for them. One was that the workers were organized and were leading the efforts. Two, that there was a real focus on the consumer base and young consumers really holding the company to its public reputation. And then third was the investor engagement and the investor engagement in terms of the shareholder proposals that we had filed, as well as other investor activity focused on the board of director. And that all of those things came together to help the company recognize that they needed to have a different approach to workers. I think that’s the power of shareholder engagement. Sometimes it takes years and years of work. Sometimes it takes slowly escalating through quiet dialogue, the noisy dialogue, then shareholder proposals, but when that work is done well and when the other constellation, the other pressure points on the company also come into alignment, it can be a very powerful tool.
Justin Hendrix:
Michael, in my past conversations with you, I’ve thought of you as one of those people slowly pushing the rock up the hill. Can you speak to your experience perhaps? I know in particular working on companies like Meta is frustrating given the ownership structure, but what keeps you coming back?
Michael Connor:
One of the things that I think shareholders can do most effectively, one of the requests shareholders can make, and that is really one of the best tools we have, is really disclosure. And I think one of our wins, I’ll put this in personal terms, about a decade ago, Edwards Snowden had revealed that AT&T and Verizon had basically created a pipeline for all of their Metadata to be sent to the NSA. And at the time working with Trillium and with Jonas, and we had requests on both companies to issue what were called transparency reports. This is about a decade ago, I think, and that was a very novel thing. Companies just weren’t doing transparency reports. And I remember my phone ringing one day and it was Jonas and he was shouting and he was saying, “They’re going to do it, they’re going to do it. We won.”
And within days, both companies had agreed to publish transparency reports, and that really has set off a whole wave of, virtually every tech company now has a transparency report and they are imperfect in many ways, but then that gives you another tool is just to go back to companies and say, we know you have a transparency report, but now we want you to disclose more information and we want you to do it better. To me, that was one of the more fundamental wins in the tech space certainly of getting transparency reports and establishing that baseline that companies did have to disclose a certain level of information.
Justin Hendrix:
Christina, I want to come to you. There’s an election going on in India right now. I think one of the more significant campaigns that I’m aware of that Ekō helped lead recently was around urging an inquiry into political bias at Meta in India. Can you speak to the efficacy of that campaign? Of course it ultimately failed, you didn’t get that inquiry, but I assume you’ve raised awareness quite a lot in collaboration with the good folks over at the Internet Freedom Foundation and a couple of other places as well.
Christina O’Connell:
Very timely question because this morning one of my colleagues along with the ICW released a report which I think the Guardian picked up on about Meta’s failure to catch both AI generated and Islamophobic and violence inciting advertisements in India. They were able to place 14 out of 22 sample ads that violate all of Meta’s terms in India. Now, the ads were not eventually run because we pulled them before we would let them go public, but that is just as of today, this is accurate.
And about two weeks ago, we also were able to report on a $1 million shadow network of advertisers, fake advertisers that were placing similarly, truly vile Islamophobic and misogynistic advertisements on Meta. These were all things that Meta’s attempts at moderation have not or claims to moderate, but their content have not caught. And one of the resolutions that we’re involved with this year is one lead filed by AkademikerPension out of Denmark, an academic pension fund and store brands and major Norwegian asset management firm asking for a fuller report for Meta on their content moderation in their five largest non-US markets.
I think often when we talk about technology issues, disinformation, et cetera, we often or all too often forget that the biggest markets are often not the US and I think we tend to have our blinders on, and if we think about dangers and harms, we’ll think about kids seeing something on Instagram or something. We don’t think about the really incredibly dangerous material that is fed to massive audiences in countries like India, in Brazil. We’re very concerned about that and concerned both on the AI issue and happy to work with Arjun and Open MIC on the AI resolution and also working with our Scandinavian friends on the issue of content moderation and the problems with content moderation from Meta. I think in all the things that we say, I think a lot of what we talk about in shareholder engagement, shareholder activism, et cetera, is the whole issue of transparency.
And I think it’s very important to understand that transparency is somewhat a two-way street. That it’s both we’re asking for this openness, but it also is valuable to the company to have the openness. So it’s not just that we’re being nosy, busy bodies wanting to know things, but before I worked for Ekō, I worked in sustainability consulting for almost 25 years working with Fortune 500 companies, including Walmart and Levi’s and the World Bank and such. And what I saw was that shareholder proposals, shareholder engagements, and this is seeing it from the other side of the table, led to the companies looking at their own performance and discovering things about their performance that they had missed. Shareholder proposal winning is not getting a certain percentage of the vote, but it’s raising an issue with the company and giving them the incentive to think more carefully about it.
When the board has to notice a proposal, they then turn around and say to the managers, “Hey, what’s going on here?” And I always thought that you could see over several years of proposals the interest and concern growing. By the time you hit 30, 40%, folks on the board are going, “We really better deal with this in some way.” So that it’s a very interesting dialogue between the owners of the company who are both investors, but also consumers and citizens and the company’s management and the things that they might miss.
I think for a victory in addition to the work we’ve done in India, which I’m very excited by, I’m very proud of my colleagues’ work there and the work we’ve done in Brazil, we also worked for several years with a proposal with Apple on the issue of Uyghur forced labor, and there were reports of forced labor in China in the Apple’s supply chain. We had filed a proposal, filed several years in a row, and just a year ago we were able to reach an agreement and withdraw our proposal because Apple agreed to finally give much more transparent reporting on not just did they not have forced labor in their supply chain, but how they were able to actually investigate that and demonstrate that.
So it wasn’t just take us on our word, we’re going to tell you we’re being good, but just how could they actually find out? And so we were able to bring expertise that we have, work that I had done in the past, for example, on supply chain into the conversation on behalf of shareholders and actually ask for clear information on how they were doing with their supply chain. We’re very proud of that.
I think that Microsoft, certainly the report as Natasha mentioned, is a good first step. Next we want to see, I’ll say this for Ekō, we want to see numbers in metrics and measurements. That’s what I always told my clients when I was consulting or what was needed, not just policies and think that we expect to see further.
Meta, unfortunately with India seems to be so closely embedded with the Modi government that it’s a big problem, but one that we’re working with a number of India civil society organizations to address as much as we can.
Justin Hendrix:
Certainly sounds like some real victories there amongst the three of you. And I want to ask a couple of questions in the few minutes we have left. One is how you see your work in shareholder activism connected to other accountability work that’s going on around tech, more traditional public activism or accountability activism. And also I’m interested in how regulatory regimes, especially in Europe that are also demanding certain types of transparency, how those are changing the game a bit, or if they are changing the that you work.
Jessica Dheere:
I think without a doubt, it’s having an effect. Certainly the fines and the repercussions. I think at least having the framework out there that the EU AI Act follows, which is a sort of high, medium and low level risk framework. I don’t think it’s perfect and I think there’s a lot of criticisms that civil society has. There are criticisms, but it is a first and I think on the issues of transparency and disclosure that most of the regulation or the proposed regulation that we see is asking essentially for many of the same things that we are asking for in our resolutions, which are coming up next week and the week after at Meta and alphabet, asking for metrics about how these companies are handling the risks and how they’re measuring basically the effectiveness of the policies and the principles that they’ve put in place. We’re asking for more data on that, more structured data on that and year over year data, as Natasha was saying, I think the Microsoft report that Natasha mentioned is really important.
They had already said that they were going to issue a report which was based on the voluntary commitments that they made with the White House last summer, but the language that they used was very reflective of the language that we included in the resolution. And to me that was the indication that they were listening, that they did see that the vote we earned, I think in December on Microsoft was 21%. That may not sound like a lot. That’s a pretty strong minority though, and it definitely signals to the company in the midst of all the other media activity around AI that they need to pay attention. There was another vote on an Apple proposal that AFL CIO presented earlier this spring. It was slightly different, but it earned a 37.5% vote. And so we’re really hoping, because we have gotten some endorsement on proxy advisors, on at least the Meta proposal so far that we’re going to see stronger votes. And I think, so what we’re asking for is what regulators are asking for. And so that kind of alignment I think really only points in one direction.
Justin Hendrix:
Natasha, do you want to jump in there?
Natasha Lamb:
I think the relationship between the investor and the company is such an important one. And to be able to have open and free communication with companies, with other investors to think about these ideas, to crack open the echo chamber between management and the board where so often you have the CEO is also the chair of the board, it’s really important to have other voices coming to the table that are identifying the risks, identifying the opportunities, and bringing forward proposals that are non-binding. But it’s all about sending a signal and having open communication. And as Jessica was just talking about, the votes at the end of the day, okay, it’s 21%, it’s 37%, for an emerging issue it could be 10%, that does not matter as much as having that open communication, sending the signal, having real shareholder democracy where investors as owners and companies can vote these shares.
The board is working for its investors and investors have a really important voice, and we’ve seen low votes, we’ve seen high votes. We had a proposal at Microsoft a couple years before that got a 78% vote. It was on sexual harassment, and that’s a unicorn. I think at the time, our shares, we had for our clients 20 million in stock and $2 Trillium voted in favor of that proposal, and that’s 100,000 times the leverage. And Microsoft responded, they did an investigation, they then became the leader in the United States on how to address sexual harassment at a company and they’re setting an example for others.
So that is such high value to the company who had been rife with issues for years, hard to attract and retain talent when you have sexual harassment issues, and now they were able to create a new playbook of how do you deal with this and how do you become a leader? And so for our engagements with these companies, what we’re asking for is for them to take leadership and that’s just going to make them better investments at the end of the day.
Justin Hendrix:
Jonas, do you want to come back around to that question, especially around Europe and the extent to which maybe regulatory regimes are driving the types of transparency that you want to see?
Jonas Kron:
Yeah, absolutely. I will keep this quick and I guess just make a couple of different points. So I do think that good public policy, good regulation that is informed by the democratic process and regulatory processes that take into account various stakeholders is definitely the best way to be moving forward on this. And I think what’s happening in Europe can really set the stage for, because of the nature of the internet, what’s going to be happening everywhere in the world. And I think those developments are critically important and the US has a lot to learn from that and investors can be a vehicle to bring those lessons to the United States, to regulators.
I think more generally, and to your point about how does shareholder advocacy interact with other players in society, I am reminded of the trope that when it comes to some of these big problems, there is no silver bullet, there’s only silver buckshot, and everybody needs to do what they can and use the tools that they have to move the best effect to try to address the issue.
And then lastly, I think Tasha made just a really important point in terms of signals. What investors are doing are conveying market signals to and to management and to boards, and you spend any time in investor communities, there’s a lot of talk about what is the investor signal, what is it that the investors want here? And what we are talking about here today, the things that we are doing, these are market signals and we are just deploying them in a way I think that most people aren’t familiar with or necessarily have experience with. But these are really important market signals that have been part of our markets and market processes for decades and decades. And that we are, I think as, I don’t know if it was Natasha or Jessica made the point earlier, we were opening the aperture.
Justin Hendrix:
Michael, last word from you.
Michael Connor:
I think you asked about the relationship between shareholder activism and other forms of accountability activism. And that’s a good question, and I think it’s important to keep in mind, as Jonas just hinted at, shareholder activism is really only one tool in the activism toolbox. Sustainable investors, certainly their victories as we’ve discussed, but it’s rare that shareholders score a victory on an issue all by themselves. And shareholder advocacy works best when it’s in sync and in the context of larger movement. So shareholders need to be informed, they need to be connected with and importantly represent the rights holders who are affected on a particular issue. We can’t do this. We shouldn’t do this in a vacuum Open MIC for example takes part every month in. We’ve been a member for over a decade now in the civil rights and technology table. We meet every month with about 30 different organizations. None of them are involved in shareholder advocacy, but their work informs us and informs our agenda. So they provide us with a firsthand view whether it be about an issue like facial recognition, the need to enact federal privacy protections or artificial intelligence. They provide the really important human rights and civil rights perspective that we need. So it’s critically important that shareholder advocacy be aligned and work with other forms of advocacy because that’s when you can bring about real change.
Justin Hendrix:
Well, I hope my listeners have a little better sense of this lane of activity when it comes to tech accountability. Is there a place where folks can go to learn more about each of your work?
Michael Connor:
www.openmic.org.
Jonas Kron:
trilliuminvest.com.
Christina O’Connell:
And we’re eko.org. And if folks would like to see the reports on India, they’ve just come out, just drop a note to info@eko.org and we’ll make sure you get a copy.
Natasha Lamb:
And we’re at Arjuna-capital.com.
Justin Hendrix:
I appreciate you all taking the time to speak to me today.
Natasha Lamb:
Thanks Justin.
Jessica Dheere:
Thank you.
Christina O’Connell:
Thanks Justin.