I think what Twitter and X have done with community notes, I think is just a better program. Rather than having a small number of fact checkers, you get the whole community to weigh in. When people usually disagree on something, tend to agree on how they're voting on a note, that's a good sign to the community that this is, there's actually like a broad consensus on this and then you show it. And you are showing more information, not less. Right. So you're not using the fact check as a signal to show less, you're using the community note to provide real context, and show additional information. So I think that's better.
When you're talking about nation states, or people interfering, a lot of that stuff is best rooted out at the level of, kind of, accounts doing phony things. So you get like, whether it's China, or Russia, or Iran, or like one of these countries, they'll set up these networks of fake accounts and bots, and they coordinate and post on each other's stuff to make it seem like it's authentic. And kind of convince people like, wow, a bunch of people must think this or something, and the way that you identify that is you build AI systems that can basically detect that those accounts are not behaving that a human would.
A lot of what we've seen too, I mean there's the anonymous accounts, also I think over time a lot of the kind of more interesting conversations have shifted from the public sphere to more private ones. So WhatsApp groups, private groups on Facebook. I'm sure you've had this experience where like maybe 10 years ago you would have posted your kind of quick takes on whatever social media you're using. Now, you know the stuff I post on Facebook and Instagram, it's like I put time into making sure that's kind of good content that I want to be seen broadly. And then most of the jokes that I make are like with my friends in WhatsApp, right, in groups.
I was really worried from the beginning about basically becoming this sort of decider of what is true in the world. Right. That's like kind of a crazy position to be in for billions of people using your service.
[Podcasts] Well, it's a new medium. I mean, I'm sure you know the history on this. It's like when people transitioned from radio to TV, the initial TV anchors were the same radio people, but just like being filmed while speaking on the radio. But it turned out it was actually a completely different type of person that you need. Because on your radio is just like your voice and your cadence, and all that. It's like you know the whole phrase you've got a good radio voice. Right. Okay, on TV you need to be telegenic, right, you need to kind of have charisma. In that medium its like a completely different thing. I think that's going to be true for the internet too.
The Supreme Court has this clear precedent. It's like, all right, you can't yell fire in a crowded theater. There are times when if there's an emergency, your ability to speak can temporarily be curtailed in order to get an emergency under control. So I was sympathetic to that at the beginning of Covid.
I think at some level you start, you only start one of these companies if you believe in giving people a voice. Right, I mean. The whole point of social media is basically you know giving people the ability to share what they want.
In the beginning [of covid] it kind of seemed like, okay we should give a little bit of deference to the government and the health authorities on how we should play this. But when it went from you know, two weeks to flatten the curve, to you know like in the beginning it was like, okay there aren't enough masks. Masks aren't that important. To then it's like oh no you have to wear a mask, and you know like everything was shifting around. It just became very difficult to follow. And this really hit the most extreme, I'd say during the Biden administration when they were trying to roll out the vaccine program.
I'm generally like, pretty pro rolling out vaccines. I think on balance the vaccines are more positive than negative. But, I think that while they're trying to push that program, they also tried to censor anyone who was basically arguing against it. And they pushed us super hard to take down things that were, honestly were true.
It's so complicated, this system, that I could spend every minute of all of my time doing this and not actually focused on building any of the things we are trying to do, AI, glasses, like the future of social media, all that stuff. So I get involved in this stuff, but in general we have a policy team. There are people I trust. They're the people kind of working on this on a day-to-day basis.
Basically, these people from the Biden administration would call up our team and like scream at them and curse. And it's like these documents are, it's all kind of out there. ... The emails are published. It's all kind of out there. And they're like, and basically it just got to this point where we were like, no we're not going to take down things that are true, that's ridiculous. ... We just said no. We're not going to take down humor and satire. We're not going to take down things that are true. And then at some point, I guess, I don't know, it flipped a bit. I mean Biden when he gave some statement at some point. I don't know if it was a press conference or to some journalist where he basically was like these guys are killing people. And, I don't know. Then like all these different agencies and branches of government basically just started investigating and coming after our company. It was brutal. It was brutal.
1984 is like an instruction manual. It's like it shows you how things can go that way with "wrong speak" and with bizarre distortion of facts. And when it comes down to it, in today's day and age, the way people get information is through your platform [Facebook], through X. This is how people are getting information. They're getting information for YouTube. They're getting information from a bunch of different sources now, and you can't censor that if it's real, legitimate information because it's not ideologically convenient for you.
Today’s social networks, Facebook chief among them, were built to encourage the things that make them so harmful. It is in their very architecture.
Megascale is nearly the existential threat that megadeath is. No single machine should be able to control the fate of the world’s population—and that’s what both the Doomsday Machine and Facebook are built to do.
The cycle of harm perpetuated by Facebook’s scale-at-any-cost business model is plain to see. Scale and engagement are valuable to Facebook because they’re valuable to advertisers. These incentives lead to design choices such as reaction buttons that encourage users to engage easily and often, which in turn encourage users to share ideas that will provoke a strong response.
Every time you click a reaction button on Facebook, an algorithm records it, and sharpens its portrait of who you are. The hyper-targeting of users, made possible by reams of their personal data, creates the perfect environment for manipulation—by advertisers, by political campaigns, by emissaries of disinformation, and of course by Facebook itself, which ultimately controls what you see and what you don’t see on the site.
At megascale, this algorithmically warped personalized informational environment is extraordinarily difficult to moderate in a meaningful way, and extraordinarily dangerous as a result.
The biggest social platforms claim to be similarly neutral and pro–free speech when in fact no two people see the same feed. Algorithmically tweaked environments feed on user data and manipulate user experience, and not ultimately for the purpose of serving the user.
I don’t even know what others with personalized experiences are seeing.
Facebook’s megascale gives Zuckerberg an unprecedented degree of influence over the global population. If he isn’t the most powerful person on the planet, he’s very near the top. “It’s insane to have that much speechifying, silencing, and permitting power, not to mention being the ultimate holder of algorithms that determine the virality of anything on the internet. ... “The thing he oversees has such an effect on cognition and people’s beliefs, which can change what they do with their nuclear weapons or their dollars.”
Right now, too many people are allowing algorithms and tech giants to manipulate them, and reality is slipping from our grasp as a result.
You have to have a significant volume of users in the ecosystem, tens to hundreds of millions of users, before advertising makes sense. Facebook and Twitter had not launched with feeds filled with ads. All the major platforms lost money for years, slowly growing and bringing in more users, until they became indispensable features of modern life. Then, and only then, did they introduce advertising.
If the adversary cannot entice us to misuse our physical bodies, then one of his most potent tactics is to beguile you and me as embodied spirits to disconnect gradually and physically from things as they really are. In essence, he encourages us to think and act as if we were in our premortal, unembodied state. And, if we let him, he can cunningly employ some aspects of modern technology to accomplish his purposes. Please be careful of becoming so immersed and engrossed in pixels, texting, earbuds, twittering, online social networking, and potentially addictive uses of media and the Internet that you fail to recognize the importance of your physical body and miss the richness of person-to-person communication.
Twitter! Never have lives less lived been more chronicled!
In, like, ’08, ’09, smartphones came on, and kids started, they stopped living their lives and starting watching people live their lives, and so we saw the biggest spike and the highest levels of depression, anxiety, loneliness, and suicidality, since records have ever been kept, and it’s just continued on and on and on.
This is just my strong, intuitive sense ... that having a public platform that is maximally trusted and broadly inclusive is extremely important to the future of civilization
I play the fool on Twitter and often shoot myself in the foot and cause myself all sorts of trouble . . . I don't know, I find it vaguely therapeutic to express myself on Twitter. It's a way to get messages out to the public.
The reason I acquired Twitter is because it is important to the future of civilization to have a common digital town square, where a wide range of beliefs can be debated in a healthy manner, without resorting to violence. There is currently great danger that social media will splinter into far right-wing and far left wing echo chambers that generate more hate and divide our society... That is why I bought Twitter. I didn't do it because it would be easy. I didn't do it to make more money. I did it to try to help humanity, whom I love ... That said, Twitter obviously cannot become a free-for-all hellscape where anything can be said with no consequences!
Is there a conspiracy theory about Twitter that didn’t turn out to be true? So far they’ve all turned out to be true — if not more true than people thought.
Better to talk to people than communicate via tweet.
Best way to fight misinformation is to respond with accurate information, not censorship
I'm sort of worried that hey, civilization, if we don't make enough people to at least sustain our numbers, perhaps increase a little bit, then civilization's going to crumble.
I'll say what I want to say, and if the consequence of that is losing money, so be it.
I would urge parents to limit the amount of social media that children can see because they're being programmed by a dopamine-maximizing AI.
Facebook’s promised metaverse is about distracting us from the world it’s helped break.
Big Tech’s critics now include both Democrats who fear manipulation by domestic and foreign extremists and Republicans who think the large platforms are biased against conservatives.
Since digital platforms both wield economic power and control communication bottlenecks, these companies have become a natural target...
It is true that digital markets exhibit certain features that distinguish them from conventional ones. For one thing, the coin of the realm is data. Once a company such as Amazon or Google has amassed data on hundreds of millions of users, it can move into completely new markets and beat established firms that lack similar knowledge. For another thing, such companies benefit greatly from so-called network effects. The larger the network gets, the more useful it becomes to its users, which creates a positive feedback loop that leads a single company to dominate the market. Unlike traditional firms, companies in the digital space do not compete for market share; they compete for the market itself. First movers can entrench themselves and make further competition impossible. They can swallow up potential rivals, as Facebook did by purchasing Instagram and WhatsApp.
Since 2016, Americans have woken up to the power of technology companies to shape information. These platforms have allowed hoaxers to peddle fake news and extremists to push conspiracy theories. They have created “filter bubbles,” an environment in which, because of how their algorithms work, users are exposed only to information that confirms their preexisting beliefs. And they can amplify or bury particular voices, thus having a disturbing influence on democratic political debate. The ultimate fear is that the platforms have amassed so much power that they could sway an election, either deliberately or unwittingly.
Consider also that the platforms—Amazon, Facebook, and Google, in particular—possess information about individuals’ lives that prior monopolists never had. They know who people’s friends and family are, about people’s incomes and possessions, and many of the most intimate details of their lives. What if the executive of a platform with corrupt intentions were to exploit embarrassing information to force the hand of a public official?
Digital platforms’ concentrated economic and political power is like a loaded weapon sitting on a table. At the moment, the people sitting on the other side of the table likely won’t pick up the gun and pull the trigger. The question for U.S. democracy, however, is whether it is safe to leave the gun there, where another person with worse intentions could come along and pick it up. No liberal democracy is content to entrust concentrated political power to individuals based on assumptions about their good intentions. That is why the United States places checks and balances on that power.
In view of the dim prospects of a breakup, many observers have turned to “data portability” to introduce competition into the platform market. Just as the government requires phone companies to allow users to take their phone numbers with them when they change networks, it could mandate that users have the right to take the data they have surrendered from one platform to another. The General Data Protection Regulation (GDPR), the powerful EU privacy law that went into effect in 2018, has adopted this very approach, mandating a standardized, machine-readable format for the transfer of personal data.
Data portability faces a number of obstacles, however. Chief among them is the difficulty of moving many kinds of data. Although it is easy enough to transfer some basic data—such as one’s name, address, credit card information, and email address—it would be far harder to transfer all of a user’s metadata. Metadata includes likes, clicks, orders, searches, and so on. It is precisely these types of data that are valuable in targeted advertising. Not only is the ownership of this information unclear; the information itself is also heterogeneous and platform-specific. How exactly, for example, could a record of past Google searches be transferred to a new Facebook-like platform?
Such rules are designed to address one of the most potent sources of platform power: the more data a platform has, the easier it is to generate more revenue and even more data.
More broadly, such laws would close the door on a horse that has long since left the barn. The technology giants have already amassed vast quantities of customer data. As the new Department of Justice lawsuit indicates, Google’s business model relies on gathering data generated by its different products—Gmail, Google Chrome, Google Maps, and its search engine—which combine to reveal unprecedented information on each user. Facebook has also collected extensive data about its users, in part by allegedly obtaining some data on users when they were browsing other sites. If privacy laws prevented new competitors from amassing and using similar data sets, they would run the risk of simply locking in the advantages of these first movers.
The public should be alarmed by the growth and power of dominant Internet platforms, and there is good reason why policymakers are turning to antitrust law as a remedy. But that is only one of several possible responses to the problem of concentrated private economic and political power.
I thought about building new social features into my clone until I heard my friend’s story. The first rule of social software design is that more engagement is better, and that the way you get engagement is by adding stuff like Like buttons and notifications. But the last thing I wanted was to somehow hurt the conversation that was happening, because the conversation was the whole reason for the thing.
Imagine if podcasts were Twitterized in the sense that people cut up and reacted to individual segments, say a few minutes long. The content marketplace might shift away from the bundle—shows that you subscribe to—and toward individual fragments. The incentives would evolve toward producing fragments that get Likes. If that model came to dominate, such that the default was no longer to subscribe to any podcast in particular, it seems obvious that long-running shows devoted to niches would starve.
Lanier's been lecturing on one topic or another for decades, and his arguments have gradually become less abstract and more pointed as time has gone on and certain platforms, like Facebook, Instagram, Google, Twitter, and YouTube, have annexed more and more of our lives. His thoughts on this subject have been influential enough that they may sound familiar to you by now: That anytime you are provided with a service, like Facebook, for free, you are in fact the product being sold. That social media companies are basically giant behavior-modification systems that use algorithms to relentlessly increase “engagement,” largely by evoking bad feelings in the people who use them. That these companies in turn sell the ability to modify your behavior to “advertisers,” who sometimes come in the old form of people who want to persuade you to buy soap but who now just as often come in the form of malevolent actors who want to use their influence over you to, say, depress voter turnout or radicalize white supremacists. That in exchange for likes and retweets and public photos of your kids, you are basically signing up to be a data serf for companies that can make money only by addicting and then manipulating you. That because of all this, and for the good of society, you should do everything in your power to quit.
One of the most depressing theories Lanier proposes in Ten Arguments for Deleting Your Social Media Accounts Right Now concerns the efficacy of social-media-based activism. In the book, he suggests that the very same media used to organize and connect people with a shared viewpoint—this powerful resource for activists looking to foment change—can end up emboldening their opponents. The way it works, according to Lanier, is that the algorithm takes a positive social movement, such as Black Lives Matter, and shows it to a bunch of people who are inclined to be enraged by it, introduces them to one another, and then continues to rile them up for profit, until they're even more fearsome and effective than the movement to which they were reacting.
Every day Google and Facebook and other tech companies become more powerful and sophisticated by analyzing you and your choices—what you click on, how long you pause to watch an ad or a YouTube video—and the stories you write and the songs you record, and they charge advertisers money to access this information, and grow their own companies with it, but they don't pay you for your contribution. They don't even really acknowledge that you are contributing, as if artificial intelligence came from nowhere, instead of from data derived from you and me.
You tag people you claim to love in unflattering photos where they look terrible...But you know who doesn't look terrible in those photos? You. Never you. Too many photos of your food, your body, your vacation, your feet, your feet on vacation. You've been warned. You won't hear from me again - until your birthday.
And thanks to social media, we're no longer just keeping up with the Joneses, but we're keeping up with every single person on the planet.
Elder Pearson said the Church’s effort to take the gospel of Jesus Christ to all the world presents a “massive challenge” due to a lack of awareness of the Church. He said as many as 6.6 billion people of the 7.6 billion who inhabit the earth have never heard about the gospel. “Of the remaining 1.0 billion people who likely have heard of the Church, approximately half have an unfavorable impression about us." "Technology will be a key to overcoming the challenge of awareness,” explained Elder Pearson. “Much of the content needed for websites and YouTube must come from independent sources … and individual members of the Church. Your voices must be heard on social media, wherever you live.” He continued: “We must be a voice for truth. We must have the faith and courage to speak up and engage in social media in a positive, responsible, non-contentious and effective way. We can simply share what we know and believe with others.”
Because platforms like Instagram and Facebook present highly curated versions of the people we know and the world around us. It is easy for our perspective of reality to become distorted.
Facebook is a time vampire that doesn't offer you much beyond connecting with friends you likely wouldn't miss over time anyway.
They banned a sitting president from Twitter, impeached him twice, jailed his supported, raided his hom, and indicted him four times...and THEN got on T.V. and told us that WE'RE the FASCISTS!
And I also think it’s wise to keep the government away from it. Governments have force as their only real weapon. You don’t want force deciding the art of persuasion or deciding the art of communication with social media.
Yes, yes, that’s exactly the point. And that’s why I say you don’t want to open that door. Because even if you would like the policies that the current administration might employ if it started stepping into this arena, that’s good for now, if you agree with it. But it’s not good for, whether it’s a few months or a few years from now, whenever circumstances might change. And it’s just terrible precedent long term. This stuff doesn’t belong to the government. It’s not the government’s tool to play with. We need to keep the two of them separated.
Sen. Mike Lee (R-UT) stated that the government should be kept “as far away” from the political handling of social media bias as possible, and warned that “even if you would like the policies that the current administration might employ if it started stepping into this arena,” you probably won’t like the policies that would be enacted by a Democratic administration using the same power.
We live in a world of comparison. Social media has made this worse as we go online and compare our seemingly less exciting lives with the ‘fake lives’ we see online.
We live in a world of comparison. Social media has made this worse as we go online and compare our seemingly less exciting lives with the “fake lives” we see online. Many of those fake lives are edited, boastful, and unreal. Some may have unrealistic expectations that they should be happy all the time, and if they are not, they feel like something is wrong with them.
The tragedy of Quora is not just that it crushed the flourishing communities it once built up. It’s that it took all of that goodwill, community, expertise, and curiosity and assumed that it could automate a system that equated it, apparently without much thought to how pale the comparison is.
It’s the dynamic at play on Facebook, where the company throws family members, lifelong friends, and chance acquaintances — strong ties and weak ties, to use the sociological terminology — into your feed so that, over time, you stop being able to distinguish them, stop being able to tell who your real friends are or what a real friend even is.
In an op-ed published by CoinDesk, Janine Yorio and Zach Hungate of Everyrealm, “a metaverse-focused innovation firm and investment fund,” argue that the metaverse “will allow us to do things we cannot do in reality, much as video games do. We can destroy things and kill people without fear of punishment or retribution. We can be risqué and push cultural and societal norms beyond traditional boundaries, cloaked by anonymity and invincibility in the metaverse. We can fly, experiment with drugs, and cheat on our partners.” To be clear, these are people who think the metaverse is a good idea. The primary attraction of the metaverse, per Yorio and Hungate, is that none of the normal rules and obligations we have to one another apply. The real world, with its endless laws and limitations, is mainly there to showcase the endless plasticity of the virtual one;
In my experience, though, this upending of social norms has a strange flattening effect on interactions in virtual reality... You can see that same flattening effect brought to life, if that’s the word, in Horizon Worlds, where users choose their own avatars, but with Meta’s template, all end up looking somehow the same: joyless, determinedly winsome cartoons of themselves, like something from an Intro to French textbook. Everybody’s the same height here in Horizon Worlds; everybody’s face is symmetrical. Almost nobody is fat or old, age usually being signified only by white hair, as if it were just some nonintuitive fashion choice.
And privacy? Forget about it. We are destined to become like tagged bears, constantly tracked, but too addicted to the data stream to switch our intimate devices off.
The influencer industry is simply the logical endpoint of American individualism, which leaves all of us jostling for identity and attention but never getting enough.
The world has shrunk since the advent of the smartphone. The women of Tokyo dress the same as the women in Paris dress the same as the women in New York. We scroll and scroll through our individualized content streams, but we’ll never reach the end; in this atomization of experience, it is hard to see the collective point of it all.
Magazines and corporations and political campaigns and non-profits alike all needed someone to run the social media accounts they now relied on while simultaneously professing to disdain.
There is a latent conservatism to all this fantasizing about a life lived without the corrupting influence of the internet—but the cat is out of the bag. The internet is here to stay. The cage has descended, and the forms in which we write have changed—our sentences are different from the long and twisted parentheticals of Henry James.
Twitter and Facebook and LinkedIn and Google+ were never intended to be your virtual storefront. It’s not what they were built for and it’s not appropriate. Instead, you use them as outposts to start talking with people who may eventually become customers. (Digital Sharecropping)
Landing pages are the key to measuring the effectiveness of what you do with social media. If you’re having trouble figuring out whether your social media marketing is effective, it’s because you haven’t thought through your landing page strategy.
Whether a piece of news spreads online does not depend on whether it is true and coherent, but whether it is surprising, shocking and confirms prejudices. It can bounce endlessly in virtual echo-chambers—even if it is patently false.
The rise of fake news and spread of filter bubbles, where people see their pre-conceptions reinforced online, have probably disillusioned many voters. Facebook has had a hand in spreading misinformation, terrorism and ethnic violence around the world. But it has also spurred civil engagement.
...in 2019 Facebook boasted nine times the users, 21 times the revenue and 12 times the profit of Twitter (see table). More importantly, the strong network effects are a prime asset that Facebook has defended vigorously: it has spent vast sums on buying firms it considers likely future competitors, such as Instagram, acquired in 2012 for $1bn, and WhatsApp, for which it paid $19bn in 2014.
For something to go viral on a social network, people had to choose to share it. Now they endorse it simply by watching, as the algorithm rewards content that attracts the most engagement.
Twitter is great to connect with fans and be transparent. I enjoy that aspect about it. But really, I'm still trying to figure it out.