This is a cache of https://news.ycombinator.com/item?id=20675859. It is a snapshot of the page at 2019-08-20T02:37:14.855+0000.
How <strong>youtube</strong> Radicalized Brazil | Hacker News
Hacker News new | past | comments | ask | show | jobs | submit login
[flagged] How youtube Radicalized Brazil (nytimes.com)
47 points by otterley 7 days ago | hide | past | web | favorite | 12 comments
 help




Apparently lies and fake news on whatsapp played a huge role in Bolsanaro’s election. https://www.washingtonpost.com/news/theworldpost/wp/2018/11/...

Disclaimer: All the views expressed here are solely my own, and not those of any company or corporation.

Doubtless, this is an incredibly complex issue, and companies behind recommendation algorithms do have some degree of culpability for the outcomes of the algorithms they design, but I can't help but feel that most of this is an issue with human beings more than it is with algorithmics.

One could presume youtube has an extreme ideological agenda and has developed its algorithms in service to that agenda, yet a more likely possibility seems to be that youtube's algorithm simply maximizes on the content that humans are drawn to engage with strongly which, surprise, surprise, all of history will show you happens to be vicious, inelegant, untruthful, and ultimately disastrous ideologies.

The problem is not the algorithm. The problem is the masses that believe everything they watch without questioning it. The problem is not internet communities. The problem is the utter lack of localized communal structures in the modern world which leaves people defenseless in the face of efforts of galvanization toward some other, hateful view--there are no institutional anchors sheltering unwitting individuals from being duped any longer, and passions, blame, and fury still speak loudest among the figures of rhetoric. The problem is not the dissolution of truth and spread of misinformation, which as always existed. The problem is the increased efficiency with which that misinformation spreads without equivalent compensatory development of the intellectual and critical thinking armature we need to defend ourselves and keep our reason sound and free of influence.

You cannot call on these companies to be stewards of the world's information, grand censors, stiflers of expression in the name of the greater good. That's tyranny that will leave the blind blind. What we should be focusing on is how we can equip citizenry to better inform themselves, better guard against new means of propaganda and information, better imbue them with the critical sense to second guess the radical and largely unsupported claims they encounter on a daily basis. "fake news" is not a problem solved at the level of the distribution channels, which will always be susceptible to dastardly uses--it is a problem solved at the level of the receivers of the information, who need to be imbued with the critical foresight and reasoning skills required to defend themselves from manipulation.

Yet no one seems to focus on that.

Instead, the pundits and essayists of our day assume the general population will remain a dark and unenlightened herd, susceptible to every trick and puff piece, slaves to the whims of algorithmic wizardry. It's a narrative that buys into the very power structure it seems to want to question, and instead of lauding and promoting the dignity of human thought, the possibility of enlightenment, and a pursuit of a rational future, has abandoned all to a dark technological determinism, assuming that we've already made the devil's trade between human thinking for algorithmic obedience. We'll never solve this problem so long as we continue treating human subjects as irrational receptors of whatever ideology comes their way instead of trying to promote intellectual edification. The very title of this article reveals this bleak worldview that assumes human beings have no free will against the larger movements of technology--note that the title is not "How Radical Far-Right Politicians used youtube to radicalize Brazil" but rather "How youtube radicalized Brazil"

We live in a techno-dystopian fantasy that posits we've already lost our freedom to the "machines".


>The problem is not the algorithm. The problem is the masses that believe everything they watch without questioning it.

I see your point, and I agree about all those sociological problems you point out.

However, that does not mean "the algorithm" (or tech in general) is blameless. People and societies have had these weaknesses for millennia. Technology should work to help us _overcome our weaknesses_ as a society. Instead, tech that ends up favoring flat-earth or extreme political hate videos _preys upon our weaknesses_ for a profit.

The recommendation system is probably entirely "neutral" in a sense that it doesn't pick "sides" itself, sure. But that isn't nearly good enough. We should strive for something much better.


Congratulations on making the most archetypically HN comment ever! You have managed to completely deflect all responsibility for any misuse of a system optimized for feeding ads to people without any care for what those ads are on from the people who created this system onto the rest of the world.

> One could presume youtube has an extreme ideological agenda and has developed its algorithms in service to that agenda

Their agenda is "more time spent watching" = "more revenue" but theyve happened to induce a health crisis in the process. In this translation of engagement to dollars, youtube has an ethical obligation to not actively promote content that is provably, unequivocally false and actively harmful to the public health, even if they continue to provide a platform for it.

> You cannot call on these companies to be stewards of the world's information

youtube and Google already do these things. Not recommending videos is not censorship.


The problem is not the algorithm. The problem is the masses that believe everything they watch without questioning it.

This is wrong. You're basically saying people should have been educated against an emerging technology, a process that can take a generation. You let the technologists off the hook on the basis that if they don't do it someone else will, and the negative outcomes are the fault of uneducated for having failed to educate themselves.

The same argument was used to dismiss the way Facebook facilitated the genocide in Myanmar. For sure, social media companies are not the originators of human evil nor in any way the monopolists of it, but they are massively profiting from it despite themselves having the education and knowledge of how their tools are being abused, so that makes them culpable.


That’s a fair point. It does seem morally reprehensible to be aware of the fact that your platform has become a megaphone for genocide and to profit from it nonetheless. So I agree that on those grounds we should hold Facebook accountable.

Still, this is not a new problem. In 1939 an entire nation was susceptible enough to propaganda to accept global war and genocide, and several other nations were willing to turn a blind eye and ally themselves to that nation. Mass propaganda has functioned successfully for a long time, the only difference is that it used to be driven primarily by national interests while now its driven by global iconoclasts on platforms that have economic interests but are otherwise neutral. The reason populations in ‘39 were suceptible to harmful ideogy is the same as the reason they are today—they are cultivated in environs that, either becuase of material limits or cultural emphasis, do not emphasize instilling the ability to conduct critical analysis of agrumentation in the wider population.

Its important to recognize that while technology has certainly resulted in an acceleration of the spread of harmful thought and populist rehteoric it has not enacted a fundamental change in the nature of that rhetoric or of its function. Totalitarianism today looks much the same as it did a few years ago, and as it did even further back in history—only its vehicle has changed. In fact, one of the prime errors in the analysis of modern ills seems to lie in the willful ignorance of history, which in flavor differs from or day but in essence remains the same.

Also, I’m not making the argument that we should have adapted to technologies we couldn’t have foreseen—I’m saying the time to adapt is now not a few years ago. I simply think focusing too heavily on attacking the technologists (now) is the wrong angle and will fail to address the whole issue (though of course, holding them accountable is defintely a big part of the story), simply ensuring the problem lives on in another form.

The goal should be better education systems, and particularly ones that focus on producing rational human subjects , not economic cogs (what public education, for the most part, produces today) that, while plenty efficient (and pleasingly expendable) for corporations, have no aptitude for discerning judgement or humanistic reasoning to save them from political radicalization.


We're in general agreement, but the problem with the educational approach is just the relatively long timeframe over which it takes place. Sticking with your ww2 example, you could say some people certainly saw the risks by the time of the Nazi regime's ascent to power in 1933, but lacked the time as well as the will to adapt sufficiently quickly. We have less of an excuse because while a fully networked society is certainly novel, we certainly have sufficient historical perspective to know and really understand the downside risks compared to earlier generations.

but I can't help but feel that most of this is an issue with human beings more than it is with algorithmics.

That's a little like saying we should do away with vaccines and antibiotics because the problem isn't disease, it's a poor immune system.

maximizes on the content that humans are drawn to engage with strongly which, surprise, surprise, all of history will show you happens to be vicious, inelegant, untruthful, and ultimately disastrous ideologies.

I don't agree with this view at all. The analogy I typically use is that people will keep eating potato chips and never feel full if they never get a good meal. Vilifying potato chips makes no sense, but vilifying people for only ever serving potato chips and no real meals makes perfect sense. That's where the real blame lies.

What we should be focusing on is how we can equip citizenry to better inform themselves, better guard against new means of propaganda and information, better imbue them with the critical sense to second guess the radical and largely unsupported claims they encounter on a daily basis. "fake news" is not a problem solved at the level of the distribution channels, which will always be susceptible to dastardly uses--it is a problem solved at the level of the receivers of the information, who need to be imbued with the critical foresight and reasoning skills required to defend themselves from manipulation.

Yet no one seems to focus on that.

It's something I try to do, but I'm just one person and can't seem to get any traction. So I don't know what to tell you. I keep on keeping on, perhaps stupidly so given the general lack of interest and the state of my finances.


Its a bit late right now, just logged in to say, this is very well written. I am tired these days of every one blaming youtube, Facebook, Google or some other corporation. If not, then some cabal, maybe we should learn not to believe everything we read or hear.

While I mostly I agree, your post also reminds me of the "dissolve the people and elect a new one" quote. How do we equip the citizenry to navigate the rapidly changing media and propaganda landscape?

Schools used to teach epistemology (they didn't call it that but it was practical learning) but now they teach "practical" stuff meaning the job skills needed today that will be obsolete by the time the kids graduate.



Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: