Dai and Jack’s Week in Tech: Don’t Trust the Internet

Dai: Did you see any coasters for the HomePods or people walking into glass?

Jack: The meeting was off the record.

Don’t Believe What You See on the Internet

Jack: Well, another week, and another example of the big tech platforms promoting misleading or downright false information. On Wednesday morning, YouTube’s No. 1 trending video was a clip that seemed to push a conspiracy theory that some survivors of the Parkland, Fla., school shooting who have called for stricter gun control are actually paid actors. The video, of a TV news broadcast that included Parkland student David Hogg, carried the description: “DAVID HOGG THE ACTOR….”

Dai: It’s repugnant that anyone would keep pushing these conspiracy theories about kids who have already suffered so much, but this video was a little tricky. It was a video of a legitimate news story involving the student. It wasn’t a false or doctored video, and the content itself didn’t push a conspiracy. However, the title on the video framed this otherwise innocuous video as evidence of a broader conspiracy. Even the description isn’t obviously inflammatory unless you understand the context of why this student is being called an actor. Man, YouTube can’t bring on those 10,000 human checkers fast enough.

YouTube took down the video before I had my morning coffee in California, but only after it attracted more than 200,000 views. YouTube said it has been trying to make changes so things like this don’t happen, but “in some circumstances these changes are not working quickly enough.”

Jack: This example also highlights another issue with YouTube: Its video recommendations can push people to misleading videos and conspiracy theories. YouTube’s video-recommendation engine is engineered to keep you on the site. It’s pretty darn good at it. YouTube said recommendations fueled a 10-fold increase in use of the site from 2012 to 2016, to more than 1 billion hours a day.

After I finished watching the trending two-minute David Hogg video on Wednesday, YouTube immediately began playing a new recommended video: a clip of Mr. Hogg speaking to reporters with the title, “David Hogg Can’t Remember His Lines When Interviewed for Florida School Shooting.” YouTube’s recommendation algorithm seeks out videos that are drawing traffic, and company engineers admit that approach can tend to reward sensational and salacious videos.

YouTube can also contribute to so-called filter bubbles by feeding you videos with perspectives similar to those in videos you already watched. That also means its recommendations can be an embarrassing window into your psyche. I mean, I wouldn’t want to reveal my recommendations to thousands of readers, like clips from comically bad movies or 1990s pro wrestling.

Dai: Maybe that ‘90s wrestling knowledge will be useful when President Rock is in the White House.

Issues at Facebook and Twitter, Too

Jack: It wasn’t just YouTube that spread false information about the shooting. There were also Facebook posts claiming survivors were so-called crisis actors, which were shared thousands of times. And your reporting with Sheera Frenkel this week showed that Twitter accounts suspected to be Russian bots mobilized to tweet about the Parkland school shooting shortly after it happened, often pushing both sides of the gun-control debate.

Dai: The sad reality is that whenever these tragic events take place in the near future, it’s going to be a test for these platforms to see how they weather the deluge of conspiratorial and false stories. It’s gotten to the point where the first instinct for tech reporters is to search for key phrases on Google, YouTube, Facebook and Twitter. Much of the time, the results are disappointing.

Jack: Meantime, the executives that run these sites have been slow to acknowledge the severity of the problem. After federal prosecutors last week indicted 13 Russians and three companies for spreading misinformation online in an attempt to subvert the 2016 election, Facebook’s vice president for advertising, Rob Goldman, posted a series of tweets that downplayed their impact.

Dai: Of course, he walked it back after President Trump amplified the comments to his followers, saying this was evidence that there was no collusion between his campaign and the Russians. Kevin Roose’s column on this really brought it home. Those tweets from Mr. Goldman made it look like the company’s P.R. strategy calls for prominent executives not named Zuckerberg to defend Facebook aggressively on social media — and it backfired in a major way.

Jack: Before we sign off, can I pass along this additional bit of good news about the internet? Twitter and Facebook are rife with impostor accounts, and — stop me if you’ve heard this before — they are slow to crack down on them.

Dai: Sometimes this feels like Sisyphus and the boulder. Farhad will be back next week with answers from many readers about the types of technology stories they want to read. I’m already preparing my five Logan Paul stories.

Daisuke Wakabayashi writes about Alphabet and its many arms, including Google. Jack Nicas covers Apple. Both also write about broader technology trends and Silicon Valley. You can follow them on Twitter here: @daiwaka and @jacknicas

Continue reading the main story

Source link

Leave a Reply