Social-Media
Facebook dithered in curbing divisive user content in India
Facebook in India has been selective in curbing hate speech, misinformation and inflammatory posts, particularly anti-Muslim content, according to leaked documents obtained by The Associated Press, even as its own employees cast doubt over the company’s motivations and interests.
From research as recent as March of this year to company memos that date back to 2019, the internal company documents on India highlights Facebook’s constant struggles in quashing abusive content on its platforms in the world’s biggest democracy and the company’s largest growth market. Communal and religious tensions in India have a history of boiling over on social media and stoking violence.
The files show that Facebook has been aware of the problems for years, raising questions over whether it has done enough to address these issues. Many critics and digital experts say it has failed to do so, especially in cases where members of Prime Minister Narendra Modi’s ruling Bharatiya Janata Party, or the BJP, are involved.
Across the world, Facebook has become increasingly important in politics, and India is no different.
Read: Amid the Capitol riot, Facebook faced its own insurrection
Modi has been credited for leveraging the platform to his party advantage during elections, and reporting from The Wall Street Journal last year cast doubt over whether Facebook was selectively enforcing its policies on hate speech to avoid blowback from the BJP. Both Modi and Facebook chairman and CEO Mark Zuckerberg have exuded bonhomie, memorialized by a 2015 image of the two hugging at the Facebook headquarters.
The leaked documents include a trove of internal company reports on hate speech and misinformation in India. In some cases, much of it was intensified by its own “recommended” feature and algorithms. But they also include the company staffers’ concerns over the mishandling of these issues and their discontent expressed about the viral “malcontent” on the platform.
According to the documents, Facebook saw India as of the most “at risk countries” in the world and identified both Hindi and Bengali languages as priorities for “automation on violating hostile speech.” Yet, Facebook didn’t have enough local language moderators or content-flagging in place to stop misinformation that at times led to real-world violence.
In a statement to the AP, Facebook said it has “invested significantly in technology to find hate speech in various languages, including Hindi and Bengali” which has resulted in “reduced the amount of hate speech that people see by half” in 2021.
“Hate speech against marginalized groups, including Muslims, is on the rise globally. So we are improving enforcement and are committed to updating our policies as hate speech evolves online,” a company spokesperson said.
This AP story, along with others being published, is based on disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form by former Facebook employee-turned-whistleblower Frances Haugen’s legal counsel. The redacted versions were obtained by a consortium of news organizations, including the AP.
Back in February 2019 and ahead of a general election when concerns of misinformation were running high, a Facebook employee wanted to understand what a new user in the country saw on their news feed if all they did was follow pages and groups solely recommended by the platform’s itself.
Read: Facebook unveils new controls for kids using its platforms
The employee created a test user account and kept it live for three weeks, a period during which an extraordinary event shook India — a militant attack in disputed Kashmir had killed over 40 Indian soldiers, bringing the country to near war with rival Pakistan.
In the note, titled “An Indian Test User’s Descent into a Sea of Polarizing, Nationalistic Messages,” the employee whose name is redacted said they were “shocked” by the content flooding the news feed which “has become a near constant barrage of polarizing nationalist content, misinformation, and violence and gore.”
Seemingly benign and innocuous groups recommended by Facebook quickly morphed into something else altogether, where hate speech, unverified rumors and viral content ran rampant.
The recommended groups were inundated with fake news, anti-Pakistan rhetoric and Islamophobic content. Much of the content was extremely graphic.
One included a man holding the bloodied head of another man covered in a Pakistani flag, with an Indian flag in the place of his head. Its “Popular Across Facebook” feature showed a slew of unverified content related to the retaliatory Indian strikes into Pakistan after the bombings, including an image of a napalm bomb from a video game clip debunked by one of Facebook’s fact-check partners.
“Following this test user’s News Feed, I’ve seen more images of dead people in the past three weeks than I’ve seen in my entire life total,” the researcher wrote.
Read:Ex-Facebook manager criticizes company, urges more oversight
It sparked deep concerns over what such divisive content could lead to in the real world, where local news at the time were reporting on Kashmiris being attacked in the fallout.
“Should we as a company have an extra responsibility for preventing integrity harms that result from recommended content?” the researcher asked in their conclusion.
The memo, circulated with other employees, did not answer that question. But it did expose how the platform’s own algorithms or default settings played a part in spurring such malcontent. The employee noted that there were clear “blind spots,” particularly in “local language content.” They said they hoped these findings would start conversations on how to avoid such “integrity harms,” especially for those who “differ significantly” from the typical U.S. user.
Even though the research was conducted during three weeks that weren’t an average representation, they acknowledged that it did show how such “unmoderated” and problematic content “could totally take over” during “a major crisis event.”
The Facebook spokesperson said the test study “inspired deeper, more rigorous analysis” of its recommendation systems and “contributed to product changes to improve them.”
“Separately, our work on curbing hate speech continues and we have further strengthened our hate classifiers, to include four Indian languages,” the spokesperson said.
4 years ago
Likee’s #KnowledgeMonth campaign ends; 5,470 videos uploaded by 1,904 users
Likee Bangladesh’s recent initiative #KnowledgeMonth, organized in partnership with 10 Minute School, has been ended which received wide acclaim from different regions and turned out to be a successful activity owing to its potential to create an enabling and healthy online atmosphere for knowledge sharing.
Likee launched the campaign on September 3 to encourage its users creating and sharing videos focusing on various academic and co-curricular skills, which would not only enlighten people but also help them showcase their creative side.
During the campaign, a total of 5,470 videos were uploaded by 1,904 users, and a staggering number of 35.8 million engagements were recorded. Likee users have uploaded videos on different categories such as English learning, football, art & painting, cooking with a concentrated focus on two streams - #howto and #education.
Read: Likee teams up with 10 Minute School
Many famous figures from different fields such as teachers, researchers, sportspersons, artists, culinary and life-skill enthusiasts and renowned nutritionists have taken part in the campaign and come up with enlightening videos.
Abdullah Al Shihab, 10 Minute School’s PR and Communication Manager, said about managing the overall operations of the collaborative campaign of 10 Minute School and Likee, “We started this #KnowledgeMonth campaign with a goal of ensuring educational value on the short video platform as Likee which people typically use for entertainment purpose.”
The campaign segments were distributed according to the platform users’ need to reach out to each category of them easily.
Read: Likee launches campaign to promote cyber safety
Tamanna Chowdhury, a clinical dietitian and nutritionist, said about this campaign, “Through my short videos I usually talk about diet tips and nutrition. Of late, I have come across Likee Bangladesh's #KnowledgeMonth campaign, which seems to be a quite a gem for me. I have shared many videos portraying different pertinent aspects related to nutritional needs. I hope people will see those and be aware of their health.”
On the other hand, Iffat, another participant, said, “Apart from showcasing my cooking skills, I have also got to learn about different life skills including cooking, art, math, spoken English etc. I am happy as after joining #KnowledgeMonth I was able to create a new cooking account and my cooking account got the verification as well.”
Inspired by the response this campaign has received, Likee is encouraged to arrange such campaigns in the future to provide a platform to the video creators for displaying their personal talents and help them expand their career options as well. This type of campaign is expected to create an ambiance where content creators will be able to acquire and develop knowledge together and add value to the fellow users’ experience.
Read Short video platforms are new way of building communities: Head of Likee Operations in Bangladesh
4 years ago
Facebook unveils new controls for kids using its platforms
Facebook, in the aftermath of damning testimony that its platforms harm children, will be introducing several features including prompting teens to take a break using its photo sharing app Instagram, and “nudging" teens if they are repeatedly looking at the same content that's not conducive to their well-being.
The Menlo Park, California-based Facebook is also planning to introduce new controls for adults of teens on an optional basis so that parents or guardians can supervise what their teens are doing online. These initiatives come after Facebook announced late last month that it was pausing work on its Instagram for Kids project. But critics say the plan lacks details and they are skeptical that the new features would be effective.
The new controls were outlined on Sunday by Nick Clegg, Facebook's vice president for global affairs, who made the rounds on various Sunday news shows including CNN's “State of the Union" and ABC's “This Week with George Stephanopoulos" where he was grilled about Facebook's use of algorithms as well as its role in spreading harmful misinformation ahead of the Jan. 6 Capitol riots.
Read:Could Facebook sue whistleblower Frances Haugen?
“We are constantly iterating in order to improve our products,” Clegg told Dana Bash on “State of the Union" Sunday. “We cannot, with a wave of the wand, make everyone’s life perfect. What we can do is improve our products, so that our products are as safe and as enjoyable to use."
Clegg said that Facebook has invested $13 billion over the past few years in making sure to keep the platform safe and that the company has 40,000 people working on these issues. And while Clegg said that Facebook has done its best to keep harmful content out of its platforms, he says he was open for more regulation and oversight.
“We need greater transparency,” he told CNN’s Bash. He noted that the systems that Facebook has in place should be held to account, if necessary, by regulation so that “people can match what our systems say they’re supposed to do from what actually happens.”
The flurry of interviews came after whistleblower Frances Haugen, a former data scientist with Facebook, went before Congress last week to accuse the social media platform of failing to make changes to Instagram after internal research showed apparent harm to some teens and of being dishonest in its public fight against hate and misinformation. Haugen’s accusations were supported by tens of thousands of pages of internal research documents she secretly copied before leaving her job in the company’s civic integrity unit.
Read:Ex-Facebook manager criticizes company, urges more oversight
Josh Golin, executive director of Fairplay, a watchdog for the children and media marketing industry, said that he doesn't think introducing controls to help parents supervise teens would be effective since many teens set up secret accounts any way. He was also dubious about how effective nudging teens to take a break or move away from harmful content would be. He noted Facebook needs to show exactly how they would implement it and offer research that shows these tools are effective.
“There is tremendous reason to be skeptical," he said. He added that regulators need to restrict what Facebook does with its algorithms.
He said he also believes that Facebook should cancel its Instagram project for kids.
When Clegg was grilled by both Bash and Stephanopoulos in separate interviews about the use of algorithms in amplifying misinformation ahead of Jan. 6 riots, he responded that if Facebook removed the algorithms people would see more, not less hate speech, and more, not less, misinformation.
Read:Whistleblower: Facebook chose profit over public safety
Clegg told both hosts that the algorithms serve as “giant spam filters."
Democratic Sen. Amy Klobuchar of Minnesota, who chairs the Senate Commerce Subcommittee on Competition Policy, Antitrust, and Consumer Rights, told Bash in a separate interview Sunday that it's time to update children's privacy laws and offer more transparency in the use of algorithms.
“I appreciate that he is willing to talk about things, but I believe the time for conversation is done," said Klobuchar, referring to Clegg's plan. “The time for action is now.”
4 years ago
Could Facebook sue whistleblower Frances Haugen?
Facebook has recently taken a harsher tone toward whistleblower Frances Haugen, suggesting that the social network could be considering legal retaliation after Haugen went public with internal research that she copied before leaving her job earlier this year.
U.S. law protects whistleblowers who disclose information about potential misconduct to the government. But that protection doesn't necessarily cover taking corporate secrets to the media.
Facebook still has to walk a fine line. The company has to weigh whether suing Haugen, which could dissuade other employees who might otherwise speak out, is worth casting itself as a legal Godzilla willing to stomp on a woman who says she's just doing the right thing.
Haugen may face other consequences. Whistleblowers often put themselves at risk of professional damage — other firms may be reluctant to hire them in the future — and personal attacks from being in the public eye.
Read: Ex-Facebook manager criticizes company, urges more oversight
Facebook did not respond to emailed questions.
WHAT DID HAUGEN DO?
Haugen secretly copied a trove of internal Facebook documents before leaving the company and subsequently had her lawyers file complaints with the Securities and Exchange Commission alleging that Facebook hides what it knows about the negative effects of its platform.
John Tye, her lawyer, said the team gave redacted documents to Congress, where Haugen testified on Tuesday, and also informed officials in California. Haugen also shared documents with the Wall Street Journal, which she started talking to in December, leading to a series of explosive stories that began in mid-September.
WHAT WAS FACEBOOK'S RESPONSE?
The company says it has been mischaracterized. “I think most of us just don’t recognize the false picture of the company that is being painted,” CEO Mark Zuckerberg wrote to employees on Tuesday.
Some company officials have also begun using harsher language to describe Haugen's actions that could be interpreted as threatening.
In an Associated Press interview Thursday, Facebook executive Monika Bickert repeatedly referred to the documents Haugen copied as “stolen,” a word she has also used in other media interviews. David Colapinto, a lawyer for Kohn, Kohn and Colapinto who specializes in whistleblower cases, said that language was threatening.
In the same interview, asked if Facebook would sue or retaliate against the whistleblower, Bickert said only, “I can’t answer that.”
A week earlier, Antigone Davis, Facebook’s head of global safety, testified in the Senate that Facebook “would never retaliate against someone for speaking to Congress,” which left open the possibility that the company might go after her for giving documents to the Journal.
Read:Whistleblower: Facebook chose profit over public safety
IS HAUGEN PROTECTED?
Various laws offer whistleblower protection at both the state and federal levels. The federal laws applicable to Haugen are the Dodd-Frank Act, a 2010 Wall Street reform law, and the Sarbanes Oxley Act, a 2002 law that followed the collapse of Enron and other accounting scandals.
Dodd-Frank expanded protections for whistleblowers and empowered the SEC to take action against a company that threatens a whistleblower. Protections exist for both employees and former employees, experts say.
Asked about her risk because she went to the media, Haugen's lawyer, Tye, maintains that because Haugen went to the SEC, Congress and state authorities, she’s entitled to whistleblower protections. He said any suit from Facebook would be “frivolous" and that Facebook has not been in touch.
WHAT ABOUT HER LEAKS TO THE MEDIA?
Courts haven't tested whether leaking to the media is protected under Dodd-Frank, but Colapinto said the U.S. Secretary of Labor determined decades ago that environmental and nuclear-safety whistleblowers' communications with the media were protected. He argues that the language of Sarbanes-Oxley is modeled on those earlier statutes, and Haugen should have the same protections for any of her communications with reporters.
Read: Ex-Facebook manager alleges social network fed Capitol riot
Facebook could allege that Haugen broke her nondisclosure agreement by sharing company documents with the press, leaking trade secrets or just by making comments Facebook considers defamatory, said Lisa Banks of Katz, Marshall and Banks, who has worked on whistleblower cases for decades. "Like many whistleblowers, she’s extraordinarily brave and puts herself at personal and professional risk in shining a light on these practices,” she said.
Haugen effectively used leaks to the media to turn up the pressure on Congress and government regulators. Colapinto said her disclosures had a public-interest purpose that could complicate enforcing the NDA if Facebook chose to do so.
COULD FACEBOOK FACE BLOWBACK?
Facebook probably wants its veiled threats to unnerve other employees or former employees who might be tempted to speak out. “If they go after her, it won't be because they necessarily think they have a strong case legally, but sending a message to other would-be whistleblowers that they intend to play hardball,” Banks said.
But she said it would be a “disaster” for Facebook to go after Haugen. Regardless of potential legal vulnerabilities, Facebook might look like a bully if it pursued a legal case against her.
“The last thing Facebook needs is to rouse the ire of governmental authorities and the public at large by playing the role of the big bad giant company against the courageous individual whistleblower,” said Neil Getnick, whose firm, Getnick and Getnick, represents whistleblowers.
4 years ago
Ex-Facebook manager criticizes company, urges more oversight
While accusing the giant social network of pursuing profits over safety, a former Facebook data scientist told Congress Tuesday she believes stricter government oversight could alleviate the dangers the company poses, from harming children to inciting political violence to fueling misinformation.
Frances Haugen, testifying to the Senate Commerce Subcommittee on Consumer Protection, presented a wide-ranging condemnation of Facebook. She accused the company of failing to make changes to Instagram after internal research showed apparent harm to some teens and being dishonest in its public fight against hate and misinformation. Haugen’s accusations were buttressed by tens of thousands of pages of internal research documents she secretly copied before leaving her job in the company’s civic integrity unit.
But she also offered thoughtful ideas about how Facebook’s social media platforms could be made safer. Haugen laid responsibility for the company’s profits-over-safety strategy right at the top, with CEO Mark Zuckerberg, but she also expressed empathy for Facebook’s dilemma.
Haugen, who says she joined the company in 2019 because “Facebook has the potential to bring out the best in us,” said she didn’t leak internal documents to a newspaper and then come before Congress in order to destroy the company or call for its breakup, as many consumer advocates and lawmakers of both parties have called for.
Haugen is a 37-year-old data expert from Iowa with a degree in computer engineering and a master’s degree in business from Harvard. Prior to being recruited by Facebook, she worked for 15 years at tech companies including Google, Pinterest and Yelp.
Read: Outage highlights how vital Facebook has become worldwide
“Facebook’s products harm children, stoke division and weaken our democracy,” Haugen said. “The company’s leadership knows how to make Facebook and Instagram safer but won’t make the necessary changes because they have put their astronomical profits before people.”
“Congressional action is needed,” she said. “They won’t solve this crisis without your help.”
In a note to Facebook employees Tuesday, Zuckerberg disputed Haugen’s portrayal of the company as one that puts profit over the well-being of its users, or that pushes divisive content.
“At the most basic level, I think most of us just don’t recognize the false picture of the company that is being painted,” Zuckerberg wrote.
He did, however, appear to agree with Haugen on the need for updated internet regulations, saying that would relieve private companies from having to make decisions on social issues on their own.
“We’re committed to doing the best work we can, but at some level the right body to assess tradeoffs between social equities is our democratically elected Congress,” Zuckerberg wrote.
Democrats and Republicans have shown a rare unity around the revelations of Facebook’s handling of potential risks to teens from Instagram, and bipartisan bills have proliferated to address social media and data-privacy problems. But getting legislation through Congress is a heavy slog. The Federal Trade Commission has taken a stricter stance toward Facebook and other tech giants in recent years.
“Whenever you have Republicans and Democrats on the same page, you’re probably more likely to see something,” said Gautam Hans, a technology law and free speech expert at Vanderbilt University
Haugen suggested, for example, that the minimum age for Facebook’s popular Instagram photo-sharing platform could be increased from the current 13 to 16 or 18.
She also acknowledged the limitations of possible remedies. Facebook, like other social media companies, uses algorithms to rank and recommend content to users’ news feeds. When the ranking is based on engagement — likes, shares and comments — as it is now with Facebook, users can be vulnerable to manipulation and misinformation. Haugen would prefer the ranking to be chronological. But, she testified, “People will choose the more addictive option even if it is leading their daughters to eating disorders.”
Haugen said a 2018 change to the content flow contributed to more divisiveness and ill will in a network ostensibly created to bring people closer together.
Read: Whistleblower: Facebook chose profit over public safety
Despite the enmity that the new algorithms were feeding, she said Facebook found that they helped keep people coming back — a pattern that helped the social media giant sell more of the digital ads that generate the vast majority of its revenue.
Haugen said she believed Facebook didn’t set out to build a destructive platform. “I have a huge amount of empathy for Facebook,” she said. “These are really hard questions, and I think they feel a little trapped and isolated.”
But “in the end, the buck stops with Mark,” Haugen said, referring to Zuckerberg, who controls more than 50% of Facebook’s voting shares. “There is no one currently holding Mark accountable but himself.”
Haugen said she believed that Zuckerberg was familiar with some of the internal research showing concerns for potential negative impacts of Instagram.
The subcommittee is examining Facebook’s use of information its own researchers compiled about Instagram. Those findings could indicate potential harm for some of its young users, especially girls, although Facebook publicly downplayed possible negative impacts. For some of the teens devoted to Facebook’s popular photo-sharing platform, the peer pressure generated by the visually focused Instagram led to mental health and body-image problems, and in some cases, eating disorders and suicidal thoughts, the research leaked by Haugen showed.
One internal study cited 13.5% of teen girls saying Instagram makes thoughts of suicide worse and 17% of teen girls saying it makes eating disorders worse.
She also has filed complaints with federal authorities alleging that Facebook’s own research shows that it amplifies hate, misinformation and political unrest, but that the company hides what it knows.
After recent reports in The Wall Street Journal based on documents she leaked to the newspaper raised a public outcry, Haugen revealed her identity in a CBS “60 Minutes” interview aired Sunday night.
As the public relations debacle over the Instagram research grew last week, Facebook put on hold its work on a kids’ version of Instagram, which the company says is meant mainly for tweens aged 10 to 12.
Read: Ex-Facebook manager alleges social network fed Capitol riot
Haugen said that Facebook prematurely turned off safeguards designed to thwart misinformation and incitement to violence after Joe Biden defeated Donald Trump in last year’s presidential election, alleging that doing so contributed to the deadly Jan. 6 assault on the U.S. Capitol.
After the November election, Facebook dissolved the civic integrity unit where Haugen had been working. That was the moment, she said, when she realized that “I don’t trust that they’re willing to actually invest what needs to be invested to keep Facebook from being dangerous.”
Haugen says she told Facebook executives when they recruited her that she wanted to work in an area of the company that fights misinformation, because she had lost a friend to online conspiracy theories.
Facebook maintains that Haugen’s allegations are misleading and insists there is no evidence to support the premise that it is the primary cause of social polarization.
“Today, a Senate Commerce subcommittee held a hearing with a former product manager at Facebook who worked for the company for less than two years, had no direct reports, never attended a decision-point meeting with (top) executives – and testified more than six times to not working on the subject matter in question. We don’t agree with her characterization of the many issues she testified about,” the company said in a statement.
4 years ago
Outage highlights how vital Facebook has become worldwide
The six-hour outage at Facebook, Instagram and Whatsapp was a headache for many casual users but far more serious for the millions of people worldwide who rely on the social media sites to run their businesses or communicate with relatives, fellow parents, teachers or neighbors.
When all three services went dark Monday, it was a stark reminder of the power and reach of Facebook, which owns the photo-sharing and messaging apps.
Around the world, the breakdown at WhatsApp left many at a loss. In Brazil, the messaging service is by far the most widely used app in the country, installed on 99% of smartphones, according to tech pollster Mobile Time.
WhatsApp has become essential in Brazil to communicate with friends and family, as well as for a variety of other tasks, such as ordering food. Offices, various services and even the courts had trouble making appointments, and phone lines became overwhelmed.
Read: Facebook services restored after worldwide outage
Hundreds of thousands of Haitians in their homeland and abroad fretted over the WhatsApp outage.
Many of the country’s more than 11 million people depend it to alert one another about gang violence in particular neighborhoods or to talk to relatives in the U.S. about money transfers and other important matters. Haitian migrants traveling to the U.S. rely on it to find each other or share key information such as safe places to sleep.
Nelzy Mireille, a 35-year-old unemployed woman who depends on money sent from relatives abroad, said she stopped at a repair shop in the capital of Port-au-Prince because she thought her phone was malfunctioning.
“I was waiting on confirmation on a money transfer from my cousin,” she said. “I was so frustrated.”
“I was not able to hear from my love,” complained 28-year-old Wilkens Bourgogne, referring to his partner, who was in the neighboring Dominican Republic, buying goods to bring back to Haiti. He said he was concerned about her safety because of the violence in their homeland.
“Insecurity makes everyone worry,” he said.
In rebel-held Syria, where the telecommunication infrastructure has been disrupted by war, residents and emergency workers rely mostly on internet communication.
Naser AlMuhawish, a Turkey-based Syrian doctor who monitors coronavirus cases in rebel-held territory in Syria, said WhatsApp is the main communication method used with over 500 workers in the field.
They switched to Skype, but WhatsApp works better when internet service is shaky, he said. If there had been an emergency such as shelling that he needed to warn field workers about, there could have been major problems, he said.
"Luckily this didn’t happen yesterday during the outage,” he said.
But hospitals treating COVID-19 patients in the region were thrown into panic. They lost contact with oxygen suppliers who have no fixed location and are normally reached via WhatsApp. One hospital sent staff member searching for oxygen at nearly two dozen facilities, said Dr. Fadi Hakim of the Syrian American Medical Society.
In Lima, Peru, the breakdown complicated dental technician Mary Mejia’s job. Like most Peruvian medical workers, she uses WhatsApp for a multitude of tasks, including scheduling appointments and ordering crowns.
“Sometimes the doctor will be working on a patient and I need to contact a technician for job,” she said. “To have to step away and make a phone call? It trips us up. We’ve become so accustomed to this tool.”
Millions of Africans use WhatsApp for all their voice calls, so “people felt they were cut off from the world,” said Mark Tinka, a Ugandan who heads engineering at SEACOM, a South Africa-based internet infrastructure company.
Read:Whistleblower: Facebook chose profit over public safety
Many Africans also use WhatsApp to connect with relatives in other countries. Tinka’s stepdaughter lives in Caldwell, Idaho, and lost her father on Sunday, but could not speak with her family back in Dar Es Salaam, Tanzania, to arrange travel for the funeral.
“It’s amazing just how little folks understand the impact of three or four content companies on the utility of the Internet,” Tinka said.
Facebook said the outage was due to an internal error related to a “configuration change” but gave no details.
The outage came amid a crisis at Facebook, accused by a whistleblower on “60 Minutes” and on Capitol Hill of profiting from hate and division and suppressing research showing that Instagram contributes to body-image problems, eating disorders and thoughts of suicide in young women.
For small businesses, the outages meant hundreds or thousands of dollars in lost revenue.
Andrawos Bassous is a Palestinian photographer in the Israeli-occupied West Bank whose Facebook page has more than 1 million followers. He has worked with companies including Samsung and Turkish Airlines to create social media content. He said the social media blackout meant he was unable to book appointments or share videos online for companies that employ him.
“Imagine if you promised one of the companies you work for to share their product at a specific time and there is a blackout,” Bassous said.
Sarah Murdoch runs a small Seattle-based travel company called Adventures with Sarah and relies on Facebook Live videos to promote her tours. She estimated the breakdown cost her thousands of dollars in bookings.
“I’ve tried other platforms because I am wary of Facebook, but none of them are as powerful for the type of content I create,” Murdoch said. As for her losses, “it may only be a few people, but we are small enough that it hurts.”
Heather Rader runs How Charming Photography in Linton, Indiana. She takes photographs for schools and sports teams and makes yard signs with the photos. She has her own website but said parents and other customers mostly try to reach her through social media.
She said she might have lost three or four bookings for photo sessions at $200 a client.
“A lot of people only have a specific window when they can do ordering and booking and things like that,” she said. “If they can’t get a direct answer, they go to someone else.”
Read: Facebook, WhatsApp, Instagram suffer worldwide outage
Tarita Carnduff of Alberta, Canada, said she connects with other parents on Facebook just about every day, and the outage drove home for her how crucial that support is.
“As a parent with special needs kids, it is the only space I found others in similar positions,” she said. “There’s a lot of us that would be lost without it.”
But for others, the breakdown led them to conclude they need less Facebook in their lives.
Anne Vydra said she realized she was spending too much free time scrolling and commenting on posts she disagreed with. She deleted the Facebook app on Tuesday.
“I didn’t want it to come back,” said Vydra, who lives in Nashville, Tennessee, and does voiceover work. She added: “I realized how much of my time was wasted.”
4 years ago
Facebook services restored after worldwide outage
Facebook services were restored Monday afternoon after global users experienced outages on social networks Facebook owns, such as Messenger, Instagram and WhatsApp for hours during the day.
"We've been working hard to restore access to our apps and services and are happy to report they are coming back online now," Facebook said in a post.
Read: Ex-Facebook manager alleges social network fed Capitol riot
During the outage, Facebook said, "Sorry, something went wrong. We're working on it and we'll get it fixed as soon as we can." Instagram showed a "5xx server error."
It appeared that the outage was caused by a DNS (domain name server) fail, a report by TechCrunch said.
"Not only are Facebook's services and apps down for the public, its internal tools and communications platforms, including Workplace, are out as well. No one can do any work," Ryan Mac, a technology reporter for The New York Times, said earlier in a tweet.
4 years ago
Likee teams up with 10 Minute School
Short-video app Likee has partnered with the online educational platform 10 Minute School to encourage users to create and share videos on academic and co-curricular skills.
Likee launched the knowledge community development initiative #KnowledgeMonth recently.
The campaign, which will run until September 30, aims to create an encouraging community sphere, where a healthy and competitive spirit will boost the users' learning. It will award several lucky winners at the end of the month-long activities.
Also read: Learning together with Likee’s #Steps2learn
This collaboration of Likee and 10 minute School is set to encourage teachers, researchers, sportspersons, artists; culinary, life-skill and other academic and non-academic skills enthusiasts to come together as a community and share their knowledge.
Users can create interesting videos to share their learning, tips, and tricks on different topics like Bengali, Bengali dialect, spoken English, science, puzzle, trivia, economy, culture, humanity, health, medicine, mathematics, physics, chemistry, and biology.
These videos will have to be uploaded using the hashtag #AcademicKnowledge, #ArtisticKnowledge, #LifeKnowledge and #UnpopularKnowledge. The videos need to be original and more than 10 seconds in length, according to a media statement.
Also read: Likee launches campaign to promote cyber safety
"In this era of online learning, Likee believes that it has the power to connect learners through educational video content. Our #KnowledgeMonth campaign is already seeing enthusiastic traffic, and we thank our partner 10 Minute School for their cooperation," said a Likee spokesperson.
Ayman Sadiq, owner of 10 Minute School said, "We are glad to have partnered with Likee as together, we can positively use this opportunity to encourage millions of Likee users to share valuable knowledge."
Read CID arrests five, inc. Bigo and Likee executives, for app-based crimes
4 years ago
Social Media Addiction: How to detach yourself from the cycle
In today’s world of internet connectivity, it's only normal that social media usage will surge. There are about 4.48 billion social media users in this world. That’s more than half the entire population of the world. But the proliferation of social media brings a new set of problems. Nowadays the young generation is suffering from the tendency of increased impulsiveness to scroll through social media. In this article, we will take a deep dive into social media addiction and how to get rid of it.
What is social media addiction?
Have you ever gone to bed and impulsively reached for your phone? Spent hours scrolling through different social media and messing up your sleep schedule in the process? If it’s a yes, then chances are you’re suffering from a case of social media addiction. You’re not alone, however.
Read How social platforms are dealing with the Taliban
In 2017, a survey found that there are about 210 million people with social media addiction. This was from a time when the world didn’t have to grapple with the pandemic. In its wake, the pandemic has left most of the office, education, and other works to an online platform, making people increasingly prone to becoming addicted to social media and online platforms.
It might not feel much like it at first but it’s part of human behavioral addictions. At one point, you will get so into scrolling through newsfeeds and posts and pictures that it will eventually end up hampering other parts of your life.
So, is there a way out of this? Well, there are a lot of ways to go about social media detoxification. We’ll talk about some of the most effective ways to get your life back to normalcy.
Read A set of directives for judicial officials for using social media
How to Get Rid of Social Media Addiction
Turn the notification off
The first you can do to get out of the cycle is to turn the notification off. You cannot even start the process of detachment if your phone lights up with a notification every other hour. The tendency to check who twitted or shared a post will eventually start a domino effect where you end up spending more time than you should.
Some people count “like” and interactions in their posts. If you have that tendency, we suggest you stay away from your phone for a few hours after posting anything. Do not let the urge to check interactions get to you. Turning off the notification will minimize the distraction and make way for a gradual transition from the addiction.
Read Dating changed during the pandemic; apps are following suit
Do not go to sleep with your phone
More than 45% of social media users said that they go to sleep while scrolling their phones. This is a very bad practice with some serious side effects. You not only deprive your brain of the rest, but the blue light from the phone messes with the eyes and hampers your sleep schedule.
The only way to get around this problem is to stop taking the phone to bed. Considering the addictive nature or simply something that became a habit, not taking the phone to bed isn’t something that’s going to be easy.
You can keep the phone on the other side of your bedroom or somewhere that’s out of your reach when you’re in bed. It’ll take some time getting used to. But once you do, you will be able to gradually cut off the impulsive need to browse the phone.
Read: How to Stop Your Online Shopping Addiction?
Say no to social media while in bed
Similar to going to bed with a phone, some too many people instinctively reach for their phone once they wake up. This is also a bad habit and compulsive behavior. The extreme dependency on social media content means you are inadvertently getting dragged into the platform without your knowledge.
Try to not use your phone until after you’ve completed your morning routine. Too much social media content in the morning will only make you distracted throughout the day.
Avoid overthinking social media appearance
This is probably the most common problem when it comes to social media platforms. You are not alone if you spend too much time thinking about what to post and how to post. Most social media users have this urge to look better and sound better in social media to get better validation. In reality, you’re using up precious time from your daily life overthinking about social media. This is a form of passive addiction to social media.
Read Dutch data protection authority fines TikTok over privacy
The stress and anxiety related to this validation might be too much to handle. Some people might even fall into depression because of this stress. As a social media user, the last thing you should worry about is what people think about what you post. The process of destressing starts from detachment from social media appearance.
Find replacement
Think of a time when neither digital devices nor social media existed. The earliest example of a digital handheld device dates back to the 90s and that of social media dates back to the 2000s. So, what about the people before that time? Didn’t they have a fulfilling life? They probably had a better life considering the analog options they had.
This is also something that you need to do. Rather than getting engrossed with social media, try and find your hobbies again. There are effective and productive alternatives to social media like reading books or gardening. The more you indulge in analog hobbies, the more you’ll be able to detach yourself from social media.
Read Boithok: Bangladeshi video conferencing app for online meeting
Digital detox
The best solution is to stay away from social media completely. Once you’ve learned to take the small steps, it's time you focused to take considerable time offs from all types of social media platforms. The entire process is known as a digital detox.
It won’t happen overnight. You will have the urge to get back at the start. But the more time you spend away from social media, the more life becomes productive and stress-free. You can gradually minimize your usage to a healthy level, or you can cut it off completely, the choice is yours.
Bottom Line
Overcoming social media addictions is easier said than done. Just like other behavioral addictions, you’ll have to take small steps to get away from them completely. While it might be seemingly easy, you’ll still have to follow through with each of the steps for successful detoxification. We hope this article helps you to divest yourself of the addiction.
Read How To Be A Popular Instagram Influencer
4 years ago
How social platforms are dealing with the Taliban
As the Taliban negotiates with senior politicians and government leaders following its lighting-fast takeover of Afghanistan, U.S. social media companies are reckoning with how to deal with a violent extremist group that is poised to rule a country of 40 million people.
Should the Taliban be allowed on social platforms if they don’t break any rules, such as a ban on inciting violence, but instead use it to spread a narrative that they’re newly reformed and are handing out soap and medication in the streets? If the Taliban runs Afghanistan, should they also run the country’s official government accounts?
And should tech companies in Silicon Valley decide what is — and isn’t — a legitimate government? They certainly don’t want to. But as the situation unfolds, uncomfortable decisions lie ahead.
Also read: Who are the Taliban?
DOES THE TALIBAN USE SOCIAL MEDIA?
The Taliban quickly seized power in Afghanistan two weeks before the U.S. was set to complete its troop withdrawal after a two-decade war. The insurgents stormed across the country, capturing all major cities in a matter of days, as Afghan security forces trained and equipped by the U.S. and its allies melted away.
The last time the Taliban was in power in Afghanistan, Facebook, Twitter and YouTube did not exist. Neither did MySpace, for that matter. Internet use in the country was virtually nonexistent with just 0.01% of the population online, according to the World Bank.
In recent years, that number has vastly increased. The Taliban have also increased their online presence, producing slick videos and maintaining official social media accounts. Despite bans, they have found ways to evade restrictions on YouTube, Facebook and WhatsApp. Last year, for instance, they used WhatsApp groups to share pictures of local health officials in white gowns and masks handing out protective masks and bars of soap to locals.
On Twitter, Taliban spokesman Zabihullah Mujahid has been posting regular updates to more than 300,000 followers, including international media. Twitter suspended another account, @AfghPresident, which has served as the nation’s de facto official presidential account, pending verification of the account holder’s identity.
“There’s a realization that winning the war is as much a function of a nonmilitary tool like social media as it is about the bullets,” said Sarah Kreps, a law professor at Cornell University who focuses on international politics, technology and national security. “Maybe these groups, even from just an instrumental perspective, have realized that beheading people is not a way to win the hearts and minds of the country.”
Also read: Taliban promise women's rights, security under Islamic rule
WAIT, THE TALIBAN WERE ALLOWED ON TWITTER?
Facebook and YouTube consider the Taliban a terrorist organization and prohibit it from operating accounts. Twitter has not explicitly banned the group, though the company said Tuesday that it will continue to enforce its rules, in particular policies than bar “glorification of violence, platform manipulation and spam.”
This essentially means that until the accounts violate Twitter’s rules — for instance, by inciting violence — they are allowed to operate.
While the Taliban is not on the U.S. list of foreign terrorist organizations, the U.S. has imposed sanctions on it. Facebook said Tuesday that the group is banned from its platform under its “dangerous organization” policies. which also bars “praise, support and representation” of the group and accounts run on its behalf. The company emphasized in a statement that it has a dedicated team of Afghanistan experts that are native speakers of Dari and Pashto, Afghanistan’s official languages, to help provide local context and to alert the company of emerging issues.
Facebook has a spotty record when it comes to enforcing its rules. Doing so on WhatsApp, also owned by Facebook, could prove more difficult given that the service encrypts messages so that no one but senders and recipients can read them.
Twitter said it is seeing people in Afghanistan using its platform to seek help and that its top priority is “keeping people safe.” Critics immediately questioned why the company continues to ban former President Donald Trump even as it allows Mujahid to post.
“They certainly decided to silence a former U.S. president,” said Alex Triantafilou, chairman of the Hamilton County Republican Party in Cincinnati, Ohio, who called Twitter’s decision “preposterous.”
Twitter permanently suspended Trump following the deadly insurrection at the U.S. Capitol on Jan. 6, saying his posts glorified and could lead to more violence. The company has long insisted that it suspends accounts based on behavior and whether they violate its rules on the service, and not on offline actions and affiliation.
While he understands that social media companies operate in a global economy, Triantafilou said, “it seems to me that supporting America and our own interest” would make more sense for a U.S. company.
Also read: What Taliban's return means for Bangladesh
WHAT HAPPENS NOW?
As the situation unfolds, the major companies are grappling with how to respond. It’s not an entirely unique situation — they have had to deal with groups such as Hamas and Hezbollah, for instance, which hold considerable political power but are also violent and have carried out acts of terrorism.
“For the past decade, Hamas has used social media to gain attention, and convey their messages to international audiences in multiple languages,” wrote Devorah Margolin, senior research fellow at the Program on Extremism at The George Washington University, in a July report. For example, she wrote, both the political and military wings of Hamas operated official accounts on Twitter.
Despite attempts to use its English-language account to make its case to the international community, Margolin said the group still used Twitter to call for violence. In 2019, Twitter closed the official accounts, @HamasInfo and @HamasInfoEn, for violating its rules, saying there is “no place on Twitter for illegal terrorist organizations and violent extremist groups.”
Facebook declined to say specifically if it would hand over Afghanistan’s official government accounts to the Taliban if it is recognized as the country’s government. The company pointed to an earlier statement saying it “does not make decisions about the recognized government in any particular country but instead respects the authority of the international community in making these determinations.”
Twitter declined to answer questions beyond its statement. YouTube, meanwhile, provided a boilerplate statement saying it complies with “all applicable sanctions and trade compliance laws” and bans the incitement of violence.
All that effectively leaves the door open for the social platforms to eventually hand over control of the official accounts, assuming the Taliban behave and U.S. sanctions are lifted. “That seems like a reasonable approach, because I think the social media platforms don’t necessarily want to be adjudicating is which groups are legitimate themselves,” said Kreps, who served in the U.S. Air Force from 1999 to 2003, partly in Afghanistan.
At the same time, she noted, the companies, especially Facebook, have learned a great deal — and paid a price — for the way the way social media helped incite genocidal behavior in Myanmar. And they’re unlikely to want a repeat of those horrors.
4 years ago