During the presidential election, people quietly joined the political groups in the United States and stopped dealing with political issues. Mark Zuckerberg, Facebook’s chief executive, mentioned the move in passing at a Senate hearing on Wednesday, a Facebook spokesman confirmed to buzzfeed news. The company declined to say when the change would be implemented or when it would end. “This is a measure we took before election day,” said Liz bourgeois, a Facebook spokesman She added that all new groups have also been filtered out of recommendation tools. “We will evaluate when to cancel them later, but they are temporary.” < / P > < p > after confirming the move, Facebook did not publicly announce that Zuckerberg had been questioned by members of the Senate business, science and Transportation Committee on the Facebook group and the possible polarization and radicalization within the group. When testifying with Twitter CEO Jack Dorsey and Google CEO sandar picchey about content regulation on its platform, Facebook’s head became the main focus of Massachusetts Senator ed Mackey’s question about whether the company will stop group recommendation on social platforms until the results of the US presidential election are certified. < / P > < p > “Senator, we have taken steps to stop recommending all groups of political content or social issues in groups as a precautionary measure.” Zuckerberg replied. < p > < p > Facebook uses algorithms to automatically identify and recommend similar groups for people to join, in order to increase participation. Researchers have long warned that these recommendations may push people on the path of radicalization, and that these groups can reinforce like-minded views and abet the spread of misinformation and hatred. < / P > < p > with more than a billion people on Facebook who are members of groups, the company is pushing users to join these groups by elevating the group’s prominence in people’s news feeds. Announcing the company’s new focus on groups in 2017, Zuckerberg said the social network had built artificial intelligence to “see if we can better recommend groups that make sense to you.”. < / P > < p > “and it works He wrote in an article entitled “bringing the world closer together.”. “In the first six months, we helped more people join 50% of meaningful communities. And there’s a lot to do here. ” < p > < p > according to Claire Wardle, co-founder of first draft, a non-profit organization for error information research, group recommendations can be harmless for dog lovers, but can be a problem for others who spread conspiracy theories or scientific misinformation. She said that based on the anecdotal evidence she saw, Facebook’s automated group suggestions could push people toward radical “recommendation tours.”. < / P > < p > “if I was in a Wisconsin group (a group protesting against staying at home precautions), what other groups would I be recommended? Anti vaccine groups? “Yellow vest” group It’s impossible to do research on a large scale, she said, because it happens on people’s personal news feed. < / P > < p > in May, the Wall Street Journal reported that Facebook internal researchers found in 2016 that “64% of extremist groups joined because of our recommendation tools,” including the platform’s “groups you should join” and “discovery” algorithms. “We’ve grown problems with our recommendation system.” The researchers wrote in their introduction. When asked about internal research by Michigan Senator Gary Peters at a Senate hearing on Wednesday, Zuckerberg said that although he had criticized the magazine’s newspaper internally to employees based on recordings of the latest company wide meeting obtained by buzzfeed news, he was “not familiar with this specific study.”. However, Zuckerberg did point out at the Senate hearing that Facebook has taken steps to prevent groups that contribute to extremism or spread misinformation from among the proposed groups. < / P > < p > despite these changes, organizations that violate Facebook’s own rules can still maintain groups on the platform. After Facebook banned right-wing radical groups and pages in August, a watchdog group found dozens of extremist groups and pages on the platform. Earlier this month, federal and state prosecutors in Michigan charged 14 people with publishing conspiracy theories about Michigan governor Gretchen Whitmer’s kidnapping and possible murder. A day after authorities announced whitmar’s conspiracy theory (partially coordinated on Facebook), buzzfeed News reported that the social network’s recommendation tool continued to advise users to pay attention to pages advocating extremist information. < / P > < p > it is not clear how many groups were affected by Facebook’s restrictions on recommending groups with political and social issues before the election. Facebook spokesman burchia declined to provide more details or say when the temporary change would be cancelled. < p > < p > tests on political groups on Facebook show that although the suggested group function generated by the algorithm may have been removed, the group administrator still has the right to manually suggest groups to members. Facebook’s search tool is also a normal emergence of groups of political and social issues. < / P > < p > Waddell wants to know why Facebook has publicly made a number of changes to its platform, including temporarily banning political advertising for the general election, but why it chose not to publicly announce changes to group recommendations. On Thursday, Facebook’s instagram announced that it would temporarily suspend the & quot; recent & quot; tags in the tab, which can collect recently uploaded content with specific tags, “to reduce the real-time spread of potentially harmful content before and after the election.”. Apple extends AppleCare + purchase period: users can decide within 60 days