Social media can give the news media an almost unlimited number of eyeballs for stories. But they can also be used to target the media and put pressure on journalists. Mediawatch meets an award-winning editor fighting back against fake news and online attacks in the Philippines.
Listen
Soon after the number of Facebook users in Malaysia passed 18 million in 2016, the social media network opened an office there.
It was a smart move. Not only was more than half the population on the platform, 6.5 million Malaysians were on Instagram, the picture sharing service owned by Facebook.
Two years later, the picture's even rosier.
Recently-released official figures (PDF) show just under 24 million Malaysians - 97.3 percent of the country's internet users - have a Facebook account and almost 14 million are on Instagram.
A Reuters Institute survey in 2017 found more than half of the respondents in Malaysia said they used messaging app WhatsApp - also owned by Facebook - for sharing or discussing news in a given week.
That's handy for Malaysia’s biggest digital-only news platform Malaysiakini. It has more than 4.5 million followers on Facebook pages in three languages.
But all these platforms are also used to share fake news as well as the real stuff.
And it travels fast.
During a recent international summit in Singapore called What is News Now? Malaysiakini's social media editor Norman Goh got calls from home about a story from his site claiming the government was planning to make civil servants work six days a week.
"It was fake. It was never published but it was spread through Facebook and WhatsApp," he told Mediawatch.
"You can't really pinpoint who does it and Malaysians spend three to four hours a day on Facebook alone," he said.
"They hold a major share when it comes to the spread of news. It may not affect New Zealand now but it will one day," he warned.
"We want a safer place - or at the end of the day we will have to find another solution or another platform," he said.
But switching won't be easy when so many readers are such heavy users of one company's platforms.
It's a similar story in the Philippines where - according to business news agency Bloomberg - there are more smartphones than people and 97 percent of those with a smartphone also have a Facebook account.
President Rodrigo Duterte came to power in 2016 off the back of an intense social media campaign. Once in power an army of dedicated followers formed online to push his messages.
One was a blogger with more than 5 million followers who became President Duterte's assistant communications secretary.
Mocha Uson - the pop star leader of girl band Mocha Girls - was appointed as the President’s assistant secretary for social media in May last year.
She puts 20-30 political messages a day on Facebook for her 4 million followers - and call journalists “presstitutes’.
Social media has become a tool to directly attack journalism in the Philippines and one big target for them is a popular online news outlet Rappler, known for bold and blunt reporting of controversial issues like President Duterte’s deadly war on drugs.
Rappler recently highlighted the harassment of its journalists in a widely shared video:
Chief Executive of Rappler Maria Ressa was formerly a long-time reporter for CNN and a news editor at the largest broadcaster in the Philippines, ABS-CBN.
In late 2016 she lifted the lid on a sustained campaign of online attacks on Rappler in a series of articles called Weaponizing the internet .(AUDIO HERE)
This triggered a fresh wave of violent threats and online abuse, detailed unflinchingly in a compelling address at the What is News Now? conference where she was honoured with a special award for courageous and persistent reporting.
Instead of shrugging off or complaining impotently about the attacks, Rappler used online technology to prove how this was being done.
They tracked the Facebook accounts originating the abuse, insults and lies. They automated the information, creating a database of more than 12 million accounts spreading it.
"Propaganda that's said a million time becomes true and that is what technology enables today," she told Mediawatch.
"Once we started taking down the data we realised exactly what he was doing. It was occupying social media. Facebook is the public space in the Philippines and the algorithms allowed this to happen," she said.
"For decades journalists and news groups developed the standards and ethics to protect the public space. All of that has been up-ended and social media has replaced it with an algorithm that rewarded mob rule. The despots and authoritarian leaders figured it out," she said.
"In Myanmar, in Sri Lanka, in Cambodia, in Vietnam - every day Facebook doesn't act means somebody dies. That is a reality," she said.
Is it really a matter of life and death?
"I absolutely believe that," said Alan Soon, the co-founder of The Splice Newsroom which reports from Singapore on media trends in Asia.
"Facebook has walked into this not knowing that at this point of their development they would have created a system that rewards emotional responses. The testing they do is often in a US context - not understanding that if you're in Myanmar and buying your first mobile phone today, you may be opening Facebook because your entire family is on it and seeing all this stuff and not knowing how to respond to it," said Mr Soon, previously a managing editor for tech company Yahoo in Southeast Asia.
Facebook is not ignoring this problem. Indeed Rappler and Maria Ressa are working with Facebook as one of two partners in the Philippines fact-checking and identifying sources of misinformation.
Facebook's public policy chief for Southeast Asia Alvin Tan told What is New Now? that Facebook had closed almost 800 million bogus accounts at the point of registration already this year and third-party fact-checking in 14 countries reduced the distribution of specific items of fake news by 80 percent.
Alvin Tan said Facebook is expanding that work into other countries and getting to know the needs of each national market better.
"These are not platitudes. We are hiring people to help us know countries better. We need to be smarter about handling billions of pieces of content. We need humans and machine learning to reduce disinformation. I assure you we are doing all we can. Work with us as we are try to do more to get disinformation off our platforms," he said.
Next week Mediawatch looks at Facebook's efforts to confront the problems of misinformation and fake news and its relationships with the news media industry in the Asia Pacific region. Last week Mediawatch looked at responses from governments in some Asian countries.
Colin Peacock attended the What Is News Now? conference in Singapore with a travel grant from the Asia New Zealand Foundation.