‘tween the devil and the deep blue sea

The devil mentioned in the President’s executive order on preventing online censorship clearly is the advertising and trending algorithm which is now being targeted by powers inimical to democratic systems– “They have also amplified China’s propaganda abroad, including by allowing Chinese government officials to use their platforms to spread misinformation regarding the origins of the COVID-19 pandemic, and to undermine pro-democracy protests in Hong Kong.” The order expands on examples without naming companies.

The Deep blue sea is Section 230, almost like Section 66, of the Indian IT Act, which holds people responsible for the content they share on online platforms. India’s first debate around rights of aggregation platforms vs. content creators was cast in 2004 around the arrest and jail term for the CEO of an online auction platform for allowing upload and possible sale of pornography on his website. But the jury is still out!

So, what does this mean?

Since Cambridge Analytica, blew open the role small campaign firms have played in influencing election results on social media, , and EU’s GDPR guidelines adoption, the days of using silent monitoring of users browsing habits via cookies for marketing purposes and otherwise are now permission based. On the other hand, with countries like Germany enacting their own Network Enforcement Act (NetzDG) to enable and enforce the rights of citizens to monitor and remove offensive content, opaque algorithms embedded by social media companies for promoting and hiding content for profit or bias are facing increasing scrutiny and censure.

On the back of Covid19, and vicious political attacks on social media either amped or hidden in pre- election periods, currently on in the US, and visible in India pre- Delhi elections, the question on responsibilities of these online platforms are now center-stage.

A quote from the executive order is telling; “When large, powerful social media companies censor opinions with which they disagree, they exercise a dangerous power.  They cease functioning as passive bulletin boards, and ought to be viewed and treated as content creators.”

The urgency, in the US, as in India, comes from the clear pro-authoritarian regime leanings of these platforms, possibly hoping to score an entry into China, the one country that has banned them from its population, ab initio.

China, on the other hand, has allowed and enabled its tech sector to acquire global presence with a caveat coming in force with its 2017 National Intelligence Law. Article 14 of the law mandates that Chinese intelligence agencies “may ask relevant institutions, organizations and citizens to provide necessary support, assistance and cooperation.” That this has had a signaling effect on Global Telco’s and governments for slow pedaling orders to prominent Chinese 5G hardware makers is no accident.

In India, a spoof on China and Coronavirus was pulled off TikTok, a Chinese promoted video social media platform, on the verge of censure after videos sharing content exhorting youngsters to commit gender atrocities was discovered on it. Twitter, on the other hand, pulled down a popular butter brand handle briefly for sharing a spoof cartoon on “made in India” which lampooned Chinese imports. Authoritarians, whether they are censoring content or running countries, usually cannot take a joke!

In my opinion,

This is simply a chain of events unfolding on a path to community and global responsibility. No longer will “Opaque Algorithms” and IPR’s around them be accepted, as data localization becomes the norm around the world. As nations realize the worth of their citizen’s data, more scrutiny into what is being published, by who, for what effect, will be asked. And certainly, global platforms which owe no explanation to anyone but their own engineers and funders, will be under constant scrutiny, as they will try to hold close what they have promoted or censored for what purpose.

Social Media is, at the end of the day, a public square, and all platforms must make sure they ensure they transparently follow a universal code of acceptable human conduct, even if business gains are compromised.

Parts of this post appears as a column in Voice& Data magazine June 2020 issue.

This entry was posted in communications, content, digital, Jay Vikram Bakshi, politics, social media, Uncategorized and tagged , , . Bookmark the permalink.

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.