Koo rolls out new safety features for proactive content moderation

Posted on   9:30 am

 Microblogging platform Koo on Thursday announced the launch of new proactive content moderation features, geared to provide users with a safer social media experience. The new features developed in-house are capable of proactively detecting and blocking any form of nudity or child sexual abuse materials in less than 5 seconds, labelling misinformation and hiding toxic comments and hate speech on the platform, Koo said in a release.