Recently, YouTube, the world’s second-largest search engine and most popular video streaming platform launched a new feature that warns users before abusing or indulging in toxic commenting.
It rolled out the additional feature to encourage people to be more respectful online and avoid any unnecessary squabble. As YouTube tries to handle offensive comments and online abuse by issuing warning notes to the users, it is preparing to launch more effective features in its Creators studio to help content creators automatically flag any unwanted comment.
“Is this something you really want to share?”
Is the question that YouTube wants people to rethink again because not everything that seems normal to one person is the same for others. The feature will ask users to let them know that the content they are going to upload “may be offensive to others” thereby, providing them with a chance to review and edit the post.
Power of the new YouTube comment warning feature
While the tool on YouTube may only warn the users against commenting that are wrong or deemed offensive by the algorithm of the platform, it will not obstruct people or stop them from doing so.
The increased percentage of hate speech led to termination and banning of thousands of creators and content on the platform. Last year, the automatic filtering system on YouTube separated and banned hate speech content over 46 times more than ever before. All these problems encouraged the company to take the right steps to curb the effects of such improper atmosphere across the platform.
YouTube has also introduced a few other features to make the platform more inclusive and supportive of diverse communities.
“Today, we are introducing a fresh reminder in comments to help encourage respectful interactions. Now on Android, the reminder may pop up before posting a comment that may be offensive to others, giving the commenter the option to reflect before posting. However, users can continue to comment if they want to. Apart from the new reminder feature, YouTube has also made a few more updates to make YouTube a more inclusive platform,”
Here quoting a Google employee form the Google support page.
The feature will soon be backed by another filter update that will withhold comments and content for review that will not be visible to the creators unless they opt for it. this will further streamline comment moderation tools to improve user experience on the platform.
Future of content creation on YouTube
YouTube is trying to fight against online abuse, partiality, discrimination, burnouts and issues obstructing channel growth. The teams are engaging with users from different walks of life and it will be asking YouTubers about the impact of different types of content on the site.
There will be regular questions asked to them that will provide more information related to their gender, age, interest, sexual orientation, race, ethnicity, cultures and traditions. This will help YouTube develop more inclusive rules, policies, guidelines, tools and systems to protect user interests. The research will be used to focus on the issues that create a negative impact on users and ensure a safer place for them.
YouTube Updates Community Strike System for Content Creators
YouTube rolls out Changes after FTC Raises Concern over Children’s privacy settings
YouTube Update: Creators Can Now Sell 20 Merchandise Items To Fans Directly
YouTube Updates its Content ID for Creators Policy
Introducing YouTube New Analytics Report Features to Enhance Video Performance