ROAR to help decode Facebook emoticons
A novel computer model called robust label ranking, or ROAR, that helps mine social emotions, could be used to predict various emotional reactions such as love, haha, and angry on Facebook posts, researchers say
A novel computer model called robust label ranking, or ROAR, that helps mine social emotions, could be used to predict various emotional reactions such as love, haha, and angry on Facebook posts, researchers say. While the trusty "like" button is still the most popular way to signal approval for Facebook posts, ROAR can help navigate the increasingly complicated way people are expressing how they feel on social media.
It could also lead to better analytic packages for social media analysts and researchers. "We want to understand the user's reactions behind these clicks on the emoticons by modelling the problem as the ranking problem -- given a Facebook post, can an algorithm predict the right ordering among six emoticons in terms of votes?" said Jason Zhang, a research assistant at the Pennsylvania State University. "This is a step in the direction of creating a model that could tell, for instance, that a Facebook posting made in 2015 with a million likes, in fact, consists only 80 per cent likes and 20 per cent angry," Lee said.
In early 2017, Facebook added five more buttons -- love, haha, wow, sad and angry -- in addition to its the like button -- the official emoticon reaction. But, merely counting clicks fail to acknowledge that some emoticons are less likely to be clicked than others. For example, users tend to click the like button the most because it signals a positive interaction and it is also the default emoticon on Facebook. For social media managers and advertisers, who spend billions buying Facebook advertisements each year, this imbalance may skew their analysis on how their content is actually performing on Facebook, said Dongwon Lee, Associate Professor at the varsity. The new model was trained using four Facebook post data sets including public posts from ordinary users, the New York Times, the Wall Street Journal and the Washington Post.
The findings, presented at the 32nd AAAI Conference on Artificial Intelligence today in New Orleans, showed that the new model significantly outperformed existing solutions in precisely understanding social emotions.
Download the new mid-day Android and iOS apps to get updates on all the latest and trending stories on the go
The content/reporting displayed on our website www.mid-day.com is provided "AS-IS," "AS AVAILABLE, by us from third party, agencies, sources, without any verification from our side. It may contain error, bugs and other limitations. The reader's can rely on the content at their own will. Mid-day accepts no responsibility or liability for its dependability, trustworthiness, reliability, data, text, images, video, messages, or any other material whatsoever or for any claims/loss/action that the reader may suffer as a result of relying on the content on our site. Mid-day management/mid-day.com reserves the sole right to alter, delete or remove (without notice) the content in its absolute discretion for any reason whatsoever.
Sign up for all the latest news, top galleries and trending videos from Mid-day.comSubscribe