Images are the New Text in Social Media

Images are the New Text in Social Media

“A picture is worth a thousand words” is an old adage that holds true today just as it did in 1911 when it was coined. Perhaps even more so today as according to Twitter: 77% of all tweets about soft drinks do not include any textual reference to this subject; an image makes whatever point the author wants to make instead.

There are over 1,000 social media monitoring tools out there which use text analytics to analyse social media posts; if the soft drink statistic above is true for all products, then these tools are failing their users - the marketing professionals. Continue Reading on LinkedIn.

What does the future of Market Research look like? Can it be democratised?

Zero to OneMROCs (Market Research Online Communities) and the ability to harvest posts from the social web and analyse them using artificial intelligence are changing the face of market research. No suspense here: Market Research will be democratized in the (hopefully not too distant) future. Market Research will be available and affordable not just for blue chip multinationals with huge revenues but for companies of all sizes, including startups and SMEs (small and medium enterprises).

The way Market Research is conducted will of course be different than what it was in the past. In order for this industry and its services to be easy to access, affordable to any company, and fast to conduct, all processes should be adapted through and through.

Ideas such as ‘self service’ will become increasingly accepted and common, and companies will be able to order their market research in a seamless online process, just like consumers order products and pay using their credit card with the click of a button. One great success story of a ‘self service’ product in Market Research is of course SurveyMonkey, which has grown tremendously in its lifetime and is now moving into the corporate space.

I believe that the use of technology will be a huge factor in this shift of how the industry works, as it has the power to make research cheaper, faster, and easier. If it costs less (both in the literal sense as well as time-wise) for a Market Research agency to conduct accurate research and analysis and reach actionable insights for its clients, that gives it the ability to viably offer its services to smaller companies with small budgets for MR. That in turn means that companies can benefit from market research at any stage of their development, and make better informed decisions. Market Research will no longer be a “luxury” that only the big players can afford and enjoy.

Furthermore, the possibilities of analysing and integrating data from various sources using technology are endless. This of course is not even about the future; it has already started. Technology, specifically Machine Learning makes it possible to analyse text, audio, and visuals (images/videos) in a fast, affordable, automated way. Online communities enable agile and on demand MR. We can now ask, listen, and view in order to collect customer perceptions.

As described in the book ‘Zero to One’ by Peter Thiel, individuals and companies can create a monopoly for themselves instead of competing in an existing industry; create something new and be the only ones offering it. This is about starting a business with an idea that no one else had before – I think there is room to do exactly that in any industry, including MR. Going back to the SurveyMonkey example, even though they do have competition nowadays, when they first started it was a Zero to One situation; they were offering MR in the form of online surveys to individuals and companies who could previously not afford it. It would be very interesting to see (more) new ideas, new processes, and new technologies for MR in the future. So here’s to listening247 the SurveyMonkey of “social listening” and communities247 the SurveyMonkey of “private online communities” that will democratize the new market research industry even further.

Originally published on the Greenbook blog as part of The Big Ideas series.

Integration of Surveys, Social and Sales
The Triple S Integration was the Holy Grail, but it has been found now

Triple S: Surveys+Social+SalesIntegrating data from different sources is admittedly nothing new. Marketing Mix Modelling (MMM) albeit difficult and expensive has been done before, with some really impressive results. The one kind of data that was and is still missing, are unsolicited posts from consumers expressed on social media and other public websites, such as e-commerce and reviews sites. The reason this data has been missing is that it is unstructured, and traditional MMMs can only deal with time series.

DigitalMR has for the longest time been advocating that there is a way to accurately measure what people say on the web and use it to produce customer insights. Yes, there can be millions of posts out there, and it is tough to harvest only the posts that are relevant, and so many languages; we say so what? We have machine learning on our side and if used properly we can expect wonders in the very near future.

DigitalMR and Nielsen have presented the results of an R&D project at the ESOMAR MENAP 2017 conference, which aimed to discover correlations between social media monitoring metrics derived from posts in Arabic, retail sales and tracking survey KPIs. The wonderful news is that the R square between sentiment and sales is 0.81. What is more impressive is that the beta coefficient for positive sentiment and sales is double that compared to negative sentiment. There were not enough data points to prove positive correlation between social metrics and survey KPIs in this instance. DigitalMR has done two other brand tracking integrations with social posts in the past (in the Dutch language), and both delivered very positive results in terms of visual correlation of trends. What is more important is that because the social data is as granular as we want it to be in terms of time periods, it can play the role of an early indicator to monthly brand health or NPS trackers.

The above paragraph proves that there is enough evidence for a new approach that we would like to coin: SSS = Surveys+Social+Sales – the triple S combo. This will be an ongoing tracking approach that will combine social listening tracking with brand health or NPS trackers, as well as retail sales and distribution data from Nielsen reports. At some point a few years down the line the role of tracking surveys will be reduced to a handful of questions, making SSS much more efficient. Social posts and retails sales from POS scanning can be tracked down to the minute so we could soon be looking at daily or even hourly reporting… tracking surveys will need to either evolve to real time intercepts or become extinct altogether.

This post feels more like an announcement of a big scientific discovery than it probably should. A lot more work is needed to prove beyond reasonable doubt that social listening correlates with representative survey trackers. Just in case SSS will become a thing, don’t forget that you learned about it here first… (This sounds like something a news channel anchor would say…!) One of our predictions along with other futurists (years ago) was that the Marketing Director of the future is a journalist. Here we are, almost behaving like journalists in our effort to perform well at inbound marketing. 

The role of Image Analytics in Social Media Monitoring

Up to now, apart from having the ability to access the relevant posts, text analytics and Natural Language Processing (NLP) have been the main disciplines required by social media monitoring tools. That is not the case anymore.

Tweets with an image get retweeted 150% more than those without one; they also get liked 89% more. According to Twitter, 77% of all tweets about soft drinks do not have a textual reference to a soft drink brand or anything related to the product category. What?????

So if you are Coca Cola or Pepsi Cola, using social media monitoring tools to crawl the web in order to harvest all the relevant posts about your brands, you will miss out on a great deal of them if you are searching based on keywords only. Even if you just get a hold of the 23% that do include one of your keywords, you will still have no idea what the images included in those are about.

What if the author of a post wrote: “Music and beer…great combination!” and posted the image below?

 

A social media monitoring tool would never tag this post as one about Heineken. Only a social listening tool specifically developed for marketing insights purposes such as listening247 can offer this capability.

Given all the stats shared above – about the use of images in social media – it is unimaginable how any serious brand owner will continue to only monitor text in social media. Clearly, image analytics needs to become part of the insight management process of any company or organisation using social media listening.

How is it done?

Easy! You need a Data Scientist who can get you a convolutional neural network with over 15 layers – Deep Learning is anything over 4 layers – you find or create a training data set of at least 100,000 images to start with, you get your hands on a VERY powerful computer with multiple Graphic Processors and lots of RAM, or get access to a Big Data infrastructure in the Cloud, and you train a model for a few days. Piece of cake!

What Image Elements can be analysed and why do we analyse them?

There are a number of things that can be analysed in images that can be useful for market research:

  1. Logo detection
  2. Text extraction
  3. Object recognition
  4. Facial recognition to detect emotions
  5. Theme detection and captioning

Let’s talk about the use of each one of them separately:

  1. Logo detection is an obvious one; we need to be able to find the images that include the logos of the brands in the competitive set so that we can extract information and analyse it.
  2. Some Twitteratis game the system by using text in an image in addition to the 140 characters that Twitter allows for a Tweet.
  3. This is useful when looking for a specific item or product where the logo is not visible
  4. Detecting consumer emotions in images where brand logos appear can be very useful in understanding how consumers feel about a brand
  5. By knowing what an image is about we can cluster it under its respective discussion driver “bucket”. An image can now be part of a topic taxonomy for holistic semantic analysis.

The obvious conclusion is that text analytics alone does not cut it for social media monitoring anymore; image analytics as described above is necessary, in order to understand what consumers think and feel when they post online. 3rd Generation Social Listening is here. Click here to experience the DigitalMR “Magic Captioner” and its A.I. magic yourself. 

The Market Research industry is finally catching up with Artificial Intelligence

During the past three weeks I travelled 10 time zones east and west of GMT, to present at three different ESOMAR events:

                        1. MENAP Forum 2017 in Dubai on March 22nd
                        2. UK member meet-up in London on March 30th
                        3. LATAM Forum 2017 in Mexico City on April 7th

As a souvenir from Mexico City I brought back a broken foot but …hey….no regrets, it was all worth it.

I have been an ESOMAR member for many years, initially as an agency side executive, and now as an entrepreneur, and overall I mainly have positive things to say about the premier organisation of our industry. I have to admit I was a bit worried at the beginning of this decade, mainly about the pace of adoption of innovation, but I think ESOMAR has now fully recovered and is on the ball again.

This is what I spoke about at the three events:

  1. The integration of social analytics with retail sales reports and brand survey tracking - together with Nielsen
  2. The importance of image processing for theme detection in social listening and analytics
  3. Social media listening case studies in LATAM

In all three of my presentations artificial intelligence and machine learning occupied centre stage. The one thing that makes me even more pleased than getting a speaker slot in those events is that DigitalMR was not the only agency that had something to say about the use of AI in discovering customer insights.

The hardest thing when innovation is introduced in an industry is to educate clients to use it effectively. The inertia that we had to endure during the past few years was relentless. Thankfully, the feeling I have after participating in these events is that there is change in the air. The fact that more people talk about AI now, means that we will finally get some traction in adopting these new methods in mainstream market research. After four years of hard work in doing R&D and running pilots with early adopters, we may be nearing the phase whereby the early majority will start kicking in.

Those of us in this field can use all the help we can get to establish machine learning as an acceptable way of analysing big data and integrating it with surveys and behavioural data. Having said that, we have to be really careful as an industry and set some boundaries that will not allow aspiring tech companies to destroy the image and reduce the value of what market research offers as an industry (to its clients).

The same way ESOMAR once created the 28 questions that a client has to ask a vendor before they engage in online research (using access panels) we now need to define the parameters of acceptable market research standards around social listening, the use of natural language processing (NLP), and by extension artificial intelligence. Here is a list of 20 questions that DigitalMR proposes ESOMAR should use as a starting point to create those standards, in a way that is simple and hopefully easy to understand:

  1. Are the sentiment classifying algorithms based on Natural Language Processing (NLP) linguistic or statistical methods or both?
  2. What is the average sentiment accuracy achievable with the method used?
  3. How is sentiment accuracy defined?
  4. How exactly is the algorithm trained (if one is used) and how long does it take to get to the maximum achievable accuracy?
  5. In what languages can the vendor analyse for sentiment and topics in an automated way?
  6. How long does it take to introduce a new language?
  7. How is noise (irrelevant posts) due to homonyms removed from the data set to be reported on?
  8. Are search terms used or is it an open ended inductive approach?
  9. Are posts weighted according to author influence? If yes, how?
  10. Is the profiling of people who post (by demographics and other variables) available?
  11. How are the harvesting sites selected?
  12. How are comments gauged and classified for sarcasm?
  13. Is the pricing based on the number of search terms researched?
  14. How is the reporting done; what are the deliverables?
  15. If Natural Language Processing is used, are adjectives classified as positive or negative in a library (rule based approach to define sentiment)?
  16. Is the vendor a technology company or a specialized market research agency?
  17. Can specific emotions be detected and analysed? If yes, which ones?
  18. Does the vendor use topic taxonomies to identify discussion drivers? What is their semantic accuracy?
  19. Is image processing for brand logo and more importantly theme detection available?
  20. Can the vendor integrate the harvested data flow with your brand tracking surveys, Nielsen retail reports, or other in-house data sources?

Let us know if you support this initiative and if you have any other questions that you would like to add. As always, feel free to tweet to @DigitalMR and @DigitalMR_CEO.