Originally Published on Unplugtech.com
A video is the most important material for marketing for any company whether it is a service or product based. But it is as hard to find out what user exactly feels while watching the video. Obviously, we cannot ask everyone or use built-in sensors to understand this task. Today we have a startup which have developed a headset ‘Affect EEG Lab’ to understand psychological aspects (and analyze using AI) of the test users while he is experiencing the content. Isn’t it interesting? let’s dive into it –
How does it work?
Affect Lab have an ‘Affect Lab EEG Headset’ which you have to wear it on your test user and let the user consume the content. During consumption, the headset will track user’s brainwave impulses to map and track emotional and cognitive responses. This data is then stored, analyzed using AI, benchmarked and shown visually on a Dashboard. Check out the below image –
Live demo –
They have a demo as well on the website, where they are showing the average pattern generated by test users while watching OLX’s ad. The pattern is categorized into Attention, appreciation, complexity, valence and activation. Each frame/sec pattern are recorded and shown on the dashboard. You can see the demo here or click on the gif below.
What’s more interesting –
They have a case study on this page, telling how the tool can help advertisers taking OLX’s ad as an example. They took two videos – one short version and second long version.
The short version is made up after cutting out clips of the long video, by analysis given by their algorithm. Check out image below –
By doing this, the viewership rate of the short video went up 123% higher than longer video and also increase attention by 19.25 points as well.
This is something which I found interesting because this can save plenty of time for the advertisers and makers, but of course, it still depends on the effectiveness of the algorithm and data set. It can become a go-to tool for advertisers, small marketers, focused social media content makers and even for research purposes in a long run.
Who made this?
Affect Lab is made by team Entropika, the same team which have made chromo.io – which uses touch-based emotion recognition technology, Machine Learning and also same ‘EEG headset’ to understand emotional response of the users while experiencing your app.
(Yes! I know you must thinking they have just changed the implementation but have nearly same product. Yeah! but it’s still good. what you think?)
Ranjan Kumar, Founder & Chief Executive Officer, Entropik adds, “Knowing your consumer has always been the core priority for any business. But the approach to building Artificial Intelligence that learns on personalized wearables, smartphones and IoT sensors has opened a new depth to know consumers. Smartphones have been the first technology since the advent of human history that have outnumbered the human population and we are still scratching just the surface. IoT likewise has started peaking globally as a data source.”
Till now they have raised $200k (nearly Rs 1.36 crore) in their first round of funding from Dileep Bhatt (president of downstream operations at JSW Steel Ltd) and Milind Chaudhary, director of Sea Global Services Pvt. Ltd.
Competitor – Affectiva
Do you know any competitor to this startup? please let us know in the comment below, we will include (and link ) it in this blog.
I found AffectLab on Tracxn, you can use this tool to find some interesting ones as well.