Social Labs Newsletter, OCTOBER 2025
%25209.50.23%25E2%2580%25AFa.m..png)
|Five Minutes Well Spent
What´s new in October 2025?
Highlights, insights, and exciting opportunities—let's dive in!
%25209.18.27%25E2%2580%25AFa.m..png)
Social Advisors
Conversational Mission: From Conversation to Action, Qualitative that Moves the Market.
Mariana arrived five minutes early to the video call. She’s the mother of two, has lived in the same neighborhood for a decade, and knows supermarket prices and the brands that “really work” like the back of her hand. Next to her, on separate screens, are four other women from her city: different stories, same daily challenge. They’re not here to “answer questions,” they’re here to converse.
On the other side, a member of your Team opens the session. There’s no improvisation: an AI Conversational Agent, trained on the client’s problem, whispers (via audio and discreet prompts) the script that was generated by AI minutes earlier. It’s not a rigid questionnaire; it’s a living roadmap that adapts in real time to tone, emotions, and the silences that also speak.
This Conversational Mission isn’t “a long survey with cameras.” It’s a hyper-segmented qualitative community: for example, 5 homemakers in a specific city, with fine-grained criteria (age ranges, purchase ticket, lifestyles). The meeting happens on the video platform you prefer (Zoom, Meet, Teams). Social Advisors doesn’t tie you to a tool; it frees you to listen.
During the conversation, participants see and react to:
- Videos (prototypes, ads, A/B tests).
- Products and photos (packaging, new variants, claims).
- Conversation scripts generated by AI and refined by the human team.
Everything —audio, video, screen sharing, gestures, laughter, and doubts— is recorded. But here’s where the magic happens: it doesn’t stay raw.
As soon as the session ends, Social Advisors automatically imports the material. Then what used to take weeks happens:
- Transcription and cleaning: AI converts audio to text, identifies speakers and timestamps.
- Intelligent decomposition: detects topics, subtopics, key moments, memorable quotes, and reactions to materials.
- Emotional and subjective analysis: maps feelings (joy, frustration, surprise), intensity, and changes throughout the session.
- Concern detection: “price,” “performance,” “safety,” “ease of use”… what worries them and why.
- Semantic clustering: groups opinions by affinity to reveal patterns, not anecdotes.
- Real-time actionable insights: what to do now with stoplights, expected impact, and recommendations.
The result: a living qualitative dashboard that isn’t limited to “what was said,” but to what matters for decision-making.
What is the Conversational Agent?
It’s a cognitive interface that lets you explore results and generate analysis through natural language:
- During the session: suggests routes, follow-ups, and dynamics to the Advisor if it detects signals (e.g., confusion or excitement).
- After the session: you can ask it with your voice or text:- “At what minute did distrust about the price appear?”
- “Compare the reaction to claim A vs. claim B.”
- “Summarize in 5 bullets the main barriers and suggest a creative test.”
 
It works in user-agent conversations and in queries over data or knowledge already indexed on the platform. In short: you talk to your data as you would to your team, and you get actionable answers with context and evidence.
Why does it work better than “traditional” qualitative?
- Speed → From “we had the group yesterday” to “we have the insights today.”
- Quality → Emotional and strategic analysis with clips anchored to each insight.
- Flexibility → From 5 homemakers to hard-to-recruit B2B niches.
- Landing into action → Recommendations focused on decisions, not pages.
We’ll show you live how a conversation with a qualitative group from your segment can give you the clarity you’re missing today.
Social Advisors — emotional, strategic, and actionable qualitative in real time.
Write to us to start your first Qualitative Mission.
%209.16.31%E2%80%AFa.m..png)
Social Labs
IA Recognition + Pattern Recognition: From Scattered Data to Clear Decisions
Each week your teams generate surveys, opinions, and visual evidence from communities and activations. Volume grows; time doesn’t.
At Social Labs, we’re taking research and community activation to a new level. IA Recognition and Pattern Recognition turn surveys, opinions, and visual evidence into actionable insights and advanced KPIs in a matter of minutes.
Below we explain what they are, how they work, and how they apply to your missions.
IA Recognition — What is it?
Multimodal cognitive analysis technology that detects patterns in social, textual, and visual data. It natively integrates with Social Advisors to transform responses and evidence into AI-powered reports, heuristic insights, and recommendations.
Functionality (how it operates)
- Active multimodal recognition: today it supports text and images; the next version will add audio and video.
- Mission-adaptive learning: tunes models to the context (segment, city, category).
- Automated reports: produces KPIs, insights, and recommendations ready to share.
- Integration with Social Advisors: enriches segmentations and community profiles.
Benefits
- Depth + context: interprets intent, themes, and sentiment, not just metrics.
- Analysis speed: from days to minutes.
- Consistency: unifies criteria across missions and teams.
- Scalability: useful for one-off studies or always-on programs.
Which missions does it apply to?
- Open/closed surveys (opinions, NPS, satisfaction).
- Photographic evidence (product in use, shelf, activations).
- Combined missions (text + image), and soon video.
Recommended mission types
- Concept & Product Fit (pre/post test).
- Purchase/Consumption Experience.
- Retail & Merchandising (visibility, execution).
- Qualitative Community (short task series with open prompts).
%209.17.43%E2%80%AFa.m..png)
Pattern Recognition — What is it?
Technology for discovering and understanding patterns in social, textual, and visual data that transforms scattered signals into interpretive insights, comparable KPIs, and actionable recommendations.
How it works:
- Ingestion: consolidates open/closed responses, photographic evidence (and soon audio/video).
- Pattern detection: identifies themes, drivers, and barriers, and how they co-occur.
- Prioritization: weights impact by segment, channel, city, or wave (tracking).
- Evolution: tracks changes and breaks over time to anticipate trends.
- Output: dashboard + AI report with KPIs, explanations, and next actions.
Features and differentiators
- Thematic clustering with intent: groups by motivations, not just keywords.
- Driver/barrier map: quantifies what propels/inhibits and where to act first.
- Useful co-occurrences: detects winning combinations (e.g., sugar-free + small pack + on-the-go).
- Weak signals and anomalies: early surfacing of changes by subgroup or channel.
- Temporal evolution: trajectories of themes and sentiment across waves, cities, and targets.
- Mission-adaptive learning: tunes models to the specific context (category, shopper, market).
- Integration with Social Advisors: enriches segmentations and ensures consistency across missions.
Where is it applied?
- Concept & Product Fit (pre/post test).
- Purchase/Consumption Experience (occasions, rituals).
- Retail & Merchandising (visibility, price/promo, facing).
- Brand/Category Tracking (trends and consistency across waves).
- Qualitative communities (short series with open prompts).
- Combined missions (text + image; and soon video).
Recommended mission types
- Post-activation: identify which messages/elements explain the real uplift.
- Packaging optimization: detect attributes that co-occur with choice and repeat purchase.
- Promos and pricing: see combinations that trigger trial by segment/channel.
- CX & loyalty: locate frictions that erode NPS and what to fix first.
What’s next — Video Recognition
What is it? An extension to interpret visual and expressive information in interviews and mission videos. It combines verbal content, tone of voice, and micro-expressions to enrich the insight.
What it unlocks:
- Nonverbal emotional layer: validates whether a claim truly resonates beyond what’s said.
- Advanced creative testing: compares spontaneous reactions to communication pieces and detects inconsistencies.
- Real usage diaries: observes context and consumption rituals without recall bias.
Example output:
“Segment A reacts positively to claim X (↑ subtle smiles, ↑ intonation) but verbalizes doubts about price; adjust value message and pack size to convert interest into trial.”
The question is no longer “what do the data say?”, but “what are they asking us to do?”
With IA Recognition you transform responses and photos into meaning and speed; with Pattern Recognition you turn that meaning into concrete levers for product, marketing, retail, and experience. And with Video Recognition, arriving soon, you’ll add the emotional and expressive layer that completes the picture.
Go from observing to orchestrating.
Book a demo and take your next Social Advisors mission from scattered data to clear decision in one week.
Social Advisors
Platform Updates
In addition to IA Recognition and Pattern Recognition, this month we elevated the platform with targeted improvements that connect conversation, analysis, and execution.
The Smart Corporate Messenger is now fully integrated into the dashboard: a new unified interface for admins and users reduces clicks and makes navigation clearer. We added chatbot management to deploy and query conversational agents within conversations, and we introduced quick actions that execute operational tasks (such as assigning a mission or sharing a report) without leaving the chat. We also improved group collaboration with greater stability and clearer statuses, making it easier to track decisions and to-dos in real time.
The Missions Dashboard has been refreshed with a one-page view that concentrates strategic KPIs on a single interactive screen. New visual components—bars, pie charts, and geospatial maps—allow you to read trends and regional differences quickly, while KPI Cards highlight what’s critical for instant comprehension. For integrations, we enabled KPI APIs that expose mission metrics to your external systems, and we added data labels on charts to interpret results effortlessly and speed up decision-making.
Exciting News
Social Labs helps you understand and connect better with your market. Join us and co-invest in initiatives that drive impact and transformation.
%209.52.39%E2%80%AFa.m..png)
%209.52.50%E2%80%AFa.m..png)
%209.52.58%E2%80%AFa.m..png)
Thank you for continuing to trust us!
Best regards, and see you in the next newsletter.
The Social Labs Team
Passion for Software & Love for Innovation
%20(1)%20copy.png)




