Skip to main content

SOYL R&D Roadmap

Our staged R&D roadmap moves from a feasibility MVP (real-time emotion sensing + AR demo) to a unified affect foundation model and commercial SDK for B2B licensing. Key milestone: functional adaptive AI salesperson within 12 months; foundation model in 18–24 months.

Phase 1

Phase 1: Foundation MVP

Months 1-6

Build real-time emotion sensing capabilities with face, voice, and text detection. Create AR commerce demo showcasing emotion-aware interactions.

Key Milestones:

  • Real-time multimodal emotion detection pipeline
  • AR commerce proof-of-concept
  • Initial dataset collection and validation
  • Basic Emotion State Vector representation
Phase 2

Phase 2: Cognitive Signal Layer

Months 6-12

Develop unified Emotion State Vector that fuses multimodal signals into a coherent affect representation. Build signal fusion architecture.

Key Milestones:

  • Unified Emotion State Vector architecture
  • Signal fusion algorithms
  • Improved emotion detection accuracy
  • API v1 for emotion detection
Phase 3

Phase 3: Agentic Layer

Months 12-18

Create adaptive AI salesperson powered by LLMs that responds dynamically based on detected emotion states. Functional adaptive agent within 12 months.

Key Milestones:

  • Functional adaptive AI salesperson
  • LLM integration with emotion context
  • Dialogue manager with affect adaptation
  • Pilot deployments with partners
Phase 4

Phase 4: Foundation Model

Months 18-24

Develop proprietary emotion-aware foundation model. Train on multimodal emotion datasets (IEMOCAP, CMU-MOSEI, AffectNet).

Key Milestones:

  • Foundation model training and validation
  • Multimodal emotion dataset integration
  • Model performance benchmarks
  • Open-source contributions
Phase 5

Phase 5: Productization

Months 24+

Commercial SDK and API offerings. B2B licensing model. Enterprise integrations and partnerships.

Key Milestones:

  • Commercial SDK release
  • Enterprise API platform
  • B2B licensing agreements
  • Scaled infrastructure

Success Metrics

>90%
Emotion Detection Accuracy
<100ms
Real-time Latency
>85%
Multimodal Fusion Accuracy
>80% user satisfaction
Agent Response Relevance

Team & Partnerships

Our R&D team includes AI researchers, ML engineers, and product specialists working on cutting-edge emotion AI. We welcome partnerships with academic institutions and industry leaders.

Research References: Our work builds on established datasets and methodologies including IEMOCAP, CMU-MOSEI, and AffectNet.

For R&D partnerships or inquiries, contact: hello@soyl.ai