1. Research Evolution
Throughout the development of this research, I gradually realised that the challenges AI brings to brand visual communication are far more complex—and far more fundamental—than I initially imagined.
At the beginning, my observations came from a practitioner’s perspective: AI-generated visuals often exhibited unstable colour shifts, stylistic drift, and inconsistent atmospheres. However, as I engaged with literature, industry analysis, and real cases, I found that these issues are not merely technical imperfections. Instead, they directly challenge the core foundations of a brand’s visual identity system. Visuals inherently carry a brand’s emotional expression, values, and personality. When AI enters the visual production process, its efficiency does not naturally translate into emotional resonance. Rather, it amplifies long-overlooked variables such as “authenticity,” “consistency,” and “cultural context.”
Subsequently, through expert interviews, I gained a more grounded industry perspective. The brand visual specialist noted that colour accuracy, stylistic consistency, and emotional precision are the three most fundamental—and most difficult to control—risk points when using AI in brand communication. Without robust mechanisms to safeguard consistency, the efficiency advantages of AI may come at the cost of weakening a brand’s visual identity. This made me realise that the uniqueness of brand visuals becomes even more fragile in the age of AI.
Through this process, the evolution of my thinking became increasingly clear:
From technical issues → to visual deviation → to emotional expression → to trust and authenticity → ultimately returning to the brand–consumer relationship itself.
In other words, “visual quality” is only a surface-level variable. What is truly being disrupted is the emotional connection mechanism between brands and their audiences.
As the research deepened, a more central question emerged:
How can fashion brands ensure that AI-generated visual content preserves, rather than undermines, consumers’ emotional connection to the brand?
This question became the anchor that shaped the next phase of the research: a series of three interventions.
2. Interventions
Intervention 1: Testing Emotional Acceptance
- Aim: Test whether consumer acceptance of AI-generated visuals changes depending on realism (verisimilitude).
- Findings:
- High realism reduced rejection.
- Brand trust moderated acceptance: trusted brands enjoyed more tolerance.
- Emotional warmth and imperfection remained weak points.
- Context and generational differences shaped perceptions.
- Conclusion: Consumer attitudes shift quickly depending on visual elements; setting boundaries around realism and trust can reduce negative effects.
Intervention 2: Emotional Boundary Experiment
- Aim: Identify factors that define the “emotional non-harm boundaries” of AI in brand visuals.
- Key Dimensions:
- Platform/Media fit – effectiveness depends on context and quality.
- Transparency – disclosure increases trust but may lower warmth.
- Human models – replacement provokes backlash.
- Over-perfection/eeriness – reduces authenticity and trust.
- Conclusion: AI is welcomed in supporting roles (backgrounds, props, retouching) but replacing human figures undermines consumer connection.
Intervention 3: Participatory AI Visual Generation Experiment
- Aim: Explore whether participatory co-creation increases emotional connection.
- Method: Participants provided prompts (colour, scenario) integrated into AI visuals.
- Findings:
- Pre-test: Realism rated highest, but emotional resonance weak.
- Post-test: Warmth, resonance, and realism all declined. Yet 52.6% agreed “the brand understands me.”
- Participation shifted perception but did not generate strong emotional bonds.
- Conclusion: Co-creation fosters recognition (“I feel understood”), but without higher-quality outputs, it does not translate into genuine resonance.
3. Overall Insights
- Realism does not equal emotional resonance — AI can produce convincing visuals, but they often lack human warmth.
- Boundaries are clear — Consumers accept AI for backgrounds, props, and retouching, but reject it for core human figures.
- Transparency is a double-edged sword — boosts credibility but can reduce emotional warmth if overemphasised.
- Participation has potential but limits — being part of the process increases perceived understanding, yet quality gaps prevent deeper connection.
4. Practical Contribution
The findings across interventions form the basis of a Brand AI Usage Boundary Guide. This guide can help brands:
- Enhance efficiency while safeguarding emotional connection.
- Define safe vs. sensitive application zones.
- Combine participatory design with technical quality to balance perceived understanding with authentic resonance.
One-sentence summary:
AI-generated visuals can enhance efficiency and realism, but without clear boundaries and emotional safeguards, they risk undermining the uniqueness and connection that fashion brands rely on
Leave a Reply