Revisiting Emotional Authenticity: Reflections on Unit 3 Interventions

1. Rediscovering My Question

At the beginning of Unit 3, my central research question was:

“How can fashion brands ensure that AI-generated visuals maintain — rather than weaken — emotional connection with consumers?”

This question arose from my curiosity and concern about the growing adoption of AI within fashion’s creative production.
While AI imagery offers efficiency and limitless aesthetic possibilities, I observed an emerging discomfort: brand visuals, though increasingly polished, seemed to lose warmth, trust, and emotional resonance (Gu et al., 2024).
Unit 3 therefore became an opportunity to test this tension between efficiency and empathy through action research.


2. Intervention 1 – Realism vs. Warmth

Method: Participants viewed highly realistic AI-generated fashion visuals and described their feelings before and after exposure.

Findings: Greater realism reduced initial rejection and improved perceived credibility.

Value: When brands establish clear creative boundaries in AI use, they can reduce negative effects on trust and emotional attachment.

These results echoed Gu et al. (2024): excessively perfect or synthetic imagery can provoke a mild uncanny effect, diminishing perceived warmth.
Perfection, I realised, is not synonymous with authenticity — brands must navigate this emotional threshold carefully.


3. Intervention 2 – Boundaries and Transparency

Building on the first study, I explored which factors most influence emotional acceptance of AI visuals.

Method: Participants compared three conditions:

  1. AI replacement of a human model
  2. AI enhancement of background or detail
  3. AI-generated visuals openly labeled as such

Findings:

  • Replacing human models was the most emotionally sensitive “red line.”
  • Background enhancement was widely accepted.
  • Transparency increased trust yet slightly reduced warmth.

Value: This defined the safe and sensitive zones of AI integration.
Moderate enhancement plus clear disclosure built confidence, while full AI substitution triggered rejection.
Transparency and creative restraint jointly sustain brand trust.


4. Intervention 3 – Participatory Co-Creation

Inspired by Rindfleisch and O’Hern (2015) and by Rezwana & Maher (2022), I invited participants to co-create brand visuals with simple AI tools.

Findings:

  • Co-creation enhanced participants’ sense of being understood by the brand.
  • Low-quality AI output quickly reduced warmth and trust.
  • High-quality generation was essential for emotional resonance.

Value: Participation itself became emotional engagement.
The process revealed that connection grows not only from outcomes but from shared authorship.
Used collaboratively, AI can bridge the gap between creator and audience rather than widen it.


5. Evolving Understanding of Emotional Connection

Across the three interventions, my view of emotional connection evolved profoundly — from a reaction to visuals toward a relational process among designers, brands, and audiences.

Key insights:

  • Connection arises when people feel seen and understood, not when images appear flawless.
  • Open, transparent, and collaborative AI use strengthens empathy between humans and brands.
  • When AI replaces or conceals human creativity, the connection weakens or disappears.

This shift refocused my inquiry: from analysing AI-generated imagery to examining how AI reshapes emotional dynamics in creative communication.
My refined guiding question became:

“Can AI serve as a medium for empathy — rather than merely a tool for efficiency — within brand storytelling?”


6. Reflections on Practice and Learning

Unit 3 pushed me from theoretical curiosity into tangible experimentation.
I learned to capture emotional data — beyond surveys, noting tone, gestures, and subtle behavioural cues.

What I learned:

  • Creative research is cyclical and uncertain; discovery often emerges through ambiguity.
  • Emotional evidence is qualitative and relational, not numerical.
  • Designing emotionally aware AI experiments built my methodological confidence.

This phase prepared me to extend my research in Unit 4 — from perception to creative behaviour — to observe how AI alters design processes, authorship, and emotional expression in real-time contexts.


References (Harvard Style)

Gu, C., Jia, S., Lai, J., Chen, R. and Chang, X. (2024) ‘Exploring Consumer Acceptance of AI-Generated Advertisements: From the Perspectives of Perceived Eeriness and Perceived Intelligence’, Journal of Theoretical and Applied Electronic Commerce Research, 19 (3), pp. 2218–2238.
doi: 10.3390/jtaer19030108. Available at: https://www.mdpi.com/0718-1876/19/3/108 (Accessed 19 October 2025).

Rezwana, J. & Maher, M.L. (2022) Identifying Ethical Issues in AI Partners in Human-AI Co-Creation. arXiv preprint. Available at: https://arxiv.org/abs/2204.07644 (Accessed: 19 October 2025).

• Rindfleisch, A. & O’Hern, M. (2015) Customer co-creation: A typology and research agenda. Review of Marketing Research, 12, pp. 275–296. Available at: https://experts.illinois.edu/en/publications/customer-co-creation-a-typology-and-research-agenda (Accessed: 19 October 2025).


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *