ME.gic.LO.gic

“About”

Human AI Collaboration
“Bridge as an Astryn”

  1. Noema (A head-worn hallucination system)
  2. Anemoia (AI Fragrance System)
  3. Venema (Ink Fades, Echoes Endure)
  4. Droplets of Sati (Shared Mindfulness as Remembering)
  5. PainMouse (Exploring Pain as an Embodied Output Modality)
  6. TAWU (A Local-Living Experience Platform)
  7. PathwayAI (EduTech SaaS Platform)
  8. Cinematic Spaces in the AI Lens
  9. TrustLens (“How to classify REALITY?”)
  10. Tide of Tears (Computational method driven avatar design)
  11. Breath of Blooms (Scripting the dream through bodies)
  12. RhinoMCP for HDR (The best RhinoMCP in the world)
  13. Prompt2Plan (From Intuition to Algorithm: Reimagining Interior Layouts with AI)
  14. ASL LiveSign (AI -Powered  bidirectional translation system)
  15. Chilling Piggy (Controllers Design for 1-D game)
  16. RhinoAI ( LLM-Based 3D Modeling Assistant )
  17. EMO機"jī"( Motion & Emotion H:M:M:H Interaction System )


XR Development
Spatial Intelligence

  1. Astronia ( AI 3D-Scan & Art Generation & Projection System )
  2. Togetherverse (Multiplayer co-presence XR platform )
  3. Lucid Prism ( MR LLM-Based Multi-Modal Agent System )
  4. Vetroverse ( VR Game & Film )
  5. XR Jewelry Shop ( XR Shopping experience system )


Architecture / Installation

“Rationale”

  1. Shadow-Less ( A ML system for Shadow as a Catalyst for Design )
  2. Garden of Sacredness ( Urban Renewal )
  3. Forest ( Modular Infrastructure for Community and Sustainability )
  4. Invasive Species ( Vertical Garden: A Living Archive )
  5. Parametricism Mortise and Tenon


Photography
“Innerpeace”

  1. Prelude - 6am
  2. Fugue - 9am
  3. Improvisation - 4pm
  4. Waltz - 6pm
  5. Fantasia - 8pm
  6. 2024 Outro


Read more →   

14. EMO機"jī": Foobot

24 Fall, Harvard University. With Viola Tan, Kida Huang.






                - Marshall McLuhan

"The medium is the message."



EmojiBot examines how machines mediate human-to-human (H:H) communication, turning it into human-to-machine-to-machine-to-human (H:M:M:H) interactions. In an era where technology is central to our everyday exchanges, this project reflects on how these mediated interactions have diverged from direct human connections. Do machines simply "get in the way," or do they add depth and new layers to our conversations? Through the lens of emojis—our favorite stand-ins for human expressions—EmojiBot investigates how these symbols succeed or fail in representing the complexity of real human emotion and interaction.

The project creates a multi-user experience where participants' body motions and facial emotions are captured and translated into text captions. These captions are then transformed into emojis, which appear on the screens of different users. By distilling nuanced human expressions into simplified emoji-based communication, EmojiBot highlights both the creative potential and the limitations of machine-mediated conversation.

EmojiBot questions the "glitches" inherent in these processes, where technology simplifies, distorts, or even enhances human connections. It invites participants to reflect on how machines have shaped the way we express, perceive, and relate to one another in a world increasingly dominated by digital interfaces.