About
YVIC Research Lab studies how pretrained representations shape what language models can and cannot do after deployment. The work focuses on the geometry of hidden states and how this structure constrains compression, adaptation, and prompt control. A central goal is to understand when semantic structure survives aggressive model compression, and how inherited representation geometry limits what downstream training or prompting can realistically change.
Research interests
- Preserving semantic structure in highly compressed multilingual models
- Understanding prompt and prefix control through representation-level analysis
- How inherited representation geometry constrains post-pretraining adaptation
- Practical deployment of compact models under strict efficiency and privacy limits
Luna (on device system)
Luna is an offline research prototype used to test manifold-preserving compression methods in realistic deployment conditions. The system runs entirely on device and integrates local retrieval with Metal-accelerated inference on consumer hardware.
Luna is presented as a research artifact rather than a commercial system.