MGIE, which stands for MLLM-Guided Image Editing, is a model developed by Apple and the University of California, Santa Barbara, that leverages multimodal large language models (MLLMs) to enable image editing based on natural language instructions.
The MGIE model, which Apple worked on with the University of California, Santa Barbara, can crop, resize, flip, and add filters to images all through text prompts... which suggests that iOS 18 (or something thereabouts) is going to be offering a bunch of sweet on-device AI functionality!
What an impressive product launch! I'm blown away by the quality and functionality. It's clear that a lot of hard work and dedication went into making this a success. Congratulations!