This Swift Playground provides an interactive experience for applying artistic style transfer to images using Apple's CoreML framework. Inspired by apps like Prisma and Lucid, this playground allows users to experiment with neural style transfer in just a few hundred lines of code.
- Real-time Style Transfer: Apply predefined artistic styles to images.
- Powered by CoreML: Utilizes machine learning models optimized for iOS.
- Minimal Code Complexity: Achieves high-quality style transfer with efficient and concise Swift code.
- Interactive Playground: Designed for experimentation within Xcode's playground environment.
- Xcode (Latest version recommended)
- macOS with support for CoreML
- Compatible with iOS 11+ devices
- Clone this repository:
git clone https://github.com/llogaricasas/CoreML-StyleTransfer.git
- Open the
.playgroundfile in Xcode. - Run the playground and interact with the style transfer feature.
This playground leverages a CoreML model trained for neural style transfer. The model takes an input image and applies a pre-defined artistic style using a convolutional neural network (CNN). The process is optimized for real-time execution on Apple devices, providing an engaging and seamless user experience.
Developed by Llogari Casas.
This project is open-source under the MIT License. Feel free to use, modify, and contribute!

