Integrated Llama3.2 model for AI features
The new highlight of the open source version of EGOCMS is the integration of the Llama3.2 model. This preconfigured model is automatically started up in the development environment and enables developers to test and use AI features of EGOCMS locally. We have adapted the Docker Compose environment and the initialisation script for this purpose.
Local AI development: With the integrated Llama3.2 model, developers can test AI functions directly in their local development environment. This speeds up the development process and enables seamless integration of AI-supported functions into your projects.
Preconfigured and ready to use: The model is pre-configured, so no additional setup steps are required. Developers can get started immediately and take advantage of the AI features of EGOCMS.
The Llama3.2:1b model is currently configured, as it only requires a few resources (approx. 1GB main or graphics memory). Other models (e.g. Llama3.1:5b) can also be integrated. Their resource consumption is significantly higher, but the quality of the output is also significantly better.
The release of EGOCMS as open source under the GPLv3 licence and the integration of the Llama3.2 model are important milestones on our way to shaping the future of content management. We invite all developers and interested parties to become part of our community, explore the source code and contribute to the further development of EGOCMS.
Click here to go directly to the gitlab/egotec/egocms project.