[ol10_u0_developer_EPEL] python3-ramalama-0.6.1-1.el10_0.noarch

Name:python3-ramalama
Version:0.6.1
Release:1.el10_0
Architecture:noarch
Group:Unspecified
Size:398015
License:MIT
RPM: python3-ramalama-0.6.1-1.el10_0.noarch.rpm
Source RPM: python-ramalama-0.6.1-1.el10_0.src.rpm
Build Date:Thu Feb 27 2025
Build Host:build-ol10-x86_64.oracle.com
Vendor:Oracle America
URL:https://github.com/containers/ramalama
Summary:RamaLama is a command line tool for working with AI LLM models
Description:
RamaLama is a command line tool for working with AI LLM models

On first run RamaLama inspects your system for GPU support, falling back to CPU
support if no GPUs are present. It then uses container engines like Podman to
pull the appropriate OCI image with all of the software necessary to run an
AI Model for your systems setup. This eliminates the need for the user to
configure the system for AI themselves. After the initialization, RamaLama
will run the AI Models within a container based on the OCI image.

Changelog (Show File list) (Show related packages)