Small changes in the large language models (LLMs) at the heart of AI applications can result in substantial energy savings, according to a report released by the United Nations Educational, Scientific ...
Intel has disclosed a maximum severity vulnerability in some versions of its Intel Neural Compressor software for AI model compression. The bug, designated as CVE-2024-22476, provides an ...
Large language models (LLMs) such as GPT-4o and other modern state-of-the-art generative models like Anthropic’s Claude, Google's PaLM and Meta's Llama have been dominating the AI field recently.
Pruna AI, a European startup that has been working on compression algorithms for AI models, is making its optimization framework open source on Thursday. Pruna AI has been creating a framework that ...
Small language models, known as SLMs, create intriguing possibilities for higher education leaders looking to take advantage of artificial intelligence and machine learning. SLMs are miniaturized ...
Nota AI's model compression and optimization technology enables device-level deployment of the high-performance LLM, EXAONE Partnership leverages Nota AI's solution portfolio to expand EXAONE adoption ...
Donostia, Spain – April 8, 2025 – Multiverse Computing today released two new AI models compressed by CompactifAI, Multiverse’s AI compressor: 80 percent compressed versions of Llama 3.1-8B and Llama ...
Amazon has acquired Edge chip company and AI model compression business Perceive for $80 million in cash. The cloud and ecommerce giant bought the division from publicly-listed technology company ...
San Sebastian, Spain – June 12, 2025: Multiverse Computing has developed CompactifAI, a compression technology capable of reducing the size of LLMs (Large Language Models) by up to 95 percent while ...