Oracle adds AI capabilities to core database and launches a lakehouse platform (siliconangle.com)

🤖 AI Summary
Oracle announced general availability of Oracle AI Database 26ai and Autonomous AI Lakehouse, extending its strategy of embedding AI capabilities directly into core data platforms to support training and inference across cloud and on‑premises environments. Database 26ai (LTS, replacing 23ai) brings native AI functions—AI vector search, Model Context Protocol server support, and an in‑database agent framework—plus ONNX embedding compatibility, open agent/LLM integrations, and Apache Iceberg support. Security gets quantum‑safe encryption for data‑at‑rest and in‑flight; performance is boosted by Exadata for AI, RDMA, tiered storage and a Private AI Services Container for running model instances on customer infrastructure. Developers gain a no‑code AI Private Agent Factory and natural‑language app building in Application Express; existing 23ai customers can update at no charge (AI Vector Search included). The Autonomous AI Lakehouse pairs the Autonomous AI Database with Apache Iceberg for a multi‑cloud/hybrid data platform (OCI, AWS, Azure, GCP, Exadata Cloud@Customer) that’s compatible with Databricks, Snowflake and AWS Glue. It supports vector search on Iceberg tables, JSON/relational duality, property graph analytics, Select AI (NL→SQL), and a Data Lake Accelerator for dynamic scaling (limited availability). New cataloging, Exadata table caching, GoldenGate for Iceberg streaming, and table hyperlinking aim to simplify governance, reduce data movement, and enable complex AI workflows across structured and unstructured data—positioning Oracle to compete on integrated, enterprise‑grade AI infrastructure.
Loading comments...
loading comments...