Analytics company Cloudera has launched Cloudera AI Inference, powered by Nvidia NIM microservices and accelerated computing (Tensor core GPUs), boosting LLM speeds by a claimed 36x. A service integration with Cloudera’s AI Model Registry enhances security and governance by managing access controls for both model endpoints and operations. Cloudera AI Inference protects sensitive data from leaking to […]
↧