Weka AI™ Wins “Most Innovative AI Application Award” at Flash Memory Summit 2020
Shailesh Manjrekar. November 11, 2020
AI/ML applications are the fastest growing applications in the data center today.These applications and data pipelines are inherently different from traditional file-based applications. Artificial Intelligence/Machine Learning applications require vast amounts of low-latency, high-throughput flash storage. Cloud and enterprise data center architectures must be optimized to train deep neural networks and analyze petabyte-scale datasets, all while satisfying critical cost constraints. The rapid adoption of AI/ML applications is fueling tremendous growth in demand for flash memory. According to IDC, the combined flash memory and SSD (Solid State Drive) markets will grow to almost $90 billion in 2022. Advances like persistent memory, computational storage, and QLC technology need to be combined with progress in 3D NAND flash to meet the needs of AI/ML applications for more data, faster.
Weka AI™ AI Is Industry’s First Software-defined Solution for Accelerated DataOps
Weka AI™is designed to democratize the use of AI, ML, and Deep Learning for businesses with AI First strategy, by working with our ecosystem partners. The solution framework delivers an end-to-end solution for data scientists, Chief Data Officers, and Chief Analytics Officers to embark on their AI journeys. The framework enables them to leverage AI for monetizing their data, launching new business models, and seamlessly embedding AI in their existing business applications across verticals like Autonomous vehicles / Mobility as a service, Healthcare and Lifesciences and Financial Services Industry.
Weka AI™ industry’s most innovative solution for AI/ML applications
- Customer traction – Weka AI powers Mobility as a service solution stack with marquee data driven businesses doing AV sensors like Innoviz, connected car companies like Cerence doing conversational AI and NLP/NLU ( Natural language processing), Tusimple using Weka AI for ADAS, all the way to providing mobility services like WeRide
- Personas and Workflow – Weka AI empowers data driven personas to derive actionable intelligence using popular MLOps platforms and compliments their workflows by enabling Accelerated DataOps.
- AI First – Weka AI™ works with leading MLOps platforms like Valohai, Open datahub, HPE Ezmeral, Kubeflow to provide the ease of use for customers AI journey
- 50x data pipeline acceleration – Weka AI™ is the industry’s only software-defined solutions framework to support GPU accelerated computing, delivering 50x performance acceleration compared to CPU-based solutions. It provides a production-ready and end-to-end storage solution for the entire AI data pipeline—ingest, to batch feature extraction, to hyperparameter optimization, to inference and versioning
- Accelerated DataOps – Gartner defines DataOps as “A collaborative data management practice focused on improving the communication, integration and automation of data flows between data managers and data consumers across an organization”. Weka AI enables Accelerated DataOps by enabling actionable intelligence, operationalizing exascale datasets and governance with end to end security.
- Cloud First – Weka AI™ enables edge to core to cloud workflows by working with innovative hybrid platforms like AWS Outposts, as well as cloud frameworks like AWS Sagemaker
Flash Memory Summit (FMS) is among the most respected flash storage programs in the industry so to have Weka AI recognized with the coveted Best of Show Award is indeed a tremendous honor. We launched Weka AI to simplify the use of AI/ML for businesses with an AI-first strategy. The Reference Architecture with companies in the WTAP program deliver end-to-end pipeline acceleration for Data Scientists, Chief Data Officers, and Chief Analytics Officers to embark on their AI journey. The high performance and low latency delivered with NVMe technology, are imperative for modern AI applications both on premise and in the cloud. Customers using Weka AI today are reaping the benefits within their AI data pipelines.