Embark on a riveting on-demand webinar experience, brought to you by AIM in collaboration with Intel, designed to enlighten attendees on harnessing the full potential of a deep learning neural network model while achieving lightning-fast AI inference using a CPU.
Prepare to be enthralled as this session unveils the secrets of Intel’s OpenVINO™ toolkit, an open-source marvel crafted to empower neural network model optimization and facilitate seamless deployment across a multitude of hardware platforms. Witness the magic unfold with a live demonstration, showcasing how to set up and operate OpenVINO to achieve real-time AI inference in a CPU.
Participants will embark on a journey of discovery, mastering three key aspects:
- Turbocharging AI inference with your CPU
- Exploring OpenVINO’s versatile toolkit for optimizing and inferencing deep learning models
- Becoming an OpenVINO virtuoso in just 5 minutes
Session Luminary: Zhuo Wu, AI Evangelist, Intel