Это видео недоступно.
Сожалеем об этом.

NVIDIA AI Workbench running a project on Windows in WSL and exploring the workbench WSL file system

Поделиться
HTML-код
  • Опубликовано: 4 июн 2024
  • NVIDIA's AI Workbench containerizes your ML and Data Science workloads on Windows using Docker and WSL. The Workbench itself runs in its' own WSL instance.
    We clone an AI Workbench-compatible Python project from GIT and runs it in a local container. We then go and terminal into the AI Workbench WSL VM/instance and see our cloned project in the file system to do things like finer-grained GIT or other operations that are supported by Workbench out of the box.
    Related Videos
    * NVIDIA AI Workbench Topology on Windows and Linux a first local project • NVIDIA AI Workbench To...
    * NVIDIA AI Workbench running a project on Windows in WSL and exploring the workbench WSL file system • NVIDIA AI Workbench ru...
    * NVIDIA AI Workbench running a project on a remote server and ssh into the machine to see the files • NVIDIA AI Workbench cl...
    Git repository used github.com/fre...
    Blog: joe.blog.freem...
    I DO NOT see Workbench as an appropriate tool for casual users at the time (2024/05) of this video primarily because
    1. An existing project can only be made Workbench compatible by stuffing some files into your repository
    2. The GIT interface gives you no control over what changes are committed

Комментарии • 1

  • @iukeay
    @iukeay Месяц назад

    Have you played with wsl2 Dev drives? I am working with 80-144 models I'm noticing that my read speeds of just reading the models into gpu memory is about 10x slower then it should be.
    Just curious.
    Was going to move all my docker volumes to