- Видео 59
- Просмотров 334 880
Jesper Dramsch - Non-hype Machine Learning
Германия
Добавлен 19 сен 2015
I build neural networks in Earth science and physics
Danger, here be Pythons. 🐍
I do real-world machine learning and here I share insights and nuggets I find while doing so.
Danger, here be Pythons. 🐍
I do real-world machine learning and here I share insights and nuggets I find while doing so.
Beyond the Hype: Content with ChatGPT and Midjourney
ChatGPT is pretty bad at certain tasks, but what if we could actually make it more useful to our daily lives?
--------------------------------------
Join the community:
🎁 latent.club
--------------------------------------
Watch it here: skl.sh/4cKzvqB
--------------------------------------
🎥 My Camera Gear
dramsch.net/r/gear
🎼 My Music
dramsch.net/r/epidemic
--------------------------------------
⏱️ Timestamps
0:00 Welcome!
--------------------------------------
👋 Social
💙 Linkedin: dramsch.net/linkedin
💜 Mastodon: dramsch.net/mastodon
🖤 Github: dramsch.net/github
🧡 Google Scholar: dramsch.net/scholar
🌍 Main Website: dramsch.net
--------------------------------------
📝 Disclaimer
Jesper Dramsch is a participa...
--------------------------------------
Join the community:
🎁 latent.club
--------------------------------------
Watch it here: skl.sh/4cKzvqB
--------------------------------------
🎥 My Camera Gear
dramsch.net/r/gear
🎼 My Music
dramsch.net/r/epidemic
--------------------------------------
⏱️ Timestamps
0:00 Welcome!
--------------------------------------
👋 Social
💙 Linkedin: dramsch.net/linkedin
💜 Mastodon: dramsch.net/mastodon
🖤 Github: dramsch.net/github
🧡 Google Scholar: dramsch.net/scholar
🌍 Main Website: dramsch.net
--------------------------------------
📝 Disclaimer
Jesper Dramsch is a participa...
Просмотров: 137
Видео
Unlock your Creative Potential with AI: ChatGPT for Content Creators @Skillshare
Просмотров 417Год назад
Are you a content creator looking to take your creativity to the next level? Do you want to harness the power of generative AI to never run out of ideas again? Look no further! In this Skillshare course, led by AI expert Jesper Dramsch, you'll dive into the world of generative AI and learn how to harness the power of ChatGPT to revolutionize your content creation process. Whether you're a write...
Discussion at Real-world Perspectives in Machine Learning for Science
Просмотров 2082 года назад
Discussion at Real-world Perspectives in Machine Learning for Science
Integrating ML in experimental pipelines - Gemma Turon
Просмотров 2452 года назад
Integrating ML in experimental pipelines - Gemma Turon
How to Test your Machine Learning models - Goku Mohandas
Просмотров 1,8 тыс.2 года назад
How to Test your Machine Learning models - Goku Mohandas
The most common issues applying ML - Mike Walmsley
Просмотров 1802 года назад
The most common issues applying ML - Mike Walmsley
Evaluating Machine Learning Models - @ValerioMaggio
Просмотров 1852 года назад
Evaluating Machine Learning Models - @ValerioMaggio
Why and how make we ML reproducible - Jesper Dramsch
Просмотров 1522 года назад
Why and how make we ML reproducible - Jesper Dramsch
Intro to Workshop on Real-world perspectives on Machine Learning
Просмотров 1402 года назад
Intro to Workshop on Real-world perspectives on Machine Learning
Stop using random Splits - NormConf "Hallway Track"
Просмотров 3412 года назад
Stop using random Splits - NormConf "Hallway Track"
Put Art into Artificial Intelligence with Stable Diffusion 🎨 now on @Skillshare
Просмотров 4982 года назад
Put Art into Artificial Intelligence with Stable Diffusion 🎨 now on @Skillshare
How to Guarantee No One Understands Your Machine Learning Project | @PyData Global Talk 2021
Просмотров 6212 года назад
How to Guarantee No One Understands Your Machine Learning Project | @PyData Global Talk 2021
100 Machine Learning tips and TRICKs to celebrate Youtube Partner💥
Просмотров 5 тыс.2 года назад
100 Machine Learning tips and TRICKs to celebrate RUclips Partner💥
the SECRET to get FREE research papers for legally 🆓
Просмотров 5792 года назад
the SECRET to get FREE research papers for legally 🆓
Craft a Top-Tier Data Science Resumé for your Career Transition into Tech on @Skillshare
Просмотров 1,3 тыс.2 года назад
Craft a Top-Tier Data Science Resumé for your Career Transition into Tech on @Skillshare
SSI Fellowship Application 2022 for Jesper Sören Dramsch
Просмотров 1913 года назад
SSI Fellowship Application 2022 for Jesper Sören Dramsch
Ugh. Stop spending $1597 on BootCamps?! 💻
Просмотров 7 тыс.3 года назад
Ugh. Stop spending $1597 on BootCamps?! 💻
Data Science projects that KILL your job application 🔪
Просмотров 4,5 тыс.3 года назад
Data Science projects that KILL your job application 🔪
How to HACK zero tech experience to a JOB 👔
Просмотров 1,3 тыс.3 года назад
How to HACK zero tech experience to a JOB 👔
access Nvidia cloud GPU for FREE - 3 ways for Machine Learning in the cloud 💸
Просмотров 53 тыс.3 года назад
access Nvidia cloud GPU for FREE - 3 ways for Machine Learning in the cloud 💸
Watch this BEFORE buying a LAPTOP for Machine Learning and AI 🦾
Просмотров 150 тыс.3 года назад
Watch this BEFORE buying a LAPTOP for Machine Learning and AI 🦾
STOP writing bad Data Science CODE with these 10 tools in VS Code 🐍
Просмотров 8 тыс.3 года назад
STOP writing bad Data Science CODE with these 10 tools in VS Code 🐍
Machine Learning in Geoscience - PhD Defence of Jesper Dramsch
Просмотров 10 тыс.4 года назад
Machine Learning in Geoscience - PhD Defence of Jesper Dramsch
The No-Code Data Science Masterclass on @Skillshare
Просмотров 8984 года назад
The No-Code Data Science Masterclass on @Skillshare
Pydata Global is getting virtual conferences right with Gather.Town
Просмотров 1,1 тыс.4 года назад
Pydata Global is getting virtual conferences right with Gather.Town
Data Science and Business Analytics in Python on @Skillshare
Просмотров 1,3 тыс.4 года назад
Data Science and Business Analytics in Python on @Skillshare
science-backed strategies to LEARN ANYTHING | How I learned machine learning
Просмотров 2,1 тыс.5 лет назад
science-backed strategies to LEARN ANYTHING | How I learned machine learning
Machine Learning for Science [Trailer]
Просмотров 1,1 тыс.5 лет назад
Machine Learning for Science [Trailer]
Avoid LOSING your JOB to Artificial Intelligence - Humans vs AI - where humans excel
Просмотров 2155 лет назад
Avoid LOSING your JOB to Artificial Intelligence - Humans vs AI - where humans excel
can i connect like 8 bad cpu´s? and train with that?
Watch the full course here: skl.sh/4cKzvqB
Thankyou so much for this video
The music. Make it stop! jk. But please consider toning it way down. Thanks.
This was very helpful!!! Thank you ❤
Thanksssss 🎉
Technically g in gpu stants for general it was just used for graphics alot so it shifted now its returning to true aka general but you need cuda and tensor cores
Anything changed since this video? Any plans to make complete beginner videos on how to take your own data, and train a model on it (using local and/or cloud GPUs like AWS,GCP/GoogleColab)? I see so many (open-source?) models on HggingFace (LLAMA, Qwen-2.5-Coder-32b, etc), but have no idea how billions of parameters I need, or what they represent?! And assuming you download one of these terabyte-sized models that have been pre-trained, you still have to train it some more with your own local data? How? Alternatively, you upload your data and train it the cloud? At what do you need to know/learn things like PyTorch or TinyGrad? I assume the models themselves are binary blobs, and so you must work via some kind of API? How does ChatGPT's MyGPTs and Claude's "Personal Projects" fit into this?
Minisforum MS-01 i9 13900H 32gb ddr5 ram (max 96gb) and 1tb for 900euro is a great base to start if you dont want a big machine ....(with cloud service)
I LOVE how you just got straight to the point
Appreciate it!
Instructions unclear : Bought an AMD Rx 7900XT
I'm watching in 2024 and my machine i7 11800h with rtx 3060 it works like a beast 😊😊. So for now no need to upgrade to new machines.
Wait you got back early August
@JesperDramsch I'm just starting out with some ML/AI projects, and I already know I'll be working with fairly large datasets. I'd be interested in hearing your opinion on whether it would make sense to use my current Mac Pro with these specs by installing an Nvidia RTX 4090 and running Linux: 3.3 GHz 12-Core Intel Xeon W 256 GB 2933 MHz DDR4 Or do you think it would be more efficient to build a separate setup? Obviously, building a separate setup would cost me more since I work with Macs and I wouldn’t want to sell my current Mac. What do you think?
What about RTX 4090 for deep learning?
I'd say that using macBook Air 15" and Ryzen based PC = both at price of single macBook Pro - does offer best value. And as far as upgrade is considered - first, do math Ph. D.
your GEFORCE RTX is an NVIDIA card right? got confused!
Wow thank you a lot i feel like i understand a lot better
2024 is this advice still relevant? Eg using only nvidia for deep learning
Mostly yes. Situation is better with AMD and LLMs make GPUs more central to the equation
My bro how to setup my 2x 3090 for machine learning ..they collecting dust now.sub +like even if u dont answer
How is rtx 2050
good video! keep up the good work
I am a complete computer novice and this video was soo helpful. Thankyou
I'm so glad to hear this!
Hello sir , I have an i7 12th gen laptop with no gpu and looking to learn deep learning on it can my laptop handle it with colob or kaggle
when you use generative AI features in Colab, Google collects your prompts related code, generated output, related feature usage information, and of course your feedback
If Im a pyshician and need to have data privacy? I can't just use an online notebook and lie about it in the paper 😅
is it good to use the x-elite notebooks for machine learning and deep learning?
This is a really good video! i stumbled here and got close to buying NPU mini PC. However, I think I am okay running Gemma 2 Ollama and learning. Thank. Subscribed!
Thank you 😊 I got all 3 now
Tremendo trabajo de investigación, gracias por tomarte el tiempo para recopilar toda esta información y resumirla.
for gpu's unless you're a deep learning guy with loaded pockets, it is not worth it to buy some rtx laptop or something with low vram, simply because these things require a LOT of computing, the free kaggle gpus are better in that regard
hey , just read ur words , im new to this it sector (web development and other thing like ai ml )ike a freelancer u can say , for job and for exploring ,studying purposes in this endless field and i like this field , i want a laptop under 80 k or 90 k which can go upto 70 k in sale , first my requirements is that it should last upto 5 or 6 years , deep in this field u know the future of AI and all of that , should i Buy a laptop with dedicated gpu or not .i know for certain that im gonna go deep in this field , could u suggest me laptop with future proofing or laptop which must i avoid (ex- a particular company or smth like that),
Can I do Ho 15 s
you tube algo gods finally decided to give me a relevant video rather than click baits
Neat! Welcome!
Most people keep pondering whether to buy an SUV when they actually need is a cycle.
no one care if you have PHD or any sort of that, just don't tell that right at the beginning of the video, no one care
well it depends on the complexity of the task. suppose you’re working on time series forecasting, you could do that with either scikit or tf/pytorch which are ML or DL respectively. by following the tf/pytorch approach you may get better results due to the NNs, but this approach demands heavy requirements like CUDA for parallel computations and accelerate the process. Meanwhile, if you are satisfied with slightly worse and cardinal results, you could just stick to ML which don’t demand much
I am intersted in 4D seismic inversion for prediction SW, Sg or pressure. I go through your github but I do not see the data input, could you please share with me about that. I am also reserching apply ML in oil&gas specially in geoscience. Many thanks for your sharing!
Could you please recommend any old workstation laptops
Hugging face,keras,kaggle 🎉🎉
Hi! I am also starting to to learn AI and ML now. Can you please help me with a few things. 1) After what amount of time will I need a better laptop or can I do it on my current laptop? Right now I have an office laptop with intel i5 10gen U Processor with integrated graphics 2) Since I am starting to learn where should I start for AI & ML? 3) Is Asus ROG Flow X13 2023 a good option? It has Ryzen 9 7940HS and Nvidea RTX 4050 6GB (60W). I want this one because it is super portable and would also help in taking notes since it is touchscreen. Also is 16GB RAM enough in the laptop? It would be great if you could help me out a bit. Thanks!
Awesome post. Can you elaborate a bit more on the mechanics of content based filter. Thanks a lot
Thank you for a wonderful work and sharing the application of ML in Geoscience. Really appreciate it.
Thanks man
He is so pretty...
want a video on classification vs clustering, ummm, more of a supervised giant vs unsupervised giant !!
Do I need upgrade my Gtx 1660Ti GPU?
Was looking for an EDC work laptop that can handle personal scoped AI ML on my own air gapped network at home. That way i can leave the models and data private and connect the laptop by cable to train the model when needed. Appreciate this extremely detailed information and how it all relates to AI and ML. My very own private air gapped AI 😀 sounds like I'm better off setting up a tower AI build and go with maybe a laptop with one of the new intel core ultras for otg edc.
Such a helpful video! I need an update for 2024 products. Can't decied on which laptop to buy.
Thanks! Honestly it mostly still holds 😅
Great explanation video! One thing I would have liked to hear more about is dominance of the Nvidia CUDA framework. It seems to me that a lot of ML python libraries are compiled to work with the CUDA framework and therefore one would need to run it on Nvidia hardware. That’s the advantage that Nvidia has because it started 20 years ago developing the CUDA framework and was miles ahead of everyone else in the field of deep learning. As you said, things like Tensorflow is just starting to have Apple silicon /aarch/arm64 architecture. But Nvidia continues to innovate with RAPIDS (CUDF vs. Pandas) and with their NVLink on their DGX A100 and DGX H100 (8 GPUs w 80GB VRAM each and all linked together). However, with respect to laptop for ML, would it make sense from the perspective of a DevOps use case? Rather than using the laptop to train a huge LLM (llama2 , falcon40b, mistral, etc.) what if I just want to test a few of the prepackage Nvidia NGC containers in docker and add some additional python packages/libraries to them and test training on a smaller dataset smaller model to confirm that things work and then move the container over to the Cloud like Amazon AWS and run it on Nvidia A100 or DGX A100 resources to do the full training? Would laptops with nvidia GPUs (for docker, kubernetes, VMware) for DevOps testing purposes be useful or not at all? Thanks.
Thank you Jesper!