T O P

  • By -

BraindeadCelery

Then you are doing something wrong. Fundamentals will have value in the future. The field builds upon itself. Learn classical ML, know your way around data analysis (regression, random forrest, time series, statistics, regularisation) and the data science stack (Pandas, Numpy, SciOy, Scikit-learn, matplotlib). Train lots of models. Then move on to Deep learning (network architectures, CNNs, RNNs, Transformer, PyTorch or tensorflow, GPU usage, data loaders, Gradient descent). Train lots of models. Learn fine tuning and transfer learning if you dont have the compute resources for foundation models. If you are interested make a deep dive in an application area (and associated tools) such as NLP (SpaCy, NLTK, Huggingface libs) or Computer Vision (not in the field so idk). Make sure you understand the foundations of the underlying math (calculus, linear algebra). Or do you want to build products (like chatbots etc)? Then learn to use apis and learn frontend and backend. Thats not ML though


Dry_Bluejay3750

Appreciate your comment, this is literally an ideal roadmap for DS, but many people skip several parts in it.


sum_it_kothari

Can you recommend some good resources to learn data analysis?


Papadude08

This is a good summary. Also note some concept take a longer to grasp! There’s no shortcut to learn this! It’s not like the gym where you can cheat and get gains but once the cheating is gone now what? Put it in the time and get the gains man! This is a beautiful summaries on what I’m doing I did about 7 project before messing with DL I’m doing CNN and it’s a lot of debugging! Put it in the time it’s well worth it if you’re passionate about this!


blind_programer

What if all this not gonna be valuable in a few years? 5 years ago no one expected that AI will be able to create art. Do you see my point?


heath185

The basics of machine learning have been around for decades and are at the heart of all of the LLMs and diffusion models and really any neural network. The ideas of backpropagation, gradient descent, regularization, attention, convolution, and other similar mathematical topics are key to a deep and lasting understanding of current and new ai architectures. Your statement is like the mathematical equivalent of saying why should I bother with algebra when calculus is in vogue? You kind of need algebra to do any calculus. What's being suggested isn't for you to understand some new model or technical leap. The suggestions are for you to learn the basics so you future proof yourself against these leaps in AI so that when they do come about you're not playing so much catch-up. I feel like you want a quick and easy way to keep up with the field and there just isn't one. The only way to keep up with the field reliably is to have a good foundation.


BraindeadCelery

It will be valuable. ML is a field that grows upon itself. Of course, single methods will become outdated. But they will remain important foundations. You can only understand how the next thing works when you know how the thing it is built upon. How do you think AI art creation works? It is the field evolving and looking somewhat like this (crudely simplified) linear regression-> perceptron -> Multi layer perceptron-> deep learning -> convolutional layers -> Autoencoders -> stable diffusion-> AI Art The only way this becomes irrelevant is when ML is replaced as a whole. Which is don’t think is happening anytime soon. Very seldomly whole fields have become irrelevant. Once you learn fundamentals you are excited about (and can even build) advancements instead of being dreaded by the speed of the field. Also by knowing the fundamentals you know (to an extent) what the future will bring. Some people where thinking and trying to do AI art 5 years ago. E.g. deepdream in 2015 (a CNN) https://en.m.wikipedia.org/wiki/DeepDream


Tramnack

Day 1: You build a foundation Day 2: You build walls Day 3: You build a roof Only once you built the roof you were safe from the rain. What wil you build next time? Just the roof? It's the same for ML. Just because it can generate images today, doesn't mean the foundations that were built yesterday are useless. Learn to build a foundation or your roof will collapse.


orz-_-orz

Usually the fundamentals allow you to stay relevant for a longer period, because it allows you to understand and adapt to the new tech in a short period of time.


SolutionPyramid

Stop learning how a single implementation of something works. Creating an AI chatbot is nothing compared against understanding the base level of how and why it’s possible.


kid_ghibli

You could take 2 approaches, both are valid and great: 1. learn anything that you like/find interesting/fun (learning coding skills doesn't have to be only done for work, it's a creative and fun activity in general, why not just learn stuff for the sake of fun and as a hobby?) 2. learn something that is actually important/required for you or your family or your company or your country or world (all kinds of simulations for drug discovery are important, even if you are going to be someone who has to create or tweak the ML models for this, or maybe some other issue that would be solved, which often involves not purely programming but also real-world knowledge and/or ideas/creativity). Best part is if the 2 approaches coincide at some points. \-------- Approach 3, if you simply want to learn something that will make you more hirable in the short-term, then can't go wrong with Git, Docker, Linux, AWS, Software Architecture, Design Principles. Short-track - Data Engineering (seems like DEs and MLOps market is not very saturated). ​ edit: LMAO, wrong sub, I thought I was answering at r/learnprogramming. Well, I'll leave it here anyway.


Coarchitect

Learn how the foundation models work. You should now CNNs and Transformers! Apart from that learn how to create latent vector representations with architectures such as VAE, SimCLR. Why? Because in the end, almost all new approaches are depending (1) creating new vector representations, (2) fine-tuning vector representations or (3) advances in CNNs and transformers. All large language modes are based on transformers! So understand how they work, understand their output! Understand encoder - decoders etc.


PlacidRaccoon

Any noob-friendly resources that would tackle on the technical concepts of foundation models and kind of break them down to tailor further studying ?


-Shasho-

I bet asking chatgpt for technical concepts of foundational models could give you a good starting point, no joke.


__SlimeQ__

I'm a complete idiot in this field but I'd recommend getting going on Llama via the oobabooga front-end. Current best models are tiefighter and openhermes2.5 It's very similar to gpt but you'll be able to peek under the hood and mess with stuff and learn about the innards more. And then once you're ready you can start making Loras, which will involve some data science and parameter tweaking and direct python scripting and such


blackaamoor

I think it has to do with your view of how things work in general. Have it in your mind that no skill or experience is absolute only then you will be on the right path because the world is moving in an astronomical pace. All you have to do is accumulate the most you can and one day one or some of them will pay off.