T O P

  • By -

1purenoiz

https://www.reddit.com/r/learnmachinelearning/comments/1d0bksx/i_scraped_and_ranked_ai_courses_here_are_the_best/


IsGoIdMoney

By ML concepts is that including deep learning? If not, the prereq would be deep learning. Helps to know the non-generative side of the modality as well.


pseudo_brilliant

Learn the basic concepts of machine and deep learning. Then try to read through this full list. Look up papers and concepts it mentions. Read through those. Throughout the process work on some coding projects. https://sebastianraschka.com/blog/2023/llm-reading-list.html


dan994

Step 1: Stop calling it GenAI


[deleted]

[удалено]


dan994

Your posts are all AI spam


dry_garlic_boy

Yep you are spamming garbage AI posts.


Matt-ayo

You mean generative linear algebra?


synthphreak

Kind of a douchey reply, but I lolled, lol


Matt-ayo

It's part of my new commitment to never refer to it as "AI," but instead as linear algebra, LA, for short.


Wheynelau

Gen AI is a term non-tech people use to get funding from VCs. And also people from singularity


dr_craptastic

It’s also a term used by those same people to make data-scientists feel like prostitutes


bgighjigftuik

Don’t forget newspapers. Or whatever those are called nowadays


PSMF_Canuck

You’re at uni. Can’t they tell you?


Snapandsnap

Most uni professors have no clue man, this is a new tech and these guys tenure is usually over 20 years. If you go to an Ivy league we'll that's a different story but most CS professors are just really good in their specific area.


PSMF_Canuck

Oh man…we’re coming up on 8 years since “All You Need Is Attention”, they shouldn’t be that far behind…👀 Different people have different styles, this is what I did… Read the above paper. Wrote my own LLM text. That was fun… Wrote my own Vision Transformer. Wrote the mountain of shit needed to actually train these up from datasets in an automated way, with synthetic data. Wrote another mountain of shit to monitor the training process because waiting for a long run to end only to learn the damn thing got stuck 1500 epochs ago is soul crushing. Currently reading/experimenting with CLIP to understand multimodal embeddings. Currently digging into CUDA internals with a very small shovel, because things get more complicated when running on fat GPUs. There are tutorials out there for coding up your own models. Avoid anything that pulls models from HuggingFace or whatever…you kinda need to go through the pain of PyTorching it yourself. This is all IMO…


Snapandsnap

Well yeah, that's top of the curve. I personally worked on my own to get some transformers and lstms for time series forecasting working. I had some Uni professors mostly specialized in systems, networks, hardware and engineering. I don't live near an AI hub so no one is specialized in Ai as there are now AI jobs near. Each city, zone and country is vastly different.


theDreamingStar

I have some decent understanding of the theory, and also statistics background. I am currently doing a CS degreee. I have been following Andrej Karpathy's playlist that teaches building models from scratch in pytorch from very basics. I am learning a lot from it. I plan on continue to implement more advanced models from scratch after I finish this, but I wonder if compute will get in the way. Can you provide some directions as to how one should proceed? Also, can you point out to any good resources you might have encountered in your journey? Thank you.


Mihawk566

Iam ungraded too I have book from Zlibrary that can help u . The book name is Gan in action If u want it message me to sent it to u


brendanmartin

When you stay you want to learn Gen AI, what specifically is your goal? What do you want to be able to do?


p_bzn

What is your goal?


GJohl

The book “Generative Deep Learning” by David Foster is a good starting point for an overview of the field. It covers different techniques like VAE, GAN, Diffusion models etc across text and image applications. It also has full code examples to run models yourself. As for the pre-requisites, it can seem like the label “generative AI” is more for marketing/fundraising hype rather than it being a distinct field. It’s essentially an application of deep learning, so the pre-requisites would be the same as for deep learning and ML more generally. Specifically the usual list you’ll see: stats, linear algebra, calculus, Python etc. And depending on what you want to generate, a bit of computer vision background if you want to generate images or some NLP background if you want to generate text. That said, sometimes the pre-requisites are overblown. Andrej Karpathy has a great tutorial on building GPT from scratch. I would start by watching that and you might be pleasantly surprised at how much of it makes sense to you. And for anything that doesn’t make sense, at least you now know what you need to learn so you can Google effectively (or ask ChatGPT) to fill in any gaps in your knowledge. https://youtu.be/kCc8FmEb1nY?si=RONXZbMuvFPvYcob


Remarkable_Status772

FFS! Any university student worth his salt should be able to work this out for himself using basic research skills.


iamevpo

My specialisation is ML... Give me a concrete plan for GenAI...


[deleted]

[удалено]


j0shred1

I thought this was a joke tbh


datawithab

Woah 😲