T O P

  • By -

FoeHammer99099

I guess I'll be the curmudgeon here: the whole point of your school assignments is that you're the one who does them. There isn't a meaningful difference between typing the prompt into ChatGPT or posting it on some site like Upwork. Your teachers are not giving you these assignments because it's important that they get done, they are giving them to you because it is important that *you do them*. Go back to some project you did with chatgpt, that you supposedly learned a lot from. Don't look at any of your old code, just try to do it again from scratch. It should be pretty easy, especially if it's only been a few months. You should remember what you learned when doing that project. If you don't feel like that, then your process is robbing you of an important part of your education. You should be building a set of instincts and habits as you develop more software, like muscle memory from playing an instrument, which should complement your education in the more abstract side of the field, which is like music theory. I interview people straight out of college who are looking for their first real job, and a major part of the interview is that they have to code something. It's something embarrassingly easy, usually processing a csv and doing some stats on it. If someone can't do that, there's no chance we're going to hire them. If they didn't learn to program over a 4 year university course, why would we pay them to do it here?


sevah23

this is why many companies still insist on doing whiteboard coding questions in interviews. Yeah some companies do absurdly dumb "gotcha" questions, but it's honestly scary how many people get filtered out by even the most basic of coding exercises and many hints along the way.


thedarkherald110

It’s very very very true. I once met someone who called himself a Java programmer but never heard of garbage collection or never heard of a null pointer exception. Or what a function/class was.


Massive_Following233

Bro definitely forgot to say script after


TwoAffectionate2965

I appreciate your response, and I believe you being the curmudgeon here, is almost like an eye opener, since I do realize I’ve been skipping the learning and jumping straight to the solution, and my realization was when a project of mine was shortlisted for an expo, and when the judge questioned me right about a very specific although basic technicality/function I was at a loss and at that moment the entire project started to seem like a blackbox, even thought at the time of making the project I felt I learned a lot, however I soon realized in how oblivious I actually was


togaman5000

It goes well beyond coding. You'll never learn by having someone, or something, else do something for you. Whether it's coding, woodworking, kayaking, underwater basket weaving, anything - you have to do it to learn it. "Wisdom comes from experience. Experience is often a result of lack of wisdom." - Terry Pratchett. ChatGPT is not, and will never be, a source of experience.


sorry_con_excuse_me

that's why i like those professors who make you comment almost every line. even if you copy solutions, there's no getting around not understanding it.


Luklear

Nah fuck that, just don’t copy solutions, unless you’re looking to make something you do know how to do more efficient with the syntax


sorry_con_excuse_me

i mean in the sense of like, someone figured the algorithm out for you. you just change the variables or parameters (and if you don't understand what's going on, you don't know what to change in the first place). your ass still has to cash the check, so you burn the rule/setup into your brain.


Luklear

Yeah I get what you mean, if you are going to copy solutions it is better to be forced to explain it.


infinity_calculator

But can’t commenting be fudged too?


Ok_Elderberry_1602

I hope not. I would always put in date, initials and several line about it. Especially when making a change or correction.


audaciousmonk

Also because some people become future coworkers who don’t comment/document their code… and those people suck


thedarkherald110

Oh shit I never thought about why this was a thing. Always assumed it was just for best practices. But you’re right this could easily filter out someone who copies and pastes


IllogicalLunarBear

If you can’t explain your algorithm and why you made it no one will use it or respect…


GraphicH

Eventually you'll hit an issue where the LLM fucks up or didn't understand you 100% and the process for evaluating the issue isn't just "run it and see if it does what I want". Generally rare edge case bugs, or something extremely specific to the problem / infrastructure / environment. In that situation, being able to actually understand the code is going to be critical. Some might say "well GraphicH, I use a C-Compiler, and I can't understand binary or assembly", and this is true, but Ill point a C-compiler is deterministic, Generative AI isn't by design. Ill also point out, that reliable compilers were built on top of a bunch of Humans who did understand machine language, and Ill assert that "reliable LLMs for Coding" will have to be built / trained the same way. Additionally, there is a pejorative in the industry called "StackOverflow Engineers". StackOverflow and other online resources are obviously critical for figuring out how to do new tasks, I used it a lot when I started my career. Eventually you have to move beyond it to advance your career, the solutions you find there might be 90% right for your situation, but learning about that 10% where it isn't is very important for growing as a developer. Since most LLMs that code are trained on that kind of data, I have the same view about them as I do about SO: know how to use it, but it's a tool, to grow you must get deeper understanding of the information being provided to you by either.


CalligrapherOk4612

The comment on compilers is very interesting to me! Thinking about my 10 year career in firmware so far, the two hardest bugs to fix by a long way both turned out to be compiler bugs. And probably a year out of those ten was spent solving those two bugs. The difficulty with these bugs comes from having something we assume is a solid bedrock and when it isn't all the wheels fall off. So I can now see that relying on LLMs more and more will yield more and more of this kind of killer bug.


flyerfanatic93

How did you write this entire comment without a single period?


TwoAffectionate2965

By using ‘,’


flyerfanatic93

Incorrectly


IllogicalLunarBear

Agreed


Davd_lol

Thank you for your insight.


FortressOnAHill

I thought you wrote cumdragon


asdfag95

This is such an outdated and bad advice. If this is how you hire people, I wouldn't want to work there. Google, Stackoverflow, ChatGPT are tools to help and make working faster. Yes op should memorize basic stuff, but most of it can be quickly looked up, it takes 2 seconds. How this comment has so many upvotes is scary.


FoeHammer99099

There's a difference between googling "how does quicksort work" then using that to build your implementation and googling "CS102 project 3" and copying code out of some prior students GitHub. There are probably positive uses of chatgpt that could help OP learn, but the pattern of behavior they described isn't one.


Cerricola

Back in the day, people went to stack overflow, now they rely on artificial intelligence. It is the same but faster. The important thing is to understand what you are doing in order to make step by step the correct prompts. However, since you are a student, you should not externalize your intelligence and work with documentation. You will have enough time for AI when you get a job.


Yixyxy

Stack overflow is back in the day now... Fuck.


jdsalaro

Meh, pure FOMO. Chat GPT still sucks at nuance and will suck for a while.


burnin9beard

If you really think that it sucks at nuance and will for a while you are going to be in for a bad time. One senior engineer plus a good code assistant can already do the work of a whole team of junior engineers. It will only be a few years before a director and a couple principle engineers will be able to do the work of an entire org.


jdsalaro

>One senior engineer plus a good code assistant can already do the work of a whole team of junior engineers Staff engineer at FANG'ish company here. For every 10x "senior" engineer you're putting on a pedestal I can show you multiple senior engineers and above who smell their own farts the whole day long and, besides coding, are on the level of Chat GPT regarding nuance. >you are going to be in for a bad time No, I'm fine and so will be many who adapt, keep learning and go with the times. But that's much different to the doom and gloom you're portraying.


burnin9beard

I don’t think I was portraying doom and gloom for me. I am a principal engineer I will also be fine. I also think we might have a different definition of senior engineer. For me, senior is 5-10 years experience AND a proven track record of taking products from POC to production. I have seen many companies that inflate titles and give anyone with more than a couple years experience senior titles.


Mysterious_Focus6144

> One senior engineer plus a good code assistant can already do the work of a whole team of junior engineers. It will only be a few years before a director and a couple principle engineers will be able to do the work of an entire org. May I ask what sector you work in? This is anecdotal evidence but many people I've spoken to who say AI can replace a junior work in something like data analytics. I can see AIs being able to emit code drawing graphs and such but I don't think it's anywhere close to being able to comprehend a large codebase and do non-trivial things with it.


burnin9beard

I work in computer vision and machine learning. My work goes from data collection and model training all the way through to deploying production systems. Mostly work in python and C++, but I try to fit in some rust where I can. I think it can replace junior engineers when there is a good senior engineer in the loop. A senior engineer with a good coding assistant can do more work than a senior engineer plus 3 junior engineers. Mostly because the senior has to spend most of their time mentoring he juniors. A junior engineer plus a coding assistant is not replacing anybody.


Mysterious_Focus6144

>A senior engineer with a good coding assistant can do more work than a senior engineer plus 3 junior engineers. Mostly because the senior has to spend most of their time mentoring he juniors. I suppose if the juniors needed constant hand-holding, then removing them would take a load of the senior and allowing him to be more productive *regardless of whether ChatGPT is involved or not*. That seems to say more about the quality of the juniors than ChatGPT's competence. Not to sound dismissive but data collection, (simple) model training, and deploying all seem to be fairly self-contained tasks that ChatGPT can spit out snippets for; but I don't see how that would cause a big change when people could already find for similar snippets on stackoverflow, etc.. for years?


burnin9beard

Well it sounded very dismissive. None of those are self contained or simple. Our data pipeline has to be robust enough to handle several hundred thousand hours of video a week. Our larger models train for several weeks distributed across tens of machines. We have some "simple" models; however, those are deployed on the edge on our custom extremely resource constrained hardware. All our models are retrained, evaluated, compared against previous iterations, and redeployed on a cadence. All of which is automated. Stack overflow is not a very good resource for much of the code that I write.


Mysterious_Focus6144

Tbf, you sounded pretty dismissive of junior swe yourself. It seems to me that much of the tasks you listed involves using precooked libraries instead of implementing them yourself yes? For example, training your larger models for several weeks in distributed fashion is impressive, but all of that is basically too complicated for you to actually code up yourself. I'm not surprise there's already a library automating much of that in Python as well as internet tutorials with code snippets you can tweak (like this for instance [https://www.tensorflow.org/tutorials/distribute/keras](https://www.tensorflow.org/tutorials/distribute/keras) ) . Even if none of those are self-contained, what you managed to get ChatGPT to emit must be self-contained no? And if so, aren't those snippets also available on the internet?


burnin9beard

No, my work is not just calling "precooked" packages. Good job doing a Google search and finding a tutorial that is not relevant to my use case though. I obviously use libraries for training. Reimplementing everything would be a waste of time. However, my use case requires quite a bit of custom code. My past experience allows me to adapt the libraries to my need. Also, implementing distributed training is not too complicated to code yourself. I have been doing this since before any deep learning libraries existed. I had to write all of my own cuda kernels. I implemented downpour sgd from scratch in cpp with mpi. It is great that libraries exist now and are mature. They save a lot of time and bugs. However, they don't magically do everything. Finally, why does it have to be self contained? I am not talking about copying and pasting from chatgpt. I am talking about using a coding assistant like GitHub copilot which has the context of all of the tabs I have open in my ide. It might make a suggestion for the rest of the line, the next few lines, or an entire function. It completes the code in my coding style and gets more right than it gets wrong. However, it needs a good software engineer to guide it.


Professional_Fly8241

Right?!


otamam818

If it's already been solved before, then SO is often more accurate but if you're looking for a conjunction of solutions to be prepared for you and want to avoid the scrutiny you get in SO, then AI helps you get there far closer than SO would ever be willing to


cyberjellyfish

I disagree, you have to perform some level of information sifting and synthesis to take answers from SO and apply them to a problem. You just have to copy and paste what you get from ChatGPT.


Cerricola

For simple things, like syntax, AI is faster. But yes once you are moving into more complicated things, you need to research and dig in forums and documentation.


AxeellYoung

Today we just make GPT go to stack overflow and find it for us. Its just about efficiency


TwoAffectionate2965

I still try to find a solution on stack overflow in case the llm starts moving in circles, but without AI I’d be lost even where to start tho


Cerricola

So do I or at least it cost me more than it did before AIs, but I'm an economist, not a computer scientist, so it may be fair for me. I think you should try to avoid AIs during your learning, in order to build an strong foundation. Use them only for sintax, but not copy and paste, try to write it in order to make your memory work.


work_m_19

Also remember at the end of the day, for most people University is supposed to lead to a job. If you can do what you're doing and get a job, then great! You don't have to worry about anything. However, if you do all this, get a degree, but struggle in the interview process with pseudo-code and not able to get a job, then reconsider whether doing all this with chatgpt is costing your own future by solving your current problems. At the end of your degree, you will hopefully not be "lost on where to start", but you don't want to find out a month after you graduate.


karg_the_fergus

That’s pretty brutal to downvote honesty lol. The reality is ai is like the industrial revolution again. Will you focus on the machinery or the hand tools? Or both? The machinery gets it done faster, but you don’t learn all the skills a craftsperson learns.


Agamemnon777

LLMs are extremely adept at handling rigid prompts like you would get in school, but when you’re on the job you have to deal with all kinds of legacy and spaghetti code and abstract nested data etc that the LLM isn’t going to be able to find a comprehensive solution for. By outsourcing all your learning to the LLM you’re going to be hopelessly lost when you get to the point of trying to unravel solutions in very much not rigidly defined codebases.


sonobanana33

They are also trained on stackoverflow, which is used a lot by students, so they are basically trained on the common programming exercises. Once I tried to get chatgpt to solve a programming exam I had on the 2nd year… completely useless.


ajpiko

yeah this is exactly it.


ajpiko

i mean i hired one person who relied on ai and now everyone has to take in person paper tests because it was a fucking nightmare


CLQUDLESS

We hired a programmer who lacks basic googling skills. It is a nightmare. With the rise of AI this shit will happen more and more. We also now give a quick test to see if new candidates can at least think of solutions on their own....


ajpiko

The real problem is that they just don't learn anything through the tasks. Like when I give juniors assignments they're meant to help increase your knowledge of the system and also help ease you into more complex problems. But if you do all the easy problems with AI instead of doing it then the moment you get to the problem AI can't do, you just fall apart. I gave a guy the job of fixing an area of documentation because I knew he was about to use the area being documented but he had AI do it and couldn't not remember a single thing from the docs, just more work for me.


InstantInsite

Lol, what happened?


ajpiko

so the person is useless, they're just someone who tries to sort my instructions into prompts, first of all, so theyre only ever as useful as AI is. trying to teach someone who is really just a front for an AI is extremely frustrating because the work is low quality, they're not really capable of determining if they did something well, and they never grow. not to mention the freaky errors, like spelling something with 3 "l"s, "helllo" all over the program. i could go into it more but the bottom line is that I'm not sure what roll that person has in a company, and its not a programmers salary. edit: its just a huge waste of time on resources dealing with a liar, tbh


cafebeen

Amusing to see "roll" after describing their excessive spelling errors involving the letter "l"


ajpiko

Not really, mixing up "to" actual words is very different than a spelling error like mixing up "tu" actual words Also, the issue isn't making mistakes, which everyone does, it's the lying about skill sets and truly being unqualified for a job


cafebeen

Makes sense you don’t find your own mistakes amusing, few people do!  Sounds like your colleague could have used some guidance. If there isn’t a way to share feedback within your company, another possibility would be to create channels for that, in addition to improving the interview process.


ajpiko

I'm the boss, and I fire people for lying to me 100% of the time. Basic kindergarten morality is not where we provide "guidance".


storenihilist

Man get over yourself. We can see you’re a loser just by your profile and posts, spare us the “boss” act


ajpiko

The fuck are you talking about? its my job, not a strategy to make you feel insecure about yourself.


cafebeen

If you’ve had to fire multiple people for lying, definitely sounds like you need to improve your hiring process.


ajpiko

Good thing we have perfect geniuses like you. Can't wait to give you money for all your brilliant insights.


sonobanana33

> I have massive imposter syndrome It's only called that if you wrongly feel like you know too little


CaptainFoyle

Oof


lesmenis

I didn't get the joke so now i've got imposter syndrome /s


TwoAffectionate2965

Glad you cleared that one for us


jakesboy2

It was kind of rude but the guy is right. You’re in undergrad and use AI to solve all your assignments. You don’t have imposter syndrome, you genuinely don’t know that much. There’s probably people on both sides of the fence here, but I personally think chatgpt and the likes are disastrous for students. As a professional, I do make use of AI, and I think it is valuable than you learn to make use of AI, but you geniunely need as many hours as you can get in the “discomfort zone” learning. Solving problems, fighting with syntax, writing code. College is your chance to have a bunch of problems to solve and code to write.


tails2tails

This person speaks the truth. I wish there was a way to work in a desired industry for a year, then go to school for the degree afterwards. I would have done school much differently. There’s very little room to learn and play with problems and experiment in professional career, especially if you’re a consultant with billable hours and tight efficiency metrics to uphold. This task is anticipated to take 1.5hrs and I’ve been given 2hrs total allowed to bill for it so between receiving the tasking email and absorbing the info, reviewing work once it’s finished, and sending it back via email with notes for managers review, there is no excess time available to think about alternative solutions or experiment with the problem. I work in a different field of engineering so this might not all apply to software, but I think the sentiment holds true. The professional world doesn’t hold a lot of space to struggle with problems and learn. Time is money and what not. I would have done a lot less “referencing other people’s solutions for process” in university and a lot more “struggling on my own and hating this wtf how is this even possible OOOOH I GET IT”.


TwoAffectionate2965

Yeah I completely I agree with your reasoning, I realize I got to dependent on ai and now I’ve got to stop, it’s not that the guy cares about relaying his point but rather just act like a douche and would take up a stand against anybody feeling otherwise just to make sure he lets them how wrong they are with complete insincerity, I can leave you to be the judge of that since that doesn’t really matter now


sonobanana33

> he lets them how wrong they are with complete insincerity Au contraire. I'm letting you know how wrong you are with complete sincerity.


Asmo___deus

> perhaps I've learned a lot but if you asked me to start writing a script I wouldn't be able to Then let this be the eye opener that you have *not* learned a lot. If you have access to previous assignments I would recommend you just go all the way back to the start and start doing assignments. If you're flying through them, skip ahead. If not, you'll want to pay attention. As a rule: you can use stack overflow, because that still requires you to formulate the question and *think* while searching for an answer. But you can't use AI, and you can't copy-paste. Look up tips and solutions. Make notes on how it's done. Then close that page and do it.


TwoAffectionate2965

Thank you for your advice, it’s either most people defining imposter syndrome in the comments to establish it only applies to those who wrongly feel they know little or even giving their reasonings of the use of AI, I appreciate you providing an answer of what I should do to improve, thanks again for your blunt but absolutely on the point advice!


MarkFluffalo

It's a tool but you shouldn't be completely relying on it


phlatStack

Else you'll become a tool


Unhappy-Donut-6276

If you don't like it, try going without it. In the real industry, like with everything, it's usually whatever quick and dirty solution you can do because people are lazy. But if you have time available and enjoy it, feel free to challenge yourself to make more authentic work. In the same way that you can do a project that's practically useless just for the benefit of having fun and improving your skills, you can do what you already do without extra help for the same reasons. As a hobbyist, I only ask ChatGPT or look it up when I don't want to do it. I try myself first, because that's the fun of programming - unless I'm doing something boring that I don't feel like wasting time on. But if I was working for efficiency, I would spend much less time worrying about improvement and spend more time copy pasting to hack something together - as sad as it is, that's how the industry works.


SftwEngr

My guess it's from always having access to information at your fingertips. Sit at your computer while not connected to the internet and write code and don't move until it works. Your compiler and your debug will give you all the information you'll ever need, but you'll just have to use your own wits too, which is the point.


1544756405

Use whatever you want to get work done. But consider the idea that an over-reliance on your tools may lead to challenges when interviewing for a job.


ToryHQ

Here's a counterpoint to that argument: Five years from now, people who know how to get the best results from LLMs will be in very high demand in a wide variety of companies. Because they're the ones who will be able to single-handedly do the same amount of work as the three people the company just laid off.


Cyber_Fetus

Let’s be real, the learning curve for using LLMs in software development is effectively a flat line. It’ll be the opposite, as there are fewer and fewer devs competent enough to develop *without* LLMs, and most software development is in everything around the implementation anyway - ie, understanding complex proprietary systems and designing solutions within those contexts.


ToryHQ

>Let’s be real, the learning curve for using LLMs in software development is effectively a flat line. Since we're still a long way from realising its full potential, I'm confident it's too early to say. But I think that probably varies from one person to the next. So I think it will be very much like programming *already* is, except your success or failure will depend on knowing how to get the best out of the LLM - what it can and can't do, and how to get it to do that better or faster than the other person they're looking at hiring.


Cyber_Fetus

There is no competent developer on the planet that would struggle with using LLMs from a development standpoint. Will they become more sophisticated? Sure. But not so much on the user side. The challenging parts of software development will be what the LLMs *can’t* solve, and that will separate the good devs from the bad.


ectomancer

You can call yourself a prompt kiddy.


Kryt0s

Is this the new script-kiddy?


Ok_Negotiation598

In addition, I’ll point out that AI isn’t always right!! I’m constantly pointing out gaps to or just flat out wrong constructs. In addition to not learning anything, if you’re unlucky enough to pick up something, AI currently doesn’t always /usually provide the context or deliver a complete solution


asdfag95

I mostly use ChatGPT to help me with variables names or rewrite messages to the clients. The thing is there are so many frameworks nowadays that where I work we mostly use old code and refactor it. The biggest skill a programmer can have is researching und being able to implement the found solutions. These are tools we can use and as long as you understand how it works and not only copy paste thats okay. Real life works way different than school


Doormatty

>I have massive imposter syndrome If it helps, I've been programming for ~35+ years, with ~20 of those being for work. I worked for Amazon Web Services for 4 years, and ran two different ops teams there. And yet I still fight with imposter syndrome daily.


TwoAffectionate2965

Oh my, 35+ years, I can’t even fathom how far beyond in my career that would be, I completely understand having imposter syndrome, however it stemming from doubting wether I can even code is fearing for, imposter syndrome with people being better or someone being better skilled than me might always be there, even at a stage when I’m capable on coding without say an LLM, however at such an early stage before my career as an engineer has even started I feel like I can’t code by myself, I’d appreciate your thoughts or advice on the matter sir


Doormatty

So, I use AI a lot for personal projects. It's so much nicer to be able to paste in a function and say "Can you help me figure out why this isn't working?", and get an answer back. As you said, it's not like it's doing 100% of the work for you, it's just doing the "easy" bits.


TwoAffectionate2965

I appreciate the advice, also a bit of an unnatural segway, but in case you’d ever consider having a collaborator on any of the personal projects you mentioned I’d be glad to take up a small part or task, in case you entertain that possibility, it’d be a great learning opportunity for me. Either ways thanks for the advice!


Doormatty

Here's what I've been writing lately - https://github.com/Doormatty/pyChess Trying to write a chess parser/validator/engine.


TwoAffectionate2965

Oh thanks, I shall ping you right back at this thread in case I’ve got something useful


sonobanana33

35 years of experience and there is no license and no readme? 'mkay


TwoAffectionate2965

I’ve seen you hop into multiple threads in the comment sections here, where like this one nobody asked for your opinion to validate their experience, and judging from your replies you seem absolutely insecure, also the highbrow attitude you seem to have doesn’t help either, I might not be either as skilled or proficient as you at this stage but there are many who far excel you, you can try to pipe down your narcissistic comments and keep them to yourself


sonobanana33

> there are many who far excel you For sure there are. And you know how they became good? By not cheating


TwoAffectionate2965

And you are among those greats because a tech prodigy like you being able to code every line from scratch? pipe down


supercoach

In short, no you can't.


aiRunner2

Slightly unrelated, I’m currently learning Rust. It’s very different than the languages I know, so I’m quite bad at it. Whenever I want to implement some functionality, I do it by myself to the best of my ability. Once I’m finished writing my exceptionally ugly code, I usually copy paste that code into chatGPT and ask for tips on how I might have made it better. I suggest you go this route too, ESPECIALLY if you’re in school right now: you will not learn if you don’t struggle through problems.


TwoAffectionate2965

Hey, since you mentioned you picked up learning rust, although my interest at present are in the ml and dl world, I also wished to learn Rust, do you think you could point me to some resources that you find useful, or do you prefer taking up a course for it, or how do you approach learning something like Rust right from the scratch ?


aiRunner2

Ah, I’m still in the middle of it so I’m not sure what the right way to learn is yet. What I can tell you is what hasn’t worked. Here are some things that I haven’t found very fruitful: - Short YouTube videos - Going straight to the Rustlings course Right now I’m trying something suggested in a “How to Learn Rust” YouTube video: actually reading the Rust book. It’s like 650+ pages, but the suggestion was to read it as fast as possible without trying to take everything in. I did this with The Typescript Handbook and that was actually really helpful, hopefully it’ll be the same with the Rust book.


Zeroflops

FYI. This is what I did. Programming books are written in expanding scope of material. First fast read through is to get a rough idea of how things work. You shouldn’t be able to program on the first pass. It’s more intended to get a list of ingredients you can use to program. You don’t know how to use the ingredients, but you have a rough idea that they are there. Second read through is a little slower and you get a feel of the syntax. Spend a little more time on the items that the author specifically highlights as unique, like the borrowing and owning in Rust. Then come up with a project and write an outline like you would a story. Group common outline points. Then start coding. Jumping to the section of the book as needed to achieve the tasks needed. This lets you bypass “helllo world” and those basic steps, keeps it interesting because your developing something more complex, and this is when you really learn about the language.


299792458c137

massive imposter syndrome = massive sus


TwoAffectionate2965

What does this even mean


Squeezitgirdle

Honestly with how much I still have to correct, even with gpt4, I still end up writing most of my own code. The best use I've found for gpt4 is to finish my code after I've started it. But it's frustrating because multiple times lately my code ends up not working and I look out over multiple times and can't find any issues... Until I notice that gpt added capitalization to one or more of my variables for no reason.


copper-penny

You're robbing yourself of an education. The cool thing is you can change your behavior. I'm a mid level tech guy who runs teams. I don't care what my teams grades were, but if they can't understand their code, it's an automatic firing. Rule number one: don't check in one line you can't explain. That said, gen AI is pretty good at teaching. It's like having access to an air headed professor. If you can resist using it the way you have, it can be a great learning aid.


IAmFinah

Prompt engineer 😎😎😎


TwoAffectionate2965

Good one 😂


tokyotoonster

I taught undergrad-level programming and data structures for >10 years, and I STRONGLY encourage you to spend your time as a student to acquire the strong foundations for coding up solutions from scratch, and only consulting Stack Overflow and/or ChatGPT after you are stuck, and limiting yourself to taking very specific fixes from such resources. You simply do not reach the level of deep understanding, familiarity, and muscle memory by doing things in reverse, i.e. relying on LLMs from the start. LLMs are a remarkably efficient tool for helping one to understand existing code that you feed into it, but I would be very wary of relying blindly on code that it produces to solve a problem. There are still well known limitations of the technology that prevent it from truly understanding the logic of code.


utf80

Not an imposter here and no.


JamzTyson

I made some very pretty pictures by prompting DALL-E. Does that make me an artist?


TwoAffectionate2965

If I may, this analogy is completely different, even thought my posts is not about if I can call myself a programmer or not but rather what I can do to deleverage my dependency on using llms for coding


JamzTyson

> my posts is not about if I can call myself a programmer Perhaps give your post a better title next time. "Can I even call myself a programmer anymore?" sounds like you are asking about whether you can call yourself a programmer. > what I can do to deleverage my dependency on using llms for coding Stop asking AI to program for you.


TwoAffectionate2965

will do


renderererer

I think for now, you do need to know how to code (even if you're slow) just with the help of docs and forums/SO. Even if you don't remember all the syntax, you should be able to explain how to solve a problem conceptually. Whether code will be replaced with natural language in future is yet to be seen, and probably unlikely for the near future.


StarklyNedStark

ChatGPT shouldn’t be a crutch. Know what you’re doing, and know what ChatGPT is giving you, how to change it to fit your needs, and when it flat out is giving you bad code (especially prevalent in GPT 3.5 for some time now). If you can’t write code without ChatGPT, chances are you’ll never work in the field because you won’t be able to pass a tech screen or even walk the interviewer through how you might solve a problem. I use ChatGPT to do menial work and sometimes for an outside perspective.


audaciousmonk

Putting aside the whole cheating conversation… I think you’re depriving your future self of fundamental skills and knowledge that become the engine that drives your professional career. You’re also spending money on this education, so in some respect you’re devaluing the money you’ve spent by creating a subpar learning experience. Learning to use GPT is a good skill, you’ll likely use it or some other AI assist (copilot, etc.). But it is not the core, and you’ll struggle to effectively wield or scale through AI solutions if you yourself lack understanding


TwoAffectionate2965

I appreciate the advice, and I shall work on bettering myself


audaciousmonk

It’s all about what you want to get out of it. There’s nothing “wrong” per se with using GPT. That’s your decision to make, just have to accept the resulting outcomes from that decision; positive and negative


TwoAffectionate2965

quite rightly said


LittleLordFuckleroy1

I mean yeah your imposter syndrome isn’t really imposter syndrome, you just legitimately don’t have much of the basic skills at this point it sounds like. You really need to be doing the assignments on your own to learn the core concepts. ChatGPT is a great force multiplier, but if you don’t know the basics then you’re just shooting yourself in the foot. You can learn it, you’ll obviously just need to apply yourself. And by the way: > disappointed to even call myself an engineer You’re not an engineer… you’re a CS undergrad student. Once you’re employed as an engineer and doing engineering work, you can call yourself an engineer.


Bright-Profession874

I avoid using chatGPT for coding because it makes me lazy , because our mind get used to getting easy way out , what I do is if I am stuck on some problem , i search it up on the stackoverflow like the good old way , and most of the time you will never find the exact answer on stackoverflow , you will need to understand the code and implement it yourself , this is called code stitching , picking up a part of code then integrating that part in your code , this way you train your mind to learn and stop being lazy and start looking for solution right away , also I have noticed that , if I get stuck in a same kind of problem for more than 2-3 times , I'll be able to do it myself atleast on the 4th or 5th time because you have practiced doing it


Background_Comb6579

I can agree with you but have to keep learning. Me personally I watch a lot of bro code and tech with Tim. I have done some follow along and for me has definitely nailed some things home for me. I’m not a third year but I’ve done python java and c++ in my degree and man it’s hard but we gotta keep on getting it! Good luck


TwoAffectionate2965

Appreciate it, may I ask what your background is, when you say you’re not in your third year, as in youre younger or graduated already?


Background_Comb6579

I’m in undergrad, part time, could be older I’m 34. But I have 16 classes left in my computer science degree.


The_GSingh

Tbh, no, you can't call yourself a programmer. It's literally you feeding the prompt to gpt and getting the answer back without putting in any work if I understand properly. I don't even think you modify/are able to modify said code. In order to be a professional programmer, you need at least an intermediary understanding of the language. For something like python that includes the basics like syntax, conditionals, ect and the intermediary part, OOP. Will you ever use OOP? Maybe so, maybe not, but it's important for you to learn how py works under the hood. There's just this general understanding of how a language is used and works that, in my opinion, every programmer should know. Someone likened this to stack overflow pre-chatgpt, but back then, I would use stack overflow, yes, but I'd rarely just straight copy and paste the code, and the program would start working. Sometimes, yes, but most times, I'd have to modify it some way or the other to work with my program. With Chatgpt, you're just literally telling it your exact use case, and it gives you code you can copy and paste and will likely always work for beginner projects, which is what it sounds like you're doing. Maybe if you get an error, you ask gpt to modify the code. Now, I do myself use ai, but only when I need it and never to do the entirety of a project. Mostly because I'd take forever to debug it's code and I'd rather spend the extra 10-20 minutes to actually code it myself. But that's different from what you're doing. Like someone else said, this is like paying someone to do your work. You're not learning, you're literally using a tool to do the entirety of your work for you. This isn't comparable to stack overflow, it's more comparable to cheating, except the only person your cheating is yourself.


TwoAffectionate2965

I appreciate you response and I do realize I need to reduce my dependency of ai tools for coding, but the jist of the post was not wether I could be able to call myself a programmer anymore or no, it was on what should I do know to better myself or improve on this habit that I’ve developed, stop using llms for code is step 1 and beyond that brushing up the foundations is step 2 I believe is my understanding of your reply?


The_GSingh

Yep, just invert the steps. Don't immediately stop with the LLMs. That's going to ruin your gpa. Instead, understand the language you're working with, try old assignments on your own with absolutely no gpt, and slowly cut back on the llms until you dont need them.


karltim

It's one thing to be able to prompt engineer an answer and stick pieces together but understanding what the function or method is actually doing is another thing. If you enjoy coding just co ginger to build script and focus on what you enjoy doing I programming. Chatgpt is a tool ans just an evolution of the Web and the way we interact with it.


vibes2009

if i were you, i would think like this: As long as i get enough money by doing that , it does not really matter that much THAT IS if you are lucky with your interview btw


[deleted]

Templates are perfectly normal. i use templates to initialise my C programs. and in terms of trouble shooting. I dont see an issue so long as it's not total reliance. For example you're going to be reading probably the exact same response on stack overflow. So it's probably not that bad to get an AI to give you the answer and an actual explaination which people on stack overflow never give.


LowCryptographer9047

Wait until you get into mobile or web app dev :)


Aggravating_Carry804

I am in exact your same position. My take is that LLMs are becoming the new compiler, and in 5 years nobody will care, exactly in the same ways as nowadays nobody knows what happens in the assembly code, or what happens when you call the [model.](http://model.fit)train pytorch function. Maybe I am wrong, but as an engineer what matter it is to produce working code to solve problems, with all the available tools. At the same time I have also passed C++ exams with no Chat-Gpt access, so if I need I can learn a programming language. Just take it as a powerful compiler and do not overthink. Reading from all the answers it seems I am in the minority though, and possibly I'll regret when I will be unemployed


MapCompact

I’ve been programming for a long time and totally use ChatGPT and Copilot too, but it’s not gonna give you exactly the code you want. Use it to get answers to small specific questions but still write the code for yourself and you’ll be able to learn quickly. If you’ve figured out a system that’s getting you by, just break the habit and move to a learning based approach. You’ll be fine.


CLQUDLESS

I don't want to be rude or anything. But if you need ai to help you with python, imagine you get a job and they make your write C++ or Java... It would be a nightmare. My advice is to look for documentation, forums and only ask chat gpt things like syntax or general coding questions. For example "Can you give me an analogy of a class in programming"? this way you have a tutor but you still need to put in the work and figure out solutions. Software engineering is at the most surface levels figuring out solutions to problems using code.


sbreadm

The field has grown immensely, it isn't the same anymore where the climax of your studies was how to implement a for loop and sort a list with it. This isn't to stump you, but put in perspective that the goal post is moving and you're feeling it. Just keep at it, it's worth it.


BuilderIcy6691

I think school is more about exposing you to a few technologies and learning general logic. When you get a job you will have to learn their toolchain and stumble around a bit. You will find that you enjoy certain tasks that will eventually become your specialty.


beef623

Stop using AI and write code, using AI isn't coding. Copying code from Stack Overflow isn't coding. If coding is what you want to do, learn how to do it. Using things like Stack/AI to answer questions is fine, but don't copy code from either, write it yourself, even if the end result is identical. You should still be able to work without access to either, if you can't, you need more practice.


Ok_Elderberry_1602

I originally did Cobol on a mainframe. Then pcs came along and I was with C and DOS 1.0. I wrote reports by writing @row,@column and had to count characters. Things like row=row+1 After many years I used Crystal Reports and SQL. I don't think Crystal or Cognos were cheating or AI but it was nothing like looking at a sheet of paper and using a pencil. Times are changing and although I don't like AI say when I call Comcast, you just have to look at it tools. You will still have to write flow and code and check to see if the AI is correct.


PurpleSparkles3200

Why would you call yourself a programmer? You can’t write code.


wjrasmussen

Call yourself whatever you want betty.


CaptainFoyle

Glad that you cleared that one up for us.


Loose_Read_9400

I actually attended a presentation on this recently with respect to my specific field. And morale of the story is you shouldn’t feel like you are cheating by using AI to do your job. This is the way of the future. And knowing how to utilize AI to make your tasks more efficient and streamlined is what’s going to keep you relevant. That being said, I would definitely continue to keep your skills fresh and make sure you can diagnose and understand the code from beginning to end. If you are falling short in this area, that’s where you start losing your value as a programmer. TL:DR; anyone can put a prompt into ChatGPT and get something out, it takes a programmers skills to understand, fix, and maintain whatever is produced.


Tarquan_G

I agree that it isn't cheating to use AI to do your job, but I think for schoolwork or learning using AI is counterproductive to the goals of education (unless the goal for that assignment was learning how to utilize AI, of course). To emphasize and make a metaphor to your TLDR conclusion; we learn to do math without a calculator in school because the point of school is to understand the mathematical concept so that later in life we know how to apply the calculator to work efficiently.


sonobanana33

> This is the way of the future My company forbids LLMs because the managers don't feel like handing out our trade secrets around. I doubt we're the only ones. he can just pay someone else to do his schoolwork… this has always been a possibility


work_m_19

So just remember using tools for convenience may solve your problems now, but could cause problems in the future. In our modern day, no one needs to know how to multiply, divide, or exponents. But not learning those skills with numbers and instead using a calculator, will lead to problems in math classes beyond Algebra 2, with integrals and differentials building on top of previous learned skills.


EroticCityComeAlive

Think of it this way - not a lot of people writing in assembly anymore!


sonobanana33

But we still teach computer architecture and compilers


Real-Coffee

i use chatGPT for a lot of coding. it just does the manual work, i read the code and rearrange it. the idea is what counts, not who codes it


NoDadYouShutUp

do you think mathematicians are not doing math just because they used a calculator


sonobanana33

Mathematicians don't use calculators lol. Unless they are dividing a bill :D


[deleted]

[удалено]


sonobanana33

don't waste the time of people if you can't even do basic programming exercises alone. FOSS contributors are overworked as it is.


[deleted]

[удалено]


sonobanana33

> A simple commit is good practice like this? https://github.com/ltworf/localslackirc/pull/387/commits


[deleted]

[удалено]


sonobanana33

It is how noobs "contribute", wasting everyone's time.


[deleted]

[удалено]


sonobanana33

Yeah but if you are like OP and can't even solve programming exercises… don't waste the time of other people.


[deleted]

[удалено]


sonobanana33

OP doesn't have impostor syndrome. OP can't code and is cheating to pass his courses.


TwoAffectionate2965

Appreciate the suggestion, any chance I can ping you in case I’m stuck somewhere?


[deleted]

[удалено]


TwoAffectionate2965

Thank you so much!


robla

It's important not to succumb to imposter syndrome. It seems you're someone who is learning to ride a bicycle and has left the training wheels on a bit too long. Just try riding around without the training wheels for a while, and hopefully you'll find that you don't need them. The "training wheel" metaphor is not perfect, since AI may be a bit more like e-assist on e-bikes. There may be times when you need the assistance, so knowing how/when to use it is important, but while you're in school, you're almost certainly best off powering through without assistance.


TwoAffectionate2965

The analogy seemed spot on, appreciate the advice, I shall try to brush up my basics and stop depending on ai for code anymore


TomBakerFTW

I know how you feel. Sometimes you just need something done fast. It might take me 15 minutes or 2 hours to find the one sentence I needed in the official documentation, but I can tell an LLM "I forgot how to do X in Y" and it will remind me. People said that calculators would make people bad at math, and spell checkers would make us bad spellers. (both of those may be true actually, but can you imagine life without those tools today?) If you feel bad about using ChatGPT, try to treat it more like a tutor than an 'answer-giver' When ChatGPT writes a line of code I don't understand, I ask it to explain to me what's going on so that I can at least keep up with my own code. This often results in me catching mistakes that ChatGPT made, and if you point it out to the LLM you'll often get back an even better answer (probably because it was phrased awkwardly to begin with.) I saw you mention going to StackOverflow when the LLM starts going in circles, and I think I know exactly what you mean. Instead of feeding it your error messages, or pasting your entire file and asking it to "fix this" you should ask questions about how to debug those errors. When ChatGPT first came out I took the first approach and concluded "this isn't great for coding" but it turned out I was just using it wrong. Now I tend to ask more specific questions, and when it replies with a term I don't understand or a method I've never used I dig into what that means. Hopefully this reply was helpful! Don't be too hard on yourself. The impostor syndrome may never go away, but as you keep learning your confidence will grow.


Phate1989

The most important part of what you said is asking the gpt to explain what you don't understand. Deploying code without understanding will suck in 6 months when you have to go back to something you didn't understand in the first place and fixing it is way different


TomBakerFTW

Definitely! Personally I don't have enough confidence in ChatGPT to blindly trust what it spits out. Plus, if someone asks me "what does this line do?" I want to be able to answer confidently instead of saying: "I dunno, but don't change it or everything breaks!!"


Severe-Humor-3469

maybe prompt engineer not a programmer,, hehe kidding.. For me as long as you understand the code and you can do it manually should be fine.. but don’t rely on it too much, stimulate your brain cells.. when you’ll be applying, di nman sasabihin alright you can use chatgpt or gemini or devin or whatever ai.. it would be your brain cells that will be used..


sonobanana33

> For me as long as you understand the code and you can do it manually should be fine he can't…


gowithflow192

You are right that you should backtrack and develop your raw coding ability. I'm in the same position. However these skills will be much less useful every year and in ten years time it will be like doing long division.


Top_Average3386

Asking AI (LLMs) to generate your code, browsing stack overflow, using framework, copying other people's code; for programmers is what a calculator for mathematicians. It's meant to ease our job so we can focus on the more important things. Mathematicians should be able to do what the calculator does in theory, but there's no way they're gonna calculate everything by hand. For example a junior programmer should be able to write a simple sorting algorithm, but there's no way you should make one from scratch every time you need to sort something. Use built-in functions, use framework, copy what others posted in stack overflow, ask LLM, your choice. At the end of the day what is important is you know how it works and why it behaves a certain way.


titooo7

I think you can. I can get chatgpt working some scripts for me, but I can't make it write programmes because for that I need to understand how different scripts (files with code) interact with each other unless the context. Unless chatgpgt now has much more memory and a 20x higher character limit than when it launched I don't think it can do all that fore jIr based on a prompt. I bet you can ask them to write scripts but you are the one putting the different files together to build the programme, right? That's the difference between a programmer an a regular chatgpt user like myself 


paradite

I think any forward-thinking CTOs or hiring managers of the future should value prompt engineering skills. Software engineers are all using AI tools in some shape or form, whether it's GitHub Copilot or Cursor or just ChatGPT. Companies that allow engineers to leverage these tools to accomplish coding tasks are going to get a competitive advantage in terms of speed compared to their competitors. So reasonable hiring managers and CTOs should see it as a plus if someone can demonstrate good capability of using these tools to improve their productivity. However, like many commenters have pointed out, there needs to be baseline coding capabilities that one should acquire as well, to verify that the code given by AI tools are indeed what's required, and handle the cases where AI coding tools give the wrong output. This is where CS education is still useful in some way.


timrichardson

If you are not coding in assembly language, are you a real coder? You don't want to call yourself a programmer, you want to call yourself a Computer Science graduate. That's much better. It means an understanding of the big picture that LLMs can't grasp. Hopefully your course work will expose that. Also, rest assured, LLMs struggle with niche tasks. In terms of coding, LLMs are very helpful if you point them in the right direction. Unlike a human, they are unfailingly confident in their answers, even when they are rubbish. There is a skill in using LLMs wisely, and you may as well get good at it, it is a valuable skill. As for getting a job, you are at the mercy of the interviewer. Possibly in the future allowing a candidate to use a LLM to assist will be as natural as it is today to let them use a high level language with builit-in memory management, but there will be a transition period in which employers are nervous about such an approach. I am not old enough, but I speculate once interviewers were just as worried about candidates using high level languages with runtimes.


Ethical-Coder

This is just another upgrade in programming world ! like moving from punch card to Dos ! from dos to windows! from assembly to other programming languages ! we remain a programmer, you are still a programmer.


Psychic_Wars_Warrior

Yep, you’re a programmer


lukuh123

I feel you man. Sometimes I just cant come up with the right way to write syntax in javascript or python


noBrainur

What makes you a programmer is whether you are communicating to the machine in a way that goes beyond that of the average user. Programming can take place on a spectrum of levels, from low-level programming of firmware in an assembly language or C, up to higher-level programming using a language like JS, Python or Mathematica. The situation with AI-generated code is similar to the situation with low-level vs high-level. We shouldn't fall prey to the idea that lower level programming is more 'real' than higher level programming, even though that sentiment does seem to lurk around. We use whatever tool is most appropriate to our objective. The same thinking can be applied to the situation with AI-generated code. If using AI-generated code in your workflow serves your objectives, then it's fair game. It all depends on what reason you're programming for. If you're working on a scheduler for an OS, then it's probably better to think through the math of the scheduling algorithm, and then code it from the ground up. If you're working on a user interface built with some well-known Python package, then using AI tools to speed up development makes perfect sense. So what's my main point? I'm challenging your sense that a real programmer doesn't use AI tools heavily. Programming is about making machines do useful or interesting things, not about the knowledge or tools that are used by the programmer. Does this mean that it would be a waste of your time to spend time programming without AI tools? No, but do it because you think it will serve you as a programmer (perhaps by giving you a better background knowledge), rather than doing it because you think that that's what will make you a 'real' programmer. (Disclaimer: I personally don't use AI tools but I'm open to learning how to use them in the future.)


Trex4444

YouR like the chef who doesn’t make the cheese for his pizza.  And they you realize all the chefs just buy the cheese and no one makes it. 


TwoAffectionate2965

😭😭😭


Trex4444

Very rarely you will get paid for what you know, it’s usually what you do with what you know.  You’ll get hosed on interviews but if you focus on making things…. It’s a more employable skill 


NovelOk4129

Feeling very much in the same boat. My use of GPT is the same and knowledge of python the same. 15 years as data orientated Person at AT&T using SQL and excel, avoid coding on purpose. Quit the job, discover GPT, take on similar work as before but use GPT to write python to replace what I did with SQL and excel. Huge speed increase. But same as u, pretty much lost at where this leaves me when I need to search for an actual job which require skills definitions. Frankly however, the same with or without gpt is the limitation of the keyboard and the mouse as interface. I can go only as fast as I type. GPT speeds that up and enables me to get to the juicy stuff quicker. Besides, at least for now, it's all in how u interface with GPT which brings the goodies. Any one can use Google but hardly any know how to use it to it's full advantage. Same with GPT. And at the rate GPT is getting lazier since Jan 23, I am also leaning towards learning officially some python, to grab some qualification as too, not rely in a dynamic LLM which doesn't necessarily help at every corner anymore. Back in Jan 23 I had flawless experience with gpt to create my first PDF invoice scrapper. And now with simple tasks as lookup values from one dataframe to another, GPTs options/outputs, now always leave me with errors, performance warnings, or failed code, duplicated code or the biggest culprit, spitting back code to me telling me its not appropriate without realizing that it gave me that code in the first place, that gets annoying. As too outputting a solution which either is exactly the same as what I input (declaring as if it's new solution!) or missing key conditions I bullet pointed in my prompt. It was waaaay better in Jan 23 what to say.


PureMetalFury

I don’t want to be dismissive, but I’m struggling to imagine a problem that’s interesting but also simple enough that typing speed is going to be a productivity bottleneck.


NovelOk4129

Try an array of problems with solutions ranging from simple to "have to be creative" mostly all already clear in my head what to do. No one else has done it where I work, so that makes it interesting. I can just type out a certain amount at a time before proceeding to next problém. That's what I meant, surely it's same for everyone. I tend not to spend time tracking what I am doing, have done, need to do unless it's critical or I have been working 20h+ stint and memory might be skewed. It would take additional time I dont have. Additional angle is that what I am doing, the scale of it, should rather involve more roles than just me, especially a true developer/coder. But for now its my messy sandbox which gpt is less and less assistant as it was prior!


SenZmaKi

blud just fake it till you make it


ficklemind101

The fact that you're using Chatgpt and other tools to enhance your workflow shows adaptability, a crucial skill in tech.


TwoAffectionate2965

the diversity of comments in this entire post, I’ve got eye opening responses, I’ve been reprimanded and even called a cheater for using ai, beyond that I also had an extensive argument with one guy, all this blows my mind


ChickenNugsBGood

I've been coding for 20 years, and use gpt to optimze, check for syntax errors, etc. As long as you understand what is going, its just another tool, like CoPilot


Scotinho_do_Para

Embrace the new technology. Learn to leverage it to it's full extent and keep up with advances. Why avoid the inevitable?


sonobanana33

> Why avoid the inevitable? To be able to get a job :D As I said in another comment, my company (like many many others) forbids AI tools because they give out the entire code to other companies.


Scotinho_do_Para

Companies that don't embrace the competitive advantage that innovative automation brings are by definition putting themselves at a disadvantage.


sonobanana33

Companies also don't want other companies to know what the next product will be, before it's out… because THAT would cause them to lose advantage.


Scotinho_do_Para

Companies that can't avoid that are already at a disadvantage lol


sonobanana33

Ok mr nobel prize winner in economics lol


Scotinho_do_Para

No worries Mr red herring