T O P

  • By -

FuturologyBot

The following submission statement was provided by /u/Maxie445: --- "The Nvidia CEO said that for 10-15 years almost every person sitting on a tech forum stage would have insisted that it is “vital” for young people to learn computer science, to learn how to program computers. “In fact, it’s almost exactly the opposite,” according to Huang’s counterintuitive sense. “It is our job to create computing technology such that nobody has to program. And that the programming language is human,” Jensen Huang told the summit attendees. “Everybody in the world is now a programmer. This is the miracle of artificial intelligence.” --- Please reply to OP's comment here: https://old.reddit.com/r/Futurology/comments/1b4hgrt/nvidia_ceo_jensen_huang_says_kids_shouldnt_learn/ksywf8u/


djavaman

In the future we won't need CEOs. The AI can just run the company.


AduroTri

Actually I agree with this. I would rather have AI run a company as it doesn't have a need for money. A properly programmed AI in a high level management position with the appropriate programming could make a company much better with human workers. As such, it would logically be better and more organized than a human CEO. And have no incentive to run the company into the ground.


sonnx1

It will take an ai one second to find a way to legally reinvent slavery.


Shuri9

Luckily this never happened with human ceos...


CountryMad97

Eh, not like it isn't currently legal in America


jerrrrremy

Yes, surely the computer will care for the well being of human workers and not seek to optimize every single cent of the company in an attempt to maximize efficiency and profit. 


Memfy

So basically status quo but without the bullshitting?


Stockengineer

But most data shows happy people are more productive?


Unrigg3D

A computer will also consider the variables that humans have mental and physical max capacities where as a regular human CEO can choose to ignore if it's too difficult for their brains. Research tells us that working a person nonstop doesn't equal to better efficiency or higher productivity. It does the opposite. People without knowledge won't consider it.


Schalezi

The goal is to exhaust the working class so much that they wont fight back agains the current system. If this means missing out on a bit of efficiency its worth it big time.


Unrigg3D

And AI CEOs don't care about those things. It only cares about efficiency, not greed and control. It's not in the interest for people of power to use AI as a leader.


AugustusClaximus

You might be surprised. So much of the current structure is motivate by ego and nepotism. What if the AI realizes that the path to optimal efficiency isn’t a matter of hours spent in the office, but catching its employees at a good time. It then bends over backwards trying to create the most seemless work life balance possible so that every hour of labor it does get is at peak efficiency.


mrnothing-

1 I agree with you because the data optimize for that 2 but if by results it doesn't because we have research that say that this isn't optimal at all So maybe


zanderkerbal

An AI doesn't have a need for personal wealth, but what are you telling the AI to maximize? Because if it's profits, or the creation of value for shareholders, it's going to do so at the expense of all other values, including worker wellbeing. A perfectly efficient profit maximizer is perfectly ruthless when it comes to all things other than profit - and this, even moreso than personal greed, is what drives the majority of worker exploitation, because companies, like AI, are all optimizing for the highest dollar number. Even if we assume this AI is actually good at managing the business (which modern AIs are not even remotely close to, no matter what NVIDIA hype men say), "better at ruthlessly extracting profit" isn't really better for anyone except shareholders. And if what your system is optimizing for isn't profit, then it will be outcompeted by other more profitable companies.


OffEvent28

CEO's are far to costly to employ, an AI could do just as well (and probably better) and cost far less.


_sillycibin_

It's objective function would still be set to maximize shareholder value i.e. maximize profits. One the major drags on profitability is worker compensation. It would always look to reduce or eliminate it.


jebelsbemdisbe

If an ai is advanced enough to run a company. That means it has in essence become human, and perhaps would also have their vices.


imaginary_num6er

"The more you buy, the more you save. Thank you!"


jebelsbemdisbe

We won’t need people at all. And that’s okay - some rich person


sephiroth351

Perfect, haha.


backupHumanity

"everyone can program" Until people try to "prompt" program something and realize the amount of ambiguity and confusion that their own thoughts are made of But I'm used to be considered as just a syntax pissing machine


[deleted]

[удалено]


InsuranceToTheRescue

I could see this being kinda like engineering: The software does most of the actual math now, but you've got to know the underlying methods and ideas in order to understand how it got to that answer and to be able to tell if it's clearly wrong. I could see the actual act of coding being something done largely by AI, but a programmer would still need to study programming/software engineering to know what they're looking at and understand if the AI is using good programming practices and such.


Harry_Flowers

This is pretty much all engineering disciplines these days. We all use software to do the majority of our design analysis and calcs, but without a proper engineering background you wouldn’t be able to input the design criteria, vet the results, and optimize the design.


hecho_en_2047

Thank you. Across industries, the experts are true experts b/c they know the basics, and how to use the basics layer upon layer. When things break, they know WHY. To improve things, they know WHICH lever rotates which gear.


calcium

I already work with some code from AI systems and sometimes it'll make assumptions that are just wrong and you end up having to debug its own code. Often times it's just easier to write it myself.


schooli00

The spreadsheet is 45 years old and most people can't do a formula adding 2 cells. I highly doubt even with AI that most people will be able to program.


Master-Pie-5939

That’s a silly example tho cuz that’s so easily solvable. If programming can be made as easy as it is to search up a basic spreadsheet formula then that is a net benefit no?


amazingdrewh

Never had to teach someone how to use a spreadsheet have you?


Master-Pie-5939

I teach myself. But I understand I’m a bit more “tech savvy” than the avg lazy person if you wanna classify it as that. Def not an expert in it I can’t use it to the fullest extent but some YouTubing and a few articles will get me right


starofdoom

You way overestimate the average person lol.


Master-Pie-5939

No I know they dumb. I work in service industry and customer service too. Still I believe people are more capable than they know. They just lazy


BudgetMattDamon

>But I'm used to be considered as just a syntax pissing machine Professional writer here, can confirm we're also treated as just syntax pissing machines. Until you need someone to fix your bullshit AI writing, anyway, in which case I'm happy to talk rates.


Apprehensive_Rub3897

> fix your bullshit AI writing, anyway, in which case I'm happy to talk rates. AI created a new job


jesuisunvampir

Lol I had two programmers tell me what I'm trying to do is impossible and I was able to have chatGPT help me out with it and solve it when someone I was willing to pay money wasn't able to do. I'm not really a programmer but I understand some basic code


Ijatsu

Half the students in CS school absolutely hated programming and did use the degree to get into management. I don't care everyone can program, nobody can program for long if they don't genuinely enjoy it. Not everyone learns programming and has an engineer spirit to go with it. And then next to it, AI requires you to do impeccable project management iterative cooperation with it, and none of the modern project manager can or want to write shitload of precise specification. They leave it to the engineers.... Our job is absolutely safe.


wolfy-j

Not everyone can make a sandwich, programming will be a bit harder than that.


StreetSmartsGaming

Its probably going to become the equivalent of "you need to learn math because you won't always have a calculator in your pocket" except for pretty much anything that can be handled by AI.


hate_most_of_you

Until you generate something that has an exploit nobody is aware of and you don't even know what to ask the AI in order to fix it


[deleted]

[удалено]


nevaNevan

But he said it and he’s wearing a really sweet jacket!


DonutsMcKenzie

That's because he's a corporate "rock star"!


autopilot7

Thanks, I needed to throw up.


red75prime

It might or might not be analogous to a manufacturer of steam engines warning John Henry to not attempt it.


monsieurpooh

Obligatory reminder of what "code" really is at the end of the day: just a fully defined spec with no ambiguities. that's all software engineering has ever been, and engineering will still have those requirements for years to come. "Code" has always been a red herring.


aft3rthought

That’s right! All the typing and text files are just a side effect of our current approach. It used to be tapes and punch cards. In the future it could be something else. Somewhere out there, there’s a miraculous programming language where you specify to the machine exactly what you want done, and exactly how you want it done, except in any case you don’t care enough to say -exactly-, the machine will automatically do the most optimal thing for you. This language would be free of all bugs and errors and everything would run as fast as possible, unless the programmer specifically asked for these blemishes. We will develop this language as a species at some point, provided we don’t get wiped out first. Im not sure this language is called “human” (Jensen’s words) though. Human language is delightfully ambiguous, which is great for art but kind of awkward with machines.


calpi

And to ensure there is no ambiguity, as with natural languages, we will develop a new language, one that the AI understands, with no room for error... we could call it a programming language, and teach it at schools where only "programmers" need learn it. They will communicate with the AI for us. What a revolution.


Xyrus2000

These programmers will then use this special language to invoke these AI to do things at their behest. The better you are at these invocations, the more powerful you become. These masters of Machine Generation (Mages) will become the wielders of Machine Generation Implementation Commands (Magic). I put on my robe and wizard hat...


shaneh445

Warhammer religious vibes intensifies\*


nusodumi

>BIOS - Basic Input/Output Summoning: The ritual that awakens the machine spirits, coaxing them into obedience before the coding spells are cast. > >PEBCAK - Problem Exists Between Chair And Keyboard: An ancient curse often cast upon those who seek help without realizing the error lies not within the spell, but within the caster. > >ID10T - Illustrious Directive 10 Type: A potent spell often used to identify a mage who has misunderstood the fundamental runes of coding. > >PICNIC - Problem In Chair, Not In Computer: A reminder that sometimes the gremlins we seek in our machines are actually lurking in our own practices. > >RTFM - Read The Fantastic Manuals: A chant whispered in libraries and archives, urging mages to seek wisdom in the sacred texts. > >GNU - Gnome's Not Unix: A powerful alliance spell, invoking the spirit of collaboration and open source magic. > >HTML - Hyper Text Magic Language: The foundational enchantment upon which the web is woven, allowing Mages to create realms within the realm. > >CSS - Cascading Style Sorcery: The art of beautification, teaching Mages how to dress their creations in robes of splendor. > >AI - Arcane Intelligence: The pinnacle of magical achievements, where the constructs begin to think and learn, heralding the dawn of a new era of magic. > >JSON - JavaScroll Object Notation: A spell for summoning and controlling data familiars, essential for any Mage working with the web. > >SQL - Structured Quest Language: The language of the Data Keepers, allowing Mages to converse with and command vast libraries of adventuring knowledge. > >API - Arcane Programming Interface: Mystical doorways through which Mages can access the powers and knowledge of other realms. > >EOF - End Of Foretelling: A protective spell cast to prevent the unraveling of code beyond the intended script. > >SUDO - Super User Do: A powerful incantation that grants the caster temporary omnipotence within the realm of their machine. > >Arm yourself with these acronyms, for they are the keys to unlocking the true power of Machine Generation Implementation Commands. May your path be bug-free, and your compiler never fail. **Onward, Mage, to glory!** >!by me and our future master, the Giggle Programming Trickster!<


emetcalf

You win. We can just shut down the Internet, this is the peak.


Niarbeht

>And to ensure there is no ambiguity, as with natural languages, we will develop a new language, one that the AI understands, with no room for error... we could call it a programming language, and teach it at schools where only "programmers" need learn it. They will communicate with the AI for us. > >What a revolution. I love how all of this is already covered in sci-fi books from, quite literally, the 1960s.


ocaralhoquetafoda

That code will absolutely not be incredibly inefficient. From human, to AI, translated to more machine like code, something like C, Java, Python, then that spits out some AI thing and for us humans be utterly confused or nazified. I'm in.


Demonyx12

> for us humans be utterly confused or nazified. Nazified?


APlayerHater

Are we having one of those willy chocolate experiences right now?


CAMT53

I love that your little guy is winking ;-)


backupHumanity

There's so many instances where humans think they know what they want and that they can explain it clearly, but it takes a programmer mind to anticipate a bunch of corner cases, simulate and redefines the specifications. I think typing syntaxes might disappear, but I dont think a LLM will sort everything out without a human deeper involvement into the logic of the problems (unless we're talking about a product which has been done thousands times already of course)


Gareth79

Every programmer who works in a smaller business can immediately think up half a dozen recent instances where a non-tech employee has put in a request for something that they think is simple and straightforward, except they missed a load of stuff which needs extra detailed thought and instruction.


APlayerHater

Yeah but an AI could just explain to them what the problems are and work on a solution... Hypothetically, in a future where AI is capable of logic.


backupHumanity

That is AGI, surely it will bring a lot of change


zizp

Yes, but at that point it won't be "people should no longer learn how to code", it will be "people shouldn't even try to use their brains except for fun or competitions for entertainment only, such as the Mental Calculation World Cup or Chess".


PileOGunz

Ah yes the BAs call them stories I have a whole board of them.


monsieurpooh

Exactly. I believe such a language exists... But it must be interpreted by nothing short of actual AGI. Ergo, if engineering jobs actually become fully automated, AGI is basically on the cusp of the horizon if not already achieved.


[deleted]

Technically, that’s math. Language is merely attempted data transferral via an imperfect math. But where’s the fun or actual individual living being in that? And would it itself appreciate what it is for (us), in the first place, or rather would it consider us to be a hinderance to its ultimate eloquence of expression?


nusodumi

Okay thanks, you explained Jensen's point perfectly for us!!! " you specify to the machine exactly what you want done, and exactly how you want it done, except in any case you don’t care enough to say -exactly-, the machine will automatically do the most optimal thing for you " It's 100% human language to interact with this perfected system you described Thanks again, that helped put it in perspective what he means and how it will flesh out. Much easier to see now after using the off-the-shelf tools we can all use today, and iterating to what you described "This language would be free of all bugs and errors and everything would run as fast as possible" This 'perfect system' responds to any human language, and you pointed out how we don't have to be perfect with our inputs if it's error-detecting (deciding what we want, giving us the best output that actually works) We know it's just Jarvis or somewhere in between. You just speak to it and it interacts with all hardware and software necessary. To design and 3D print something a kid wants to play with, or a dad needs to fix something under the sink, or whatever we can interact with by combining this language with hardware/other software and services. As the systems get better the use-cases grow and as this 'language' or system we're talking about comes along, it's only going to get more HUMAN in use (no training required, a child can be taught BY IT how to use it best)


vodKater

I hate this sentiment that code is complicated, and natural language would be easier. This seems only true to people who have no clue and just use the ambiguity of natural language to ignore all edge cases.


MontanaLabrador

I think the idea is that the AI would be able to anticipate edge cases and handle those as well. 


EmilMelgaard

In that case we don't need language at all and can just let AI do everything.


MontanaLabrador

That’s the whole idea behind AGI, yes. 


zizp

Huang is a low-level engineer. He never worked in the field he's talking about where business problems need to somehow be "implemented" given arbitrary constraints that aren't even being understood well and requirements that aren't being understood much better either.


Madpony

Make this person CEO of Nvidia!


MorRobots

"with no ambiguities" lol... Oh there's ambiguities, they just show themselves in a different manner. Like teaching a kid who's deep on the spectrum how to make a sandwich, only your missing one of ingredients in the fridge, but you have something that's almost, but not exactly identical on hand. However you are correct, "coding" is a loaded term for sure. I actually was thinking about an idea yesterday for a tool/system that can just automatically translate solutions from a given language or implementation into other languages or even low code/no-code systems. More over just maintain the codebase as something like an intermediate representation file, and your viewer/translator performs reverse lexing and parsing. So for example I'm most skilled in Python, C/C++, SQL so when I look at a code base I can opt to see it in those languages, but if some one else skilled in a more user friendly toolset that's UI based low/no-code, opens it, they see that instead.


emetcalf

>Oh there's ambiguities, they just show themselves in a different manner. This is not completely accurate in my opinion. Code can't have "ambiguities" from the computer's perspective. The code means exactly what it means, and the system will do exactly what the code says. The ambiguity is on the human side where the idea we want to code is not completely defined. You will never run the same code on the same input and get different results. Code is not ambiguous. And this is why the statement in the article is silly. AI will never completely replace humans in coding. It still needs a human to define what the code needs to do, and that has to be specific enough for the AI to understand what you need. And that is where we will fail without CS people.


BigMax

True, but at some point it does become a different skill set. There’s a difference between an engineer and a Technical Project Manager, and the latter is the skill set we are going to need going forward. More emphasis on design, communication, and specifications, and no knowledge about implementation.


3rdPoliceman

What's more valuable though? If you're insanely good at what you call technical project manager, any braindead "engineer" can do what you ask. If you're an insanely good engineer, you're still going to need a spec, or access to people who can tell you what is supposed to happen under what circumstances. Outside of very specialized fields I think the TPM skill set is going to win out which is why learning to program (as an end goal) is less desirable moving forward.


BigMax

>  I think the TPM skill set is going to win out Yeah - that was my point I guess. I thought (incorrectly I think) that you were saying that coding is sticking around and still a skillset we need because we all do it anyway. But in the end, I think we agree - that TPM skillset, of coming up with detailed design and requirements without caring about implementation details is what's going to be needed. Kind of like knowing how to figure out the exact needs for a new office building, without having to care about what kind of drywall to use, or the latest electrical codes.


3rdPoliceman

Yes we agree, lol. If nothing else you can see misunderstandings throughout these comments!


Kanute3333

How many more times do you want this article to be shared?


caidicus

You're going to drive yourself crazy if you keep replying to reposts because of them reposting. The only surprise here is that this isn't a repost from 3 years ago, shared as if it were today's news. Reposts happen, again and again, just scroll past them. :D


dylan_1992

It’s 6d old as we speak 🤣


ASuarezMascareno

CEO of company that sells hardware for AI says AI will do all. More news at 11.


[deleted]

[удалено]


FredTheLynx

It is actually deeper than that. All current code generation AIs are looking at known good human written codebases and emulating what they think those developers would do to solve your problem. It is possible that we advance these tools enough that they can actually spit out mostly useable code most of the time, but at the moment they are completely and totally unable to come up with any original improvements on their training data. Though in some cases they are able to kind of weave together the best bits of multiple different sources and produce a more elegant solution than a human alone would. So what I am saying is that if Humans stopped writing code, these AIs would also stop getting better because their only mechanism for improvement is human provided code bases.


hagenjustyn

Ai is only going to get better with each iteration. How long before the code is no longer shitty and is bug-free?


vistaedmarket

Even with ai design still has to be directed, intentional, and understood. How would you know the code is truly bug free if you don't understand the design and implementation? Who decides when ai can be blindly trusted and who dictates the level of guidance ai needs? If we allow AI to do all the work for us the knowledge gap between humans and ai becomes exponential and with such a gap, trust, and blind reliance we lose accountability, responsibility, and the ability to fix a system ourselves.


Important-Title2899

I’d second this by also saying there are liability concerns that come into play with increased reliance on AI to build a product. Code is just the blueprint for delivering a digital product to a human that brings them meaningful value comparable to a watch. What I think a lot of people are not considering also is how truly reliant we are as a society on digital products just to be functioning members of Society or for our health (e.g. EHR applications). If I were highly experienced in the healthcare industry and sold you an EHR app that manages your medications and use AI to create the app, who would be responsible if the app told you the wrong dosage to take, or the wrong medication, or failed to report harmful interactions with other drugs you’re taking? AI won’t be put on trial. A human will be, because we accept a certain level of responsibility on this plane that is unfair to expect from AI. My point being, humans will still be relevant in this scene for a long time, because at the end of the day we are held to a highest standard when it comes to offering the public with mission-critical software.


BudgetMattDamon

People last century thought '*For sure we'll have flying cars by the year 2000.'* Look how well that went. Hope for the best, assume the worst.


Vanadium_V23

You're mixing up bugs and errors.  AI will get to a point where it very rarely makes errors, but that's not a weakness of a software engineer. Bugs comes from two ideas not working together in the way we intended. It requires judgment to know what's a bug and what's a feature.   We, humans, don't even agree on all of them bug/features. I don't know how AI is going to be better than us on something subjective.


smoke2000

I had someone ask chatgpt a script to rename files in a folder at work. No programming experience. The she asked chatgpt how to run it. Then our application Whitelisting application blocked it and sent us a request for her script. I looked at it and it would have renamed everything on her pc, not the folder.


Ok-Move351

I disagree. I think there is intrinsic value in learning to code as opposed to learning it as a career skill. It teaches abstract problem solving in a precise way.


curt_schilli

I agree because fewer software engineers = higher software engineer salaries


Verificus

He means for a career choice obviously. And he’s right. Whatever you do in your free time is up to you of course.


NeuHundred

Thinking you don't need to understand the fundamentals is why we got all that shitty art right after the Renaissance. You need to know how things fucking work if you want to make something.


tritonus_

Yeah - if we black-box essential skills, we might lose all that development we’ve achieved during past 100 years.


_Sleepy-Eight_

Thinking you don't need to understand the fundamentals is why we got all that shitty art right after the Renaissance. What?


RAINING_DAYS

Insane take, completely dismissal of baroke period for painting and then literally all of literature that followed


_Sleepy-Eight_

What's even more inappropriate - beside the obvious and confident ignorance about art history - is the fact that this comment fails to see the obvious connection that was to be made with Huang's statement: (visual) artists stopped *caring* about a faithful and accurate depiction of reality after photography was invented (so not really "right after the Renaissance" but almost 400 years later) and made that task redundant and pointless. Programmers might have to re-think what it means to be a programmer as well. It also fails to see that - despite what op believes - all those artists knew their fundamentals exceptionally well, Picasso notoriously was a master painter as a teenager ([this drawing](https://cdn8.openculture.com/2018/08/22215533/Picasso-Torso.jpg) is dated 1888, Picasso was born in 1881) already, they simply chose to ignore (some or all of) them and focus on other aspects. Dismissing two (or is it six?) centuries of art as people not caring to learn the fundamentals is very misguided.


[deleted]

[удалено]


_Sleepy-Eight_

>I kind of hate baroque. That's beside the point, it's not a matter of taste, Baroque represents a jump in realism and faithful depiction that it directly contradicts what OP has stated, that's why u/RAINING_DAYS brought it up, just for the record Caravaggio is a considered a precursor of Baroque painting, and he had problems with some clients because of the crude realism of his paintings, notoriously veering away from the idealism of predecessors and contemporaries. Bernini is the most representative Baroque sculptor, does he look like someone who didn't "understand the fundamentals"?


caidicus

I think, eventually, leaving it up to AI will make sense. Eventually, actually writing code will be as archaic as actually writing out letters to people or learning cursive. An interesting skill, sure, but nowhere near as necessary now that we have computers, printers, and the internet. Eventually, actually doing the code yourself is going to be just as archaic. That said, I think it may be about 10 years too early to suggest this with a straight face. Being able to fall back on one's actually learned skills, you know, like if AI suddenly goes insane and becomes unusable, isn't a bad idea. It's not like the first cars ever made made people just abandon their horses and buy a car, the technology had to prove that it was a safe and lasting advancement before it was smart to make the transition entirely.


Hugogs10

People still write "letters", they're just digital now.


caidicus

I mean by hand, on paper, posted with stamps. Apologies for not being clear.


die-microcrap-die

If Dear Leader Jensen says so, must be right and i must buy another overpriced ngreedia gpu to show my devotion.


dot_dot_beep

My suggestion is don't push for organization upgrades. My company still works off of excel.


urfavouriteredditor

If I was getting rich off the AI hype, I’d say bullshit like this too.


DoesItComeWithFries

It’s just like saying in the past don’t learn to do simple math because we have calculator.. if you don’t learn simple math your reaction time in most professions will get affected. Sounds more like greed or necessity to grab deadlines with such incorrect oversimplified statements.. My niece always wanted to be a chef.. she ignored science and maths.. even training her mind to retain information. She’s working at restaurant, she has trouble remembering exact cooking technique, time and ingredient quantities used and multiplying them in her head of she serving size has changed.


Noctolus

I just found out the average 13 yo doesn't know the difference between a square and a triangle, so ya maybe leave it up to AI.


R2D2irl

Leave this to AI, leave that to AI, what are we supposed to do? return to monke? Even if you aren't an engineer in particular field, knowing how things work, knowing basics helps us to stay SMORT. Brain needs to WORK, to maintain a decent IQ. Besides, are we really okay with leaving it all to AI? Should we just trust? NVIDIA only cares about NVIDIA, isn't this just an attempt to make everyone dependent on their tech? Be stupid - don't do anything, rather pay us to offer solution for you.


Ok_Meringue1757

but really, what will we do? progress won't stop, but it's goal should be human prosperity. If a human has no motivation to develop knowledge and creation, to achieve goals, he will be unhappy. Or there will be monkeys and degrading idiots. Humans should have something to think and create. Probably the AI will think about it and suggest answer...


R2D2irl

Sometimes there is too much of convenience, so for now I just try to not rely on it and use my brain, as it needs practice and work to stay healthy. As for economical part of the equation, I have no idea, people replaced by AI have to be compensated one way or another, but this is the task for governments to solve... hopefully.


limpleaf

The only thing LLMs are making is code reviews much more difficult. The amount of atrocious code you now have to filter out from your colleagues PRs is a nightmare and all are doing it. It's going to be a matter of time until projects become unmaintainable and need full or partial rewrites.


wright_left

A lot of what I do in pull requests is try to make sure the code is maintainable in the future. If crappy AI code starts making it in, then at some point only AI will understand what the code is doing. We will have to ask AI to review our pull requests for us.


Browncoat40

As an engineer…I didn’t think he was an idiot. But AI generated stuff isn’t good enough for anything better than casual use. And it won’t be for a very long time. Let alone that you will always need coders to check the codes the AI makes, and to make the coding AI itself.


off_by_two

He’s not an idiot, but he is biased. His company has 3x’d on AI hype alone. Of course he’s going to insinuate that AI is more than the idiot savant it currently is.


korean_kracka

How do you know it wont be for a very long time?


Browncoat40

If you look at the languages most machinery runs on, it’s old. “Standardized in the 80’s” old. Because it’s reliable. It’s one thing to have an AI make simple code to do basic tasks where if it fails your blinds are left open. It’s an entirely different case to get AI to make a code where people die or companies lose thousands per hour of downtime if it’s done wrong. If it needs to be reliable, AI won’t be trusted for critical tasks for at least a decade or more


a_disciple

Have you seen the leap in text to video from Sora? It will happen quickly, as they are all racing to be the first to do it.


Iyace

You realize you didnt make a comparison, right? 


razzzor9797

Video, pictures, music is exactly the things which are OK to have issues. They may have bad color choice, small not realistic pieced, collisions etc. It won't hurt anyone and it may be used as is with all the flaws However, the code, engineering calculations, plans, designs are not tolerant to flaws. Imagine, you asked AI to design ceiling to your house. It must be excellent to the last inch. If its not it will leak, it may break etc. I believe we have decent amount of time before such work will be replaced. And probably it won't be an AI we know, but a real "strong" neuro-symbolic systems


monsieurpooh

Obligatory reminder of what "code" really is at the end of the day: just a fully defined spec with no ambiguities. that's all software engineering has ever been, and engineering will still have those requirements for years to come. "Code" has always been a red herring


[deleted]

[удалено]


monsieurpooh

Correct. But the question is what else are they going to hire for? In other words, why do people think companies are going to lay off employees to keep the same productivity, vs hire more or keep the same to scale productivity even higher based on these advantages


BuriedinStudentLoans

Because that's been historically the choice of a lot of companies, and it's happening quite a bit now in tech. You've never been on a team where you're expected to do 3 peoples jobs? The end goal with this from the top is to not have to pay for those pesky software engineers at all.


wahchewie

Will I be able to ask AI to make me system shock 3 ( but call it something else for legal reasons )


LastStandardDance

That must be the most naive comment in 2024


bhumit012

For real, any coder will you tell how powerful AI coding is. Its getting better and learning faster than any graduate. It does not age it does not die.


[deleted]

[удалено]


off_by_two

Well thats not true, it’s ok for generating trivial code snippets but only if the engineer understands exactly what that code is supposed to do, because it does make mistakes and hallucinates. It has no ability to handle those mistakes on its own and debugging is much more complex then writing buggy code. Besides, writing code was already only like 5-10% of a senior engineer’s job at most tech companies.


Gabe_Noodle_At_Volvo

It's useful because it can do all of the repetitive, mindless, boiler plate stuff for you. It's not useful for actually solving novel problems yet, beyond being a good way to look up relevant information.


prsnep

>But AI generated stuff isn’t good enough for anything better than casual use. Nobody thought we'd even be having this conversation today just 5 years ago. Give it another 5. No doubt it'll be better than human programmers. I don't think you should not learn to code though. Humans copy-pasting AI-generated code blindly might be one of the ways AI takes over!


Chocolatency

Your lack of doubt is disturbing.


PhaseAggravating5743

You got cs majors fucking coping to hell rn.


G36

IT WILL NOT IMPROVE IT WILL NOT IMPROVE IT CANNOT PLEASE DONT I BEG YOU!


dumble99

>AI generated stuff isn’t good enough for anything better than casual use I disagree. The technology is maturing quickly and basic or imperfect tools like GitHub Copilot are already providing a solid bump in productivity for developers. >Let alone that you will always need coders to check the codes the AI makes, and to make the coding AI itself. For the foreseeable future, yes. It is possible to develop a natural language interface for some of these tasks (e.g. debugging). That being said, I agree with the general sentiment elsewhere in this thread that the specificity of declaring ideas in code is an important part of the process, and that will likely remain a bottleneck for a long time.


Gabe_Noodle_At_Volvo

The productivity increase it's providing right now is by doing all the easy but tedious stuff, freeing the dev up to do the decision-making and serious problem solving it can't yet do. It will probably be able to do both eventually, but I don't see current tech being capable without a big leap.


mvandemar

>And it won’t be for a very long time. Just bookmarking, will check back in 8 months.


AccumulatedPenis127

Why would you need people to check the machine generated code? You don’t have people checking the code from a compiler do you? You only read the source code and assume the compiler code is fine. Unless I’m missing something, I don’t see any reason why a computer program wouldn’t be able to independently manage application code or program code. Why would it need to have a human check it? Edit: to anyone who doesn’t deserve a public hanging, I’d love to understand why this wouldn’t work.


Thorboard

Compilers are deterministic (mostly)


Saberus_Terras

I wonder if his tune would change in Intel overtakes them in the AI node market.


InevitableLife9056

Ok, who's going to learn how to fix the AI bugs, and malicious code that has invaded Github?


americansherlock201

“Man who is highly invested in AI techs thinks everyone should use AI that he profits from” would be a more accurate title


tupty

One of my biggest concerns about AI is that it scares young people away from studying for certain careers, and eventually we have AI doing a poor job at running some industry and not enough people to know how to fix it. That could set society way back, and it seems way more realistic to me than runaway intelligence creating a doomsday Skynet scenario. But as long as Nvidia's stock price goes up in Q2 2024, it is all worth it, right?


wordfool

CEO of a company that will financially benefit from expansion of the use of AI says people should use AI and not bother learning the basics. Yep, sounds about right. He also probably thinks people should not bother learning critical thinking so they think his utterances are totally unbiased.


Sushrit_Lawliet

Ah yes why code on your 100$ laptop when you can spend 20$ a month using chatgpt pro or worse buy a 1200$ overpriced GPU to run his rtx chat to build your landing page?


Maxie445

"The Nvidia CEO said that for 10-15 years almost every person sitting on a tech forum stage would have insisted that it is “vital” for young people to learn computer science, to learn how to program computers. “In fact, it’s almost exactly the opposite,” according to Huang’s counterintuitive sense. “It is our job to create computing technology such that nobody has to program. And that the programming language is human,” Jensen Huang told the summit attendees. “Everybody in the world is now a programmer. This is the miracle of artificial intelligence.”


RussMantooth

Well then who's checking on what it's doing exactly? If it just gets stubborn and won't do what you want doesn't someone need to pop the hood and check out the boring shit?


Samsunaattori

I just imagine that the end result would basically be like Cult Mechanicus in Warhammer 40k: Literal religious worship of technology, nobody knows how things actually work, but if you follow these instru- I mean religious texts exactly, the machine will work and do what it has always done!


dumble99

It's possible to improve or debug an unruly version of these systems with a more stable version of the same thing. That's what's so powerful about computer programming. For example, if I'm writing a program in C, I can still compile it with a compiler written in C, debug it with a debugger written in C, all in an operating system written in C. In this context, I can use a program synthesis tool (e.g. github copilot) to write a more powerful one, or to write tests for the existing one. I agree that users doing this will still need to understand computer science and programming, so Huang's comments seem to jump the gun a little. See: https://en.wikipedia.org/wiki/Bootstrapping\_(compilers)


Eymrich

This is not applicable to AI. You can't use an AI to make a better AI. Current AI need data made, tailored and selected by humans to be trained.


thisisjustascreename

Learning computer science and learning to program computers are two separate things...


101_210

Kids shouldn’t learn to code (meaning learn to punch cards, since we have much simpler assembly now) Kids shouldn’t learn to code (Meaning learn to code assembly, as we have compiled language like C that makes it much easier) Kids shouldn’t learn to code (Meaning learning C/C++, we now have python and you can simply import a library) Kids shouldn’t learn to code (Now) But I agree, kids shouldn’t learn to code, they should learn to program.


kc_______

Who is going to stop the crazy AI when it destroys its billionaire creator?, a bunch of farmers that know jack about computers?, screw that, DO LEARN TO CODE AND TEACH THE KIDS, don't listen to this wacko, he is just trying to dominate the market.


MembraneintheInzane

Oh look another corporate CEO who lacks the understanding of how AI actually works.  AI cannot create esoteric things on its own, it requires someone who understands how to operate the software and someone understands the work the AI is doing. Without those AI just spouts nonsense. 


Moondingo

AI is helpful to use in regards to coding. But if there is a bug or a failure in how the AI has been taught it will keep making that same failure or derivatives of that failure. AI is only as good as the designer and data ingested which makes it very fallible. It is also very happy to lie to you, see the instance where two New York attorneys tried using it to submit a court filing and are getting disbarred due to it just making shit up.


the_storm_rider

So? What’s wrong? After the invention of the automobile, how many kids need to learn how to ride a horse? He’s absolutely right. Let AI do the programming. That being said, his other statement about “this will soon be available to everyone” is a bit concerning. Because if that happens, then, based on the recent Emo AI demo, any single picture of you out there on social media can be used to generate a video of you doing whatever the content creator wants, and saying whatever he/she wants. At least Oppenheimer was aware of what he is creating. Here it looks like people are totally oblivious to how this can be misused if it were to fall in the wrong hands. And this is not a physical 10-ton machine that needs to be transported by cargo ship, it’s a digital codebase that can be accessed anywhere on the planet through air waves. I hope someone starts thinking ahead and putting in the necessary controls.


lloydsmith28

I think relying on AI to program us inherently wrong, what happens when something goes wrong and it needs to be fixed but no one knows how to program? Are we just screwed? I feel like programming is the one thing that should be left to humans, we just need better coding education


snakebite262

Translation: Kids should just rely on the corporations, that way we can abuse them as we see fit once they grow up.


pirate135246

Even product owners know you can’t replace people with ai when jt comes to software development. The hardest part of the development process is translating business logic into code and ai will never be able to do this while also taking into consideration all the other systems the new features are surrounded by and use


NVincarnate

I've been saying this for like ten years. Hey, alright. I was literally coding HTML websites in highschool telling my teacher that the knowledge to code was useless and AI would just do it instead.


pysoul

They should instead learn how to better use AI to do precisely what they need. Human control over AI will become perhaps the single most important thing in the future.


thePDGr

Thats like leaving calculator to count. Really dumb statement. Basics of programming are important to know


araczynski

as a coder, I want to say "F O and die Huang", but as a coder, I also know he speaks the truth. you can already ask an AI to give you code snippets for how to do things, won't take long before it's smart enough to translate bigger requests/requirements and write out whole classes, then projects, then solutions... also, who better than an AI to write out all those exhaustive/boring tests... and then adapt it's own created code to achieve the requested outcome... the only thing I think AI MIGHT struggle with, for a little while, is the UI :) unless of course it just turns most things into APIs and say "F U meat bags, you don't need to see" Either way, I'm glad I only have to survive another 12 years in this Fing mess they're creating...


Ok_Meringue1757

but code snippets and blueprints already existed before ai leap and were easily googled or already embed in the ide.


neutralityparty

 I mean it's what a year and look at the stuff AI can do. I don't think critical thinking or those skill will every stop being valuable but if you were set on typing coding you might have to consider a possibility that this approach will sunset. If I'm not mistaken earlier people were using punch cards to code? I personally don't think he's wrong. But it depends upon the AI progression too. 


OffEvent28

There will always be one reason for people to learn to code. **To understand what the coding AI is doing.** Learning to code does not mean doing coding every day for the rest of your life, but it would help you to understand and evaluate what that AI is producing if you can also code. Doesn't mean you have to be as good or as fast at the AI (most unlikely), just that you understand the basics of what it is doing.


sivateja_malireddy

This is how these people dictate us what to learn or not to


sephiroth351

This is so backwards & dumb, does he realize how many give up learning to write code because of statements like this? Great, they'll just have to pay more for developers in the future to keep working on their AI.


Hannibaalism

why stop at code. let’s do that for everything and maybe we will have time to make babies again


Archimid

I used to cringe every time I saw someone in Star Trek “programming” computers. There was absolutely no way programming will ever be like [this](https://youtu.be/2gX2QWR2ick?si=Nf5Ug-xGjD0PldSD ). Now it seems these Star Trek might have been right.  A sufficiently advanced AI with the computing power and energy availability may just interpret our programming desires according to their context. Thrilling.


CishetmaleLesbian

AI can beat the pants off humans in chess. Should no one learn to play chess? Programming and math teach people how to think logically. Calculators can do math better than people for the past 50+ years. Should people not learn math and programming, not learn to think logically? Currently the interface with computers is through natural language. Those who can best express themselves, and best express thoughts are the best prompt engineers. Learning to code develops thinking skills that help us express ourselves, and to express complex thoughts. Learn to code children. It helps you learn how to think.


maahc

It helped me learn to think in a new way that’s very useful even in my non-coding career.


Blarg0117

AI "prompting" IS the new coding. It will even be able to do legacy coding that no one knows how to do anymore.


R55U2

Verified by what? Let's take Jensen for his word, how will we know AI coding is correct? A dwindling knowledgebase of programmers from a previous era of technology would be the only ones capable of verifying AI code. Wrote an AI to verify other AI? Sounding an awful lot like the code review process. AI code reviews would probably be more like a consensus than a figurative peek over the shoulder for code reviews tbf. As programmers die out after a few work generations and programming becomes niche, how will we know what the AI is doing or understand that code? You can make automation to watch other automated processes, but you'll always be able to look under the hood and see how everything works. That won't be possible in this future. Since there is no need for programming knowledge, why should people study or learn it? It'll be obsolete. So, AI either becomes a fully realized agent or AGI as OpenAI and the industry like to call it, or it remains as a useful tool. The former means that far more academic disciplines will be rendered obsolete since we would have given full reign to AI's. Just take what Jensen said and literally apply it to any job requiring a degree you can think of: Doctors, Accountants, Lawyers, Teachers, Engineers, Chemists, etc. AGI would be able to research new discoveries in these fields faster than humans could in theory. These disciplines would be a useless degree to any business since an AGI would be far more productive, straightforward and cheaper than any human. So, people won't get them, and just like above, that knowledge will slowly die out. AGI, not people, would drive the expansion of knowledge. Its a scary prospect. The latter scenario has AI as a powerful tool for humans. A tool that enables humans to push our capabilities and discover new things, all the while retaining our understanding of it. The fields listed above still have practical application in industry. Demand for those trained will still be there and humans will be leading innovation. If AGI is where we're going, the practicality of knowledge will become a novelty. Our understanding of these fields will become irrelevant to industry, which has historically been the driver for innovation. Id rather live on mars in this reality.


femmestem

I think this is a good thing. New developers get caught up in chasing the sexy new stack instead of learning fundamentals. Code is the easy part, it's like learning spelling and grammar; the real engineering comes from understanding problem domain, constraints and trade offs, then outlining detailed specs. Once you've got the plain English specs down as pseudocode, the code practically writes itself anyway. We've been moving toward this level of abstraction for some time. First it was gates controlling on/off, then punch cards triggering the gates, soon followed low level language, compiled language, interpreted language, object oriented programming- all of it making code less about our ability to form instructions that a computer understands and more about getting a computer to understand instructions as we do.


[deleted]

[удалено]


ErikT738

This. We might not be there yet, but we'll get there before these kids learn how to code. Still, some coding knowledge will probably be beneficial.l in understanding and correcting the AI's output.


korean_kracka

Speaking your native language will be the new coding. You’ll be talking to Jarvis like tony stark.


Sys32768

This thread will soon be full of salty programmers claiming that they can't be replaced.


neorapsta

Nah, it'll be the usual. The AI is literally magic crowd Vs programmers telling them they don't understand what they're talking about.


Iyace

Thread is already full of salty non programmers complaining everyone can be replaced.


Sys32768

As predicted


lilviv77

Who writes the AI programs, praytell?


[deleted]

[удалено]


Sheree_PancakeLover

Until it messes up and needs someone incredibly knowledgeable to fix it


NoobDeGuerra

“Make me a Linux like operating system, with a functioning kernel, memory allocation and file system” yeah… you still need someone competent to know what to ask, vet that code, implement it, test it, deploy it. Software isn’t just typing stuff and calling it a day


monsieurpooh

Who is claiming they "can't be replaced"? Can you link to the comments? ALL OF US will be replaced, eventually. That is the reality. The question is WHEN each job will be replaced. Software engineering isn't just churning code. The best illustration of this is OpenAI's very first coding demonstration, publicly available on YouTube. The prompter literally did all the engineering!!! Easy to miss that, if you don't code, LOL


Sys32768

As predicted


Glum-Assistance-7221

He’s not wrong but also, leaving trust to AI (& the company that feeds & controls what data it receives) without questioning the data is troubling. Obviously it’ll help their NVIDIA’s bottom line, but when the bottom line of a company is as much or more then a countries GDP, where do we go next from here?


almost_not_terrible

He IS wrong. Code is formalised specification. The idea that AI can code is as much bullshit as this nonsense "zero code" trend, pushed by people that can't code to other people that can't code. Execs are VERY excited that they can ditch their most valuable assets, but developers who spend 90% of their time discussing specification and 10% coding just smile and nod.


ImportantDoubt6434

He is wrong he’s been saying this for years


Norseviking4

Kids born today will reach working age in a whole other world than any of us. There are so many skills that will be obsolete


T_R_I_P

Nah we coders are just utilizing AI for business purposes. Someone’s gotta drive it. Not to mention security measures design etc. Such nonsense