by faizshah on 10/6/2024, 6:39:51 PM
by steve_adams_86 on 10/6/2024, 6:27:39 PM
I’ve come to the same conclusion in regards to my own learning, even after 15 years doing this.
When I want a quick hint for something I understand the gist of, but don’t know the specifics, I really like AI. It shortens the trip to google, more or less.
When I want a cursory explanation of some low level concept I want to understand better, I find it helpful to get pushed in various directions by the AI. Again, this is mostly replacing google, though it’s slightly better.
AI is a great rubber duck at times too. I like being able to bounce ideas around and see code samples in a sort of evolving discussion. Yet AI starts to show its weaknesses here, even as context windows and model quality has evidently ballooned. This is where real value would exist for me, but progress seems slowest.
When I get an AI to straight up generate code for me I can’t help but be afraid of it. If I knew less I think I’d mostly be excited that working code is materializing out of the ether, but my experience so far has been that this code is not what it appears to be.
The author’s description of ‘dissonant’ code is very apt. This code never quite fits its purpose or context. It’s always slightly off the mark. Some of it is totally wrong or comes with crazy bugs, missed edge cases, etc.
Sure, you can fix this, but this feels a bit too much like using the wrong too for the job and then correcting it after the fact. Worse still is that in the context of learning, you’re getting all kinds of false positive signals all the time that X or Y works (the code ran!!), when in reality it’s terrible practice or not actually working for the right reasons or doing what you think it does.
The silver lining of LLMs and education (for me) is that they demonstrated something to me about how I learn and what I need to do to learn better. Ironically, this does not rely on LLMs at all, but almost the opposite.
by boredemployee on 10/6/2024, 6:28:22 PM
Well, I must admit, LLMs made me lose the joy of learning programming and made me realize I like to solve problems.
There was a time I really liked to go through books, documentation, learn and run the codes etc. but these days are gone for me. I prefer to enjoy free time and go to the gym now
by tekchip on 10/6/2024, 7:27:31 PM
While I don't disagree and understand the authors concern the bottom line is the author, and others of the same mind, will have to face facts. LLMs are a genie that isn't going back in that bottle. Humans have LLMs and will use them. The teaching angle needs to change to acknowledge this. "You need to learn long hand math because you won't just have a calculator in your pocket." Whoopsie! Everyone has a smart phone. Now I'm going back to school for my degree and classes are taught expecting calculators and even encouraging the use of various math and graphing websites.
By all means urge folks to learn the traditional, arguably better, way but also teach them to use the tools available well and safely. The tools aren't going away and the tools will continue to improve. Endeavour to make coders who use the tools well to produce valuable well written code 2x, 5x, 8x, 20x the amount of code as those of today.
by yumraj on 10/6/2024, 7:07:28 PM
I’ve been thinking about this, since LLMs helped me get something done quickly in languages/frameworks that I had no prior experience in.
But I realized a few things, that while they are phenomenally great when starting new projects and small code bases:
1) one needs to know programming/soft engineering in order to use these well. Else, blind copying will hurt and you won’t know what’s happening when code doesn’t work
2) production code is a whole different problem that one will need to solve. Copy pasters will not know what they don’t know and need to know in order to have production quality code
3) Maintenance of code, adding features, etc is going to become n-times harder the more the code is LLM generated. Even large context windows will start failing, and hell hallucinations may screw up without one even realizing
4) debugging and bug fixing, related to maintenance above, is going to get harder.
These problems may get solved, but till then:
1) we’ll start seeing a lot more shitty code
2) the gap between great engineers and everyone else will become wider
by lofaszvanitt on 10/6/2024, 6:30:12 PM
When a person is using LLMs for work and the result is abysmal, that person must go. So easy. LLMs will make people dumber in the long term, because the machine thinks instead of them and they will readily accept the result it gives if it works. This will have horrifying results in 1-2 generations. Just like social media killed people's attention spam.
But of course we don't need to regulate this space. Just let it go, all in wild west baby.
by Rocka24 on 10/6/2024, 6:45:56 PM
I strongly disagree, I was able to learn so much about web development by using AI, it streamlines the entire knowledge gathering and dissemination process. By asking for general overviews then poking into the specifics of why things work the way they do, its possible to get an extremely functional and practical knowledge of almost any application of programming. For the driven and ambitious hacker, LLMs are practically invaluable when it comes to self learning. I think you have a case where you're simply dealing with the classic self-inflicting malady of laziness.
by wkirby on 10/6/2024, 7:11:09 PM
The reason I am a software engineer — why it keeps me coming back every week — is the satisfying click when something I didn’t understand becomes obvious. I’ve talked to a lot of engineers over the last 15 years of doing this, and for most of them, they possess some version of the same compulsion. What makes good engineers tick is, imo, a tenacity and knack for solving puzzles. LLMs are useful when they let you get to the meat of the problem faster, but as the article says, they’re a hindrance when they are relied on to solve the problem. Knowing the difference is hard, a heuristic I work on with my team is “use an LLM if you already know the code you want to write.” If you don’t already know the right answer you won’t know if the LLM is giving you garbage.
by xyst on 10/6/2024, 7:07:31 PM
Anybody remember the days of “macromedia”? I think it was dreamweaver that spit out WYSIWYG trash from people that didn’t know better.
For a period of time there was a segment of development cleaning up this slop or just redoing it entirely.
The AI-generated slop reminds me of that era.
by xnx on 10/6/2024, 6:21:49 PM
"Modern" web development is so convoluted I'm happy to have a tool to help me sort through the BS and make something useful. In the near future (once the thrash of fad frameworks and almost-databases has passed) there may be a sane tech stack worth knowing.
by btbuildem on 10/6/2024, 7:14:50 PM
I disagree with the premise of the article -- for several reasons. You could argue that an LLM-based assistant is just a bigger footgun, sure. Nothing will replace a teacher who explains the big picture and the context. Nothing will replace learning how to manage, handle and solve problems. But having a tireless, nimble assistant can be a valuable learning tool.
Web development is full of arbitrary, frustrating nonsense, layered on and on by an endless parade of contributors who insist on reinventing the wheel while making it anything but round. Working with a substantial web codebase can often feel like wading through a utility tunnel flooded with sewage. LLMs are actually a fantastic hot blade that cuts through most of the self-inflicted complexities. Don't learn webpack, why would you waste time on that. Grunt, gulp, burp? Who cares, it's just another in a long line of abominations careening towards a smouldering trash heap. It's not important to learn how most of that stuff works. Let the AI bot churn through that nonsense.
If you don't have a grasp on the basics, using an LLM as your primary coding tool will quickly leave you with a tangle of incomprehensible, incoherent code. Even with solid foundations and experience, it's very easy to go just a little too far into the generative fairytale.
But writing code is just a small part of software development. While reading code doesn't seem to get talked about as much, it's the bread and butter of any non-solo project. It's also a very good way to learn -- look at how others have solved a problem. Chances that you're the first person trying to do X are infinitesimally small, especially as a beginner. Here, LLMs can be quite valuable to a beginner. Having a tool that can explain what a piece of terse code does, or why things are a certain way -- I would've loved to have that when I was learning the trade.
by dennisy on 10/6/2024, 6:25:54 PM
I feel this idea extends past just learning, I worry using LLMs to write code is making us all lazy and unfocused thinkers.
I personally have banned myself from using any in editor assistance where you just copy the code directly over. I do still use chatGPT but without copy pasting any code, more along the lines of how I would use search.
by orwin on 10/6/2024, 6:39:17 PM
For people who like me mostly do Backend/Network/System development and who disagree on how helpfull LLMs are (basically a waste of time if you're using it for anything other than rubber ducking/writing tests cases/autocomplete), LLMs can basically write a working front-end page/component in 10s. Not an especially well-designed one, but "good enough". I find it especially shine in writing the html/css parts. It cannot write a FSM on its own, so basically when i write a page, i still write the states, actions and the reducer, but then i can generate the rest and it's really good.
by gwbas1c on 10/6/2024, 7:07:48 PM
All the mistakes Ben describes smell like typical noob / incompetent programmer mistakes.
All the LLM is doing is helping people make the same mistakes... faster.
I really doubt that the LLM is the root cause of the mistake, because (pre LLM) I've come across a lot of similar mistakes. The LLM doesn't magically understand the problem; instead a noob / incompetent programmer misapplies the wrong solution.
by aatarax on 10/6/2024, 8:53:28 PM
This section sums it up and I agree with the author here
> LLMs are useful if you already have a good mental model and understanding of a subject. However, I believe that they are destructive when learning something from 0 to 1.
Super useful if you have code in mind and you can get an LLM to generate that code (eg, turning a 10 minute task into a 1 minute task).
Somewhat useful if you have a rough idea in mind, but need help with certain syntax and/or APIs (eg, you are an experienced python dev but are writing some ruby code).
Useful for researching a topic.
Useless for generating code where you have no idea if the generated code is good or correct.
by weitendorf on 10/6/2024, 8:52:57 PM
I can’t help but think part of the problem is that web development is also an impediment to learning web development.
IME there is a lot more arcana and trivia necessary to write frontend/web applications than most other kinds of software, mostly because it’s both regular programming and HTML/CSS/browser APIs. While you can build a generalized intuition for programming, the only way to master the rest of the content is through sheer exposure - mostly through tons of googling, reading SO, web documentation, and trial and error getting it do the thing. If you’re lucky you might have a more experienced mentor to help you. And yes, there are trivia and arcana needed to be productive in any programming domain, but you can drop a freshly minted CS undergrad into a backend engineering role and expect them to be productive much faster than with frontend (perhaps partly why frontend tends to have a higher proportion of developers with non-CS backgrounds).
It doesn’t help that JavaScript and browsers are typically “fail and continue”, nor that there may be several HTML/CSS/browser features all capable of implementing the same behavior but with caveats and differences that are difficult to unearth even from reading the documentation, such as varying support across browsers or bad interactions with other behavior.
LLMs are super helpful dealing with the arcana. I’m recently writing a decent amount of frontend and UI code after spending several years doing backend/systems/infra - I am so much more productive with LLMs than without, especially when it comes to HTML and CSS. I kind of don’t care that I’ll never know the theory behind “the right way to center a div” - as long as the LLM is good enough at doing it for me why does it matter? And if it isn’t, I’ll begrudgingly go learn it. It’s like caring that people don’t know the trick to check “is a number even” in assembly.
by cladopa on 10/6/2024, 7:58:21 PM
I disagree. I am a programmer and entrepreneur myself with engineering education. I know lots of languages very well (c,c++,scheme, python) and made my own tech company so managing it takes a big amount of my time.
I always wanted to program(and understand deeply) the web and could not. I bought books and videos, I went to courses with real people but I could not progress. I had limited time and there were so many different things, like CSS, and js and html and infinite frameworks you had to learn at once.
Thanks to ChatGPT and Claude I have understood web development, deeply. You can ask both general and deep questions and it helps you like no teacher could (the teachers I had access to).
Something I have done is creating my own servers to understand what happens under the hood. No jQuery teacher could help with that. But ChatGPT could.
AI is a great tool if you know how to use it.
by userbinator on 10/6/2024, 7:43:28 PM
AI is an impediment to learning.
Also ask yourself this the next time you feel compelled to just take an AI solution: what value are you providing, if anyone can simply ask the AI for the solution? The less your skills matter, the more easily you'll be replaced.
by csallen on 10/6/2024, 6:59:01 PM
AI is an impediment to learning high-level programming languages. High-level programming languages are an impediment to learning assembly. Assembly is an impediment to learning machine code. Machine code is an impediment to learning binary.
by BinaryMachine on 10/6/2024, 6:22:35 PM
Thank you for this post.
I use LLMs sometimes to understand a step by step mathematical process (this can be hard to search google). I believe getting a broad idea by asking someone is the quickest way to understand any sort of business logic related to the project.
I enjoyed your examples, and maybe there should be a dedicated site just for examples of code related to the web that used an LLM to generate any logic, the web changes constantly and I wonder how these LLMs will keep up with the specs, specific browsers, frameworks, etc.
by jMyles on 10/6/2024, 7:35:38 PM
I've been engineering (mostly backend but lots of full stack too) web technologies for almost two decades. Not the world's greatest sage maybe, but I have some solid contributions to open source web frameworks, have worked on projects of all different scales from startups to enterprise media outfits, etc.
And I have this to say: any impediment to learning web development is probably a good thing insofar as the most difficult stumbling block isn't the learning at all, but the unlearning. The web (and its tangential technologies) are not only ever-changing, but ever-accelerating in their rate of change. Anything that helps us rely less on what we've learned in the past, and more on what we learn right in the moment of implementation, is a boon to great engineering.
Every one of the greatest engineers I've worked with doesn't actually know how to do anything until they're about to do it, and they have the fitness to forget what they've learned immediately so that they have to look at the docs again next time.
LLMs are lubricating that process, and it's wonderful.
by camillomiller on 10/6/2024, 6:39:09 PM
> For context, almost all of our developers are learning web development (TypeScript, React, etc) from scratch, and have little prior experience with programming.
To be fair, having non programmers learn web development like that is even more problematic than using LLMs. What about teaching actual web development like HTML + CSS + JS, in order to have the fundamentals to control LLMs in the future?
by Buttons840 on 10/6/2024, 7:45:55 PM
Does anyone else feel that web technologies are the least worthy of mastery?
I mean, a lot of effort has gone into making poorly formed HTML work, and JavaScript has some really odd quirks that will never be fixed because of backwards compatibility, and every computer runs a slightly different browser. True mastery of such a system sounds like a nightmare. Truly understanding different browsers, and CSS, and raw DOM APIs, none of this feels worthy my time. I've learned Haskell even though I'll never use it because there's useful universal ideas in Haskell I can use elsewhere. The web stack is a pile of confusion; there's no great insight that follows learning how JavaScript's if-statements work, just more confusion.
If there was ever a place where I would just blindly use whatever slop a LLM produces, it would be the web.
by Krei-se on 10/6/2024, 6:28:09 PM
I like AI to help me fixing bugs and looking up errors, but i usually architect everything on my own and i'm glad i can use it for everything i would've put off to some coworker who can do the lookups and works on a view or sth. that has no reconnect to the base system architecture.
So he's not wrong, you have to ask the right questions still, but with later models that think about what they do this could still become a non-issue sooner than some breathing in relieve think.
We are bound to a maximum of around 8 working units in our brain, a machine is not. Once AI builds a structure graph like wikidata next to the attention vectors we are so done!
by infinite-hugs on 10/6/2024, 7:19:41 PM
Certainly agree that copy pasting isn’t a replacement to teaching but I can say I’ve had success learning coding basics while just asking Claude or gpt to explain the code output line by line.
by raxxorraxor on 10/7/2024, 8:15:06 AM
I don't believe it is more of an impediment than looking up stackoverflow to be honest.
I do write my code myself, but LLM offer help. Less with suggesting code and more with comments, improvements or common errors. It isn't entirely reliable, so you need to know what you are doing anyway.
That said, there is little investment in new developers and some rely on AI too much. Companies need to invest more into training in all engineering domains. This is were the value of companies reside for the most part.
by jt2190 on 10/6/2024, 7:58:12 PM
> Use of LLMs hinders learning of web development.
I’m sure this is true today, but over time I think this will become less true.
Additionally, LLMs will significantly reduce the need for individual humans to use a web browser to view advertisement-infested web pages or bespoke web apps that are difficult to learn and use. I expect the commercial demand for web devs is going to slowly decline for these college-aged learners as the internet transitions, so maybe it’s ok if they don’t become experts in web development.
by synack on 10/6/2024, 6:35:19 PM
I learned web development in the early '00s with the Webmonkey tutorials, which were easy, accessible, and fun. I don't know what the modern equivalent is, but I wish more people could have that experience.
https://web.archive.org/web/20080405212335/http://webmonkey....
by xp84 on 10/6/2024, 9:28:38 PM
> API calls from one server-side API endpoint to another public API endpoint on localhost:3000 (instead of just importing a function and calling it directly). > These don’t seem to me like classic beginner mistakes — these are fundamental misunderstandings of the tools
This all sounds on par with the junior level developers I feel ai has pretty quickly replaced.
I still feel sad though :( how are people meant to get good experience in this field now?
by kazinator on 10/7/2024, 7:17:35 AM
I would say that the people who are learning from AI are precisely the ones who are training for jobs that will never exist for them due to AI.
by heisenbit on 10/6/2024, 8:46:13 PM
Not just web development learning is affected. Students handing in homework in all kind of coursed generated with AI. The problem of course is that part of learning depends on spaced repetition (ask any AI how it learned ;-) ) so skipping that part - all across the board - is having an impact already now.
by elicksaur on 10/6/2024, 6:28:06 PM
If it’s true for the beginner level, then it’s true for every level, since we’re always learning something.
by fimdomeio on 10/6/2024, 8:05:48 PM
I'm the person that was copy-past school work in the 90's for things that I wasn't interested in. I'm also the person who spent years learning things that I was passionate for without a end goal in mind. The issue here is not AI, it's motivation.
by trzy on 10/6/2024, 9:20:16 PM
That’s ok. Having to learn web development is an impediment to the flourishing of the human spirit.
by yhoots on 10/7/2024, 2:27:19 AM
sounds more like this happened because the instructors failed to tell the students not to just ChatGPT all the answers, OR the students didn't listen.
this is somewhat analogous to learning arithmetic but just using a calculator to get the answer. but coding is more complicated at least in terms of the sheer number of concepts.
yet we dont ban calculators from the classroom. we just tell students to use them mindfully. the same should apply to LLMs.
i wish when I was learning coding I had LLMs. but you do need to have the desire to understand how something really works. and also the pain of debugging something trivial for hours does help retention :).
by calibas on 10/6/2024, 8:57:06 PM
Having someone else do something for you is an impediment to learning anything.
by manx on 10/6/2024, 7:12:03 PM
Humanity was only able to produce one generation who knows how computers work.
by FpUser on 10/6/2024, 8:50:21 PM
Don't we already have enough self certified prophets telling everyone how to do things "properly"? Nobody pushes you to use LLM. As for us - we'll figure out what forks to our benefit
by tetha on 10/6/2024, 6:58:45 PM
I very much agree with this.
If I have a structured code base, I understood the patterns and the errors to look out for, something like copilot is useful to bang out code faster. Maybe the frameworks suck, or the language could be better to require less code, but eh. A million dollars would be nice to have too.
But I do notice that colleagues use it to get some stuff done without understanding the concepts. Or in my own projects where I'm trying to learn things, Copilot just generates code all over the place I don't understand. And that's limiting my ability to actually work with that engine or code base. Yes, struggling through it takes longer, but ends up with a deeper understanding.
In such situations, I turn off the code generator and at most, use the LLM as a rubber duck. For example, I'm looking at different ways to implement something in a framework and like A, B and C seem reasonable. Maybe B looks like a deadend, C seems overkill. This is where an LLM can offer decent additional inputs, on top of asking knowledgeable people in that field, or other good devs.
by travisgriggs on 10/6/2024, 9:23:07 PM
You had me at "AI is an impediment to learning..."
I use GPT all the time. But I do very little learning from it. GPT is like having an autistic 4 year old with with vast memory as your sidekick. It can be like having a super power when asked the right questions. But it lacks experience. What GPT does is allow you to get from some point As to other point Bs faster. I work in quite a few languages. I like that when I haven't done Python for a few days, I can ask "what is the idiomatic way to flatten nested collections in Python again?". I like that I can use it to help me prototype a quick idea. But I never really trust it. And I don't ship that code til I've ever learned more to vouch for it myself or can ask a real expert about what I've done.
But for young programmers, who feel the pressure to produce faster than they can otherwise, GPT is a drug. It optimizes getting results fast. And since there is very little accountability in software development, who cares? It's a short term gain of productivity over a long term gain of learning.
I view the rise of GPT as an indictment against how shitty the web has become, how sad the state of documentation is, and what a massive sprint of layering crappy complicated software on top of crappy complicated software has wrought. Old timers mutter "it was not always so." Software efforts used to have trained technical writers to write documentation. Less is more, used to be an effort by good engineering. AI tools will not close the gap in having well written concise documentation. It will not simplify software so that the mental model to understand it is more approachable. But it does give us a hazy approximation of what the current internet content has to offer.
(I mean no offense to those who have autism in my comment above, I have a nephew with severe autism, I love him dearly, but we do adjust how we interact with him)
by ellyagg on 10/6/2024, 6:53:25 PM
Or is learning web development an impediment to learning AI?
by monacobolid on 10/6/2024, 7:02:59 PM
Web development is impediment to learning web development.
by menzoic on 10/6/2024, 6:36:12 PM
Learning how to use ̶C̶a̶l̶c̶u̶l̶a̶t̶o̶r̶s̶ LLMS is probably the skill we should be focusing on.
by j45 on 10/6/2024, 7:19:31 PM
This article feels out to lunch.
If you use AI to teach you HTML / programming concepts first, then support you using them, that is learning.
Having AI generate an answer and then not have it satisfy you usually means the prompt could use improvement. In that case, the prompter (and perhaps the author) may not know the subject well enough.
by blackeyeblitzar on 10/6/2024, 6:22:19 PM
Almost every student I know now cheats on assignments using ChatGPT. It’s sad.
by meiraleal on 10/6/2024, 6:44:25 PM
Code School employee says: AI is an impediment to learning web development
by seydor on 10/6/2024, 6:36:27 PM
I don't think the thing called 'modern web development' is defensible anyway
by wslh on 10/6/2024, 6:33:56 PM
As Python is an impediment to learning assembler?
by cush on 10/6/2024, 6:25:23 PM
I find it particularly ironic when someone who goes to a top university with $70k/yr tuition attempts to gatekeep how learning should be. LLMs are just another tool to use. They're ubiquitously accessible to everyone and are an absolute game-changer for learning.
Folks in an academic setting particularly will sneer at those who don't build everything from first principles. Go back 20 years, and the same article would read "IDEs are an impediment to learning web development"
by kgeist on 10/6/2024, 8:00:26 PM
>API calls from one server-side API endpoint to another public API endpoint on localhost:3000 (instead of just importing a function and calling it directly).
>LLMs will obediently provide the solutions you ask for. If you’re missing fundamental understanding, you won’t be able to spot when your questions have gone off the rails.
This made me think: most of the time, when we write code, we have no idea (and don't really care) what kind of assembly the compiler will generate. If a compiler expert looked at the generated assembly, they’d probably say something similar: "They have no idea what they’re doing. The generated assembly shows signs of a fundamental misunderstanding of the underlying hardware," etc. I'm sure most compiled code could be restructured or optimized in a much better, more "professional" way and looks like a total mess to an assembly expert—but no one has really cared for at least two decades now.
At the end of the day, as long as your code does what you intend and performs well, does it really matter what it compiles to under the hood?
Maybe this is just another paradigm shift (forgive me for using that word) where we start seeing high-level languages as just another compiler backend—except this time, the LLM is the compiler, and natural human language is the programming language.
by MicolashKyoka on 10/6/2024, 8:10:19 PM
sure, let's hear it from the "head of engineering" of an academic club with "9-12" intern level devs who has barely 2y of experience as a dev himself what he thinks about the industry. i mean it's fine to have an opinion and not particularly hating on the guy, but why is it given any credence and making the front page? are people this afraid?
llms are a tool, if you can't make it work for you or learn from using them, sorry but it's just a skill/motivation issue. if the interns are making dumb mistakes, then you need to guide them better and chop up the task into smaller segments, contextualize it for them as needed.
The copy-paste programmer will always be worse than the programmer who builds a mental model of the system.
LLMs are just a faster and more wrong version of the copy-paste stackoverflow workflow it’s just now you don’t even need to ask the right question to find the answer.
You have to teach students and new engineers to never commit a piece of code they don’t understand. If you stop at “I don’t know why this works” then you will never be able to get out of the famous multi hour debug loop that you get into with LLMs or similarly the multi day build debugging loop that everyone has been through.
The real thing that LLMs do that is bad for learning is that you don’t need to ask it the right question to find your answer. This is good if you already know the subject but if you don’t know the subject you’re not getting that reinforcement in your short term memory and you will find things you learned through LLMs are not retained as long as if you did more of it yourself.