• by frankmcsherry on 7/23/2018, 2:57:21 PM

    https://en.wikipedia.org/wiki/Iterative_deepening_depth-firs...

    Edit: for context, wasn't meant to be "zomg how you so dumb" so much as "everyone should also read, cause is relevant".

  • by stcredzero on 7/23/2018, 7:10:57 PM

    Something has happened to Comp Sci programs over the past 3 decades. Based on what's too small a sample size (the graduates I've been interviewing in SF) it seems like a very large number of graduates from CS programs with 3.75 GPAs or above, can't do much more than glue together libraries, can't practically design a system on their own, and if ever confronted with a graph theory problem, can't do much more than name-drop algorithms, and fall far short of being able to implement those algorithms.

    There are literally problems that were 1) once covered in freshman year, 2) could once be recognized and solved by CS grads in seconds, 3) stump recent CS graduates, 4) prompt HN commenters to say how they could solve it if given a few days, and 5) come up in conversation if you go to meetups and talk to people doing actual work.

    How does this relate to BFS? It used to be that someone trained as a computer scientist would look at a data structure or a graph and start running some quick gedankenexperiments: What would happen if I tried to find that with DFS? What would happen if I tried to find that with BFS? Those aren't going to be suitable solutions for all problems, but it's a good place to start thinking. There seem to be a large number of recent grads who can't even get that far.

  • by megaman22 on 7/23/2018, 2:55:59 PM

    > 100GB of memory would be trivial at work, but this was my home machine with 16GB of RAM. And since Chrome needs 12GB of that, my actual memory budget was more like 4GB. Anything in excess of that would have to go to disk (the spinning rust kind).

    I hope this is a joke, although it's a little scary totalling up how much Chrome is chewing right now, with just one window and seven tabs open...