Show HN: Every problem and solution in Beyond Cracking the Coding Interview
We just compiled every problem (and solution) in the book and made them available for free. There are ~230 problems in total. Some of them are classics like n-queens, but almost all are new and not found in the original CTCI.
You can read through the problems and solutions, or you work them with our AI Interviewer, which is also free. I'd recommend doing AI Interviewer before you read the solutions, but you can do it in whichever order you like. (When you first get into AI Interviewer, you can configure which topics you want problems on, and at what difficulty level, and you can add topics and change difficulty levels as you go.)
Here's the link: https://start.interviewing.io/beyond-ctci/all-problems/techn... (You'll have to create an account if you don't already have one, but there's nothing else you need to do to access all the things.)
Actors for sure with auditions and maybe maybe chefs, male pornstars.
Not that I agree with absurd interview process of software development but they often see themselves more akin to attorneys than tradesmen. The difference being attorneys have to pass a bar exam and even trades have journeyman cards to provide credibility.
Software development has none of that. Real engineering has PE licenses but how do you achieve that in such a broadly scoped field of software development?
We either play the interview game or find a way around it.
Maybe this varies country to country. I have seen this in India and UK atleast.
In UK atleast there are trades like roofing, landscaping, exterior building cleaning, masonry, tree surgeon etc which I have seen them are mostly family based.
staging [1] is very much a thing for kitchen staff
[1] https://en.wikipedia.org/wiki/Staging_(cooking)
I’m currently preparing for interviews myself, so having access to high-quality, free resources like this is incredibly helpful. The AI interviewer feature, in particular, looks like it will be very useful for me. Thanks again to the author for making these resources available!
But I did have one job working on an AI graph compiler which used fancy algorithms all over the place. In practice though I found the space between "use the standard library" and "it's NP-complete; use heuristics" where the answer is "you can use this neat dynamic programming trick" is basically nonexistent.
The whole thing is broken.
If you've ever hired a plumber or electrician, you might have gotten a crew of younger apprentices, maybe a journeyman, and an older "master" plumber or electrician. Most of the master's time is spent with the critical mechanical tasks, solving problems that occur, and directing those other tradesmen. The principal is the 1 person who can do _any_ of the other's job if they are unavailable. They are also the 1 person (and ideally the only 1 person) who makes "the plan" for how the work will proceed, and also decides when a project is complete.
The crucial difference (well, one of many?) between a principal-level engineer, and any type of management...is that the principal-level engineer should be able to do every junior engineer's job in a pinch - expertly, and with confidence and adapability to problems.
I had some interviews, not at the principal level, we had a couple of candidates who were very good during the informal interviews, they could hold a conversation about technology, but they couldn’t code the simplest of problems. I know folks don’t like it, but this could happen in my humble opinion at all levels.
What is key is letting the candidate decide the format they're best at.
Leetcode's signal is pretty bad compared to pair coding/PR reviews IMHO. And if the job genuinely involves writing algorithms, you can put algorithms in the code and have them go over that.
Take home is probably the most vulnerable to cheating, but if you have them code review it afterwards, it's detectable fairly easily.
But I've never seen anyone fail upwards as far as a Principal/Staff Engineer level. Last time I interviewed at that position, no one even asked anything about code. They were more concerned about my position on architectural choices, pros and cons of various approaches, knowledge of applicable standards and regulations (I'm in the medical device field), mentoring and team leadership issues and how to resolve them, etc.
I'm sure the manager was great, but we've all heard of some less-desirable aspects of working at Amazon, and I wouldn't want to go there without a sign that I'd be shielded a bit.
So, I've made the "corporate drone coding screen", and Leetcode interviews in general, my own metric. If a company does it, they fail the interview.
And if I'm having a moment of weakness, and considering submitting to some techbro frat hazing, I remind myself that, if I was willing to do that, I would've gone to Google already, which usually would've preferable to whatever opportunity this other company is dangling.
They could cheat on the take-home but it isn't meant to be difficult and you hopefully figured out at the in-person that they're someone who wouldn't need to bother cheating.
I more understand the emphasis on leetcode problems for juniors but a timed session without an observer (perhaps with browser tracking) to solve those problems makes a lot more sense than bringing in the anxiety of the observer, as you’ve noted. It sucks having to spend mental energy wondering how your problem-solving looks to whoever’s watching and seems actively detrimental to assessing talent for an IC role.
My name is Aline, and I'm the founder of interviewing.io. Thanks so much for your interest. It looks like you’re not in a country where we’re open for business yet, so we can’t create your account, but we’ll add you to our waitlist.
1) We want people to read the book (To wit, we've also made 9 chapters of the book free: http://bctci.co/free-chapters)
2) We want people to use interviewing.io
In my career, I've written a lot of stuff about hiring, and I've shared a lot of interview-related materials (e.g., full length interview replays). I hate paywalls for content, and you probably do too... and I have never regretted making it free. In my experience, putting good stuff out there is the best way to market to an eng audience.
Usually low level engineers don't have a say on what projects they get to work. They show impact by completing whatever projects were handed to them and hope/pray those projects take off and become visible to upper management and/or tied directly to revenue.
The grind aspect is real though. 99% of FAANG engineers aren't building the next google maps or LLM, they are doing Enterprise CRUD + ELT + jira tickets. Companies like Meta and Amazon have enormous workloads and thus a grinder is preferable.
After a few years of experience, CRUD and ETL can be done while sleepwalking so the only missing component is someone willing to grind, e.g. someone who will spend 100 consecutive days doing leetcode
Today, CS student's idea of an industry interview has turned into an extortion racket cottage industry, with people not only selling ritual prep books, but now also selling mock interview rituals with techbros who got into the best-paying companies.
Youse has a lovely career potential; its would be a shame if somethings was ta happen to your job interview.
What more does it take to realize this is very time-consuming and expensive theatre, and terrible metric for hiring good software engineers.
And if you're an employer who doesn't care that students spend many hundreds of hours rehearsing for the interview theatre, to the exclusion of getting more experience building things, and that your interviews aren't actually selecting for software engineer aptitude, what happens when the hire takes that same misaligned hoops-jumping mindset to their work.
We built it this way on purpose, though... the intent is to mimic an interview and gently force you to talk through your thought process, not to have yet another LeetCode clone.
The writing is on the wall for Leetcode-style interviewing. The signal-to-noise ratio is diminishing in the age of AI (cheating). These sorts of puzzle challenges might no longer play a meaningful role going forward.
What is hopefully dying is companies asking verbatim LeetCode questions and candidates having to memorize a bunch of questions. We wrote this book largely because we wanted to teach people how to think. I knowing how to think is only going to get more valuable.
It’s FAR easier for companies to stick with the interview process they’ve used for decades—just mandate in-person interviews again—than to reinvent the wheel with some new, unproven format. Sure, there’s a growing need to assess more than just DS&A in initial screenings, but let’s be honest: those interviews aren’t going anywhere.
The REAL reason to make these resources free? Because it’s not a competitive advantage to offer problems to practice. There are already tons of free problems online. The real value isn’t in giving people a place to do problems—that already exists. The value is in the book. If you already know enough to do well on problems without the book, then you shouldn't have to pay to practice it.
I bet the genuine answer to your question is that she knows it's a resource that could help tons of people (at a time when tons of people need that help) and paywalling it means that it won't serve that same purpose.
Am I the only one that interviewed people with lengthy resumes full of programming experience and when I asked them to do a simple programming exercise they fell flat on their face? I've seen experience in C, gave them take home two hour exam and they couldn't even get anything to compile. What he meant was he took a class a few years back.
You see it in other domains with extensive Excel experience and the guy gets hired and never heard of a vlookup.
I think some of the stuff is overkill but you need to select for people that know how to program.
I for one am glad they exist because I don't have a CS degree but learned on my own. I lucked into this profession through an online leetcode style screener and your book helped me immensely,so thank you
You don't need Leetcode style tests to weed those out. Much simpler problems will do it.
My experience is the opposite - developers eager to squeeze every last drop, using some exotic data structure [*] or sorting algorithm which takes way too long to implement and makes code review a nightmare. For a feature that doesn't need it.
[*] not saying that linked list is exotic, it is just rarely needed in $DAYJOB in my experience.
(Maybe I am just bitter because I have more than once bombed a leet-code interview myself)
I interview a lot of people and my go-to coding question is actually a pretty simple question that might be found in a 2-year coding course. What I am looking for is production ready code, good error handling, tidy design, and understandable code. All things that leet-coding specifically discourages.
Maybe we have different things in mind when we say "leet-code questions".
I don't know why leet-code style interviews would discourage the things you mentioned.
1. Raw mental horsepower
2. The ability to just repeatedly do focused learning, aka just grinding
And sure, it probably does favor #2 these days - but that is a critically important skill. You can trade one for the other, but everybody has some amount of both, and these questions figure out, roughly, your computed aggregate score of these.
They have a very high false negative rate, but an exceptionally low false positive rate for a 60 minute interview, so it works very well in companies with large interview candidate pipelines.
Can you think of anything we do for, say, the first two decades of our life, that could send this signal?
The furthest I've ever seen it go in practice: binary search, BFS/DFS, hash tables. I've never seen any more obscure algorithmic trick than standard uses of these algorithms and data structures.
I'm not saying leetcode doesn't have more insane questions, but interviews tend to be straightforward.
Sure, some interviews are pretty hard and some algorithms/data structures are not as common on the job. But given a complex enough system, you'll run into lots of situations where having this foundation will pay off. I mean, it's just computer science.
That's the thing about software engineering. You can get a lot done without knowing the foundational stuff. But then you're just a blunt instrument. Everything looks like a nail to a hammer.
And if you have a specific industry need to invert a binary tree or fill 8 containers with 32 differently sized boxes or whatever then go nuts. But I have found working through a 30 minute exercise with the candidate, asking them to explain their thought processes, and listening to what questions they ask is much better than just giving them 30 minutes to bash their head against a problem.
It is more effort for the interviewer though because it cannot be automated. But it does allow you to scale the interview by asking things like "What would it take to make your code thread safe? Imagine 4 threads. Now imagine 1000, what would you change?", etc.
Honestly it's 30-45 minutes where you can establish whether the person can code, and whether they have the basic foundational knowledge to crack efficiency problems is pretty hard to beat.
Whilst there's probably diminishing returns on making the actual challenge more and more difficult, the general concept is a lot fairer than the majority of other interview types I've had thrown at me. ( Usually something they've solved internally, where they expect you to regurgitate the same answers without the same context )
This is in part due to job titles being meaningless. Senior Software Engineer has a very large dynamic range of technical ability needed when looking industry-wide. For example, if "Developer" was the non-leetcode title, and "Engineer" was the leetcode title (with harder interviews and higher pay), it'd make things a lot more understandable for everyone.
How we got to this point : as average candidates train more in interview-coding, the interviewers pick harder and harder questions. It's got to the point where the only way to reliably pass is to have pre-canned memorized solutions to hundreds of existing questions. It's an arms race divorced from the reality of the job, which is done with real world tasks, privacy, little time pressure, and access to reference materials.