Ask HN: What trick of the trade took you too long to learn?
32 unsupp0rted 37 8/4/2025, 5:39:59 PM
Every week for the last 3 months I’ve learned a new trick when it comes to getting whatever LLM I’m using at the time to produce better output. That’s my trade, but lots of HNers have more interesting trades than that.
In my case, only recently I learned the value of getting an LLM to write and refine a plan.md architecture doc first, and for it to break that doc down into testable phases, and then to implement phase by phase.
Seems obvious in hindsight. But it took too long to learn that that should be my approach. I had been going phase by phase myself- no overarching plan.md for the LLM.
What Trick of the Trade took you too long to learn?
An investing prof at Chicago puts this on the whiteboard at the start of semester, saying this is really all most people need to know and this class is unlikely to learn anything in his or any class that will let them, personally, do better.
I was lucky, my physics department administrator told me the same thing when I was graduating.
The 2ND best piece of advice is to rollover your 401k when you move to a new company -> this cost me at least 500k because they effectively stagnate when your company isn't paying the maintenance cost (AIUI).
Mostly on the tax side. Some specific examples:
- after maxing out your 401k what should you do next? IRA? Mega backdoor roth? Something else?
- If you have kids, how to best save for future education expenses? Hint: consider 529 plan.
- HSA is technically the best tax advantaged account, most high earners don’t realize it and “waste” the HSA funds to reimburse typical medical bills. HSA has triple tax benefits: contributions are tax-free, growth is tax-free, and withdrawals are also tax-free after age 65 for any reason, not just medical expenses. So basically investing without any tax obligation. You can also withdraw tax free before 65, but for medical expenses only.
i could go on…investing is great, but reducing your tax obligation is an even more powerful technique if you want to grow your net worth.
I wouldn’t consider those options needing much motivation or research. The key with all of them is investing early and leaving it alone.
I do agree people should call vanguard. But just blindly following steps they give you is unlikely to be productive if you don’t understand why you’re doing those steps. Furthermore, those people who don’t understand _why_ will freak out every time there’s a huge market correction. They get scared - because they don’t understand any of it.
I’m also curious, do they offer financial advice for your accounts outside vanguard? Genuinely curious since i’m unsure.
529 plans can get a bit more complicated because you'll want one from your state (if your state has an income tax) and they may offer several, but then it's less about knowing tax code specifics than about what the differences are between their offerings.
It’s really not that hard and i don’t understand why more people aren’t interested. Let’s reframe for a minute…if i said a high earner could retire a year earlier, or maybe even a few years earlier just by learning some semi-advanced tax strategies. Should they do so? Yeah. They’d be crazy not to lol.
"Everything worth doing is worth doing badly"
And as a corollary, every complex system that works came from a simple system that works.
I learned this in programming, but now I apply it on everything from motorcycle maintenance, home appliance repair to parenting.
--
Often the easier way to fix a complex system is to pretend that it could be simpler and then reintroduce the complexity-inducing requirements.
I had a professor who taught debugging as a whole another skill from programming and used to say "Most of programming is starting from an empty editor and debugging until your code works".
The debugging "lab" in Java course (in the year 2000) was one of my transformational after-school classes - where I got a java program which fits within 2-3 pages of print code with a bug and was told to go find it in print for ~20 minutes, then given 40 minutes with a debugger instead.
Using timing coincidences in particle physics experiments is incredibly powerful. If multiple products from the same reaction can be measured at once, it's usually worth looking into.
Circular saws using wood cutting blades with carbide teeth can cut aluminum plates.
You can handle and attach atomically thin metal foils to things by floating them on water.
Use library search tools and academic databases. They are entirely superior to web search and AI.
The main example is, you're considering leasing new equipment that might save you money. What's the risk that it will actually cost more, considering various ranges of potential numbers (and distributions)?
I think it's harder to apply to software since there are more unknowns (or the unknowns are fatter-tailed) but I still liked the book just for the philosophical framing at the beginning: you want to the measure things because they help you make decisions; you don't need perfect measurements since reducing the range of uncertainty is often enough to make the decision.
There may exist an analytical solution for this, but I wouldn't trust myself to derive it correctly. It would certainly be a huge mess.
If we add that the source is also a right cylinder instead of point source, and we want to add first order attenuation of emitted gammas by the source itself, the spreadsheet becomes only a bit more complex, but there will not be a pen and paper equation solution.
In this example every row of the spreadsheet would represent a hypothetical ray. One could randomly choose a location in the source, a random trajectory, and check if the photon intersects the detector. An alternative approach would be randomly choosing points in both target and detector, then doing additional math.
The results are recovered by making histograms and computing stats on the outputs of all the rows. You probably need a few thousand for most things at least. Remember roughly speaking 10k hits gets you ~1% statistics.
also in general bayesian statistics
random medium article: https://medium.com/pythoneers/monte-carlo-simulation-ideas-a...
What is likely to happen if I do (or don't do) this thing one thousand days (or times) in a row?
Examples:
- exercising 2h per day and eating right --> I'm going to look and feel great and my health will be far better than that of my peers
- Should I buy these cookies along with the rest of my groceries? If I do that 1,000 grocery trips in a row …
- spending 30+ minutes per day reading the highest quality material I can find; taking notes; and figuring out ways to implement the knowledge and ideas I gain --> …
I think you'd like Atomic Habits [1] if you haven't read it already.
[1] https://www.amazon.com/Atomic-Habits-Proven-Build-Break/dp/0...
1. Make PRs small, but not as small as you can possibly make them.
2. Intend to write at least one nice-ish test suite that tests like 50-80% of the LOC in the PR. Don't try to unit test the entire thing, that's not a unit test. And if something is intrinsically hard to test - requires extensive mocking etc - let it go instead of writing a really brittle test.
3. Tests only do two things: help you make the code correct right now, or help the company keep the code right long term. If your test doesn't do either, don't write it.
4. Ok - now code. And just keep in mind you're the poor sod who's gonna have to test it, so try to factor most of the interesting stuff to not require extensive mocking or shit tons of state.
I've found this workflow works almost anywhere and on almost any project or code. It doesn't require any dogmatic beliefs in PR sizes or test coverages. And it helps prevent the evils that dogmatic beliefs often lead you into. It just asks you to keep your eyes open and don't paint yourself into a corner, short term or long term.
Simple example: Can you get more done working 12 hours a day than 8? Sure, for the first day. Second day maybe. But after weeks, you're worse off in one way or another.
It's easy to chase imaginary gains like automating repetitive tasks that don't actually materialize, but some basics like sleep, nutrition, happiness, etc are 100% going to affect you going forward.
* I actually hate that word, and prefer saying "effectiveness". Productivity implies the only objective is more, more, more, endlessly. Effectiveness opens up the possibility that you achieve better results with less.
1. Discarding the bullshit. A consistent practice of weighting assumptions and conclusions on evidence/numbers helps identify biases and motives from other people.
2. Measures allow for value identification and performance. Most people just guess at this. Guessing is wrong more than 80% of the time and often wrong by multiple orders of magnitude.
Most people don’t think like this and find this line of thinking completely foreign, so I often just keep my conclusions to myself. To see a movie about this watch Money Ball.
And then iterating on it over time, and I find what’s valuable and what’s not.
There are still a few special cases where macros are useful, such as the multiple #include trick where a macro #defined before the #include determines what the macro invocations in the include file does -- really helpful for building certain kinds of tables.
The #include trick is called the "X Macro". I used it extensively, and eventually just removed it.
(I have never played with it -- I saw the ads in Byte but I never met anybody who had tried it. It seemed so ridiculously cheap that I felt it had to be a scam ;) )