My Most Embarrassing Mistakes as a Programmer (so far)
I agree with the saying: “If you’re not embarrassed by your old code then you aren’t progressing as a programmer.” I began programming recreationally more than 40 years ago and professionally more than 30 years ago, so I have made a lot of mistakes. As a computer science professor, I encourage students to learn from mistakes, whether their own, mine, or famous examples. I feel it’s time to shine a light on my own mistakes to keep myself humble and in the hope that someone can learn from them.
Third Place: Microsoft C compiler
I had a high school English teacher who argued that Romeo and Juliet was not a tragedy, since the protagonists did not have tragic flaws: the reason they acted foolishly was that they were teenagers. I didn’t like the argument then but see the truth in it now, especially as it pertains to programming.
After completing my sophomore year at MIT, I was both literally a teenager and a programming adolescent when I began a summer internship at Microsoft on the C compiler team. After doing some grunt work, such as adding support for profiling, I got to work on what I consider the most fun part of a compiler: back-end optimizations. Specifically, I got to improve the x86 code generated for switch statements.
I went hog wild, determined to generate the optimal machine code in every case I could think of. If the case values were dense, I used them as indices into a jump table. If they had a common divisor, I would use that to make the table denser (but only if the division could be done by bit shifting). I had another optimization when all the values were powers of two.
If the full set of values didn’t satisfy one of my conditions, I divided the cases up and called my code recursively.
It was a mess.
I heard years later that the person who inherited my code hated me.
Lessons
As David Patterson and John Hennessy write in Computer Organization and Design, one of the great principles of computer architecture (also software engineering) is “make the common case fast”:
Making the common case fast will tend to enhance performance better than optimizing the rare case. Ironically, the common case is often simpler than the rare case. This common sense advice implies that you know what the common case is, which is only possible with careful experimentation and measurement.
In my defense, I did try to find out what switch statements looked like in practice (i.e., how many cases there were and how spread out the constants were), but the data just wasn’t available back in 1988. That did not, however, give me license to keep adding special cases whenever I could come up with a contrived example for which the existing compiler didn’t generate optimal code.
I should have sat down with an experienced compiler writer or developer to come up with our best guesses of what the common cases were and then cleanly handled only those. I would have written fewer lines of code, but that’s a good thing. As Stack Overflow co-founder Jeff Atwood has written, software developers are their own worst enemies:
I know you have the best of intentions. We all do. We’re software developers; we love writing code. It’s what we do. We never met a problem we couldn’t solve with some duct tape, a jury-rigged coat hanger, and a pinch of code….
It’s painful for most software developers to acknowledge this, because they love code so much, but the best code is no code at all. Every new line of code you willingly bring into the world is code that has to be debugged, code that has to be read and understood, code that has to be supported. Every time you write new code, you should do so reluctantly, under duress, because you completely exhausted all your other options. Code is only our enemy because there are so many of us programmers writing so damn much of it.
If I had written simple code that handled common cases, it could have been easily modified if the need arose, rather than leaving a mess that nobody wanted (or dared) to touch.
Source: https://twitter.com/ThePracticalDev/status/710156980535558144
Second Place: Social Network Ads
When working on social network ads at Google (remember Myspace?), I wrote some C++ code that looked something like this:
for (int i = 0; i < user->interests->length(); i++) {
for (int j = 0; j < user->interests(i)->keywords.length(); j++) {
keywords->add(user->interests(i)->keywords(i)) {
}
}
Readers who are programmers probably see the mistake: The last argument should be j not i. My unit tests didn’t catch the mistake, nor did my reviewer.
After going through the launch process, my code was pushed late one night — and promptly crashed all the computers in a data center.
It wasn’t a big deal, however. There were no outages, since code is tested in a single data center before being pushed globally. It just meant that the SREs had to stop playing pool and rollback some code. I got an email the next morning telling me this that included a stack dump of the crash. I fixed the code and added unit tests that would have caught the error. Since I followed proper procedure — and there’s no way my code would have gone live if I hadn’t — that was that.
Lessons
Some people think that making a mistake that big could cause someone to lose their job, but (a) programmers make mistakes and (b) the programmer is unlikely to make that mistake again.
Actually, I do know a programmer who was fired for a single honest mistake, despite being an excellent engineer. He was then hired (and later promoted) by Google, which didn’t care about the mistake, which my friend openly admitted during the interview process.
There’s a story about Thomas Watson, the legendary Chairman and CEO of IBM:
“A very large government bid, approaching a million dollars, was on the table. The IBM Corporation — no, Thomas J. Watson Sr. — needed every deal. Unfortunately, the salesman failed. IBM lost the bid. That day, the sales rep showed up at Mr. Watson’s office. He sat down and rested an envelope with his resignation on the CEO’s desk. Without looking, Mr. Watson knew what it was. He was expecting it.
He asked, “What happened?”
The sales rep outlined every step of the deal. He highlighted where mistakes had been made and what he could have done differently. Finally he said, “Thank you, Mr. Watson, for giving me a chance to explain. I know we needed this deal. I know what it meant to us.” He rose to leave.
Tom Watson met him at the door, looked him in the eye and handed the envelope back to him saying, “Why would I accept this when I have just invested one million dollars in your education?”“
I have a t-shirt that says “If people learn from their mistakes, I must have a Master’s degree by now.” I have a PhD.

First place: App Inventor API
To be really mortifying, a mistake should affect a large number of users, be public, exist for a long period of time, and come from someone who should have known better. My biggest mistake qualifies on all counts.
Worse is Better
I read The Rise of Worse is Better by Richard Gabriel when I was a grad student in the nineties, and I like it so much that I assign it to my students. If you haven’t read it recently, do so now. It’s short.
The essay contrasts doing “the right thing” with “the worse-is-better philosophy” along a number of dimensions, including simplicity:
The Right Thing: The design must be simple, both in implementation and interface. It is more important for the interface to be simple than the implementation.
Worse is Better: The design must be simple, both in implementation and interface. It is more important for the implementation to be simple than the interface.
Set that aside for a moment. I set it aside for years, unfortunately.
App Inventor
I was part of the team at Google that created App Inventor, an online drag-and-drop programming environment that enables beginners to create Android apps.
Back in 2009, we were rushing to release an alpha version in time for teacher workshops in the summer and classroom use in the fall. I volunteered to implement sprites, fondly remembering writing games with them on the TI-99/4 in my youth. For those not familiar with the term, a sprite is an object with a 2D representation and the ability to move and interact with other program elements. Some examples of sprites are spaceships, asteroids, balls, and paddles.
We implemented App Inventor, which is itself object-oriented, in Java, so it’s objects all the way down. Since balls and image sprites are very similar in behavior, I created an abstract Spriteclass, with properties (fields) such as X, Y, Speed, and Heading. They have common methods for collision detection, bouncing off the edge of the screen, etc.
The main difference between a ball and an image sprite is what is drawn: a filled-in circle or a bitmap. Since I implemented image sprites first, it was natural to make the x- and y-coordinates specify the upper-left corner of where the image was placed on the enclosing canvas.
This is a reasonable design decision.
Image credit: Yun Miao
Once I got sprites working, I realized that it would be simple to implement a ball object with very little code. The problem was that I did so in the simplest way (from the point of view of the implementer): having the x- and y-coordinates specify the upper-left corner of the bounding box containing the ball.
This is a terrible design decision.
What I should have done is have the x- and y-coordinates specify the center of the circle, as is done in every single math book and everywhere else circles are specified.
This is the right design decision.
Unlike my other mistakes, which primarily affected my colleagues, this affected the millions of App Inventor users, many of them children or otherwise new to programming. They had to do extra work in every app they created that used the ball component. While I can laugh off my other mistakes, I am truly mortified by this one.
I finally patched the mistake just recently, ten years later. I say “patched” and not “fixed” because, as the great Joshua Bloch says, “APIs are forever”. We couldn’t make any change that would affect existing programs, so we added a property OriginAtCenter, which defaults to false in old programs and to true going forward. Users will be right to wonder why on earth the origin would ever be anywhere but the center. Answer: Ten years ago, one programmer was lazy and didn’t create the obvious API.
Lessons
If you ever develop an API (which almost all programmers do), you should follow best practices, which you can learn from Joshua Bloch’s video “How To Design a Good API And Why It Matters” or the bumper sticker summary, which includes:
APIs can be among your greatest assets or liabilities. Good APIs create long-term customers; bad ones create long-term support nightmares.
Public APIs, like diamonds, are forever. You have one chance to get it right so give it your best.
Early drafts of APIs should be short, typically one page with class and method signatures and one-line descriptions. This makes it easy to restructure the API when you don’t get it right the first time.
Code the use-cases against your API before you implement it, even before you specify it properly. This will save you from implementing, or even specifying, a fundamentally broken API.
If I’d written even a one-page proposal with a sample use case, I probably would have realized my design mistake and fixed it. If not, one of my teammates would have. Any decision that people will have to live with for years deserves at least a day’s consideration (whether or not we’re talking about programming).
The title of Richard Gabriel’s essay “Worse is Better” refers to the benefit of being first to market, even with a flawed product, rather than taking forever to create something perfect. When I looked back at the sprite code, however, I saw that doing things the right way wouldn’t have even been more code. By all measures, I made a poor decision.
Conclusion
Programmers make mistakes every day, whether it’s writing buggy code or failing to try new things that would increase their skill and productivity. While it is possible to be a programmer without making as big of mistakes as I have, it is not possible to be a good programmer without admitting and learning from your mistakes.
As a teacher, I often encounter students who fear they’re not cut out for computer science because they make mistakes, and I know that the Imposter Syndrome is rampant in tech. While I hope readers will remember all of the lessons from this article, the one I most hope you’ll remember is that whether we laugh, cry, blush, or shrug them off, all of us make mistakes. Indeed I’ll be surprised and disappointed if I can’t write a future sequel to this article.
77 Comments
Excellent! You have more street cred than even the TechLead (only ex-Google and ex-Facebook).
Minor: “leaving a mess than” → “leaving a mess that”
Thanks a lot! I fixed the mistake. If only you’d reviewed my code and not just my post.
😂😂😂
I laughed out hard on this 🙂
Imagine if every widget that has a top left corner positioning originated from the App Inventor philosophy. You have just became a legend In a way, although unpreferably… still a legend nonetheless.
Fantastic article!
`rm * ~` instead of `rm *~` with rm aliased to rm -f. In a directory of a weeks work of non source controlled code. In front of my boss.
Spent the rest of the day sifting through unused hard drive sectors for stuff that looked like C++.
Hey, I did that! Perl, not C++, though.
On the plus side, I learned a lot about file systems that day.
I physically cringed reading that
This happened to me, luckily it was like 2 simple files in a folder that I rewrote in half an hour. Now I do this: `mv *~ /tmp`, way too scared to do a typo.
Nick Craver can get in on this too: https://meta.stackoverflow.com/questions/345280/what-if-i-see-someone-like-nick-craver-doing-something-bad/
😂
“so it’s(sic) objects” I hope you also teach your students that language (not just computer language) is important. My experience is more projects fail at the specification stage than any other. It doesn’t matter how good the implementation if the objective is not understood.
Why the “(sic)”? “It’s” (the contraction for “it is”) is completely correct here. “It’s [Xyz] all the way down” is, in my estimation, a fairly commonly used phrase (see https://en.m.wikipedia.org/wiki/Turtles_all_the_way_down).
“so it’s(sic) objects”: this is not a mistake. She is saying that App inventor, “…itself object-oriented,” is therefore made thoroughly of objects. Read carefully, especially before coming down as militant grammarian.
Inspirational write up. It requires lot of courage to admit our mistakes ☺️
“And all I got was this lousy T-shirt” 🙂
Thank you for the essay!
Another minor: A square image contain[ing] a ball (twice).
The ““make the common case fast” principle is quite far from being universal – there are a lot of cases (especially, in the real-time and near real-time systems) when the principle shall be “make the worst possible case acceptable”.
For an ever wider class of systems the “make a system reaction time predictable” outweigh (to some extent) “make the common case fast” (a good example is tree sort vs. quick sort).
A quick note: it seems most of the image links are broken?
Love the article, but a lot of the images aren’t loading because of our company firewall… Could they be hotlinked, or hosted by SO?
Also I had to hunt down the comic (http://www.threepanelsoul.com/comic/on-perl) that didn’t load…could you add the name into the alt text?
This is the today’s mistake that won’t happen again next time 😀
Nice Article. Refreshing. Since i switched to another programming job I increasingly suffer from the imposter syndrome. I have been put to work on projects that were build years ago and I feel I don’t know what i’m doing every single day :(. Its mostly bugfixes in someone elses code in systems that might win awards because they are done by the book, but are overengineered in my opinion. So you have to go down many classes and subclasses to find out what stuff does.
They must have noticed because we both agreed it would be best if I got a job elsewhere.
So now I’m trying to switch jobs to be a programming teacher. I have had enough stress from the commercial jobs for a while.
But how can you convince yourself you are not a imposter if my current boss tells me i take to long to solve bugs?
I’m starting to think there are limits to my skill and it doesn’t include fixing bugs in systems not build by myself.
That’s exactly what happened with me and I am still suffering from imposter syndrome. I don’t even know if that’s a real syndrome but that’s what I feel now. Although I have resigned from that job a month ago but I have still not regained my old confidence. I am still trying to figure out am I too much slow or others are really that much competent.
I’m never embarrassed of my old code, if I were I’d feel I wasn’t progressing. Anyone embarrassed of where they came from or how much they’ve learned is a shame because you’re going to learn quickly that life is full of mistakes and you will always learn new ways to do things.
YACodeMistake:
`keywords->add(user->interests(i)->keywords(i)) {`
should be
`keywords->add(user->interests(i)->keywords(i));`
The AppInventor is… interesting (or not). Shouldn’t testing have screened this kind of problem in the first place ?
To Hopeless Programmer, tell your current boss that when you want to have your confidence destroyed you’ll let him know. Until then he should keep his feedback to himself. We all have to learn and no pain no gain as they say.
This was a very enjoyable read!
From my experience, with programming, it’s very easy to feel like the “dumbest person in the room” – it’s nice to remember that we all make mistakes. We’re all learning and growing….and that’s the point. Not to write perfect code every time, but to learn and grow from our mistakes. Thank you for sharing!
I am curious to know in what rough time frame you worked on the switch optimizer. If it was in the mid 1990s, fun fact, I was on the scripting team at the time and we ended up writing our own macro system to produce an optimized switch because there was a bug in the C compiler’s optimizer that caused a large slowdown in the script engine’s inner loop — which was of course a switch on the script bytecode. It is surprisingly difficult to get a switch optimization right, which is all the more reason to ensure that the optimization code is extremely readable! 🙂
Accepting a programmers job in languages and platforms I did not know or have any peers to bounce off. Software outlasted hardware (14 years, demise due to Microsoft pulling Peer-Peer protocol, and new laws regarding batteries), three SLA callouts in lifetime – all to recover data on failed (drowned) hardware. Those were the days… Or writing an inter-systems protection routine in CL on the AS/400 trans-application, trans-program, only for the systems architect to ask why CL?, or in a fit of pique after being dropped from all ‘projects’ become the bug fixer (fixing 6-8 fully ISO5750 documented bugs daily for 8 months) – ending up with four full time validators confirming code fix and causing problems for one project because they had not booked out ownership of any programs – 43 fixes in the core program dented that managers nose somewhat. (Projects are meant to also own the bug set that program has, and fix as part of the project changes, manager decided unilaterally bugs were beyond the programmers remit, as was booking out the programs). Think I may be dangerous…
Ian, I believe “it’s objects all the way down” is correct. It’s an allusion to “turtles all the way down”.
Thanks for caring about language! I am indeed strict with my students about proper writing.
Jeroen, thanks for the correction! I’m delighted that someone is reading the alt text.
Thanks, Craig. I added the URL to the alt text.
Eric, thanks for the story. I wrote the code in the summer of 1988.
Lovely article! I vividly remember in my first job putting some code into production in a hurry, then looking at it, and thinking ‘there’s a possible divide by zero there – but surely that occurrence must be pretty unlikely’. Of course, I got a phone call from the end user within about 10 minutes.
Speaking of divide by zero…
when I was an undergrad in CS my former econ prof asked me to help debug his friend’s econometric simulation, which was “erroneously crashing”, according to its author. I called said author (a PhD economist) and explained that his program was aborting because it was dividing by zero. “So?”, comes the reply. “Um, well dividing by zero is illegal, because the result is undefined”, I say, trying not to sound snotty. “What do you mean?”, says he. “When you divide by zero you get zero”, he says.
After thinking for a long time, I finally ask him, “well, this kind of computer doesn’t do that. So when the denominator is zero, what do you want the result of the calculation to be?”
“Oh”, he says. “Just make it zero.”
That was 1983. I still smile whenever I hear that “the models” are forecasting this or that amount of inflation, unemployment, economic growth, etc.
Nice story!
“APIs are forever” seems to contradict the worse is better “simple implemenations are better than simple interfaces”. Non-simple interfaces often lead to incorrect usage patterns and just like “APIs are forever” bad usage patterns are forever, especially in the age of github/stack-overflow where one bad usage pattern is copied forever. I’m still having to correct people on bad examples from 9 years ago that just won’t die.
Maybe it depends on what you think of when hearing “simple interface”. A simple interface could mean a complex implementation. That’s bad. A simple interface could also just mean well thought out.
“Sprites should have origins in center”. Not in my experience. Where you need the origin is game specific and art specific. Pretty much all platform games put the origin between the feet of the character. Usually this is either settable or is done by hierarchy. Parent the sprite to some other object and use that object as the center.
The top example of the compiler optimizations was also confusing. My thoughts as I read it were (a) code review should have caught this but it was 1988 so there was probably no code review. Lesson = there should always be code review so someone with more experience can catch your hard to understand code (b) it’s easy to imagine bad code but the description of the code didn’t sound obviously bad. I can imagine every easy to understand code in the form of something like
if (indiciesAreConsequtive) { emitLookUpTable(spacing = 1); }
else if (indiiciesAreEvenlySpaced) { emitLookUpTable(spacing); }
else { emitCascadingIfStatements(); }
Or something along those lines.
I’ve certainly written my share of embarrassing code.
Nice article. I don’t understand what distinguishes the bitmap from the circle, though. Why is it not equally logical to refer a bitmap object to the centre rather than the top left?
Thank you for this article. Funnest item I have encountered in the Stack universe during the entire month of October. I thoroughly enjoyed reading the article.
As a former user of the TI 99/4A I would award a 50-point bonus for mentioning the first 16-bit home computer, if only I could.
This article presents a well-selected trio of war stories around software engineering mistakes that is both educational and entertaining. As an older software engineer, I can very much relate.
This is the experience I share with younger colleagues: Test everything. Never assume anything.
Very nice article. Only strong people can speak about their mistakes like this.
I have a question about this.
“If you ever develop an API (which almost all programmers do)”
I develop vba programs inside excel for my living. So far I thought I’m also a programmer. But I never developed API. So seems like I was wrong.
Awesome read- I learned more about API creation and how to be easier on myself as a dev! Thanks Ellen
Thank you writing an inspirational article.
I will point out that some of the cases mentioned here, can mislead a programmer. For example “make the comman case fast” is not going to work in most of the real life situations. Rather a programmer must be careful about the worst possible cases. If I am not wrong, a European space mission was failed (with billions of dollars lost) just because of a floating point error where the programmer assumed that such case will never happen.
(P.S. sorry for my weak English)
If I had a square meal for every time I’ve screwed up a nested for loop I would need a gastric bypass.
These seem to me like mistakes anyone and everyone would make (maybe not the compiler one actually – that DOES sound like a support horrorshow!)
So if you’re new to software development reading this thread, don’t have nightmares!
Your article reminded me of a alleged true story of the guy who worked at Elliott Brothers in the 60s on a particular contract and is credited with inventing virtual memory. Since 8k would have been a helluva lot of RAM in those days he used tape to spool stuff in and out of memory, just as it’s done today with hard drives. Unfortunately it was incredibly slow and the contract was canceled. He was called to the MD’s office. He thought he would be fired but instead the MD said to him: you got us out of this mess, get us out of it.
warna, if you write functions that other people (or you at a later time) call, that’s an API.
Thank you for the reply.
Yes I normally use my own vba class files to do the coding quickly. So they have lots of functions and subroutines which I use frequently.
I got my degree in different field (I had a VB6 subject in one semester and C for one semester. That’s all).
I learnt rest from experience, from free online tutorials. So my knowledge in technical terms are poor.
Killed a herd of cows with an extra 0.
On a DEC VAX coding from coding sheet (so hand written code) keyed in on-site at an animal feed company.
Feed formulation active ingredient potency calculation (drug amounts basically) had an extra 0 in one of the calculations causing a x10 increase in the amount of the ingredient.
Software company I worked for was not liable, the feed company were. Due to the nature of the project, they were contracted to do side-by-side testing with existing systems, which would have highlighted the issue, but they cut corners, and didn’t.
It highlights the importance of testing.
Programming is about 3 things ..
Refactor, Refactor and Refactor
When I revisit previously written code, I almost feel like …
Re the Microsoft C compiler tale, as Brian Kernighan put it, “Everyone knows that debugging is twice as hard as writing a program in the first place. So if you’re as clever as you can be when you write it, how will you ever debug it?”
Great read thank you.
I’m kind of puzzled that you view the circle as a mistake. You don’t really make an argument that it was other than that some people thought it was.
Math books may place circles from the center, but nothing else in your API is like that. The system you used was both consistent with your other calls and would make it far easier to place the circle under an object.
With the center placement you have to perform a couple of minor calculations to place the circle aligned with or below one of the sprites.
I think understanding that you have to add 1/2 the circles width to the X or Y to align with a sprite is actually more daunting than what you did for a novice.
You only have one set of feedback, the people who thought it should be from the center, and maybe didn’t see everyone for whom it worked just fine. I think if you had done it the other way, you would have had the same or a greater “mistake”.
I’m probably the biggest “imposter” here. I didn’t take computer science courses in college except for intro to Pascal and the bonehead intro to MSDOS (5.0), Lotus 1-2-3, WordPerfect, etc. class that everyone had to take. (In high school I learned WordStar–which was better than Word, in my opinion, and it’s a shame it died so quickly.) I graduated as a teacher. But I loved to do programming on the side.
My most horrifying programming moment came in about 1997 when I had managed to download all of my emails to date and wanted a way to keep them private, readable only by me via a password. I programmed a simple encryption algorithm (I no longer recall if I was using Pascal or QuickBasic at the time) that accounted for essentially any ASCII character that might be in the emails, and made it so that running it on a file would simply encrypt or decrypt it, toggle style, when armed with the correct password. I then aimed the thing at my email file…and was shocked to see that it had worked beautifully, but only for the first email of hundreds! I had entirely forgotten that at the end of each email there was an EOF character that the code would see as the end of the file. Because it had overwritten the old file, and because I had foolishly made no backup, I lost years of email in one fell swoop!
I guess it didn’t affect others much; but, boy, was I chagrined.
Your article have it all. It’s fun to read, inspiring to admit mistakes, and teaches to learn from mistakes. Very nice write-up.
Check this out: https://imgur.com/gallery/VHc9g
Code errors 😀
Ellen, I think you’ve just lifted the most crushing, debilitating weight of imposter syndrome off me! Thank you!
I’m a beginner in programming, and up to now i have made a lot of mistakes that keep me going and exploring more and more.
Thanks for sharing your experience, it means a lot to me.
I really enjoyed reading your post this morning.
As a self-taught programmer who started out spending hours after school at the mall (programming on display-model TRS-80’s, Atari 800’s, and the TI-99/4A as well!) I’ve written some doozies over the years.
I’ve been amazed how even the throwaway example code in our documentation and help files is critical. The number of systems that have been compromised by bad SQL Server examples using hard-coded connection strings with ‘SA’ [blank] for the username and PW boggles the mind.
As Bill Joy once said (also commenting about bad code he’d written) “Examples must be exemplary!”
Your students are most fortunate!
Thank you again!
@jason C
Why in the heck was rm aliased to ‘rm -f’? that’s like removing the safety from a firearm, and later changing you mind to make it another trigger.
the GNU -I (capital i) option is useful to prevent accidental large deletions. Though, I haven’t deleted a whole directory full of stuff without tab-expanding the glob in a while.
I loved this. I have been a professional programmer for over 50. I hope no one has a list of all the mistakes I have made.
One of my favorite sayings is ” The dark side of every optimizer is a pissimizer.”.
Bah… these examples are nothing. Wait till quantum computing, especially if used (it will be) to control ICBM silos.
Awesome article and I love the comic sketch. Wish you the best. Peace & Love
Zibri, thanks for sharing the cartoon! I’ll share it with my students.
Thanks for reassuring us that making mistakes is an inevitable part of learning and growth – it’s easy to forget that when it seems like everyone around you is supremely confident and never admits to mistakes. Doesn’t help being female in this field either – as this article describes:
https://www.nytimes.com/2018/06/12/smarter-living/dealing-with-impostor-syndrome-when-youre-treated-as-an-impostor.html
Thanks for the post!
This is a great article, and you’re a great writer. Thank you for sharing.
Loved it, this whole thing was beyond painfully relatable. I fuck up on a daily basis & I think everyone should too, thanks for the inspiration
Loved reading the article, ” The best code is no code at all” being the thought that most resonates with me. Thank you sharing your experience.
The “Worse is better” text might be the most horrible thing I’ve ever read, and I wish for the world that everyone reading it will instantly and wholeheartedly reject every single sentence in there, because that would be the right thing to do.
Or to say it with the words of Linus Thorwalds: I’d suggest printing out a copy of that text, and NOT read it. Burn it, it’s a great symbolic gesture.
My best ever mistake was running an SQL query like the following:
DELETE FROM someTable WHERE id = 1 || id = 2 || id || id = 4 || id = 5;
Great article for th 95% of programmers who are in permanent roles. For seasoned IT contractors, mistakes that make it into production are nothing short of complacency and bad practices. Mistakes can be made, but these should never make it into production. But they do, and this is normally the result of the aforementioned shortcomings in a developer, nothing to do with it being “part of the journey”. Unfortunately, human nature results in this being fairly ubiquitous while the skilled contractors must tolerate it and clean up the constant pipeline of mess.
I want to link directly to your first place story so that I can point people directly to this when they are designing APIs.
But there’s no anchor on the heading. So now, I’m considering putting up a page on the web with a summary, I link to the blog and an explanation to scroll down.
Even for people who are just bookmarking, it feels like an obvious use case for the blog that many users may face……?
(great article btw)
Good article, thank you for sharing your own personal experiences.
Questions – When did you go back to Academia and why ? And, how/what role you were in, did working at Mozilla, Google and Microsoft fit in to that ?
Just read the first paragraph, couldn’t resist commenting on you teacher “who argued that Romeo and Juliet was not a tragedy, since the protagonists did not have tragic flaws.”
I think they missed the point: when two teenagers plus four adults all end up dead because of their foolishness, it’s the honor culture they live in that has tragic flaws. The persistence of this kind of tribalism to this day is deeply embedded in many of our most serious problems.
For example, US politics. It’s not a straight line, but one could argue that Confederate mythology is a major contributor to global warming due to the increasing derision of science and reason in the name of preserving (a bad) culture, leading our politicians to make highly irrational decisions..
OK, one more comment after reading the article: worse-is-better vs. the right thing strikes me as the fundamental attitudes of Microsoft and Apple, respectively. Especially poignant at the time Gabriel wrote it.
https://www.businessinsider.com/how-apple-really-lost-its-lead-in-the-80s-2012-12
OMG… The App Inventor is what introduced me to programming all those years ago. I still remember clearly how I struggled with the ball component, so reading this is mind-boggling for me. Thanks for your work on the App Inventor, without it I wouldn’t be a developer today.
In my opinion, the worst mistake that a programmer can make is to not go through the full process of programming:
Requirements->Analysis->Design->Implementation->Testing->Maintenance.
Most programmers get an idea, and then immediately start implementing, and then realize that they didn’t take account something. And then they start re-implementing before their code becomes an unorganized spaghetti code. Then they tweak and add code until it works. Once it works and goes to SQA, SQA will find bugs, and the programmer will add more crap to get this program to work. By the time it reaches the customer it is a total un-debuggable, unmaintainable, ununderstandable mess. Eventually, the programmer goes to work at another company, and a new programmer comes to maintain this code.
The waterfall model is insufficient for software development. Programming is the hardest craft. Often you will find that you didn’t anticipate something in an earlier step, so that model must be modified to allow bactracking. It is somewhat inappropriate for software development, as making prototypes is cheap, in sharp contrast to the automotive industry where it was created.
The process of my pet projects is like:
0. I have an idea I want to try it.
1. I revise my aims, feasibility and design.
2. If I have done something similar before, I check the possibility to reuse the codes, possibly in part only.
3. Implementing and testing go hand in hand. If find something I didn’t anticipate, I go back to step 1.
4. Dogfooding for QA. As I begin to depend on it, I can’t just backtrack anymore after a while in this step.
5. Interface is codified and can only be extended now. The extensions follow similar process.
6. Maintenance: what can be fixed may be fixed but nothing else. A replacement exists.