In the words of the greatest television character of all time, Omar Little:
A man's got to have a code.
Over the last few years, I've been collecting a set of pithy statements that capture the most fundamental beliefs that guide the way I think and work and move through the world. I'll collect them here and expound on what they mean. These aren't political or theological or scientific or anything else, but inform where all of those things come from.
#1: You have something to learn from everybody.
There's a lot of reasons that you might be tempted to dismiss someone. In my day job, I'm an academic. I don't think of myself as particularly smart, more passionately curious, but you might be fooled if you like brand-name colleges and judged me purely based on my resume. I can see from within the academy that a lot of folks with PhDs are tempted to think that input from people who don't have the right credentials or even the right kind of credentials is not going to be useful. This, according to principle #1, is useless. People don't always have expert knowledge, but they always know something that you don't. If you don't find ways of communicating with people whose knowledge, training, education, or style matches yours you miss out on what they have to teach you.
This is not merely a statement about formal interactions in academia, but all of them. People have strange and interesting lives, not a single one of them is the same as yours. I look at baseball research through this lens, for example. There's this trope of analysts and coaches being at loggerheads. Though they share the same goal– winning baseball games– they find themselves incapable of understanding each other. This principle #1 suggests an approach completely unlike the trope: as an engineer or analyst, the approach that it suggests would be to start by develop a collegial relationship with one or more coaches. The mistake to avoid is asserting that I– the analyst with my fancy degrees in engineering– know baseball better than the coach, just because I can descibe it probabilistically. It sounds ridiculous, especially if you imagine the scenario with an MLB coach and all the experience it takes to become one and me, say, but it's a much easier mistake to make than it sounds. People of all stripes make this sort of mistake all the time, whether its the analyst and the coach or the engineer and the machinist.
A good example is one of my bosses from a foodservice job, who I legitimately think is one of my smartest friends. He's not particularly educated, I think he might have a college degree, certainly not a technical one. Every once a while, in the middle of a sleepy midafternoon shift, he'll come over and say something like, "Don't get caught flat footed, I think we are gonna get a rush 30 minutes early today." Imagine a literal tumbleweed rolling past the window, just a dead shift. "Mike being Mike," I think to myself, every time. Without fail, he's right. Sometimes it's because he spotted a catering truck for some kind of gala nearby, and predicted the after-partiers coming our way. Other times, it's because of a calendar event that he might have noticed. But the best is when his reasoning is something like, "Well, it was unseasonably warm day so I think it it will be busier than normal, and it's cloudy and the sun sets at 4:30 this time of year, so I think people will want to get out of their offices earlier than usual." As a math nerd and an analyst, it's obvious that all of these things could be measured, and we could probably project turnout in real time pretty decently with a simple model and the right datastream. But the statistical and analytical knowledge is completely useless if I don't know where to look, and it's my day job to know the math and not to watch out for the signals. Principle #1 is all about finding ways of getting more than the sum of the parts in collaboration, and it will never happen if you think you're above someone else for the particular skill that you have.
My last thing to say about Principle #1 is that it doesn't just apply to my friends and confreres. Some people have deep ideological disagreements with me, fundamental disagreements about life or work. It's easy to ignore these types of folks, and in the social media era it's even easier to talk past them. I think that's a lost opportunity though. Iron sharpens iron, for one: if my ideas are right, subjecting them to good-faith criticism, and even thinking hard about them to defend them against bad-faith criticism, they will only get better. Even in the latter case, when someone is truly finds themself completely unworthy of interaction, Principle #1 still begs that I try not to miss the learning opportunity; the truly nihilistic are few and far in between, and the rest are trying and failing to act in good faith somewhere along the line. Principle #1 suggests that if I can nothing else from someone, I should at least learn from them how not to become like them.
#2: Confidence about ideas is good for you; absolute certainty can only hurt you
A bunch of the stuff that I work on, whether it's in aeronautics research or in baseball is about Bayesian statistical methods. That's not the point here, but an idea from this domain of statistics, which is closely related to how our brains function, I think is relevant here.
Consider trying to predict probability of one event– like hitting a home run– among a set of events of which only one can happen at a time– like all the different things that can happen in an at bat in baseball. You know it's gonna be weighted with a probability but you don't know what the weighting is. It turns out that you can prove, with mathematical certainty, that by incorporating data to improve an initial guess, you can only get better and better estimates of the true probability no matter how bad the initial guess is. There's a requirement though: if you are 100% certain or 0% certain about something, the new information that you are gathering will not help get you closer to the real probability of the event that you're trying to estimate. It doesn't have to be much: it can be tiny! You just have to hold that all of the things that could happen are not impossible, merely of tiny probability. If you do so, eventually, you will approach the truth.
I like to apply this way of thinking to my life and work. I try not to fear uncertainty but, at the same time, I also don't fear having confidence. Like all of these principles, #2 is one that I am trying and constantly failing to live by, and it's definitely not my strong suit. In practice this is reflected in my willingness to have opinions; those that know me will attest I have many. But it also means that, I hope, it is always well understood that though I often have a strong point of view professionally or philosophically or aesthetically, I am not above changing it, and should always be open to discussing it with those who disagree.
A key part of the Principle #2, is building a framework of knowledge that incorporates new data and observations appropriately: this data should be not trusted in a vacuum, but taken as a whole with the state of belief before understanding it in mind. Simultaneously; the state of belief before an observation should not be the same after the observation. The whole of understanding incorporates both the prior framework for understanding, constructed so as not to exclude possibilities, and the observations, which in life as in engineering are always imperfect.
#3: The elementary particle of every success and failure is the team, not the individual
I will elucidate this idea... eventually.