2011
11.15

It seems to be fashionable of late to malign the value of a college education.  Tech pundits like Peter Thiel is offering bright students a pile of cash to either not go to school, or to drop out and start a company.  For some really bright people, this approach might make perfect sense, but to suggest it in general is incredibly shortsighted. It’s undeniable many, many people waste their opportunity when they go to college.  They go to the wrong schools, for the wrong reasons without putting thought into what could be a critical investment. Some, by poor instruction, disability, or simply lack of motivation, go without enough of an educational foundation to gain anything from college. The main thrust of most of the bulk of the counterarguments ultimately comes down to “It costs too much”, “You can learn all those things outside of college”, “Most people don’t work in the fields they get their degree in anyway” All true, all well documented, and all ultimately misdirection.

All these criticisms, in my opinion are aimed at implementation issues, rather than the idea of University learning itself. A coworker made this point best, and to paraphrase him, college, at its best, is academic boot-camp. I’d call it a short term sacrifice to make a long term gain. At its best, getting a college degree is a chance to dive deep into knowledge that the needs of today would never afford. It’s a chance to learn how to write with grace, and to construct well thought out arguments. Most importantly, it’s the one time in most peoples lives when they can dive deeply enough into learning that they can uncover ideas that will induce a paradigm shift.

So why doesn’t it work out that way? Everyone wants to blame one thing which, in my opinion, is why the discussion is all hand wringing and no progress. The truth is more layered. The cost of a college education is undeniably increasing quickly. Among American Institutions, you have increasing number of (often wealthy) foreign students attending. Then there’s the fact that lots of people treat college as a four year party, interrupted only briefly by classes and studying. Top notch researches aren’t always the best instructors, which puts a greater teaching burden on TA’s and graduate students. To top it off a college education is no longer a guaranteed ticket to employment, if it ever was. There are a host of other problems with how the higher education system currently functions.

So why defend the system? In the end, there are some things that just require a lot of time and immersion to learn.  It’s certainly possible to learn these things by oneself, but for most people carving out enough time to do so is hard. Our society is full of distractions, add on the mental load of career employment and most people don’t have the mental discipline left to push themselves through college level material requiring hours of study to master. Generally people’s skill in their career generally appreciate in value over time, post high school is the cheapest time to make an investment in the future. Wait too long, and most people won’t be able to afford the time when our the value of our  that would ultimately make them better employees, citizens, and human beings. Pragmatically, getting a degree immediately after high school means sacrificing time when most people’s skills and employment value are at their lowest.

On top of these pragmatic reasons College campuses are still a breeding ground for creativity.  They’re a place for people to collaborate  and rub elbows with other intelligent people without the time pressure of the workplace  Many successful startups and business have been created from connections created, and research done on the grounds of Universities.  Even Thiel’s Paypal had it’s roots on University Drive adjacent to Stanford. This unstructured collaboration and research often pays dividends in personal connections made, even if the student doesn’t end up working in their original field of study.

Higher education certainly doesn’t have to be that path for everyone.  There is a shortage of skilled tradesman and there are plenty of other paths including self education, and entrepreneurship for young people to develop into knowledgeable adults. None of these things changes the fact that for many people, a college degree is still a great investment. In short, lets not get hasty and throw away a great thing to resolve some implementation issues.

2011
11.10

Sometimes I’ll come across something that I read, or hear, or see, that causes a permanent shift in my thinking. I was reminded of this the today when I was idly browsing my copy of Stranger in a Strange Land by Robert A Heinlein. The plot of this excellent piece of science fiction revolves around the reintroduction to Earth of a man raised by Martians. When I first read the book, in high school, I can clearly remember how shocking it was to put myself in the mind of someone experiencing money, religion, and other very common human concepts for the first time. While I certainly haven’t adopted all of the philosophies espoused in the book, these ideas definitely altered my thinking, I’d like to believe for the better. In college, it was breakthroughs in understanding of induction, recursion, and functional programming all radically altered the way I approach problem solving in ways that reach beyond my programming skill. I remember a similar effect the first time I really wrapped the concept of “focus” in the martial arts.

the common thread between all these personal mental shifts was a pile of hard work.  There’s currently no easy way to shortcut the learning process, and I can’t see any way to do so in the near future. This presents a real problem, considering that the amount of knowledge available to the average person is increasing exponentially. Most of us don’t have the luxury of having unlimited time for study and exploration. Technology might be make this information more accessible, but it isn’t making it any easier for our brains to process, collate, and learn from the increasing river of useful knowledge, nor does it make it easier to find which nuggets in the sea of content will actually spark a new idea.

I wonder if it would be possible to somehow speed up this process by collecting a list of books, activities, and conceptual “big ideas” that other people have found to create a paradigm shift in their own thinking. It seems to me that this would be a great way to continue working out my mental muscles without having to try to find  good ideas by luck or coincidence. I’d be much more motivated to study a subject outside my expertise  on my own if I knew what the “big idea” was that I was trying to wrap my head around.  Admittedly, this would come with the downside of knowledge “a mile wide, and an inch deep” but I think that’s a worthy sacrifice to prevent the motivation killing problem of drowning in the details of a new area of study.  The goal wouldn’t be to become a master of every domain, but rather, to force the mind to expand and grow by mastering new core ideas that change how one views the universe.

Does anyone have any suggestions for content that meets the standard of “paradigm shifting” ?  Or is it possible that this kind of thing is too personal to generalize well?

2011
11.01

Programmers write code for a living.  Everyone who writes about programming agrees that programmers need to write more than just code, but this dramatically underestimates the level at which we as a profession should be writing.

It’s become a mantra to “show me the code” but the code isn’t always enough.  Code is a slippery and imperfect representation of the a particular programmers mental model spelled out in the language of the his or her  tools at a particular point in time.  This references changes meaning changes as languages and  frameworks changes, and the programmers working on it change.  Codebases live on, 2,5,10 years past their intended use.  We talk about producing literate code, testable code, readable code, but imagine code being written now viewed from the vantage point of a programmer 10 years from now.  I see it as most likely looking like COBOL appears to most programmers today.

This isn’t writing documentation for the codebase, comments inside of the code, or lists of features and technical specifications.  The truth of the matter is, useful though all those things may be, people don’t read them unless it’s absolutely unnecessary to solve a particular problem with your software. The kind of writing I’m talking about is crafting the story of what your software was intended to do, and to tell the origin of how that software was written.  It’s telling about the technologies used without diving into technical details by describing how they are being used to solve a particular problem.  In fact, a good litmus test to ask about whether or not something belongs in this type of document is “would a non programmer find this useful?”  If so, you’re probably on the right track, if not, keep writing it but put it in one of the aforementioned types of documentation.

Good writing isn’t boring, and it isn’t always entirely efficient.  That’s okay.  Developing a voice and keeping it light means someone might actually read your writing, rather than using ctrl-f to get to the point immediately.  Source control allows you to track the development of software over time, so too should the writing I’m describing document the changing goals and trials, both technical and business specific that the software is adapting to solve over time.  For a completely contrived, and short example:

As the load began increasing in our application and we reached a dozen servers we started having serious concerns about how to quickly get messages between the different parts of the application.  We opted to deploy the third party NServiceBus library as a quick and easy way to pass events between the different parts of our product.   This gave us a stable way to reliably queue up messages to ensure that the email system always sent out responses after the customer sent in an order and prevented embarrassing situations like last week when a silently failing service meant four people didn’t get their Friday deals email until Sunday morning…

This kind of documentation helps everyone.  It gives the business people an idea of how the application is working, it gives future programmers an idea of what problems were being solved when features were being written and it adds to the corpus of knowledge that sales, and management can use to better understand what the software team has been working on.  For consulting, this type of writing can quickly clear up unclear points that can otherwise get glossed over in discussion.  I’ve found that misunderstandings that continue for days during verbal discussion often get immediately highlighted when someone puts down their model of a situation into a problem.  Unlike more specific technical documentation, when paired with a time and a date this piece of writing never becomes outdated it’s just one more thing that paints a picture of how an application came to be what it is today.

The critical point in all of this is that only the people who are intimately involved in writing the application can write this kind of documentation.  Our frequent failure to do so is a contributor to engineering’s too often strained relationship to management and business, and it doesn’t need to be this way.  We can do better, and we can start doing so by communicating our work in a way that everyone else can understand and appreciate.  Write it up, stand up, and be recognized.