Sometime in 2011, I stumbled across an unassuming site called Hacker News. At the time, I was a data analyst working mostly with Excel and SAS, and almost all of the headlines there were foreign to me. Git diffs? PyPy? Real-time APIs?
I realized quickly, though, that HN was the heartbeat of the tech industry, the place where many threads and points of interest were discussed. There’s a lot to dislike about the commenters and some of the discussion generated, but I still don’t know of any other outlet like it for any other industry that allows the emerging practitioner to become quickly familiar with the latest news and slang of their given profession, with such a high discussion volume. (Other than maybe r/programming.)
I stil give newbies the advice to read Hacker News every day, click on links that interest them, and Google things they don’t know until it starts to make sense.
But, in the beginning, the more I read, the more disgusted I became with both myself (why didn’t I know everything?) and everything in my work environment.
Why were we still using Oracle? Why did I have a crummy Windows machine? Why were we not using Gmail instead of Outlook? Why was our team not called a “data science” team, and why were we not doing “data science”?
The more I learned, the more angry I became at both how little I personally knew, and how terrible my company was.
What I didn’t realize, was that I was falling prey to a cognitive that in today’s world might be known as a filter bubble, a combination of anchoring, availability, and framing effects that lead you to believe that the information you’re most exposed to is the way the entire world actually works.
And what Hacker News was teaching me was that the entire IT industry was working on brand-new Macbooks, using Go and Rust, Docker and CI, to build amazing products.
My hobby: Adding “in Rust”, “in Go”, and “in Tensorflow” to the end of Hacker News headlines and seeing if it makes sense. Always does. pic.twitter.com/N0lfLpAjY6
— Vicki Boykis (@vboykis) August 5, 2017
After more than ten years in tech, in a range of different environments, from Fortune 500 companies, to startups, I’ve finally come to realize that most businesss and developers simply don’t revolve around whatever’s trending on HN.
Someday, I want to write about the state of 80% of the IT market, for both companies and candidates. 80% of the market is not using K8s or microservices or doing interviews inverting binary trees.
Two cases in point:
One: https://t.co/IvOYO21mJy
Two: https://t.co/OJr4y9Za5l— Vicki Boykis (@vboykis) May 7, 2019
Most developers – and companies – are part of what Scott Hanselman dubbed a while ago as the 99%:
My coworker Damian Edwards and I hypothesize that there is another kind of developer than the ones we meet all the time. We call them Dark Matter Developers. They don’t read a lot of blogs, they never write blogs, they don’t go to user groups, they don’t tweet or facebook, and you don’t often see them at large conferences.
Lots of technologies don’t iterate at this speed, nor should they. Embedded developers are still doing their thing in C and C++. Both are deeply mature and well understood languages that don’t require a lot of churn or panic on the social networks.
Where are the dark matter developers? Probably getting work done. Maybe using ASP.NET 1.1 at a local municipality or small office. Maybe working at a bottling plant in Mexico in VB6. Perhaps they are writing PHP calendar applications at a large chip manufacturer.*
While some companies are using Spark and Druid and Airflow, some are still using Coldfusion:
You want to work in tech, get paid and never want for work. Learn some legacy shit. I was working on the only cold fusion project, deployed on the only Linux servers, from the first project on git at a company with 80b in assets under mgt. Then I got transferred to Visual Basic.
— Douglas King (@douglascodes) May 7, 2019
Or telnet:
TV Guide used to have a database of movie reviews bigger than IMDB’s. It was basically a text database running on a sparc. The speed with which editors could telnet into it and update stuff without a GUI could never be matched by a web interface.
— Deadprogrammer (@deadprogrammer) May 7, 2019
or TFS:
I worked with a mid-market consulting firm and our team was struggling to get GitHub Enterprise but it was a huge fight because (1) they exclusively used Microsoft TFS and (2) they never before supported Linux-based products
— Jacqueline @ csv,conf (@skyetetra) May 7, 2019
At one company where I worked in, we went through a migration of SVN to Git. In 2014. In many industries where regulation remains an important consideration, developers don’t have administrative access to their own computers. I’ve developed in environments where Java 8 was still the latest available version because of upgrade security issues.
Java 8 is still the dominant development environment, according to the JVM ecosystem report of 2018.
If you think that’s bad, check out Oracle:
Oracle Database 12.2. It is close to 25 million lines of C code.What an unimaginable horror! You can’t change a single line of code in the product without breaking 1000s of existing tests.
If you think it’s only old staid companies that do this, here’s a story about Tesla’s infrastructure.
A former Tesla employee, who worked on their IT infrastructure, is posting in a subforum of a subforum, a little-known place for funy computer forgotten by time. His NDA has expired.
He has such sights to show us. Join me and I will be your silent guide into a world of horror. pic.twitter.com/uFDOj0x5Zy
— fuckiest of nuggets (@atomicthumbs) August 24, 2018
Hanselman, in his post, focuses on developers, but there’s really several parts to this issue.
The first is that technology doesn’t move as quickly as people do. Companies have politics. There are reasons updates are not made. In some cases, it’s a matter of national security (like at NASA). In others, people get used to what they know.
In some cases, the old tech is better:
Not me, but my mom’s a COBOL programmer for a Fortune 50. They spent a ton of money trying to port their stuff to C/C++ and couldn’t get the same efficiency/speed.
— Hannah (@story645) May 7, 2019
In some cases, it’s both a matter of security, AND IT is not a priority. This is the reason many government agencies return data in PDF formats, or in XML.
But, not only government agencies are running XML.
This is the saddest thing I’ve read today, and I just recently started Crime and Punishment. https://t.co/SYgM063p3s
— Vicki Boykis (@vboykis) May 5, 2019
For all of this variety of reasons and more, the majority of companies that are at the pinnacle of succes in America are quietly running Windows Server 2012 behind the scenes.
And, not only are they running Java on Windows 2012, they’re also not doing machine learning, or AI, or any of the sexy buzzwords you hear about. Most business rules are still just that: hardcoded case statements decided by the business, passed down to analysts, and done in Excel sheets, half because of bureacracy and intraction, and sometimes, beacause you just don’t need machine learning.
we once chaned an ml algorithm into a sql query once we knew the table existed.
— Vincent D. Warmerdam (@fishnets88) May 7, 2019
Finally, the third piece of this is the “dark matter” effect. Most developers are simply not talking about the mundane work they’re doing. Who wants to share their C# code moving fractions fo a cent transactions between banking systems when everyone is doing Tensorflow.js?
Tried to do some simple emoji detection with @tensorflow.js
What do you think? pic.twitter.com/mP0RDsPHMd— Nick Bourdakos (@bourdakos1) May 6, 2019
In today’s online economy where thousands of developers are online, the person whose voice is the loudest gets the most weight. The loudest people aren’t going to be those working with legacy systems. (Unless you’re doing something extremely new).
This piece of the puzzle is the one that worries me the most. What I’m worried about is that places like Hacker News, r/programming, the tech press, and conferences expose us to a number of tech-forward biases about our industry that are overenthusiastic about the promises of new technology without talking about tradeoffs. That the loudest voices get the most credibility, and, that, as a result, we are listening to complicated set-ups and overengineering systems of distributed networking and queues and serverless and microservices and machine learning platforms that our companies don’t need, and that most other developers that pick up our work can’t relate to, or can even work with.
I’ve spoken and written about it at length, but, most times, easier is best.
And, if the tech is, in fact old and outdated, and the tradeoff from replacing it is lower than the tradeoff of keeping it, we shouldn’t be jumping to replace it with the latest and greatest. While we should evaluate new tools evenhandedly, most times, Postgres works just fine.
For better or worse, the world still runs on Excel, Java 8, and Sharepoint, and I think it’s important for us as technology professionals to remember and be empathetic of that.