Trying New Technology I had someone ask me about DuckDB recently. Would I think that's a good choice for a database? I don't really know. From their blog and some online research, maybe, but it's also a minority player in a niche space. I had a chat recently with someone that had implemented ArangoDB, a graph database. Why that and not Neo4J I asked them? Someone at the company had tried the database and recommended it. Not a bad reason, as I think experience with tech is important, but it's not the most important thing. As I've aged, and maybe matured, I think less about the ability of a technology to work and more about the ability of a technology to be maintained over time. Not by me, but by everyone in my organization. Not everyone, but can anyone working on our staff learn and use it, including the future employees we haven't yet hired. There seem to be no shortage of new niche technologies. I have a few newsletters I subscribe to, and I see new projects and new solutions appearing every day. New tools, utilities, frameworks, even databases. Some of these might be amazing, and incredibly useful, but will they exist in a few years? In fact, that's a question I ask myself about plenty of Microsoft technologies that appear. Will they really be around in 5 years? Long-term, or at least medium-term, supportability is important. I also worry about the training and learning required for new technology. I've seen companies that adopt too many products in their tech stack and it becomes hard to hire experienced people. Even if we hire smart people that can learn, we have a lot to teach them. The more we need to teach, the slower they are to be productive. It can be even slower for us to trust them to work independently, especially in a crisis. I think that most organizations should limit the number of technologies they use. This could be frameworks, languages, and more, including databases. Don't add something new just because a developer, DBA, or even executive likes it. Certainly, be careful about changing technologies when the change isn't adding value to your organization. Every change has costs, every new advantage contains a disadvantage, and every additional thing creates training requirements. Some people might pick things up quickly, easily, and during their off hours. That person might be you, but how many others will be able to do that? Not many. That's been my experience. The world is full of average people, by definition. While the average level (skill, capability experience, etc.) at your organization might be higher than our industry, over time, that will change. As our organizations grow, and as we change staff, we often become more average. Our choices, and methodologies, our architecture, and more must survive the average employee, not the high performing ones. Every organization ought to limit tech choices. There ought to be a process and way to add new technologies, and employees ought to be able to submit a request, make a case, and have others decide if taking on a new technology makes sense. If so, great, but do so carefully. I like seeing new technologies built and adopted, but I also try not to just adopt the latest shiny things. Experiment, in a time-boxed fashion, and make decisions when appropriate, consciously because the benefits outweigh the costs. And not just slightly outweigh the costs, but substantially. In all likelihood whoever proposes the new tech isn't thinking about the downside, and there will always be more downsides than you can see right now. Steve Jones - SSC Editor Join the debate, and respond to today's editorial on the forums |