Container Development Work

  • Comments posted to this topic are about the item Container Development Work

  • Containers are great for many situations. They do add many complications and require a certain level of infrastructure to get anywhere at all in terms of serving applications. Also, get ready for many, many long and labourious debug sessions when security etc doesn't quite work! Once things do work you do get great deployment and consistency. Until something breaks it all again 🙄

  • Containers is a technology I would like to get into. I've tried once, about a year ago, but then life got in the way and I had to put it aside. Where I work adopting anything current (notice, I didn't say new) is greatly resisted and sometimes prevented from adopting via policy. I don't know why, because whenever asked why that question is never answered. My guess is someone, somewhere, is threaten by the adoption of current technology, so its better to prevent it ever being used. But, I could be wrong.

    Kindest Regards, Rod Connect with me on LinkedIn.

  • I found that building a container to run PostGres was a useful training exercise.  It was hard enough to be interesting but not so hard that I threw the monitor out of the window.  It has not been designed to be difficult.

    The company I work for uses containers for DBT, Python and various other apps.  It is something I do infrequently so there is always a period or re-acclimatisation but not as much as when I step away from SSIS.

    I find containers particularly useful for training labs.  I have images for different versions or Postgres and also for different databases.

    • Images represent a fresh starting point
    • Containers = long lived data provided a mount point is created to your machine from the container.

    This means that I can learn over several days/week without going back to square one every time.  I can spin up different containers with the same image if I want.

    Containers work well with the cloud too.  Sometimes a company is reluctant to upgrade software but in the cloud they may not get the choice.  When AWS decides that the minimum version of Python or PostGres needs to increase then  you have to follow.  Having containers to run those versions gets around the hard limits imposed by the cloud vendor.  I wouldn't recommend keeping ancient versions of languages and DBs alive for too long, especially when they are not dying technologies.

    The appeal for me is

    • No DLL, library or version conflict hell
    • Docker desktop allows containers to be security scanned.  This can be very useful if you are arguing for an upgrade
    • Speed of spinning up a new container or shutting one down
    • Only run what you need to when you need to.
  • I started using containers last year as a way to ensure all of the varying projects we had would still allow me to produce based on a "base" (current git version) and apply the features for each project AND as a way to provide for easier developer setup.  We have multiple developers working on multiple projects, and with the images I created, they can switch to different projects w/ a fully setup database server for what they are working.  As well, they did not need to have SQL Server installed on their machines - only whatever IDE they chose to use for querying the data outside of their application work.  One additional feature this has allowed is having no databases on their machine - I actually put the database inside the image (it does make it larger), however, the devs know they have all they need with each different image they need to pull and use - and they have "sa" rights on those containers.

Viewing 5 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic. Login to reply