So you’ve written Infrastructure As Code, Automated All The Things, and drank deeply of the DevOps kool-aid on your current project – what’s next?

You’ve been reaping the rewards of DevOps on a small scale, and your one-off DevOps effort has earned a lot of positive attention – now management would like you to implement DevOps for all the projects! So how do you spread the DevOps wealth, and what do you need to be aware of?

Delivering DevOps

What is DevOps

For this article, we’ll need shorthand “DevOps” to mean the code bits – such as bash/posh/cmd for scripting, TF/CLI for IAC, and YAML for Azure DevOps CI/CD pipelines. Start with identifying the valuable bits from your current DevOps efforts and use that as a basis for what you want to disseminate to all other projects.

Cross-Project DevOps

Not all projects will have the exact DevOps requirements. Still, over time you will build up a collection of useful scripts and templates that are generic enough to provide value across all software projects in your organization.

But you can’t simply copy/paste these files into every repo, as that would be a massive headache to manage and keep updated. Instead, you’ll want to version and package these scripts so that every project that adopts the enterprise DevOps approach can track and plan for DevOps package updates.

Custom Package Repository

The easiest way to distribute file packages is through a custom package repository. Chances are your software project is already using at least Nuget, NPM, Maven, or PyPI. Azure DevOps can create all of these kinds of repositories, which is how you can seamlessly distribute your company-proprietary DevOps package without making it publicly available. These custom repositories are also handy as a local cache for public packages.

DevOps Integration

Usually, downloaded packages are not committed to the repo, only the reference to the package – then the build/release agents download them as needed. But for DevOps packages, should be committed to the repo for a variety of reasons. To do that, include an install script with your package, like this example for an NPM package:

#!/usr/bin/env node

const path = require('path');
const fse = require('fs-extra');
const rootFolder = path.resolve('../..'); // backwards from “./node_modules/<package>”
const installFolder = `${rootFolder}/.devops`;
const oldInstallFolder = `${rootFolder}/.devops.backup`;
const nodePkgFolder = path.resolve('.');
const srcFolder = `${nodePkgFolder}/src`;

let oldFolderRenamed = false;

// rename .devops to .devops.backup
if (fse.existsSync(installFolder)) {
    oldFolderRenamed = true;
    if (fse.existsSync(oldInstallFolder)) {
        console.log('removing last [.devops.backup] folder...');
        fse.removeSync(oldInstallFolder);
    }
    console.log('renaming [.devops] as [.devops.backup]...');
    fse.renameSync(installFolder, oldInstallFolder);
}

// copy package src folder to install folder
console.log('installing devops package...');
fse.copySync(srcFolder, installFolder);

// read version from package.json and save to installFolder
const packageJson = fse.readFileSync(`${nodePkgFolder}/package.json`);
const package = JSON.parse(packageJson);
fse.writeFileSync(`${installFolder}/pkg_version.txt`, package.id);

if (oldFolderRenamed) {
    console.warn('Existing devops configuration has been backed up and replaced, please manually merge your configuration back into the new devops package!');
    console.log(`Please read ${nodePkgFolder}\\changelog.md`)
} else {
    console.log(`Devops package has been installed, please read ${nodePkgFolder}\\readme.md`)
}

This script copies the DevOps package from the node_modules (NPM package cache) directory into the project’s root directory. If the DevOps package directory is already present, it renames the old directory and copies it in the new one. It is then trivial to diff the old and new directories for changes and merge them.

For NPM install, use the –no-save option, as we are using NPM as a downloader/installer so it doesn’t need to save the reference in the package.json

Package Versioning

DevOps package files should be tracked in a Version Control System like Git, and each feature should then be developed in a separate branch, then PR’d, and each PR should create a new version of the DevOps package. You can then Devops your Devops by setting up CI pipelines to automatically publish new package versions whenever the master branch is changed.

Don’t forget to document! Include a readme for first-time consumers and a changelog for updates.

Adopt and Migrate Incrementally

Generally, DevOps development will follow this kind of flow:

  1. Implement a project-specific DevOps functionality
  2. Test, Fix and Refine
  3. Generalize and extract useful bits of functionality
  4. Test, Fix and Refine
  5. Repeat

When integrating an enterprise DevOps package, a similar process can be followed:

  1. Migrate project-specific DevOps functionality to use the enterprise DevOps package
  2. Test, Fix and Refine
  3. Generic and extract useful bits of functionality
  4. Test, Fix and Refine
  5. Merge the generically useful bits into the enterprise DevOps package
  6. Repeat

Standardize and Prevent Making Changes

A critical benefit of the DevOps package approach is that it allows for the standardization of DevOps processes across the enterprise and provides a straightforward way to keep all projects in sync. A bug fix or new process can be quickly rolled out to all consumers of the package.

Standardization could also be accomplished using Azure DevOps Pipelines or Task Groups, but any change to those will immediately affect all consumers and invisibly break things. But if the DevOps package is stored in the project repo, those projects are insulated from breaking changes and bugs.

Track Changes and Update Easily

Each project will likely have a separate set of custom configuration files that must be modified from the package baseline to customize the build and release pipelines. It is essential to separate these files and keep track of changes to them after a package update. Conversely, all the other files in the DevOps package should not be modified to ensure a smooth update. Suppose one of the package files must be modified. In that case, it should be either A) temporary with the expectation of merging the changes up to the package or B) copied locally to the custom configuration directory so that it is evident that it must be inspected during the package update process.

Enterprise DevOps Principles

To sum everything up, there are several themes here that will ensure a successful enterprise DevOps approach:

  • Consistency – standardize DevOps processes across the enterprise
  • Modularity – design DevOps processes so that each component is focused and reusable – follow DRY and SRP guidelines
  • Resiliency/Stability – make DevOps processes resistant to unexpected changes
  • Traceability – easily understand changes to DevOps processes and merge quickly

Example

Consider this NPM package:

  • scripts/install.js – module install script
  • src/config – all project-specific customization and scripts
  • src/pipelines – enterprise-wide CI/CD pipelines
  • src/scripts – enterprise-wide bash/posh/cmd scripts
  • src/tf – enterprise-wide terraform templates
  • src/Update-DevopsPackage.ps1 – helper script that triggers the download of the latest package
  • changelog.md – description of differences between versions, upgrade notes
  • package.json – NPM publishes metadata, contains version
  • readme.md – introduction to DevOps package, contains getting started instructions, directory, and file descriptions
  • And here’s what a consuming project may look like:
  • .devops/config – src/config directory from package
  • .devops/pipelines – src/pipelines directory from package
  • .devops/scripts – src/scripts directory from package
  • .devops/tf – src/tf directory from package
  • .devops/version.txt – text file containing the package version (handy for tracking updates)
  • .devops/Update-DevopsPackage.ps1 – copied from src directory from package
  • src/ – project source code, etc.
So you’ve written Infrastructure As Code, Automated All The Things, and drank deeply of the DevOps Kool-Aid on your current project – what’s next?

You’ve been reaping the rewards of DevOps at some scale, and your one-off DevOps effort has earned a lot of positive attention – now, management would like you to implement DevOps for all the projects!

How do you spread the DevOps wealth, and what do you need to be successful?

Delivering DevOps

What is DevOps?

For this article, we’ll use shorthand “DevOps” to mean the code bits – such as bash/posh/cmd for scripting, TF/CLI for IAC, and YAML for Azure DevOps CI/CD pipelines.

Start with identifying the valuable bits from your current DevOps efforts and use that as a basis for what you want to disseminate to all other projects.

Cross-Project DevOps

Not all projects will have the exact same DevOps requirements. Still, over time you will build up a collection of useful scripts and templates that are generic enough to provide value across all software projects in your organization.

But you can’t simply copy/paste these files into every repo, as that would be a massive headache to manage and keep updated. Instead, you’ll want to version and package these scripts so that every project that adopts the enterprise DevOps approach can track and plan for DevOps package updates.

Custom Package Repository

The easiest way to distribute file packages is through a custom package repository. Chances are your software project is already using at least Nuget, NPM, Maven, or PyPI. Azure DevOps can create all of these kinds of repositories, which is how you can seamlessly distribute your company-proprietary DevOps package without making it publicly available. These custom repositories are also handy as a local cache for public packages.

DevOps Integration

Usually, downloaded packages are not committed to the repo, only the reference to the package – then the build/release agents download them as needed. But for DevOps packages, should be committed to the repo for a variety of reasons. To do that, include an install script with your package, like this example for an NPM package:

#!/usr/bin/env node

const path = require('path');
const fse = require('fs-extra');
const rootFolder = path.resolve('../..'); // backwards from “./node_modules/<package>”
const installFolder = `${rootFolder}/.devops`;
const oldInstallFolder = `${rootFolder}/.devops.backup`;
const nodePkgFolder = path.resolve('.');
const srcFolder = `${nodePkgFolder}/src`;

let oldFolderRenamed = false;

// rename .devops to .devops.backup
if (fse.existsSync(installFolder)) {
    oldFolderRenamed = true;
    if (fse.existsSync(oldInstallFolder)) {
        console.log('removing last [.devops.backup] folder...');
        fse.removeSync(oldInstallFolder);
    }
    console.log('renaming [.devops] as [.devops.backup]...');
    fse.renameSync(installFolder, oldInstallFolder);
}

// copy package src folder to install folder
console.log('installing devops package...');
fse.copySync(srcFolder, installFolder);

// read version from package.json and save to installFolder
const packageJson = fse.readFileSync(`${nodePkgFolder}/package.json`);
const package = JSON.parse(packageJson);
fse.writeFileSync(`${installFolder}/pkg_version.txt`, package.id);

if (oldFolderRenamed) {
    console.warn('Existing devops configuration has been backed up and replaced, please manually merge your configuration back into the new devops package!');
    console.log(`Please read ${nodePkgFolder}\\changelog.md`)
} else {
    console.log(`Devops package has been installed, please read ${nodePkgFolder}\\readme.md`)
}

This script copies the DevOps package from the node_modules (NPM package cache) directory into the project’s root directory. If the DevOps package directory is already present, it renames the old directory and copies it in the new one. It is then trivial to diff the old and new directories for changes and merge them.

For NPM install, use the –no-save option, as we are using NPM as a downloader/installer, so it doesn’t need to save the reference in the package.json

Package Versioning

DevOps package files should be tracked in a Version Control System like Git. We need to develop each feature in a separate branch, then PR’d, and each PR should create a new version of the DevOps package. You can then Devops your Devops by setting up CI pipelines to automatically publish new package versions whenever the master branch is changed.

Don’t forget to document! Include a readme for first-time consumers and a changelog for updates.

Adopt and Migrate Incrementally

Generally, DevOps development will follow this kind of flow:

  1. Implement a project-specific DevOps functionality
  2. Test, Fix and Refine
  3. Generalize and extract valuable bits of functionality
  4. Test, Fix and Refine
  5. Repeat

When integrating an enterprise DevOps package, a similar process can be followed:

  1. Migrate project-specific DevOps functionality to use the enterprise DevOps package
  2. Test, Fix and Refine
  3. Generalize and extract useful bits of functionality
  4. Test, Fix and Refine
  5. Merge the generically useful bits into the enterprise DevOps package
  6. Repeat

Standardize and Prevent Breaking Changes

A critical benefit of the DevOps package approach is that it allows for the standardization of DevOps processes across the enterprise and provides a straightforward way to keep all projects in sync. A bugfix or new process can quickly be rolled out to all consumers of the package.

Standardization could also be accomplished using Azure DevOps Pipelines or Task Groups, but any change to those will affect all consumers immediately and can invisibly break things. But if the DevOps package is stored in the project repo, those projects are insulated from damaging changes and bugs.

Track Changes and Update Easily

Each project will likely have a separate set of custom configuration files to modify from the package baseline to customize the build and release pipelines. It is essential to separate these files and keep track of changes to them after a package update. Conversely, the other files in the DevOps package should not be modified to ensure a smooth update. Suppose one of the package files must be modified. In that case, it should be either A) temporary with the expectation of merging the changes up to the package or B) copied locally to the custom configuration directory so that it is evident that it must be inspected during the package update process.

Enterprise DevOps Principles

To sum everything up, there are several themes here that will ensure a successful enterprise DevOps approach:

  • Consistency – standardize DevOps processes across the enterprise
  • Modularity – design DevOps processes so that each component is focused and reusable – follow DRY and SRP guidelines
  • Resiliency/Stability – make DevOps processes resistant to unexpected changes
  • Traceability – easily understand changes to DevOps processes and merge quickly

Example

Consider this NPM package:

  • scripts/install.js – module install script
  • src/config – all project-specific customization and scripts
  • src/pipelines – enterprise-wide CI/CD pipelines
  • src/scripts – enterprise-wide bash/posh/cmd scripts
  • src/tf – enterprise-wide terraform templates
  • src/Update-DevopsPackage.ps1 – helper script that triggers the download of the latest package
  • changelog.md – description of differences between versions, upgrade notes
  • package.json – NPM publish metadata, contains version
  • readme.md – introduction to DevOps package, contains getting started instructions, directory, and file descriptions

And here’s what a consuming project may look like:

  • .devops/config – src/config directory from package
  • .devops/pipelines – src/pipelines directory from package
  • .devops/scripts – src/scripts directory from package
  • .devops/tf – src/tf directory from package
  • .devops/version.txt – a text file containing the package version (handy for tracking updates)
  • .devops/Update-DevopsPackage.ps1 – copied from src directory from package
  • src/ – project source code, etc.

Why Stream?

If you don’t have school-aged children, you might not know that live streaming is the hot thing at the moment. Imagine telling your younger self that you could not only get paid for playing video games but that people would enjoy watching you play games. Honestly, I was born in the wrong era.

Traditionally, the way developers communicated their craft was through the written word; we already type thousands of lines a day, have a WPM of 90, and write extensive documentation. Writing instructional text is a natural progression.

But some ideas are best communicated visually and personally – this is why developers go to conferences, sit in crowded rooms, and stare at a wall of text. However, in the age of COVID-19, in-person meetings have been put on hold indefinitely – so the next best way to recreate the experience is live streaming.

This may sound daunting; most people have a fear of public speaking, and many developers are introverts, compounding the fear. If you’re terrified of speaking to a dozen people, imagine potentially thousands of people watching and listening to you online! But the best way to overcome fear is through familiarity.

Livestreaming is no more difficult than sharing your screen in Microsoft Teams and talking your coworkers through your code. Sure, you could write a multi-page email or extensive documentation, but that lacks the immediacy and leaves it open to misinterpretation. A live stream, with a live audience responding and asking questions, is a great way to communicate ideas.

Getting Started

Content

This is the simplest but sometimes the hardest part – coming up with something to discuss. In your daily grind, it’s easy to talk about what you’re working on with your coworkers, but once you clock out, it can be hard to come up with a topic. This is the curse of every communicator, and it’s up to you to figure out your muse. But if you can figure out a way to talk about what you’re working on for work without giving up any company secrets, then that’s a great place to start.

Audience

Now that you have an idea of what you want to talk about, you have to figure out who you want to talk to! Who’s your audience? It’s simple in a work setting: your office roommate, your pair programming buddy, and your team members. But outside of the 9-5, it can be difficult to find an audience. This is a topic in and of itself, but a great first step is to find dev streamers that you like, then follow in their footsteps and appeal to their kind of audience. Over time, you’ll develop your niche.

Tools

Now that you know what you want to talk about, and who you want to talk to, it’s time to figure out how to bring it together. Each area will be broken down in greater detail in the following section.

  • Computer hardware (of course)
  • Broadcasting software
  • Audio/Video equipment
  • Streaming services
  • Miscellaneous gear

Computer Hardware

Since we’re specifically talking about live streaming software development-related topics, it follows that having a computer is required. You’ll want a decently capable computer with a moderate CPU, RAM, and GPU. Essentially, if it’s a decent gaming system, then it’ll be a decent devstreaming system. A good starting point would be a Ryzen 3600, 16GB RAM, 2 or more monitors, and an Nvidia RTX GPU*.

* The GTX 1650 Super is also good because it is built on the same Turing engine as the RTX, and it encodes just as well as the RTX 2080TI, so don’t overspend. The other important factor is the number of connected monitors affected by the amount of GPU memory.

Broadcasting Software

I’ve only experimented with a couple of apps, so this isn’t an exhaustive list – do your research as to which one suits you best.

Open Broadcaster Software (OBS) Studio

It’s free, has a lot of features, is cross-platform, and is very popular. This is an excellent app for getting started with streaming and is pretty straightforward. It has a thriving plugin ecosystem. There’s also an alternative version of OBS called StreamLabs OBS (or SLOBS), which streamlines the OBS interface and provides additional functionality.

vMix

vMix is not free, but it has many professional features, which makes it very well suited for professional broadcasts and personal live streaming. It’s got a steeper learning curve than OBS but is much more flexible and customizable. You can try it out free for 60 days, and after that, the basic HD version is just fine for live streaming and costs $60.

How to Choose?

I’d suggest starting with OBS to get acquainted with live streaming, and it’s sufficient for screen sharing and integrating a few cameras. But once you start trying more advanced tricks that are difficult to do in OBS, look into vMix (or other paid apps).

Add Video and Equipment

At a minimum, you’ll need an audio input – no point in a live streaming without one! Hopefully, as we’ve all become remote workers, you’ve been provided with a headset by your company, that’s all you need to get started. The important part is that your voice is clear and intelligible, so if your coworkers can understand you during your daily standups, then you’re good to go.

A camera is not strictly necessary, but recommended – you want your audience to connect with you quickly, and your face on the stream facilitates that. An example would be giving a presentation in a room full of people; they’re focused on you just as much as your content. A webcam can fulfill both video and audio needs, but the webcam microphone will probably pick up a lot of ambient or background noise, so be sure to test all your audio sources to figure out which sounds the best.

If you have a camera, the next most important thing is lighting – make sure you have some strong neutral ambient light sources. If you don’t have a nearby window or are broadcasting at night, you may have to buy some key lights.

As your live streaming journey progresses, you’ll discover a whole new world of AV gear, but now stick to the basics before you start chasing that dragon. A headset and a webcam are sufficient to begin.

Streaming Services

Up to this point, we’ve addressed everything required to produce a live stream – now, we need a service[SK1] to receive and distribute it.

  • Twitch.TV – The current popular service for live streamers of all kinds. Good interactivity with your audience. The negatives are that the video and audio quality (bitrate) is limited.
  • YouTube.com – YouTube is the gold standard of internet video distribution. There are no limitations on bitrate here. They’ll take as high a quality as you can provide them—a great place to organize and store older streams. Live interaction with your audience is not as good as other services.
  • Facebook – Very easy to get started, with good interactivity with your audience, but it can be hard to find an audience outside your immediate circle of friends. Limited bitrate dramatically affects video and audio quality.
  • Restream.IO – Restream is a site that receives your stream feed and simultaneously retransmits it to other services, like FB, YT, Twitch, etc. The free tier can handle up to 3 output streams—a great way to increase your reach for free.

Collaboration Apps

Everything that applies to streaming also applies to screen-sharing or teleconference apps! You can stream into applications like Microsoft Teams, Zoom, Slack, and Discord by either dedicating an entire display to the broadcast output and then screen sharing, or adapting the broadcast output as a webcam.

Miscellaneous Gear

These are optional tools that I’ve found useful for improving the ease and quality of a live stream.

HDMI Dummy Plugs

HDMI Dummy Plug ShotThese little HDMI plugs trick your computer into thinking there’s a real display connected to it, which you can then use as a virtual display for screen captures. Sometimes, two screens aren’t enough to produce a live stream, so I plug one of these in, set the screen capture to the fake display, and composite my applications.

Elgato Stream Deck

These are great little devices that can be used to control your broadcasting software like OBS or vMix, execute macros, launch apps, etc. Each button is an LCD and can provide visual feedback to the state of your stream. Here’s my layout for when I’m streaming with vMix:

First row: CPU/GPU usage & temps
Second row: vMix inputs
Third row: vMix overlays, cough (hold to mute), Fade To Black (cuts feed)

Elgato Stream Deck

Webaround Green Screen

Green Screen WebThe downside of putting yourself on camera is that you also have to put your background on camera – unless you get a green screen. These green screens by WebAround attach around your chair and provide the utility of a green screen without the messy setup.

Then you go into your broadcasting software, set the chroma key, and now you have a clean image of yourself that you can overlay on top of the screen share.
Example of Green Screen

Producing a Live Stream

Giving a successful presentation is outside the scope of this article, but you can check out a blog from my co-worker, Andrea Pinillos, on top considerations when presenting at a conference. Below are some practical steps to follow when live streaming:

  1. Clean up your screens, and remove or hide anything that might be a distraction. As a viewer, seeing dozens of unorganized icons on the desktop brings me almost physical pain. Close all apps not needed for streaming or development. Prevent desktop notifications from popping up during the stream.
  2. If you’re displaying a browser window, have a clean one prepared ahead of time with no bookmarks or tabs.
  3. Test your broadcasting software ahead of the scheduled broadcast and give yourself enough time to fix any technical issues.
  4. If you have a smaller audience, take some time at the beginning of the live stream for small talk and connect.
  5. Switch things up occasionally (visually and aurally), to keep the audience engaged.
  6. Be mindful of your audience’s time. Avoid doing things that take a long time unless you have content to fill that gap. If possible, just like on a baking show, have a precooked result ready to go.
  7. When ending the stream, wrap it up coherently, summarizing everything discussed, and the main points you wanted to convey.
  8. Solicit feedback and adjust accordingly.

Do’s and Don’ts

Lastly, here are some tips as you begin your live streaming journey:

DO: Show your face.
DON’T: Be a slob – check your appearance before you show your face.

DO: Be prepared and have an outline.
DON’T: Have any “dead air” where you’re waiting on something and have nothing exciting going on – have a precooked result and skip right to it.

DO: Place the camera above your screen so that you’re naturally looking at it while developing.
DON’T: Have a distracting background behind you – tidy it up or use a green screen to hide it.

DO: Know how to cut the audio and or video feed if you need to cough or handle an interruption.
DON’T: Multitask – stay focused on the subject matter.

DO: Show your mistakes – nobody’s perfect, and showing how to recover from a mistake is also helpful.
DON’T: Trash other languages, frameworks, etc. Everybody has their own opinions and favorites. In general, avoid negativity.

DO: Have a drink handy – you’ll be talking a lot.
DON’T: Talk endlessly – give yourself and your audience breaks, at least once an hour.

Summary

So why stream? Because you’re most of the way there and you didn’t even know it.

You already have a good foundation of experience for it because you’ve been working from home for a few months. It doesn’t require a lot of specialized hardware, and your employer has most already provided you with the minimum (headset and/or webcam). All the software and services you need to get started are free. There are no technical or financial hurdles to start devstreaming.

But the real benefit of devstreaming is personal growth. By attempting to explain and teach a topic to others, you’ll learn it much more thoroughly – you end up teaching yourself. Every benefit that applies to public speaking also applies to streaming: you’ll learn to overcome stage fright, be more organized in your thought process and speech, and get better at communication – always a plus at work!

Coding is relatively easy – clearly conveying ideas is hard. But it’s a skill that you can master with practice – so get streaming!

Another new year, another Codemash in the books. Codemash is a fantastic family-friendly developer conference that brings in speakers and attendees from all over the world, with a wonderfully diverse range of topics and expertise. If there’s a hot new language or technology, it’ll be discussed here, in the sessions or in the hallways. Despite being held in Sandusky, OH in the middle of winter, it’s a blast for the whole family since it’s hosted at the Kalahari indoor waterpark resort.

Pro-Tips

This was the fourth Codemash I’ve been able to attend, so I’ve learned a few tricks. The pre-compilers are where you’ll actually learn the most – these are half- or whole-day sessions preceding the main conference, and allow you to deep-dive into a specific subject and get hands-on experience with expert guidance. Bring a laptop with enough power to play – you’ll be spinning up new VMs, IDEs, and hardware.

(Also, bring plenty of hand sanitizer, Emergen-C, and antacids – because The Crud seems to spread very fast at these indoor winter conferences!)

Networking

There are plenty of opportunities to expand your professional network – from the experts teaching the sessions, to fellow session attendees, to your dining table companions, to board-game players. I met a lot of great people doing interesting things from all over the Midwest and my neck of the woods – I even met local developers who had preceded me at my local office! Some were even familiar with AIS and our reputation for doing great work in the .NET/Azure space.

Session Recaps

Next I’ll break down the most memorable sessions that I attended, and the major takeaways from each… Read More…