I got the privilege of picking up the CI/CD task from the backlog. I have some background in building CICD pipelines with YAML but that was mainly from the application test and deployment perspective.
Creating pipelines that also creates and updates resources in Azure was going to be entirely new for me.
Luckily, I had access to co-pilot at work so I could put it a bit to the test.
This was a recent project and we wanted to take advantage of the opportunity to learn something new and build something that was future proof since there are plans to move from Azure DevOps to Github enterprise.
The first task was understanding what infrastructure as code was. We had existing projects that had existing pipelines created with azure classic pipelines as well as arm or terraform templates.
There was a lot of looking through the code and co-pilot was useful in some instances where I wanted some segment of code explained.
After I finally got my head around what the existing code did, I had a better understanding about how we were organizing our resources in the new solution too as we were using the same patterns.
I had initially created a pr pipeline and a complete CI/CD pipeline that created the resources as well as built and deployed the code but then I realized we did not need to update resources every time we release our app. So, i separated the infra pipeline and triggered changes only on updates to that folder.
It was straight forward to ask copilot for example, ‘how would i add a task that creates new resources using bicep’ and co-pilot created ready to use snippets where I just needed to update names etc.
There were issues where the code would not work straight off and when I say, ‘that didn’t work’, it then says it is not supported in the current version of bicep and suggests an alternative.
To be fair, the bicep syntax is simple to understand. I asked copilot things like; I want to access an existing resource in a different resource group, and it provided the code for creating modules and specifying the scope. The same for accessing secret values and connection strings, it said to use modules with secure parameters, complete with code samples and explanation.
I became quite reliant because all the code that it provided worked right off the bat. Until I wanted to programmatically add key values to azure app configuration.
First it provided some code which when deployed just failed. Then I told them it failed, and it said it was not supported in arm or bicep at this time which i found hard to believe since i could do it for key vault.
By this time, it was on the tail end of the day and my brain was no longer braining.
I stopped for the day and asked again the next day then it spat out some bicep code that worked. I think if i actually knew what I was doing i probably could have seen that right away. I tried googling it and it was one of the results above the fold.
The lesson I got from that is copilot is great for the initial learning phase, but once you get an idea of how things work, sometimes it’s better to use google so you find the documentation that helps you do thing you want to do, and that is with up-to-date information. And then if you just want to do some monkey typing, the inline copilot feature is actually useful for that. The chat co-pilot is a bit smarter than the inline one, at least I heard some people commenting on that.
Overall, it was really a fantastic way to learn about something. It really is like having someone to pair program with that will not be offended if you interrupt them while they give you explanations of why something is so.
It has been extremely helpful for me to figure out the LINQ statement that would get the right shape of data from a bunch of collections.
There’s’ a lot of people saying that we programmers are going to lose our jobs very soon. And someone has said, that’s probably going to be true when the customers are able to perfectly describe the requirements.
Jokes aside though, at its stage right now, it’s very much a tool that you use, but you still need to know and understand its output otherwise it will be a lot of brute force back and forth ‘no that didn’t work’-‘here try this’ which wastes a lot of time. It should also just be one of many tools you use. Search is still the best place to go to for up to date content.
I’m very excited to see where the technology is headed and how my workflow evolves with it.

Leave a comment