I’m delighted to say that I passed AZ-400 Designing and Implementing Microsoft DevOps Solutions on Monday November 23rd 2020. After MS-500 this was “fun” if such a thing can be said about a Microsoft Exam.
My method is to create a OneNote section for each exam I target and then I create lists of links to training along with an estimate of how long the training says it will take. The handy thing about the learning paths and profile page is that it counts down as you complete sections, meaning that it feels like you are making progress. When I feel I am on the home stretch I book the exam with however long I feel like I need. With this exam I’ve had the benefit this year of supporting an iOS mobile application development project on the Microsoft Platform which really helped me to understand the build and distribution aspects of that.
I make sure that I do every exercise I can and it was excellent to see how good GitHub is – I’m quite old and have worked with Microsoft technologies since Visual SourceSafe and before the Microsoft acquisition of GitHub. The training that the latter offers is really really good for a free to use resource.
Other than that my take on the exam is that it is a wide ranging topic so the exam is basically DevOps where one part of the solution is a Microsoft product. This means it pivots – you won’t necessarily be using a “Microsoft” build solution like Azure DevOps or GitHub to build your solution if you are developing in Visual Studio and so on.
I enjoyed the training for this exam as it is heavy on automation but also straight forward to follow along with the setup I have. I’m fortunate to have an Visual Studio subscription allocated to me as part of working for a Microsoft Partner and the tooling and Azure Subscription that come with this was essential in completing a number of the exercises on Microsoft Learn as you follow along in VS Code, GitHub, Azure DevOps and Azure in building solutions.
I’m up to date with my other Azure Exams so I also achieved an Expert Certification with this pass which is a nice feeling!
Being male I’m probably obsessed with fiddling when I could probably spend my time doing something productive. One obsession is tuning (I spent more than the value of my first car on “performance parts”) which I’ve slowly cured over the years and now refuse to modify my cars.
Funnily enough one thing that has made a real difference to my internet performance now that I have broadband is to ignore the popular opinion on websites and actually benchmark my home dns performance and then stick to it.
In my case I’m fortunate to have a decent router that has a caching dns service (rather than simple pass through) and I’ve set this to query the fastest dns server I can get to on my connection. All connections on my LAN point to the dns server on my router by getting the settings through DHCP.
Rather than blindly pointing to Google or Cloudflare, please benchmark your performance by using a DNS performance tool. As I’m very old I like to use GRC’s DNS Benchmark. Yes I’m a Windows user so I probably excluded a bunch of readers but for the rest of us it is a simple .exe and creates an INI file (remember those?) when you create a custom resolvers list.
If you are like me and don’t live in the US then run the program (this is what apps used to be called) and create a custom resolver list, then run a benchmark and adjust your network.
Read and weep – in my case Cloudflare takes ten times as long to resolve an cached dns lookup as my router and Google thirty times as long. Unfortunately the cliche is that the fastest dns is that provided by the vendor of my connection. Your mileage may vary, which is why you should test.
Now to deal with the challenge of dns settings when you have failover between two ISPs!
Although I expected a decent result, I had the usual trepidation before the exam and woke up really early on the day. This started building up naturally as the date approached, but in the days leading up to the exam I noticed that the length of the exam was the longest I have seen at 210 minutes. (Part of my preparation methods is to put an appointment in my diary for the exam and location – I’ll write this up as part of my exam prep post one day!). The length got me thinking about labs and things and confirmed when I got the announcement at the beginning of the sitting that it included 1 lab.
A “lab” is a practical test of your skills on a particular subject and although it’s getting on for a year or so that Microsoft Azure exams have included labs, so far I’ve not had any and I was a bit nervous.
I progressed through the various sections steadily and I kept an eye on the clock. I’d read a few horror stories of candidate’s time keeping going awry and them running out of time. As it was I didn’t get too bogged down and proceeding at my usual pace. The curve of dread was quite amusing (in hindsight) and peaked about a third of the way in to the exam as I got a bit stressed at what I didn’t know. Then as I progressed through the questions it settled down as I encountered elements that I was confident in.
The practical test came at the end and I had over 2 hours left and actually began to enjoy that part. I’ll admit that I just used the portal to complete my activities but was reassured that the direction giving acknowledged that certain parts would take time to complete and that I could progress with the tasks as needed while it waited. I’m fortunate that my “day job” has a lot of hands-on work and I’m logged in to an azure subscription almost every day (after elevating my permissions through privileged identity management!). I applied the same deliberate pace and double checked each setting and user. If I was to build a test system against a live portal then I could imagine the type of process that I would interrogate the Azure Resource Model to check that configuration had been carried out correctly. This is just the same as naming conventions and azure policy checking so at each pivotal point I paused and made sure that I was reading things correctly – just like following a technical design. In a real life situation I would also use scripting as a confirmation step but took a pragmatic approach with the tools I had.
I was ambivalent at the end and it doesn’t do to be overconfident, and the lab introduced another twist at the end. I clicked the Finish Exam button and the response came back almost immediately:
‘Thank you for taking this Microsoft Certification exam. Your test results will be available once scoring is complete. You may exit the exam now without affecting the scoring process by clicking the “End” button. Your score report will be available online in your Microsoft Learning dashboard at www.microsoft.com/learning/dashboard‘
Talk about an anti-climax and it even sent it to the printer (the chap at the test centre asked if I really wanted to keep it!). So I was a little high and dry and while in limbo decided to get the bus back to the office while I waited and then I collected my stuff from the locker and fired up my work phone for the colleague support network on Microsoft Teams!
Anyway to wind forward I was about 10 minutes in to my bus journey when the congratulatory email came through on my phone and I was able to see my score report. Although it doesn’t really matter, the score was a good 100 points over the pass mark which I’m happy about as it’s content I should know in my day job.
My thoughts on the exam – here’s a summary without any NDA busting:
Like the admin exam the exam outline calls out the Azure services that will be included and these will be in the exam. Inevitably this is not everything that the extensive platform provides and this is a relief!
The exam has good coverage of the built in protection in Virtual Networks and Azure AD. Unlike the real world where you might have federation or Network Virtual Appliances in the mix, this exam rightly focuses on the “out of box” provision.
Time management is crucial in giving yourself space to address the lab. That said my first lab was a really good experience – it was actually the easiest part of the whole exam to understand and answer as it covered things I do almost every day. The flip side was that it took me as long to do the single lab I had as it did to answer the other sections.
And finally, as well as building on the other hands-on work (and exams) the preparation material I used for this exam was:
Skylines Academy AZ-500 Course – Nick Colyer’s course on udemy has a good step by step coverage of most of the content. As ever remember to follow along in your own portal. I bought it months ago during one of the regular sales on the platform.
Skylines Academy AZ-500 Practice Questions – this came through as I was in my latter stages of preparation. About 60 odd questions and a good way to poke me out of exam fatigue. Not a huge number but again so cheap that it was a no brainer to further my learning.
Of course you should spend lots of time in the Azure Documentation as this is an awesome reference and gets lots of feedback through GitHub. I also found a pluralsight path for AZ-500 but at a total of 42 Hours when I looked there was no time I would be able to cover it all in the time I wanted to spend.
As part of my work in Azure Architecture and Operations we make extensive use of Azure DevTest Labs as they are a useful way to facilitate end user compute for advanced users like Developers and Data Scientists.
In that we tend to use the Azure Data Science Virtual Machine as it includes a whole bunch of tools that cover 90% of our end-user needs and it is very easy to provide secure access with a self-service element and maintain control while managing the demand on our small team.
Recently I was preparing a lighter machine based on a Windows 2016 image with just the tools we required for 6 months of Python related development. Many of the sample artifacts make use of Chocolately which is really handy for deploying applications as there is a great library of packages.
I developed and tested the Artifact set last week but when it went to initial UAT it failed with “ERROR: Exception calling “DownloadString” with “1” argument(s): “The request was aborted: Could not create SSL/TLS secure channel.”.
I traced this to the Ensure-Chocolatey function and specifically the line that downloads and runs install.ps1 . Hunting around the internet let me do a discussion about TLS versions and that the webclient defaults to TLS 1.0. I wasn’t able to confirm this in the environment I had but I was able to check SSL on the chocolatey target using ssl labs i.e. https://www.ssllabs.com/ssltest/analyze.html?d=chocolatey.org
This indicated that the server the machine was connecting to was only accepting TLS 1.2 and above. I forced the script to use this using [Net.ServicePointManager]::SecurityProtocol = “tls12” above the webclient call and this fixed the issue for the time being.
I’m happy to say that I’ve passed AZ-900 as part of my employer’s initiative to have everyone go through the Azure Fundamentals exam. This is a recognition that cloud is a core part of their business.
My thoughts? I perhaps underestimated the exam and although I passed well I didn’t ace it. I’ve scored more in other “harder” exams so I’d recommend what I try to tell myself – look through the actual product being tested (Azure Portal Features) and if you want to score more you’ll have to remember some of the detail of features and charging structures. I think the classic learning tips of What? How? When? for each exam objective will serve you well.
I’m beginning to realise that all of the exams are treated seriously and a pass (even for fundamentals) actually means something. Respect to my non technical colleagues and a little nudge to myself to treat things seriously!
I’m really happy to say that I (finally) passed 70-339 Managing Microsoft SharePoint Server 2016 on Friday after a couple of failed attempts. This was my 32nd exam pass and my first time pass percentage is quite high, mainly as I tend to be very careful about booking exams when I think I am well and ready for an exam. So what was different this time ?
1. I didn’t respect the exam
I think a run of first time passes on exams made me a little complacent and I relied too much on the good results I got with the official practice exam. I should have remembered how hard I found the breadth of the previous generation of SharePoint exams and though about the implications of a single exam for the whole product (there used to be two administrative exams for each version of SharePoint). I probably came short and should have thought harder about the implications of elements in the exam outline.
Having the product in front of you to try things out is also a proper lesson well remembered.
2. Study and exams don’t exist in a bubble
When I failed first time I took the standard approach and booked for a couple of weeks after, on the basis that my fail mark was just short of the required pass mark. Then some family stuff came up which meant that I didn’t get a lot of sleep the night before the exam and had a lot on my mind. This happens and there isn’t a lot that can be done; life is unpredictable and it’s important to work to live rather than get things the wrong way around. Reflecting on this made me think about my attitude during preparation and what techniques and methods might help with all of the aspects of my life.
3. Sit exams when you know stuff
This inelegant heading refers to my experience that sitting exams on subjects that directly relate to your day job is so much easier than others. I’ve not been working daily with SharePoint 2016 since my last job and I think that even that was focused on a narrow band of deployment. Both this exam and 70-532 Azure development were tough and that was because I didn’t have the day to day depth in a subject area like I have with Azure Architecture and Administration. Stretch targets are good but they need the work.
4. Sit exams when they are current
What I mean by this is that there is a natural curve to an exam lifetime. Some Microsoft exam areas are particularly current like the Azure Administration and Architecture exams and apart from tweaks to the platform will be active and up to date. I think the perfect set of circumstances is a year or so after an exam goes live in a technology that is in wide use. Contrast this to 70-339 which has been available since mid 2016 and relates to a product which has undergone a fundamental change in delivery – most users of SharePoint will now use the online product.
Like my car driving test (I love driving!) sometimes I have to work hard to achieve something and sticking at it is a test of personality. Unfortunately due to what must be a bit of a personality defect it can take a couple of fails for me to realise that I have to buckle down and examine my strategy. In the case of 70-339 I waited a month or two after my second fail to have a think, see how things were going and take a bit more time out. In something I think is like a classic retry pattern I introduced a delay. Of course in development the delay would be a bit more regular in nature but hopefully you get my point.
I changed jobs just over a year ago and an awesome part of my new Employer’s approach to staff development is that they send members of the team to conferences and training.
I had the privilege of attending a boot camp for cloud architects in Bellevue, run by Microsoft for their Partners and also had a day in between travelling from Scotland to Seattle to have a look at the city with my colleagues.
We basically got lost after visiting Pike Place Market, we thought we were heading towards lake Union but hadn’t read the map quite correctly. So we looked around and spotted the Space Needle fairly nearby (it’s a bit of a spottable land mark) and headed towards that.
The majority view was that we didn’t want to spend the money on going up the Space Needle so we went next door in to MoPOP Museum of Pop Culture. As I discovered this has a fairly significant connection to Microsoft as I saw one exhibit after another from the Paul G. Allen collection and it slowly dawned that a founder of the museum was also co-founder of Microsoft.
This place is amazing; it starts with the swoopy architecture which has a monorail bursting through it. Then the inside is all modern clean lines with doors and stairs leading to themed exhibits.
There are closed off exhibitions behind doors that cover elements like films or people or open areas that open out to the full height of the museum.
I failed a Microsoft Exam last Friday – yes it’s true, on occasion I fail an exam. One (amongst the many) fantastic attitude at my current employer is that a Microsoft exam fail is part of the journey of discovery. A couple of my new colleagues also remark that any significant score over the “pass” mark is a waste of study time and I can kind of see where that comes from.
If you’ve booked exams for the last few years you will have been informed of the latest on retake policy which has been tweaked and firmed up to an extent to give candidates a proper chance between resits and not to try to brute force the attempts. At the time of writing the Exam retake policy states:
If a candidate does not achieve a passing score on an exam the first time, the candidate must wait at least 24 hours before retaking the exam.
This time I was particularly keen to book my resit as soon as possible, the practicalities of availability in Edinburgh means that I was expecting to have to wait a couple of weeks at least for availability so I didn’t expect the 24 hours to be a problem. I went through the exam details page, clicked “Schedule Exam”, confirmed my details and the link accounts page and got redirected back to the same page with a light yellow banner “50055: This exam is not currently offered. Please select another exam.”
So I tried a few different ways without success; inprivate, different devices and all gave the same error. I tried telephoning to be told that I would have to wait 24 hours to book. So I waited 24 hours after the end of my exam and still couldn’t book.
I was finally able to rebook through the Pearson Vue site at 18:30 on the Monday after my exam on the Friday; the exam was scheduled to end at 12:30 (My times are BST). The half hour seems more than coincidental and the take away is that the systems will prevent the booking taking place until at least a number of hours have passed on business days.
I don’t fail exams often and I certainly don’t plan to, and hopefully you don’t either. So when the unthinkable happens don’t panic and take time to regroup and make plans. And wait a day and a bit before you try to rebook!
Interesting one today – standing up a Cosmos DB to record the output of a CycleCloud job run which happened to be written in C++ and started getting “Failed to read item”. Data Explorer stopped showing the results from the item when browsing.
Issue was that our new id had been delimited with slashes and Cosmos DB didn’t like it. If you get “Failed to read item” when clicking through then you might have a character in your document Id that Cosmos doesn’t like.
There are some awesome folks out there who share their hard efforts so the rest of us can have an easier job. A few of these that have been really useful sit around work against the REST APIs of key Azure services.
My days of day in day out development are over so I find a lot of my automation “glue” mashing up deployments relies on PowerShell with the odd bit of CLI. Most is a little bit of scaffolding to deploy ARM templates but occasionally a requirement to work with the data plane of a resource appears and I have to resort to manual config.
ARM Template support for configuring resources is always improving but due to timing this isn’t always possible. Sometimes it is really helpful to understand what is going on, and sometimes the only option is REST.
For the latter I thoroughly recommend POSTMAN if you need to interact, though Azure is also improving native API exploring support. I discovered POSTMAN through an azure Friday video with Steven Lindsay who has some really really useful modules on GitHub. This is really helpful for CosmosDB (Documentdb as it was) and really helped me debug some Gremlin issues.
Next is the PowerShell module for CosmosDB which sits over REST and as well as being an awesome example of the kind is also a really helpful module for checking interactions with CosmosDB.