I’m pleased to say that I recently passed DP-201: Designing an Azure Data Solution. I did this a week after the DP-200 exam and gained my Microsoft Certified: Azure Data Engineer Associate Certification as a result.
As I mentioned in my post about DP-200 the learning paths are identical on the exam pages for DP-200 and DP-201 but one thing I discovered after DP-200 is that the certification page has additional learning paths which were helpful in augmenting my knowledge.
I found the exam slightly easier going and the score (which means little) was higher than I received for my DP-200 pass. I think this is fair on a number of levels, I don’t work with Azure Data every day so implementation was always going to be tougher.
I’m pleased to say that I recently passed Microsoft Exam DP-200: Implementing an Azure Data Solution. Although a new single exam is in the wings, I was ready to progress and had read that one benefit of the “current” Data Engineering Associate exams was that the renewal would still be 2 years rather than the new style of one.
As you’ll read on I sat DP-201 the week after this exam. Having reviewed the material available it was the same on Microsoft Learn so there was little more to be gained in the time.
Although I expected a decent result, I had the usual trepidation before the exam and woke up really early on the day. This started building up naturally as the date approached, but in the days leading up to the exam I noticed that the length of the exam was the longest I have seen at 210 minutes. (Part of my preparation methods is to put an appointment in my diary for the exam and location – I’ll write this up as part of my exam prep post one day!). The length got me thinking about labs and things and confirmed when I got the announcement at the beginning of the sitting that it included 1 lab.
A “lab” is a practical test of your skills on a particular subject and although it’s getting on for a year or so that Microsoft Azure exams have included labs, so far I’ve not had any and I was a bit nervous.
I progressed through the various sections steadily and I kept an eye on the clock. I’d read a few horror stories of candidate’s time keeping going awry and them running out of time. As it was I didn’t get too bogged down and proceeding at my usual pace. The curve of dread was quite amusing (in hindsight) and peaked about a third of the way in to the exam as I got a bit stressed at what I didn’t know. Then as I progressed through the questions it settled down as I encountered elements that I was confident in.
The practical test came at the end and I had over 2 hours left and actually began to enjoy that part. I’ll admit that I just used the portal to complete my activities but was reassured that the direction giving acknowledged that certain parts would take time to complete and that I could progress with the tasks as needed while it waited. I’m fortunate that my “day job” has a lot of hands-on work and I’m logged in to an azure subscription almost every day (after elevating my permissions through privileged identity management!). I applied the same deliberate pace and double checked each setting and user. If I was to build a test system against a live portal then I could imagine the type of process that I would interrogate the Azure Resource Model to check that configuration had been carried out correctly. This is just the same as naming conventions and azure policy checking so at each pivotal point I paused and made sure that I was reading things correctly – just like following a technical design. In a real life situation I would also use scripting as a confirmation step but took a pragmatic approach with the tools I had.
I was ambivalent at the end and it doesn’t do to be overconfident, and the lab introduced another twist at the end. I clicked the Finish Exam button and the response came back almost immediately:
‘Thank you for taking this Microsoft Certification exam. Your test results will be available once scoring is complete. You may exit the exam now without affecting the scoring process by clicking the “End” button. Your score report will be available online in your Microsoft Learning dashboard at www.microsoft.com/learning/dashboard‘
Talk about an anti-climax and it even sent it to the printer (the chap at the test centre asked if I really wanted to keep it!). So I was a little high and dry and while in limbo decided to get the bus back to the office while I waited and then I collected my stuff from the locker and fired up my work phone for the colleague support network on Microsoft Teams!
Anyway to wind forward I was about 10 minutes in to my bus journey when the congratulatory email came through on my phone and I was able to see my score report. Although it doesn’t really matter, the score was a good 100 points over the pass mark which I’m happy about as it’s content I should know in my day job.
My thoughts on the exam – here’s a summary without any NDA busting:
Like the admin exam the exam outline calls out the Azure services that will be included and these will be in the exam. Inevitably this is not everything that the extensive platform provides and this is a relief!
The exam has good coverage of the built in protection in Virtual Networks and Azure AD. Unlike the real world where you might have federation or Network Virtual Appliances in the mix, this exam rightly focuses on the “out of box” provision.
Time management is crucial in giving yourself space to address the lab. That said my first lab was a really good experience – it was actually the easiest part of the whole exam to understand and answer as it covered things I do almost every day. The flip side was that it took me as long to do the single lab I had as it did to answer the other sections.
And finally, as well as building on the other hands-on work (and exams) the preparation material I used for this exam was:
Skylines Academy AZ-500 Course – Nick Colyer’s course on udemy has a good step by step coverage of most of the content. As ever remember to follow along in your own portal. I bought it months ago during one of the regular sales on the platform.
Skylines Academy AZ-500 Practice Questions – this came through as I was in my latter stages of preparation. About 60 odd questions and a good way to poke me out of exam fatigue. Not a huge number but again so cheap that it was a no brainer to further my learning.
Of course you should spend lots of time in the Azure Documentation as this is an awesome reference and gets lots of feedback through GitHub. I also found a pluralsight path for AZ-500 but at a total of 42 Hours when I looked there was no time I would be able to cover it all in the time I wanted to spend.
As part of my work in Azure Architecture and Operations we make extensive use of Azure DevTest Labs as they are a useful way to facilitate end user compute for advanced users like Developers and Data Scientists.
In that we tend to use the Azure Data Science Virtual Machine as it includes a whole bunch of tools that cover 90% of our end-user needs and it is very easy to provide secure access with a self-service element and maintain control while managing the demand on our small team.
Recently I was preparing a lighter machine based on a Windows 2016 image with just the tools we required for 6 months of Python related development. Many of the sample artifacts make use of Chocolately which is really handy for deploying applications as there is a great library of packages.
I developed and tested the Artifact set last week but when it went to initial UAT it failed with “ERROR: Exception calling “DownloadString” with “1” argument(s): “The request was aborted: Could not create SSL/TLS secure channel.”.
I traced this to the Ensure-Chocolatey function and specifically the line that downloads and runs install.ps1 . Hunting around the internet let me do a discussion about TLS versions and that the webclient defaults to TLS 1.0. I wasn’t able to confirm this in the environment I had but I was able to check SSL on the chocolatey target using ssl labs i.e. https://www.ssllabs.com/ssltest/analyze.html?d=chocolatey.org
This indicated that the server the machine was connecting to was only accepting TLS 1.2 and above. I forced the script to use this using [Net.ServicePointManager]::SecurityProtocol = “tls12” above the webclient call and this fixed the issue for the time being.
I’m happy to say that I’ve passed AZ-900 as part of my employer’s initiative to have everyone go through the Azure Fundamentals exam. This is a recognition that cloud is a core part of their business.
My thoughts? I perhaps underestimated the exam and although I passed well I didn’t ace it. I’ve scored more in other “harder” exams so I’d recommend what I try to tell myself – look through the actual product being tested (Azure Portal Features) and if you want to score more you’ll have to remember some of the detail of features and charging structures. I think the classic learning tips of What? How? When? for each exam objective will serve you well.
I’m beginning to realise that all of the exams are treated seriously and a pass (even for fundamentals) actually means something. Respect to my non technical colleagues and a little nudge to myself to treat things seriously!
Interesting one today – standing up a Cosmos DB to record the output of a CycleCloud job run which happened to be written in C++ and started getting “Failed to read item”. Data Explorer stopped showing the results from the item when browsing.
Issue was that our new id had been delimited with slashes and Cosmos DB didn’t like it. If you get “Failed to read item” when clicking through then you might have a character in your document Id that Cosmos doesn’t like.
There are some awesome folks out there who share their hard efforts so the rest of us can have an easier job. A few of these that have been really useful sit around work against the REST APIs of key Azure services.
My days of day in day out development are over so I find a lot of my automation “glue” mashing up deployments relies on PowerShell with the odd bit of CLI. Most is a little bit of scaffolding to deploy ARM templates but occasionally a requirement to work with the data plane of a resource appears and I have to resort to manual config.
ARM Template support for configuring resources is always improving but due to timing this isn’t always possible. Sometimes it is really helpful to understand what is going on, and sometimes the only option is REST.
For the latter I thoroughly recommend POSTMAN if you need to interact, though Azure is also improving native API exploring support. I discovered POSTMAN through an azure Friday video with Steven Lindsay who has some really really useful modules on GitHub. This is really helpful for CosmosDB (Documentdb as it was) and really helped me debug some Gremlin issues.
Next is the PowerShell module for CosmosDB which sits over REST and as well as being an awesome example of the kind is also a really helpful module for checking interactions with CosmosDB.
Kubernetes and AKS in particular is becoming more and more important to us at work. In our experimental facility we have to stand up varying compute platforms; my main project is examining a specific workload on HPC and part of it needs Kubernetes to support some supporting work.
Then I stumbled across a blog by Chris Johnson . I’ve met Chris (officially a “good guy”) exactly twice in Person; once in 2010 in Berlin at an Ignite Session (when Ignite was a smaller scale effort) for SharePoint 2010 where he presented a session on Microsoft Certified Master, and secondly at Ignite in Orlando last year when I made a point of catching him before he presented a session of the Microsoft Cloud Show with Andrew Connell (also officially a “good guy”) and Julia White (yes, that Julia White).
Anyway, this is one of those posts which is as much for my benefit as yours!
Working in the Microsoft cloud ecosystem (ok, Azure) and working for a Microsoft Partner steers me heavily towards the tools that the vendor provides. This works on a number of levels; mainly around depth of knowledge and personally this means getting ready for the next exam.
For code and script storage this means Azure DevOps and GitHub, the choice has got harder lately due to the tweak to the “free” tier on GitHub and private repos but we all love Azure DevOps because of pipelines and all the other stuff, even though my primary day to day use is as a Git Repo.
Of course I’ve been using Visual Studio for years and the online version for as long as it exists. The rebrand to Azure DevOps also brought a new url option going from <org>.visualstudio.com to dev.azure.com/<org> and the latter has created some new joy. I really recommend Multi-Factor authentication and love using the latest and greatest tech from Microsoft including their security features as it’s about the only way to keep up with the threats we face out there on the internet.
Of course it comes back to bite you from time to time and this morning has been a classic case. The current Git for Windows Release is 2.21.0 but a key component for me as a multi-factor protected user of Azure AD and Azure DevOps is the Git Credential Manager for Windows and there are a bunch of fixes relating to the new dev.azure.com url in version 1.19. Git for Windows 2.21.0 unfortunately includes Git Credential Manager for Windows 184.108.40.206 so you’ll need to install in strict order to get this the correct way round.
My symptoms included the following:
No prompt for credentials when cloning my repo, just a couple of http errors then a prompt for a password.
No prompt for credentials even though I had removed the pat tokens and emptied Windows Credential Manager.
Errors thrown at the Git level (I tend to live in VS Code or Visual Studio).
One of the (many) great aspects of my current role working in the Innovations area of a UK Bank is a relentless introduction to new features in Microsoft Azure. At my stage with Azure in practice and exams it is usually a new feature or behaviour that has dropped as part of a generation 2 (E.g. Storage vs Data Lake) or evolution of features or more subtly a change to the defaults of a combination (e.g. Automation and Desired State Configuration Extension). Then there are the “never heard of it” moments when a term gets mentioned and I rattle straight to a search engine.
One of these a few months back was Azure Cyclecloud, one of our projects involved input from Microsoft and their HPC specialist proposed it as a key component of the platform being evaluated. In our case it is acting as an orchestrator / scheduler and keeping tabs on a handful of low priority virtual machine scale sets.
I’ve not had any direct exposure to HPC beyond awareness due to Microsoft architectural exams I’ve done in the past for on-premises Windows, and latterly Azure cloud. The good news is within parameters that the news is good and Azure CycleCloud appears straightforward and being predominantly IaaS based is fairly easy to secure within our patterns. My thoughts so far are:
The web admin interface is fairly sensitive to environment – I’ve lost about a day to Internet Explorer (doesn’t work) and the reverse proxy on our firewall appliances mangling page scripts.
The manual install is straightforward and reliable in my limited experience – we have a vnet model that it sits in quite nicely and the documentation is good on required ports and cluster communications.
Azure Cyclecloud being HPC and batch etc comes from open source land, so lots of command line and Linux – quite ironic that my career includes so many loops (my first job at an accountants in the 1980s included being the guy who wrote sql reports using vi on a unix practice management system).
Following on from the previous point, Azure Cyclecloud integrates with Active Directory and therefore has it’s own RBAC model – very important to understand if you are trying to secure it.
I have a few concerns about the quickstart deploy, mainly due to the public ip address bound to a server but that probably reflects our use cases and my background. (Googling “cyclecloud initial setup” reinforces this concern as a number of servers in initial setup pop up).
The cloud account relies on a fairly big service principal so it’s important to keep on top of that bearing in mind the last two points.
About 50% of the time I get the name wrong and call it Azure CloudCycle. This hit rate is slowly improving.