A nice addition to your #Azure Daily hammers : Azure Dockit , a Toolbelt Essential!

azuredockitlogo_whitetextThe one thing that always keep lingering around is that single Post-It note with the todo: Generate Documentation. When mainting environment and architectre docs, this can be time consuming, troublesome and … well for most of the people kind of annoying, you’ve got to admit.

Well the last mentioned group of people i have some great news Smile . Especially when you’re involved in Azure deployments. There’s a new tool on the market, named Azure Dockit. What this tool can help you accomplish is actually pretty amazing. And i’m not exagerating here.

Let me walk you through this, the capabilities and the end result.

 

We start of by (after created a subscription, and thus pay for it) selecting a subscription on which you want to generate your documentation. And press Generate Docment. And actually that’s it …..

image

Now this may look like a little simple, but the devil is in the settings button. As in there’s more to it than meets the eye.

 

The settings button allows you to configure what your report will look like. First the “Azure Workloads” tab. This allows you to select all the service, so far implemented, you want to generate documentation for.

image

Now the cool thing is that these guys are awesome when it comes down to implementing these servcies for documentation. When i first started testing this, there was only just 4 or 5 (and that was mid january). As you can tell from the list in the screenshot, these guys haven’t been sitting still thus far. and that’s a general rule of thumb here: they implement new features on a very high tempo! so kudos for that!

So after selecting all the workloads you want to generate docs for , select the “Document”. now depending on the version you’ve bought, this can be used to either upload a custom template or just use the full one provided by Dockit. You can also choose to generate a ToC and even a time constraint for picking up the document.

image

it’s not only the service itself that can be documented, but also some of the content can be documented through this. as we can tell from the “SQL Server”  tab.

image

This allows you to generate db structure information as also info on SP’s, tables, etc etc …

in the “Storage” tab you can enter the desired level of containers and blobs you whish to be scanned. now this is a useful feature, as you can (so far) not select the storage accounts containers you want to document. This setting will avoid the documentation overload for storage accounts or BLOB’s which could contain diagnostics, logging and monitoring data.

image

On the part of identity and acces management, you have the choice to document either all or just a select number of AAD Users or all of them (with all the changes accordingly).

image

One of the more interesting features is the documentation of your VM infrastructure (aka IaaS). a new feature was recently added to this. you can document the internals of your VM by installing and running a custom script extension which uses a Powershell script for internal VM info.

image

Also from a billing perspective, this proves interesting as this works for both EA as normal subscriptions. Allowing you to get a cost overview.

image

Th last set of settings is on how you want this report to become available: you can choose to drop it in either an Azure Storage account (enabled by CORS, from their end) or in an O365 tenant storage library. There’s even a capability to engage with UManage to create follow ups int his online Team Management Tool. Also here you can choose to get advanced loggin on all the analysis calls the services does for you.

image

 

Once you’ve made your choice of doc generation, you return to the main screen and then press the generate documentation button, to get docs the way you want them. This process might take a while so have some patience here.

image

image

 

when finished you’ll be presented with a downloadable Word Document.

image

 

when looking deeper in the document, you’ll actually see that it contains some nice graphs and schema’s:

image

you get maps on where you’ve deployed, generic overviews in tables and so on. When looking for instance in detail to the VM details you get nice Visio-esque images which are editable in Visio afterwards when having a certain SKU of the product:

image

Not only that but also the fact that it generates warnings and best practice suggestions is a valuable feature.

image

image

it even generates stats on websites :

image

 

On schema’s you’ll even get a lay out of your VNET’s incluing gateways and connections.

image

if you want to check a full demo file, they have one ready for you on their site here : http://www.azuredockit.com/examples/

 

Conclusion:

This product can save you time, period! Seeing that this is a cloud based service only offers you benefits, although you might find this a disadvantage as you need to regenerate the documentation more often as more feature get added. I must say this company made a lot of effort in making this an easy to use tool. With this many features this is a must have tool. The only downside for me is that you need to buy it per subscription, which makes it hard to re-use for multiple customers or in a consulting role. Maybe a version for consultants could be a great addition to their licensing scheme, as i see a lot of business potential.

Advertisement

Cool Windows 10 update.. and there goes my network … NOW WHAT? #lifehack

windowslogoFor those also a edgy as me, and thus living on the Windows Insiders Fast Ring might come to a surprise from time to time. Today i was passed on the latest build (14279), and of course very happy with it Smile (geek as I am). Unfortunately that wasn’t counted on networkadapters, Hyper-V and the update itself.

 

So what happened: I had a full setup of Hyper-V active on the machine, with a couple of vAdapters and a vSwitch belonging to that. After the update and the reboot, i all of a sudden could no longer connect to the internet. What happened? I don’t get it… i checked the Wi-Fi connection … seemed connected.. although now i was getting a 169.x.x.x loopback address (you know,  the ones that come for free Sad smile )

 

I went to check out the adapters … and guess what , my network bridge was gone …. weird … So i decided to remove it from the adapters .. but … COMPUTER SAYS NO ..

All of a sudden my adapters start misbehaving and havig an attitude against me.  Luckily i remembered someting from the old day … netcfg …. Smile

 

image

so what do you need to do:

  1. open a command prompt in Admin Mode
  2. run netcfg /? to get you all the netfcg help info. The key lies in the netcfg –d, as this will reset and remove all adapters and put them back in zero configuration
  3. run the command with netcfg –d
  4. reboot the device

The only thing now of course is to re-create the vSwitch and all should be back to normal.

 

It’s not the best solution around (it shouldn’t be happeneing at first), but still it does the trick.

I hope i could save you all some time here

M.

things to do in Sweden … on a Saturday night … #powerbi quickstart demo with #github

riddlocatBy means of a joke thanks to this tweet (see image below) i created a small start video on how to use PowerBi to get your Github stats of any projects :-). enjoy

twitter2

 

 

twitter

PS: bad sound quality perhaps, since it was shot under a time constraint in a hotel room
watch the video

a quick tip for monitoring : #azure application insights has a plugin for WordPress

azureWhen willing to have monitoring and stats and diagnostics for your sites and you want to use tools like Application Insights you mostly need to add code or scriptlines to all the pages you want tor track. The Redmondians have made it easy for us when using world’s most popular CMS: WordPress (as I do too , yes I’m a fan, so sue me :-) ). This makes monitoring sooooo easy, there’s no more excuse NOT TO DO IT!!

 

 

so … enter your WP site, select the Plugins side menu and click “add new”. Type in “Applications Insights” and there it is. now all you have to do is click “Install Now”.

installtheplugin

Once done it will be in the installed plugins list. Activate it if it didn’t automatically.

instaledplugin

Once actived the only thing left to do is enter the application insights instrumentation key (the green indication) and paste it in the settings pages of the Application Insights settings.

appinsights

settings

And that’s it. give it a couple of minutes and all your stats will be dripping onto your beautiful Azure Ibiza portal in no time. Hope this helps :-)

 

Bigger, Bolder, Better! Join the Global Windows Azure Bootcamp on 29/03!

imageAfter last year’s success, a second more successful round of GWAB is en route. At the end of this month we’ll have about 26h of around the world bootcamps in more than 135 locations. Sheer Madness!!! (or Fricking awesome as @Noopman always says J ) This year will be even more special. Last year we spent about 9000 compute hours on rendering Windows Azure Security (read this as sharks with fricking lazers J) and therefor we joined forces with a charity this year to put some good use to the compute capacity. What’s it about ? well read this smaal excerpt from the GWAB Site (http://global.windowsazurebootcamp.com/charity/ )

The Global Windows Azure Boot Camp event will help advance this endeavor by hosting a globally distributed lab in which attendees of the event will deploy virtual machines in Windows Azure which will help analyze data needed for this research. We’re aiming at discovering how our body’s serum protein glycosylation works. We want to know how high blood sugar levels present in diabetes patients affect the complex sugar production systems required for our health and ability to fight disease. We want to prove the theory that when small changes in this process start occurring, the disease can progress and lead to Type 2 diabetes. The results from this work will not only help understand the human diabetic state at the molecular level but also lead the way for early detection of diabetes.


looking at that map with all the locations just dazzles my mind!

map

Having that said I hope you will either join us or even better .. organize your own location (if you aren’t doing so already). there’s even 2 locations in Belgium: one in Genk and another one in Kortrijk, see our Azug page for more info : http://www.azug.be/events/2014-03-29—global-windows-azure-bootcamp-in-belgium

I’m proud of being part of the global organizer group next to these terrific gents:

Maarten Balliauw     Windows Azure MVP    @maartenballiauw

Magnus Mårtensson     Windows Azure MVP    @noopman

Mike Martin       Windows Azure MVP    @TechMike2kX

Alan Smith         Windows Azure MVP    @alansmith

Michael Wood         Windows Azure MVP     @mikewo

I hope March 29th will be as community filled for you as it will be for me!!!

Yours truly

M.

Toying around with Visual Studio Monaco on Windows Azure … some side dishes

050313_1311_Didyouknowh1.png2,5 months ago MSFT released Visual Studio Online to the masses (see my first look blogpost from 13/11/13 at it here ). It’s a great system and it works like a charm, but there’s a few things you need to know besides just the basics.

It’s an extension Jim, but not as we know it.

First things first, the foundation and architecture. Well actually when you’re using Monaco on of your websites, you’re actually using … well … your website. Monaco is installed as a Private Website extension totally living in the same sandbox environment as your website only with a different end-point.

When we take a look on the structure it will become more clear:

As you can see both the website and the Monaco site extension point to the same inetpub root on the PaaS IIS role. (Yes, websites do run on Windows Azure webroles). The only difference is that the VSO does editing and the entry point on website does runtime execution.

Knowing what the structure is of this this technology then something should slip to mind. Since it runs on the same node and workspace, you should realize that, when you would activate and use this a lot, it comes with a cost.

Because it runs on the same PaaS role, it will also use its compute and bandwidth capabilities. Compilation will be accumulated with your web calling compute hours on your website role. This means that, when for instance, running free model websites your website might be locked from time to time. So this is important enough to know and realize.

How they’ve build it is actually quite cool. The total solution Node.JS bases and consists out of a little more than 200.000 lines of TypeScript code, which gets compiled to JavaScript. So the entire thing is running client side/in the browser. For those who pay close attention: yes it is based on the same solution as is used for “Napa”, TFS Online, SkyDrive, Windows Azure Mobile Services, and so on and so …

Fast responsiveness during coding, semantic references, AST (Abstract Syntax Tree) running in the browser, etc. … are all being generated through TypeScript in the browser. Giving you flexibility and performance at your fingertips.

As stated, it’s based on Node.Js and following technologies have been integrated/used for developing the environment(s) in the back-end:

As you can see a nice fair set of tools and technologies used for bringing you a rich environment.

Crouching Tiger, Hidden Dragon.

Being aware of all the goodness Monaco hides let’s take a look at some non-conventional scenario’s. And by non-conventional I do mean scenario’s that it wasn’t perhaps even meant to be used for.

  1. I got the Power(Shell)

    Since we are running in a windows environment, one could start looking for the underlying technologies. One of the more important things in Windows today is PowerShell. Does it exist? Typing powrshell doesn’t seem to do much, but when typing help we do see there’s a powershell entry:

    But then again, it’s not a real life console as we know it.

    So let’s try something basic and try to get all the processes from the machine (s). I add a new file to the explorer tree and edit it with the line get-process


    I then enter the command ps powershell.ps1 in the console and …. Tattaaa:

    All the running processes are showing.

    I can even get all the variable information out of it by using get-variable:

    Or the PS version :

    BUUUUT no services … :

    This will probably have to do with the sandboxing (and can probably be used as proof too I think).

    Then again some powerfull info can still be gained from dir‘ing the env: psdrive which shows us that we are running on a Red Dog (the old/original Windows Azure Codename) machine and what kind of machine we’re running it on in shared mode

    Then of course we could use this power to create some stuff also, like webpages with that info and much much more. I haven’t found out which cmdlets are able to run and which ones not. But I think you could put some good use to the ps shell somehow.

  2. The Cloud Atlas … aka azure-cli

    Another thing we have available to this environment is npm (so it seems, when we run help from the console). Now, I don’t know about you, but when you say npm to me the first thing that jumps to mind is … you guessed it! Windows Azure Xplat Client Tools J !!! Now I got you probably thinking “wait, you’re not saying you want to manage the cloud … from the cloud …. ?” Yes I actually do want to do that: imagine git + npm + xplat tools = continuous availability of all your management scripts ! no need to have a device with all tools installed: you just need a browser. Yes but you also just can use the portal then … well not quite true, because now you can use your onsite created scripts … EVERYWHERE (even from a smartphone!)

    So let’s see if we can pull this one off :

    Well not with the –g switch (sandbox) but we can do it without. And as you can see it created me all my node tools in a separate folder inside my website:

    It installed allright , but does it work?

    It seems so now, doesn’t it?

    So the only thing missing a publishing settingsfile or an account login. Since the Azure-cli tools don’t support the account login yet, we need to go for a settings file. I guess launching a browser window from an emulated console inside a browser would be impossible, but I gave it a try anyway:

    No luck there, so I just dl’ed myself a fresh one from my own stack and dragged it inside my file explorer pane in Monaco:

    Let’s try to import the settings:

    This means I can access my Windows Azure Assets, right? Let’s try for something nog websites related:

    Cool J, can I do more?

    Let’s stop a VM

    Do note: CASE SENSITIVE ;-)

    WORD OF ADVICE : when doing the above operations: DO NOT FORGET TO DELETE YOUR .publishsettings! it’s a precious thing!

[UPDATED/ADDED 28-01-2014] The tale of two Git-ies …

When considering using VSO Monaco, try only to use it with DEV or more static WebPages and not in direct measures of using it in heavy production websites. Try to set up an ALM stream from your sources just a s you would in any other situation. The easiest way to achieve this is to activate the new Staging Preview on websites, which will actually help you on this matter (in the easiest way). How to proceed?

  1. Initiate a new website and activate VSO Monaco on this one. This site will function as your DEV area
  2. clone or initiate (depending on what you already have of course) a Git repo.
    DO NOTE: this is not to be confound with publish from sourcecontrol, this has to be seen as a LOCAL REPO! so do it through the VSO MONACO INTERFACE.
    Once that’s done and you have a stable solution, you can proceed to the next step in the flow
  3. initiate a second Website (create a quick one, without anything)
  4. activate the Staging Preview on this one.
    DO NOTE: when doing this you’ll need a standard mode website to achieve this, your dev site can still be in free mode (but consider the accompanying cost remark made earlier)
    staging
  5. once that’s done, open the staged site and choose to deploy from source control on that one and activate your earlier created git repo on this one.
  6. when in place you can now do a VIP swap when finished in Staging when neededHere’s a schematic of the flow :
    flox
    If you want more info on the underlying  and a more deep ALM thought on the thing take a look at the Monaco Blog site : http://blogs.msdn.com/b/monaco/archive/2013/12/06/using-monaco-for-in-depth-modifications.aspx

As you can see VSO Monaco is not limited to coding alone, it even makes a great environment for doing some perhaps continuously available devops stuff. As with many thing Azure one should think outside of the box (and a big box it is indeed). Monaco is a standard tool but one you can (ab)use for your needs. Especially when you know that the entire environment is sandboxed and totally locked down for your privacy.

Happy tinkering!

Yours truly,

Techmike2kx