Disk Space Reporting through Lamba Functions- Windows servers

Solution Objective:

The solution provides detailed report related to hard disk space for all the Windows Ec2 instances in the AWS environment.

Requirements:

Mentioned below are the requirements the solution should be able to fulfil.

  • Gather information related to all mount points in all the Windows EC2 instances in the environment.
  • Able to generate cumulative report based on all instances in the environment.

3. Assumptions:

The following assumptions are considered

  • All the EC2 instances have SSM agent installed.
  • The personnel responsible for the configuration have some understanding of IAM Roles, S3 buckets and lambda functions

4. Solutions Description:

The following services provided by Amazon will be utilized to generate the report

  • PowerShell Scripts
  • AWS S3
  • AWS Lambda
  • AWS IAM Roles
  • Maintenances Windows

4.1      Linux Shell Script.

PowerShell Script will be utilized to generate information about the instance and the mount points space utilization.
Mentioned below script needs to be executed on all Windows Ec2 instances to generate the mount point information.

$instanceId = Invoke-WebRequest -Uri http://169.254.169.254/latest/meta-data/instance-id -UseBasicParsing
$instanceId.content
Get-WmiObject Win32_logicaldisk | select DeviceID,Size,Used,FreeSpace,PlaceHolder,VolumeName | ft -Autosize

4.1      AWS S3

The result of the shell script will be posted to an S3 bucket for further use.
The EC2 instances will need write access to the nominated S3 bucket for certificate Maintenance.
S3 Bucket Name: eomreport ( sample name )

4.2      AWS Lambda Functions

Lambda Functions will be used to perform the following activities.

  • Acquire the result of the Shell script from the S3 bucket
  • Generate a Report
  • Email the report to the relevant recipient

The Lambda Functions would need read access to the S3 bucket and access to AWS SES to send emails to recipients.
Mentioned below is the Lambda Functions that performs the mentioned above tasks.

import boto3
import codecs
import pprint
from datetime import datetime, date, time
def lambda_handler(event,Context):
    s3 = boto3.resource('s3')
    mybucket = s3.Bucket('diskspacewindows')
    resulthtml = ["<h1>Report : Hard disk Space Client Name </h1>"] # Adds heading to the email body
    resulthtml.append('<html><body><table border="1">') # Creates a table
    resulthtml.append('<tr><td><b>InstanceID</b></td><td><b>Drive Letter</b></td><td><b> FreeSpace</b></td><td><b>Total Space </b></td></b></tr>')
    for file_key in mybucket.objects.all():
        complete_string = str(file_key)
        search = "stdout"
        check = complete_string.find(search)
        if check > 0 :
            body = file_key.get()['Body'].read().decode('utf-8')
            complete=body.splitlines() #splits data into lines.
            id="".join(complete[0])
            details=complete[4:]
            resulthtml.append(("<td>'{}'</td><td></td><td></td><td></td></tr>").format(id)) # for the HTML email to be sent.
            for line in details:
                    output_word=line.split()
                    dstr="".join(line)
                    #print(output_word)
                    #print(len(output_word))
                    if len(output_word) > 0:
                      resulthtml.append(("<td></td><td>'{}'</td><td>'{}'</td><td>'{}'</td></tr>").format(output_word[0],output_word[1],output_word[2])) # for the HTML email to be sent.
    resulthtml.append('</table></body></html>')
    final=str("".join(resulthtml))
    final=final.replace("'","")
    print(final)
    sender = "syed.naqvi@kloud.com.au"
    recipient = "syed.naqvi@kloud.com.au"
    awsregion = "us-east-1"
    subject = "Client Hard Disk Space - Windows "
    charset = "UTF-8"
    mylist="mylist update"
    client = boto3.client('ses',region_name=awsregion)
    try:
        response = client.send_email(
           Destination={
               'ToAddresses': [
                   recipient,
                ],
            },
         Message={
                  'Body': {
                      'Html': {
                        'Charset': charset,
                        'Data': final,
                             },
                    'Text': {
                     'Charset': charset,
                     'Data': mylist,
                    },
                },
                'Subject': {
                    'Charset': charset,
                    'Data': subject,
                },
            },
            Source=sender,
        )
    # Display an error if something goes wrong.
    except Exception as e:
        print( "Error: ", e)
    else:
       print("Email sent!")

 
4.1 AWS IAM Roles
Roles will be used to grant

  • AWS S3 write access to all the EC2 instances as they will submit the output of the  the S3 bucket
  • AWS SES access to Lambda Functions to send emails to relevant recipients.

4.2 AWS SES

Amazon Simple Email Service (Amazon SES) evolved from the email platform that Amazon.com created to communicate with its own customers. In order to serve its ever-growing global customer base, Amazon.com needed to build an email platform that was flexible, scalable, reliable, and cost-effective. Amazon SES is the result of years of Amazon’s own research, development, and iteration in the areas of sending and receiving email.( Ref. From https://aws.amazon.com/ses/).
We would be utilizing AWS SES to generate emails using AWS lambda.
The configuration of the Lambda functions can be modified to send emails to a distribution group to provide Certificate reporting, or it can be used to send emails to ticketing system in order to provide alerting and ticket creation in case a certificate expiration date crosses a configured threshold.

5Solution Configuration

5.1 Configure IAM Roles

The following Roles should be configured

  • IAM role for Lambda Function.
  • IAM for EC2 instances for S3 bucket Access

5.1.1 Role for Lambda Function

Lambda function need the following access

  • Read data from the S3 bucket
  • Send Emails using Amazon S3

To accomplish the above the following policy should be created and attached to the IAM Role

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "Stmt1501474857000",
            "Effect": "Allow",
            "Action": [
                "s3:GetObject"
            ],
            "Resource": [
                "arn:aws:s3:::S3BucketName/*"
            ]
        },
        {
            "Sid": "Stmt1501474895000",
            "Effect": "Allow",
            "Action": [
                "ses:SendEmail"
            ],
            "Resource": [
                "*"
            ]
        }
    ]
}

6.1.2  Role for EC2 instance

All EC2 instances should have access to store the Shell output in the S3 bucket.
To accomplish the above , the following policy should be assigned to the EC2 roles

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "Stmt1501475224000",
            "Effect": "Allow",
            "Action": [
                "s3:PutObject"
            ],
            "Resource": [
                "arn:aws:s3:::eomreport"
            ]
        }
    ]
}

6.2 Configure Maintenance Window.

The following tasks need to be performed for the maintenance window

  • Register a Run Command with Run-Shell Script using the script in section 4.1
  • Register targets based on the requirements
  • Select the schedule based on your requirement

Maintenance Window Ref : 
http://docs.aws.amazon.com/systems-manager/latest/userguide/what-is-systems-manager.html

6.3 Configure Lambda Function:

The following tasks need to be performed for the Lambda Function

  • Create a blank lambda function with the S3 put event as the trigger\lambda function
  • Click on Next
  • Enter the Name and Description
  • Select run time Python 3.6
  • Copy and paste the lambda function mentioned in section 4.3

    6.4 Configuring AWS SES

The following tasks need to be completed before the execution of the Run-commands.

  • Email Addresses should be added to the AWS SES section of the tenant.
  • The email addresses should be verified.

 7. Result:

Based on the above configuration, whenever the run command is executed, the following report is generated and sent to the nominated email account.

Sharing a report using a Power BI app

History

You have created reports and built dashboards in Power BI desktop to surface your data from multiple data sources, it is a time for you to share dashboards to a wider audience in your organisation and looking for how to do it. Power BI service came up with a powerful feature of Power BI apps to cater such scenarios.
If you have not yet created reports or did not setup a gateway for leveraging our on-premises data, please follow my earlier posts Setup a Power BI Gateway and Create reports using a Power BI Gateway to do so.

Approach

Sharing and Collaborating in a Power BI service is a three-step process, each step is explained in this blog post. At a surface level, tasks are as follows:

  1. Creation of an App Workspace
  2. Publishing reports to an App Workspace
  3. Publishing a Power BI App

A typical usage scenario for a Power BI apps in Office 365 services is depicted below:

1) Create an App Workspace

App Workspace is a new concept introduced in Power BI using which you can collaborate on datasets, reports and dashboards (authored by members) and builds/package Power BI apps to be distributed to your wider audience.

  1. Log in to your Power BI service https://app.powerbi.com and click on your Workspace list menu on the left

    If this is your first-time login, you need to create a new app workspace. (it’s just a new name for group workspaces)

  2. A form needs to be filled inside your Office 365 Power BI service for creating and a unique name is required for each app workspace
  3. Whilst creating the workspace, you need to set the privacy which can’t be changed later – so please decide carefully.
  4. And you need to set Permission levels for your workspace accordingly, please only add members who can edit content as viewers can be added later during publishing your Power BI app.

  5. Next step is to add users to it and set admins for the workspace. (default role is Member, change it for Owner against users you are intending to give administrator permissions)Note: you can only add individual users to this list, security group and modern groups support is not yet available at the time of writing this post.
  6. Upon reaching this step, your app workspace has been created successfully and is ready for use.

2) Publishing Reports to an App Workspace

Power BI app workspace is a collaboration tool, any member can create a model using their Power BI desktop and then publish it to a workspace so members can get advantage existing datasets, reports and dashboards. Follow the steps listed below to share your model in an app workspace.

  1. Open your Power BI desktop file (*.pbix) you have created earlier and hit the Publish button
  2. Select app workspace you want to publish your reports to:and Power BI desktop will start publishing reports to your Office 365 Power BI service
  3. Subsequent publishing to same app workspace will remind you if your data set already exists.
  4. Depending on the size of your data and internet speed may take some time to publish reports to Power BI service. Sooner or later you will receive a success message
  5. Upon reaching this step your reports, datasets and dashboards are published and available in your Power BI service.

3) Publishing Power BI App

  1. Login into your Power BI service and go to your app workspaces list and select your newly created workspace from the list
  2. On the right top, you will see a button to publish an app
  3. Provide description for the app in ‘Details’ tab, as your Power BI app will get the same name as of your app workspace
  4. In the next ‘Content’ tab, you will see a list of all contents within app workspace that will be published within this app. In this step, you can set a landing page of a Power BI app which users will see when they click on your Power BI app. I have selected a specific dashboard to be shown
  5. You will then need to set audience for your app in ‘Access’ tab, it can be either whole organisation or a combination of users or groups. On the top right corner, it will show you how many artefacts will be published within this Power BI app.
  6. Once you publish it, Power BI service will advise you the URL of your app as shown below:

AppSource and Power BI

Power BI users intending to use apps shared by other users or organisation must get apps first to use dashboards and reports from it.

  1. You need to go to ‘Apps’ menu in Power BI service (in the left menu)
  2. On selecting Apps from the menu will list apps you are subscribed to, if you are using it for the 1st time it’s usually empty and you need to click on ‘Get apps’ to get Power BI apps from AppSource store
  3. You can then select which apps you want to subscribe to from the list, they are listed by category

Behind the Scenes

The moment users create an app workspace, Office 365 group will be created in the background having the same name as of app workspace and users maintained as Office 365 groups users.

  • Admins of the workspace will become Owners of the group
  • Members of the workspace will become Members of the group


And a SharePoint site will be created as well with same members as of Power BI app workspace and Office 365 group.

You can see the details of users (admins/members) by checking ‘Site permissions’ menu under site settings

In the

Create reports using a Power BI Gateway

Background

Once you have a Power BI gateway setup to ensure data flow from your on-premises data sources to Power BI service in the cloud, next step is to create reports using Power BI desktop and build reports using data from multiple on-premises data sources.
Note: If you didn’t have a gateway setup already, please follow my earlier post to set it up before you continue reading this post.

Scenario

All on-premises data is stored in SQL server instances and spread across few data warehouses and multiple databases built and managed by your internal IT teams.
Before building reports, you need to ensure following key points:

  1. Each data source should be having connectivity to your gateway with minimum latency, this should be ensured.
  2. Every data source intended to be used within reports needs to be configured within a gateway in the Power BI service
  3. List of people needs to be configured against each data source who can publish reports using this data source

An interaction between on-premises data sources and cloud services is depicted below:

Pre-requisites

Before you build reports, you need to setup on-premises data sources in the gateway to ensure Power BI service knows which data sources are allowed by gateway administrator to pull data from on-premises sources.
Login into https://app.powerbi.com with Power BI service administrator service credentials.

  1. Click on Manage gateways to modify settings
  2. You will see a screen with gateway options that your setup earlier while configuring gateway on-premises
  3. Next step is to setup gateway administrators, who will have permission to setup on-premises data sources as and when required
  4. After gateway configuration, you need to add data sources one by one so published reports can use on-premises data sources (pre-configured within gateway)
  5. You need to setup users against each data source within a gateway who can use this data source to pull data from on-premises sources within their published reports
  6. Repeat above steps for each of your on-premises data sources by selecting appropriate data source type and allowing users who can use them while building reports

Reports

Upon reaching this step, you are all good to create reports.

  1. Open Power BI desktop
  2. Select sources you want to retrieve data from
  3. Just ensure while creating reports, data source details are same as what was configured in Power BI service while you were setting up data sources.
  4. Great! once you publish reports to your Power BI service – your gateway will be able to connect to relevant on-premises data sources if you have followed steps above.

 

Setup a Power BI Gateway

Scenario

So, you have explored Power BI (free) and wanted to start some action in the cloud. Suddenly you realise that your data is stored in an on-premise SQL data source and you still wanted to get insights up in the cloud and share it with your senior business management.

Solution

Microsoft’s on-premises data gateway is a bridge that can securely transfer your data to Power BI service from your on-premises data source.

Assumptions

  • Power BI pro licenses have been procured already for the required no of users (this is a MUST)
  • Users are already part of Azure AD and can sign in to Power BI service as part of Office 365 offering

Pre-requisites

You can build and setup a machine to act as a gateway between your Azure cloud service and on-premises data sources. Following are the pre-requisites to build that machine:

1) Server Requirements

Minimum Requirements:
  • .NET 4.5 Framework
  • 64-bit version of Windows 7 / Windows Server 2008 R2 (or later)

Recommended:

  • 8 Core CPU
  • 8 GB Memory
  • 64-bit version of Windows 2012 R2 (or later)

Considerations:

  • The gateway is supported only on 64-bit Windows operating systems.
  • You can’t install gateway on a domain controller
  • Only one gateway can be installed on a single machine
  • Your gateway should not be turned off, disconnected from the Internet, or have a power plan to go sleep – in all cases, it should be ‘always on’
  • Avoid wireless network connection, always use a LAN connection
  • It is recommended to have the gateway as close to the data source as possible to avoid network latency. The gateway requires connectivity to the data source.
  • It is recommended to have good throughput for your network connection.

Notes:

  • Once a gateway is configured and you need to change a name, you will need to reinstall and configure a new gateway.

2) Service Account

If your company/client is using a proxy server and your gateway is not having a direct connection to Internet. You may need to configure a windows service account for authentication purposes and change default log on credential (NT SERVICE\PBIEgwService) to a service account you like with a right of ‘Log on as a service’

3) Ports

The gateway creates an outbound connection to Azure Service Bus and does not require inbound ports for communication. It is required to whitelist IP addresses listed in Azure Datacentres IP List.

Installation

Once you are over with a pre-requisite as listed in the previous paragraph, you can proceed to gateway installation.

  1. Login to Power BI with your organisation credentials and download your data gateway setup
  2. While installing, you need to select the highlighted option so a single gateway can be shared among multiple users.
  3. As listed in pre-requisites section, if your network has a proxy requirement – you can change the service account for the following windows service:

  4. You will notice gateway is installed on your server
  5. Once you open a gateway application, you can see a success message

Configuration

Post installation, you need to configure a gateway to be used within your organisation.

  1. Open gateway application and sign in with your credentials
  2. You need to set a name and a recovery key for a gateway that can be used later to reconfigure/restore gateway
  3. Once it is configured successfully, you will see a message window that now it is ready to use
  4. Switch to Gateway’s Network tab and you will see its status as Connected – great!
  5. You are all set, the gateway is up and running and you need to start building reports to use data from your on-premises server using gateway you just configured.

 

Follow Us!

Kloud Solutions Blog - Follow Us!