Office365-AzureHybrid: Building an automated solution to pull Office 365 Audit logs

Custom reporting for Office 365 Audit logs is possible using data fetched from the Security and Compliance center. In the previous blogs here, we have seen how to use PowerShell and Office 365 Management API to fetch the data. In this blog, we will look at planning, prerequisites and rationale to help decide between the approaches.

The Office 365 Audit logs are available from the Security and Compliance center when enabled. At present, audit logging is not enabled by default and needs to be enabled from the Security and Compliance center. This could be turned on (if not done already) via the Start recording user and admin activity on the Audit log search page in the Security & Compliance Center. In future, supposedly Microsoft will be turning it On by default. The Audit information across all Office 365 services are tracked after enabling.

The Audit log search in Security and Compliance center allows to search the audit logs but is limited in what is provided. Also it takes a long time to obtain the results. All the below cases need custom hosting to provide more efficiency and performance

Planning and prerequisites:

Few considerations for custom processes are as follows:

  1. Need additional compute to process the data – Create a periodic job to fetch the data from the Office 365 audit log using a custom process since the audit log data is huge and queries take a longer time. The options are using a PowerShell job or Azure Function App as detailed below.
  2.  Need additional hosting for storing Office 365 Audit log data – The records could range from 5000 to 20000 per hour depending on the data sources and relevant data size. Hence to make it easier to retrieve the data later, store the data in a custom database. Since the data cost could be significant for this, use either dedicated hosting or NOSQL hosting such as Azure Tables/CosmosDB (Azure) or SimpleDB / DynamoDB (AWS)
  3. Might need additional Service Account or Azure AD App – The data will be retrieved using an elevated account at runtime so use an Azure AD app or service account to gather the data. For more information about this, please refer to this blog here.

Scenarios:

Some of the scenarios when the Office 365 Audit log data could be useful.

  1. Create custom reports for user activities and actions
  2. Store audit log data for greater than 90 days
  3. Custom data reporting and alerts which are not supported in Security and Compliance center

Approaches:

Below are few approaches to pull the data from the Office 365 Audit Logs. Also there is benefits and limitations of the approaches in order to help decide on implementation.

Using PowerShell

Search-UnifiedAuditLog of Exchange Online PowerShell could be used to retrieve data from Office 365 Audit log. More implementation details could be found at the blog here.

Benefits:

  1. Doesn’t need additional compute hosting. The PowerShell job could be run on a local system with a service account or on a server.
  2. One off data-pull is possible and can be retrieved later
  3. Able to retrieve data more than 90 days from Office 365 Audit log
  4. No session time out constraints as long the PowerShell console can stay active
  5. Local Date filtering is applicable while searching. No need to convert to GMT Formats

Limitations:

  1. It Need Tenant Admin rights when connecting to Exchange PowerShell to download cmdlets from Exchange Online
  2. Needs active connection to Exchange online PowerShell every time it runs
  3. It is not possible to run it on Azure or AWS at present as connection with Exchange Online PowerShell cmdlet is not possible in serverless environment
  4. Needs longer active window time as the job could run for hours depending on the data

Using Office 365 Management API :

The Office Management API provides another way to retrieve data from Azure Logs using a subscription service and Azure AD App. For more detailed information, please check the blog here.

Benefits:

  1. Support of any language such as C#, Javascript, Python etc.
  2. Parallel processing allows greater speed and flexibility of data management
  3. Controlled data pull depending on data size to increase efficiency and performance

Limitations:

  1. Need Additional compute hosting for serverless workloads or web jobs to process the data
  2. Needs an Azure AD app or OAuth layer to connect to the subscription service
  3. Needs additional Time zone processing since all dates are in GMT for retrieving data
  4. Session timeout might occur in data pull involving large datasets. So advisable to use smaller time slot windows for data pull
  5. Multilevel data pull required to fetch the audit log. Please check the blog here to get more information

Final Thoughts

Both PowerShell and Office 365 Management Activity APIs are a great way to fetch Office 365 Audit log data in order to create custom reports. The above points could be used to decide on an approach to fetch the data efficiently and process it. For more details on the steps of the process, please check the blog here (PowerShell) and here (Office 365 Management API).

Selectively prevent and secure content from External sharing using Labels and DLP policies in Office 365

In a recent project, we had a requirement to prevent specific selective content from shared externally while still allowing the flexibility of external sharing for all users. We were able to make it possible through Security and Compliance Center. There are few ways to achieve this, Auto-classify (see below conclusion section for more info), Selective apply via Labels and both.

Note: Till recently (Dec 2018), there was a bug in Office 365 which was preventing this DLP policy with Labels to work. This is fixed in the latest release so available for use.

In this blog, we will look at the process where business users can decide the content to be shared externally or not. This is a nifty feature, because there are cases when the content could be classified as secured even when they don’t have any sensitive info such as contracts (without business info) or invoices (with only business name). Also, there are cases when content could be public even when the document has sensitive info because the company has decided to make it public. So, at the end it is up to the discretion of the owner to decide the content’s privacy and hence this feature a great value in these scenarios.

Note: If you would like to auto classify the content using Sensitive info types, please refer to the great article here. This process leverages the machine learning capabilities of Office 365 engine to identify secure content and automatically apply the security policy on it.

The first step is to create a Retention label (somehow this doesn’t work with Security labels, so must create retention label). After creating a label, publish the label to the selected locations, for our use case we will post it to SharePoint Sites only. While the label is published, we could go ahead and create a DLP policy to prevent sharing with external users (I was not able to make it work while set to Test with notification so put it to on state to test also). After this, when you apply the label to a document, after some time (takes about 1-2 min to affect), then the content is not able to be shared with external users. Lets’ look at each of the above steps in detail below.

Steps:

  1. First step is to create a retention label in Security and Compliance center. To my astonishment, the selective process doesn’t work with Security Labels but Retention Labels, so will create Retention Labels. If it is optional to apply a retention period to the content, then the retention period can be left, so not required for this exercise.


  2. Secondly, we will publish the label to SharePoint Sites, for our requirement. I haven’t tried the process with other sources such as Outlook and One Drive but should work the same when applied.
    Note: It takes about a day for the retention labels to publish to SharePoint sites, so please wait for that to become available. We can move to the next configuration step right away but will have to wait for the label to be published to stop sharing.
  3. Next, we could create a DLP policy for the content to be applied. For creating a DLP policy we need to follow the below configuration steps. Once created, we might have to turn it on in order to test it.
    SecurityAndCompliance_DLPPolicy1
  4. First step of the policy creation would be select Custom Policy for DLP policy creation and give it a name.
  5. Then, we would select the sources to be included for this policy. In our case, it is only SharePoint.
    SecurityAndCompliance_DLPPolicy2
  6. After the above, we will set rule settings for the DLP policy where we will select the label to which the policy to apply, then select the policy tips, block sharing rules and override rules as shown in the below screenshots. We could also set the admins (provided) to get notified when such as content is shared externally.
     
  7. Next, we could allow the users to override the policy if needed. For this blog and our requirement, we had decided to not allow it to happen.
     
  8. After this is setup, we could turn on the DLP policy so that it could start applying the rules. There doesn’t seem to be any wait time for applying the policy later but give it some time if you don’t see it happening right away.
  9. Now the policy is enabled and if the label are published, the user can then apply the label on a content as shown in below screenshot.
    Note: In some cases, it takes about 2-3 min for the policy to be effective on the content after applying the label so give it some time.
  10.  After the label is effective after 2-3 min wait, if the same content is shared with an external user, we get the following error.
    SharingFix1

Read More

Follow Us!

Kloud Solutions Blog - Follow Us!