12th section starting with more Sentinel. This time we are classifying Entities, creating custom log in Log Analytics and parsing the content.
So let’s do some designs and configurations.
Table of Contents
Classify and analyze data by using entities
You can do the following in Entity pages:
- View related entities: The entity page lists every entity connected to the one you’ve chosen. For instance, the entity page for a person may show connected IP addresses and devices.
- View entity fields: The entity page lists each field connected to the entity you chose. For instance, the name, IP address, and operating system of a device may be displayed on its entity page.
- Check out latest activity: All of the most recent activity connected to the chosen entity are shown on the entity page. For instance, a user’s entity page may show recent logins and file accesses.
When you create a new rule under Sentinel -> Analytics or edit an existing one, you will find the Entities under Rule logic.
Microsoft Sentinel recognizes the following entity types:
- User account
- Host
- IP address
- Malware
- File
- Process
- Cloud application
- Domain name
- Azure resource
- File hash
- Registry key
- Registry value
- Security group
- URL
- IoT device
- Mailbox
- Mail cluster
- Mail message
- Submission mail
In example the following are strong identifiers of an User entity:
- Name + UPNSuffix
- AadUserId
- Sid + Host (required for SIDs of builtin accounts)
- Sid (except for SIDs of builtin accounts)
- Name + NTDomain (unless NTDomain is a builtin domain, for example “Workgroup”)
- Name + Host (if NTDomain is a builtin domain, for example “Workgroup”)
- Name + DnsDomain
- PUID
- ObjectGuid
And this presents a weak identifier of an account entity:
- Name
And here a reference table from Learn on the Entities.
You can the user Entities to map data fields to them.
Create custom logs in Azure Log Analytics to store custom data
What should you considering when creating an Custom log:
- Define the data you wish to gather. It might be challenging to handle and interpret data that has been gathered in excess.
- Don’t complicate the log schema. Use the fewest possible fields and data types. Unless absolutely required, avoid utilizing sophisticated data types like arrays or objects.
- Field names should be evocative. Field names ought to be clear and informative.
- Maintain a standard naming convention. Use a naming scheme that is constant for both the log and field names.
- Send information in a standardized manner to the log. Ensure that the information you provide to the log complies with the log schema.
Go to Log analytics -> Tables and choose New Custom Log, either DCR or MMA based.
Just for clarity:
- MMA is Azure Monitoring Agent
- DCR means Data Collection Rules
To better understand what DCR is, see this table from Microsoft.
Custom logs | Configure custom logs by using the Azure portal Configure custom logs by using Azure Resource Manager templates and the REST API | Send custom data by using a REST API. The API call connects to a data collection endpoint and specifies a DCR to use. The DCR specifies the target table and potentially includes a transformation that filters and modifies the data before it’s stored in a Log Analytics workspace. |
And then selecting the Data endpoints and previously created Collection rules or you can also create a new one.
Note that _CL is added automatically as suffix.
You need at least Log analytics contributor to create the new schema
See from Learn on how to create a custom log and remove it.
Read Jeffrey’s post on AMA and DCR for deeper understanding on the subject.
Query Microsoft Sentinel data by using Advanced SIEM Information Model (ASIM) parsers
Some reason why you would use ASIM:
- Detection of cross-sources. Assertions like brute force or impossible travel between systems, such as Okta, AWS, and Azure, are caught by normalized analytics rules that operate across sources, both on-premises and in the cloud.
- Any source that supports ASIM is immediately included in the coverage of both built-in and bespoke content that uses ASIM, even if the source was added after the content was generated. Process event analytics, for instance, support any data source a client may employ, including Microsoft Defender for Endpoint, Windows Events, and Sysmon.
- Support for your individual sources, integrated analytics
And here is a good visualization from Microsoft on ASIM framework
And see here for ASIM schemas from Learn.
Note! If it isn’t clear for now, ASIM parsers are KQL Functions
Now when we have the basics information, we can see how to query.
First you can open Sentinel GitHub repository.
In example under the repo there is this parser that you can deploy
Click Deploy to Azure
And once deployed, you can find it under Logs and Functions
And then load the Function
And the run the function you just loaded
Here is a Lab for deploying ASIM parsers
And from here you can find an interactive simulation for creating parsers
Develop and manage ASIM parsers
The process of developing parsers could be this:
- Gather test logs.
- Determine which schema(s) the events coming from the source represent. See Schema overview for further details.
- Connect the fields from the source event to the identified schema or schemas.
- Create a parser or parsers for ASIM for your source. For each schema pertinent to the source, you must create a filtering parser and a parameter-free parser.
- Run a parser test.
- Install the parsers in the workspaces for Microsoft Sentinel.
- The new custom parser should be referenced in the appropriate ASIM unifying parser. Managing ASIM parsers has further details.
- Additionally, you might wish to add your parsers to the main ASIM distribution. Contributed parsers may also be included as built-in parsers in all workspaces.
Here is an excellent presentation from Microsoft on ASIM parsers and how to query, Manage and develop them.
And here is a list of ASIM content from Learn
Closure
As a recap, you should understand the Entity mapping and types that you can use
- User account
- Host
- IP address
- Malware
- File
- Process
- Cloud application
- Domain name
- Azure resource
- File hash
- Registry key
- Registry value
- Security group
- URL
- IoT device
- Mailbox
- Mail cluster
- Mail message
- Submission mail
What should you considering when creating an Custom log?
How Custom logs can be populated?
How ASIM framework look like, what are ASIM parsers (KQL functions) and how to deploy them?
What is the process of developing your own? What should you consider?