Using CIF to create content for ArcSight – Part 1

If you use ArcSight hopefully by now you have come across the great ArcOSI Project for generating content for use within ArcSight. I have used it in the past and liked it but I found myself having to look for more context around the alerts it generated. I recently came across the Collective Intelligence Framework (CIF) and really like how many intel sources it aggregates like ArcOSI does and how it stores the data from the intel source and I think this too can be a great source of content for ArcSight. I have previously blogged about integrating CIF and ArcSight, but that was just using CIF as a tool for looking up data with in ArcSight not using CIF to create content to be used by ArcSight.

EDIT: 6/10/2012 if you haven’t seen @kylemaxwell ‘s Post Introduction to the Collective Intelligence Framework I highly recommend check it out!

 

I think the content CIF can provide could be great for ActiveLists and Correlation rules on those active lists. I came up with a few possible scenarios on how this content could be used:

  • Malicious Domain Queries – DNS Logs
  • Malicious Domain Web Traffic – Proxy Logs
  • Malicious IP Traffic – Firewall/Proxy Logs
  • Scanner Traffic – SSH/Firewall Logs

For the Scanner Traffic maybe instead of reporting on the noise of someone knocking on your door, you report on any traffic that was accepted (meaning authentication happened) but that is up to you.

I have been working on a python script that assumes you are using the CIF Perl client to generate feed data in csv format, then the script will parse the csv files and send them like ArcOSI does to ArcSight via  CEF over syslog. I have posted the script and a quick tutorial on it over at the Google Code Project cif-csv-parse-to-cef.

A quick example for this post will be to generate the domain/malware feed using the medium severity and confidence level of 85, send it to ArcSight and have it add the feed data to an Active List. Part 2 of this post will cover writing a correlation rule to monitor the Active List for actionable data.

Let’s start by first creating the Active List and the Correlation Rule to populate the Active List:

In the Navigation Panel go to Active Lists and right click your personal folder and select New Active List

New Active List

Next in the Inspect/Edit Panel modify the Active List to meet your needs but in this example it will have the name “Malicious Domains”, it will not expire, 100,000 entries allowed (these settings can be changed later) Now set the fields the Active List will use. I have entered:
Domain, Source, Confidence, Description

Active List Edit Panel

Click Apply and all that is left is to create the correlation rule to populate the Active List.

New Rule

Next add a name for your rule

Rule Name

Then click on the Conditions field and create the following filter.

Rule Conditions

Next click on the Actions tab and make sure you De-Activate the Trigger for On First Event| Action. Then activate the On Every Event Trigger

Deactivate Trigger

After activating the On Every Event Trigger right click and select Add -> Active List -> Add to Active List

Select the active list you previously created in this case. Malicious Domains

Select Active ListAfter selecting the Active List you will have to map ArcSight event Fields to the corresponding Active list fields.

Active List Action

Once you click Ok, you will most likely get a pop up message similar to this that asks if you want to add all the ArcSight Fields you mapped in the previous step to the aggregation tab. Click yes, if you don’t then your active list will be blank after the rule fires.

Aggregation Question

Now deploy the rule as a real time rule. Your account will need privileges to do that, If you don’t have them ask your ArcSight Admin to deploy the rule for you.

Now the rule and active list have been created let’s generate content for the rule to populate the active list with.

Let’s start by generating the csv:

$ cif -q domain/malware -s medium -c 85 -p csv > dom_malware.csv

Now run the cifcsv.py script

$./cifcsv.py -f dom_malware.csv -s 192.168.100.154 -p 514 -t Domain

You will see output on the screen similar to this

<29>CEF:0|CIF|CIF 0.1|100|1|CIF Malicious Domain|1|shost=7daily-homebusiness7.net cs1=www.spamhaus.org/sbl/sbl.lasso?query=sbl112756 cs1Label=Source cs2=85 cs2Label=ConfidenceLevel cs3=malware cs3Label=Description

Now if you have an Active Channel up and running with a filter for Device Vendor = CIF and  Name = CIF Malicious Domain you should see something similar to this.

CIF Active Channel

Now if you right click your active list and show entries you should also see that your Active List is being populated with data.

Populated Active List

This concludes Part 1 – Part 2 will cover writing a correlation rule to monitor the Active List for actionable data.

Happy Hunting!

Advertisements

CIF Integration with ArcSight

I have been playing with and testing the Collective Intelligence Framework (CIF) and after seeing these great posts by Martin Holste and Brad Shoop on integrating CIF into ELSA and Splunk I got motivated to do the same thing with the ArcSight ESM console. EDIT: 6/10/2012 if you haven’t seen @kylemaxwell ‘s Post Introduction to the Collective Intelligence Framework I highly recommend check it out!

There are several steps to integrating CIF with arcsight before you start make sure you know the following:

  • CIF API Key
  • CIF API url

To start off you will need to go the Integration Commands Navigation Panel and right click your personal Integration Commands Folder  and select New Command.

New Integration Command

From there in the Inspect/Edit Panel you will want to select URL as the type of command:

Selecting New Command Type URL

You will want to give the new command a name for this example I used CIF (orginal huh?) you then can double click the URL field and you will want to put in the CIF API url and your CIF API key. If you notice the screenshot below you will also see $selectedItem, this is the field that will get populated by what you select in the ArcSight Console. Once you have your API url, key and $selectedItem set you can click OK then click Apply.

You are now halfway there :). Your command is now set you need to set the Integration Command Configuration. In the Navigator Panel Click the Configuration Tab and right click your Integration Configuration folder and select New Configuration.

New Integration Configuration

In the Inspect/Edit Panel you now can select the name of the Configuration and if it  will be rendered in an internal or external browser

Configuration Name and Browser

Next click on the Context Tab and select what ArcSight Contexts you want to see this CIF Search enabled for. In this example I have selected the Editor and Viewer Locations. You may choose others so play around with and see what works best for you.

Config Context Examples

Next click the Commands tab Click Add and select the Command you created earlier. In this case the CIF command.

Selecting Config Command

Click OK then Apply and your Integration Command is ready to go.

Fire up an Active Channel that you use and select either an IP Address Field or Host Name Field and Right Click and Select Integration Commands – > CIF, In the example below we have Right Clicked an Attacker Host Name and selected the CIF Integration Command.

Selecting Integration

In a few seconds your external browser (If that is what you chose and had configured) should load and you should see something similar to this:

CIF Query in External Browser

Now you are done and you have CIF integrated in your ArcSight Console. If you have tried out Martin’s CIF-REST-Sphinx addon you could configure that as an Integration Command as well.

Now go have fun hunting!

Hunting: Finding lateral movement using Snare and ArcSight Logger

Once again I received  inspiration for this post from the Mandiant M-Trends 2012: An Evolving Threat report and reflecting on a previous work engagement where the attackers leveraged lateral movement  to move around and  deeper into the network. On page 12 the report highlights the attackers leveraging at.exe (task scheduler) to install malware and take control of systems. This post hopefully will help you get an idea of what your current scheduled tasks look like and get you thinking about ways to find badness when it occurs. Yes it will occur!

In the M-Trends example the attacker creates a a NetBios session first  and then runs the at.exe command which to schedule the malware they previously uploaded over the NetBios session. This two steps should create some events in the Windows event logs, assuming auditing is turned on. For the NetBios connection an event id of 540 for XP and Server 2003 systems should be created and an event id of 4624 for Vista and Server 2008 systems.  For the at.exe an event id of 602 for XP and Server 2003 systems  and an event id of 4702  for Vista and Server 2008 systems.

To get to the point where you can actually hunt for at.exe events you must do a little leg work.

Set the audit policy:

At a minimum you need to have a few things turned on in the Local Security Policy for the Auditing Policy. You will want to enable Success and Failures audits for the following audit policies.

  • Audit account logon events
  • Audit logon events
  • Audit Object Access

It should be noted if possible you should turn on as many of the available audit policies as you can for your environment where you can. Having these logs helps find not only badness but misconfiguration and other issues that might creep up. They are easy to setup and push out via Group Policy Objects (GPO) but just make sure you watch changes to the GPO in your logs. Some attackers have been known to modify GPO’s and turn settings off.

Now that you have your Auditing Policy in place you need to enable logging.

Enabling Logging:

For central windows logging that should work with almost any commercial or open source central log collection tool  I recommend using Snare as your agent for getting the logs from your windows systems to what ever central log system you have. You do have one right?

The install for Snare is pretty straight forward and is covered pretty well in their documentation, and so is adding a remote Syslog host so I won’t cover that here. What I will cover is one minor addition that I have found that needs to be made to capture and send  at.exe related event logs. Start by logging in to the Snare Configuration page and select Objectives Configuration on the left hand side. When editing the auditing configuration here is what I have used in my testing to get the logs I am interested in for this hunting trip:

Sample Snare Config

After adding the configuration above go to the left navigation bar and select Apply the Latest Audit Configuration. Now you may want to test and try and create a at.exe event on the system you just applied this configuration to. To test it you will need to either be a domain admin or a local admin of the system. A sample test you could use from the command prompt is:

at.exe \\srv1 07:30 cmd /c ping.exe 8.8.8.8

Change srv1 with your host name and change what is after the cmd /c to something you want the system to execute. The command above will create a task to run at 7:30 in the morning and will execute a ping to 8.8.8.8

Now go back to your Snare Configuration and look at the Latest Events and you should see near the top the scheduled task you just created.

Hunting with Logger

Now that you have configured the audit policy and you have configured Snare. Its time to go hunting for the logs. In this hunt we are using the free version of  ArcSight Logger (in future posts I will explore using Snare, ELSA and maybe a few other tools). I am going to assume your logger instance is already setup and you have a smart connector in place to receive logs from Snare.

Quick Initial Search

A quick and dirty search for looking for scheudled tasks is as simple as the filter below and hitting Go!:

Logger Search Filter

Search: (externalId=620 or externalId=4702)

Now if you have any hits you might get output similar to this:

Logger Search Results

Now that you have results you should probably go make and do a search for network logon events (event id 540 or 4624) around those times to see where the commands originated from. This will help you find lateral movement.

What to do after your initial search?

We have found our initial search and hopefully all of your events are ones that were planned and not ones done by someone on your network. Perhaps you don’t want to run this query every day or so or maybe you don’t want to login every day to run it. You could quickly turn this into a report and have it run for a set interval and email you the results. I will quickly cover how to create the query and report that is needed on logger below.

Creating the Query:

Let’s create a quick and dirty query and report that can be touched up later if needed 🙂

Under the Reports Function tab on the left had side of Logger. Go to Design and select Queries then click Add New at the top. Give it a name and you could start by using a query similar to this:

select events.arc_endTime AS ‘Time’,events.arc_name AS ‘Name’,events.arc_destinationUserName As ‘Dest. User Name’, events.arc_destinationHostName AS ‘Dest. Host Name’, events.arc_message AS ‘Message’ from events where ( events.arc_externalId = 602 OR events.arc_externalId = 4702) group by events.arc_endTime

Below is what my quick Query Object looks like:

Logger Query

After creating the query you will need to create the report simply give it a name, select the query you just created and then select the fields you want displayed. Save the report and run it. Here is what my quick and dirty report design looks like.

Logger Report

Depending on your Report Start and End Times you might get data similar to your quick logger search above.

Report output

You should now have a way to hopefully find badness if there was any (assuming you have historical logs) or this can help put you on a way to monitor and find badness and respond faster.

If you have other suggestions or maybe tricks you use for these types of searches I would love to see them. We are all one big community lets help each other out where we can.

As always Happy Hunting!