OSQuery, Splunk and PCI

A couple of years ago over at Facebook, OSQuery was open sourced. This tool allows you to make SQL-Lite queries against tables containing information about a running Linux or OSX host. One massive advantage of this is a wide range of system attributes can be queried using a universal syntax; just imagine building (and maintaining!) even a modest sized bank of queries using native Linux tools, as well as trying to get their collective outputs into a universal format.


You can check out the tables available here: https://osquery.io/docs/tables/ and you'll notice the file_events table, which if you are faced with PCI requirement 11.5, you'll probably find your interest starting to get piqued...


PCI 11.5: "Deploy file-integrity monitoring software to alert personnel to unauthorized modification of critical system files, configuration files, or content files, and configure the software to perform critical file comparisons at least weekly."


Previously, this involved either something like OSSec, or a Wire, which an attacker may Trip over... Both are well established tools, but both have some disadvantages on lean hosts within scalable deployments. The commercial offering by it's nature costs money, and can be tricky to configure. OSSec is an excellent project, but very much a full on server-client application with custom management and processes, which if you already have config and log management may not be desirable.

So then, back to OSQuery. By now, you've hopefully checked out the project docs and can see how it all hangs together. Let's look at a really simple config file:


{
 /* Configure the daemon below */
 "options": {
   "config_plugin": "filesystem",
   "logger_plugin": "filesystem",
   "events_expiry": "3600",
   "verbose": "false",
   "worker_threads": "2",
   "enable_monitor": "true"
 },
   "schedule": {
   "ports": {
     "query": "SELECT * from listening_ports;",
     "interval": 300
   },
   "users": {
     "query": "SELECT * from logged_in_users;",
     "interval": 300
   },
   "file_events": {
     "query": "select * from file_events;",
     "interval": 300
   }
},
"file_paths": {
   "my_app": [
     "/var/www/html/my_app/checkout/%%"
     ]
 }
}


[disclaimer] As you can see, OSQuery is a highly capable tool, and can return a wealth of information about a host. PCI is a very thorough compliance standard, covering all aspects of payment card security. The example in this blog is very simple, and provided for your interest only. For further information, consult your QSA. Your milage may vary.


This runs three basic queries:


users: Returns columns from the users table, notifies when a user logs in, or out of the host. (OSQuery's daemon by default, logs an "added" value when an query returns a previously unseen result, and a "removed" value when a previously seen result is no longer being returned)


ports: Returns columns from the listening_ports table, when ports are added/removed


file_events: Returns columns from the file events table when a file's attributes on the monitored paths change.


The final stanza defines the paths we want to monitor. The % and %% operators are wildcards, and recursive wildcards respectively.


Each time a query is ran the results are output to the default log location, where in our example, the Splunk Forwarder picks them up and indexes as per the inputs.conf below:


[monitor:///var/log/osquery/osqueryd.results.log]
disabled = false
index = test
sourcetype = _json

json data... yeah, Splunk knows it.


If you are using Splunk Enterprise Security, you can now create a simple correlation search and generate a notable event on the fields of your choice. As we're talking FIM, let's look at all added events where a hash value for a file on the monitored path has changed (and map some fields to their Splunk CIM names so they show up in the notable event):


index=test sourcetype=_json name=file_events action="added" "columns.action"="updated" | `get_event_id` | stats values("columns.sha256") as "file_hash", values("columns.target_path") as "path", values("columns.mode") as "file_permission", values("hostIdentifier") as "orig_host", values("action") as "action" values("calendarTime") as "file_modify_time", values("columns.size") as "file_size",values("event_id") as "orig_event_id" | `map_notable_fields`


The query is returning all file change attributes, but we are only alerting on the "UPDATED" events where a hash value is added.


So great, we now have a notable event which tells us when a monitored file's contents have been changed, or a new file (ergo, a new hash) is created. Wouldn't it be nice to see what happened prior to that? Well, remember the inputs.conf? We're bringing in all OSQuery results as the same sourcetype. As the column names are consistent, we can return some interesting fields in a simple drill-down search, stats sorted by time to form a quick and dirty timeline:


index=test sourcetype=_json name=* name!=info name!=rpm_packages "hostIdentifier"=$orig_host$ | `get_event_id` | table "name", "hostIdentifier","columns.port", "columns.address","columns.user","columns.target_path","columns.mode","columns.sha256","columns.host","columns.action" "columns.name","columns.tty", "columns.pid", action, "calendarTime", "event_id","unixTime" | stats values(*) as * by event_id | sort - "unixTime" | fields - event_id


Which is using (in addition to fields higher up in the json objects) fields from:


users: columns.user, columns.pid,columns.tty, columns.address,columns.host
ports:columns.port, columns.address, columns.pid
file_events: columns.target_path, columns.mode, columns.sha256


We're also using get_event_id to create unique ID for each matching event. Beware of creating "where columns.foo=bar" conditionals, ALL results from tables which have no "columns.foo" key will be excluded.


Run this up, and we can see:

Timelines rule, dashboards drool. 

1. The attacker logs into the host, possibly using a stolen credential?
2. The file payment.html is changed
3. The file newpayment.html is created, and a php shell, because why not.
4. A listening port is opened, maybe to land shells from pivots to other hosts?


This of course, is just the start. Want to see the shell history? Add queries from shell_history table or the more detailed user_events table. Processes? Yup, there's a table for that.


Try some queries, then play the attacker and watch the results roll in. Keep tweaking the queries until you have enough data for a really nice timeline.

[edit] Ok then, who spotted the deliberate mistake? The payment.html page didn't exist before I started this demo, so you only saw the empty file get created in the previous screenshot. I logged in again (just for you, you lucky people) and tampered with the payment.html file. You can see my login, the "removed" event for the old hash, and the "added" event for the new one:



[another edit] Did you spot another strange thing? There seems to be more events than lines in the stats table. This is thanks to 'get_event_id'; each matching event gets a hash value as it's event ID, duplicate events having the same hash value are assumed to be the same event. Screenshots are from my development setup, and something with forwarding or indexing is causing duplicated events.


G.

Comments

  1. This comment has been removed by a blog administrator.

    ReplyDelete
  2. I think this is a really good article. You make this information interesting and engaging. You give readers a lot to think about and I appreciate that kind of writing. PCI DSS toolkit

    ReplyDelete
    Replies
    1. Thank you for sharing this information. IntelliMindz is the best IT Training in Bangalore with placement, offering 200 and more software courses with 100% Placement Assistance.

      Splunk Online Training
      Splunk Training In Bangalore
      Splunk Training In Chennai

      Delete
  3. Awesome article! I want people to know just how good this information is in your article. It’s interesting, compelling content. Your views are much like my own concerning this subject. GDPR toolkit

    ReplyDelete
  4. This article is a creative one and the concept is good to enhance our knowledge. Waiting for more updates.
    Cybersecurity Online Training

    ReplyDelete

Post a Comment