Monday, January 27, 2014

Logstash configuration for collecting OpenAM and OpenIDM logs


Following on to my previous posting, here is a logstash configuration that collects logs from both OpenAM and OpenIDM, and feeds them into elastic search:



 input {  
  file {  
   type => idmRecon  
   start_position => beginning  
   path => "/opt/openidm/audit/recon.csv"  
  }  
  file {  
   type => idmActivity  
   start_position => beginning  
   path => "/opt/openidm/audit/activity.csv"  
  }  
  file {  
   type => amAccess  
 #  start_position => beginning  
   path => "/opt/openam/openam-config/openam/log/amAuthentication.*"  
  }  
 }  
 filter {  
     if [type] == "idmRecon" {  
         csv {  
             columns => [  
                     "idX","action","actionId","ambiguousTargetObjectIds","entryType","message","reconciling","reconId",  
                     "rootActionId","situation","sourceObjectId","status","targetObjectId","timestamp"  
                     ]  
         }  
         date {  
             match => ["timestamp", "ISO8601"]  
         }  
     }  
     if [type] == "idmActivity" {  
         csv {  
             columns => [  
             "_id","action","activityId","after","before","changedFields","message","objectId","parentActionid",  
             "passwordChanged","requester","rev","rootActionId","status","timestamp"  
             ]  
         }  
         date {  
             match => ["timestamp", "ISO8601"]  
         }  
     }  
     if [type] == "amAccess" {  
     csv {  
         columns => [time,Data,LoginID,ContextID, IPAddr, LogLevel,  
             Domain, LoggedBy, MessageID, ModuleName, NameID, HostName]  
         separator => " "  
      }  
      date {  
             match => ["time", "yyyy-MM-dd HH:mm:ss"]  
      }  
      geoip {   
        database => "/usr/share/GeoIP/GeoIP.dat"  
        source => ["IPAddr"]  
      }     
   }  
 }  
 output {  
  # Use stdout in debug mode again to see what logstash makes of the event.  
  stdout {  
   debug => true  
   codec => rubydebug  
  }  
  elasticsearch { embedded => true }  
 }  



Now we can issue elastic search queries across all of the data sets. Here is a very simple Kibana dashboard showing events over time and their source:



















While this configuration is quite basic, it allows us to find and correlate events of interest across OpenAM and OpenIDM.

Try searching for a sample user "fred" by entering the string into the top search box. You will see all OpenAM and OpenIDM events that contain this string in any field. You can of course build more specific queries - but the default free form search does an excellent job.

Thursday, January 9, 2014

Collecting OpenAM logs with logstash


Logstash is a general purpose log collector that can read, transform and collect various logs.

The following logstash configuration will collect OpenAM Access logs. The default target here is Elastic Search - which is document oriented no-sql database optimized for text search (perfect for log files).

In a future blog I will show you how you can use Kibana to makes some sexy charts of your access data.

 file {  
   type => amAccess  
   start_position => beginning  
   path => "/path_to_your_install/openam/openam/log/amAuthentication.access"  
  }  
 }  
 filter {  
     if [type] == "amAccess" {  
     csv {  
         columns => [time,Data,LoginID,ContextID, IPAddr, LogLevel,  
             Domain, LoggedBy, MessageID, ModuleName, NameID, HostName]  
         separator => " "      
      }  
      date {  
             match => ["dateTime", "yyyy-MM-dd HH:mm:ss"]        
      }  
      geoip {   
           database => "/path_to_your/GeoIP.dat"  
           source => ["IPAddr"]  
         }  
  }  
 }  


Here is an upstart config file to start logstash:

 # logstash - indexer instance  
 #  
 description   "logstash indexer instance"  
 start on virtual-filesystems  
 stop on runlevel [06]  
 respawn  
 respawn limit 5 30  
 limit nofile 65550 65550  
 # set HOME to point to where you want the embedded elasticsearch  
 # data directory to be created and ensure /opt/logstash is owned  
 # by logstash:adm  
 env HOME=/opt/logstash  
 #env JAVA_OPTS='-Xms512m -Xmx512m'  
 chdir /opt/logstash  
 setuid ubuntu  
 setgid ubuntu  
 #setuid logstash  
 #setgid adm  
 console log  
 # for versions 1.1.1 - 1.1.4 the internal web service crashes when touched  
 # and the current workaround is to just not run it and run Kibana instead  
 script  
     exec /opt/java/bin/java -jar logstash.jar agent -f /opt/logstash/access.conf --log /opt/logstash/log.out   
 end script