Alerts

Alerting can be configured in two ways.

  • Insight email alerts - The simplest and recommended approach that sends email notifications when an Insight is created. Configured as part of the Insight Monitor Configuration.
  • Native Splunk alerts - Allows use of additional alert actions, and alerting on more granular levels of detail in the Prelert results. Documented on this page.

Insight email alerts

Insight email alerts are the simplest to configure and is the recommended option.

An Insight is a collection of anomalies that tells the story of your data. They can be automatically created based upon anomalous activity performed by a partiticular influencer (e.g. a user) or when anomalies occur together, close in time.

The email alert is sent when an Insight is automatically created or significantly updated.

The HTML alert email includes a link to the Insight View (which contains a visual timeline of anomalies) along with useful information for troubleshooting.

For Insight email alerts, please jump straight to Insight Monitor Configuration.

Native Splunk alerts

Native Splunk alerting is much more complicated to configure as it works directly off the data in the prelertresults index. Alerts can be triggered for the following levels of summarization:

  • Insights - This is the most summarized level of alerting and is similar to Insight alerts described above; however this allows the use of Splunk alert actions such as Run a script or Webhook.
  • Summary - Summarized alerting based upon the anomalous activity for a type of influencer. For example, alert when clientip’s behave strangely.
  • Detailed - Detailed alerting based upon the anomalous activity for a specific influencer. For example, alert when the user Bob is unusual.

Some general notes on alert configuration are below, followed by detailed steps for each of the above. The example searches and text included in the configuration steps are based around data from a fictional flight comparison website, and should be modified to suit the data sets you are alerting against.

For further assistance on configuring alerts please contact Prelert Support.

General Notes on Configuration

  • The first step in configuration is to run the supplied search in the Search view of the Prelert app, then use the standard Splunk Save As > Alert wizard to set up the alerting search.
  • We recommend use of the Scheduled Splunk alert type rather than Real Time alerts. See http://docs.splunk.com/Documentation/Splunk/latest/Alert/Definescheduledalerts for further information on scheduled alerts.
  • The frequency of the alert search should be at least the bucketspan, or faster, of the Anomaly Search(s) being monitored. For example, if the bucketspan of an Anomaly Search is 10 mins, then there is no problem with scheduling the alert search to run every 1 minute if desired, as long as the window of time the alert search runs over is wide enough to catch events from the Anomaly Search.
  • Emails generated by the alerts will include links to views in the Prelert app. Use of these links is to be encouraged over Splunk’s Triggered Alerts list and links to the view the results in the Search view as the views in the Prelert app provide a rich user interface for exploring the data.
  • Splunk allows you to use token substitution to insert fields from the job, results and server environment into various parts of the email, including the subject and message - see http://docs.splunk.com/Documentation/Splunk/latest/Alert/Emailnotification for further details.
  • Must make sure that any result fields used for token substitution are included as-is in the list of fields in the table command used to build the results table included in the email i.e. without subsequent rename commands.
  • The generated email will be in HTML format. Do not try to insert HTML characters when configuring the message input since whatever you put in the form input field goes in as-is into the body of the mail.
  • For full details on configuring alerts in Splunk http://docs.splunk.com/Documentation/Splunk/latest/Alert/Aboutalerts.

Native Splunk Alerts - Insights

Alerts can be configured to notify users of new Insights, whether they have been created automatically or manually in the Prelert app. Emails will be generated with summary information on the Insight including the description, score and a link to the Insight View.

Note that for automatically created Insights, there can be a lag in generation of the alert after the time of the anomalies which triggered the Insight, the length of which will depend on the Insight Monitor frequency, and the bucketSpan and bufferSpan of the related Anomaly Searches.

Switch to the Search page of the Prelert app and enter the search to be used for alerting in the form:

index=prelertresults prelert.resulttype=insight (`PrelertSearchGroupTerms("*","prelert.searchnames{}")`) | `PrelertInsightLookup` | search description="*" score >= 1 (status ="new" OR status ="open") | eval score=floor(score) | eval start=strftime(start_time, "%b %e, %Y %H:%M:%S") | eval insightid='prelert.id' | rename description as insight_description | eval link="http://localhost:8000/en-US/app/prelert/prl_insight_view?form.insightid=".'prelert.id'."&earliest=".start_time."&latest=now" | eval insight_id='prelert.id' | table start, insight_description, rule_description, score, status, link, insight_id

The following parts of the search should be set to appropriate values:

  • Splunk host and port used in the link to the Insight View.
  • Insight score filter threshold - a value from 1 to 100. Set to 0 to be alerted for all Insights.
  • Note that the latest time used for the Insight View search is set to now as the _time field of an Insight is set to the time of the latest anomaly and anomalies may have been added to the Insight since the alert was triggered.
  • Optionally filter for Insights with anomalies from a particular Anomaly Search group e.g. to alert on Insights with searches from security_group, the first part of search pipe would be:
index=prelertresults prelert.resulttype=insight (`PrelertSearchGroupTerms("security_group","prelert.searchnames{}")`)

Run the Search over the past 24 hours, and then click Save As > Alert, next to the search bar time picker, to open the Splunk Save As Alert wizard.

Creating Insight alert

Step 1 of the Save As Alert wizard for creating an Insight alert

Creating Insight alert

Step 2 of the Save As Alert wizard for creating an Insight alert

  • Enter earliest/latest times according to the longest time span of (bucketSpan*2) + bufferSpan of all the Anomaly Searches off which you are generating Insights.

    • earliest = (bucketSpan*2) + bufferSpan + insight monitor frequency
    • latest = bucketSpan + bufferSpan + insight monitor frequency
    • e.g. longest Anomaly Search has 10 minutes bucket span and 2 minute buffer span, insight frequency is 10 minutes - earliest: -32m@m, latest: -22m@m
  • Pick an appropriate cron schedule. We recommend that the alert search frequency should be at or faster than the bucketspan used above.

  • Message input which forms the body of the email is:

    The alert condition for ‘$name$’ was triggered due to the creation of a new Prelert Insight with description ‘$result.insight_description$’. You can view details on the Insight in the Insight View in the Prelert app at $result.link$.

  • The Insight description can be included in the subject, for example by entering:

    Splunk Alert: $name$ - $result.insight_description$

  • Throttling on insight_id is used to ensure you just get one alert, when the insight is first created, and not for example whenever new anomalies are added to the insight.

  • For more information on configuring a scheduled alert see the Splunk docs at http://docs.splunk.com/Documentation/Splunk/latest/Alert/Definescheduledalerts.

An email will be sent for each new Insight that is created in the format:

Insight Alert email

Sample email that is generated by an Insight alert

Clicking on the link will show the new Insight in the Insight View:

Insight View for farequote insight

Native Splunk Alerts - Summary

To be notified when a high anomaly score is observed across an influencer field, summary alerts should be configured. The Prelert analytics ensure these will be rate limited, and the email generated will include a link to the Anomaly Search Results view to explore the anomalies for that influencer type.

Switch to the Search page of the Prelert app and enter the search to be used for alerting in the form:

index=prelertresults prelert.resulttype=influencerbucket prelert.searchname="farequote" prelert.influencerfield="airline" prelert.initialscore >= 50 | eval searchname='prelert.searchname' | eval influencerfield='prelert.influencerfield' | eval score=floor('prelert.initialscore') | eval linklatest=_time+1 | eval linkearliest=_time-14400 | eval link= "http://marple:10000/en-US/app/prelert/prl_anomaly_search_results?form.minThreshold=25&form.searchname=farequote&form.influencertype=airline&earliest=".linkearliest."&latest=".linklatest | table _time, searchname, influencerfield, score, link

The following parts of the search should be set to appropriate values:

  • Anomaly Search name in the prelert.searchname field.
  • Name of the influencer field in prelert.influencerfield. If the search has multiple influencers then you need to consider which one you want to be alerted on. If you really do want to be alerted on multiple influencers then you should set up a separate summary alert search for each.
  • Score threshold via the prelert.initialscore field. We recommend starting off with a threshold of 50 i.e. major severity and above.
  • Splunk host and port used in the link to the Anomaly Search Results view.
  • form.minThreshold in the link field determines the anomaly score threshold used when opening the Anomaly Search Results view. Set it according to the threshold you set in the original search, or you may want to set it to 25 (minor), or even 0 (warning) regardless.
  • form.searchname and form.influencertype in the link field.

Run the Search over the past 24 hours, and then click Save As > Alert, next to the search bar time picker, to open the Splunk Save As Alert wizard.

Creating Summary alert

Step 1 of the Save As Alert wizard for creating a Summary alert

Creating Summary alert

Step 2 of the Save As Alert wizard for creating a Summary alert

  • You need to enter earliest/latest times according to the bucketSpan and bufferSpan of the Anomaly Search off which you are alerting.

    • earliest = (bucketSpan*2) + bufferSpan
    • latest = bucketSpan + bufferSpan
    • e.g. 10 minutes bucket span, 2 minute buffer span, alert every 1 minute - earliest: -22m@m, latest: -12m@m
  • Pick an appropriate cron schedule. We recommend that the alert search frequency should be at or faster than the bucketSpan of the Anomaly Search.

  • Message input which forms the body of the email in this example is set to:

    The alert condition for ‘$name$’ was triggered due to Prelert detecting anomalies in farequote response times influenced by one or more airlines. View the anomalies influenced by airlines in the Anomaly Search Results view in the Prelert app by clicking on the links in the table below.

  • Configure an appropriate Throttling interval if desired. For example you may not want to be alerted again for this search for 1 hour.

  • For more information on configuring a scheduled alert see the Splunk docs at http://docs.splunk.com/Documentation/Splunk/latest/Alert/Definescheduledalerts.

The user will receive one email per alert, showing the time, search name and score, with a link to the Anomaly Search Results view, viewed by the influencer of interest, showing the four hours leading up to the time in the alert:

Summary Alert email

Sample email that is generated by a Summary alert

Clicking on the link in the email opens the Anomaly Search Results view:

Anomaly Search Results View for airline influencer

Native Splunk Alerts - Detailed

If you want to be notified whenever a single influencer value, such as a particular user or machine, is highly anomalous, detailed alerts can be configured. In contrast to the summary alerts described above, these will not be rate limited so the score threshold should be set high enough to limit noise. The email generated by the alert will include links to the Entity View, allowing you to explore the anomalies for the particular influencer value.

Switch to the Search page of the Prelert app and enter the search to be used for alerting in the form:

index=prelertresults prelert.resulttype=influencer prelert.searchname="farequote" prelert.influencerfield="airline" prelert.initialscore >= 80 | eval searchname='prelert.searchname' | eval score=floor('prelert.initialscore') | eval linklatest=_time+1 | eval linkearliest=_time-14400 | eval link="http://marple:10000/en-US/app/prelert/prl_entity_view?form.searchGroup=*&form.entity=".airline."&earliest=".linkearliest."&latest=".linklatest | table _time, searchname, airline, score, link

The following parts of the search should be set to appropriate values:

  • Anomaly Search name in the prelert.searchname field.
  • Name of the influencer field in prelert.influencerfield. If the search has multiple influencers then you need to consider which one you want to be alerted on. If you really do want to be alerted on multiple influencers then you should set up a separate alerting search for each.
  • Score threshold via the prelert.initialscore field. We recommend starting off with a threshold of 80 as this level of results are not rate limited.
  • Splunk host and port used in the link to the Entity View.
  • The entity field name used in URL, and corresponding field name in table clause.

Run the Search over the past 24 hours, and then click Save As > Alert, next to the search bar time picker, to open the Splunk Save As Alert wizard.

Creating Detailed alert

Step 1 of the Save As Alert wizard for creating a Detailed alert

Creating Detailed alert

Step 2 of the Save As Alert wizard for creating a Detailed alert

  • You need to enter earliest/latest times according to the bucketSpan and bufferSpan of the Anomaly Search off which you are alerting.

    • earliest = (bucketSpan*2) + bufferSpan
    • latest = bucketSpan + bufferSpan
    • e.g. 10 minutes bucket span, 2 minute buffer span, alert every 1 minute - earliest: -22m@m, latest: -12m@m
  • Pick an appropriate cron schedule. We recommend that the alert search frequency should be at or faster than the bucketSpan of the Anomaly Search.

  • Message input which forms the body of the email in this example is set to:

    The alert condition for ‘$name$’ was triggered due to Prelert detecting one or more airlines with anomalous response times in the farequote data. You can view the anomalies for each anomalous airline over the past 4 hours by clicking the link in the table below.

  • Configure an appropriate Throttling interval if desired. For example you may not want to be alerted again for this search for 1 hour.

  • For more information on configuring a scheduled alert see the Splunk docs at http://docs.splunk.com/Documentation/Splunk/latest/Alert/Definescheduledalerts.

The user will receive one email per alert, with details on all the influencers that exceeded the threshold score. The email will contain links to the Entity View and be of the form:

Detailed Alert email

Sample email that is generated by a Summary alert

Clicking on the links in the table opens the Entity View view over the four hours leading up to the time of alert:

Entity View