This article applies to:
- Trustwave MailMarshal (SEG)
- Trustwave ECM/MailMarshal Exchange
Introduction
Most organizations will attempt to protect their email users from offensive images that are distributed by email. Companies may also want to limit email images to avoid time wasting.
Furthermore, in some jurisdictions there is a legal requirement to block access to offensive images.
This article will provide some strategies for limiting exposure to offensive or unwanted images. For full details of the implementation, please refer to the User Guide and Help for your product version.
Strategy
There are three parts to the strategy:
- Detecting Offensive Images
- Quarantining Unknown Images
- Always allowing Company or Trusted Images
Detecting Offensive Images
Image Analyzer is a third party deep image analysis product that has been fully integrated into the MailMarshal content scanning engine. Integration with Image Analyzer allows MailMarshal to assess the content of images that pass through the email gateway. Trustwave also provides integrated licensing for this product.
- In MailMarshal 10.1.0 and above, Image Analyzer can be configured to detect several types of inappropriate or time-wasting content, including pornography.
- In earlier versions, the main target content that Image Analyzer attempts to detect is pornographic images.
Because MailMarshal unpacks the content of a message, extracting the attachments and the content inside archive files, Microsoft Word documents, and other packed formats, Image Analyzer can scan the image content from all components of the target message. Rules to use this technology are included with the default ruleset.
It is important to note that detection of this type of content is not an exact science, and the level of technology available today means that there will be some false-positive and false-negative detections. However, Image Analyzer allows your organization to utilize leading technology, and provides evidence of due diligence in protecting your employees from receiving material that may be offensive or in some cases illegal.
The default Image Analyzer rule included in MailMarshal 10.1 and above is as follows:
Content Analysis Rule: Block Suspected Pornographic Images
This rule utilizes MailMarshal's Image Analyzer to scan attached images for suspected pornographic content. Image Analyzer must be licensed for this rule to work correctly.
When a message arrives
Where message is incoming
Where the attached image matches the category 'pornography'
Send a 'Suspect Image In' notification message
And move the message to 'Suspect Images' with release action "continue processing"
Quarantining Unknown Images
If you have an obligation to block unknown images, you can easily create a rule in MailMarshal to do so. A properly constructed rule will block messages containing images even if the images are contained in archive files or documents.
Blocking all images could create a heavy administrative load to deal with user requests to release messages. Fortunately MailMarshal includes an optional web-based End User Management system (SQM or Spam Quarantine Management) that can allow you to grant users the ability to release their own messages. Using SQM, you can prevent images from being passed directly to users, while allowing them some control over the content and minimizing administrative load.
Two Block Image rules are included in the MailMarshal default ruleset. Trustwave suggests that you use the rule that provides an exception for known images using the "fingerprinting" feature described below.
Note: When enabling user self management, the folder used by Image Analyzer (by default, 'Suspect Images') should NOT be enabled. Only the folder used by the Block IMAGES rule ('Attachment Type - Images') should be enabled for self-management. This will prevent offensive images from being displayed to users while allowing them to access the other unknown images. To prevent confusion, ensure that the folder 'Suspect Images' is not used by any other custom rules.
Allowing Company or Trusted Images
MailMarshal offers the ability to exclude from evaluation images used in Company documents, graphics or signatures as well as those from trusted sources. The feature in MailMarshal is known as Fingerprinting. A rule with Fingerprint image exclusion to the normal Block Image action is included with the default ruleset.
The default rule in MailMarshal SMTP is as follows:
Content Analysis Rule: Block IMAGES (unless fingerprint is known)
This rule blocks messages with IMAGE attachments, except where the fingerprint for that image is known to MailMarshal. This can be a useful way to allow company logos to be excluded from an image-blocking rule.
When a message arrives
Where message is incoming
Where message attachment is of type 'IMAGE'
And where attachment fingerprint is not known
Send a 'Multimedia in' notification message
And move the message to 'Attachment Type - Images' with release action "continue processing"
Note: For more information about fingerprinting and adding images to the known fingerprints, see the User Guide. The fingerprinting functionality should be used with discretion. Fingerprinting very large numbers of images can slow MailMarshal configuration reloads.
Summary
The strategy and MailMarshal capabilities described above should permit you to block most offensive images propagated by email, while allowing those from company or trusted sources, and with minimal ongoing administrative load.
To complete the picture, you should ensure that an Acceptable Use policy for email is established in the organization and known to users. Individuals who exchange inappropriate material tend to do so repeatedly. MailMarshal allows you to report on blocked material and can send a notification to the user when it detects inappropriate content. Even if MailMarshal does not detect every instance of the material, the individuals will be educated that the content of email is being analyzed and monitored. The risk of action being taken, or social embarrassment, rapidly increases. Most users will comply with the policy voluntarily.