Print Page | Close Window

Max Message Size

Printed From: LogSat Software
Category: Spam Filter ISP
Forum Name: Spam Filter ISP Support
Forum Description: General support for Spam Filter ISP
Printed Date: 20 October 2017 at 8:55pm

Topic: Max Message Size
Posted By: yapadu
Subject: Max Message Size
Date Posted: 20 June 2009 at 6:11pm
Can the maximum message size per domain be set?

I've got one domain, which processed mail normally. But if someone sends a large message (over 10 megs maybe) then the message will never get delivered to their SMTP server.

SpamFilterISP will try to deliver and eventually fail with Socket Error #10054 -- Connection reset by peer.

This client now has 3 large emails in the queue, and has used up about 8 gigs of bandwidth in the last few days while a couple of messages are repeatedly sent for delivery - and failing.

If I could reduce the max message size for the individual domains then those who can handle large messages could have a higher limit than those who's servers will only accept 1 or 2 megs.

Posted By: LogSat
Date Posted: 20 June 2009 at 9:25pm

The max message is one of the very few global settings, and can't be modified per-domain. If you could however zip and email us SpamFilter's activity logfile for one of the days this happened, along with the to/from email addresses involved, we can try to see if there's any workarounds/solutions for this issue. if the zip is over 8MB in size, please upload it to our FTP server (I'll provide you with the login info via PM).

Roberto Franceschetti" rel="nofollow - LogSat Software" rel="nofollow - Spam Filter ISP

Posted By: yapadu
Date Posted: 20 June 2009 at 11:37pm

I continue to play around with this problem, I've found that the system will timeout and receive the Socket Error #10054 -- Connection reset by peer in exactly 10 minutes after connecting to the remote server.

So it seems to be a timeout issue, from one side or the other. I don't see anything in the spamfilter.ini file that looks like a timeout at 10 minutes.

The closest I could find is this:


The session would not exactly be idle if transferring a message, and the timeout is 10 minutes not 15.

Posted By: LogSat
Date Posted: 21 June 2009 at 10:57pm

We definetly see the pattern of 10-minute timeouts that is causing the connection from SpamFilter to be dropped.

There's 3 settings in the SpamFilter.ini file that could in theory apply here (the first is measured in minutes, the others in seconds):

;Force disconnect of sessions after they have remained connected for this long

;Force disconnect of sessions if a command has not been received within the last nn seconds

;Timeout when delivering emails to the destination SMTP server (in seconds)

if none of these are configured for 10 minutes, then it's very likely that the remote server (or their firewall) is configured for a 10 minute timeout. Please note that companies in the specific country the email is being forwarded to often have rather slow internet connections (I'm seeing an avg of 500ms pings to their mail server), so large emails several MBs in size frequently take a long time to be delivered there, thus increasing the chances of such timeouts if their administrators have implemented them.

As a side-note, by default SpamFilter will retry to deliver emails to the destination SMTP server for ever until they are successful, retrying every 60 minutes (form your logs it seemed you decreased this to 30minute retries).

You may alter this "forever" behavior thru the following setting in the SpamFilter.ini setting:

;Number of hours SpamFilter will retry to deliver messages in queue to your destination SMTP server if it was unreachable. Enter 0 to try forever until back online.

Roberto Franceschetti" rel="nofollow - LogSat Software" rel="nofollow - Spam Filter ISP

Posted By: yapadu
Date Posted: 22 June 2009 at 12:52am
Thanks Roberto,

I have restored retry to deliver emails back to 60 (I actually had it at 15), and set ExpireRetryQueueHours to 36 hours. This should help keep the bandwidth bill down.

The ping time for me to that server is about 300ms, so not as bad as 500. I was able to get a couple of 13 meg files to go through in the 10 minute limit before it gets cut off by making sure it was only doing one at a time.

If I tried two at a time it would never make it. So the problem was:

1) A large 20 meg file came in, that could never get delivered - retrying forever every 15 minutes.
2) An additional two 13 meg files came in, and all of them were being delivered at once - due to the bad bandwidth they were all failing every 10 minutes and restarting 5 minutes later.

By removing all from the queue, and placing just a single 13 meg file there it was at least able to deliver them one at a time.

Looks like compression will be the next technology for email servers, with the large number of attachments people try and send these days.

Guess I will have to research the IIS timeout for the client, as that is what server they are using I think.


Print Page | Close Window