-
Notifications
You must be signed in to change notification settings - Fork 297
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Looping on mailbox URL messages with no content #988
Comments
Hi, Thanks for reporting that.
I also think that these cases should be logged as they indicate some problem - either the configuration, on the network part or the source of the webpage. Thus warning could be appropriate - as it's an error that we can handle. |
Sorry for my late reply. Marking the mail as sent is a quick fix, but risks losing reports. So, contrary to @wagner-certat, I do expect a different result on another run. Proposal: |
read, not sent
That risk is communicated to the user by logging a warning. |
### Core - `lib/message`: `Report()` can now create a Report instance from Event instances (#1225). - `lib/bot`: * The first word in the log line `Processed ... messages since last logging.` is now adaptible and set to `Forwarded` in the existing filtering bots (#1237). * Kills oneself again after proper shutdown if the bot is XMPP collector or output (#970). Previously these two bots needed two stop commands to get actually stopped. - `lib/utils`: log: set the name of the `py.warnings` logger to the bot name (#1184). ### Bots #### Collectors - `bots.collectors.mail.collector_mail_url`: handle empty downloaded reports (#988). - `bots.collectos.file.collector_file`: handle empty files (#1244). #### Parsers - Shadowserver parser: * SSL FREAK: Remove optional column `device_serial` and add several new ones. * Fixed HTTP URL parsing for multiple feeds (#1243). - Spamhaus CERT parser: * add support for `smtpauth`, `l_spamlink`, `pop`, `imap`, `rdp`, `smb`, `iotscan`, `proxyget`, `iotmicrosoftds`, `automatedtest`, `ioturl`, `iotmirai`, `iotcmd`, `iotlogin` and `iotuser` (#1254). * fix `extra.destination.local_port` -> `extra.source.local_port`. #### Experts - `bots.experts.filter`: Pre-compile regex at bot initialization. ### Tests - Ensure that the bots did process all messages (#291). ### Tools - `intelmqctl`: * `intelmqctl run` has a new parameter `-l` `--loglevel` to overwrite the log level for the run (#1075). * `intelmqctl run [bot-id] mesage send` can now send report messages (#1077). - `intelmqdump`: * has now command completion for bot names, actions and queue names in interacive console. * automatically converts messages from events to reports if the queue the message is being restored to is the source queue of a parser (#1225). * is now capable to read messages in dumps that are dictionaries as opposed to serialized dicts as strings and does not convert them in the show command (#1256). * truncated messages are no longer used/saved to the file after being shown (#1255). * now again denies recovery of dumps if the corresponding bot is running. The check was broken (#1258). * now sorts the dump by the time of the dump. Previously, the list was in random order (#1020). ### Known issues no known issues
1.0.5 ### Core - `lib/message`: `Report()` can now create a Report instance from Event instances (certtools#1225). - `lib/bot`: * The first word in the log line `Processed ... messages since last logging.` is now adaptible and set to `Forwarded` in the existing filtering bots (certtools#1237). * Kills oneself again after proper shutdown if the bot is XMPP collector or output (certtools#970). Previously these two bots needed two stop commands to get actually stopped. - `lib/utils`: log: set the name of the `py.warnings` logger to the bot name (certtools#1184). ### Bots #### Collectors - `bots.collectors.mail.collector_mail_url`: handle empty downloaded reports (certtools#988). - `bots.collectos.file.collector_file`: handle empty files (certtools#1244). #### Parsers - Shadowserver parser: * SSL FREAK: Remove optional column `device_serial` and add several new ones. * Fixed HTTP URL parsing for multiple feeds (certtools#1243). - Spamhaus CERT parser: * add support for `smtpauth`, `l_spamlink`, `pop`, `imap`, `rdp`, `smb`, `iotscan`, `proxyget`, `iotmicrosoftds`, `automatedtest`, `ioturl`, `iotmirai`, `iotcmd`, `iotlogin` and `iotuser` (certtools#1254). * fix `extra.destination.local_port` -> `extra.source.local_port`. #### Experts - `bots.experts.filter`: Pre-compile regex at bot initialization. ### Tests - Ensure that the bots did process all messages (certtools#291). ### Tools - `intelmqctl`: * `intelmqctl run` has a new parameter `-l` `--loglevel` to overwrite the log level for the run (certtools#1075). * `intelmqctl run [bot-id] mesage send` can now send report messages (certtools#1077). - `intelmqdump`: * has now command completion for bot names, actions and queue names in interacive console. * automatically converts messages from events to reports if the queue the message is being restored to is the source queue of a parser (certtools#1225). * is now capable to read messages in dumps that are dictionaries as opposed to serialized dicts as strings and does not convert them in the show command (certtools#1256). * truncated messages are no longer used/saved to the file after being shown (certtools#1255). * now again denies recovery of dumps if the corresponding bot is running. The check was broken (certtools#1258). * now sorts the dump by the time of the dump. Previously, the list was in random order (certtools#1020). ### Known issues no known issues
IntelMQ newbie here: wanted to get feedback on the right way to do this rather than submit a pull request just yet.
I had an issue with an email report containing a URL, where the URL content is 0 bytes. This caused lib/message.py:212 to raise InvalidValue, and the bot exits and restarts.
Since the problem report didn't get marked as read before the bot restart, the collector_mail_url.py bot would get permanently stuck in the mailbox trying and failing to process that one report.
A quick fix was modifying this code in collector_mail_url.py:
I'd imagine there would be more situations where we have a permanent failure in a report URL, but we never mark the problem mail as read. Perhaps we need to specify other conditions between temporary failures and a permanent inability to process a report - I'm wondering if this is the right place to do it.
The text was updated successfully, but these errors were encountered: