Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] Virustotal Analyzer Docker stuck "In Progress" #1239

Closed
padey opened this issue Jan 1, 2024 · 3 comments
Closed

[Bug] Virustotal Analyzer Docker stuck "In Progress" #1239

padey opened this issue Jan 1, 2024 · 3 comments
Assignees
Labels
category:bug Issue is related to a bug category:enhancement Issue is related to an existing feature to improve
Milestone

Comments

@padey
Copy link

padey commented Jan 1, 2024

Request Type

Bug

Work Environment

Question Answer
OS version (server) Ubuntu
OS version (client) 22.04
Cortex version / git hash Cortex Version: 3.1.8-1
Package Type Docker,

Problem Description

Since 28.12 the Virustotal Get Report / Scan Docker Container seems not to work.

Edit: when submitting like 30 report requests, 25 deliver a results, 5 of them are stuck in "in progress".

Get Domain = does not work, stuck "in progress"
Get FQND = does not work, stuck "in progress"
Get IP = does not work, stuck "in progress"
Get Hash = works
Get File = works

Scan File = does not work, stuck "in progress"

In the docker hub, I can see there was a new version released on 28.12.23

28.12.23: cortexneurons/virustotal_getreport:3.0 (from Tooom)
28.22.23: cortexneurons/virustotal_getreport:3.1 (from Tooom)
2 Years old: cortexneurons/virustotal_getreport:3 (From StrangeBee)

Steps to Reproduce

  1. start a anaylsis with the virustotal get report
  2. see the job is "in progress"

Complementary information

The Log from Cortex shows the following:

2023-12-29 11:22:57,866 [ERROR] from org.thp.cortex.services.JobSrv in application-akka.actor.default-dispatcher-10 - Job 0CdQtYwBF66zXQbdCAeq has failed
com.fasterxml.jackson.core.io.JsonEOFException: Unexpected end-of-input within/between Object entries
 at [Source: (sun.nio.ch.ChannelInputStream); line: 1, column: 3818]
        at com.fasterxml.jackson.core.base.ParserMinimalBase._reportInvalidEOF(ParserMinimalBase.java:682)
        at com.fasterxml.jackson.core.json.UTF8StreamJsonParser._skipColon2(UTF8StreamJsonParser.java:3202)
        at com.fasterxml.jackson.core.json.UTF8StreamJsonParser._skipColon(UTF8StreamJsonParser.java:3117)
        at com.fasterxml.jackson.core.json.UTF8StreamJsonParser.nextToken(UTF8StreamJsonParser.java:802)
        at play.api.libs.json.jackson.JsValueDeserializer.deserialize(JacksonJson.scala:229)
        at play.api.libs.json.jackson.JsValueDeserializer.deserialize(JacksonJson.scala:143)
        at play.api.libs.json.jackson.JsValueDeserializer.deserialize(JacksonJson.scala:138)
        at com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:323)
        at com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4650)
        at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2831)
        at play.api.libs.json.jackson.JacksonJson$.parseJsValue(JacksonJson.scala:288)
        at play.api.libs.json.StaticBinding$.parseJsValue(StaticBinding.scala:21)
        at play.api.libs.json.Json$.parse(Json.scala:175)
        at org.thp.cortex.services.JobRunnerSrv.extractReport(JobRunnerSrv.scala:163)
        at org.thp.cortex.services.JobRunnerSrv.$anonfun$run$13(JobRunnerSrv.scala:247)
        at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41)
        at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
        at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:63)
        at akka.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:100)
        at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
        at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85)
        at akka.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:100)
        at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:49)
        at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:48)
        at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387)
        at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1311)
        at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1841)
        at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1806)

2023-12-29 11:31:11,030 [ERROR] from org.thp.cortex.services.JobSrv in application-akka.actor.default-dispatcher-12 - Job 1ydXtYwBF66zXQbdiQcS has failed
com.fasterxml.jackson.core.io.JsonEOFException: Unexpected end-of-input within/between Object entries
 at [Source: (sun.nio.ch.ChannelInputStream); line: 1, column: 3818]
        at com.fasterxml.jackson.core.base.ParserMinimalBase._reportInvalidEOF(ParserMinimalBase.java:682)
        at com.fasterxml.jackson.core.json.UTF8StreamJsonParser._skipColon2(UTF8StreamJsonParser.java:3202)
        at com.fasterxml.jackson.core.json.UTF8StreamJsonParser._skipColon(UTF8StreamJsonParser.java:3117)
        at com.fasterxml.jackson.core.json.UTF8StreamJsonParser.nextToken(UTF8StreamJsonParser.java:802)
        at play.api.libs.json.jackson.JsValueDeserializer.deserialize(JacksonJson.scala:229)
        at play.api.libs.json.jackson.JsValueDeserializer.deserialize(JacksonJson.scala:143)
        at play.api.libs.json.jackson.JsValueDeserializer.deserialize(JacksonJson.scala:138)
        at com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:323)
        at com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4650)
        at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2831)
        at play.api.libs.json.jackson.JacksonJson$.parseJsValue(JacksonJson.scala:288)
        at play.api.libs.json.StaticBinding$.parseJsValue(StaticBinding.scala:21)
        at play.api.libs.json.Json$.parse(Json.scala:175)
        at org.thp.cortex.services.JobRunnerSrv.extractReport(JobRunnerSrv.scala:163)
        at org.thp.cortex.services.JobRunnerSrv.$anonfun$run$13(JobRunnerSrv.scala:247)
        at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41)
        at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
        at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:63)
        at akka.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:100)
        at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
        at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85)
        at akka.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:100)
        at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:49)
        at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:48)
        at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387)
        at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1311)
        at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1841)
        at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1806)
        at java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:177)
@ebsd
Copy link

ebsd commented Jan 4, 2024

Hi,
Just trying to help. Testing the analyzer give this :

$ docker run --rm -v ${PWD}:/job cortexneurons/virustotal_scan:3.1
Traceback (most recent call last):
  File "/worker/VirusTotal/virustotal.py", line 407, in <module>
    VirusTotalAnalyzer().run()
  File "/worker/VirusTotal/virustotal.py", line 365, in run
    self.report(results)
  File "/usr/local/lib/python3.9/site-packages/cortexutils/analyzer.py", line 110, in report
    super(Analyzer, self).report({
  File "/usr/local/lib/python3.9/site-packages/cortexutils/worker.py", line 203, in report
    self.__write_output(output, ensure_ascii=ensure_ascii)
  File "/usr/local/lib/python3.9/site-packages/cortexutils/worker.py", line 127, in __write_output
    json.dump(data, f_output, ensure_ascii=ensure_ascii)
  File "/usr/local/lib/python3.9/json/__init__.py", line 179, in dump
    for chunk in iterable:
  File "/usr/local/lib/python3.9/json/encoder.py", line 431, in _iterencode
    yield from _iterencode_dict(o, _current_indent_level)
  File "/usr/local/lib/python3.9/json/encoder.py", line 405, in _iterencode_dict
    yield from chunks
  File "/usr/local/lib/python3.9/json/encoder.py", line 405, in _iterencode_dict
    yield from chunks
  File "/usr/local/lib/python3.9/json/encoder.py", line 405, in _iterencode_dict
    yield from chunks
  File "/usr/local/lib/python3.9/json/encoder.py", line 438, in _iterencode
    o = _default(o)
  File "/usr/local/lib/python3.9/json/encoder.py", line 179, in default
    raise TypeError(f'Object of type {o.__class__.__name__} '
TypeError: Object of type WhistleBlowerDict is not JSON serializable
Unclosed client session
client_session: <aiohttp.client.ClientSession object at 0x7efece4a3070>
Unclosed connector
connections: ['[(<aiohttp.client_proto.ResponseHandler object at 0x7efece422ac0>, 14112064.793170864)]']
connector: <aiohttp.connector.TCPConnector object at 0x7efed06af580>

and the output.json seems truncated

$ cat output/output.json
{"success": true, "summary": {"taxonomies": [{"level": "info", "namespace": "VT", "predicate":
"Scan", "value": "0/91"}]}, "artifacts": [], "operations": [], "full": {"type": "analysis", "attributes": {"date": 1704374672, "status": "completed", "stats":

@jeromeleonard jeromeleonard added this to the 3.3.4 milestone Jan 10, 2024
@jeromeleonard jeromeleonard self-assigned this Jan 10, 2024
@jeromeleonard
Copy link
Contributor

Hello. This should be fixed in the release we made today. Please test and feel free to share your feedback.

@jeromeleonard jeromeleonard added category:bug Issue is related to a bug category:enhancement Issue is related to an existing feature to improve labels Jan 10, 2024
@padey
Copy link
Author

padey commented Jan 10, 2024

@jeromeleonard - looking good. :) will close the issue!

@padey padey closed this as completed Jan 10, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
category:bug Issue is related to a bug category:enhancement Issue is related to an existing feature to improve
Projects
None yet
Development

No branches or pull requests

3 participants