Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] TheHive4 fails to start #1745

Closed
ilyesilli opened this issue Jan 12, 2021 · 1 comment
Closed

[Bug] TheHive4 fails to start #1745

ilyesilli opened this issue Jan 12, 2021 · 1 comment
Labels
bug TheHive4 TheHive4 related issues

Comments

@ilyesilli
Copy link

ilyesilli commented Jan 12, 2021

TheHive4 fails to start

Request Type
Bug

Work Environment

Question Answer
OS version (server) CentOS
OS version (client) 7
TheHive version / git hash 4
Package Type RPM

Problem Description

TheHive service fails to start .

Steps to Reproduce

  1. Install using rpm packages using the following steps
  2. Configuration using the following steps
  3. Starting ElasticSearch
  4. Starting the service using " bin/thehive -Dconfig.file=/etc/thehive/application.conf "
  5. Observe the following error in the logs:
21:21:38.879 [main] INFO ScalligraphApplication - Loading application ...
[error] a.a.OneForOneStrategy - Unable to provision, see the following errors:

1) Error injecting constructor, java.lang.IllegalArgumentException: Could not find implementation class: org.janusgraph.diskstorage.inmemory.InMemoryStoreManager
  at org.thp.scalligraph.janus.JanusDatabase.<init>(JanusDatabase.scala:76)
  at org.thp.scalligraph.janus.JanusDatabase.class(JanusDatabase.scala:61)
  while locating org.thp.scalligraph.janus.JanusDatabase
  while locating org.thp.scalligraph.models.Database
    for the 2nd parameter of org.thp.thehive.models.DatabaseProvider.<init>(SchemaUpdaterActor.scala:19)
  at org.thp.thehive.models.DatabaseProvider.class(SchemaUpdaterActor.scala:18)
  while locating org.thp.thehive.models.DatabaseProvider
  while locating org.thp.scalligraph.models.Database annotated with @com.google.inject.name.Named(value=with-thehive-schema)
    for the 4th parameter of org.thp.thehive.services.AuditSrv.<init>(AuditSrv.scala:29)
  at org.thp.thehive.services.AuditSrv.class(AuditSrv.scala:28)
  while locating org.thp.thehive.services.AuditSrv
    for the 3rd parameter of org.thp.thehive.services.notification.NotificationActor.<init>(NotificationActor.scala:78)
  while locating org.thp.thehive.services.notification.NotificationActor

1 error
akka.actor.ActorInitializationException: akka://application/user/notification-actor: exception during creation
        at akka.actor.ActorInitializationException$.apply(Actor.scala:196)
        at akka.actor.ActorCell.create(ActorCell.scala:661)
        at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:513)
        at akka.actor.ActorCell.systemInvoke(ActorCell.scala:535)
        at akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:295)
        at akka.dispatch.Mailbox.run(Mailbox.scala:230)
        at akka.dispatch.Mailbox.exec(Mailbox.scala:243)
        at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
        at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
        at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
Caused by: com.google.inject.ProvisionException: Unable to provision, see the following errors:

1) Error injecting constructor, java.lang.IllegalArgumentException: Could not find implementation class: org.janusgraph.diskstorage.inmemory.InMemoryStoreManager
  at org.thp.scalligraph.janus.JanusDatabase.<init>(JanusDatabase.scala:76)
  at org.thp.scalligraph.janus.JanusDatabase.class(JanusDatabase.scala:61)
  while locating org.thp.scalligraph.janus.JanusDatabase
  while locating org.thp.scalligraph.models.Database
    for the 2nd parameter of org.thp.thehive.models.DatabaseProvider.<init>(SchemaUpdaterActor.scala:19)
  at org.thp.thehive.models.DatabaseProvider.class(SchemaUpdaterActor.scala:18)
  while locating org.thp.thehive.models.DatabaseProvider
  while locating org.thp.scalligraph.models.Database annotated with @com.google.inject.name.Named(value=with-thehive-schema)
    for the 4th parameter of org.thp.thehive.services.AuditSrv.<init>(AuditSrv.scala:29)
  at org.thp.thehive.services.AuditSrv.class(AuditSrv.scala:28)
  while locating org.thp.thehive.services.AuditSrv
    for the 3rd parameter of org.thp.thehive.services.notification.NotificationActor.<init>(NotificationActor.scala:78)
  while locating org.thp.thehive.services.notification.NotificationActor

1 error
        at com.google.inject.internal.InternalProvisionException.toProvisionException(InternalProvisionException.java:226)
        at com.google.inject.internal.InjectorImpl$1.get(InjectorImpl.java:1097)
        at com.google.inject.internal.InjectorImpl.getInstance(InjectorImpl.java:1131)
        at play.api.inject.guice.GuiceInjector.instanceOf(GuiceInjectorBuilder.scala:436)
        at play.api.inject.guice.GuiceInjector.instanceOf(GuiceInjectorBuilder.scala:431)
        at play.api.inject.ContextClassLoaderInjector.$anonfun$instanceOf$2(Injector.scala:119)
        at play.api.inject.ContextClassLoaderInjector.withContext(Injector.scala:128)
        at play.api.inject.ContextClassLoaderInjector.instanceOf(Injector.scala:119)
        at play.api.libs.concurrent.ActorRefProvider.$anonfun$get$1(Akka.scala:281)
        at akka.actor.TypedCreatorFunctionConsumer.produce(IndirectActorProducer.scala:91)
Caused by: java.lang.IllegalArgumentException: Could not find implementation class: org.janusgraph.diskstorage.inmemory.InMemoryStoreManager
        at org.janusgraph.util.system.ConfigurationUtil.instantiate(ConfigurationUtil.java:60)
        at org.janusgraph.diskstorage.Backend.getImplementationClass(Backend.java:440)
        at org.janusgraph.diskstorage.Backend.getStorageManager(Backend.java:411)
        at org.janusgraph.graphdb.configuration.builder.GraphDatabaseConfigurationBuilder.build(GraphDatabaseConfigurationBuilder.java:50)
        at org.janusgraph.core.JanusGraphFactory.open(JanusGraphFactory.java:161)
        at org.janusgraph.core.JanusGraphFactory.open(JanusGraphFactory.java:132)
        at org.janusgraph.core.JanusGraphFactory.open(JanusGraphFactory.java:112)
        at org.thp.scalligraph.janus.JanusDatabase$.openDatabase(JanusDatabase.scala:56)
        at org.thp.scalligraph.janus.JanusDatabase.<init>(JanusDatabase.scala:77)
        at org.thp.scalligraph.janus.JanusDatabase$$FastClassByGuice$$113881e3.newInstance(<generated>)
Caused by: java.lang.ClassNotFoundException: org.janusgraph.diskstorage.inmemory.InMemoryStoreManager
        at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:264)
        at org.janusgraph.util.system.ConfigurationUtil.instantiate(ConfigurationUtil.java:56)
        at org.janusgraph.diskstorage.Backend.getImplementationClass(Backend.java:440)
        at org.janusgraph.diskstorage.Backend.getStorageManager(Backend.java:411)
[...]

Complementary information

ElasticSearch is running and Cassandra not installed because i want to use ElasticSearch.
The output fro the cmd # java -version

openjdk version "1.8.0_275"

OpenJDK Runtime Environment (build 1.8.0_275-b01)

OpenJDK 64-Bit Server VM (build 25.275-b01, mixed mode)

PS : Thehive and ElasticSearch are in the same server, and for installation of the hive i use the RPM package #yum install thehive4.

  • Bellow the "application.conf" for thehive4
// Documentation is available at https://github.com/TheHive-Project/TheHiveDocs/TheHive4


// Include Play secret key
// More information on secret key at https://www.playframework.com/documentation/2.8.x/ApplicationSecret
include "/etc/thehive/secret.conf"

// Database configuration

//db.janusgraph

// Elasticsearch
search {
  // Basic configuration
  // Index name.
  index = the_hive
  // ElasticSearch instance address.
  uri = "http://127.0.0.1:9200/"

  // Scroll keepalive
  keepalive = 1m
  // Size of the page for scroll
  pagesize = 50
  // Number of shards
  nbshards = 5
  // Number of replicas
  nbreplicas = 1
  // Arbitrary settings
 settings {
    // Maximum number of nested fields
    mapping.nested_fields.limit = 100
  }

  // Authentication configuration
  //search.username = ""
  //search.password = ""

  // SSL configuration
  //search.keyStore {
  //  path = "/path/to/keystore"
  //  type = "JKS" # or PKCS12
  //  password = "keystore-password"
  //}
  //search.trustStore {
  //  path = "/path/to/trustStore"
  //  type = "JKS" # or PKCS12
  //  password = "trustStore-password"
  //}
 }



 //storage {
    // Cassandra configuration
    // More information at https://docs.janusgraph.org/basics/configuration-reference/#storagecql
    //backend: cql
    // hostname: ["ip1", "ip2"]
 // hostname: ["ip1", "ip2"]
    // Cassandra authentication (if configured)
    // username: "thehive"
    // password: "password"
    //cql {
      //cluster-name: thp
      //keyspace: thehive
   // }
  //}

  // For test only !
  // Comment Cassandra settings before enable Berkeley database
  // storage.backend: berkeleyje
  // storage.directory: /path/to/berkeleydb
  // berkeleyje.freeDisk: 200 # disk usage threshold
}

// Attachment storage configuration
storage {
  // Local filesystem
  // provider: localfs
  // localfs.location: /path/to/files

 // Hadoop filesystem (HDFS)
  // provider: hdfs
  // hdfs {
  //   root: "hdfs://localhost:10000" # namenode server hostname
  //   location: "/thehive"           # location inside HDFS
  //   username: thehive              # file owner
  // }

// Datastore
datastore {
  name = data
  // Size of stored data chunks
  chunksize = 50k
  hash {
    // Main hash algorithm /!\ Don't change this value
    main = "SHA-256"
    // Additional hash algorithms (used in attachments)
    extra = ["SHA-1", "MD5"]
  }
  attachment.password = "malware"
}


}

//Authentication configuration
//More information at https://github.com/TheHive-Project/TheHiveDocs/TheHive4/Administration/Authentication.md
auth {
  providers: [
    {name: session}               # required !
    {name: basic, realm: thehive}
    {name: local}
    {name: key}
  ]
//The format of logins must be valid email address format. If the provided login doesn't contain `@` the following
//domain is automatically appended
  defaultUserDomain: "thehive.local"
}
//CORTEX configuration
//More information at https://github.com/TheHive-Project/TheHiveDocs/TheHive4/Administration/Connectors.md
//Enable Cortex connector
// play.modules.enabled += org.thp.thehive.connector.cortex.CortexModule
// cortex {
//  servers: [
//    {
//      name: "local"                # Cortex name
//      url: "http://localhost:9001" # URL of Cortex instance
//      auth {
//        type: "bearer"
//        key: "***"                 # Cortex API key
//      }
//      wsConfig {}                  # HTTP client configuration (SSL and proxy)
//    }
//  ]
// }

//MISP configuration
//More information at https://github.com/TheHive-Project/TheHiveDocs/TheHive4/Administration/Connectors.md
//Enable MISP connector
// play.modules.enabled += org.thp.thehive.connector.misp.MispModule
// misp {
//  interval: 1 hour
//  servers: [
//    {
//      name = "local"            # MISP name
//      url = "http://localhost/" # URL or MISP
//      auth {
//        type = key
//        key = "***"             # MISP API key
//      }
//      wsConfig {}               # HTTP client configuration (SSL and proxy)
//    }
//  ]
//}

//Streaming
stream.longpolling {
//Maximum time a stream request waits for new element
  refresh = 1m
//Lifetime of the stream session without request
  cache = 15m
  nextItemMaxWait = 500ms
  globalMaxWait = 1s
}


//Max textual content length
play.http.parser.maxMemoryBuffer=1M
//Max file size
play.http.parser.maxDiskBuffer=1G

//Define maximum size of attachments (default 10MB)
//play.http.parser.maxDiskBuffer: 1GB

I Hope you will help me and at your disposal if you want any additional information.

@ilyesilli ilyesilli added TheHive4 TheHive4 related issues bug labels Jan 12, 2021
@To-om
Copy link
Contributor

To-om commented Jan 14, 2021

You apply the configuration of TheHive 3. Please use this documentation

@To-om To-om closed this as completed Jan 14, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug TheHive4 TheHive4 related issues
Projects
None yet
Development

No branches or pull requests

2 participants