diff --git a/AUTHORS b/AUTHORS index 27de4530b7..7db3412333 100644 --- a/AUTHORS +++ b/AUTHORS @@ -11,6 +11,11 @@ Contributors * CERT Banque de France (CERT-BDF) * Nabil Adouani +Contributed Analyzers +--------------------- + +* URLCategory: Eric Capuano + Copyright (C) 2014-2016 Thomas Franco Copyright (C) 2014-2016 Saâd Kadhi Copyright (C) 2014-2016 Jérôme Leonard diff --git a/CHANGELOG.md b/CHANGELOG.md new file mode 100644 index 0000000000..ab2ca13705 --- /dev/null +++ b/CHANGELOG.md @@ -0,0 +1,42 @@ +# Change Log + +## [2.9.1](https://github.com/CERT-BDF/TheHive/tree/2.9.1) + +**Implemented enhancements:** + +- Update logo and favicon [\#45](https://github.com/CERT-BDF/TheHive/issues/45) +- Inconsistent wording between the login and user management pages [\#44](https://github.com/CERT-BDF/TheHive/issues/44) +- MaxMind Analyzer 'Short Report' has hard-coded language [\#23](https://github.com/CERT-BDF/TheHive/issues/23) +- Don't update imported case from MISP if it is deleted or merged [\#22](https://github.com/CERT-BDF/TheHive/issues/22) + +**Fixed bugs:** + +- NPE occurs at startup if conf directory doesn't exists [\#41](https://github.com/CERT-BDF/TheHive/issues/41) +- Resource not found by Assets controller [\#38](https://github.com/CERT-BDF/TheHive/issues/38) +- Systemd startup script does not work [\#29](https://github.com/CERT-BDF/TheHive/issues/29) +- MISP event parsing error when it doesn't contain any attribute [\#25](https://github.com/CERT-BDF/TheHive/issues/25) +- Phantom tabs [\#18](https://github.com/CERT-BDF/TheHive/issues/18) +- The Action button of observables list is blank [\#15](https://github.com/CERT-BDF/TheHive/issues/15) +- Description becomes empty when you cancel an edition [\#13](https://github.com/CERT-BDF/TheHive/issues/13) +- Metric Labels Not Showing in Case View [\#10](https://github.com/CERT-BDF/TheHive/issues/10) +- chrome on os x - header alignment [\#5](https://github.com/CERT-BDF/TheHive/issues/5) +- Tags not saving when creating observable. [\#4](https://github.com/CERT-BDF/TheHive/issues/4) + +**Closed issues:** + +- Statistics based on Tags [\#37](https://github.com/CERT-BDF/TheHive/issues/37) +- Statistics on a per case template name / prefix basis [\#31](https://github.com/CERT-BDF/TheHive/issues/31) +- Observable Viewing Page [\#17](https://github.com/CERT-BDF/TheHive/issues/17) +- Case merging [\#14](https://github.com/CERT-BDF/TheHive/issues/14) +- Give us something to work with! [\#2](https://github.com/CERT-BDF/TheHive/issues/2) + +**Merged pull requests:** + +- New analyzer to check URL categories [\#24](https://github.com/CERT-BDF/TheHive/pull/24) ([ecapuano](https://github.com/ecapuano)) +- Fix "Run from Docker" [\#9](https://github.com/CERT-BDF/TheHive/pull/9) ([2xyo](https://github.com/2xyo)) +- Fixing a Simple Typo [\#6](https://github.com/CERT-BDF/TheHive/pull/6) ([swannysec](https://github.com/swannysec)) +- Fixed broken link to Wiki [\#1](https://github.com/CERT-BDF/TheHive/pull/1) ([Neo23x0](https://github.com/Neo23x0)) + + + +\* *This Change Log was automatically generated by [github_changelog_generator](https://github.com/skywinder/Github-Changelog-Generator)* \ No newline at end of file diff --git a/README.md b/README.md index 7597b15d4c..673d0e29f8 100644 --- a/README.md +++ b/README.md @@ -41,13 +41,14 @@ TheHive is written in Scala and uses ElasticSearch 2.x for storage. Its REST API ![](images/Architecture.png) ## Analyzers -The first public release of TheHive is provided with 7 analyzers: +TheHive 2.9.1 is provided with 8 analyzers: + DNSDB*: leverage Farsight's [DNSDB](https://www.dnsdb.info/) for pDNS. + DomainTools*: look up domain names, IP addresses, WHOIS records, etc. using the popular [DomainTools](http://domaintools.com/) service API. + Hippocampe: query threat feeds through [Hippocampe](https://github.com/CERT-BDF/Hippocampe), a FOSS tool that centralizes feeds and allows you to associate a confidence level to each one of them (that can be changed over time) and get a score indicating the data quality. + MaxMind: geolocation. + Olevba: parse OLE and OpenXML files using [olevba](http://www.decalage.info/python/olevba) to detect VBA macros, extract their source code etc. + Outlook MsgParser: this analyzer allows to add an Outlook message file as an observable and parse it automatically. ++ URLCategory: checks the Fortinet categories of URLs. + VirusTotal*: look up files, URLs and hashes through [VirusTotal](https://www.virustotal.com/). The star (*) indicates that the analyzer needs an API key to work correctly. We do not provide API keys. You have to use your own. diff --git a/analyzers/MaxMind/report/success_short.html b/analyzers/MaxMind/report/success_short.html index 2289ffa02c..b0b410cbdb 100644 --- a/analyzers/MaxMind/report/success_short.html +++ b/analyzers/MaxMind/report/success_short.html @@ -1 +1 @@ -IP location: {{content.country.names.fr}} / {{content.continent.names.fr}} \ No newline at end of file +IP location: {{content.country.name}} / {{content.continent.name}} diff --git a/analyzers/URLCategory/report/success_long.html b/analyzers/URLCategory/report/success_long.html new file mode 100644 index 0000000000..a48ebbb6da --- /dev/null +++ b/analyzers/URLCategory/report/success_long.html @@ -0,0 +1,18 @@ +
+
+ URL Categories of {{artifact.data}} +
+
+
+
Fortinet URL Category:
+
{{content.fortinet_category}}  + + + View Full Report + + + Request Recategorization +
+
+
+
diff --git a/analyzers/URLCategory/report/success_short.html b/analyzers/URLCategory/report/success_short.html new file mode 100644 index 0000000000..bc42a51644 --- /dev/null +++ b/analyzers/URLCategory/report/success_short.html @@ -0,0 +1,4 @@ + + URLCat: + {{content.fortinet_category}}  + diff --git a/analyzers/URLCategory/urlcategory.py b/analyzers/URLCategory/urlcategory.py new file mode 100755 index 0000000000..5d8fd4463a --- /dev/null +++ b/analyzers/URLCategory/urlcategory.py @@ -0,0 +1,85 @@ +#!/usr/bin/env python +# encoding: utf-8 +import sys +import os +import json +import codecs +import time +import re +import requests + +if sys.stdout.encoding != 'UTF-8': + if sys.version_info.major == 3: + sys.stdout = codecs.getwriter('utf-8')(sys.stdout.buffer, 'strict') + else: + sys.stdout = codecs.getwriter('utf-8')(sys.stdout, 'strict') +if sys.stderr.encoding != 'UTF-8': + if sys.version_info.major == 3: + sys.stderr = codecs.getwriter('utf-8')(sys.stderr.buffer, 'strict') + else: + sys.stderr = codecs.getwriter('utf-8')(sys.stderr, 'strict') + +# load artifact +artifact = json.load(sys.stdin) + +def error(message): + print('{{"errorMessage":"{}"}}'.format(message)) + sys.exit(1) + +def get_param(name, default=None, message=None, current=artifact): + if isinstance(name, str): + name = name.split('.') + if len(name) == 0: + return current + else: + value = current.get(name[0]) + if value == None: + if message != None: + error(message) + else: + return default + else: + return get_param(name[1:], default, message, value) + +def debug(msg): + #print >> sys.stderr, msg + pass + +def fortinet_category(data): + debug('>> fortinet_category ' + str(data)) + pattern = re.compile("(?:Category: )([\w\s]+)") + baseurl = 'http://www.fortiguard.com/iprep?data=' + tailurl = '&lookup=Lookup' + url = baseurl + data + tailurl + r = requests.get(url) + category_match = re.search(pattern, r.content, flags=0) + return category_match.group(1) + +http_proxy = get_param('config.proxy.http') +https_proxy = get_param('config.proxy.https') +max_tlp = get_param('config.max_tlp', 1) +tlp = get_param('tlp', 2) # amber by default +data_type = get_param('dataType', None, 'Missing dataType field') +service = get_param('config.service', None, 'Service parameter is missing') + +# run only if TLP condition is met +if tlp > max_tlp: + error('Error with TLP value ; see max_tlp in config or tlp value in input data') + +# setup proxy +if http_proxy != None: + os.environ['http_proxy'] = http_proxy +if https_proxy != None: + os.environ['https_proxy'] = https_proxy + +if service == 'query': + if data_type == 'url' or data_type == 'domain': + data = get_param('data', None, 'Data is missing') + json.dump({ + 'fortinet_category': fortinet_category(data) + }, sys.stdout, ensure_ascii=False) + else: + error('Invalid data type') +else: + error('Invalid service') + diff --git a/analyzers/URLCategory_1.0.json b/analyzers/URLCategory_1.0.json new file mode 100644 index 0000000000..a98d46127a --- /dev/null +++ b/analyzers/URLCategory_1.0.json @@ -0,0 +1,13 @@ +{ + "name": "URLCategory", + "version": "1.0", + "report": "URLCategory/report", + "description": "URL Category query: checks the category of a specific URL or domain", + "dataTypeList": ["url", "domain"], + "baseConfig" : "URLCategory", + "config": { + "service": "query", + "max_tlp": 10 + }, + "command": "URLCategory/urlcategory.py" +} diff --git a/conf/keepme b/conf/keepme new file mode 100644 index 0000000000..e69de29bb2 diff --git a/install/thehive.service b/install/thehive.service index 5ec6e60a21..baee3b740f 100644 --- a/install/thehive.service +++ b/install/thehive.service @@ -5,15 +5,19 @@ Wants=network-online.target After=network-online.target [Service] +Environment=PID_DIR=/var/run/thehive WorkingDirectory=/opt/thehive + User=thehive Group=thehive -ExecStart=/opt/thehive/bin/thehive \ - -Dconfig.file=/etc/thehive/application.conf \ - -Dhttp.port=9000 \ - -Dpidfile.path=/var/run/thehive/pid +RuntimeDirectory=thehive +RuntimeDirectoryMode=0750 +ExecStart=/opt/thehive/bin/thehive \ + -Dconfig.file=/etc/thehive/application.conf \ + -Dhttp.port=9000 \ + -Dpidfile.path=${PID_DIR}/pid StandardOutput=journal StandardError=inherit diff --git a/project/BuildSettings.scala b/project/BuildSettings.scala index 50f79d45c6..b5c0d92356 100644 --- a/project/BuildSettings.scala +++ b/project/BuildSettings.scala @@ -7,7 +7,7 @@ object BasicSettings extends AutoPlugin { override def projectSettings = Seq( organization := "org.cert-bdf", licenses += "AGPL-V3" -> url("https://www.gnu.org/licenses/agpl-3.0.html"), - version := "2.9.0", + version := "2.9.1", resolvers += Resolver.bintrayRepo("cert-bdf", "elastic4play"), scalaVersion := Dependencies.scalaVersion, scalacOptions ++= Seq( diff --git a/project/Dependencies.scala b/project/Dependencies.scala index 17e2796bac..45555525e1 100644 --- a/project/Dependencies.scala +++ b/project/Dependencies.scala @@ -29,7 +29,7 @@ object Dependencies { val reflections = "org.reflections" % "reflections" % "0.9.10" val zip4j = "net.lingala.zip4j" % "zip4j" % "1.3.2" val akkaTest = "com.typesafe.akka" %% "akka-stream-testkit" % "2.4.4" - val elastic4play = "org.cert-bdf" %% "elastic4play" % "1.1.0" + val elastic4play = "org.cert-bdf" %% "elastic4play" % "1.1.1" object Elastic4s { private val version = "2.3.0" diff --git a/thehive-backend/app/controllers/Case.scala b/thehive-backend/app/controllers/Case.scala index ef0c4643e6..41a219dd05 100644 --- a/thehive-backend/app/controllers/Case.scala +++ b/thehive-backend/app/controllers/Case.scala @@ -24,10 +24,13 @@ import org.elastic4play.services.JsonFormat.{ aggReads, queryReads } import models.{ Case, CaseStatus } import services.{ CaseSrv, TaskSrv } +import services.CaseMergeSrv +import scala.util.Try @Singleton class CaseCtrl @Inject() ( caseSrv: CaseSrv, + caseMergeSrv: CaseMergeSrv, taskSrv: TaskSrv, auxSrv: AuxSrv, authenticated: Authenticated, @@ -39,46 +42,53 @@ class CaseCtrl @Inject() ( val log = Logger(getClass) @Timed - def create() = authenticated(Role.write).async(fieldsBodyParser) { implicit request => + def create() = authenticated(Role.write).async(fieldsBodyParser) { implicit request ⇒ caseSrv.create(request.body) - .map(caze => renderer.toOutput(CREATED, caze)) + .map(caze ⇒ renderer.toOutput(CREATED, caze)) } @Timed - def get(id: String) = authenticated(Role.read).async { implicit request => - caseSrv.get(id) - .map(caze => renderer.toOutput(OK, caze)) + def get(id: String) = authenticated(Role.read).async { implicit request ⇒ + val withStats = for { + statsValues <- request.queryString.get("nstats") + firstValue <- statsValues.headOption + } yield Try(firstValue.toBoolean).getOrElse(firstValue == "1") + + for { + caze ← caseSrv.get(id) + casesWithStats ← auxSrv.apply(caze, 0, withStats.getOrElse(false)) + } yield renderer.toOutput(OK, casesWithStats) } @Timed - def update(id: String) = authenticated(Role.write).async(fieldsBodyParser) { implicit request => + def update(id: String) = authenticated(Role.write).async(fieldsBodyParser) { implicit request ⇒ val isCaseClosing = request.body.getString("status").filter(_ == CaseStatus.Resolved.toString).isDefined for { // Closing the case, so lets close the open tasks - caze <- caseSrv.update(id, request.body) - closedTasks <- if (isCaseClosing) taskSrv.closeTasksOfCase(id) else Future.successful(Nil) // FIXME log warning if closedTasks contains errors + caze ← caseSrv.update(id, request.body) + closedTasks ← if (isCaseClosing) taskSrv.closeTasksOfCase(id) else Future.successful(Nil) // FIXME log warning if closedTasks contains errors } yield renderer.toOutput(OK, caze) } @Timed - def bulkUpdate() = authenticated(Role.write).async(fieldsBodyParser) { implicit request => + def bulkUpdate() = authenticated(Role.write).async(fieldsBodyParser) { implicit request ⇒ val isCaseClosing = request.body.getString("status").filter(_ == CaseStatus.Resolved.toString).isDefined - - request.body.getStrings("ids").fold(Future.successful(Ok(JsArray()))) { ids => + + request.body.getStrings("ids").fold(Future.successful(Ok(JsArray()))) { ids ⇒ if (isCaseClosing) taskSrv.closeTasksOfCase(ids: _*) // FIXME log warning if closedTasks contains errors - caseSrv.bulkUpdate(ids, request.body.unset("ids")).map(multiResult => renderer.toMultiOutput(OK, multiResult)) + caseSrv.bulkUpdate(ids, request.body.unset("ids")).map(multiResult ⇒ renderer.toMultiOutput(OK, multiResult)) } } @Timed - def delete(id: String) = authenticated(Role.write).async { implicit request => + def delete(id: String) = authenticated(Role.write).async { implicit request ⇒ caseSrv.delete(id) - .map(_ => NoContent) + .map(_ ⇒ NoContent) } @Timed - def find() = authenticated(Role.read).async(fieldsBodyParser) { implicit request => + def find() = authenticated(Role.read).async(fieldsBodyParser) { implicit request ⇒ val query = request.body.getValue("query").fold[QueryDef](QueryDSL.any)(_.as[QueryDef]) val range = request.body.getString("range") val sort = request.body.getStrings("sort").getOrElse(Nil) @@ -91,21 +101,21 @@ class CaseCtrl @Inject() ( } @Timed - def stats() = authenticated(Role.read).async(fieldsBodyParser) { implicit request => + def stats() = authenticated(Role.read).async(fieldsBodyParser) { implicit request ⇒ val query = request.body.getValue("query").fold[QueryDef](QueryDSL.any)(_.as[QueryDef]) val aggs = request.body.getValue("stats").getOrElse(throw BadRequestError("Parameter \"stats\" is missing")).as[Seq[Agg]] - caseSrv.stats(query, aggs).map(s => Ok(s)) + caseSrv.stats(query, aggs).map(s ⇒ Ok(s)) } @Timed - def linkedCases(id: String) = authenticated(Role.read).async { implicit request => + def linkedCases(id: String) = authenticated(Role.read).async { implicit request ⇒ caseSrv.linkedCases(id) .runWith(Sink.seq) - .map { cases => + .map { cases ⇒ val casesList = cases.sortWith { - case ((c1, _), (c2, _)) => c1.startDate().after(c2.startDate()) + case ((c1, _), (c2, _)) ⇒ c1.startDate().after(c2.startDate()) }.map { - case (caze, artifacts) => + case (caze, artifacts) ⇒ Json.toJson(caze).as[JsObject] - "description" + ("linkedWith" -> Json.toJson(artifacts)) + ("linksCount" -> Json.toJson(artifacts.size)) @@ -113,4 +123,11 @@ class CaseCtrl @Inject() ( renderer.toOutput(OK, casesList) } } + + @Timed + def merge(caseId1: String, caseId2: String) = authenticated(Role.read).async { implicit request ⇒ + caseMergeSrv.merge(caseId1, caseId2).map { caze ⇒ + renderer.toOutput(OK, caze) + } + } } \ No newline at end of file diff --git a/thehive-backend/app/models/Case.scala b/thehive-backend/app/models/Case.scala index 4dea21f1bc..dcc69e8634 100644 --- a/thehive-backend/app/models/Case.scala +++ b/thehive-backend/app/models/Case.scala @@ -13,20 +13,22 @@ import play.api.libs.json.Json import play.api.libs.json.Json.toJsFieldJsValueWrapper import org.elastic4play.JsonFormat.dateFormat -import org.elastic4play.models.{ AttributeDef, AttributeFormat => F, AttributeOption => O, BaseEntity, EntityDef, HiveEnumeration, ModelDef } +import org.elastic4play.models.{ AttributeDef, AttributeFormat ⇒ F, AttributeOption ⇒ O, BaseEntity, EntityDef, HiveEnumeration, ModelDef } import org.elastic4play.services.{ FindSrv, SequenceSrv } import JsonFormat.{ caseImpactStatusFormat, caseResolutionStatusFormat, caseStatusFormat } import services.AuditedModel +import services.CaseSrv +import play.api.Logger object CaseStatus extends Enumeration with HiveEnumeration { type Type = Value - val Ephemeral, Open, FalsePositive, TruePositive, Resolved, Deleted = Value + val Open, Resolved, Deleted = Value } object CaseResolutionStatus extends Enumeration with HiveEnumeration { type Type = Value - val Indeterminate, FalsePositive, TruePositive, Other = Value + val Indeterminate, FalsePositive, TruePositive, Other, Duplicated = Value } object CaseImpactStatus extends Enumeration with HiveEnumeration { @@ -34,7 +36,7 @@ object CaseImpactStatus extends Enumeration with HiveEnumeration { val NoImpact, WithImpact, NotApplicable = Value } -trait CaseAttributes { _: AttributeDef => +trait CaseAttributes { _: AttributeDef ⇒ val caseId = attribute("caseId", F.numberFmt, "Id of the case (auto-generated)", O.model) val title = attribute("title", F.textFmt, "Title of the case") val description = attribute("description", F.textFmt, "Description of the case") @@ -50,56 +52,111 @@ trait CaseAttributes { _: AttributeDef => val resolutionStatus = optionalAttribute("resolutionStatus", F.enumFmt(CaseResolutionStatus), "Resolution status of the case") val impactStatus = optionalAttribute("impactStatus", F.enumFmt(CaseImpactStatus), "Impact status of the case") val summary = optionalAttribute("summary", F.textFmt, "Summary of the case, to be provided when closing a case") + val mergeInto = optionalAttribute("mergeInto", F.stringFmt, "Id of the case created by the merge") + val mergeFrom = multiAttribute("mergeFrom", F.stringFmt, "Id of the cases merged") } @Singleton class CaseModel @Inject() ( artifactModel: Provider[ArtifactModel], taskModel: Provider[TaskModel], + caseSrv: Provider[CaseSrv], sequenceSrv: SequenceSrv, findSrv: FindSrv, - implicit val ec: ExecutionContext) extends ModelDef[CaseModel, Case]("case") with CaseAttributes with AuditedModel { caseModel => + implicit val ec: ExecutionContext) extends ModelDef[CaseModel, Case]("case") with CaseAttributes with AuditedModel { caseModel ⇒ + + lazy val logger = Logger(getClass) override val defaultSortBy = Seq("-startDate") override val removeAttribute = Json.obj("status" -> CaseStatus.Deleted) override def creationHook(parent: Option[BaseEntity], attrs: JsObject) = { - sequenceSrv("case").map { caseId => + sequenceSrv("case").map { caseId ⇒ attrs + ("caseId" -> JsNumber(caseId)) } } override def updateHook(entity: BaseEntity, updateAttrs: JsObject): Future[JsObject] = Future.successful { (updateAttrs \ "status").asOpt[CaseStatus.Type] match { - case Some(CaseStatus.Resolved) if !updateAttrs.keys.contains("endDate") => + case Some(CaseStatus.Resolved) if !updateAttrs.keys.contains("endDate") ⇒ updateAttrs + ("endDate" -> Json.toJson(new Date)) - case Some(CaseStatus.Open) => + case Some(CaseStatus.Open) ⇒ updateAttrs + ("endDate" -> JsArray(Nil)) - case _ => + case _ ⇒ updateAttrs } } - override def getStats(entity: BaseEntity): Future[JsObject] = { + private[models] def buildArtifactStats(caze: Case): Future[JsObject] = { import org.elastic4play.services.QueryDSL._ - for { - taskStatsJson <- findSrv( - taskModel.get, - and( - "_parent" ~= entity.id, - "status" in ("Waiting", "InProgress", "Completed")), - groupByField("status", selectCount)) - (taskCount, taskStats) = taskStatsJson.value.foldLeft((0L, JsObject(Nil))) { - case ((total, s), (key, value)) => - val count = (value \ "count").as[Long] - (total + count, s + (key -> JsNumber(count))) + findSrv( + artifactModel.get, + and( + parent("case", withId(caze.id)), + "status" ~= "Ok"), + selectCount) + .map { artifactStats ⇒ + Json.obj("artifacts" -> artifactStats) + } + } + + private[models] def buildTaskStats(caze: Case): Future[JsObject] = { + import org.elastic4play.services.QueryDSL._ + findSrv( + taskModel.get, + and( + parent("case", withId(caze.id)), + "status" in ("Waiting", "InProgress", "Completed")), + groupByField("status", selectCount)) + .map { taskStatsJson ⇒ + val (taskCount, taskStats) = taskStatsJson.value.foldLeft((0L, JsObject(Nil))) { + case ((total, s), (key, value)) ⇒ + val count = (value \ "count").as[Long] + (total + count, s + (key -> JsNumber(count))) + } + Json.obj("tasks" -> (taskStats + ("total" -> JsNumber(taskCount)))) + } + } + + private[models] def buildMergeIntoStats(caze: Case): Future[JsObject] = { + caze.mergeInto() + .fold(Future.successful(Json.obj())) { mergeCaseId ⇒ + caseSrv.get.get(mergeCaseId).map { c ⇒ + Json.obj("mergeInto" -> Json.obj( + "caseId" -> c.caseId(), + "title" -> c.title())) + } } - artifactStats <- findSrv( - artifactModel.get, - and( - "_parent" ~= entity.id, - "status" ~= "Ok"), - selectCount) - } yield Json.obj("tasks" -> (taskStats + ("total" -> JsNumber(taskCount))), "artifacts" -> artifactStats) + } + + private[models] def buildMergeFromStats(caze: Case): Future[JsObject] = { + Future + .traverse(caze.mergeFrom()) { id ⇒ + caseSrv.get.get(id).map { c ⇒ + Json.obj( + "caseId" -> c.caseId(), + "title" -> c.title()) + } + } + .map { + case mf if !mf.isEmpty ⇒ Json.obj("mergeFrom" -> mf) + case _ ⇒ Json.obj() + } + } + override def getStats(entity: BaseEntity): Future[JsObject] = { + + + entity match { + case caze: Case ⇒ + for { + taskStats <- buildTaskStats(caze) + artifactStats <- buildArtifactStats(caze) + mergeIntoStats <- buildMergeIntoStats(caze) + mergeFromStats <- buildMergeFromStats(caze) + } yield taskStats ++ artifactStats ++ mergeIntoStats ++ mergeFromStats + case other ⇒ + logger.warn(s"Request caseStats from a non-case entity ?! ${other.getClass}:$other") + Future.successful(Json.obj()) + } } override val computedMetrics = Map( diff --git a/thehive-backend/app/models/Job.scala b/thehive-backend/app/models/Job.scala index f90c715948..b5f5b895f5 100644 --- a/thehive-backend/app/models/Job.scala +++ b/thehive-backend/app/models/Job.scala @@ -23,7 +23,7 @@ trait JobAttributes { _: AttributeDef => val analyzerId = attribute("analyzerId", F.stringFmt, "Analyzer", O.readonly) val status = attribute("status", F.enumFmt(JobStatus), "Status of the job", JobStatus.InProgress) val artifactId = attribute("artifactId", F.stringFmt, "Original artifact on which this job was executed", O.readonly) - val startDate = attribute("startDate", F.dateFmt, "Timestamp of the job start", O.model) + val startDate = attribute("startDate", F.dateFmt, "Timestamp of the job start") // , O.model) val endDate = optionalAttribute("endDate", F.dateFmt, "Timestamp of the job completion (or fail)") val report = optionalAttribute("report", F.textFmt, "Analysis result", O.unaudited) diff --git a/thehive-backend/app/services/ArtifactSrv.scala b/thehive-backend/app/services/ArtifactSrv.scala index cab31e429e..39ba43234a 100644 --- a/thehive-backend/app/services/ArtifactSrv.scala +++ b/thehive-backend/app/services/ArtifactSrv.scala @@ -13,6 +13,8 @@ import org.elastic4play.services.{ Agg, AuthContext, CreateSrv, DeleteSrv, Field import models.{ Artifact, ArtifactModel, ArtifactStatus, Case, CaseModel, JobModel } import org.elastic4play.utils.{ RichFuture, RichOr } +import models.CaseStatus +import models.CaseResolutionStatus @Singleton class ArtifactSrv @Inject() ( @@ -102,14 +104,14 @@ class ArtifactSrv @Inject() ( def findSimilar(artifact: Artifact, range: Option[String], sortBy: Seq[String]) = find(similarArtifactFilter(artifact), range, sortBy) - private def similarArtifactFilter(artifact: Artifact): QueryDef = { + private[services] def similarArtifactFilter(artifact: Artifact): QueryDef = { import org.elastic4play.services.QueryDSL._ val dataType = artifact.dataType() artifact.data() match { // artifact is an hash case Some(d) if dataType == "hash" => and( - not(parent("case", "_id" ~= artifact.parentId.get)), + parent("case", and(not(withId(artifact.parentId.get)), "status" ~!= CaseStatus.Deleted, "resolutionStatus" ~!= CaseResolutionStatus.Duplicated)), "status" ~= "Ok", or( and( @@ -119,7 +121,7 @@ class ArtifactSrv @Inject() ( // artifact contains data but not an hash case Some(d) => and( - not(parent("case", "_id" ~= artifact.parentId.get)), + parent("case", and(not(withId(artifact.parentId.get)), "status" ~!= CaseStatus.Deleted, "resolutionStatus" ~!= CaseResolutionStatus.Duplicated)), "status" ~= "Ok", "data" ~= d, "dataType" ~= dataType) @@ -128,7 +130,7 @@ class ArtifactSrv @Inject() ( val hashes = artifact.attachment().toSeq.flatMap(_.hashes).map(_.toString) val hashFilter = hashes.map { h => "attachment.hashes" ~= h } and( - not(parent("case", "_id" ~= artifact.parentId.get)), + parent("case", and(not(withId(artifact.parentId.get)), "status" ~!= CaseStatus.Deleted, "resolutionStatus" ~!= CaseResolutionStatus.Duplicated)), "status" ~= "Ok", or( hashFilter :+ diff --git a/thehive-backend/app/services/CaseMergeSrv.scala b/thehive-backend/app/services/CaseMergeSrv.scala new file mode 100644 index 0000000000..4406cf1a57 --- /dev/null +++ b/thehive-backend/app/services/CaseMergeSrv.scala @@ -0,0 +1,278 @@ +package services + +import java.util.Date + +import javax.inject.{ Inject, Singleton } + +import scala.concurrent.{ ExecutionContext, Future } +import scala.math.BigDecimal.long2bigDecimal + +import akka.Done +import akka.stream.Materializer +import akka.stream.scaladsl.Sink + +import play.api.libs.json.{ JsArray, JsBoolean, JsNull, JsNumber, JsObject, JsString, JsValue } +import play.api.libs.json.JsValue.jsValueToJsLookup +import play.api.libs.json.Json + +import org.elastic4play.controllers.{ AttachmentInputValue, Fields } +import org.elastic4play.models.BaseEntity +import org.elastic4play.services.AuthContext +import org.elastic4play.services.JsonFormat.log +import org.elastic4play.services.QueryDSL + +import models.{ Artifact, ArtifactStatus, Case, CaseImpactStatus, CaseResolutionStatus, CaseStatus, JobStatus, Task } +import play.api.Logger +import scala.util.Success +import scala.util.Failure +import models.TaskStatus +import models.LogStatus + +@Singleton +class CaseMergeSrv @Inject() (caseSrv: CaseSrv, + taskSrv: TaskSrv, + logSrv: LogSrv, + artifactSrv: ArtifactSrv, + jobSrv: JobSrv, + implicit val ec: ExecutionContext, + implicit val mat: Materializer) { + + lazy val logger = Logger(getClass) + + import QueryDSL._ + private[services] def concat[E](entities: Seq[E], sep: String, getId: E ⇒ Long, getStr: E ⇒ String) = { + JsString(entities.map(e ⇒ s"#${getId(e)}:${getStr(e)}").mkString(sep)) + } + + private[services] def concatCaseDescription(cases: Seq[Case]) = { + val str = cases + .map { caze ⇒ + s"#### ${caze.title()} ([#${caze.caseId()}](#/case/${caze.id}/details))\n\n${caze.description()}" + } + .mkString("\n \n") + JsString(str) + } + + private[services] def firstDate(dates: Seq[Date]) = Json.toJson(dates.min) + + private[services] def mergeResolutionStatus(cases: Seq[Case]) = { + val resolutionStatus = cases + .map(_.resolutionStatus()) + .reduce[Option[CaseResolutionStatus.Type]] { + case (None, s) ⇒ s + case (s, None) ⇒ s + case (Some(CaseResolutionStatus.Other), s) ⇒ s + case (s, Some(CaseResolutionStatus.Other)) ⇒ s + case (Some(CaseResolutionStatus.FalsePositive), s) ⇒ s + case (s, Some(CaseResolutionStatus.FalsePositive)) ⇒ s + case (Some(CaseResolutionStatus.Indeterminate), s) ⇒ s + case (s, Some(CaseResolutionStatus.Indeterminate)) ⇒ s + case (s, _) ⇒ s //TruePositive + } + resolutionStatus.map(s ⇒ JsString(s.toString)) + } + + private[services] def mergeImpactStatus(cases: Seq[Case]) = { + val impactStatus = cases + .map(_.impactStatus()) + .reduce[Option[CaseImpactStatus.Type]] { + case (None, s) ⇒ s + case (s, None) ⇒ s + case (Some(CaseImpactStatus.NotApplicable), s) ⇒ s + case (s, Some(CaseImpactStatus.NotApplicable)) ⇒ s + case (Some(CaseImpactStatus.NoImpact), s) ⇒ s + case (s, Some(CaseImpactStatus.NoImpact)) ⇒ s + case (s, _) ⇒ s // WithImpact + } + impactStatus.map(s ⇒ JsString(s.toString)) + } + + private[services] def mergeSummary(cases: Seq[Case]) = { + val summary = cases + .flatMap(c ⇒ c.summary().map(_ -> c.caseId())) + .map { + case (summary, caseId) ⇒ s"#$caseId:$summary" + } + if (summary.isEmpty) + None + else + Some(JsString(summary.mkString(" / "))) + } + + private[services] def mergeMetrics(cases: Seq[Case]): JsObject = { + val metrics = for { + caze ← cases + metrics ← caze.metrics() + metricsObject ← metrics.asOpt[JsObject] + } yield metricsObject + + val mergedMetrics: Seq[(String, JsValue)] = metrics.flatMap(_.keys).distinct.map { key ⇒ + val metricValues = metrics.flatMap(m ⇒ (m \ key).asOpt[BigDecimal]) + if (metricValues.size != 1) + key -> JsNull + else + key -> JsNumber(metricValues.head) + } + + JsObject(mergedMetrics) + } + + private[services] def baseFields(entity: BaseEntity): Fields = Fields(entity.attributes - "_id" - "_routing" - "_parent" - "_type" - "createdBy" - "createdAt" - "updatedBy" - "updatedAt" - "user") + + private[services] def mergeLogs(oldTask: Task, newTask: Task)(implicit authContext: AuthContext): Future[Done] = { + logSrv.find("_parent" ~= oldTask.id, Some("all"), Nil)._1 + .mapAsyncUnordered(5) { log ⇒ + logSrv.create(newTask, baseFields(log)) + } + .runWith(Sink.ignore) + } + + private[services] def mergeTasksAndLogs(newCase: Case, cases: Seq[Case])(implicit authContext: AuthContext): Future[Done] = { + val (tasks, futureTaskCount) = taskSrv.find(and(parent("case", withId(cases.map(_.id): _*)), "status" ~!= TaskStatus.Cancel), Some("all"), Nil) + futureTaskCount.foreach(count ⇒ logger.info(s"Creating $count task(s):")) + tasks + .mapAsyncUnordered(5) { task ⇒ taskSrv.create(newCase, baseFields(task)).map(task -> _) } + .flatMapConcat { + case (oldTask, newTask) ⇒ + logger.info(s"\ttask : ${oldTask.id} -> ${newTask.id} : ${newTask.title()}") + val (logs, futureLogCount) = logSrv.find(and(parent("case_task", withId(oldTask.id)), "status" ~!= LogStatus.Deleted), Some("all"), Nil) + futureLogCount.foreach { count ⇒ logger.info(s"Creating $count log(s) in task ${newTask.id}") } + logs.map(_ -> newTask) + } + .mapAsyncUnordered(5) { + case (log, task) ⇒ + val fields = log.attachment().fold(baseFields(log)) { a ⇒ + baseFields(log).set("attachment", AttachmentInputValue(a.name, a.hashes, a.size, a.contentType, a.id)) + } + logSrv.create(task, fields) + } + .runWith(Sink.ignore) + } + + private[services] def mergeArtifactStatus(artifacts: Seq[Artifact]) = { + val status = artifacts + .map(_.status()) + .reduce[ArtifactStatus.Type] { + case (ArtifactStatus.Deleted, s) ⇒ s + case (s, _) ⇒ s + } + .toString + JsString(status) + } + + private[services] def mergeJobs(newArtifact: Artifact, artifacts: Seq[Artifact])(implicit authContext: AuthContext): Future[Done] = { + jobSrv.find(and(parent("case_artifact", withId(artifacts.map(_.id): _*)), "status" ~= JobStatus.Success), Some("all"), Nil)._1 + .mapAsyncUnordered(5) { job ⇒ + jobSrv.create(newArtifact, baseFields(job)) + } + .runWith(Sink.ignore) + } + + private[services] def mergeArtifactsAndJobs(newCase: Case, cases: Seq[Case])(implicit authContext: AuthContext): Future[Done] = { + val caseMap = cases.map(c ⇒ c.id -> c).toMap + val caseFilter = and(parent("case", withId(cases.map(_.id): _*)), "status" ~= "Ok") + // Find artifacts hold by cases + val (artifacts, futureArtifactCount) = artifactSrv.find(caseFilter, Some("all"), Nil) + futureArtifactCount.foreach { count ⇒ log.info(s"Found $count artifact(s) in merging cases") } + artifacts + .mapAsyncUnordered(5) { artifact ⇒ + // For each artifact find similar artifacts + val dataFilter = artifact.data().map("data" ~= _) orElse artifact.attachment().map("attachment.id" ~= _.id) + val filter = and(caseFilter, + "status" ~= "Ok", + "dataType" ~= artifact.dataType(), + dataFilter.get) + + val (artifacts, futureArtifactCount) = artifactSrv.find(filter, Some("all"), Nil) + futureArtifactCount.foreach { count ⇒ + logger.debug(s"${count} identical artifact(s) found (${artifact.dataType()}):${(artifact.data() orElse artifact.attachment().map(_.name)).get}") + } + artifacts.runWith(Sink.seq) + } + .mapAsync(5) { sameArtifacts ⇒ + // Same artifacts are merged + val firstArtifact = sameArtifacts.head + val fields = firstArtifact.attachment().fold(Fields.empty) { a ⇒ + Fields.empty.set("attachment", AttachmentInputValue(a.name, a.hashes, a.size, a.contentType, a.id)) + } + .set("data", firstArtifact.data().map(JsString)) + .set("dataType", firstArtifact.dataType()) + .set("message", concat[Artifact](sameArtifacts, "\n \n", a ⇒ caseMap(a.parentId.get).caseId(), _.message())) + .set("startDate", firstDate(sameArtifacts.map(_.startDate()))) + .set("tlp", JsNumber(sameArtifacts.map(_.tlp()).min)) + .set("tags", JsArray(sameArtifacts.flatMap(_.tags()).map(JsString))) + .set("ioc", JsBoolean(sameArtifacts.map(_.ioc()).reduce(_ || _))) + .set("status", mergeArtifactStatus(sameArtifacts)) + // Merged artifact is created under new case + artifactSrv + .create(newCase, fields) + .map(a ⇒ List(a -> sameArtifacts)) + // Errors are logged and ignored (probably document already exists) + .recover { + case e ⇒ + logger.warn("Artifact creation fail", e) + Nil + } + } + .mapConcat(identity) + .mapAsyncUnordered(5) { + case (newArtifact, sameArtifacts) ⇒ + // Then jobs are imported + mergeJobs(newArtifact, sameArtifacts) + .recover { + case error ⇒ + logger.error("Log creation fail", error) + Done + } + } + .runWith(Sink.ignore) + } + + private[services] def mergeCases(cases: Seq[Case])(implicit authContext: AuthContext): Future[Case] = { + logger.info("Merging cases: " + cases.map(c ⇒ s"#${c.caseId()}:${c.title()}").mkString(" / ")) + val fields = Fields.empty + .set("title", concat[Case](cases, " / ", _.caseId(), _.title())) + .set("description", concatCaseDescription(cases)) + .set("severity", JsNumber(cases.map(_.severity()).max)) + .set("startDate", firstDate(cases.map(_.startDate()))) + .set("tags", JsArray(cases.flatMap(_.tags()).distinct.map(JsString))) + .set("flag", JsBoolean(cases.map(_.flag()).reduce(_ || _))) + .set("tlp", JsNumber(cases.map(_.tlp()).max)) + .set("status", JsString(CaseStatus.Open.toString)) + .set("metrics", mergeMetrics(cases)) + .set("isIncident", JsBoolean(cases.map(_.isIncident()).reduce(_ || _))) + .set("resolutionStatus", mergeResolutionStatus(cases)) + .set("impactStatus", mergeImpactStatus(cases)) + .set("summary", mergeSummary(cases)) + .set("mergeFrom", JsArray(cases.map(c ⇒ JsString(c.id)))) + caseSrv.create(fields) + } + + def markCaseAsDuplicated(cases: Seq[Case], mergeCase: Case)(implicit authContext: AuthContext): Future[Done] = { + Future.traverse(cases) { caze ⇒ + val s = s"Merge into : ${mergeCase.title()} ([#${mergeCase.caseId()}](#/case/${mergeCase.id}/details))" + val summary = caze.summary().fold(s)(_ + s"\n\n$s") + caseSrv.update(caze.id, Fields.empty + .set("mergeInto", mergeCase.id) + .set("status", CaseStatus.Resolved.toString) + .set("resolutionStatus", CaseResolutionStatus.Duplicated.toString) + .set("summary", summary)) + } + .map(_ ⇒ Done) + .recover { + case error ⇒ + log.error("Case update fail", error) + Done + } + } + + def merge(caseIds: String*)(implicit authContext: AuthContext): Future[Case] = { + for { + cases ← Future.sequence(caseIds.map(caseSrv.get)) + newCase ← mergeCases(cases) + _ ← mergeTasksAndLogs(newCase, cases) + _ ← mergeArtifactsAndJobs(newCase, cases) + _ ← markCaseAsDuplicated(cases, newCase) + } yield newCase + } +} \ No newline at end of file diff --git a/thehive-backend/app/services/CaseSrv.scala b/thehive-backend/app/services/CaseSrv.scala index 4ffc531278..a1accb73ac 100644 --- a/thehive-backend/app/services/CaseSrv.scala +++ b/thehive-backend/app/services/CaseSrv.scala @@ -17,6 +17,8 @@ import org.elastic4play.controllers.Fields import org.elastic4play.services.{ Agg, AuthContext, CreateSrv, DeleteSrv, FindSrv, GetSrv, QueryDSL, QueryDef, UpdateSrv } import models.{ Artifact, ArtifactModel, Case, CaseModel, Task, TaskModel } +import models.CaseStatus +import models.CaseResolutionStatus @Singleton class CaseSrv @Inject() ( @@ -25,7 +27,6 @@ class CaseSrv @Inject() ( taskModel: TaskModel, createSrv: CreateSrv, artifactSrv: ArtifactSrv, - taskSrv: TaskSrv, getSrv: GetSrv, updateSrv: UpdateSrv, deleteSrv: DeleteSrv, @@ -45,7 +46,7 @@ class CaseSrv @Inject() ( } } - def get(id: String)(implicit authContext: AuthContext): Future[Case] = + def get(id: String): Future[Case] = getSrv[CaseModel, Case](caseModel, id) def update(id: String, fields: Fields)(implicit authContext: AuthContext): Future[Case] = @@ -76,7 +77,7 @@ class CaseSrv @Inject() ( def linkedCases(id: String): Source[(Case, Seq[Artifact]), NotUsed] = { import org.elastic4play.services.QueryDSL._ - findSrv[ArtifactModel, Artifact](artifactModel, parent("case", withId(id)), Some("all"), Nil) + findSrv[ArtifactModel, Artifact](artifactModel, parent("case", and(withId(id), "status" ~!= CaseStatus.Deleted, "resolutionStatus" ~!= CaseResolutionStatus.Duplicated)), Some("all"), Nil) ._1 .flatMapConcat { artifact => artifactSrv.findSimilar(artifact, Some("all"), Nil)._1 } .groupBy(20, _.parentId) diff --git a/thehive-backend/app/services/JobSrv.scala b/thehive-backend/app/services/JobSrv.scala index bde442797d..d5153da094 100644 --- a/thehive-backend/app/services/JobSrv.scala +++ b/thehive-backend/app/services/JobSrv.scala @@ -67,8 +67,7 @@ class JobSrv(analyzerConf: JsValue, def create(artifact: Artifact, fields: Fields)(implicit authContext: AuthContext): Future[Job] = { createSrv[JobModel, Job, Artifact](jobModel, artifact, fields.set("artifactId", artifact.id)).map { - case job if job.status() != JobStatus.InProgress => job - case job => + case job if job.status() == JobStatus.InProgress => val newJob = for { analyzer <- analyzerSrv.get(job.analyzerId()) (status, result) <- analyzer.analyze(attachmentSrv, artifact) @@ -83,6 +82,7 @@ class JobSrv(analyzerConf: JsValue, case t => log.error("Job execution fail", t) } job + case job => job } } @@ -122,4 +122,4 @@ class JobSrv(analyzerConf: JsValue, } def stats(queryDef: QueryDef, agg: Agg) = findSrv(jobModel, queryDef, agg) -} \ No newline at end of file +} diff --git a/thehive-backend/conf/routes b/thehive-backend/conf/routes index 89582a054b..57a4dd57cc 100644 --- a/thehive-backend/conf/routes +++ b/thehive-backend/conf/routes @@ -18,6 +18,7 @@ GET /api/case/:caseId controllers.CaseCtrl.get(cas PATCH /api/case/:caseId controllers.CaseCtrl.update(caseId) DELETE /api/case/:caseId controllers.CaseCtrl.delete(caseId) GET /api/case/:caseId/links controllers.CaseCtrl.linkedCases(caseId) +POST /api/case/:caseId1/_merge/:caseId2 controllers.CaseCtrl.merge(caseId1, caseId2) POST /api/case/template/_search controllers.CaseTemplateCtrl.find() POST /api/case/template controllers.CaseTemplateCtrl.create() diff --git a/thehive-misp/app/connectors/misp/JsonFormat.scala b/thehive-misp/app/connectors/misp/JsonFormat.scala index 87c63697e0..ad59cf0cce 100644 --- a/thehive-misp/app/connectors/misp/JsonFormat.scala +++ b/thehive-misp/app/connectors/misp/JsonFormat.scala @@ -20,7 +20,7 @@ object JsonFormat { eventId <- (json \ "id").validate[String] optTags <- (json \ "EventTag").validateOpt[Seq[JsValue]] tags = optTags.toSeq.flatten.flatMap(t => (t \ "Tag" \ "name").asOpt[String]) - attrCountStr <- (json \ "attribute_count").validate[String] + attrCountStr <- (json \ "attribute_count").validate[String].recover { case _ => "0" } attrCount = attrCountStr.toInt timestamp <- (json \ "timestamp").validate[String] date = new Date(timestamp.toLong * 1000) diff --git a/thehive-misp/app/connectors/misp/MispSrv.scala b/thehive-misp/app/connectors/misp/MispSrv.scala index 054be87e26..7e65ebabec 100644 --- a/thehive-misp/app/connectors/misp/MispSrv.scala +++ b/thehive-misp/app/connectors/misp/MispSrv.scala @@ -36,7 +36,7 @@ import net.lingala.zip4j.exception.ZipException import net.lingala.zip4j.model.FileHeader import JsonFormat.{ attributeReads, eventReads, eventStatusFormat, eventWrites } -import models.{ Artifact, Case, CaseModel, CaseStatus } +import models.{ Artifact, Case, CaseModel, CaseStatus, CaseResolutionStatus } import services.{ ArtifactSrv, CaseSrv, CaseTemplateSrv, UserSrv } case class MispInstanceConfig(name: String, url: String, key: String, caseTemplate: Option[String], artifactTags: Seq[String]) @@ -131,13 +131,13 @@ class MispSrv @Inject() (mispConfig: MispConfig, */ def getEvents: Future[Seq[MispEvent]] = { Future - .sequence(mispConfig.instances.map { c => + .traverse(mispConfig.instances) { c => getEvents(c).recoverWith { case t => log.warn("Retrieve MISP event list failed", t) Future.failed(t) } - }) + } .map(_.flatten) } @@ -150,10 +150,13 @@ class MispSrv @Inject() (mispConfig: MispConfig, .asOpt[Seq[JsValue]] .getOrElse(Nil) val events = eventJson.flatMap { j => - j.asOpt[MispEvent].map(_.copy(serverId = instanceConfig.name)) orElse { - log.warn(s"MISP event can't be parsed\n$j") - None - } + j.asOpt[MispEvent] + .map(_.copy(serverId = instanceConfig.name)) + .orElse { + log.warn(s"MISP event can't be parsed\n$j") + None + } + .filter(_.attributeCount > 0) } val eventJsonSize = eventJson.size val eventsSize = events.size @@ -265,6 +268,22 @@ class MispSrv @Inject() (mispConfig: MispConfig, } } + def getSuccessMispAndCase(misp: Seq[Try[Misp]]): Future[Seq[(Misp, Case)]] = { + val successMisp = misp.collect { + case Success(m) => m + } + Future + .traverse(successMisp) { misp => + caseSrv.get(misp.id).map(misp -> _) + } + // remove deleted and merged cases + .map { + _.filter { + case (misp, caze) => caze.status() != CaseStatus.Deleted && caze.resolutionStatus != CaseResolutionStatus.Duplicated + } + } + } + /* for all misp servers, retrieve events */ getEvents /* sort events into : case must be updated (Left) and case must be created (Right) */ @@ -276,15 +295,14 @@ class MispSrv @Inject() (mispConfig: MispConfig, updatedMisp <- updateSrv(updates) createdMisp <- createSrv[MispModel, Misp](mispModel, creates) misp = updatedMisp ++ createdMisp - importedMisp = updatedMisp.collect { - case Success(m) if m.caze().isDefined => m - } + importedMisp <- getSuccessMispAndCase(updatedMisp) // update case status - _ <- updateSrv.apply[CaseModel, Case](caseModel, importedMisp.flatMap(_.caze()), Fields.empty.set("status", CaseStatus.Open.toString)) + _ <- caseSrv.bulkUpdate(importedMisp.map(_._2.id), Fields.empty.set("status", CaseStatus.Open.toString)) // and import MISP attributes - _ <- Future.sequence(importedMisp.map { m => - importAttributes(m).fallbackTo(Future.successful(Nil)) - }) + _ ← Future.traverse(importedMisp) { + case (m, c) => + importAttributes(m, c).fallbackTo(Future.successful(Nil)) + } } yield misp } } diff --git a/ui/app/images/favicon.png b/ui/app/images/favicon.png index 36906f912c..19340ab62c 100644 Binary files a/ui/app/images/favicon.png and b/ui/app/images/favicon.png differ diff --git a/ui/app/images/logo.png b/ui/app/images/logo.png index 499c0b7003..e40000743e 100644 Binary files a/ui/app/images/logo.png and b/ui/app/images/logo.png differ diff --git a/ui/app/images/logo.white.png b/ui/app/images/logo.white.png index df86b8100c..a76b527274 100644 Binary files a/ui/app/images/logo.white.png and b/ui/app/images/logo.white.png differ diff --git a/ui/app/index.html b/ui/app/index.html index d600d555a1..417199527d 100644 --- a/ui/app/index.html +++ b/ui/app/index.html @@ -114,6 +114,7 @@ + diff --git a/ui/app/scripts/app.js b/ui/app/scripts/app.js index c986312755..e1e3e93c15 100644 --- a/ui/app/scripts/app.js +++ b/ui/app/scripts/app.js @@ -130,7 +130,8 @@ angular.module('thehive', ['ngAnimate', 'ngMessages', 'ui.bootstrap', 'ui.router var deferred = $q.defer(); CaseSrv.get({ - 'caseId': $stateParams.caseId + 'caseId': $stateParams.caseId, + 'nstats': true }, function(data) { deferred.resolve(data); @@ -169,7 +170,23 @@ angular.module('thehive', ['ngAnimate', 'ngMessages', 'ui.bootstrap', 'ui.router .state('app.case.tasks-item', { url: '/tasks/{itemId}', templateUrl: 'views/partials/case/case.tasks.item.html', - controller: 'CaseTasksItemCtrl' + controller: 'CaseTasksItemCtrl', + resolve: { + task: function($q, $stateParams, CaseTaskSrv, AlertSrv) { + var deferred = $q.defer(); + + CaseTaskSrv.get({ + 'taskId': $stateParams.itemId + }, function(data) { + deferred.resolve(data); + }, function(response) { + deferred.reject(response); + AlertSrv.error('taskDetails', response.data, response.status); + }); + + return deferred.promise; + } + } }) .state('app.case.observables', { url: '/observables', @@ -244,7 +261,7 @@ angular.module('thehive', ['ngAnimate', 'ngMessages', 'ui.bootstrap', 'ui.router }) .config(['markedProvider', 'hljsServiceProvider', function(markedProvider, hljsServiceProvider) { 'use strict'; - + // marked config markedProvider.setOptions({ gfm: true, diff --git a/ui/app/scripts/controllers/RootCtrl.js b/ui/app/scripts/controllers/RootCtrl.js index c3de017ee1..a500deac80 100644 --- a/ui/app/scripts/controllers/RootCtrl.js +++ b/ui/app/scripts/controllers/RootCtrl.js @@ -52,6 +52,13 @@ angular.module('theHiveControllers').controller('RootCtrl', AlertSrv.error('RootCtrl', data, status); }); + $scope.$on('metrics:refresh', function() { + // Get metrics cache + MetricsCacheSrv.all().then(function(list) { + $scope.metricsCache = list; + }); + }); + $scope.$on('misp:status-updated', function(event, enabled) { $scope.mispEnabled = enabled; }); diff --git a/ui/app/scripts/controllers/admin/AdminMetricsCtrl.js b/ui/app/scripts/controllers/admin/AdminMetricsCtrl.js index c85e378b3f..15ce6dfb3b 100644 --- a/ui/app/scripts/controllers/admin/AdminMetricsCtrl.js +++ b/ui/app/scripts/controllers/admin/AdminMetricsCtrl.js @@ -35,6 +35,8 @@ $scope.initMetrics(); MetricsCacheSrv.clearCache(); + + $scope.$emit('metrics:refresh'); }, function(response) { AlertSrv.error('AdminMetricsCtrl', response.data, response.status); diff --git a/ui/app/scripts/controllers/case/CaseMainCtrl.js b/ui/app/scripts/controllers/case/CaseMainCtrl.js index e7a9687afd..3d9f25408e 100644 --- a/ui/app/scripts/controllers/case/CaseMainCtrl.js +++ b/ui/app/scripts/controllers/case/CaseMainCtrl.js @@ -96,6 +96,13 @@ field: 'status' }); + $scope.$on('tasks:task-removed', function(event, task) { + CaseTabsSrv.removeTab('task-' + task.id); + }); + $scope.$on('observables:observable-removed', function(event, observable) { + CaseTabsSrv.removeTab('observable-' + observable.id); + }); + $scope.openTab = function(tabName) { var tab = CaseTabsSrv.getTab(tabName), params = angular.extend({}, $state.params, tab.params || {}); @@ -170,6 +177,20 @@ }); }; + $scope.mergeCase = function() { + $modal.open({ + templateUrl: 'views/partials/case/case.merge.html', + controller: 'CaseMergeModalCtrl', + controllerAs: 'dialog', + size: 'lg', + resolve: { + caze: function() { + return $scope.caze; + } + } + }); + }; + /** * A workaround filter to make sure the ngRepeat doesn't order the * object keys diff --git a/ui/app/scripts/controllers/case/CaseMergeModalCtrl.js b/ui/app/scripts/controllers/case/CaseMergeModalCtrl.js new file mode 100644 index 0000000000..e19b3585e6 --- /dev/null +++ b/ui/app/scripts/controllers/case/CaseMergeModalCtrl.js @@ -0,0 +1,86 @@ +(function () { + 'use strict'; + + angular.module('theHiveControllers') + .controller('CaseMergeModalCtrl', CaseMergeModalCtrl); + + function CaseMergeModalCtrl($state, $modalInstance, $q, SearchSrv, CaseSrv, UserInfoSrv, AlertSrv, caze, $http) { + var me = this; + + this.caze = caze; + this.pendingAsync = false; + this.search = { + type: 'title', + placeholder: 'Search by case title', + minInputLength: 1, + input: null, + cases: [] + }; + this.getUserInfo = UserInfoSrv; + + this.getCaseByTitle = function(type, input) { + var defer = $q.defer(); + + SearchSrv(function (data /*, total*/ ) { + defer.resolve(data); + }, { + _string: (type === 'title') ? ('title:"' + input + '"') : ('caseId:' + input) + }, 'case', 'all'); + + return defer.promise; + } + + this.format = function(caze) { + if(caze) { + return '#' + caze.caseId + ' - ' + caze.title; + } + return null; + } + + this.clearSearch = function() { + this.search.input = null; + this.search.cases = []; + } + + this.onTypeChange = function(type) { + this.clearSearch(); + + this.search.placeholder = 'Search by case ' + type; + + if(type === 'title') { + this.search.minInputLength = 3; + } else if(type === 'number') { + this.search.minInputLength = 1; + } + } + + this.onSelect = function(item, model, label) { + this.search.cases = [item]; + } + + this.merge = function () { + // TODO pass params as path params not query params + this.pendingAsync = true; + CaseSrv.merge({}, { + caseId: me.caze.id, + mergedCaseId: me.search.cases[0].id + }, function (merged) { + + $state.go('app.case.details', { + caseId: merged.id + }); + + $modalInstance.dismiss(); + + AlertSrv.log('The cases have been successfully merged into a new case #' + merged.caseId, 'success'); + }, function (response) { + this.pendingAsync = false; + AlertSrv.error('CaseMergeModalCtrl', response.data, response.status); + }); + }; + + this.cancel = function () { + $modalInstance.dismiss(); + }; + } +})(); diff --git a/ui/app/scripts/controllers/case/CaseObservablesCtrl.js b/ui/app/scripts/controllers/case/CaseObservablesCtrl.js index 62956f30d8..5048eb79ae 100644 --- a/ui/app/scripts/controllers/case/CaseObservablesCtrl.js +++ b/ui/app/scripts/controllers/case/CaseObservablesCtrl.js @@ -290,10 +290,12 @@ }; - $scope.dropArtifact = function(artifact) { + $scope.dropArtifact = function(observable) { // TODO check result ! CaseArtifactSrv.api().delete({ - artifactId: artifact.id + artifactId: observable.id + }, function() { + $scope.$emit('observables:observable-removed', observable); }); }; diff --git a/ui/app/scripts/controllers/case/CaseTasksCtrl.js b/ui/app/scripts/controllers/case/CaseTasksCtrl.js index 14b9d291c8..8dab80be60 100644 --- a/ui/app/scripts/controllers/case/CaseTasksCtrl.js +++ b/ui/app/scripts/controllers/case/CaseTasksCtrl.js @@ -71,7 +71,9 @@ 'taskId': task.id }, { status: 'Cancel' - }, function() {}, function(response) { + }, function() { + $scope.$emit('tasks:task-removed', task); + }, function(response) { AlertSrv.error('taskList', response.data, response.status); }); }); diff --git a/ui/app/scripts/controllers/case/CaseTasksItemCtrl.js b/ui/app/scripts/controllers/case/CaseTasksItemCtrl.js index 162e96c4bb..b142f73233 100644 --- a/ui/app/scripts/controllers/case/CaseTasksItemCtrl.js +++ b/ui/app/scripts/controllers/case/CaseTasksItemCtrl.js @@ -1,7 +1,7 @@ (function() { 'use strict'; angular.module('theHiveControllers').controller('CaseTasksItemCtrl', - function($scope, $state, $stateParams, CaseTabsSrv, CaseTaskSrv, PSearchSrv, TaskLogSrv, AlertSrv) { + function($scope, $state, $stateParams, CaseTabsSrv, CaseTaskSrv, PSearchSrv, TaskLogSrv, AlertSrv, task) { var caseId = $stateParams.caseId, taskId = $stateParams.itemId; @@ -16,39 +16,8 @@ attachmentCollapsed: true, logMissing: '' }; - $scope.task = {}; - - CaseTaskSrv.get({ - 'taskId': taskId - }, function(data) { - - var task = data, - taskName = 'task-' + task.id; - - // Add tabs - CaseTabsSrv.addTab(taskName, { - name: taskName, - label: task.title, - closable: true, - state: 'app.case.tasks-item', - params: { - itemId: task.id - } - }); - - // Select tab - CaseTabsSrv.activateTab(taskName); - - // Prepare the scope data - $scope.initScope(data); - - }, function(response) { - AlertSrv.error('taskDetails', response.data, response.status); - CaseTabsSrv.activateTab('tasks'); - }); - $scope.initScope = function(task) { - $scope.task = task; + $scope.initScope = function() { $scope.logs = PSearchSrv(caseId, 'case_task_log', { 'filter': { @@ -128,6 +97,27 @@ return true; }; + + // Initialize controller + $scope.task = task; + var taskName = 'task-' + task.id; + + // Add tabs + CaseTabsSrv.addTab(taskName, { + name: taskName, + label: task.title, + closable: true, + state: 'app.case.tasks-item', + params: { + itemId: task.id + } + }); + + // Select tab + CaseTabsSrv.activateTab(taskName); + + // Prepare the scope data + $scope.initScope(task); } ); }()); diff --git a/ui/app/scripts/services/CaseSrv.js b/ui/app/scripts/services/CaseSrv.js index 1b8f78afa4..6b7288a6a8 100644 --- a/ui/app/scripts/services/CaseSrv.js +++ b/ui/app/scripts/services/CaseSrv.js @@ -10,6 +10,14 @@ method: 'GET', url: '/api/case/:caseId/links', isArray: true + }, + merge: { + method: 'POST', + url: '/api/case/:caseId/_merge/:mergedCaseId', + params: { + caseId: '@caseId', + mergedCaseId: '@mergedCaseId', + } } }); }); diff --git a/ui/app/scripts/services/CaseTabsSrv.js b/ui/app/scripts/services/CaseTabsSrv.js index b3002a1e22..17ca824830 100644 --- a/ui/app/scripts/services/CaseTabsSrv.js +++ b/ui/app/scripts/services/CaseTabsSrv.js @@ -59,15 +59,19 @@ }, removeTab: function(tab) { - var currentIsActive = tabs[tab].active; + var tabItem = tabs[tab]; + + if(!tabItem) { + return; + } + + var currentIsActive = tabItem.active; delete tabs[tab]; if (currentIsActive) { - console.log('Closing active tab, switch to details'); return true; } else { - console.log('Closing non active tab, stay in current tab'); return false; } diff --git a/ui/app/scripts/services/Constants.js b/ui/app/scripts/services/Constants.js index 2bc9177d55..f68fbf4d45 100644 --- a/ui/app/scripts/services/Constants.js +++ b/ui/app/scripts/services/Constants.js @@ -5,6 +5,7 @@ Indeterminate: 'Indeterminate', FalsePositive: 'False Positive', TruePositive: 'True Positive', + Duplicated: 'Duplicate', Other: 'Other' }) .value('CaseImpactStatus', { diff --git a/ui/app/styles/case.css b/ui/app/styles/case.css index 3ce6ed970d..517890f6b6 100644 --- a/ui/app/styles/case.css +++ b/ui/app/styles/case.css @@ -23,6 +23,10 @@ span.link-id { text-align: center; } +.merge-hints { + padding-left: 35px; +} + .indicent-header h2.background { position: relative; z-index: 1; @@ -49,3 +53,15 @@ span.link-id { padding: 0 15px; } + + +.merge-dialog .merge-case { + background-color: #f5f5f5; + padding: 10px; + overflow: hidden; +} + +.merge-dialog .search-field ul.dropdown-menu { + width: 100%; + left: 0 !important; +} \ No newline at end of file diff --git a/ui/app/styles/main.css b/ui/app/styles/main.css index becf846d69..2a147debd2 100644 --- a/ui/app/styles/main.css +++ b/ui/app/styles/main.css @@ -17,9 +17,13 @@ body { top: 0; z-index: 1030; } +.navbar-brand { + padding: 6px 20px !important; +} .navbar-brand>img { display: inline; } + .main-navbar .nav>li>a { padding: 15px 8px; } diff --git a/ui/app/views/app.html b/ui/app/views/app.html index f98b09b390..8f518c8c16 100644 --- a/ui/app/views/app.html +++ b/ui/app/views/app.html @@ -9,7 +9,7 @@ - The Hive + The Hive
- +
@@ -45,7 +45,7 @@

User management

Login - User name + Full Name Roles Password / API key Lock diff --git a/ui/app/views/partials/case/case.close.html b/ui/app/views/partials/case/case.close.html index 3f1603b2a6..4760e8b518 100644 --- a/ui/app/views/partials/case/case.close.html +++ b/ui/app/views/partials/case/case.close.html @@ -78,13 +78,13 @@

Incident

Investigation clearly demonstrates that there is something malicious (scam, phishing, malspam, malware, cybersquatting...) - Investigation shows that there is nothing malicious (unlock email with clean attachment ...) + Investigation shows that there is nothing malicious (email with clean attachment ...) - There is not enough elements to tell that there is something malicious (original message has been delete and not transmitted, IOC lookup with 0 hit ...) + There aren't enough elements to tell that there is something malicious (original message has been deleted and not transmitted, IOC lookup with 0 hits ...) - Everything that does not need analysis (not an incident) + Everything that does not require an investigation (not an incident)

@@ -105,7 +105,7 @@

Incident

Something altered availability, integrity or confidentiality - Security measures blocked the attack on infection + Security measures blocked the attack or infection

This field is required

diff --git a/ui/app/views/partials/case/case.details.html b/ui/app/views/partials/case/case.details.html index 087c1c8f0b..9574c2a44c 100644 --- a/ui/app/views/partials/case/case.details.html +++ b/ui/app/views/partials/case/case.details.html @@ -134,6 +134,7 @@

+
No metrics need to be set
diff --git a/ui/app/views/partials/case/case.merge.html b/ui/app/views/partials/case/case.merge.html new file mode 100644 index 0000000000..7bcea20037 --- /dev/null +++ b/ui/app/views/partials/case/case.merge.html @@ -0,0 +1,56 @@ + + + diff --git a/ui/app/views/partials/case/case.panelinfo.html b/ui/app/views/partials/case/case.panelinfo.html index fe011081ef..e17da5c8b7 100644 --- a/ui/app/views/partials/case/case.panelinfo.html +++ b/ui/app/views/partials/case/case.panelinfo.html @@ -9,22 +9,27 @@

- {{caze.title}} - - - - + + + - - - + + + + - - - + + + + + + + +

@@ -35,16 +40,14 @@

- {{caze.startDate | showDate}}   - (Closed at + (Closed at {{caze.endDate | showDate}} as {{CaseResolutionStatus[caze.resolutionStatus]}}) - @@ -55,3 +58,9 @@

+
+

+ This case has been closed as a duplicate and merged into
+ +

+
diff --git a/ui/app/views/partials/index-closedcases.html b/ui/app/views/partials/index-closedcases.html index 6602fbf656..6687192f8c 100644 --- a/ui/app/views/partials/index-closedcases.html +++ b/ui/app/views/partials/index-closedcases.html @@ -13,7 +13,10 @@ - #{{closedCase.caseId}} - {{closedCase.title}} + #{{closedCase.caseId}} - {{closedCase.title}} + diff --git a/ui/app/views/partials/index-currentcases.html b/ui/app/views/partials/index-currentcases.html index ab790d6712..2653f92004 100644 --- a/ui/app/views/partials/index-currentcases.html +++ b/ui/app/views/partials/index-currentcases.html @@ -16,6 +16,12 @@ #{{currentCase.caseId}} - {{currentCase.title}} + diff --git a/ui/app/views/partials/observables/list/artifacts-list-export.html b/ui/app/views/partials/observables/list/artifacts-list-export.html index ee3e951375..1737bd172a 100644 --- a/ui/app/views/partials/observables/list/artifacts-list-export.html +++ b/ui/app/views/partials/observables/list/artifacts-list-export.html @@ -4,7 +4,7 @@
- + Back diff --git a/ui/bower.json b/ui/bower.json index e1b589cb22..2ea4e4ab1b 100644 --- a/ui/bower.json +++ b/ui/bower.json @@ -1,6 +1,6 @@ { "name": "thehive", - "version": "2.9.0", + "version": "2.9.1", "license": "AGPL-3.0", "dependencies": { "angular": "^1.5.8", diff --git a/ui/package.json b/ui/package.json index ff8ba547a9..e0e7d65c44 100644 --- a/ui/package.json +++ b/ui/package.json @@ -1,6 +1,6 @@ { "name": "thehive", - "version": "2.9.0", + "version": "2.9.1", "license": "AGPL-3.0", "repository": { "type": "git",