{"id":25866,"date":"2024-02-06T13:46:11","date_gmt":"2024-02-06T12:46:11","guid":{"rendered":"https:\/\/nr.stage.dekodes.no\/en\/?post_type=bc_project&#038;p=25866"},"modified":"2025-04-02T08:53:55","modified_gmt":"2025-04-02T06:53:55","slug":"cogmar","status":"publish","type":"bc_project","link":"https:\/\/nr.stage.dekodes.no\/en\/projects\/cogmar\/","title":{"rendered":"Advancing marine services with computer vision (COGMAR)"},"content":{"rendered":"\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-28f84493 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:66.66%\">\n<p><strong>Fisheries and aquaculture are major industries in Norway, and marine image data are acquired in different formats for a wide range of tasks. Together with our project partners, we have developed automatic solutions for marine image analysis that enhance knowledge and efficiency in the marine sector, and also enable continuous monitoring of ecosystems to ensure sustainable fisheries and harvest quotas.   <\/strong><\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Automatic extraction of information from marine image data<\/h2>\n\n\n\n<p>Today, vast amounts of marine data are being collected, encompassing optical images, videos, and acoustic surveys. These extensive datasets contain information that is crucial for sustainable fisheries and resource management. As marine services shift towards real-time analysis, data volumes are anticipated to surge. Manual processes are not equipped to handle such volumes. Our objective has been to develop automatic solutions that are capable of extracting valuable information from these complex images. This will not only enhance efficiency and accuracy, but also drive advancements in both marine science and deep learning. <\/p>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-28f84493 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\"><div>\n<figure class=\"wp-block-image alignright size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"307\" height=\"392\" src=\"https:\/\/nr.stage.dekodes.no\/content\/uploads\/sites\/2\/2024\/02\/sandeel.png\" alt=\"\" class=\"wp-image-26005\" srcset=\"https:\/\/nr.stage.dekodes.no\/content\/uploads\/sites\/2\/2024\/02\/sandeel.png 307w, https:\/\/nr.stage.dekodes.no\/content\/uploads\/sites\/2\/2024\/02\/sandeel-235x300.png 235w\" sizes=\"auto, (max-width: 307px) 100vw, 307px\" \/><figcaption class=\"wp-element-caption\"><em><em>We have developed a method for detecting and classifying sandeel in acoustic data.<\/em><\/em> <em>Figure: NR.<\/em><\/figcaption><\/figure>\n<\/div><\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<h2 class=\"wp-block-heading\">Using deep learning to detect acoustic targets<\/h2>\n\n\n\n<p>When it comes to acoustical data, our goal is to automatically estimate fish quantities and identify their species for abundance estimation. We have achieved this through a deep learning-based method for detecting acoustic targets, placing special emphasis on handling data variations across surveys. Methods have been further enhanced by incorporating contextual information like depth and distance to the seabed. <\/p>\n<\/div>\n<\/div>\n\n\n\n<h2 class=\"wp-block-heading\">Automated age estimation and transparent AI<\/h2>\n\n\n\n<p>The Institute of Marine Research (IMR) has developed methods for automated age estimation of Greenland halibut using otolith images. Parallel to this pioneering work, NR has explored Explainable Artificial Intelligence (XAI) methods to understand how the network interprets these images. Our study revealed that deep learning techniques interpret images differently than humans. We also observed reduced performance with otolith data from other laboratories than those used for training. To address this, we have developed methods that adapt the model trained on Norwegian otolith images to perform well on otolith images from other labs. Our results can be accessed on the DeepOtolith web portal. <\/p>\n\n\n\n<figure class=\"wp-block-image aligncenter size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"655\" height=\"421\" src=\"https:\/\/nr.stage.dekodes.no\/content\/uploads\/sites\/2\/2024\/02\/cogmar-figure.png\" alt=\"The figure shows an illustration of a  deep neural network for age prediction of fish\" class=\"wp-image-26004\" srcset=\"https:\/\/nr.stage.dekodes.no\/content\/uploads\/sites\/2\/2024\/02\/cogmar-figure.png 655w, https:\/\/nr.stage.dekodes.no\/content\/uploads\/sites\/2\/2024\/02\/cogmar-figure-300x193.png 300w\" sizes=\"auto, (max-width: 655px) 100vw, 655px\" \/><figcaption class=\"wp-element-caption\"><em><em>What does a neural network for predicting the age of fish from images of otoliths focus on? Figure: NR<\/em>.<br>&nbsp;<\/em><\/figcaption><\/figure>\n\n\n\n<p><\/p>\n\n\n\n<p><\/p>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:33.33%\">\t\t<div id=\"post-type-multi-block_50037c05c1fe4487213e4715a202edf1\" class=\"wp-block-post-type-multi type-manual style-card-bc_employee t2-grid\">\n\t\t\t\t\t\t\t<div class=\"t2-grid-item-col-12\">\n\t\t\t\t\t\t<a href=\"https:\/\/nr.stage.dekodes.no\/en\/employees\/arnt-borre-salberg\/\" class='card-employee'>\n\t\t\t\t\t<figure>\n\t\t\t\t<img decoding=\"async\" src=\"https:\/\/nr.stage.dekodes.no\/content\/uploads\/sites\/2\/2024\/05\/arnt-borre-salberg-7.jpg\" alt=\"\">\n\t\t\t<\/figure>\n\t\t\t\t<div class=\"card-employee__content\">\n\t\t\t<p class=\"card-employee__name\">Arnt-B\u00f8rre Salberg<\/p>\n\t\t\t\t\t\t\t<p class=\"card-employee__position\">Chief Research Scientist<\/p>\n\t\t\t\t\t\t<svg xmlns=\"http:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 24 24\" height=\"24\" width=\"24\" class=\"t2-icon t2-icon-arrowforward\" aria-hidden=\"true\" focusable=\"false\"><path d=\"M15.9 4.259a1.438 1.438 0 0 1-.147.037c-.139.031-.339.201-.421.359-.084.161-.084.529-.001.685.035.066 1.361 1.416 2.947 3l2.882 2.88-10.19.02c-8.543.017-10.206.029-10.29.075-.282.155-.413.372-.413.685 0 .313.131.53.413.685.084.046 1.747.058 10.29.075l10.19.02-2.882 2.88c-1.586 1.584-2.912 2.934-2.947 3-.077.145-.085.521-.013.66a.849.849 0 0 0 .342.35c.156.082.526.081.68-.001.066-.035 1.735-1.681 3.709-3.656 2.526-2.53 3.606-3.637 3.65-3.742A.892.892 0 0 0 23.76 12a.892.892 0 0 0-.061-.271c-.044-.105-1.124-1.212-3.65-3.742-1.974-1.975-3.634-3.616-3.689-3.645-.105-.055-.392-.107-.46-.083\"\/><\/svg>\n\t\t<\/div>\n\t<\/a>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\n\n\n\n\n<div class=\"wp-block-group has-primary-200-background-color has-background\">\n<p>Project: COGMAR<\/p>\n\n\n\n<p>Partners: The Institute of Marine Research (IMR), UiT The Arctic University of Norway and DeepVision<\/p>\n\n\n\n<p>Funding: The Research Council of Norway<\/p>\n\n\n\n<p>Period:  2017-2022<\/p>\n<\/div>\n\n\n\n\n\n<div class=\"wp-block-group\">\n<div class=\"wp-block-group has-background\" style=\"background-color:#cdf1f1\">\n<p><strong>Further reading:<\/strong><\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><a href=\"https:\/\/app.cristin.no\/projects\/show.jsf?id=627070\" data-type=\"link\" data-id=\"https:\/\/app.cristin.no\/projects\/show.jsf?id=627070\" target=\"_blank\" rel=\"noreferrer noopener\">Cristin<\/a><\/li>\n\n\n\n<li><a rel=\"noreferrer noopener\" href=\"http:\/\/otoliths.ath.hcmr.gr\/\" data-type=\"link\" data-id=\"http:\/\/otoliths.ath.hcmr.gr\/\" target=\"_blank\">DeepOtolith<\/a><\/li>\n\n\n\n<li><a rel=\"noreferrer noopener\" href=\"https:\/\/prosjektbanken.forskningsradet.no\/en\/project\/FORISS\/270966?Kilde=FORISS&amp;distribution=Ar&amp;chart=bar&amp;calcType=funding&amp;Sprak=no&amp;sortBy=date&amp;sortOrder=desc&amp;resultCount=30&amp;offset=0&amp;Prosjektleder=Eivind%20Egeland\" data-type=\"link\" data-id=\"https:\/\/prosjektbanken.forskningsradet.no\/en\/project\/FORISS\/270966?Kilde=FORISS&amp;distribution=Ar&amp;chart=bar&amp;calcType=funding&amp;Sprak=no&amp;sortBy=date&amp;sortOrder=desc&amp;resultCount=30&amp;offset=0&amp;Prosjektleder=Eivind%20Egeland\" target=\"_blank\">Project Bank<\/a><\/li>\n<\/ul>\n<\/div>\n<\/div>\n\n\n\n<figure class=\"wp-block-image aligncenter size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"180\" height=\"135\" src=\"https:\/\/nr.stage.dekodes.no\/content\/uploads\/sites\/2\/2024\/02\/cogmar-logo.png\" alt=\"The image shows  the COGMAR logo in blue, capitalised font. The logo is shaped like an eye and there are three waves inside the outline.\" class=\"wp-image-25992\" \/><\/figure>\n<\/div>\n<\/div>\n","protected":false},"featured_media":26013,"template":"","meta":{"_acf_changed":false,"_trash_the_other_posts":false,"editor_notices":[],"footnotes":""},"class_list":["post-25866","bc_project","type-bc_project","status-publish","has-post-thumbnail"],"acf":[],"_links":{"self":[{"href":"https:\/\/nr.stage.dekodes.no\/en\/wp-json\/wp\/v2\/bc_project\/25866","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/nr.stage.dekodes.no\/en\/wp-json\/wp\/v2\/bc_project"}],"about":[{"href":"https:\/\/nr.stage.dekodes.no\/en\/wp-json\/wp\/v2\/types\/bc_project"}],"version-history":[{"count":5,"href":"https:\/\/nr.stage.dekodes.no\/en\/wp-json\/wp\/v2\/bc_project\/25866\/revisions"}],"predecessor-version":[{"id":34422,"href":"https:\/\/nr.stage.dekodes.no\/en\/wp-json\/wp\/v2\/bc_project\/25866\/revisions\/34422"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/nr.stage.dekodes.no\/en\/wp-json\/wp\/v2\/media\/26013"}],"wp:attachment":[{"href":"https:\/\/nr.stage.dekodes.no\/en\/wp-json\/wp\/v2\/media?parent=25866"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}