Squashed commit of the following:

commit 840e6c89a1
Author: Douglas Gillespie <50671166+douggillespie@users.noreply.github.com>
Date:   Thu Nov 16 20:36:14 2023 +0000

    R2.02.09d

    Fix matched template classifier import
    fix azigram help file image display

commit acc806d375
Author: Douglas Gillespie <50671166+douggillespie@users.noreply.github.com>
Date:   Wed Nov 15 13:08:02 2023 +0000

    Updated X3 library to 2.2.6

commit a4f484c76c
Merge: 8e60ad2e d7c4c278
Author: Douglas Gillespie <50671166+douggillespie@users.noreply.github.com>
Date:   Wed Nov 15 09:44:09 2023 +0000

    Merge branch 'main' of https://github.com/PAMGuard/PAMGuard.git into main

commit 8e60ad2eff
Author: Douglas Gillespie <50671166+douggillespie@users.noreply.github.com>
Date:   Wed Nov 15 09:43:39 2023 +0000

    update to array diagnostics and sensor control

commit d7c4c278d8
Author: m2oswald <45486636+m2oswald@users.noreply.github.com>
Date:   Wed Nov 15 09:17:49 2023 +0000

    Added code to Rocca for training/testing classifier (#114)

    * allow Rocca to run without classifiers

    Fixed bug that threw an error if no classifier files were specified in Rocca Params dialog

    * add rocca switch to enable dev mode

    currently only shows/hides extra buttons in the Params dialog, but will
    extend to more options in the future

    * Fix memory issue with RoccaContourDataBlocks not being released for
    garbage collection

    Set RoccaContourDataBlock objects to null and stop PamObserver Timer to
    force release

    * Fix problem tracing whistles in Rocca spectrogram pop-up

    Whistle and raw data were being cleared before the user had time to trace out the whistle, causing PAMGuard to throw an exception.  Both were already being cleared when the pop-up window is closed, so no need to do it here.

    * updated for training/testing classifiers

commit d5f504dcd1
Author: Douglas Gillespie <50671166+douggillespie@users.noreply.github.com>
Date:   Fri Nov 10 18:08:31 2023 +0000

    Bearing localiser offline save

    Bug stopping it correctly saving data from the offline task to the
    database fixed.

commit 7a44d49e27
Author: Douglas Gillespie <50671166+douggillespie@users.noreply.github.com>
Date:   Fri Oct 27 09:59:28 2023 +0100

    X3 Version 2.2.3

    Add maven jar files for X3/SUD version 2.2.3

commit fa5fe9943d
Author: Douglas Gillespie <50671166+douggillespie@users.noreply.github.com>
Date:   Thu Oct 26 14:44:41 2023 +0100

    update sud file management to more efficiently skip to the correct part
    of a sud file when reading data offline.

commit 60435e567a
Author: Brian S Miller <93690136+BrianSMiller@users.noreply.github.com>
Date:   Fri Sep 8 21:54:40 2023 +1000

    Fixes issues #111 and fixes #112 (DIFAR module crashes and ability to use Deep Learning Detections in DIFAR module) (#110)

    * Bugfix for OverlayMarks

    Check for a null pointer exception in OverlayMarks that was causing a crash on startup.

    * Bugfix for null pointer in symbol manager

    Fix a bug that I found where the DIFAR module was crashing the symbol manager. Seems that this was due to this subclass of clip generator having a null value for it's uniqueName. I've fixed by checking for null values and assigning a generic symbol when null.

    * DeepLearning detections canGenerateClips=true

    Set flag in DeepLearning detector so that detections are considered 'clips' by Pamguard. This allows them to be processed automatically in the DIFAR Localisation module (and maybe others).

    * DIFAR: bugfix frequency limits for auto-detections

    Fix a bug in DIFAR module where the frequency limits of automated detections were not being set properly by the DIFAR module.

    * DeepLearning - Bugfix to detection duration

    Fix bug in deep learning detector where duration (in samples) was being set to number of samples in a hop instead of the number of samples in a segment.
This commit is contained in:
Jamie Mac 2023-12-12 12:51:13 +00:00
parent a235619edd
commit 54264a689b
22 changed files with 499 additions and 241 deletions

View File

@ -8,6 +8,7 @@
</classpathentry> </classpathentry>
<classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-17"> <classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-17">
<attributes> <attributes>
<attribute name="module" value="true"/>
<attribute name="maven.pomderived" value="true"/> <attribute name="maven.pomderived" value="true"/>
</attributes> </attributes>
</classpathentry> </classpathentry>

View File

@ -4,7 +4,7 @@
<groupId>org.pamguard</groupId> <groupId>org.pamguard</groupId>
<artifactId>Pamguard</artifactId> <artifactId>Pamguard</artifactId>
<name>Pamguard Java12+</name> <name>Pamguard Java12+</name>
<version>2.02.09e</version> <version>2.02.09c</version>
<description>Pamguard for Java 12+, using Maven to control dependcies</description> <description>Pamguard for Java 12+, using Maven to control dependcies</description>
<url>www.pamguard.org</url> <url>www.pamguard.org</url>
<organization> <organization>
@ -53,16 +53,16 @@
<plugins> <plugins>
<plugin> <plugin>
<artifactId>maven-compiler-plugin</artifactId> <artifactId>maven-compiler-plugin</artifactId>
<version>3.11.0</version> <version>3.8.1</version>
<dependencies> <dependencies>
<dependency> <dependency>
<groupId>org.eclipse.tycho</groupId> <groupId>org.eclipse.tycho</groupId>
<artifactId>tycho-compiler-jdt</artifactId> <artifactId>tycho-compiler-jdt</artifactId>
<version>4.0.3</version> <version>1.5.1</version>
</dependency> </dependency>
</dependencies> </dependencies>
<configuration> <configuration>
<release>17</release> <release>11</release>
<compilerId>jdt</compilerId> <compilerId>jdt</compilerId>
<compilerArguments> <compilerArguments>
<properties>.settings/org.eclipse.jdt.core.prefs</properties> <properties>.settings/org.eclipse.jdt.core.prefs</properties>
@ -72,11 +72,16 @@
<plugin> <plugin>
<groupId>org.openjfx</groupId> <groupId>org.openjfx</groupId>
<artifactId>javafx-maven-plugin</artifactId> <artifactId>javafx-maven-plugin</artifactId>
<version>0.0.8</version> <version>0.0.6</version>
<configuration>
<source>17</source>
<target>17</target>
<release>17</release>
</configuration>
</plugin> </plugin>
<plugin> <plugin>
<artifactId>maven-shade-plugin</artifactId> <artifactId>maven-shade-plugin</artifactId>
<version>3.5.1</version> <version>3.2.1</version>
<executions> <executions>
<execution> <execution>
<phase>package</phase> <phase>package</phase>
@ -189,8 +194,8 @@
</plugins> </plugins>
</reporting> </reporting>
<properties> <properties>
<maven.compiler.target>17</maven.compiler.target> <maven.compiler.target>11</maven.compiler.target>
<maven.compiler.source>17</maven.compiler.source> <maven.compiler.source>11</maven.compiler.source>
<javafx.version>17</javafx.version> <javafx.version>16</javafx.version>
</properties> </properties>
</project> </project>

36
pom.xml
View File

@ -4,7 +4,7 @@
<modelVersion>4.0.0</modelVersion> <modelVersion>4.0.0</modelVersion>
<groupId>org.pamguard</groupId> <groupId>org.pamguard</groupId>
<artifactId>Pamguard</artifactId> <artifactId>Pamguard</artifactId>
<version>2.02.09e</version> <version>2.02.09d</version>
<name>Pamguard Java12+</name> <name>Pamguard Java12+</name>
<description>Pamguard for Java 12+, using Maven to control dependcies</description> <description>Pamguard for Java 12+, using Maven to control dependcies</description>
<url>www.pamguard.org</url> <url>www.pamguard.org</url>
@ -15,9 +15,9 @@
<properties> <properties>
<javafx.version>17</javafx.version> <javafx.version>16</javafx.version>
<maven.compiler.source>17</maven.compiler.source> <maven.compiler.source>11</maven.compiler.source>
<maven.compiler.target>17</maven.compiler.target> <maven.compiler.target>11</maven.compiler.target>
</properties> </properties>
<build> <build>
@ -49,10 +49,11 @@
--> -->
<groupId>org.apache.maven.plugins</groupId> <groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId> <artifactId>maven-compiler-plugin</artifactId>
<version>3.11.0</version> <version>3.8.1</version>
<configuration> <configuration>
<!-- set compiler to use Java version 11 API https://docs.oracle.com/javase/9/tools/javac.htm#JSWOR627 --> <!-- set compiler to use Java version 11 API https://docs.oracle.com/javase/9/tools/javac.htm#JSWOR627 -->
<release>17</release> <release>11</release>
<compilerId>jdt</compilerId> <compilerId>jdt</compilerId>
<compilerArguments> <compilerArguments>
<properties>.settings/org.eclipse.jdt.core.prefs</properties> <!-- make sure to use same params as what Eclipse uses --> <properties>.settings/org.eclipse.jdt.core.prefs</properties> <!-- make sure to use same params as what Eclipse uses -->
@ -64,7 +65,7 @@
<dependency> <dependency>
<groupId>org.eclipse.tycho</groupId> <groupId>org.eclipse.tycho</groupId>
<artifactId>tycho-compiler-jdt</artifactId> <artifactId>tycho-compiler-jdt</artifactId>
<version>4.0.3</version> <version>1.5.1</version>
</dependency> </dependency>
</dependencies> </dependencies>
@ -73,7 +74,12 @@
<plugin> <plugin>
<groupId>org.openjfx</groupId> <groupId>org.openjfx</groupId>
<artifactId>javafx-maven-plugin</artifactId> <artifactId>javafx-maven-plugin</artifactId>
<version>0.0.8</version> <version>0.0.6</version>
<configuration>
<source>17</source>
<target>17</target>
<release>17</release>
</configuration>
</plugin> </plugin>
<!-- Maven Shade plugin - for creating the uberjar / fatjar --> <!-- Maven Shade plugin - for creating the uberjar / fatjar -->
@ -81,7 +87,7 @@
<plugin> <plugin>
<groupId>org.apache.maven.plugins</groupId> <groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId> <artifactId>maven-shade-plugin</artifactId>
<version>3.5.1</version> <version>3.2.1</version>
<configuration> <configuration>
<transformers> <transformers>
<transformer <transformer
@ -312,7 +318,7 @@
<dependency> <dependency>
<groupId>io.github.macster110</groupId> <groupId>io.github.macster110</groupId>
<artifactId>jpamutils</artifactId> <artifactId>jpamutils</artifactId>
<version>0.0.57</version> <version>0.0.56</version>
</dependency> </dependency>
<!--jpam project - Deep learning java library <!--jpam project - Deep learning java library
@ -323,7 +329,7 @@
<dependency> <dependency>
<groupId>io.github.macster110</groupId> <groupId>io.github.macster110</groupId>
<artifactId>jdl4pam</artifactId> <artifactId>jdl4pam</artifactId>
<version>0.0.97</version> <version>0.0.94</version>
</dependency> </dependency>
<!-- https://mvnrepository.com/artifact/gov.nist.math/jama --> <!-- https://mvnrepository.com/artifact/gov.nist.math/jama -->
@ -389,7 +395,7 @@
<dependency> <dependency>
<groupId>net.synedra</groupId> <groupId>net.synedra</groupId>
<artifactId>validatorfx</artifactId> <artifactId>validatorfx</artifactId>
<version>0.4.2</version> <version>0.4.0</version>
</dependency> </dependency>
<!-- https://mvnrepository.com/artifact/org.apache.commons/commons-compress --> <!-- https://mvnrepository.com/artifact/org.apache.commons/commons-compress -->
@ -776,14 +782,14 @@
<!-- not in Maven repository <!-- not in Maven repository
you may need to copy files from your downloaded PAMGuard source code, e.g. C:\Users\*yourreposfolder*\source\repos\PAMGuardPAMGuard\repo\pamguard\org\x3\2.2.2 to you may need to copy files from your downloaded PAMGuard source code, e.g. C:\Users\*yourreposfolder*\source\repos\PAMGuardPAMGuard\repo\pamguard\org\x3\2.2.2 to
C:\Users\*yourusername*\.m2\repository\pamguard\org\x3\2.2.2 C:\Users\*yourusername*\.m2\repository\pamguard\org\x3\2.2.2-->
-->
<dependency> <dependency>
<groupId>org.pamguard</groupId> <groupId>pamguard.org</groupId>
<artifactId>x3</artifactId> <artifactId>x3</artifactId>
<version>2.2.6</version> <version>2.2.6</version>
</dependency> </dependency>
<!-- https://mvnrepository.com/artifact/it.sauronsoftware/jave --> <!-- https://mvnrepository.com/artifact/it.sauronsoftware/jave -->
<dependency> <dependency>
<groupId>it.sauronsoftware</groupId> <groupId>it.sauronsoftware</groupId>

View File

@ -0,0 +1,4 @@
#NOTE: This is a Maven Resolver internal implementation file, its format can be changed without prior notice.
#Wed Nov 15 12:43:42 GMT 2023
x3-2.2.6.jar>=
x3-2.2.6.pom>=

Binary file not shown.

View File

@ -0,0 +1,9 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd" xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<modelVersion>4.0.0</modelVersion>
<groupId>pamguard.org</groupId>
<artifactId>x3</artifactId>
<version>2.2.6</version>
<description>POM was created from install:install-file</description>
</project>

View File

@ -20,13 +20,12 @@ import PamUtils.worker.PamWorker;
/** /**
* Opens a .sud audio file. * Opens a .sud audio file.
* <p> * <p>
* Sud files contain X3 compressed audio data. The sud * Sud files contain X3 compressed audio data. The sud file reader opens files,
* file reader opens files, creating a map of the file and saving * creating a map of the file and saving the map as a.sudx file so it can be
* the map as a.sudx file so it can be read more rapidly when the file * read more rapidly when the file is next accessed.
* is next accessed.
* <p> * <p>
* The SudioAudioInput stream fully implements AudioInputStream and so * The SudioAudioInput stream fully implements AudioInputStream and so sud files
* sud files can be accessed using much of the same code as .wav files. * can be accessed using much of the same code as .wav files.
* *
* @author Jamie Macaulay * @author Jamie Macaulay
* *
@ -35,14 +34,12 @@ public class SudAudioFile extends WavAudioFile {
private Object conditionSync = new Object(); private Object conditionSync = new Object();
private volatile PamWorker<AudioInputStream> worker; private volatile PamWorker<AudioInputStream> worker;
private volatile SudMapWorker sudMapWorker; private volatile SudMapWorker sudMapWorker;
public SudAudioFile() { public SudAudioFile() {
super(); super();
fileExtensions = new ArrayList<String>(Arrays.asList(new String[]{".sud"})); fileExtensions = new ArrayList<String>(Arrays.asList(new String[] { ".sud" }));
} }
@Override @Override
@ -50,96 +47,96 @@ public class SudAudioFile extends WavAudioFile {
return "SUD"; return "SUD";
} }
@Override @Override
public AudioInputStream getAudioStream(File soundFile) { public AudioInputStream getAudioStream(File soundFile) {
synchronized(conditionSync) { synchronized (conditionSync) {
//System.out.println("Get SUD getAudioStream : " + soundFile.getName()); // System.out.println("Get SUD getAudioStream : " + soundFile.getName());
if (soundFile.exists() == false) { if (soundFile.exists() == false) {
System.err.println("The sud file does not exist: " + soundFile); System.err.println("The sud file does not exist: " + soundFile);
return null; return null;
}
if (soundFile != null) {
if (new File(soundFile.getAbsolutePath()+"x").exists()) {
System.out.println("----NO NEED TO MAP SUD FILE-----" + soundFile);
try {
return new SudAudioFileReader().getAudioInputStream(soundFile);
} catch (UnsupportedAudioFileException | IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
} }
else { if (soundFile != null) {
System.out.println("----MAP SUD FILE ON OTHER THREAD-----" + soundFile); if (new File(soundFile.getAbsolutePath() + "x").exists()) {
// System.out.println("----NO NEED TO MAP SUD FILE-----" + soundFile);
/**
* We need to map the sud file. But we don't want this to just freeze the current GUI thread. Therefore
* add a listener to the mapping process and show a blocking dialog to indicate that something is happening.
* The mapping is put on a separate thread and blocks stuff from happening until the mapping process has completed.
*/
if (sudMapWorker==null || !sudMapWorker.getSudFile().equals(soundFile)) {
sudMapWorker = new SudMapWorker(soundFile);
worker = new PamWorker<AudioInputStream>(sudMapWorker, PamController.getInstance().getMainFrame(),1, "Mapping sud file: " + soundFile.getName());
// System.out.println("Sud Audio Stream STARTED: " + soundFile.getName());
SwingUtilities.invokeLater(()->{
worker.start();
});
//this should block AWT thread but won't block if called on another thread..
}
//this is only ever called if this function is called on another thread other than the event dispatch thread.
while (sudMapWorker==null || !sudMapWorker.isDone()) {
//do nothing
System.out.println("Waiting for the SUD file map: " + soundFile.getName() + " worker: " + worker);
try { try {
// Thread.sleep(100); return new SudAudioFileReader().getAudioInputStream(soundFile);
Thread.sleep(100); } catch (UnsupportedAudioFileException | IOException e) {
} catch (InterruptedException e) {
// TODO Auto-generated catch block // TODO Auto-generated catch block
e.printStackTrace(); e.printStackTrace();
} }
} } else {
AudioInputStream stream = sudMapWorker.getSudAudioStream(); // System.out.println("----MAP SUD FILE ON OTHER THREAD-----" + soundFile);
/**
* We need to map the sud file. But we don't want this o just freeze the current
* GUI thread. Therefore add a listener to the mapping process and show a
* blocking dialog to indicate that something is happening. The mapping is put
* on a separate thread and blocks stuff from happening until the mapping
* process has completed.
*/
if (sudMapWorker == null || !sudMapWorker.getSudFile().equals(soundFile)) {
sudMapWorker = new SudMapWorker(soundFile);
worker = new PamWorker<AudioInputStream>(sudMapWorker,
PamController.getInstance().getMainFrame(), 1,
"Mapping sud file: " + soundFile.getName());
// System.out.println("Sud Audio Stream STARTED: " + soundFile.getName());
SwingUtilities.invokeLater(() -> {
worker.start();
});
// this should block AWT thread but won't block if called on another thread..
}
// this is only ever called if this function is called on another thread other
// than the event dispatch thread.
while (sudMapWorker == null || !sudMapWorker.isDone()) {
// do nothing
// System.out.println("Waiting for the SUD file map: " + soundFile.getName() + " worker: " + worker);
try {
// Thread.sleep(100);
Thread.sleep(100);
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
AudioInputStream stream = sudMapWorker.getSudAudioStream();
// sudMapWorker= null; // sudMapWorker= null;
// worker = null; // worker = null;
System.out.println("----RETURN SUD FILE ON OTHER THREAD-----" + stream); // System.out.println("----RETURN SUD FILE ON OTHER THREAD-----" + stream);
return stream;
return stream; }
} }
} }
}
return null; return null;
} }
public class SudMapProgress implements SudMapListener { public class SudMapProgress implements SudMapListener {
PamWorker<AudioInputStream> sudMapWorker; PamWorker<AudioInputStream> sudMapWorker;
public SudMapProgress(PamWorker<AudioInputStream> sudMapWorker) { public SudMapProgress(PamWorker<AudioInputStream> sudMapWorker) {
this.sudMapWorker=sudMapWorker; this.sudMapWorker = sudMapWorker;
} }
@Override @Override
public void chunkProcessed(ChunkHeader chunkHeader, int count) { public void chunkProcessed(ChunkHeader chunkHeader, int count) {
//System.out.println("Sud Map Progress: " + count); // System.out.println("Sud Map Progress: " + count);
if (count%500 == 0) { if (count % 500 == 0) {
//don't update too often or everything just freezes // don't update too often or everything just freezes
sudMapWorker.update(new PamWorkProgressMessage(-1, ("Mapped " +count + " sud file chunks"))); sudMapWorker.update(new PamWorkProgressMessage(-1, ("Mapped " + count + " sud file chunks")));
} }
if (count == -1) { if (count == -1) {
sudMapWorker.update(new PamWorkProgressMessage(-1, ("Mapping sud file finished"))); sudMapWorker.update(new PamWorkProgressMessage(-1, ("Mapping sud file finished")));
@ -149,12 +146,13 @@ public class SudAudioFile extends WavAudioFile {
} }
/** /**
* Opens an sud file on a different thread and adds a listener for a mapping. This allows * Opens an sud file on a different thread and adds a listener for a mapping.
* a callback to show map progress. * This allows a callback to show map progress.
*
* @author Jamie Macaulay * @author Jamie Macaulay
* *
*/ */
public class SudMapWorker implements PamWorkWrapper<AudioInputStream>{ public class SudMapWorker implements PamWorkWrapper<AudioInputStream> {
private File soundFile; private File soundFile;
@ -180,26 +178,24 @@ public class SudAudioFile extends WavAudioFile {
public AudioInputStream runBackgroundTask(PamWorker<AudioInputStream> pamWorker) { public AudioInputStream runBackgroundTask(PamWorker<AudioInputStream> pamWorker) {
AudioInputStream stream; AudioInputStream stream;
try { try {
System.out.println("START OPEN SUD FILE:"); // System.out.println("START OPEN SUD FILE:");
this.sudMapListener = new SudMapProgress(pamWorker); this.sudMapListener = new SudMapProgress(pamWorker);
stream = new SudAudioFileReader().getAudioInputStream(soundFile, sudMapListener); stream = new SudAudioFileReader().getAudioInputStream(soundFile, sudMapListener);
System.out.println("END SUD FILE:"); // System.out.println("END SUD FILE:");
// for some reason - task finished may not be called on other
//for some reason - task finished may not be called on other // thread so put this here.
//thread so put this here.
this.result = stream; this.result = stream;
this.done = true; this.done = true;
return stream; return stream;
} } catch (UnsupportedAudioFileException e) {
catch (UnsupportedAudioFileException e) { System.err.println("UnsupportedAudioFileException: Could not open sud file: not a supported file "
System.err.println("UnsupportedAudioFileException: Could not open sud file: not a supported file " + soundFile.getName()); + soundFile.getName());
System.err.println(e.getMessage()); System.err.println(e.getMessage());
// e.printStackTrace(); // e.printStackTrace();
} catch (IOException e) { } catch (IOException e) {
System.err.println("Could not open sud file: IO Exception: " + soundFile.getName()); System.err.println("Could not open sud file: IO Exception: " + soundFile.getName());
@ -210,7 +206,7 @@ public class SudAudioFile extends WavAudioFile {
@Override @Override
public void taskFinished(AudioInputStream result) { public void taskFinished(AudioInputStream result) {
System.out.println("TASK FINSIHED:"); // System.out.println("TASK FINSIHED:");
this.result = result; this.result = result;
this.done = true; this.done = true;
} }
@ -221,7 +217,4 @@ public class SudAudioFile extends WavAudioFile {
} }
} }

View File

@ -37,16 +37,21 @@ public class SudAudioFileReader {
} }
/** /**
* Get the audio input stream for a sud file. * Get the audio input streamn.
* @param file - the .sud file to open. * @param file - the .sud file to open.
* @return the sud AudioStream. * @return the sud AudioStream.
* @throws UnsupportedAudioFileException * @throws UnsupportedAudioFileException
* @throws IOException * @throws IOException
*/ */
public AudioInputStream getAudioInputStream(File file) throws UnsupportedAudioFileException, IOException { public AudioInputStream getAudioInputStream(File file) throws UnsupportedAudioFileException, IOException {
return getAudioInputStream( file, null); try {
sudAudioInputStream = SudAudioInputStream.openInputStream(file, sudParams, false);
} catch (Exception e) {
String msg = String.format("Corrupt sud file %s: %s", file.getName(), e.getMessage());
throw new UnsupportedAudioFileException(msg);
}
return sudAudioInputStream;
} }
/** /**
* Get the audio input stream for a sud file. * Get the audio input stream for a sud file.
* @param file - the .sud file to open. * @param file - the .sud file to open.
@ -68,7 +73,4 @@ public class SudAudioFileReader {
return sudAudioInputStream; return sudAudioInputStream;
} }
} }

View File

@ -7,7 +7,6 @@ import org.pamguard.x3.sud.SudAudioInputStream;
public class SUDFileTime { public class SUDFileTime {
private static long sudTime; private static long sudTime;
private static String lastFilePath = ""; private static String lastFilePath = "";
/** /**
* Temp measure to get the time from the first available sud record. * Temp measure to get the time from the first available sud record.
@ -15,9 +14,7 @@ public class SUDFileTime {
* @return * @return
*/ */
public static long getSUDFileTime(File file) { public static long getSUDFileTime(File file) {
//System.out.println("Get sud file time: " + file.getName()); //System.out.println("Get sud file time: " + file.getName());
if (file == null || file.exists() == false) { if (file == null || file.exists() == false) {
return Long.MIN_VALUE; return Long.MIN_VALUE;
} }
@ -48,15 +45,12 @@ public class SUDFileTime {
// return Long.MIN_VALUE; // return Long.MIN_VALUE;
// } // }
// long t = sudMap.getFirstChunkTimeMillis(); // long t = sudMap.getFirstChunkTimeMillis();
long t = SudAudioInputStream.quickFileTime(file); long t = SudAudioInputStream.quickFileTime(file);
t=t/1000; //turn to milliseconds. t=t/1000; //turn to milliseconds.
if (t != 0) { if (t != 0) {
sudTime = t; sudTime = t;
} }
// sudAudioInputStream.addSudFileListener((chunkID, sudChunk)->{ // sudAudioInputStream.addSudFileListener((chunkID, sudChunk)->{
// ChunkHeader chunkHead = sudChunk.chunkHeader; // ChunkHeader chunkHead = sudChunk.chunkHeader;
// if (chunkHead == null || sudTime != Long.MIN_VALUE) { // if (chunkHead == null || sudTime != Long.MIN_VALUE) {
@ -78,12 +72,10 @@ public class SUDFileTime {
// sudAudioInputStream.close(); // sudAudioInputStream.close();
// long t2 = System.currentTimeMillis(); // long t2 = System.currentTimeMillis();
// System.out.printf("SUD file time %s extracted in %d milliseconds\n", PamCalendar.formatDBDateTime(sudTime), t2-t1); // System.out.printf("SUD file time %s extracted in %d milliseconds\n", PamCalendar.formatDBDateTime(sudTime), t2-t1);
} catch (Exception e) { } catch (Exception e) {
System.err.println("Error getting time from SUD file: " + file + " " + e.getMessage()); System.err.println("Error getting time from SUD file: " + file + " " + e.getMessage());
e.printStackTrace(); e.printStackTrace();
} }
return sudTime; return sudTime;
} }

View File

@ -24,19 +24,19 @@ public class PamguardVersionInfo {
* PAMGuard can work with. * PAMGuard can work with.
*/ */
static public final String minJavaVersion = "11.0.0"; static public final String minJavaVersion = "11.0.0";
static public final String maxJavaVersion = "21.99.99"; static public final String maxJavaVersion = "19.99.99";
/** /**
* Version number, major version.minorversion.sub-release. * Version number, major version.minorversion.sub-release.
* Note: can't go higher than sub-release 'f' * Note: can't go higher than sub-release 'f'
*/ */
static public final String version = "2.02.09e"; static public final String version = "2.02.09c";
/** /**
* Release date * Release date
*/ */
static public final String date = "29 June 2023"; static public final String date = "10 November 2023";
// /** // /**
// * Release type - Beta or Core // * Release type - Beta or Core

View File

@ -89,7 +89,8 @@ public class TxtFileUtils {
//5/08/2022 - there was a bug here where there is some sort of invisible character that does not appear on the //5/08/2022 - there was a bug here where there is some sort of invisible character that does not appear on the
//print screen - the only way you can tell is the char array is greater than the number of digits - removed all non numeric //print screen - the only way you can tell is the char array is greater than the number of digits - removed all non numeric
//characters. //characters.
String number = new String(recordsOnLine[i].strip().replaceAll("[^\\d.]", "")); // updated again on 15/11/23 to include - signs, or you end up with the abs(of every number!)
String number = new String(recordsOnLine[i].strip().replaceAll("[^\\d.-]", ""));
dat = Double.valueOf(number); dat = Double.valueOf(number);
//dat = DecimalFormat.getNumberInstance().parse(new String(recordsOnLine[i].strip().toCharArray())).doubleValue(); //dat = DecimalFormat.getNumberInstance().parse(new String(recordsOnLine[i].strip().toCharArray())).doubleValue();
} }

View File

@ -10,7 +10,7 @@ public class ArraySensorParams implements Serializable, Cloneable, ManagedParame
public static final long serialVersionUID = 1L; public static final long serialVersionUID = 1L;
public int readIntervalMillis = 1000; public volatile int readIntervalMillis = 1000;
private ArrayDisplayParameters arrayDisplayParameters; private ArrayDisplayParameters arrayDisplayParameters;

View File

@ -45,7 +45,8 @@ public class ArraySensorProcess extends PamProcess {
while(true) { while(true) {
readData(); readData();
try { try {
Thread.sleep(analogSensorControl.getAnalogSensorParams().readIntervalMillis); int slptime = analogSensorControl.getAnalogSensorParams().readIntervalMillis;
Thread.sleep(slptime);
} catch (InterruptedException e) { } catch (InterruptedException e) {
e.printStackTrace(); e.printStackTrace();
} }

View File

@ -139,7 +139,7 @@ public class BrainBoxDevices implements AnalogDeviceType, PamSettings{
double sensData = BBED549.hexToEngineering(bbRanges[item], sensInts); double sensData = BBED549.hexToEngineering(bbRanges[item], sensInts);
double paramValue = calibration.rawToValue(sensData, calibrationData[item]); double paramValue = calibration.rawToValue(sensData, calibrationData[item]);
analogDevicesManager.notifyData(new ItemAllData(item, sensInts, sensData, paramValue)); analogDevicesManager.notifyData(new ItemAllData(item, sensInts, sensData, paramValue));
// System.out.printf("Read item %d, chan %d, int %d, real %3.5f, param %3.5f\n", iChan, chan, sensInts, sensData, paramValue); // System.out.printf("Read item %d, chan %d, int %d, real %3.5f, param %3.5f\n", 0, chan, sensInts, sensData, paramValue);
sayError(null); sayError(null);
return new AnalogSensorData(sensData, paramValue); return new AnalogSensorData(sensData, paramValue);

View File

@ -221,7 +221,7 @@ public class AnalogDiagnosticsDisplay extends UserDisplayComponentAdapter implem
break; break;
case 3: case 3:
if (lastUpdate[rowIndex] > 0) { if (lastUpdate[rowIndex] > 0) {
return PamCalendar.formatTime(lastUpdate[rowIndex]); return PamCalendar.formatTime(lastUpdate[rowIndex], true);
} }
break; break;
case 4: case 4:

View File

@ -1041,6 +1041,9 @@ public abstract class SQLLogging {
* @return a result set * @return a result set
*/ */
protected ResultSet createViewResultSet(PamConnection con, PamViewParameters pamViewParameters) { protected ResultSet createViewResultSet(PamConnection con, PamViewParameters pamViewParameters) {
if (con == null) {
return null;
}
String viewerClause = getViewerLoadClause(con.getSqlTypes(), pamViewParameters); String viewerClause = getViewerLoadClause(con.getSqlTypes(), pamViewParameters);
return createViewResultSet(con, viewerClause); return createViewResultSet(con, viewerClause);
} }

View File

@ -36,11 +36,11 @@ The Azigram plugin is version 0.0.1 and has been tested on Pamguard version 2.01
<p>The sample rate of the Azigram output can be&nbsp;chosen from the&nbsp;Output panel of the settings. The plugin uses frequency domain downsampling in order to acheive the&nbsp;selected sample rate. When selecting the output sample rate, the output FFT length and FFT hop will be altered in order to maintain&nbsp;the same time and frequency resolution as the upstream FFT module.</p> <p>The sample rate of the Azigram output can be&nbsp;chosen from the&nbsp;Output panel of the settings. The plugin uses frequency domain downsampling in order to acheive the&nbsp;selected sample rate. When selecting the output sample rate, the output FFT length and FFT hop will be altered in order to maintain&nbsp;the same time and frequency resolution as the upstream FFT module.</p>
<p><img src="AzigramSettings.png" /></p> <p><img src="./images/AzigramSettings.png" /></p>
<p>The Azigram can be viewed on a Spectrogram Display.&nbsp;The HSV colour model is recommended for viewing Azigrams. This&nbsp;colour model&nbsp;is circular&nbsp;so will better illustrate the&nbsp;circular nature of the angular data (e.g.&nbsp;sounds from&nbsp;359 degrees will be similar in colour to sounds from 1 degree).&nbsp;The limits of the Amplitude Range on the &quot;Scales&quot; tab of the&nbsp;&quot;Spectrogram Parameters&quot; should be manually set to Min 0 and Max 360. While this tab&nbsp;suggests that the&nbsp;Min and Max are in dB,&nbsp;the&nbsp;Azigram module will treat these&nbsp;values as degrees if an Azigram is being displayed.</p> <p>The Azigram can be viewed on a Spectrogram Display.&nbsp;The HSV colour model is recommended for viewing Azigrams. This&nbsp;colour model&nbsp;is circular&nbsp;so will better illustrate the&nbsp;circular nature of the angular data (e.g.&nbsp;sounds from&nbsp;359 degrees will be similar in colour to sounds from 1 degree).&nbsp;The limits of the Amplitude Range on the &quot;Scales&quot; tab of the&nbsp;&quot;Spectrogram Parameters&quot; should be manually set to Min 0 and Max 360. While this tab&nbsp;suggests that the&nbsp;Min and Max are in dB,&nbsp;the&nbsp;Azigram module will treat these&nbsp;values as degrees if an Azigram is being displayed.</p>
<p><img src="AzigramDisplay.png" /></p> <p><img src="./images/AzigramDisplay.png" /></p>
<p>&nbsp;</p> <p>&nbsp;</p>
@ -48,7 +48,7 @@ The Azigram plugin is version 0.0.1 and has been tested on Pamguard version 2.01
<p>The screenshot below shows the DIFAR Azigram Module displaying the Azigram output (top graph) and regular spectrogram ouput (middle panel) of a synthetic test signal. The test signal is a simulated DIFAR source with 8 short FM downsweeps arriving from 0-315 degrees in 45 degree increments. The bottom panel shows the PAMGuard Data Model.</p> <p>The screenshot below shows the DIFAR Azigram Module displaying the Azigram output (top graph) and regular spectrogram ouput (middle panel) of a synthetic test signal. The test signal is a simulated DIFAR source with 8 short FM downsweeps arriving from 0-315 degrees in 45 degree increments. The bottom panel shows the PAMGuard Data Model.</p>
<p><img src="AzigramExample.png" /></p> <p><img src="./images/AzigramExample.png" /></p>
<p>&nbsp;</p> <p>&nbsp;</p>

View File

@ -32,6 +32,7 @@ import java.io.IOException;
import java.util.EnumMap; import java.util.EnumMap;
import javax.swing.JFileChooser; import javax.swing.JFileChooser;
import javax.swing.filechooser.FileNameExtensionFilter;
import PamUtils.PamCalendar; import PamUtils.PamCalendar;
@ -126,31 +127,34 @@ public class RoccaClassifyThis {
/** the field in the RoccaContourStats object which contains all the stats measures */ /** the field in the RoccaContourStats object which contains all the stats measures */
private EnumMap<RoccaContourStats.ParamIndx, Double> contourStats; private EnumMap<RoccaContourStats.ParamIndx, Double> contourStats;
private String dirIn; /**
* Constructor used when allowing user to select training dataset
/** the input filename */ * */
private String csvIn;
/** the input file */
private File statsFileIn;
/** the output filename */
private String csvOut;
/** the output file */
private File statsFileOut;
/** Constructor */
public RoccaClassifyThis(RoccaProcess roccaProcess) { public RoccaClassifyThis(RoccaProcess roccaProcess) {
File statsFileIn = getTheFile();
if (statsFileIn!=null) {
runTheClassifier(statsFileIn, roccaProcess);
}
}
// initialize the BufferedReader /**
BufferedReader inputFile = null; * Constructor when we pass in the training dataset
*/
public RoccaClassifyThis() {
}
/**
* Ask the user to select the file containing the testing dataset
*
* @return File the csv file containing the testing dataset
*/
public File getTheFile() {
// set the directory // set the directory
// this.dirIn = new String("C:\\Users\\Mike\\Documents\\Work\\Java\\EclipseWorkspace\\testing\\RoccaClassifyThis_testing"); // this.dirIn = new String("C:\\Users\\Mike\\Documents\\Work\\Java\\EclipseWorkspace\\testing\\RoccaClassifyThis_testing");
// this.dirIn = new String("C:\\Users\\Mike\\Documents\\Work\\Tom\\Atlantic Classifier\\manual 2-stage data"); // this.dirIn = new String("C:\\Users\\Mike\\Documents\\Work\\Tom\\Atlantic Classifier\\manual 2-stage data");
// this.dirIn = new String("C:\\Users\\Mike\\Documents\\Work\\Tom\\Hawaii dataset problems"); // this.dirIn = new String("C:\\Users\\Mike\\Documents\\Work\\Tom\\Hawaii dataset problems");
this.dirIn = new String("C:\\Users\\SCANS\\Documents\\Work\\Biowaves\\ONR classifier"); // this.dirIn = new String("C:\\Users\\SCANS\\Documents\\Work\\Biowaves\\ONR classifier");
// Define the input and output filenames // Define the input and output filenames
// Hard-coded for now. To Do: query the user for the filename // Hard-coded for now. To Do: query the user for the filename
@ -158,28 +162,44 @@ public class RoccaClassifyThis {
// this.csvIn = new String("Manual_5sp_April 9 2013.csv"); // this.csvIn = new String("Manual_5sp_April 9 2013.csv");
// this.csvIn = new String("CombinedContourStats-fixed.csv"); // this.csvIn = new String("CombinedContourStats-fixed.csv");
// this.csvOut = new String("RoccaContourStatsReclassified.csv"); // this.csvOut = new String("RoccaContourStatsReclassified.csv");
this.csvIn = new String("Atl_TestDFNoTrain_Call_W_160831.csv"); // this.csvIn = new String("Atl_TestDFNoTrain_Call_W_160831.csv");
statsFileIn = new File(dirIn, csvIn); // statsFileIn = new File(dirIn, csvIn);
this.csvOut = new String("Atl_TestDFNoTrain_Call_W_160829-classified.csv"); // this.csvOut = new String("Atl_TestDFNoTrain_Call_W_160829-classified.csv");
statsFileOut = new File(dirIn, csvOut); // statsFileOut = new File(dirIn, csvOut);
// let the user select the arff file
// JFileChooser fileChooser = new JFileChooser(); JFileChooser fileChooser = new JFileChooser();
// fileChooser.setDialogTitle("Select spreadsheet to recalculate..."); fileChooser.setDialogTitle("Select spreadsheet to recalculate...");
// fileChooser.setFileHidingEnabled(true); fileChooser.setFileHidingEnabled(true);
// fileChooser.setApproveButtonText("Select"); fileChooser.setApproveButtonText("Select");
// fileChooser.setFileSelectionMode(JFileChooser.FILES_ONLY); fileChooser.setFileSelectionMode(JFileChooser.FILES_ONLY);
// FileNameExtensionFilter restrict = new FileNameExtensionFilter("Only .csv files", "csv");
// int state = fileChooser.showOpenDialog(this.dirIn); fileChooser.addChoosableFileFilter(restrict);
// if (state == JFileChooser.APPROVE_OPTION) {
int state = fileChooser.showOpenDialog(null);
File statsFileIn = null;
if (state == JFileChooser.APPROVE_OPTION) {
// load the file
statsFileIn = fileChooser.getSelectedFile();
return statsFileIn;
} else {
return null;
}
}
/**
* Run the classifier
* @param statsFileIn the File containing the testing dataset
* @param roccaProcess the RoccaProcess instance
*/
public void runTheClassifier(File statsFileIn, RoccaProcess roccaProcess) {
int index = statsFileIn.getAbsolutePath().lastIndexOf(".");
String csvOut = statsFileIn.getAbsolutePath().substring(0,index) + "-classified.csv";
File statsFileOut = new File(csvOut);
// load the classifier // load the classifier
@ -187,6 +207,9 @@ public class RoccaClassifyThis {
roccaProcess.setClassifierLoaded roccaProcess.setClassifierLoaded
(roccaProcess.roccaClassifier.setUpClassifier()); (roccaProcess.roccaClassifier.setUpClassifier());
// initialize the BufferedReader
BufferedReader inputFile = null;
// open the input file // open the input file
try { try {
System.out.println("Opening input file "+statsFileIn); System.out.println("Opening input file "+statsFileIn);
@ -263,12 +286,45 @@ public class RoccaClassifyThis {
contourStats.put(RoccaContourStats.ParamIndx.FREQPOSSLOPEMEAN, Double.parseDouble(dataArray[34])); contourStats.put(RoccaContourStats.ParamIndx.FREQPOSSLOPEMEAN, Double.parseDouble(dataArray[34]));
contourStats.put(RoccaContourStats.ParamIndx.FREQNEGSLOPEMEAN, Double.parseDouble(dataArray[35])); contourStats.put(RoccaContourStats.ParamIndx.FREQNEGSLOPEMEAN, Double.parseDouble(dataArray[35]));
contourStats.put(RoccaContourStats.ParamIndx.FREQSLOPERATIO, Double.parseDouble(dataArray[36])); contourStats.put(RoccaContourStats.ParamIndx.FREQSLOPERATIO, Double.parseDouble(dataArray[36]));
contourStats.put(RoccaContourStats.ParamIndx.FREQBEGSWEEP, Double.parseDouble(dataArray[37]));
//contourStats.put(RoccaContourStats.ParamIndx.FREQBEGUP, Double.parseDouble(dataArray[38])); // Note that we have to modify the FREQBEGSWEEP value. Weka is trained with the FREQBEGSWEEP param
//contourStats.put(RoccaContourStats.ParamIndx.FREQBEGDWN, Double.parseDouble(dataArray[39])); // as -1=down, 0=flat and 1=up, and that would be how the test data comes through as well. HOWEVER,
contourStats.put(RoccaContourStats.ParamIndx.FREQENDSWEEP, Double.parseDouble(dataArray[40])); // Weka assumes that for nominal parameters, the value is the index location (0,1 or 2) and NOT the actual trained
//contourStats.put(RoccaContourStats.ParamIndx.FREQENDUP, Double.parseDouble(dataArray[41])); // value (-1,0 or 1). So if the whistle has a down sweep, Weka needs the FREQBEGSWEEP value to be 0 indicating the
//contourStats.put(RoccaContourStats.ParamIndx.FREQENDDWN, Double.parseDouble(dataArray[42])); // first location in the array (which was 'down'). If it was up, the value would need to be 2 indicating the third
// location in the array (which was 'up').
// Ideally we would map the values in the test data to the positions in the training array, but as a quick and
// dirty hack we'll simply add 1 to the value since the difference between the nominal values (-1,0,1) and the
/// index positions (0,1,2) is an offset of 1
// Note also that we don't have to do the same thing for FREQBEGUP and FREQBEGDWN since, by coincidence, the training
// values of 0 and 1 happen to match the index locations of 0 and 1
//contourStats.put(RoccaContourStats.ParamIndx.FREQBEGSWEEP, Double.parseDouble(dataArray[37]));
double tempVal = Double.parseDouble(dataArray[37]);
tempVal++;
contourStats.put(RoccaContourStats.ParamIndx.FREQBEGSWEEP, tempVal);
contourStats.put(RoccaContourStats.ParamIndx.FREQBEGUP, Double.parseDouble(dataArray[38]));
contourStats.put(RoccaContourStats.ParamIndx.FREQBEGDWN, Double.parseDouble(dataArray[39]));
// Note that we have to modify the FREQENDSWEEP value. Weka is trained with the FREQENDSWEEP param
// as -1=down, 0=flat and 1=up, and that would be how the test data comes through as well. HOWEVER,
// Weka assumes that for nominal parameters, the value is the index location (0,1 or 2) and NOT the actual trained
// value (-1,0 or 1). So if the whistle has a down sweep, Weka needs the FREQENDSWEEP value to be 0 indicating the
// first location in the array (which was 'down'). If it was up, the value would need to be 2 indicating the third
// location in the array (which was 'up').
// Ideally we would map the values in the test data to the positions in the training array, but as a quick and
// dirty hack we'll simply add 1 to the value since the difference between the nominal values (-1,0,1) and the
/// index positions (0,1,2) is an offset of 1
// Note also that we don't have to do the same thing for FREQENDUP and FREQENDDWN since, by coincidence, the training
// values of 0 and 1 happen to match the index locations of 0 and 1
//contourStats.put(RoccaContourStats.ParamIndx.FREQENDSWEEP, Double.parseDouble(dataArray[40]));
tempVal = Double.parseDouble(dataArray[40]);
tempVal++;
contourStats.put(RoccaContourStats.ParamIndx.FREQENDSWEEP, tempVal);
contourStats.put(RoccaContourStats.ParamIndx.FREQENDUP, Double.parseDouble(dataArray[41]));
contourStats.put(RoccaContourStats.ParamIndx.FREQENDDWN, Double.parseDouble(dataArray[42]));
// end of hack
contourStats.put(RoccaContourStats.ParamIndx.NUMSWEEPSUPDWN, Double.parseDouble(dataArray[43])); contourStats.put(RoccaContourStats.ParamIndx.NUMSWEEPSUPDWN, Double.parseDouble(dataArray[43]));
contourStats.put(RoccaContourStats.ParamIndx.NUMSWEEPSDWNUP, Double.parseDouble(dataArray[44])); contourStats.put(RoccaContourStats.ParamIndx.NUMSWEEPSDWNUP, Double.parseDouble(dataArray[44]));
contourStats.put(RoccaContourStats.ParamIndx.NUMSWEEPSUPFLAT, Double.parseDouble(dataArray[45])); contourStats.put(RoccaContourStats.ParamIndx.NUMSWEEPSUPFLAT, Double.parseDouble(dataArray[45]));
@ -285,8 +341,8 @@ public class RoccaClassifyThis {
contourStats.put(RoccaContourStats.ParamIndx.INFLMEANDELTA, Double.parseDouble(dataArray[56])); contourStats.put(RoccaContourStats.ParamIndx.INFLMEANDELTA, Double.parseDouble(dataArray[56]));
contourStats.put(RoccaContourStats.ParamIndx.INFLSTDDEVDELTA, Double.parseDouble(dataArray[57])); contourStats.put(RoccaContourStats.ParamIndx.INFLSTDDEVDELTA, Double.parseDouble(dataArray[57]));
contourStats.put(RoccaContourStats.ParamIndx.INFLMEDIANDELTA, Double.parseDouble(dataArray[58])); contourStats.put(RoccaContourStats.ParamIndx.INFLMEDIANDELTA, Double.parseDouble(dataArray[58]));
contourStats.put(RoccaContourStats.ParamIndx.INFLDUR, Double.parseDouble(dataArray[59])); //contourStats.put(RoccaContourStats.ParamIndx.INFLDUR, Double.parseDouble(dataArray[59]));
contourStats.put(RoccaContourStats.ParamIndx.STEPDUR, Double.parseDouble(dataArray[60])); //contourStats.put(RoccaContourStats.ParamIndx.STEPDUR, Double.parseDouble(dataArray[60]));
// Run the classifier // Run the classifier
roccaProcess.roccaClassifier.classifyContour2(rcdb); roccaProcess.roccaClassifier.classifyContour2(rcdb);

View File

@ -169,6 +169,7 @@ public class RoccaParametersDialog extends PamDialog implements ActionListener,
JButton classifier2Button; JButton classifier2Button;
JButton recalcButton; JButton recalcButton;
JButton reclassifyButton; JButton reclassifyButton;
JButton trainThenTestButton;
JButton clearClassifier; JButton clearClassifier;
JComboBox<DefaultComboBoxModel<Vector<String>>> stage1Classes; JComboBox<DefaultComboBoxModel<Vector<String>>> stage1Classes;
DefaultComboBoxModel<Vector<String>> stage1ClassModel; DefaultComboBoxModel<Vector<String>> stage1ClassModel;
@ -513,6 +514,10 @@ public class RoccaParametersDialog extends PamDialog implements ActionListener,
reclassifyButton.addActionListener(this); reclassifyButton.addActionListener(this);
reclassifyButton.setToolTipText("Load the whistle data from the contour stats output file, and run it through the current Classifier"); reclassifyButton.setToolTipText("Load the whistle data from the contour stats output file, and run it through the current Classifier");
reclassifyButton.setVisible(true); reclassifyButton.setVisible(true);
trainThenTestButton = new JButton("Train then Test");
trainThenTestButton.addActionListener(this);
trainThenTestButton.setToolTipText("Train a classifier on a set of training data, then test it with a set of testing data");
trainThenTestButton.setVisible(true);
// ******** THIS LINES CONTROLS THE VISIBILITY ******** // ******** THIS LINES CONTROLS THE VISIBILITY ********
if (RoccaDev.isEnabled()) { if (RoccaDev.isEnabled()) {
@ -528,13 +533,15 @@ public class RoccaParametersDialog extends PamDialog implements ActionListener,
extraPanelLayout.createParallelGroup(GroupLayout.Alignment.LEADING) extraPanelLayout.createParallelGroup(GroupLayout.Alignment.LEADING)
.addGroup(extraPanelLayout.createSequentialGroup() .addGroup(extraPanelLayout.createSequentialGroup()
.addComponent(recalcButton) .addComponent(recalcButton)
.addComponent(reclassifyButton)) .addComponent(reclassifyButton)
.addComponent(trainThenTestButton))
); );
extraPanelLayout.setVerticalGroup( extraPanelLayout.setVerticalGroup(
extraPanelLayout.createSequentialGroup() extraPanelLayout.createSequentialGroup()
.addGroup(extraPanelLayout.createParallelGroup(GroupLayout.Alignment.BASELINE) .addGroup(extraPanelLayout.createParallelGroup(GroupLayout.Alignment.BASELINE)
.addComponent(recalcButton) .addComponent(recalcButton)
.addComponent(reclassifyButton)) .addComponent(reclassifyButton)
.addComponent(trainThenTestButton))
); );
classifierPanel.add(extraButtonsSubPanel); classifierPanel.add(extraButtonsSubPanel);
@ -892,7 +899,9 @@ public class RoccaParametersDialog extends PamDialog implements ActionListener,
} else if (e.getSource() == recalcButton) { } else if (e.getSource() == recalcButton) {
RoccaFixParams recalc = new RoccaFixParams(roccaControl.roccaProcess); RoccaFixParams recalc = new RoccaFixParams(roccaControl.roccaProcess);
} else if (e.getSource() == reclassifyButton) { } else if (e.getSource() == reclassifyButton) {
RoccaClassifyThisEvent reclassify = new RoccaClassifyThisEvent(roccaControl.roccaProcess); RoccaClassifyThis reclassify = new RoccaClassifyThis(roccaControl.roccaProcess);
} else if (e.getSource() == trainThenTestButton) {
RoccaTrainThenTest trainThenTest = new RoccaTrainThenTest(roccaControl.roccaProcess);
} else if (e.getSource() == fftButton) { } else if (e.getSource() == fftButton) {
roccaParameters.setUseFFT(true); roccaParameters.setUseFFT(true);
this.enableTheCorrectSource(); this.enableTheCorrectSource();

View File

@ -145,6 +145,7 @@ public class RoccaRFModel implements java.io.Serializable {
} catch (Exception ex) { } catch (Exception ex) {
System.err.println("1st Classification failed: " + ex.getMessage()); System.err.println("1st Classification failed: " + ex.getMessage());
ex.printStackTrace();
rcdb.setClassifiedAs("Err"); rcdb.setClassifiedAs("Err");
} }
} }

View File

@ -24,10 +24,14 @@
package rocca; package rocca;
import java.io.BufferedReader; import java.io.BufferedReader;
import java.io.File;
import java.io.FileReader; import java.io.FileReader;
import java.util.Date; import java.util.Date;
import java.util.Enumeration; import java.util.Enumeration;
import javax.swing.JFileChooser;
import javax.swing.filechooser.FileNameExtensionFilter;
import weka.classifiers.trees.RandomForest; import weka.classifiers.trees.RandomForest;
import weka.core.Instances; import weka.core.Instances;
import weka.core.SerializationHelper; import weka.core.SerializationHelper;
@ -42,13 +46,64 @@ import weka.core.SerializationHelper;
*/ */
public class RoccaTrainClassifier { public class RoccaTrainClassifier {
/**
* Standalone implementation
*
* @param args
*/
public static void main(String[] args) { public static void main(String[] args) {
RoccaTrainClassifier rtc = new RoccaTrainClassifier();
File arffFile = rtc.getArff();
if (arffFile!=null) {
String modelName = rtc.trainClassifier(arffFile);
}
}
/**
* Let user choose arff file training dataset
*
* @return File the arff file containing the training dataset
*/
public File getArff() {
// String arffFile = "C:\\Users\\SCANS\\Documents\\Work\\Biowaves\\ONR classifier\\TP_TrainEvtDF_170408";
// let the user select the arff file
JFileChooser fileChooser = new JFileChooser();
fileChooser.setDialogTitle("Select arff file containing training data");
fileChooser.setFileHidingEnabled(true);
fileChooser.setApproveButtonText("Select");
fileChooser.setFileSelectionMode(JFileChooser.FILES_ONLY);
FileNameExtensionFilter restrict = new FileNameExtensionFilter("Only .arff files", "arff");
fileChooser.addChoosableFileFilter(restrict);
File arffFile;
int state = fileChooser.showOpenDialog(null);
if (state == JFileChooser.APPROVE_OPTION) {
// load the file
arffFile = fileChooser.getSelectedFile();
return arffFile;
} else {
return null;
}
}
/**
* Actual code to train the classifier
*
*/
public String trainClassifier(File arffFile) {
RandomForest model = new RandomForest (); RandomForest model = new RandomForest ();
Instances trainData = null; Instances trainData = null;
String arffFile = "C:\\Users\\SCANS\\Documents\\Work\\Biowaves\\ONR classifier\\TP_TrainEvtDF_170408";
// load the ARFF file containing the training set // load the ARFF file containing the training set
System.out.println("Loading data..."); System.out.println("Loading data..." + arffFile.getAbsolutePath());
try { try {
trainData = new Instances trainData = new Instances
(new BufferedReader (new BufferedReader
@ -56,10 +111,13 @@ public class RoccaTrainClassifier {
// ("C:\\Users\\Mike\\Documents\\Work\\Java\\WEKA\\allwhists 12 vars 8sp update 1-28-10.arff"))); // ("C:\\Users\\Mike\\Documents\\Work\\Java\\WEKA\\allwhists 12 vars 8sp update 1-28-10.arff")));
// ("C:\\Users\\Mike\\Documents\\Work\\Java\\WEKA\\weka vs R\\ETP_orcawale_whists2 modified-subset110perspecies-no_harm_ratios.arff"))); // ("C:\\Users\\Mike\\Documents\\Work\\Java\\WEKA\\weka vs R\\ETP_orcawale_whists2 modified-subset110perspecies-no_harm_ratios.arff")));
// ("C:\\Users\\SCANS\\Documents\\Work\\Biowaves\\ONR classifier\\Atl_TrainDF_Event_160829.arff"))); // ("C:\\Users\\SCANS\\Documents\\Work\\Biowaves\\ONR classifier\\Atl_TrainDF_Event_160829.arff")));
(arffFile + ".arff"))); // (arffFile + ".arff")));
(arffFile)));
trainData.setClassIndex(trainData.numAttributes()-1); trainData.setClassIndex(trainData.numAttributes()-1);
} catch (Exception ex) { } catch (Exception ex) {
System.out.println("Error Loading..."); System.out.println("Error Loading...");
ex.printStackTrace();
return null;
} }
// set the classifier parameters // set the classifier parameters
@ -78,6 +136,8 @@ public class RoccaTrainClassifier {
model.setOptions(options); model.setOptions(options);
} catch (Exception ex) { } catch (Exception ex) {
System.out.println("Error setting options..."); System.out.println("Error setting options...");
ex.printStackTrace();
return null;
} }
// train the classifier // train the classifier
@ -90,23 +150,29 @@ public class RoccaTrainClassifier {
new Date()); new Date());
} catch (Exception ex) { } catch (Exception ex) {
System.out.println("Error training classifier..."); System.out.println("Error training classifier...");
ex.printStackTrace();
return null;
} }
// save the classifier // save the classifier
String[] curOptions = model.getOptions(); // String[] curOptions = model.getOptions();
Enumeration test = model.listOptions(); // Enumeration test = model.listOptions();
System.out.println("Saving Classifier...");
Instances header = new Instances(trainData,0); Instances header = new Instances(trainData,0);
int index = arffFile.getAbsolutePath().lastIndexOf(".");
String modelName = arffFile.getAbsolutePath().substring(0,index) + ".model";
System.out.println("Saving Classifier..." + modelName);
try { try {
SerializationHelper.writeAll SerializationHelper.writeAll
// ("C:\\Users\\Mike\\Documents\\Work\\Java\\WEKA\\weka vs R\\RF_8sp_54att_110whistle-subset.model", // ("C:\\Users\\Mike\\Documents\\Work\\Java\\WEKA\\weka vs R\\RF_8sp_54att_110whistle-subset.model",
(arffFile + ".model", // (arffFile + ".model",
(modelName,
new Object[]{model,header}); new Object[]{model,header});
System.out.println("Finished!");
return modelName;
} catch (Exception ex) { } catch (Exception ex) {
System.out.println("Error saving classifier..."); System.out.println("Error saving classifier...");
ex.printStackTrace();
} }
return null;
System.out.println("Finished!");
} }
} }

View File

@ -0,0 +1,109 @@
/*
* PAMGUARD - Passive Acoustic Monitoring GUARDianship.
* To assist in the Detection Classification and Localisation
* of marine mammals (cetaceans).
*
* Copyright (C) 2006
*
* This program is free software; you can redistribute it and/or
* modify it under the terms of the GNU General Public License
* as published by the Free Software Foundation; either version 3
* of the License, or (at your option) any later version.
*
* This program is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU General Public License for more details.
*
* You should have received a copy of the GNU General Public License
* along with this program; if not, write to the Free Software
* Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
*/
package rocca;
import java.io.BufferedReader;
import java.io.File;
import java.io.FileNotFoundException;
import java.io.FileReader;
import java.io.IOException;
import javax.swing.JFileChooser;
import javax.swing.filechooser.FileNameExtensionFilter;
public class RoccaTrainThenTest {
RoccaTrainClassifier roccaTrainClassifier;
RoccaClassifyThis roccaClassifyThis;
/**
* Main Constructor
* @param roccaProcess
*/
public RoccaTrainThenTest(RoccaProcess roccaProcess) {
// let the user select the csv file containing the training and testing dataset(s)
JFileChooser fileChooser = new JFileChooser();
fileChooser.setDialogTitle("Select csv file with the training/testing pairs");
fileChooser.setFileHidingEnabled(true);
fileChooser.setApproveButtonText("Select");
fileChooser.setFileSelectionMode(JFileChooser.FILES_ONLY);
FileNameExtensionFilter restrict = new FileNameExtensionFilter("Only .csv files", "csv");
fileChooser.addChoosableFileFilter(restrict);
int state = fileChooser.showOpenDialog(null);
if (state == JFileChooser.APPROVE_OPTION) {
// load the file
try {
File csvDataPairs = fileChooser.getSelectedFile();
BufferedReader br = new BufferedReader(new FileReader(csvDataPairs));
String curPath = csvDataPairs.getParent();
// main loop
// read through the csv file one line at a time. The first column should contain the training dataset filename,
// and the second column the testing dataset filename. Paths should be relative to the path containing
// the csv file
String line = "";
String splitBy = ",";
while ((line=br.readLine())!=null) {
String[] filenames = line.split(splitBy);
// train the classifier
File arffFile = new File(curPath + File.separator + filenames[0]);
roccaTrainClassifier = new RoccaTrainClassifier();
String modelName = roccaTrainClassifier.trainClassifier(arffFile);
if (modelName == null) {
System.out.println("ERROR: could not create classifier model from "+arffFile);
continue;
}
// set the classifier as the current one in RoccaParameters
roccaProcess.roccaControl.roccaParameters.setRoccaClassifierModelFilename(new File(modelName));
// test the classifier with the testing dataset
File testFile = new File(curPath + File.separator + filenames[1]);
roccaClassifyThis = new RoccaClassifyThis();
roccaClassifyThis.runTheClassifier(testFile, roccaProcess);
}
} catch (FileNotFoundException e) {
e.printStackTrace();
return;
} catch (IOException e) {
e.printStackTrace();
return;
}
} else {
return;
}
}
}