You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+9-2Lines changed: 9 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,6 +1,6 @@
1
1
# camera-traps
2
2
3
-
The Camera Traps application is both a simulator and IoT device software for utilizing machine learning on the edge in field research. The first implementation specializes in applying computer vision (detection and classification) to wildlife images for animal ecology studies. Two operational modes are supported: "simulation" mode and "demo" mode. When executed in simulation mode, the software serves as a test bed for studying ML models, protocols and techniques that optimize storage, execution time, power and accuracy. It requires an input dataset of images to act as the images that would be generated an IoT camera device; it uses these images to drive the simulation.
3
+
The Camera Traps application is both a simulator and IoT device software for utilizing machine learning on the edge in field research. The first implementation specializes in applying computer vision (detection and classification) to wildlife images for animal ecology studies. Two operational modes are supported: "simulation" mode and "demo" mode. When executed in simulation mode, the software serves as a test bed for studying ML models, protocols and techniques that optimize storage, execution time, power and accuracy. Simulation mode accepts two different input types: (1) an input dataset of images to act as the images that would be generated an IoT camera device or (2) an input video file that would be captured by a camera which is then processed by an image detecting plugin that saves frames with motion in them; it uses these images to drive the simulation.
4
4
5
5
Conversely, when run in "demo" mode, the application serves as software that can be deployed onto actual, Linux-based camera trap devices in the wild. In this case, the Camera Traps software relies on a digital camera accessible over a Linux device mount (the default `/dev/video0` location can be re-configured), and it drives the camera directly using the Linux Motion activation software, which comes bundled with the as a plugin with Camera Traps. It includes a detection reporter plugin and MQTT component which coordinate to communicate in real time when a configurable object of interest has been detected (up to a configurable confidence threshold). As a proof of concept of the capabilities of the software, we are producing a demo integration with drone software developed by the Stewart Lab at OSU which enables the Camera Traps software to communicate over a local network to a nearby drone whenever an object of interest is detected.
6
6
@@ -9,6 +9,8 @@ Conversely, when run in "demo" mode, the application serves as software that can
9
9
- CI4AI
10
10
- Animal Ecology
11
11
12
+
---
13
+
12
14
# Explanation
13
15
14
16
## Architectual Overview
@@ -29,6 +31,7 @@ In general, plugins can also depend on their own environment variables and/or co
@@ -159,6 +162,8 @@ When *image_recv_write_file_action* is specifed, the *image_recv_plugin* uses th
159
162
The *image_uuid* and *image_format* are from the NewImageEvent. The image_file_prefix can be the empty string and the image_format is always lowercased when used in the file name.
160
163
161
164
165
+
---
166
+
162
167
# How-To Guide
163
168
164
169
## Quick Start
@@ -232,7 +237,8 @@ allow_anonymous true
232
237
233
238
In-memory representations of events are translated into flatbuffer binary streams plus a leading two byte sequence that identifies the event type. These statically defined byte sequences are specified in the [events.rs](https://github.com/tapis-project/camera-traps/blob/main/src/events.rs) source file and repeated here for convenience.
234
239
235
-
// Each event is assigned a binary prefix that zqm uses to route incoming binary streams to all of the event's subscribers.<br>
240
+
Each event is assigned a binary prefix that zqm uses to route incoming binary streams to all of the event's subscribers.<br>
Each event sent or received begins with its two byte prefix followed by its serialized form as defined in the camera-traps flatbuffer definition file ([events.fbs](https://github.com/tapis-project/camera-traps/blob/main/resources/events.fbs)). The following section describes how to generate Rust source code from this definition file, a similar process can be used for any language supported by flatbuffers.
Copy file name to clipboardExpand all lines: RELEASE_NOTES.md
+6Lines changed: 6 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,5 +1,11 @@
1
1
# Camera Traps Release Notes
2
2
3
+
## Version 0.6.0
4
+
This release includes three main improvements:
5
+
1. A Video Generating plugin that uses a video file and a v4l2loopback virtual device to simulate a mounted camera stream, to use in conjunction with the Image Detecting plugin.
6
+
2. Improved metric reporting.
7
+
3. Set a minimum time between image events when using the Image Detecting plugin.
8
+
3
9
## Version 0.5.0
4
10
This major release expands the camera-traps application with a new functional mode, referred to as `demo` mode.
0 commit comments