Files changed (1) hide show
  1. README.md +9 -11
README.md CHANGED
@@ -191,13 +191,12 @@ This is what _you_ did to it following collection from the original source; it w
191
  -->
192
 
193
  #### Who are the source data producers?
194
- [More Information Needed]
195
  <!-- This section describes the people or systems who originally created the data.
196
 
197
  Ex: This dataset is a collection of images taken of the butterfly collection housed at the Ohio State University Museum of Biological Diversity. The associated labels and metadata are the information provided with the collection from biologists that study butterflies and supplied the specimens to the museum.
198
  -->
199
 
200
- The beetles were preserved in ethanol following collection from NEON sites in 2018. They were were sent by [NEON](https://www.neonscience.org/) to Sydne Record and Isadora Fluck for imaging in 2022. Information regarding the sites from which they were collected and the taxonomic labels were provided by NEON. Site information (names and ecoclimatic domains) can be matched to `siteID` from the `NEON_Field_Site_Metadata_20240802.csv`, which is available on [NEON's field sites information page](https://www.neonscience.org/field-sites/explore-field-sites) (click `Download Field Site Table (CSV)`).
201
 
202
 
203
  ### Annotations
@@ -206,20 +205,19 @@ If the dataset contains annotations which are not part of the initial data colle
206
 
207
  Ex: We standardized the taxonomic labels provided by the various data sources to conform to a uniform 7-rank Linnean structure. (Then, under annotation process, describe how this was done: Our sources used different names for the same kingdom (both _Animalia_ and _Metazoa_), so we chose one for all (_Animalia_). -->
208
 
209
- Annotations (elytra length and width) were completed in Zooniverse by Isadora Fluck and two summer REU students (Riley Wolcheski and Isha Chinniah); annotator is indicated by their NEON system usernames in the `BeetleMeasurements` CSV. The annotations (2 per beetle) were repeated for a subset of the images to measure observation error introduced in the annotation process. The taxonomic labels were provided by NEON with the samples.
210
 
211
  #### Annotation process
212
- [More Information Needed: URL for Zooniverse?]
213
  <!-- This section describes the annotation process such as annotation tools used, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
214
 
215
- The images were downsized to 1MB (preserving aspect ratio) for annotation in Zooniverse. Overall, 234 images were annotated by all three annotators and 1 was annotated by just two. Beetles were indicated as not lying flat if they tended to one or the other side, as this can result in a smaller width measurement than expected. See the [sample image](https://huggingface.co/datasets/imageomics/BeetlePalooza/blob/main/beetles.png) (also in Figure 1) for indicators of length and width, as well as the centimeter mark in pixels on the scalebar.
216
 
217
  #### Who are the annotators?
218
  <!-- This section describes the people or systems who created the annotations. -->
219
 
220
- - Isadora E. Fluck: Ph.D. Candidate at Baiser Lab of Community Ecology University of Florida annotated all samples, and led annotation effort of two summer REU students:
221
- - Riley Wolcheski
222
- - Isha Chinniah
223
 
224
  ### Personal and Sensitive Information
225
  <!--
@@ -234,13 +232,13 @@ Things to consider while working with the dataset. For instance, maybe there are
234
  - The `NEON_sampleID` RMNP_014.20180709.CALADV.01 is repeated because there were too many individuals in the sample to organize them all in one picture. Thus, the individuals from this sample are split between two pictures: `A00000051555_1` and `A00000051555_2`.
235
  - The `NEON_sampleID` MOAB_001.S.20180724 was provided without scientific name identification.
236
  - The `individual` indicator is not unique to `pictureID` since Zooniverse restarted IDs after individual number 99, so individuals are indicated by measurements annotated by `user_name == "IsaFluck"` since she annotated each image once, with the other annotators just labeling a subset for comparison.
237
-
 
238
 
239
  ### Bias, Risks, and Limitations
240
- [More Information Needed]
241
  <!-- This section is meant to convey both technical and sociotechnical limitations. Could also address misuse, malicious use, and uses that the dataset will not work well for.-->
242
 
243
- This dataset does not have a balanced representation of genera.
244
  <!-- For instance, if your data exhibits a long-tailed distribution (and why). -->
245
 
246
  ### Recommendations
 
191
  -->
192
 
193
  #### Who are the source data producers?
 
194
  <!-- This section describes the people or systems who originally created the data.
195
 
196
  Ex: This dataset is a collection of images taken of the butterfly collection housed at the Ohio State University Museum of Biological Diversity. The associated labels and metadata are the information provided with the collection from biologists that study butterflies and supplied the specimens to the museum.
197
  -->
198
 
199
+ This dataset is a collection of images taken from beetles collected by [NEON](https://www.neonscience.org/) field technicians that were preserved in ethanol following collection from NEON sites in 2018. A subset of difficult to ID taxa were pinned and sent to experts for identification. The images provided here were of more common species that did not require expert ID that could be identified by NEON field technicians. After being preserved in ethanol, the beetles were archived at the [NEON Biorepository] (https://biorepo.neonscience.org/portal/). They were were sent by Nico Franz, Kelsey Yule, and Andrew Johnston from the NEON Biorepository to Ben Baiser, Sydne Record, and Isadora Fluck for imaging in 2022. Information regarding the sites from which they were collected and the taxonomic labels were provided by NEON. Site information (names and ecoclimatic domains) can be matched to `siteID` from the `NEON_Field_Site_Metadata_20240802.csv`, which is available on [NEON's field sites information page](https://www.neonscience.org/field-sites/explore-field-sites) (click `Download Field Site Table (CSV)`).
200
 
201
 
202
  ### Annotations
 
205
 
206
  Ex: We standardized the taxonomic labels provided by the various data sources to conform to a uniform 7-rank Linnean structure. (Then, under annotation process, describe how this was done: Our sources used different names for the same kingdom (both _Animalia_ and _Metazoa_), so we chose one for all (_Animalia_). -->
207
 
208
+ Annotations (elytra length and width) were completed in Zooniverse by Isadora Fluck and two Harvard Forest summer REU students (Riley Wolcheski and Isha Chinniah); annotator is indicated by their NEON system usernames in the `BeetleMeasurements` CSV. The annotations (2 per beetle) were repeated for a subset of the images to measure observation error introduced in the annotation process. The taxonomic labels were provided by NEON with the samples.
209
 
210
  #### Annotation process
 
211
  <!-- This section describes the annotation process such as annotation tools used, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
212
 
213
+ The images were downsized to 1MB (preserving aspect ratio) for annotation in [Zooniverse](https://www.zooniverse.org/). Overall, 234 images were annotated by all three annotators and 1 was annotated by just two. Beetles were indicated as not lying flat if they tended to one or the other side, as this can result in a smaller width measurement than expected. See the [sample image](https://huggingface.co/datasets/imageomics/BeetlePalooza/blob/main/beetles.png) (also in Figure 1) for indicators of length and width, as well as the centimeter mark in pixels on the scalebar.
214
 
215
  #### Who are the annotators?
216
  <!-- This section describes the people or systems who created the annotations. -->
217
 
218
+ - Isadora E. Fluck: Ph.D. Candidate at Baiser Lab of Community Ecology University of Florida annotated all samples, and led annotation effort of two summer REU students under the supervision of Sydne Record:
219
+ - Riley Wolcheski (University of Connecticut)
220
+ - Isha Chinniah (Mount Holyoke College)
221
 
222
  ### Personal and Sensitive Information
223
  <!--
 
232
  - The `NEON_sampleID` RMNP_014.20180709.CALADV.01 is repeated because there were too many individuals in the sample to organize them all in one picture. Thus, the individuals from this sample are split between two pictures: `A00000051555_1` and `A00000051555_2`.
233
  - The `NEON_sampleID` MOAB_001.S.20180724 was provided without scientific name identification.
234
  - The `individual` indicator is not unique to `pictureID` since Zooniverse restarted IDs after individual number 99, so individuals are indicated by measurements annotated by `user_name == "IsaFluck"` since she annotated each image once, with the other annotators just labeling a subset for comparison.
235
+ - These images do not include all beetles sampled at all sites in 2018 because they do not include the pinned specimens.
236
+ - Dorsal images may not provide all the information needed to ID a beetle.
237
 
238
  ### Bias, Risks, and Limitations
 
239
  <!-- This section is meant to convey both technical and sociotechnical limitations. Could also address misuse, malicious use, and uses that the dataset will not work well for.-->
240
 
241
+ This dataset does not have a balanced representation of genera. In particular, rare or more difficult to ID taxa are not included here. These images do not include all beetles sampled at all sites in 2018 because they do not include the pinned specimens.
242
  <!-- For instance, if your data exhibits a long-tailed distribution (and why). -->
243
 
244
  ### Recommendations