Grants

Social Science Grants

PACIFIC CLIMATE WORKSHOP

Grant awarded to: Dr. Michelle Goman
Funded by: United States Geological Survey (USGS)
 

PACLIM 2019 logo


Dr. Goman has been co-organzing the Pacific Climate Workshop (PACLIM), along with colleague Dr. Scott Mensing (University of Nevada, Reno), since 2015.  PACLIM is a multidisciplinary workshop that broadly addresses the climatic phenomena occurring in the eastern Pacific Ocean and western North America. The purpose of the workshop is to understand climate effects in this region by bringing together specialists from diverse fields including physical, social, and biological sciences. Time scales from weather to the Quaternary are addressed in oral and poster presentations. The conference occurs biennially.

The 2019 PACLIM workshop was held at the Asilomar Conference Center in Pacific Grove, California in February 2019.  The theme of this year’s meeting was “Extreme Events”, inspired by the recent fires, and floods in the west, which followed an extended drought period. The three day conference attracted over 90 participants from across the United States. Nine invited keynote speakers addressed a range of perspectives from fire, flooding, drought, and snow extremes. In addition to the invited speakers there were a further twenty-four oral presentation and fifty poster presentations. The presenters used tree-rings, marine sediments, historical documents, remote sensing, and future modelling to bring insights into the changing patterns of climate extremes through time.  

PACLIM 2019 was awarded a grant from the United States Geological Survey (USGS) to provide support for students and early career scientists to present and participate in the meeting. Over thirty individuals from across the country received support. For more information about this year’s conference and past years go to Paclim.org.

 

Soundscapes to Landscapes (S2L): Monitoring Animal Biodiversity from Space Using Citizen Scientists

Grant Awarded to: Dr. Matt Clark

Funded by: NASA

Soundscapes to Landscapes (S2L) is a NASA-funded project led by Dr. Matthew Clark of the Center for Interdisciplinary Geospatial Analysis (CIGA), in Geography, Environment and Planning. Project collaborators are affiliated with Point Blue Conservation Science, Northern Arizona University, UC Merced, Audubon California, Pepperwood Preserve, and the Sonoma County Agriculture Preservation and Open Space District. The broad goal of the project is to advance animal diversity monitoring from the next generation of Earth-observing satellites. The project uses sounds recorded from low-cost recorders placed in the field (i.e., soundscapes) and bioacoustics analysis to identify bird species by their calls and measure overall avian diversity. Bird diversity data are used by the science team to explore the benefits and trade-offs in using new and existing sensors in space for mapping of bird diversity and conservation planning. One new sensor the team is using is NASA’s Global Ecosystem Dynamics Investigation (GEDI), docked on the International Space Station, which uses a laser to provide detailed measurements of vegetation canopy height and internal structure that can be related to bird habitat. The S2L project is currently focused in Sonoma County, an area with a diverse range of natural vegetation, urban and agricultural landscapes, including areas recovering from the October 2017 fires.

A critical component of S2L is a partnership with “citizen scientists”, or volunteers in the community, that help with placement of recorders on public and private lands, assist with data management, identify bird calls in sound recordings, and train machine learning models. Since starting in spring 2017, fifty citizen scientists have donated approximately 2,400 hours of time. Working with landowners, public agencies, and land conservation organizations, citizen scientists have placed recorders at over 260 sites around the county. The science team has just begun to tap the information stored in these audio data. A team at Northern Arizona University is exploring how forest complexity, as measured remotely by GEDI, is related to bird diversity and aggregate measures of soundscape diversity. The project is partnering with Sieve Analytics, the company that makes a web-based bioacoustics analysis platform (Arbimon), to create a citizen science interface for rapid bird-call validation. In addition, collaborators at UC Merced are exploring cutting-edge “deep learning” (e.g., artificial intelligence) methods for improving automated bird-call detections in recordings. Soundscapes to Landscapes is funded for three years, but the project seeks to build a citizen science and data processing system that can scale to larger regions with private funding.

Bird Map

Example species distribution model outputs. Maps show probability of occurrence from 2 years of simulated GEDI at 250 m spatial resolution. Bird data were from existing eBird, Breeding Bird Survey, and Point Blue data. The maps will be improved as the project collects more field data from soundscape analysis.

Citizen Science bird enthusiast

Citizen science bird enthusiasts (i.e., birders) help identify bird calls in sound recordings.

Citizen scientist recorder

Recorder deployment to remote areas done by citizen scientists.

INFERENCE OF INTERNAL ATTENTIONAL FOCUS BY EXAMINATION OF SCALP EEG IN HUMANS

Grant awarded to: Dr. Jesse Bengson
 
Funded by: CSUPERB Grant Program for Education and Research in Biotechnology

 

In the United States alone, there are millions of individuals who suffer from conditions in which they are consciously aware but unable to move any part of their body, even their eyes. Students in Sonoma State’s Electrophysiology Lab are using machine learning and Electroencephalography (EEG) to develop a brain-computer interface that allows paralyzed individuals the freedom of communication and possibly even movement.  One significant challenge to this goal has been the isolation of a robust, externally observable and non-invasive biomarker that could be used to infer an internal cognitive state in real time. Students in our lab accomplished this feat and successfully implemented an innovative experimental paradigm in which the state of a person’s internal attentional focus could be reliably inferred in real time by decoding frequency-specific changes in neural activity in response to internal decisional processes. More specifically, we were able to infer the internal decisional state of individuals up to 90% of the time using a machine learning algorithm combined with Electroencephalography. This ground-breaking demonstration is a critical step in the development of a robust brain computer interface that can liberate millions of people who are currently severely limited by Lou Gehrig’s disease and locked-in syndrome.  For more information, visit, https://bengsonlab.com/, or email Dr. Bengson at bengson@sonoma.edu