Hello everybody. Now I have a job posting up for a Research Technician to do all of the mouse neuroscience things. Re-toot s'il vous plaƮt. Here is the info: https://monell.org/research-technician-bolding-lab/
Asst. Prof. in Neuroscience at UCSB Psychological and Brain Sciences
Hello everybody. Now I have a job posting up for a Research Technician to do all of the mouse neuroscience things. Re-toot s'il vous plaƮt. Here is the info: https://monell.org/research-technician-bolding-lab/
Re-upping the postdoc search hoping for a new set of eyes:
https://apply.interfolio.com/139308
A full-time postdoctoral fellow position is available ASAP to contribute to a funded project examining olfactory sensory processing for social recognition in mice.
boldinglab.org
Is there a floxed ChRmine mouse line?
Are you interested in how neuronal populations code external and internal signals to drive behavior? Join our lab as a postdoc! We are looking for people with a diverse set of skills, from computational to experimental. Our lab offers a friendly and collaborative environment, and the most advanced techniques. For more information, see https://www.ucl.ac.uk/cortexlab/positions
@desdemonafricker thanks Desdemona!!
@andyalexander Best illustration for a Head Direction Cell recording, by Preston-Ferrer, Coletta, Frey & Burgalossi, eLife 2016, video 1. https://doi.org/10.7554/eLife.14592
Anyone have a good head direction cell video they'd be willing to share?
Hi all, we would like to introduce you to Spyglass (https://github.com/LorenFrankLab/spyglass) - our software framework for creating reproducible data analysis and data sharing for neuroscience research (spearheaded by Kyu Hyun Lee and myself, but really a group effort by the Frank lab).
Try it for yourself without any setup at https://spyglass.hhmi.2i2c.cloud/ thanks to support from @2i2c_org, @RapidScience, @HHMINEWS. Note that there might be a slight wait for things to load.
We all know how hard it is to keep track of all the parameters and code that go into processing neuroscience data. These choices fundamentally affect the outcomes of a paper, but we have few reliable ways of recording what those choices are.
One reason for this is neuroscience data is complex and writing good code that keeps track of these choices is hard. Researchers typically create ad-hoc pipelines to existing tools for themselves, but this is time consuming and potentially error prone.
We built Spyglass to make it easy for researchers to process and track their data. We make it possible for users to spikesort and curate their data using different spike sorters via @spikeinterface, track the pose of animals via @DeepLabCut, or more complex analyses like decoding
We make all this possible using the @NeurodataWB format. We believe that starting with data in NWB and keeping analyses within this format unlocks huge potential to take advantage of tools that rely on this standard. This makes it easy to share your data on @DANDIarchive.
We realized that simply processing data is not enough. You have to visualize your data to know processing worked, but there can be a lot of data with many data types. We make this easy using figurl - an interactive web-based visualization tool by Jeremy Magland (@FlatironInst).
For example, you can visualize spike sorting curation: https://figurl.org/f?v=gs://figurl/spikesortingview-10&d=sha1://1fa0b4a1663323b49b6f1934d79ca9f67779bda8&s=%7B%22initialSortingCuration%22:%22sha1://51b950cad7d97f26aaf807ba234e4b41ffade4ef%22,%22sortingCuration%22:%22gh://LorenFrankLab/sorting-curations/main/mcoulter/molly20220316_.nwb_r1_r2/15/curation.json%22%7D&label=molly20220316_.nwb_r1_r2_15_franklab_tetrode_hippocampus%20molly20220316_.nwb_r1_r2_15_franklab_tetrode_hippocampus_13f7a6a2_spikesorting
or visualize ripple detection: https://figurl.org/f?v=gs://figurl/spikesortingview-10&d=sha1://f94ea807087b446aa0ff7f1993fbafe7a9066f79&label=Ripple%20Detection&zone=franklab.default
or even visualize decoding of hippocampal mental representations: https://figurl.org/f?v=gs://figurl/spikesortingview-10&d=sha1://3990d47cfcfbe426fae203659479e55d7b08980f&label=j1620210710_clusterless_decode&zone=franklab.default
Finally, neuroscience research is becoming more collaborative within and across labs, but sharing data is still difficult. Spyglass makes it easy for you to share data by allowing collaborators to access the database and seamlessly download data via the cloud.
If you want to find out more, please read our preprint: https://www.biorxiv.org/content/10.1101/2024.01.25.577295v2
or view our documentation and tutorials: https://lorenfranklab.github.io/spyglass/latest/
We made a slack-like user group for 2p imaging in neuroscience, Diesel2p users, Cousa.
Just a place to share information, tips, discuss troubleshooting, techniques, etc.
If you want in, let me know. Or just try this link:
https://2p.ece.ucsb.edu/signup_user_complete/?id=6ozestbxr3ggt8j7et45pmuafr&md=link&sbr=sa
We're self-hosting it using Mattermost-- so the posts will stay indefinitely (no expiration data like free version of Slack). It's very much like Slack-- there are apps, and I use one on my phone, but on desktops and laptops I use the web interface, which works great.
We spun out a company. If you're doing multiphoton imaging and hitting some limits, let me know. Maybe we can help you.
Anyone know what this artifact is? This is the power spectrum for one channel out of 128 recorded with Neuronexus polytrodes using a Neuronexus Smartbox Pro. That weird regular set of peaks is on every channel. It's at subdivisions of the sampling frequency (30000 Hz). There is ostensibly a hardware antialiasing filter in effect at 15000 Hz. My internet searches have not proved enlightening. #neuroscience #electrophysiology #ephys
Looking for a full-time lab technician. This is a career appointment with UC benefits! Job 40709 #neuroscience https://hr.jobs.ucla.edu/applicants/jsp/shared/frameset/Frameset.jsp?time=1705877260566
@NicoleCRust @MatteoCarandini @thetransmitter @eLife @avaskham
a similar behavioral explanation for drift in hippocampal spatial tuning can be found in this paper from the yartsev lab
@sls fear and large language modeling in neurophotonics
@sls amazing, truly unhinged
More Money Than God and a Death Wish. http://labrigger.com/blog/2024/01/11/more-money-than-god-and-a-death-wish/
Postdoc position at UC Santa Barbara. http://labrigger.com/blog/2024/01/10/postdoc-position-at-uc-santa-barbara/
my dogs i cannot lie when i say that i have seriously investigated to the brink of implementation the possibility of doing adversarial metadata manipulation and paper farming under a pen name, the attack surface is the size of the sun https://www.nature.com/articles/d41586-023-03865-y
A #preprint (#tootprint?, whatever), on how the theories we build depend on the problems we use them to solve.
#pragmatism #philosophyofscience #neuroscience
#cognitivescience
Multiphoton imaging through 20 mm of air with the new Cousa objective.
This is a new type of objective: ultra long working distance, huge field-of-view, optimized for two-photon imaging.
And it fits on standard microscopes neuroscientists use for in vivo imaging (e.g., from Bruker, Thorlabs, Neurolabware, INSS, CoSys, etc.)!
The Cousa provides excellent data and even three-photon imaging over a huge field-of-view!
The paper has data from marmosets, tree shrews, both young and adult ferrets, intact pig eye, the first in vivo calcium imaging from mouse cochlear hair cells (!), beautiful input mapping / dendritic spine imaging, and more!
The paper was published just now: https://doi.org/10.1038/s41592-023-02098-1 It's open access. The Cousa objective can be purchased from a company that is being spun out of the lab: https://pacificoptica.com