The workshop “High-Performance Computing, Stochastic Modeling and Databases in Neuroscience,” that FAPESP's Research, Innovation and Dissemination Center for Neuromathematics (RIDC NeuroMat) held in the last week of April, was an occasion for strengthening ties among brain-science international consortia. Specifically, this event contributed to engaging NeuroMat’s technology-transfer team within the network that is facilitated by the International Neuroinformatics Coordinating Facility (INCF) with the forthcoming creation of creation of an INCF Special Interest Group on “stochastic modeling and statistical analysis of neural systems” and a formal collaboration with the INCF Program on Standards for Data Sharing, particularly with a task force on electrophysiology and on neuroimaging. INCF’s scientific director Sean Hill attended NeuroMat’s workshop and gave a talk on computational challenges of understanding the brain.
Brain initiatives in the world require the development of new neuroscience data management standards and interchange tools. The best way to tackle this need is to share efforts in a global level. This was the leading discussion point of a roundtable on “'Big science'”: the case for neuroscience. What are the goals? What are the research questions? What can neuroscience benefit from the big science approach?,” on April 25, with Markus Diesmann (Research Centre Jüllich, Germany), Stefan Mihalas (Allen Institute, US) and Sean Hill. During this roundtable, a common understanding that small laboratories are generally unable to make impactful advances in brain science, to the extent that they have little capacity to deal with the volume of data that is needed to be addressed in this line of research. The idea of setting up a Global Brain Initiative was put forward, though real prospects of this initiative were not specifically addressed —a follow-up of this idea might be the potential inclination of establishing a formal connection between NeuroMat and the Human Brain Project, in Europe.
A pillar to a possible collaborative network in brain science is open data, which remained the general topic of discussion at the roundtable "Open databases and open source in neuroscience. Why open? What are the challenges and bottlenecks?,” on April 28, with Padraig Gleeson (University College London, UK), Viktor Jirsa (INSERM, France) and Claudia Vargas (NeuroMat, UFRJ). The understanding is that any collaboration around brain science requires sharing of data and tools.
Computational challenges to advance brain science were a special focus of the workshop. A NeuroMat PI and member of the Scientific Committee of the workshop, Antonio Carlos Roque presented the scientific objectives associated to NeuroMat’s high-performance computer, basically to serve as a hub for researchers interested in stochastic modeling in neuroscience and a learning reference for students. Specific objectives of NeuroMat’s supercomputer were presented in a recent radiocast, here (in Portuguese). Computational challenges to advance brain science were the general topic of the roundtable "HPC in neuroscience. What to expect from large-scale brain computer simulations? What are the computational and neurobiological challenges and bottlenecks?,” on April 26, with Tomoki Fukai (RIKEN Brain Science Institute, Japan), William Lytton (State University of New York, USA) and Antonio Roque (NeuroMat, USP).
This piece is part of NeuroMat's Newsletter #27. Read more hereShare on Twitter Share on Facebook
Featuring this week:
Stay informed on our latest news!
|Follow Us on Facebook|