A specter is haunting the modern world, the specter of crypto anarchy. Computer technology is on the verge of providing the ability for individuals and groups to communicate and interact with each other in a totally anonymous manner. Two persons may exchange messages, conduct business, and negotiate electronic contracts without ever knowing the True Name, or legal identity, of the other. Interactions over networks will be untraceable, via extensive re-routing of encrypted packets and tamper-proof boxes which implement cryptographic protocols with nearly perfect assurance against any tampering. Reputations will be of central importance, far more important in dealings than even the credit ratings of today. These developments will alter completely the nature of government regulation, the ability to tax and control economic interactions, the ability to keep information secret, and will even alter the nature of trust and reputation.
Fundamental science and applied research and technology development (RTD) are facing significant challenges that particularly compound to the notorious credibility, reproducibility, funding and sustainability crises. The underlying, serious shortcomings are substantially amplified by a metrics-obsessed publication culture, and a growing cohort of academics fishing for fairly stagnant (public) funding budgets. This work presents, for the first time, a groundbreaking strategy to successfully address these severe issues; the novel strategy proposed here leverages the distributed ledger technology (DLT) “blockchain” to capitalize on cryptoeconomic mechanisms, such as tokenization, consensus, crowdsourcing, smart contracts, reputation systems as well as staking, reward and slashing mechanisms. This powerful toolbox, which is so far widely unfamiliar to traditional scientific and RTD communities (“TradSci”), is synergistically combined with the exponentially growing computing capabilities for virtualizing experiments through digital twin methods in a future scientific “metaverse”. Project contributions, such as hypotheses, methods, experimental data, modelling, simulation, assessment, predictions and directions are crowdsourced using blockchain, and captured by so-called non-fungible tokens (“NFTs”). The so enabled, highly integrative approach, termed decentralized science (“DeSci”), is destined to move research out of its present silos, and to markedly enhance quality, credibility, efficiency, transparency, inclusiveness, sustainability, impact, and sustainability of a wide spectrum of academic and commercial research initiatives.
Modern science is a main driver of technological innovation. The efficiency of the scientific system is of key importance to ensure the competitiveness of a nation or region. However, the scientific system that we use today was devised centuries ago and is inadequate for our current ICT-based society: the peer review system encourages conservatism, journal publications are monolithic and slow, data is often not available to other scientists, and the independent validation of results is limited. The resulting scientific process is hence slow and sloppy. Building on the Innovation Accelerator paper by Helbing and Balietti [1], this paper takes the initial global vision and reviews the theoretical and technological building blocks that can be used for implementing an innovation (in first place: science) accelerator platform driven by re-imagining the science system. The envisioned platform would rest on four pillars: (i) Redesign the incentive scheme to reduce behavior such as conservatism, herding and hyping; (ii) Advance scientific publications by breaking up the monolithic paper unit and introducing other building blocks such as data, tools, experiment workflows, resources; (iii) Use machine readable semantics for publications, debate structures, provenance etc. in order to include the computer as a partner in the scientific process, and (iv) Build an online platform for collaboration, including a network of trust and reputation among the different types of stakeholders in the scientific system: scientists, educators, funding agencies, policy makers, students and industrial innovators among others. Any such improvements to the scientific system must support the entire scientific process (unlike current tools that chop up the scientific process into disconnected pieces), must facilitate and encourage collaboration and interdisciplinarity (again unlike current tools), must facilitate the inclusion of intelligent computing in the scientific process, must facilitate not only the core scientific process, but also accommodate other stakeholders such science policy makers, industrial innovators, and the general public. We first describe the current state of the scientific system together with up to a dozen new key initiatives, including an analysis of the role of science as an innovation accelerator. Our brief survey will show that there exist many separate ideas and concepts and diverse stand-alone demonstrator systems for different components of the ecosystem with many parts are still unexplored, and overall integration lacking. By analyzing a matrix of stakeholders vs. functionalities, we identify the required innovations. We (non-exhaustively) discuss a few of them: Publications that are meaningful to machines, innovative reviewing processes, data publication, workflow archiving and reuse, alternative impact metrics, tools for the detection of trends, community formation and emergence, as well as modular publications, citation objects and debate graphs. To summarize, the core idea behind the Innovation Accelerator is to develop new incentive models, rules, and interaction mechanisms to stimulate true innovation, revolutionizing the way in which we create knowledge and disseminate information.
Moderate alcohol consumption is widespread but its impact on brain structure and function is contentious. The relationship between alcohol intake and structural and functional neuroimaging indices, the threshold intake for associations, and whether population subgroups are at higher risk of alcohol-related brain harm remain unclear. 25,378 UK Biobank participants (mean age 54.9 ± 7.4 years, 12,254 female) underwent multi-modal MRI 9.6 ± 1.1 years after study baseline. Alcohol use was self-reported at baseline (2006–10). T1-weighted, diffusion weighted and resting state images were examined. Lower total grey matter volumes were observed in those drinking as little as 7–14 units (56–112 g) weekly. Higher alcohol consumption was associated with multiple markers of white matter microstructure, including lower fractional anisotropy, higher mean and radial diffusivity in a spatially distributed pattern across the brain. Associations between functional connectivity and alcohol intake were observed in the default mode, central executive, attention, salience and visual resting state networks. Relationships between total grey matter and alcohol were stronger than other modifiable factors, including blood pressure and smoking, and robust to unobserved confounding. Frequent binging, higher blood pressure and BMI steepened the negative association between alcohol and total grey matter volume. In this large observational cohort study, alcohol consumption was associated with multiple structural and functional MRI markers in mid- to late-life
Complex cognition requires binding together of stimulus, action, and other features, across different time scales. Several implementations of such binding have been proposed in the literature, most prominently synaptic binding (learning) and synchronization. Biologically plausible accounts of how these different types of binding interact in the human brain are still lacking. To this end, we adopt a computational approach to investigate the impact of learning and synchronization on both behavioral (reaction time, error rate) and neural (θ power) measures. We train four models varying in their ability to learn and synchronize for an extended period of time on three seminal action control paradigms varying in difficulty. Learning, but not synchronization, proved essential for behavioral improvement. Synchronization however boosts performance of difficult tasks, avoiding the computational pitfalls of catastrophic interference. At the neural level, θ power decreases with practice but increases with task difficulty. Our simulation results bring new insights in how different types of binding interact in different types of tasks, and how this is translated in both behavioral and neural metrics