We use Matsim along with our own tools for traffic modelling and simulation. MATSim provides a toolbox to implement large-scale agent-based transport simulations. The toolbox consists of severel modules which can be combined or used stand-alone. Currently, MATSim offers a toolbox for demand-modeling, agent-based mobility-simulation (traffic flow simulation), re-planning, a controler to iteratively run simulations as well as methods to analyze the output generated by the modules.

Researchers at MIT have developed a synthetic gene circuit that triggers the body’s immune system to attack cancers when it detects signs of the disease. The circuit, which will only activate a therapeutic response when it detects two specific cancer markers, is described in a paper published today in the journal Cell. Immunotherapy is widely seen as having considerable potential in the fight against a range of cancers. The approach has been demonstrated successfully in several recent clinical trials, according to Timothy Lu, associate professor of biological engineering and of electrical engineering and computer science at MIT. “There has been a lot of clinical data recently suggesting that if you can stimulate the immune system in the right way you can get it to recognize cancer,” says Lu, who is head of the Synthetic Biology Group in MIT’s Research Laboratory of Electronics. “Some of the best examples of this are what are called checkpoint inhibitors, where essentially cancers put up stop signs [that prevent] T-cells from killing them. There are antibodies that have been developed now that basically block those inhibitory signals and allow the immune system to act against the cancers.” However, despite this success, the use of immunotherapy remains limited by the scarcity of tumor-specific antigens — substances that can trigger an immune system response to a particular type of cancer. The toxicity of some therapies, when delivered as a systemic treatment to the whole body, for example, is another obstacle. What’s more, the treatments are not successful in all cases. Indeed, even in some of the most successful tests, only 30-40 percent of patients will respond to a given therapy, Lu says. As a result, there is now a push to develop combination therapies, in which different but complementary treatments are used to boost the immune response. So, for example, if one type of immunotherapy is used to knock out an inhibitory signal produced by a cancer, and the tumor responds by upregulating a second signal, an additional therapy could then be used to target this one as well, Lu says. “Our belief is that there is a need to develop much more specific, targeted immunotherapies that work locally at the tumor site, rather than trying to treat the entire body systemically,” he says. “Secondly, we want to produce multiple immunotherapies from a single package, and therefore be able to stimulate the immune system in multiple different ways.” To do this, Lu and a team including MIT postdocs Lior Nissim and Ming-Ru Wu, have built a gene circuit encoded in DNA designed to distinguish cancer cells from noncancer cells. The circuit, which can be customized to respond to different types of tumor, is based on the simple AND gates used in electronics. Such AND gates will only switch on a circuit when two inputs are present. Cancer cells differ from normal cells in the profile of their gene expression. So the researchers developed synthetic promoters — DNA sequences designed to initiate gene expression but only in cancer cells. The circuit is delivered to cells in the affected area of the body using a virus. The synthetic promotors are then designed to bind to certain proteins that are active in tumor cells, causing the promoters to turn on. “Only when two of these cancer promoters are activated, does the circuit itself switch on,” Lu says. This allows the circuit to target tumors more accurately than existing therapies, as it requires two cancer-specific signals to be present before it will respond. Once activated, the circuit expresses proteins designed to direct the immune system to target the tumor cells, including surface T cell engagers, which direct T cells to kill the cells. The circuit also expresses a checkpoint inhibitor designed to lift the brakes on T cell activity. When the researchers tested the circuit in vitro, they found that it was able to detect ovarian cancer cells from amongst other noncancerous ovarian cells and other cell types. They then tested the circuit in mice implanted with ovarian cancer cells, and demonstrated that it could trigger T cells to seek out and kill the cancer cells without harming other cells around them. Finally, the researchers showed that the circuit could be readily converted to target other cancer cells. “We identified other promoters that were selective for breast cancer, and when these were encoded into the circuit, it would target breast cancer cells over other types of cell,” Lu says. Ultimately, they hope they will also be able to use the system to target other diseases, such as rheumatoid arthritis, inflammatory bowel disease, and other autoimmune diseases. This advance will open up a new front against cancer, says Martin Fussenegger, a professor of biotechnology and bioengineering at ETH Zurich in Switzerland, who was not involved in the research. “First author Lior Nissim, who pioneered the very first genetic circuit targeting tumor cells, has now teamed up with Timothy Lu to design RNA-based immunomodulatory gene circuits that take cancer immunotherapy to a new level,” Fussenegger says. “The design of this highly complex tumor-killing gene circuit was made possible by meticulous optimization and integration of several components that target and program tumor cells to become a specific prey for the immune system — this is very smart technology.” The researchers now plan to test the circuit more fully in a range of cancer models. They are also aiming to develop a delivery system for the circuit, which would be both flexible and simple to manufacture and use.

Every cell needs a shell. The cell interior is separated from its surroundings by a membrane made up of fat molecules, helping to create the environment needed for the cell to survive. Development of artificial cells is similarl

Recent years have witnessed an explosion of extensive geolocated datasets related to human movement, enabling scientists to quantitatively study individual and collective mobility patterns, and to generate models that can capture and reproduce the spatiotemporal structures and regularities in human trajectories. The study of human mobility is especially important for applications such as estimating migratory flows, traffic forecasting, urban planning, and epidemic modeling. In this survey, we review the approaches developed to reproduce various mobility patterns, with the main focus on recent developments. This review can be used both as an introduction to the fundamental modeling principles of human mobility, and as a collection of technical methods applicable to specific mobility-related problems. The review organizes the subject by differentiating between individual and population mobility and also between short-range and long-range mobility. Throughout the text the description of the theory is intertwined with real-world applications.

It will enter clinical trials to prevent and treat the infection next year.

ComplexInsight's insight:

This possibly the most promising news in HIV treatment research in over a decade. A research collaboration between the US National Institutes of Health and the pharmaceutical company Sanofi has produced a new antibody for treatment of AIDS. Developed from three "broadly neutralising antibodies", that a small number of patients develop in response to HIV infection, the new antibody has been shown to be effective to 99% of HIV strains in vitro tests with monkeys. Human trials start next year. to see the BBC article click the picture. To see the full paper - see here http://science.sciencemag.org/content/early/2017/09/22/science.aan8630

Network science offers a powerful language to represent and study complex systems composed of interacting elements from the Internet to social and biological systems. In its standard formulation, this framework relies on the assumption that the underlying topology is static, or changing very slowly as compared to dynamical processes taking place on it, e.g., epidemic spreading or navigation. Fuelled by the increasing availability of longitudinal networked data, recent empirical observations have shown that this assumption is not valid in a variety of situations. Instead, often the network itself presents rich temporal properties and new tools are required to properly describe and analyse their behaviour.A Guide to Temporal Networks presents recent theoretical and modelling progress in the emerging field of temporally varying networks, and provides connections between different areas of knowledge required to address this multi-disciplinary subject. After an introduction to key concepts on networks and stochastic dynamics, the authors guide the reader through a coherent selection of mathematical and computational tools for network dynamics. Perfect for students and professionals, this book is a gateway to an active field of research developing between the disciplines of applied mathematics, physics and computer science, with applications in others including social sciences, neuroscience and biology.

We present a model of contagion that unifies and generalizes existing models of the spread of social influences and micro-organismal infections. Our model incorporates individual memory of exposure to a contagious entity (e.g., a rumor or disease), variable magnitudes of exposure (dose sizes), and heterogeneity in the susceptibility of individuals. Through analysis and simulation, we examine in detail the case where individuals may recover from an infection and then immediately become susceptible again (analogous to the so-called SIS model). We identify three basic classes of contagion models which we call \textit{epidemic threshold}, \textit{vanishing critical mass}, and \textit{critical mass} classes, where each class of models corresponds to different strategies for prevention or facilitation. We find that the conditions for a particular contagion model to belong to one of the these three classes depend only on memory length and the probabilities of being infected by one and two exposures respectively. These parameters are in principle measurable for real contagious influences or entities, thus yielding empirical implications for our model. We also study the case where individuals attain permanent immunity once recovered, finding that epidemics inevitably die out but may be surprisingly persistent when individuals possess memory.

A generalized model of social and biological contagionPeter Sheridan Dodds, Duncan J. Watts

In this paper we present a thorough analysis of the nature of news in different mediums across the ages, introducing a unique mathematical model to fit the characteristics of information spread. This model enhances the information diffusion model to account for conflicting information and the topical distribution of news in terms of popularity for a given era. We translate this information to a separate graphical node model to determine the spread of a news item given a certain category and relevance factor. The two models are used as a base for a simulation of information dissemination for varying graph topoligies. The simulation is stress-tested and compared against real-world data to prove its relevancy. We are then able to use these simulations to deduce some conclusive statements about the optimization of information spread.

Characterizing information importance and the effect on the spread in various graph topologiesJames Flamino, Alexander Norman, Madison Wyatt

How cooperation can evolve between players is an unsolved problem of biology. Here we use Hamiltonian dynamics of models of the Ising type to describe populations of cooperating and defecting players to show that the equilibrium fraction of cooperators is given by the expectation value of a thermal observable akin to a magnetization. We apply the formalism to the Public Goods game with three players, and show that a phase transition between cooperation and defection occurs that is equivalent to a transition in one-dimensional Ising crystals with long-range interactions. We also investigate the effect of punishment on cooperation and find that punishment acts like a magnetic field that leads to an "alignment" between players, thus encouraging cooperation. We suggest that a thermal Hamiltonian picture of the evolution of cooperation can generate other insights about the dynamics of evolving groups by mining the rich literature of critical dynamics in low-dimensional spin systems.

Advancements in science and technology take place on a global scale without much consideration of the exact implications that they may essentially have on the species or our planet. Over the last few decades, things are moving very fast and not always in a good way. The climate of the planet is changing drastically. Ice caps are melting faster than ever. Known animal species around the world are declining at rates faster than ever previously known in recorded history. We humans, might have intelligent individuals amidst us. However, collectively, to any external observer, we would perhaps seem to act more like mindless scavengers stripping the planet of its resources faster than she can ever replenish them. And this all seems to be intrinsically linked with our seemingly insatiable “collective” urge to satisfy immediate needs. So, while the technological revolution has greatly benefited humankind, our continual reliance on technology also has considerable collateral effects on the planet.

This year marks the tenth anniversary of the identification of the biological function of CRISPR–Cas as adaptive immune systems in bacteria. In just a decade, the characterization of CRISPR–Cas systems has established a novel means of adaptive immunity in bacteria and archaea and deepened our understanding of the interplay between prokaryotes and their environment, and CRISPR-based molecular machines have been repurposed to enable a genome editing revolution. Here, we look back on the historical milestones that have paved the way for the discovery of CRISPR and its function, and discuss the related technological applications that have emerged, with a focus on microbiology. Lastly, we provide a perspective on the impacts the field has had on science and beyond.

Observational data about human behavior is often heterogeneous, i.e., generated by subgroups within the population under study that vary in size and behavior. Heterogeneity predisposes analysis to Simpson's paradox, whereby the trends observed in data that has been aggregated over the entire population may be substantially different from those of the underlying subgroups. I illustrate Simpson's paradox with several examples coming from studies of online behavior and show that aggregate response leads to wrong conclusions about the underlying individual behavior. I then present a simple method to test whether Simpson's paradox is affecting results of analysis. The presence of Simpson's paradox in social data suggests that important behavioral differences exist within the population, and failure to take these differences into account can distort the studies' findings.

A new statistical model has enabled researchers to pinpoint 27 novel genes thought to prevent cancer from forming, in an analysis of over 2000 tumours across 12 human cancer types. The findings could help create new cancer treatments that target these genes, and open up other avenues of cancer research.

Results are in from the first phase 1 clinical trial for a Zika vaccine, and they are very promising. A new generation DNA-based Zika vaccine, developed by Wistar scientists in collaboration with Inovio Pharmaceuticals and GeneOne Life Science, was found to be safe and well tolerated by all study participants and was able to elicit an immune response against Zika, opening the door to further and larger trials to move this vaccine forward.

Most digital sensors are easy to connect, but those tucked away in hard-to-access spots will need to harvest ambient energy

ComplexInsight's insight:

Powering sensors over the long time periods is key to sensor network related developments in environmental monitoring and industrial internet of things. By recouping ambient energy as a power sources for sensors - deployed duration times can become measured in years and decades rather than weeks and months. Good article on changes in sensor power sources designers can expect to be available soon.

Poor economies not only produce less; they typically produce things that involve fewer inputs and fewer intermediate steps. Yet the supply chains of poor countries face more frequent disruptions---delivery failures, faulty parts, delays, power outages, theft, government failures---that systematically thwart the production process. To understand how these disruptions affect economic development, we model an evolving input--output network in which disruptions spread contagiously among optimizing agents. The key finding is that a poverty trap can emerge: agents adapt to frequent disruptions by producing simpler, less valuable goods, yet disruptions persist. Growing out of poverty requires that agents invest in buffers to disruptions. These buffers rise and then fall as the economy produces more complex goods, a prediction consistent with global patterns of input inventories. Large jumps in economic complexity can backfire. This result suggests why "big push" policies can fail, and it underscores the importance of reliability and of gradual increases in technological complexity.

CompleNet is an international conference that brings together researchers and practitioners from diverse disciplines—from sociology, biology, physics, and computer science—who share a passion to better understand the interdependencies within and across systems. CompleNet is a venue to discuss ideas and findings about all types networks, from biological, to technological, to informational and social. It is this interdisciplinary nature of complex networks that CompleNet aims to explore and celebrate.

One hundred and ten Zika virus genomes from ten countries and territories involved in the Zika virus epidemic reveal rapid expansion of the epidemic within Brazil and multiple introductions to other regions.

Biological organisms must perform computation as they grow, reproduce, and evolve. Moreover, ever since Landauer's bound was proposed it has been known that all computation has some thermodynamic cost -- and that the same computation can be achieved with greater or smaller thermodynamic cost depending on how it is implemented. Accordingly an important issue concerning the evolution of life is assessing the thermodynamic efficiency of the computations performed by organisms. This issue is interesting both from the perspective of how close life has come to maximally efficient computation (presumably under the pressure of natural selection), and from the practical perspective of what efficiencies we might hope that engineered biological computers might achieve, especially in comparison with current computational systems. Here we show that the computational efficiency of translation, defined as free energy expended per amino acid operation, outperforms the best supercomputers by several orders of magnitude, and is only about an order of magnitude worse than the Landauer bound. However this efficiency depends strongly on the size and architecture of the cell in question. In particular, we show that the {\it useful} efficiency of an amino acid operation, defined as the bulk energy per amino acid polymerization, decreases for increasing bacterial size and converges to the polymerization cost of the ribosome. This cost of the largest bacteria does not change in cells as we progress through the major evolutionary shifts to both single and multicellular eukaryotes. However, the rates of total computation per unit mass are nonmonotonic in bacteria with increasing cell size, and also change across different biological architectures including the shift from unicellular to multicellular eukaryotes.

The thermodynamic efficiency of computations made in cells across the range of lifeChristopher P. Kempes, David Wolpert, Zachary Cohen, Juan Pérez-Mercader

Is Chinese urbanization going to take a long time, or can its development goal be achieved by the government in a short time? What is the highest stable urbanization level that China can reach? When can China complete its urbanization? To answer these questions, this paper presents a system dynamic (SD) model of Chinese urbanization, and its validity and simulation are justified by a stock-flow test and a sensitivity analysis using real data from 1998 to 2013. Setting the initial conditions of the simulation by referring to the real data of 2013, the multi-scenario analysis from 2013 to 2050 reveals that Chinese urbanization will reach a level higher than 70% in 2035 and then proceed to a slow urbanization stage regardless of the population policy and GDP growth rate settings; in 2050, Chinese urbanization levels will reach approximately 75%, which is a stable and equilibrium level for China. Thus, it can be argued that Chinese urbanization is a long social development process that will require approximately 20 years to complete and that the ultimate urbanization level will be 75–80%, which means that in the distant future, 20–25% of China’s population will still settle in rural regions of China.

This web collection showcases the potential of interdisciplinary complexity research by bringing together a selection of recent Nature Communications articles investigating complex systems. Complexity research aims to characterize and understand the behaviour and nature of systems made up of many interacting elements. Such efforts often require interdisciplinary collaboration and expertise from diverse schools of thought. Nature Communications publishes papers across a broad range of topics that span the physical and life sciences, making the journal an ideal home for interdisciplinary studies.

Sharing your scoops to your social media accounts is a must to distribute your curated content. Not only will it drive traffic and leads through your content, but it will help show your expertise with your followers.

Integrating your curated content to your website or blog will allow you to increase your website visitors’ engagement, boost SEO and acquire new visitors. By redirecting your social media traffic to your website, Scoop.it will also help you generate more qualified traffic and leads from your curation work.

Distributing your curated content through a newsletter is a great way to nurture and engage your email subscribers will developing your traffic and visibility.
Creating engaging newsletters with your curated content is really easy.