Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

FreeBSD has had packages for years. It's not transitioning, it's allowing another option.

Ports in FreeBSD, in my experience, if you follow a production-like attitude, rather than an ADD OH-NOEZ-THIS-PORT-IS-30-SECONDS-OUT-OF-DATE-MUST-UPDATE methodology, works better than any package manager I've seen (rpm, deb, yum, apt).

The BSD package systems tend to be more like apt or yum, than simple rpm/deb. They grab your binary packages and their dependencies automatically.

IANAL, but I suspect it's a case where a public domain work is still public domain, even if used in a copyrighted work, however the copyrighted work, including the disposition of the public domain items within it, is still copyrighted.

For example, if you take each individual character or word, by itself, to be public domain, a copyrighted novel contains nothing but public domain bits, it's how they are organized that makes is copyrightable.

As long as he only accidentally shoots himself. That's fine. However, he can't gurantee that.

But it's irresponsible behavior like that that gives fuel to the anti-gun crowd.

I agree with the concern about the biometric identification though, in TFS. It's too easy for it to fail, even if it's merely the battery going dead, though if it's low enough charge, the trigger action may be able to charge it up, enough so that firing actually doesn't need a battery. I can see cases where someone would want this on their guns, and it's a responsible thing in those cases. People shouldn't force their desire for irresponsibility onto others - if you don't want this, don't buy it. I know I won't. However, I don't have a situation where this would be relevant (my guns are only loaded at the range and when camping). Just because I wouldn't want this, and it doesn't add any safety benefit in my situation... doesn't mean it shouldn't be around, because it sure as hell isn't hard to think of reasons why people would want this and it would provide safety to themselves and/or others.

However, there are more potential points of failure with the cloud, so a well maintained in-house solution is generally better, because it eliminates many of the external network points of failure. It also allows for multiple points of on-site failover as well as offsite failover (in the case of onsite catastrophe).

have you actually worked in a scientific field, let alone a biological one?

Yes, articles are rarely written with less than 5 authors, and almost never with less than 3. However, the process of working over everything requires math at most stages - if you don't understand it, you will take a lot longer to complete the project, and someone else will beat you to the publication. While you could constantly go and bother your mathematician/statistician, that is still a horribly inefficient use of resources, and will still slow you down.

The biologists should not be math experts any more than the the mathematicians should be biology experts. However, both would need to dip into the others field and become at least somewhat experienced to be effective. To suggest that biologists need not be even math literate, is idiotic, if you want proper, accurate research, which progresses at any more than a snails pace..

While most groups worth with teams, some generalization in all members is a good idea. Everyone should have basic mathematical knowledge, otherwise they are going to waste the time of the math & stats experts, as well as their own, with stupid questions and requests (everyone I worked with had that problem with our local math dunce). We'd answer some odd question, the person would go back, do something, ask more question, and halfway through we'd finally find out what they were spending so much time on, and realize how bass ackward they were going about it.

to put it in your terms, it'd be like someone doing the 3D programming and not knowing much about trig or vector math. There are libraries around that would would do the grunt work for them, but they are still going to cause problems. Math and stats are integrated into every aspect of science.

However, at the same time you can't specialize in everything, and almost everything is done in a team these days, so having specialists with different topics is better than having everyone be a generalist, so long as you can communicate and work together. Where I am now, a very large number of research teams have statisticians on board for that very reason - a lot of scientists are more familiar with their specific fields than the nuances and tools of statistics.

Given the comment on town size, and bandwith, I suspect the GP was referring to monthly bandwidth more than dev costs. It's anonymous, they probably won't have to bother with dev costs. Even assuming cheap VMs and whatnot, the bandwidth for such a site could be an issue, and including the VM to run it, cost $30-$60 a month (was looking at a relatively cheap Linux/FreeBSD host with 500GB/month). I think the top plan had $100 for 2TB/month. Still not to or beyond the $2000/year number, but not exactly cheap.

Actually, mild/low inflation is important to the economy. It provides an incentive to invest or spend, rather than horde - these keep the economy moving. The key is mild/low - if it is too high, then there is uncertainty in how much is appropriate for compensation.

That's where the problem comes with bitcoin, what was being described above. It fluctuates too much. Backing it is a problem. Of course, in a few years, when we are more comfortable with it, it might well become a viable form of currency, but for now, it's better described as a volatile commodity.

That, and even if they did have a good metric, I suspect the first few million to billion years, would have more rapid development.

SImply put, from an evolutionary perspective - the more precisely a genetic material copies itself, the more it will propigate. Until you run into a wall of needing to adapt to changing conditions in the environment.

Assuming that exact replication is not trivial, you can conclude that for the initial period of life, mutations would be more frequent than they are now, and therefore , any calculation covering the change rate would have a negative second derivative with respect to time. Possibly something like (T+Log(T)). If T+Log(T) were the growth rate of the complexity, the complexity vs. time would look exponential, and the Log(T) factor could easily be overlooked if you aren't towards the beginning of the process.