Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

Something is very, very wrong if a state of 3 million people only has 6 CS teachers.

I doubt it's that bad.

What it probably means is that only six teachers have bothered to attain the needed certifications to teach CS in high school there so far, probably because there's been no demand for it.

If the demand appeared, there's probably quite a few more teachers who have the needed skills already (perhaps they have a degree in CS but they're teaching math or science now, or they don't have a degree in CS proper but have one in a similar field that would also work, etc.) but they aren't certified to teach it because there was no need to get certified -- but they could get certified fairly quickly if the demand appeared.

This "only six qualified teachers in the state" sounds scary, but it probably just means that the call has never gone out for qualified teachers before.

To expand on the other post I just made, it's quite interesting the dangers that the R/C hobby has encountered lately.

A few decades ago, young people stopped getting into the hobby largely due to video games and so the average modeller was getting older and older.

R/C sites have always been at risk from encroachment by new neighbors who don't like the noise. This effect has nearly decimated general avaition airports over the last many decades and it continues.

But then electric planes came, greatly improving the noise situation. Still, fields are always being lost and created.

Then the park flier came... this helped bring the casual flier into the fold and many youth. It also meant that people were often flying in parks and baseball fields rather than formal fields -- not really a risk to the hobby (but a big risk to the AMA itself, as these flyers don't need the AMA!), but a pretty big change.

But now it's the rise of the FPV plane (well, they're still relatively rare) and especially the semi-autonomous (sometimes, usually not) quadcopters with cameras. These things are bringing all sorts of people to the hobby, interested in flying and photography, but people are all riled up by the idea of these being used to take pictures of them, and so the models are being banned all over the place, laws enacted, etc.

And people fly them in places where models generally weren't normally flown in the past (to take pictures) and then something happens and it's all over the news and lawmakers have knee jerk reactions and ban things.

It's a good time for the hobby -- lots of new things to do, new technologies to play with -- but it's a bad time for the hobby, with the hammer coming down and lots of new regulations appearing. The AMA is fighting the good fight, but I think they're going to ultimately lose, and the FAA and local governments will continue to greatly restrict the hobby -- it'll be done in the name of safety, but the reality is that it'll mostly be about preventing photography.

On the bright side, they will probably open some ways for commercial use of unmanned aircraft with lots of red tape associated with that -- so that's good that they allow that, as it wasn't allowed at all before, but the red tape is likely to be as heavy or even heavier than that associated with full scale manned aircraft.

The AMA bars flight over populated areas, encouraging people to find a cow pasture IR something.

The AMA rules (not binding, but they can refuse to pay insurance claims if you violate them) say that you will not fly RC planes "directly over unprotected people, vessels, vehicles or structures". Not quite the same as you put it -- flying in a populated area is fine, as long as you aren't flying directly over people and aren't flying in a careless or reckless manner.

It may seem odd that a private club has effectively been given authority to make law

Again, it has not. The AMA rules are even *less* restrictive than the FAA circular in one way -- the AMA rules say not to fly over 400 feet near an airport without notifying the airport, and the FAA suggestions say not to fly over 400 feet above the surface, period. And note that R/C pilots, especially those flying gliders, fly over 400 feet quite often.

any doctor violating these generally accepted standards is likely to lose any court case.

Now, that part rings true... the AMA safety code is basically the industry standard and if you're sued for hurting somebody, not following those standards will hurt you in court.

And indeed, it seems that whatever new *mandatory* standards the FAA comes up with be largely influenced by the AMA safety code... but we are not there yet.

So make them larger -- as large as you can get with the tips not quite going supersonic. (The speed of sound on Mars (probably around 540 mph at ground level) is a bit lower than it it is on Earth thanks to the low pressure, low temperatures and mostly CO2 atmosphere, so that's an even bigger problem.)

More blades as well -- not just 2, but 3, 4, 5, 6, whatever. There's diminishing returns past two (well, one!) but it can help when you don't mind using a lot more power for a little more thrust.

Go for fatter blades and higher pitches as well -- more diminishing returns, but it could still help.

If you're thinking of a multicopter as I imagine they are, go with more than four propellers. Putting propellers on top of other propellers could help as well, but again... diminishing returns.

It's not trivial, but it should be doable.

Or maybe they could even let the tips go supersonic... it might be less of a problem with such a thin atmosphere than it would be here. I'm not so sure about this.

so simply applying a scaling law like that isn't very accurate.

It's a good "back of the napkin" first order approximation. I'd expect NASA to take everything into consideration, model it exactly, and then actually build it and fly it in a chamber that approximates the atmosphere of Mars.

It doesn't need high performance or duration -- just enough to go almost straight up and pan around and take pictures and then land back and charge up again.

We have the right to a jury trial. The jury has to be impartial.. It has to be in the state that the crime was committed. And that's it.

The only way we get a jury "of our peers" is if you consider that the American ideal says that we are all peers, regardless of gender, race, religion, education, experience, etc.

In the case of this specific trial, given that detailed knowledge of the Internet is rare, I imagine that the attorneys involved were asking questions designed to find out if any potential jurors had a deep understanding of these things, and while I'm not sure which side would be doing it, but one side or the other would decide that deep knowledge of these things was bad for their case, and since such people are rare, they'd use their peremptory challenge to keep such people off the jury.

Without this system, you might have a person or two on the jury who understands such things pretty well. But with the system... such people would have been excluded by one side or the other.

Cheaper way would be a large high altitude jet to carry the rocket to the edge of space.

The problem is - it's not really cheaper. Fuel is cheap, large high altitude jets aren't

More to the point, the high altitude jet doesn't help much.

Let's suppose we need to send something to the ISS. The ISS averages around 260 miles above sea level and orbits at about 17,000 mph.

So, our plane takes off at the equator and flies at 700 mph up to 11 miles (60,000 feet) above the ground. We launch rockets near the equator and to the East if possible to take advantage of the 1000 mph rotational velocity and our plane should do so as well -- so that means we need 16,000 mph more speed.

So, our high altitude high speed jet has provided 1/23rd of the speed and 1/23rd of the altitude needed to reach the ISS, and our rocket needs to provide the rest. (The fact that both worked out to 1/23 is just a coincidence.)

I haven't worked all of this out exactly, but it looks like putting your rocket on a plane and taking it up to 60,000 feet at 700 mph before launching saves less than 1% of the total energy needed to get to the ISS -- so it sounds good, but in practice it makes a lot more sense to just make your rocket a little bigger and launch from the ground.

The reason they use older laptops is not because of the density of the chips but simply because they're known commodities -- any quirks they have have already been figured out and they get the job done. Getting anything certified (for mission critical purposes) is a very time consuming process, and once it's done... the item is no longer state of the art, that's just the nature of the beast.

The Raspberry Pis don't have to go through the same certification process, though of course if they were expected to only work "for eight seconds" I think NASA would have told the people sending them up that to pick something older. I'm guessing that NASA knows a bit about the radiation environment up there and advises people who send up experiments appropriately.

And as others have said... humans are living in the same environment for months at a time -- it can't be *that* bad.

The ISS is well below the Van Allen radiation belts and well within the Earth's magnetic field (which deflects many of the charged particles headed towards the Earth) so the level of cosmic radiation it gets is not *that* high, and the metal of the ISS blocks most of of that.

And if a Raspberry Pi does get its registers corrupted by cosmic rays... it's not a tragedy. Nobody dies -- it's not mission critical.

In any event, they use pretty standard (but old -- last I heard, they still ran Windows 95) laptops on the ISS and they work fine. It would be interesting to know how much more often they experience failures and errors on the ISS due to radiation compared to how much they experience here, but I don't know if anybody has measured that. (My guess is that NASA has, though I wouldn't know where to look for the data.)

If you have a single process that needs to use more than 1.6 - 2.0 GB of memory... you need the 64 bit version. And on top of that, if you've got 4 GB of memory the OS can use about 3 GB (total) due to the way Windows handles things.

I've used lots of multi-user linux boxes over the years and never noticed that a few bad users ruined the experience for everybody else.

I did... but this was 25 years ago at college when hardware was scarce (we had 1 MB disk quotas!) and the computer system was used to do all sorts of things that people just couldn't do from their own personal computers (i.e. access mail, news or the Internet.)

Users policed each other back then to a degree, but there wasn't much you could do to make a bad user behave unless the sysadmins backed you on it, and they'd only back you if they explicitly broke the rules set down. And often you didn't even know who a user was -- if they sat at a console you'd know who they were, but if they dialed in you might just know their user name and often that gave no clue who they really were. (The sysadmins knew, but they wouldn't share.)

But now... most of the things that caused problems can be done from anybody's own computer, or from a PC down in a lab somewhere. True multiuser systems are kind of rare nowadays, and most users probably don't deal with them where back then we had little choice.